I am having trouble finding the source of lag in my code. I believe I have narrowed the possible source down to this method.
Essentially I start a script, set it in a Process variable p, and grab the output from the script using a BufferedReader, and put it into an ArrayList.
Somehow I am getting lag when the script outputs (it outputs at a 5 minute interval)
Any ideas?
public void runCommand(String path)
{
if (SystemUtils.IS_OS_WINDOWS)
{
ProcessBuilder builder = new ProcessBuilder("cmd.exe", "/c", "cd " + path + " && " + this.getCommand());
builder.redirectErrorStream(true);
try
{
p = builder.start();
}
catch (IOException e)
{
e.printStackTrace();
}
}
else
{
try
{
String name = ManagementFactory.getRuntimeMXBean().getName();
String pid = name.substring(0, name.indexOf("#"));
p = Runtime.getRuntime().exec("./btrace.sh " + pid + " " + path + " " + this.getConfig().getPort());
}
catch (Exception e)
{
e.printStackTrace();
}
}
BufferedReader r = new BufferedReader(new InputStreamReader(p.getInputStream()));
String line;
try
{
// Print out everything that's happening.
while (true)
{
line = r.readLine();
if (line == null)
{
break;
}
if (this.isDebugEnabled)
{
System.out.println("[Script Output]: " + line);
}
lines.add(line);
}
r.close();
}
catch (IOException e)
{
e.printStackTrace();
}
}
Seems like I found the cause of lag.
After researching how ArrayLists resize, I realized it could be performance taxing and tried using a Linked list.
So far seems like the issue is fixed.
Thanks!
Related
i have a really suspicious case here, envolving a simple method which is supposed to write into a .txt file.
public void extractCoNLL(int n, String outputFile) throws IOException {
String msg;
PrintWriter pr = new PrintWriter(outputFile);
FileInputStream fConlliN = new FileInputStream(this.txt_CoNLL_in);
BufferedReader readBufferData = new BufferedReader(new InputStreamReader(fConlliN));
try {
while ((msg = readBufferData.readLine()) != null) {
String aMsg[] = msg.split("\\s+");
if (!msg.startsWith("#")) {
//pr.println(msg);
if (aMsg.length >= n) {
pr.print(aMsg[n] + "_"); // DOES NOT WORK
pr.println(aMsg[n] + "_"); // WORKS ?????
System.out.println(aMsg[4] + aMsg.length);
} else {
pr.println();
}
}
}
this.txt_CoNLL = out_Extracted_txt_CoNLL;
} catch (Exception e) {
System.err.println("Error Exception: " + e.getMessage());
}
}
Also, why is it not possible for me to add a simple " " -space but i have to be forced to use "_" to seperate the words.
Very grateful for your Help.
Thank you in advance!
This program is meant to see two files located in a particular folder and then merge those two files and create a third file which is does. From the third merged file it is then searching for a keyword such as "test", once it finds that key word it prints out the location and the line of the keyword which is what is somewhat doing. What is happening is when I run the program it stops after the finds the keyword the first time in a line but it will not continue to search that line. So if there is multiple keyword 'test' in the line it will only find the first one and spit back the position and line. I want it to print both or multiple keywords. I think it is because of the IndexOf logic which is causing the issue.
import com.sun.deploy.util.StringUtils;
import java.io.*;
import java.lang.*;
import java.util.Scanner;
public class Concatenate {
public static void main(String[] args) {
String sourceFile1Path = "C:/Users/me/Desktop/test1.txt";
String sourceFile2Path = "C:/Users/me/Desktop/test2.txt";
String mergedFilePath = "C:/Users/me/Desktop/merged.txt";
File[] files = new File[2];
files[0] = new File(sourceFile1Path);
files[1] = new File(sourceFile2Path);
File mergedFile = new File(mergedFilePath);
mergeFiles(files, mergedFile);
stringSearch(args);
}
private static void mergeFiles(File[] files, File mergedFile) {
FileWriter fstream = null;
BufferedWriter out = null;
try {
fstream = new FileWriter(mergedFile, true);
out = new BufferedWriter(fstream);
} catch (IOException e1) {
e1.printStackTrace();
}
for (File f : files) {
System.out.println("merging: " + f.getName());
FileInputStream fis;
try {
fis = new FileInputStream(f);
BufferedReader in = new BufferedReader(new InputStreamReader(fis));
String aLine;
while ((aLine = in.readLine()) != null) {
out.write(aLine);
out.newLine();
}
in.close();
} catch (IOException e) {
e.printStackTrace();
}
}
try {
out.close();
} catch (IOException e) {
e.printStackTrace();
}
}
private static void stringSearch(String args[]) {
try {
String stringSearch = "test";
BufferedReader bf = new BufferedReader(new FileReader("C:/Users/me/Desktop/merged.txt"));
int linecount = 0;
String line;
System.out.println("Searching for " + stringSearch + " in file");
while (( line = bf.readLine()) != null){
linecount++;
int indexfound = line.indexOf(stringSearch);
if (indexfound > -1) {
System.out.println(stringSearch + " was found at position " + indexfound + " on line " + linecount);
System.out.println(line);
}
}
bf.close();
}
catch (IOException e) {
System.out.println("IO Error Occurred: " + e.toString());
}
}
}
It's because you are searching for the word once per line in your while loop. Each iteration of the loop takes you to the next line of the file because you are calling bf.readLine(). Try something like the following. You may have to tweak it but this should get you close.
while (( line = bf.readLine()) != null){
linecount++;
int indexfound = line.indexOf(stringSearch);
while(indexfound > -1)
{
System.out.println(stringSearch + " was found at position " + indexfound + " on line " + linecount);
System.out.println(line);
indexfound = line.indexOf(stringSearch, indexfound);
}
}
I'm trying to count the number of lines of a text file using a unix command from java code.
My code looks like:
String filePath = "/dir1/testFile.txt";
Runtime rt = Runtime.getRuntime();
Process p;
try {
System.out.println("No: of lines : ");
findLineCount = "cat " + filePath + " | wc -l";
p = rt.exec(findLineCount);
p.waitFor();
} catch (Exception e) {
//code
}
But, nothing is displayed in the console. When I execute the command directly, it works. What could be the issue in the above code?
I suggest you use a ProcessBuilder instead of Runtime.exec. You can also simplify your command by passing the filePath to wc. Please don't swallow Exception(s). Finally, you can use ProcessBuilder.inheritIO() (Sets the source and destination for subprocess standard I/O to be the same as those of the current Java process) like
String filePath = "/dir1/testFile.txt";
try {
System.out.println("No: of lines : ");
ProcessBuilder pb = new ProcessBuilder("wc", "-l", filePath);
pb.inheritIO();
Process p = pb.start();
p.waitFor();
} catch (Exception e) {
e.printStackTrace();
}
Of course, it's more efficient to count the lines in Java without spawning a new process. Perhaps like,
int count = 0;
String filePath = "/dir1/testFile.txt";
try (Scanner sc = new Scanner(new File(filePath));) {
while (sc.hasNextLine()) {
String line = sc.nextLine();
count++;
}
} catch (Exception e) {
e.printStackTrace();
}
System.out.printf("No: of lines : %d%n", count);
When I execute the command directly
I doubt you're execute it "directly". You're probably running it in a shell.
Your code should run that script in a shell too.
rt.exec(new String[]("bash", "-c", findLineCount});
This is how i printed number of lines
public static void main(String[] args) {
try {
Runtime run = Runtime.getRuntime();
String[] env = new String[] { "path=%PATH%;" + "your shell path " }; //path of cigwin bin or any similar application. this is needed only for windows
Process proc = run.exec(new String[] { "bash.exe", "-c", "wc -l < yourfile" }, env);
BufferedReader reader = new BufferedReader(new InputStreamReader(
proc.getInputStream()));
String s;
while ((s = reader.readLine()) != null) {
System.out.println("Number of lines " + s);
}
proc.waitFor();
int exitValue = proc.exitValue();
System.out.println("Status {}" + exitValue);
} catch (IOException | InterruptedException e) {
e.printStackTrace();
}
}
A simple example of using phantomJs in Java will block undefinitely:
public void runPhantomJs(String path, String command) {
Process process;
String outFile = "a11.txt";
try {
process = Runtime.getRuntime().exec(path+ " " + command + " > " +outFile);
int exitStatus = process.waitFor();
//String status = (exitStatus == 0 ? "SUCCESS:" : "ERROR:");
File f = new File(outFile);
if (f.exists()) {
BufferedReader in = new BufferedReader(new InputStreamReader(new FileInputStream(f),"UTF-8"));
String str;
while ((str = in.readLine()) != null) {
System.out.println(str);
}
in.close();
System.out.println(str);
}
} catch (IOException e) {
e.printStackTrace();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
The script execute is very simple, but it returns a whole page on the console:
var webPage = require('webpage');
var page = webPage.create();
page.open('http://www.google.com/', function(status) {
if (status !== 'success') {
console.log('1');
phantom.exit();
} else {
console.log(page.content);
phantom.exit();
}
});
Note that on the pasted code I've added a "> a11.txt" to see if it worked better to read a file instead of reading the output directly. It should be faster, but for some reason it doesn't work. I suppose the redirection > doesn't work.
So I got my code to work. Apparently the output of phantomjs has to be read or the buffer will fill up completely, blocking further execution.
So I think your code will work if you modify it like so:
process = Runtime.getRuntime().exec(path+ " " + command + " > " +outFile);
BufferedInputStream bis = new BufferedInputStream(process.getInputStream());
bis.close();
process.waitFor();
...
If it doesn't work, try using ProcessBuilder. This is my working code:
try {
String phantomJsExe = configuration.getPhantomJsExe().toString();
String phantomJsScript = configuration.getPhantomJsScript().toString();
String urlsTextFile = configuration.getPhantomJsUrlsTextFile().toString();
Process process = new ProcessBuilder(phantomJsExe, phantomJsScript, urlsTextFile).start();
BufferedInputStream bis = new BufferedInputStream(process.getInputStream());
bis.close();
process.waitFor();
} catch (Exception ex) {
ex.printStackTrace();
}
The problem is that I have a windows excel exported CSV with Swedish letters åäöÅÄÖ. When I upload them and convert to string I get those letters completely messed up. The server is tomcat7 on linux. It's set to use iso-8859-1.
I have tried different byte[] conversions but none seem to work. I have removed all conversions I have tried from this code.
public void run(InputStreamReader is) {
BufferedReader br = null;
String line = "";
String cvsSplitBy = ";";
try {
br = new BufferedReader(is);
while ((line = br.readLine()) != null) {
// use comma as separator
String[] playerInfo = line.split(cvsSplitBy);
System.out.println("Förnamn: " + playerInfo[0]
+ "Efternamn: " + playerInfo[1]
+ "Klubb= " + playerInfo[7]
+ " , datum=" + playerInfo[10]
+ " , Total= " + playerInfo[14]
+ " , serier= " + playerInfo[15]);
saveInfo(playerInfo);
}
} catch (FileNotFoundException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
} finally {
if (br != null) {
try {
br.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
System.out.println("Done");
}