Here's my code:
Runtime re = Runtime.getRuntime();
BufferedReader output = null;
try{
Process cmd = re.exec("java -jar myProg.jar " + myArgument);
output = new BufferedReader(new InputStreamReader(cmd.getInputStream()));
}
catch (Exception e){
e.printStackTrace();
}
String line;
while ((line = output.readLine()) != null)
{
//process line
}
When debugging this code snippet, I find that when reading each line from output, it skips certain lines.
If i run this myProg.jar from command line, the text that's seen on my command line is not 100% the same as what I get when I process the output from inside my java program!
What could cause this? The output is all text.
You only appear to be reading standard out, whereas you may be getting output on standard error as well. I would read both.
Note that you need to read both streams concurrently, to avoid blocking. See this answer for more details.
Related
I have the following code example below. Whereby you can enter a command to the bash shell i.e. echo test and have the result echo'd back. However, after the first read. Other output streams don't work?
Why is this or am I doing something wrong? My end goal is to created a Threaded scheduled task that executes a command periodically to /bash so the OutputStream and InputStream would have to work in tandem and not stop working. I have also been experiencing the error java.io.IOException: Broken pipe any ideas?
Thanks.
String line;
Scanner scan = new Scanner(System.in);
Process process = Runtime.getRuntime ().exec ("/bin/bash");
OutputStream stdin = process.getOutputStream ();
InputStream stderr = process.getErrorStream ();
InputStream stdout = process.getInputStream ();
BufferedReader reader = new BufferedReader (new InputStreamReader(stdout));
BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(stdin));
String input = scan.nextLine();
input += "\n";
writer.write(input);
writer.flush();
input = scan.nextLine();
input += "\n";
writer.write(input);
writer.flush();
while ((line = reader.readLine ()) != null) {
System.out.println ("Stdout: " + line);
}
input = scan.nextLine();
input += "\n";
writer.write(input);
writer.close();
while ((line = reader.readLine ()) != null) {
System.out.println ("Stdout: " + line);
}
Firstly, I would recommend replacing the line
Process process = Runtime.getRuntime ().exec ("/bin/bash");
with the lines
ProcessBuilder builder = new ProcessBuilder("/bin/bash");
builder.redirectErrorStream(true);
Process process = builder.start();
ProcessBuilder is new in Java 5 and makes running external processes easier. In my opinion, its most significant improvement over Runtime.getRuntime().exec() is that it allows you to redirect the standard error of the child process into its standard output. This means you only have one InputStream to read from. Before this, you needed to have two separate Threads, one reading from stdout and one reading from stderr, to avoid the standard error buffer filling while the standard output buffer was empty (causing the child process to hang), or vice versa.
Next, the loops (of which you have two)
while ((line = reader.readLine ()) != null) {
System.out.println ("Stdout: " + line);
}
only exit when the reader, which reads from the process's standard output, returns end-of-file. This only happens when the bash process exits. It will not return end-of-file if there happens at present to be no more output from the process. Instead, it will wait for the next line of output from the process and not return until it has this next line.
Since you're sending two lines of input to the process before reaching this loop, the first of these two loops will hang if the process hasn't exited after these two lines of input. It will sit there waiting for another line to be read, but there will never be another line for it to read.
I compiled your source code (I'm on Windows at the moment, so I replaced /bin/bash with cmd.exe, but the principles should be the same), and I found that:
after typing in two lines, the output from the first two commands appears, but then the program hangs,
if I type in, say, echo test, and then exit, the program makes it out of the first loop since the cmd.exe process has exited. The program then asks for another line of input (which gets ignored), skips straight over the second loop since the child process has already exited, and then exits itself.
if I type in exit and then echo test, I get an IOException complaining about a pipe being closed. This is to be expected - the first line of input caused the process to exit, and there's nowhere to send the second line.
I have seen a trick that does something similar to what you seem to want, in a program I used to work on. This program kept around a number of shells, ran commands in them and read the output from these commands. The trick used was to always write out a 'magic' line that marks the end of the shell command's output, and use that to determine when the output from the command sent to the shell had finished.
I took your code and I replaced everything after the line that assigns to writer with the following loop:
while (scan.hasNext()) {
String input = scan.nextLine();
if (input.trim().equals("exit")) {
// Putting 'exit' amongst the echo --EOF--s below doesn't work.
writer.write("exit\n");
} else {
writer.write("((" + input + ") && echo --EOF--) || echo --EOF--\n");
}
writer.flush();
line = reader.readLine();
while (line != null && ! line.trim().equals("--EOF--")) {
System.out.println ("Stdout: " + line);
line = reader.readLine();
}
if (line == null) {
break;
}
}
After doing this, I could reliably run a few commands and have the output from each come back to me individually.
The two echo --EOF-- commands in the line sent to the shell are there to ensure that output from the command is terminated with --EOF-- even in the result of an error from the command.
Of course, this approach has its limitations. These limitations include:
if I enter a command that waits for user input (e.g. another shell), the program appears to hang,
it assumes that each process run by the shell ends its output with a newline,
it gets a bit confused if the command being run by the shell happens to write out a line --EOF--.
bash reports a syntax error and exits if you enter some text with an unmatched ).
These points might not matter to you if whatever it is you're thinking of running as a scheduled task is going to be restricted to a command or a small set of commands which will never behave in such pathological ways.
EDIT: improve exit handling and other minor changes following running this on Linux.
I think you can use thread like demon-thread for reading your input and your output reader will already be in while loop in main thread so you can read and write at same time.You can modify your program like this:
Thread T=new Thread(new Runnable() {
#Override
public void run() {
while(true)
{
String input = scan.nextLine();
input += "\n";
try {
writer.write(input);
writer.flush();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
} );
T.start();
and you can reader will be same as above i.e.
while ((line = reader.readLine ()) != null) {
System.out.println ("Stdout: " + line);
}
make your writer as final otherwise it wont be able to accessible by inner class.
You have writer.close(); in your code. So bash receives EOF on its stdin and exits. Then you get Broken pipe when trying to read from the stdoutof the defunct bash.
I am calling a bash scrip script from Java.
The script does the following:
cat /home/user/Downloads/bigtextfile.txt | grep 'hello'
This particular command when run command line takes about 1 second to complete on the text file which is 150MB.
When calling the bash script via Java using the following call:
command = "sh /home/user/bashfiletocall"
p = Runtime.getRuntime().exec(command);
The time to complete takes so long I don't wait.
Am I doing something very wrong and if not can you explain the reason for the huge lack in performance?
NOTE: I was running it in Netbeans and this seems to be the problem .. when I ran the file command line it was quick. The performance between execution in netbeans and command line is huge.
Many thanks.
private String executeCommand(String command) {
StringBuilder output = new StringBuilder();
BufferedReader reader = null;
Process p;
try {
p = Runtime.getRuntime().exec(command);
p.waitFor();
reader = new BufferedReader(new InputStreamReader(p.getInputStream()));
String line = "";
while ((line = reader.readLine())!= null) {
output.append(line + "\n");
}
} catch (Exception e) {
e.printStackTrace();
}
return output.toString();
}
After starting your process you need start reading from the input stream. Otherwise the buffers are running full and p.waitFor() waits forever.
Javadoc of the Process class:
Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, or even deadlock.
When I run this code and the call graph is really large, the program prints to the last line that opt outputs and is blocked at readLine, even though there is nothing left. Anyone know what the problem is? opt -print-callgraph file sends the call graph to the error stream. I tried executing opt -print-callgraph file 2> callgraph so that I can read from a file instead but it complains that there are too many positional arguments.
Oddly enough, the code runs fine for call graphs that are small in size.
I tried using ProcessBuilder as well but I get the same problem.
Runtime runtime = Runtime.getRuntime();
Process process = runtime.exec("opt -print-callgraph " + file);
BufferedReader in = new BufferedReader(new InputStreamReader(process.getErrorStream()));
String s = null;
try {
// Gets stuck at readLine after printing out the last line.
while ((s = in.readLine()) != null) {
System.out.println(s);
}
} catch (Exception e) {
e.printStackTrace();
} finally {
in.close();
}
You need to read both streams, in separate threads, or else merge them so you're reading them both at the same time. Otherwise the process can block if output is unconsumed. In this case there must be unconsumed output in stdout which is blocking the process, which means it won't finish, which means it won't close stderr, which means reading stderr will block.
I'm trying to use the Java Runtime.getRuntime().exec(String) command to run Festival, then use OutputStreamWriter to write some commands to the outpustream of the process.
This works great, and I'm able to do something like this:
Process p = Runtime.getRuntime().exec("festival");
Writer w = new OutputStreamWriter(p.getOutputStream());
w.append("(SayText \"Hello World\")");
w.flush();
Obviously the way I can tell this works is that it speaks the text through the speakers.
What I am having a real hard time doing is getting the text output from what I would see in the terminal. I'm trying to run some other commands (such as (voice.list)) which output text, presumably to stdout.
For example, I've tried using a BufferedReader in the following way:
BufferedReader reader = new BufferedReader (new InputStreamReader(p.getInputStream()));
w.append("(voice.list)");
w.flush();
String output = "";
String line = reader.readLine();
System.out.println(line);
while ((line = reader.readLine()) != null)
{
System.out.println("Reading: " + line);
output += line;
}
(The System.out.println's is just for debugging, I would do the entire thing in a cleaner way if I was able to get it to work.)
No matter what code I try, I'm never able to get any output from Festival. I can get output from other commands. E.G. I have tried this code as well http://en.allexperts.com/q/Java-1046/2008/2/Runtime-getRuntime-exec-cmd.htm and it works with many other commands (like ls) but not Festival.
Does anything have any idea how I would be able to get this to work?
Thanks.
Festival may output it's text on stderr instead of stdout. Try replacing
p.getInputStream()
with
p.getErrorStream()
In a java program, I am generating an sh script for use on a centOS machine, which will use sox and lame to decode an MP3 audio file, then apply some gain to the file respectively. Im having some issues getting the Process.waitFor() method to do anything other than hang indefinitely. Here is the code:
try
{
// TODO code application logic here
String reviewPath = "/SomeDirectory/";
String fileName = "FileName";
String extension = ".mp3";
StringBuilder sb = new StringBuilder();
sb.append("#!/bin/bash\n");
sb.append("cd " + reviewPath + "\n");
sb.append("lame --decode " + fileName + extension + "\n");
File script = new File(reviewPath + fileName + ".sh");
script.createNewFile();
script.setExecutable(true);
FileWriter writer = new FileWriter(script);
writer.write(sb.toString());
writer.close();
Process p = Runtime.getRuntime().exec(script.getAbsolutePath());
String line;
BufferedReader bri = new BufferedReader
(new InputStreamReader(p.getInputStream()));
BufferedReader bre = new BufferedReader
(new InputStreamReader(p.getErrorStream()));
while ((line = bri.readLine()) != null) {
System.out.println(line);
}
bri.close();
while ((line = bre.readLine()) != null) {
System.out.println(line);
}
bre.close();
p.waitFor();
System.out.println("Done.");
}
catch (Exception e)
{
System.out.println(e.getMessage());
}
The odd part is that when I run the .sh file it generates by hand, it runs and exits nicely, but when I execute it from a process object in java, it never exits. The exitValue of the process is always "Process has not exited". Ive tried adding set -e to the script, and exit to the end of the script. Short of using the kill command (which I dont really think I can do here) Im at a loss as to what is going on here. Any suggestions?
Add something like while(p.getInputStream().read() != -1); after starting the process. The buffer will get filled and the process will stop waiting for something (in this case, your program) to read from it to free up space.
I figured it out! The problem here was indeed that the output streams needed to be flushed for the application to exit, but simply reading from the streams is not enough. I used Suresh Koya's suggestion and used the processBuilder api, and redirected the error stream on the process before starting it, and read from the streams. This fixed the issues I was having :D