I want to be able to execute an external command from java, eg "ls", and get the output, both output and error streams, as a string, and in realtime, and in the order they were generated.
So, if the output from the command is something like:
blah <- to stdout
foo <- to stderr
bar <- to stdout
Then, ideally I want the output string, and the realtime output, to look like:
blah
foo
bar
Naive methods produce either:
blah
bar
(ie no stderr output)
or:
blah
bar
foo
(ie the order is rearranged so that the stdout and stderr messages are not interlaced with each other).
Use ProcessBuilder and set redirectErrorStream(true).
This isn't always possible.
If you use the ProcessBuilder API, you can merge the stdout and stderr streams into one (using redirectErrorStream(true)), so you can read both outputs in a single InputStream. But that means you can't tell from which stream the data originally came from.
If you read stdout and stderr as two streams, you will need NIO or two Java threads. In both cases, processing one output will block the other (or rather, the other stream won't be processed timely). That will lead to swapped lines.
If the child process doesn't flush the output and you use pipes, then this gets worse because stdout will be sent to your process in 4KB blocks while stderr will usually arrive line-by-line.
There is no platform-independent workaround, I'm afraid. If you only need a solution which works on Unix, you can use Pseudo TTYs (man 4 pty) to simulate a shell but these are hard to set up from Java since you need to call OS functions.
One approach might be to use Perl or similar to run your command in a PTY (this causes stdout to become line buffered), read stdout and stderr from there and prefix each line with 1 for stdout and 2 for stderr.
I would suggest you to use Apache commons exec API as they are more sophisticated API to use.
See DefaultExecutor: You can :
set a current working directory for the subprocess
provide a set of environment variables passed to the subprocess
capture the subprocess output of stdout and stderr using an ExecuteStreamHandler
kill long-running processes using an ExecuteWatchdog
define a set of expected exit values
terminate any started processes when the main process is terminating using a ProcessDestroyer
Related
Using zt-exec I would like to know how it can write/read to a process which is waiting for console input on an infinite loop, as well as responding using its console out?
I believe the easiest way to describe this is with a Python script:
while(True):
javaSaid = raw_input("Hey Java, Say Something: ") ##wait for input from java
print "Python Heard Java Say: " +str(javaSaid) ##java needs to be able to get this output
Note: Executing the python process multiple times is what I am trying to avoid as the initialization time on the real python script makes this unacceptable.
You need to call redirectInput as well as redirectOutput on ProcessExecutor.
Have a look at ProcessExecutorInputStreamTest.java. It's just an example. It writes data to the process input vie PipedOutputStream -> PipedInputStream -> ProcessExecutor and read data from the process via an OutputStream.
I want to read what gets written to stdout in a Java process that spawns other processes using inheritIO. I cannot use redirectOut as I have no control over the code that starts the process!. Also note that resetting System.setOut doesn't work in this case.
Also I don't have access to the Process object.
Example:
new ProcessBuilder().command('/bin/echo', 'FooBar').inheritIO.start
// read 'FooBar' from standard out
By definition, inheritIO causes the output of subprocess to be the same of the output of caller. So as soon as you call it on the ProcessBuilder, the calling process cannot read the standard output nor error streams of the callee.
As you say you cannot change that, the only way I can imagine is to use an external launcher that redirects output to a pipe (the default for ProcessBuilder...), starts the program containing the line you showed (new ProcessBuilder().command('/bin/echo', 'FooBar').inheritIO.start) and processes that output.
I have started a process in my Java code, this process take a very long time to run and could generate some output from time to time. I need to react to every output when they are generated, what is the best way to do this?
What kind of reaction are you talking about? Is the process writing to its standard output and/or standard error? If so, I suspect Process.getInputStream and Process.getErrorStream are what you're looking for. Read from both of those and react accordingly. Note that you may want to read from both of them from different threads, to avoid the individual buffer for either stream from filling up.
Alternatively, if you don't need the two separately, just leave redirectErrorStream in ProcessBuilder as false, so the error and output streams are merged.
You should start a thread which reads from the Process.getInputStream() and getErrorStream() (or alternatively use ProcessBuilder.redirectErrorStream(true)) and handle it when something shows up in the stream. There are many ways that how to handle it - the right way depends on how the data is being used. Please tell more details.
Here is one real-life example: SbtRunner uses ProcessRunner to send commands to a command line application and wait for the command to finish execution (the application will print "> " when a command finishes execution). There is some indirection happening to make it easier to read from the process' output (the output is written to a MulticastPipe from where it is then read by an OutputReader).
As I know both out and err are of same class PrintStream. Can anybody tell me how they differ...how they changed their behaviour?
The difference is not evident because by default in most of the Operating Systems they are written to the console (same file, console is also a file). However, you can have System.out write to a file, and System.err write to the console (monitor) - this is just one scenario.
Write a program that emits both System.out and System.err messages, and try this:
java MyProgram > out.txt 2> err.txt # On a *NIX.
System.out messages will go to out.txt and System.err messages will to err.txt. Basic point to remember is to think of System.out and System.err as streams to files (which is what they are) instead of a mechanism to output on the monitor, which is what I assumed as a beginner.
They go to the system stdout and stderr streams respectively. On most OSes these are distinct and can be sent to different places. For example, this might be useful if your program's output was to be parsed by another program - if it needed to report an error, stderr would normally be the better place for it as you might have set it up to get a human's attention.
They have the same behavior. But first (out) is a reference to standard output stream (by default it's a console). And second (err) is a reference to standard error stream (by default it's a console too).
But if you want, you can change a reference, or you can add a wrapper/filter to each of them.
My IDE, for example, shows output from err stream in red colors.
System.out sends the output to the standard output stream. System.err sends the output to the standard error stream.
By default both of these write to the console.
However the benefit is that the two stream can be redirected so you could have the system.out output redirected to your normal log file and you could have the System.err output redirected to an error log.
We have a Perl program to validate XML which is invoked from a Java program. It is not able to write to standard error and hanging in the print location.
Perl is writing to STDERR and a java program is reading the STDERR using getErrorStream() function. But the Perl program is hanging to write to STDERR. I suspect Java function is blocking the STDERR stream completely and Perl is waiting for this stream to be released.
Is there a way in Perl to overcome this blockage and write to standard error forcefully? Since Java is doing only a read the API should not be locking the STDERR stream as per java doc.
Perl Code snippet is:
sub print_error
{
print STDERR shift;
}
Java code snippet is:
while ( getErrorStream() != null )
{
SOP errorMessage;
}
Appreciate the help in advance.
Thanks,
Mathew Liju
getErrorStream does not read the error stream, it just obtains a handle to it. As it's a pipe, if you never actually read it, it will fill up and force the Perl program to block.
You need something like:
Inputstream errors = getErrorStream();
while (errors.read(buffer) > 0) {
SOP buffer;
}
Ideally, I think that to avoid deadlock, in Java you need to spawn separate threads to read the STDERR and the STDOUT. It sounds like Perl is blocking when writing to STDERR because for one reason or another you are never reading from it in Java.
An additional factor to consider is the buffering that occurs with piped processes.
There is by default, about a 30-line-ish buffer that is maintained by the shell creating the inter-process pipe, so if the Perl app has not created enough data, it won't have been sent to the Java application yet to process.
May be this thread has a possible cause for your problem:
Add 3 lines to the top of the Perl script:
use IO::Handle;
STDOUT->autoflush(1);
STDERR->autoflush(1);
The problem in the mentioned thread was related to "the way Perl is buffering its output".
However here, Adrian Pronk mentions in the comments that "Perl is hanging because Java is never reading its output".
STDOUT->autoflush(1);
STDERR->autoflush(1);
This is the information I needed!
I have a Java app running some Perl scripts and I'd only get the output after it was finished.
By adding the autoflush(1) I get it right away.
BTW, I do have separate threads for reading STDERR and STDOUT, and that's the way to go.
Thanks.