As I know both out and err are of same class PrintStream. Can anybody tell me how they differ...how they changed their behaviour?
The difference is not evident because by default in most of the Operating Systems they are written to the console (same file, console is also a file). However, you can have System.out write to a file, and System.err write to the console (monitor) - this is just one scenario.
Write a program that emits both System.out and System.err messages, and try this:
java MyProgram > out.txt 2> err.txt # On a *NIX.
System.out messages will go to out.txt and System.err messages will to err.txt. Basic point to remember is to think of System.out and System.err as streams to files (which is what they are) instead of a mechanism to output on the monitor, which is what I assumed as a beginner.
They go to the system stdout and stderr streams respectively. On most OSes these are distinct and can be sent to different places. For example, this might be useful if your program's output was to be parsed by another program - if it needed to report an error, stderr would normally be the better place for it as you might have set it up to get a human's attention.
They have the same behavior. But first (out) is a reference to standard output stream (by default it's a console). And second (err) is a reference to standard error stream (by default it's a console too).
But if you want, you can change a reference, or you can add a wrapper/filter to each of them.
My IDE, for example, shows output from err stream in red colors.
System.out sends the output to the standard output stream. System.err sends the output to the standard error stream.
By default both of these write to the console.
However the benefit is that the two stream can be redirected so you could have the system.out output redirected to your normal log file and you could have the System.err output redirected to an error log.
Related
I want to read what gets written to stdout in a Java process that spawns other processes using inheritIO. I cannot use redirectOut as I have no control over the code that starts the process!. Also note that resetting System.setOut doesn't work in this case.
Also I don't have access to the Process object.
Example:
new ProcessBuilder().command('/bin/echo', 'FooBar').inheritIO.start
// read 'FooBar' from standard out
By definition, inheritIO causes the output of subprocess to be the same of the output of caller. So as soon as you call it on the ProcessBuilder, the calling process cannot read the standard output nor error streams of the callee.
As you say you cannot change that, the only way I can imagine is to use an external launcher that redirects output to a pipe (the default for ProcessBuilder...), starts the program containing the line you showed (new ProcessBuilder().command('/bin/echo', 'FooBar').inheritIO.start) and processes that output.
I have a java application taking data from stdin and writing results to stdout. Like this:
app1 | java_app | app2
The amount of data moved through the pipe is large and takes a long time (hours and days is not uncommon or unexpected).
Its seems that if app1 dies, and closes its end of the pipe, I receive a pipe related exception. However is app2 dies, and closes its end of the pipe I do not receive an exception. This means that java_app continues on consuming input, which will never generate any useful output, running for hours or days.
The difference in pipe exception reporting seems to stem from System.in being an InputStream, while System.out is a PrintStream and PrintStreams swallow up any errors. I know you can get the error state from a PrintStream by using stream.checkError() - but using that everywhere is clunky (and forces a flush of your output stream.)
I 'm happy to forgo the bulk of functionality of PrintStream to get better error reporting.
Is there another way to get access to stdout that isn't wrapped in a PrintStream, but instead a nicer OutputStream?
Turns out one of my colleagues found a nice way to do this.
OutputStream stream = new FileOutputStream(FileDescriptor.out);
I want to be able to execute an external command from java, eg "ls", and get the output, both output and error streams, as a string, and in realtime, and in the order they were generated.
So, if the output from the command is something like:
blah <- to stdout
foo <- to stderr
bar <- to stdout
Then, ideally I want the output string, and the realtime output, to look like:
blah
foo
bar
Naive methods produce either:
blah
bar
(ie no stderr output)
or:
blah
bar
foo
(ie the order is rearranged so that the stdout and stderr messages are not interlaced with each other).
Use ProcessBuilder and set redirectErrorStream(true).
This isn't always possible.
If you use the ProcessBuilder API, you can merge the stdout and stderr streams into one (using redirectErrorStream(true)), so you can read both outputs in a single InputStream. But that means you can't tell from which stream the data originally came from.
If you read stdout and stderr as two streams, you will need NIO or two Java threads. In both cases, processing one output will block the other (or rather, the other stream won't be processed timely). That will lead to swapped lines.
If the child process doesn't flush the output and you use pipes, then this gets worse because stdout will be sent to your process in 4KB blocks while stderr will usually arrive line-by-line.
There is no platform-independent workaround, I'm afraid. If you only need a solution which works on Unix, you can use Pseudo TTYs (man 4 pty) to simulate a shell but these are hard to set up from Java since you need to call OS functions.
One approach might be to use Perl or similar to run your command in a PTY (this causes stdout to become line buffered), read stdout and stderr from there and prefix each line with 1 for stdout and 2 for stderr.
I would suggest you to use Apache commons exec API as they are more sophisticated API to use.
See DefaultExecutor: You can :
set a current working directory for the subprocess
provide a set of environment variables passed to the subprocess
capture the subprocess output of stdout and stderr using an ExecuteStreamHandler
kill long-running processes using an ExecuteWatchdog
define a set of expected exit values
terminate any started processes when the main process is terminating using a ProcessDestroyer
I am creating a GUI using Java. This GUI launches a program from the command line using the ProcessBuilder class.
A little information on the process being launched: from the command line, it creates another window and prints information to said window.
In my GUI window, I have a text area to where I would like to redirect said output. I originally intended to use a SwingWorker object to constantly check for more output and not hold up the GUI. To test and make sure I had the original syntax down (without even bringing the GUI into things) I thought I would print the output from the secondary process' window to System.out. However, something seems to be wrong as I can see the output in the secondary process' window, but not the terminal from which I am working.
Excerpt of code is as follows:
Process p = pb.start();
Scanner s = new Scanner(p.getInputStream());
SwingWorker pipe = new SwingWorker<String, Void> (){
public String doInBackground(){
while(run){
if(s.hasNextLine()){
System.out.println("S has next!");
System.out.println(s.nextLine());
}
}
return null;
}
};
pipe.execute();
The boolean run is defined elsewhere in the program and is set to false when the process p exits or is force quit (additional question: is that a really bad idea? I feel like it might be...).
Does anyone have an idea as to why I am never getting any output when I see it being printed to the other window? Initially my reaction was to use p.getOutputStream() but Scanner does not take an outputStream as a paramter.
Thank you for your time.
You should also scan p.getErrorStream() - some programs write to STDERR which is indistinguishable from STDOUT when run from the command line. It is generally good practice to consume both streams, as if either one is not consumed it can cause the external process to hang.
If the external process is writing its output to its own window, it is almost certain that the output is NOT being written to STDOUT, which is what you are reading with your code. If it did so, then the external program's output would be appearing both in its window and in the command line session from which it was launched (if one existed). Without access to the source of the external program it's unlikely you will be able to intercept its output unless the authors made provisions for that functionality (i.e. a command-line switch that redirects output to STDOUT instead of the window).
As to p.getOutputStream(), that returns a stream which is "output" from YOUR point of view -- i.e. you write to it to send data to the process' STDIN. Your use of p.getInputStream() would be correct for the case where the external program writes to its STDOUT.
We have a Perl program to validate XML which is invoked from a Java program. It is not able to write to standard error and hanging in the print location.
Perl is writing to STDERR and a java program is reading the STDERR using getErrorStream() function. But the Perl program is hanging to write to STDERR. I suspect Java function is blocking the STDERR stream completely and Perl is waiting for this stream to be released.
Is there a way in Perl to overcome this blockage and write to standard error forcefully? Since Java is doing only a read the API should not be locking the STDERR stream as per java doc.
Perl Code snippet is:
sub print_error
{
print STDERR shift;
}
Java code snippet is:
while ( getErrorStream() != null )
{
SOP errorMessage;
}
Appreciate the help in advance.
Thanks,
Mathew Liju
getErrorStream does not read the error stream, it just obtains a handle to it. As it's a pipe, if you never actually read it, it will fill up and force the Perl program to block.
You need something like:
Inputstream errors = getErrorStream();
while (errors.read(buffer) > 0) {
SOP buffer;
}
Ideally, I think that to avoid deadlock, in Java you need to spawn separate threads to read the STDERR and the STDOUT. It sounds like Perl is blocking when writing to STDERR because for one reason or another you are never reading from it in Java.
An additional factor to consider is the buffering that occurs with piped processes.
There is by default, about a 30-line-ish buffer that is maintained by the shell creating the inter-process pipe, so if the Perl app has not created enough data, it won't have been sent to the Java application yet to process.
May be this thread has a possible cause for your problem:
Add 3 lines to the top of the Perl script:
use IO::Handle;
STDOUT->autoflush(1);
STDERR->autoflush(1);
The problem in the mentioned thread was related to "the way Perl is buffering its output".
However here, Adrian Pronk mentions in the comments that "Perl is hanging because Java is never reading its output".
STDOUT->autoflush(1);
STDERR->autoflush(1);
This is the information I needed!
I have a Java app running some Perl scripts and I'd only get the output after it was finished.
By adding the autoflush(1) I get it right away.
BTW, I do have separate threads for reading STDERR and STDOUT, and that's the way to go.
Thanks.