When I execute a command in a separate process, for example by using the Runtime.getRuntime().exec(...) method, whose JavaDoc states:
Executes the specified command and arguments in a separate process.
What do I need to do with the streams from this process, knowing that the process shall live until the Java program exists? (this is a detail but the Java program takes care of killing this process and the process itself has a safety built-in where it kills itself should it notice that the Java program who spawned him his not running anymore).
If we consider that this process produces no output at all (for example because all error messages and stdout are redirected to /dev/null and all communications are done using files/sockets/whatever), what do I need to do with the input stream?
Should I have one (or two?) Java threads running for nothing, trying to read stdout/stderr?
What is the correct way to deal with a long-living external process spawned from a Java program that produces no stdout/stderr at all?
EDIT
Basically I wrap the shell script in another shell script that makes sure to redirect everything to /dev/null. I'm pretty sure my Un*x would be non-compliant if my "outter" shell script (the one redirecting everything to /dev/null) would still generate anything on stdout or stderr. Yet I find it mindboggling that I would somehow be supposed to have threads running during the lifecycle of the app "for nothing". Really boggles the mind.
If everything is as you say, then you can probably ignore them.
However, rarely do things work out so cleanly. It may be worth it in the long run to spawn a single thread to pull stdout/stderr, just in case. The one day it fails and actually puts something out, is the day you needed to know what came out. 1 or 2 threads (I think it could be done with just one) won't be a large overhead. Especially if you are correct and nothing ever comes out of those streams.
I believe the correct way to deal with a process's input and output if you are not interested in them is to close them promptly. If the child process subsequently tried to call read or write on stdin or stdout respectively, then an IOException would be thrown. It would be the responsibility of the child process to deal with the fact that it cannot read or write.
Most processes will ignore the fact that they cannot write and silently discard and writes. This is true in Java, where System.out is a PrintWriter, so any IOExceptions thrown by stdout are ignored. This is pretty much what happens when you redirect output to /dev/null -- all output is silently discarded.
It sounds like you've read the API on Processes and why it's important to read/write to the process if it expects to be doing any writing or reading of its own. But I'll reiterate, the problem comes that some OSes only allocated very limited buffers for (specifically) stdout, so it is important to either not allow this buffers to fill up. This means either reading any output of the child process promptly, or notifying the OS that you do not require the output of the process and that it can release any resources held, and reject any further attempted to write to stdout or read from stdin (rather than just hanging until resources become available).
Related
I want to run a java program on a remote machine, and intercept its logs-- also I want to be able to know if the program has completed execution, and also whether it was successful execution or if execution was halted due to an error.
Is there any ready-made java library available for this purpose? Also, I would like to be able to use this program for obtaining logs/execution completion for remote programs in different languages-- like Java/Ruby/Python etc--
If you're only looking to determine when it has completed (and not looking to really capture all the output, as in your other question) you can simply check for the existence of the process id and, when you fail to find the process id, phone home. You really don't need the logs for that.
You should take a look at the Apache Commons Tailer for reading from another process's logs. As for checking if a process completed successfully, that's a little bit trickier. I would wrap the execution of the process in a shell script that writes out the status to a file that the Java program could then check. See here for more info on how to do that.
I want to do something along these lines.
Process shell = Runtime.getRuntime().exec("/bin/bash");
Then I want to use the streams for the shell process to talk to the bash shell. However this doesn't seem to work at all and it totally stumps me.
I found this link which seems to talk about the same problem. Why exactly does this happen and are there better solutions than the one outlined in the link?
It can be necessary to flush your writes from the JVM to the child process to make sure its getting its input. IIRC I didn't need to do this on Windows, but did on Linux. I also ran into issues where I had to force the child process to flush writes so the JVM would see them right away too.
Also, make sure that you have JVM threads reading from stdout and stderr before you do anything, if either of those buffers fills up it can lock the process. This is a huge problem on Windows. You will only need one thread if you use the options to combine the streams when launching the process.
Also, your example (above), doesn't have a newline, wouldn't bash require one? e.g. "touch blah\n"
I have a simple script that writes output to the console using ProcessBuilder. And OutputStreamWriter see:
Java Process with Input/Output Stream
An earlier thread. I believe my problem rests with the fact a sub process is spawned and the initial Parent process is killed. Resulting in the OutputStreamWriter to throw a java.io.IOException: Broken pipe exception. Given the application is spawning a second process how can I connect my OutputStreamWriter to this new process? Including being able to read back the output it is generating? Is this even possible in Java? Surly java should be able to follow onto the spawned process.
Thanks
Are you calling waitFor() on the process you are running? That should ensure you're parent doesn't complete before the child. You may also want to look at commons exec, which is an open source library designed to make your life easier when running separate processes in Java.
I'm attempting to launch an instance of the VideoLAN program from within a java application. One of the ways I've tried to do this is shown here:
Process p = Runtime.getRuntime().exec("\"C:\\Program Files\\VideoLAN\\VLC\\vlc.exe\" \"http://www.dr.dk/Forms/Published/PlaylistGen.aspx?qid=1316859&odp=true\" :sout=#std{access=udp,mux=ts,dst=127.0.0.1:63928}");
If I execute the above command the vlc program will be launched, and will start a streaming operation (it goes through connect, buffering and then streaming phases).
When the command is executed by Runtime exec (or ProcessBuilder start), the vlc program will hang when it reached the end of the buffering phase. If all threads in the java program are terminated/run to an end, the vlc program will progress to the streaming phase. The java process will not terminate until the vlc process is closed, so this behavior is obviously the result of some sort of coupling between the processes.
Have tried to execute the command indirectly by writing it to a .cmd file and then executing it, but results in the same behavior.
Any ideas for how I can avoid the external process hanging?
Hmm, my guess would be that VLC filled your STDOUT buffer and is hung in a printf statement because STDOUT is waiting for that buffer to empty.
You need to get the stream for the process's output and read it (even if you discard it).
I recommend you read this article
On the 4th page is a good example of how to read the streams in threads so your child process won't block.
This site is fantastic :). For some reason an approach I thought had already been tried suddenly started working.
The problem is that vlc writes to its stdErrOut (which is not visible when executed in a prompt). It then blocks once some output buffer is full. A solution is to have a stdErr redirected to stdOut and then have a thread empty the input stream of the process object.
It is however not an optimal solution, since I need a fair amount of external processes, and you can't do non-blocking I/O on their input streams. Will experiment a bit with having a timer service drive empty-reading for a number of processes. Other suggestions for how to de-couple the processes to avoid this problem are very welcome.
I'm using Java's ProcessBuilder class to run an external process. The process should not terminate before the Java program does; it must stay alive in command/response mode.
I know that the process streams may easily 'jam' if neglected, so I've done the following:
The program reads the process's combined output and error streams in a "reader" thread, and uses a "writer" thread to manage the commands. The reader thread does blocking character reads from process output, buffers them up into Strings and dispatches the results. The writer thread writes complete "command" lines via a PrintWriter; it uses a queue to ensure that no two command writes are "too close together" (currently 100ms), and that no new command gets written before the output of the previous command is complete. I also call flush() and checkError() after every println().
This scheme works fine for a few seconds or minutes, then the reader thread hangs on the blocking read(). No errors, no exceptions thrown, no more process output. Thereafter nothing will revive the external process (short of restarting it). (BTW this happens on both Linux and Windows.)
I've looked at the code and test-cases in Jakarta Commons Exec and in Plexus Utils http://plexus.codehaus.org/plexus-utils/ but (a) neither gives an example of using a long-lived Process and (b) neither appears to be doing anything basically different from what I've described.
Does anyone have a clue what's happening here please?
Thanks!
Do you also have a thread managing stderr? You only mention the two streams.
i had implemented error, input and output stream in three sepereate threads and i can read and write to external processes without any problem.
I tested both on windows/linux with multitude of built in apps cmd/bash as well as other cmd line binaries and it works fine except on some occasions it just throws io stream exception, what i do is catch the exception and restart thread again, so that program keeps on working.
If you are trying to e.g ssh in linux, then you might run across problem like you won't be able to write to same stdin, this is because of security reasons.
Try taking input from System.in and see if it works, it worked in my case
Just a guess, but have you tried un-combining the error and output streams?