Java ProcessBuilder: external process hangs - java

I'm using Java's ProcessBuilder class to run an external process. The process should not terminate before the Java program does; it must stay alive in command/response mode.
I know that the process streams may easily 'jam' if neglected, so I've done the following:
The program reads the process's combined output and error streams in a "reader" thread, and uses a "writer" thread to manage the commands. The reader thread does blocking character reads from process output, buffers them up into Strings and dispatches the results. The writer thread writes complete "command" lines via a PrintWriter; it uses a queue to ensure that no two command writes are "too close together" (currently 100ms), and that no new command gets written before the output of the previous command is complete. I also call flush() and checkError() after every println().
This scheme works fine for a few seconds or minutes, then the reader thread hangs on the blocking read(). No errors, no exceptions thrown, no more process output. Thereafter nothing will revive the external process (short of restarting it). (BTW this happens on both Linux and Windows.)
I've looked at the code and test-cases in Jakarta Commons Exec and in Plexus Utils http://plexus.codehaus.org/plexus-utils/ but (a) neither gives an example of using a long-lived Process and (b) neither appears to be doing anything basically different from what I've described.
Does anyone have a clue what's happening here please?
Thanks!

Do you also have a thread managing stderr? You only mention the two streams.

i had implemented error, input and output stream in three sepereate threads and i can read and write to external processes without any problem.
I tested both on windows/linux with multitude of built in apps cmd/bash as well as other cmd line binaries and it works fine except on some occasions it just throws io stream exception, what i do is catch the exception and restart thread again, so that program keeps on working.
If you are trying to e.g ssh in linux, then you might run across problem like you won't be able to write to same stdin, this is because of security reasons.
Try taking input from System.in and see if it works, it worked in my case

Just a guess, but have you tried un-combining the error and output streams?

Related

How to find status of producer using consumer

I am reading a large sized file like approximately 2 GB. But I am not aware of whether the writer has written the file completely. Is there any way to check whether the file is written completely from reader side?
Without any code, it's going to be hard to make any concrete suggestions but why not just extend whatever class you're using as a writer and add a simple CountDownLatch to it (I'm assuming that reader and writer are running in two different threads) and call countDown() once writing is done? On the reader side, you simply use the latch's await method.
AFAIK, there isn't a portable way to do it. However on Linux you can test if some other process has a file open by running the lsof command and processing its output.
For example:
lsof /home/fred/someFile
will output the pids for the process or processes that have that file open. If you can assume that the process that is writing the file closes it once it has finished writing (and doesn't reopen it), you can use the lsof output to test if the file has been fully written.
You can run the lsof command from Java using Process / ProcessBuilder.

Need to kill a Java program started by Runtime.getRuntime().exec()

In a Java program I am running another Java program using the following command:
Process p = Runtime.getRuntime().exec("echo.bat | java -Xms64M -Xmx1G -cp "+execFilePath+" "+inputFileName+" "+inputParam);
The invocation is working fine, but if due to bad coding the executed Java file (inputFileName) is hanging, say, due to some infinite loop, a new process which got started, is not ending. Now, I need to kill that process in my code.
I am able to detect if the Java program is hanging by using TimeOut. But, I don't know how to get the process id of this executed Java program and kill it once TimeOut happens.
Any help is appreciated!
Generally, you can call destroy on your Process instance. See here
I notice however that you are starting one process and pipe its output to another. The simple approach will most probabely only kill the former (your echo.bat process).
Therefore, you need a more complex scenario. Do the following:
Start a process calling echo.bat only
Wait until it is finished
Read all of its output through its output stream (Process.getOutputStream())
Start another process, calling the java program only.
Write the read data to its input (write to Process.getInputStream())
This second process instance will be your java process
As fge mentioned above, take a look at ProcessBuilder, as it simplyfies some of the steps. Especially setting up the input stream can be accomplished before actually starting the program

What to do with unneeded streams from an external process?

When I execute a command in a separate process, for example by using the Runtime.getRuntime().exec(...) method, whose JavaDoc states:
Executes the specified command and arguments in a separate process.
What do I need to do with the streams from this process, knowing that the process shall live until the Java program exists? (this is a detail but the Java program takes care of killing this process and the process itself has a safety built-in where it kills itself should it notice that the Java program who spawned him his not running anymore).
If we consider that this process produces no output at all (for example because all error messages and stdout are redirected to /dev/null and all communications are done using files/sockets/whatever), what do I need to do with the input stream?
Should I have one (or two?) Java threads running for nothing, trying to read stdout/stderr?
What is the correct way to deal with a long-living external process spawned from a Java program that produces no stdout/stderr at all?
EDIT
Basically I wrap the shell script in another shell script that makes sure to redirect everything to /dev/null. I'm pretty sure my Un*x would be non-compliant if my "outter" shell script (the one redirecting everything to /dev/null) would still generate anything on stdout or stderr. Yet I find it mindboggling that I would somehow be supposed to have threads running during the lifecycle of the app "for nothing". Really boggles the mind.
If everything is as you say, then you can probably ignore them.
However, rarely do things work out so cleanly. It may be worth it in the long run to spawn a single thread to pull stdout/stderr, just in case. The one day it fails and actually puts something out, is the day you needed to know what came out. 1 or 2 threads (I think it could be done with just one) won't be a large overhead. Especially if you are correct and nothing ever comes out of those streams.
I believe the correct way to deal with a process's input and output if you are not interested in them is to close them promptly. If the child process subsequently tried to call read or write on stdin or stdout respectively, then an IOException would be thrown. It would be the responsibility of the child process to deal with the fact that it cannot read or write.
Most processes will ignore the fact that they cannot write and silently discard and writes. This is true in Java, where System.out is a PrintWriter, so any IOExceptions thrown by stdout are ignored. This is pretty much what happens when you redirect output to /dev/null -- all output is silently discarded.
It sounds like you've read the API on Processes and why it's important to read/write to the process if it expects to be doing any writing or reading of its own. But I'll reiterate, the problem comes that some OSes only allocated very limited buffers for (specifically) stdout, so it is important to either not allow this buffers to fill up. This means either reading any output of the child process promptly, or notifying the OS that you do not require the output of the process and that it can release any resources held, and reject any further attempted to write to stdout or read from stdin (rather than just hanging until resources become available).

Java Dealing with child Process

I have a simple script that writes output to the console using ProcessBuilder. And OutputStreamWriter see:
Java Process with Input/Output Stream
An earlier thread. I believe my problem rests with the fact a sub process is spawned and the initial Parent process is killed. Resulting in the OutputStreamWriter to throw a java.io.IOException: Broken pipe exception. Given the application is spawning a second process how can I connect my OutputStreamWriter to this new process? Including being able to read back the output it is generating? Is this even possible in Java? Surly java should be able to follow onto the spawned process.
Thanks
Are you calling waitFor() on the process you are running? That should ensure you're parent doesn't complete before the child. You may also want to look at commons exec, which is an open source library designed to make your life easier when running separate processes in Java.

External program blocks when run by Runtime exec

I'm attempting to launch an instance of the VideoLAN program from within a java application. One of the ways I've tried to do this is shown here:
Process p = Runtime.getRuntime().exec("\"C:\\Program Files\\VideoLAN\\VLC\\vlc.exe\" \"http://www.dr.dk/Forms/Published/PlaylistGen.aspx?qid=1316859&odp=true\" :sout=#std{access=udp,mux=ts,dst=127.0.0.1:63928}");
If I execute the above command the vlc program will be launched, and will start a streaming operation (it goes through connect, buffering and then streaming phases).
When the command is executed by Runtime exec (or ProcessBuilder start), the vlc program will hang when it reached the end of the buffering phase. If all threads in the java program are terminated/run to an end, the vlc program will progress to the streaming phase. The java process will not terminate until the vlc process is closed, so this behavior is obviously the result of some sort of coupling between the processes.
Have tried to execute the command indirectly by writing it to a .cmd file and then executing it, but results in the same behavior.
Any ideas for how I can avoid the external process hanging?
Hmm, my guess would be that VLC filled your STDOUT buffer and is hung in a printf statement because STDOUT is waiting for that buffer to empty.
You need to get the stream for the process's output and read it (even if you discard it).
I recommend you read this article
On the 4th page is a good example of how to read the streams in threads so your child process won't block.
This site is fantastic :). For some reason an approach I thought had already been tried suddenly started working.
The problem is that vlc writes to its stdErrOut (which is not visible when executed in a prompt). It then blocks once some output buffer is full. A solution is to have a stdErr redirected to stdOut and then have a thread empty the input stream of the process object.
It is however not an optimal solution, since I need a fair amount of external processes, and you can't do non-blocking I/O on their input streams. Will experiment a bit with having a timer service drive empty-reading for a number of processes. Other suggestions for how to de-couple the processes to avoid this problem are very welcome.

Categories