process.waitFor() not returning - java

I would like to migrate data from a Postgres SQL script.
I am using the code below.
void importDatabase() {
final Process process = Runtime.getRuntime().exec("psql -f ./test.sql postgresql://postgres:root#127.0.0.1:5432/testdb");
//Wait to get exit value
try {
if (process.waitFor() != 0) {
String line;
BufferedReader input = new BufferedReader(new InputStreamReader(process.getErrorStream()));
while ((line = input.readLine()) != null) {
log.error(line);
}
}
} catch (InterruptedException e) {
e.printStackTrace();
}
}
waitFor is not returning anything, it just waits forever.
Note: My test.sql has many queries so it will display many log entries.
Question:
How to solve this problem?
If this relates to the buffer memory, how do I clear the buffer?
Thank you.

If a process executed by Runtime.exec() produces output -- either to stdout or stderr -- then for it to operate robustly you need to start consuming the output before calling Process.waitFor(). Most likely, your psql process is blocked because it's waiting for something to read its output, and you don't start to do that until Process.waitFor() is complete.
You probably need to consume the output in a separate thread, unless you know exactly what output is expected. In any case, there may, or may not, be a problem with your invocation of psql -- unless you capture its output you probably won't know.
Robust and portable use of Runtime.exec() is surprisingly difficult. I've written about this at length, with code samples, here:
http://kevinboone.me/exec.html

Related

Java - Handling Process (Builder) Output

In my program I have a SwingWorker starting a background process. The background process's error stream is redirected to stdout, and the stdout stream is written out (line by line) to a JTextArea. I thought I was consuming the stdout stream with this
BufferedReader processOut = new BufferedReader(
new InputStreamReader(p.getInputStream()));
And with this inside the SwingWorker's doInBackground:
String line;
while((line = processOut.readLine()) != null)
process(line);
On my machine, the process executes to completion and the Text Area is periodically updated. The process freezes in the background on other computers though. I've increased the size of the default command window, so that might be why I can't get any process to freeze on my computer (that is probably very wrong from the more that I read).
I tried redirecting the output inside the ProcessBuilder command with > log.txt (I'm on Windows 7 currently) but I think that's causing the p.getInputStream() call to crash.
How can I either consume the stdout of the subprocess properly inside my SwingWorker class, or is it possible to pipe the output to a file and still get the output to print to the JTextArea.
Edit:
I read here That an input stream needs to be read promptly to be consumed. I've provided the loop that processes the input stream below. I would assume that the pipe doesn't get more than 4K of data before it is read, but I can't assume anything at this point.
while(processIsAlive())
{
if(isCancelled())
{
try
{
p.destroy();
p.waitFor();
return 1;
}
catch(Exception e){}
}
try
{
//Update Text Area
//get output from process
while((line = processOut.readLine()) != null)
{
//print output to text area
publish(line);
}
sleep(1000);
}
catch(Exception e){}
}
EDIT 2:
I thought that having the process redirected to a file, then putting an InputStream on that same file would be impossible, but its not crashing my program and the GUI is still updated properly. I'm going to go test this on the problematic machine, then I'm going to mark it as an answer if it works.
All I did was redirect the process to a file:
pb.redirectOutput(new File("log.txt"));
And put the InputStream on that file instead of the process stdout stream.
processOut = new BufferedReader(new FileReader("log.txt"));
When starting processes, it's convenient to read the stdout and stderr in a separate thread each one, because reading must be done through a call to InputStream.read(), which does block the owner thread.
You said that your background process was redirecting errors to stdout, ok. Then, read only the stdout, but do it always in a separate thread, because the main thread gets blocked in the waitFor call until the process ends. But the process will block if its output buffer gets full and nobody is reading it.
In this case, the difference between your local environment (success) and the others (failing) could be that, in yours, the amount of data fits in the output buffer.
My second edit will work on any machine with Java 7 or later. The reading from the input stream is blocking, so that is incorrect for using in a GUI but the process streams are read from appropriately.
The actual problem with the machine was that for some reason the program wasn't flushing appropriately, so I added fflush(stdout) to the c source code in the problematic program, and it worked perfectly.

Error stream blocking when running external command with Java

Working on SEAndroid, I call Setools commands from my Java application.
It works perfectly with small SEAndroid policy and now I need to test my tool with real
SEAndroid policy. But unfortunately, I face a problem with an error stream.
Here my code I used to call external commands :
public static BufferedReader runCommand(final String[] args)
throws IOException {
BufferedReader stdInput = null;
BufferedReader stdError = null;
try {
Process p = Runtime.getRuntime().exec(args);
stdInput = new BufferedReader(new
InputStreamReader(p.getInputStream()));
stdError = new BufferedReader(new
InputStreamReader(p.getErrorStream()));
// read any errors from the attempted command
String s = null;
StringBuilder err = new StringBuilder();
while ((s = stdError.readLine()) != null) {
err.append(s + "\n");
}
if (err.length() != 0) {
throw new IOException(err.toString());
}
return stdInput;
} finally {
if (stdError != null) {
stdError.close();
}
}
}
So, as you can see, I call an external command. Then read the error stream and throw an exception if there is any errors, otherwise I return the InputStream, so I can parse it later.
With a real SEAndroid policy, the error stream seems to block (even if I read a single char) and I can't parse the result of the command. If I close the error stream without reading anything, the application works fine, but I want to handle errors if any.
If I type the command in a console, it works fine too.
In the first case (with small SEAndroid policy), the output of the command is small ( ~350 lines).
In the second case (with a real SEAndroid policy), the output of the command is larger ( >1500 lines).
Is it possible that the size of the output stream influences the error stream? The two streams are two distinctive resources, isn't it?
The fact that I do not read the output stream immediately have an importance?
I fear that its not a "programming" problem but more a system problem...
Any suggestion?
Thanks in advance for your help=)
Edit:
I try to read the output stream before the error stream and it works. But I need to check the error stream before perform any parsing on the output stream, so the problem is still topical.
First, it's probably better to use the newer ProcessBuilder class as opposed to Runtime exec. If you want to go a step further, you can even use Apache commons-exec which takes care of stream handling and other things for you.
Next, as you've discovered, process control is a tricky thing in Java and you've run into one of its tricky issues. From the documentation for java's Process class:
The parent process uses these streams to feed input to and get output from the subprocess. Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, and even deadlock.
You need to have something consuming both (Error and Output) streams or you risk deadlock - these should each be read on their own threads. Using something like a StreamGobbler (google it, there are plenty out there) would be a good step, or you can roll your own if you're so inclined. It isn't too hard to get it right but if you're unfamiliar with multithreading you may want to look at someone else's implementation or go the Apache commons-exec route.
The processing of output is so annoying, that I wrote little library called jproc that deals with the problem of consuming stdout and stderr. It can simply filter strings through external programs likes this:
ProcBuilder.filter("x y z","sed" ,"s/y/a/")
It also lets you specify a timeout for the completion and will convert non-zero exit codes into exception.

Java stuck in infinite loop executing a wmic command on Windows Server 2003

I'm trying to get a list of running processes and their file paths on a Windows Server 2003 machine. I'm using the following code to try and do that:
protected Map<String,String> getProcesses() {
Map<String,String> processes = new HashMap<String,String>();
try {
String line;
Process p = null;
// Windows
if (OS.indexOf("win") >= 0) {
p = Runtime.getRuntime().exec("wmic process get description,executablepath");
BufferedReader input = new BufferedReader(new InputStreamReader(p.getInputStream()));
LOG.info("Entering while loop");
while ((line = input.readLine()) != null) {
LOG.info("blah");
String[] array = line.split("\\s+");
if (array.length > 1) {
processes.put(array[0], array[1]);
}
}
LOG.info("Exited while loop");
input.close();
}
} catch (Exception e) {
e.printStackTrace();
}
return processes;
}
The program gets stuck in an infinite loop at the while condition. "blah" and "Exited while loop" never output to the log. I've ran the command in command prompt on both my win7 local machine and the server which outputs the information just fine. I've also ran the above code on my local machine which also works fine. It looks like it's some issue between Java and Windows Server 2003 that I haven't been able to find in the past 3 hours of googling. Any help would be much appreciated.
You will need to get and close your OutputStream before getting and using your InputStream. That will confirm to the process that you've started that you have finished sending input (in this case, no input) to the process.
p.getOutputStream().close();
Remember that on the Process object, getInputStream() input comes from the output stream of the process, and getOutputStream() output goes to the input stream of the process.
Remember the BufferedReader.readLine() operation will block if there the end of input has not been reached, see here.
I think what you are experiencing is explained in the API for Process:
The methods that create processes may not work well for special processes on certain native platforms, such as native windowing processes, daemon processes, Win16/DOS processes on Microsoft Windows, or shell scripts. The created subprocess does not have its own terminal or console. All its standard io (i.e. stdin, stdout, stderr) operations will be redirected to the parent process through three streams (getOutputStream(), getInputStream(), getErrorStream()). The parent process uses these streams to feed input to and get output from the subprocess. Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, and even deadlock.

Read output from external process

I am trying to run a .csh script and read it's output into a StringBuffer.
the output sometime returns empty although running the script from console returns some output. the same running flow can sometimes returns output and sometimes not, although nothing is changed in the way the process starts (same script, path , args) and the script isn't changed as well.
I'm not getting any exceptions thrown.
what might cause output now to be read correctly/successfully ?
the code segment is
public static String getOutpoutScript(Process p) {
InputStream outpout = p.getInputStream();
logger.info("Retrived script output stream");
BufferedReader buf = new BufferedReader(new InputStreamReader(outpout));
String line = "";
StringBuffer write = new StringBuffer();
try {
while ((line = buf.readLine()) != null) {
write.append(line);
}
} catch (IOException e) {
// do something
}
return write.toString().trim();
}
beside the fact not closing the streams is not good, could this or something else in the code might prevent output from being read correctly under some circumstances ?
thanks,
If you launch it with ProcessBuilder, you can combine the error stream into the output stream. This way if the program prints to stderr you'll capture this too. Alternatively you could just read both. Additionally, you may not want to use readLine, you could be stuck for awhile if the program does not print end of line character at the end.
Maybe you must replace p.getInputStream() with p.getOutputStream()
Besides this sometimes processes can block waiting on input, so you must read and write asynchronously - one possible solution is to use different threads - e.g. one thread is reading, other is writing and one that is monitoring the process.
If you have an error, this will write to getErrorStream() by default. If you have a problem, I would ensure you are reading this somewhere.
If the buffer for this stream fills, your program will stop, waiting for you to read it.
A simple way around these issues is to use ProcessBuilder.redirectErrorStream(true)

call lynx from jsp script

I have an execute(String cmd) in a jsp script that calls the exec method from the Runtime class.
It works when I call a local command, like a php script stored on the server. for example: /usr/bin/php /path/to/php/script arg1 arg2
So I guess my execute command is ok, since it is working with that.
Now when I try to call lynx, the text-based web browser, it does not work.
If I call it in a terminal, it works fine:
/usr/bin/lynx -dump -accept_all_cookies 'http://www.someurl.net/?arg1=1&arg2=2'
But when I call this from my execute command, nothing happens...
Any idea why?
This is my execute method:
public String execute(String cmd){
Runtime r = Runtime.getRuntime();
Process p = null;
String res = "";
try {
p = r.exec(cmd);
InputStreamReader isr = new InputStreamReader(p.getInputStream());
BufferedReader br = new BufferedReader(isr);
String line = null;
//out.println(res);
while ((line = br.readLine()) != null) {
res += line;
}
p.waitFor();
} catch (Exception e) {
res += e;
}
System.out.println(p.exitValue());
return res;
}
You need to read from the Process' output stream.
Since you're not, the underlying lynx process is likely blocking while writing output, waiting for someone to empty the output stream's buffer. Even if you're going to ignore the output, you need to read it anyway for the process to execute as you'd expect.
As the javadocs of Process say, "Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, and even deadlock."
See http://www.javaworld.com/javaworld/jw-12-2000/jw-1229-traps.html for some examples of how to handle this.
Edit: in case you are wondering, chances are that when you invoked the PHP script it didn't produce a great deal of output, and so was able to terminate before filling the output buffer and blocking. The lynx command is presumably producing more output and hence hitting this issue.
I solved it.... by calling lynx into a php script, php script that I called from the Jsp script...
It's a shitty solution but at least it works... I still do not really understand why the exec command from Java works that way...
Thanks for your help anyway Andrzej (Czech I guess from the name..? ^_^), somehow you put me on the way!

Categories