Process.getInputStream() seems to overwrite the output from the subprocess? - java

I am trying to get the output of a program (written in C++) from Java by listening to the STDOUT/STDERR. When I run the C++ program through the console, I am able to properly see the expected output, and it works fine. However, when the program ends (I am basically looking for error codes), it spits out 3-4 lines almost immediately and then ends the output. When looking at the output from the console, this is fine. However, when listening to STDOUT from java, I am missing the final line from the output and lines seems to be "colliding", like this:
INFO: max_encoded_bytes = 76475 aERROR: RTMP_Connect0, failed to connect socket. 110 (Connection timed out).
When they should look like this:
INFO: max_encoded_bytes = 76475 at [line 939: program.cpp]
ERROR: RTMP_Connect0, failed to connect
FATAL: Error code 200
My code currently looks like this:
processBuilder.redirectErrorStream(true);
processLock.lock();
logger.debug("Starting " + name + " process");
try {
Process process = processBuilder.start();
InputStream is = process.getInputStream();
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);
Future<?> outputFuture = executor.submit(() -> {
br.lines().forEach(line -> {
outputHandler.handleOutput(line);
});
});
try {
process.waitFor();
outputFuture.get();
logger.debug(name + " process completed naturally");
} catch (ExecutionException e) {
logger.error("ExecutionException during " + name + " process: " + e.getMessage());
} catch (InterruptedException e) {
logger.debug(name + " process interrupted");
process.destroy();
} finally {
br.close();
isr.close();
is.close();
}
catch (IOException e) { } //etc
Which, based on me looking at resources on the web, seems to be pretty standard for processing output.
I was reviewing some other SO questions that seemed similar, but I wasn't exactly sure since their situations are not identical to mine (see here and here).
I have tried changing how I am reading from STDOUT through a whole bunch of different libs (removing BufferedStream, using InputStream directly, using ByteArrayOutputStream, changing buffer sizes to some absurdly large and small values, etc..). With the goal of matching the console output and printing out each line as it happens, what can I do to get the output that I think I should be getting?

Related

Client-Thread mysteriously leaves out Java code, thus not answering

Every line of code is executed -> Event -> Important Java code lines skipped
# Client-Server # AssumeNoCodeOptimization # IDE:Processing (Processing.org by MIT)
There is a Thread "ConnectionHandler" that stays in a loop where it either sends or receives data. Before it receives the first data, every line of code is executed. After that, the program seems to pause at a System.out.println()!?
Then, when more data arrive, certain lines of Java Code are executed again, but the very same System.out.println() - in addition other things - I mentioned before is skipped.
The very problem is actually that a method sendOutput() is skipped. This is what really grinds my gears. Please help.
I am programming in an environment called Processing, developed by the MIT. As far as I know it only wraps more Java code arround your code.
Below the following code snippets I will explain why I think that code optimization by JIT or AOT is not the problem (At the end I guess I will be wrong. I hope the opposite because you can not pass System variables to Processing to suppress optimization)
private void runConnectionHandler() {
final BufferedReader inFromClient = getBufferedReader(socket);
final DataOutputStream dataOutputStream = getDataOutputStream(socket);
while (true) {
getInput(inFromClient);
sendOutput(dataOutputStream);
System.out.println("Cycle ends");
}
}
private void getInput(final BufferedReader input) {
System.out.println("Get");
try {
String clientSentence = null;
if (input.ready()) {
while ((clientSentence = input.readLine()) != null) {
inputQueue.add(new GameData(clientSentence));
System.out.println("Received se: \"" + clientSentence + "\"");
System.out.println("1");
}
System.out.println("2");
}
System.out.println("3");
}
catch (final Exception e) {
e.printStackTrace();
}
}
private void sendOutput(final DataOutputStream dataOutputStream) {
System.out.println("Send");
while (outputQueue.peek() != null) {
try {
String out = outputQueue.poll().toString();
dataOutputStream.writeBytes(out + "\r\n");
dataOutputStream.flush();
System.out.println("Sent \"" + out + "\"");
}
catch (final Exception e) {
e.printStackTrace();
}
}
}
OUTPUT: Before first data:
Having no idea about optimization I would think that it should also come to effect here, but it does not?
...
Get
3
Send
Cycle ends
...
OUTPUT: First data arrive
After the "1" the ouput stops. Why is System.out.println("2"); and so on not executed?! Why does it stop?!?!
...
Get
3
Send
Cycle ends
Get
Received se: "SET PLAY MODE"
1
OUTPUT: Manually sent more data
Gad dayium, I mean where went my System.out.println("Get"); and everything?!
...
Get
Received se: "SET PLAY MODE"
1
Received se: "Hey Ho1"
1
OUTPUT: Server shuts down
When I shut down the other side (Server), every line of code is executed again ?!?! Futhermore neither it stops, nor an execution is thrown (But maybe different issue).
...
Get
3
Send
Cycle ends
...
Also if my concept is deeply broken I would appreciate a lot of hints.

Detecting completion of process inside microservice

I have a Java microservice which shells out to execute a program and then monitors stderr until nothing is returned:
Process p = Runtime.getRuntime().exec("/bin/sh /var/task/bin/iTMSTransporter " + commandLine);
BufferedReader stdError = new BufferedReader(new InputStreamReader(p.getErrorStream()));
logger.log("StdErr:\n");
while ((s = stdError.readLine()) != null && p.isAlive()) {
System.out.println("stderr: " + s);
if (s.indexOf("DBG-X") == -1) {
result.stdErr += s + "\n";
}
}
this.logger.log("finished reading stderr\n");
This was effective at detecting the completion of the external program and it always worked. Now the external program has been updated and it seems to start outputting stderr but then just stops (there should be more to the stderr stream) and eventually it times out.
I then added the p.isAlive() as an attempt to capture the shelled program's completion. This seemed to have no impact. Now here's the frustrating part ... in an older version of my microservice I used NodeJS instead of Java to shell out to run this program. The NodeJS version still works by listening for the close event:
shell.stdout.on('data', data => {
stdout += data;
});
shell.stderr.on('data', data => {
stderr += data;
});
shell.on('error', error => {
reject(error);
});
shell.on('close', () => {
resolve({
stdout: stdout,
stderr: stderr
});
});
Is there something equivalent I can do with Java?
---- Addition -----
I tried something I'd seen online:
Runtime rt = Runtime.getRuntime();
rt.addShutdownHook(new Thread(new Runnable() {
public void run() {
System.out.println("\nGOT HERE\n");
}
}));
Process p = rt.exec("/bin/sh /var/task/bin/iTMSTransporter " + commandLine);
thinking I'd get it detect in a new thread and output "GOT HERE" but that never is sent to console.
If you just want to wait for the process to finish, waitFor should be appropriate.
Process p = Runtime.getRuntime().exec("/bin/sh /var/task/bin/iTMSTransporter " + commandLine);
try {
p.waitFor();
// now, the process has terminated
} catch (InterruptedException e) {
// something went wrong
e.printStackTrace();
}
If you do want to capture the out- and err-streams, that's actually somewhat complicated because you'd need to create new threads for that (not that that's a huge problem, but it's OTT if you just want to know that the process has finished).
Also maybe see process.waitFor() never returns.

Java: Write to and read from same process multiple times

I've gone through so many related StackOverflow questions for this that I'm getting lost in them, and I've coded this multiple ways, but none seem to solve this problem in a way that works for me: How can I send output to the same command and process multiple times while at the same time receiving input from this same process?
(See Input various strings to same process in Java for a similar question, but this ended with only a theoretical answer.)
The command (command line, from a C++ executable) loads a large file, and then I want to send input to it very quickly, get back the answer, do other stuff in between, then send different input and get the corresponding answer. Multiply this by thousands or millions of times.
One implementation, with threads:
ProcessBuilder pb = new ProcessBuilder(command.split(" "));
kenLMProcess = pb.start();
KenLMInThread lmInput = new KenLMInThread(kenLMProcess.getInputStream());
KenLMInThread lmError = new KenLMInThread(kenLMProcess.getErrorStream());
KenLMOutThread lmOutput = new KenLMOutThread(kenLMProcess.getOutputStream());
lmOutput.inStr = "Test . \n";
lmInput.start();
lmOutput.start();
lmError.start();
lmOutput.join();
lmInput.join();
lmError.join();
outStr = lmInput.newStr;
But join waits until the thread ends. What if I don't want to wait for it to end? I can't seem to figure out how to use wait() for that purpose. For one I'd prefer to not have to keep opening and closing a new output stream and input stream every time I query the command. But at least that's better than starting a new ProcessBuilder every time.
Here's what run() looks like for KenLMOutThread:
public void run() {
try {
pw.write(inStr+"\n");
pw.write('\n');
} catch (Exception e) {
System.out.println("Error while inputting to KenLM.");
e.printStackTrace();
} finally {
pw.flush();
try {
pw.flush();
bw.close();
} catch (IOException e) {
e.printStackTrace();
}
}
Supposedly flush is supposed to let it move on, and "\n" at the end is supposed to help, but it just hangs unless I use close. And if I use close, I can't use the OutputStream anymore. I'm also then unable to make a new OutputStream from the Process.
If it helps, here's a more simple implementation with everything together (taken from How to send EOF to a process in Java?):
Note that close() is used, and using flush() without close() causes the program to hang.
public static String pipe(String str, String command2) throws IOException, InterruptedException {
Process p2 = Runtime.getRuntime().exec(command2);
OutputStream out = p2.getOutputStream();
out.write(str.getBytes());
out.close();
p2.waitFor();
BufferedReader reader
= new BufferedReader(new InputStreamReader(p2.getInputStream()));
StringBuilder sb = new StringBuilder();
String line;
while ((line = reader.readLine()) != null) {
sb.append(line + "\n");
}
return sb.toString();
}
Other things I've tried:
Using exec(): Process kenLMProcess=Runtime.getRuntime().exec(command);
Putting the command process in its own thread: KenLMProcessThread procThread = new KenLMProcessThread(pb.start());
If the target process is hanging unless you close the output stream, the problem is at that end: it is reading until end of stream before doing anything. Nothing you can do about that at the sending end.

Redirecting output of a process with Process Builder

I am trying to run plink in my own console window. I started by using Process.exec() and that worked fine. The I moved to using ProcessBuilder and now the output is not sent out until I kill the process.
My code looks like this:
class ConsoleOutputThread extends Thread {
public void start(String processName) {
// this was old code: Runtime r = Runtime.getRuntime();
try {
builder = new ProcessBuilder("plink", "-ssh", "192.168.3.21");
builder.redirectErrorStream(true);
process = builder.start();
//this was old code: process = r.exec (processName);
} catch (IOException ex) {
}
start();
}
#Override
public void run() {
try {
BufferedReader is = new BufferedReader(new InputStreamReader(process.getInputStream()));
writer = new BufferedWriter(new OutputStreamWriter(process.getOutputStream()));
try {
process.waitFor();
} catch (InterruptedException ex) {
}
char b[];
b = new char[1];
while(is.read(b, 0, 1)> 0) {
// this is for debug, normally sent to console
System.out.println("Got character: " + b[0]);
}
} catch (IOException ex) {
}
}
}
So, when using Runtime.exec() everything worked fine. Now, with ProcessBuilder, the read function blocks forever (actually until I kill the process, when everuthing is spitted out). However, the error stream works, i.e. if I put a bad option I get the messages in the console.
I am probably missing something here and looking for help.
Thank you
You've set the plink process to write its output to a pipe which is connected to the java process. Anything output by the plink process will be saved in an operating-system buffer until your process reads it. The OS buffer has a limited capacity,, and if plink writes too much data, then it will block until your process reads some data from the buffer.
Unfortunately, the java process waits for the plink process to complete before reading anything from the pipe. So, if the plink process writes too much output, it will block indefinitely.
You should change the java logic to read the plink process's output before calling waitfor().

Process Builder waitFor() issue and Open file limitations

I have inherited some code:
Process p = new ProcessBuilder("/bin/chmod", "777", path).start();
p.waitFor();
Basically, there is for some ancient and highly voodoo based reason for storing key/value pairs on disk as files. I don't really want to go into it.
However, I am left with a bunch of IO exceptions:
Exception :Cannot run program "/bin/chmod": java.io.IOException: error=24, Too many open files
Message: Cannot run program "/bin/chmod": java.io.IOException: error=24, Too many open files
And by a bunch I mean in the realms of 10k - millions
I get the feeling the waitFor call was to stop these from occurring waiting for the process to complete it and exit back, however I think the chmod is returning a result before the file is actually closed. Does anyone know if that would be the cause of these exceptions?
My other inclination is that the opening and closing of thousands of files is not happening quickly enough on the java end and that there is something else going on, maybe something like that there is some form of file buffer that isn't getting cleared out when fw.close() is being called.
I am pretty new to java and this was a hell weird one that has me stumped. (gladly the app still runs somehow.. after spitting out a very large log file that is)
Can anyone else think of a way to get around this, clearing buffers or increasing the files open limit to something where the jvm can keep up with itself (assuming that is the problem)
I presume you are running these chmod commands in a loop - otherwise I don't see why you'd get so many exceptions. It's possible that you're hitting a deadlock because you're not reading the output of the spawned processes. That certainly used to bite me back in the pre-ProcessBuilder, Runtime.exec() days.
Change your code snippet to the above pattern:
try {
ProcessBuilder pb = new ProcessBuilder("/bin/chmod", "777", path);
pb.redirectErrorStream(true); // merge stdout, stderr of process
Process p = pb.start();
InputStreamReader isr = new InputStreamReader(p.getInputStream());
BufferedReader br = new BufferedReader(isr);
String lineRead;
while ((lineRead = br.readLine()) != null) {
// swallow the line, or print it out - System.out.println(lineRead);
}
int rc = p.waitFor();
// TODO error handling for non-zero rc
}
catch (IOException e) {
e.printStackTrace(); // or log it, or otherwise handle it
}
catch (InterruptedException ie) {
ie.printStackTrace(); // or log it, or otherwise handle it
}
(credit: this site) and see if that helps the situation.
Thanks for the help guys, this should sort out a load of weirdness going on elsewhere because of it.
Using your(Vinay) example and the stream closings:
try{
fw.close();
ProcessBuilder pb = new ProcessBuilder("/bin/chmod", "777", path);
pb.redirectErrorStream(true); // merge stdout, stderr of process
p = pb.start();
InputStreamReader isr = new InputStreamReader(p.getInputStream());
BufferedReader br = new BufferedReader(isr);
String lineRead;
while ((lineRead = br.readLine()) != null) {
// swallow the line, or print it out - System.out.println(lineRead);
}
} catch (Exception ioe) {
Logger.logException(Logger.WARN, ioe.getMessage(), ioe);
} finally {
try {
p.waitFor();//here as there is some snipped code that was causing a different
// exception which stopped it from getting processed
//missing these was causing the mass amounts of open 'files'
p.getInputStream().close();
p.getOutputStream().close();
p.getErrorStream().close();
} catch (Exception ioe) {
Logger.logException(Logger.WARN, ioe.getMessage(), ioe);
}
}
Got the idea from John B Mathews post.
It seems unlikely that the process would actually complete without closing the files. Could this be happening in a very large # of threads? Or perhaps some of them are not actually completing (ie, it is hanging at waitFor in some cases)?
Otherwise, I think you will be stuck with increasing the open files limit. Assuming that this is a Unix-like system, the "ulimit" command is probably what you are looking for.
If you're using JAVA 6, you could also try the new setters (for read,write,execute) on the File object. Might be slower, but it should work.

Categories