Java Process Builder always giving data in Error Stream [duplicate] - java

Can anyone clarify me if the below procedure is correct way to handle streams of process without any stream buffer full and blocking problems
I'm invoking external program from java program, I'm using ProcessBuilder to build the process and after I perform
Process gpgProcess = processBuilder.start();
I'm handling the process using a method
String executionResult = verifyExecution(gpgProcess);
and in my method i'm trying to handle the process streams
private String verifyExecution(Process gpgProcess) throws IOException, InterruptedException {
String gpgResult = null;
BufferedReader stdOut = new BufferedReader(new InputStreamReader(gpgProcess.getInputStream()));
BufferedReader stdErr = new BufferedReader(new InputStreamReader(gpgProcess.getErrorStream()));
gpgProcess.waitFor();
if(stdErr.ready()) {
gpgResult = "Exit code: " + gpgProcess.exitValue() + "\n" + readStream(stdErr);
} else if(stdOut.ready()) {
gpgResult = "Exit code: " + gpgProcess.exitValue() + "\n" + readStream(stdOut);
} else {
gpgResult = "Exit code: " + gpgProcess.exitValue();
}
int exitCode = gpgProcess.exitValue();
this.setExitCode(exitCode);
stdOut.close();
stdErr.close();
if(exitCode != 0) {
throw new RuntimeException("Pgp Exception: " + gpgResult);
}
return gpgResult;
}
The readStream method is used to read my stream text.
private String readStream(BufferedReader reader) throws IOException {
StringBuilder result = new StringBuilder();
try {
while(reader.ready()) {
result.append(reader.readLine());
if(reader.ready()) {
result.append("\n");
}
}
} catch(IOException ioe) {
System.err.println("Error while reading the stream: " + ioe.getMessage());
throw ioe;
}
return result.toString();
}

No, that is not the correct way to do it.
First, on some systems, your code will be stuck on the gpgProcess.waitFor() call forever, because the process cannot finish until its standard out and standard error have been fully read and consumed.
Second, you are not using the ready() method of Reader correctly. The documentation states that the method returns true only if reading a character is guaranteed not to block. Returning false does not mean that the end of the stream has been reached; it just means the next read might block (meaning, it might not return immediately).
The only ways to know when you have reached the end of a Reader’s data stream are:
check whether any of its read methods return a negative number
check whether the readLine method of BufferedReader returns null
So your readStream method should look like this:
String line;
while ((line = reader.readLine()) != null) {
result.append(line).append("\n");
}
As of Java 8, you can make it even shorter:
return reader.lines().collect(Collectors.joining("\n"));
Similarly, you should not be calling stdErr.ready() or stdOut.ready(). Either or both methods might or might not return true, even when there are no characters available; the only guarantee for the ready() method is that returning true means the next read will not block. It is possible for ready() to return true even at the end of the character stream, when the next read would immediately return -1, as long as that read does not block.
In summary, don't use ready() at all. Consume all of both streams, and check whether the error stream is empty:
String output = readStream(stdErr);
if (output.isEmpty()) {
String output = readStream(stdOut);
}
gpgResult = "Exit code: " + gpgProcess.exitValue() + "\n" + output;
That would address the case your question appears to present: Either the Process produces standard error and no lines on standard output, or the other way around. However, this will not properly handle Processes in general.
For the general case, the easiest solution is to have the process merge its standard error with standard output using redirectErrorStream, so there is only one stream to consume:
processBuilder.redirectErrorStream(true);
Process gpgProcess = processBuilder.start();
The verifyExecution method could then contain:
String output;
try (BufferedReader stdOut = new BufferedReader(new InputStreamReader(gpgProcess.getInputStream()))) {
output = readStream(stdOut);
}
if (output.isEmpty()) {
gpgResult = "Exit code: " + gpgProcess.waitFor();
} else {
gpgResult = "Exit code: " + gpgProcess.waitFor() + "\n" + output;
}
If you absolutely must have separate standard error and standard output, you need at least one background thread. I find an ExecutorService makes passing a value from a background thread easier:
ExecutorService background = Executors.newSingleThreadExecutor();
Future<String> stdOutReader = background.submit(() -> readStream(stdOut));
String output = readStream(stdErr);
if (output.isEmpty()) {
output = stdOutReader.get();
}
background.shutdown();
if (output.isEmpty()) {
gpgResult = "Exit code: " + gpgProcess.waitFor();
} else {
gpgResult = "Exit code: " + gpgProcess.waitFor() + "\n" + output;
}
Finally, you should not catch and re-throw IOException just to print it out. Whatever code calls verifyExecution will have to catch IOException anyway; it is that code’s job to print, log, or otherwise handle the IOException. Intercepting it like that will probably result in its being printed twice.

There's no reliable way to tell whether an stream has data available without a call to read()—but that call will block if there are no data available. Methods like available() and ready() aren't reliable, because they can give false negatives; they can report that no data are available, even when there are.
A general-purpose facility that will work with any process requires a separate thread to consume each InputStream. This is because, in general, processes could interleave output to stdout and stderr, and unblocking one could cause the other to block, and so on. The process might write partial standard output, then block on a write to standard error. If your master process uses just one thread, it will hang, regardless which stream it reads first. Independent threads consuming both streams will make sure the process runs smoothly.
If you are running a specific process, and you can guarantee it has certain output in every case, you could take some shortcuts… keeping in mind that, "Short cuts make long delays."

Related

Java - Use Input and OutputStream of ProcessBuilder continuously

I want to use an external tool while extracting some data (loop through lines).
For that I first used Runtime.getRuntime().exec() to execute it.
But then my extraction got really slow. So I am searching for a possibility to exec the external tool in each instance of the loop, using the same instance of shell.
I found out, that I should use ProcessBuilder. But it's not working yet.
Here is my code to test the execution (with input from the answers here in the forum already):
public class ExecuteShell {
ProcessBuilder builder;
Process process = null;
BufferedWriter process_stdin;
BufferedReader reader, errReader;
public ExecuteShell() {
String command;
command = getShellCommandForOperatingSystem();
if(command.equals("")) {
return; //Fehler! No error handling yet
}
//init shell
builder = new ProcessBuilder( command);
builder.redirectErrorStream(true);
try {
process = builder.start();
} catch (IOException e) {
System.out.println(e);
}
//get stdout of shell
reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
errReader = new BufferedReader(new InputStreamReader(process.getErrorStream()));
//get stdin of shell
process_stdin = new BufferedWriter(new OutputStreamWriter(process.getOutputStream()));
System.out.println("ExecuteShell: Constructor successfully finished");
}
public String executeCommand(String commands) {
StringBuffer output;
String line;
try {
//single execution
process_stdin.write(commands);
process_stdin.newLine();
process_stdin.flush();
} catch (IOException e) {
System.out.println(e);
}
output = new StringBuffer();
line = "";
try {
if (!reader.ready()) {
output.append("Reader empty \n");
return output.toString();
}
while ((line = reader.readLine())!= null) {
output.append(line + "\n");
return output.toString();
}
if (!reader.ready()) {
output.append("errReader empty \n");
return output.toString();
}
while ((line = errReader.readLine())!= null) {
output.append(line + "\n");
}
} catch (Exception e) {
System.out.println("ExecuteShell: error in executeShell2File");
e.printStackTrace();
return "";
}
return output.toString();
}
public int close() {
// finally close the shell by execution exit command
try {
process_stdin.write("exit");
process_stdin.newLine();
process_stdin.flush();
}
catch (IOException e) {
System.out.println(e);
return 1;
}
return 0;
}
private static String getShellCommandForOperatingSystem() {
Properties prop = System.getProperties( );
String os = prop.getProperty( "os.name" );
if ( os.startsWith("Windows") ) {
//System.out.println("WINDOWS!");
return "C:/cygwin64/bin/bash";
} else if (os.startsWith("Linux") ) {
//System.out.println("Linux!");
return"/bin/sh";
}
return "";
}
}
I want to call it in another Class like this Testclass:
public class TestExec{
public static void main(String[] args) {
String result = "";
ExecuteShell es = new ExecuteShell();
for (int i=0; i<5; i++) {
// do something
result = es.executeCommand("date"); //execute some command
System.out.println("result:\n" + result); //do something with result
// do something
}
es.close();
}
}
My Problem is, that the output stream is always empty:
ExecuteShell: Constructor successfully finished
result:
Reader empty
result:
Reader empty
result:
Reader empty
result:
Reader empty
result:
Reader empty
I read the thread here: Java Process with Input/Output Stream
But the code snippets were not enough to get me going, I am missing something. I have not really worked with different threads much. And I am not sure if/how a Scanner is of any help to me. I would really appreciate some help.
Ultimatively, my goal is to call an external command repeatetly and make it fast.
EDIT:
I changed the loop, so that the es.close() is outside. And I wanted to add, that I do not want only this inside the loop.
EDIT:
The problem with the time was, that the command I called caused an error. When the command does not cause an error, the time is acceptable.
Thank you for your answers
You are probably experiencing a race condition: after writing the command to the shell, your Java program continues to run, and almost immediately calls reader.ready(). The command you wanted to execute has probably not yet output anything, so the reader has no data available. An alternative explanation would be that the command does not write anything to stdout, but only to stderr (or the shell, maybe it has failed to start the command?). You are however not reading from stderr in practice.
To properly handle output and error streams, you cannot check reader.ready() but need to call readLine() (which waits until data is available) in a loop. With your code, even if the program would come to that point, you would read only exactly one line from the output. If the program would output more than one line, this data would get interpreted as the output of the next command. The typical solution is to read in a loop until readLine() returns null, but this does not work here because this would mean your program would wait in this loop until the shell terminates (which would never happen, so it would just hang infinitely).
Fixing this would be pretty much impossible, if you do not know exactly how many lines each command will write to stdout and stderr.
However, your complicated approach of using a shell and sending commands to it is probably completely unnecessary. Starting a command from within your Java program and from within the shell is equally fast, and much easier to write. Similarly, there is no performance difference between Runtime.exec() and ProcessBuilder (the former just calls the latter), you only need ProcessBuilder if you need its advanced features.
If you are experiencing performance problems when calling external programs, you should find out where they are exactly and try to solve them, but not with this approach. For example, normally one starts a thread for reading from both the output and the error stream (if you do not start separate threads and the command produces large output, everything might hang). This could be slow, so you could use a thread pool to avoid repeated spawning of processes.

BufferedReader blocking when reading from process.getErrorStream()

When I run this code and the call graph is really large, the program prints to the last line that opt outputs and is blocked at readLine, even though there is nothing left. Anyone know what the problem is? opt -print-callgraph file sends the call graph to the error stream. I tried executing opt -print-callgraph file 2> callgraph so that I can read from a file instead but it complains that there are too many positional arguments.
Oddly enough, the code runs fine for call graphs that are small in size.
I tried using ProcessBuilder as well but I get the same problem.
Runtime runtime = Runtime.getRuntime();
Process process = runtime.exec("opt -print-callgraph " + file);
BufferedReader in = new BufferedReader(new InputStreamReader(process.getErrorStream()));
String s = null;
try {
// Gets stuck at readLine after printing out the last line.
while ((s = in.readLine()) != null) {
System.out.println(s);
}
} catch (Exception e) {
e.printStackTrace();
} finally {
in.close();
}
You need to read both streams, in separate threads, or else merge them so you're reading them both at the same time. Otherwise the process can block if output is unconsumed. In this case there must be unconsumed output in stdout which is blocking the process, which means it won't finish, which means it won't close stderr, which means reading stderr will block.

Java: No input from Process object until the program closes

I'm trying to get input from the console of a .exe process started by a Java script. Nothing appears in the console window, and nothing is read by the program until the process is terminated.
blServ = new ProcessBuilder(blPath + "Blockland.exe", "ptlaaxobimwroe", "-dedicated", "-port " + port, "-profilepath " + blPath.substring(0, blPath.length() - 1)).start();
System.out.println("Attempting to start server...\n" + blPath);
consoleIn = new BufferedReader(new InputStreamReader(blServ.getInputStream()));
'blServ' is a Process object. And yes, the program is starting successfully.
public void blStreamConsole() //called once every 500 milliseconds
{
String lineStr = "";
String line = "";
int lines = 0;
try
{
if (consoleIn != null)
{
while ((line = consoleIn.readLine()) != null)
{
//if (!line.equals("%"));
//{
lineStr += line + wordSym;
lines++;
//}
}
}
}
catch (IOException e)
{
netOut.println("notify" + wordSym + "ERROR: An I/O exception occured when trying to get data from the remote console. Some lines may not be displayed.");
}
if (!lineStr.equals("") && !(lineStr == null))
netOut.println("streamconsole" + wordSym + lines + wordSym + lineStr);
}
Basically, this method sees if there is more input waiting in the consoleIn object, and if there is, it appends every line it has to another string, and that other string is sent to a client. Unfortunately, it is all sent in one big chunk right when Blockland.exe is closed. Sorry about the indenting issues. The Stackoverflow editor re-arranged all of the code.
It seems to me that there are two possibilities here:
readLine blocks, waiting for input (and doesn't return null as you expect). You may be able to fix it by not using BufferedReader and instead using the InputStream
The output stream doesn't flush until all the input has been written. Try putting a flush there:
Also note that if lineStr is null, you'll get a NullPointerException as your code currently is (you need to swap your conditions), but it can't even be null.
if (!lineStr.isEmpty())
{
netOut.println("streamconsole" + wordSym + lines + wordSym + lineStr);
netOut.flush();
}
while ((line = consoleIn.readLine()) != null){
lineStr += line + wordSym;
lines++;
}
The problem with this piece of code is that it will keep running until the program exits. It will append every single line to lineStr until the program exits (when console.readLine() is null). The whole lineStr is then printed afterwards, containing the whole console.
If you want to continuously print the output, you will need to print it immediatly:
while ((line = consoleIn.readLine()) != null){
netOut.println(line);
}
You can run this in one separate thread, and it will keep outputting the console to the output stream until the program exits.

Java Process.waitFor() and Readline hangs

First, this is my code :
import java.io.*;
import java.util.Date;
import com.banctecmtl.ca.vlp.shared.exceptions.*;
public class PowershellTest implements Runnable {
public static final String PATH_TO_SCRIPT = "C:\\Scripts\\ScriptTest.ps1";
public static final String SERVER_IP = "XX.XX.XX.XXX";
public static final String MACHINE_TO_MOD = "MachineTest";
/**
* #param args
* #throws OperationException
*/
public static void main(String[] args) throws OperationException {
new PowershellTest().run();
}
public PowershellTest(){}
#Override
public synchronized void run() {
String input = "";
String error = "";
boolean isHanging = false;
try {
Runtime runtime = Runtime.getRuntime();
Process proc = runtime.exec("powershell -file " + PATH_TO_SCRIPT +" "+ SERVER_IP +" "+ MACHINE_TO_MOD);
proc.getOutputStream().close();
InputStream inputstream = proc.getInputStream();
InputStreamReader inputstreamreader = new InputStreamReader(inputstream);
BufferedReader bufferedreader = new BufferedReader(inputstreamreader);
proc.waitFor();
String line;
while (!isHanging && (line = bufferedreader.readLine()) != null) {
input += (line + "\n");
Date date = new Date();
while(!bufferedreader.ready()){
this.wait(1000);
//if its been more then 1 minute since a line has been read, its hanging.
if(new Date().getTime() - date.getTime() >= 60000){
isHanging = true;
break;
}
}
}
inputstream.close();
inputstream = proc.getErrorStream();
inputstreamreader = new InputStreamReader(inputstream);
bufferedreader = new BufferedReader(inputstreamreader);
isHanging = false;
while (!isHanging && (line = bufferedreader.readLine()) != null) {
error += (line + "\n");
Date date = new Date();
while(!bufferedreader.ready()){
this.wait(1000);
//if its been more then 1 minute since a line has been read, its hanging.
if(new Date().getTime() - date.getTime() >= 60000){
isHanging = true;
break;
}
}
}
inputstream.close();
proc.destroy();
} catch (IOException e) {
//throw new OperationException("File IO problem.", e);
} catch (InterruptedException e) {
//throw new OperationException("Script thread problem.",e);
}
System.out.println("Error : " + error + "\nInput : " + input);
}
}
I'm currently trying to run a powershell script that will start/stop a vm (VMWARE) on a remote server. The script work from command line and so does this code. The thing is, I hate how I have to use a thread (and make it wait for the script to respond, as explained further) for such a job. I had to do it because both BufferedReader.readline() and proc.waitFor() hang forever.
The script, when ran from cmd, is long to execute. it stall for 30 sec to 1 min from validating authentification with the server and executing the actual script. From what I saw from debugging, the readline hang when it start receiving those delays from the script.
I'm also pretty sure it's not a memory problem since I never had any OOM error in any debugging session.
Now I understand that Process.waitFor() requires me to flush the buffer from both the error stream and the regular stream to work and so that's mainly why I don't use it (I need the output to manage VM specific errors, certificates issues, etc.).
I would like to know if someone could explain to me why it hangs and if there is a way to just use the typical readline() without having it to hang so hard. Even if the script should have ended since a while, it still hang (I tried to run both the java application and a cmd command using the exact same thing I use in the java application at the same time, left it runingfor 1h, nothing worked). It is not just stuck in the while loop, the readline() is where the hanging is.
Also this is a test version, nowhere close to the final code, so please spare me the : this should be a constant, this is useless, etc. I will clean the code later. Also the IP is not XX.XX.XX.XXX in my code, obviously.
Either explanation or suggestion on how to fix would be greatly appreciated.
Ho btw here is the script I currently use :
Add-PSSnapin vmware.vimautomation.core
Connect-VIServer -server $args[0]
Start-VM -VM "MachineTest"
If you need more details I will try to give as much as I can.
Thanks in advance for your help!
EDIT : I also previously tested the code with a less demanding script, which job was to get the content of a file and print it. Since no waiting was needed to get the information, the readline() worked well. I'm thus fairly certain that the problem reside on the wait time coming from the script execution.
Also, forgive my errors, English is not my main language.
Thanks in advance for your help!
EDIT2 : Since I cannot answer to my own Question :
Here is my "final" code, after using threads :
import java.io.*;
public class PowershellTest implements Runnable {
public InputStream is;
public PowershellTest(InputStream newIs){
this.is = newIs;
}
#Override
public synchronized void run() {
String input = "";
String error = "";
try {
InputStreamReader inputstreamreader = new InputStreamReader(is);
BufferedReader bufferedreader = new BufferedReader(inputstreamreader);
String line;
while ((line = bufferedreader.readLine()) != null) {
input += (line + "\n");
}
is.close();
} catch (IOException e) {
//throw new OperationException("File IO problem.", e);
}
System.out.println("Error : " + error + "\nInput : " + input);
}
}
And the main simply create and start 2 thread (PowerShellTest instances), 1 with the errorStream and 1 with the inputStream.
I believe I made a dumb error when I first coded the app and fixed it somehow as I reworked the code over and over. It still take a good 5-6 mins to run, which is somehow similar if not longer than my previous code (which is logical since the errorStream and inputStream get their information sequentially in my case).
Anyway, thanks to all your answer and especially Miserable Variable for his hint on threading.
First, don't call waitFor() until after you've finished reading the streams. I would highly recommend you look at ProcessBuilder instead of simply using Runtime.exec, and split the command up yourself rather than relying on Java to do it for you:
ProcessBuilder pb = new ProcessBuilder("powershell", "-file", PATH_TO_SCRIPT,
SERVER_IP, MACHINE_TO_MOD);
pb.redirectErrorStream(true); // merge stdout and stderr
Process proc = pb.start();
redirectErrorStream merges the error output into the normal output, so you only have to read proc.getInputStream(). You should then be able to just read that stream until EOF, then call proc.waitFor().
You are currently waiting to complete reading from inputStream before starting to read from errorStream. If the process writes to its stderr before stdout maybe you are getting into a deadlock situation.
Try reading from both streams from concurrently running threads. While you are at it, also remove proc.getOutputStream().close();. It shouldn't affect the behavior, but it is not required either.

Java - Flushing the OutputStream of a process doesn't send the data immediately if it's too small

I'm firing up an external process from Java and grabbing its stdin, stdout and stderr via process.getInputStream() etc. My issue is: when I want to write data to my output stream (the proc's stdin) it's not getting sent until I actually call close() on the stream. I am explicitly calling flush().
I did some experimenting and noticed that if I increased the number of bytes I was sending, it would eventually go through. The magic number, on my system, is 4058 bytes.
To test I'm sending the data over to a perl script which reads like this:
#!/usr/bin/perl
use strict;
use warnings;
print "Perl starting";
while(<STDIN>) {
print "Perl here, printing this: $_"
}
Now, here's the java code:
import java.io.InputStream;
import java.io.IOException;
import java.io.OutputStream;
public class StreamsExecTest {
private static String readInputStream(InputStream is) throws IOException {
int guessSize = is.available();
byte[] bytes = new byte[guessSize];
is.read(bytes); // This call has side effect of filling the array
String output = new String(bytes);
return output;
}
public static void main(String[] args) {
System.out.println("Starting up streams test!");
ProcessBuilder pb;
pb = new ProcessBuilder("./test.pl");
// Run the proc and grab the streams
try {
Process p = pb.start();
InputStream pStdOut = p.getInputStream();
InputStream pStdErr = p.getErrorStream();
OutputStream pStdIn = p.getOutputStream();
int counter = 0;
while (true) {
String output = readInputStream(pStdOut);
if (!output.equals("")) {
System.out.println("<OUTPUT> " + output);
}
String errors = readInputStream(pStdErr);
if (!errors.equals("")) {
System.out.println("<ERRORS> " + errors);
}
if (counter == 50) {
// Write to the stdin of the execed proc. The \n should
// in turn trigger it to treat it as a line to process
System.out.println("About to send text to proc's stdin");
String message = "hello\n";
byte[] pInBytes = message.getBytes();
pStdIn.write(pInBytes);
pStdIn.flush();
System.out.println("Sent " + pInBytes.length + " bytes.");
}
if (counter == 100) {
break;
}
Thread.sleep(100);
counter++;
}
// Cleanup
pStdOut.close();
pStdErr.close();
pStdIn.close();
p.destroy();
} catch (Exception e) {
// Catch everything
System.out.println("Exception!");
e.printStackTrace();
System.exit(1);
}
}
}
So when I run this, I get effectively nothing back. If immediately after calling flush(), I call close() on pStdIn, it works as expected. This isn't what I want though; I want to be able to continually hold the stream open and write to it whenever it so pleases me. As mentioned before, if message is 4058 bytes or larger, this will work without the close().
Is the operating system (running on 64bit Linux, with a 64bit Sun JDK for what it's worth) buffering the data before sending it? I could see Java having no real control over that, once the JVM makes the system call to write to the pipe all it can do is wait. There's another puzzle though:
The Perl script prints line before going into the while loop. Since I check for any input from Perl's stdout on every iteration of my Java loop, I would expect to see it on the first run through the loop, see the attempt at sending data from Java->Perl and then nothing. But I actually only see the initial message from Perl (after that OUTPUT message) when the write to the output stream happens. Is something blocking that I'm not aware of?
Any help greatly appreciated!
You haven't told Perl to use unbuffered output. Look in perlvar and search for $| for different ways to set unbuffered mode. In essence, one of:
HANDLE->autoflush( EXPR )
$OUTPUT_AUTOFLUSH
$|
Perl may be buffering it before it starts printing anything.
is.read(bytes); // This call has side effect of filling the array
No it doesn't. It has the effect of reading between 1 and bytes.length-1 bytes into the array. See the Javadoc.
I don't see any obvious buffering in your code, so it may be on the Perl side. What happens if you put a newline \n at the end of your print statement?
Note also that you can't, in general, read the stdin and stderr on the main thread like that. You'll be subject to deadlock - e.g., if the child process prints lots of stderr, while the parent is reading stdin, the stderr buffer will fill and the child process will block, but the parent will stay blocked forever trying to read stdin.
You need to use separate threads to read stderr and stding (also separate from the main thread, which here is used to pump input to the process).

Categories