Java 6 : ProcessBuilder inheritIO for java 6 [duplicate] - java

I'm building a process in Java using ProcessBuilder as follows:
ProcessBuilder pb = new ProcessBuilder()
.command("somecommand", "arg1", "arg2")
.redirectErrorStream(true);
Process p = pb.start();
InputStream stdOut = p.getInputStream();
Now my problem is the following: I would like to capture whatever is going through stdout and/or stderr of that process and redirect it to System.out asynchronously. I want the process and its output redirection to run in the background. So far, the only way I've found to do this is to manually spawn a new thread that will continuously read from stdOut and then call the appropriate write() method of System.out.
new Thread(new Runnable(){
public void run(){
byte[] buffer = new byte[8192];
int len = -1;
while((len = stdOut.read(buffer)) > 0){
System.out.write(buffer, 0, len);
}
}
}).start();
While that approach kind of works, it feels a bit dirty. And on top of that, it gives me one more thread to manage and terminate correctly. Is there any better way to do this?

Use ProcessBuilder.inheritIO, it sets the source and destination for subprocess standard I/O to be the same as those of the current Java process.
Process p = new ProcessBuilder().inheritIO().command("command1").start();
If Java 7 is not an option
public static void main(String[] args) throws Exception {
Process p = Runtime.getRuntime().exec("cmd /c dir");
inheritIO(p.getInputStream(), System.out);
inheritIO(p.getErrorStream(), System.err);
}
private static void inheritIO(final InputStream src, final PrintStream dest) {
new Thread(new Runnable() {
public void run() {
Scanner sc = new Scanner(src);
while (sc.hasNextLine()) {
dest.println(sc.nextLine());
}
}
}).start();
}
Threads will die automatically when subprocess finishes, because src will EOF.

For Java 7 and later, see Evgeniy Dorofeev's answer.
For Java 6 and earlier, create and use a StreamGobbler:
StreamGobbler errorGobbler =
new StreamGobbler(p.getErrorStream(), "ERROR");
// any output?
StreamGobbler outputGobbler =
new StreamGobbler(p.getInputStream(), "OUTPUT");
// start gobblers
outputGobbler.start();
errorGobbler.start();
...
private class StreamGobbler extends Thread {
InputStream is;
String type;
private StreamGobbler(InputStream is, String type) {
this.is = is;
this.type = type;
}
#Override
public void run() {
try {
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);
String line = null;
while ((line = br.readLine()) != null)
System.out.println(type + "> " + line);
}
catch (IOException ioe) {
ioe.printStackTrace();
}
}
}

A flexible solution with Java 8 lambda that lets you provide a Consumer that will process the output (eg. log it) line by line. run() is a one-liner with no checked exceptions thrown. Alternatively to implementing Runnable, it can extend Thread instead as other answers suggest.
class StreamGobbler implements Runnable {
private InputStream inputStream;
private Consumer<String> consumeInputLine;
public StreamGobbler(InputStream inputStream, Consumer<String> consumeInputLine) {
this.inputStream = inputStream;
this.consumeInputLine = consumeInputLine;
}
public void run() {
new BufferedReader(new InputStreamReader(inputStream)).lines().forEach(consumeInputLine);
}
}
You can then use it for example like this:
public void runProcessWithGobblers() throws IOException, InterruptedException {
Process p = new ProcessBuilder("...").start();
Logger logger = LoggerFactory.getLogger(getClass());
StreamGobbler outputGobbler = new StreamGobbler(p.getInputStream(), System.out::println);
StreamGobbler errorGobbler = new StreamGobbler(p.getErrorStream(), logger::error);
new Thread(outputGobbler).start();
new Thread(errorGobbler).start();
p.waitFor();
}
Here the output stream is redirected to System.out and the error stream is logged on the error level by the logger.

It's as simple as following:
File logFile = new File(...);
ProcessBuilder pb = new ProcessBuilder()
.command("somecommand", "arg1", "arg2")
processBuilder.redirectErrorStream(true);
processBuilder.redirectOutput(logFile);
by .redirectErrorStream(true) you tell process to merge error and output stream and then by .redirectOutput(file) you redirect merged output to a file.
Update:
I did manage to do this as follows:
public static void main(String[] args) {
// Async part
Runnable r = () -> {
ProcessBuilder pb = new ProcessBuilder().command("...");
// Merge System.err and System.out
pb.redirectErrorStream(true);
// Inherit System.out as redirect output stream
pb.redirectOutput(ProcessBuilder.Redirect.INHERIT);
try {
pb.start();
} catch (IOException e) {
e.printStackTrace();
}
};
new Thread(r, "asyncOut").start();
// here goes your main part
}
Now you're able to see both outputs from main and asyncOut threads in System.out

There is a library that provides a better ProcessBuilder, zt-exec. This library can do exactly what you are asking for and more.
Here's what your code would look like with zt-exec instead of ProcessBuilder :
add the dependency :
<dependency>
<groupId>org.zeroturnaround</groupId>
<artifactId>zt-exec</artifactId>
<version>1.11</version>
</dependency>
The code :
new ProcessExecutor()
.command("somecommand", "arg1", "arg2")
.redirectOutput(System.out)
.redirectError(System.err)
.execute();
Documentation of the library is here : https://github.com/zeroturnaround/zt-exec/

Simple java8 solution with capturing both outputs and reactive processing using CompletableFuture:
static CompletableFuture<String> readOutStream(InputStream is) {
return CompletableFuture.supplyAsync(() -> {
try (
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);
) {
StringBuilder res = new StringBuilder();
String inputLine;
while ((inputLine = br.readLine()) != null) {
res.append(inputLine).append(System.lineSeparator());
}
return res.toString();
} catch (Throwable e) {
throw new RuntimeException("problem with executing program", e);
}
});
}
And the usage:
Process p = Runtime.getRuntime().exec(cmd);
CompletableFuture<String> soutFut = readOutStream(p.getInputStream());
CompletableFuture<String> serrFut = readOutStream(p.getErrorStream());
CompletableFuture<String> resultFut =
soutFut.thenCombine(serrFut, (stdout, stderr) -> {
// print to current stderr the stderr of process and return the stdout
System.err.println(stderr);
return stdout;
});
// get stdout once ready, blocking
String result = resultFut.get();

I too can use only Java 6. I used #EvgeniyDorofeev's thread scanner implementation. In my code, after a process finishes, I have to immediately execute two other processes that each compare the redirected output (a diff-based unit test to ensure stdout and stderr are the same as the blessed ones).
The scanner threads don't finish soon enough, even if I waitFor() the process to complete. To make the code work correctly, I have to make sure the threads are joined after the process finishes.
public static int runRedirect (String[] args, String stdout_redirect_to, String stderr_redirect_to) throws IOException, InterruptedException {
ProcessBuilder b = new ProcessBuilder().command(args);
Process p = b.start();
Thread ot = null;
PrintStream out = null;
if (stdout_redirect_to != null) {
out = new PrintStream(new BufferedOutputStream(new FileOutputStream(stdout_redirect_to)));
ot = inheritIO(p.getInputStream(), out);
ot.start();
}
Thread et = null;
PrintStream err = null;
if (stderr_redirect_to != null) {
err = new PrintStream(new BufferedOutputStream(new FileOutputStream(stderr_redirect_to)));
et = inheritIO(p.getErrorStream(), err);
et.start();
}
p.waitFor(); // ensure the process finishes before proceeding
if (ot != null)
ot.join(); // ensure the thread finishes before proceeding
if (et != null)
et.join(); // ensure the thread finishes before proceeding
int rc = p.exitValue();
return rc;
}
private static Thread inheritIO (final InputStream src, final PrintStream dest) {
return new Thread(new Runnable() {
public void run() {
Scanner sc = new Scanner(src);
while (sc.hasNextLine())
dest.println(sc.nextLine());
dest.flush();
}
});
}

It's really surprising to me that the redirection methods in ProcessBuilder don't accept an OutputStream, only File. Yet another proof of forced boilerplate code that Java forces you to write.
That said, let's look at a list of comprehensive options:
If you want the process output to simply be redirected to its parent's output stream, inheritIO will do the job.
If you want the process output to go to a file, use redirect*(file).
If you want the process output to go to a logger, you need to consume the process InputStream in a separate thread. See the answers that use a Runnable or CompletableFuture. You can also adapt the code below to do this.
If you want to the process output to go to a PrintWriter, that may or may not be the stdout (very useful for testing), you can do the following:
static int execute(List<String> args, PrintWriter out) {
ProcessBuilder builder = new ProcessBuilder()
.command(args)
.redirectErrorStream(true);
Process process = null;
boolean complete = false;
try {
process = builder.start();
redirectOut(process.getInputStream(), out)
.orTimeout(TIMEOUT, TimeUnit.SECONDS);
complete = process.waitFor(TIMEOUT, TimeUnit.SECONDS);
} catch (IOException e) {
throw new UncheckedIOException(e);
} catch (InterruptedException e) {
LOG.warn("Thread was interrupted", e);
} finally {
if (process != null && !complete) {
LOG.warn("Process {} didn't finish within {} seconds", args.get(0), TIMEOUT);
process = process.destroyForcibly();
}
}
return process != null ? process.exitValue() : 1;
}
private static CompletableFuture<Void> redirectOut(InputStream in, PrintWriter out) {
return CompletableFuture.runAsync(() -> {
try (
InputStreamReader inputStreamReader = new InputStreamReader(in);
BufferedReader bufferedReader = new BufferedReader(inputStreamReader)
) {
bufferedReader.lines()
.forEach(out::println);
} catch (IOException e) {
LOG.error("Failed to redirect process output", e);
}
});
}
Advantages of the code above over the other answers thus far:
redirectErrorStream(true) redirects the error stream to the output stream, so that we only have to bother with one.
CompletableFuture.runAsync runs from the ForkJoinPool. Note that this code doesn't block by calling get or join on the CompletableFuture but sets a timeout instead on its completion (Java 9+). There's no need for CompletableFuture.supplyAsync because there's nothing really to return from the method redirectOut.
BufferedReader.lines is simpler than using a while loop.

As an addition to msangel answer I would like to add the following code block:
private static CompletableFuture<Boolean> redirectToLogger(final InputStream inputStream, final Consumer<String> logLineConsumer) {
return CompletableFuture.supplyAsync(() -> {
try (
InputStreamReader inputStreamReader = new InputStreamReader(inputStream);
BufferedReader bufferedReader = new BufferedReader(inputStreamReader);
) {
String line = null;
while((line = bufferedReader.readLine()) != null) {
logLineConsumer.accept(line);
}
return true;
} catch (IOException e) {
return false;
}
});
}
It allows to redirect the input stream (stdout, stderr) of the process to some other consumer. This might be System.out::println or anything else consuming strings.
Usage:
...
Process process = processBuilder.start()
CompletableFuture<Boolean> stdOutRes = redirectToLogger(process.getInputStream(), System.out::println);
CompletableFuture<Boolean> stdErrRes = redirectToLogger(process.getErrorStream(), System.out::println);
System.out.println(stdOutRes.get());
System.out.println(stdErrRes.get());
System.out.println(process.waitFor());

Thread thread = new Thread(() -> {
new BufferedReader(
new InputStreamReader(inputStream,
StandardCharsets.UTF_8))
.lines().forEach(...);
});
thread.start();
Your custom code goes instead of the ...

import java.io.BufferedReader;
import java.io.InputStreamReader;
public class Main {
public static void main(String[] args) throws Exception {
ProcessBuilder pb = new ProcessBuilder("script.bat");
pb.redirectErrorStream(true);
Process p = pb.start();
BufferedReader logReader = new BufferedReader(new InputStreamReader(p.getInputStream()));
String logLine = null;
while ( (logLine = logReader.readLine()) != null) {
System.out.println("Script output: " + logLine);
}
}
}
By using this line: pb.redirectErrorStream(true); we can combine InputStream and ErrorStream

By default, the created subprocess does not have its own terminal or console. All its standard I/O (i.e. stdin, stdout, stderr) operations will be redirected to the parent process, where they can be accessed via the streams obtained using the methods getOutputStream(), getInputStream(), and getErrorStream(). The parent process uses these streams to feed input to and get output from the subprocess. Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, or even deadlock.
https://www.securecoding.cert.org/confluence/display/java/FIO07-J.+Do+not+let+external+processes+block+on+IO+buffers

Related

Starting external application inside Java

I'm having trouble starting an application from my JavaFX GUI. I'm using ProcessBuilder. It creates the process, but the application won't launch until I close my Java program. Is it because that specific program is waiting for arguments or something wrong with my code?
#FXML
private void runWorldpac() {
try {
ProcessBuilder process = new ProcessBuilder("C:\\speedDIAL\\speedDIAL.exe");
Process p = process.start();
} catch (IOException e) {
e.printStackTrace();
}
}
External application starts but won't allow any interaction with the original application until i close this external program. Tried running a new thread, same result.
Here's the new code:
try {
ProcessBuilder process = new ProcessBuilder("C:\\speedDIAL\\speedDIAL.exe");
Map<String, String> environ = process.environment();
Process p = process.start();
InputStream is = p.getInputStream();
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);
String line;
while ((line = br.readLine()) != null) {
//System.out.println(line);
}
System.out.println("Program terminated!");
} catch (IOException e) {
e.printStackTrace();
}
Read that article, good info. Also read another good example on here. It's running in a new thread now, but my program is waiting for the external application to finish before it continues, I understand that's usually desired, but not in this case, how can i disable that?
Wait for the production of the exit value in a new thread. Something like:
try {
ProcessBuilder pBuilder = new ProcessBuilder("C:\\speedDIAL\\speedDIAL.exe");
// don't forget to handle the error stream, and so
// either combine error stream with input stream, as shown here
// or gobble it separately
pBuilder.redirectErrorStream(true);
final Process process = pBuilder.start();
final InputStream is = process.getInputStream();
// in case you need to send information back to the process
// get its output stream. Don't forget to close when through with it
final OutputStream os = process.getOutputStream();
// thread to handle or gobble text sent from input stream
new Thread(() -> {
// try with resources
try (BufferedReader reader = new BufferedReader(new InputStreamReader(is));) {
String line = null;
while ((line = reader.readLine()) != null) {
// TODO: handle line
}
} catch (IOException ex) {
ex.printStackTrace();
}
}).start();
// thread to get exit value from process without blocking
Thread waitForThread = new Thread(() -> {
try {
int exitValue = process.waitFor();
// TODO: handle exit value here
} catch (InterruptedException e) {
e.printStackTrace();
}
});
waitForThread.start();
// if you want to join after a certain time:
long timeOut = 4000;
waitForThread.join(timeOut);
} catch (IOException | InterruptedException e) {
e.printStackTrace();
}

Freezing window when using multithread in java

I've got a gui in java and I want to run inside my GUI when a button is pressed one executable. My code for running the executable is the following:
Process pr;
Runtime rt = Runtime.getRuntime();
new Thread(() -> {
try {
Process proc = rt.exec("Release\\face.exe", null, new File("Release\\"));
} catch (Exception e1) {
e1.printStackTrace();
}
}).start();
The executable is running when I am pressing the button however I detect a freezing in the executable behavior. I tried to add multithreading to check if that is the reason for freezing, however still I am experienced the same thing. What could be wrong here?
EDIT:
I tried to add the following lines for inputStreamReader inside my thread:
new Thread(() -> {
try {
Runtime rt = Runtime.getRuntime();
Process proc = rt.exec("face.exe", null, new File("Release\\"));
BufferedReader in = new BufferedReader(new InputStreamReader(proc.getInputStream()));
BufferedReader err = new BufferedReader(new InputStreamReader(proc.getErrorStream()));
} catch (Exception e1) {
// TODO Auto-generated catch block
e1.printStackTrace();
}
}).start();
I got the message InputStreamReader cannot be resolved to a type.
As bowmore mentioned, your executable likely blocks because the output is not handled and "gets stuck" after a while as the executable cannot write more to its standard out (likely the console).
I usually use code like the following to run external commands from java:
Process p = Runtime.getRuntime().exec(args);
BufferedReader in = new BufferedReader(new InputStreamReader(p.getInputStream()));
BufferedReader err = new BufferedReader(new InputStreamReader(p.getErrorStream()));
StreamHandler outputHandler = new StreamHandler(in);
outputHandler.start();
StreamHandler errorHandler = new StreamHandler(err);
errorHandler.start();
where StreamHandler is defined as follows:
class StreamHandler extends Thread {
private final BufferedReader in;
private final JTextArea[] textArea;
public StreamHandler(final BufferedReader in) {
this.in = in;
}
#Override
public void run() {
try {
String line = null;
while ((line = this.in.readLine()) != null) {
System.out.println(line);
}
} catch (IOException e) {
log.info("StreamHandler got interrupted");
e.printStackTrace();
}
}
}
Naturally, the stream handler could be more sophisticated, e.g. write errors to System.err instead of System.out, or to write to a JTextArea, or somewhere else.
Since you don not read the InputStreams for this process object, it is possible that it's stuck because the stream's buffer is full, causing the exe to block on the write to stdout.

Cannot get the getInputStream from Runtime.getRunTime.exec()

public class LinuxInteractor {
public static String executeCommand(String command)
{
System.out.println("Linux command: " + command);
try
{
Process p = Runtime.getRuntime().exec(command);
p.waitFor();
BufferedReader bf=new BufferedReader(new InputStreamReader( p.getInputStream()));
String str=bf.readLine();
System.out.println("inputStream is::"+str);
while( (str=bf.readLine()) != null)
{
System.out.println("input stream is::"+str);
}
System.out.println("process started");
}
catch (Exception e) {
System.out.println("Error occured while executing Linux command. Error Description: "
+ e.getMessage());
e.printStackTrace();
}
}
When I run the script through console, it's working. But through Java program InputStream(Str) is coming as null.
Is there any other approach I can use?
Solution
You should try to do the reading and the executing on different threads.
A better alternative is to use a ProcessBuilder, which takes care of the "dirty" work for you.
The code inside the try block could look something like this:
/* Create the ProcessBuilder */
ProcessBuilder pb = new ProcessBuilder(commandArr);
pb.redirectErrorStream(true);
/* Start the process */
Process proc = pb.start();
System.out.println("Process started !");
/* Read the process's output */
String line;
BufferedReader in = new BufferedReader(new InputStreamReader(
proc.getInputStream()));
while ((line = in.readLine()) != null) {
System.out.println(line);
}
/* Clean-up */
proc.destroy();
System.out.println("Process ended !");
See, also, this short demo.
Cause of the problem
According to the Java Docs, waitFor():
causes the current thread to wait, if necessary, until the process represented by this Process object has terminated.
So, you are trying to get the process's output-stream after it has terminated, therefore the null.
(Sorry for the major revamp of the answer.)
You need to do this in a separate thread:
Process process = Runtime.getRuntime().exec(command);
LogStreamReader lsr = new LogStreamReader(process.getInputStream());
Thread thread = new Thread(lsr, "LogStreamReader");
thread.start();
public class LogStreamReader implements Runnable {
private BufferedReader reader;
public LogStreamReader(InputStream is) {
this.reader = new BufferedReader(new InputStreamReader(is));
}
public void run() {
try {
String line = reader.readLine();
while (line != null) {
System.out.println(line);
line = reader.readLine();
}
reader.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
Then you need a second thread for input handling. And you might want to deal with stderr just like stdout.

ProcessBuilder: Forwarding stdout and stderr of started processes without blocking the main thread

I'm building a process in Java using ProcessBuilder as follows:
ProcessBuilder pb = new ProcessBuilder()
.command("somecommand", "arg1", "arg2")
.redirectErrorStream(true);
Process p = pb.start();
InputStream stdOut = p.getInputStream();
Now my problem is the following: I would like to capture whatever is going through stdout and/or stderr of that process and redirect it to System.out asynchronously. I want the process and its output redirection to run in the background. So far, the only way I've found to do this is to manually spawn a new thread that will continuously read from stdOut and then call the appropriate write() method of System.out.
new Thread(new Runnable(){
public void run(){
byte[] buffer = new byte[8192];
int len = -1;
while((len = stdOut.read(buffer)) > 0){
System.out.write(buffer, 0, len);
}
}
}).start();
While that approach kind of works, it feels a bit dirty. And on top of that, it gives me one more thread to manage and terminate correctly. Is there any better way to do this?
Use ProcessBuilder.inheritIO, it sets the source and destination for subprocess standard I/O to be the same as those of the current Java process.
Process p = new ProcessBuilder().inheritIO().command("command1").start();
If Java 7 is not an option
public static void main(String[] args) throws Exception {
Process p = Runtime.getRuntime().exec("cmd /c dir");
inheritIO(p.getInputStream(), System.out);
inheritIO(p.getErrorStream(), System.err);
}
private static void inheritIO(final InputStream src, final PrintStream dest) {
new Thread(new Runnable() {
public void run() {
Scanner sc = new Scanner(src);
while (sc.hasNextLine()) {
dest.println(sc.nextLine());
}
}
}).start();
}
Threads will die automatically when subprocess finishes, because src will EOF.
For Java 7 and later, see Evgeniy Dorofeev's answer.
For Java 6 and earlier, create and use a StreamGobbler:
StreamGobbler errorGobbler =
new StreamGobbler(p.getErrorStream(), "ERROR");
// any output?
StreamGobbler outputGobbler =
new StreamGobbler(p.getInputStream(), "OUTPUT");
// start gobblers
outputGobbler.start();
errorGobbler.start();
...
private class StreamGobbler extends Thread {
InputStream is;
String type;
private StreamGobbler(InputStream is, String type) {
this.is = is;
this.type = type;
}
#Override
public void run() {
try {
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);
String line = null;
while ((line = br.readLine()) != null)
System.out.println(type + "> " + line);
}
catch (IOException ioe) {
ioe.printStackTrace();
}
}
}
A flexible solution with Java 8 lambda that lets you provide a Consumer that will process the output (eg. log it) line by line. run() is a one-liner with no checked exceptions thrown. Alternatively to implementing Runnable, it can extend Thread instead as other answers suggest.
class StreamGobbler implements Runnable {
private InputStream inputStream;
private Consumer<String> consumeInputLine;
public StreamGobbler(InputStream inputStream, Consumer<String> consumeInputLine) {
this.inputStream = inputStream;
this.consumeInputLine = consumeInputLine;
}
public void run() {
new BufferedReader(new InputStreamReader(inputStream)).lines().forEach(consumeInputLine);
}
}
You can then use it for example like this:
public void runProcessWithGobblers() throws IOException, InterruptedException {
Process p = new ProcessBuilder("...").start();
Logger logger = LoggerFactory.getLogger(getClass());
StreamGobbler outputGobbler = new StreamGobbler(p.getInputStream(), System.out::println);
StreamGobbler errorGobbler = new StreamGobbler(p.getErrorStream(), logger::error);
new Thread(outputGobbler).start();
new Thread(errorGobbler).start();
p.waitFor();
}
Here the output stream is redirected to System.out and the error stream is logged on the error level by the logger.
It's as simple as following:
File logFile = new File(...);
ProcessBuilder pb = new ProcessBuilder()
.command("somecommand", "arg1", "arg2")
processBuilder.redirectErrorStream(true);
processBuilder.redirectOutput(logFile);
by .redirectErrorStream(true) you tell process to merge error and output stream and then by .redirectOutput(file) you redirect merged output to a file.
Update:
I did manage to do this as follows:
public static void main(String[] args) {
// Async part
Runnable r = () -> {
ProcessBuilder pb = new ProcessBuilder().command("...");
// Merge System.err and System.out
pb.redirectErrorStream(true);
// Inherit System.out as redirect output stream
pb.redirectOutput(ProcessBuilder.Redirect.INHERIT);
try {
pb.start();
} catch (IOException e) {
e.printStackTrace();
}
};
new Thread(r, "asyncOut").start();
// here goes your main part
}
Now you're able to see both outputs from main and asyncOut threads in System.out
There is a library that provides a better ProcessBuilder, zt-exec. This library can do exactly what you are asking for and more.
Here's what your code would look like with zt-exec instead of ProcessBuilder :
add the dependency :
<dependency>
<groupId>org.zeroturnaround</groupId>
<artifactId>zt-exec</artifactId>
<version>1.11</version>
</dependency>
The code :
new ProcessExecutor()
.command("somecommand", "arg1", "arg2")
.redirectOutput(System.out)
.redirectError(System.err)
.execute();
Documentation of the library is here : https://github.com/zeroturnaround/zt-exec/
Simple java8 solution with capturing both outputs and reactive processing using CompletableFuture:
static CompletableFuture<String> readOutStream(InputStream is) {
return CompletableFuture.supplyAsync(() -> {
try (
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);
) {
StringBuilder res = new StringBuilder();
String inputLine;
while ((inputLine = br.readLine()) != null) {
res.append(inputLine).append(System.lineSeparator());
}
return res.toString();
} catch (Throwable e) {
throw new RuntimeException("problem with executing program", e);
}
});
}
And the usage:
Process p = Runtime.getRuntime().exec(cmd);
CompletableFuture<String> soutFut = readOutStream(p.getInputStream());
CompletableFuture<String> serrFut = readOutStream(p.getErrorStream());
CompletableFuture<String> resultFut =
soutFut.thenCombine(serrFut, (stdout, stderr) -> {
// print to current stderr the stderr of process and return the stdout
System.err.println(stderr);
return stdout;
});
// get stdout once ready, blocking
String result = resultFut.get();
I too can use only Java 6. I used #EvgeniyDorofeev's thread scanner implementation. In my code, after a process finishes, I have to immediately execute two other processes that each compare the redirected output (a diff-based unit test to ensure stdout and stderr are the same as the blessed ones).
The scanner threads don't finish soon enough, even if I waitFor() the process to complete. To make the code work correctly, I have to make sure the threads are joined after the process finishes.
public static int runRedirect (String[] args, String stdout_redirect_to, String stderr_redirect_to) throws IOException, InterruptedException {
ProcessBuilder b = new ProcessBuilder().command(args);
Process p = b.start();
Thread ot = null;
PrintStream out = null;
if (stdout_redirect_to != null) {
out = new PrintStream(new BufferedOutputStream(new FileOutputStream(stdout_redirect_to)));
ot = inheritIO(p.getInputStream(), out);
ot.start();
}
Thread et = null;
PrintStream err = null;
if (stderr_redirect_to != null) {
err = new PrintStream(new BufferedOutputStream(new FileOutputStream(stderr_redirect_to)));
et = inheritIO(p.getErrorStream(), err);
et.start();
}
p.waitFor(); // ensure the process finishes before proceeding
if (ot != null)
ot.join(); // ensure the thread finishes before proceeding
if (et != null)
et.join(); // ensure the thread finishes before proceeding
int rc = p.exitValue();
return rc;
}
private static Thread inheritIO (final InputStream src, final PrintStream dest) {
return new Thread(new Runnable() {
public void run() {
Scanner sc = new Scanner(src);
while (sc.hasNextLine())
dest.println(sc.nextLine());
dest.flush();
}
});
}
It's really surprising to me that the redirection methods in ProcessBuilder don't accept an OutputStream, only File. Yet another proof of forced boilerplate code that Java forces you to write.
That said, let's look at a list of comprehensive options:
If you want the process output to simply be redirected to its parent's output stream, inheritIO will do the job.
If you want the process output to go to a file, use redirect*(file).
If you want the process output to go to a logger, you need to consume the process InputStream in a separate thread. See the answers that use a Runnable or CompletableFuture. You can also adapt the code below to do this.
If you want to the process output to go to a PrintWriter, that may or may not be the stdout (very useful for testing), you can do the following:
static int execute(List<String> args, PrintWriter out) {
ProcessBuilder builder = new ProcessBuilder()
.command(args)
.redirectErrorStream(true);
Process process = null;
boolean complete = false;
try {
process = builder.start();
redirectOut(process.getInputStream(), out)
.orTimeout(TIMEOUT, TimeUnit.SECONDS);
complete = process.waitFor(TIMEOUT, TimeUnit.SECONDS);
} catch (IOException e) {
throw new UncheckedIOException(e);
} catch (InterruptedException e) {
LOG.warn("Thread was interrupted", e);
} finally {
if (process != null && !complete) {
LOG.warn("Process {} didn't finish within {} seconds", args.get(0), TIMEOUT);
process = process.destroyForcibly();
}
}
return process != null ? process.exitValue() : 1;
}
private static CompletableFuture<Void> redirectOut(InputStream in, PrintWriter out) {
return CompletableFuture.runAsync(() -> {
try (
InputStreamReader inputStreamReader = new InputStreamReader(in);
BufferedReader bufferedReader = new BufferedReader(inputStreamReader)
) {
bufferedReader.lines()
.forEach(out::println);
} catch (IOException e) {
LOG.error("Failed to redirect process output", e);
}
});
}
Advantages of the code above over the other answers thus far:
redirectErrorStream(true) redirects the error stream to the output stream, so that we only have to bother with one.
CompletableFuture.runAsync runs from the ForkJoinPool. Note that this code doesn't block by calling get or join on the CompletableFuture but sets a timeout instead on its completion (Java 9+). There's no need for CompletableFuture.supplyAsync because there's nothing really to return from the method redirectOut.
BufferedReader.lines is simpler than using a while loop.
As an addition to msangel answer I would like to add the following code block:
private static CompletableFuture<Boolean> redirectToLogger(final InputStream inputStream, final Consumer<String> logLineConsumer) {
return CompletableFuture.supplyAsync(() -> {
try (
InputStreamReader inputStreamReader = new InputStreamReader(inputStream);
BufferedReader bufferedReader = new BufferedReader(inputStreamReader);
) {
String line = null;
while((line = bufferedReader.readLine()) != null) {
logLineConsumer.accept(line);
}
return true;
} catch (IOException e) {
return false;
}
});
}
It allows to redirect the input stream (stdout, stderr) of the process to some other consumer. This might be System.out::println or anything else consuming strings.
Usage:
...
Process process = processBuilder.start()
CompletableFuture<Boolean> stdOutRes = redirectToLogger(process.getInputStream(), System.out::println);
CompletableFuture<Boolean> stdErrRes = redirectToLogger(process.getErrorStream(), System.out::println);
System.out.println(stdOutRes.get());
System.out.println(stdErrRes.get());
System.out.println(process.waitFor());
Thread thread = new Thread(() -> {
new BufferedReader(
new InputStreamReader(inputStream,
StandardCharsets.UTF_8))
.lines().forEach(...);
});
thread.start();
Your custom code goes instead of the ...
import java.io.BufferedReader;
import java.io.InputStreamReader;
public class Main {
public static void main(String[] args) throws Exception {
ProcessBuilder pb = new ProcessBuilder("script.bat");
pb.redirectErrorStream(true);
Process p = pb.start();
BufferedReader logReader = new BufferedReader(new InputStreamReader(p.getInputStream()));
String logLine = null;
while ( (logLine = logReader.readLine()) != null) {
System.out.println("Script output: " + logLine);
}
}
}
By using this line: pb.redirectErrorStream(true); we can combine InputStream and ErrorStream
By default, the created subprocess does not have its own terminal or console. All its standard I/O (i.e. stdin, stdout, stderr) operations will be redirected to the parent process, where they can be accessed via the streams obtained using the methods getOutputStream(), getInputStream(), and getErrorStream(). The parent process uses these streams to feed input to and get output from the subprocess. Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, or even deadlock.
https://www.securecoding.cert.org/confluence/display/java/FIO07-J.+Do+not+let+external+processes+block+on+IO+buffers

Draining Standard Error in Java

When launching a process from Java, both stderr and stdout can block on output if I don't read from the pipes. Currently I have a thread that pro-actively reads from one and the main thread blocks on the other.
Is there an easy way to join the two streams or otherwise cause the subprocess to continue while not losing the data in stderr?
Set the redirectErrorStream property on ProcessBuilder to send stderr output to stdout:
ProcessBuilder builder = new ProcessBuilder(command);
builder.redirectErrorStream(true);
You should then create a thread to deal with the process stream, something like the following:
Process p = builder.start();
InputHandler outHandler = new InputHandler(p.getInputStream());
Where InputHandler is defined as:
private static class InputHandler extends Thread {
private final InputStream is;
private final ByteArrayOutputStream os;
public InputHandler(InputStream input) {
this.is = input;
this.os = new ByteArrayOutputStream();
}
public void run() {
try {
int c;
while ((c = is.read()) != -1) {
os.write(c);
}
} catch (Throwable t) {
throw new IllegalStateException(t);
}
}
public String getOutput() {
try {
os.flush();
} catch (Throwable t) {
throw new IllegalStateException(t);
}
return os.toString();
}
}
Alternatively, just create two InputHandlers for the InputStream and ErrorStream. Knowing that the program will block if you don't read them is 90% of the battle :)
Just have two threads, one reading from stdout, one from stderr?

Categories