Draining Standard Error in Java - java

When launching a process from Java, both stderr and stdout can block on output if I don't read from the pipes. Currently I have a thread that pro-actively reads from one and the main thread blocks on the other.
Is there an easy way to join the two streams or otherwise cause the subprocess to continue while not losing the data in stderr?

Set the redirectErrorStream property on ProcessBuilder to send stderr output to stdout:
ProcessBuilder builder = new ProcessBuilder(command);
builder.redirectErrorStream(true);
You should then create a thread to deal with the process stream, something like the following:
Process p = builder.start();
InputHandler outHandler = new InputHandler(p.getInputStream());
Where InputHandler is defined as:
private static class InputHandler extends Thread {
private final InputStream is;
private final ByteArrayOutputStream os;
public InputHandler(InputStream input) {
this.is = input;
this.os = new ByteArrayOutputStream();
}
public void run() {
try {
int c;
while ((c = is.read()) != -1) {
os.write(c);
}
} catch (Throwable t) {
throw new IllegalStateException(t);
}
}
public String getOutput() {
try {
os.flush();
} catch (Throwable t) {
throw new IllegalStateException(t);
}
return os.toString();
}
}
Alternatively, just create two InputHandlers for the InputStream and ErrorStream. Knowing that the program will block if you don't read them is 90% of the battle :)

Just have two threads, one reading from stdout, one from stderr?

Related

Java 6 : ProcessBuilder inheritIO for java 6 [duplicate]

I'm building a process in Java using ProcessBuilder as follows:
ProcessBuilder pb = new ProcessBuilder()
.command("somecommand", "arg1", "arg2")
.redirectErrorStream(true);
Process p = pb.start();
InputStream stdOut = p.getInputStream();
Now my problem is the following: I would like to capture whatever is going through stdout and/or stderr of that process and redirect it to System.out asynchronously. I want the process and its output redirection to run in the background. So far, the only way I've found to do this is to manually spawn a new thread that will continuously read from stdOut and then call the appropriate write() method of System.out.
new Thread(new Runnable(){
public void run(){
byte[] buffer = new byte[8192];
int len = -1;
while((len = stdOut.read(buffer)) > 0){
System.out.write(buffer, 0, len);
}
}
}).start();
While that approach kind of works, it feels a bit dirty. And on top of that, it gives me one more thread to manage and terminate correctly. Is there any better way to do this?
Use ProcessBuilder.inheritIO, it sets the source and destination for subprocess standard I/O to be the same as those of the current Java process.
Process p = new ProcessBuilder().inheritIO().command("command1").start();
If Java 7 is not an option
public static void main(String[] args) throws Exception {
Process p = Runtime.getRuntime().exec("cmd /c dir");
inheritIO(p.getInputStream(), System.out);
inheritIO(p.getErrorStream(), System.err);
}
private static void inheritIO(final InputStream src, final PrintStream dest) {
new Thread(new Runnable() {
public void run() {
Scanner sc = new Scanner(src);
while (sc.hasNextLine()) {
dest.println(sc.nextLine());
}
}
}).start();
}
Threads will die automatically when subprocess finishes, because src will EOF.
For Java 7 and later, see Evgeniy Dorofeev's answer.
For Java 6 and earlier, create and use a StreamGobbler:
StreamGobbler errorGobbler =
new StreamGobbler(p.getErrorStream(), "ERROR");
// any output?
StreamGobbler outputGobbler =
new StreamGobbler(p.getInputStream(), "OUTPUT");
// start gobblers
outputGobbler.start();
errorGobbler.start();
...
private class StreamGobbler extends Thread {
InputStream is;
String type;
private StreamGobbler(InputStream is, String type) {
this.is = is;
this.type = type;
}
#Override
public void run() {
try {
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);
String line = null;
while ((line = br.readLine()) != null)
System.out.println(type + "> " + line);
}
catch (IOException ioe) {
ioe.printStackTrace();
}
}
}
A flexible solution with Java 8 lambda that lets you provide a Consumer that will process the output (eg. log it) line by line. run() is a one-liner with no checked exceptions thrown. Alternatively to implementing Runnable, it can extend Thread instead as other answers suggest.
class StreamGobbler implements Runnable {
private InputStream inputStream;
private Consumer<String> consumeInputLine;
public StreamGobbler(InputStream inputStream, Consumer<String> consumeInputLine) {
this.inputStream = inputStream;
this.consumeInputLine = consumeInputLine;
}
public void run() {
new BufferedReader(new InputStreamReader(inputStream)).lines().forEach(consumeInputLine);
}
}
You can then use it for example like this:
public void runProcessWithGobblers() throws IOException, InterruptedException {
Process p = new ProcessBuilder("...").start();
Logger logger = LoggerFactory.getLogger(getClass());
StreamGobbler outputGobbler = new StreamGobbler(p.getInputStream(), System.out::println);
StreamGobbler errorGobbler = new StreamGobbler(p.getErrorStream(), logger::error);
new Thread(outputGobbler).start();
new Thread(errorGobbler).start();
p.waitFor();
}
Here the output stream is redirected to System.out and the error stream is logged on the error level by the logger.
It's as simple as following:
File logFile = new File(...);
ProcessBuilder pb = new ProcessBuilder()
.command("somecommand", "arg1", "arg2")
processBuilder.redirectErrorStream(true);
processBuilder.redirectOutput(logFile);
by .redirectErrorStream(true) you tell process to merge error and output stream and then by .redirectOutput(file) you redirect merged output to a file.
Update:
I did manage to do this as follows:
public static void main(String[] args) {
// Async part
Runnable r = () -> {
ProcessBuilder pb = new ProcessBuilder().command("...");
// Merge System.err and System.out
pb.redirectErrorStream(true);
// Inherit System.out as redirect output stream
pb.redirectOutput(ProcessBuilder.Redirect.INHERIT);
try {
pb.start();
} catch (IOException e) {
e.printStackTrace();
}
};
new Thread(r, "asyncOut").start();
// here goes your main part
}
Now you're able to see both outputs from main and asyncOut threads in System.out
There is a library that provides a better ProcessBuilder, zt-exec. This library can do exactly what you are asking for and more.
Here's what your code would look like with zt-exec instead of ProcessBuilder :
add the dependency :
<dependency>
<groupId>org.zeroturnaround</groupId>
<artifactId>zt-exec</artifactId>
<version>1.11</version>
</dependency>
The code :
new ProcessExecutor()
.command("somecommand", "arg1", "arg2")
.redirectOutput(System.out)
.redirectError(System.err)
.execute();
Documentation of the library is here : https://github.com/zeroturnaround/zt-exec/
Simple java8 solution with capturing both outputs and reactive processing using CompletableFuture:
static CompletableFuture<String> readOutStream(InputStream is) {
return CompletableFuture.supplyAsync(() -> {
try (
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);
) {
StringBuilder res = new StringBuilder();
String inputLine;
while ((inputLine = br.readLine()) != null) {
res.append(inputLine).append(System.lineSeparator());
}
return res.toString();
} catch (Throwable e) {
throw new RuntimeException("problem with executing program", e);
}
});
}
And the usage:
Process p = Runtime.getRuntime().exec(cmd);
CompletableFuture<String> soutFut = readOutStream(p.getInputStream());
CompletableFuture<String> serrFut = readOutStream(p.getErrorStream());
CompletableFuture<String> resultFut =
soutFut.thenCombine(serrFut, (stdout, stderr) -> {
// print to current stderr the stderr of process and return the stdout
System.err.println(stderr);
return stdout;
});
// get stdout once ready, blocking
String result = resultFut.get();
I too can use only Java 6. I used #EvgeniyDorofeev's thread scanner implementation. In my code, after a process finishes, I have to immediately execute two other processes that each compare the redirected output (a diff-based unit test to ensure stdout and stderr are the same as the blessed ones).
The scanner threads don't finish soon enough, even if I waitFor() the process to complete. To make the code work correctly, I have to make sure the threads are joined after the process finishes.
public static int runRedirect (String[] args, String stdout_redirect_to, String stderr_redirect_to) throws IOException, InterruptedException {
ProcessBuilder b = new ProcessBuilder().command(args);
Process p = b.start();
Thread ot = null;
PrintStream out = null;
if (stdout_redirect_to != null) {
out = new PrintStream(new BufferedOutputStream(new FileOutputStream(stdout_redirect_to)));
ot = inheritIO(p.getInputStream(), out);
ot.start();
}
Thread et = null;
PrintStream err = null;
if (stderr_redirect_to != null) {
err = new PrintStream(new BufferedOutputStream(new FileOutputStream(stderr_redirect_to)));
et = inheritIO(p.getErrorStream(), err);
et.start();
}
p.waitFor(); // ensure the process finishes before proceeding
if (ot != null)
ot.join(); // ensure the thread finishes before proceeding
if (et != null)
et.join(); // ensure the thread finishes before proceeding
int rc = p.exitValue();
return rc;
}
private static Thread inheritIO (final InputStream src, final PrintStream dest) {
return new Thread(new Runnable() {
public void run() {
Scanner sc = new Scanner(src);
while (sc.hasNextLine())
dest.println(sc.nextLine());
dest.flush();
}
});
}
It's really surprising to me that the redirection methods in ProcessBuilder don't accept an OutputStream, only File. Yet another proof of forced boilerplate code that Java forces you to write.
That said, let's look at a list of comprehensive options:
If you want the process output to simply be redirected to its parent's output stream, inheritIO will do the job.
If you want the process output to go to a file, use redirect*(file).
If you want the process output to go to a logger, you need to consume the process InputStream in a separate thread. See the answers that use a Runnable or CompletableFuture. You can also adapt the code below to do this.
If you want to the process output to go to a PrintWriter, that may or may not be the stdout (very useful for testing), you can do the following:
static int execute(List<String> args, PrintWriter out) {
ProcessBuilder builder = new ProcessBuilder()
.command(args)
.redirectErrorStream(true);
Process process = null;
boolean complete = false;
try {
process = builder.start();
redirectOut(process.getInputStream(), out)
.orTimeout(TIMEOUT, TimeUnit.SECONDS);
complete = process.waitFor(TIMEOUT, TimeUnit.SECONDS);
} catch (IOException e) {
throw new UncheckedIOException(e);
} catch (InterruptedException e) {
LOG.warn("Thread was interrupted", e);
} finally {
if (process != null && !complete) {
LOG.warn("Process {} didn't finish within {} seconds", args.get(0), TIMEOUT);
process = process.destroyForcibly();
}
}
return process != null ? process.exitValue() : 1;
}
private static CompletableFuture<Void> redirectOut(InputStream in, PrintWriter out) {
return CompletableFuture.runAsync(() -> {
try (
InputStreamReader inputStreamReader = new InputStreamReader(in);
BufferedReader bufferedReader = new BufferedReader(inputStreamReader)
) {
bufferedReader.lines()
.forEach(out::println);
} catch (IOException e) {
LOG.error("Failed to redirect process output", e);
}
});
}
Advantages of the code above over the other answers thus far:
redirectErrorStream(true) redirects the error stream to the output stream, so that we only have to bother with one.
CompletableFuture.runAsync runs from the ForkJoinPool. Note that this code doesn't block by calling get or join on the CompletableFuture but sets a timeout instead on its completion (Java 9+). There's no need for CompletableFuture.supplyAsync because there's nothing really to return from the method redirectOut.
BufferedReader.lines is simpler than using a while loop.
As an addition to msangel answer I would like to add the following code block:
private static CompletableFuture<Boolean> redirectToLogger(final InputStream inputStream, final Consumer<String> logLineConsumer) {
return CompletableFuture.supplyAsync(() -> {
try (
InputStreamReader inputStreamReader = new InputStreamReader(inputStream);
BufferedReader bufferedReader = new BufferedReader(inputStreamReader);
) {
String line = null;
while((line = bufferedReader.readLine()) != null) {
logLineConsumer.accept(line);
}
return true;
} catch (IOException e) {
return false;
}
});
}
It allows to redirect the input stream (stdout, stderr) of the process to some other consumer. This might be System.out::println or anything else consuming strings.
Usage:
...
Process process = processBuilder.start()
CompletableFuture<Boolean> stdOutRes = redirectToLogger(process.getInputStream(), System.out::println);
CompletableFuture<Boolean> stdErrRes = redirectToLogger(process.getErrorStream(), System.out::println);
System.out.println(stdOutRes.get());
System.out.println(stdErrRes.get());
System.out.println(process.waitFor());
Thread thread = new Thread(() -> {
new BufferedReader(
new InputStreamReader(inputStream,
StandardCharsets.UTF_8))
.lines().forEach(...);
});
thread.start();
Your custom code goes instead of the ...
import java.io.BufferedReader;
import java.io.InputStreamReader;
public class Main {
public static void main(String[] args) throws Exception {
ProcessBuilder pb = new ProcessBuilder("script.bat");
pb.redirectErrorStream(true);
Process p = pb.start();
BufferedReader logReader = new BufferedReader(new InputStreamReader(p.getInputStream()));
String logLine = null;
while ( (logLine = logReader.readLine()) != null) {
System.out.println("Script output: " + logLine);
}
}
}
By using this line: pb.redirectErrorStream(true); we can combine InputStream and ErrorStream
By default, the created subprocess does not have its own terminal or console. All its standard I/O (i.e. stdin, stdout, stderr) operations will be redirected to the parent process, where they can be accessed via the streams obtained using the methods getOutputStream(), getInputStream(), and getErrorStream(). The parent process uses these streams to feed input to and get output from the subprocess. Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, or even deadlock.
https://www.securecoding.cert.org/confluence/display/java/FIO07-J.+Do+not+let+external+processes+block+on+IO+buffers

EOF handling in Drozer (python) interactive session

I try to code a Java program which uses Drozer (a tool written in Python to test vulnerabilities in Android app). I need to execute commands directly from Java and so far everything goes pretty well, but I have a problem when an interavtive session of drozer starts. It seems that the problem occurs when EOF needs to be handled, since ctrl + D also can't stop the session. Here is what I get after hitting ctrl + D
*** Unknown syntax: EOF
Here is the code I use to connect from Java to Drozer, after running it, my program starts infinite loop printing the same error: *** Unknown syntax: EOF.
Any other command works like a charm. Any ideas what do I do wrong?
Cheers
public class test1 {
public static void main(String a[]) throws InterruptedException, IOException {
List<String> commands = new ArrayList<String>();
List<String> commands1 = new ArrayList<String>();
commands.add("/usr/local/bin/drozer");
commands.add("console");
commands.add("connect");
ProcessBuilder pb = new ProcessBuilder(commands);
pb.redirectErrorStream(true);
try {
Process prs = pb.start();
Thread inThread = new Thread(new In(prs.getInputStream()));
inThread.start();
Thread.sleep(1000);
OutputStream writeTo = prs.getOutputStream();
writeTo.write("oops\n".getBytes());
writeTo.flush();
writeTo.close();
}catch (IOException e) {
e.printStackTrace();
}
}
}
class In implements Runnable {
private InputStream is;
public In(InputStream is) {
this.is = is;
}
#Override
public void run() {
try {
byte[] b = new byte[1024];
int size = 0;
while ((size = is.read(b)) != -1) {
System.out.println(new String(b));
}
is.close();
} catch (IOException ex) {
Logger.getLogger(In.class.getName()).log(Level.SEVERE, null, ex);
}
}
}

Runtime Process BufferedReader not outputting all lines (Psexec)

I am trying to read the output of Psexec into Java using a BufferedReader on a Process InputStream for use on a network however it is only outputting the first line.
Runtime rt = Runtime.getRuntime();
try {
Process p = rt.exec("C:\\Users\\*****\\Desktop\\PS\\Psexec \\\\" + "******" + " -u ****** -p ****** cmd /c dir D:\\");
BufferedReader stdInput = new BufferedReader(new InputStreamReader(p.getInputStream()));
log.add("Computer: " + address);
String s = null;
while ((s = stdInput.readLine()) != null) {
log.add(s);
}
} catch (IOException e) {
e.printStackTrace();
}
What would be the reason for this happening and how would this be fixed?
The process is probably producing some of its output on stderr. Either read both the output and the error streams, in separate threads, or use the ProcessBuilder to create the Process, and merge the output streams before you do so, with redirectErrorStream().
So, I spent some time playing around with this, using ProcessBuilder.
I tried redirecting the IO through the INHERITED and PIPE options, but could not get it to display the output of the remote command (the psexec content was fine)
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
public class Test1 {
public static void main(String[] args) {
ProcessBuilder pb = new ProcessBuilder(
"C:\\Users\\shane\\Downloads\\PSTools\\PsExec.exe",
"\\\\builder",
"-u",
"xxx",
"-p",
"xxx",
"cmd",
"/c", "dir", "c:\\"
);
try {
Process p = pb.start();
StreamConsumer.consume(p.getErrorStream());
StreamConsumer.consume(p.getInputStream());
System.out.println("Exited with :" + p.waitFor());
} catch (IOException | InterruptedException exp) {
exp.printStackTrace();
}
}
public static class StreamConsumer implements Runnable {
private InputStream is;
public StreamConsumer(InputStream is) {
this.is = is;
}
public static void consume(InputStream is) {
StreamConsumer consumer = new StreamConsumer(is);
new Thread(consumer).start();
}
#Override
public void run() {
try {
int in = -1;
while ((in = is.read()) != -1) {
System.out.print((char)in);
}
} catch (IOException exp) {
exp.printStackTrace();
}
}
}
}
I even tried redirecting the InputStreams to File without any success. It would seem that whatever mechanism psexec is using to stream the results from the remote machine don't seem to be picked up by Java.
You might try PAExec which did work, but didn't seem to wait to exit after the remote command exited...
It could be the case that you started the process and didn't wait for it to finish before checking it's output. If this is the case, your main thread will exit your while loop because it reads null even though the subprocess is still executing. I would suggest using Process.waitFor() so that all of the output ends up in the stream before you begin polling it.

Running a java program from another java program

I am working on a simple java program. It simply compiles and executes another java program. I am using Runtime.exec() function to compile and run. There is no problem with compilation. but when it runs, if the second program needs an input to read from keyboard, I can't give it from the master process. I used getOutputStream() function. but it couldn't help. I will provide my code.
public class sam {
public static void main(String[] args) throws Exception {
try {
Process p = Runtime.getRuntime().exec("javac sam2.java");
Process p2 = Runtime.getRuntime().exec("java sam2");
BufferedReader in = new BufferedReader(
new InputStreamReader(p2.getInputStream()));
OutputStream out = p.getOutputStream();
String line = null;
line = in.readLine();
System.out.println(line);
input=input+"\n";
out.write(input.getBytes());
p.wait(10000);
out.flush();
}catch (IOException e) {
e.printStackTrace();
}
}
}
This is my master program(sam.java).
The following is the code of sam2.java
public class sam2 {
public static void main(String[] args) throws Exception {
BufferedReader br = new BufferedReader(new InputStreamReader(System.in));
String str;
System.out.println("Enter the number..\n");
str = br.readLine();
System.out.println(Integer.parseInt(str));
}
}
There is no problem, if my second program has only printing statements. But the problem arises when I have to read something from the other.
It is a bit strange but you can run the second program without forking it. Just calling the main method in it. So forget the runtime section and do this:
sam2.main(new String[0]);
Of course this way you must compile sam2 at compile time
Each process needs to be allowed to run and finish. You can use Process#waitFor for this purpose. Equally, you need to consume any output from the process at the same time. waitFor will block so you will need use a Thread to read the input (and if you need to, write output to the process)
Depending on the location of the java/class file, you may also need to specify a starting folder from which the execution of the process can start.
Most of this significantly easier using ProcessBuilder
import java.io.File;
import java.io.IOException;
import java.io.InputStream;
public class CompileAndRun {
public static void main(String[] args) {
new CompileAndRun();
}
public CompileAndRun() {
try {
int result = compile("compileandrun/HelloWorld.java");
System.out.println("javac returned " + result);
result = run("compileandrun.HelloWorld");
} catch (IOException | InterruptedException ex) {
ex.printStackTrace();
}
}
public int run(String clazz) throws IOException, InterruptedException {
ProcessBuilder pb = new ProcessBuilder("java", clazz);
pb.redirectError();
pb.directory(new File("src"));
Process p = pb.start();
InputStreamConsumer consumer = new InputStreamConsumer(p.getInputStream());
consumer.start();
int result = p.waitFor();
consumer.join();
System.out.println(consumer.getOutput());
return result;
}
public int compile(String file) throws IOException, InterruptedException {
ProcessBuilder pb = new ProcessBuilder("javac", file);
pb.redirectError();
pb.directory(new File("src"));
Process p = pb.start();
InputStreamConsumer consumer = new InputStreamConsumer(p.getInputStream());
consumer.start();
int result = p.waitFor();
consumer.join();
System.out.println(consumer.getOutput());
return result;
}
public class InputStreamConsumer extends Thread {
private InputStream is;
private IOException exp;
private StringBuilder output;
public InputStreamConsumer(InputStream is) {
this.is = is;
}
#Override
public void run() {
int in = -1;
output = new StringBuilder(64);
try {
while ((in = is.read()) != -1) {
output.append((char) in);
}
} catch (IOException ex) {
ex.printStackTrace();
exp = ex;
}
}
public StringBuilder getOutput() {
return output;
}
public IOException getException() {
return exp;
}
}
}
Now obviously, you should check the return results of the processes, and may be produce a better mechanism for interacting with the processes, but that's the basic idea...
You can just call the main method of the second class. The main method is just like any other static method.
This is what worked for me:
try {
single.main(new String[0]);
} catch (Exception e) {
JOptionPane.showMessageDialog(null, e);
}
Just call the main class file. For example, if your java class file name is xyz.java, you can call and execute the same in java swing application on click of a JButton, code is
private void Btn_createdatabaseActionPerformed(java.awt.event.ActionEvent evt) {
xyz.main(new String[0]);
}
That's it...

ProcessBuilder: Forwarding stdout and stderr of started processes without blocking the main thread

I'm building a process in Java using ProcessBuilder as follows:
ProcessBuilder pb = new ProcessBuilder()
.command("somecommand", "arg1", "arg2")
.redirectErrorStream(true);
Process p = pb.start();
InputStream stdOut = p.getInputStream();
Now my problem is the following: I would like to capture whatever is going through stdout and/or stderr of that process and redirect it to System.out asynchronously. I want the process and its output redirection to run in the background. So far, the only way I've found to do this is to manually spawn a new thread that will continuously read from stdOut and then call the appropriate write() method of System.out.
new Thread(new Runnable(){
public void run(){
byte[] buffer = new byte[8192];
int len = -1;
while((len = stdOut.read(buffer)) > 0){
System.out.write(buffer, 0, len);
}
}
}).start();
While that approach kind of works, it feels a bit dirty. And on top of that, it gives me one more thread to manage and terminate correctly. Is there any better way to do this?
Use ProcessBuilder.inheritIO, it sets the source and destination for subprocess standard I/O to be the same as those of the current Java process.
Process p = new ProcessBuilder().inheritIO().command("command1").start();
If Java 7 is not an option
public static void main(String[] args) throws Exception {
Process p = Runtime.getRuntime().exec("cmd /c dir");
inheritIO(p.getInputStream(), System.out);
inheritIO(p.getErrorStream(), System.err);
}
private static void inheritIO(final InputStream src, final PrintStream dest) {
new Thread(new Runnable() {
public void run() {
Scanner sc = new Scanner(src);
while (sc.hasNextLine()) {
dest.println(sc.nextLine());
}
}
}).start();
}
Threads will die automatically when subprocess finishes, because src will EOF.
For Java 7 and later, see Evgeniy Dorofeev's answer.
For Java 6 and earlier, create and use a StreamGobbler:
StreamGobbler errorGobbler =
new StreamGobbler(p.getErrorStream(), "ERROR");
// any output?
StreamGobbler outputGobbler =
new StreamGobbler(p.getInputStream(), "OUTPUT");
// start gobblers
outputGobbler.start();
errorGobbler.start();
...
private class StreamGobbler extends Thread {
InputStream is;
String type;
private StreamGobbler(InputStream is, String type) {
this.is = is;
this.type = type;
}
#Override
public void run() {
try {
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);
String line = null;
while ((line = br.readLine()) != null)
System.out.println(type + "> " + line);
}
catch (IOException ioe) {
ioe.printStackTrace();
}
}
}
A flexible solution with Java 8 lambda that lets you provide a Consumer that will process the output (eg. log it) line by line. run() is a one-liner with no checked exceptions thrown. Alternatively to implementing Runnable, it can extend Thread instead as other answers suggest.
class StreamGobbler implements Runnable {
private InputStream inputStream;
private Consumer<String> consumeInputLine;
public StreamGobbler(InputStream inputStream, Consumer<String> consumeInputLine) {
this.inputStream = inputStream;
this.consumeInputLine = consumeInputLine;
}
public void run() {
new BufferedReader(new InputStreamReader(inputStream)).lines().forEach(consumeInputLine);
}
}
You can then use it for example like this:
public void runProcessWithGobblers() throws IOException, InterruptedException {
Process p = new ProcessBuilder("...").start();
Logger logger = LoggerFactory.getLogger(getClass());
StreamGobbler outputGobbler = new StreamGobbler(p.getInputStream(), System.out::println);
StreamGobbler errorGobbler = new StreamGobbler(p.getErrorStream(), logger::error);
new Thread(outputGobbler).start();
new Thread(errorGobbler).start();
p.waitFor();
}
Here the output stream is redirected to System.out and the error stream is logged on the error level by the logger.
It's as simple as following:
File logFile = new File(...);
ProcessBuilder pb = new ProcessBuilder()
.command("somecommand", "arg1", "arg2")
processBuilder.redirectErrorStream(true);
processBuilder.redirectOutput(logFile);
by .redirectErrorStream(true) you tell process to merge error and output stream and then by .redirectOutput(file) you redirect merged output to a file.
Update:
I did manage to do this as follows:
public static void main(String[] args) {
// Async part
Runnable r = () -> {
ProcessBuilder pb = new ProcessBuilder().command("...");
// Merge System.err and System.out
pb.redirectErrorStream(true);
// Inherit System.out as redirect output stream
pb.redirectOutput(ProcessBuilder.Redirect.INHERIT);
try {
pb.start();
} catch (IOException e) {
e.printStackTrace();
}
};
new Thread(r, "asyncOut").start();
// here goes your main part
}
Now you're able to see both outputs from main and asyncOut threads in System.out
There is a library that provides a better ProcessBuilder, zt-exec. This library can do exactly what you are asking for and more.
Here's what your code would look like with zt-exec instead of ProcessBuilder :
add the dependency :
<dependency>
<groupId>org.zeroturnaround</groupId>
<artifactId>zt-exec</artifactId>
<version>1.11</version>
</dependency>
The code :
new ProcessExecutor()
.command("somecommand", "arg1", "arg2")
.redirectOutput(System.out)
.redirectError(System.err)
.execute();
Documentation of the library is here : https://github.com/zeroturnaround/zt-exec/
Simple java8 solution with capturing both outputs and reactive processing using CompletableFuture:
static CompletableFuture<String> readOutStream(InputStream is) {
return CompletableFuture.supplyAsync(() -> {
try (
InputStreamReader isr = new InputStreamReader(is);
BufferedReader br = new BufferedReader(isr);
) {
StringBuilder res = new StringBuilder();
String inputLine;
while ((inputLine = br.readLine()) != null) {
res.append(inputLine).append(System.lineSeparator());
}
return res.toString();
} catch (Throwable e) {
throw new RuntimeException("problem with executing program", e);
}
});
}
And the usage:
Process p = Runtime.getRuntime().exec(cmd);
CompletableFuture<String> soutFut = readOutStream(p.getInputStream());
CompletableFuture<String> serrFut = readOutStream(p.getErrorStream());
CompletableFuture<String> resultFut =
soutFut.thenCombine(serrFut, (stdout, stderr) -> {
// print to current stderr the stderr of process and return the stdout
System.err.println(stderr);
return stdout;
});
// get stdout once ready, blocking
String result = resultFut.get();
I too can use only Java 6. I used #EvgeniyDorofeev's thread scanner implementation. In my code, after a process finishes, I have to immediately execute two other processes that each compare the redirected output (a diff-based unit test to ensure stdout and stderr are the same as the blessed ones).
The scanner threads don't finish soon enough, even if I waitFor() the process to complete. To make the code work correctly, I have to make sure the threads are joined after the process finishes.
public static int runRedirect (String[] args, String stdout_redirect_to, String stderr_redirect_to) throws IOException, InterruptedException {
ProcessBuilder b = new ProcessBuilder().command(args);
Process p = b.start();
Thread ot = null;
PrintStream out = null;
if (stdout_redirect_to != null) {
out = new PrintStream(new BufferedOutputStream(new FileOutputStream(stdout_redirect_to)));
ot = inheritIO(p.getInputStream(), out);
ot.start();
}
Thread et = null;
PrintStream err = null;
if (stderr_redirect_to != null) {
err = new PrintStream(new BufferedOutputStream(new FileOutputStream(stderr_redirect_to)));
et = inheritIO(p.getErrorStream(), err);
et.start();
}
p.waitFor(); // ensure the process finishes before proceeding
if (ot != null)
ot.join(); // ensure the thread finishes before proceeding
if (et != null)
et.join(); // ensure the thread finishes before proceeding
int rc = p.exitValue();
return rc;
}
private static Thread inheritIO (final InputStream src, final PrintStream dest) {
return new Thread(new Runnable() {
public void run() {
Scanner sc = new Scanner(src);
while (sc.hasNextLine())
dest.println(sc.nextLine());
dest.flush();
}
});
}
It's really surprising to me that the redirection methods in ProcessBuilder don't accept an OutputStream, only File. Yet another proof of forced boilerplate code that Java forces you to write.
That said, let's look at a list of comprehensive options:
If you want the process output to simply be redirected to its parent's output stream, inheritIO will do the job.
If you want the process output to go to a file, use redirect*(file).
If you want the process output to go to a logger, you need to consume the process InputStream in a separate thread. See the answers that use a Runnable or CompletableFuture. You can also adapt the code below to do this.
If you want to the process output to go to a PrintWriter, that may or may not be the stdout (very useful for testing), you can do the following:
static int execute(List<String> args, PrintWriter out) {
ProcessBuilder builder = new ProcessBuilder()
.command(args)
.redirectErrorStream(true);
Process process = null;
boolean complete = false;
try {
process = builder.start();
redirectOut(process.getInputStream(), out)
.orTimeout(TIMEOUT, TimeUnit.SECONDS);
complete = process.waitFor(TIMEOUT, TimeUnit.SECONDS);
} catch (IOException e) {
throw new UncheckedIOException(e);
} catch (InterruptedException e) {
LOG.warn("Thread was interrupted", e);
} finally {
if (process != null && !complete) {
LOG.warn("Process {} didn't finish within {} seconds", args.get(0), TIMEOUT);
process = process.destroyForcibly();
}
}
return process != null ? process.exitValue() : 1;
}
private static CompletableFuture<Void> redirectOut(InputStream in, PrintWriter out) {
return CompletableFuture.runAsync(() -> {
try (
InputStreamReader inputStreamReader = new InputStreamReader(in);
BufferedReader bufferedReader = new BufferedReader(inputStreamReader)
) {
bufferedReader.lines()
.forEach(out::println);
} catch (IOException e) {
LOG.error("Failed to redirect process output", e);
}
});
}
Advantages of the code above over the other answers thus far:
redirectErrorStream(true) redirects the error stream to the output stream, so that we only have to bother with one.
CompletableFuture.runAsync runs from the ForkJoinPool. Note that this code doesn't block by calling get or join on the CompletableFuture but sets a timeout instead on its completion (Java 9+). There's no need for CompletableFuture.supplyAsync because there's nothing really to return from the method redirectOut.
BufferedReader.lines is simpler than using a while loop.
As an addition to msangel answer I would like to add the following code block:
private static CompletableFuture<Boolean> redirectToLogger(final InputStream inputStream, final Consumer<String> logLineConsumer) {
return CompletableFuture.supplyAsync(() -> {
try (
InputStreamReader inputStreamReader = new InputStreamReader(inputStream);
BufferedReader bufferedReader = new BufferedReader(inputStreamReader);
) {
String line = null;
while((line = bufferedReader.readLine()) != null) {
logLineConsumer.accept(line);
}
return true;
} catch (IOException e) {
return false;
}
});
}
It allows to redirect the input stream (stdout, stderr) of the process to some other consumer. This might be System.out::println or anything else consuming strings.
Usage:
...
Process process = processBuilder.start()
CompletableFuture<Boolean> stdOutRes = redirectToLogger(process.getInputStream(), System.out::println);
CompletableFuture<Boolean> stdErrRes = redirectToLogger(process.getErrorStream(), System.out::println);
System.out.println(stdOutRes.get());
System.out.println(stdErrRes.get());
System.out.println(process.waitFor());
Thread thread = new Thread(() -> {
new BufferedReader(
new InputStreamReader(inputStream,
StandardCharsets.UTF_8))
.lines().forEach(...);
});
thread.start();
Your custom code goes instead of the ...
import java.io.BufferedReader;
import java.io.InputStreamReader;
public class Main {
public static void main(String[] args) throws Exception {
ProcessBuilder pb = new ProcessBuilder("script.bat");
pb.redirectErrorStream(true);
Process p = pb.start();
BufferedReader logReader = new BufferedReader(new InputStreamReader(p.getInputStream()));
String logLine = null;
while ( (logLine = logReader.readLine()) != null) {
System.out.println("Script output: " + logLine);
}
}
}
By using this line: pb.redirectErrorStream(true); we can combine InputStream and ErrorStream
By default, the created subprocess does not have its own terminal or console. All its standard I/O (i.e. stdin, stdout, stderr) operations will be redirected to the parent process, where they can be accessed via the streams obtained using the methods getOutputStream(), getInputStream(), and getErrorStream(). The parent process uses these streams to feed input to and get output from the subprocess. Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, or even deadlock.
https://www.securecoding.cert.org/confluence/display/java/FIO07-J.+Do+not+let+external+processes+block+on+IO+buffers

Categories