InputStream blocks when reading large amount of data from external process - java

I have some problem with getting data prom Process.exec()
There is a process that I must call from Java, that can work for a very long time, and this process can produce large amount of data.
What I was doing for now was this:
public InputStream exec() throws IOException, InterruptedException{
ProcessBuilder pb = new ProcessBuilder(args);
Process p = pb.start();
p.waitFor();
return p.getInputStream();
}
This code blocks after it runs for some time. I assume that the InputStream buffer is being filled, and it's waiting for me to read from it.
I want to return InputStream since this output can, but not must, be compressed, so I could decompress this output later, and I must read this output as byte output stream
How can I run this process and get it's output data?

Related

Is there a way to achieve both reading process's OutputStream and redirecting it's output to standard io?

I failed to do both reading from process's OutputStream and redirecting it to standard io at the same time. I can do any one of the above, but not both.
I tried to use both inheritIO() and redirectOutput(ProcessBuilder.Redirect.PIPE) and it didn't work, I can read the output but it didn't appear in the standard output.
#Test
void testRedirectOutput() throws IOException, InterruptedException {
// when
Process proc = new ProcessBuilder()
.redirectErrorStream(true)
.inheritIO()
.redirectOutput(ProcessBuilder.Redirect.PIPE)
.command("where", "where")
.start();
proc.waitFor();
// then
String output = readAllOutput(proc);
assertNotNull(output);
}
private static String readAllOutput(Process process) throws IOException {
StringBuilder builder = new StringBuilder();
BufferedReader reader = new BufferedReader(new InputStreamReader(process.getInputStream()));
String line;
do {
line = reader.readLine();
builder.append(line);
}
while (line != null);
reader.close();
return builder.toString();
}
If I remove .redirectOutput(ProcessBuilder.Redirect.PIPE), it appears on the standard io but I can't read it (output=null).
Is there an elegant way to achieve it rather than calling System.out.println(output)?
Thanks.
A Process runs in parallel to your Java program. While it runs, it reads something from standard input and writes something to standard output and standard error. Typically, a process does not read all input at once and then write all output at once. Instead, it reads some input, does something with it, and then produces some output. This repeats itself until the process detects end of input, at which point it writes the remaining output to the standard output and closes it. In such cases, if you tried to write everything at once and then read everything at once, the OS buffers would fill up and your streams would become blocked in read/write operation.
This means you need a separate Thread to handle each stream. In addition to the thread that is running this code (probably main), you will need two other threads. If the input and output can fit into memory, you can simply pass the data around as ByteArrayInputStream and ByteArrayOutputStream to corresponding threads, and let them do the "pumping" of the data to/from the process.

process.waitFor() returns one on of my computers but not on my other computer

I have this code
private static void restartTor() throws IOException, InterruptedException {
String killTor = "killall tor";
String startTor = "/opt/local/bin/tor -f /dev/torrc";
Runtime run = Runtime.getRuntime();
Process pr = run.exec(killTor);
pr.waitFor();
BufferedReader buf = new BufferedReader(new InputStreamReader(pr.getInputStream()));
String line = "";
while ((line=buf.readLine())!=null) {
System.out.println(line);
}
pr = run.exec(startTor);
pr.waitFor();
buf = new BufferedReader(new InputStreamReader(pr.getInputStream()));
line = "";
while ((line=buf.readLine())!=null) {
System.out.println(line);
}
}
When I run this on computer A it executes as expected but when I run it on my second computer, B, it gets stuck at the second pr.waitFor();.
I have read a bunch of questions here on SE, such as process.waitFor() never returns and Java process.waitFor() does not return and the main issue there seems to be that you don't read the buffer but I do that (don't I?).
A and B are similar, but not identical (Macs, running 10.15, A has 32 GB RAM, B has 16 GB RAM).
I use the same version of tor and the torrc:s are identical on A and B.
I am stumped. What is the problem here?
Edit: On B, If I manually, from a regular terminal, kill the process, it returns and everything continues as expected.
Edit 2: Now it fails on computer A as well. I had run it dozens of times there, without problems before but now it fails constantly.
I don't know if this this is the ultimate cause of your problem, but you should call waitFor after reading the output (and errors) from the external process. If a process writes more to its standard output and error streams than the OS is prepared to buffer, then your code could deadlock:
The external process is blocked while trying to write output
Your Java code is blocked waiting for the external process to exit.
You need to consume both streams at same time if you are experiencing the stream deadlock issue on waitFor(). You can do this with background threads on pr.getErrorStream(), or set up STDERR handling before calling waitFor().
To do this replace use of Runtime.exec with ProcessBuilder.
ProcessBuilder pb = new ProcessBuilder(startTor);
This is easier with either sending error log to a file:
File stderr = new File("stderr.log");
pb.redirectError(stderr));
... or just redirect error to merge it with stdout:
pb.redirectErrorStream(true);
then
pr = pb.start();
pr.waitFor();

Java: Wait for subprocess of process to finish before reading process's InputStream

I have a process created as follows:
Process p = Runtime.getRuntime().exec(new String[]{"su"});
In my program, I only want to create this process once. I am developing a root file explorer application for Android, and whenever this process is created, the Android device will prompt the user to grant root permissions. This is a very slow operation, and as this is a file browser, it will need root permissions often. So, I have decided to create this process once and write commands to its OutputStream in the following manner (stdin is this OutputStream):
stdin.writeBytes(command + "\n");
Before I can read the output of the command, I need my program to wait until the command written by writeBytes has terminated. I have tried p.waitFor(), but this causes the program to hang.
Here is how I read bytes from the InputStream:
int read;
String out = "";
stdout = p.getInputStream();
byte[] buffer = new byte[262144];
while (true) {
read = stdout.read(buffer);
out += new String(buffer, 0, read);
if (read < BUFF_LEN) {
//we have read everything
break;
}
}
Note that although the read(buffer) method blocks until input data is available, it does not block in this case because it thinks it has reached the end of the InputStream.
I have tried to include only relevant portions of my code in this post, but if you would like to take a look at the entire source code of the class where this is contained, see here: http://pastebin.com/t6JdWmQr.
How can I make sure the command has finished running before reading the process' InputStream?
I also encounter similar problem, and I found the answer here:
Wait until a command in su finishes
If you don't need any read stream in this shell process, simply add shell read stream may completed the shell process.
Or in XDA also have better way:
[HowTo]Execute Root Commands and read output

IOUtils.copy() hangs when copying big stream?

I want to parse content of some file by srcML parser which is an external windows program. I'm doing this in a following way:
String command = "src2srcml.exe --language java";
Process proc = Runtime.getRuntime().exec(command);
InputStream fileInput = Files.newInputStream(file)
OutputStream procOutput = proc.getOutputStream();
IOUtils.copy(fileInput, procOutput);
IOUtils.copy() is from Commons IO 2.4 library.
When my file is small (several KB) everything works fine. However, when I try to copy some relatively big file (~72 KB) my program hangs.
Moreover, when I execute the parser 'manually' in cmd:
src2srcml.exe --language Java < BigFile.java
everything works fine, too.
Any ideas why this is happening?
You should buffer the OutputStream:
OutputStream procOutput = proc.getOutputStream();
BufferedOutputStream bos = new BufferedOutputStream(procOutput);
IOUtils.copy(fileInput, bos);
Moreover, why don't you simply redirect fileInput as the process InputStream?
ProcessBuilder pb = new ProcessBuilder(command);
pb.redirectInput(file);
Process proc = pb.start();
proc.waitFor();
The problem is most likely that you are not consuming the output of the external program in a separate thread. you need to start a separate thread to consume the output so that the external program does not get blocked.

How to write in Java to stdin of ssh?

Everything works fine on the command line, but when I translate what I want into Java, the receiving process never gets anything on stdin.
Here's what I have:
private void deployWarFile(File warFile, String instanceId) throws IOException, InterruptedException {
Runtime runtime = Runtime.getRuntime();
// FIXME(nyap): Use Jsch.
Process deployWarFile = runtime.exec(new String[]{
"ssh",
"gateway",
"/path/to/count-the-bytes"});
OutputStream deployWarFileStdin = deployWarFile.getOutputStream();
InputStream deployWarFileStdout = new BufferedInputStream(deployWarFile.getInputStream());
InputStream warFileInputStream = new FileInputStream(warFile);
IOUtils.copy(warFileInputStream, deployWarFileStdin);
IOUtils.copy(deployWarFileStdout, System.out);
warFileInputStream.close();
deployWarFileStdout.close();
deployWarFileStdin.close();
int status = deployWarFile.waitFor();
System.out.println("************ Deployed with status " + status + " file handles. ************");
}
The script 'count-the-bytes' is simply:
#!/bin/bash
echo "************ counting stdin bytes ************"
wc -c
echo "************ counted stdin bytes ************"
The output indicates that the function hangs at the 'wc -c' line -- it never gets to the 'counted stdin bytes' line.
What's going on? Would using Jsch help?
You might try closing the output stream before you expect wc -c to return.
IOUtils.copy(warFileInputStream, deployWarFileStdin);
deployWarFileStdin.close();
IOUtils.copy(deployWarFileStdout, System.out);
warFileInputStream.close();
deployWarFileStdout.close();
Would using Jsch help?
Using JSch would only help if you would be using the setInputStream() and setOutputStream() methods of the channel instead of the IOUtils.copy method, since they manage the copying on a separate thread.
ChannelExec deployWarFile = (ChannelExec)session.openChannel("exec");
deployWarFile.setCommand("/path/to/count-the-bytes");
deployWarFile.setOutputStream(System.out);
deployWarFile.setInputStream(new BufferedInputStream(new FileInputStream(warFile)));
deployWarFile.connect();
(Here you somehow have to wait until the other side closes the channel.)
If you simply replaced the Runtime.exec with opening an ChannelExec (and starting it after getting the streams), the problem would be completely the same, and could be solved by the same solution mentioned by antlersoft, i.e. closing the input before reading the output:
ChannelExec deployWarFile = (ChannelExec)session.openChannel("exec");
deployWarFile.setCommand("/path/to/count-the-bytes");
OutputStream deployWarFileStdin = deployWarFile.getOutputStream();
InputStream deployWarFileStdout = new BufferedInputStream(deployWarFile.getInputStream());
InputStream warFileInputStream = new FileInputStream(warFile);
deployWarFile.connect();
IOUtils.copy(warFileInputStream, deployWarFileStdin);
deployWarFileStdin.close();
warFileInputStream.close();
IOUtils.copy(deployWarFileStdout, System.out);
deployWarFileStdout.close();
(Of course, if you have longer output, you will want to do input and output in parallel, or simply use the first method.)
You probably get an error, but the process hangs because you are not reading the error stream.
Taken from the Process JavaDoc
All its standard io (i.e. stdin, stdout, stderr) operations will be redirected to the parent process through three streams (Process.getOutputStream(), Process.getInputStream(), Process.getErrorStream()). The parent process uses these streams to feed input to and get output from the subprocess. Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, and even deadlock.
So you need to read all of them. Using the ProcessBuilder is probably easier

Categories