How to improve performance of Java Process? - java

Hello stackoverflow community,
I try to run a command on the console and read the input stream of this process:
String command[] = {"ffmpeg"}; //minimal command example
Process proc = processBuilder.command(command).start();
byte[] bytes = IOUtils.toByteArray(proc.getInputStream());
proc.waitFor();
The command is a very long ffmpeg command, which prints bytes to the output. Everything works just fine, but it is very slow using java. When I run this command on my regular command line tool, it takes round about 20ms, everything is printed and done.
For the Java process is takes more than 2s. I also tried to redirect all I/O streams to the std, but the performance is the same, since I thought my reading is too slow. Also, reading the stream in additional threads or using other stream readers etc. did not change anything. The only thing which has an effect is adding cmd (currently working with Windows 10) to the command:
String command[] = {"cmd","/C","ffmpeg"};
resulting in a execution time of round about 400ms. I did not know before that this makes a difference, but it really does.
Since this is a Spring Boot web application and the command is used to output images from a video, 400ms is still a lot. The issue here is, that the frontend/browser requests a bunch of images (lets say 36). Apparently the simultaneous requests of a browser to one host is limited (chrome 6 requests), see here. Therefore it takes at best 6 x 400ms to deliver the content.
So is there a way to improve the performance to the java process or maybe keep it open and fire commands to avoid overhead?

Here is the pseudo code to your question in the comments above:
ByteArrayOutputStream out = new ByteArrayOutputStream();
ProcessBuilder pb = new ProcessBuilder(command);
// Consuming STDOUT and STDERR in same thread can lead to the process freezing if it writes large amounts.
pb.redirectErrorStream(true);
Process process = pb.start();
try (var infoStream = process.getInputStream()) {
infoStream.transferTo(out);
}
status = process.waitFor();
if (status > 0)
// errorhandling
#transferTo(out) Consumes the output stream inside the processbuilder thread. That's it! It runs inside the thread of process.

Related

Force BufferedInputStream to return captured content

I have a Spring Boot REST application with 2 endpoints: first starts the Process (db shell) with a command like 'mysql -e root'. Second one accepts command (query) and writes it to OutputStream (which is BufferedOutputStream from Process implementation).
Starting the Process (MySQL shell):
ProcessBuilder builder = new ProcessBuilder(commands);
builder.redirectErrorStream(true);
process = builder.start();
out = process.getInputStream();
in = process.getOutputStream();
Executing a command (e.g. 'select * from db.some_table;\n'):
byte[] commandBytes = command.getBytes(Charset.defaultCharset());
in.write(commandBytes, 0, commandBytes.length);
in.flush();
After running a command (query) I want to return its result (or at least, output it to the console) with:
int no = out.available();
if (no > 0) {
int n = out.read(buffer, 0, Math.min(no, buffer.length));
System.out.println(new String(buffer, 0, n));
}
The problem is that out.available() is always 0.
If I call close() on an output stream, out.available() returns all the input stream length and I can read from it. But that is not what I want.
Can I somehow force BufferedInputStream to make result available to be read without closing the stream?
I see that internally BufferedInputStream uses FileInputStream and FileChannel, but I haven't found a way to capture the result when output stream is not closed.
I think what's happening is that the mysql client detects that standard input is not a terminal, and runs in batch mode rather than in interactive mode. This isn't caused by the behaviour of BufferedReader: it's blocking indefinitely on read, and reporting 0 bytes available because there genuinely isn't anything to read from the output of the subprocess.
In batch mode, the client expects to read a list of commands from standard input, and only executes them once the end of file is reached. In other words, the subprocess will not produce any output on the InputStream you see in your parent process until your parent process closes the OutputStream of the subprocess.
It appears that there's no way to force mysql to run in interactive mode (according to this question: "How to force mysql.exe to run in "interactive" mode?", and the documentation of command line options).
The mysqlsh client can be forced into interactive mode, but it is worth considering whether this is really the best solution for your use case. Other alternatives include:
Embracing batch mode and executing all of the commands together, if the form of each command does not depend on the results of previous ones
Sequentially invoking the subprocess multiple times in batch mode, if subsequent commands do depend on the results of previous ones
Performing the queries using JDBC (as g00se recommended in the comments)

How to wait for a process in Java [duplicate]

This question already has answers here:
Wait for process to finish before proceeding in Java
(4 answers)
Closed 3 years ago.
In the following code, I tried to execute a script, which takes quite a while to finish.So, I already tried to use process.waitfor() but it didn't let the script finish. Are there any ideas how to make it work?
ProcessBuilder pb = new ProcessBuilder(osShell);
Process process = pb.start();
PrintWriter pyCon = new PrintWriter(process.getOutputStream());
pyCon.println("cd " + videoDir);
System.out.println("Executing python file: "+ command);
pyCon.println(command);
//Here, I need a piece of code which let's my command run in peace
pyCon.close();
System.out.println(convertStreamToString(process.getInputStream()));
process.waitFor();
By closing the stdin of the shell process, you pretty much told it it's over. Chances are the shell ended killing the child python cmd, or at least not draining nor forwarding the python stdout/stderr to its own shell stdout/stderr.
Try waiting for some expected end marker from your python cmd. Then only close stdin. Drain all the stdout/stderr from the process of course.
It common to see 2 more threads just to pump out the bytes from stdout/stderr while the main thread is controlling/waiting for the process. Keep in mind that if you don't pump the bytes out, the pipes may fill up and block on the child side, preventing it from terminating. These pipes are often pretty small (512 to 2k bytes for instance - that is obviously testable).
Sending commands to a shell’s standard input is unlikely to work, especially in Windows.
The correct way to execute something in a particular directory is not by trying to send a cd command, but by specifying the directory in the ProcessBuilder itself:
ProcessBuilder pb = new ProcessBuilder("python", pythonFileName);
pb.directory(new File(videoDir));
Process process = pb.start();

Java: Wait for subprocess of process to finish before reading process's InputStream

I have a process created as follows:
Process p = Runtime.getRuntime().exec(new String[]{"su"});
In my program, I only want to create this process once. I am developing a root file explorer application for Android, and whenever this process is created, the Android device will prompt the user to grant root permissions. This is a very slow operation, and as this is a file browser, it will need root permissions often. So, I have decided to create this process once and write commands to its OutputStream in the following manner (stdin is this OutputStream):
stdin.writeBytes(command + "\n");
Before I can read the output of the command, I need my program to wait until the command written by writeBytes has terminated. I have tried p.waitFor(), but this causes the program to hang.
Here is how I read bytes from the InputStream:
int read;
String out = "";
stdout = p.getInputStream();
byte[] buffer = new byte[262144];
while (true) {
read = stdout.read(buffer);
out += new String(buffer, 0, read);
if (read < BUFF_LEN) {
//we have read everything
break;
}
}
Note that although the read(buffer) method blocks until input data is available, it does not block in this case because it thinks it has reached the end of the InputStream.
I have tried to include only relevant portions of my code in this post, but if you would like to take a look at the entire source code of the class where this is contained, see here: http://pastebin.com/t6JdWmQr.
How can I make sure the command has finished running before reading the process' InputStream?
I also encounter similar problem, and I found the answer here:
Wait until a command in su finishes
If you don't need any read stream in this shell process, simply add shell read stream may completed the shell process.
Or in XDA also have better way:
[HowTo]Execute Root Commands and read output

Why does java.lang.process's readline() behave differently for reading inputstream on different boxes with the same os

I tested this code(below) on several different linux boxes(4+) and it worked fine. However, on one linux box I ran into an issue with readline() hanging for the error inputStream(errorStream). This stream should be empty so I suspected that box was not writing out a line terminator to the errorStream for the error. I changed my code to use read() instead of readline()...but read() also hung.
I tried retrieving the input inputStream first, and that worked and there was no hangs with readline()/read() for the error inputstream. I could not do this since I needed to obtain possible errors first. Appearing to be a deadlock, I was able to resolve this by having each inputstream read from it's own thread. Why did I only see this issue on one box? Is there a kernel setting or some other setting specific to this box that could have caused this?
ProcessBuilder processBuilder = new ProcessBuilder()
try
{
Process processA = null;
synchronized (processBuilder)
{
processBuilder.command("/bin/sh","-c"," . /Home/SomeScript.ksh");
processA = processBuilder.start();
}
inputStream = processA.getInputStream();
reader = new BufferedReader(new InputStreamReader(inputStream));
errorStream = processA.getErrorStream();
errorReader = new BufferedReader(new InputStreamReader(errorStream));
String driverError;
while ((driverError = errorReader.readLine()) != null)
{
//some code
}
Why did I only see this issue on one box?
Most likely because of something in the script that is being run ... and its interactions with its environment (e.g. files, environment variables, etc)
Is there a kernel setting or some other setting specific to this box that could have caused this?
It is possible but unlikely that it is a kernel setting. It might be "something else". Indeed, it has to be "something" outside of the Java application that is to blame, at least in part.
I suggest you do the following temporarily (at least):
ProcessBuilder processBuilder = new ProcessBuilder();
processBuilder.command("/bin/sh","-c"," . /Home/SomeScript.ksh");
processBuilder.redirectErrorStream(true);
processA = processBuilder.start();
inputStream = processA.getInputStream();
reader = new BufferedReader(new InputStreamReader(inputStream));
String line;
while ((line = reader.readLine()) != null) {
System.out.println(line);
}
System.out.println("Return code is " + processA.exitValue());
That way you can see what all of the output is.
There should not be a problem if the external process fails to put a newline at the end of the last line. The Java process will see an EOF on the input stream, and the BufferedReader will return what characters it has ... and return null on the next call.
Another possibility is that the external process is blocking because it is trying to read from its standard input.
UPDATE
The redirectErrorStream also resolved the issue, but I need the error stream separate.
OK so if it did (reliably) solve the problem then that (most likely) means that you have to read the external processes stdout and stderr streams in parallel. The simple way to do is to create 2 threads to read and buffer the two streams separately. For example: Capturing stdout when calling Runtime.exec
(Your problem is due to the fact that pipes have a finite buffering capacity. The external problem is most likely alternating between writing stuff to stdout and stderr. If it tries to write to one of the pipes when that pipe is "full", it will block. But if your application is reading all of the other pipe (to EOF) before it reads the blocked pipe, then everything will deadlock. The fact that the external process is stuck in PIPE_W state is more evidence for this explanation.
One possible reason that you are seeing different behaviour on different systems is that the amount of buffering in a pipe is system dependent. But it could also be due to differences in what the external process is doing; e.g. its inputs.)
You are running OS specific commands in a script, any one could be holding the error output. You can avoid this by discarding the errors, but that is unlikely to be a good idea.
I would check the version of the OS are the same and whether you have any significant differences in the command you run in the script. If this doesn't help, take out commands from the script until it starts working. I assume an empty script doesn't do this.

How to wait for a batch file processing to end which is executed using Runtime.exec or ProcessBuilder.start, before moving to the next statement?

I have written a small application for a project..that will do following tasks:
Writes a commands.bat file. This bat file has some source-code-server commands that will take sometime to get process.
Executes the commands.bat using ProcessBuilder and get outputfile.txt using redirectOutput(File file) method.
Reads the outputfile.txt and get the desired output.
When I run this application, the program control starts with step-1 and executes it completely. In step-2 the control starts a process that drives the batch file. Now commands.bat file takes some time to finish (depends on the response from source code server). Sometimes this batch takes a little more than the reasonable time, for which the control is not waiting and starts executing step-3, and this way I am not getting the complete stream in the outfile.txt.
I also used things like:
waitfor(): Even with this control is not waiting for process to end(technically I may be wrong)
Thread.sleep(). This is not working as time taken in batch file processing is not certain.
Please help.
This is how I am waiting for a batch file to execute. Hopefully you have solved the problem by now. But, it might help someone else who looks at this question
// Any command you want to run in my case im executing a batch file
String cmd = "load_execute.bat";
//FILE_PATH is the directory where to starting from
ProcessBuilder builder = new ProcessBuilder("cmd.exe", "/c", cmd).redirectErrorStream(true);
builder.directory(new File(FILE_PATH));
Process process = builder.start();
//Redirect stream from cmd stream to local print stream
BufferedReader input = new BufferedReader(new InputStreamReader(process.getInputStream()));
String line;
while ((line = input.readLine()) != null) {
System.out.println(line);
}
input.close();
res = process.waitFor();`

Categories