Send multiples commands with ProcessBuilder? - java

I want to spawn a child process in Java and send different commands inside the app. My child process has authentication and each user can have a variety of internal commands.
For example:
> login myuser passowrd
OK
> list certs
cert1 abc
cert2 efg
> logout
> exit
Well, to simulate that I will make my example with "node" as IO CLI.
public class JAVAMain {
public static void main(String[] args) throws Exception {
action();
}
public static String action() throws Exception {
ProcessBuilder pb = new ProcessBuilder("node");
pb.redirectErrorStream(true);
Process process = pb.start();
// streams
InputStream stdout = process.getInputStream();
OutputStream stdin = process.getOutputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(stdout));
BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(stdin));
String buff = "";
String res = "";
System.out.println("1");
writer.write("console.log('OK');\n"); // simulate "login answer"
writer.flush();
System.out.println("2");
res = "";
while ((buff = reader.readLine()) != null) {
res += buff;
}
if (!res.equals("OK")) {
reader.close();
writer.close();
throw new Exception("Invalid auth");
}
System.out.println("3");
writer.write("console.log('any text...');\n");
writer.flush();
System.out.println("4");
res = "";
while ((buff = reader.readLine()) != null) {
res += buff;
}
reader.close();
writer.close();
return res;
}
}
I expect to print 1,2,3,4 and get the res any text... for this example. But the program never stops and stay in 1,2. If I close the writer after the flush in 1 I get this output:
1
2
3
Exception in thread "main" java.io.IOException: Stream closed
at java.io.BufferedWriter.ensureOpen(BufferedWriter.java:116)
at java.io.BufferedWriter.write(BufferedWriter.java:221)
at java.io.Writer.write(Writer.java:157)
at com.keynua.kades.JAVAMain2.action(JAVAMain2.java:48)
at com.keynua.kades.JAVAMain2.main(JAVAMain2.java:14)
That's why I close the writer and the reader works but I can't write again. So, how I can make to send multiples commands to the child app and read the output to follow the flow with other commands?

One problem is that your while loops
while ((buff = reader.readLine()) != null) { ... }
only terminate when the reader has reached the end of the input stream.
The end of the input stream is only reached when the subprocess terminates.
The second problem is that you seem to be using NodeJS as sample command executor.
If NodeJS is started from a console, you can enter JavaScript statements and they are executed one by one.
You are however starting NodeJS not from a console, but from some other application. In this case, NodeJS wants to read a complete script from stdin and executes the complete script at once
You could start NodeJS with the -i parameter (force interactive mode), at the expense of some additional output.
To achieve this, you would create the ProcessBuilder with
ProcessBuilder pb = new ProcessBuilder("node", "-i");
Communicating with a subprocess in this way only works when you know how many lines to read from the reader before sending the next command.
Knowing how many lines to read can mean:
knowing how many lines of output a command produces (login: 1 line of output, logout: no output)
knowing that a command produces a distinct last line (for example an empty line or a line with only "END" in it)
that a command produces a line count as the first result
executing another command first that returns the result line count of the subsequent command
The list certs command could either:
produce
cert1 abc
cert2 def
END
produce (where the last line would be empty instead of containing a dot)
cert1 abc
cert2 def
.
produce
2 certs
cert1 abc
cert2 def
or you could execute a count certs command before executing list certs

Related

Java - Redirecting Process I/O (piping) Is Stalling [duplicate]

This question already has answers here:
Redirecting the output of a process into the input of another process using ProcessBuilder in Java
(2 answers)
Closed 4 years ago.
I am trying to redirect the output of a process to the input of another. I.e., piping. When doing this in the Windows DOS shell it looks like
C:\> dir /s/b . | findstr dat$
However I am trying to a command like this in Java and so far it looks like:
Stopwatch sw = Stopwatch.createStarted();
ProcessBuilder source = new ProcessBuilder("cmd", "/S/D/c", "dir", "/s/b", ".");
ProcessBuilder target = new ProcessBuilder("cmd", "/S/D/c", "findstr", "dat$");
source.directory(new File("C:/"));
target.directory(source.directory());
// I am messing with the below lines, nothing is working
source.redirectInput(target.redirectInput());
source.redirectOutput(ProcessBuilder.Redirect.PIPE);
source.redirectOutput(target.redirectInput());
source.redirectInput(target.redirectOutput());
target.redirectOutput(source.redirectInput());
Process pSource = source.start();
Process pTarget = target.start();
log.debug("Running {} | {}", source.command(), target.command());
try (BufferedReader br = new BufferedReader(new InputStreamReader(pTarget.getInputStream()))) {
String line;
while ((line = br.readLine()) != null)
log.debug("{}", line);
} finally {
log.debug("Ending process {} with exist code {} in time {}", target.command(),
pTarget.destroyForcibly().exitValue(), sw);
}
But I am finding the code stalls on the readLine, so something isn't working here. How do I properly use IO redirects?
The objects accepted or returned by redirectInput() resp, redirectOutput() describe a certain policy; they do not represent actual channels.
So by the statement source.redirectInput(target.redirectInput()) you are just specifying that both processes should have the same policy, you’re not linking channels.
In fact, directly linking the channels of two processes is impossible in Java 8. The best you can do to achieve a similar effect, is to start a background thread which will read the first process’ output and write it to the second process’ input:
static List<Process> doPipeJava8() throws IOException {
Process pSource = new ProcessBuilder("cmd", "/S/D/c", "dir", "/s/b", ".")
.redirectInput(ProcessBuilder.Redirect.INHERIT)
.redirectError(ProcessBuilder.Redirect.INHERIT)
.start();
Process pTarget;
try {
pTarget = new ProcessBuilder("cmd", "/S/D/c", "findstr", "dat$")
.redirectErrorStream(true)
.redirectOutput(ProcessBuilder.Redirect.INHERIT)
.start();
} catch(Throwable t) {
pSource.destroyForcibly();
throw t;
}
new Thread(() -> {
try(InputStream srcOut = pSource.getInputStream();
OutputStream dstIn = pTarget.getOutputStream()) {
byte[] buffer = new byte[1024];
while(pSource.isAlive() && pTarget.isAlive()) {
int r = srcOut.read(buffer);
if(r > 0) dstIn.write(buffer, 0, r);
}
} catch(IOException ex) {}
}).start();
return Arrays.asList(pSource, pTarget);
}
This configures the error channels, the input channel of the first process, and the output channel of the last process to INHERIT so they will use our initiating process’ console. The first process’ output and the second process’ input are kept at the default PIPE which means establishing a pipe to our initiating process, so it’s our duty to copy the data from one pipe to another.
The method can be used as
List<Process> sub = doPipeJava8();
Process pSource = sub.get(0), pTarget = sub.get(1);
pSource.waitFor();
pTarget.waitFor();
If we remove the .redirectOutput(ProcessBuilder.Redirect.INHERIT) from the builder of the pTarget process, we could read the final output:
List<Process> sub = doPipeJava8();
Process pSource = sub.get(0), pTarget = sub.get(1);
List<String> result = new BufferedReader(new InputStreamReader(pTarget.getInputStream()))
.lines().collect(Collectors.toList());
Java 9 is the first Java version with support for establishing a pipeline between sub-processes. It simplifies the solution to
static List<Process> doPipeJava9() throws IOException {
return ProcessBuilder.startPipeline(
List.of(new ProcessBuilder("cmd", "/S/D/c", "dir", "/s/b", ".")
.redirectInput(ProcessBuilder.Redirect.INHERIT)
.redirectError(ProcessBuilder.Redirect.INHERIT),
new ProcessBuilder("cmd", "/S/D/c", "findstr", "dat$")
.redirectErrorStream(true)
.redirectOutput(ProcessBuilder.Redirect.INHERIT)) );
}
It does the same as the other solution¹; the example above is configured to let the first process read from the console (if needed) and the last process write to the console. Again, if we omit the .redirectOutput(ProcessBuilder.Redirect.INHERIT) from the last process builder, we can read the last process’ output.
¹ except that it will use the system’s native piping capability when possible

java ProcessBuilder: run program with multiple input

I use a ProcessBuilder to run system command from java. The system command may ask input data from user. Program failed when the system command asks input data from user for multiple times. Example of running such a command from command-line directly:
>test-input
Continue? Y/N
y
Entered y
Again: Continue? Y/N
y
Entered y
If I use my ProcessBuilder based program to run "test-input", it either hangs or failed to take input for a second time. Here is the code of reading/writing logic. Read from input stream (Exception handling and stream close logic is omitted)
ProcessBuilder pb = new ProcessBuilder(cmdList);
pb.redirectErrorStream(true);
pb.directory(new File("some-test-dir"));
process = pb.start();
InputStream is = process.getInputStream();
int value = -1;
while ( (value = is.read()) != -1) {
reader.append((char)value);
}
int result = process.waitFor();
Write to output stream:
public void write(String s) {
OutputStream os = null;
try {
os = process.getOutputStream();
os.write(s.getBytes(Charset.forName("UTF-8")));
}
catch (IOException e) {
//...
}
finally {
// Problematic
os.close();
}
}
The problem occurred at the line os.close(). If I put it there, the output stream is closed after the first input data is processed, thus it cannot be re-opened and program cannot take the second input data. If I do not close the output stream, then program hangs there as is.read() gets blocked forever. How to solve this issue? thanks
Problem is fixed by writing a new line character for each input, as described in: Writing to InputStream of a Java Process
os.write(s.getBytes(Charset.forName("UTF-8")));
os.write('\n');
os.flush();

Java reader does not start printing until closing the programm [duplicate]

I have the following code example below. Whereby you can enter a command to the bash shell i.e. echo test and have the result echo'd back. However, after the first read. Other output streams don't work?
Why is this or am I doing something wrong? My end goal is to created a Threaded scheduled task that executes a command periodically to /bash so the OutputStream and InputStream would have to work in tandem and not stop working. I have also been experiencing the error java.io.IOException: Broken pipe any ideas?
Thanks.
String line;
Scanner scan = new Scanner(System.in);
Process process = Runtime.getRuntime ().exec ("/bin/bash");
OutputStream stdin = process.getOutputStream ();
InputStream stderr = process.getErrorStream ();
InputStream stdout = process.getInputStream ();
BufferedReader reader = new BufferedReader (new InputStreamReader(stdout));
BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(stdin));
String input = scan.nextLine();
input += "\n";
writer.write(input);
writer.flush();
input = scan.nextLine();
input += "\n";
writer.write(input);
writer.flush();
while ((line = reader.readLine ()) != null) {
System.out.println ("Stdout: " + line);
}
input = scan.nextLine();
input += "\n";
writer.write(input);
writer.close();
while ((line = reader.readLine ()) != null) {
System.out.println ("Stdout: " + line);
}
Firstly, I would recommend replacing the line
Process process = Runtime.getRuntime ().exec ("/bin/bash");
with the lines
ProcessBuilder builder = new ProcessBuilder("/bin/bash");
builder.redirectErrorStream(true);
Process process = builder.start();
ProcessBuilder is new in Java 5 and makes running external processes easier. In my opinion, its most significant improvement over Runtime.getRuntime().exec() is that it allows you to redirect the standard error of the child process into its standard output. This means you only have one InputStream to read from. Before this, you needed to have two separate Threads, one reading from stdout and one reading from stderr, to avoid the standard error buffer filling while the standard output buffer was empty (causing the child process to hang), or vice versa.
Next, the loops (of which you have two)
while ((line = reader.readLine ()) != null) {
System.out.println ("Stdout: " + line);
}
only exit when the reader, which reads from the process's standard output, returns end-of-file. This only happens when the bash process exits. It will not return end-of-file if there happens at present to be no more output from the process. Instead, it will wait for the next line of output from the process and not return until it has this next line.
Since you're sending two lines of input to the process before reaching this loop, the first of these two loops will hang if the process hasn't exited after these two lines of input. It will sit there waiting for another line to be read, but there will never be another line for it to read.
I compiled your source code (I'm on Windows at the moment, so I replaced /bin/bash with cmd.exe, but the principles should be the same), and I found that:
after typing in two lines, the output from the first two commands appears, but then the program hangs,
if I type in, say, echo test, and then exit, the program makes it out of the first loop since the cmd.exe process has exited. The program then asks for another line of input (which gets ignored), skips straight over the second loop since the child process has already exited, and then exits itself.
if I type in exit and then echo test, I get an IOException complaining about a pipe being closed. This is to be expected - the first line of input caused the process to exit, and there's nowhere to send the second line.
I have seen a trick that does something similar to what you seem to want, in a program I used to work on. This program kept around a number of shells, ran commands in them and read the output from these commands. The trick used was to always write out a 'magic' line that marks the end of the shell command's output, and use that to determine when the output from the command sent to the shell had finished.
I took your code and I replaced everything after the line that assigns to writer with the following loop:
while (scan.hasNext()) {
String input = scan.nextLine();
if (input.trim().equals("exit")) {
// Putting 'exit' amongst the echo --EOF--s below doesn't work.
writer.write("exit\n");
} else {
writer.write("((" + input + ") && echo --EOF--) || echo --EOF--\n");
}
writer.flush();
line = reader.readLine();
while (line != null && ! line.trim().equals("--EOF--")) {
System.out.println ("Stdout: " + line);
line = reader.readLine();
}
if (line == null) {
break;
}
}
After doing this, I could reliably run a few commands and have the output from each come back to me individually.
The two echo --EOF-- commands in the line sent to the shell are there to ensure that output from the command is terminated with --EOF-- even in the result of an error from the command.
Of course, this approach has its limitations. These limitations include:
if I enter a command that waits for user input (e.g. another shell), the program appears to hang,
it assumes that each process run by the shell ends its output with a newline,
it gets a bit confused if the command being run by the shell happens to write out a line --EOF--.
bash reports a syntax error and exits if you enter some text with an unmatched ).
These points might not matter to you if whatever it is you're thinking of running as a scheduled task is going to be restricted to a command or a small set of commands which will never behave in such pathological ways.
EDIT: improve exit handling and other minor changes following running this on Linux.
I think you can use thread like demon-thread for reading your input and your output reader will already be in while loop in main thread so you can read and write at same time.You can modify your program like this:
Thread T=new Thread(new Runnable() {
#Override
public void run() {
while(true)
{
String input = scan.nextLine();
input += "\n";
try {
writer.write(input);
writer.flush();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
} );
T.start();
and you can reader will be same as above i.e.
while ((line = reader.readLine ()) != null) {
System.out.println ("Stdout: " + line);
}
make your writer as final otherwise it wont be able to accessible by inner class.
You have writer.close(); in your code. So bash receives EOF on its stdin and exits. Then you get Broken pipe when trying to read from the stdoutof the defunct bash.

Slow System Commands From Java

I am calling a bash scrip script from Java.
The script does the following:
cat /home/user/Downloads/bigtextfile.txt | grep 'hello'
This particular command when run command line takes about 1 second to complete on the text file which is 150MB.
When calling the bash script via Java using the following call:
command = "sh /home/user/bashfiletocall"
p = Runtime.getRuntime().exec(command);
The time to complete takes so long I don't wait.
Am I doing something very wrong and if not can you explain the reason for the huge lack in performance?
NOTE: I was running it in Netbeans and this seems to be the problem .. when I ran the file command line it was quick. The performance between execution in netbeans and command line is huge.
Many thanks.
private String executeCommand(String command) {
StringBuilder output = new StringBuilder();
BufferedReader reader = null;
Process p;
try {
p = Runtime.getRuntime().exec(command);
p.waitFor();
reader = new BufferedReader(new InputStreamReader(p.getInputStream()));
String line = "";
while ((line = reader.readLine())!= null) {
output.append(line + "\n");
}
} catch (Exception e) {
e.printStackTrace();
}
return output.toString();
}
After starting your process you need start reading from the input stream. Otherwise the buffers are running full and p.waitFor() waits forever.
Javadoc of the Process class:
Because some native platforms only provide limited buffer size for standard input and output streams, failure to promptly write the input stream or read the output stream of the subprocess may cause the subprocess to block, or even deadlock.

User input to the command line when using Runtime.getRuntime().exec(command);

I dont think this is possible, but I have been using:
Process p = Runtime.getRuntime().exec(command);
to run commands on the command line, but now I have come accross a situation where the command I am running part way through will ask for some user input, for example a username.
This can not be resolved by a argument to the command that is being exec, is there any way I can pass the username to the same command line instance and continue?
---EDIT---
I still cant get this to work. These are the steps on the command line:
C:\someProgram.exe
Login:
Passowrd:
So I need to pass the login and password when it prompts at runtime. The code I've got that doesnt work:
try {
String CMD = "\"C:\\someProgram\"";
Scanner scan = new Scanner(System.in);
ProcessBuilder builder = new ProcessBuilder(CMD);
builder.redirectErrorStream(true);
Process process = builder.start();
InputStream is = process.getInputStream();
BufferedReader reader = new BufferedReader (new InputStreamReader(is));
OutputStream out = process.getOutputStream();
BufferedWriter writer = new BufferedWriter(new OutputStreamWriter(out));
String line;
try {
while (scan.hasNext()) {
String input = scan.nextLine();
if (input.toLowerCase().startsWith("login")) {
writer.write("myUsername");
} else if(input.toLowerCase().startsWith("password")){
writer.write("myPassword");
}
writer.flush();
line = reader.readLine();
while (line != null) {
System.out.println ("Stdout: " + line);
line = reader.readLine();
}
if (line == null) {
break;
}
}
process.waitFor();
}
finally {;
writer.close();
reader.close();
}
}
catch (Exception err) {
System.err.println("some message");
}
Ive tried things like:
writer.write("myUsername\n");
Any help, i can see that someProgram.exe is called and running in the processes, but it just hangs.
Just write to p.getOutputStream(). That'll send the username to the process's standard input, which should do what you want.
out = p.getOutputStream();
out.write("fooUsername\n".getBytes());
out.flush();
You should redirect the command input and send your parameters there. Use process.setInputStream(), the write into this stream.
As part your command String, if you are running on Unix/Linux/OSX and maybe PowerShell, you could prepend the cat shell command to have the shell dump the contents of a file into the input stream for your intended executable to read as input.
A command something like cat user_input.txt | myAppWantsInput.pl.
This will take the content of user_input.txt, dump it into standard-in, so when "myAppWantsInput.pl" in your command executes, and reads from standard-in, it will be reading the contents of the file and taking that as input as if entered from the keyboard.
Of course, it you don't know a priori what values you intend to pass, you could generate the files you need dynamically before invoke the command. This won't work if you can't determine all the input you'll want before you run the command.

Categories