How to use ExpectJ in Jython? - java

In our company we use Jython for some reason. I need to extend it with the ExpectJ, but I could not figure out how to do it.
I managed to download the expectj-2.0.7.jar, expectj-2.0.7-sources.jar and expectj-2.0.7-javadoc.jar files and made them accessible to Jython and Java itself as well.
So I can import it in my python script and JVM also finds the jars (by using a classpath loader hack). But according to ExpectJ's docs, something is still wrong.
import expectj
ex = expectj.ExpectJ() # I cannot use the second form of the
# constructor where I can pass a timeout
# parameter
sh = ex.spawn(targetshell, 22, usr, passw) # There is no spawn method in
# ExpectJ - but why???
This is where I'm getting stuck. Why doesn't the ExpectJ object have a spawn method? Does anyone have a solution for this?

The following solution is to ensure that spawned process completes before executing the next command. It guarantees the cycle 'send - expect - wait for completion of the command sent in send' and then 'send again - expect again - wait for completion' .
In order to wait for command prompt to finish executing the spawned process, use shell.expect(""). If in case there are further expectJ send and expect commands after this, sequential execution can be ensured. If there is no shell.expect("") the next step shell.send("exit\n") is executed without waiting for the completion of the process that it has already spawned, in the following case scp command is put to completion before next command is issued.
import expectj
expectinator = expectj.ExpectJ();
shell = expectinator.spawn(expectj.SshSpawn(host_name, 22, usr_name, ssh_pwd));
shell.send("scp -r src:usr:dest" + "\r")
shell.expect(remote_box_usr + "'s password:");
shell.send(ssh_pwd + "\r");
shell.expect("");
shell.send("exit\n");
shell.expectClose();

Related

Java/JavaFX ProcessHandle possibly not finding all processes (Linux/Debian)

I've a JavaFX application where I've a list of a bunch of script files. Once the application loads, it reads it and and checks which ones are running.
To do that I use a ProcessHandle, as mentioned in various examples here on StackOverflow and other guides/tutorials on the internet.
The problem is, it never finds any of them. There for I programmatically started one, which I know for a fact that it will be running, via Process process = new ProcessBuilder("/path/to/file/my_script.sh").start(); - and it won't find this one either.
Contents of my_script.sh:
#!/bin/bash
echo "Wait for 5 seconds"
sleep 5
echo "Completed"
Java code:
// List of PIDs which correspond to the processes shown after "INFO COMMAND:"
System.out.println("ALL PROCESSES: " + ProcessHandle.allProcesses().toList());
Optional<ProcessHandle> scriptProcessHandle = ProcessHandle.allProcesses().filter(processHandle -> {
System.out.println("INFO COMMAND: " + processHandle.info().command());
Optional<String> processOptional = processHandle.info().command();
return processOptional.isPresent() && processOptional.get().equals("my_script.sh");
}).findFirst();
System.out.println("Script process handle is present: " + scriptProcessHandle.isPresent());
if (scriptProcessHandle.isPresent()) { // Always false
// Do stuff
}
Thanks to the good old fashioned System.out.println(), I noticed that I get this in my output console every time:
ALL PROCESSES: [1, 2, 28, 85, 128, 6944, 21174, 29029, 29071]
INFO COMMAND: Optional[/usr/bin/bwrap]
INFO COMMAND: Optional[/usr/bin/bash]
INFO COMMAND: Optional[/app/idea-IC/jbr/bin/java]
INFO COMMAND: Optional[/app/idea-IC/bin/fsnotifier]
INFO COMMAND: Optional[/home/username/.jdks/openjdk-17.0.2/bin/java]
INFO COMMAND: Optional[/usr/bin/bash]
INFO COMMAND: Optional[/home/username/.jdks/openjdk-17.0.2/bin/java]
INFO COMMAND: Optional[/home/username/.jdks/openjdk-17.0.2/bin/java]
INFO COMMAND: Optional[/usr/bin/bash]
Script process handle is present: false
The first line in the Javadoc of ProcessHandle.allProcess() reads:
Returns a snapshot of all processes visible to the current process.
So how come I can't see the rest of the operating system's processes?
I'm looking for a non-os-dependent solution, if possible. Why? For better portability and hopefully less maintenance in the future.
Notes:
A popular solution for GNU/Linux seems to be to check the proc entries, but I don't know if that would work for at least the majority of the most popular distributions - if it doesn't, adding support for them in a different way, would create more testing and maintenance workload.
I'm aware of ps, windir, tasklist.exe possible solutions (worst comes to worst).
I found the JavaSysMon library but it seems dead and unfortunately:
CPU speed on Linux only reports correct values for Intel CPUs
Edit 1:
I'm on Pop_OS! and installed IntelliJ via the PopShop as flatpak.
In order to start it as root as suggested by mr mcwolf, I went to /home/username/.local/share/flatpak/app/com.jetbrains.IntelliJ-IDEA-Community/x86_64/stable/active/export/bin and found com.jetbrains.IntelliJ-IDEA-Community file.
When I run sudo ./com.jetbrains.IntelliJ-IDEA-Community or sudo /usr/bin/flatpak run --branch=stable --arch=x86_64 com.jetbrains.IntelliJ-IDEA-Community in my terminal, I get error: app/com.jetbrains.IntelliJ-IDEA-Community/x86_64/stable not installed
So I opened the file and ran its contents:
exec /usr/bin/flatpak run --branch=stable --arch=x86_64 com.jetbrains.IntelliJ-IDEA-Community "$#"
This opens IntelliJ, but not as root, so instead I ran:
exec sudo /usr/bin/flatpak run --branch=stable --arch=x86_64 com.jetbrains.IntelliJ-IDEA-Community "$#"
Which prompts for a password and when I write it in, the terminal crashes.
Edit 1.1:
(╯°□°)╯︵ ┻━┻ "flatpak run" is not intended to be ran with sudo
Edit 2:
As mr mcwolf said, I downloaded the IntelliJ from the official website, extracted it and ran the idea.sh as root.
Now a lot more processes are shown. 1/3 of them show up as INFO COMMAND: Optional.empty.
scriptProcessHandle.isPresent() is still unfortunately returning false. I searched through them and my_script.sh is nowhere to be found. I also tried processOptional.isPresent() && processOptional.get().equals("/absolute/path/to/my_script.sh") but I still get false on isPresent() and it's not in the list of shown processes.
Though the last sentence might be a different problem. I'll do more digging.
Edit 3:
Combining .commandLine() and .contains() (instead of .equals()) solves the problem mentioned in "Edit 2".
Optional<ProcessHandle> scriptProcessHandle = ProcessHandle.allProcesses().filter(processHandle -> {
System.out.println("INFO COMMAND LINE: " + processHandle.info().commandLine());
Optional<String> processOptional = processHandle.info().commandLine();
return processOptional.isPresent() && processOptional.get().contains("/absolute/path/to/my_script.sh");
}).findFirst();
System.out.println("Script process handle is present: " + scriptProcessHandle.isPresent());
if (scriptProcessHandle.isPresent()) { // Returns true
// Do stuff
}
.commandLine() also shows script arguments, so that must be kept in mind.

How to wait for multi-threaded shell script execution to finish called inside my web service?

I have a java restful service method which executes a myscript.sh using processBuilder. My script takes one input (example - myscript.sh /path/to-a/folder).
Inside the script something like this
-> execute a command which is multithreaded i.e parallel processing
-> echo "my message"
Now when call my script from a linux command line it executes fine. First all the threads running finishes and then some text output from threaded command execution shown on terminal and then echo my message is shown.
But when I call the same script from java using processBuilder, the last echo message comes immidiately and execution ends.
Following the way I call my script from java
ProcessBuilder processBuilder = new ProcessBuilder("/bin/bash","/path/to/myscript.sh","/path/to/folder/data");
Process proc = processBuilder.start();
StringBuffer output = new StringBuffer();
BufferedReader reader = new BufferedReader(new InputStreamReader(proc.getInputStream()));
String line = "";
while((line = reader.readLine()) != null){
output.append(line + "\n");
}
System.out.println("### " + output);
I don't know whats happening, how to debug also.
Can someone enlighten me on how to get the same behaviour from shell script when run from terminal or from java processBuilder?
Use ProcessBuilder.redirectErrorStream(boolean redirectErrorStream) with argument true to merge the errors into output. Alternatively, you could also use the shell command syntax cmd 2>&1 to merge the error with output.
These are some of the cases why you may be immediately getting the output of the last echo statement (instead of the script taking time to run and return proper results):
Missing environment variables
The launched bash needs to source .bashrc or some such recource file
The launched bash may not be running in right directory (you can set this in ProcessBuilder)
The launched bash may not be finding some script/executable in its PATH
The launched bash may not be finding proper libraries in the path for any of the executables
Once you merge error, you would be able to debug and see the errors for yourself.
In your context, separate processes may be spawned in two ways:
1) Bash
/path/to/executables/executable &
This will spawn a new executable executable and you need to wait for it to finish. Here's an answer that will help you.
2) Java
Process exec = Runtime.getRuntime().exec(command);
status = exec.waitFor();
Essentially, you need to wait for the process to end before you start reading its std/err streams.
If I understand the problem correctly, adding just this line to your code should suffice: status = exec.waitFor() (Before you obtain the streams)
Here's the JavaDoc for Process.waitFor() :
Causes the current thread to wait, if necessary, until the process represented by this Process object has terminated. This method returns immediately if the subprocess has already terminated. If the subprocess has not yet terminated, the calling thread will be blocked until the subprocess exits.
Returns:
the exit value of the subprocess represented by this Process object. By convention, the value 0 indicates normal termination.
Throws:
InterruptedException - if the current thread is interrupted by another thread while it is waiting, then the wait is ended and an InterruptedException is thrown

How can I programmatically terminate a running process in the same script that started it?

How do I start processes from a script in a way that also allows me to terminate them?
Basically, I can easily terminate the main script, but terminating the external processes that this main script starts has been the issue. I googled like crazy for Perl 6 solutions. I was just about to post my question and then thought I'd open the question up to solutions in other languages.
Starting external processes is easy with Perl 6:
my $proc = shell("possibly_long_running_command");
shell returns a process object after the process finishes. So, I don't know how to programmatically find out the PID of the running process because the variable $proc isn't even created until the external process finishes. (side note: after it finishes, $proc.pid returns an undefined Any, so it doesn't tell me what PID it used to have.)
Here is some code demonstrating some of my attempts to create a "self destructing" script:
#!/bin/env perl6
say "PID of the main script: $*PID";
# limit run time of this script
Promise.in(10).then( {
say "Took too long! Killing job with PID of $*PID";
shell "kill $*PID"
} );
my $example = shell('echo "PID of bash command: $$"; sleep 20; echo "PID of bash command after sleeping is still $$"');
say "This line is never printed";
This results in the following output which kills the main script, but not the externally created process (see output after the word Terminated):
[prompt]$ ./self_destruct.pl6
PID of the main script: 30432
PID of bash command: 30436
Took too long! Killing job with PID of 30432
Terminated
[prompt]$ my PID after sleeping is still 30436
By the way, the PID of sleep was also different (i.e. 30437) according to top.
I'm also not sure how to make this work with Proc::Async. Unlike the result of shell, the asynchronous process object it creates doesn't have a pid method.
I was originally looking for a Perl 6 solution, but I'm open to solutions in Python, Perl 5, Java, or any language that interacts with the "shell" reasonably well.
For Perl 6, there seems to be the Proc::Async module
Proc::Async allows you to run external commands asynchronously, capturing standard output and error handles, and optionally write to its standard input.
# command with arguments
my $proc = Proc::Async.new('echo', 'foo', 'bar');
# subscribe to new output from out and err handles:
$proc.stdout.tap(-> $v { print "Output: $v" });
$proc.stderr.tap(-> $v { print "Error: $v" });
say "Starting...";
my $promise = $proc.start;
# wait for the external program to terminate
await $promise;
say "Done.";
Method kill:
kill(Proc::Async:D: $signal = "HUP")
Sends a signal to the running program. The signal can be a signal name ("KILL" or "SIGKILL"), an integer (9) or an element of the Signal enum (Signal::SIGKILL).
An example on how to use it:
#!/usr/bin/env perl6
use v6;
say 'Start';
my $proc = Proc::Async.new('sleep', 10);
my $promise= $proc.start;
say 'Process started';
sleep 2;
$proc.kill;
await $promise;
say 'Process killed';
As you can see, $proc has a method to kill the process.
Neither Perl, Perl 6, nor Java, but bash:
timeout 5 bash -c "echo hello; sleep 10; echo goodbye" &
In Java you can create a process like this:
ProcessBuilder processBuilder = new ProcessBuilder("C:\\Path\program.exe", "param1", "param2", "ecc...");
Process process = processBuilder.start(); // start the process
process.waitFor(timeLimit, timeUnit); // This causes the current thread to wait until the process has terminated or the specified time elapses
// when you want to kill the process
if(process.isAlive()) {
process.destroy();
}
Or you can use process.destroyForcibly();, see the Process documentation for more info.
To execute a bash command point to the bash executable and set the command as a parameter.

using python subprocess to run javaw.exe

I use javaw.exe in a Windows command prompt and it returns immediately after spawning my Swing java program.
But if I use Python's subprocess.call() to do the same thing, it hangs.
import subprocess
retval = subprocess.call(['javaw.exe','-jar','myjar.jar',arg1,arg2])
What am I doing wrong and why is there this difference?
subprocess.call will wait for the process (javaw) to complete, as it says in the docs:
Run the command described by args. Wait for command to complete, then return the returncode attribute.
You should probably use subprocess.Popen instead.
Check out the docs for replacing the os.spawn family:
pid = os.spawnlp(os.P_NOWAIT, "/bin/mycmd", "mycmd", "myarg")
==>
pid = Popen(["/bin/mycmd", "myarg"]).pid
In your case, this is probably
pid = subprocess.Popen(["javaw.exe", "-jar", "myjar.jar", arg1, arg2])
perhaps adjusted to get the absolute path to javaw.exe, or shell=True, depending on your mood and needs.

Why does output appear in wrong order?

I'm trying to write a Groovy script that wraps another command and am having trouble with the stdout/stderr order. My script is below:
#!/usr/bin/env groovy
synchronized def output = ""
def process = "qrsh ${args.join(' ')}".execute()
def outTh = Thread.start {
process.in.eachLine {
output += it
System.out.println "out: $it"
}
}
def errTh = Thread.start {
process.err.eachLine {
output += it
System.err.println "err: $it"
}
}
outTh.join()
errTh.join()
process.waitFor()
System.exit(process.exitValue())
My problem is that the output doesn't appear on the terminal in the correct order. Below is the wrapper's output.
[<cwd>] wrap.groovy -cwd -V -now n -b y -verbose ant target
waiting for interactive job to be scheduled ...
Your interactive job 2831303 has been successfully scheduled.
Establishing builtin session to host <host> ...
Buildfile: build.xml
BUILD FAILED
Target "target" does not exist in the project "null".
Total time: 0 seconds
Your job 2831303 ("wrap.groovy") has been submitted
Below is the unwrapped command output.
[<cwd>] qrsh -cwd -V -now n -b y -verbose ant target
Your job 2831304 ("ant") has been submitted
waiting for interactive job to be scheduled ...
Your interactive job 2831303 has been successfully scheduled.
Establishing builtin session to host host ...
Buildfile: build.xml
BUILD FAILED
Target "target" does not exist in the project "null".
Total time: 0 seconds
Why does the "Your job has been submitted" message appear as the first line in one cast and the last line in another? I'm guessing it's related to Java libraries, not Groovy.
This is because of buffering. The threads which read stdout and stderr will not process the output the moment it is written by the child process. Instead, both streams are buffered, so your process won't see anything unless the child flushes the streams).
When the data is on the way, which thread gets the CPU first? There is no way to tell. Even if the data for stderr arrives a few milliseconds before stdout, if the stdout thread has the CPU right now, it will get its data first.
What you could do is use Java NIO (channels) and a single thread and first process all output from stderr but that still wouldn't guarantee that the order is preserved. Because of the buffering between child and parent process, you could get 4KB of text from one stream before you see a single byte of the other.
Unfortunately, there is no cross-platform solution because Java doesn't have an API to merge the two streams into one. On Unix, you could run the command with sh -c cmd 2>&1. That would redirect stderr to stdout. In the parent process, you could then just read stdout and ignore stderr.
The same works for OS X (since it's Unix based). On Windows, you could install Perl or a similar tool to run the process; that allows you to mess with the file descriptors.
PS: Pray that args never contains spaces. String.execute() is a really bad way to run a process; use java.lang.ProcessBuilder instead.
Try putting System.out.flush after you do your println. If I am right, the messages are appearing in different orders because the System.out is being buffered.

Categories