I've a JavaFX application where I've a list of a bunch of script files. Once the application loads, it reads it and and checks which ones are running.
To do that I use a ProcessHandle, as mentioned in various examples here on StackOverflow and other guides/tutorials on the internet.
The problem is, it never finds any of them. There for I programmatically started one, which I know for a fact that it will be running, via Process process = new ProcessBuilder("/path/to/file/my_script.sh").start(); - and it won't find this one either.
Contents of my_script.sh:
#!/bin/bash
echo "Wait for 5 seconds"
sleep 5
echo "Completed"
Java code:
// List of PIDs which correspond to the processes shown after "INFO COMMAND:"
System.out.println("ALL PROCESSES: " + ProcessHandle.allProcesses().toList());
Optional<ProcessHandle> scriptProcessHandle = ProcessHandle.allProcesses().filter(processHandle -> {
System.out.println("INFO COMMAND: " + processHandle.info().command());
Optional<String> processOptional = processHandle.info().command();
return processOptional.isPresent() && processOptional.get().equals("my_script.sh");
}).findFirst();
System.out.println("Script process handle is present: " + scriptProcessHandle.isPresent());
if (scriptProcessHandle.isPresent()) { // Always false
// Do stuff
}
Thanks to the good old fashioned System.out.println(), I noticed that I get this in my output console every time:
ALL PROCESSES: [1, 2, 28, 85, 128, 6944, 21174, 29029, 29071]
INFO COMMAND: Optional[/usr/bin/bwrap]
INFO COMMAND: Optional[/usr/bin/bash]
INFO COMMAND: Optional[/app/idea-IC/jbr/bin/java]
INFO COMMAND: Optional[/app/idea-IC/bin/fsnotifier]
INFO COMMAND: Optional[/home/username/.jdks/openjdk-17.0.2/bin/java]
INFO COMMAND: Optional[/usr/bin/bash]
INFO COMMAND: Optional[/home/username/.jdks/openjdk-17.0.2/bin/java]
INFO COMMAND: Optional[/home/username/.jdks/openjdk-17.0.2/bin/java]
INFO COMMAND: Optional[/usr/bin/bash]
Script process handle is present: false
The first line in the Javadoc of ProcessHandle.allProcess() reads:
Returns a snapshot of all processes visible to the current process.
So how come I can't see the rest of the operating system's processes?
I'm looking for a non-os-dependent solution, if possible. Why? For better portability and hopefully less maintenance in the future.
Notes:
A popular solution for GNU/Linux seems to be to check the proc entries, but I don't know if that would work for at least the majority of the most popular distributions - if it doesn't, adding support for them in a different way, would create more testing and maintenance workload.
I'm aware of ps, windir, tasklist.exe possible solutions (worst comes to worst).
I found the JavaSysMon library but it seems dead and unfortunately:
CPU speed on Linux only reports correct values for Intel CPUs
Edit 1:
I'm on Pop_OS! and installed IntelliJ via the PopShop as flatpak.
In order to start it as root as suggested by mr mcwolf, I went to /home/username/.local/share/flatpak/app/com.jetbrains.IntelliJ-IDEA-Community/x86_64/stable/active/export/bin and found com.jetbrains.IntelliJ-IDEA-Community file.
When I run sudo ./com.jetbrains.IntelliJ-IDEA-Community or sudo /usr/bin/flatpak run --branch=stable --arch=x86_64 com.jetbrains.IntelliJ-IDEA-Community in my terminal, I get error: app/com.jetbrains.IntelliJ-IDEA-Community/x86_64/stable not installed
So I opened the file and ran its contents:
exec /usr/bin/flatpak run --branch=stable --arch=x86_64 com.jetbrains.IntelliJ-IDEA-Community "$#"
This opens IntelliJ, but not as root, so instead I ran:
exec sudo /usr/bin/flatpak run --branch=stable --arch=x86_64 com.jetbrains.IntelliJ-IDEA-Community "$#"
Which prompts for a password and when I write it in, the terminal crashes.
Edit 1.1:
(╯°□°)╯︵ ┻━┻ "flatpak run" is not intended to be ran with sudo
Edit 2:
As mr mcwolf said, I downloaded the IntelliJ from the official website, extracted it and ran the idea.sh as root.
Now a lot more processes are shown. 1/3 of them show up as INFO COMMAND: Optional.empty.
scriptProcessHandle.isPresent() is still unfortunately returning false. I searched through them and my_script.sh is nowhere to be found. I also tried processOptional.isPresent() && processOptional.get().equals("/absolute/path/to/my_script.sh") but I still get false on isPresent() and it's not in the list of shown processes.
Though the last sentence might be a different problem. I'll do more digging.
Edit 3:
Combining .commandLine() and .contains() (instead of .equals()) solves the problem mentioned in "Edit 2".
Optional<ProcessHandle> scriptProcessHandle = ProcessHandle.allProcesses().filter(processHandle -> {
System.out.println("INFO COMMAND LINE: " + processHandle.info().commandLine());
Optional<String> processOptional = processHandle.info().commandLine();
return processOptional.isPresent() && processOptional.get().contains("/absolute/path/to/my_script.sh");
}).findFirst();
System.out.println("Script process handle is present: " + scriptProcessHandle.isPresent());
if (scriptProcessHandle.isPresent()) { // Returns true
// Do stuff
}
.commandLine() also shows script arguments, so that must be kept in mind.
Related
I'm trying to use Apache Commons Exec to run a git command which uses a regex.
When I form my CommandLine and print it out it looks like this:
[git, --no-pager, grep, --line-number, --untracked, --extended-regexp, "^\s*public void\s+(testFindByAdAccount).*", --, *Test.java]
However when I execute this, git returns no results, resulting in an exit code 1.
When I run this command manually though, it returns plenty of results and succeeds. Changing the --extended-regexp argument to just a string like testFindByAdAccount does yield results when run via Exec, so I think Apache Commons is doing something to the regexp argument making it invalid. Any ideas what is going on?
EDIT: Adding a reproducible example
Clone https://github.com/ragurney/min-example
Run gradlew shadowJar to produce jar file for project
Run the app with java -jar app/build/libs/app-all.jar
Note the output which shows the command printed fails with an exit code 1 (because there are no results returned by the git command)
$ java -jar app/build/libs/app-all.jar
HELLOOOOOO
WD::: null
[git, --no-pager, grep, --line-number, --untracked, --extended-regexp, "^\s*public void\s+(testAppHasAGreeting)\(\).*", --, *Test.java]
WD::: /Users/rgurney/Src/personal/min-example
Exception in thread "main" java.lang.RuntimeException: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1)
at min.example.App.lambda$runCommand$1(App.java:74)
at io.vavr.control.Try.getOrElseThrow(Try.java:748)
Running the command manually does produce expected results:
$ git --no-pager grep --line-number --untracked --extended-regexp "^\s*public void\s+(testAppHasAGreeting)\(\).*" -- "*Test.java"
app/src/test/java/min/example/AppTest.java:11: public void testAppHasAGreeting() {
I got a clue as to what's going on here when the sample you provided worked just fine on my Windows laptop but failed on my Linux desktop.
Once I made sure the git version wasn't the culprit (tested several versions between 2.17 and 2.39 on both machines), I figured the difference must be in the way different shells handle quoting. Specifically, the only argument here that has any potential quoting issues is the regex ("^\s*public void\s+(testFindByAdAccount).*"), which is added to the command line by commandLine.addArgument(regex);.
addArgument may look innocuous, but under the hood, it allows the CommandLine to handle the quoting itself (i.e., addArgument(String argument) calls addArgument(String argument, true). Since you've handled the quoting yourself, you should not allow the CommandLine to handle the quoting, and should explicitly call it with the second argument false. i.e.:
public static List<String> grep(String regex, String filePattern, String wd) {
CommandLine commandLine = CommandLine.parse("git");
commandLine.addArgument("--no-pager");
commandLine.addArgument("grep");
commandLine.addArgument("--line-number");
commandLine.addArgument("--untracked");
commandLine.addArgument("--extended-regexp");
commandLine.addArgument(regex, false);
// Here -----------------------^
commandLine.addArgument("--");
commandLine.addArgument(filePattern);
System.out.println(commandLine);
return List.of(runCommand(commandLine, wd).split("\n"));
}
This takes the quote-handling logic away and ensures the same code runs smoothly both on Windows and Linux (at least those I've tested).
When I build my project with gradle, the test outputs are huge and I would like to keep them. Therefore, I activated showing the output streams:
test {
testLogging.showStandardStreams = true
}
Unfortunately, gradle does not seem to be able to handle big standard outputs. For demonstration, I created a minimal example: https://github.com/DaGeRe/stdout-test It is a project creating big standard output in a test by
#Test
public void test() {
long start = System.currentTimeMillis();
for (int i = 0; i < 200; i++) {
for (int j = 0; j < 10000; j++) {
long current = System.currentTimeMillis() - start;
System.out.println("This is a simple logging output: " + i + " " + j + " " + current);
}
}
}
If I run this in a standard maven project, it finishes in about 2 minutes:
reichelt#reichelt-desktop:~/workspaces/stdout-test$ time mvn test &> mvn.txt
real 1m34,130s
user 0m31,333s
sys 1m12,296s
If I run it in gradle by time ./gradlew test &> gradle.txt, it does not finish at all (in reasonable time) and the output contains many Expiring Daemon because JVM heap space is exhausted. A way to solve this temporarily would be increasing heap memory (like suggested here: JVM space exhausted when building a project through gradle ), but -Xmx4g does not change anything according to my experiments, and this obviously will not scale for bigger outputs. Also, running ./gradlew -i test does not change the behavior.
The project also contains example files for outputs from maven (https://github.com/DaGeRe/stdout-test/blob/master/mvn.txt) and gradle (https://github.com/DaGeRe/stdout-test/blob/master/gradle.txt - aborted process after ~10 minutes) and one of the heap dumps (https://github.com/DaGeRe/stdout-test/blob/master/java_pid10812.hprof.tar) gradle created. There are only minor increases in the current time in the gradle-log (third output number of every line). Therefore, I assume that gradle mainly has problems printing to stdout and not executing the program.
This shows that, while gradle has some problems printing to stdout, it seems to not block the test execution. Is there any switch or parameter I could give gradle, which forces gradle to directly print to stdout instead of doing its memory-intense processing? Unfortunately, I did not find any in the documentation (https://docs.gradle.org/current/dsl/org.gradle.api.tasks.testing.Test.html).
EDIT Just finished a test run on a server:
reichelt#r147:~/workspaces/dissworkspace/stdout-test$ time ./gradlew test &> gradle.txt
real 28m17,959s
user 216m37,351s
sys 0m12,410s
Ends with an exception:
This is a simple logging output: 89 7842 1416
This is a simple logging output: 89 7843 1416
This is a simple logging output: 89 7844 1416
This is a simple logging output: 89 7845 1416
This is a simple logging output: 89 7846 1416
FAILURE: Build failed with an exception.
* What went wrong:
GC overhead limit exceeded
I don't really have a solution for you, but I want to share a few observations that are too long to put into a comment.
First of all, I can reproduce your OutOfMemory problem from your Github repository. I googled it a bit, and while there are other reports on OOM on this out there, none had a solution. I think it is just a limitation in Gradle when enabling showStandardStreams. I tried fiddling around with the console output type and a few other parameters, but none had an effect.
However, with disabling showStandardStreams, I would not get an OOM. Not even after bumping the number of iterations from 200*10000, that you specified, to 1000*10000. It worked fine and the output got saved to both a .bin, .xml and a .html file for later inspection.
What's more, Gradle ran it more than twice as fast as Maven on my machine:
λ time ./gradlew test &> gradle.txt
real 1m23.113s
user 0m0.015s
sys 0m0.031s
λ time mvn test &> mvn.txt
real 3m6.671s
user 0m0.183s
sys 0m0.566s
Not sure why there is such a big difference between the two.
While I completely agree that it would be nice to use showStandardStreams for large outputs, just like Maven defaults to, it appears it is just not possible unless you can afford to raise the maximum heap size accordingly. On the other hand, having the output saved in the report is also rather nice, which is something you don't get from the Surefire plugin in Maven.
I am working on a application which first require to check the available free disk space before running any operation. We have set some Default required Space limit like 512MB, So if any working drive does not have more then 512mb space my program will prompt for less memory space available, please make sufficient space to run the program.
I am using following code for it.
long freeSpace = FileSystemUtils.freeSpaceKb() * 1024;
here I am coverting size into byte first to compare with our standard required size.
Due to the above statement i am gettign following exception:
Error-Command line returned OS error code '3' for command [cmd.exe, /C, dir /-c "F:\MyApp\"]Stacktrace java.io.IOException: Command line returned OS error code '3' for command [cmd.exe, /C, dir /-c "F:\MyApp"]
at org.apache.commons.io.FileSystemUtils.performCommand(FileSystemUtils.java:506)
at org.apache.commons.io.FileSystemUtils.freeSpaceWindows(FileSystemUtils.java:303)
at org.apache.commons.io.FileSystemUtils.freeSpaceOS(FileSystemUtils.java:270)
at org.apache.commons.io.FileSystemUtils.freeSpaceKb(FileSystemUtils.java:206)
at org.apache.commons.io.FileSystemUtils.freeSpaceKb(FileSystemUtils.java:240)
at org.apache.commons.io.FileSystemUtils.freeSpaceKb(FileSystemUtils.java:222)...
The OS returned Error Code is '3' thats mean it is not normal termination.
So now how can I resolve this issue ?
I also found alternative method available in java 1.6 - How to find how much disk space is left using Java?
new File("c:\\").getFreeSpace();
---------------------------------
**More Details :**
---------------------------------
OS Architecture : amd64
Temp Dir : c:\temp\
OS Name : Windows 7
OS Version : 6.1 amd64
Jre Version : 1.6.0_45-b06
User Home : C:\Users\Tej.Kiran
User Language : en
User Country: US
File Separator : \
Current Working Directory : F:\MyApp\
You can try executing that command from a prompt. Run cmd.exe and enter the following:
cmd.exe /C dir /-c "F:\MyApp\"
echo %errorlevel%
Error code 3 means the path doesn't exist, but in this case I wonder if it is related to permissions. Any non-zero errorlevel is a problem. If your Java app needs to know the free space on the drive it is installed on, you can do something like this:
// returns something like "file:/C:/MyApp/my/pkg/MyClass.class"
// -OR- "jar:file:/C:/MyApp/myjar.jar!/my/pkg/MyClass.class"
String myPath = my.pkg.MyClass.class.getResource(MyClass.class).toString();
int start = myPath.indexOf("file:/") + 6;
FileSystemUtils.freeSpaceKb(myPath.substring(start, myPath.indexOf("/", start));
Obviously this code wouldn't work in an applet, but that shouldn't be surprising. The substring logic should also be more robust, but this is just a simple example.
In our company we use Jython for some reason. I need to extend it with the ExpectJ, but I could not figure out how to do it.
I managed to download the expectj-2.0.7.jar, expectj-2.0.7-sources.jar and expectj-2.0.7-javadoc.jar files and made them accessible to Jython and Java itself as well.
So I can import it in my python script and JVM also finds the jars (by using a classpath loader hack). But according to ExpectJ's docs, something is still wrong.
import expectj
ex = expectj.ExpectJ() # I cannot use the second form of the
# constructor where I can pass a timeout
# parameter
sh = ex.spawn(targetshell, 22, usr, passw) # There is no spawn method in
# ExpectJ - but why???
This is where I'm getting stuck. Why doesn't the ExpectJ object have a spawn method? Does anyone have a solution for this?
The following solution is to ensure that spawned process completes before executing the next command. It guarantees the cycle 'send - expect - wait for completion of the command sent in send' and then 'send again - expect again - wait for completion' .
In order to wait for command prompt to finish executing the spawned process, use shell.expect(""). If in case there are further expectJ send and expect commands after this, sequential execution can be ensured. If there is no shell.expect("") the next step shell.send("exit\n") is executed without waiting for the completion of the process that it has already spawned, in the following case scp command is put to completion before next command is issued.
import expectj
expectinator = expectj.ExpectJ();
shell = expectinator.spawn(expectj.SshSpawn(host_name, 22, usr_name, ssh_pwd));
shell.send("scp -r src:usr:dest" + "\r")
shell.expect(remote_box_usr + "'s password:");
shell.send(ssh_pwd + "\r");
shell.expect("");
shell.send("exit\n");
shell.expectClose();
I'm trying to write a Groovy script that wraps another command and am having trouble with the stdout/stderr order. My script is below:
#!/usr/bin/env groovy
synchronized def output = ""
def process = "qrsh ${args.join(' ')}".execute()
def outTh = Thread.start {
process.in.eachLine {
output += it
System.out.println "out: $it"
}
}
def errTh = Thread.start {
process.err.eachLine {
output += it
System.err.println "err: $it"
}
}
outTh.join()
errTh.join()
process.waitFor()
System.exit(process.exitValue())
My problem is that the output doesn't appear on the terminal in the correct order. Below is the wrapper's output.
[<cwd>] wrap.groovy -cwd -V -now n -b y -verbose ant target
waiting for interactive job to be scheduled ...
Your interactive job 2831303 has been successfully scheduled.
Establishing builtin session to host <host> ...
Buildfile: build.xml
BUILD FAILED
Target "target" does not exist in the project "null".
Total time: 0 seconds
Your job 2831303 ("wrap.groovy") has been submitted
Below is the unwrapped command output.
[<cwd>] qrsh -cwd -V -now n -b y -verbose ant target
Your job 2831304 ("ant") has been submitted
waiting for interactive job to be scheduled ...
Your interactive job 2831303 has been successfully scheduled.
Establishing builtin session to host host ...
Buildfile: build.xml
BUILD FAILED
Target "target" does not exist in the project "null".
Total time: 0 seconds
Why does the "Your job has been submitted" message appear as the first line in one cast and the last line in another? I'm guessing it's related to Java libraries, not Groovy.
This is because of buffering. The threads which read stdout and stderr will not process the output the moment it is written by the child process. Instead, both streams are buffered, so your process won't see anything unless the child flushes the streams).
When the data is on the way, which thread gets the CPU first? There is no way to tell. Even if the data for stderr arrives a few milliseconds before stdout, if the stdout thread has the CPU right now, it will get its data first.
What you could do is use Java NIO (channels) and a single thread and first process all output from stderr but that still wouldn't guarantee that the order is preserved. Because of the buffering between child and parent process, you could get 4KB of text from one stream before you see a single byte of the other.
Unfortunately, there is no cross-platform solution because Java doesn't have an API to merge the two streams into one. On Unix, you could run the command with sh -c cmd 2>&1. That would redirect stderr to stdout. In the parent process, you could then just read stdout and ignore stderr.
The same works for OS X (since it's Unix based). On Windows, you could install Perl or a similar tool to run the process; that allows you to mess with the file descriptors.
PS: Pray that args never contains spaces. String.execute() is a really bad way to run a process; use java.lang.ProcessBuilder instead.
Try putting System.out.flush after you do your println. If I am right, the messages are appearing in different orders because the System.out is being buffered.