I'm trying to stop and remove containers programmatically using a Kotlin process that execute the following command: docker-compose down --remove-orphans ,but the process never terminates, even after I call process.waitFor(). Alternatively, when I execute the same command in a powershell/cmd window the command terminates successfully.
Does anyone knows if there are any issues with 'Runtime.GetRuntime().exec()' or the docker-compose down command, and if there's another way to execute cmd/powershell commands (and receive their exit value) in Kotlin?
Thanks in advance.
My code:
val process: Process?
val sb = StringBuilder()
val command = "powershell.exe -command docker-compose down --remove-orphans"
process = Runtime.GetRuntime().exec(command)
process.waitFor()
val processStream: InputStream?
if(process.exitValue() == 0){
processStream = process.inputStream
} else {
processStream = process.errorStream
}
Scanner(processStream).use{
while(it.hasNextLine())
sb.append(it.nextLine() + System.lineSeparator())
}
println(sb.toString())
Related
Currently trying to execute a .jar file programmatically. But to test out java, I tried running the the following:
val p = ProcessBuilder("cmd.exe", "/c", "java", "-version").start()
val results: List<String> = p.inputStream.bufferedReader().readLines()
assertThat("Results should contain java version: ", results, hasItem(containsString("java version")))
However, nothing seems to output.
I am successfully able to run:
val pb = ProcessBuilder("cmd.exe", "/c", "echo", "hello world")
I have tried adding a working directory where the java executable is located, but nothing happens.
I am running out of ideas on how to make this work. If I run cmd and type out java -version I get the version information.
What else could I do to get this to work?
ProcessBuilder writes the result of command java -version to error output Process.errorStream, not Process.inputStream.
Try this code:
val results: List<String> = p.errorStream.bufferedReader().readLines()
Also you may try Koproc lib
It's a small Kotlin lib to run process and execute commands based on Java ProcessBuilder
You may run java process with timeout = 120 sec :
val koproc = "java -jar some.jar".startProcess { timeoutSec = 120 }
koproc.use {
println("Out: ${it.readAvailableOut}")
println("Err: ${it.readAvailableErrOut}")
}
println("Full result after closing: ${koproc.result}")
Run cmd command:
// 'cmd.exe' process will be closed after timeout
val commandResult = "cmd.exe dirs".startCommand { timeoutSec = 1 }
// But you will get the output
println("Out: ${commandResult.out}")
See examples in unit tests: https://github.com/kochetkov-ma/koproc/blob/main/src/test/kotlin/ru/iopump/koproc/ExtensionKtIT.kt
I am facing an issue that I'm not able to figure out. What I want to achieve is to have a gradle task that spawn a docker-compose process that is a mssql server, and then use liquibase to run-up all migrations and seed the database.
But the problem is that the docker takes some time to get the server up, and the liquibase is running before it gets up.
What i did was to start docker-compose in a daemon using -d flag, and then use a loop to ping the server until the port 1433 responds and then let the gradle continue with the other dependent tasks (that actually creates the database and seed it).
here is what I did:
task checkDbStatusAndGetsItUp(){
group "localEnvironment"
description "Check current local db is up or sets it up"
dependsOn 'cloneEntityProject'
println 'Checking db Status and setting it up'
println '---------------------------'
def stdoutDocker = new ByteArrayOutputStream()
exec{
executable 'sh'
args "-c", """
docker ps | grep microsoft | wc -c
"""
standardOutput = stdoutDocker
}
doLast {
if (stdoutDocker.toString().trim() == '0') {
exec {
executable 'sh'
workingDir 'setup/dp-entidades'
args "-c", """
docker-compose up -d
"""
}
}
def shouldStop = false;
while (shouldStop == false){
def stdoutPing = new ByteArrayOutputStream()
exec{
workingDir 'setup/dp-entidades'
executable 'sh'
args """
nc -zv localhost 1433
"""
ignoreExitValue = true
standardOutput = stdoutPing
}
println stdoutPing.toString();
sleep(1000)
}
}
}
What I get from the above code is a loop showing that the docker never gets it up. But if I open another terminal and ping it manually it works, and the database is actually up. (I even tried to use telnet, with same results)
What I need to do, to achive the ping from the gradle and if success on conecting to database let the task continue?
-c flag of sh is missing in the last exec block. Another problem is that you never set shouldStop to true, so the last loop will never terminate. You can e.g. check the exit status of exec:
def result = exec { ... }
shouldStop = result.exitValue == 0
Note that you should also limit the number of tries to propagate server failure instead of waiting forever.
I have a java restful service method which executes a myscript.sh using processBuilder. My script takes one input (example - myscript.sh /path/to-a/folder).
Inside the script something like this
-> execute a command which is multithreaded i.e parallel processing
-> echo "my message"
Now when call my script from a linux command line it executes fine. First all the threads running finishes and then some text output from threaded command execution shown on terminal and then echo my message is shown.
But when I call the same script from java using processBuilder, the last echo message comes immidiately and execution ends.
Following the way I call my script from java
ProcessBuilder processBuilder = new ProcessBuilder("/bin/bash","/path/to/myscript.sh","/path/to/folder/data");
Process proc = processBuilder.start();
StringBuffer output = new StringBuffer();
BufferedReader reader = new BufferedReader(new InputStreamReader(proc.getInputStream()));
String line = "";
while((line = reader.readLine()) != null){
output.append(line + "\n");
}
System.out.println("### " + output);
I don't know whats happening, how to debug also.
Can someone enlighten me on how to get the same behaviour from shell script when run from terminal or from java processBuilder?
Use ProcessBuilder.redirectErrorStream(boolean redirectErrorStream) with argument true to merge the errors into output. Alternatively, you could also use the shell command syntax cmd 2>&1 to merge the error with output.
These are some of the cases why you may be immediately getting the output of the last echo statement (instead of the script taking time to run and return proper results):
Missing environment variables
The launched bash needs to source .bashrc or some such recource file
The launched bash may not be running in right directory (you can set this in ProcessBuilder)
The launched bash may not be finding some script/executable in its PATH
The launched bash may not be finding proper libraries in the path for any of the executables
Once you merge error, you would be able to debug and see the errors for yourself.
In your context, separate processes may be spawned in two ways:
1) Bash
/path/to/executables/executable &
This will spawn a new executable executable and you need to wait for it to finish. Here's an answer that will help you.
2) Java
Process exec = Runtime.getRuntime().exec(command);
status = exec.waitFor();
Essentially, you need to wait for the process to end before you start reading its std/err streams.
If I understand the problem correctly, adding just this line to your code should suffice: status = exec.waitFor() (Before you obtain the streams)
Here's the JavaDoc for Process.waitFor() :
Causes the current thread to wait, if necessary, until the process represented by this Process object has terminated. This method returns immediately if the subprocess has already terminated. If the subprocess has not yet terminated, the calling thread will be blocked until the subprocess exits.
Returns:
the exit value of the subprocess represented by this Process object. By convention, the value 0 indicates normal termination.
Throws:
InterruptedException - if the current thread is interrupted by another thread while it is waiting, then the wait is ended and an InterruptedException is thrown
I am trying to run a script using apache-commons-exec which was implemented using the java approximation to run. This script is executed in the production server (Linux) but I need to test it in my localhost to see that everything works OK.
Here is my code to launch cygwin and this code is working in the cmd.exe but it does not work when I try to launch it using commons.exec:
OutputStream outputStream = new ByteArrayOutputStream();
DefaultExecutor exec = new DefaultExecutor();
exec.setWatchdog(new ExecuteWatchdog(1000));
PumpStreamHandler streamHandler = new PumpStreamHandler(outputStream);
exec.setStreamHandler(streamHandler);
CommandLine cmdLine = CommandLine.parse("C:\\cygwin64\\bin\\bash");
cmdLine.addArgument("-c");
cmdLine.addArgument("/cygdrive/c/dev/launch.sh");
int exit = exec.execute(cmdLine);
logger.warn("Job exit: " + exit);
It returns 1 and no output or log error:
org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1)
Is there anything missing? How can I catch the output properly?
A bit of a guess this but might help.
Sometimes an exit code = 1 represents "success". However, Apache Commons Exec by default interprets an exit code = 1 as a failure and will throw an ExecuteException if the script in question exits with and exit code = 1.
You can tell your DefaultExecutor that "exit code = 1 = success" using the following code:
exec.setExitValue(1);
Might not be the reason but worth a go.
I want to perform an svn delete from my Grails app. I tested out both of the following in the Grails console:
"svn delete /usr/share/mydir".execute()
Runtime.getRuntime().exec("svn delete /usr/share/mydir")
In both cases, a instance of java.lang.Process is returned, but the command does not get executed (/usr/share/mydir is not deleted).
This behaviour only happens when the app is running on Linux (Ubuntu). If I run it on Windows, the command does get executed.
Update
Following Tim's advice in the comments, I changed the command so that it captures the process output:
def process = "svn delete /usr/share/mydir".execute()
def out = new StringBuilder()
process.waitForProcessOutput(out, new StringBuilder())
println "$out"
I now see that the reason it's failing is because:
error svn: Can't open file '/usr/share/mydir/.svn/lock': Permission
denied
The below code works fine for me on CentOS.
def scriptCom="/folderlocation/shellscript.sh"
println "[[Running $scriptCom]]"
def proc = scriptCom.execute()
def oneMinute = 60000
proc.waitForOrKill(oneMinute)
if(proc.exitValue()!=0){
println "[[return code: ${proc.exitValue()}]]"
println "[[stderr: ${proc.err.text}]]"
return null
}else{
println "[[stdout:$revisionid]]"
return proc.in.text.readLines()
}