I am trying to execute a script file at (location : /home/id/scripts/) from my java code. Below is my java code :
Process process = null;
scriptfileName = "myScript.sh" ;
executeCmd = "/home/id/scripts/" +scriptfileName ;
process = new ProcessBuilder(executeCmd).start();
When I try to run the script using above code, only initial some lines are getting executed like, I placed 2 echo statement, only the first one is getting printed and rest below lines which has update DataBase statements are not executing. Same script file if I am running directly using command - {ksh sctiptfileName}, it successfully executes and updates DB.
Related
I'm trying to train a neural network to play Halite 3. The provided interface is a bash script which:
1. compiles my bot
2. calls the binary file with a string to run the bot java myBot
I'm trying to run this script from Java to train the network.
I've tried using a ProcessBuilder to run the script as well as the binary in the script. Running the script produces no output, and using echo I've determined that the program terminates when javac is called in the script. Removing that call, it terminates when the program is run.
I've tried calling the program directly as well using ProcessBuilder, and this does indeed produce output. The issue is it doesn't run the bots properly saying it can't find the file. I've tried changing the path to be relative to different directory levels as well as the absolute path (the java command doesn't seem to like absolute paths?).
Calling the binary directly:
List<String> cmd = new ArrayList<>();
cmd.add(dir+ "/src/halite");
// Replay
cmd.add("--replay-directory");
cmd.add(dir+"/replays/");
// Options
cmd.add("--results-as-json");
cmd.add("--no-logs");
// Dimensions
cmd.add("--width");
cmd.add("16");
cmd.add("--height");
cmd.add("16");
// Players
cmd.add("\"java -cp . myBot\"");
cmd.add("\"java -cp . myBot\"");
Process proc = new ProcessBuilder(cmd).start();
InputStream is = proc.getInputStream();
Scanner s = new Scanner(is);
while (s.hasNext()){
System.out.println((String) s.next());
}
This code does produce a JSON, however, I get an error in my logs saying that the bots do not run.
we are creating a groovy shell object and passing the bindings to the shell then
the parsing the groovy code using the shell and initializing a Script object as below
GroovyShell shell = new GroovyShell(binding);
Script script = shell.parse(//groovy code );
then we are storing the script object in a Concurrent hashmap and running the script using script.run() fetching the script from this hashmap , But the groovy code in the script does not executes completely say 1 in 100 runs . we had placed logs in the //groovy code that shows the code did not run completely and neither any exception is thrown
when you run the same instance of Script in different threads at the same time it could be stopped just by logic of your script.
if you want ta cache the parsed script, then store into your map the parsed class and not the instance of script and for each run re-bind variables.
the following code snippet should give you an idea how to do that:
scriptMap = new HashMap()
Script getScript(String code){
Class<Script> scriptClass = scriptMap.get(code);
if(scriptClass)return script.newInstance();
GroovyShell shell = new GroovyShell();
Script script = shell.parse( code );
scriptMap.put(code, script.getClass());
return script;
}
Object runScript(String code, Map variables){
Script script=getScript(code);
script.setBinding( new Binding(variables) );
return script.run();
}
println runScript("a+b", [a:2,b:7])
println runScript("(b-a)*3", [a:7,b:9])
println scriptMap
I have a java restful service method which executes a myscript.sh using processBuilder. My script takes one input (example - myscript.sh /path/to-a/folder).
Inside the script something like this
-> execute a command which is multithreaded i.e parallel processing
-> echo "my message"
Now when call my script from a linux command line it executes fine. First all the threads running finishes and then some text output from threaded command execution shown on terminal and then echo my message is shown.
But when I call the same script from java using processBuilder, the last echo message comes immidiately and execution ends.
Following the way I call my script from java
ProcessBuilder processBuilder = new ProcessBuilder("/bin/bash","/path/to/myscript.sh","/path/to/folder/data");
Process proc = processBuilder.start();
StringBuffer output = new StringBuffer();
BufferedReader reader = new BufferedReader(new InputStreamReader(proc.getInputStream()));
String line = "";
while((line = reader.readLine()) != null){
output.append(line + "\n");
}
System.out.println("### " + output);
I don't know whats happening, how to debug also.
Can someone enlighten me on how to get the same behaviour from shell script when run from terminal or from java processBuilder?
Use ProcessBuilder.redirectErrorStream(boolean redirectErrorStream) with argument true to merge the errors into output. Alternatively, you could also use the shell command syntax cmd 2>&1 to merge the error with output.
These are some of the cases why you may be immediately getting the output of the last echo statement (instead of the script taking time to run and return proper results):
Missing environment variables
The launched bash needs to source .bashrc or some such recource file
The launched bash may not be running in right directory (you can set this in ProcessBuilder)
The launched bash may not be finding some script/executable in its PATH
The launched bash may not be finding proper libraries in the path for any of the executables
Once you merge error, you would be able to debug and see the errors for yourself.
In your context, separate processes may be spawned in two ways:
1) Bash
/path/to/executables/executable &
This will spawn a new executable executable and you need to wait for it to finish. Here's an answer that will help you.
2) Java
Process exec = Runtime.getRuntime().exec(command);
status = exec.waitFor();
Essentially, you need to wait for the process to end before you start reading its std/err streams.
If I understand the problem correctly, adding just this line to your code should suffice: status = exec.waitFor() (Before you obtain the streams)
Here's the JavaDoc for Process.waitFor() :
Causes the current thread to wait, if necessary, until the process represented by this Process object has terminated. This method returns immediately if the subprocess has already terminated. If the subprocess has not yet terminated, the calling thread will be blocked until the subprocess exits.
Returns:
the exit value of the subprocess represented by this Process object. By convention, the value 0 indicates normal termination.
Throws:
InterruptedException - if the current thread is interrupted by another thread while it is waiting, then the wait is ended and an InterruptedException is thrown
I'm using net.neoremind.sshxcute SSH Java API library to connect to a sftp server and execute a shell script present on that server.
My Shell Script does a simple job of moving files from that SFTP location to a HDFS location on some other machine.
Currently, there's no way to report if any of the files are not moved due to any reason such as connection failure, file with illegal name, empty file etc.
I wonder, how can I show that set of information for each failed file move from shell command back to Java code ?
This is my sample code :
// e.g sftpScriptPath => /abc/xyz
// sftpScriptCommand => sudo ./move.sh
// arguments => set of arguments to shell script.
task = new ExecShellScript(sftpScriptPath, sftpScriptCommand, arguments);
result = m_SshExec.exec(task);
if(result.isSuccess && result.rc == 0)
{
isSuccessful = true;
s_logger.info("Shell script executed successfully");
s_logger.info("Return code : " + result.rc);
s_logger.info("Sysout : " + result.sysout);
}
else
{
isSuccessful = false;
s_logger.info("Shell script execution failed");
s_logger.info("Return code : " + result.rc);
s_logger.info("Sysout : " + result.sysout);
}
The Result object returned from the exec method call includes:
exit status or return code (Result.rc),
standard output (stdout) (Result.sysout),
standard error (stderr) (Result.error_msg), and
an indication of success, based on return code and output (Result.isSuccess).
So, if you are committed to the current method of executing a shell script using the sshxcute framework, then the simplest way would be to have the move.sh script provide information about any failures while moving files. This could be done via a combination of return codes and standard output (stdout) and/or standard error (stderr) messages. Your Java code would then obtain this information from the returned Result object.
I want to perform an svn delete from my Grails app. I tested out both of the following in the Grails console:
"svn delete /usr/share/mydir".execute()
Runtime.getRuntime().exec("svn delete /usr/share/mydir")
In both cases, a instance of java.lang.Process is returned, but the command does not get executed (/usr/share/mydir is not deleted).
This behaviour only happens when the app is running on Linux (Ubuntu). If I run it on Windows, the command does get executed.
Update
Following Tim's advice in the comments, I changed the command so that it captures the process output:
def process = "svn delete /usr/share/mydir".execute()
def out = new StringBuilder()
process.waitForProcessOutput(out, new StringBuilder())
println "$out"
I now see that the reason it's failing is because:
error svn: Can't open file '/usr/share/mydir/.svn/lock': Permission
denied
The below code works fine for me on CentOS.
def scriptCom="/folderlocation/shellscript.sh"
println "[[Running $scriptCom]]"
def proc = scriptCom.execute()
def oneMinute = 60000
proc.waitForOrKill(oneMinute)
if(proc.exitValue()!=0){
println "[[return code: ${proc.exitValue()}]]"
println "[[stderr: ${proc.err.text}]]"
return null
}else{
println "[[stdout:$revisionid]]"
return proc.in.text.readLines()
}