Groovy Shell script object not executed entirely - java

we are creating a groovy shell object and passing the bindings to the shell then
the parsing the groovy code using the shell and initializing a Script object as below
GroovyShell shell = new GroovyShell(binding);
Script script = shell.parse(//groovy code );
then we are storing the script object in a Concurrent hashmap and running the script using script.run() fetching the script from this hashmap , But the groovy code in the script does not executes completely say 1 in 100 runs . we had placed logs in the //groovy code that shows the code did not run completely and neither any exception is thrown

when you run the same instance of Script in different threads at the same time it could be stopped just by logic of your script.
if you want ta cache the parsed script, then store into your map the parsed class and not the instance of script and for each run re-bind variables.
the following code snippet should give you an idea how to do that:
scriptMap = new HashMap()
Script getScript(String code){
Class<Script> scriptClass = scriptMap.get(code);
if(scriptClass)return script.newInstance();
GroovyShell shell = new GroovyShell();
Script script = shell.parse( code );
scriptMap.put(code, script.getClass());
return script;
}
Object runScript(String code, Map variables){
Script script=getScript(code);
script.setBinding( new Binding(variables) );
return script.run();
}
println runScript("a+b", [a:2,b:7])
println runScript("(b-a)*3", [a:7,b:9])
println scriptMap

Related

Is there a way to have Java call a bash script which calls javac and java?

I'm trying to train a neural network to play Halite 3. The provided interface is a bash script which:
1. compiles my bot
2. calls the binary file with a string to run the bot java myBot
I'm trying to run this script from Java to train the network.
I've tried using a ProcessBuilder to run the script as well as the binary in the script. Running the script produces no output, and using echo I've determined that the program terminates when javac is called in the script. Removing that call, it terminates when the program is run.
I've tried calling the program directly as well using ProcessBuilder, and this does indeed produce output. The issue is it doesn't run the bots properly saying it can't find the file. I've tried changing the path to be relative to different directory levels as well as the absolute path (the java command doesn't seem to like absolute paths?).
Calling the binary directly:
List<String> cmd = new ArrayList<>();
cmd.add(dir+ "/src/halite");
// Replay
cmd.add("--replay-directory");
cmd.add(dir+"/replays/");
// Options
cmd.add("--results-as-json");
cmd.add("--no-logs");
// Dimensions
cmd.add("--width");
cmd.add("16");
cmd.add("--height");
cmd.add("16");
// Players
cmd.add("\"java -cp . myBot\"");
cmd.add("\"java -cp . myBot\"");
Process proc = new ProcessBuilder(cmd).start();
InputStream is = proc.getInputStream();
Scanner s = new Scanner(is);
while (s.hasNext()){
System.out.println((String) s.next());
}
This code does produce a JSON, however, I get an error in my logs saying that the bots do not run.

Run ksh command from java code to run script file

I am trying to execute a script file at (location : /home/id/scripts/) from my java code. Below is my java code :
Process process = null;
scriptfileName = "myScript.sh" ;
executeCmd = "/home/id/scripts/" +scriptfileName ;
process = new ProcessBuilder(executeCmd).start();
When I try to run the script using above code, only initial some lines are getting executed like, I placed 2 echo statement, only the first one is getting printed and rest below lines which has update DataBase statements are not executing. Same script file if I am running directly using command - {ksh sctiptfileName}, it successfully executes and updates DB.

CasperJS script not calling a function only when executed from a shell script which itself is executed from Java

I have a script written in Casper.js. I am using a Java Process to execute a shell script which executes the Casper script.
Almost all of the Casper script runs when I execute it this way (confirmed by console logs and this.echo() commands) except for one function, and I don't know why. Here's a snippet:
this.then(function() {
this.click('input[type="type"]');
this.then(function() {
this.page.uploadFile('input[type="type"]', 'picture.jpg');
this.then(function() {
this.wait(1000);
this.then(function() {
var test = this.evaluate(function() {
return document.getElementById('elementid').value;
});
this.echo(test);
return test;
});
The function in question that is not being called is the callback function for assigning a value to the variable test.
It works perfectly when I execute the shell script on my computer. This only fails when it is called from my Java program.
Any insight on the matter would be greatly appreciated, thank you!

GroovyShell: embedded execution

I'm trying to embed groovy into a large Java application.
The Java application should load some utility Groovy scripts at startup.
The application should then run other scripts multiple times. There is also a need to enter some code at a GUI and execute it at user request.
The problem I'm facing is this:
I am loading the startup script like this:
GroovyShell gShell = new GroovyShell();
gShell.evaluate(new FileReader("scripts/autoload.groovy"));
Suppose my autoload.groovy contains:
def prnt(m) {
println("From Groovy: " + m);
}
This works fine. But when I want to run a user command using:
gShell.evaluate("prnt 66");
I get the error:
groovy.lang.MissingMethodException: No signature of method: Script2.prnt() is applicable for argument types: (java.lang.Integer) values: [66]
How can my user script access the methods already loaded?
Note: I have also tried "autoload.prnt 88", and still get the error.
Each evaluate call is compiled and run as a separate Script, and
def prnt(m) {
println("From Groovy: " + m);
}
defines a method in the Script class generated from autoload.groovy, which is not accessible from the subsequent "calling" script. However, the scripts run by the same GroovyShell share the same binding, so you can store values in the binding from one script and access them in another. Storing a value in the binding is simply a case of assigning the value to an otherwise undeclared variable:
prnt = { m ->
println("From Groovy: " + m);
}
This stores a closure in the binding variable prnt, and you can call the closure from other scripts in the same shell. Note that
def prnt = { m ->
or
Closure prnt = { m ->
would not work, because the def or type makes it a local variable declaration (private to this particular script) rather than an assignment to the binding.

Shell script not running R (Rhipe) program from Java code

I have a simple shell script which looks like this:
R --vanilla<myMRjob.R
hadoop fs -get /output_03/ /home/user/Desktop/hdfs_output/
This shell script runs myMRjob.R, and gets the output from hdfs to local file system. It executes fine from terminal.
When i am trying to run shell script from java code, i am unable to launch the MapReduce job i.e. the first line isn't getting executed. While "hadoop fs -get .." line is running fine through Java code.
Java code which i used is:
import java.io.*;
public class Dtry {
public static void main(String[] args) {
File wd = new File("/home/dipesh/");
System.out.println("Working Directory: " +wd);
Process proc = null;
try {
proc = Runtime.getRuntime().exec("./Recomm.sh", null, wd);
} catch (Exception e) {
e.printStackTrace();
}
}
}
The reason behind this whole exercise is that i want to trigger and display the result of the myMRjob.R in JSP.
Please help!
The reason your shell script isn't running from the exec call is because shell scripts are really just text files and they aren't native executables. It is the shell (Bash) that knows how to interpret them. The exec call is expecting to find a native executable binary.
Adjust your Java like this in order to call the shell and have it run your script:
proc = Runtime.getRuntime().exec("/bin/bash Recomm.sh", null, wd);
When you called hadoop directly from Java, it is a native executable and that's why it worked.

Categories