I have 5 shell scripts. Each has a java command. Previous jobs output is input to the next job.
I created a superScript.sh
//mail - to inform beginning
sh script1.sh;
sh script2.sh;
sh script3.sh;
sh script4.sh;
sh script5.sh;
//mail to inform end
Sample script1.sh
cd toBaseDirectory;
java -cp /path/to/application.jar main.class parameter
But all the jobs are started at the same time. How can I make this sequential?
Try to run javas like this
java -cp /path/to/application.jar main.class parameter & wait
If want to run the second command only if the first exited successfully. To do so, join them with &&
command1 && command2
simple Example
~$ cat abc.sh
#!/usr/bin/env bash
echo "hi"
~$ cat pqr.sh
#!/usr/bin/env bash
echo "batman say : $1"
If abc.sh execute successfully then only execute pqr.sh
~$ retval=$(./abc.sh) && result=$(./pqr.sh "$retval") && echo "$result"
batman say : hi
you can also try similar approach with your java command execution using shell script
Note:
To execute Shell scripts as command sequentially, waiting for the first to finish before the next one starts. You can use ;
command1; command2
wait waits for a process to finish
You can also choose to run the second command only if the first exited successfully. To do so, join them with &&
cmd1 && cmd2
but In repetitive cases like this I recommended using a simple loop in the body of your script:
for n in {1..5} ; do sh script${n}.sh ; done
The loop not only to run them in order but is easier to tweak and reuse when needed, with using brace expansion
Related
I have this Script php-pull-script.phpwritten:
<?php
$output1 = shell_exec('git pull');
$output2 = shell_exec('pkill java');
$output3 = shell_exec('mvn package');
$output4 = shell_exec('java -jar ./target/compute-0.0.1-SNAPSHOT.jar');
echo "<pre>$output1</pre>";
echo "<pre>$output2</pre>";
echo "<pre>$output3</pre>";
echo "<pre>$output4</pre>";
?>
When executing in shell, I am not seeing any output in order to verify that it is running. I am not sure at all it is working. Is there a better way to do this automation script?
How to send of the java command with shell_exec and leave it running in background (Is & possible with shell exec)?
test.php
<?php
shell_exec('test.sh');
test.sh
echo "Do something"
/bin/sh -c 'sleep 10' >> /dev/null 2>&1 &
exit 0
/dev/null can be also a path to an logfile
All pathes here and in phpscript should be absolute, like: /path/tomy/test.sh
Here the test.php doesnt wait 10 seconds for the subcall.
Hope that helps a little ;)
You might not be in the correct working directory. You may need to set that manually to make the commands run. Otherwise, those commands look right. Though you really should just have this set up as a shell script. PHP's not a good language for this sort of thing, and if you are calling this from a REST endpoint there are far better solutions like Jenkins.
UPDATE: Based on below discussion I have edited my answer for more accurate description.
I am trying to run a nohup command from jenkins. The full command is
nohup java -jar /home/.../jar/server-process-0.35.jar prod >> /var/../server-process-prod.log 2>&1 &
This command does not work. I can see status as success in jenkins but no java process in linux. When I do 'ps -ef | grep java'
However when I remove the last '&' , that is I change it from run in forground instead of background
It starts working. I can see the java process started.
The original command works fine If I run it on linux console.
I need to run it from jenkins in the original form that is as a backgorund process. So that it is independant of jenkins.
Any clues why is this happening?
Long story short, Jenkins kills all processes spawned by a job once that job finishes. To override this behavior, you need to set an environment variable.
The variable appears to vary from job type to job type. It used to be BUILD_ID, but for Pipeline jobs it is JENKINS_NODE_COOKIE, and there are several others mentioned in this answer.
So if you're running your command in Pipeline, it would look like this:
sh 'JENKINS_NODE_COOKIE=dontKillMe nohup java -jar /home/.../jar/server-process-0.35.jar prod >> /var/../server-process-prod.log 2>&1 &'
See the wiki on ProcessTreeKiller and this comment in the Jenkins Jira for more information.
In your jenkins shell script try:
export BUILD_ID=dontKillMe
nohup java -jar your_java_app.jar &
It worked for me!
I tried every possible combination with BUILD_ID but it didn't work.
I made it though by putting "nohup command > output.txt&" inside a shell script ran by the execute shell in jenkins, it worked perfectly!
Got the same problem, added:
BUILD_ID=dontKillMe python /var/lib/jenkins/release.py
into Execute Shell -> Command and inside release.py there is:
os.system('nohup java -jar ' + new_jars_on_server + '/' + generated_jar_by_mvn_name + '&')
and it works
Best simple solution is to use "at now" instead of "nohup"
In your job jenkins (execute shell) put :
set +e #so "at now" will run even if java -jar fails
#Run java app in background
echo "java -jar $(ls | grep *.jar | head -n 1)" | at now + 1 min
what worked for me was wrapping the nohup java -jar ... command into sh file inside execute shell command, and running that same sh file right after:
echo "starting java jar..."
cd [some location where jar is]
echo "nohup java -jar [jar_name].jar &" > start-jar-in-background.sh
sh start-jar-in-background.sh
echo "started java jar"
If I had nohup java -jar ... inline with Execute shell command, then it didn't start it from some reasons. I spent quite some time on this, hope it helps to someone ';)
Simplest way :
`nohup java -jar [jar_name].jar >log_file_you_want 2>another_file`&
set +e #so "at now" will run even if java -jar fails
#Run java app in background
echo "java -jar $(ls | grep *.jar | head -n 1)" | at now + 1 min
above command worked for him, thanks #walid, & remove at the end (+ 1 min)
I have developed a web app in Java which uses Scrapy to get some data. To reach that, I invoke a shell script from Java:
Process p = Runtime.getRuntime().exec("sh myPath/myScript.sh");
p.waitFor();
which contains
#!/bin/bash
cd mySpiderPath
echo "We are going tu run scrapy"
scrapy crawl mySpider
echo "done!"
After running it, both "echo" are printed but scrapy does nothing. If I run myScript.sh from shell it works perfectly... I'm confused!
What can I do to try to debug this strange behavior?
EDIT
I have changed myScript.sh to run python version instead of scrapy command, and it doesn't work... so, the conclusion is that is not an "scrapy problem" but it is a bash script problem when it's invoked from Java...any ideas? (if I execute myScript.sh from shell it works fine)
#!/bin/bash
cd mySpiderPath
echo "We are going tu run scrapy"
python --version
echo "done!"
Try changing:
Process p = Runtime.getRuntime().exec("sh myPath/myScript.sh");
to:
Process p = Runtime.getRuntime().exec("bash myPath/myScript.sh");
This will probably run the script with /bin/bash instead of /bin/sh, that often points to a simpler shell.
I'm creating a java application that calls shell script.
ProcessBuilder pb = new Processuilder("./script.sh", path1, path2);
Process p = pb.start();
this starts the script.sh file.
PROBLEM occurs when script.sh calls another shell script which is # path2.
======script.sh=====
1.#some code
2.
3.
4.
5.
6. cd $2 # works till here, and changes directory
7. chmod +x script2.sh
8. ./script2.sh
.
.#remaining code
===========
the script exits # line7 without any error or warning.
Please guide.
I have a java program in which I am reading from stdin
BufferedInputStream bis = new BufferedInputStream(System.in);
byte[] b = new byte[1];
int cmd = bis.read(b);
System.out.println("Read command: " + new String(b));
And a shell script to start-stop this program
'start')
if [ -p myfifo ]; then
rm myfifo
rm myfifo-cat-pid
fi
mkfifo myfifo
cat > myfifo &
echo $! > myfifo-cat-pid
java -jar lib/myJar.jar >/dev/null 2>&1 0<myfifo &
echo `date +%D-%T` $! >> process.pid
echo "Started process: "$!
;;
'stop')
echo 0 > myfifo
echo "Stopped process: "
rm myfifo
;;
When I run commands in start one by one the program waits until i echo on fifo. But when I run it from .sh file it immediately reads from stdin. Dont understand what is the difference between if run a command directly on command prompt and if I make a .sh file and run it then
The difference is not on the Java side, but instead on the fact that your shell handles differently the job control when launching a script. From man bash:
JOB CONTROL
Job control refers to the ability to selectively stop (suspend) the
execution of processes and continue (resume) their execution at a later
point. A user typically employs this facility via an interactive
interface supplied jointly by the operating system kernel's terminal
driver and bash.
As explained here, by default job control is disabled in a script.
When cat > myfifo & is executed in an interactive shell, it remains in "Stopped" mode waiting to be put in foreground again (with fg). When launched in a script, instead, job control is disabled so, as soon as cat tries to read from the (detached) terminal, it exists, closing the pipe (and your Java process reads EOF).
If you use set -m at the top of your shell script (hence enabling forcefully job control), you should see a consistent behavior.
set [+abefhkmnptuvxBCEHPT] [+o option-name] [arg ...]
-m Monitor mode. Job control is enabled. This option is on by
default for interactive shells on systems that support it
(see JOB CONTROL above). Background processes run in a sep‐
arate process group and a line containing their exit status
is printed upon their completion.