Linux command for pseudo terminal - java

I have a csh script which requires TTY to work properly.
I need to execute this script from a program which does not have TTY (like Java program started from Eclipse).
I tried "script" Linux command and it works in general, but it has a side effect.
This command forks itself with PTY and my csh script runs under child (slave) process.
But when parent (master) process is killed (by Java program), child process remains running and detached to "init" process.
So, is there any way to run csh in pseudo-terminal, so that all processes are killed when killing parent process (which Java program has access to)?
I know that some Java libraries exist for this (pty4j, etc.) but I have no possibilities to use them, so I need a Linux command.

Related

DTEXEC doesn't trigger dtsx package in a power shell when triggered from java but works from command prompt on local machine

I'm calling a dtsx file in PowerShell(test.ps1) using below:
& $dtexec /f "$dtsx"
This works fine when I run the test.ps1 from command prompt in local machine but the same doesn't even trigger when test.ps1 is triggered from a java application.
please help.
I'd check if any other Powershell runs for You when started from Java, and only this particular one fails.
At some point I had similar problem. It turned out, that I was using Powershell module that required to 64bit powershell instance while I was running 32bit java process. This in turn was spawning 32bit Powershell process (You can check that with [Environment]::Is64BitProcess) that could not run what I was asking it to run.

Spawning process in background on Jenkins - job that won't stay in queue

I want to make job on Jenkins that starts server (MockServer on WireMock).
Server is launched from *.jar file, from terminal like that.
java -jar serverLaunch.jar
It takes over my console. To avoid that I modify this and do:
java -jar serverLaunch.jar &>/dev/null &
And that works for me on my local PC. Now I want to move it to Jenkins.
If I try to do this from "Shell command" block in Jenkins Job then:
a) java -jar serverLaunch.jar
I have task locked in queue in my Jenkins and I don't want that but server starts and works.
b) java -jar serverLaunch.jar &>/dev/null &
Job ends with success but my server is not alive.
I have wrapped this command also in .sh script and .rb script. Any idea how to make it work?
I've tried this:
https://wiki.jenkins-ci.org/display/JENKINS/Spawning+processes+from+build
And then in Jenkins "Shell script":
daemonize -E BUILD_ID=dontKillMe /bin/bash launch.sh
But it also passes but server is not alive.
I had to check "Inject environment variables to the build process" and add:
BUILD_ID=dontKillMe
Now it is working.
Try using nohup e.g.:
nohup java -jar serverLaunch.jar &
That should prevent the process being terminated when the parent shell process exits (which I suspect is your problem).
Another effective approach would be to add a post-build action that executes a shell spawning the server.

Checking if java code is running and attempt to restart if not (unix)

I have some java code that is running continuously on a raspberry pi (from the terminal) and listening to a twitter stream and saving data to disk/usb.
I would like to know what would be the preferred method of detecting if a program is still running so I can take appropriate action and attempt to restart the app?
I hope that in this manner I could detect the program has failed, send an email to notify me and attempt to rerun the code. Would running this in a server environment be the best way to go?
Have a look at the forever project. If you have npm installed you can use that to install the forever package with the -g (for global install) parameter:
npm install forever -g
Then use the start argument to start the script. In your case this could be a bash file (.sh) with the required java commands.
forever start name-of-script-here
If the script would fail (system.exit in java or any fatal error) it will be restarted by forever. You can also get a list of all the running scripts managed by forever with:
forever list
In Unix let a parent process create the child java process and have it monitor. If it terminates then the parent can restart it.
The Unix fork returns the child pid to the parent.
Using this technique: Tracking the death of a child process parent can monitor child's death.

Usage of ssehexec task in Jenkins build doesn't stop its execution

I have an Ant Task in the Jenkins Ant Execution Plugin, as a Post Build Step, to remotely run a shell script in one of our servers. The shell scripts starts a java process in the background. When I execute the shell script on the server directly it starts the java process in the back ground and comes out. When I run it from Jenkins via the sshexec task the shell script is run, but it never comes out and the Jenkins Build waits.
Later when I added the timeout attribute onto the sshexec it times out after the given number of milliseconds, but the Jenkins build is shown as failed. How do I make the sshexec task to come out cleanly from the shell script execution?
Here is my ssheexec task
<sshexec host="${deploy.host}" username="${deploy.username}" password="${deploy.password}" command=". /etc/profile; cd ${deploy.path}; sh start.sh i1" trust="true" timeout="10000" />
The start.sh file is as given:
nohup java -Xms512m -Xmx1024m -cp calculation.jar com.tes.StartCalculation $1 &
echo $! > calculation-$1-java.pid
It looks like, the ssh executed job is not fully daemonized. Starting with nohup is not sufficient in many cases.
See the discussion that related to it (in a different context)
The issue is that you are not closing your file descriptors when you
push something into the background. The & is fine when you are in a
shell, but is not enough when you want to disconnect and leave a
process running, you need the process to disconnect from the shell.
.... Fix to to correct the script.
If someone writes a naive service script that does not properly detach
from the terminal, I want to know the first time that that script is
used in a deployment - the SCM changes will enable the breaking change
to be quickly identified.
It is wrong to hide the problem to enable incorrect code to be
released to production - and I would not be happy if the first I knew
about it was when a production system administrator complained.
If this is the same problem, you need to daemonize the script

Running a java application through shell script in a JSP/Servlet

I am running a shell script through a web application. This shell script looks something like
`#! /bin/bash
user=""
pass=""
db_url=""
db_instance=""
sqlplus -s $user/$pass#$db_url/$db_instance # ./SqlScripts/foo.sql
sqlplus -s $user/$pass#$db_url/$db_instance # ./SqlScripts/bar.sql
CLASS_PATH="./lib/*"
java -classpath $CLASS_PATH package.Main ./Data/inputfile`
I am using ProcessBuilder to run the script and everything but the last line works fine. Am I creating a problem by calling shell through the jvm then calling the jvm again to run the application?
The problem was the environment that the script execution process was running in. I changed some of the environment variables of the process and everything is working fine now. The script was initially a standalone shell script, but I wrote one script for each of the databases used. In order to control the workflow I wrote a web application for this which calls seperate threads for each script and can manage the threads. Thanks for the responses!
Often, app servers run their servlets in a 'clean room' environment - e.g. they strip away all the variables that would normally be set from the outside for security reasons. Try using a fully qualified path to the java binary, and also try setting a full/absolute path for your CLASS_PATH variable.
The parent JVM and the child JVM should be separate processes, no particular reason why they should interfere.
What error do you get?
is java on your PATH?
OK, adding more questions in response to your comments ...
Which thread is waiting? Presumably the parent?
The child java process, do you have any evidence as to whether is succesfully initalises. My guess woukld be that the child is in some way blocked. If you kill the child does the parent then come back to life?
Suppose it was a simple "hello world" application, would that work?
Most likely the line:
CLASS_PATH="./lib/*"
And
$CLASS_PATH
It won't be expanded by the process builder because that's usually shells' job, which in this situation is not being invoked.
Try creating the complete list of ./lib/* and append it directly into the last line of your script.
java -classpath ./lib/a.jar:./lib/b.jar
Side note:
Invoking all this from java looks just bad to me. I would rather have it in a standalone script and invoke it by other means, but that's me.

Categories