I am trying to deploy my Java build to CentOS using Gradle ssh plugin
remotes {
ftc {
role 'masterNode'
host = '173.199.123.42'
user = 'root'
password = 'myPass'
}
}
ssh.settings {
knownHosts = allowAnyHosts
}
task deploy {
doLast {
ssh.run {
session(remotes.ftc) {
execute 'pkill -f \'java.*chat\'', ignoreError: true
remove 'build/libs/chat.jar'
put from: file('build/libs/chat.jar'), into: '/root/test'
execute 'nohup java -jar /root/test/chat.jar &'
}
}
}
}
It works except it never finish, it get stuck on execute nohup java -jar /root/test/chat.jar &
How can I make it to run in background?
Using nohup usually involves also redirection of the IO to some file so the file descriptors in the connections can be closed:
execute 'nohup java -jar /root/test/chat.jar & &> /tmp/log'
Related
I try to get a Newrelic java agent to run in a docker container to monitor a HiveMetaStore server running in the docker container.
In order to get the Newrelic agent started during the startup of the JVM I have to pass-javaagent /path/to/newrelic.jar flag to the JVM.
I tried:
hive --service metastore -javaagent /path/to/newrelic.jar
This failed with "Unrecognized Option" in the HiveMetaStore server code, where it should not have ended up at all.
The hive script invokes the bin/ext/metastore.sh script which in turn invokes
exec $HADOOP jar $JAR $CLASS "$#"
So I tried to patch this invocation:
exec $HADOOP -javaagent /path/to/newrelic.jar jar $JAR $CLASS "$#"
This failed as well.
Then I took a deeper look at the hadoop script. Finally function hadoop_java_exec in libexec//hadoop_functions.sh invokes:
exec "${JAVA}" "-Dproc_${command}" ${HADOOP_OPTS} "${class}" "$#"
So I patched this code:
exec "${JAVA}" "-javaagent /path/to/newrelic.jar" "-Dproc_${command}" ${HADOOP_OPTS} "${class}" "$#"
This again failed.
Last but not least I recognized that one can pass java properties via HADOOP _OPTS (in libexec/hadoop_functions.sh):
function hadoop_finalize_hadoop_opts
{
hadoop_translate_cygwin_path HADOOP_LOG_DIR
hadoop_add_param HADOOP_OPTS hadoop.log.dir "-Dhadoop.log.dir=${HADOOP_LOG_DIR}"
hadoop_add_param HADOOP_OPTS hadoop.log.file "-Dhadoop.log.file=${HADOOP_LOGFILE}"
…
}
But I could not figure out how to pass -javaagent:/path/to/newrelic.jar using this mechanism.
Is anyone out there who has tried this before and can help with that ?
My apology if this is a stupid question. Thanks upfront, Ute
function hadoop_finalize_hadoop_opts
{
hadoop_translate_cygwin_path HADOOP_LOG_DIR
hadoop_add_param HADOOP_OPTS hadoop.log.dir "-Dhadoop.log.dir=${HADOOP_LOG_DIR}"
hadoop_add_param HADOOP_OPTS hadoop.log.file "-Dhadoop.log.file=${HADOOP_LOGFILE}"
…
hadoop_add _param HADOOP_OPTS java.javaagent -javaagent:${NEWRELIC_AGENT_HOME}\/newrelic.jar
}
Adding the last statement to got the agent started. I see in the container:
/usr/lib/jvm/default-jvm/bin/java -Dproc_jar -Dproc_metastore , … , NullAppender - javaagent:/opt/newrelic-agent-5.10.0/newrelic.jar org.apache.hadoop.util.RunJar /opt/apache-hive-3.1.2-bin/lib/hive-metastore-3.1.2.jar org.apache.hadoop.hive.metastore.HiveMetaStore
I don't understand the "NullAppender" yet but at least the agent seems to be running now.
I have a git repository with 2 modules in it. One is SpringBoot based backend module and another one is VueJS based frontend module.
app-root
- backend
- frontend
I have a declarative style Jenkinsfile to build my backend module with relevant maven commands. I want to execute all those maven commands from inside backend directory.
One option is to use dir("backend") { ....} for all commands separately, which looks ugly to me.
Is there any other option to instruct Jenkins to execute the entire pipeline from inside a sub-directory?
I ended up with a "prepare project" stage that puts the subdirectory content into the root.
It's also probably a good idea to remove all the root contents (stage "clean") to be absolutely sure there are no leftovers from previous builds.
node {
def dockerImage
stage('clean') {
sh "rm -rf *"
}
stage('checkout') {
checkout scm
}
// need to have only 'backend' subdir content on the root level
stage('prepare project') {
// The -a option is an improved recursive option, that preserve all file attributes, and also preserve symlinks.
// The . at end of the source path is a specific cp syntax that allow to copy all files and folders, included hidden ones.
sh "cp -a ./backend/. ."
sh "rm -rf ./backend"
// List the final content
sh "ls -la"
}
stage('build docker image') {
dockerImage = docker.build("docker-image-name")
}
stage('publish docker image') {
docker.withRegistry('https://my-private-nexus.com', 'some-jenkins-credentials-id') {
dockerImage.push 'latest'
}
}
}
You can have jenkinsfiles inside backend and frontend modules and just point to them on each pipeline, eg:
and on the pipeline itself you just cd to the submodule and execute its commands:
pipeline {
agent any
stages {
stage('build') {
steps {
sh 'cd backend'
sh 'mvn clean package'
}
}
//other stages
}
}
if you don't want to cd to a sub module you can use sparse checkouts but in this case you have to change the path to the jenkinsfile accordingly because it will be on the root folder.
I can get working directory of current Java program using this code:
Path path = Paths.get(*ClassName*.class.getProtectionDomain().getCodeSource().getLocation().toURI());
Also I can get CommandLine parameters (but there is no directory in the output) of running Java processes using this command wmic process get CommandLine where name='java.exe' /value
It is possible to get working directory of another Java process (better programmatically)? Probably it can be solved with some jdk/bin utilities?
You can get this information via the Attach API. To use it, you have to add the tools.jar of your jdk to your class path. Then, the following code will print the current working directories of all recognized JVM processes:
for(VirtualMachineDescriptor d: VirtualMachine.list()) {
System.out.println(d.id()+"\t"+d.displayName());
try {
VirtualMachine vm = VirtualMachine.attach(d);
try(Closeable c = vm::detach) {
System.out.println("\tcurrent dir: "+vm.getSystemProperties().get("user.dir"));
}
}
catch(AttachNotSupportedException|IOException ex) {
System.out.println("\t"+ex);
}
}
I use the script of PbxMan to run a Java app as a service on this answer.
All work perfectly when I launch my app in terminal but when it's running as a service, external commands are not executed.
Example of how I execute commands :
String [] startStreamWCTour = {
"/usr/local/bin/mjpg_streamer",
"-i", "/usr/local/lib/input_uvc.so -f 10r VGA",
"-o", "/usr/local/lib/output_http.so -w /var/www/cam"};
Process p = Runtime.getRuntime().exec( startStreamWCTour );
...
and nothing is executed...
Maybe I have to add something to my commands...? ( I'm a little newbish on Linux ;) )
Dear all,
I want a java program which can connect to a remote host that runs on linux os and able to start and stop any application from that remote host. Is it possible to do this task with a java program, with or without any service wrapper!
Check Ant SSHEXEC task.
link...
Looks like there's no AntBuilder implementation in Java. Groovy is the ideal tool for these kind of tasks. You can write a Groovy and access it just like any other class in Java. If you are using NetBeans, then Groovy and Java can co-exist in the same package.
def String execute(def cmd, def host, def uname, def pwd)throws Exception {
def ant = new AntBuilder()
ant.sshexec(host : host,
username : uname,
password : pwd,
command : "ls -l,
trust : "true",
outputproperty : "result")
return ant.project.properties."result"
}