I have an automation project written in Java, using Junit, and I'm trying to create my new Jenkins job pipeline.
I've created the pipeline and a new Jenkins file, but I don't know what this file should contain.
I need to -
build the project
Run the tests by category (I don't want to run
all the test in one job)
Deploy
I've found this one in Jenkins documentation
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Building..'
}
}
stage('Test') {
steps {
/* `make check` returns non-zero on test failures,
* using `true` to allow the Pipeline to continue nonetheless
*/
sh 'make check || true'
junit 'pom.xml'
}
}
stage('Deploy') {
steps {
echo 'Deploying....'
}
}
}
}
But I got this message:
"Test reports were found but none of them are new. Did leafNodes run?"
So how do I make it work? and how can I specify an exact category to run?
There are a couple of things you will need to setup first. Like jdk , maven, git credentials. Then your pipeline will looks something like this.
pipeline {
agent {
node {
label 'the label of your agen or have "any" if you didnt specidy one'
}
}
environment {
//maven home as it is configured in Global Configuration
mvnHome = tool 'maven'
}
options{
// remove older builds and artifacts if they exceed 15 builds
buildDiscarder(logRotator(numToKeepStr: '100', artifactNumToKeepStr: '100'))
//add the time stamp to the logs
timestamps()
}
stages {
stage("Git CheckOut") {
steps {
script{
//CheckOut from the repository
def scmVars = checkout([$class: 'GitSCM',
branches: [[name: 'master']], //here you can enter branch name or SHA code
userRemoteConfigs: [[credentialsId: 'credential that you set for you git here',
url: "your git url here"]]])
}
}
}
stage('Build Artifacts') {
steps {
sh "echo Packaging the artifacts!"
//packaging the project
sh "${mvnHome}/bin/mvn clean package "
//archiving the artifacts after the build
sh "echo Archiving the artifacts!"
archiveArtifacts 'target/*.war' // you can deploy to nexus if you setup nexus
}
}
stage('Unit Test') {
steps {
//running the unit tests
sh "${mvnHome}/bin/mvn clean test"
}
}
stage('Transfer war file to Servers') {
steps {
sshagent(['agent name that was setup in your server where you want to deploy artifacts']) {
sh "echo Trasnfering files to servers!"
//copy war file servers
sh 'scp -o StrictHostKeyChecking=no $projPath/target/your war file /your server path'
}
}
}
}
post {
always {
sh "echo Jenkins Job is Done"
}
success {
sh "echo Sending Success Email!"
}
failure {
sh "echo Sending Failed Email!"
}
}
}
Related
I'm trying to push a docker image using gradle to a private repo. gradle-docker-plugin does exactly that. Using the docker-java-application I can build an image but I can't push it my own repository.
In the debug logging i see the following line:
[DEBUG] [com.github.dockerjava.core.command.PushImageResultCallback] ResponseItem(stream=null, status=The push refers to repository [docker.io/project-name/project-app], progressDetail=null, progress=null, id=null, from=null, time=null, errorDetail=null, error=null, aux=null)
And the later the breaking error:
> Could not push image: unauthorized: incorrect username or password
Which makes sense because my configured account is not for docker.io but only my own repository.
My gradle files looks like this:
Buildscript:
buildscript {
dependencies {
classpath 'com.bmuschko:gradle-docker-plugin:7.0.1'
}
...
}
build.gradle:
plugins {
id 'java'
id 'com.bmuschko.docker-java-application'
}
...
mainClassName = 'com.project.app.Main'
...
docker {
javaApplication {
baseImage = 'jre-11.0.11_9-alpine'
ports = [8080]
jvmArgs = ['-Xms256m', '-Xmx2048m']
}
registryCredentials {
url = 'https://private.repository.example'
username = 'user'
password = 'password'
}
}
I can build and run the image locally, only the push fails. Is this the correct way of configuring the remote repository for the docker-java-application plugin?
Can you try:
docker login https://private.repository.example --username user --password password
And then check that your /.docker/config.json has entries:
{
"auths": {
"https://index.docker.io/v1/": {},
"https://private.repository.example" : {}
},
"credsStore": "desktop.exe"
}
After that - remove "https://index.docker.io/v1/": {} and restart Docker.
It should default all pushes and pulls with https://private.repository.example
My project root directory is:
D:/Project/Node_Project
I am using a gradle plugin to install nodejs temporarily in my project root directory so that some nodejs command can run in the project while the thoject builds. The plugin is as below:
plugins {
id "com.github.node-gradle.node" version "2.2.4"
}
node {
download = true
version = "10.10.0"
distBaseUrl = 'https://nodejs.org/dist'
workDir = file("${project.buildDir}/nodejs")
}
So, nodejs is getting installed inside the project in the location:
D:/Project/Node_Project/build/nodejs/node-v10.10.0-win-x64
Now, I am using a .execute(String[] "path to set at environment variable", String path of file to be executed which is in the project root directory) method to run a windows command with node dependency. Code below:
cmd = "node connect.js"
def process = cmd.execute(["PATH=${project.projectDir}/build/nodejs/node-v10.10.0-win-x64"],null)
In the above .execute method, is there a way to auto-populate the "build/nodejs/node-v10.10.0-win-x64" part of the string instead of hardcoding it into the method?
Something like:
def process = cmd.execute(["PATH=${project.projectDir}/.*"],null)
Syntax of .execute method:
https://docs.groovy-lang.org/latest/html/groovy-jdk/java/lang/String.html#execute(java.lang.String[],%20java.io.File)
All the codes are inside "build.gradle" file. Please help!
I asked why you don't just write a task of type NodeTask, but I understand that you like to run a it in the background, which you can't do with that.
You could list the content of a directory and use that as part of the command. But you could also just grab it from the extension provided by the plugin.
This is not documented and it might break in future releases of the plugin, but you can do something like this (Groovy DSL):
task connectJS {
dependsOn nodeSetup
doFirst {
def connectProcess = "$node.variant.nodeExec $projectDir/src/js/connect.js".execute()
// Blocking readers (if async, pipe to a log file instead)
connectProcess.in.eachLine { logger.info(it) }
connectProcess.err.eachLine { logger.err(it) }
}
}
I'm building Gradle script that runs Tomcat (sadly I cannot use Gretty or Cargo plugin). After launching Tomcat ($TOMCAT_HOME/bin/startup.sh) I want to open file in Gradle/Groovy and then print all lines that comes in, another words: open file, track if smt new has came, print it.
Now my task looks like that:
task startTomcat(dependsOn: ...) << {
def catalinaOut = "${project.TOMCAT_HOME}/logs/catalina.out"
delete { catalinaOut }
exec {
workingDir '.'
executable "${project.TOMCAT_HOME}/bin/${tomcatStartScript()}"
environment CATALINA_OPTS: tomcatArgs.join(' ')
}
new File(catalinaOut).eachLine { line -> println(line) }
}
Of course it won't work because new File opens and immediately closes file.
What you are looking for is basically the behaviour of tail -f <file> on unix. So one obvious way to handle this would just to call this (e.g. ['tail', '-f', '<file>'].execute()) if you have access to this tool.
Otherwise Java IO implementation of unix/linux "tail -f" holds several answers.
So this is a trivial example using apache commons-io Tailer:
buildscript {
repositories.jcenter()
dependencies {
classpath 'commons-io:commons-io:2.4'
}
}
task tail << {
def tailer = new org.apache.commons.io.input.Tailer(
"/tmp/mylog" as File,
[handle: { String l -> println l }] as org.apache.commons.io.input.TailerListenerAdapter
)
try {
tailer.run()
}
finally {
tailer.stop()
}
}
Run with gradle tail and add lines to the mentioned file /tmp/mylog. Stop with CTRL-C.
I'm using Gradle to build my software. However, I find the output it proceduces a bit to minimal. I don't want to use --debug or --info, since that logging is much to verbose. I just want to know what the result in terms of artifacts (zip, jar, dmg, etc) of the Gradle buid is. For example, when I run 'gradle jar', I'd like to print where the jar is created.
I did that using:
jar {
doLast {
println "Jar has been created in ${archivePath}"
}
}
And it nicely prints that the jar has been created in the build/lib directory. However, when I run 'gradle distZip', the artifact is not created in the lib dir, but in the distributions directory. The above however is still printed, but I'd rather not have that: when I run the distZip, I'd like to know where I can find the output of that command, not of every step the distZip depends on.
Never mind, the following will work just nicely:
def artifacts = []
addListener(new TaskExecutionListener() {
void afterExecute(Task task, TaskState state) {
if(task in AbstractArchiveTask) {
artifacts << task.outputs.files.singleFile
}
}
void beforeExecute(Task task) { }
})
addBuildListener(new BuildAdapter() {
void buildFinished(BuildResult result) {
if(artifacts) {
println "\nOutput location: ${artifacts.last()}\n"
}
}
})
This is also available as a gist here.
I want to use Apache ant sshexec task in my gradle custom task. The problem is that this task doesn't work (output is not shown in console and sshexec action is not executed). This is how I use it:
configurations {
sshexecAntTask
}
repositories {
mavenCentral()
}
dependencies {
sshexecAntTask 'org.apache.ant:ant-jsch:1.7.0'
}
// ----------------------------------------------------
import java.nio.file.FileAlreadyExistsException;
import java.nio.file.Files
class MyCustomTask extends DefaultTask {
#TaskAction
def build() {
String command = ""
command = 'cmd.exe /C mdir C:\\aadd'
runSshCommand(command)
}
private void runSshCommand(String command) {
String host = "host"
String username = "username"
String password = "password"
ant.taskdef(name: 'sshexec', classname: 'org.apache.tools.ant.taskdefs.optional.ssh.SSHExec', classpath: project.configurations.sshexecAntTask.asPath)
// this command is not executed; why?
ant.sshexec(host: host, username: username, password: password, command: command, trust: 'true', failonerror: 'true')
}
}
[EDIT]
I've tested sshexec and those are my results:
The command cmd.exe /C echo test > C:\testresult.txt started from ant works correctly and output is returned to file.
The command cmd.exe /C echo test > C:\testresult.txt started from gradle works correctly and output is returned to file. Great!
The command cmd.exe /C echo test started from ant works correctly and output is returned to stdout. !
The command cmd.exe /C echo test started from gradle works correctly but output is not returned to stdout. !
The command cmd.exe /C mkdir C:\\\\Inetpub\\\\ftproot\\\\temp\\\\jakisnowykatalog started from ant works correctly and directory is created (I need to use \\\\ as path separator because \\, \, / doesn't work)
The command cmd.exe /C mkdir C:\\\\Inetpub\\\\ftproot\\\\temp\\\\jakisnowykatalog started from gradle doesn't work and directory is not created.
I should add that I want to connect with windows ssh server (not unix/mac) but I've also tested those commands with mac shh without success. Please help!
[Another edit]
I've created groovy test code which uses jsch library to execute command and it works. I still don't know why ant task doesn't work.
import com.jcraft.jsch.*
import java.util.Properties;
private void jschTest() {
Session session = null
Channel channel = null
try {
JSch jsch = new JSch()
session = jsch.getSession("host", "login", 22)
session.setPassword("password")
Properties config = new Properties()
config.put("StrictHostKeyChecking", "no")
session.setConfig(config)
session.connect()
String command = "cmd.exe /C mkdir C:\\gradledir"
channel = session.openChannel("exec");
((ChannelExec)channel).setCommand(command);
channel.connect()
}
catch (Exception e) {
println e.getMessage()
}
finally {
if (session!=null) {
session.disconnect()
}
if (channel!=null) {
channel.disconnect()
}
}
}
Assuming you declare a task of type MyCustomTask and execute it correctly, I see no reason why the Ant task wouldn't get executed. The problem is more likely elsewhere (e.g. wrong configuration of the Ant task).