How to run job-dsl-plugin locally with additional plugins - java

We are using the Jenkins Job-DSL plugin for creating a number of jobs and having the actual job-configuration as part of the version-controlled source-code.
In order to test the resulting XML files locally, I currently use something like the following:
java -jar /opt/job-dsl-plugin/job-dsl-core/build/libs/job-dsl-core-1.78-SNAPSHOT-standalone.jar create_jobs.groovy
This allows to look at the resulting XML while making changes.
However some DSL elements are failing in the local build, but still work on the actual Jenkins installation.
E.g. "batchFile", "pullRequestBuildTrigger" and a few others.
As far as I understand these are separate Jenkins plugins which contribute some additional elements to the DSL, so the core job-dsl-plugin does not know about them.
I tried various ways of adding the code from these plugins to the job-dsl-plugin so that I can run the local transformation, but I could not find a way that actually works. Adding the plugins to job-dsl-plugin, classpath, ... nohting fixed it.
I also looked at How to import and run 3rd party Jenkins Plugin's extension DSL (githubPullRequest) with Gradle tool locally?, but the suggestions there did not work for me as I do not want to run a local Jenkins instance here.
So how can I run the job-dsl-plugin manually with DSL from additional plugins being available?

Related

Run tests with multiple dependency versions?

My library depends on another library; let's call it "lib". I want to test my library with multiple versions of lib, in an automated manner.
Test if my library compiles for each version of lib.
Run JUnit 5 tests for each version of lib.
Are there any existing solutions for this?
I could write a script that changes the version number of lib in my pom.xml and executes mvn compile and mvn surefire:test. I could also use profiles and automate this with a script. I was hoping there is a better way, through something like a Maven plugin.
Maven focuses on reproducible builds which means that if you repeat the build at a later date you should get the same results which in turn requires that the dependency versions are fixed.
This fundamental mindset is what you want to challenge. Maven won't like it even if it is for a good reason, and you will most likely need to have a separate full run for each version instead of looping inside Maven.
I was thinking that the way I would approach this, is to have a bill-of-materials POM that has a dependencyManagement section that lists the exact version you want to have, which is generated in the local filesystem before each run, and then orchestrate a run for each version you want to test.
You can also leverage your build system and have a repository which orchestrates this. Github Actions can do array builds which might be what you need.

create a Groovy script distribution/executable via Maven

I have a project module full of groovy scripts which are run via embedded IntelliJ Groovy shell. In a new issue I need to have one of those scripts being run in combination with crontab. Needless to say I cannot just run groovy myScript.groovy dev to have this script executed out of the box - no the dependencies are missing for sure.
I now need a way to have this one particular groovy script being compiled and ready to run out of the box (with the use of the "dev" parameter)
Assuming that I put the myScript.groovy into a directory
main/
|_src/
|_groovy/
What do I need to have a maven build creating a usable executable for me to drop into my machine and let crontab run it accordingly.
I tried a lot of Maven Plugins - but never came far enough. Also I'm sure that there must be a way more trivial way to achieve this since it's a simple build operation in my opinion.

Jenkins and Maven profiles

We are working on a legacy project and the first task is to setup a DevOps for the same.
The important thing is we are very new to this area.
We are planning to use jenkins and sonarqube for the purpose initially. Let me start with the requirements.
Currently the the project is sub divided into multiple projects (not modules)
We had to follow this build structure as there are no plans for re-organising it as a single multimodule maven project
Currently the builds and dependencies are managed manually
Eg: The project is subdivided in to 5 multi-module maven projects,
say A,B,C,D and E
1. A and C are completely independent and can be easly built
2. B depends on the artifact generated by A (jar) and has multiple maven profiles (say tomcat and websphere, it is a webservice module)
3. D depends on the artifact generated by C
4. E depends on A, B and D and has multiple maven profiles (say tomcat and websphere, it is a web project)
Based on jenkins documentation to handle this scenario, we are thinking about parameterized builds using “parameterized build plugin" and "extended choice parameter plugin" with the help of these plugins we are able to parameterize the profile name. But before each build, the builder waits for the profile parameter.
So we are still searching to find an good solution to
1. keep the dependency between projects an built the whole projects if there is any change in SCM (SVN). For that we are used "Build whenever a SNAPSHOT dependency is built" and "SCM polling option". Unfortunately this option seems not working in our case (we have given an interval of 5 min for scm polling but no build is happening based on test commits)
2. Even though we are able to parameterize the profile, this seems as a manual step (is there an option to automate this part too, ie. build with tomcat profile and websphere profile should happen sequentially).
We are struggling to find a solution to cater all these core requirements. Any pointer would be greatly appreciated.
Thanks,
San
My maven knowledge is limited, however since you didnt get any response yet, ill try to give some general advice.
There are usually multiple ways to reach some aim in Jenkins, each has its pros and cons. Choosing the most fitting solution depends on the specific requirements and your environment/setup.
However you first need something that just works, then you can refine it.
A quick result you get with the following
Everything in one job
Configure your subversion repo (Multiple are possible) to be checked out into your workspace
Enable Poll SCM trigger
Build your modules/projects via Execute shell build steps. (Failed builds can be handed to the job result by using Exit 1 on a Execute shell Build step.)
However keep in mind that this will prevent advanced functionality on a per project/module basis, such as mail notifications to the dev to blame. Or trend of metrics, like warnings or static code analysis.
The following solution is easier to extend in that direction.
Wrapper job around your various build jobs
Use Build step Trigger/call builds on other projects to build A, archive needed artifacts
Use Build step Trigger/call builds on other projects with some parameter tomcatto build B tomcat version, use Copy Artifact Plugin to copy over jar from A
...
Use Build step Trigger/call builds on other projects with some parameter tomcatto build E tomcat version. Use Copy Artifact Plugin to copy all needed artifacts, you can specify parameter there if you need artifact of i.e. B tomcat version
In this setup, monitoring the svn is an issue since if you trigger it from polling SCM, it will checkout it in your wrapper workspace while you dont actually need it checked out there, but in your build jobs.
Possible solution: Share the workspace between wrapper job and your build jobs, so the duplicate checkouts in the build jobs will find the files already in the right revision. However then you *need+ to make sure the downstream jobs are executed on the same machine (there are plugins to do so)
Or even more elegant: Use a post-commit hook (See here, section Post-comit hook) on your svn to notify jenkins of changes.
Edit: For the future, its worth looking into the Pipeline Plugin and its documentation for more complex builds, this is the engine for the upcoming jenkins version 2.0, see here.
I would create 5 different jobs for ABCDE.
As you mentioned A and C would be standalone jobs so I would just do mvn clean install/pkg/verify based on your need.
For B I would first build A and then invoke another maven target in build to build B
For D, I would first build C and then build D
Finally for E , i would use invoke top level mvn targets 5 times A , B,C,D and finally E
Edit:
Jenkins 2 is out and has a built-in support for pipelines.
A few pointers for your requirements:
"built the whole projects if there is any change in SCM"
Although Poll SCM usually requires less work, the proper way to do it is to use SVN hooks.
The solution works as follows:
First you enable the Trigger builds remotely feature and enter a random token in Authentication Token.
This allows you to trigger builds remotely using Jenkins REST API (http[s]://JENKINS_URL/job/BUILD_NAME/build?token=TOKEN)
Then you create a SVN hook (a script that runs whenever you commit) which triggers the build by sending a request to that URL (using curl,wget, python,...).
There are a lot of manuals on how to create SVN hooks, here's the first result on "SVN Hooks" from Google.
"keep the dependency between projects"
I would create a different Jenkins Job for each project separately, then make sure builds are executed in the required order.
I think the best way to order your builds (dependencies) is to create a Build Pipeline using the Pipeline Plugin (previously known as Workflow Plugin).
There is a lot to explain here, so it's better you read on your own. You can start here.
There are also other (simpler) solutions, like Build Flow Plugin or Parameterized Trigger Plugin which can help create dependencies between builds, but I think Pipeline is the newest and considered a best practice (it's definitely the most advanced solution).
Still, having said that, if you feel Pipeline is an overkill for you, go for the alternatives.
I would recommend making sure each build does a mvn install to the same local repo, and also deploys the artifact to Artifactory (hopefully you have one).
Automate parameterized builds: "build with tomcat profile and websphere profile"
To do that you'll need to create parameterized builds.
That's pretty easy to do, you just check This build is parameterized in your build config and add a MVN_PROFILE string/choice parameter.
After that you can trigger each build several times, with different parameters, using any one of the plugins mentioned in the previous bullet.
Extra Tip:
While hacking your way through this, consider using Job Configuration History Plugin, it can help review and revert changes made to the configuration.
Good luck, hope this helps :)
I would consider a bit different approach to fully de-couple the projects.
If you are able to create your internal artifactory, than I would consider in the maven build each on of the dependencies as a third party library exactly like it is done with any other external libraries you are using.
This way, each such project can be seperatly built and stored in the artifactory and when a dependent project will be built it will just take the right version as mentioned in the pom file.
This way you'll have different build process for each one of the projects and only relevant projects (relevant = changed) will be built.

Resolve jar dependencies automatically

A project was using various libraries. E.g. a.jar, b.jar,c.jar,d.jar etc
Some of the jars have been refactored and now is ab.jar and cd.jar etc.
What I need is an automatic way to find which jars in my installation are now obsolete and I can delete them.
Is this possible?
So with LooseJar you can detect unused Jar files by adding:
-javaagent:loosejar.jar
to your java command when you invoke form the command line (or as a VM option in Eclipse). I guess this isn't technically automatic because lines of code that dynamically load classes at runtime will need to be invoked in order for LooseJar to know that the class and therefor the jar is needed. A good method might be to invoke your unit tests with this java agent (assuming your unit tests have good code coverage)
The best way is to use maven. If dependencies are defined in maven you can just run mvn dependency:tree to retrieve needed information. Please refer to this article for details.
If you do not use maven you probably have to use tools like jdepend. But be careful: such tools cannot really retrieve all dependencies. It is impossible to retrieve dependency on dynamically loaded class or API being called by reflection using static analysis only. Full solution may be achieved only if you are running your application, test it with all possible scenarios and check what classes are loaded by class loader. If you have 100% test coverage you can run your application using option -verbose:class and then run all unit tests against your application. You will get a list of all loaded classes. Now put this list to file and write shell script that analyses the classes list and transforms it to jars list.

Maven requires manual dependency update?

I'm new to Maven, using the m2e plugin for Eclipse. I'm still wrapping my head around Maven, but it seems like whenever I need to import a new library, like java.util.List, now I have to manually go through the hassle of finding the right repository for the jar and adding it to the dependencies in the POM. This seems like a major hassle, especially since some jars can't be found in public repositories, so they have to be uploaded into the local repository.
Am I missing something about Maven in Eclipse? Is there a way to automatically update the POM when Eclipse automatically imports a new library?
I'm trying to understand how using Maven saves time/effort...
You picked a bad example. Portions of the actual Java Library that come with the Java Standard Runtime are there regardless of Maven configuration.
With that in mind, if you wanted to add something external, say Log4j, then you would need to add a project dependency on Log4j. Maven would then take the dependency information and create a "signature" to search for, first in the local cache, and then in the external repositories.
Such a signature might look like
groupId:artifactId:version
or perhaps
groupId:artifactId:version:classifier
This identifies a maven "module" which will then be downloaded and configured into your system. Once in place it adds all of the classes within the module to your configured project.
Maven principally saves time in downloading and organizing JAR files in your build. By defining a "standard" project layout and a "standard" build order, Maven eliminates a lot of the guesswork in the "why isn't my project building" sweepstakes. Also, you can use neat commands like "mvn dependency:tree" to print out a list of all the JARs your project depends on, recursively.
Warning note: If you are using the M2E plugin and Eclipse, you may also run into problems with the plugin itself. The 1.0 version (hosted at eclipse.org) was much less friendly than the previous 0.12 version (hosted at Sonatype). You can get around this to some extent by downloading and installing the "standalone" version of Maven from apache (maven.apache.org) and running Maven from the command line. This is actually much more stable than trying to run Maven inside Eclipse (in my personal experience) and may save you some pain as you try to learn about Maven.

Categories