Maven: running Unit-Tests remotely - java

We are currently working on a distributed Java EE-Application and have therefore a separated test and production system.
Compiling and Bundling is done via an Ant-Task. Now we want to deploy the Jar-Files of the different servers to the test-servers and run the JUnit Integration / Function-Tests there. If they succeed, then the current version should be deployed to the live-servers.
Plain Unit-Tests are executed by Hudson.
Is that possible with Maven and is there any information or best practice available?

Yes. Hudson has maven integration. Take a loot this wiki and this link.
You can set unit test case thresholds for your job to see if it does not pass a certain number of test cases. In that the deploy plugin will not get invoked and the app will not get deployed.

Take a JAR built from Ant and reuse it. I would add a Maven repository to your environment such as Artifactory, Archiva, or Nexus and deploy to that using Ivy. You almost certainly need to use a Maven repository to be happy with Maven for anything other than small scale personal projects. http://ant.apache.org/ivy/
Use Maven to grab the JAR from the Maven Repository. For this, just use a normal Maven dependency declaration.
Run Maven on the QA server, with the JUnit tests declared in that project. If that succeeds, deploy the JAR to the production server. For this, the details depend on the production server. If it's a WAR, I would use Cargo, but if it's a JAR it really depends on what's executing the JAR - you might need some sort of file copy, scp, etc. http://cargo.codehaus.org/
Hudson and TeamCity both have deployment features as well. You just set up a job to run (in this case the Maven job) and tell the CI server to deploy on success.

Related

Jenkins and Maven profiles

We are working on a legacy project and the first task is to setup a DevOps for the same.
The important thing is we are very new to this area.
We are planning to use jenkins and sonarqube for the purpose initially. Let me start with the requirements.
Currently the the project is sub divided into multiple projects (not modules)
We had to follow this build structure as there are no plans for re-organising it as a single multimodule maven project
Currently the builds and dependencies are managed manually
Eg: The project is subdivided in to 5 multi-module maven projects,
say A,B,C,D and E
1. A and C are completely independent and can be easly built
2. B depends on the artifact generated by A (jar) and has multiple maven profiles (say tomcat and websphere, it is a webservice module)
3. D depends on the artifact generated by C
4. E depends on A, B and D and has multiple maven profiles (say tomcat and websphere, it is a web project)
Based on jenkins documentation to handle this scenario, we are thinking about parameterized builds using “parameterized build plugin" and "extended choice parameter plugin" with the help of these plugins we are able to parameterize the profile name. But before each build, the builder waits for the profile parameter.
So we are still searching to find an good solution to
1. keep the dependency between projects an built the whole projects if there is any change in SCM (SVN). For that we are used "Build whenever a SNAPSHOT dependency is built" and "SCM polling option". Unfortunately this option seems not working in our case (we have given an interval of 5 min for scm polling but no build is happening based on test commits)
2. Even though we are able to parameterize the profile, this seems as a manual step (is there an option to automate this part too, ie. build with tomcat profile and websphere profile should happen sequentially).
We are struggling to find a solution to cater all these core requirements. Any pointer would be greatly appreciated.
Thanks,
San
My maven knowledge is limited, however since you didnt get any response yet, ill try to give some general advice.
There are usually multiple ways to reach some aim in Jenkins, each has its pros and cons. Choosing the most fitting solution depends on the specific requirements and your environment/setup.
However you first need something that just works, then you can refine it.
A quick result you get with the following
Everything in one job
Configure your subversion repo (Multiple are possible) to be checked out into your workspace
Enable Poll SCM trigger
Build your modules/projects via Execute shell build steps. (Failed builds can be handed to the job result by using Exit 1 on a Execute shell Build step.)
However keep in mind that this will prevent advanced functionality on a per project/module basis, such as mail notifications to the dev to blame. Or trend of metrics, like warnings or static code analysis.
The following solution is easier to extend in that direction.
Wrapper job around your various build jobs
Use Build step Trigger/call builds on other projects to build A, archive needed artifacts
Use Build step Trigger/call builds on other projects with some parameter tomcatto build B tomcat version, use Copy Artifact Plugin to copy over jar from A
...
Use Build step Trigger/call builds on other projects with some parameter tomcatto build E tomcat version. Use Copy Artifact Plugin to copy all needed artifacts, you can specify parameter there if you need artifact of i.e. B tomcat version
In this setup, monitoring the svn is an issue since if you trigger it from polling SCM, it will checkout it in your wrapper workspace while you dont actually need it checked out there, but in your build jobs.
Possible solution: Share the workspace between wrapper job and your build jobs, so the duplicate checkouts in the build jobs will find the files already in the right revision. However then you *need+ to make sure the downstream jobs are executed on the same machine (there are plugins to do so)
Or even more elegant: Use a post-commit hook (See here, section Post-comit hook) on your svn to notify jenkins of changes.
Edit: For the future, its worth looking into the Pipeline Plugin and its documentation for more complex builds, this is the engine for the upcoming jenkins version 2.0, see here.
I would create 5 different jobs for ABCDE.
As you mentioned A and C would be standalone jobs so I would just do mvn clean install/pkg/verify based on your need.
For B I would first build A and then invoke another maven target in build to build B
For D, I would first build C and then build D
Finally for E , i would use invoke top level mvn targets 5 times A , B,C,D and finally E
Edit:
Jenkins 2 is out and has a built-in support for pipelines.
A few pointers for your requirements:
"built the whole projects if there is any change in SCM"
Although Poll SCM usually requires less work, the proper way to do it is to use SVN hooks.
The solution works as follows:
First you enable the Trigger builds remotely feature and enter a random token in Authentication Token.
This allows you to trigger builds remotely using Jenkins REST API (http[s]://JENKINS_URL/job/BUILD_NAME/build?token=TOKEN)
Then you create a SVN hook (a script that runs whenever you commit) which triggers the build by sending a request to that URL (using curl,wget, python,...).
There are a lot of manuals on how to create SVN hooks, here's the first result on "SVN Hooks" from Google.
"keep the dependency between projects"
I would create a different Jenkins Job for each project separately, then make sure builds are executed in the required order.
I think the best way to order your builds (dependencies) is to create a Build Pipeline using the Pipeline Plugin (previously known as Workflow Plugin).
There is a lot to explain here, so it's better you read on your own. You can start here.
There are also other (simpler) solutions, like Build Flow Plugin or Parameterized Trigger Plugin which can help create dependencies between builds, but I think Pipeline is the newest and considered a best practice (it's definitely the most advanced solution).
Still, having said that, if you feel Pipeline is an overkill for you, go for the alternatives.
I would recommend making sure each build does a mvn install to the same local repo, and also deploys the artifact to Artifactory (hopefully you have one).
Automate parameterized builds: "build with tomcat profile and websphere profile"
To do that you'll need to create parameterized builds.
That's pretty easy to do, you just check This build is parameterized in your build config and add a MVN_PROFILE string/choice parameter.
After that you can trigger each build several times, with different parameters, using any one of the plugins mentioned in the previous bullet.
Extra Tip:
While hacking your way through this, consider using Job Configuration History Plugin, it can help review and revert changes made to the configuration.
Good luck, hope this helps :)
I would consider a bit different approach to fully de-couple the projects.
If you are able to create your internal artifactory, than I would consider in the maven build each on of the dependencies as a third party library exactly like it is done with any other external libraries you are using.
This way, each such project can be seperatly built and stored in the artifactory and when a dependent project will be built it will just take the right version as mentioned in the pom file.
This way you'll have different build process for each one of the projects and only relevant projects (relevant = changed) will be built.

Running a multi project Maven-built webapp on Heroku?

I'm migrating my multi project Java webapp from Cloudbees to Heroku. I have one main webapp that depends on 3 other library projects that I have written (also Maven+Java). On Cloudbees this was simple, you just build everything through Jenkins, the JARs get poked into their Maven repos then the main webapp gets built taking its dependencies from that Maven repos.
However, I can't for the life of me find a way of doing this on Heroku without doing something horrendous like this:
https://devcenter.heroku.com/articles/local-maven-dependencies
By doing so it would mean that every time I changed a library I would have to remember to deploy the built JAR to the main webapp project and push this to Git, somewhat defeating the point of using Maven!
I understand that the way to deploy your webapp on Heroku is essentially by doing a git push to their repo, but how can I tell heroku at the other end to find my library dependencies without having to bundle them in the main webapp like this article suggests?
I would assume that there is some way to set up a private maven repos on your heroku account but I can't find anything like this.
I think the best solution is to use a private Maven repository, and then add the config for that repo in the settings.xml of your main project. This article on using a custom settings.xml describes how to do that.
There are some services that host private Maven repos for you, such as JFrog. But I've even read of people using Github to host a Maven repo, too.

How to do a deployment pipeline with Maven

We want to accelerate our build pipeline for a multi-module Java web application, which roughly consists of
compile/code analysis
unit tests
integration tests
GUI tests
At the moment each of these build steps starts from scratch, compiling and building again and again, which costs time and means that we do not deploy the actual files to production that have gone through the tests. Is it possible to get Maven to not recompile everything on subsequent steps but instead run the tests against the previously compiled classes?
We are using Maven3 to manage our dependencies and Teamcity as a build server (7 at the moment, planning to upgrade to 8 soon). I have tried to set up a build chain, doing a
mvn clean install
on the first step and then exporting all the */target/ folders to the following builds. Then ideally I would only do a
mvn test
mvn integration-test
Unfortunately I have not been able to persuade Maven to do this properly. Either it compiles the classes again or produces errors.
Has anyone successfully done this kind of setup and has any pointers for me? Is this possible with Maven and is this even the right way to do things?

What is the process with Maven project to compile and test the code?

I have maven project imported in my eclipse. Now I need to start making changes to it and test it with the integration test (out of App server). Currently, the integration test is run out of server using openEJB container.
My basic question is, what is the regular process to compile, build and test with Maven?
mvn install
Maven -> Update Project.
Run my test from command line
Is it how it is done? I am specifically interested in knowing mvn install commands.
So should I do all three steps before I can test it?
Example: I just wanted to print something and see what is the output. For this I guess I need to do all these steps?
The openEJB container needs classes so it can load them.
There is a wonderful Maven quick-reference sheet at http://maven.apache.org/guides/MavenQuickReferenceCard.pdf
First, you should be aware that unit tests and integration tests are separate and are run from separate plugins and at separate parts of the maven lifecycles. Unit tests are run with surefire and integration tests are run with failsafe.
You want to run integration tests and the failsafe documentation says:
NOTE: when running integration tests, you should invoke maven with the (shorter to type too)
mvn verify
rather than trying to invoke the integration-test phase directly...
This is the best way to run integration tests directly in maven. It will run all the preceding steps necessary (eg: compile) in order to run the integration tests. It won't waste time doing an install because install happens immediately after verify.
But if you're running the tests locally, it may be a better idea to run your integration tests directly in your IDE. That will give you a much faster feedback loop.
If it is Eclipse project the most reasonable thing is to do everything not from command line but from Eclipse. Assuming you have m2e plugin installed, go to your_project->run as->Maven test and run it.
You dont need neither install nor package phase to run Maven tests, package will create a jar which is not needed for tests, install will copy this jar to local repo which is also useless. When Maven run tests it uses compiled classes from target dir and ignores project's jar if even it exists.
Yes, mvn isntall is the most popular option. It compiles, packages and tests your project.

Deployment from Nexus to different environment (Test, Prerelease, Production)

I have Java projects built with maven, with artifacts (.jar .war) deployed to Nexus release repository. Also Jenkins is used for CI (building every hour) and automatically deploys to Tomcat (integration testing environment). We are using maven-release-plugin for artifact deployment to Nexus, that is done on local PC.
I need to automate deploying to other 3 environments: Test, Prerelease, Production.
There are 2 problems:
It is unlikely that I can use Jenkins for that, as Jenkins can't know when current version is promoted as good & released.
The location of .jar .war is different after every release
http://nexusserver:8081/nexus/content/repositories/releases/com/company/projectname/component/0.2.4/
A bit similar questions is
Deploying from Nexus to Tomcat (via Jenkins/Hudson)
It sounds like you need what is often referred to as a "build pipeline" or "build pipeline manager" a term which I believe became popular with the (excellent) book "Continuous Delivery".
There is an open source Jenkins plugin called Build Pipeline Plugin that may meet your needs.
https://wiki.jenkins-ci.org/display/JENKINS/Build+Pipeline+Plugin

Categories