We want to accelerate our build pipeline for a multi-module Java web application, which roughly consists of
compile/code analysis
unit tests
integration tests
GUI tests
At the moment each of these build steps starts from scratch, compiling and building again and again, which costs time and means that we do not deploy the actual files to production that have gone through the tests. Is it possible to get Maven to not recompile everything on subsequent steps but instead run the tests against the previously compiled classes?
We are using Maven3 to manage our dependencies and Teamcity as a build server (7 at the moment, planning to upgrade to 8 soon). I have tried to set up a build chain, doing a
mvn clean install
on the first step and then exporting all the */target/ folders to the following builds. Then ideally I would only do a
mvn test
mvn integration-test
Unfortunately I have not been able to persuade Maven to do this properly. Either it compiles the classes again or produces errors.
Has anyone successfully done this kind of setup and has any pointers for me? Is this possible with Maven and is this even the right way to do things?
Related
I'm using the JaCoCo Maven plugin and agent to measure and retrieve the code coverage data of an application which is tested nightly.
This is the schema of the architecture:
My Maven project is configured with the JMeter Maven plugin to execute some API tests during the Maven verify phase.
The Maven command executed by the Jenkins server is the following
mvn verify org.jacoco:jacoco-maven-plugin:0.7.8:dump sonar:sonar -Djacoco.address=TEST_SERVER -Djacoco.destFile=/proj/coverage-reports/jacoco-it.exec -Dsonar.projectKey=sonar_test -Dsonar.projectName=sonar_test -Dsonar.branch=sonar_test -Dsonar.jacoco.itReportPath=/proj/coverage-reports/jacoco-it.exec -Dsonar.java.coveragePlugin=jacoco -Dsonar.language=java
As you can see first the tests are executed through the verify phase, then the jacoco:dump goal retrieves the coverage data from the test server (I configured the server to run the JaCoCo agent) and at last the data is uploaded to my Sonar server.
The "strange" behaviour I'm having is that if I run this command on my computer and then on Jenkins (configuring the Jenkins project accordingly) in the SonarQube page I get different coverage results. Moreover, if I configure the Jenkins project and then I simply COPY it creating a new (but equivalent) Jenkins project, the results are different.
I tried different configurations and cases, but I cannot understand what the problem can be. Am I not considering some JaCoCo constraints (e.g. someting related to the Jenkins project name)?
As said in the question comments, the artifact deployed on the test server and the one compiled during the verify phase on which the report is generated must be exactly the same, so it is not enough that the code is the same.
To solve my problem I had to implement this workflow with Jenkins:
Do a mvn package on the project
Deploy the generated WAR on the remote server using Ansible (we already use Ansible for nightly deploys and other tasks on remote machines)
Run the remote tests without recompiling the wars. To do this I had to add the Maven flag -Dmaven.compiler.useIncrementalCompilation=false (thanks to this and this for the hints) in order to not re-compile the artifacts during the verify phase
Retrieve (dump) the JaCoCo coverage data
So the Maven command described in the question has been split in two commands: the one which creates the package and the one which performs the tests and retrieves the JaCoCo data without recompiling the artifacts.
I'm trying to run a complete Maven build of multiple projects for an automated build tool. If unit tests fail, but the project itself builds correctly, I want to be able to continue the build and detect this after the build completes. I tried doing this:
mvn clean package -Dmaven.test.failure.ignore=true -Dmaven.test.error.ignore=true -Dmaven.test.reportsDirectory=/Users/bfraser/misc/reports
The "maven.test.failure.ignore" and "maven.test.error.ignore" properties work fine. However, surefire seems to ignore the "maven.test.reportsDirectory" completely (in fact, if you look at the documentation for the test goal, the reportsDirectory property is not documented to be tied to the system variable). This may be because I'm building a multi-module project? All reports seem to go in the target/ folder of the subprojects.
It is very difficult for me to be able to edit the POMs in an automated way since many of them have parent POMs that might be on a Nexus repo somewhere, etc. -- I need to be able to do this externally to the project (preferable via command line switches, but if I need to create some files so be it... as long as I don't have to edit the project POM it's all good).
I just need to know if any test failed. I'm not particularly fussy about what/how many tests failed.
I am currently thinking about the potential solutions for building and running a Jenkins Maven project. I am a Jenkins Noob and what I currently think of is providing a Maven Plugin that runs the project right after the build and test phase. This feels wrong... .
So my basic question is, is it possible in Jenkins to configure a process to build a maven project and execute it right away and taking care for not interfering with it by starting another process and rebuilding it since a change arrived.
If this is possible it would ease the task by omitting the "Let's write a Maven plugin".
What do you means by 'execute'? Your program is a jar? If you do have to deploy it, there is tools in Jenkins to deploy with the build phase. If not, I think you can always make a 'Post build' command like java -jar nameofprogram.jar
In Jenkins you can configure jobs to execute multiple Maven targets when the jobs are run. I don't know if this answers your question, but you should be able to accomplish what you want by using "post build steps" and trigger certain behaviour from there.
What I want to achieve is as:
Build the maven project and push the jar to repo, using maven & jenkins.
Deploy the application, using script.
Run jmeter test cases and display test results in jenkins dashboard.
First jenkins build my project and push it to repo.
Then I have defined a post build step in jenkins to run script on remote server, this script deploys and starts my application.
Then I have created a post build action in jenkins to invoke top-level maven targets, to run mvn verify, which triggers the jmeter-maven plugin, which runs the test cases on my already running application.
Is this a good approach and if not please let me know a better way to do this?
Thanks in advance.
The bit that may be missing here is how Jenkins knows if the build should be marked as passed or failed? Even if jmeter-maven-analysis plugin execution did exit with zero exit code, it doesn't mean performance-wise the application is fine. It may be, but don't have to be. I came across that kind of concerns some time ago and provided a solution. Check project wiki for usage example and more information.
I have maven project imported in my eclipse. Now I need to start making changes to it and test it with the integration test (out of App server). Currently, the integration test is run out of server using openEJB container.
My basic question is, what is the regular process to compile, build and test with Maven?
mvn install
Maven -> Update Project.
Run my test from command line
Is it how it is done? I am specifically interested in knowing mvn install commands.
So should I do all three steps before I can test it?
Example: I just wanted to print something and see what is the output. For this I guess I need to do all these steps?
The openEJB container needs classes so it can load them.
There is a wonderful Maven quick-reference sheet at http://maven.apache.org/guides/MavenQuickReferenceCard.pdf
First, you should be aware that unit tests and integration tests are separate and are run from separate plugins and at separate parts of the maven lifecycles. Unit tests are run with surefire and integration tests are run with failsafe.
You want to run integration tests and the failsafe documentation says:
NOTE: when running integration tests, you should invoke maven with the (shorter to type too)
mvn verify
rather than trying to invoke the integration-test phase directly...
This is the best way to run integration tests directly in maven. It will run all the preceding steps necessary (eg: compile) in order to run the integration tests. It won't waste time doing an install because install happens immediately after verify.
But if you're running the tests locally, it may be a better idea to run your integration tests directly in your IDE. That will give you a much faster feedback loop.
If it is Eclipse project the most reasonable thing is to do everything not from command line but from Eclipse. Assuming you have m2e plugin installed, go to your_project->run as->Maven test and run it.
You dont need neither install nor package phase to run Maven tests, package will create a jar which is not needed for tests, install will copy this jar to local repo which is also useless. When Maven run tests it uses compiled classes from target dir and ignores project's jar if even it exists.
Yes, mvn isntall is the most popular option. It compiles, packages and tests your project.