I'm using maven-cobertura-plugin in order to calculate code coverage in my project. As I understand, this plugin starts a new/forked build cycle in order to compile and test the code base. When it's done the plugin calculates code coverage. As I understand, this is the only approach the plugin can use, and it's fine for me.
The problem is that before cobertura plugin my code base is compiled and tested already. Thus, I'm experiencing duplicated compilation and testing. Is it possible to avoid compilation and testing before cobertura? Or maybe there is some other workaround?
Is it possible to avoid compilation and testing before cobertura? Or maybe there is some other workaround?
There are several issues about this (see MCOBERTURA-83, MCOBERTURA-76) but AFAIK, there is no perfect workaround (due to the way the life cycle is constructed - things might be improved in Maven 3).
The only one I'm aware of (works with CI servers) would be to run:
mvn clean install -Dmaven.test.skip=true
and then
mvn cobertura:check
Instead of binding cobertura:check on the build lifecycle.
Note that compiling twice shouldn't be an issue as all classes should be up to date.
As far as I know, cobertura needs to do bytecode weaving on your code to be able to work.
The only way I was able to work around that was to instrument the byte code as part of my build (by binding the cobertura:instrument goal to the verify phase and also bind the default-test execution from maven-surefire-plugin to the verify phase so it doesn't get executed as part of the test phase on every cobertura goal execution.
Related
I am trying run single integration test. I have a a lot of *IT class and I want to run only one test. I try this :
mvn -Dit.test=XControllerIT verify
Am I doing wrong? Is there another alternative to this? Maven is being used.
There are two main options depending on your project setup:
Integration Tests are run with a dedicated Failsafe plugin
Integration Tests are run with a regular surefire plugin
If you have a failsafe plugin (and you actually should, its a recommended approach), then use the following snippet:
mvn -Dit.test=MySampleIntegrationTest failsafe:integration-test
If you're on surefire, then run:
mvn -Dtest=MySampleUnitTest surefire:test
In both cases there is a direct plugin goal execution, bypassing the lifecycle like in your initial example (with mvn verify)
In maven it is possible to run the lifecycle, see Default Lifecycle Documentation for more information
Basically, the lifecycle is comprised of phases with plugins bound to each phase
So when you run the mvn verify all the phases before verify will also run.
As a consequence, the code will be compiled (compile phase with a maven compile plugin automatically attached to it will do the job), tests will run (surefire plugin), and so on.
If you don't have a compiled source code and code of tests, you can't use the presented approach because you have to compile the code first.
However, if you already have everything compiled, it makes sense to run only the one test without recompilation of the code, and in this case, depending on the plugin you can use the suggested solution.
Especially it can make sense for local debugging or for CI in some cases multi-step build setup (can be seen in fairly complicated projects)
I want to know how to link production code and test code, i.e., I'd like to answer this question: which product codes are the targets of this test code?
I'd like to do this automatically.
My project uses maven and I have used SonarQube and the source codes are written in Java.
If it's needed, I will try any other tools.
How can I link production codes and test codes?
Please let me know how to do it.
What you want to do is effectively getting the Coverage of your tests, an answer to the question "What lines/branches of my code are covered by my tests?".
Maven and SonarQube are perfectly suited for this, the only thing you need to add into the mix is Jacoco. A good explanation for the configuration of Jacoco/Junit is here. Jacoco is an agent that gets added to the execution of your tests and which monitors them, analyzing which lines/branches have been executed (covered) and which have not.
The important part is to configure the jacoco plugin and the surefire/failsafe plugin(s) (last one is for integration tests) to use jacoco. This will generate jacoco report files, which then can/will be read by SonarCube during the sonar:sonar goal (you might have to set the path to these files either in your maven pom.xml as a sonar property or directly in the SonarQube server properties, both work fine).
You can test it step-by-step, first getting jacoco to run, since it already creates nice html reports. Reading the reports into SonarQube is the easier part then.
I am just looking at the cobertura maven plugin and I wasnt sure if the following is possible
Instrument classes
Run junit tests
Generate Cobertura report without reinstrumenting classes and running tests
I have a multi module maven project and the coverage of the domain module is showing up as 0% even though its been used by every other module
I have tried different combinations of things but the coverage of my domain module always stays at 0%.
People have mentioned writing separate tests for the domain classes but i dont want to do this as you could easily write a tests to test a function that isn't actually being used anywhere within the codebase
Any pointers would be greatly appreciated
In order to do so you would have to execute the maven goals in the correct order so :
cobertura:instrument
test
goalToAskCoberturaToGenerateReport
But then comes the trouble : there is no such goal as cobertura:report, if you look at the documentation and source code of the maven plugin : The goal cobertura:cobertura is the only goal generating the report. I suspect it is as such because of some maven internal limitation.
So in short, given the state of the maven plugin it is not possible.
You might have a chance to manage what you want to achieve by executing cobertura from the command line.
For multi-module maven projects cross-module coverage seems not to be available off-the-shelf with Cobertura.
A solution using a mixture of maven and ant is described byThomas Sundberg:
http://thomassundberg.wordpress.com/2012/02/18/test-coverage-in-a-multi-module-maven-project/
See also this related question:
Maven2 Multiproject Cobertura Reporting Problems During mvn site Build
We're migrating some ant build scripts to gradle and are diagnosing issues along the way. One problem that has popped up is that on the CI server (jenkins running gradle) we occasionally get test failures. We think the issue is related to test execution order because one of the tests that is failing uses thread local storage in some library code.
I would like to be able to reproduce the problem locally before fixing the broken tests. However. I can't reproduce the problem locally because gradle always runs the tests in an order that happens to work.
So, is there a way to force gradle to run test class X before test class Y? The tests need to run in the same JVM - one test right after the other.
In case it matters, the tests are JUnit tests.
Yes, it is possible. One of the possibilities is to create additional task for test Y run
task YTest(type: Test) {
include '**/Y.*'
}
test {
exclude '**/Y.*'
}
test.dependsOn YTest
Per Peter's response on the gradle forum, this is not possible.
http://forums.gradle.org/gradle/topics/can_gradle_run_two_tests_in_a_specific_desired_order_in_the_same_jvm?utm_content=reply_link&utm_medium=email&utm_source=reply_notification#reply_13187620
An alternative is to use a test suite.
testNG (a Junit alternative, also supported by gradle) also allows you to have control on test order. See this post for more details.
We are running our continuous builds on Hudson currently with "mvn clean verify". That is what we always did and so we never questioned it.
The question is: Is it safe to run a continuous build with "mvn verify" only?
So that would mean the maven-compiler-plugin would only compile classes that got changed since the last build and save precious time.
Will the feedback be the same quality as with "clean" or are there any drawbacks to be expected?
The product being tested is a typical Java web application with lots of generated code (JSPs, reports). There is also code around using Dependency Injection.
No, it's not safe! The Maven compiler plugin is not smart enough to figure out that the API of a class A has changed and that it should check all other classes which use this API, too. It will only compile A and create a jar with a lot of broken classes.
Note: It's usually better to run mvn clean in advance and then run the build/verify/compile/install. That allows you to run the second command several times without cleaning all the time.