I am building a SpringBoot/Maven project. Jacoco is configured to check coverage and creates 3 reports:
from unit tests (surefire) target/jacoco.exec (xml: target/site/jacoco)
from integration tests (failsafe) target/jacoco-it.exec (xml: target/site/jacoco-it)
aggregate (jacoco-aggregate) target/jacoco-aggregate.exec (xml: target/site/jacoco-aggregate)
Without any additional configuration Sonarqube seems to create it's own aggregate:
[INFO] 13:14:27.153 Analysing C:\Users\code\dras-mt\target\jacoco-it.exec
[INFO] 13:14:27.248 Analysing C:\Users\code\dras-mt\target\jacoco.exec
[INFO] 13:14:27.414 Analysing C:\Users\code\dras-mt\target\sonar\jacoco-merged.exec
And it warns about deprecated use of properties:
[DEBUG] 13:14:27.000 Property 'sonar.junit.reportsPath' is deprecated and will be ignored, as property 'sonar.junit.reportPaths' is also set.
(they are not set via my pom.xml)
The problem with the generated report is, that Sonarqube seems to show the aggregated coverage but only counts the unit tests.
My question is how to get a consistent report (coverage matches count tests) and in best case how to export all 3 reports to Sonarqube (Sonarqube's own aggregate could be the 4th).
I tried to use property:
<properties>
....
<sonar.coverage.jacoco.xmlReportPaths>
${project.build.directory}/site/jacoco-aggregate/,${project.build.directory}/site/jacoco/,${project.build.directory}/site/jacoco-it/
</sonar.coverage.jacoco.xmlReportPaths>
</properties>
But this property seems to be completely ignored (i tried also absolute and reletive paths and set via -D and also single path) - so log output (also using -X) does not change.
(first i tried this: https://community.sonarsource.com/t/coverage-test-data-importing-jacoco-coverage-report-in-xml-format/12151)
So how to correctly confige the sonar-maven-plugin (3.7.0.1746) to show the 3 reports?
Related
This maven command launches an older version (1.3.162) of the H2 database:
mvn -debug com.edugility:h2-maven-plugin:1.0:spawn
As recommended here, I'm trying to override a property of the plugin on the command line....so I can instead use a more recent h2 version:
mvn -debug -Dh2Version=1.4.200 com.edugility:h2-maven-plugin:1.0:spawn
This h2Version property with the older h2 version is defined here on github in the plugin's pom.
Here is the end of the verbose maven output
[DEBUG] Process arguments: [C:\java\jdk-9.0.4\bin\java, -cp, C:\Users\eoste\.m2\repository\com\h2database\h2\1.3.162\h2-1.3.162.jar, org.h2.tools.Server, -tcp, -tcpPassword, h2-maven-plugin, -tcpPort, 9092]
[INFO] H2 server spawned at tcp://localhost:9092
Not only does the old 1.3.162 version launch, but the there is zero mention anywhere of the h2Version property that I placed on the command line.
I tried moving the -Dh2Version parameter to the end of the command line. I also tried deleting the plugin from the local repo, to force a download so maybe the h2Version would then get reevaluated....none of these things worked.
This blog shows how to embed a dependency inside a plugin, but that's tons more complicated than my simple command line invocation.
What am I doing wrong?
Using windows 10, java 9, maven 3.6.2
What am I doing wrong?
1) Beware when you want to use plugins/libraries not maintained. The source code was not updated from about 8 years. That may have important issues.
2) To know how to use a maven plugin, don't look in the pom declaration. You can find some information in but you will find much more information in the mojo implementation/specification.
But in fact no, you should not even rely on that to understand how to use a plugin.
3) Indeed a Maven plugin may support configurable properties : directly in the pom.xml and even export them for command line usage. But that is not automatic.
But in both cases, that has to be foreseen by the plugin developer and it is generally documented on the plugin or the source repository homepage.
In fact in your case, if you go into the Mojo implementation : AbstractH2Mojo, you can see how the configuration is set.
All properties have default values in the mojo constructor.
protected AbstractH2Mojo() {
super();
final Service tcpService = new Service("tcp", Service.getDefaultPort("tcp"), false, false);
this.setServices(Collections.singletonList(tcpService));
this.setPort(Service.getDefaultPort("tcp"));
this.setShutdownPassword("h2-maven-plugin");
this.setJava(new File(new File(new File(System.getProperty("java.home")), "bin"), "java"));
}
The mojo empty constructor is first invoked, then all setter are invoked on the created instance.
It means that you can override any of these properties defined in that class at runtime by providing the property such as ${artifactIdPrefixWithoutMavenPlugin}.field.
Since the maven plugin is h2-maven-plugin, the prefix to refer is h2.
If you run that :
mvn -X com.edugility:h2-maven-plugin:1.0:spawn -Dh2.port=8084 -Dh2.useSSL=false
You could see in the output :
[DEBUG] Configuring mojo 'com.edugility:h2-maven-plugin:1.0:spawn' with basic configurator -->
[DEBUG] (s) port = 8084
[DEBUG] (s) shutdownHost = localhost
[DEBUG] (s) shutdownPassword = h2-maven-plugin
[DEBUG] (s) useSSL = false
[DEBUG] -- end configuration --
[DEBUG] Process arguments: [/usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java, -cp, /home/david/.m2/repository/com/h2database/h2/1.3.162/h2-1.3.162.jar, org.h2.tools.Server, -tcp, -tcpPassword, h2-maven-plugin, -tcpPort, 8084]
Concerning the h2 jar used, if you still look in the same class, you will see that part that retrieves the jar file from the classpath :
public final File getH2() {
final ProtectionDomain pd = Server.class.getProtectionDomain();
assert pd != null;
final CodeSource cs = pd.getCodeSource();
assert cs != null;
final URL location = cs.getLocation();
assert location != null;
try {
return new File(location.toURI());
} catch (final URISyntaxException wontHappen) {
throw (InternalError)new InternalError().initCause(wontHappen);
}
}
It means that you have no way to change the H2 JAR used : from the command line when you run the plugin or from the plugin declaration in the pom.xml, since no property is defined in the Mojo to achieve that.
if you will change the H2 version, you need to change the version embedded by the plugin. As a starter, you could try to fork the plugin GIT repository, change the h2 dependency used in the pom to match to your requirement and check whether in spite of the gap version, working with that plugin is a possible thing.
Note that you could add a new property of the Mojo to make it completely configurable such as :
mvn ... -Dh2Version=1.4.200
But in that case you will need to retrieve that. For example by performing a request to download the dependency from the m2 central repo for example.
And you should also ensure that only valid ranges of the h2 version are used.
I don't think you'll be able to do that using this particular maven-plugin. Here is another answer for someone that had a similar issue: How to pass parameter to Maven plugin from CLI?.
Basically, the h2Version property is not defined as a user property.
When you start the plugin, using the command that you've mentioned, there is an output for pre-defined configuration properties:
<configuration>
<allowOthers>${h2.allowOthers}</allowOthers>
<baseDirectory>${h2.baseDirectory}</baseDirectory>
<forceShutdown>${h2.forceShutdown}</forceShutdown>
<ifExists>${h2.ifExists}</ifExists>
<java>${h2.java}</java>
<port default-value="9092">${h2.port}</port>
<shutdownAllServers>${h2.shutdownAllServers}</shutdownAllServers>
<shutdownHost default-value="localhost">${h2.shutdownHost}</shutdownHost>
<shutdownPassword default-value="h2-maven-plugin">${h2.shutdownPassword}</shutdownPassword>
<trace>${h2.trace}</trace>
<useSSL>${h2.useSSL}</useSSL>
</configuration>
Only these properties could be defined by the user. For example, changing the running port:
mvn -debug com.edugility:h2-maven-plugin:1.0:spawn -Dport=9090
Let me show you the problem with the following use case:
Let's assume I have class Example and want to use the unit and integration test for that.
Once the unit tests in file ExampleTest (with methods annotated by #Test) is done, coverage report displays correct result.
Once the integration tests in file ExampleTestInt (with methods annotated by #Test) is done, coverage report does not display correct result. As if the class was excluded from the coverage process.
That means if I have just integration tests for Example class then I cannot see the correct result in coverage by jacoco.
Is there some way how to keep that ExampleTestInt class to be coverage similarly like in case of the unit test? I would like to keep the same name.
You can configure custom excludes like this:
<configuration>
<excludes>
<exclude>**/*Config.*</exclude>
<exclude>**/*Dev.*</exclude>
</excludes>
</configuration>
Please checkout the jacoco docs: https://www.eclemma.org/jacoco/trunk/doc/report-mojo.html
I have found out that the root problem was in wrong filename used for integration tests.
As per my observation the jacoco coverage plugin was looking for files with suffix Test and hence the file ExampleTestInt couldn't be seen by plugin.
This question already has an answer here:
Exclude folder from analysis
(1 answer)
Closed 6 years ago.
I need to set exclusions for Code coverage exclusions for XSD, lombok and slf4j logger generated classes and data objects
I know SonarQube allows to set coverage exclusions at file level. Is there also a global setting we can use to exclude coverage anlaysis for data objects. As an example.
These classes can be identified by scanning annotation “#XmlAccessorType”, “#XmlRootElement”, “implements Serializable” etc. These classes usually don’t contain any business logic except setters/getters. Also few generated classes have extension of “extends BaseResponseEdge” and “extends BaseResponseMiddle”.
Also Exclude slf4j Logger objects, static constants and final class variables from Integration test coverage.
BTW, I use mvn + jacoco + surefire + failsafe for code coverage implementation
Thanks,
Manny
You can set exclusions at both the project and global levels.
Since you want to exclude by file contents, take a look at the Ignore Issues section of the docs. It shows you how to ignore issues raised
* on files that contain a string matching your regex
* between regex-specified start and end markers
* from specific rules on file paths matching a pattern
Code coverage exclusions aren't as fine-grained. You can only exclude by file path pattern.
I am just started to automate my Web service project with Java. I need to run it in Jenkins so I built the project with TestNG framework and Maven. I am able to run some automated test suite with Jenkins without any issues.
Now I have problem like, I need to get the user input at runtime (say HostName, UserName, Password, etc) and then the test suite should run accordingly.
The problem I am facing like, while entering the input in the console the cursor is not returning back to the program and it simple going to ideal state.
PFA screenshots:
Test case running as Maven build
Test case running as TestNG
Making your tests interactive is absolutely bad approach - the build is just not reproducible at all, makes other people know what parameters are expected and so on.
If your test case need to take parameters at runtime (or more accurately, they need to be set before maven build is started) you have several options.
Option #1. Properties file and maven resource processing.
Idea is just to have a properties file in your project and tell maven to resolve variable
values there. Example:
Pom.xml:
<resources>
<resource>
<directory>src/test/resources/data</directory>
<filtering>true</filtering>
</resource>
</resources>
/src/test/resources/data/myproperties.properties:
myVar = ${myVar}
Then run maven build as:
mvn -DmyVar=value clean install
You can use not only properties files, but any other file types (for example XML).
Hint: you can use maven-enforcer-plugin to force user to enter -DmyVar, this plugin will fail the build if all required properties are not set.
Option #2. Properties file and Spring DI.
Idea is almost same as above, but Spring is used to inject all necessary stuff to test class. Something like:
#Component
public class MyTest
{
#Value(${myProperty})
private String myValue;
// ... Test methods which can use myValue ...
}
See this answer for full example of possible Spring configuration.
Btw, your test is the a unit test, but integration test - in case if it involves deploying of your web service or something else like this. Therefore, it would be wise to split unit tests (maven-surefire-plugin) and integration tests (maven-failsafe-plugin).
In my code I start multiple OSGi frameworks using EquinoxFactory. By setting the property "org.osgi.framework.storage" to "#user.home/osgi-frameworks/framework-x", where x is different for every framework, each framework uses a different directory:
frameworkProperties.put("osgi.clean", "true");
frameworkProperties.put("osgi.console", "true");
frameworkProperties.put("org.osgi.framework.storage",
"#user.home/osgi-frameworks/osgi-framework-"
+ numberOfFramework);
framework = new EquinoxFactory().newFramework(frameworkProperties)
This works perfectly when running the actual application. Also the JUnit tests in the IDE run without any problems.
However, when I start the Maven build for my project, the JUnit tests fail since all frameworks use the same directory ("osgi-frameworks/framework-0").
I added logging to the application to check whether the property "org.osgi.framework.storage" does really have different value in the OSGi property map. Everything looks fine in the log, but when checking the file space, only one directory has been created.
Since I would like to include the application in Jenkins, I would rather not skip the tests.
Has anyone an idea what could be wrong? Do I have to set other parameters for the framework? Is there any considerable difference between Junit in the IDE and in Maven?