Converting Maven dependency to Gradle - java

I am currently following a tutorial here and saw a POM plugin that I couldn't convert to Gradle, you can find the plugin below. Tried to follow a couple tutorials though they didn't seem to help, the part I am confused about is the executions and what is the general syntax that Gradle expects.
<plugin>
<groupId>com.github.temyers</groupId>
<artifactId>cucumber-jvm-parallel-plugin</artifactId>
<version>5.0.0</version>
<executions>
<execution>
<id>generateRunners</id>
<phase>generate-test-sources</phase>
<goals>
<goal>generateRunners</goal>
</goals>
<configuration>
<!-- Mandatory -->
<!-- List of package names to scan for glue code. -->
<glue>
<package>com.example</package>
<package>com.example.other</package>
</glue>
</configuration>
</execution>
</executions>
</plugin>

As of version 4.x cucumber supports parallel execution. This makes the cucumber-jvm-parallel-plugin for maven obsolete.
You'll will have to create a task that uses gradles JavaExec to call cucumbers CLI directly with --parallel 4.
I don't believe you can use Cucumbers JUnit runner with Gradle to achieve parallel execution because Gradle doesn't install a parallel computer into JUnit but instead forks the the JVM.

Related

JaCoCo Offline Instrumentaion and Integration Test Coverage Reports

I have been trying to implement JaCoCo offline code coverage in a JBoss server using an instrumented EAR for deployment and the jacococagent.jar in order to track code coverage of external integration testing running against said JBoss.
I have been following guides such as these:
http://www.eclemma.org/jacoco/trunk/doc/offline.html
http://automationrhapsody.com/code-coverage-with-jacoco-offline-instrumentation-with-maven/
I feel I am pretty close as everything SEEMS to be working, however, when I load the coverage report up in eclipse's EclEmma plugin, it reports as 0 coverage for everything (which I know is wrong).
Here's my setup:
Here's the maven plugin configuration:
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>${jacoco.version}</version>
<configuration>
<!-- <destFile>${sonar.jacoco.reportPath}</destFile> -->
<append>true</append>
<excludes>
<exclude>**/dao/**/*Dao*</exclude>
<exclude>**/dao/**/*DAO*</exclude>
<exclude>**/dao/**/*Vo*</exclude>
<exclude>**/dao/**/*VO*</exclude>
<exclude>**/ui/**/*</exclude>
<exclude>**/*Vo.*</exclude>
<exclude>**/*VO.*</exclude>
<exclude>**/test/**/*</exclude>
<exclude>**/tester/**/*</exclude>
</excludes>
</configuration>
<executions>
<execution>
<id>pre-unit-test</id>
<goals>
<goal>prepare-agent</goal>
</goals>
<configuration>
<destFile>${sonar.jacoco.reportPath}</destFile>
</configuration>
</execution>
<execution>
<id>unit-test-report</id>
<phase>test</phase>
<goals>
<goal>report</goal>
</goals>
<configuration>
<dataFile>${sonar.jacoco.reportPath}</dataFile>
</configuration>
</execution>
<execution>
<id>jacoco-instrument</id>
<phase>test</phase>
<goals>
<goal>instrument</goal>
</goals>
<configuration>
<skip>${jacoco.skip.instrument}</skip>
<!-- <skip>false</skip> -->
</configuration>
</execution>
</executions>
</plugin>
Here's my jacoco-agent.properties file:
destfile=/stage/live_integration_jacoco.exec
output=file
dumponexit=true
append=true
I'm bundling the JaCoCo Agent JARs right inside the EAR as these dependencies (the second one is just what jacocoagent.jar is labelled as in our repository):
<dependency>
<groupId>org.jacoco</groupId>
<artifactId>org.jacoco.agent</artifactId>
<version>${jacoco.version}</version>
</dependency>
<dependency>
<groupId>org.jacoco.build</groupId>
<artifactId>org.jacoco.jacocoagent</artifactId>
<version>${jacoco.version}</version>
</dependency>
Here's my process:
I run this on the project: mvn clean install -U -Djacoco.skip.instrument=false
And that generates my instrumented EAR artifact. I have verified that the classes in there are indeed instrumented by JaCoCo by decompiling a few of them.
I take that EAR that has instrumented code, the jacococagent.jar included in it, and the jacoco-agent.popreties file included as well and deploy that to JBoss. JBoss starts just fine (it used to get ClassNotFound exception before I started bundling jacocoagent.jar in it directly).
The "/stage/live_integration_jacoco.exec" file is created at this point with a size of '0'.
I run some tests on and against the server, even some manual testing, then stop the application.
The "/stage/live_integration_jacoco.exec" file now has data (30-60kb of data so far in my observations).
I import that exec file into eclipse and it loads without any errors and shows the classes in the project, however it reports 0 coverage on everything.
Well, I'm not sure what else to try at this point.
Does anyone have some thoughts on how to get it correctly generating the coverage report in my situation?
Thanks!
I suspect that classes deployed on server are compiled with Oracle Java compiler, while classes in Eclipse are compiled with Eclipse Java compiler, and hence JaCoCo can't associate them since they differ. To confirm this - you can try to generate report using the exec file that you try to import, but outside of Eclipse using Ant or Maven. And make sure that you execute generation of report on original (non instrumented) classes, otherwise they also won't match.

How to generate separate jar files for application, source, and documentation (for central.sonatype.org)

Sonatype has a repository that I want to deploy a jar file to, and they ask for separate files for application, sources, and javadocs:
Example:
example-application-1.4.7.pom
example-application-1.4.7.jar
example-application-1.4.7-sources.jar
example-application-1.4.7-javadoc.jar
In Scala SBT, I have a command called "package" that generates the jar file for the project, but that only generates "example-application-1.4.7.jar".
Question: What should I do to generate the other two jar files?
In Maven, in order to get the additional -sources and -javadoc artifacts, add to your POM file the following:
<build>
<plugins>
<!-- additional plugin configurations, if any.. -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.10.3</version>
<executions>
<execution>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
Note the snippet above:
We are invoking the Maven Source Plugin to create an additional jar files for sources
We are invoking the Maven Javadoc Plugin to create an additional jar files for javadoc
Executing
mvn clean package
You will find these two additional jars in the target folder.
The .pom file instead is generated during the install phase, but it is not placed under the target folder. Basically, it is a copy of your pom.xml file, with a different extension and used by Maven during the dependency mediation process to check which transitive dependencies are required by the concerned artifact.
Executing
mvn clean install
Maven will install the artifact in your local cache (in your machine), under path_to_cache/.m2/repository/your_groupId/your_artifactId/your_version/. In this folder, you will also find the .pom file, which normally you don't need to distribute (it is created automatically by Maven).
Further note: you probably don't want to generate these additional jar files at each and every build, so to speed up normal builds and have them only on demand, you could wrap the snippet above in a Maven profile.
You can achieve this by removing the snippet above from your build section and add a further section at the end of your pom:
<profiles>
<profile>
<id>prepare-distribution</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.10.3</version>
<executions>
<execution>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
So that normal builds would not create these jars anymore, but when executing the following:
mvn clean install -Pprepare-distribution
You would instead get them back. the -P option is actually activating on demand the profile defined with the id prepare-distribution.
With Maven 3 a default profile already comes as part of the super pom which perform exactly the same actions (sources and javadoc artifact), hence no need to add anything to your existing project. Simply run:
mvn clean install -Prelease-profile
Or, to activate it via a property
mvn clean install -DperformRelease=true
However, as also specified in the super pom, this profile may be removed in future releases (although there since first Maven 3 version till version 3.3.9 so far)
NOTE: The release profile will be removed from future versions of the super POM
The main reason behind this warning is most probably to push for the usage of the Maven Release Plugin, which indirectly makes use of this profile via the useReleaseProfile option of the release:perform goal.
As highlighted by comments, if you are not familiar with maven (especially via console) I would definitely recommend to
Go through the official Maven in 5 minutes documentation for a quick but worthy look.
Play with Maven from the command line, is there where Maven gives you its best. IDE integrations are great, but command line is the real turning point.
Then play with the POM customization above, to get familiar with some concepts and behaviors, first directly as part of your default build, then moved to a profile.
Then, and only then, move to the Maven Release Plugin usage. I recommend it as last step because you would already have acquired more confidence and understanding and see it as less magic and more reasonable approach.

How to make SonarQube module analyze the project only once when sonar analysis is bound to maven lifecycle in a multi-module project?

What I am trying to achieve is integrate SonarQube analysis into the build process, so that whenever mvn clean install is run, the code is analyzed with SonarQube. We want to use it for local analysis and also for build on Jenkins. If new issues are found, than the build should fail (we want to use build breaker plugin for that). This way the developer would know that by his code his is going to introduce new issues, and will have to fix them for the build to work.
When I run mvn sonar:sonar, the analysis takes 30 seconds, which is OK.
However, the problem occurs when I am trying to bind sonar goal to maven build phases. I bind sonar to verify phase. The build takes 5 minutes now, which is too long. It should take about 1 minute. The build itself, without SonarQube analysis takes 30 seconds.
Note (may help to figure out what the problem is): the project on which the build is run has multiple modules in it, and I guess that is the problem. It looks like sonar:sonar goal is executed multiple times, once for each submodule, and the whole project is analyzed multiple times (not only the submodules). So, we have 4 submodules, and the report is generated 5 times during the build.
Instead, we want to analyze the whole project only once, not 5 times. It's also important for this 1 analysis to be run at the end of the build, after the cobertura reports are generated for all modules.
So, how do I integrate SonarQube analysis into the build, so that it analyzes my multi-module project only once, in the end, after cobertura reports are generated for all the submodules?
SonarQube plugin properties in parent pom:
<!-- Sonar plugin properties -->
<sonar.jdbc.url>jdbc:url</sonar.jdbc.url>
<sonar.analysis.mode>preview</sonar.analysis.mode>
<sonar.issuesReport.html.enable>true</sonar.issuesReport.html.enable>
<sonar.issuesReport.console.enable>true</sonar.issuesReport.console.enable>
<sonar.host.url>sonar.host:9000</sonar.host.url>
<sonar.language>java</sonar.language>
<sonar.buildbreaker.skip>false</sonar.buildbreaker.skip>
<sonar.qualitygate>Sonar%20way%20with%20Findbugs</sonar.qualitygate>
<sonar.preview.includePlugins>buildbreaker</sonar.preview.includePlugins>
<sonar.exclusions>file:**/target/**</sonar.exclusions>
<branch>development</branch>
Plugins configuration in the project pom:
<!-- Run cobertura analysis during package phase -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>cobertura-maven-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>cobertura</goal>
</goals>
</execution>
</executions>
</plugin>
<!-- Run sonar analysis (preview mode) during verify phase. Cobertura reports need to be generated already -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>sonar-maven-plugin</artifactId>
<version>2.5</version>
<executions>
<execution>
<phase>verify</phase>
<goals>
<goal>sonar</goal>
</goals>
</execution>
</executions>
</plugin>
IMO, this is just a Maven configuration issue, you're missing the <inherited>false</inherited> element on the execution of sonar:sonar:
<!-- Run sonar analysis (preview mode) during verify phase. Cobertura reports need to be generated already -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>sonar-maven-plugin</artifactId>
<version>2.5</version>
<executions>
<execution>
<phase>verify</phase>
<goals>
<goal>sonar</goal>
</goals>
<inherited>false</inherited>
</execution>
</executions>
</plugin>

Maven code coverage

I am new to java world. Our team is using Maven to building everything into single .war file. I am looking for tools to instrument .war files to enable code coverage. Idea is to manually instrument .war file and then run the test.
I looked at couple of tools, but not getting exactly what I am looking for e.g. Emma, jester, cobertura etc. Looking for simple instructions.
If you want to measure code coverage you should use Jacoco. It allows measuring for unit tests and integration tests as well.
All you have to do is to add dependecy:
<dependency>
<groupid>org.jacoco</groupid>
<artifactid>org.jacoco.core</artifactid>
<version>0.6.2.201302030002</version>
<scope>test</scope>
</dependency>
and add jacoco-maven-plugin. Please note that if you won't use Sonar then you have to replace ${sonar.jacoco.reportPath} properties with raw file paths
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>0.6.2.201302030002</version>
<executions>
<!-- prepare agent for measuring unit tests -->
<execution>
<id>prepare-unit-tests</id>
<goals>
<goal>prepare-agent</goal>
</goals>
<configuration>
<destFile>${sonar.jacoco.reportPath}</destFile>
</configuration>
</execution>
<!-- prepare agent for measuring integration tests -->
<execution>
<id>prepare-integration-tests</id>
<goals>
<goal>prepare-agent</goal>
</goals>
<phase>pre-integration-test</phase>
<configuration>
<destFile>${sonar.jacoco.itReportPath}</destFile>
<propertyName>itCoverageAgent</propertyName>
</configuration>
</execution>
</executions>
</plugin>
If you want also to use sonar, then specify such properties:
<properties>
<!-- select JaCoCo as a coverage tool -->
<sonar.core.codeCoveragePlugin>jacoco</sonar.core.codeCoveragePlugin>
<!-- force sonar to reuse reports generated during build cycle -->
<sonar.dynamicAnalysis>reuseReports</sonar.dynamicAnalysis>
<!-- set path for unit tests reports -->
<sonar.jacoco.reportPath>${project.basedir}/target/jacoco-unit.exec</sonar.jacoco.reportPath>
<!-- all modules have to use the same integration tests report file -->
<sonar.jacoco.itReportPath>${project.basedir}/../target/jacoco-it.exec</sonar.jacoco.itReportPath>
</properties>
You can find more details on http://www.kubrynski.com/2013/03/measuring-overall-code-coverage-in.html
Cobertura would support that. See the answer to this question.
Java: measure code coverage for remote scripting tests
If you want to do this in development rather than on your build server, you might want to give eclemma a try. You can launch your webapp in your IDE with eclemma and then simply run whatever test you want to run (outside of eclemma) and it will nicely annotate the code that is running with green.

Managing JAXB-generated classes in a Maven project

I have a Maven-based project, in which I trying to add some JAXB classes automatically generated by the "jaxb2-maven-plugin" Maven plugin. However, my first cut has me in a circular dependency loop:
Because these JAXB classes aren't generated yet, my other sources which reference them have compilation errors.
Because those other sources have compilation errors, these JAXB classes don't get generated.
It seems like there are two obvious possibilities for solving this:
Comment-out the broken references, so that the project builds and the JAXB classes are automatically generated. Then copy those generated sources from /target into /src/main/java, so that references to them won't cause compilation errors.
Create an entirely separate project, consisting of nothing but the JAXB stuff. Include it as a dependency in my main project.
Am I missing something here? Option #1 seems flat-out ridiculous... that just can't be the manner in which people use JAXB. Option #2 seems more rational, but still rather inefficient and cumbersome. I really have to take on the overhead of an entirely separate project just to use JAXB?
Are there any more elegant approaches that developers use to reference JAXB-generated classes in the same project where the Maven plugin generates them?
UPDATE: By request, here is the relevant portion of my POM:
<build>
<plugins>
<plugin>
<!-- configure the compiler to compile to Java 1.6 -->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>jaxb2-maven-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>xjc</goal>
</goals>
</execution>
</executions>
<configuration>
<!-- The name of your generated source package -->
<packageName>com.mypackage</packageName>
</configuration>
</plugin>
</plugins>
</build>
When I run mvn clean package, I DO see my JAXB sources being generated beneath the /target subdirectory. However, those generated sources are not being automatically added to the classpath for the compile phase.
POST-RESOLUTION UPDATE: It turns out that my compilation issues had more to do with the fact that I was running in Eclipse, and its Maven integration has some issues with "jaxb2-maven-plugin". See this StackOverflow question for more detail on that issue and its resolution.
How did you configure your jaxb maven plugin? Normally it runs in the generate-sources lifecycle, which comes before the compile lifecycle. So your JAXB generated classes should already be there when your own code gets compiled, Maven puts them in target/generated-source and puts that folder on the classpath.
Edit:
This is my code we use at work (and which works as expected):
<plugin>
<groupId>com.sun.tools.xjc.maven2</groupId>
<artifactId>maven-jaxb-plugin</artifactId>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
<configuration>
<schemaDirectory>src/main/resources/<companyname>/xsd</schemaDirectory>
<includeSchemas>
<includeSchema>retrieval.xsd</includeSchema>
<includeSchema>storage.xsd</includeSchema>
</includeSchemas>
</configuration>
</plugin>
Apparently we use yet another jaxb plugin... (see also this thread: Difference of Maven JAXB plugins).
i would suggest you to split jaxb-generated classes (api) and your BL classes (implementation) to 2 maven projects with separate pom.xml for each, and the main root pom.xml with the compilation order. that way, you will be able to build api.jar, then maven will install it inside the local repo, and after that you can use it as the dependency of your implementation. so it will looks like:
-API\
--pom.xml - for api, jaxb generation
-IMPL\
--pom.xml - for impl, api dependency is here
pom.xml - main pom.xml with references to the projects above
Maybe try using the maven-jaxb2-plugin instead:
<plugin>
<groupId>org.jvnet.jaxb2.maven2</groupId>
<artifactId>maven-jaxb2-plugin</artifactId>
<version>0.8.2</version>
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
</plugin>
The answer from dfuse is correct, though. Either plugin should generate sources before compiling, and the result of the source generation will be on the classpath. I tested this with both plugins. Is it possible for you to post your schema, or at least the schema for the type that your code is failing to pick up on the classpath?

Categories