I have a Sonar profile in Maven. Everything works fine except the code coverage metric. I want to make Sonar ignore some classes only for the code coverage metric. I have the following profile:
<profile>
<id>sonar</id>
<properties>
<sonar.exclusions>**/beans/jaxb/**</sonar.exclusions>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>${maven.surefire.plugin.version}</version>
<configuration>
<redirectTestOutputToFile>true</redirectTestOutputToFile>
<excludes>
<exclude>**/*Suite*.java</exclude>
<exclude>**/*RemoteTest.java</exclude>
<exclude>**/*SpringTest.java</exclude>
<exclude>**/*CamelTest.java</exclude>
<exclude>**/*FunctionalTest.java</exclude>
<exclude>**/*IntegrationTest.java</exclude>
<exclude>**/*DaoBeanTest.java</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
</profile>
Please help. I tried to add something like
<exclude>com/qwerty/dw/publisher/Main.class</exclude>
but it didn't help
UPDATE
I have a correct Cobertura profile. I tried to add it to the Sonar profile, but still I have 53% instead about 95% like in the Cobertura profile
<profile>
<id>sonar</id>
<properties>
<sonar.exclusions>**/beans/jaxb/**</sonar.exclusions>
<sonar.core.codeCoveragePlugin>cobertura</sonar.core.codeCoveragePlugin>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>${maven.surefire.plugin.version}</version>
<configuration>
<redirectTestOutputToFile>true</redirectTestOutputToFile>
<excludes>
<exclude>**/*Suite*.java</exclude>
<exclude>**/*RemoteTest.java</exclude>
<exclude>**/*SpringTest.java</exclude>
<exclude>**/*CamelTest.java</exclude>
<exclude>**/*FunctionalTest.java</exclude>
<exclude>**/*IntegrationTest.java</exclude>
<exclude>**/*DaoBeanTest.java</exclude>
</excludes>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>cobertura-maven-plugin</artifactId>
<version>${cobertura.maven.plugin.version}</version>
<configuration>
<instrumentation>
<excludes>
<exclude>com/qwerty/dw/dao/*</exclude>
<exclude>com/qwerty/dw/domain/*</exclude>
<exclude>com/qwerty/dw/beans/**/*</exclude>
<exclude>com/qwerty/dw/daemon/exception/*</exclude>
<exclude>com/qwerty/dw/daemon/Main.class</exclude>
<exclude>com/qwerty/dw/sink/Main.class</exclude>
<exclude>com/qwerty/dw/publisher/Main.class</exclude>
<exclude>com/qwerty/dw/publisher/dao/*</exclude>
<exclude>com/qwerty/dw/publisher/domain/*</exclude>
</excludes>
</instrumentation>
<formats>
<format>html</format>
</formats>
<aggregate>true</aggregate>
<check>
<haltOnFailure>true</haltOnFailure>
<branchRate>60</branchRate>
<lineRate>60</lineRate>
<totalBranchRate>60</totalBranchRate>
<totalLineRate>60</totalLineRate>
</check>
</configuration>
<executions>
<execution>
<goals>
<goal>clean</goal>
<goal>check</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
At the time of this writing (which is with SonarQube 4.5.1), the correct property to set is sonar.coverage.exclusions, e.g.:
<properties>
<sonar.coverage.exclusions>foo/**/*,**/bar/*</sonar.coverage.exclusions>
</properties>
This seems to be a change from just a few versions earlier. Note that this excludes the given classes from coverage calculation only. All other metrics and issues are calculated.
In order to find the property name for your version of SonarQube, you can try going to the General Settings section of your SonarQube instance and look for the Code Coverage item (in SonarQube 4.5.x, that's General Settings → Exclusions → Code Coverage). Below the input field, it gives the property name mentioned above ("Key: sonar.coverage.exclusions").
For me this worked (basically pom.xml level global properties):
<properties>
<sonar.exclusions>**/Name*.java</sonar.exclusions>
</properties>
According to: http://docs.sonarqube.org/display/SONAR/Narrowing+the+Focus#NarrowingtheFocus-Patterns
It appears you can either end it with ".java" or possibly "*"
to get the java classes you're interested in.
Accordingly to this document for SonarQube 7.1
sonar.exclusions - Comma-delimited list of file path patterns to be
excluded from analysis
sonar.coverage.exclusions - Comma-delimited
list of file path patterns to be excluded from coverage
calculations
This document gives some examples on how to create path patterns
# Exclude all classes ending by "Bean"
# Matches org/sonar.api/MyBean.java, org/sonar/util/MyOtherBean.java, org/sonar/util/MyDTO.java, etc.
sonar.exclusions=**/*Bean.java,**/*DTO.java
# Exclude all classes in the "src/main/java/org/sonar" directory
# Matches src/main/java/org/sonar/MyClass.java, src/main/java/org/sonar/MyOtherClass.java
# But does not match src/main/java/org/sonar/util/MyClassUtil.java
sonar.exclusions=src/main/java/org/sonar/*
# Exclude all COBOL programs in the "bank" directory and its sub-directories
# Matches bank/ZTR00021.cbl, bank/data/CBR00354.cbl, bank/data/REM012345.cob
sonar.exclusions=bank/**/*
# Exclude all COBOL programs in the "bank" directory and its sub-directories whose extension is .cbl
# Matches bank/ZTR00021.cbl, bank/data/CBR00354.cbl
sonar.exclusions=bank/**/*.cbl
If you are using Maven for your project, Maven command line parameters can be passed like this for example -Dsonar.coverage.exclusions=**/config/*,**/model/*
I was having problem with excluding single class explicitly. Below my observations:
**/*GlobalExceptionhandler.java - not working for some reason, I was expecting such syntax should work
com/some/package/name/GlobalExceptionhandler.java - not working
src/main/java/com/some/package/name/GlobalExceptionhandler.java - good, class excluded explicitly without using wildcards
When using sonar-scanner for swift, use sonar.coverage.exclusions in your sonar-project.properties to exclude any file for only code coverage. If you want to exclude files from analysis as well, you can use sonar.exclusions. This has worked for me in swift
sonar.coverage.exclusions=**/*ViewController.swift,**/*Cell.swift,**/*View.swift
For jacoco: use this properties:
-Dsonar.jacoco.excludes=**/*View.java
I think you 're looking for the solution described in this answer Exclude methods from code coverage with Cobertura
Keep in mind that if you're using Sonar 3.2 you have specify that your coverage tool is cobertura and not jacoco ( default ), which doesn't support this kind of feature yet
Sometimes, Clover is configured to provide code coverage reports for all non-test code. If you wish to override these preferences, you may use configuration elements to exclude and include source files from being instrumented:
<plugin>
<groupId>com.atlassian.maven.plugins</groupId>
<artifactId>maven-clover2-plugin</artifactId>
<version>${clover-version}</version>
<configuration>
<excludes>
<exclude>**/*Dull.java</exclude>
</excludes>
</configuration>
</plugin>
Also, you can include the following Sonar configuration:
<properties>
<sonar.exclusions>
**/domain/*.java,
**/transfer/*.java
</sonar.exclusions>
</properties>
I had to struggle for a little bit but I found a solution, I added
-Dsonar.coverage.exclusions=**/*.*
and the coverage metric was cancelled from the dashboard at all, so I realized that the problem was the path I was passing, not the property. In my case the path to exclude was java/org/acme/exceptions, so I used :
`-Dsonar.coverage.exclusions=**/exceptions/**/*.*`
and it worked, but since I don't have sub-folders it also works with
-Dsonar.coverage.exclusions=**/exceptions/*.*
I am able to achieve the necessary code coverage exclusions by updating jacoco-maven-plugin configuration in pom.xml
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>0.8.6</version>
<executions>
<execution>
<id>pre-test</id>
<goals>
<goal>prepare-agent</goal>
</goals>
<configuration>
<propertyName>jacocoArgLine</propertyName>
<destFile>${project.test.result.directory}/jacoco/jacoco.exec</destFile>
</configuration>
</execution>
<execution>
<id>post-test</id>
<phase>test</phase>
<goals>
<goal>report</goal>
</goals>
<configuration>
<dataFile>${project.test.result.directory}/jacoco/jacoco.exec</dataFile>
<outputDirectory>${project.test.result.directory}/jacoco</outputDirectory>
</configuration>
</execution>
</executions>
<configuration>
<excludes>
<exclude>**/GlobalExceptionHandler*.*</exclude>
<exclude>**/ErrorResponse*.*</exclude>
</excludes>
</configuration>
</plugin>
this configuration excludes the GlobalExceptionHandler.java and ErrorResponse.java in the jacoco coverage.
And the following two lines does the same for sonar coverage .
<sonar.exclusions> **/*GlobalExceptionHandler*.*, **/*ErrorResponse*.</sonar.exclusions>
<sonar.coverage.exclusions> **/*GlobalExceptionHandler*.*, **/*ErrorResponse*.* </sonar.coverage.exclusions>
If you are looking for exclusion and using Gradle.
Add below lines in build.gradle file:
sonarqube {
properties {
property "sonar.exclusions", "'**/example/your/service/impl/**',"
} }
Note the inner single quote. Good luck.
Related
I have a project that consist of 3 different libraries. When I run install script it takes all libraries from repo and run mvn clean install on them. But this version of library already installed in repo. Is there a way to skip install phase if version in pom.xml equal version in my local repo.
I know that I can use local repo and just set dependencies. But my boss want that our project can build only with public repos and without any our repos.
You can bypass like this
-Dmaven.install.skip=true
<profiles>
<profile>
<id>skipInstall</id>
<activation>
<property>
<name>maven.install.skip</name>
<value>true</value>
</property>
</activation>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<executions>
<execution>
<id>default-install</id>
<phase>none</phase>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
</build>
</profile>
Last week Olivier Lamy patched this jira.
MINSTALL-73
Most maven plugins can be skipped by specifying something like:
<plugin>
<artifactId>maven-install-plugin</artifactId>
<version>X.Y</version>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
you can also set up build profiles to set properties and use that to determine the value. for example, running the command: mvn -Pexample would select the "example" profile. The POM would then contain:
...
<properties>
<skip.install>false</skip.install>
...
</properties>
...
<profile>
<id>example</id>
<properties>
<skip.install>false</skip.install>
</properties>
</profile>
...
<plugin>
<artifactId>maven-install-plugin</artifactId>
<version>X.Y</version>
<configuration>
<skip>${skip.install}</skip>
</configuration>
</plugin>
...
Using these POM additions, the default behavior for the install plugin will be to perform its default goal, but if the example profile is selected, then the install plugin will skip its goal.
Using what I learned from the other answers, this was the cleanest result for me.
In my super pom I added a pluginManagement/plugin to disable default-install and default-test phases when the property deployOnly is set.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<version>2.5.2</version>
<executions>
<execution>
<id>default-install</id>
<configuration>
<skip>${deployOnly}</skip>
</configuration>
</execution>
<execution>
<id>default-test</id>
<configuration>
<skip>${deployOnly}</skip>
</configuration>
</execution>
</executions>
</plugin>
So on the command line, I can disable install and test phases by adding -DdeployOnly.
mvn clean install #build and test everything
mvn deploy -DdeployOnly #just deploy it
I know that I can use local repo and just set dependencies. But my boss want that our project can build only with public repos and without any our repos.
Are you sure you understood correctly what you boss meant? I interpret the above as "don't install third party libraries in your local repository, use only libraries available in public repositories". This is different from "don't use your local repository" which is basically impossible, that's just not how maven works. I'd try to clarify this point.
Apart from that, I don't get the question which is very confusing (what repo are you talking about? What is the install script doing? Why do you call clean install on libraries? etc).
Extending the other answers, from the future.
Maven plugins have a surprisingly high freedom, how do they run. If they want, they can ignore/override the typical pom.xml settings. Furthermore, also the <configuration><skip>true</skip></configuration> is only a convention, nothing obligates a plugin to follow it, except that most of them is developed so.
My experiments with the recent problem show, that both #Cemo's and #MiloshBoroyevich solution should be utilized, also the plugin requires both to really let us in peace. More concretely, the only working configuration by me was this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<version>2.5.2</version>
<executions>
<execution>
<id>default-install</id>
<phase>none</phase>
</execution>
</executions>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
One of your options is to put the deployment to another module. I.e. have one pom.xml build the artifact and install it to the local repo, and another pom.xml to deploy it. This separation is quite common in larger projects, where the testsuite is sometimes a separate module or even a project, the packaging happens in several stages, etc.
- pom.xml - myProject-root - type=pom
- pom.xml - myProject-artifact - type=jar
- pom.xml - myProject-deploy - type=pom, does the deployment, skips it's own `install` goal
I need to deploy my project into artifactory. For this purpose I use maven-assembly-plugin together with artifactory-maven-plugin
only I can use for building of mvn is this CMD (small updates are possible):
mvn -e -B -U clean deploy -DskipIntegrationTests=false -DskipCoverageReport=false -Dservice_name=sample_service
What I can not do in mvn command is update service name. It will be always "sample_service" or some other constant which represent name of service
Because I do not know name of service (there could be more services) my base part of pom.xml looks like this (artifactId is created dynamically from property service_name):
<groupId>my.group.id</groupId>
<artifactId>${service_name}</artifactId>
<version>2.0.0-SNAPSHOT</version>
The problem is that parameter -Dservice_name will always consists "underscores". Because of conventions artifact has to consists "dashes" instead of "underscores".
Is there any way (some plugin for instance) how I can do something like this?
<groupId>my.group.id</groupId>
<artifactId>${service_name}.replaceAll("_","-")</artifactId>
<version>2.0.0-SNAPSHOT</version>
In short for property service_name I need replace underscores by dashes before building of artifact.
Thanks for answers.
This cannot be done.
Properties used inside <artifactId> can only be set through the command line. You have no chance to manipulate them in Maven. The only chance I see is to change the command line, so that you do the replacement before you send the parameter to Maven.
I found a solution for my problem. But I am not sure if it is correct way how to solve it. I used plugin gmaven-plugin
<plugin>
<groupId>org.codehaus.groovy.maven</groupId>
<artifactId>gmaven-plugin</artifactId>
<executions>
<execution>
<phase>pre-clean</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<source>
project.getModel().setArtifactId(project.properties["service_name"].replaceAll('_', '-'))
project.getArtifact().setArtifactId(project.properties["service_name"].replaceAll('_', '-'))
</source>
</configuration>
</execution>
</executions>
</plugin>
After that I use maven-assembly plugin which upload data into artifactory. And this plugin read artifact id from instance "project.getArtifacts()" so I update it directly.
So I updated artifact id directly in maven instance.
As I say it is not 100 percent correct but in my case it helps
You can do this with the buildhelper plugin, it has a goal regex-property which can set a property based on an initial value (your service_name property) and a regular expression to replace with a replacement value.
Example from the usage page (adapted because the value used made no sense):
<project>
...
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<id>regex-property</id>
<goals>
<goal>regex-property</goal>
</goals>
<configuration>
<name>human.version</name>
<value>${project.version}</value>
<regex>-SNAPSHOT</regex>
<replacement> pre-release development version</replacement>
<failIfNoMatch>false</failIfNoMatch>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
...
</project>
I spent hours on this problem, searching several Google and SO entries, I have some ideas but not getting the result.
I have a maven file that does something like this:
grab a jar containing JSON schemas, and unpack them.
Using the Maven Replacer plugin (v 1.5.3), replace a line in a schema file called “MySchema.json” as such:
”Hello” :
”HelloWorld” :
then Maven would use another plugin to compile a class called “converter.java” and runs this class to output a Java file based on “MySchema.json”. let’s call the generated Java file “MyPojo.java”.
Now, I want Maven to replace a line in “MyPojo.java”, but no matter what I do I cannot achieve this.
I tried:
include a separate replace plugin entry for step 4 after the plugin that converts schemas to Java, but ofcourse this caused Maven to complain about existing replace plugin with same artifact/group id from step 2.
Tried adding a separate execution id to the goal “replace” for second plugin, this is invalid for this plugin.
There is a parent project to my current project folder, I tried putting another replacer plugin in the parent POM and make the phase to be any of the “package”, “generate-resources”, “compile”, etc. did not work. Note: the phase for replacements inside “MySchema.json” (in my current project POM) is generate-sources.
give absolute path to the Java, it kept complaining that path does not exist. But I copied and pasted the path to the Java inside windows explorer address bar after it was generated and could read it from Windows explorer. Note that the generated Java file “MyPojo.java”, went under “target/generated-sources” which is sourced by a parent POM above this project using a Maven Helper plugin in parent POM, so this folder should be visible as a source for further compilation. That Maven Helper plugin has phase generate-sources.
Use with same result as above
In my current project (non-parent one) this is the POM code:
<build>
<!—execute a plugin grab schemas jar and unpack schemas-->
...
<plugin>
<groupId>com.google.code.maven-replacer-plugin</groupId>
<artifactId>replacer</artifactId>
<version>1.5.3</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>replace</goal>
</goals>
</execution>
</executions>
<configuration>
<includes>
<include>${project.basedir}/target/schemas/MySchema.json</include>
</includes>
<replacements>
<replacement>
<token>"Hello":</token>
<value>"Hello World":</value>
</replacement>
</replacements>
</configuration>
</plugin>
<!-- execute a Plugin for converting shcemas to POJO -->
. . .
</plugins>
</build>
</project>
You should be able to declare the plugin only once and run two replace execution at different Maven Build Lifecycle phases:
Before the Json -> POJO conversion
After the Json -> POJO conversion
So, translating that into could would result in something like:
<plugin>
<!-- (unique) plugin declaration -->
<groupId>com.google.code.maven-replacer-plugin</groupId>
<artifactId>maven-replacer-plugin</artifactId>
<version>1.3.5</version>
<executions>
<!-- first execution: replace on json file -->
<execution>
<id>replace-for-json</id>
<phase>some-phase-before-conversion</phase>
<goals>
<goal>replace</goal>
</goals>
<configuration>
<filesToInclude>${project.basedir}/target/schemas/MySchema.json</filesToInclude>
<preserveDir>true</preserveDir>
<outputDir>target</outputDir>
<replacements>
<replacement>
<token>"Hello":</token>
<value>"Hello World (Json)":</value>
</replacement>
</replacements>
</configuration>
</execution>
<!-- second execution: replace on java file -->
<execution>
<id>replace-for-pojo</id>
<phase>some-phase-after-conversion</phase>
<goals>
<goal>replace</goal>
</goals>
<configuration>
<filesToInclude>${project.basedir}/target/generated-sources/MyPojo.java</filesToInclude>
<preserveDir>true</preserveDir>
<outputDir>target</outputDir>
<replacements>
<replacement>
<token>"Hello":</token>
<value>"Hello World (Java)":</value>
</replacement>
</replacements>
</configuration>
</execution>
</executions>
</plugin>
Source: Configuration for the maven-replacer-plugin on two separate executions
I'm new to annotation processing and I'm trying to automating it with Maven. I've put this in my pom.xml:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.0</version>
<configuration>
<annotationProcessors>
<annotationProcessor>
co.aurasphere.revolver.annotation.processor.InjectAnnotationProcessor</annotationProcessor>
<annotationProcessor>
co.aurasphere.revolver.annotation.processor.RevolverContextAnnotationProcessor</annotationProcessor>
</annotationProcessors>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
The problem is that when I try to build the project I get a CompilationFailureException because Maven can't find the processors.
I've found other questions like this, solved by putting the dependency outside the plugin. I tried that, but nothing changed for me.
Am I missing something?
Thank you.
EDIT
Here is my dependency on another project which contains both the processor and the annotations:
<dependencies>
<dependency>
<groupId>co.aurasphere</groupId>
<artifactId>revolver-annotation-processor</artifactId>
<version>0.0.3-SNAPSHOT</version>
</dependency>
</dependencies>
EDIT 2:
After further investigation, I decided to decompile the processor JAR (built with Maven) and it happens that... my classes are not there. For some reasons, Maven is not compiling my classes into the JAR and that's why the classes are not found. I've tried figuring out what's wrong on that build (this never happened to me before and I've used Maven for a while...).
First of all, the packaging on that project is jar.
The classes are all under src/main/java.
I've checked in my pom.xml that the classpath and source path is the same.
Here's the processor pom:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>co.aurasphere</groupId>
<artifactId>revolver-annotation-processor</artifactId>
<version>0.0.3-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<!-- https://mvnrepository.com/artifact/javax.inject/javax.inject -->
<dependency>
<groupId>javax.inject</groupId>
<artifactId>javax.inject</artifactId>
<version>1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.velocity/velocity -->
<dependency>
<groupId>org.apache.velocity</groupId>
<artifactId>velocity</artifactId>
<version>1.7</version>
</dependency>
</dependencies>
EDIT 3
Here's the output of a maven clean install on the processor project. Unfortunately the output is too long and I had to post an external link even if I know it's not good.
EDIT 4
Here are some screenshots of my dependency hierarchy: and .
Since the project was originally created as an Eclipse simple Java project and then converted to a Maven one, I tried to create a new Maven project and move everything to the new one in the hope that the problem was the Eclipse plugin that messed something up, but the error was still there.
This is an extended version of the accepted answer above provided by #Aurasphere. Hopefully this will give some explanation to how the proposed solution works.
First, some background to what is happening here. Say, we want a custom annotation processor. We implement it and put it into a JAR as Maven artefact, so that it could be consumed by other projects. When such projects are being compiled, we want our annotation processor to be recognised by Java compiler and used appropriately. To make this happen, one needs to tell the compiler about a new custom processor. Compiler looks in the resources and checks FQN of classes listed in META-INF/services/javax.annotation.processing.Processor file. It tries to find these classes in classpath and load them to run the processing of annotations used upon classes that are currently being compiled.
So, we want our custom class to be mentioned in this file. We can ask a user of our library to put this file manually, but this is not intuitive and users could be frustrated why the promised processing of annotation doesn't work. That's why we might want to prepare this file in advance and deliver it together with the processor inside JAR of our Maven artefact.
The problem is that if we simply put this file with FQN of the custom processor in it, it will trigger compiler during compilation of our artefact, and since the processor itself is not yet compiled, the compiler will show the error about it. So we need to skip annotation processing to avoid this. This can be done using -proc:none, or with Maven:
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<proc>none</proc>
</configuration>
</plugin>
We might have unit tests that will need our annotation processor. In Maven, test compilation is carried out after main sources are built, and all classes are already available including our processor. We just need to add special step during processing of test sources which would use our annotation processor. This can be done using:
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<id>process-test-annotations</id>
<phase>generate-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
<configuration>
<proc>only</proc>
<annotationProcessors>
<annotationProcessor>fully.qualified.Name</annotationProcessor>
</annotationProcessors>
</configuration>
</execution>
</executions>
</plugin>
I've found the answer myself. I've figured out that the problem was the file javax.annotation.processing.Processor in META-INF/services/ with the configuration of the annotation processor's class. In order to fix the problem I had to add the following to the pom.xml configuration of my processor project:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<compilerArgument>
-proc:none
</compilerArgument>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
This let Maven build the classes into the actual jar and fixed the problem. I don't know if this is a bug or not but it surely looks strange to me. Thank you everybody for the help!
The easiest way is to register the annotation processor in the META-INF/services directory of the revolver-annotation-processor artifact. No Maven compiler configuration is needed.
Check if it's already registered, if not, register it yourself if you control the source code.
https://docs.oracle.com/javase/8/docs/api/java/util/ServiceLoader.html
If you control the source code I also recommend to package the processor in the same artifact as the annotations. Like this, whenever you're using one of the annotations, the annotation processor is also picked-up by the compiler.
The accepted answer here works by disabling all annotation processing, which may not be suitable if other annotation processors need to run during the compilation. Instead, the SPI configuration file listing the newly compiled annotation processor can be added in a post-processing step. I added a directory src/main/post-resources to my project and this plugin configuration:
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>3.3.0</version>
<executions>
<execution>
<id>annotation-processor-spi</id>
<phase>process-classes</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${project.build.outputDirectory}</outputDirectory>
<resources>
<resource>
<directory>src/main/post-resources</directory>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
I have a maven project that has different environments for development and production. There are some parts in the web.xml that need to be removed for production. Until today, there was a antrunner based xml task that did exactly that. Now i want to remove the scripting-part and introduce 2 versions of the web.xml. Basically, i would extend the path "WEB-INF/web.xml" to "WEB-INF/dev/web.xml". This was configured in both profiles:
<profile>
<id>dev</id>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
<properties>
<webxml.path>${project.basedir}/src/main/webapp/WEB-INF/dev/WEB.xml</webxml.path>
</properties>
</profile>
Same for the other profile, just with a different path.
Later, I will use the maven-war-plugin to make sure maven finds the file:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.4</version>
<executions>
<execution>
<id>default-war</id>
<phase>package</phase>
<goals>
<goal>war</goal>
</goals>
<configuration>
<warSourceDirectory>${project.build.directory}/${project.build.finalName}</warSourceDirectory>
<webXml>${webxml.path}</webXml>
<archive>
<manifest>
<addClasspath>true</addClasspath>
</manifest>
</archive>
</configuration>
</execution>
</executions>
</plugin>
And everything works fine, but just as long as I build from the commandline.
I told the Spring Tool Suite 3.7 to update with the dev profile,sadly it somehow refuses to resolve that path.
When I use the "create Descriptor-Stub" from the context menu, it even creates a file called ${webxml.path} with default web.xml content.
The provided error message is:
web.xml is missing and failOnMissingWebXml is set to true
I dont want to disable that check (failOnMissingWebXml) if possible.
Is there any way to get STS to be a bit more eager on resolving variables?
Update:
While writing this, I found out, that it is possible to use the maven-resources plugin to copy the file manually to the respective path when building from the IDE. It would be great to get m2e to respect the attribute from the war-plugin. Or is this simply not possible? Is the buildprocess not executed by maven itself, even though i explicitly specified the external version?