I need to keep in sync the #version tag of all class Javadocs in my project, as well as the #author tag. However I don't know an easy way to do this.
Is there a plugin (preferably a maven plugin) that could accomplish this? And no, the maven-release plugin will not do this for me.
The way I use #version is, in conjunction with #since. IMHO, I think #version represents version of software when this class was modified and #since represents the version of the software when this file/class was created.
On #author, my policy is each developer who has ever contributed to that class (in some major way) should append his/her name.
So, if you see all these processes are manual and need to be done by Class creator/modifier at the time of coding. And, obviously you will have unequal version of files. And, I guess that makes sense.
I would like to listen if someone differs on this.
Of course there's a maven way to do it, but it's very unusual:
define your src/main/java folder as <resource>, with a fixed outputDirectory. Then reconfigure javadoc and jar plugins, something like this:
<build>
<resources>
<resource>
<directory>src/main/java</directory>
<targetPath>sources</targetPath>
<filtering>true</filtering>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.6.1</version>
<configuration>
<sourcepath>${project.build.outputDirectory}/sources</sourcepath>
</configuration>
<!-- other config stripped -->
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.3</version>
<configuration>
<excludes>
<exclude>sources/**</exclude>
</excludes>
</configuration>
<!-- other config stripped -->
</plugin>
</plugins>
</build>
Now you can use placeholders in your source files and interpolate them with maven properties (see maven filtering for reference)
Related
I have a maven plugin that exposes a Mojo, with a goal that runs at the compile stage. The project was generated using mvn archetype:generate, and the POM contains all the standard stuff that comes with running that, very little deviation. The project includes a couple of resource files, e.g. filea.txt and fileb.txt, that are packaged up as part of the jar.
When the plugin is used in a project, I'd like the files that are included in the jar to be extracted and copied to the target\test-classes directory of the host project. I'm trying to use the plugin jar to both distribute some files + expose some functionality that can then use those files.
Is this a valid approach, and if so, are there settings I can add to the plugin POM to indicate that content from the plugin should be extracted and copied? I want to centralise this logic in the plugin, rather than having to do in the plugin host.
I feel like it's something with maven-dependency-plugin or maven-resources-plugin or build-helper-maven-plugin:attach-artifact, have tried a couple of different approaches but think I'm missing something obvious:
e.g. something like this in plugin POM?
<plugins>
<plugin>
<artifactId>maven-clean-plugin</artifactId>
<version>3.1.0</version>
</plugin>
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>3.0.2</version>
<configuration>
<outputDirectory>${basedir}/target/test-classes</outputDirectory>
<resources>
<resource>
<directory>src/main/resources</directory>
<includes>
<include>filea.txt</include>
<include>fileb.txt</include>
</includes>
</resource>
</resources>
</configuration>
</plugin>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.0</version>
</plugin>
<plugin>
<artifactId>maven-plugin-plugin</artifactId>
<version>3.6.0</version>
</plugin>
// etc etc
Google fu has let me down, keep ending up on maven resources page. Can post directory structure / more information if needed.
Cheers
First I would suggest to put resources which needs to be distributed into src/main/resources which looks like you have done ...but remove the configuration for the maven-resources-plugin and let maven do it's work. This is automatically copied into target/classes/ which in result is packaged into the resulting jar later.
If your plugin needs to get those files those can accessed as a usual resource via this.getClass().getResourcesAsStream("/...") and reading and writing them into a new location preferable into target/...
I'm new to annotation processing and I'm trying to automating it with Maven. I've put this in my pom.xml:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.0</version>
<configuration>
<annotationProcessors>
<annotationProcessor>
co.aurasphere.revolver.annotation.processor.InjectAnnotationProcessor</annotationProcessor>
<annotationProcessor>
co.aurasphere.revolver.annotation.processor.RevolverContextAnnotationProcessor</annotationProcessor>
</annotationProcessors>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
The problem is that when I try to build the project I get a CompilationFailureException because Maven can't find the processors.
I've found other questions like this, solved by putting the dependency outside the plugin. I tried that, but nothing changed for me.
Am I missing something?
Thank you.
EDIT
Here is my dependency on another project which contains both the processor and the annotations:
<dependencies>
<dependency>
<groupId>co.aurasphere</groupId>
<artifactId>revolver-annotation-processor</artifactId>
<version>0.0.3-SNAPSHOT</version>
</dependency>
</dependencies>
EDIT 2:
After further investigation, I decided to decompile the processor JAR (built with Maven) and it happens that... my classes are not there. For some reasons, Maven is not compiling my classes into the JAR and that's why the classes are not found. I've tried figuring out what's wrong on that build (this never happened to me before and I've used Maven for a while...).
First of all, the packaging on that project is jar.
The classes are all under src/main/java.
I've checked in my pom.xml that the classpath and source path is the same.
Here's the processor pom:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>co.aurasphere</groupId>
<artifactId>revolver-annotation-processor</artifactId>
<version>0.0.3-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<!-- https://mvnrepository.com/artifact/javax.inject/javax.inject -->
<dependency>
<groupId>javax.inject</groupId>
<artifactId>javax.inject</artifactId>
<version>1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.velocity/velocity -->
<dependency>
<groupId>org.apache.velocity</groupId>
<artifactId>velocity</artifactId>
<version>1.7</version>
</dependency>
</dependencies>
EDIT 3
Here's the output of a maven clean install on the processor project. Unfortunately the output is too long and I had to post an external link even if I know it's not good.
EDIT 4
Here are some screenshots of my dependency hierarchy: and .
Since the project was originally created as an Eclipse simple Java project and then converted to a Maven one, I tried to create a new Maven project and move everything to the new one in the hope that the problem was the Eclipse plugin that messed something up, but the error was still there.
This is an extended version of the accepted answer above provided by #Aurasphere. Hopefully this will give some explanation to how the proposed solution works.
First, some background to what is happening here. Say, we want a custom annotation processor. We implement it and put it into a JAR as Maven artefact, so that it could be consumed by other projects. When such projects are being compiled, we want our annotation processor to be recognised by Java compiler and used appropriately. To make this happen, one needs to tell the compiler about a new custom processor. Compiler looks in the resources and checks FQN of classes listed in META-INF/services/javax.annotation.processing.Processor file. It tries to find these classes in classpath and load them to run the processing of annotations used upon classes that are currently being compiled.
So, we want our custom class to be mentioned in this file. We can ask a user of our library to put this file manually, but this is not intuitive and users could be frustrated why the promised processing of annotation doesn't work. That's why we might want to prepare this file in advance and deliver it together with the processor inside JAR of our Maven artefact.
The problem is that if we simply put this file with FQN of the custom processor in it, it will trigger compiler during compilation of our artefact, and since the processor itself is not yet compiled, the compiler will show the error about it. So we need to skip annotation processing to avoid this. This can be done using -proc:none, or with Maven:
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<proc>none</proc>
</configuration>
</plugin>
We might have unit tests that will need our annotation processor. In Maven, test compilation is carried out after main sources are built, and all classes are already available including our processor. We just need to add special step during processing of test sources which would use our annotation processor. This can be done using:
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<id>process-test-annotations</id>
<phase>generate-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
<configuration>
<proc>only</proc>
<annotationProcessors>
<annotationProcessor>fully.qualified.Name</annotationProcessor>
</annotationProcessors>
</configuration>
</execution>
</executions>
</plugin>
I've found the answer myself. I've figured out that the problem was the file javax.annotation.processing.Processor in META-INF/services/ with the configuration of the annotation processor's class. In order to fix the problem I had to add the following to the pom.xml configuration of my processor project:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<compilerArgument>
-proc:none
</compilerArgument>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
This let Maven build the classes into the actual jar and fixed the problem. I don't know if this is a bug or not but it surely looks strange to me. Thank you everybody for the help!
The easiest way is to register the annotation processor in the META-INF/services directory of the revolver-annotation-processor artifact. No Maven compiler configuration is needed.
Check if it's already registered, if not, register it yourself if you control the source code.
https://docs.oracle.com/javase/8/docs/api/java/util/ServiceLoader.html
If you control the source code I also recommend to package the processor in the same artifact as the annotations. Like this, whenever you're using one of the annotations, the annotation processor is also picked-up by the compiler.
The accepted answer here works by disabling all annotation processing, which may not be suitable if other annotation processors need to run during the compilation. Instead, the SPI configuration file listing the newly compiled annotation processor can be added in a post-processing step. I added a directory src/main/post-resources to my project and this plugin configuration:
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>3.3.0</version>
<executions>
<execution>
<id>annotation-processor-spi</id>
<phase>process-classes</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${project.build.outputDirectory}</outputDirectory>
<resources>
<resource>
<directory>src/main/post-resources</directory>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
I am using maven to compile my project using this configuration:
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<encoding>UTF-8</encoding>
<source>${java.version}</source>
<target>${java.version}</target>
<showDeprecation>true</showDeprecation>
<showWarnings>true</showWarnings>
</configuration>
</plugin>
</plugins>
The project should be in UTF-8, but by convention .properties files should be latin1 (ISO 8859-1) and Eclipse treats them that way (I know I can change how Eclipse behaves, but that's not the point). I use .properties files for internationalization.
The problem is that, using Eclipse to deploy to Tomcat, I can see my special chars well, but when compiling through maven (for instance, through Jenkins), I get all messed characters, like somehow Maven is translating all my .properties into UTF-8, thus screwing all my i18n messages.
What is the proper way to solve this? It feels like this should be a very common problem but I haven't found a valid solution online.
Just make a supplemental entry for resources directory which contains the ISO-LATIN1 files and turn off filtering explicitly. Than those files should be kept as they are...
BTW: You should use the encoding property like:
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
which is recognized by a large number of plugins for example maven-compiler-plugin, maven-resources-plugin etc.
I have a JEE6 web application project.The project structure is according to maven convention.
Now I have introduced additional web.xml files for this project.
So they are now stored in WEB-INF as below:
WEB-INF/
|__ A/web.xml
|__ B/web.xml
What is the maven way to build a war to include proper xml depending upon the property.
I know the how to add custom properties in maven.But I cannot find how to configure the maven plugin such that during the war file building it chooses the appropriate file.
Any hints/suggestions/maven best practices in such cases are most welcome.
Thanks!!
maven war plugin could be configured to add and filter some external resources. See http://maven.apache.org/plugins/maven-war-plugin/examples/adding-filtering-webresources.html.
So I would make 2 maven profiles with 2 war plugin configuration like this :
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.3</version>
<configuration>
<webResources>
<resource>
<!-- this is relative to the pom.xml directory -->
<directory>src/main/webapp/WEB-INF/__A</directory>
<includes>
<include>web.xml</include>
</includes>
<targetPath>WEB-INF</targetPath>
</resource>
</webResources>
</configuration>
</plugin>
<!-- repeat for your second profile -->
BUT I think a better solution (and if your project permits it) would be to keep only one web.xml file with some filtered properties inside. In this case, you should just configure your war plugin to enable some filtering like this :
<plugin>
<artifactId>maven-war-plugin</artifactId>
<version>2.3</version>
<configuration>
<filteringDeploymentDescriptors>true</filteringDeploymentDescriptors>
</configuration>
</plugin>
I want to compile only selected files or directories (including subdirectories) within source directory. I was pretty sure I can do this using <includes> of maven-compiler-plugin's configuration, but it seems to not work as I expect since it still compiles all classes into target/classes. What is really strange, Maven output suggest that the setting actually does its work, because with:
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.5.1</version>
<configuration>
<includes>
<include>com/example/dao/bean/*.java</include>
</includes>
</configuration>
</plugin>
I have:
[INFO] Compiling 1 source file to c:\Projects\test\target\classes
but with no compiler's configuration I have:
[INFO] Compiling 14 source file to c:\Projects\test\target\classes
In both cases however, all 14 classes are compiled into target/classes as I mentioned. Can you explain that or suggest another solution to compile only selected files?
Simple app with 3 classes.
com/company/Obj1.java
com/company/Obj2.java
com/company/inner/Obj3.java
build in pom.xml
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.0.2</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
<includes>
<include>com/company/inner/*.java</include>
</includes>
</configuration>
</plugin>
</plugins>
</build>
result: 1 class is compiled.
And any combination of includes is working well
or you mean something else?
I have faced a similar situation. We needed to hot swap only modified files to our remote docker container in order to improve changes-deploy time. This is how we solved it.
Add includes option in build plugin with command line argument.
Note since we wanted to add multiple files, so we have used includes and not include
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.7.0</version>
<configuration>
<compilerVersion>1.8</compilerVersion>
<source>1.8</source>
<target>1.8</target>
<includes>${changed.classes}</includes>
</configuration>
</plugin>
Now run compile phase with argument, example:
mvn compile -Dchanged.classes=com/demo/ClassA.java,com/demo/ClassB.java,com/demo2/*
maven-compiler-plugin using Ant-like inclusion/exclusion notation.
You can see examples in Ant documentation Ant FileSet Type
If you are want include only files from one directory, you need write it like you did:
<include>com/example/dao/bean/*.java</include>
To include also subdirectories from path, it will be:
<include>com/example/dao/bean/**/*.java</include>
I had no difficulty including or excluding files for compilation with maven compiler plugin 2.5.1. Here is the dummy project I used for the purpose. Perhaps the include pattern that you use is different.