I have a Maven-based project, in which I trying to add some JAXB classes automatically generated by the "jaxb2-maven-plugin" Maven plugin. However, my first cut has me in a circular dependency loop:
Because these JAXB classes aren't generated yet, my other sources which reference them have compilation errors.
Because those other sources have compilation errors, these JAXB classes don't get generated.
It seems like there are two obvious possibilities for solving this:
Comment-out the broken references, so that the project builds and the JAXB classes are automatically generated. Then copy those generated sources from /target into /src/main/java, so that references to them won't cause compilation errors.
Create an entirely separate project, consisting of nothing but the JAXB stuff. Include it as a dependency in my main project.
Am I missing something here? Option #1 seems flat-out ridiculous... that just can't be the manner in which people use JAXB. Option #2 seems more rational, but still rather inefficient and cumbersome. I really have to take on the overhead of an entirely separate project just to use JAXB?
Are there any more elegant approaches that developers use to reference JAXB-generated classes in the same project where the Maven plugin generates them?
UPDATE: By request, here is the relevant portion of my POM:
<build>
<plugins>
<plugin>
<!-- configure the compiler to compile to Java 1.6 -->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>jaxb2-maven-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>xjc</goal>
</goals>
</execution>
</executions>
<configuration>
<!-- The name of your generated source package -->
<packageName>com.mypackage</packageName>
</configuration>
</plugin>
</plugins>
</build>
When I run mvn clean package, I DO see my JAXB sources being generated beneath the /target subdirectory. However, those generated sources are not being automatically added to the classpath for the compile phase.
POST-RESOLUTION UPDATE: It turns out that my compilation issues had more to do with the fact that I was running in Eclipse, and its Maven integration has some issues with "jaxb2-maven-plugin". See this StackOverflow question for more detail on that issue and its resolution.
How did you configure your jaxb maven plugin? Normally it runs in the generate-sources lifecycle, which comes before the compile lifecycle. So your JAXB generated classes should already be there when your own code gets compiled, Maven puts them in target/generated-source and puts that folder on the classpath.
Edit:
This is my code we use at work (and which works as expected):
<plugin>
<groupId>com.sun.tools.xjc.maven2</groupId>
<artifactId>maven-jaxb-plugin</artifactId>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
<configuration>
<schemaDirectory>src/main/resources/<companyname>/xsd</schemaDirectory>
<includeSchemas>
<includeSchema>retrieval.xsd</includeSchema>
<includeSchema>storage.xsd</includeSchema>
</includeSchemas>
</configuration>
</plugin>
Apparently we use yet another jaxb plugin... (see also this thread: Difference of Maven JAXB plugins).
i would suggest you to split jaxb-generated classes (api) and your BL classes (implementation) to 2 maven projects with separate pom.xml for each, and the main root pom.xml with the compilation order. that way, you will be able to build api.jar, then maven will install it inside the local repo, and after that you can use it as the dependency of your implementation. so it will looks like:
-API\
--pom.xml - for api, jaxb generation
-IMPL\
--pom.xml - for impl, api dependency is here
pom.xml - main pom.xml with references to the projects above
Maybe try using the maven-jaxb2-plugin instead:
<plugin>
<groupId>org.jvnet.jaxb2.maven2</groupId>
<artifactId>maven-jaxb2-plugin</artifactId>
<version>0.8.2</version>
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
</plugin>
The answer from dfuse is correct, though. Either plugin should generate sources before compiling, and the result of the source generation will be on the classpath. I tested this with both plugins. Is it possible for you to post your schema, or at least the schema for the type that your code is failing to pick up on the classpath?
Related
Is it possible to use annotation processor in the same project where it is defined?
Example:
src/
MyAnnotation.java
path_to_MyAnnotationProcessor.MyAnnotationProcessor.java
other classes
resources
META-INF/services/javax.annotation.processing.Processor
pom
when I will run mvn clean install, I will expect that my processor will process classes annotated with MyAnnotation.
I don`t want to import already compiled processor from another lib, I just want to use it once I defined it in my src.
For now, I get error:
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.6.1:compile (default-compile) on project my-project: Compilation failure
[ERROR] Annotation processor 'path_to_MyAnnotationProcessor' not found
part of pom.xml, where I ref. to my processors:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven.plugin.compiler}</version>
<configuration>
<source>${version.java}</source>
<target>${version.java}</target>
<annotationProcessors>
<proc>path_to_MyAnnotationProcessor.MyAnnotationProcessor</proc>
</annotationProcessors>
</configuration>
</plugin>
Thanks to everybody, especially to #Stefan Ferstl and #yegodm. The solution came from yegodm is:
"One way is two have two modules in the same project. One module would define annotations and processor. Another would have it as a dependency to establish build order."
The easiest way to solve this problem is convert your project into a multi-module project where the annotation processor is in its own module. Having a different module for the annotation processor, you could use the quite new <annotationProcessorPaths> option to define the annotation processor via groupId/artifactId.
The module using the annotation processor might need a dependency to the annotation processor module to get it built first.
Note: In a previous version of this answer I described an additional way to solve this problem, which apparently didn't work out of the box. That part has been deleted.
You could compile your processor earlier with a separate compiler execution.
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<id>compile-generator</id>
<phase>generate-sources</phase>
<goals>
<goal>compile</goal>
</goals>
<configuration>
<includes>
<include>com/example/YourProcessor.java</include>
</includes>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
I've tested this, and it works - the processor does get invoked later during the actual compile phase.
If you precompile some other classes from the same project too, then you could directly reference and use them in the processor. That could be useful.
I'm looking to see if we can migrate from the current legacy (mojo) GWT Maven plugin to the new generation (ltgt) Maven plugin. I've read documentation such as http://www.g-widgets.com/2016/12/02/gwt-tip-working-with-maven-multi-modules-projects/ which outlines how to setup the code as separate maven (POM) modules. Considering we already have the project setup where the application has multiple GWT modules all part of the same POM is there anyway we can work the plugin to compile the code successfully or does each module have to be separated into maven module of its own?
No need to change the structure of your project, though you would be missing out on the clean separation of client and server code via maven modules(not to mix up with the gwt modules).
So said that here is an example of how to use the new GWT maven plugin without having multiple maven modules:
Example Project structure with only one Maven Module: https://github.com/branflake2267/Archetypes/tree/master/archetypes/gwt-basic-rpc
And in case you have multiple GWT modules inside one maven module then you have to specify multiple executions. (not like in the old plugin):
Example Plugin config with multiple GWT modules:
<plugin>
<groupId>net.ltgt.gwt.maven</groupId>
<artifactId>gwt-maven-plugin</artifactId>
<executions>
<execution>
<id>compile-module1</id>
<goals>
<goal>compile</goal>
</goals>
<configuration>
<moduleName>com.example.module1.Module1</moduleName>
<moduleShortName>module1</moduleShortName>
<compilerArgs>
<compilerArg>-localWorkers</compilerArg>
<compilerArg>4</compilerArg>
<compilerArg>-draftCompile</compilerArg>
</compilerArgs>
</configuration>
</execution>
<execution>
<id>compile-module1</id>
<goals>
<goal>compile</goal>
</goals>
<configuration>
<moduleName>com.example.module2.Module2</moduleName>
<moduleShortName>module2</moduleShortName>
<compilerArgs>
<compilerArg>-draftCompile</compilerArg>
</compilerArgs>
</configuration>
</execution>
</executions>
</plugin>
Also there an little migration guide on the plugin website.
If ever you are interested in how the proper multi module setup would look like see here.
I want to deploy two jar artifacts with different classifiers, but at the moment that fails because both supply their own version of pom.xml. How can I fix that, so that both pom.xmls can be uploaded along with their artifacts?
Example - I have com.test.company.somelib-1.0.0-cmp1.jar and com.test.company.somelib-1.0.0-cmp2.jar, where cmpX is a classifier. Both packages contain (logically) the same code and classes (of the same version), they only differ slightly in the way they were preprocessed. The classifier annotation is there due to backwards compatibility we need to maintain.
Long story short, first artifact uploads fine, second one fails with Forbidden, because our repository does not allow overwriting artifacts (and I want to keep it that way).
There is a slightly different pipeline that creates both the packages, so it is easier to have their builds separate. I just want to deploy them as two packages of the same name and different classifier.
Thanks for help
Edit: it has been suggested to use Maven profiles. I can see that they would work, but they would not be ideal.
Consider the setup I have depicted on the picture below - there is a CI server (TeamCity).
There is a "starter" build (Sources). This build checkouts all required source files.
From this starter build several other builds are triggered (processing using x.x.x/compile). Each of those builds adjusts a template-pom.xml (fills in particular classifier and other info), and then builds and deploys its artifact to our Artifactory.
With the setup I want to achieve if I decide to add another processing-build, all I need to do is add another "branch". If I was using profiles, I would need to also add a new profile to the pom.xml file.
Correct me if I am wrong please. Profiles seem to be able to achieve the goal, but not ideally, at least in my case.
I strongly discourage having 2 (or more) different pom files with the same GAV.
But I understand your need is raised by legacy reasons.
I have not tried this myself but it could be working:
Leave one build (= maven project) as you have it now. On the other build skip the normal deployment and manually invoke the deploy-file goal of the deploy plugin like so:
<build>
<plugins>
<!-- skip normal execution of deploy plugin -->
<plugin>
<artifactId>maven-deploy-plugin</artifactId>
<executions>
<execution>
<id>default-deploy</id>
<configuration>
<skip>true</skip>
</configuration>
</execution>
</executions>
</plugin>
<!-- invoke with goal: deploy-file -->
<plugin>
<artifactId>maven-deploy-plugin</artifactId>
<executions>
<execution>
<id>someId</id>
<phase>deploy</phase>
<goals>
<goal>deploy-file</goal>
</goals>
<inherited>false</inherited>
<configuration>
<file>path-to-your-artifact-jar</file>
<generatePom>false</generatePom>
<artifactId>xxx</artifactId>
<groupId>xxx</groupId>
<version>xxx</version>
<classifier>xxx</classifier>
<packaging>xxx</packaging>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
Sonatype has a repository that I want to deploy a jar file to, and they ask for separate files for application, sources, and javadocs:
Example:
example-application-1.4.7.pom
example-application-1.4.7.jar
example-application-1.4.7-sources.jar
example-application-1.4.7-javadoc.jar
In Scala SBT, I have a command called "package" that generates the jar file for the project, but that only generates "example-application-1.4.7.jar".
Question: What should I do to generate the other two jar files?
In Maven, in order to get the additional -sources and -javadoc artifacts, add to your POM file the following:
<build>
<plugins>
<!-- additional plugin configurations, if any.. -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.10.3</version>
<executions>
<execution>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
Note the snippet above:
We are invoking the Maven Source Plugin to create an additional jar files for sources
We are invoking the Maven Javadoc Plugin to create an additional jar files for javadoc
Executing
mvn clean package
You will find these two additional jars in the target folder.
The .pom file instead is generated during the install phase, but it is not placed under the target folder. Basically, it is a copy of your pom.xml file, with a different extension and used by Maven during the dependency mediation process to check which transitive dependencies are required by the concerned artifact.
Executing
mvn clean install
Maven will install the artifact in your local cache (in your machine), under path_to_cache/.m2/repository/your_groupId/your_artifactId/your_version/. In this folder, you will also find the .pom file, which normally you don't need to distribute (it is created automatically by Maven).
Further note: you probably don't want to generate these additional jar files at each and every build, so to speed up normal builds and have them only on demand, you could wrap the snippet above in a Maven profile.
You can achieve this by removing the snippet above from your build section and add a further section at the end of your pom:
<profiles>
<profile>
<id>prepare-distribution</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>2.10.3</version>
<executions>
<execution>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
So that normal builds would not create these jars anymore, but when executing the following:
mvn clean install -Pprepare-distribution
You would instead get them back. the -P option is actually activating on demand the profile defined with the id prepare-distribution.
With Maven 3 a default profile already comes as part of the super pom which perform exactly the same actions (sources and javadoc artifact), hence no need to add anything to your existing project. Simply run:
mvn clean install -Prelease-profile
Or, to activate it via a property
mvn clean install -DperformRelease=true
However, as also specified in the super pom, this profile may be removed in future releases (although there since first Maven 3 version till version 3.3.9 so far)
NOTE: The release profile will be removed from future versions of the super POM
The main reason behind this warning is most probably to push for the usage of the Maven Release Plugin, which indirectly makes use of this profile via the useReleaseProfile option of the release:perform goal.
As highlighted by comments, if you are not familiar with maven (especially via console) I would definitely recommend to
Go through the official Maven in 5 minutes documentation for a quick but worthy look.
Play with Maven from the command line, is there where Maven gives you its best. IDE integrations are great, but command line is the real turning point.
Then play with the POM customization above, to get familiar with some concepts and behaviors, first directly as part of your default build, then moved to a profile.
Then, and only then, move to the Maven Release Plugin usage. I recommend it as last step because you would already have acquired more confidence and understanding and see it as less magic and more reasonable approach.
I am new to maven. I have created a maven project which will be packaged to JAR. I did clean package then jar is created. When i extracted the same jar, i could not see any dependencies (jars) i added in pom.xml inside the packaged jar. If i give this jar to third party clients how will the code work without any dependent jars ? Please help me how maven manages the jars?
Thanks!
Maven handles dependencies based on how you configure the dependency plugin.
See this reference for a simple example of how to do this.
In this example, the following code configures where your dependencies will end up:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.5.1</version>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<includeGroupIds>log4j</includeGroupIds>
<outputDirectory>${project.build.directory}/dependency-jars/</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
Then this code sets up the classpath for your main jar, which will allow anyone running it to find these dependencies
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<mainClass>com.mkyong.core.App</mainClass>
<classpathPrefix>dependency-jars/</classpathPrefix>
</manifest>
</archive>
</configuration>
</plugin>
Your other option would be to create a single jar, with all dependencies included, by following this example here
You could distribute the jar and the POM file if you want to try and provide your users with the files in that manner, but they'd need to be able to access your Maven repository where those dependencies are kept.
Core maven doesn't handle this. Maven is a build tool, its work is to build an artifact (a jar in your case). Dependencies you define in your module's pom.xml file are needed to get the code compiled. You'll need maven plugins to do so.
Now, you're asking not about the build, but the distribution of your compiled binaries.
If I understand it should be a lot of jars (your and your dependencies). Alternatively you may distribute the code as a jar + dependencies inside.
Example:
A first case:
If your code resides in module A (say, the code is in packages org.a.*) and depends on some thirdparty (say, log4j, whose classes reside in org.apache.log4j) than you can expect that you jar will only contain the classes of module a and you expect that the log4j will be added by the user of your module automatically (The first case).
A second case:
module a.jar will contain both org.a.* and org.apache.log4j.* classes, everything in the same module.
In general the first approach is more "healthy" and in this case you shouldn't do anything in maven. Maybe your distribution tool/documentation should contain this information.
If someone uses the module a in his/her code like a thirdparty (if you develop a framework or something) and if his/her project is mavenized, than the fact you've defined a dependency on log4j will make the maven to download the log4j as well as your a.jar (In maven notation, this is called "transitive dependencies").
If you're interested in the second case (this can be relevant if you define some "client application", like "jndi client for some server" for example) you might want to take a look on Maven shade plugin
Beware this can lead to dependency hell (what if the application that uses your client also makes use of log4j? what if the log4j-s are of different version)/
Bottom line, you probably want the first approach, think twice before you decide the second approach :)
One more tip, if you just want to download all the dependencies of your module "a" you might want to use maven dependency plugin - type the following in the command prompt
mvn dependency:copy-dependencies
and you'll find all the dependencies in target/dependencies folder
Hope this helps and happy mavening
The simplest solution to the problem is to use the maven-assembly-plugin which can create such jar with dependencies like the following:
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4</version>
<executions>
<execution>
<id>distro-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
Afterwards you can distribute the created jar xyz-1.0-jar-with-dependencies which contains the defined dependencies.
If you need more control on how the resulting artifact is created or if some files needed to be overwritten etc. you might take a deeper look into maven-shade-plugin