Maven: how configure javadoc to generate all the private and package too - java

I am working with Maven 3.6.3, for a project based in one module, about the generation of the javadoc in the pom.xml file I have:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>${maven.javadoc.plugin.version}</version>
<configuration>
<source>${jdk.version}</source>
</configuration>
</plugin>
Where maven.javadoc.plugin.version is 3.2.0.
The plugin works how is expected:
it generates by default all the public and protected methods and classes
Now, for developing purposes I need include all about the private and package (methods, classes) too. What is the correct extra configuration?. It is possible in Gradle, so I am assuming it is possible in Maven too.

Add
<show>private</show>
to your <configuration/>.

Related

maven-javadoc-plugin: How to update the module path dynamically

I have a Java (11.0.7) Maven (3.0.6) multi-module project that contains the following module declarations:
<modules>
<module>jdrum-commons</module>
<module>jdrum-datastore-base</module>
<module>jdrum-datastore-simple</module>
<module>jdrum</module>
</modules>
Each of these Maven modules contains a module-info that defines the necessary requirements and exports to restrict access and visibility.
As such, jdrum-datastore-simple has some test utility classes that I reuse in jdrum's tests. By configuring the surefire plugin in jdrum's config via the code snippet below I am able to package the whole project without any issues.
<build>
<plugins>
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>
<!-- Allow the unnamed module access to the tests at test-time -->
--add-opens jdrum/at.rovo.drum.impl=ALL-UNNAMED
--illegal-access=deny
</argLine>
</configuration>
</plugin>
</plugins>
</build>
Within the parents POM I've also configured the generation of a report via the site argument, which also generates the Javadoc of the respective projects. The configuration for the JAR containing the javadoc as well as the configuration for the Javadoc generation as part of the report are both the same and look like this:
<!-- Generate Javadoc while reporting -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>3.2.0</version>
<inherited>true</inherited>
<configuration>
<verbose>true</verbose>
<source>${maven.compiler.source}</source>
<show>protected</show>
<failOnWarnings>false</failOnWarnings>
<release>${maven.compiler.release}</release>
<stylesheet>java</stylesheet>
</configuration>
<reportSets>
<reportSet>
<id>html</id>
<reports>
<report>javadoc</report>
</reports>
</reportSet>
</reportSets>
</plugin>
The Javadoc generation as part of the package step, which generates the project-version-javadoc.jar as output, succeeds as both, the jdrum-datastore-simple dependencies as well as its tests, are only included at test time:
<!-- Test data store to use for testing -->
<dependency>
<groupId>at.rovo</groupId>
<artifactId>jdrum-datastore-simple</artifactId>
<version>${project.parent.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>at.rovo</groupId>
<artifactId>jdrum-datastore-simple</artifactId>
<version>${project.parent.version}</version>
<scope>test</scope>
<type>test-jar</type>
</dependency>
If I'd change the scope from test to compile or provided the Javadoc generation would also fail with an error such as
Exit code: 1 - javadoc: error - The code being documented uses packages in the unnamed module, but the packages defined in https://github.com/RovoMe/JDrum/jdrum-datastore-simple/apidocs/ are in named modules.
The issue here, as far as I understood the problem, is, that the jdrum-datastore-simple module is not added to the module path of Javadoc. The next logical step was therefore to add that module to the configuration as such:
<reporting>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<configuration>
<additionalOptions>
<option>--add-modules</option>
<option>jdrum.datastore.simple</option>
</additionalOptions>
</configuration>
</plugin>
</plugins>
</reporting>
This adds the jdrum-datastore-simple module to the Javadoc configuration string, which can be seen in the jdrum/target/site/apidocs/options file that now contains an
...
--add-modules
jdrum.datastore.simple
...
entry. On further analyzing the generated options file it is apparent that the module path is missing out a reference to the actual JAR file and hence the Javadoc generation and thus the Maven process fails due to Javadoc not being able to locate the defined module. If I update that options file and add the path to the missing JAR file and then only perform a mvn package site the whole process succeeds and all is fine (as the pure invocation of the javadoc.bat located in the target/site/apidocs folder would as well).
Now, in order to make the whole process more dynamic I wanted to add or update the module path. However, the maven-javadoc-plugin does not directly allow this. Therefore I came up with adding a further maven-javadoc-plugin option of --module-path and a further option entry that contains the whole path. By the whole path I mean the path to every single dependency, so not only the path to jdrum-datastore-simple. This also works but due to hardcoding the path to the respective JAR files, the project is now not usable by other users unless they have the same system and path structure I used. To fix this I quickly replaced the respective path structure with ${settings.localRepository} and ${project.parent.basedir} properties on the respective modules in the module path. Unfortunately Javadoc is rather nitpicking on the path structure it accepts and it turns out that on my Windows machine Maven does return a path structure starting with C:\Users\... which Javadoc can't handle. If the path structure looks like C:/Users/... however Javadoc is fine with the values.
On further research I stumbled upon this thread which suggests to use Maven's build-helper-maven-plugin to define new properties for i.e. the M2 repository and use the built-in reg-ex capability to replace \ characters with /. However, adding a configuration such as
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.1.0</version>
<executions>
<execution>
<id>replace-local-repo-characters</id>
<goals>
<goal>regex-property</goal>
</goals>
<configuration>
<name>tag.m2repo</name>
<value>${settings.localRepository}</value>
<regex>\\</regex>
<replacement>/</replacement>
<failIfNoMatch>false</failIfNoMatch>
</configuration>
</execution>
<execution>
<id>replace-local-path-characters</id>
<goals>
<goal>regex-property</goal>
</goals>
<configuration>
<name>tag.basedir</name>
<value>${project.parent.basedir}</value>
<regex>\\</regex>
<replacement>/</replacement>
<failIfNoMatch>false</failIfNoMatch>
</configuration>
</execution>
</executions>
</plugin>
and using the introduced tags instead does not work at all as Maven is complaining about an invalid value provided. If I use $\{settings.localRepository} Maven is fine about the provided value, however in the final options file not the value of the actual settings.localRepository is updated but the provided string itself and I end up with something like $/{settings.localRepository}/org/slf4j/... which Javadoc can't resolve and therefore still misses out on the correct location to the jdrum-datastore-simple dependency.
So, how can I add the path to the missing dependency to maven-javadoc-plugin's module path defined in the generated options file so that the Maven is actually able to generate the whole report?
It seems that with java11 Update 9 (maybe also with update 8; not tested) maven-javadoc-plugin is able to correctly generate the Javadoc for multi-module projects without the need to alter the module-path.
For those interested how the actual Maven POM looks like:
Parent POM
POM for a shared module
POM for a sharing and consuming module
POM for the consuming module

Maven annotation processing processor not found

I'm new to annotation processing and I'm trying to automating it with Maven. I've put this in my pom.xml:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.0</version>
<configuration>
<annotationProcessors>
<annotationProcessor>
co.aurasphere.revolver.annotation.processor.InjectAnnotationProcessor</annotationProcessor>
<annotationProcessor>
co.aurasphere.revolver.annotation.processor.RevolverContextAnnotationProcessor</annotationProcessor>
</annotationProcessors>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
The problem is that when I try to build the project I get a CompilationFailureException because Maven can't find the processors.
I've found other questions like this, solved by putting the dependency outside the plugin. I tried that, but nothing changed for me.
Am I missing something?
Thank you.
EDIT
Here is my dependency on another project which contains both the processor and the annotations:
<dependencies>
<dependency>
<groupId>co.aurasphere</groupId>
<artifactId>revolver-annotation-processor</artifactId>
<version>0.0.3-SNAPSHOT</version>
</dependency>
</dependencies>
EDIT 2:
After further investigation, I decided to decompile the processor JAR (built with Maven) and it happens that... my classes are not there. For some reasons, Maven is not compiling my classes into the JAR and that's why the classes are not found. I've tried figuring out what's wrong on that build (this never happened to me before and I've used Maven for a while...).
First of all, the packaging on that project is jar.
The classes are all under src/main/java.
I've checked in my pom.xml that the classpath and source path is the same.
Here's the processor pom:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>co.aurasphere</groupId>
<artifactId>revolver-annotation-processor</artifactId>
<version>0.0.3-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<!-- https://mvnrepository.com/artifact/javax.inject/javax.inject -->
<dependency>
<groupId>javax.inject</groupId>
<artifactId>javax.inject</artifactId>
<version>1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.velocity/velocity -->
<dependency>
<groupId>org.apache.velocity</groupId>
<artifactId>velocity</artifactId>
<version>1.7</version>
</dependency>
</dependencies>
EDIT 3
Here's the output of a maven clean install on the processor project. Unfortunately the output is too long and I had to post an external link even if I know it's not good.
EDIT 4
Here are some screenshots of my dependency hierarchy: and .
Since the project was originally created as an Eclipse simple Java project and then converted to a Maven one, I tried to create a new Maven project and move everything to the new one in the hope that the problem was the Eclipse plugin that messed something up, but the error was still there.
This is an extended version of the accepted answer above provided by #Aurasphere. Hopefully this will give some explanation to how the proposed solution works.
First, some background to what is happening here. Say, we want a custom annotation processor. We implement it and put it into a JAR as Maven artefact, so that it could be consumed by other projects. When such projects are being compiled, we want our annotation processor to be recognised by Java compiler and used appropriately. To make this happen, one needs to tell the compiler about a new custom processor. Compiler looks in the resources and checks FQN of classes listed in META-INF/services/javax.annotation.processing.Processor file. It tries to find these classes in classpath and load them to run the processing of annotations used upon classes that are currently being compiled.
So, we want our custom class to be mentioned in this file. We can ask a user of our library to put this file manually, but this is not intuitive and users could be frustrated why the promised processing of annotation doesn't work. That's why we might want to prepare this file in advance and deliver it together with the processor inside JAR of our Maven artefact.
The problem is that if we simply put this file with FQN of the custom processor in it, it will trigger compiler during compilation of our artefact, and since the processor itself is not yet compiled, the compiler will show the error about it. So we need to skip annotation processing to avoid this. This can be done using -proc:none, or with Maven:
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<proc>none</proc>
</configuration>
</plugin>
We might have unit tests that will need our annotation processor. In Maven, test compilation is carried out after main sources are built, and all classes are already available including our processor. We just need to add special step during processing of test sources which would use our annotation processor. This can be done using:
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<id>process-test-annotations</id>
<phase>generate-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
<configuration>
<proc>only</proc>
<annotationProcessors>
<annotationProcessor>fully.qualified.Name</annotationProcessor>
</annotationProcessors>
</configuration>
</execution>
</executions>
</plugin>
I've found the answer myself. I've figured out that the problem was the file javax.annotation.processing.Processor in META-INF/services/ with the configuration of the annotation processor's class. In order to fix the problem I had to add the following to the pom.xml configuration of my processor project:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<compilerArgument>
-proc:none
</compilerArgument>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
This let Maven build the classes into the actual jar and fixed the problem. I don't know if this is a bug or not but it surely looks strange to me. Thank you everybody for the help!
The easiest way is to register the annotation processor in the META-INF/services directory of the revolver-annotation-processor artifact. No Maven compiler configuration is needed.
Check if it's already registered, if not, register it yourself if you control the source code.
https://docs.oracle.com/javase/8/docs/api/java/util/ServiceLoader.html
If you control the source code I also recommend to package the processor in the same artifact as the annotations. Like this, whenever you're using one of the annotations, the annotation processor is also picked-up by the compiler.
The accepted answer here works by disabling all annotation processing, which may not be suitable if other annotation processors need to run during the compilation. Instead, the SPI configuration file listing the newly compiled annotation processor can be added in a post-processing step. I added a directory src/main/post-resources to my project and this plugin configuration:
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>3.3.0</version>
<executions>
<execution>
<id>annotation-processor-spi</id>
<phase>process-classes</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${project.build.outputDirectory}</outputDirectory>
<resources>
<resource>
<directory>src/main/post-resources</directory>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>

How maven jar project works when it is packaged to jar?

I am new to maven. I have created a maven project which will be packaged to JAR. I did clean package then jar is created. When i extracted the same jar, i could not see any dependencies (jars) i added in pom.xml inside the packaged jar. If i give this jar to third party clients how will the code work without any dependent jars ? Please help me how maven manages the jars?
Thanks!
Maven handles dependencies based on how you configure the dependency plugin.
See this reference for a simple example of how to do this.
In this example, the following code configures where your dependencies will end up:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.5.1</version>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<includeGroupIds>log4j</includeGroupIds>
<outputDirectory>${project.build.directory}/dependency-jars/</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
Then this code sets up the classpath for your main jar, which will allow anyone running it to find these dependencies
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<mainClass>com.mkyong.core.App</mainClass>
<classpathPrefix>dependency-jars/</classpathPrefix>
</manifest>
</archive>
</configuration>
</plugin>
Your other option would be to create a single jar, with all dependencies included, by following this example here
You could distribute the jar and the POM file if you want to try and provide your users with the files in that manner, but they'd need to be able to access your Maven repository where those dependencies are kept.
Core maven doesn't handle this. Maven is a build tool, its work is to build an artifact (a jar in your case). Dependencies you define in your module's pom.xml file are needed to get the code compiled. You'll need maven plugins to do so.
Now, you're asking not about the build, but the distribution of your compiled binaries.
If I understand it should be a lot of jars (your and your dependencies). Alternatively you may distribute the code as a jar + dependencies inside.
Example:
A first case:
If your code resides in module A (say, the code is in packages org.a.*) and depends on some thirdparty (say, log4j, whose classes reside in org.apache.log4j) than you can expect that you jar will only contain the classes of module a and you expect that the log4j will be added by the user of your module automatically (The first case).
A second case:
module a.jar will contain both org.a.* and org.apache.log4j.* classes, everything in the same module.
In general the first approach is more "healthy" and in this case you shouldn't do anything in maven. Maybe your distribution tool/documentation should contain this information.
If someone uses the module a in his/her code like a thirdparty (if you develop a framework or something) and if his/her project is mavenized, than the fact you've defined a dependency on log4j will make the maven to download the log4j as well as your a.jar (In maven notation, this is called "transitive dependencies").
If you're interested in the second case (this can be relevant if you define some "client application", like "jndi client for some server" for example) you might want to take a look on Maven shade plugin
Beware this can lead to dependency hell (what if the application that uses your client also makes use of log4j? what if the log4j-s are of different version)/
Bottom line, you probably want the first approach, think twice before you decide the second approach :)
One more tip, if you just want to download all the dependencies of your module "a" you might want to use maven dependency plugin - type the following in the command prompt
mvn dependency:copy-dependencies
and you'll find all the dependencies in target/dependencies folder
Hope this helps and happy mavening
The simplest solution to the problem is to use the maven-assembly-plugin which can create such jar with dependencies like the following:
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4</version>
<executions>
<execution>
<id>distro-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
Afterwards you can distribute the created jar xyz-1.0-jar-with-dependencies which contains the defined dependencies.
If you need more control on how the resulting artifact is created or if some files needed to be overwritten etc. you might take a deeper look into maven-shade-plugin

Custom second packaging artifact from single pom.xml, like jar and test-jar?

I have a group of projects that have (a) generated beans, and (b) code to work with those beans. I'd like each such project to create two different artifacts: a regular jar artifact that contains all classes, and a custom beans artifact that contains only the generated types.
I put together a quick plugin that creates a second beans artifact using artifact attachments and the "beans" classifier, but it doesn't work well in m2e. For this reason, I think creating a custom packaging type (e.g., "test-jar") is The Right Thing.
To be totally clear about what I'm imagining, this pom would works today and creates two different artifacts with two different packaging types:
<project>
<groupId>${groupId}</groupId>
<artifactId>${artifactId}</artifactId>
<version>${version}</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>test-jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
which you could import with either of the following:
<dependency>
<groupId>${groupId}</groupId>
<artifactId>${artifactId}</artifactId>
<version>${version}</version>
<!-- <type>jar</type> -->
</dependency>
<dependency>
<groupId>${groupId}</groupId>
<artifactId>${artifactId}</artifactId>
<version>${version}</version>
<type>test-jar</type>
</dependency>
I'd like to create a plugin that will let me use (for example) beans instead of test-jar to create a similar "paired" artifact.
I've poked around in the maven source code, and you can create custom types. However, "test-jar" seems to be "baked in" to maven, so I can't tell if it has some special features and I can't duplicate this behavior with my own plugin.
Of course, if there's another way to handle this kind of behavior without custom types that m2e understands -- for example, but getting m2e to understand my classifier, although that seems hard -- I'm all ears! :)
How can I make a similar paired packaging type? I've seen this answer regarding how to create custom types, but it only seems to create one artifact from a pom with the given custom packaging type.
OK, figured out how to get a custom type working with an additional artifact from the same POM.
You do use attached artifacts to generate the additional artifact. For my example, I used this call in my goal in my plugin (after I was done building my JAR file):
#Mojo(name="goal-name", defaultPhase=LifecyclePhase.PACKAGE)
public class MyMojo
extends AbstractMojo
{
#Component
private MavenProject project;
#Component
private MavenProjectHelper projectHelper;
#Component(role=Archiver.class, hint="jar")
private JarArchiver archiver;
public void execute() throws MojoExecutionException {
// Do work...
// Create JAR file...
File jarFile=createJarFile(archiver);
projectHelper.attachArtifact(project, "beans-jar", jarFile);
}
}
Note that I specified my custom type beans-jar, and no classifier.
Next, I dropped a components file into my plugin at src/main/resources/plexus/components.xml:
<component-set>
<components>
<component>
<role>org.apache.maven.artifact.handler.ArtifactHandler</role>
<role-hint>beans-jar</role-hint>
<implementation>org.apache.maven.artifact.handler.DefaultArtifactHandler</implementation>
<configuration>
<classifier>beans</classifier>
<extension>jar</extension>
<type>beans-jar</type>
<packaging>jar</packaging>
<language>java</language>
<addedToClasspath>true</addedToClasspath>
</configuration>
</component>
</components>
</component-set>
Here, I specify my custom type beans-jar and a classifier, which appears to be used to name the new attribute in the repository.
This file was based on artifact-handlers.xml from the maven-core project in the main maven repository. At the moment, that file is located here. (I found this file by grepping for test-jar in all .xml files in the maven repository.)
To import that dependency, you use:
<dependency>
<groupId>${groupId}</groupId>
<artifactId>${artifactId}</artifactId>
<type>beans-jar</type>
</dependency>
To import the dependency, you don't need to include the custom plugin.
I would suggest to try a simpler way like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>
<execution>
<id>second-jar</id>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<classifier>second</classifier>
<includes>
<include>**/service/*</include>
</includes>
</configuration>
</execution>
</executions>
</plugin>
Via the <include> you can defined which classes would be packaged into the supplemental jar file.

Managing JAXB-generated classes in a Maven project

I have a Maven-based project, in which I trying to add some JAXB classes automatically generated by the "jaxb2-maven-plugin" Maven plugin. However, my first cut has me in a circular dependency loop:
Because these JAXB classes aren't generated yet, my other sources which reference them have compilation errors.
Because those other sources have compilation errors, these JAXB classes don't get generated.
It seems like there are two obvious possibilities for solving this:
Comment-out the broken references, so that the project builds and the JAXB classes are automatically generated. Then copy those generated sources from /target into /src/main/java, so that references to them won't cause compilation errors.
Create an entirely separate project, consisting of nothing but the JAXB stuff. Include it as a dependency in my main project.
Am I missing something here? Option #1 seems flat-out ridiculous... that just can't be the manner in which people use JAXB. Option #2 seems more rational, but still rather inefficient and cumbersome. I really have to take on the overhead of an entirely separate project just to use JAXB?
Are there any more elegant approaches that developers use to reference JAXB-generated classes in the same project where the Maven plugin generates them?
UPDATE: By request, here is the relevant portion of my POM:
<build>
<plugins>
<plugin>
<!-- configure the compiler to compile to Java 1.6 -->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.6</source>
<target>1.6</target>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>jaxb2-maven-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>xjc</goal>
</goals>
</execution>
</executions>
<configuration>
<!-- The name of your generated source package -->
<packageName>com.mypackage</packageName>
</configuration>
</plugin>
</plugins>
</build>
When I run mvn clean package, I DO see my JAXB sources being generated beneath the /target subdirectory. However, those generated sources are not being automatically added to the classpath for the compile phase.
POST-RESOLUTION UPDATE: It turns out that my compilation issues had more to do with the fact that I was running in Eclipse, and its Maven integration has some issues with "jaxb2-maven-plugin". See this StackOverflow question for more detail on that issue and its resolution.
How did you configure your jaxb maven plugin? Normally it runs in the generate-sources lifecycle, which comes before the compile lifecycle. So your JAXB generated classes should already be there when your own code gets compiled, Maven puts them in target/generated-source and puts that folder on the classpath.
Edit:
This is my code we use at work (and which works as expected):
<plugin>
<groupId>com.sun.tools.xjc.maven2</groupId>
<artifactId>maven-jaxb-plugin</artifactId>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
<configuration>
<schemaDirectory>src/main/resources/<companyname>/xsd</schemaDirectory>
<includeSchemas>
<includeSchema>retrieval.xsd</includeSchema>
<includeSchema>storage.xsd</includeSchema>
</includeSchemas>
</configuration>
</plugin>
Apparently we use yet another jaxb plugin... (see also this thread: Difference of Maven JAXB plugins).
i would suggest you to split jaxb-generated classes (api) and your BL classes (implementation) to 2 maven projects with separate pom.xml for each, and the main root pom.xml with the compilation order. that way, you will be able to build api.jar, then maven will install it inside the local repo, and after that you can use it as the dependency of your implementation. so it will looks like:
-API\
--pom.xml - for api, jaxb generation
-IMPL\
--pom.xml - for impl, api dependency is here
pom.xml - main pom.xml with references to the projects above
Maybe try using the maven-jaxb2-plugin instead:
<plugin>
<groupId>org.jvnet.jaxb2.maven2</groupId>
<artifactId>maven-jaxb2-plugin</artifactId>
<version>0.8.2</version>
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
</plugin>
The answer from dfuse is correct, though. Either plugin should generate sources before compiling, and the result of the source generation will be on the classpath. I tested this with both plugins. Is it possible for you to post your schema, or at least the schema for the type that your code is failing to pick up on the classpath?

Categories