I have a Java (11.0.7) Maven (3.0.6) multi-module project that contains the following module declarations:
<modules>
<module>jdrum-commons</module>
<module>jdrum-datastore-base</module>
<module>jdrum-datastore-simple</module>
<module>jdrum</module>
</modules>
Each of these Maven modules contains a module-info that defines the necessary requirements and exports to restrict access and visibility.
As such, jdrum-datastore-simple has some test utility classes that I reuse in jdrum's tests. By configuring the surefire plugin in jdrum's config via the code snippet below I am able to package the whole project without any issues.
<build>
<plugins>
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<configuration>
<argLine>
<!-- Allow the unnamed module access to the tests at test-time -->
--add-opens jdrum/at.rovo.drum.impl=ALL-UNNAMED
--illegal-access=deny
</argLine>
</configuration>
</plugin>
</plugins>
</build>
Within the parents POM I've also configured the generation of a report via the site argument, which also generates the Javadoc of the respective projects. The configuration for the JAR containing the javadoc as well as the configuration for the Javadoc generation as part of the report are both the same and look like this:
<!-- Generate Javadoc while reporting -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>3.2.0</version>
<inherited>true</inherited>
<configuration>
<verbose>true</verbose>
<source>${maven.compiler.source}</source>
<show>protected</show>
<failOnWarnings>false</failOnWarnings>
<release>${maven.compiler.release}</release>
<stylesheet>java</stylesheet>
</configuration>
<reportSets>
<reportSet>
<id>html</id>
<reports>
<report>javadoc</report>
</reports>
</reportSet>
</reportSets>
</plugin>
The Javadoc generation as part of the package step, which generates the project-version-javadoc.jar as output, succeeds as both, the jdrum-datastore-simple dependencies as well as its tests, are only included at test time:
<!-- Test data store to use for testing -->
<dependency>
<groupId>at.rovo</groupId>
<artifactId>jdrum-datastore-simple</artifactId>
<version>${project.parent.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>at.rovo</groupId>
<artifactId>jdrum-datastore-simple</artifactId>
<version>${project.parent.version}</version>
<scope>test</scope>
<type>test-jar</type>
</dependency>
If I'd change the scope from test to compile or provided the Javadoc generation would also fail with an error such as
Exit code: 1 - javadoc: error - The code being documented uses packages in the unnamed module, but the packages defined in https://github.com/RovoMe/JDrum/jdrum-datastore-simple/apidocs/ are in named modules.
The issue here, as far as I understood the problem, is, that the jdrum-datastore-simple module is not added to the module path of Javadoc. The next logical step was therefore to add that module to the configuration as such:
<reporting>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<configuration>
<additionalOptions>
<option>--add-modules</option>
<option>jdrum.datastore.simple</option>
</additionalOptions>
</configuration>
</plugin>
</plugins>
</reporting>
This adds the jdrum-datastore-simple module to the Javadoc configuration string, which can be seen in the jdrum/target/site/apidocs/options file that now contains an
...
--add-modules
jdrum.datastore.simple
...
entry. On further analyzing the generated options file it is apparent that the module path is missing out a reference to the actual JAR file and hence the Javadoc generation and thus the Maven process fails due to Javadoc not being able to locate the defined module. If I update that options file and add the path to the missing JAR file and then only perform a mvn package site the whole process succeeds and all is fine (as the pure invocation of the javadoc.bat located in the target/site/apidocs folder would as well).
Now, in order to make the whole process more dynamic I wanted to add or update the module path. However, the maven-javadoc-plugin does not directly allow this. Therefore I came up with adding a further maven-javadoc-plugin option of --module-path and a further option entry that contains the whole path. By the whole path I mean the path to every single dependency, so not only the path to jdrum-datastore-simple. This also works but due to hardcoding the path to the respective JAR files, the project is now not usable by other users unless they have the same system and path structure I used. To fix this I quickly replaced the respective path structure with ${settings.localRepository} and ${project.parent.basedir} properties on the respective modules in the module path. Unfortunately Javadoc is rather nitpicking on the path structure it accepts and it turns out that on my Windows machine Maven does return a path structure starting with C:\Users\... which Javadoc can't handle. If the path structure looks like C:/Users/... however Javadoc is fine with the values.
On further research I stumbled upon this thread which suggests to use Maven's build-helper-maven-plugin to define new properties for i.e. the M2 repository and use the built-in reg-ex capability to replace \ characters with /. However, adding a configuration such as
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.1.0</version>
<executions>
<execution>
<id>replace-local-repo-characters</id>
<goals>
<goal>regex-property</goal>
</goals>
<configuration>
<name>tag.m2repo</name>
<value>${settings.localRepository}</value>
<regex>\\</regex>
<replacement>/</replacement>
<failIfNoMatch>false</failIfNoMatch>
</configuration>
</execution>
<execution>
<id>replace-local-path-characters</id>
<goals>
<goal>regex-property</goal>
</goals>
<configuration>
<name>tag.basedir</name>
<value>${project.parent.basedir}</value>
<regex>\\</regex>
<replacement>/</replacement>
<failIfNoMatch>false</failIfNoMatch>
</configuration>
</execution>
</executions>
</plugin>
and using the introduced tags instead does not work at all as Maven is complaining about an invalid value provided. If I use $\{settings.localRepository} Maven is fine about the provided value, however in the final options file not the value of the actual settings.localRepository is updated but the provided string itself and I end up with something like $/{settings.localRepository}/org/slf4j/... which Javadoc can't resolve and therefore still misses out on the correct location to the jdrum-datastore-simple dependency.
So, how can I add the path to the missing dependency to maven-javadoc-plugin's module path defined in the generated options file so that the Maven is actually able to generate the whole report?
It seems that with java11 Update 9 (maybe also with update 8; not tested) maven-javadoc-plugin is able to correctly generate the Javadoc for multi-module projects without the need to alter the module-path.
For those interested how the actual Maven POM looks like:
Parent POM
POM for a shared module
POM for a sharing and consuming module
POM for the consuming module
Related
I'm new to annotation processing and I'm trying to automating it with Maven. I've put this in my pom.xml:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.0</version>
<configuration>
<annotationProcessors>
<annotationProcessor>
co.aurasphere.revolver.annotation.processor.InjectAnnotationProcessor</annotationProcessor>
<annotationProcessor>
co.aurasphere.revolver.annotation.processor.RevolverContextAnnotationProcessor</annotationProcessor>
</annotationProcessors>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
The problem is that when I try to build the project I get a CompilationFailureException because Maven can't find the processors.
I've found other questions like this, solved by putting the dependency outside the plugin. I tried that, but nothing changed for me.
Am I missing something?
Thank you.
EDIT
Here is my dependency on another project which contains both the processor and the annotations:
<dependencies>
<dependency>
<groupId>co.aurasphere</groupId>
<artifactId>revolver-annotation-processor</artifactId>
<version>0.0.3-SNAPSHOT</version>
</dependency>
</dependencies>
EDIT 2:
After further investigation, I decided to decompile the processor JAR (built with Maven) and it happens that... my classes are not there. For some reasons, Maven is not compiling my classes into the JAR and that's why the classes are not found. I've tried figuring out what's wrong on that build (this never happened to me before and I've used Maven for a while...).
First of all, the packaging on that project is jar.
The classes are all under src/main/java.
I've checked in my pom.xml that the classpath and source path is the same.
Here's the processor pom:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>co.aurasphere</groupId>
<artifactId>revolver-annotation-processor</artifactId>
<version>0.0.3-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<!-- https://mvnrepository.com/artifact/javax.inject/javax.inject -->
<dependency>
<groupId>javax.inject</groupId>
<artifactId>javax.inject</artifactId>
<version>1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.velocity/velocity -->
<dependency>
<groupId>org.apache.velocity</groupId>
<artifactId>velocity</artifactId>
<version>1.7</version>
</dependency>
</dependencies>
EDIT 3
Here's the output of a maven clean install on the processor project. Unfortunately the output is too long and I had to post an external link even if I know it's not good.
EDIT 4
Here are some screenshots of my dependency hierarchy: and .
Since the project was originally created as an Eclipse simple Java project and then converted to a Maven one, I tried to create a new Maven project and move everything to the new one in the hope that the problem was the Eclipse plugin that messed something up, but the error was still there.
This is an extended version of the accepted answer above provided by #Aurasphere. Hopefully this will give some explanation to how the proposed solution works.
First, some background to what is happening here. Say, we want a custom annotation processor. We implement it and put it into a JAR as Maven artefact, so that it could be consumed by other projects. When such projects are being compiled, we want our annotation processor to be recognised by Java compiler and used appropriately. To make this happen, one needs to tell the compiler about a new custom processor. Compiler looks in the resources and checks FQN of classes listed in META-INF/services/javax.annotation.processing.Processor file. It tries to find these classes in classpath and load them to run the processing of annotations used upon classes that are currently being compiled.
So, we want our custom class to be mentioned in this file. We can ask a user of our library to put this file manually, but this is not intuitive and users could be frustrated why the promised processing of annotation doesn't work. That's why we might want to prepare this file in advance and deliver it together with the processor inside JAR of our Maven artefact.
The problem is that if we simply put this file with FQN of the custom processor in it, it will trigger compiler during compilation of our artefact, and since the processor itself is not yet compiled, the compiler will show the error about it. So we need to skip annotation processing to avoid this. This can be done using -proc:none, or with Maven:
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<proc>none</proc>
</configuration>
</plugin>
We might have unit tests that will need our annotation processor. In Maven, test compilation is carried out after main sources are built, and all classes are already available including our processor. We just need to add special step during processing of test sources which would use our annotation processor. This can be done using:
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<executions>
<execution>
<id>process-test-annotations</id>
<phase>generate-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
<configuration>
<proc>only</proc>
<annotationProcessors>
<annotationProcessor>fully.qualified.Name</annotationProcessor>
</annotationProcessors>
</configuration>
</execution>
</executions>
</plugin>
I've found the answer myself. I've figured out that the problem was the file javax.annotation.processing.Processor in META-INF/services/ with the configuration of the annotation processor's class. In order to fix the problem I had to add the following to the pom.xml configuration of my processor project:
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<compilerArgument>
-proc:none
</compilerArgument>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
</plugins>
</build>
This let Maven build the classes into the actual jar and fixed the problem. I don't know if this is a bug or not but it surely looks strange to me. Thank you everybody for the help!
The easiest way is to register the annotation processor in the META-INF/services directory of the revolver-annotation-processor artifact. No Maven compiler configuration is needed.
Check if it's already registered, if not, register it yourself if you control the source code.
https://docs.oracle.com/javase/8/docs/api/java/util/ServiceLoader.html
If you control the source code I also recommend to package the processor in the same artifact as the annotations. Like this, whenever you're using one of the annotations, the annotation processor is also picked-up by the compiler.
The accepted answer here works by disabling all annotation processing, which may not be suitable if other annotation processors need to run during the compilation. Instead, the SPI configuration file listing the newly compiled annotation processor can be added in a post-processing step. I added a directory src/main/post-resources to my project and this plugin configuration:
<plugin>
<artifactId>maven-resources-plugin</artifactId>
<version>3.3.0</version>
<executions>
<execution>
<id>annotation-processor-spi</id>
<phase>process-classes</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${project.build.outputDirectory}</outputDirectory>
<resources>
<resource>
<directory>src/main/post-resources</directory>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
<build>
<plugins>
<plugin>
<groupId>org.jvnet.jax-ws-commons</groupId>
<artifactId>jaxws-maven-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>generate-sources</phase>
<goals>
<goal>wsimport</goal>
</goals>
<id>generate-sei</id>
<configuration>
<sourceDestDir>${project.basedir}/src/main/java</sourceDestDir>
</configuration>
</execution>
</executions>
<dependencies>...</dependencies>
</plugin>
</plugins>
</build>
The above XML snippet is from a POM file in a Java project. In this snippet I've defined the jaxws-maven-plugin to use a wsdl file to generate the SEI code and place it in the src/main/java directory. This plugin is bound to the generate-sources phase, and works fine.
I want to make it so that if I issue the plugin directly, using:
mvn jaxws:wsimport
it should place the files in the above mentioned folder. From the plugins reference site (https://jax-ws-commons.java.net/jaxws-maven-plugin/wsimport-mojo.html), I can't figure out how to pass the parameter (sourceDestDir) as a command line argument. Is there someway I can do this?
WARNING /!\
You are trying to generate sources under the source folder src/main/java. Unless there is a very strong reason, don't do this. All generated content should always be placed under the build directory (target by default) and not be version-controlled. You can always add the generated sources as source folder using the build-helper-maven-plugin:add-source, if the plugin does not do it already itself.
To be able to set parameters directly on the command line, the plugin needs to define a user property. However, the org.jvnet.jax-ws-commons:jaxws-maven-plugin does not define a user property for the sourceDestDir parameter. This is noticeable because the documentation does not have a "User Property" set.
You can also find this in the source code:
#Parameter(defaultValue = "${project.build.directory}/generated-sources/wsimport")
private File sourceDestDir;
The #Parameter annotation, used to declare the parameter of the Maven plugin, does not have a corresponding property.
As such, you will need to have the following:
Define a Maven property jaxws.sourceDestDir with a value of ${project.basedir}/src/main/java with
<properties>
<jaxws.sourceDestDir>${project.basedir}/src/main/java</jaxws.sourceDestDir>
</properties>
Preferably, you would have ${project.build.directory}/some/path instead of src/main/java.
Configure the plugin to use this Maven property:
<configuration>
<sourceDestDir>${jaxws.sourceDestDir}</sourceDestDir>
</configuration>
If you want to override it, you can now do so directly on the command line with -Djaxws.sourceDestDir=/my/new/value. This system property will take precedence over the value of the Maven property.
I am new to maven. I have created a maven project which will be packaged to JAR. I did clean package then jar is created. When i extracted the same jar, i could not see any dependencies (jars) i added in pom.xml inside the packaged jar. If i give this jar to third party clients how will the code work without any dependent jars ? Please help me how maven manages the jars?
Thanks!
Maven handles dependencies based on how you configure the dependency plugin.
See this reference for a simple example of how to do this.
In this example, the following code configures where your dependencies will end up:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.5.1</version>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<includeGroupIds>log4j</includeGroupIds>
<outputDirectory>${project.build.directory}/dependency-jars/</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
Then this code sets up the classpath for your main jar, which will allow anyone running it to find these dependencies
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<mainClass>com.mkyong.core.App</mainClass>
<classpathPrefix>dependency-jars/</classpathPrefix>
</manifest>
</archive>
</configuration>
</plugin>
Your other option would be to create a single jar, with all dependencies included, by following this example here
You could distribute the jar and the POM file if you want to try and provide your users with the files in that manner, but they'd need to be able to access your Maven repository where those dependencies are kept.
Core maven doesn't handle this. Maven is a build tool, its work is to build an artifact (a jar in your case). Dependencies you define in your module's pom.xml file are needed to get the code compiled. You'll need maven plugins to do so.
Now, you're asking not about the build, but the distribution of your compiled binaries.
If I understand it should be a lot of jars (your and your dependencies). Alternatively you may distribute the code as a jar + dependencies inside.
Example:
A first case:
If your code resides in module A (say, the code is in packages org.a.*) and depends on some thirdparty (say, log4j, whose classes reside in org.apache.log4j) than you can expect that you jar will only contain the classes of module a and you expect that the log4j will be added by the user of your module automatically (The first case).
A second case:
module a.jar will contain both org.a.* and org.apache.log4j.* classes, everything in the same module.
In general the first approach is more "healthy" and in this case you shouldn't do anything in maven. Maybe your distribution tool/documentation should contain this information.
If someone uses the module a in his/her code like a thirdparty (if you develop a framework or something) and if his/her project is mavenized, than the fact you've defined a dependency on log4j will make the maven to download the log4j as well as your a.jar (In maven notation, this is called "transitive dependencies").
If you're interested in the second case (this can be relevant if you define some "client application", like "jndi client for some server" for example) you might want to take a look on Maven shade plugin
Beware this can lead to dependency hell (what if the application that uses your client also makes use of log4j? what if the log4j-s are of different version)/
Bottom line, you probably want the first approach, think twice before you decide the second approach :)
One more tip, if you just want to download all the dependencies of your module "a" you might want to use maven dependency plugin - type the following in the command prompt
mvn dependency:copy-dependencies
and you'll find all the dependencies in target/dependencies folder
Hope this helps and happy mavening
The simplest solution to the problem is to use the maven-assembly-plugin which can create such jar with dependencies like the following:
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4</version>
<executions>
<execution>
<id>distro-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
Afterwards you can distribute the created jar xyz-1.0-jar-with-dependencies which contains the defined dependencies.
If you need more control on how the resulting artifact is created or if some files needed to be overwritten etc. you might take a deeper look into maven-shade-plugin
I have a group of projects that have (a) generated beans, and (b) code to work with those beans. I'd like each such project to create two different artifacts: a regular jar artifact that contains all classes, and a custom beans artifact that contains only the generated types.
I put together a quick plugin that creates a second beans artifact using artifact attachments and the "beans" classifier, but it doesn't work well in m2e. For this reason, I think creating a custom packaging type (e.g., "test-jar") is The Right Thing.
To be totally clear about what I'm imagining, this pom would works today and creates two different artifacts with two different packaging types:
<project>
<groupId>${groupId}</groupId>
<artifactId>${artifactId}</artifactId>
<version>${version}</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>test-jar</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
which you could import with either of the following:
<dependency>
<groupId>${groupId}</groupId>
<artifactId>${artifactId}</artifactId>
<version>${version}</version>
<!-- <type>jar</type> -->
</dependency>
<dependency>
<groupId>${groupId}</groupId>
<artifactId>${artifactId}</artifactId>
<version>${version}</version>
<type>test-jar</type>
</dependency>
I'd like to create a plugin that will let me use (for example) beans instead of test-jar to create a similar "paired" artifact.
I've poked around in the maven source code, and you can create custom types. However, "test-jar" seems to be "baked in" to maven, so I can't tell if it has some special features and I can't duplicate this behavior with my own plugin.
Of course, if there's another way to handle this kind of behavior without custom types that m2e understands -- for example, but getting m2e to understand my classifier, although that seems hard -- I'm all ears! :)
How can I make a similar paired packaging type? I've seen this answer regarding how to create custom types, but it only seems to create one artifact from a pom with the given custom packaging type.
OK, figured out how to get a custom type working with an additional artifact from the same POM.
You do use attached artifacts to generate the additional artifact. For my example, I used this call in my goal in my plugin (after I was done building my JAR file):
#Mojo(name="goal-name", defaultPhase=LifecyclePhase.PACKAGE)
public class MyMojo
extends AbstractMojo
{
#Component
private MavenProject project;
#Component
private MavenProjectHelper projectHelper;
#Component(role=Archiver.class, hint="jar")
private JarArchiver archiver;
public void execute() throws MojoExecutionException {
// Do work...
// Create JAR file...
File jarFile=createJarFile(archiver);
projectHelper.attachArtifact(project, "beans-jar", jarFile);
}
}
Note that I specified my custom type beans-jar, and no classifier.
Next, I dropped a components file into my plugin at src/main/resources/plexus/components.xml:
<component-set>
<components>
<component>
<role>org.apache.maven.artifact.handler.ArtifactHandler</role>
<role-hint>beans-jar</role-hint>
<implementation>org.apache.maven.artifact.handler.DefaultArtifactHandler</implementation>
<configuration>
<classifier>beans</classifier>
<extension>jar</extension>
<type>beans-jar</type>
<packaging>jar</packaging>
<language>java</language>
<addedToClasspath>true</addedToClasspath>
</configuration>
</component>
</components>
</component-set>
Here, I specify my custom type beans-jar and a classifier, which appears to be used to name the new attribute in the repository.
This file was based on artifact-handlers.xml from the maven-core project in the main maven repository. At the moment, that file is located here. (I found this file by grepping for test-jar in all .xml files in the maven repository.)
To import that dependency, you use:
<dependency>
<groupId>${groupId}</groupId>
<artifactId>${artifactId}</artifactId>
<type>beans-jar</type>
</dependency>
To import the dependency, you don't need to include the custom plugin.
I would suggest to try a simpler way like this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<executions>
<execution>
<id>second-jar</id>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<classifier>second</classifier>
<includes>
<include>**/service/*</include>
</includes>
</configuration>
</execution>
</executions>
</plugin>
Via the <include> you can defined which classes would be packaged into the supplemental jar file.
I'd like to add *.dlls as third party libs to my repository and during packaging process just pack them to *.jar, sign them and copy to some specific folder.
Signing and coping are well done and work correctly (as expected by using maven-dependency-plugin and maven-jarsigner-plugin). But I didn't find any method to automatically pack single dll to jar (without any sources like maven-assembly-plugin does).
Solution that I see by the time: add to my repository not a "pure" dll, but already packed to jar lib (packed by myself)... but it's not a good idea, I guess)
It sounds like you've successfully retrieved your .dll (with dependency plugin) and signed it (jarsigner plugin), and it's somewhere in your ${project.build.directory} (which defaults to target).
If that's correct, give this a try:
Define the packaging of your project as jar
Retrieve dlls
Make sure the jarsigner:sign goal is bound to the prepare-package phase. It binds to package by default and we need to ensure jarsigner:sign runs before jar:jar.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jarsigner-plugin</artifactId>
<version>1.2</version>
<executions>
<execution>
<id>sign</id>
<phase>prepare-package</phase> <!-- important -->
<goals>
<goal>sign</goal>
</goals>
</execution>
</executions>
</plugin>
Configure the jar plugin to include the signed dll(s)
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<executions>
<execution>
<!-- using this ID merges this config with default -->
<!-- So it should not be necessary to specify phase or goals -->
<!-- Change classes directory because it will look in target/classes
by default and that probably isn't where your dlls are. If
the dlls are in target then directoryContainingSignedDlls is
simply ${project.build.directory}. -->
<id>default-jar</id>
<configuration>
<classesDirectory>directoryContainingSignedDlls</classesDirectory>
<includes>
<include>**/*.dll</include>
</includes>
</configuration>
</execution>
</executions>
</plugin>
Now, running mvn clean package should give you a jar containing your signed dlls.
If JACOB requires manifest config there are docs explaining how to do this.
Good luck!
I would recommend to pack your dll's as a zip archive via maven-assembly-plugin and let that module deploy the zip archive as attached to your usual pom. The packaging of that project should be pom instead of default.
I would be a little bit confused if i download a jar and find dll's inside it,
but if you prefer you could create jar via the maven-assembly-plugin or use the maven-jar-plugin.