Smarter Native Dependency Handling with Maven - java

I'm currently in the midst of converting a large multi-module project (~100 sub-modules) to use Maven. Currently we use Ant + Ivy.
So far no major issues have cropped up and I'm comfortable that Maven is still a good fit. However, I wonder if there is a better way to handle native dependencies.
So far I have come to the following conclusions.
It's best to install each native dependency into the maven repo either as a standalone library or an archived package containing multiple dependencies.
Rather than get lost in declaring each and every dependency with the Maven dependency plugin, I opted to give each a classifier (e.g. natives-win32) and use the following in the parent POM:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.4</version>
<executions>
<execution>
<id>copy</id>
<phase>compile</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<includeScope>runtime</includeScope>
<includeClassifiers>natives-win32</includeClassifiers>
<outputDirectory>${project.build.directory}/natives</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
So far this seems to be a simple all-round solution that doesn't require too much messing about to add new native dependencies. It also offers me a simple all-round solution for managing natives. The only thing I must do is ensure that my /natives/ directory is defined on java.library.path.
One thing that bothers me (a little) about this approach is that all my native dependencies get copied around to each sub-module that expresses a transitive dependency on them, whilst my happy jar libraries are added to the classpath referenced to where they sit in my local repository (no copy required).
Is there no way to be smarter about this and have my natives referenced from withing their repository location (assuming I don't have them archived, i.e. dll). That would save a bunch of unnecessary copying about.
Are there any other potential gotchas' that I should be concerned about with the above approach?

Your snippet shows a goal attached to a build phase, not a dependency. Is the 'copy dependencies' goal in a super pom and inherited by all modules? There's no way to move it only to the modules which are going to be run/packaged as an app?

It could be, that I didn't got it. But why don't you deploy all your native libs into the repository at first. If the native libs are stable and change seldom, That could be done in a seperate reactor.
And afterwards you reference those native dependencies simply via GAV as any other dependency. Also the problem af unnecessary copying is solved by that.

I ended up using the maven natives plugin and dealing with the fact that I have redundant copies of the native libraries around the place. The reason for this was primarily due to the simplicity that the plugin offers and the fact that it also has a related eclipse plugin that sets up natives in developers eclipse environment without intervention.

Related

Maven build failure involving MANIFEST.MF classpath in dependency JARs

First off, I'm a long time user (and beneficiary) of StackOverflow, but this is my first question. I've done plenty of searching on this topic, but most of the articles I've turned up talk about generating JAR files, not working with 3rd party JARs from the Maven Central repo which I don't really have the power to fix. The few solutions I've seen floating around aren't really acceptable.
The problem is that most of the jaxb JARs found in the Maven Central repository contain classpath entries (in the MANIFEST.MF file) that point to dependencies, and those dependencies are specified with relative paths -- typically assuming that dependency bar.jar exists in the same directory as foo.jar.
This is fine if you have all your JAR dependencies in a single lib directory, for example. But:
Maven wants to maintain its own local repository, so every single packaged JAR lives in its own directory (with each version in a separate subdirectory).
Maven JARs are typically named with the version info embedded in the filename, whereas the classpath entry in MANIFEST.MF specifies the dependency with just the base filename (no version).
The net result is an error message like this:
[ERROR] bad path element
"C:\Users\rpoole\.m2\repository\com\sun\xml\bind\jaxb-impl\2.2.11\jaxb-core.jar":
no such file or directory
One solution is to write a script or small app to go through all the JARs and strip out the classpath info from the embedded MANIFEST.MF file. But that is not very clean, and requires an extra step before doing the actual build.
Another potential solution is that some newer published versions of the JARs in question have supposedly fixed this classpath problem, so therefore use the latest and greatest. Unfortunately, this app I'm working on is legacy, and is being developed for a 3rd party, so I can't update the dependencies beyond a certain version. So far, all the jaxb JARs that I have poked into seem to have issues.
Is there a way to tell Maven to ignore the embedded classpath in the JAR and only rely on Maven's own dependency resolution? I've tried things like reordering dependencies, but that doesn't work (or moves the build problem from one subproject to another).
One additional annoyance: There is a "blessed" Maven repo we have that seems to let the build complete with no problem, but so far I've been unable to figure out why this particular set of JARs builds OK. I suspect someone may have gone in and tweaked some JARs or POMs manually, but there's scant information, and diff tools aren't really helping much.
Regardless, the project should build from scratch.
What would be nice is if I could specify something like an exclusion block in the pom.xml for the subproject that's breaking, but for dealing with a JAR's embedded classpath instead of Maven's own transitive dependencies (specified by groupId/artifactId).
Edit: Apparently, some people believe that this is impossible, and that Maven ignores the Class-Path entry in Manifest.MF. However, this is a known issue, as discussed in this StackOverflow article. There's also another good article which explains some of the history of this a bit better.
The problem is that I can't go through the JARs and edit the MANIFEST.MF files on each as part of the build process. That's just not a practical approach, even if automated by script. I need a solution that actually will work for code that is already in production. These issues were supposedly fixed in later versions of the JARs in question, but I may not be able to use newer versions of those.
Additionally, one of the proposed fixes is to add -Xlint:-path to the compiler args, which suppresses the error message. However, the build simply fails at another point for me, so at first blush this does not appear to be a good solution either. I'll be trying this again because according to this, the syntax for compiler arguments inside POM files is a bit wonky.
I hate answering my own question, but I finally did manage to get past this problem. For those who keep insisting that Maven builds can't possibly be affected by the Class-Path entry in a jar's MANIFEST.MF, please read this article by Michael Bayne. It summarizes the problem and the solution rather nicely. Hint: javac certainly does care about the embedded classpath in jars.
The trick is to add -Xlint:-path to the compiler arguments, although I was dubious of this solution initially. Doing this will suppress the bad path element warnings/errors, and therefore Maven won't prematurely halt the build. The problem I had was figuring out the correct syntax for embedding these arguments in the pom.xml. That's where this article comes in handy. Therefore, the compiler plugin's configuration has to look like this to be understood properly:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>${java-version}</source>
<target>${java-version}</target>
<compilerArguments>
<Xlint/>
<Xlint:-path/>
</compilerArguments>
<showWarnings>true</showWarnings>
<showDeprecation>true</showDeprecation>
</configuration>
</plugin>
</plugins>
Note that this syntax is not very good XML (as Bayne points out), and most IDEs will flag the second Xlint line as an error, but it will work with Maven. Using the syntax given in some Maven tutorials may result in the arguments not being passed to the compiler at all, or only the last argument being passed.
Once this problem is taken care of, you may discover other build problems (I certainly did), but at least those problems won't be hidden from you any longer.
The problem is that you are referencing a post which is seven years old..and don't use more recent versions of the maven-compiler-plugin:
The arguments to the javac compiler can better done like this:
<project>
[...]
<build>
[...]
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.6.1</version>
<configuration>
<compilerArgs>
<arg>-Xlint</arg>
<arg>-Xlint:-path</arg>
</compilerArgs>
</configuration>
</plugin>
</plugins>
[...]
</build>
[...]
</project>

Solving a jar hell with maven?

I'm using two jars A and B. B is a library and A has classes that uses some old classes from library B. Now this is causing me a problem when I include both jars in my project classpath as there are the same names of two classes but one of them is older than the other and behave differently.
One solution to this problem I found is by first importing library B into Eclipse and then I click OK and the project builds. Then I add the jar A. This way all my existing code will use the newer versions of B and the classes of A will be untouched.
However now I want to use Maven for my projects but I'm unable to know how to make this trick again using Maven. Please help.
Maybe you can solve your problem by renaming the package of the class that you don't want.
You can do it by using Maven Shade Plugin
This plugin allows to rename package names at compilation.
Usage :
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>com.example.package.name.YourClass</pattern>
<shadedPattern>com.example.rename.package.name.YourClass</shadedPattern>
</relocation>
</relocations>
<promoteTransitiveDependencies>true</promoteTransitiveDependencies>
</configuration>
</execution>
</executions>
</plugin>
This isn't necessarily a Maven problem. The default classloader will search for classes according to the order of the jars on the classpath. When you are adding the jars to Eclipse you are doing so in a way that their order will ensure the correct classes are loaded - specifically B appears on the classpath before A and therefore, when the same class is in both jars, it will be loaded from B.
Since version 2.0.9 of Maven, the classpath is built according to the order of the dependencies in the POM. So, providing dependency B is declared before the dependency A, you should get the same behaviour are with Eclipse.
Needless to say, relying on classpath order in this way is rather fragile and personally I'd look to clean-up the jars if that's possible.
The problem is very real (unless for me).
If you have detected what libraries are the conflicted, you can use exclusions to prevent import libraries that you dont want.
If you dont know what are the conflicted libraries, in eclipse using the default maven plugin you can open the pom file and select the tab "Dependency Hierarchy" in the right column you can see all your resolved dependencies for your project, and in the left what library import each dependency.
I hope it can help you.

Maven: Different library versions in one JVM

I have a project whose dependency tree is huge i.e. it packs in modules from several teams.
Now there are some commonly used dependencies which are common across several modules.
A simplified example can be:
TopModule.jar
ChildModule.jar
CommonModule-v1.jar
CommonModule-v2.jar
When I build my project, I specify the latest version of common dependencies, but its very hard to ask the same from every other team.
So, frequently, the TopModule is built using different versions of CommonModule (v1 and v2 in the above example).
My question is:
If the final jar file contains both CommonModule-v1.jar and CommonModule-v2.jar, how does it affect the runtime?
Can the runtime erroneously load versions v2 where v1 is required and vice versa?
Maven will only use one version of each artifact in the end -- it doesn't do any fancy classloader isolation tricks. You can see which version it'll use with mvn dependency:resolve.
If you need to use specific versions within dependencies, you can use the shade plugin. It'll do renaming trickery so that dependencies get their own versions of libraries.
To fight with this problem globally use this DependencyConvergence Rule
This rule requires that dependency version numbers converge. If a
project has two dependencies, A and B, both depending on the same
artifact, C, this rule will fail the build if A depends on a different
version of C then the version of C depended on by B.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>1.3.1</version>
<executions>
<execution>
<id>enforce</id>
<configuration>
<rules>
<DependencyConvergence/>
</rules>
</configuration>
<goals>
<goal>enforce</goal>
</goals>
</execution>
</executions>
</plugin>
After this all teams work together with consistent versions of dependencies.
This depends on the way the modules are named in maven. Usually, maven tries to resolve the conflicting libs and takes the highest version into the tree. But if the libraries are different artifacts in terms of artifactId, then maven will not see that they are from the same breed and thus will not resolve the ambiguity.
Usually you resolve this by a common parent.pom, where you define the versions of commonly used libraries throughout the project. If you have no control over the other projects (not part of your build, only dependencies), you may be lucky to have your final project working fine. If the library breaks compatibility in the newer version, you will not be able to use it.
So, does your final project contain both versions of the library or not, did you check it? The dependency tree may show both versions, but if maven will use only the latest version of a dependency in the hierarchy.
Classloader will load the first JAR which appears on your classpath. In more details - it will search for the first class on your class path, so in each case all these searches would fall into i.e. CommonModule-v2.jar. So the answer is yes - it can erroneously load versions v2 where v1 if it appears earlier on your classpath.
If your pom.xml is only an aggregator of already packaged modules then this apply. If it is not the case and your project actually compile and packages all of those modules as a submodules then maven will choose one. If it compiles every project on its own then it will be packaged using that dependency. But if all of them end up in the same class loader then it won't work fine.
At runtime it can cause errors, think of method not found and the like. Your byte code classes were compiled and linked with the correct dependencies but since the class loader finds two candidates it just load one at runtime.
What you can do is set a parent pom defining a <dependencyManagement> and then ask all teams to use it as a parent and don't declare <version> but inherit it from the parent pom.xml.
References http://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#Dependency_Management
On top of what #yshavit said, ideally you'd exclude the earlier version of the CommonModule so that only v2 is in the classpath. This is only possible if the CommonModule v2 api is backwards compatible with CommonModule v1.
Here's an example of how you exclude:
<dependency>
<groupId>ChildModuleGroupid</groupId>
<artifactId>ChildModuleArtifactid</artifactId>
<version>1.0</version>
<exclusions>
<exclusion>
<groupId>CommonModuleGroupId</groupId>
<artifactId>CommonModuleArtifactId</artifactId>
</exclusion>
</exclusions>
</dependency>
You'd put that in the TopModule pom.xml.

Eclipse/Maven and "Resolve dependencies from workspace projects" can't mix jars and source?

I've got what seems like a corner case for Eclipse/Maven and "Resolve dependencies from workspace projects". My project has a mix of written code and generated code, with the generated code coming from a dependency which uses JAXWS.
The problem is that if I check "Resolve dependencies", Eclipse/Maven ignores any JAR dependencies and tries to resolve everything by only looking at the workspace, which results in Eclipse showing errors like "Package/Class not found" (related to the generated code) even though the project will build fine with Maven from the command line.
On the other hand, if I uncheck it, it resolves everything by only looking at the JARs in the Maven repository. The second option generally works, but when I do something like Ctrl-click on a class or variable, I get the Class File Editor and "Source not found", which isn't terribly useful. Also, it can get out of sync if I edit code in the IDE but don't run "maven install" after that.
I suppose this is mainly an inconvenience with Eclipse but it's annoying. I am considering resolving this by modifying the Maven dependencies to build with source (or debug) but I can't necessarily do this with everything. Is the "Resolve dependencies" option intended to work exclusively one way or the other as I've described?
You might want to have a look at the build helper maven plugin.
You can configure it like this :
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<id>add-source</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>target/generated-sources</source>
<source>target/jaxws/wsimport/java</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
This will tell your eclipse maven plugin to have a look at the generated sources and include it in your project classpath.
You can also add the generated sources manually to your classpath in eclipse. (right-click on the generated folder -> add to build path)
I think that since you want to reference files that only exist after a build that you somehow force the build to happen before you need the references resolved. You could cheat by just doing a build from within Eclipse. That would leave the generated source files in place ready to be referenced. I think, however, that the maven philosophy would have you move the generated code to another maven artifact entirely. That would let you separate the lifecycle of the two groups of code so that when you're ready to use Eclipse to edit the hand-coded code, references to generated classes are resolved because you've already generated that code in the build of an separate, independent module.
I know this is an old issue. But I encountered the same thing in Juno with an updated "m2e-wtp" plugin. So I'm answering solely for other readers' benefit.
This was only happening in war projects. The only thing resolved it eventually was removing the ".settings" folder under the war project's folder and restarting eclipse.

Moving to Maven from GNUMake

I've been a long time user of the Make build system, but I have decided to begin learning the Maven build system. While I've read through most of the online docs, none seem to give me the analogies I'm looking for. I understand the system's lifecycle, but I have not see one reference to compile step dependencies. For example, I want to generate a JFlex grammar as part of the compile lifecycle step. Currently, I see no way of making that step a pre-compile phase step. Documentation seems to be limited on this. In general, the concept of step dependencies seem to be baked into Maven and require a plugin for any alteration. Is this the case? What am I missing, because currently the Maven build system seems very limited in how you can setup compilation steps.
You can do anything in Maven. It generally has a default way to do each thing, and then you can hook in and override if you want to do something special. Sometimes it takes a lot of Stack Overflowing and head scratching to figure it out.
There is even an official JFlex Maven plugin.
Whenever possible, find someone who has made a Maven plugin do what you want. Even if it isn't 100% right, it may at least give you an idea on how to make maven do something.
Minimal configuration
This configuration generates java code of a parser for all grammar files (.jflex , *.jlex , *.lex , .flex ) found in src/main/jflex/ and its sub-directories. The name and package of the generated Java source code are the ones defined in the grammar. The generated Java source code is placed in target/generated-source/jflex , in sub-directories following the Java convention on package names.
pom.xml
<project>
<!-- ... -->
<build>
<plugins>
<plugin>
<groupId>de.jflex</groupId>
<artifactId>jflex-maven-plugin</artifactId>
<version>1.6.0</version>
<executions>
<execution>
<goals>
<goal>generate</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
<!-- ... -->
</build>
<!-- ... -->
</project>
This feels like the maven way to do things. Put your stuff in the right folders (src/main/flex), and this plugin will automatically build it into your project. If you want to do fancier custom stuff, there are some options. but Maven is all about favoring convention over configuration.
To be frank I think that your current mindset maps much better to ant than to maven, and I would suggest starting with that.

Categories