I have a multimodule maven project, where module share dependencies. By share I mean use the same dependencies. However each module declares dependencies itself. To keep sanity (yeah, maven, sanity, I know), and to have all modules using the same version of dependencies, parent pom declares properties with version numbers:
<properties>
<dependency1.version>1.0-SNAPSHOT</dependency1.version>
<dependency2.version>1.1-SNAPSHOT</dependency2.version>
</properties>
and all modules use that like:
<dependency>
<groupId>group</groupId>
<artifactId>dependency1</artifactId>
<version>${dependency1.version}</version>
</dependency>
I'm quite happy with this setup, as it allows me to change dependencies versions in 1 place.
Now I have a bunch of dependencies that I maintain myself. Release of those is automatic and very simple, basically:
mvn release:prepare release:perform -B
now I want to automate further and in the main project I run:
mvn versions:update-properties
(basically I also run: "mvn versions:use-releases" to change usual dependencies if needed, but it's out of the scope of this question).
After this update-properties run, properties in my main projects pom point to releases (which is good). However if my modules use properties to define versions of other dependencies and those projects have newer versions available, those properties are also changed.
Is there any way to limit damage from update-properties? versions:use-release takes includes property, so I can use it only on mine artefacts. Cannot find anything similar for update-properties.
I can revert all poms besides parent one and commit/push only that, but it doesn't seem elegant.
It sounds that you didn't understand the concept of maven.
In such circumstances you should use dependencyManagement in the parent pom like the following:
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.google.inject</groupId>
<artifactId>guice</artifactId>
<version>3.0</version>
</dependency>
...
</dependencies>
</dependencyManagement>
In you modules you just use a dependency like this:
<dependencies>
<dependency>
<groupId>com.google.inject</groupId>
<artifactId>guice</artifactId>
</dependency>
...
</dependencies>
The important step is not to define the version. In this case the version will be used which is defined by the dependency management block. So you don't need to define properties etc. and furthermore you have a single point where you can define and change the dependencies in particular the versions.
Apart from that it's possible to limit the properties which will be changed defining it on the command line on the version:update-properties call.
Related
We are trying to centralize the versions of all the artifacts that we are using in our code base to remove duplication and ease the task of bumping versions.
We have created a BOM pom with the versions of all of our artifacts and third party artifacts and imported it (scope import) in the dependencyManagement section of the poms of each of our artifacts.
To avoid having to update each artifact each time the bom version changes we have tried to use a version range when importing the bom.
<dependencyManagement>
<dependencies>
<dependency>
<groupId>com.acme</groupId>
<artifactId>bom</artifactId>
<version>[1.0,)</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
However maven does not seem to recognize version ranges in the dependencyManagement section of a pom.
I'm aware that if the relationship between our artifacts were hierarchical we could use modules and release from a parent POM. However unfortunately this is not the case.
This must be a common use case for maven. What are we doing wrong or what other solutions exist?
Taking into account the number of relevant issues in Maven issues tracker, it looks like it was a long-lasting issue in Maven.
Based on the most recent and relevant ticket, this will be resolved in the next major Maven release (4.0.0).
I have a Maven Java project. I don't want my project dependencies to be satisfied by chance through a chain of subdependencies when compiling the project. It is OK for me when building the final war when maven must check all used dependencies and add necessary libs to the war, but when compiling the code I want to be sure that only direct dependencies are used. Why?
Let's say I have two dependencies:
<dependency>
<groupId>com.package</groupId>
<artifactId>module-1</artifactId>
</dependency>
<dependency>
<groupId>com.package</groupId>
<artifactId>module-2</artifactId>
</dependency>
For our project module-1 and module-2 serve completely different purposes, but somewhere in the dependency tree of module-2, module-1 is used. I delete module-1 dependency, but maven continue to build my project without compilation errors, because it resolves module-1 from module-2 sub-dependencies. This change goes unnoticed.
After sometime we decide to remove module-2, because we don't need it. Strange enough but we can not any more compile classes which were using imports from module-1 and which are not connected to module-2 logic.
This is a simple case, but in big project this can make quite a dependency mess.
You can use the Maven dependency plugin goal "dependency:analyze" to give you a report of all used dependencies which are not declared on the current module (included transitively). That way Maven will still use transitive dependencies (no way around that I guess), but you can force yourself via the plugin to make sure these are also declared. It will also warn you of unnecessary dependencies. Mind, the plugin analyzes the compiled classes. At times, you may need to configure the plugin, because occasionally it may not detect that a dependency is required at compile time but not at runtime, e.g. because a constant was inlined.
If you really need to do this then you can setup exclusions in the pom.
e.g. here's an example of an exclusion in one of my poms where I don't want it to automatically get commons-logging because I'm using a different logging provider.
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-context</artifactId>
<version>${org.springframework-version}</version>
<exclusions>
<!-- Exclude Commons Logging in favor of SLF4j -->
<exclusion>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
</exclusion>
</exclusions>
</dependency>
You could do something like this (untested)
<dependency>
<groupId>com.package</groupId>
<artifactId>module-2</artifactId>
<exclusions>
<exclusion>
<groupId>com.package</groupId>
<artifactId>module-1</artifactId>
</exclusion>
</exclusions>
</dependency>
I wouldn't necessarily recommend this though. It makes sense in the case of my logging exclusion because I'm using slf4j instead of commons logging. I've seen other examples where this is used to exclude spring 2 if the project as a whole is using spring 3.
It's a bit difficult to tell from your example because it's so vague. In general you should keep your dependencies to a minimum. If module-2 depends on module-1 then it implies that your application won't compile or run without module-1. If in fact it can live happily without it then it's not really dependent.
As a side note it's a bit alarming that you don't have a version number against the dependencies. You'll probably find maven warns you about this. It's good practice to always include a version number. If you're dependent on a module which is currently in development then you should use the .SNAPSHOT suffix on the version to get the latest build for that version.
There seems to be no way to tell maven not to resolve dependency transitively: How to exclude all transitive dependencies of a Maven dependency. One of the reason's I think, is that the user can soon run into runtime troubles, when he finds that some of the artifacts are not being resolved at runtime or there are artifact versions problems. However, if you check the link out, you can make each of the deps 'standalone' with a wildcard exclusion pattern.
One other option is to use <optional> dependency for each of your module-X sub-dependencies. This will make sure the project compiles and non of your module-X would be resolved transitively. Like:
<dependency>
<groupId>com.package</groupId>
<artifactId>module-1</artifactId>
<optional>true</optional>
</dependency>
Still, analyzing the dependency tree might be the most safe and predictable choice.
It does sound a bit strange what you plan to do. In a way you sabotage the dependency management you want to use.
If your module-2 depends on module-1 and has a dependency to it, then any module that depends on module-2 only need to define that one.
You may be able to restrict the depth of the resolution using exclusions: Exclude all transitive dependencies of a single dependency
Newer versions of maven allow wildcards in those.
But: you will need to re-add the ones you actually need, this is by repeating the dependencies you have an other modules. This duplicates the work.
If there are artifacts that cause weirdness it may be possible to define a scope: http://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html so it is not propagated to dependant modules as well.
I'm new to Maven and I'm trying to build a project for the first time. I want to write some code that depends on apache lucene. Here's a list of artifacts in maven that I'm trying to get.
Is there any way instead of explicitly listing each artifact, I could simply depend on all artifacts of a given version? I tried this:
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>*</artifactId>
<version>3.6.1</version>
</dependency>
which gave me the error
'dependencies.dependency.artifactId' for org.apache.lucene::jar with value '' does not match a valid id pattern. # line 19, column 19
I can verify that I can download dependencies when I explicitly state them. IE this works fine:
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-core</artifactId>
<version>3.6.1</version>
</dependency>
I realize depending on everything in lucene is probably sub-optimal, but for doing something quick-and-dirty I'd hate to have to manually populate all these little lucene libraries. What is the typical practice for getting a large set of related dependencies in maven?
Short answer: you can't. Remember you just do this once and later you can simply copy-paste dependencies (not very DRY though). Also consider creating an archetype that will quickly create a skeleton with all required dependencies (for quick and dirty projects).
Longer answer: well, you can work around that. Create a separate pom.xml with:
<packaging>pom</packaging>
and declare all Lucene dependencies there manually, one after another. Once and for all. Later you can simply add a dependency to your pom.xml (that is to groupId/artifactId/version defined there) which will transitively include all dependencies of that pom.xml.
Talking about transitivity: if you depend on a JAR in maven and that JAR has other dependencies, you get that transitive dependencies implicitly. Examine Lucene poms, maybe it's enough to import few of them and rely on transitive dependencies?
Inside a single dependency for a groupId add different artifactId's
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-beans</artifactId>
<artifactId>spring-context</artifactId>
<artifactId>spring-beans</artifactId>
<version>4.3.7.RELEASE</version>
</dependency>
I am new to Maven and am setting up my first maven project. I am also creating some maven assets in the form of some poms that can be inherited from or used as dependencies in any future projects as well. I want to group dependencies together and to be able to selectively add them to a project as needed.
I read this article on pom best practices. I like the idea of grouping related dependencies together into poms and then adding the pom as a dependency to a project as needed. This approach works great for compile scoped dependencies. However it fails for provided scoped ones since as transitive dependencies, they get omitted.
Here's an example of what I mean: Lets say I group together web dependencies for my projects into a web-deps pom.xml. These include compile scoped spring framework dependencies and also a provided scoped javaee one:
<modelVersion>4.0.0</modelVersion>
<groupId>com.xyz</groupId>
<artifactId>mvn-web-deps</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>pom</packaging>
<dependencies>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-web</artifactId>
<version>${org.springframework.version}</version>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-webmvc</artifactId>
<version>${org.springframework.version}</version>
</dependency>
<dependency>
<groupId>javaee</groupId>
<artifactId>javaee-api</artifactId>
<version>${javaee.version}</version>
<scope>provided</scope>
</dependency>
</dependencies>
I then add this pom as a dependency in another project:
<modelVersion>4.0.0</modelVersion>
<groupId>com.xyz</groupId>
<artifactId>project-a</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependency>
<groupId>com.xyz</groupId>
<artifactId>mvn-web-deps</artifactId>
<version>0.0.1-SNAPSHOT</version>
<type>pom</type>
</dependency>
The dependencies in mvn-web-deps now become transitive. Since the dependency reference above is compile scoped, the provided transitive dependency gets omitted.
I want to avoid adding them to the dependency section of a parent since there can only be one parent and a project may need only some of these dependency groups, not all. I can perhaps add them to the dependencyManagement section, but then I will have to redeclare each dependency (sans the version) in each child project.
What is the correct/better way of grouping dependencies while avoiding the issues like above?
The short answer to your question is that you should only include 'provided' dependencies locally where the code requires it to compile, but not in parent pom.xml or other structures. Indicating that you have a 'provided' dependency in global pom.xml is non-sense for maven, because it does not need it to compile in such pom.xml.
Here is the long answer:
When I started using Maven, I had the same idea of trying to group artifacts and dependencies into pom.xml modules hoping they would be useful in the future. Now, that I have a bit more experience, I got to understand that it is a complete waste of time. For me, this was form of over-engineering.
I have learned to split my big projects into separate modules, each in their own subversion repository. I am including dependencies as necessary for each local module in their pom.xml. I release versioned tags of each module as I am coding and as necessary (i.e., when tested and stable).
I build my big projects by creating a separate maven project with its own pom.xml and import my modules as dependencies. From time to time, I update the module's version in the dependency when I have made a release. Then, I let maven do the job of pulling whatever it has to pull, transitively of not, when compiling/releasing the big project.
Maven allows all sorts of complex constructions and hierarchy between pom.xmls, but IMHO this feature creates unnecessary mess and complexities. So far it has not proved to be a real benefit for me. At the beginning, I was hoping that compiling one pom.xml would compile the rest properly in a cascading way. I did get some result, but what a mess to maintain in all the global pom.xml.
Releasing my module's artifacts separately and building my project on these releases has saved me so much time that I can only recommend it. In total, I have less pom.xml to maintain and they are also less complex. For the same final result...
So, if your only reason for building global/structural pom.xml is a hope to save time, I recommend abandoning this idea... Separate code in separate projects, release and THEN compile globally.
I concluded that Maven was not designed for this kind of use-case. I ended up having a parent pom.xml with all the libraries I use added to its <dependencyManagement> section. Any new projects/modules that I create have their pom.xml inherit from the parent pom.xml and add each dependency they need to their own <dependencies> section, minus the version. This scheme allows me to manage the versions for the libraries that I use and the respository declarations they need at a single place. Another advantage (over trying to create dependency bundles somehow) is that this gives more fine-grained control over the libraries added to child poms - only the dependencies that are actually needed are added.
Provided-scope dependencies are indeed inherited from parent POM, but NOT from POM defined as dependencies and I consider that a Maven weakness.
Given that Maven has also difficulties in adding modules as dependencies across module hierarchies, I can't say Maven is a sophisticated tool to manage multi-module projects. Maven expects a strict single-rooted hierarchy that is only suitable for the simplest projects.
We're using maven 2.1.0. I have multiple modules that are completely separate, but still have many common dependencies. Like log4J, but some modules don't need it. I am wondering if it is a good idea to declare all common dependencies in one parent file in the <dependencyManagement> section or is there a better way to deal with this?
A follow up question about <dependencyManagement>. If I declare Log4J in the <dependencyManagement> section of the parent and a sub project does not use it, will it be included anyway?
If you have a parent project, you can declare all dependencies and their versions in the dependencyManagement section of the parent pom. This doesn't mean that all projects will use all those dependencies, it means that if a project does declare the dependency, it will inherit the configuration, so it only need declare the groupId and artifactId of the dependency. You can even declare your child projects in the parent's dependencyManagement without introducing a cycle.
Note you can also do similar with plugins by declaring them in the pluginManagement section. This means any child declaring the plugin will inherit the configuration.
For example, if you have 4 projects, parent, core, ui and utils, you could declare all the external dependences and the internal project versions in the parent. The child projects then inherit that configuration for any dependencies they declare. If all modules are to have the same version, these can be even be declared as properties in the parent.
An example parent is as follows:
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>name.seller.rich</groupId>
<artifactId>parent</artifactId>
<version>1.0.0</version>
<packaging>pom</packaging>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
<version>1.4</version>
</dependency>
<dependency>
<groupId>name.seller.rich</groupId>
<artifactId>ui</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>name.seller.rich</groupId>
<artifactId>core</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>name.seller.rich</groupId>
<artifactId>utils</artifactId>
<version>${project.version}</version>
</dependency>
</dependencies>
</dependencyManagement>
<modules>
<module>utils</module>
<module>core</module>
<module>ui</module>
</modules>
</project>
And the utils, core, and ui projects inherit all the relevant versions.
utils:
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>name.seller.rich</groupId>
<artifactId>utils</artifactId>
<!--note version not declared as it is inherited-->
<parent>
<artifactId>parent</artifactId>
<groupId>name.seller.rich</groupId>
<version>1.0.0</version>
</parent>
<dependencies>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
</dependency>
</dependencies>
</project>
core:
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>name.seller.rich</groupId>
<artifactId>core</artifactId>
<parent>
<artifactId>parent</artifactId>
<groupId>name.seller.rich</groupId>
<version>1.0.0</version>
</parent>
<dependencies>
<dependency>
<groupId>name.seller.rich</groupId>
<artifactId>utils</artifactId>
</dependency>
</dependencies>
ui:
<project>
<modelVersion>4.0.0</modelVersion>
<groupId>name.seller.rich</groupId>
<artifactId>ui</artifactId>
<parent>
<artifactId>parent</artifactId>
<groupId>name.seller.rich</groupId>
<version>1.0.0</version>
</parent>
<dependencies>
<dependency>
<groupId>name.seller.rich</groupId>
<artifactId>core</artifactId>
</dependency>
</dependencies>
</project>
I wrote up a list of best practices. Here are the most important ones.
Always use the maven-enforcer-plugin
Enforce dependency convergence
Otherwise it's possible that you depend on two different jars which both depend on log4j. Which one gets used at compile time depends on a set of rules that you shouldn't have to remember. They can both (!) get exported as transitive dependencies.
Require plugin versions (for all plugins, even the built in ones)
Define them in pluginManagement in the parent pom to define versions
Otherwise a new version of maven-surefire-plugin could break your build
Use dependencyManagement in the parent pom to use versions consistently across all modules
Periodically run mvn dependency:analyze
It's possible that you're getting a dependency transitively that you directly depend on at compile time. If so, it's important to add it to your pom with the version you require. This plays nicely with the enforcer plugin.
It's possible that you're declaring extra dependencies that you don't use. This doesn't work properly 100% of the time, especially with libraries that are designed to have optional pieces (i.e. slf4j-api gets detected properly, but slf4j-log4j12 fails).
Each module should have its own POM and where it declares its own dependencies. This not only tracks external dependencies, but also internal ones.
When you use Maven to build a project it will sort the whole lot out. So if many modules (perhaps all) depend on log4j, then it will only be included once. There are some problems if your modules depend on different versions of log4j but this approach usually works fine.
It is also useful (if there are more than 1-2 developers working together) to set up an internal repository (like Artifactory) and use that internally. It makes it much easier to deal with libraries that are not in the public repos (just add it to your internal repo!) and you can also use build tools to push builds of your own code there so other can use the modules without checking out the code (useful in larger projects)
A follow up question about . If I declare Log4J in the section of the parent and a sub project does not use it, will it be included anyway?
No. Dependency management only sets the default version and possibly scope (I've seen this both appear to be inherited and appear to not be inherited so you will need to look this one up on your own). To include the dependency in a child module, you need to declare it as a dependency of the module and omit the version element. You can override the default in a child module simply be including the version number in the dependency element of the child module's POM.
I have multiple modules that are completely separate, but still have many common dependancies.
In this case, yes and no.
For modules that are built, versioned, and deployed together as a unified project, for instance the modules that compose a single Web application, most definitely yes. You want to relieve yourself of the headache of changing the version in more than one POM when you decide to move to a new version of a dependency. It can also save you work when you need to exclude certain transitive dependencies. If you declare the dependency with its excludes in the section you don't have to maintain the exclusions in multiple POMs.
For modules that are not directly related but are built within a single team within the company you may want to consider declaring default versions for common libraries like testing utilities, logging utilities, etc. in order to keep the team working with the standard versions of the tools that you have defined as part of your best practices. Remember you can always increase the version of your super POM when you standardize on a new set of common libraries. Where you draw the line between standardized library and tools and project specific libraries and tools is up to you but it should be easy for your team to find.
We use a single common parent with a dependencyManagement block for all our projects. This is starting to break down as we move more projects into maven - if a project needs a different version then we have to either declare it as a dependency for all children or explicitly define the version for each pertinent child.
We're trying out a model where we split the dependencyManagement out from our common parent and then import our corporate dependencyManagement pom into the top level project pom. This allows us to selectively define project defaults that override the corporate defaults.
Here is the original scenario:
A defines version 1.0 of foo.jar as the corporate default
B child of A
C1, C2, C3 children of B
D1, D2, D3 children of C1, C2, C3 respectively
If D1 and D2 require version 1.1 of foo.jar, then our choice used to be:
Declare foo.jar version 1.1 as a dependency in B, making it appear that C1, C2, C3 and D3 also depended upon version 1.1
Declare foo.jar version 1.1 as a dependency in D1 and D2, moving the dependency declaration into multiple places deeper in our project hierarchy.
Here is what we're trying out:
A defines version 1.0 of foo.jar as the corporate default
B dependencyManagement: imports A, declares a default of foo.jar version 1.1
C1, C2, C3 children of B
D1, D2, D3 children of C1, C2, C3 respectively
Now D1 and D2 just declare a dependency upon foo.jar and pick up version 1.1 from the dependencyManagement block of B.
In a multi-module project I place any common dependencies in the element of the parent pom.xml. I'm not sure if this would be best practice if the modules were not related to the same project though.