Building Maven Projects, Independent of IDE (esp. ecplise) - java

I inherited a .Net/Java Project combo, in which there are different "modules" some modules are dependent on others, some are independent. It is not a true multi module project (No aggregrator POM).
I am using Intellij (community) to compile/debug these and I am unable to do so, Earlier these were made using Eclipse (.classpath , .project) and now the team (Management) wants to move to Intellij and VSTS CI.
Some more Info
Web application is packed as war and deployed to Tomcat using WIX on windows
Application some custom jars (no source available).
I have been asked to avoid using local repositories.
Multi module Project on Maven website / Github do not compile/ wants
me to use mvn clean install for every dependent java project
ALL java projects are maven projects with POM.XML
Questions :
How to create a Multi Module Project using this structure (Which can
also be imported in intellij/Eclipse).
How to compile the resulting multi module project with Maven/Intellij compile/debug .
Is Gradle more suitable here.
As a new Java Programmer , Its gets really difficult to change and deploy an already existing setup when you cannot connect/talk to the person who made it :( .
Any help/direction/suggestions will be really helpful.
Project Structure (JAVA)
C:\SomeProj\
├── Jars
| ├── Java
| | └── extJar2.jar
| | └── extJar3.jar
| | └── extJar4.jar
| | └── extJar5.jar
| | └── javax.servlet-3.0.jar (Why is this externally required)
| └── SomeFolder1
| | └── extJar1.jar
├── Source
| ├── Common
| | ├── DAL (JAVA - jar) [Depends On Logger]
| | ├── Logger (JAVA - Jar)
| ├── API
| | ├── DataInput
| | | ├── InfoPuller (JAVA - Jar) [Depends On Logger]
| | ├── Info
| | | └── MyWebApi (JAVA-WAR) [Depends On DAL and Logger]{Application to Build/Run/Deploy from Intellij/MAVEN}
| | └── Util
| | └── JobManager (JAVA - Jar) [Depends On Logger]
| └── WebApp (Js App) {FrontEnd}

I think your best bet is to create an aggregator project, in the most standard way possible (for example following the recommendations from here : https://maven.apache.org/guides/mini/guide-multiple-modules.html);
If you're new to maven, it might look scary, but this can be as little as a single file (which can even be in a subfolder in one of your projects, although I don't recommend this, and suggest to have instead its own project). Having an aggregator will help a lot with modules that depend on other modules, etc.
The resulting aggregated project can easily be imported in Eclipse/IntelliJ (Eclipse for example has a feature where it knows to import/create project files out of a pom.xml; IntelliJ has a similar feature).
The resulting project (if you set it up correctly) can be compiled/packaged via the default maven lifecycle, either with
mvn compile or mvn package. Obviously you can also run stuff via IDE, but you might need a little spadework (setting up runtimes/servers, etc).
For example I build my projects for CI via maven, but I am also running them via eclipse, for development purposes. I'm using my pom.xmls as the Single Source of Truth.
You've just described what I believe to be a classic use-case for multi-module projects; You're asking for an opinionated answer which in the end boils down to "it depends what you prefer". I personally don't know Gradle, but I don't see elements in your description that look strange. So I think you can use Gradle if members of your team have better knowledge of Gradle vs Maven, etc.
Additional things to consider:
On Application custom jars: If they do not exist in Maven Central repository, or in the provider's repository somewhere, and if you want to avoid local repositories, you can include a repository along with your project; this is basically just a flavor of local maven repository, with the distinct advantage that it can be packaged along with your projects, and other devs don't need to execute additional steps (i.e. they don't need to execute an extra step of adding artifacts to local repo). Some people don't like this approach, but if your constraints of "no local repo; no company repo" are unmovable, then I don't see any other way
I don't understand very well what you say here:
Multi module Project on Maven website / Github do not compile/ wants
me to use mvn clean install for every dependent java project.
I am going to assume that you mean that you've found some multi-module sample somewhere and it didn't worked for you. Without a concrete example, I can't really comment, except to mention that I have used multi-module builds in the past (and in the present), and they work fine for me.
Regarding your comment about mvn clean install - that's a maven command that install your project's/module's artifacts into local repository, and is not REALLY needed, unless these artifacts need to be consumed by projects external to your current multi-module build.

Related

Ways to organize Maven project with lots of artifacts

I'm trying to organise my maven project.
Let's say my project is called "awesome". "awesome" has several artifact, each of them built differently (e.g., some of them may be built with some plugin, others are built with some other plugins): in general thse build-configurations are finite and limited (let's say there are at most 3 different ways to build an artifact), however, each artifact can only be built with exactly one build (e.g., the utility artifact is built with maven-jar-plugin configured in a particular way, while the artifact client-ui is built with maven-war-plugin configured in a particular way).
Now, I know I could organize the maven project as follows:
awesome-root
|---jars
| |--- utility
| |--- client-model
| |--- task-model
| |--- supplier-model
| |--- client-logic
| |--- task-logic
| ---- supplier-logic
|---wars
|--- client-ui
|--- task-ui
---- supplier-ui
This way, each particular configuration build can be put inside the build --> plugins section of the projects jars and wars, while general properties/dependency management/plugin management can be put in awesome-root.
Problem:
I quickly realized that the developers generates artifacts closely related with eachother but with different builds. In the previous example, we can notice that the artifacts can be grouped in this other way:
awesome-root
|--- tasks
| |--- task-model
| |--- task-logic
| ---- task-ui
|--- clients
| |--- client-model
| |--- client-logic
| ---- client-ui
|--- supplier
| |--- supplier-model
| |--- supplier-logic
| ---- supplier-ui
|--- others
|--- utility
The main advantage of this grouping is that tasks, clients and suppliers are 3 different, independent software sectors. When the developer needs to make a change in the, let's say, client sectors, she has everything she needs in a small part of the file system (or in the project explorer tab in an IDE, like Eclipse). Viceversa, in the first mapping, the clients software sector is scrambled all over in the project repository.
While this may not be a big deal, if "awesome" project starts to get really big, with a lot of artifacts and so on, finding all the related parts of clients sectors start to be annoying (not impossible, IDEs offer searches for this purpose).
I'd say the second structure is much better, developer wise.
However, It's seems difficult to implement this strategy in maven: the main difficulty is to where to put the different build configurations for each artifacts (e.g., *-ui needs to be built in a different way of *-model).
One may be tempted to put such configurations in client-ui, client-logic, client-model, but this would mean duplicate configuration build everywhere (e.g, client-ui, supplier-ui, task-ui has the same build configuration): if a build configuration needs to be changed, you need to change all the other copies;
Another solution might be to declare plugins management in awesome-root and them write the plugin definition in each artifactId: while this seems better, it still suffer from the same duplication problem of option 1;
Use archetype to generate poms with the correct build configuration: same as above;
Profiles: profiles are not inherited and they depend only on system properties, not maven's one;
My questions are:
Is the second structure impossible to achieve in Maven? Is there a way?
If not, do I need to bite the bullet and set on the first structure?
Is there any alternative? (I'm trying not to propose a XY problem, any alternative is appreciated);
Additional information:
OS: Ubuntu 18.04.3 (bionic), 64 bit
java version: openjdk 11.0.4 2019-07-16
IDE: Eclipse 4.10.0
m2e plugin: 1.10.0.20181127-2120
Thanks for any kind reply

get maven clean install to work like maven clean + maven install

I have the following project hierarchy:
app
|-module1
| |-pom.xml
|-module2
| |-pom.xml
|-pom.xml
Module1 and module2 both copies files to the same target directory, so im using the app's pom.xml to clear that directory. My problem is, the execution order right now is module1[clean], module1[install], module2[clean], module2[install], app[clean], app[install], so everything module1 and module2 puts into that directory will be deleted.
I would like to get it to execute all clean first, then all install, even when i run mvn clean install. Or if there is another way to execute app[clean] before module1[install] and module2[install], that would work too.
EDIT
I ended up making a separate module (Netbeans POM projekt) for cleaning alone. Not the sollution i was hoping for, but it works for now.
The root of the problem here is that you're trying to make Maven do something that sort-of contradicts Maven's multi-module "conventions", as well as conflicting with Maven's "understanding" of a "target directory". There is a reason why Maven's reactor is operating the way that it does, and it is to preserve the Maven "spirit" (or "convention") of how modules are structured in a multi-module build.
In Maven, the target directory is supposed to belong only to one project: each project has its own target directory. In your scenario, there should really be a different target directory for app, module1 and module2.
I suppose your best bet, in order to both achieve your objective and keep your build process flexible, is to:
Have module1 output its own JAR into its own target directory (module1/target).
Have module2 output its own JAR into its own target directory (module2/target).
Add a plugin to app (the parent module) that will collect whatever it needs from module1/target and module2/target into app/target, and do whatever processing on those artifacts.

How to install the impl module for a api dependency in maven

I have a multi-module maven project structured in something like this:
parent
|
|-presentation
|+services
| |-services-api
| |-services-impl
|+data-access
| |-data-access-api
| |-data-access-impl
|-+connector
| |-connector-api
| |-connector-implA
| |-connector-implB
|-...
The presentation module is packaged in a war and it depends only on the api modules.
When i run the install goal the only dependencies that the war installs are the api modules. To choose wich impl modules to install in the presentation module i'm using profiles that add the dependency to the impl modules at build time depending on the profiles selected.
From what i've been reading i don't think that this is correct usage for the maven profiles.
What is the best way to tell maven to add a chosen impl to the presentation module?
I have the same usage of profiles but only for specific changes (dependencies mostly).
You do not have to put everything in profiles. Most of the implementation dependencies are common and are therefore declare directly without profiles.
Depending on the targeted application server I use profiles to override properties, add specific dependencies (CommonJ for Websphere for instance), ...
I got a solution from the maven users mailing list that i think is the right way to use maven in my scenario.
I use runtime dependencies for the impl modules and one war project for each implementation of the api. Using war overlays it merges the resources and enables me to have the application running with the correct module implementations depending on the war i run.

How to clean old dependencies from maven repositories?

I have too many files in .m2 folder where maven stores downloaded dependencies. Is there a way to clean all old dependencies? For example, if there is a dependency with 3 different versions: 1, 2 and 3, after cleaning there must be only 3rd. How I can do it for all dependencies in .m2 folder?
If you are on Unix, you could use the access time of the files in there. Just enable access time for your filesystem, then run a clean build of all your projects you would like to keep dependencies for and then do something like this (UNTESTED!):
find ~/.m2 -amin +5 -iname '*.pom' | while read pom; do parent=`dirname "$pom"`; rm -Rf "$parent"; done
This will find all *.pom files which have last been accessed more than 5 minutes ago (assuming you started your builds max 5 minutes ago) and delete their directories.
Add "echo " before the rm to do a 'dry-run'.
Short answer -
Deleted .m2 folder in {user.home}. E.g. in windows 10 user home is C:\Users\user1. Re-build your project using mvn clean package. Only those dependencies would remain, which are required by the projects.
Long Answer -
.m2 folder is just like a normal folder and the content of the folder is built from different projects. I think there is no way to figure out automatically that which library is "old". In fact old is a vague word. There could be so many reasons when a previous version of a library is used in a project, hence determining which one is unused is not possible.
All you could do, is to delete the .m2 folder and re-build all of your projects and then the folder would automatically build with all the required library.
If you are concern about only a particular version of a library to be used in all the projects; it is important that the project's pom should also update to latest version. i.e. if different POMs refer different versions of the library, all will get downloaded in .m2.
Given a POM file for a maven project you can remove all its dependencies in the local repository (by default ~/.m2/respository) using the Apache Maven Dependency Plugin.
It includes the dependency:purge-local-repository functionality that removes the project dependencies from the local repository, and optionally re-resolve them.
To clean the local dependencies you just have to used the optional parameter reResolve and set it to false since it is set to true by default.
This command line call should work:
mvn dependency:purge-local-repository -DreResolve=false
Download all actual dependencies of your projects
find your-projects-dir -name pom.xml -exec mvn -f '{}' dependency:resolve
Move your local maven repository to temporary location
mv ~/.m2 ~/saved-m2
Rename all files maven-metadata-central.xml* from saved repository into maven-metadata.xml*
find . -type f -name "maven-metadata-central.xml*" -exec rename -v -- 's/-central//' '{}' \;
To setup the modified copy of the local repository as a mirror, create the directory ~/.m2 and the file ~/.m2/settings.xml with the following content (replacing user with your username):
<settings>
<mirrors>
<mirror>
<id>mycentral</id>
<name>My Central</name>
<url>file:/home/user/saved-m2/</url>
<mirrorOf>central</mirrorOf>
</mirror>
</mirrors>
</settings>
Resolve your projects dependencies again:
find your-projects-dir -name pom.xml -exec mvn -f '{}' dependency:resolve
Now you have local maven repository with minimal of necessary artifacts. Remove local mirror from config file and from file system.
It's been more than 6 years since this question was asked, but I still didn't find any tool to satisfactorily clean up my repository. So I wrote one myself in Python to get rid of old local artefacts. Maybe it will be useful for someone else also:
repo-cleaner.py:
from os.path import isdir
from os import listdir
import shutil
import semver
import Constants
# Change to True to get a log of what will be removed
dry_run = False
def check_and_clean(path):
files = listdir(path)
only_files = True
for index, file in enumerate(files):
if isdir('/'.join([path, file])):
only_files = False
else:
files[index] = None
if only_files:
return
directories = [d for d in files if d is not None]
latest_version = check_if_versions(directories)
if latest_version is None:
for directory in directories:
check_and_clean('/'.join([path, directory]))
elif len(directories) == 1:
return
else:
print('Update ' + path.split(Constants.m2_path)[1])
for directory in directories:
if directory == latest_version:
continue
print(directory + ' (Has newer version: ' + latest_version + ')')
if not dry_run:
shutil.rmtree('/'.join([path, directory]))
def check_if_versions(directories):
if len(directories) == 0:
return None
latest_version = ''
for directory in directories:
try:
current_version = semver.VersionInfo.parse(directory)
except ValueError:
return None
if latest_version == '':
latest_version = directory
if current_version.compare(latest_version) > 0:
latest_version = directory
return latest_version
if __name__ == '__main__':
check_and_clean(Constants.m2_path)
Constants.py (edit to point to your own local Maven repo):
# Paths
m2_path = '/home/jb/.m2/repository/'
Make sure that you have Python 3.6+ installed and that the semver package has been installed into your global environment or venv (use pip install semver if missing).
Run the script with python repo-cleaner.py.
It recursively searches within the local Maven repository you configured (normally ~/.m2/repository) and if it finds a catalog where different versions reside it removes all of them but the newest.
Say you have the following tree somewhere in your local Maven repo:
.
└── antlr
├── 2.7.2
│   ├── antlr-2.7.2.jar
│   ├── antlr-2.7.2.jar.sha1
│   ├── antlr-2.7.2.pom
│   ├── antlr-2.7.2.pom.sha1
│   └── _remote.repositories
└── 2.7.7
├── antlr-2.7.7.jar
├── antlr-2.7.7.jar.sha1
├── antlr-2.7.7.pom
├── antlr-2.7.7.pom.sha1
└── _remote.repositories
Then the script removes version 2.7.2 of antlr and what is left is:
.
└── antlr
└── 2.7.7
├── antlr-2.7.7.jar
├── antlr-2.7.7.jar.sha1
├── antlr-2.7.7.pom
├── antlr-2.7.7.pom.sha1
└── _remote.repositories
Any old versions, even ones that you actively use, will be removed. It can easily be restored with Maven (or other tools that manage dependencies).
You can get a log of what is going to be removed without actually removing it by setting dry_run = True. The output will look like this:
update /org/projectlombok/lombok
1.18.2 (newer version: 1.18.6)
1.16.20 (newer version: 1.18.6)
This means that versions 1.16.20 and 1.18.2 of lombok will be removed and 1.18.6 will be left untouched.
The latest version of the above files can be found on my github.
I came up with a utility and hosted on GitHub to clean old versions of libraries in the local Maven repository. The utility, on its default execution removes all older versions of artifacts leaving only the latest ones. Optionally, it can remove all snapshots, sources, javadocs, and also groups or artifacts can be forced / excluded in this process. This cross platform also supports date based removal based on last access / download dates.
https://github.com/techpavan/mvn-repo-cleaner
I wanted to remove old dependencies from my Maven repository as well. I thought about just running Florian's answer, but I wanted something that I could run over and over without remembering a long linux snippet, and I wanted something with a little bit of configurability -- more of a program, less of a chain of unix commands, so I took the base idea and made it into a (relatively small) Ruby program, which removes old dependencies based on their last access time.
It doesn't remove "old versions" but since you might actually have two different active projects with two different versions of a dependency, that wouldn't have done what I wanted anyway. Instead, like Florian's answer, it removes dependencies that haven't been accessed recently.
If you want to try it out, you can:
Visit the GitHub repository
Clone the repository, or download the source
Optionally inspect the code to make sure it's not malicious
Run bin/mvnclean
There are options to override the default Maven repository, ignore files, set the threshold date, but you can read those in the README on GitHub.
I'll probably package it as a Ruby gem at some point after I've done a little more work on it, which will simplify matters (gem install mvnclean; mvnclean) if you already have Ruby installed and operational.
Just clean every content under .m2-->repository folder.When you build project all dependencies load here.
In your case may be your project earlier was using old version of any dependency and now version is upgraded.So better clean .m2 folder and build your project with mvn clean install.
Now dependencies with latest version modules will be downloaded in this folder.
I did spend some hours looking at this problem and to the answers, many of them rely on the atime (which is the last access time on UNIX systems), which is an unreliable solution for two reasons:
Most UNIX systems (including Linux and macOS) update the atime irregularly at best, and that is for a reason: a complete implementation of atime would imply the whole file system would be slowed down by having to update (i.e., write to the disk) the atime every time a file is read, moreover having a such an extreme number of updates would very rapidly wear out the modern, high performance SSD drives
On a CI/CD environment, the VM that's used to build your Maven project will have its Maven repository restored from a shared storage, which in turn will make the atime get set to a "recent" value
I hence created a Maven repository cleaner and made it available on https://github.com/alitokmen/maven-repository-cleaner/. The bash maven-repository-cleaner.sh script has one function, cleanDirectory, which is a recursive function looping through the ~/.m2/repository/ and does the following:
When the subdirectory is not a version number, it digs into that subdirectory for analysis
When a directory has subdirectories which appear to be version numbers, it only deletes all lower versions
In practice, if you have a hierarchy such as:
artifact-group
artifact-name
1.8
1.10
1.2
... maven-repository-cleaner.sh script will:
Navigate to artifact-group
In artifact-group, navigate to artifact-name
In artifact-name, delete the subfolders 1.8 and 1.2, as 1.10 is superior to both 1.2 and 1.8
This is hence very similar to the solutions Andronicus and Pavan Kumar have provided, the difference is that this one is written as a Shell script. To run the tool on your CI/CD platform (or any other form of UNIX system), simply use the below three lines, either at the beginning or at the end of the build:
wget https://raw.githubusercontent.com/alitokmen/maven-repository-cleaner/main/maven-repository-cleaner.sh
chmod +x maven-repository-cleaner.sh
./maven-repository-cleaner.sh
You need to copy the dependency you need for project.
Having these in hand please clear all the <dependency> tag embedded into <dependencies> tag
from POM.XML file in your project.
After saving the file you will not see Maven Dependencies in your Libraries.
Then please paste those <dependency> you have copied earlier.
The required jars will be automatically downloaded by Maven, you can see that too in
the generated Maven Dependencies Libraries after saving the file.
Thanks.

Maven parent pom vs modules pom

There seem to be several ways to structure parent poms in a multiproject build and I wondering if anyone had any thoughts on what the advantages / drawbacks are in each way.
The simplest method of having a parent pom would be putting it in the root of a project i.e.
myproject/
myproject-core/
myproject-api/
myproject-app/
pom.xml
where the pom.xml is both the parent project as well as describes the -core -api and -app modules
The next method is to separate out the parent into its own subdirectory as in
myproject/
mypoject-parent/
pom.xml
myproject-core/
myproject-api/
myproject-app/
Where the parent pom still contains the modules but they're relative, e.g. ../myproject-core
Finally, there's the option where the module definition and the parent are separated as in
myproject/
mypoject-parent/
pom.xml
myproject-core/
myproject-api/
myproject-app/
pom.xml
Where the parent pom contains any "shared" configuration (dependencyManagement, properties etc.) and the myproject/pom.xml contains the list of modules.
The intention is to be scalable to a large scale build so should be scalable to a large number of projects and artifacts.
A few bonus questions:
Where is the best place to define the various shared configuration as in source control, deployment directories, common plugins etc. (I'm assuming the parent but I've often been bitten by this and they've ended up in each project rather than a common one).
How do the maven-release plugin, hudson and nexus deal with how you set up your multi-projects (possibly a giant question, it's more if anyone has been caught out when by how a multi-project build has been set up)?
Edit: Each of the sub projects have their own pom.xml, I've left it out to keep it terse.
In my opinion, to answer this question, you need to think in terms of project life cycle and version control. In other words, does the parent pom have its own life cycle i.e. can it be released separately of the other modules or not?
If the answer is yes (and this is the case of most projects that have been mentioned in the question or in comments), then the parent pom needs his own module from a VCS and from a Maven point of view and you'll end up with something like this at the VCS level:
root
|-- parent-pom
| |-- branches
| |-- tags
| `-- trunk
| `-- pom.xml
`-- projectA
|-- branches
|-- tags
`-- trunk
|-- module1
| `-- pom.xml
|-- moduleN
| `-- pom.xml
`-- pom.xml
This makes the checkout a bit painful and a common way to deal with that is to use svn:externals. For example, add a trunks directory:
root
|-- parent-pom
| |-- branches
| |-- tags
| `-- trunk
| `-- pom.xml
|-- projectA
| |-- branches
| |-- tags
| `-- trunk
| |-- module1
| | `-- pom.xml
| |-- moduleN
| | `-- pom.xml
| `-- pom.xml
`-- trunks
With the following externals definition:
parent-pom http://host/svn/parent-pom/trunk
projectA http://host/svn/projectA/trunk
A checkout of trunks would then result in the following local structure (pattern #2):
root/
parent-pom/
pom.xml
projectA/
Optionally, you can even add a pom.xml in the trunks directory:
root
|-- parent-pom
| |-- branches
| |-- tags
| `-- trunk
| `-- pom.xml
|-- projectA
| |-- branches
| |-- tags
| `-- trunk
| |-- module1
| | `-- pom.xml
| |-- moduleN
| | `-- pom.xml
| `-- pom.xml
`-- trunks
`-- pom.xml
This pom.xml is a kind of "fake" pom: it is never released, it doesn't contain a real version since this file is never released, it only contains a list of modules. With this file, a checkout would result in this structure (pattern #3):
root/
parent-pom/
pom.xml
projectA/
pom.xml
This "hack" allows to launch of a reactor build from the root after a checkout and make things even more handy. Actually, this is how I like to setup maven projects and a VCS repository for large builds: it just works, it scales well, it gives all the flexibility you may need.
If the answer is no (back to the initial question), then I think you can live with pattern #1 (do the simplest thing that could possibly work).
Now, about the bonus questions:
Where is the best place to define the various shared configuration as in source control, deployment directories, common plugins etc. (I'm assuming the parent but I've often been bitten by this and they've ended up in each project rather than a common one).
Honestly, I don't know how to not give a general answer here (like "use the level at which you think it makes sense to mutualize things"). And anyway, child poms can always override inherited settings.
How do the maven-release plugin, hudson and nexus deal with how you set up your multi-projects (possibly a giant question, it's more if anyone has been caught out when by how a multi-project build has been set up)?
The setup I use works well, nothing particular to mention.
Actually, I wonder how the maven-release-plugin deals with pattern #1 (especially with the <parent> section since you can't have SNAPSHOT dependencies at release time). This sounds like a chicken or egg problem but I just can't remember if it works and was too lazy to test it.
From my experience and Maven best practices there are two kinds of "parent poms"
"company" parent pom - this pom contains your company specific information and configuration that inherit every pom and doesn't need to be copied. These informations are:
repositories
distribution managment sections
common plugins configurations (like maven-compiler-plugin source and target versions)
organization, developers, etc
Preparing this parent pom need to be done with caution, because all your company poms will inherit from it, so this pom have to be mature and stable (releasing a version of parent pom should not affect to release all your company projects!)
second kind of parent pom is a multimodule parent. I prefer your first solution - this is a default maven convention for multi module projects, very often represents VCS code structure
The intention is to be scalable to a large scale build so should be scalable to a large number of projects and artifacts.
Mutliprojects have structure of trees - so you aren't arrown down to one level of parent pom. Try to find a suitable project struture for your needs - a classic exmample is how to disrtibute mutimodule projects
distibution/
documentation/
myproject/
myproject-core/
myproject-api/
myproject-app/
pom.xml
pom.xml
A few bonus questions:
Where is the best place to define the various shared configuration as in source control, deployment directories, common plugins etc. (I'm assuming the parent but I've often been bitten by this and they've ended up in each project rather than a common one).
This configuration has to be wisely splitted into a "company" parent pom and project parent pom(s). Things related to all you project go to "company" parent and this related to current project go to project one's.
How do the maven-release plugin, hudson and nexus deal with how you set up your multi-projects (possibly a giant question, it's more if anyone has been caught out when by how a multi-project build has been set up)?
Company parent pom have to be released first. For multiprojects standard rules applies. CI server need to know all to build the project correctly.
An independent parent is the best practice for sharing configuration and options across otherwise uncoupled components. Apache has a parent pom project to share legal notices and some common packaging options.
If your top-level project has real work in it, such as aggregating javadoc or packaging a release, then you will have conflicts between the settings needed to do that work and the settings you want to share out via parent. A parent-only project avoids that.
A common pattern (ignoring #1 for the moment) is have the projects-with-code use a parent project as their parent, and have it use the top-level as a parent. This allows core things to be shared by all, but avoids the problem described in #2.
The site plugin will get very confused if the parent structure is not the same as the directory structure. If you want to build an aggregate site, you'll need to do some fiddling to get around this.
Apache CXF is an example the pattern in #2.
There is one little catch with the third approach. Since aggregate POMs (myproject/pom.xml) usually don't have parent at all, they do not share configuration. That means all those aggregate POMs will have only default repositories.
That is not a problem if you only use plugins from Central, however, this will fail if you run plugin using the plugin:goal format from your internal repository. For example, you can have foo-maven-plugin with the groupId of org.example providing goal generate-foo. If you try to run it from the project root using command like mvn org.example:foo-maven-plugin:generate-foo, it will fail to run on the aggregate modules (see compatibility note).
Several solutions are possible:
Deploy plugin to the Maven Central (not always possible).
Specify repository section in all of your aggregate POMs (breaks DRY principle).
Have this internal repository configured in the settings.xml (either in local settings at ~/.m2/settings.xml or in the global settings at /conf/settings.xml). Will make build fail without those settings.xml (could be OK for large in-house projects that are never supposed to be built outside of the company).
Use the parent with repositories settings in your aggregate POMs (could be too many parent POMs?).

Categories