There seem to be several ways to structure parent poms in a multiproject build and I wondering if anyone had any thoughts on what the advantages / drawbacks are in each way.
The simplest method of having a parent pom would be putting it in the root of a project i.e.
myproject/
myproject-core/
myproject-api/
myproject-app/
pom.xml
where the pom.xml is both the parent project as well as describes the -core -api and -app modules
The next method is to separate out the parent into its own subdirectory as in
myproject/
mypoject-parent/
pom.xml
myproject-core/
myproject-api/
myproject-app/
Where the parent pom still contains the modules but they're relative, e.g. ../myproject-core
Finally, there's the option where the module definition and the parent are separated as in
myproject/
mypoject-parent/
pom.xml
myproject-core/
myproject-api/
myproject-app/
pom.xml
Where the parent pom contains any "shared" configuration (dependencyManagement, properties etc.) and the myproject/pom.xml contains the list of modules.
The intention is to be scalable to a large scale build so should be scalable to a large number of projects and artifacts.
A few bonus questions:
Where is the best place to define the various shared configuration as in source control, deployment directories, common plugins etc. (I'm assuming the parent but I've often been bitten by this and they've ended up in each project rather than a common one).
How do the maven-release plugin, hudson and nexus deal with how you set up your multi-projects (possibly a giant question, it's more if anyone has been caught out when by how a multi-project build has been set up)?
Edit: Each of the sub projects have their own pom.xml, I've left it out to keep it terse.
In my opinion, to answer this question, you need to think in terms of project life cycle and version control. In other words, does the parent pom have its own life cycle i.e. can it be released separately of the other modules or not?
If the answer is yes (and this is the case of most projects that have been mentioned in the question or in comments), then the parent pom needs his own module from a VCS and from a Maven point of view and you'll end up with something like this at the VCS level:
root
|-- parent-pom
| |-- branches
| |-- tags
| `-- trunk
| `-- pom.xml
`-- projectA
|-- branches
|-- tags
`-- trunk
|-- module1
| `-- pom.xml
|-- moduleN
| `-- pom.xml
`-- pom.xml
This makes the checkout a bit painful and a common way to deal with that is to use svn:externals. For example, add a trunks directory:
root
|-- parent-pom
| |-- branches
| |-- tags
| `-- trunk
| `-- pom.xml
|-- projectA
| |-- branches
| |-- tags
| `-- trunk
| |-- module1
| | `-- pom.xml
| |-- moduleN
| | `-- pom.xml
| `-- pom.xml
`-- trunks
With the following externals definition:
parent-pom http://host/svn/parent-pom/trunk
projectA http://host/svn/projectA/trunk
A checkout of trunks would then result in the following local structure (pattern #2):
root/
parent-pom/
pom.xml
projectA/
Optionally, you can even add a pom.xml in the trunks directory:
root
|-- parent-pom
| |-- branches
| |-- tags
| `-- trunk
| `-- pom.xml
|-- projectA
| |-- branches
| |-- tags
| `-- trunk
| |-- module1
| | `-- pom.xml
| |-- moduleN
| | `-- pom.xml
| `-- pom.xml
`-- trunks
`-- pom.xml
This pom.xml is a kind of "fake" pom: it is never released, it doesn't contain a real version since this file is never released, it only contains a list of modules. With this file, a checkout would result in this structure (pattern #3):
root/
parent-pom/
pom.xml
projectA/
pom.xml
This "hack" allows to launch of a reactor build from the root after a checkout and make things even more handy. Actually, this is how I like to setup maven projects and a VCS repository for large builds: it just works, it scales well, it gives all the flexibility you may need.
If the answer is no (back to the initial question), then I think you can live with pattern #1 (do the simplest thing that could possibly work).
Now, about the bonus questions:
Where is the best place to define the various shared configuration as in source control, deployment directories, common plugins etc. (I'm assuming the parent but I've often been bitten by this and they've ended up in each project rather than a common one).
Honestly, I don't know how to not give a general answer here (like "use the level at which you think it makes sense to mutualize things"). And anyway, child poms can always override inherited settings.
How do the maven-release plugin, hudson and nexus deal with how you set up your multi-projects (possibly a giant question, it's more if anyone has been caught out when by how a multi-project build has been set up)?
The setup I use works well, nothing particular to mention.
Actually, I wonder how the maven-release-plugin deals with pattern #1 (especially with the <parent> section since you can't have SNAPSHOT dependencies at release time). This sounds like a chicken or egg problem but I just can't remember if it works and was too lazy to test it.
From my experience and Maven best practices there are two kinds of "parent poms"
"company" parent pom - this pom contains your company specific information and configuration that inherit every pom and doesn't need to be copied. These informations are:
repositories
distribution managment sections
common plugins configurations (like maven-compiler-plugin source and target versions)
organization, developers, etc
Preparing this parent pom need to be done with caution, because all your company poms will inherit from it, so this pom have to be mature and stable (releasing a version of parent pom should not affect to release all your company projects!)
second kind of parent pom is a multimodule parent. I prefer your first solution - this is a default maven convention for multi module projects, very often represents VCS code structure
The intention is to be scalable to a large scale build so should be scalable to a large number of projects and artifacts.
Mutliprojects have structure of trees - so you aren't arrown down to one level of parent pom. Try to find a suitable project struture for your needs - a classic exmample is how to disrtibute mutimodule projects
distibution/
documentation/
myproject/
myproject-core/
myproject-api/
myproject-app/
pom.xml
pom.xml
A few bonus questions:
Where is the best place to define the various shared configuration as in source control, deployment directories, common plugins etc. (I'm assuming the parent but I've often been bitten by this and they've ended up in each project rather than a common one).
This configuration has to be wisely splitted into a "company" parent pom and project parent pom(s). Things related to all you project go to "company" parent and this related to current project go to project one's.
How do the maven-release plugin, hudson and nexus deal with how you set up your multi-projects (possibly a giant question, it's more if anyone has been caught out when by how a multi-project build has been set up)?
Company parent pom have to be released first. For multiprojects standard rules applies. CI server need to know all to build the project correctly.
An independent parent is the best practice for sharing configuration and options across otherwise uncoupled components. Apache has a parent pom project to share legal notices and some common packaging options.
If your top-level project has real work in it, such as aggregating javadoc or packaging a release, then you will have conflicts between the settings needed to do that work and the settings you want to share out via parent. A parent-only project avoids that.
A common pattern (ignoring #1 for the moment) is have the projects-with-code use a parent project as their parent, and have it use the top-level as a parent. This allows core things to be shared by all, but avoids the problem described in #2.
The site plugin will get very confused if the parent structure is not the same as the directory structure. If you want to build an aggregate site, you'll need to do some fiddling to get around this.
Apache CXF is an example the pattern in #2.
There is one little catch with the third approach. Since aggregate POMs (myproject/pom.xml) usually don't have parent at all, they do not share configuration. That means all those aggregate POMs will have only default repositories.
That is not a problem if you only use plugins from Central, however, this will fail if you run plugin using the plugin:goal format from your internal repository. For example, you can have foo-maven-plugin with the groupId of org.example providing goal generate-foo. If you try to run it from the project root using command like mvn org.example:foo-maven-plugin:generate-foo, it will fail to run on the aggregate modules (see compatibility note).
Several solutions are possible:
Deploy plugin to the Maven Central (not always possible).
Specify repository section in all of your aggregate POMs (breaks DRY principle).
Have this internal repository configured in the settings.xml (either in local settings at ~/.m2/settings.xml or in the global settings at /conf/settings.xml). Will make build fail without those settings.xml (could be OK for large in-house projects that are never supposed to be built outside of the company).
Use the parent with repositories settings in your aggregate POMs (could be too many parent POMs?).
Related
I'm trying to organise my maven project.
Let's say my project is called "awesome". "awesome" has several artifact, each of them built differently (e.g., some of them may be built with some plugin, others are built with some other plugins): in general thse build-configurations are finite and limited (let's say there are at most 3 different ways to build an artifact), however, each artifact can only be built with exactly one build (e.g., the utility artifact is built with maven-jar-plugin configured in a particular way, while the artifact client-ui is built with maven-war-plugin configured in a particular way).
Now, I know I could organize the maven project as follows:
awesome-root
|---jars
| |--- utility
| |--- client-model
| |--- task-model
| |--- supplier-model
| |--- client-logic
| |--- task-logic
| ---- supplier-logic
|---wars
|--- client-ui
|--- task-ui
---- supplier-ui
This way, each particular configuration build can be put inside the build --> plugins section of the projects jars and wars, while general properties/dependency management/plugin management can be put in awesome-root.
Problem:
I quickly realized that the developers generates artifacts closely related with eachother but with different builds. In the previous example, we can notice that the artifacts can be grouped in this other way:
awesome-root
|--- tasks
| |--- task-model
| |--- task-logic
| ---- task-ui
|--- clients
| |--- client-model
| |--- client-logic
| ---- client-ui
|--- supplier
| |--- supplier-model
| |--- supplier-logic
| ---- supplier-ui
|--- others
|--- utility
The main advantage of this grouping is that tasks, clients and suppliers are 3 different, independent software sectors. When the developer needs to make a change in the, let's say, client sectors, she has everything she needs in a small part of the file system (or in the project explorer tab in an IDE, like Eclipse). Viceversa, in the first mapping, the clients software sector is scrambled all over in the project repository.
While this may not be a big deal, if "awesome" project starts to get really big, with a lot of artifacts and so on, finding all the related parts of clients sectors start to be annoying (not impossible, IDEs offer searches for this purpose).
I'd say the second structure is much better, developer wise.
However, It's seems difficult to implement this strategy in maven: the main difficulty is to where to put the different build configurations for each artifacts (e.g., *-ui needs to be built in a different way of *-model).
One may be tempted to put such configurations in client-ui, client-logic, client-model, but this would mean duplicate configuration build everywhere (e.g, client-ui, supplier-ui, task-ui has the same build configuration): if a build configuration needs to be changed, you need to change all the other copies;
Another solution might be to declare plugins management in awesome-root and them write the plugin definition in each artifactId: while this seems better, it still suffer from the same duplication problem of option 1;
Use archetype to generate poms with the correct build configuration: same as above;
Profiles: profiles are not inherited and they depend only on system properties, not maven's one;
My questions are:
Is the second structure impossible to achieve in Maven? Is there a way?
If not, do I need to bite the bullet and set on the first structure?
Is there any alternative? (I'm trying not to propose a XY problem, any alternative is appreciated);
Additional information:
OS: Ubuntu 18.04.3 (bionic), 64 bit
java version: openjdk 11.0.4 2019-07-16
IDE: Eclipse 4.10.0
m2e plugin: 1.10.0.20181127-2120
Thanks for any kind reply
In the project I am currently working we have different maven projects in different SVN directories, something like this:
(simplified)
...service/rest-api/trunk/project1
...service/common/trunk/project2
...service/common/trunk/parent-aggregator
The last one (parent-aggregator) is a maven pom project that contains the shared dependencies and the multi-module configuration.
So as I am using Eclipse svn (subclipse) client, it allows me to import all those projects into my eclipse workspace, having all the projects in the same directory, therefore the configuration I created can use the relative paths:
parent-aggregator pom.xml:
<modules>
<module>../project1</module>
<module>../project2</module>
</modules>
The issue came when one of my colleagues got the projects using tortoise svn client and then imported the projects to eclipse, tortoise is replicating the svn directory structure in his local file system.
So whenever he tries to do a mvn clean install to the parent-aggregator, it fails due to project1 not being reachable. Which makes sense as in his machine the projects are not in the same directory.
Is there a way to reference modules so both structures can work?
I tried so far to use the artifactId:
<module>artifactId</module>
it doesn't work.
Also tried adding project name, by first defining a:
<name>project1</name>
inside project1 pom.xml
And then referencing it in module:
<module>project1</module>
But it keeps telling me that it cannot find the child module.
The temporary solution that we are using is having different relative path in his local machine:
<modules>
<module>../../../rest-api/trunk/project1</module>
<module>../project2</module>
</modules>
Which I don't like at all as we should have a single approach that we can keep in SVN.
The first thing you should do is to change your structure as well as in SVN..cause if you have multi-module build you express that those modules belong togehter so you should represent in your structure.
+--- root (pom.xml)
+--- mod-rest-api (pom.xml)
+--- mod-war (pom.xml)
+--- mod-p1 (pom.xml)
If you change your project according to the above you only have entries like this:
<modules>
<module>mod-rest-api</module>
<module>mod-war</module>
<module>mod-p1</module>
</modules>
This will simplify your entries and you don't need to have relativePath entries for your parents.
Furthermore you can have the structure in SVN as well:
URL/project/trunk
+--- root (pom.xml)
+--- mod-rest-api (pom.xml)
+--- mod-war (pom.xml)
+--- mod-p1 (pom.xml)
So you have URL/project/tags and URL/project/branches
Having entries in your modules like .. is from my point of view a build smell which indicates something is wrong with your folder structure in relationship to the project architecture.
I have the following project hierarchy:
app
|-module1
| |-pom.xml
|-module2
| |-pom.xml
|-pom.xml
Module1 and module2 both copies files to the same target directory, so im using the app's pom.xml to clear that directory. My problem is, the execution order right now is module1[clean], module1[install], module2[clean], module2[install], app[clean], app[install], so everything module1 and module2 puts into that directory will be deleted.
I would like to get it to execute all clean first, then all install, even when i run mvn clean install. Or if there is another way to execute app[clean] before module1[install] and module2[install], that would work too.
EDIT
I ended up making a separate module (Netbeans POM projekt) for cleaning alone. Not the sollution i was hoping for, but it works for now.
The root of the problem here is that you're trying to make Maven do something that sort-of contradicts Maven's multi-module "conventions", as well as conflicting with Maven's "understanding" of a "target directory". There is a reason why Maven's reactor is operating the way that it does, and it is to preserve the Maven "spirit" (or "convention") of how modules are structured in a multi-module build.
In Maven, the target directory is supposed to belong only to one project: each project has its own target directory. In your scenario, there should really be a different target directory for app, module1 and module2.
I suppose your best bet, in order to both achieve your objective and keep your build process flexible, is to:
Have module1 output its own JAR into its own target directory (module1/target).
Have module2 output its own JAR into its own target directory (module2/target).
Add a plugin to app (the parent module) that will collect whatever it needs from module1/target and module2/target into app/target, and do whatever processing on those artifacts.
I have a multi-module Maven project in a Subversion repository with many developers working on it with Eclipse + M2Eclipse. Now if a developer adds a module, others need to do an SVN update from the command line (as Eclipse doesn't see the common root of the Maven project), and import the new module manually as an Eclipse project.
Is there a way to do this automatically?
My project structure looks like this:
Working Copy Eclipse Workspace
working copy root -X->
+- parent ---> +- parent
| \- pom.xml | \- pom.xml
+- child1 ---> +- child1
| \- pom.xml | \- pom.xml
+- child2 ---> +- child2
\- pom.xml \- pom.xml
You can have a pom in the root, that will have parent, child1, child2, etc as modules. After SVN update if there a new module was added, you can run
mvn eclipse:clean eclipse:m2eclipse
from the eclipse tools button (right of the debug and run buttons)
If you are using TortoiseSVN you can set a post-update client side hook, but each developer will have to set it independently.
May be buckminster project can help you check FAQ. Hope it helps
Is there a way to do this automatically?
To do what? To avoid importing the new module manually as an Eclipse project? AFAIK, this is currently not supported, you'll have to add it manually (it should be possible to do it programmatically though, there is such a request for the Maven Eclipse plugin - MECLIPSE-75 - couldn't find one for Maven Intergration for Eclipse).
That said, does adding a module really happen so often? Your situation might be different but, to my experience, you'll reach a stable point quite fast and adding a module will become something unusual.
Nevertheless, good team communication is the best solution I found to deal with this. When a developer add a new module, it is a duty for him to let others team members know that he introduced a change and to describe the required steps to take the modification into account. Nobody is omniscient, nobody can read in others mind, active communication is the key to good collaboration.
you can also use maven-eclipse-plugin
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-eclipse-plugin</artifactId>
<configuration>
<wtpversion>2.0</wtpversion>
<projectNameTemplate>[artifactId]-[version]</projectNameTemplate>
</configuration>
</plugin>
refresh the project after importing it from svn
or you can also right click the project and click on 'enable dependency management'
What's the best way to setup Maven for a project that has a SmartClient architecture? Consider the following packages:
myproject.core
myproject.server
myproject.client
Of course there are several sub-packages in each. Client and Server both use core. I see two main options:
Make an uber-POM in myproject to cover all three and have some sort of build parameter to identify what to build.
Make a POM in each package above (one for core, another for server and another for client).
Here are the outputs we need to build (at a minimum):
Standalone.jar: A test application that will launch the server and a client.
Server.war: A WAR file that can be deployed to Tomcat.
Client.jar: The SmartClient without any server code.
Is option #1 even possible? If so, is it good practice? From my initial research, option #2 sounds like best practice. However, jumping from POM to POM when all the code is intimately related sounds like extra work and extra clutter we may not need. Should I just stick with option #2?
Maven has a general rule that there should be only a single artifact per project. In other words, option #1 wouldn't allow you to produce a server.war, a client.jar, etc without fighting against maven. This would be a big mess and you wouldn't be able to take advantage of maven plugins. No, really, you don't want this. So just go for option #2, with a structure like (omitting the src directory):
.
|-- core
| `-- pom.xml
|-- server
| `-- pom.xml
|-- client
| `-- pom.xml
`-- pom.xml
Regarding your concern about jumping from POM to POM, well, just import all modules into your IDE and you won't really notice it. This just works pretty well for lots of people.
UPDATE (to cover questions from the OP in comments):
Fighting against Maven doesn't sound fun.
No, and you will loose :)
What is in the pom.xml at the root level?
This is a parent POM used for Project Aggregation. Quoting the Introduction to the POM document:
Project Aggregation is similar to
Project Inheritance. But instead of
specifying the parent POM from the
module, it specifies the modules from
the parent POM. By doing so, the
parent project now knows its modules,
and if a Maven command is invoked
against the parent project, that Maven
command will then be executed to the
parent's modules as well. To do
Project Aggregation, you must do the
following:
Change the parent POMs packaging to the value "pom" .
Specify in the parent POM the directories of its modules (children
POMs)
Project aggregation and project inheritance are often used together. Refer to the mentioned document for more details.
By "single artifact per project" do you mean that there should be a separate POM for Standalone.jar, Server.war, and Client.jar (three total POMs)?
Yes, this is what I mean, one project generates one artifact (there are some exceptions but this is true 99% of the time). This is a maven best practice that you should (must?) follow.
What if I also want a Server.jar, a simple server based with Grizzly included? Wouldn't server need two POM's?
I think that the maven way to handle this would be to use assemblies and there is no unique answer to your question (this might be one of the exception to the rule mentioned above). But this won't prevent you from starting.
Also, how would one kick off a build that would result in all three artifacts getting produced?
Launch your maven command from an aggregating project as we saw (aka "multi-modules build").