I have the following project hierarchy:
app
|-module1
| |-pom.xml
|-module2
| |-pom.xml
|-pom.xml
Module1 and module2 both copies files to the same target directory, so im using the app's pom.xml to clear that directory. My problem is, the execution order right now is module1[clean], module1[install], module2[clean], module2[install], app[clean], app[install], so everything module1 and module2 puts into that directory will be deleted.
I would like to get it to execute all clean first, then all install, even when i run mvn clean install. Or if there is another way to execute app[clean] before module1[install] and module2[install], that would work too.
EDIT
I ended up making a separate module (Netbeans POM projekt) for cleaning alone. Not the sollution i was hoping for, but it works for now.
The root of the problem here is that you're trying to make Maven do something that sort-of contradicts Maven's multi-module "conventions", as well as conflicting with Maven's "understanding" of a "target directory". There is a reason why Maven's reactor is operating the way that it does, and it is to preserve the Maven "spirit" (or "convention") of how modules are structured in a multi-module build.
In Maven, the target directory is supposed to belong only to one project: each project has its own target directory. In your scenario, there should really be a different target directory for app, module1 and module2.
I suppose your best bet, in order to both achieve your objective and keep your build process flexible, is to:
Have module1 output its own JAR into its own target directory (module1/target).
Have module2 output its own JAR into its own target directory (module2/target).
Add a plugin to app (the parent module) that will collect whatever it needs from module1/target and module2/target into app/target, and do whatever processing on those artifacts.
Related
My project is structured with a Parent POM and lots of modules each having child POMs themselves. Currently I'm doing a maven build with my Parent POM which is building the entire project. How can I convert this to a delta build i.e. I want to build only the module which has changed files?
I'm unable to find any plugins suiting this need.
Another approach is writing a script to find the deltas and build individually.
Can someone please advise.
Thanks in advance.
Slightly unrelated but if you know which module you are working on you can use Maven reactor to build this module with related dependencies by running:
mvn install -pl :my-module -am
-pl, --projects
Build specified reactor projects instead of all projects
-am, --also-make
If project list is specified, also build projects required by the list
I am currently trying to migrate a multi-app project from Ant to Maven.
At the moment the project consists of multiple packages, creating some kind of dependency tree, without circular dependencies. The leaves of this tree are "application" packages, containing a Main. Intermediate nodes are "library" packages, used by other library "packages" or "application" packages.
The nodes are allowed to "grow together" to a single node or leaf.
I figured out, that those packages should probably be grouped into maven modules and I now have a structure similar to this:
root
- lib1
- lib1A (depends on lib1)
- lib1B (depends on lib1)
- app1A (depends on lib1A)
- lib2 (depends on lib1B)
- lib2A (depends on lib2)
- lib2B (depends on lib2)
- app2 (depends on lib2A and lib2B)
- lib3 (depends on lib2A and lib2B)
- app3A (depends on lib3)
- app3B (depends on lib3)
Basically a library and an application can depend on one or more other libraries.
Now I would like to be able to build each app on it's own and create an executable jar for it.
The way I am trying to do it now is the following:
configure the pom.xml of every app to use maven-assembly-plugin to create an executable jar.
Build each needed module for a specific app.
Build the app-module, which results in a executable jar.
So the build for app2 would build lib1, lib1A and lib1B, lib2, lib2A and lib2B and finally app2.
However, to automate the build, I would need to create a build-script for every app, which takes care of building all needed dependecies, which maven should already do by itself.
Also, if I want to build multiple apps at once, I would need to build all libraries multiple times, or track the already built modules by myself.
As I am new to maven, I am not sure if that's the correct way to manage such a multi-app project.
So I am asking for some advice on how to correctly manage this use case.
EDIT:
To clarify what I would like to be able to do:
build a single app with it's dependencies,
without building all apps (running maven on the parent pom).
build multiple apps (not all) with their dependencies, without building the dependencies multiple times.
If you define the dependencies in the respective POM and build the whole project (at the root level), then Maven automatically orders the modules topologically, which means that builds every module once and everything is done in the right order.
I am now using a parent project, which defines the maven version, the common dependencies and the common plugin configurations.
The parent project also defines it's child modules in module-tags.
Every child module references this parent project and uses the same version.
To build the applications, I am running maven inside the parent project, using the -pl and -am flag, as mentioned in this comment.
The -pl flag tells maven to only build the listed modules, instead of building the whole project.
The -am flag tells maven to also build the needed dependencies.
Example:
Given the following structure:
parent
---- lib1
---- lib1A (depends on lib1)
---- lib1B (depends on lib1)
---- lib2 (depends on lib1B)
---- lib2A (depends on lib2)
---- lib2B (depends on lib2)
---- app1A (depends on lib1A)
---- app2A (depends on lib2A)
---- app2B (depends on lib2B)
Executing mvn clean install -pl app1A,app2A -am would build all modules except app2B and lib2B.
Also, the module lib1, which is used by app1A and app2A would only be built once.
Usually, you want to version each module independently, but in our case this would introduce a huge effort, since we have a lot of modules building on top of each other. Small changes in the "lowest" module (lib1 in the example) would cause changes in almost every module.
So in this case we would need to increase every version number and update all referenced dependencies in all modules.
We instead decided to always rebuild all dependencies, resulting in a always up-to-date Jar. That's why we only manage the maven version in the parent project.
I think with maven multi-module, you just need to navigate to the module that you want to build, then run maven command from there. Maven will build that module with associated dependencies automatically without build unnecessary modules.
Edited: The original answer was incorrect, after did some test I found the following should be helpful:
Maven Modules + Building a Single Specific Module
You just need to add -pl and -am flags when building sepecific module from parent level.
I have multi-module Java project (Jodd). Main module, i.e. the root project is not java project. Only large subset of submodules are java projects:
root
|-- jodd-core
|-- jodd-bean
...
I wanted to apply bintray plugin. So first thing I did is:
1. Apply bintray to just java submodules.
This worked fine, except, since bintray is not enabled from the root; I am not able to invoke just:
gradle bintrayUpload
since this task does not exist in the root. Then I tried this:
2. Apply bintray to all modules, including the root.
This worked, except now on the bintray I have an empty package for the root, that does not have any file.
Question
What would be the right way to upload to bintray? I think I would go with the solution 1 and create my own custom task that will depend on bintrayUpload tasks on all Java modules. Am I missing something?
Solution #1 seems the way to go. Ive created task bintray that depends on all bintrayUpload task from submodules that have something to publish.
If someone could help me out here it would save me a lot of time.
I maintain an open source library that gets pushed out to a sonatype repository. I make changes to that library a few times a day and push it out to the 1.0_snapshot build using mvn deploy.
Let's call it project1
I work constantly in another project that uses that library let's call it project2.
Right now, whenever i make changes to project 1 or 2, i need to first build and deploy project 1 to the repo, then build project 2 so it downloads a fresh copy of project1.jar
Project2 has Project1 as a dependency in a pom.xml:
<dependency>
<groupId>com.group</groupId>
<artifactId>project1</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
in order to build in a way where all of my changes can be tested, I have to do something like this:
mvn -f ./project1/pom.xml clean deploy
mvn -U -f ./project2/pom.xml clean package
this uploads my project1.jar to sonatype, then project2 downloads the new snapshot and builds it.
This is a simplified picture of what i'm doing on a larger scale, where my compiles take 5 minutes of up and downloads.
Question: What is the proper way to use maven so it knows to use the source of project1 in a dependency in project 2?
IDE:
install m2e in eclipse
import your both projects into workspace
from consumer project (right click > maven > enable workspace resolution)
this will put project2's classes in classpath from its target/classes instead of the actual jar
native straight maven:
You can create a maven project tree, if this two are open source project going through same build cycle it must have one already, if they are not related but related for your usecase then you can temporarily create a maven tree on top of these 2 projects and build the top parent it will build from bottom up in one command
it will find the leaf project, build it install it in maven's cache and now while building projectA it will refer it from maven's cache so no need to deploy to sonatype
You can also point project2 to build using offline mode :
mvn -o package
Then you drop the upload part of the project1 build to the remote repo.
Check the following link: Intro to repositories and How do I configure Maven for offline development?
Background
one trunk and one branch in a standard svn layout, /trunk and /branches/prod
about 20 modules in the maven project
when i want to compile trunk, just type cd trunk; mvn clean package. but it will compile each of modules. something similar when the production need to be build.
a center maven repo
Problem
if it's possible to compile the updated module only? actually i want to compile the modified source files only if possible. i know it's a simple thing by ant or make. however, maven compile always make it from scratch.
should pom.xml in trunk and prod keep the same artifactId and version? the same artifactId and version cannot work with maven:deploy because trunk and prod will overwrite each others deployed packages.
You can try to use the mvn -am -pl option which will limit the operation to a given module.
+-- root (pom.xml)
+-- client (pom.xml) dep: core
+-- server (pom.xml) dep: client
+-- core (pom.xml)
+-- cli (pom.xml) dep:core
If you do a
mvn -am -pl server LifeCycle
only the server module and the modules which are used by the server will be run through the appropriate lifecycle.
mvn -am -pl cli LifeCycle
Only the cli module and the core module in this case will be run throught the given lifecycle.
mvn compile does not always build from scratch. It only compiles changed java files. Unless you run mvn clean compile. Then of course, all projects will be cleaned and then compiled from scratch.
You can always cd into the project you want to and compile that by itself.
edit Maven does always run all the phases up to the compile phase if you invoke mvn compile and there may be plugin executions there that are time consuming. But the actual compiling of java files is not done more than needed. run mvn compile twice and you will see the message "nothing to compile all classes are up to date"
Or is it "all files are up to date"..?
edit - question nbr 2
Why are you releasing from both trunk and prod if the artifacts generated from the two are not different versions? Yes they should have different versions. Unless you create an experimental branch that you might merge back into trunk and you never intend to release it on its' own. In that case the experimental branch does not need a version number of its own.