I am working on my little OSS project in which I am using Maven as build tool. I split the project into smaller sub-projects to simplify development. Thus I have following structure:
project
+-- main-module
| |
| +- pom.xml
|
+-- submodule1
| |
| +- pom.xml
|
+ pom.xml
My thought was that main-module should provide interfaces which each submodule should be implementing in order to be plugged into whole application. Therefore submodule1/pom.xml contains compile time dependency reference to main-module. In its turn I also need to be able to test whole application and thus main-module/pom.xml contains test scope dependency reference to submodule1. As the result maven refuses to compile projects saying that they contain cyclic references.
My thought was that maven could first compile classes of main-module as it does not require any compile time dependency on any of submodules, then it using compiled classes of main-module could compile classes of submodule1 and after that compile test classes of main-module (to be able run tests). But seems that maven compiler does not take in account the scope of dependency and I somehow need to work around that.
The only solution I can see is to move away tests from main-module, which doesn't really make sense for me as only that module provides main logic.
My question - is there any other way around this issue except for moving away tests? Or maybe something is wrong with my understanding of how maven-reactor-plugin should work?
Instead of moving your tests away, you could move all of your API into it's own module. Then your main module would contain the application and you can freely distribute your application's API to allow others to access it. If they want to develop new functionality they do not necessarily need the sources of your app.
I consider this a much better style, because sub modules with specific functionality can now clearly separate between what is your applications API and what is the code your application needs to startup/shutdown, etc.
This is in my mind the intended way of how maven projects should look like. It also sticks with the single responsibility principle. The main modules responsibility is to startup/shutdown, etc your application. The API modules responsibility is to show how other developers can access your application. And the other submodule provide specific functionality for your application.
I know this is more of a comment, but might provide you with (not so pretty) solution:
You can seet your submodule1 as a dependency of maven-surefire-plugin (so that reactor is forced to build it) and then play with its settings... i.e. childDelegation or additionalClasspathElements
Related
I have multiple app projects of of roughly this layout:
example app (Java)
Java Wrapper with additional functionality
C++ + Shallow Java Wrapper
2nd example app (flutter)
flutter wrapper
Java Wrapper with additional functionality
C++ + Shallow Java Wrapper
3rd example app
flutter wrapper
Java Wrapper with additional functionality
C++ + Shallow Java Wrapper
All apps share the same main dependency (java Wrapper with additional functionality) and its dependency tree. Now I am developing on each app all the way down to C++ code. They are managed as git submodules in their respective parent project.
As there is a high change rate along the whole process, I want the final example to be built for testing from all sources.
I tried several approaches for tying this together into one gradle build:
1. Preferred (but failing) solution: settings.gradle in each project, each project only includes direct dependencies
Now I want this full tree to be managed in one flutter build. So I add the direct dependencies in each projects settings.gradle, just to learn that gradle only supports one toplevel settings.gradle. So this does not work. The presented solutions in aforementioned question mostly try to emulate support for multiple settings.gradle files.
2. Functioning but Ugly: Add all dependency projects are included in the toplevel settings.gradle
Do I really have to include all subprojects manually in the toplevel settings.gradle, when each of the subprojects knows its dependencies perfectly fine? Furthermore, since there are multiple projects depending on this, do I have to do this manually for each of them?
(And don't even get me startet about gradle not telling me, I have a wrong projectDir because I got a typo in the 100rth level of recursive descend!)
3. Probably Working Solution: Use composite builds
This will trigger the builds but now I have to resolve the build artifacts instead of the projects. So same problem with other artifacts.
4. Probably Working solution: Publish dependency projects to a maven (or other) repository and pull that into the app
I did not try this because I find the idea abhorent: I want to test one small change in the C++ code and now have to push that to a repository and potentially do the same on every project above?
This works for a stable project but not for flexible exploratory development. Sure, I want to publish something at the end but I don't want to publish every little step in between.
This left me wondering: Am I doing something unusual? I mean: is there nobody who has the same requirements that gradle does not seem able to fit:
live updates from all the way down to quick test local changes
no repeating of transitive dependencies on the toplevel
What is the common practice in this case?
After Lukas Körfer's comment I took a closer look at composite builds again and noticed that I had a misconception about them. I did not understand that their dependency resolution will solve the finding of the build artifacts for me.
Now I use the composite builds to tie together the whole build while using
implementation 'my.group:project'
to import the code of the subprojects and
includeBuild '../path/to/subproject/'
to pull them in.
I develop a JavaEE app in two separate maven projects:
- one project is the API
- the other is a JSF app where calls are made to the API of the 1st project.
They both share the same Model (the classes describing my data structure).
How can I have these 2 projects to share the same Model? Should I create a third Maven project, which will contain these common classes, and this 3rd project will be a dependency for the 2 projects? It seems a bit heavy handed. Is there a better design?
I guess that your model classes will be compiled into a separate deliverable, for example a JAR file. Since every Maven project object model describes a single deliverable, the most reasonable option in order to properly reuse the code and include the binary in both web apps is the one you have already proposed.
This way, your project may be like this:
+ pom.xml (parent, "pom" type packaging with 3 sub-modules)
|_ /ModelJAR/pom.xml (model classes, "jar" type packaging)
|_ /API/pom.xml (REST? API, maybe "war" type packaging)
|_ /WebAPP/pom.xml (web application, maybe "war" type packaging)
And as you say, both API and WebAPP projects would depend on the model project. It may look cumbersome, but (and I admit that this is a matter of taste) it states clearly the number, nature and location of every deliverable in your code. I hope you find this useful.
I'm writing a Scala "plugin" to a Java project. (Essentially I'm writing a Scala class that is a subclass of a Java class. It is intended to run in the Java application.) I can create both the Java project and the Scala project in the Scala/eclipse IDE, and I can tell one of them to refer to the other. But when I try to tell each to refer to the other I get an error message saying that I have a cycle in my build path. When I leave out the reference from the Java project to the Scala project (but include the reference from the Scala project to the Java project) I'm able to write the subclass and refer to the required Java classes. But now the Java application can't see the Scala subclass. Is there a good way to accomplish something like this?
Thanks.
It is impossible in general to create cyclic references between projects. And it is quite natural. Suppose you have projects 'A' and 'B' which are interdependent. This means that in order to compile 'B' you should have 'A' already compiled. But to compile 'A' you need 'B' to be compiled. How do you break such loop?
That said, in some particular cases it may be possible. For example, if there are no dependency cycles between classes in these projects, i.e. there are no dependency loops like com.a.X -> com.b.Y -> com.b.Z -> com.a.X, then it may be possible to compile both projects in chunks until both of them are fully compiled. But this would require sophisticated algorithms and dependency tracking system which is just not worth it.
Plugin-based systems usually are structured in the following way. Main project is separated in two: actual program and its plugin API (actual program depends on this API). Then there are plugins which depend on the API but not on the main program. The main program now can have dependencies on plugins, because there are no loops now.
+------------------------+
| |
+-> Plugin 1 --+ Project
| | |
+-> Plugin 2 --+--+ V
| | +--> API
+-> Plugin 3 --+
Even better is to link the program with plugins at runtime. Then there are no compile-time dependencies of Project on plugins at all. Runtime system (e.g. OSGi or JBoss Modules or something custom) becomes responsible for finding and loading plugins at runtime. This approach will require some changes in the build system and in the way you launch your project, but it may be worth it. This depends on your actual requirements, of course.
This StackOverflow question seems to solve my problem. Its solution is to "Add Scala Nature" to the Java project: right click the project and select Configure > Add Scala Nature. After doing that it's possible to include Scala files within a Java project! That's it.
I have a multi-module maven project structured in something like this:
parent
|
|-presentation
|+services
| |-services-api
| |-services-impl
|+data-access
| |-data-access-api
| |-data-access-impl
|-+connector
| |-connector-api
| |-connector-implA
| |-connector-implB
|-...
The presentation module is packaged in a war and it depends only on the api modules.
When i run the install goal the only dependencies that the war installs are the api modules. To choose wich impl modules to install in the presentation module i'm using profiles that add the dependency to the impl modules at build time depending on the profiles selected.
From what i've been reading i don't think that this is correct usage for the maven profiles.
What is the best way to tell maven to add a chosen impl to the presentation module?
I have the same usage of profiles but only for specific changes (dependencies mostly).
You do not have to put everything in profiles. Most of the implementation dependencies are common and are therefore declare directly without profiles.
Depending on the targeted application server I use profiles to override properties, add specific dependencies (CommonJ for Websphere for instance), ...
I got a solution from the maven users mailing list that i think is the right way to use maven in my scenario.
I use runtime dependencies for the impl modules and one war project for each implementation of the api. Using war overlays it merges the resources and enables me to have the application running with the correct module implementations depending on the war i run.
What's the best way to setup Maven for a project that has a SmartClient architecture? Consider the following packages:
myproject.core
myproject.server
myproject.client
Of course there are several sub-packages in each. Client and Server both use core. I see two main options:
Make an uber-POM in myproject to cover all three and have some sort of build parameter to identify what to build.
Make a POM in each package above (one for core, another for server and another for client).
Here are the outputs we need to build (at a minimum):
Standalone.jar: A test application that will launch the server and a client.
Server.war: A WAR file that can be deployed to Tomcat.
Client.jar: The SmartClient without any server code.
Is option #1 even possible? If so, is it good practice? From my initial research, option #2 sounds like best practice. However, jumping from POM to POM when all the code is intimately related sounds like extra work and extra clutter we may not need. Should I just stick with option #2?
Maven has a general rule that there should be only a single artifact per project. In other words, option #1 wouldn't allow you to produce a server.war, a client.jar, etc without fighting against maven. This would be a big mess and you wouldn't be able to take advantage of maven plugins. No, really, you don't want this. So just go for option #2, with a structure like (omitting the src directory):
.
|-- core
| `-- pom.xml
|-- server
| `-- pom.xml
|-- client
| `-- pom.xml
`-- pom.xml
Regarding your concern about jumping from POM to POM, well, just import all modules into your IDE and you won't really notice it. This just works pretty well for lots of people.
UPDATE (to cover questions from the OP in comments):
Fighting against Maven doesn't sound fun.
No, and you will loose :)
What is in the pom.xml at the root level?
This is a parent POM used for Project Aggregation. Quoting the Introduction to the POM document:
Project Aggregation is similar to
Project Inheritance. But instead of
specifying the parent POM from the
module, it specifies the modules from
the parent POM. By doing so, the
parent project now knows its modules,
and if a Maven command is invoked
against the parent project, that Maven
command will then be executed to the
parent's modules as well. To do
Project Aggregation, you must do the
following:
Change the parent POMs packaging to the value "pom" .
Specify in the parent POM the directories of its modules (children
POMs)
Project aggregation and project inheritance are often used together. Refer to the mentioned document for more details.
By "single artifact per project" do you mean that there should be a separate POM for Standalone.jar, Server.war, and Client.jar (three total POMs)?
Yes, this is what I mean, one project generates one artifact (there are some exceptions but this is true 99% of the time). This is a maven best practice that you should (must?) follow.
What if I also want a Server.jar, a simple server based with Grizzly included? Wouldn't server need two POM's?
I think that the maven way to handle this would be to use assemblies and there is no unique answer to your question (this might be one of the exception to the rule mentioned above). But this won't prevent you from starting.
Also, how would one kick off a build that would result in all three artifacts getting produced?
Launch your maven command from an aggregating project as we saw (aka "multi-modules build").