Using maven to rebuild local sources from other projects in dev environment - java

We have a microservice architecture where the entire project looks as follows:
+- Utilities
+- Service A REST
+- Service A Backend
+- Service B REST
+- Service B Backend
...
+- Service X REST
+- Service X Backend
Each of these is an independent Maven project that can be independently developed.
It so happens, of course, that some of the projects may have to use classes from another one (e.g. to be able to give back corresponding Exception classes in error messages).
Thus one of the projects may have the following dependencies:
Service A backend
+- dependency 1
+- dependency 2
...
\- Utilities
\- Service B backend
In a standard deployment, we would use a Maven repository and simply add the latest jars as dependencies. The problem is during development: if we make a change in Service B backend, a dependency of Service A backend, we cannot simply do mvn compile Service-A-backend because Service B backend will not be recompiled. Doing this for every single project during development is extremely error-prone.
Eclipse might be able to work around this by having the project on the build path, but we do not want to bind ourselves to an IDE and would like to ideally be able to solve the issue with Maven itself.
Can you use Maven in the above scenario so that you can list Service B backend above as a source dependency where, if we compile or package Service A backend, its local source dependencies also get recompiled if there have been any changes? If not, can you do it with gradle or ant?

You should consider looking at the Maven modules feature. Here is an example of use in a POM file:
<modules>
<module>A</module>
....
</modules>
I personally use in addition to maven modules (for compilation ) git sub-module for pull/push code base on multiple project feature.

Related

cloudbuild.yaml for javafx (jdk8u201) with other dependencies

I'm setting up a GitRepo, with different modules of a project. It has some legacy code, some C/C++ for Ardino, and a JavaFX project with some dependencies and Kotlin files in it.
What I actually need is, to build that JavaFx project on pull requests, targeting the develop branch
I already have an empty cloudbuild.yaml in my repository root. What I want is a non-Docker continous integration, so on pull request, I need an artifact build, so the executable can be downloaded for the other project members. GitHub and Google Cloud are connected, only the config is needed.
What is also specific, that I want to build with jdk8u201 (because of the licensing)
The folder structure is something like that:
+- legacy
+- Arduino_codes
+- JavaFX_project
| +- FILES...
+- cloudbuild.yaml
+- .git
If it is possible, that would be great if the builded version would be downloadable, or stored in a specific place in the repository
Try to add :3.5.0-jdk-8 to gcr.io/cloud-builders/mvn if you use maven. Result gcr.io/cloud-builders/mvn:3.5.0-jdk-8
It helped me. More info and cases you can find here.

Maven compile multi-module project with cyclic dependencies

I am working on my little OSS project in which I am using Maven as build tool. I split the project into smaller sub-projects to simplify development. Thus I have following structure:
project
+-- main-module
| |
| +- pom.xml
|
+-- submodule1
| |
| +- pom.xml
|
+ pom.xml
My thought was that main-module should provide interfaces which each submodule should be implementing in order to be plugged into whole application. Therefore submodule1/pom.xml contains compile time dependency reference to main-module. In its turn I also need to be able to test whole application and thus main-module/pom.xml contains test scope dependency reference to submodule1. As the result maven refuses to compile projects saying that they contain cyclic references.
My thought was that maven could first compile classes of main-module as it does not require any compile time dependency on any of submodules, then it using compiled classes of main-module could compile classes of submodule1 and after that compile test classes of main-module (to be able run tests). But seems that maven compiler does not take in account the scope of dependency and I somehow need to work around that.
The only solution I can see is to move away tests from main-module, which doesn't really make sense for me as only that module provides main logic.
My question - is there any other way around this issue except for moving away tests? Or maybe something is wrong with my understanding of how maven-reactor-plugin should work?
Instead of moving your tests away, you could move all of your API into it's own module. Then your main module would contain the application and you can freely distribute your application's API to allow others to access it. If they want to develop new functionality they do not necessarily need the sources of your app.
I consider this a much better style, because sub modules with specific functionality can now clearly separate between what is your applications API and what is the code your application needs to startup/shutdown, etc.
This is in my mind the intended way of how maven projects should look like. It also sticks with the single responsibility principle. The main modules responsibility is to startup/shutdown, etc your application. The API modules responsibility is to show how other developers can access your application. And the other submodule provide specific functionality for your application.
I know this is more of a comment, but might provide you with (not so pretty) solution:
You can seet your submodule1 as a dependency of maven-surefire-plugin (so that reactor is forced to build it) and then play with its settings... i.e. childDelegation or additionalClasspathElements

How to install the impl module for a api dependency in maven

I have a multi-module maven project structured in something like this:
parent
|
|-presentation
|+services
| |-services-api
| |-services-impl
|+data-access
| |-data-access-api
| |-data-access-impl
|-+connector
| |-connector-api
| |-connector-implA
| |-connector-implB
|-...
The presentation module is packaged in a war and it depends only on the api modules.
When i run the install goal the only dependencies that the war installs are the api modules. To choose wich impl modules to install in the presentation module i'm using profiles that add the dependency to the impl modules at build time depending on the profiles selected.
From what i've been reading i don't think that this is correct usage for the maven profiles.
What is the best way to tell maven to add a chosen impl to the presentation module?
I have the same usage of profiles but only for specific changes (dependencies mostly).
You do not have to put everything in profiles. Most of the implementation dependencies are common and are therefore declare directly without profiles.
Depending on the targeted application server I use profiles to override properties, add specific dependencies (CommonJ for Websphere for instance), ...
I got a solution from the maven users mailing list that i think is the right way to use maven in my scenario.
I use runtime dependencies for the impl modules and one war project for each implementation of the api. Using war overlays it merges the resources and enables me to have the application running with the correct module implementations depending on the war i run.

How to remove dependency: maven sub-modules creates implicit dependency on parent module

I have a maven project
Parent Project (with the following modules) (package type pom)
API Project (package type jar)
Packaging Project (has a dependency on API Project) (package type custom)
I want to be able to deploy the API project to Nexus repositories so others can leverage that code. The Packaging Project is more of a supporting utility project for a smaller set of deployment use-cases. The Parent Project wraps it all together for me.
When I deploy the API project to the nexus repo, it deploys fine. If i try to make a brand new project that has a depdency on API, it finds the API dependency in nexus but then also wants the Parent project as well. Is there any way to get around publishing the parent project as its really isnt necessary for use of the API lib when used via the nexus repo?
Any tips on how to organize my maven proj to support this?
When you add a <parent> reference to a Maven project what you are doing is saying: "Take all the configuration from that parent and inject it into my model, then override with the following"
Therefore, in order for Maven to build the model of your project, it is necessary for Maven to retrieve the parent itself. In other words, adding a <parent> tag creates an explicit hard dependency between the parent and the child.
The good news is that Inheritance does not have to follow Aggregation. What does that exactly mean?
Aggregation is when you list <modules> in your pom. It tells Maven that the reactor (i.e. the set of projects that Maven builds) should also include the following (sub)projects.
Inheritance is when you set a project's <parent>.
Nowhere does Maven enforce that a project's <parent> has to list its children as <modules> and nowhere does Maven enforce that a projects <modules> must list the project as a <parent>.
Some people will set up their project like so
ROOT/
+- pom.xml
+- parent/
| \- pom.xml
+- api/
| \- pom.xml
\- packaging/
\- pom.xml
where the parent of ROOT, api and packaging is actually a child of ROOT. Or sometimes ROOT will be a standalone project with no parent [In fact this is a pattern I use a lot myself. When I am working on several related project I will throw together an aggregating pom.xml on my local disk and open that with my IDE and that way all the related code is available as one single "project", even though the actual modules may come from different sources]
So in your case the solution would be to remove the <parent> tag from your "API" module.
Now! There is a downside. When you remove the <parent> tag from your "API" module you have removed all the defaults that your parent project is providing, so you will need to copy those defaults that are relevant to the "API" project or else you may find subtle changes in behaviour. For example, you should definately copy over the pinning of plugin versions, and any <dependencyManagement> that is relevant to the "API" dependencies. There are other bits you may have to copy, but you should use the Maven command mvn help:effective-pom before and after removing the <parent> tag as an aid to seeing the effective differences
for my developments I use uber-pom maven plugin which merges information in maven project hierarchy and provides independent pom as result and I publish artifacts in maven central without any extra links to parents

Import new Maven module in Eclipse automatically

I have a multi-module Maven project in a Subversion repository with many developers working on it with Eclipse + M2Eclipse. Now if a developer adds a module, others need to do an SVN update from the command line (as Eclipse doesn't see the common root of the Maven project), and import the new module manually as an Eclipse project.
Is there a way to do this automatically?
My project structure looks like this:
Working Copy Eclipse Workspace
working copy root -X->
+- parent ---> +- parent
| \- pom.xml | \- pom.xml
+- child1 ---> +- child1
| \- pom.xml | \- pom.xml
+- child2 ---> +- child2
\- pom.xml \- pom.xml
You can have a pom in the root, that will have parent, child1, child2, etc as modules. After SVN update if there a new module was added, you can run
mvn eclipse:clean eclipse:m2eclipse
from the eclipse tools button (right of the debug and run buttons)
If you are using TortoiseSVN you can set a post-update client side hook, but each developer will have to set it independently.
May be buckminster project can help you check FAQ. Hope it helps
Is there a way to do this automatically?
To do what? To avoid importing the new module manually as an Eclipse project? AFAIK, this is currently not supported, you'll have to add it manually (it should be possible to do it programmatically though, there is such a request for the Maven Eclipse plugin - MECLIPSE-75 - couldn't find one for Maven Intergration for Eclipse).
That said, does adding a module really happen so often? Your situation might be different but, to my experience, you'll reach a stable point quite fast and adding a module will become something unusual.
Nevertheless, good team communication is the best solution I found to deal with this. When a developer add a new module, it is a duty for him to let others team members know that he introduced a change and to describe the required steps to take the modification into account. Nobody is omniscient, nobody can read in others mind, active communication is the key to good collaboration.
you can also use maven-eclipse-plugin
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-eclipse-plugin</artifactId>
<configuration>
<wtpversion>2.0</wtpversion>
<projectNameTemplate>[artifactId]-[version]</projectNameTemplate>
</configuration>
</plugin>
refresh the project after importing it from svn
or you can also right click the project and click on 'enable dependency management'

Categories