Our company creates an ejb in two artifacts. The impl artifact contains the implementations and the client artifact contains all the interfaces. This means that the impl artifact has a compile dependency on the client artifact.
Now at runtime, the client artifact needs the impl artifact - otherwise the container cannot inject the required objects. This means that an ear needs to contain the impl artifacts for all client artifacts.
Does this mean that the client artifact should have a runtime dependency on the impl artifact? Or should these "circular" dependencies be avoided, even if one direction is compile and the other is runtime?
Does this mean that the client artifact should have a runtime dependency on the impl artifact?
No and there is no dependency (or better should not be). Take a look at the import statements in the client artifact's classes and interfaces and you will see that the client artifact does not depend on implementations.
If the client would depend on the implementation it would violate the dependency inversion principle which is part of the SOLID principles.
Or should these "circular" dependencies be avoided, even if one direction is compile and the other is runtime?
In fact at runtime an implementation is needed, but that is a question of component assembly. One might want to replace the implementation some day or for test reasons. So it wouldn't be a good idea to introduce a maven dependency in the client artifact to the implementation only to make component assembly a little bit easier.
Instead you should declare the implementation dependency in the EAR deployment unit, because the EAR is the assembly of the enterprise application.
EDIT
Our developers complain that making sure that every client has a corresponding impl in the ear is tedious manual work. One looks for all client artifacts in the dependency:list, adds all corresponding impl artifacts, calls dependency:list again, adds again all missing impl artifacts etc.
I think they take the JEE development roles description word by word.
A software developer performs the following tasks to deliver an EAR file containing the Java EE application:
Assembles EJB JAR and WAR files created in the previous phases into a Java EE application (EAR) file
Specifies the deployment descriptor for the Java EE application (optional)
Verifies that the contents of the EAR file are well formed and comply with the Java EE specification
Nevertheless the specification also says
The assembler or deployer can edit the deployment descriptor directly or can use tools that correctly add XML tags according to interactive selections.
I would say that the a ear pom is an example of an assembly description using a tool.
JF Meier also mentioned
Some developers write scripts for this process, but then again, after one changes versions of some ejbs, one needs to repeat the process because maybe somewhere deep down in the dependency tree, ejb-clients were erased or added, so additional impls might be necessary.
To me these scripts are the same as the ear pom. Maybe more flexible, but at the price of standards and conventions. The fact that they have to update the scripts with every release makes clear that it would be better if these versions are also updated by maven.
Furthermore... Since the ear pom is just a maven artifact, it can be deployed to a repository as well. This is better then private scripts that noone except the author has access to.
I hope these arguments will help you when discussing the deployment strategy with your colleagues.
You do not need to be concerned with the client's implicit dependency upon the implementation because the server will manage that.
The EJB container creates a proxy through which the implementation is invoked, so there is never a direct reference to it from the client.
If you have the pom for the EJB containing:
<groupId>com.stackoverflow</groupId>
<artifactId>Q43120825-server</artifactId>
<packaging>ejb</packaging>
<dependencies>
<dependency>
<groupId>javax</groupId>
<artifactId>javaee-api</artifactId>
</dependency>
<dependency>
<groupId>com.stackoverflow</groupId>
<artifactId>Q43120825-api</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-ejb-plugin</artifactId>
<configuration>
<ejbVersion>3.2</ejbVersion>
</configuration>
</plugin>
</plugins>
</build>
and the EAR file pom containing:
<dependencies>
<dependency>
<groupId>com.stackoverflow</groupId>
<artifactId>Q43120825-server</artifactId>
<version>1.0-SNAPSHOT</version>
<type>ejb</type>
</dependency>
<dependency>
... other modules
</dependency>
</dependencies>
<build>
<finalName>${project.artifactId}</finalName>
<plugins>
<plugin>
<artifactId>maven-ear-plugin</artifactId>
<version>2.10.1</version>
<configuration>
<version>7</version>
<defaultLibBundleDir>lib</defaultLibBundleDir>
<modules>
<ejbModule>
<groupId>com.stackoverflow</groupId>
<artifactId>Q43120825-server</artifactId>
<bundleFileName>Q43120825-server.jar</bundleFileName>
</ejbModule>
... other modules that might have the API jar as a dependency
</modules>
</configuration>
</plugin>
</plugins>
</build>
then this will build a correct EAR file with the API jar in it's lib directory.
Related
I've added my local dependency library (jar file):
<dependency>
<groupId>com.oracle.jdbc</groupId>
<artifactId>ojdbc7</artifactId>
<version>12.1.0.2</version>
<scope>system</scope>
<systemPath>${project.basedir}/libs/ojdbc7-12.1.0.2.jar</systemPath>
</dependency>
Everything works fine, up to when maven generates war artifact.
I've look up inside generated war file, but, jar dependency is not there inside.
Any ideas?
I know I'm able to use maven installfile. I need to focus the problem using this kind of dependency declaration.
From Maven documentation :
system:
This scope is similar to provided except that you have to provide the JAR which contains it explicitly. The artifact is always available and is not looked up in a repository.
provided: This is much like compile, but indicates you expect the JDK
or a container to provide the dependency at runtime. For example, when
building a web application for the Java Enterprise Edition, you would
set the dependency on the Servlet API and related Java EE APIs to
scope provided because the web container provides those classes. This
scope is only available on the compilation and test classpath, and is
not transitive.
It seems that system scope needs the container or JDK to provide the dependency as the provided scope. Because of that, the dependency is not packed into the WAR file.
You can pack the dependencies into the lib folder using maven-war-plugin like this:
<plugin>
<artifactId>maven-war-plugin</artifactId>
<version>2.4</version>
<configuration>
...
<webResources>
<resource>
<directory>libs</directory>
<targetPath>WEB-INF/lib</targetPath>
<includes>
<include>ojdbc7-12.1.0.2.jar</include>
</includes>
</resource>
</webResources>
</configuration>
</plugin>
A WAR is an web archive for Servlet Containers like Tomcat, Glassfish, JBoss (...). They are specified by the Servlet Specifications. The specs point out that the Datasources (Database) is in the Field of the Servlet-Containers.
(...) type javax.sql.DataSource for which the reference to the data
source is injected by the container prior to the component being made available to
the application.
You should place the database-driver to the servlet-container, not the web-application.
Dependencies in compile scope are automatically added to the target's WEB-INF/lib as part of the Maven build. Dependencies in system scope are not, dependencies with a system scope must be explicitly provided by definition. More information on below URL
Maven 2 assembly with dependencies: jar under scope "system" not included
I have a Maven java web app (.WAR) project that includes several libraries, including the Wicket libraries (but I don't think the problem is wicket itself, but rather with maven).
Here's the problem: even tho I only include Wicket 6.20.0, the resulting .WAR contains two copies of the Wicket libraries: 6.20.0 and 6.18.0, as you can see in this screenshot:
Thinking of some conflicting imports I printed the dependency tree using the:
mvn dependency:tree
commnad... but there is no mention of Wicket 6.18.0 in the dependency tree! I also double-checked using Eclipse's "dependency hierarchy" view and I can confirm there's no trace of that import.
I even did a search for string "6.18.0" across the entire workspace with Eclipse, but it's nowhere to be found!
How can I find out what is causing the inclusion of that duplicate version of the library?
Maven doesn't work in this way.
The resolution of more than one dependency with the same artifactId and groupId but with a different version will result to a single dependency (the version used is no determinist).
The presence of two artifacts with the same artifactId and groupId but with two distinct versions in a same lib folder of the WAR is probably related to one of these :
you don't execute mvn clean package but only mvn package.
your use a bugged version of the Maven war plugin. Try to update it to check that.
you have a Maven plugin that copies Wicket jars 6.18.0 in the WEB-INF/lib folder of the target folder during the build of the component.
the maven WAR project you are building has as dependency an artifact of type WAR. In this case, the dependencies of the WAR dependency are so overlaid in the WAR project that you are building.
An interesting Maven issue about duplicated JAR because of WAR dependencies :
JARs with different versions can be in WEB-INF/lib with war as dependencies
Your answer and your comment indicate that actually you have a WAR dependency in your build.
Unfortunately, there is not really a good and long term effective solution to bypass this limitation.
As said in my comment, using the packagingExcludes property of the maven war plugin is a valid workaround for the actual issue :
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.4</version>
<configuration>
<!-- ... -->
<packagingExcludes>WEB-INF/lib/wicket-*-6.18.0.jar</packagingExcludes>
</configuration>
</plugin>
But beware, using that will do your build less robust through the time.
The day where you update the version of the WAR dependency and that in its new version, it pulls again a different version of wicket, you have still a risk to have duplicate jars with two distinct versions in your built WAR.
Using the overlay feature by specifying the overlay element of the maven-war-plugin is generally better as it focuses on the overlay applied for the war dependency. It fixes the problem early.
As a result, you could define to exclude any wicket JARs from the WAR dependency :
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<version>2.4</version>
<artifactId>maven-war-plugin</artifactId>
<configuration>
<overlays>
<overlay>
<groupId>com.whatever.youlike</groupId>
<artifactId>myArtifact</artifactId>
<excludes>
<exclude>WEB-INF/lib/wicket-*.jar</exclude>
</excludes>
</overlay>
</overlays>
</configuration>
</plugin>
This way is better but this is still a workaround.
The day where the dependency WAR is updated and that it pulls new dependencies (other than Wicket) that are declared in your actual build but with different versions, you may finish with the same kind of issue.
I think that declaring a dependency on a WAR artifact should be done only as we don't have choice.
As poms and projects refactoring are possible, introducing a common JAR dependency which the two WARs depend on and that contains only common sources and resources for the two WARs makes really things simpler.
Well, I figured it out while poking around.
I had a dependency of type "war" in the project:
<dependency>
<groupId>com.whatever.youlike</groupId>
<artifactId>myArtifact</artifactId>
<version>1.0.7-SNAPSHOT</version>
<type>war</type>
</dependency>
Apparently (I wasn't aware of this, my fault here) these type of dependencies will include themselves in the classpath by copying all libs to the main WAR /libs folder, but these will NOT show app in the dependency tree / dependency hierarchy.
I solved by configuring an explicit exclusion in the WAR plugin:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.4</version>
<configuration>
<!-- ... -->
<packagingExcludes>WEB-INF/lib/wicket-*-6.18.0.jar</packagingExcludes>
</configuration>
</plugin>
Use clean install and the double dependency will probably be gone.
Because other libs can use same libs but different version or you tried different version and didn't make mvn clean
The command mvn dependency:tree is telling you the correct information - what you are looking at here is an eclipse / build issue.
Clear out all the target and build areas in your project. If need be, check it out from source control to a new folder.
Alternatively you can build your project in IntelliJ IDEA, and see if you get the correct dependencies (most likely you will).
I already found an answer here on Stack Overflow how to include a 3rd party JAR in a project without installing it to a "local repository":
Can I add jars to maven 2 build classpath without installing them?
But, when I use the Maven Shade Plugin to create a JAR that includes all the dependencies of the project as well, the 3rd party JAR is not included automatically.
How can I make the Maven Shade Plugin add such a 3rd party JAR in to the shaded JAR?
As per the answer gotten, I made it work. What I did was, added this snippet to the beginning of my pom.xml:
<repositories>
<repository>
<id>repo</id>
<url>file://${basedir}/repo</url>
</repository>
</repositories>
Then added a dependency for my project, also to pom.xml:
<dependencies>
<dependency>
<groupId>dummy</groupId>
<artifactId>dummy</artifactId>
<version>0.0.0</version>
<scope>compile</scope>
</dependency>
</dependencies>
And then ran a command line to add a package to 'repo':
mvn org.apache.maven.plugins:maven-install-plugin:2.3.1:install-file
-Dfile=<my-jar>.jar -DgroupId=dummy -DartifactId=dummy
-Dversion=0.0.0 -Dpackaging=jar -DlocalRepositoryPath=`pwd`/repo/
(Not sure if the repo path needs to be a full path, but didn't want to take chances.)
The contents of the repo subdirectory is now:
repo/dummy/dummy/0.0.0/dummy-0.0.0.jar
repo/dummy/dummy/0.0.0/dummy-0.0.0.pom
repo/dummy/dummy/maven-metadata-local.xml
Now I can check this in to version control, and have no local or remote dependencies.
But, when I use the Maven Shade Plugin to create a JAR that includes all the dependencies of the project as well, the 3rd party JAR is not included automatically.
Yes, because the system scoped dependencies are assumed to be always present (this is exactly what the system scope is about) so they won't be included. People actually don't understand what system scope dependencies are, they just keep abusing them (yes, this is abuse), and then get side effects and wonder why (as Brian pointed out in his answer).
I already wrote many, many, really many times about this here on SO and in 99% of the cases, system scoped dependencies should be avoided. And I'll repeat what the Dependency Scopes mini guide says one more time:
system: This dependency is required in some phase of your project's lifecycle, but is system-specific. Use of this scope is discouraged: This is considered an "advanced" kind of feature and should only be used when you truly understand all the ramifications of its use, which can be extremely hard if not actually impossible to quantify. This scope by definition renders your build non-portable. It may be necessary in certain edge cases. The system scope includes the <systemPath> element which points to the physical location of this dependency on the local machine. It is thus used to refer to some artifact expected to be present on the given local machine an not in a repository; and whose path may vary machine-to-machine. The systemPath element can refer to environment variables in its path: ${JAVA_HOME} for instance.
So, instead of using the system scope, either:
Add your libraries to your local repository via install:install-file. This is a quick and dirty way to get things working, it might be an option if you're alone but it makes your build non portable.
Install and run an "enterprise repository" like Nexus, Archiva, or Artifactory and add your libraries via deploy:deploy-file. This is the ideal scenario.
Setup a file based repository as described in this previous answer and put your libraries in there. This is the best compromise if you don't have a corporate repository but need to work as a team and don't want to sacrifice portability.
Please, stop using the system scope.
The Maven addjars plugin solves this problem - see
http://code.google.com/p/addjars-maven-plugin/wiki/UsagePage
Used <resources> to include my lib with all jars. i.e:
<build>
<resources>
<resource>
<directory>${project.basedir}</directory>
<includes>
<include>lib/*.jar</include>
</includes>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.3</version>
<configuration>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
If you only need a quick and dirty solution, you can add the content of the extracted jar file to your src/main/resource directory.
We have several projects that are microservices, every project is independent (running on separate spring boot server, exposing rest services, using separate DB schema...)
We use maven to manage the dependencies.
Is it a good idea to have a parent pom declaring each microservices as modules? And so helping to manage the common dependencies (like the lib servlet-api witch is used in every project, to remove it of all of them and declare it in only the parent pom)
The 'problem' with a multi-module parent pom is that, without complicated profiles, it locks the modules in the same release cycle (assuming you're using the Release Plugin, which you should be).
The way I work with Maven is to have a parent pom that declares:
common dependencies (logging APIs, JUnit, etc).
common plugins.
all dependencies in the dependencyManagement section.
all plugins in the pluginManagement section.
Each module delcares the parent pom as its parent but the parent knows nothing about the modules.
The benefit of this comes from the last to two bullets above, the 'management' sections. Anything contained in a 'management' section needs to be redeclared in a module that wants to use a particular dependency or plugin.
For example the parent might look like this:
<project>
<groupId>com.example</groupId>
<artifactId>parent</artifactId>
<version>1.0.00-SNAPSHOT</version>
...
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.7</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
</dependencies>
<dependencyManagement>
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
<version>2.6</version>
</dependency>
<dependency>
<groupId>commons-collections</groupId>
<artifactId>commons-collections</artifactId>
<version>2.1</version>
</dependency>
</dependencyManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugins>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4</version>
<configuration>
<appendAssemblyId>false</appendAssemblyId>
<descriptors>
<descriptor>src/main/assembly/assembly.xml</descriptor>
</descriptors>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
</project>
And the module might look like this:
<project>
<parent>
<groupId>com.example</groupId>
<artifactId>parent</artifactId>
<version>1.0.00-SNAPSHOT</version>
</parent>
<groupId>com.example</groupId>
<artifactId>module</artifactId>
<version>1.0.00-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
</dependency>
</dependencies>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
</plugin>
</plugins>
</project>
The module would:
have dependencies on org.slf4j:slf4j-api:1.7.7:compile, junit:junit:4.11:test and commons-lang:commons-lang:2.6:compile.
has the plugin org.apache.maven.plugins:maven-assembly-plugin:2.4
I would avoid dependencies in the parent pom. It's awkward if one of your (independent) microservices would want some other things. It's weird to have the parent know of each microservice.
You can stick with dependencyManagement though to suggest default versions/scopes if you want. A parent pom is, non the less, very convenient to define plugins, repositories and the like.
Instead, I would group a common set of dependencies into a specific artifact(s), that may be only a single pom with dependencies. Then you can depend on, say "com.example/common-db-dependencies/1.2" to include a standard set of database dependencies, like hibernate, apache derby and JPA specs. (or whatever you're using). A service does not use JPA/SQL could avoid that dependency all together.
Don't entangle yourself though. It's easy to overwork dependency structures if you're trying to cover each case. So, only try to standardize things that really get used by a majority of the services.
I would definitely use a parent project.
I've been working for years with both the structures...Microservices and not, modular or not, Ant, Maven and Gradle..
We need to understand that using a parent pom does not mean talk about microservices not coupled and independent:
they can be still independent and not coupled using parent pom,
they can be still built release and updated in isolation even if you are using a parent pom.
I heard saying "a microservice may need to use different versions for a dependency", well you can, just override the dependency in the specific microservice pom.
We need to focus on "What are here the benefit and what are the cons":
Control and standardization: I can manage the common dependencies (with the dependencies management) in a single point, it makes easier to roll out dependencies changes across all the modules, yes we may need different third parties version, but same time we need to avoid losing control over all the dependencies, so exceptions may be allowed but they needs to be balanced with the "standardization"
Group management: I can still release just a single module, but I can also manage multi modules releases in a easier way, without having to release module by module, but simply the modules that are under development, in this case I still have a single entry point and all the common dependencies can be overviews withing the parent
And much more:
common third parties and platform dependencies management
common third parties and platform standardization
Full control of the dependencies ecosystem withing the whole application (structured in micro services)
common plugins management and standardization
reduce duplication and redundant logic.
Configurations management and standardization
Easier maintenance, change in one place instead of potentially 30 places!!
easier to test and roll out common change.
What about the cons?
I don't see any for the moment, as exceptions can be managed through overriding common behaviour in the specific microservices pom, I can still manage anything in isolation (build in isolation, release in isolation, deploy in isolation..)
There is nothing coupled
Not sure yet what we mean with "it locks the modules in the same release cycle" It does not, unless you are using external SNAPSHOT, I can release a microservice in isolation re-using the same parent version.
for example I can have module 1 declaring Parent 1.0 and be released in isolation without having to run the release process from the parent, I can run it directly on the submodule, but I need to not declare any external SNAPSHOT within the submodule project (you would have same issues with or without parent)
Here there is one issue with dependency and dependency management. Say one of your micro service wants to upgrade to newer version of common for some reason...you cant do that as you have parent. I do understand temptation of reducing duplication of redundant things like plugin configuration. In micro service we need to think more about independence of each service.
Some config like say your repository or release configuration etc can be common.
Most books on microservice architecture recommend autonomy as a principle. Using a parent pom violates that principle.
First of all with a parent pom you can no
longer adopt a polyglot approach and write your microservices in different languages.
You'll also be forced to use the dependencies prescribed by the parent, especially if the enforcer plugin is employed.
The microservices will no longer be independently deployable.
There is also the risk that your work on any one microservice may break others if that work involves altering the parent.
A major drawback of using a parent pom approach with microservices is it will make the release management for microservices a slightly tricky affair. Few related pointers -
The parent pom should not be frequently changed, should be managed as a separate project in a separate repo.
Every change to the parent pom should increment the parent pom version. Once the changes are finalized, the parent pom repo should also be tagged. (treating is as a separate library with independent releases)
Moreover the child pom of all the microservices being touched should ideally be updated to point to the latest parent pom version (affecting the autonomy of microservices to some extent). This may also lead to forceful ask of upgrading the microservice to use newer versions of the libraries, which may not always be a feasible option.
Even if the only change in a microservice is to point to the new parent pom version, it would call for a new (mostly minor) version release of the service.
Suggestions -
You can use the maven enforcer plugin to check for duplicate dependency versions specified between parent and child poms.
The parent pom will not be a good option for extensive dependencies and dependency-management, but can certainly be used for things like repositories, distribution management, and plugin management which shall generally not have clashes between microservices.
I'm wanting to use the slf4j-gwt library https://github.com/FinamTrade/slf4j-gwt
But I only want to include it in my gwt compile, not in the war that is built as I'm having issues with tomcat startup calling GWT.create...
Is there a simple way to do this? I would expect the maven gwt compiler plugin to support this but I can't see that it does.
<scope>provided</scope> or <optional>true</optional>.
Neither one is semantically satisfying but that's what Maven gives us.
That being said, there are many reasons why you should rather split your project into several modules, with one module containing only client-side code and producing JavaScript and associated resources (through the GWT compiler) and one with only server-side dependencies; that way you never risk putting client-side classes or dependencies into your WAR.
See http://blog.ltgt.net/announcing-gwt-maven-archetypes-project/ for more about it, and sample projects (in the form of Maven archetypes).
If I understand you correctly, you could exclude the artifact when building the WAR. Something like
<build>
...
<plugin>
<artifactId>maven-war-plugin</artifactId>
<version>2.5</version>
...
<configuration>
<packagingExcludes>
WEB-INF/lib/slf4j-gwt-*.jar
</packagingExcludes>
</configuration>
</plugin>
...
</build>
should work.
Cheers,