Excluding Mock Datasource In Maven Profile - java

I have a java backend project that includes services to import data from a database. While working on new features, I sometimes need to deploy and run the code on my local machine. Since I don't actually want to connect to the production db while running experimental code, I set up a mock datasource class using Mockito.
The Mock datasource works fine and does what I want when running locally. The problem I'm running into is that I don't want to include that class and its associated dependencies when doing a production deployment. I added an <excludes> section to the configuration section of maven-compiler-plugin. I added the Mock specific dependencies to a 'local' profile section. When I actually try to do a compile using maven however, I get compile errors on the mock datasource class that was supposed to be excluded. I'll post the relevant snippets from my .pom file below. I've tried putting the excludes statement in a specific profile and in the 'default' as shown below. Any help with the would be greatly appreciated.
<profiles>
<profile>
<id>local</id>
<properties>
<config>config.dev.properties</config>
</properties>
<dependencies>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>1.9.5</version>
</dependency>
</dependencies>
</profile>
...
</profiles>
<build>
<finalName>order</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<excludes>
<exclude>com/tura/order/guice/mock/**</exclude>
</excludes>
<compilerId>groovy-eclipse-compiler</compilerId>
</configuration>
</plugin>
</plugins>
</build>

As a simpler alternative, you could configure an alternative version of your app to be run from the src/main/test source directory instead of the normal directory.
You would also remove the profile and only declare the mockito dependency with the scope test (adding test).
This way, you could launch your app on your computer but this code and mockito would not appear in the final build.
I think it would be a lot simpler IF you can easily configure your app to be run from the test but I don't see why it would be otherwise. Usually, avoiding dealing with Maven profiles is considered good practise if there are alternative ways.
EDIT: following your question...
So first, make sure mockito is defined with the "test" scope in you pom. Like this:
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>1.10.19</version>
<scope>test</scope>
</dependency>
Then your code should not compile anymore as it is under src/main/java and needs Mockito.
Transfer your code into src/test/java/ where it will be able to benefit from the test dependencies (thus Mockito).
You have to know that test dependencies and testing code (in src/test/) will not be part of the final jar. So this is what you want.
And I forgot to say that the code in src/test/ may be whatever you like: unit test tests, applications with a main(..) methods.
The only tricky part may be to make your code work from the tests. But test code "sees" the main code (the opposite is not true) so you will have to call main code and pass it your mock, where your mock is instantiated in the test code.
Hope it helps

Altough I like Francois Marot's answer, this other choice is cleaner (tough more complicated): Split your current project into several ones:
One project containing the core code of the application: It must publish pure APIs with no dependencies.
Another project, which must have a dependency on the core, and include the mockito infraestructure as well as your "local" environment facade.
The last one, if necessary, must have a dependency on the core, and add the proper infraestructure and classes for the "production" environment (depending on the complexity, maybe you could decide to include this one into the core itself).
In this way, you will package your code into 100% reusable libraries, and make one distribution for each required target environment, so that no one of them will be polluted with code aimed for the other target environments.
And the POMs will become simplier and won't need profiles.

Related

How to exclude error prone from being run on unit tests?

When maven-compiler-plugin:3.8.0:testCompile # foo-child runs, thread dumps show errorprone is taking an extremely long time. I believe there is a bug with errorprone, but for now I'd rather just have errorprone not run on unit tests.
I have a parent pom.xml:
<modules>
<module>foo-child</module>
</modules>
<dependencyManagement>
<dependency>
<groupId>com.google.errorprone</groupId>
<artifactId>error_prone_annotations</artifactId>
</dependency>
// also has dependency for io.norberg auto-matter and com.google.auto.value auto-value
</dependencyManagement>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
// also has annotationProcessorPaths configuration for auto-matter and auto-value
</plugin>
</plugins>
</build>
Is there anything I can put in the foo-child pom.xml that will allow me to exclude maven-compiler-plugin:3.8.0:testCompile # foo-child from being run at all.
I cannot exclude error prone completely because other things like guava depend on it.
EDIT: Seems like this user is trying to solve the same problem. Do you know how I could apply the solution given there to my case?
Use error prone's command line flag to disable checks: -XepDisableAllChecks
Similar answer for disabling error prone in bazel
add --javacopt="-XepDisableAllChecks" to your bazelrc
For specific test(s) use -XepExcludedPaths:
you can completely exclude certain paths from any Error Prone checking via the -XepExcludedPaths flag
-XepExcludedPaths:.*/build/generated/.*
You can use the Inclusions and Exclusions of Tests plugin for this.
You can add the -XepExcludedPaths compiler option to your maven build.
https://errorprone.info/docs/flags
You can add maven profile to foo-child module which will run the build without errorprone. Also you can make the activation of that profile dependent from some parameter and set that parameter's value in parent pom.

ejb with client artifact - runtime dependency?

Our company creates an ejb in two artifacts. The impl artifact contains the implementations and the client artifact contains all the interfaces. This means that the impl artifact has a compile dependency on the client artifact.
Now at runtime, the client artifact needs the impl artifact - otherwise the container cannot inject the required objects. This means that an ear needs to contain the impl artifacts for all client artifacts.
Does this mean that the client artifact should have a runtime dependency on the impl artifact? Or should these "circular" dependencies be avoided, even if one direction is compile and the other is runtime?
Does this mean that the client artifact should have a runtime dependency on the impl artifact?
No and there is no dependency (or better should not be). Take a look at the import statements in the client artifact's classes and interfaces and you will see that the client artifact does not depend on implementations.
If the client would depend on the implementation it would violate the dependency inversion principle which is part of the SOLID principles.
Or should these "circular" dependencies be avoided, even if one direction is compile and the other is runtime?
In fact at runtime an implementation is needed, but that is a question of component assembly. One might want to replace the implementation some day or for test reasons. So it wouldn't be a good idea to introduce a maven dependency in the client artifact to the implementation only to make component assembly a little bit easier.
Instead you should declare the implementation dependency in the EAR deployment unit, because the EAR is the assembly of the enterprise application.
EDIT
Our developers complain that making sure that every client has a corresponding impl in the ear is tedious manual work. One looks for all client artifacts in the dependency:list, adds all corresponding impl artifacts, calls dependency:list again, adds again all missing impl artifacts etc.
I think they take the JEE development roles description word by word.
A software developer performs the following tasks to deliver an EAR file containing the Java EE application:
Assembles EJB JAR and WAR files created in the previous phases into a Java EE application (EAR) file
Specifies the deployment descriptor for the Java EE application (optional)
Verifies that the contents of the EAR file are well formed and comply with the Java EE specification
Nevertheless the specification also says
The assembler or deployer can edit the deployment descriptor directly or can use tools that correctly add XML tags according to interactive selections.
I would say that the a ear pom is an example of an assembly description using a tool.
JF Meier also mentioned
Some developers write scripts for this process, but then again, after one changes versions of some ejbs, one needs to repeat the process because maybe somewhere deep down in the dependency tree, ejb-clients were erased or added, so additional impls might be necessary.
To me these scripts are the same as the ear pom. Maybe more flexible, but at the price of standards and conventions. The fact that they have to update the scripts with every release makes clear that it would be better if these versions are also updated by maven.
Furthermore... Since the ear pom is just a maven artifact, it can be deployed to a repository as well. This is better then private scripts that noone except the author has access to.
I hope these arguments will help you when discussing the deployment strategy with your colleagues.
You do not need to be concerned with the client's implicit dependency upon the implementation because the server will manage that.
The EJB container creates a proxy through which the implementation is invoked, so there is never a direct reference to it from the client.
If you have the pom for the EJB containing:
<groupId>com.stackoverflow</groupId>
<artifactId>Q43120825-server</artifactId>
<packaging>ejb</packaging>
<dependencies>
<dependency>
<groupId>javax</groupId>
<artifactId>javaee-api</artifactId>
</dependency>
<dependency>
<groupId>com.stackoverflow</groupId>
<artifactId>Q43120825-api</artifactId>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-ejb-plugin</artifactId>
<configuration>
<ejbVersion>3.2</ejbVersion>
</configuration>
</plugin>
</plugins>
</build>
and the EAR file pom containing:
<dependencies>
<dependency>
<groupId>com.stackoverflow</groupId>
<artifactId>Q43120825-server</artifactId>
<version>1.0-SNAPSHOT</version>
<type>ejb</type>
</dependency>
<dependency>
... other modules
</dependency>
</dependencies>
<build>
<finalName>${project.artifactId}</finalName>
<plugins>
<plugin>
<artifactId>maven-ear-plugin</artifactId>
<version>2.10.1</version>
<configuration>
<version>7</version>
<defaultLibBundleDir>lib</defaultLibBundleDir>
<modules>
<ejbModule>
<groupId>com.stackoverflow</groupId>
<artifactId>Q43120825-server</artifactId>
<bundleFileName>Q43120825-server.jar</bundleFileName>
</ejbModule>
... other modules that might have the API jar as a dependency
</modules>
</configuration>
</plugin>
</plugins>
</build>
then this will build a correct EAR file with the API jar in it's lib directory.

Deploying two different versions of same jars through jenkings on different servers

I have two servers for my java application and I'm using jenkins to deploy my code on those servers. The application is same but because of the nature of work we are doing we need different versions of same custom jars on each server.
1: I've tried to set environment variables and tried to get artifact and group Id's of those in pom.xml but we can not access environment variables in pom.xml
2: I've tried changing their names and import both jars but that's insane one of them is ignored and both the servers use only one version.
I've been struggling with it for a long time now, The only possible solution that comes to my mind is that i create two different git repositories and make different jenkin jobs for every server.
Can anyone help me figure out how can I import different versions on different servers, that'd mean a lot. Thanks in advance.
If I get you correctly,
different versions of some custom jars
are different version of yours dependencies. This can be easily achieved using maven profiles. Your pom.xml would look similair to this (XML is simplified to minimum.
<project>
<!-- Basic info like model, artifact, group etc. -->
<dependencies>
<!-- Your usual deps as you are used to -->
</dependencies>
<profiles>
<profile>
<id>profile1</id>
<dependencies>
<!-- Extra deps for this profile -->
</dependencies>
</profile>
<profile>
<id>profile2</id>
<dependencies>
<!-- Extra deps for this profile -->
</dependencies>
</profile>
</profiles>
</project>
IDEs commonly provides way to set profile, so devs should not have problem . From jenkins, are while building from command line you would be invoking command with given profile. You can have separate jobs or you can create your job with parameters.
mvn install -P profile1
Alternatively, profile can be activated by enviroment variable. Problem may be that this variable must be availble during compilation.
Another aproach would be branching your code for different customers as Abhilas mentioned in comment.

Parent pom and microservices

We have several projects that are microservices, every project is independent (running on separate spring boot server, exposing rest services, using separate DB schema...)
We use maven to manage the dependencies.
Is it a good idea to have a parent pom declaring each microservices as modules? And so helping to manage the common dependencies (like the lib servlet-api witch is used in every project, to remove it of all of them and declare it in only the parent pom)
The 'problem' with a multi-module parent pom is that, without complicated profiles, it locks the modules in the same release cycle (assuming you're using the Release Plugin, which you should be).
The way I work with Maven is to have a parent pom that declares:
common dependencies (logging APIs, JUnit, etc).
common plugins.
all dependencies in the dependencyManagement section.
all plugins in the pluginManagement section.
Each module delcares the parent pom as its parent but the parent knows nothing about the modules.
The benefit of this comes from the last to two bullets above, the 'management' sections. Anything contained in a 'management' section needs to be redeclared in a module that wants to use a particular dependency or plugin.
For example the parent might look like this:
<project>
<groupId>com.example</groupId>
<artifactId>parent</artifactId>
<version>1.0.00-SNAPSHOT</version>
...
<dependencies>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>1.7.7</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
</dependencies>
<dependencyManagement>
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
<version>2.6</version>
</dependency>
<dependency>
<groupId>commons-collections</groupId>
<artifactId>commons-collections</artifactId>
<version>2.1</version>
</dependency>
</dependencyManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugins>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4</version>
<configuration>
<appendAssemblyId>false</appendAssemblyId>
<descriptors>
<descriptor>src/main/assembly/assembly.xml</descriptor>
</descriptors>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
</project>
And the module might look like this:
<project>
<parent>
<groupId>com.example</groupId>
<artifactId>parent</artifactId>
<version>1.0.00-SNAPSHOT</version>
</parent>
<groupId>com.example</groupId>
<artifactId>module</artifactId>
<version>1.0.00-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>commons-lang</groupId>
<artifactId>commons-lang</artifactId>
</dependency>
</dependencies>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
</plugin>
</plugins>
</project>
The module would:
have dependencies on org.slf4j:slf4j-api:1.7.7:compile, junit:junit:4.11:test and commons-lang:commons-lang:2.6:compile.
has the plugin org.apache.maven.plugins:maven-assembly-plugin:2.4
I would avoid dependencies in the parent pom. It's awkward if one of your (independent) microservices would want some other things. It's weird to have the parent know of each microservice.
You can stick with dependencyManagement though to suggest default versions/scopes if you want. A parent pom is, non the less, very convenient to define plugins, repositories and the like.
Instead, I would group a common set of dependencies into a specific artifact(s), that may be only a single pom with dependencies. Then you can depend on, say "com.example/common-db-dependencies/1.2" to include a standard set of database dependencies, like hibernate, apache derby and JPA specs. (or whatever you're using). A service does not use JPA/SQL could avoid that dependency all together.
Don't entangle yourself though. It's easy to overwork dependency structures if you're trying to cover each case. So, only try to standardize things that really get used by a majority of the services.
I would definitely use a parent project.
I've been working for years with both the structures...Microservices and not, modular or not, Ant, Maven and Gradle..
We need to understand that using a parent pom does not mean talk about microservices not coupled and independent:
they can be still independent and not coupled using parent pom,
they can be still built release and updated in isolation even if you are using a parent pom.
I heard saying "a microservice may need to use different versions for a dependency", well you can, just override the dependency in the specific microservice pom.
We need to focus on "What are here the benefit and what are the cons":
Control and standardization: I can manage the common dependencies (with the dependencies management) in a single point, it makes easier to roll out dependencies changes across all the modules, yes we may need different third parties version, but same time we need to avoid losing control over all the dependencies, so exceptions may be allowed but they needs to be balanced with the "standardization"
Group management: I can still release just a single module, but I can also manage multi modules releases in a easier way, without having to release module by module, but simply the modules that are under development, in this case I still have a single entry point and all the common dependencies can be overviews withing the parent
And much more:
common third parties and platform dependencies management
common third parties and platform standardization
Full control of the dependencies ecosystem withing the whole application (structured in micro services)
common plugins management and standardization
reduce duplication and redundant logic.
Configurations management and standardization
Easier maintenance, change in one place instead of potentially 30 places!!
easier to test and roll out common change.
What about the cons?
I don't see any for the moment, as exceptions can be managed through overriding common behaviour in the specific microservices pom, I can still manage anything in isolation (build in isolation, release in isolation, deploy in isolation..)
There is nothing coupled
Not sure yet what we mean with "it locks the modules in the same release cycle" It does not, unless you are using external SNAPSHOT, I can release a microservice in isolation re-using the same parent version.
for example I can have module 1 declaring Parent 1.0 and be released in isolation without having to run the release process from the parent, I can run it directly on the submodule, but I need to not declare any external SNAPSHOT within the submodule project (you would have same issues with or without parent)
Here there is one issue with dependency and dependency management. Say one of your micro service wants to upgrade to newer version of common for some reason...you cant do that as you have parent. I do understand temptation of reducing duplication of redundant things like plugin configuration. In micro service we need to think more about independence of each service.
Some config like say your repository or release configuration etc can be common.
Most books on microservice architecture recommend autonomy as a principle. Using a parent pom violates that principle.
First of all with a parent pom you can no
longer adopt a polyglot approach and write your microservices in different languages.
You'll also be forced to use the dependencies prescribed by the parent, especially if the enforcer plugin is employed.
The microservices will no longer be independently deployable.
There is also the risk that your work on any one microservice may break others if that work involves altering the parent.
A major drawback of using a parent pom approach with microservices is it will make the release management for microservices a slightly tricky affair. Few related pointers -
The parent pom should not be frequently changed, should be managed as a separate project in a separate repo.
Every change to the parent pom should increment the parent pom version. Once the changes are finalized, the parent pom repo should also be tagged. (treating is as a separate library with independent releases)
Moreover the child pom of all the microservices being touched should ideally be updated to point to the latest parent pom version (affecting the autonomy of microservices to some extent). This may also lead to forceful ask of upgrading the microservice to use newer versions of the libraries, which may not always be a feasible option.
Even if the only change in a microservice is to point to the new parent pom version, it would call for a new (mostly minor) version release of the service.
Suggestions -
You can use the maven enforcer plugin to check for duplicate dependency versions specified between parent and child poms.
The parent pom will not be a good option for extensive dependencies and dependency-management, but can certainly be used for things like repositories, distribution management, and plugin management which shall generally not have clashes between microservices.

unit testing - advice needed

i have spring project that consists of a parent and few child projects.
the build is maven-based and the continous integration server is hudson.
i need to choose the way the unit tests will run.
currently, there are several good and also quite few garbage junit tests.
i preffer not to mess with the test packages of the child projects since it would be time-consuming, i am not familiar with all the junits and the last but not least: i'm lazy.
my questions are:
should i use maven-surefire-plugin and do a heavy cleanup in the test package?
is it any way to tell hudson (not in pom.xml of the project being built) to run specific unit tests and ignore others?
should I create some other build - (ant?) and use it for running unit tests on hudson?
are there any other options good options in the market i am not aware of?
any piece of advice would be appreciated.
aviad
should i use maven-surefire-plugin
and do a heavy cleanup in the test
package?
is it any way to tell hudson (not in
pom.xml of the project being built)
to run specific unit tests and
ignore others?
If you just want to run unit tests in a single-module project, you can do
mvn test
That will run all maven lifecycle phases up to test, including compile and test-compile (as you can see in built-in lifecycle bindings, the surefire:test goal is executed in phase test). Now if you want to restrict the unit tests that are executed, you can configure the surefire:test plugin execution with the test parameter:
mvn test -Dtest=Funky*Test
(This will execute all test classes that start with Funky and end with Test)
Unfortunately, the command line configuration is limited, if you want multiple includes and excludes you will have to do some XML configuration. I'd suggest to use a dedicated profile for hudson:
<profiles>
<profile>
<id>hudson</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.6</version>
<configuration>
<includes>
<include>**/TestA.java</include>
<include>**/Funky*Test.java</include>
<include>**/Test**.java</include>
</includes>
<excludes>
<!-- overrides above include -->
<exclude>**/TestButNotMe.java</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
Now you can let your hudson call maven like this:
mvn -Phudson test
1) should i use maven-surefire-plugin and do a heavy cleanup in the test package?
yes
2) is it any way to tell hudson (not in pom.xml of the project being built) to run specific unit tests and ignore others?
I don't know any if you use JUnit. But you could check the "hidden" features of version 4.7, 4.6, 4.5... (there are many, you can found it in the release notes). If there is nothing what you need, then you could program it by yourselfe. (checking for a system property and skip the test).
3) should I create some other build - (ant?) and use it for running unit tests on hudson?
This would be the same like sprecifiying it in the pom. But in my personal opining mixing Ant and Maven is one of the uggliest thing you can do.
4) are there any other options good options in the market i am not aware of?
Have a look at TestNG.

Categories