When maven-compiler-plugin:3.8.0:testCompile # foo-child runs, thread dumps show errorprone is taking an extremely long time. I believe there is a bug with errorprone, but for now I'd rather just have errorprone not run on unit tests.
I have a parent pom.xml:
<modules>
<module>foo-child</module>
</modules>
<dependencyManagement>
<dependency>
<groupId>com.google.errorprone</groupId>
<artifactId>error_prone_annotations</artifactId>
</dependency>
// also has dependency for io.norberg auto-matter and com.google.auto.value auto-value
</dependencyManagement>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
// also has annotationProcessorPaths configuration for auto-matter and auto-value
</plugin>
</plugins>
</build>
Is there anything I can put in the foo-child pom.xml that will allow me to exclude maven-compiler-plugin:3.8.0:testCompile # foo-child from being run at all.
I cannot exclude error prone completely because other things like guava depend on it.
EDIT: Seems like this user is trying to solve the same problem. Do you know how I could apply the solution given there to my case?
Use error prone's command line flag to disable checks: -XepDisableAllChecks
Similar answer for disabling error prone in bazel
add --javacopt="-XepDisableAllChecks" to your bazelrc
For specific test(s) use -XepExcludedPaths:
you can completely exclude certain paths from any Error Prone checking via the -XepExcludedPaths flag
-XepExcludedPaths:.*/build/generated/.*
You can use the Inclusions and Exclusions of Tests plugin for this.
You can add the -XepExcludedPaths compiler option to your maven build.
https://errorprone.info/docs/flags
You can add maven profile to foo-child module which will run the build without errorprone. Also you can make the activation of that profile dependent from some parameter and set that parameter's value in parent pom.
Related
Problem Description
I'm working with a collection of old projects from defects4j. My problem now is that since I want to combine those projects with a newer maven plugin, a regression test tool, there are some issue with the maven surefire plugin version.
In the pom.xml that come along with the projects, there are no specifications of surefire version:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<executions>
<execution>
<id>plain</id>
<configuration>
<includes>
<include>**/*Test.java</include>
</includes>
<runOrder>random</runOrder>
</configuration>
</execution>
</executions>
</plugin>
However, the regression tool (made into a maven plugin), require surefire version of 2.14 and above. So I get error like this:
[ERROR] Failed to execute goal edu.illinois:starts-maven-plugin:1.4-SNAPSHOT:select (default-cli) on project commons-lang: Unsupported Surefire version: 2.12.4. Use version 2.13 and above
Efforts Done
I checked several stackoverflow posts, and they talked about the effective pom. When I run mvn help:effective-pom, I can see that the version of surefire used is
<plugin>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12.4</version>
Question
Since the project collection in defects4j does not specify surefire version in their pom.xml, is there a way to specify the surefire version used to 2.14 or above from command line? I want to prevent from manually editing the pom every time.
Update
by running mvn dependency:resolve-plugins, i get
Plugin Resolved: maven-surefire-plugin-2.12.4.jar
So it seems to me that somehow maven use 2.12.4 as a default. The reason maybe that I used this version previously. How do I fix this?
Without modifying the pom manually?
Any advice will be welcomed!
Update:
Problem solved by editing maven's super pom.
Maven takes the newest version from the repository if there was no version fixed in your POM, parent POM or the super POM (from which every Maven project inherits).
It is best practise to fix a version "manually" in the POM. The best place for this is a parent POM from which the projects inherit (this means, only one place to change).
You cannot just supply a version from command line. Unless you do some tricks like putting <version>${surefire.version}</version> into the plugin definition and set this property from command line.
I'm 4+ years removed from working with poms so don't remember everything, but consider a couple of things.
First, since the pom you show isn't specifying the version of surefire to use I don't think that the 2.12.4 version can be coming from that directly. Try getting a dependency tree to see where things are coming from. Try How can you display the Maven dependency tree for the *plugins* in your project? for that and a few other suggestions.
Second, I think I recall that in your own pom you should be able to specify the version of plugin to associate with a dependency that doesn't specify one. You'll have to research that option yourself.
I think your best bet is the dependency tree to find what's using what and where things are coming from. If you get the tree and still can't figure out what to do try adding the tree output to your question. (You can edit out parts that are proprietary, or clearly unrelated.)
I'm working on a large Java codebase that's split into multiple modules, each with a separate pom.xml, all parented by a top-level pom.xml.
I'm currently in the process of bringing in a couple of library dependencies. The transitive set of dependencies is large, and as luck would have it, there are conflicting dependency versions for different modules.
Here's a simplification of my situation:
project/pom.xml
/module-a/pom.xml # references library-a, depends on library-c:v1
/module-b/pom.xml # references library-b, depends on library-c:v2
/module-c/pom.xml # references module-a and module-b
Now the unit tests for module-a will exercise library-a in the presence of library-c:v1, while module-b will exercise library-b in the presence of library-c:v2.
The trouble is that module-a and module-b need to live together on the same classpath when module-c is deployed, but whatever version of library-c is chosen when module-c is packaged, at least one combination of libraries hasn't been unit tested!
I'd like to pin the version of library-c at the parent pom level somehow, rather than repeating myself in every module that transitively depends on library-c; ideally it would be added in such a way indicating that it's a transitive dependency that is allowed to go away should library-a and library-b no longer rely on it.
I'd like a guarantee that there is exactly one version selected for
every transitive dependency across the entire project rooted from this parent pom, and I'd like the build to blow up if this isn't true. I wrote a tool to parse the output of mvn dependency:tree (turning the leaves of the tree into a forest of paths from leaf to root, then finding all different versions of leaf with the dependency path) so I can see the problem, but without explicitly resolving the transitive dependencies for every conflict and bloating out poms with redundant declarations, this doesn't seem fruitful. It's what I'll do if I have no alternative, naturally.
What's the best way to handle this transitive dependency conflict problem with Maven?
How severe is this problem? Quite apart from getting unconvincing test coverage, in practice I see JVM-killing NoSuchMethodError at runtime from the wrong versions getting deployed. I'd prefer to see these at test time at the very least.
Looks like there are two aspects to this:
You need to insist on a single version of a dependency, whether it is declared explicitly or acquired transitively
You can use <dependencyManagement/> here. For example in the top-level pom.xml you can pin the version of library-c:
<dependencyManagement>
<dependencies>
<dependency>
<groupId>your.group.id</groupId>
<artifactId>library-c</artifactId>
<version>2</version>
<dependency>
<dependencies>
<dependencyManagement>
And then in library-a, library-b you would declare the dependency on library-c as follows:
<dependencies>
<dependency>
<groupId>your.group.id</groupId>
<artifactId>library-c</artifactId>
<dependency>
<dependencies>
By declaring this dependency in the parent's dependencyManagement you are insisting on both of the child modules using the version declared in the parent.
You want to protect yourself from unhappy dependency additions occurring in future
You can use the Maven Enforcer plugin here, specifically the dependencyConvergence rule. For example:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-enforcer-plugin</artifactId>
<version>3.0.0-M1</version>
<executions>
<execution>
<id>enforce</id>
<configuration>
<rules>
<dependencyConvergence/>
</rules>
</configuration>
<goals>
<goal>enforce</goal>
</goals>
</execution>
</executions>
</plugin>
The enforcer can be configured to either fail or warn if it discovers a non convergent dependency.
I have a java backend project that includes services to import data from a database. While working on new features, I sometimes need to deploy and run the code on my local machine. Since I don't actually want to connect to the production db while running experimental code, I set up a mock datasource class using Mockito.
The Mock datasource works fine and does what I want when running locally. The problem I'm running into is that I don't want to include that class and its associated dependencies when doing a production deployment. I added an <excludes> section to the configuration section of maven-compiler-plugin. I added the Mock specific dependencies to a 'local' profile section. When I actually try to do a compile using maven however, I get compile errors on the mock datasource class that was supposed to be excluded. I'll post the relevant snippets from my .pom file below. I've tried putting the excludes statement in a specific profile and in the 'default' as shown below. Any help with the would be greatly appreciated.
<profiles>
<profile>
<id>local</id>
<properties>
<config>config.dev.properties</config>
</properties>
<dependencies>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>1.9.5</version>
</dependency>
</dependencies>
</profile>
...
</profiles>
<build>
<finalName>order</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<excludes>
<exclude>com/tura/order/guice/mock/**</exclude>
</excludes>
<compilerId>groovy-eclipse-compiler</compilerId>
</configuration>
</plugin>
</plugins>
</build>
As a simpler alternative, you could configure an alternative version of your app to be run from the src/main/test source directory instead of the normal directory.
You would also remove the profile and only declare the mockito dependency with the scope test (adding test).
This way, you could launch your app on your computer but this code and mockito would not appear in the final build.
I think it would be a lot simpler IF you can easily configure your app to be run from the test but I don't see why it would be otherwise. Usually, avoiding dealing with Maven profiles is considered good practise if there are alternative ways.
EDIT: following your question...
So first, make sure mockito is defined with the "test" scope in you pom. Like this:
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>1.10.19</version>
<scope>test</scope>
</dependency>
Then your code should not compile anymore as it is under src/main/java and needs Mockito.
Transfer your code into src/test/java/ where it will be able to benefit from the test dependencies (thus Mockito).
You have to know that test dependencies and testing code (in src/test/) will not be part of the final jar. So this is what you want.
And I forgot to say that the code in src/test/ may be whatever you like: unit test tests, applications with a main(..) methods.
The only tricky part may be to make your code work from the tests. But test code "sees" the main code (the opposite is not true) so you will have to call main code and pass it your mock, where your mock is instantiated in the test code.
Hope it helps
Altough I like Francois Marot's answer, this other choice is cleaner (tough more complicated): Split your current project into several ones:
One project containing the core code of the application: It must publish pure APIs with no dependencies.
Another project, which must have a dependency on the core, and include the mockito infraestructure as well as your "local" environment facade.
The last one, if necessary, must have a dependency on the core, and add the proper infraestructure and classes for the "production" environment (depending on the complexity, maybe you could decide to include this one into the core itself).
In this way, you will package your code into 100% reusable libraries, and make one distribution for each required target environment, so that no one of them will be polluted with code aimed for the other target environments.
And the POMs will become simplier and won't need profiles.
What are pros and cons of configuring Maven plugins through properties as oppose to configuration?
For example, maven-compiler-plugin documentation explicitly shows configuring source and target as
shown below, presumably going even further with pluginManagement.
https://maven.apache.org/plugins/maven-compiler-plugin/examples/set-compiler-source-and-target.html
<project>
[...]
<build>
[...]
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>1.4</source>
<target>1.4</target>
</configuration>
</plugin>
</plugins>
[...]
</build>
[...]
</project>
Wouldn't it be more succinct to use user properties instead, with no dependency on specific version?
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
First of all, many parameters of many goals do not have any associated user property, so can only be set via <configuration>.
For those which do have a user property (such as these examples), it depends. A few user properties are specifically coördinated between multiple goals or even plugins, as #khmarbaise points out.
User properties can be overridden from the command line, as #kdoteu points out, which is useful in certain cases—though very likely not for Java source/target level which would normally be intrinsic to the project (something you would change in a versioned commit alongside source file changes). User properties can also be overridden from external profiles such as in settings.xml, which is occasionally important.
On the other hand, plugin configuration is more explicit: it is expressly associated with a particular goal (or all goals in a plugin using the same parameter name). Some IDEs can offer code completion. You need not worry about accidentally having some other unrelated plugin interpret a property name (though plugin authors try to use unique-looking prefixes for most user property names). Surprisingly, Maven (3.8.1) will not fail the build if you mistype the parameter name, however—it will just quietly ignore the extra element.
You can influence the properties druing build time with commandline parameters. And you can use them in multimodule projects.
So wie are using them to configure findbugs or some urls for deploying.
There are some properties which are automatically taken by plugins. One for example are the given target/source information. An other is the project.build.sourceEncoding which is taken into account of several plugins like maven-compiler-plugin, maven-resources-plugin etc. So it makes sense to use properties which reduces the size and number of your configurations for plugins.
I have a POM file with one dependency on Freemarker.jar. In the library folder there are several versions of the freemarker jar. I am wondering if there is an easier way to update which freemarker jar is being used without having to open the pom and change the name of the jar or having to find the jar and rename it manually. A JComboBox with the different freemarker jars would be the best but I have no idea how to make it change during runtime. I would be fine with having to restart the application as long as all I have to do is change the selection of the combobox and restart.
I have read a few similar questions and I believe it might not be possible.
Here's my dependency:
<dependency>
<groupId>org.freemarker</groupId>
<artifactId>freemarker</artifactId>
<version>2.3.19</version>
</dependency>
You can use the exec-maven-plugin to start the application together with a dependency management in maven. The version of the freemarker dependency must be overridable by the command line. For that you can use maven properties.
Then your user can restart the application with a different freemarker version by choosing it through a command line parameter.
For example something like this:
mvn exec:java -D=freemarker.version=2.3.19
But there are 3 limitiations:
Your users need to restart the application
This solution is only possible if the freemarker versions are binary compatible
If the freemarker versions are only source compatible, your users additionally need to re-compile the application before starting it.
If you try this solution you should begin with 2 freemarker version that are very close, e.g. 2.3.19. 2.3.18 and try if they are compatible.
Step 1: Add the freemarker dependency to the dependency management.
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.freemarker</groupId>
<artifactId>freemarker</artifactId>
<version>${freemarker.version}</version>
</dependency>
</dependencies>
</dependencyManagement>
Step 2 Add a default version property for the case that the user does not specify one at the command line.
<properties>
<freemarker.version>2.3.19</freemarker.version>
</properties>
Step 3 Configure the exec-maven-plugin
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.2.1</version>
<configuration>
<mainClass>org.your.full.quallified.MainClass/mainClass>
</configuration>
</plugin>
Step 4 Try to execute it with the default freemarker version
mvn exec:java
Step 5 Try to execute it with another freemarker version
mvn exec:java -D=freemarker.version=2.3.18
I don't think you could use maven for this, since maven is (normally) not used during runtime, only during compile/build. You could change the scope of your dependency to "provided", and then tweak the mechanism you're using to start your application, to add the correct jar to your classpath. However, with more details on how you run your application, it's hard to give more details.
EDIT: changed to the correct scope.