There is an issue with JaCoCo and the MultiRelease JAR files. Since the same class name exist on two places, JaCoCo complains:
Caused by: java.lang.IllegalStateException: Can't add different class with same name: jodd/core/JavaBridge
at org.jacoco.core.analysis.CoverageBuilder.visitCoverage(CoverageBuilder.java:107)
at org.jacoco.core.analysis.Analyzer$1.visitEnd(Analyzer.java:96)
How we can tell JaCoCo (in Gradle) to skip the classes from META-INF path? OR to behave like it should (use correct class and ignoring other versions), depending on JVM version?
As explained by #nullpointer, JaCoCo doesn't support Multi-Release JAR Files.
My workaround is to ignore the versions classes. I was not able to ignore just the class by explicitly set its name, it looks like JaCoCo is scanning all of them and then only later applies the filters for exclusion (but maybe I am wrong).
Therefore, the only way to remove versions classes was to exclude all resources - since they are not used anyway. Like this:
task codeCoverage(type: JacocoReport) {
executionData fileTree("${buildDir}/jacoco/").include("*.exec")
//sourceSets it.sourceSets.main <--- REPLACED WITH FOLLOWING LINES!!!
sourceDirectories = it.sourceSets.main.java
classDirectories = it.sourceSets.main.output.classesDirs
reports {
xml.enabled true
html.enabled true
}
}
So I changed this:
sourceSets it.sourceSets.main
to this:
sourceDirectories = it.sourceSets.main.java
classDirectories = it.sourceSets.main.output.classesDirs
The difference here that we explicitly state: sourceSets.main.output.classesDirs which excludes resources.
Source
I had the same problem on Jacoco 0.8.8 (I guess they haven't fixed it yet). But I use maven, not gradle, so even though the accepted answer is correct, it was very hard for me to follow. First, files should be excluded in the report goal, not in the prepare-agent goal. That was not at all obvious to me and took careful reading of the maven jacoco help which can be seen using the following command
mvn help:describe -Dplugin=org.jacoco:jacoco-maven-plugin -Ddetail
Second, it wasn't obvious to me whether the exclude value was a path or a package reference and, if a path, what kind of path. By experimenting I found it's a path relative to the target/classes folder. Also note that foo/* excludes all the files in the foo folder. To exclude all files recursively under foo use foo/**/*. Based on all that this is what my unit test report goal looks like.
<!-- Use unit test coverage data to generate report -->
<execution>
<id>after-unit-tests-generate-report</id>
<phase>test</phase>
<goals>
<goal>report</goal>
</goals>
<configuration>
<!-- Exclude alternate versions for multi-release modules-->
<excludes>
<exclude>META-INF/**/*</exclude>
</excludes>
<dataFile>${jacoco.data.file.ut}</dataFile>
<outputDirectory>${jacoco.report.folder.ut}</outputDirectory>
</configuration>
</execution>
That code excludes all the files under target/classes/META-INF. In other words, all the other versions besides the base. I was worried that my tests use Java 11 but my base is Java 8, but my coverage results seem correct.
Note the use of properties jacoco.data.file.ut and jacoco.report.folder.ut. Those are defined earlier in my pom file and otherwise are not relevant to this discussion. Also note, this is defined in the parent pom of a project with lots of child modules. Even though it is not inside a pluginManagement tag (only a plugins tag) it still applies for all the children.
JaCoCo doesn't yet provide support for Java 9 Multi-Release JAR Files.
This seems to be in their plans though as tracked at jacoco/issues#407.
Related
I want to write a piece of Java code which can be executed with 2 different kinds of dependencies (or version of a dependency). Namely speaking about org.apache.poi. The code must run on a system with version=2 as well as version=3 or org.apache.poi.
Unfortunately between the versions 2 & 3 some interfaces have changed, code must be build slightly different and there is no way to upgrade both system to the same org.apache.poi version.
So my questions are:
Is there a way to compile the code with both versions to not run into compiler errors?
Is there a way to execute the right code based on the available org.apache.poi version?
What would be an appropriate approach to solve this issue?
As an amendment:
I'm building a code which shall work for two applications which provides an interface in different versions (maven scope of the dependency is provided).
If I have both dependencies in maven, it takes any of the dependencies and IF clauses will fail to compile as Cell.CELL_TYPE_STRING or CellType.STRING is not available in the chosen dependency.
And I would like to have the code working regardless of which dependency is plugged in the application.
// working with old poi interface
if (cell != null && cell.getCellType() == Cell.CELL_TYPE_STRING
&& cell.getRichStringCellValue().getString().trim().equals(cellContent)) {
return row;
}
// working with new poi interface
if (cell != null && cell.getCellType() == CellType.STRING
&& cell.getRichStringCellValue().getString().trim().equals(cellContent)) {
return row;
}
This i probably opinion based, but it seams legit.
First, you will have to create common interface that you will use to do your job.
Second, you will have to create adapter classes that implements that interface and will do required job using particular version of POI library
Third, you will write adapter factory that will return proper instance of adapter.
Adapter itself should provide "isSupported" method that will detect if given adapter can be used based on what kind of POI classes are currently loaded (detect by reflection - there must be some version specific classes or other markers)
Then you will put each adapter into separate maven module, so each module can be compiled independently (thus you will have no class conflicts). Each module will have POI dependency in "provided" scope in version that this adapter is going to support
Either module registers itself with the factory in your main module, or factory itself detects all adapters that are available (like #ComponentScan in Spring).
Then you will pack everything into single app bundle. Main module will use only common interface. All in all it will be kind of extensible plugin system
I do not think there is a single "best way".
Nonetheless, we faced a similar issue in a few of our apps that share a common library. I ended up with a variant of #Antoniossss's variant, except that the library itself does not use dependency injection (the parent app may or may not, but the library is free of it).
To be more specific, and due to transitive dependencies, some of our apps need a certain version of Apache Lucene (e.g. 7.x.y, or more) and other are stuck on older versions (5.5.x).
So we needed a way to build one of our lib against those versions, using maven in our case.
What we ended uses the following principles :
We share some code, which is common between all versions of Lucene
We have specific code, for each target version of Lucene that has an incompatible API (e.g. package change, non existing methods, ...)
We build as many jars as there are supported versions of lucene, with a naming scheme such as groupId:artifact-luceneVersion:version
Where the lib is used, direct access to the Lucene API is replaced by access to our specific classes
For exemple, un Lucene v5 there is a org.apache.lucene.analysis.synonym.SynonymFilterFactory facility. In v7 the same functionnality is implemented using org.apache.lucene.analysis.synonym.SynonymGraphFilterFactory e.g. same package, but different class.
What we end up with is providing a com.mycompany.SynonymFilterFactoryAdapter. In the v5 JAR, this class extends the Lucene v5 class, and respectively with v7 or any other version.
In the final app, we always instantiate the com.mycompany object, that behaves just the same as the native org.apache class.
Project structure
The build system being maven, we build it as follow
project root
|- pom.xml
|-- shared
|---|- src/main/java
|---|- src/test/java
|-- v5
|---|- pom.xml
|-- v7
|---|- pom.xml
Root pom
The root pom is a classic multimodule pom, but it does not declare the shared folder (notice that the shared folder has no pom).
<modules>
<module>v5</module>
<module>v7</module>
</modules>
The shared folder
The shared folder stores all non-version specific code and the tests. On top of that, when a version specific class is needed, it does not code against the API of this class (e.g. it does not import org.apache.VersionSpecificStuff), it does against com.mycompany.VersionSpecificStuffAdapter).
The implementation of this Adapter being left to the version specific folders.
Version specific folders
The v5 folder declares in its artifact id the Lucene version it compiles to, and of course declares it as a dependency
....
<artifactId>myartifact-lucene-5.5.0</artifactId>
....
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-analyzers-common</artifactId>
<version>5.5.0</version>
</dependency>
But the real "trick" is that it declares an external source folder for classes and tests using the build-helper-maven-plugin : see below how the source code from the shared folder is imported "as if" it was from this project itself.
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<id>add-5.5.0-src</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>../shared/src/main/java</source>
</sources>
</configuration>
</execution>
<execution>
<id>add-5.5.0-test</id>
<phase>generate-test-sources</phase>
<goals>
<goal>add-test-source</goal>
</goals>
<configuration>
<sources>
<source>../shared/src/test/java</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
For the whole implementation to work, it provides the Adapter implementations in its own source folder src/main/java, e.g.
package com.mycompany
public class VersionSpecificStuffAdapter extends org.apache.VersionSpecificStuff {
}
If both the v5 and the v7 package do it the same way, then client code using the com.mycompany.xxxAdapter will always compile, and under the hood, get the corresponding implementation of the library.
This is one way to do it. You can also, as already suggested, define your whole new interfaces and have clients of your lib code against your own interface. This is kind of cleaner, but depending on the case, may imply more work.
In your edit, you mention refering to constants that are not defined the same way, e.g. CellType.TYPE_XX.
In the version specific code, you could either produce another constant MyCellType.TYPE_XX that would duplicate the actual constant, under a stable name.
In case of an enum, you could create a CellTypeChecker util with a method isCellTypeXX(cell), that would be implemented in a version specific way.
v7 folder
It's pretty much the same structure, you just swap what changed between v5 and v7.
Caveats
This may not always scale.
If you have hundreds of types you need to adapt, this is cumbersome to say the least.
If you have 2 or more libs you need to cross-compile against (e.g. mylib-poi-1.0-lucene-5.5-guava-19-....) it's a mess.
If you have final classes to adapt, it gets harder.
You have to test to make sure every JAR has all the adapters. I do that by testing each Adapted class in the shared test folder.
It looks like it is possible to get the path/to/a/dependency.jar as an expandable variable within a Maven pom.xml: see Can I use the path to a Maven dependency as a property? You can expand, e.g., an expression into a string like /home/pascal/.m2/repository/junit/junit/3.8.1/junit-3.8.1.jar.
What I want instead of the full path to the dependency JAR within my local Maven repository is just the bare name of the JAR, for example junit-3.8.1.jar.
So for example, within my pom.xml, I would like to be able to use a value like ${maven.dependency.junit.junit.jar.name} to expand to junit-3.8.1.jar.
Can I do this, and how?
You can use the maven-antrun-plugin to get the file name of a dependency. Ant has a <basename> task which extracts the file name from a path. As described in Can I use the path to a Maven dependency as a property? the full path name of a dependency is available in ant as ${maven.dependency.groupid.artifactid.type.path}. This enables us to extract the file name with the ant task like this:
<basename file="${maven.dependency.groupid.artifactid.type.path}" property="dependencyFileName" />
This stores the file name in a property named dependencyFileName.
In order to make this property availbable in the pom, the exportAntProperties configuration option of the maven-antrun-plugin needs to be enabled. This option is only available as of version 1.8 of the plugin.
This example shows the plugin configuration for retrieving the artifact file name of the junit dependency:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.8</version>
<executions>
<execution>
<phase>initialize</phase>
<configuration>
<exportAntProperties>true</exportAntProperties>
<tasks>
<basename file="${maven.dependency.junit.junit.jar.path}"
property="junitArtifactFile"/>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
No, I'm sorry to say that it isn't possible. So, you have two options before you.
1) modify the maven source code and contribute the modification.
2) write your own plug-in.
I recommend the second option. Writing plug-ins is not that hard. As a philosophical principal, select a frequently-used plug-in which has functionality close to what you want to accomplish. Read and understand the code, and then modify it to do what you desire.
So for your example, you might look at the filter plugin. There's also some interesting syntax going on in the Ant plugin. It allows you to name dependencies and get those jar filenames into the embedded Ant script.
Good luck. :-)
As a more practical alternative, you might just break down and manually code the property value with the exact version number you're using. You're not going to switch the version number that often, right? And this is only one jar you're dealing with, right?
I'm trying to upgrade our Spring version and use Spring IO Platform BOM to do so, but a few of our classes have gone missing (moved into other artifacts) or are no longer dependencies of some thing I was pulling in. I'm trying to find out which package they were originally part of (one example is CSVStrategy ). Some of these dependencies such as WhitespaceTokenizer have over a dozen artifact names that could be supplying it, and in order to find the correct upgrade path I need to figure out where it's currently coming from.
One possible way could be to get the resource (class) location. If the class comes from a jar file you would at least get the jar name. From that you should be able to identify the maven artifact.
someClass.getProtectionDomain().getCodeSource().getLocation().toURI();
Or with a ResourceLoader and a logger you could print a list of all classes on the classpath / servlet-path.
#Autowired
ResourceLoader resourceLoader;
public void printResourceLocations() {
PathMatchingResourcePatternResolver resolver = new PathMatchingResourcePatternResolver(resourceLoader);
Resource[] resources = resolver.getResources("classpath*:com/**/*.class"));
for (Resource resource : resources) {
log.info(resource.getURI());
// Not sure if that works, probably getFile() is ok?
}
}
I have used JBoss Tattletale for this type of task in the past. I don't think it's being actively maintained any longer, however it still works for me. Here's the config I use. Note, I had to add this to my POM's build section, even though the goal 'report' seems to imply it is a report plugin.
<plugin>
<groupId>org.jboss.tattletale</groupId>
<artifactId>tattletale-maven</artifactId>
<!-- The version of the plugin you want to use -->
<version>1.2.0.Beta2</version>
<executions>
<execution>
<goals>
<goal>report</goal>
</goals>
</execution>
</executions>
<configuration>
<!-- This is the location which will be scanned for generating tattletale reports -->
<source>${project.build.directory}/${project.artifactId}/WEB-INF/lib</source>
<!-- This is where the reports will be generated -->
<destination>${project.build.directory}/site/tattletale</destination>
</configuration>
</plugin>
You could also try jHades. I haven't had a chance to use it yet, it is on my list of things to investigate.
Attempting to modify an existing Java/Tomcat app for deployment on Heroku following their tutorial and running into some issues with AppAssembler not finding the entry class. Running target/bin/webapp (or deploying to Heroku) results in Error: Could not find or load main class org.stopbadware.dsp.Main
Executing java -cp target/classes:target/dependency/* org.stopbadware.dsp.Main runs properly however. Here's the relevant portion of pom.xml:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>appassembler-maven-plugin</artifactId>
<version>1.1.1</version>
<configuration>
<assembleDirectory>target</assembleDirectory>
<programs>
<program>
<mainClass>org.stopbadware.dsp.Main</mainClass>
<name>webapp</name>
</program>
</programs>
</configuration>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>assemble</goal>
</goals>
</execution>
</executions>
</plugin>
My guess is mvn package is causing AppAssembler to not use the correct classpath, any suggestions?
Your artifact's packaging must be set to jar, otherwise the main class is not found.
<pom>
...
<packaging>jar</packaging>
...
</pom>
The artifact itself is added at the end of the classpath, so nothing other than a JAR file will have any effect.
Try:
mvn clean package jar:jar appassembler:assemble
Was able to solve this by adding "$BASEDIR"/classes to the CLASSPATH line in the generated script. Since the script gets rewritten on each call of mvn package I wrote a short script that calls mvn package and then adds the needed classpath entry.
Obviously a bit of a hack but after a 8+ hours of attempting a more "proper" solution this will have to do for now. Will certainly entertain any more elegant ways of correcting the classpath suggested here.
I was going through that tutorial some time ago and had very similar issue. I came with a bit different approach which works for me very nicely.
First of all, as it was mentioned before, you need to keep your POM's type as jar (<packaging>jar</packaging>) - thanks to that, appassembler plugin will generate a JAR file from your classes and add it to the classpath. So thanks to that your error will go away.
Please note that this tutorial Tomcat is instantiated from application source directory. In many cases that is enough, but please note that using that approach, you will not be able to utilize Servlet #WebServlet annotations as /WEB-INF/classes in sources is empty and Tomcat will not be able to scan your servlet classes. So HelloServlet servlet from that tutorial will not work, unless you add some additional Tomcat initialization (resource configuration) as described here (BTW, you will find more SO questions talking about that resource configuration).
I did a bit different approach:
I run a org.apache.maven.plugins:maven-war-plugin plugin (exploded goal) during package and use that generated directory as my source directory of application. With that approach my web application directory will have /WEB-INF/classes "populated" with classes. That in turn will allow Tomcat to perform scanning job correctly (i.e. Servlet #WebServlet annotations will work).
I also had to change a source of my application in the launcher class:
public static void main(String[] args) throws Exception {
// Web application is generated in directory name as specified in build/finalName
// in maven pom.xml
String webappDirLocation = "target/embeddedTomcatSample/";
Tomcat tomcat = new Tomcat();
// ... remaining code does not change
Changes to POM which I added - included maven-war-plugin just before appassembler plugin:
...
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.5</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>exploded</goal>
</goals>
</execution>
</executions>
</plugin>
...
Please note that exploded goal is called.
I hope that small change will help you.
One more comment on that tutorial and maven build: note that the tutorial was written to show how simple is to build an application and run it in Heroku. However, that is not the best approach to maven build.
Maven recommendation is that you should adhere to producing one artifact per POM. In your case there are should two artifacts:
Tomcat launcher
Tomcat web application
Both should be build as separate POMs and referenced as modules from your parent POM. If you look at the complexity of that tutorial, it does not make much sense to split that into two modules. But if your applications gets more and more complex (and the launcher gets some additional configurations etc.) it will makes a lot of sense to make that "split". As a matter of fact, there are some "Tomcat launcher" libraries already created so alternatively you could use of one them.
You can set the CLASSPATH_PREFIX environment variable:
export CLASSPATH_PREFIX=target/classes
which will get prepended to the classpath of the generated script.
The first thing is that you are using an old version of appassembler-maven-plugin the current version is 1.3.
What i don't understand why are you defining the
<assembleDirectory>target</assembleDirectory>
folder. There exists a good default value for that. So usually you don't need it. Apart from that you don't need to define an explicit execution which bounds to the package phase, cause the appassembler-maven-plugin is by default bound to the package phase.
Furthermore you can use the useWildcardClassPath configuration option to make your classpath shorter.
<configuration>
<useWildcardClassPath>true</useWildcardClassPath>
<repositoryLayout>flat</repositoryLayout>
...
</configruation>
And that the calling of the generated script shows the error is depending on the thing that the location of the repository where all the dependencies are located in the folder is different than in the generated script defined.
I am using the Google Reflections library for querying certain resources in the classpath. Those resources are located in the same location than the classes in my project.
I wrote some unit tests that succeed when executed as a unit test in Eclipse, but when I try to execute them with Maven (with a maven install for example), they are not working as expected.
After some debugging, apparently the problem is that when executed with Maven, the Reflections library cannot find the classpath url where the resources are located.
I arrived to that conclusion researching how Reflections determines the classpath URLs that should be inspected. As an example, the following method shows how Reflections finds the available classpath URLs given a class loader (the original Reflections method has been simplified a bit):
public static Set<URL> forClassLoader(ClassLoader... classLoaders) {
final Set<URL> result = Sets.newHashSet();
for (ClassLoader classLoader : classLoaders) {
while (classLoader != null) {
if (classLoader instanceof URLClassLoader) {
URL[] urls = ((URLClassLoader) classLoader).getURLs();
if (urls != null) {
result.addAll(Sets.<URL>newHashSet(urls));
}
}
classLoader = classLoader.getParent();
}
}
return result;
}
In short, it is traversing the class loader hierarchy asking for the URLs of each individual classloader.
When in Eclipse I invoke the previous method from a unit test with something like this:
ClassLoader myClassClassLoader = <MyClass>.class.getClassLoader(); //<MyClass> is in the same classpath url than the resources I need to find
Set<URL> urls = forClassLoader(myClassClassLoader);
for(URL url : urls) {
System.out.println("a url: " + url);
as expected, I can see (among many other URLs) the classpath URLs that are configured as part of my project:
file:<MY_PROJECT_PATH>/target/classes/
file:<MY_PROJECT_PATH>/target/test-classes/
and Reflections works as a charm (the resources Reflections should find are located in file:<MY_PROJECT_PATH>/target/classes/).
However, when the test is executed by Maven, I realized that these URL entries are missing from the set returned by the forClassLoader method, and the rest of the Reflections methods are not working as expected for this problem.
The "surprising" thing is that if I write this when the unit test is executed by maven:
ClassLoader myClassClassLoader = <MyClass>.class.getClassLoader();
url = myClassClassLoader.getResource("anExistingResource");
System.out.println("URL: "+url); //a valid URL
I can see that the class loader still can resolve the resource I am trying to find.
I am puzzled about why when executed with Maven the forClassLoader method does not include in the returned set the classpath URLs of my project, although at the same time it is able to resolve resources that are located in such urls(!).
What is the reason of this behavior? Is there any workaround I can try to make the Reflections library work when invoked as part of a unit test run by Maven ?
Solved it. Posting the solution in case someone find the same problem in the future.
When executing the unit tests of a project, Maven does not (explicitly) include in the classpath all its dependencies.
Instead, it declares a dependency on a tmp jar located in "target/surefire/surefirebooter_NUMBER_THAT_LOOKS_LIKE_TIME_STAMP.jar".
This jar only contains a manifest file that declares a classpath for the project.
The method forClassLoader in the Reflections library does not return a set of urls with the effective classpath (i.e., classpath entries in manifest files are ignored).
To overcome this, I just implemented this simple method:
public static Set<URL> effectiveClassPathUrls(ClassLoader... classLoaders) {
return ClasspathHelper.forManifest(ClasspathHelper.forClassLoader(classLoaders));
}
The method forManifest (also part of the Reflections library) adds to the set of classpath urls sent as parameter, the missing classpath entries declared in manifest files of any jar files contained in the set. In this way the method returns a set of URLs with the effective classpath of the project.
You're probably using M2Eclipse, which adds stuff to the classpath on its own. Command-line Maven works differently. You might find some options that will help.
I had the exact same problem. Adding following URL did the trick for me.
ConfigurationBuilder cb = new ConfigurationBuilder();
cb.setUrls(...);
cb.addUrls(YourClassName.class.getProtectionDomain().getCodeSource().getLocation());
I just encountered the same problem with Reflections library (version 0.9.11), only when executing unit tests from Maven builds. The link provided in the accepted answer pointed me in the right direction.
A simple POM file change to my Surefire plugin fixed this:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.21.0</version>
<configuration>
<useSystemClassLoader>false</useSystemClassLoader>
</configuration>
</plugin>
<plugin>
<artifactId>maven-failsafe-plugin</artifactId>
<version>2.21.0</version>
<configuration>
<useSystemClassLoader>false</useSystemClassLoader>
</configuration>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
The <useSystemClassLoader> config parameter defaults to 'true'. Forcing it to 'false' seems to resolve the classloader problem in my unit tests.
There are a few issues you can create that makes surefire fail.
there is a naming convention; test suites should be called 'TestBlaBla' or 'BlaBlaTest'; beginning or ending with the word 'Test'.
as mentioned earlier, the classpath in Maven is more restricted than it is in Eclipse as Eclipse (stupidly) does not separate the compile classpath from the test classpath.
Surefire is free to run test cases from different test suites in any order. When running multiple test suites that initialize some common base (such as an in-memory database or a JNDI context) that can create conflicts where test suites start to influence each other. You need to take care to properly isolate test suites. Tricks I use are to use separate in-memory databases for suites, and I initialize shared things per unit test in stead of per test suite.
3 is the hardest to debug I can tell you; whenever something works in Eclipse and not in Maven I naturally assume I'm doing something wrong in isolating the test suite.