manage maven different cglib/asm versions - java

I have a multi-module project in maven, that uses (amongst others) glassfish-jersey, jersey-moxy, wicket-ioc, lucene and lamdbaj These all come with asm, but all with different versions.
Lately, I run into a lot of trouble when running my tests. Typical error I get is:
java.lang.VerifyError: class net.sf.cglib.core.DebuggingClassWriter overrides final method visit.(IILjava/lang/String;Ljava/lang/String;Ljava/lang/String;[Ljava/lang/String;)V
I read that this can be caused by different asm versions. Is there a way to 'sandbox' these different asm-versions in their dependencies, so they don't get mixed up?
Edit:
My current solution is to use jarjar, like this:
<build>
<plugins>
<plugin>
<groupId>org.sonatype.plugins</groupId>
<artifactId>jarjar-maven-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>jarjar</goal>
</goals>
<configuration>
<includes>
<include>cglib:cglib-nodep</include>
</includes>
<rules>
<rule>
<pattern>net.sf.cglib.asm.**</pattern>
<result>com.myproject.lambda4j.asm.#1</result>
</rule>
<rule>
<pattern>net.sf.cglib.**</pattern>
<result>com.myproject.lambda4j.cglib.#1</result>
</rule>
</rules>
</configuration>
</execution>
</executions>
</plugin>

Try explicitly adding some cglib v3.* on your class path which uses a recent version of ASM. The problem that you encounter is that cglib modifies the behavior of the ASM class writer by inheritance rather then by delegation. However, ASM enforced this latter best practice by making its ClassWriter's methods final from any version 4.* while it was still possible to override its methods in version 3. The error you encounter is a result of combining cglib 2.* with ASM 4.*.
Fortunately (for you), cglib has been rather static in its last versions, i.e. there were only minor API changes while the newer versions mostly consisted of updates of ASM. If you are lucky, this explicit use of cglib v3.* therefore solves your problem. This holds as long as none of your project dependencies has a direct dependency on ASM what seems reasonable for the dependencies you named like Jersey or Lucene.
If this does not work, you need to recompile some of your dependencies while using a tool like jarjar in order to repack the direct ASM dependencies into different name spaces in order to resolve these version conflicts. An alternative could be to isolate different ASM versions by some conditional child-first ClassLoader magic but this is not so recommendable since the effects are unpredictable and will also result in a performance penalty.

Related

Maven shade plugin - don't relocate excluded/optional scope dependencies

I'm using maven shade plugin in order to create an uber jar while relocating all classes. That's because I'm getting an external jar and I don't want to have classpath collisions. So the idea is to create a new uber (relocated) jar and use it in my application.
So the shade plugin takes all the classes and relocates them to the new package prefix. My issue is that a far as I understand, it also does that for dependencies which the classes they depend on, aren't in the scope [*].
Let's say I'm relocating all com to shade.com:
<executions>
<execution>
<id>rename-all</id>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<shadedArtifactAttached>true</shadedArtifactAttached>
<keepDependenciesWithProvidedScope>false</keepDependenciesWithProvidedScope>
<relocations>
<relocation>
<pattern>com</pattern>
<shadedPattern>shade/com</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
So if one of my dependencies, A, has a <optional> dependency on B (which package com.B), the plugin would change the imports within A to point to shade.com.B. But com.B is optional, which means those classes won't be available in shade.com.B, because they didn't get into the relocation process. The actual classes available when using this jar would be the "normal" ones - com.B.
And then I get class not found exceptions on shade.com.B classes when I try to use the shaded jar in my application.
Am I missing something in my understanding? Is there any solution to this?
[*] Some examples: I'm not sure yet about the exact cases where this happens. In my case, I depend on spark-sqldependency. I dig into 3 of the classes which I saw this issue (there are many more):
org/apache/html/dom/HTMLIsIndexElementImpl imports org.w3c.dom.html.HTMLIsIndexElement which is in rt.jar - the import was relocated so now it points to shade.org.w3c.dom.html.HTMLIsIndexElement and can't be found.
io/netty/handler/codec/marshalling/ChannelBufferByteOutput imports org.jboss.marshalling.ByteOutput. Hence the relocation change this to import shade.org.jboss.marshalling.ByteOutput. But as can be seen here, the import for jboss-marshalling is marked as <optional>true</optional>, so it's not in the jar and ByteOutput itself won't be relocated, hence won't be available.
jasper-runtime is included here but excluded in the upper one, here. So the outcome is that the files in Hadoop-hdfs are getting relocation to their dependencies, even though these classes would never be available in the relocated path since they aren’t in the jar at all. e.g. org/apache/hadoop/hdfs/server/datanode/browseBlock_jsp now has a reference to “shade/org/apache/jasper/runtime/JspSourceDependent”.

Different Java code compilation based on Dependency

I want to write a piece of Java code which can be executed with 2 different kinds of dependencies (or version of a dependency). Namely speaking about org.apache.poi. The code must run on a system with version=2 as well as version=3 or org.apache.poi.
Unfortunately between the versions 2 & 3 some interfaces have changed, code must be build slightly different and there is no way to upgrade both system to the same org.apache.poi version.
So my questions are:
Is there a way to compile the code with both versions to not run into compiler errors?
Is there a way to execute the right code based on the available org.apache.poi version?
What would be an appropriate approach to solve this issue?
As an amendment:
I'm building a code which shall work for two applications which provides an interface in different versions (maven scope of the dependency is provided).
If I have both dependencies in maven, it takes any of the dependencies and IF clauses will fail to compile as Cell.CELL_TYPE_STRING or CellType.STRING is not available in the chosen dependency.
And I would like to have the code working regardless of which dependency is plugged in the application.
// working with old poi interface
if (cell != null && cell.getCellType() == Cell.CELL_TYPE_STRING
&& cell.getRichStringCellValue().getString().trim().equals(cellContent)) {
return row;
}
// working with new poi interface
if (cell != null && cell.getCellType() == CellType.STRING
&& cell.getRichStringCellValue().getString().trim().equals(cellContent)) {
return row;
}
This i probably opinion based, but it seams legit.
First, you will have to create common interface that you will use to do your job.
Second, you will have to create adapter classes that implements that interface and will do required job using particular version of POI library
Third, you will write adapter factory that will return proper instance of adapter.
Adapter itself should provide "isSupported" method that will detect if given adapter can be used based on what kind of POI classes are currently loaded (detect by reflection - there must be some version specific classes or other markers)
Then you will put each adapter into separate maven module, so each module can be compiled independently (thus you will have no class conflicts). Each module will have POI dependency in "provided" scope in version that this adapter is going to support
Either module registers itself with the factory in your main module, or factory itself detects all adapters that are available (like #ComponentScan in Spring).
Then you will pack everything into single app bundle. Main module will use only common interface. All in all it will be kind of extensible plugin system
I do not think there is a single "best way".
Nonetheless, we faced a similar issue in a few of our apps that share a common library. I ended up with a variant of #Antoniossss's variant, except that the library itself does not use dependency injection (the parent app may or may not, but the library is free of it).
To be more specific, and due to transitive dependencies, some of our apps need a certain version of Apache Lucene (e.g. 7.x.y, or more) and other are stuck on older versions (5.5.x).
So we needed a way to build one of our lib against those versions, using maven in our case.
What we ended uses the following principles :
We share some code, which is common between all versions of Lucene
We have specific code, for each target version of Lucene that has an incompatible API (e.g. package change, non existing methods, ...)
We build as many jars as there are supported versions of lucene, with a naming scheme such as groupId:artifact-luceneVersion:version
Where the lib is used, direct access to the Lucene API is replaced by access to our specific classes
For exemple, un Lucene v5 there is a org.apache.lucene.analysis.synonym.SynonymFilterFactory facility. In v7 the same functionnality is implemented using org.apache.lucene.analysis.synonym.SynonymGraphFilterFactory e.g. same package, but different class.
What we end up with is providing a com.mycompany.SynonymFilterFactoryAdapter. In the v5 JAR, this class extends the Lucene v5 class, and respectively with v7 or any other version.
In the final app, we always instantiate the com.mycompany object, that behaves just the same as the native org.apache class.
Project structure
The build system being maven, we build it as follow
project root
|- pom.xml
|-- shared
|---|- src/main/java
|---|- src/test/java
|-- v5
|---|- pom.xml
|-- v7
|---|- pom.xml
Root pom
The root pom is a classic multimodule pom, but it does not declare the shared folder (notice that the shared folder has no pom).
<modules>
<module>v5</module>
<module>v7</module>
</modules>
The shared folder
The shared folder stores all non-version specific code and the tests. On top of that, when a version specific class is needed, it does not code against the API of this class (e.g. it does not import org.apache.VersionSpecificStuff), it does against com.mycompany.VersionSpecificStuffAdapter).
The implementation of this Adapter being left to the version specific folders.
Version specific folders
The v5 folder declares in its artifact id the Lucene version it compiles to, and of course declares it as a dependency
....
<artifactId>myartifact-lucene-5.5.0</artifactId>
....
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-analyzers-common</artifactId>
<version>5.5.0</version>
</dependency>
But the real "trick" is that it declares an external source folder for classes and tests using the build-helper-maven-plugin : see below how the source code from the shared folder is imported "as if" it was from this project itself.
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<id>add-5.5.0-src</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>../shared/src/main/java</source>
</sources>
</configuration>
</execution>
<execution>
<id>add-5.5.0-test</id>
<phase>generate-test-sources</phase>
<goals>
<goal>add-test-source</goal>
</goals>
<configuration>
<sources>
<source>../shared/src/test/java</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
For the whole implementation to work, it provides the Adapter implementations in its own source folder src/main/java, e.g.
package com.mycompany
public class VersionSpecificStuffAdapter extends org.apache.VersionSpecificStuff {
}
If both the v5 and the v7 package do it the same way, then client code using the com.mycompany.xxxAdapter will always compile, and under the hood, get the corresponding implementation of the library.
This is one way to do it. You can also, as already suggested, define your whole new interfaces and have clients of your lib code against your own interface. This is kind of cleaner, but depending on the case, may imply more work.
In your edit, you mention refering to constants that are not defined the same way, e.g. CellType.TYPE_XX.
In the version specific code, you could either produce another constant MyCellType.TYPE_XX that would duplicate the actual constant, under a stable name.
In case of an enum, you could create a CellTypeChecker util with a method isCellTypeXX(cell), that would be implemented in a version specific way.
v7 folder
It's pretty much the same structure, you just swap what changed between v5 and v7.
Caveats
This may not always scale.
If you have hundreds of types you need to adapt, this is cumbersome to say the least.
If you have 2 or more libs you need to cross-compile against (e.g. mylib-poi-1.0-lucene-5.5-guava-19-....) it's a mess.
If you have final classes to adapt, it gets harder.
You have to test to make sure every JAR has all the adapters. I do that by testing each Adapted class in the shared test folder.

Strange exception / error in gwt build when I extend ArrayList or other common collections

I am using Eclipse IDE Version: Luna Service Release 1 (4.4.1).
Using GWT 2.7 and Java 1.7.
I have implemented some custom collection class using replace with rule provided by GWT.
public class CustomArrayList<E> extends ArrayList{
// ...... overridden methods here
}
ArrayList arraList=GWT.create(ArrayList.class);
whenever I create instance of arraylist using GWT.create it use my on CustomArrayList class.
//gwt.xml file contains this configuration
<replace-with
class="xxxxxxxx.yyyyyyy.CustomArrayList">
<when-type-is class="java.util.ArrayList" />
</replace-with>
In GWT maven project on client side I use array list instantiation as per the mention above but But while project compilation got some stack overflow error with AbstarctTreeLogger related msg printing on console.
customcollection.CustomArrayList<java.lang.Object>
[INFO] [WARN] Checking all subtypes of Object which qualify for serialization
[ERROR] Exception in thread "main" java.lang.StackOverflowError
[ERROR] at com.google.gwt.dev.util.log.AbstractTreeLogger.commitMyBranchEntryInMyParentLogger(AbstractTreeLogger.java:252)
[ERROR] at com.google.gwt.dev.util.log.AbstractTreeLogger.commitMyBranchEntryInMyParentLogger(AbstractTreeLogger.java:252)
[ERROR] at
It produces this error repeatedly for at least a few thousand times and then the compiler crashes.
After failure If I try again it works but not every time.
Can any one help me to figure out issue.
Just a guess, but since GWT needs to make sure that every class that could ever be contained in a List is Serializable... By creating a custom List, you just doubled the compiler's work, and now it's hitting a memory limit.
You might be able to solve this by increasing the memory available to the compiler, but your best choice is to be more specific with the types that you use.
See How can I keep GWT from trying to include every serializable class when I use ArrayList and gwt - Using List<Serializable> in a RPC call?
Finally an updated plugin configuration seems to work that addresses various memory requirement issues -
<!-- GWT Maven Plugin -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>gwt-maven-plugin</artifactId>
<version>2.7.0</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<!-- <goal>test</goal> -->
</goals>
</execution>
</executions>
<!-- Plugin configuration. There are many available options, see gwt-maven-plugin
documentation at codehaus.org -->
<configuration>
<runTarget>TrupublicBuilder.html</runTarget>
<modules>
<module>com.trupublic.TrupublicBuilder</module>
</modules>
<extraJvmArgs>-XX:PermSize=1024m </extraJvmArgs>
<extraJvmArgs>-XX:MaxPermSize=8192m</extraJvmArgs>
<extraJvmArgs>-Xms512m </extraJvmArgs>
<extraJvmArgs>-Xmx8192M </extraJvmArgs>
<extraJvmArgs>-Xss2048k </extraJvmArgs>
</configuration>
</plugin>
We were missing the -Xss configuration and it was aptly pointed out by Jens in Gwt user google group here.

How to properly use the server stubs generated from a swagger specification?

I'm using Swagger 2.0 and swagger-codegen (actually the swagger-codegen-plugin for Maven) to specify,document and generate an API, with Java as the target language.
The project is already setup to build the server stubs (JAX-RS) and documentation, and Eclipse recognizes the generated code in the project buildPath.
I'm not sure of what is the proper workflow from here. :-/
I don't think I should modify the generated classes, otherwise my changes would be overwritten whenever I change the swagger spec, an I expect it will change as I think more about the API as the development goes on.
What should I do then? Inherit from the generated classes (which ones?) or include them in my own classes?
There are two steps to the solution here.
Add **/*Controller.java or **/*Impl.java to .swagger-codegen-ignore file. Depending on the language used the default implementation is provided in a *Controller.java or *Impl.java file. Once the default implementation is excluded from generation, you can implement the generated interfaces in your own class. The code in your own class will not get refreshed on mvn clean.
.swagger-codegen-ignore file itself is an auto-generated file hence whatever you add in step 1 gets refreshed when you do a mvn clean. To avoid this keep your version of .swagger-codegen-ignore in your resources folder and add the below plugin to your pom, to copy the file at the start of the Maven lifecycle:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<executions>
<execution>
<id>copy-resources</id>
<phase>initialize</phase>
<goals>
<goal>copy-resources</goal>
</goals>
<configuration>
<outputDirectory>${project.build.directory}/generated/swagger</outputDirectory>
<resources>
<resource>
<directory>${basedir}/src/main/resources</directory>
<includes>
<include>.swagger-codegen-ignore</include>
</includes>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
I believe you will need to update the Impl classes, e.g. PetApiServiceImpl.
If you want to skip certain files (e.g. Impl classes) during code regeneration, you can add the files to .swagger-codegen-ignore.

using maven coordinate style episodes in wsimport

I'm building (multiple) complex webservice with base XSD types from all kinds of standards (GML, SWE, XLINK, etc). Now, I would like to break up the compilation into more steps, preferrably one for each of the standards I'm using.
Advantages:
1) I can add create tooling libraries that I can re-use in all of my webservices on each of the standards.
2) I can make use of the power of JAXB2 basics plugin, which seems to work very nicely with the maven-jaxb2-plugin (org.jvnet.jaxb2.maven2) and create for instance interface bindings. This in contrast with the jaxws-maven-plugin plugin.
The final step would be using the org.jvnet.jax-ws-commons:maven-jaxb2-plugin to create the actual web service that I can implement in an EJB (or call as a client).
Now, the org.jvnet.jaxb2.maven2:maven-jaxb2-plugin plugin allows me to refer to episodes by means of their maven coordinate, as part of its like this:
<episodes>
<episode>
<groupId>org.example</groupId>
<artifactId>jaxb2-basics-test-episodes-a</artifactId>
</episode>
</episodes>
How can I do this by means of the org.jvnet.jax-ws-commons:maven-jaxb2-plugin? I've searched a lot, and experimented like this:
<plugin>
<groupId>org.jvnet.jax-ws-commons</groupId>
<artifactId>>maven-jaxb2-plugin</artifactId>
<version>2.1</version>
<executions>
<execution>
<goals>
<goal>wsimport</goal>
</goals>
</execution>
</executions>
<configuration>
<wsdlDirectory>src/main/resources/</wsdlDirectory>
<wsdlFiles>
<wsdlFile>example.wsdl</wsdlFile>
</wsdlFiles>
<xjcArgs>
<xjcArg>-b</xjcArg>
<xjcArg>../cpt-xsd/target/generated-sources/xjc/META-INF/sun-jaxb.episode</xjcArg>
</xjcArgs>
<verbose>true</verbose>
</configuration>
</plugin>
Which takes the episode file from the target dir of the (compiled) JAXB dependend project. This sometimes even fails in the maven build (why I did not figure out yet).
I've tried to use catalog files to make a mapping but (I think I saw somewhere a catalog mapping that took maven coordinates as destination), but haven't succeeded yet.
Are you aware of the OGC Schemas and Tools Project? (Disclaimer: I'm the author.)
Now, to your question. My guess would be that org.jvnet.jax-ws-commons:maven-jaxb2-plugin does not support the "Maven coordinates" as you call them. This was a feature I've specifically implemented for my org.jvnet.jaxb2.maven2:maven-jaxb2-plugin (disclaimer: I'm the author).
From the other hand, episode file is nothing but a JAXB binding file. So you can simply extract this file from the JAR artifact (for instance using the maven-dependency-plugin) and then include it more or less like you do it already. Just don't point to directories in other modules, this is not reliable.

Categories