NoClassDefFoundError due to different version of jars - java

I am working on maven product that use some common-collection jar with version v3.2.1, which gets downloaded from our repository. Through out the project we are using ver1.1. Now I have to use third party jar which uses common-collection with v3.2.2 due to which I'm getting NoClassDefFound exception.
Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/commons/collections/map/ReferenceMap
I cant change the version in my project. How to solve this issue?

I cant change the version in my project which is v1.1
Now I have to use third party jar which uses common-collection with v3.2.2
You have a hard choice to make. Either change (upgrade) the version in your project, or don't use the 3rd-party library. (This assumes that the 3rd-party library's dependency is a hard one ... which seems likely if API classes have been moved, etcetera.)
The first alternative is probably better. The longer you stay on an outdated version of the commons-collection library, the more problems like this you will encounter.
Actually, there is third possibility but it is asking for trouble. You could try to build your own version of commons-collection that is compatible with both v1.1 and v3.2.2. But here's the problem:
You are buying into extra work maintaining this custom version of commons-collection, for as long as you need it in your codebase. (And that could be a long time if versions of your code are long-lived; e.g. if they released to customers who have long-term support requirements.)
It might not work. Suppose that one part of the code requires ReferenceMap in one package, and another part requires it in another package.
Another possibility (another bad idea!) might be to do tricky things with classloaders, but that can lead to problems as well. If you two versions of the same class are loaded by different class loaders into an application, the type system will insists that they are different types. They won't be assignment compatible. Type-casts will fail unexpectedly, etcetera.

seems it has been moved to
<!-- https://mvnrepository.com/artifact/org.apache.commons/commons-collections4 -->
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-collections4</artifactId>
<version>4.1</version>
</dependency>

Related

Is it possible to solve org.jetbrains.kotlin:kotlin-stdlib vulnerability from OkHttp?

I'm using Snyk service to check my projects for vulnerabilities.
Projects with OkHttp dependency have one common vulnerability:
Vulnerable module: org.jetbrains.kotlin:kotlin-stdlib
Introduced through: com.squareup.okhttp3:okhttp#4.10.0
You can check the full report here: https://snyk.io/test/github/yvasyliev/deezer-api
In Overview section there is a note:
Note: As of version 1.4.21, the vulnerable functions have been marked as deprecated. Due to still being useable, this advisory is kept as "unfixed".
I have two questions:
Can I fix this vulnerability in Maven project and how?
If vulnerability cannot be fixed, then does it mean that every signle Kotlin application has this vulnerability by default (since it's comming from kotlin-stdlib)?
The latest stable version of OkHttp is added to project by Maven:
<dependency>
<groupId>com.squareup.okhttp3</groupId>
<artifactId>okhttp</artifactId>
<version>4.10.0</version>
</dependency>
As with all vulnerable software libraries, you need to assess whether or not you're actually affected by the vulnerability that is included.
Details on the vulnerability are listed in your Snyk report.
The problematic functions are createTempDir and createTempFile from the package kotlin.io. As outlined in your report as well as the Kotlin documentation, these functions are a possible source of leaking information, due to the created file / directory having having too wide permissions; that is, everyone with access to the file system can read the files.
Is this a problem?
If you (and any dependencies you're including in your software) is NOT using one of the aforementioned functions, you're not vulnerable.
Also, if you (or the dependency) is adjusting the file permissions after calling one of these functions and before inserting any information, you're not affected.
In case the functions are used and the permissions are not adjusted, still that might not pose a problem, as long as the data stored in the files do not need to be protected, e.g. are NOT secrets or personal information.
To address your questions directly:
Unfortunately, there is not easy way to fix this. You either would have to use a version of kotlin-stdlib where the function was not introduced yet, exclude the kotlin-stdlib from your classpath entirely or use a version where the functions are no longer included; which is not released yet. However, options 1 and 2 do not make any sense, because if the software keeps working, that means noone is using the functions and you're not affected anyway.
No and yes. Everyone relying on the kotlin-stdlib in one of the affected versions, has the function on its classpath. However, as long as it is not used, or the usage does not pose a problem as explained above, the software is not vulnerable.
The OkHttp project seems to know of the vulnerability, but seems not to be affected.

How do big companies tackle with the package dependencies conflict problem?

Just as shown in the picture, one app (Java) referenced two third-party package jars (packageA and packageB), and they referenced packageC-0.1 and packageC-0.2 respectively. It would work well if packageC-0.2 was compatible with packageC-0.1. However sometimes packageA used something that could not be supported in packageC-0.2 and Maven can only use the latest version of a jar. This issue is also known as "Jar Hell".
It would be difficult in practice to rewrite package A or force its developers to update packageC to 0.2.
How do you tackle with these problems? This often happens in large-scale companies.
I have to declare that this problem is mostly occurred in BIG companies due to the fact that big company has a lot of departments and it would be very expensive to let the whole company update one dependency each time certain developers use new features of new version of some dependency jars. And this is not big deal in small companies.
Any response will be highly appreciated.
Let me throw away a brick in order to get a gem first.
Alibaba is one of the largest E-Commerces in the world. And we tackle with these problems by creating an isolation container named Pandora. Its principle is simple: packaging those middle-wares together and load them with different ClassLoaders so that they can work well together even they referenced same packages with different versions. But this need a runtime environment provided by Pandora which is running as a tomcat process. I have to admit that this is a heavy plan. Pandora is developed based on a fact that JVM identifies one class by class-loader plus classname.
If you know someone maybe know the answers, share the link with him/her.
We are a large company and we have this problem a lot. We have large dependency trees that over several developer groups. What we do:
We manage versions by BOMs (lists of Maven dependencyManagement) of "recommended versions" that are published by the maintainers of the jars. This way, we make sure that recent versions of the artifacts are used.
We try to reduce the large dependency trees by separating the functionality that is used inside a developer group from the one that they offer to other groups.
But I admit that we are still trying to find better strategies. Let me also mention that using "microservices" is a strategy against this problem, but in many cases it is not a valid strategy for us (mainly because we could not have global transactions on databases any more).
This is a common problem in the java world.
Your best options are to regularly maintain and update dependencies of both packageA and packageB.
If you have control over those applications - make time to do it. If you don't have control, demand that the vendor or author make regular updates.
If both packageA and packageB are used internally, you can use the following practise: have all internal projects in your company refer to a parent in the maven pom.xml that defines "up to date" versions of commonly used third party libraries.
For example:
<framework.jersey>2.27</framework.jersey>
<framework.spring>4.3.18.RELEASE</framework.spring>
<framework.spring.security>4.2.7.RELEASE</framework.spring.security>
Therefore, if your project "A" uses spring, if they use the latest version of your company's "parent" pom, they should both use 4.3.18.RELEASE.
When a new version of spring is released and desirable, you update your company's parent pom, and force all other projects to use that latest version.
This will solve many of these dependency mismatch issues.
Don't worry, it's common in the java world, you're not alone. Just google "jar hell" and you can understand the issue in the broader context.
By the way mvn dependency:tree is your friend for isolating these dependency problems.
I agree with the answer of #JF Meier ,In Maven multi-module project, the dependency management node is usually defined in the parent POM file when doing unified version management. The content of dependencies node declared by the node class is about the resource version of unified definition. The resources in the directly defined dependencies node need not be introduced into the version phase. The contents of the customs are as follows:
in the parent pom
<dependencyManagement> 
    <dependencies > 
      <dependency > 
        <groupId>com.devzuz.mvnbook.proficio</groupId> 
        <artifactId>proficio-model</artifactId> 
        <version>${project.version}</version> 
      </dependency > 
</dependencies >
</dependencyManagement>
in your module ,you do not need to set the version
<dependencies > 
    <dependency > 
      <groupId>com.devzuz.mvnbook.proficio</groupId> 
       <artifactId>proficio-model</artifactId> 
    </dependency > 
  </dependencies > 
This will avoid the problem of inconsistency .
This question can't be answered in general.
In the past we usually just didn't use dependencies of different versions. If the version was changed, team-/company-wide refactoring was necessary. I doubt it is possible with most build tools.
But to answer your question..
Simple answer: Don't use two versions of one dependency within one compilation unit (usually a module)
But if you really have to do this, you could write a wrapper module that references to the legacy version of the library.
But my personal opinion is that within one module there should not be the need for these constructs because "one module" should be relatively small to be manageable. Otherwise it might be a strong indicator that the project could use some modularization refactoring. However, I know very well that some projects of "large-scale companies" can be a huge mess where no 'good' option is available. I guess you are talking about a situation where packageA is owned by a different team than packageB... and this is generally a very bad design decision due to the lack of separation and inherent dependency problems.
First of all, try to avoid the problem. As mentioned in #Henry's comment, don't use 3rd party libraries for trivial tasks.
However, we all use libraries. And sometimes we end up with the problem you describe, where we need two different versions of the same library. If library 'C' has removed and added some APIs between the two versions, and the removed APIs are needed by 'A', while 'B' needs the new ones, you have an issue.
In my company, we run our Java code inside an OSGi container. Using OSGi, you can modularize your code in "bundles", which are jar files with some special directives in their manifest file. Each bundle jar has its own classloader, so two bundles can use different versions of the same library. In your example, you could split your application code that uses 'packageA' into one bundle, and the code that uses 'packageB' in another. The two bundles can call each others APIs, and it will all work fine as long as your bundles do not use 'packageC' classes in the signature of the methods used by the other bundle (known as API leakage).
To get started with OSGi, you can e.g. take a look at OSGi enRoute.
Let me throw away a brick in order to get a gem first.
Alibaba is one of the largest E-Commerces in the world. And we tackle with these problems by creating an isolation container named Pandora. Its principle is simple: packaging those middle-wares together and load them with different ClassLoaders so that they can work well together even they referenced same packages with different versions. But this need a runtime environment provided by Pandora which is running as a tomcat process. I have to admit that this is a heavy plan.
Pandora is developed based on a fact that JVM identifies one class by class-loader plus classname.

NoSuchMethodException - method's return type has changed - Want to accept both types

Background
I have a commons library that I have to update. This commons library has a third party dependency (jgroups) which was changed significantly in newer versions. Through transitive dependencies, the newer version of jgroups is sometimes required and this breaks the commons library. I need to update some classes for compatibility with newer versions, while maintaining backwards compatibility.
The Problem
JGroups provides a View class, which has a method getMembers(). In the old version (2.10.0), this method returns Vector<Address> and in the newer version (3.2.7), this returns List<Address>. Any implementation of java.util.Collection will work for me, but the problem is I'm getting a NoSuchMethodException. As I understand it, the getMembers() method found has the legacy Vector<Address> return type (based on the JGroups dependency in the commons library), but I am dragging in a newer JGroups version and that View class expects a List<Address> returned from the getMembers() method.
Stacktrace
I get the following error when starting up my application in Eclipse.
Caused by: java.lang.NoSuchMethodError: org.jgroups.View.getMembers()Ljava/util/Vector;
at com.mycompany.commons.messaging.events.impl.distributed.JGroupsEventDistributionProvider$JGroupsEventReceiver.viewAccepted(JGroupsEventDistributionProvider.java:136) ~[classes/:na]
at org.jgroups.JChannel.invokeCallback(JChannel.java:752) ~[jgroups-3.2.7.Final.jar:3.2.7.Final]
at org.jgroups.JChannel.up(JChannel.java:710) ~[jgroups-3.2.7.Final.jar:3.2.7.Final]
at org.jgroups.stack.ProtocolStack.up(ProtocolStack.java:1020) ~[jgroups-3.2.7.Final.jar:3.2.7.Final]
at org.jgroups.protocols.pbcast.FLUSH.up(FLUSH.java:466) ~[jgroups-3.2.7.Final.jar:3.2.7.Final]
....
Where it breaks
Collection<Address> viewMembers = view.getMembers();
Question
Is it possible to support both versions, even though they are different implementations of Collection? How can I handle this scenario where I don't know the method return type until runtime?
Note:
I have tried to exclude the older version of JGroups that is being pulled in by adding an exclusion in my maven pom. This has not worked.
<dependency>
<groupId>com.mycompany.commons</groupId>
<artifactId>mycompany-commons-event-distributed-jgroups</artifactId>
<!-- Note: JGroups dependency is provided by infinispan -->
<version>1.0.2-SNAPSHOT</version>
<type>jar</type>
<scope>compile</scope>
<exclusions>
<exclusion>
<groupId>org.jgroups</groupId>
<artifactId>jgroups</artifactId>
</exclusion>
</exclusions>
</dependency>
How about using reflection ? Field View.members is a Vector in 2.10.x and an Address[] array in 3.x. You could access field View.members and - depending on its type - return all members as a collection of addresses. Not nice, but should work..
If you are using "plain java" for the app framework, I think you are generally out of luck.
Without using some sort of module framework such as OSGI, you only have a single pool of classes and every class loaded onto your classpath goes into that pool. This means you can't have multiple versions of the same JAR or the same class in your JVM at the same time.
Also, you must support multiple versions of the same JAR because you have at least two pieces of code compiled each against the different versions: one is expecting the return value of Vector and the other of List, so even if you could isolate the undesired one from your build environment, the code built against it would then not link to the proper binary and you will continue to get the runtime exception you see.
Unfortunately, you mentioned this is a "library" and not just a single app, which may make it even more difficult to apply a solution. Off the top of my head, I see these options for you moving forward, none of which are trivial and some may not be possible:
downgrade your code as necessary to only have a single version of jgroups in all dependency chains
rearchitect your app to use OSGI or a similar framework that supports multiple versions of the same library so dependency chains may diverge
rearchitect your app and fragment it into multiple that each run in their own JVM, communicating with sockets or any other means
For example, we have used the third option to split out a small portion of an app so it could depend on libraries with licenses unfriendly to our entire codebase but that portion could be licensed along with the library.
I am also not sure how the Java 9 module system would behave with this, but it might support multiple versions of the same module in the runtime simultaneously. If it's an option for you to use that beta or investigate, that may be worth your effort. However, you mentioned that the point was backward compatibility, so that may not be a viable option either.

Why Maven requires same version of different dependencies?

I'm a student with quite some experience in Java but totally new to Maven.
I was trying to implement a RESTful service provider and client with jersey-server and jersey-client. Both also depends on jersey-json, to make use of automatic conversion between POJO and JSON. Both of them also depend on a service model I implemented myself, where the POJO definition resides.
However, the code doesn't work for me. I spent quite a few hours looking for solutions everywhere on the Internet. It turns out the reason of the failure is that I accidentally specified version of jersey-server and jersey-client as 1.14, but jersey-json as 1.9.1.
The server doesn't work at the beginning, but at some point suddenly starts working. (I have no idea how this happened.) The client never worked until I change jersey-json version to 1.14.
Why do I need to have the same version for these different dependencies?
Because one depends on the other or otherwise has a compatibility issue. This is what dependency management is all about. Run mvn dependency:tree to see exactly how these libraries relate to each other.
In this case, it seems Jersey libraries are all released together as a "bundle" - and you need to use the versions from those bundles together. See: http://jersey.java.net/nonav/documentation/latest/chapter_deps.html
Note that this is an attribute of the Jersey libraries, not Maven.
Often different jars from the same distribution are tested together and given the same version number.
If you try to mix different versions it might work, or it might not, as its not a combination which was intended or tested.

Declaring JAXB as a dependency. Why?

Why would you want to declare JAXB as a dependency of your Java application since it's anyway shipped with the JRE and cannot be overriden in your applications classpath?
Using the jersey-json as an example, in their POM file they declare a dependency to jaxb-impl without specifying the exact version even. What are they gaining by doing this?
And also, when I add a dependency to jersey-json in my own POM file, I end up having jaxb-api.jar and jaxb-impl.jar in my classpath. Why would I want this to happen? Isn't the default JVM implementation anyway getting loaded if I don't put the files to the endorsed libraries directory?
I'm seeing in the linked pom:
<dependency>
<groupId>com.sun.xml.bind</groupId>
<artifactId>jaxb-impl</artifactId>
</dependency>
But there is also a parent pom, so you might want to follow that trail up, to see if an actual version and/or scope was specified there (and propagated down).
But if it isn't, then I'm guessing the Jersey team felt that the JAXB 2 API is mature enough to allow them to specify a loose dependency on the JAXB implementation.
The JRE ships with a specific implementation of the JAXB impl, which was equivalent to JAXB RI 2.1.7 for the longest time. But at the same time, there is a mechanism to easily swap out for another implementation of your choice, in your apps.
Sure you can use the built-in JRE JAXB implementation, and if it works for your app, you surely should try.
Nonetheless, some reasons that could lead you to want a separate JAXB implementation include : a left-over bug (addressed in a newer release); need a newer JAXB API such as JAXB 2.2.x (which comes in more recent version of the JRE); want to use a different implementation altogether (because it happens to have a better API and/or performance for your particular usage), etc...
So back to your jersey question, I'm guessing again that they wanted to give developers the flexibility to pull in their JAXB impl of choice. I'd think they have some level of recommendations somewhere in their guides.
However, the fact that the JAXB RI is specifically marked as a dependency, undercuts the argument.
That sound like the jersey people defined their dependency wrong. If a dependency is provided by the environment where you run it should be defined with the scope "provided".

Categories