Maven subdependency conflict at runtime with Twilio - java

Getting a class not found error at runtime due to maven sub dependency issue:
I am working on integrating twilio sdk ( com.twilio.sdk:twilio:7.35.0 ) into a multi module maven(3.x)/java(java8) project.
I firstly added the twilio maven dependency to the corresponding module
And I am getting a class not found exception at runtime on org.apache.http.conn.HttpClientConnectionManager.
I looked into it and found out that this class is part of org.apache.httpcomponents:httpclient (which is a subdependency in the twilio sdk ) and that an earlier version of this dependency is in my project.
And this earlier version does not have the HttpClientConnectionManager class.
So from this point, I tried to exclude the old version of the dependency with exclude tag first then with maven enforcer plugin and in the same time importing the dependency directly but nothing worked.
I tried to import the dependency in the parent pom and in the other modules that are using my twilio module as well.
I am using twilio 7.35 which uses org.apache.httpcomponents:4.5.6 but in my multi-module project I am using org.apache.cassandra:cassandra-thrift:3.0.0 which is using thrift:0.9.2 which contains the old version of httpclient(4.2.5).
The latest version of this cassandra module does not support the latest version of httpClient, so I need to make sure this httpclient older dependency does not mess up the twilio one.
I also analysed the output of mvn dependency:tree -Dverbose and it seems that the 4.5.6 is getting picked up correclty. And when I tried adding it to the parent module or the calling module, I can see that the old version is getting overwritten by the twilio one but it does not solve my issue.
I am starting to wonder if it is even possible to have two versions of the dependencies in the same maven project.

It sounds like you are experiencing something similar to a related question dealing with Jar Hell: Jar hell: how to use a classloader to replace one jar library version with another at runtime
In this case you need to use a separate classloader from the default one in your project. Perhaps you could use the URL Classloader and load some or all of your newer dependencies from the filesystem.

Related

NoSuchMethodError: org.apache.hadoop.conf.Configuration.getPassword

I have been struggling with this NoSuchMethodError in a Spark project for a while now without getting anywhere. Currently, this project is running locally using SparkNLP 3.3.0 and Spark-Core/SparkMLLib 3.1.2, both with Scala 2.12.4. Hadoop 3.2.0 is pulled in as a transitive dependency via spark-core.
What I have tried so far:
check that this method is indeed present by stepping through the code
verify uniform Scala version across all dependencies
verify that spark and hadoop versions are the same throughout (using maven dep tree and enforcer plug-in)
manually remove other versions of Hadoop from local .m2 directory
The code is running from an executable JAR which pulls in other jars to the classpath that are provided at runtime. Java version is 1.8.0_282. Maven is version 3.6.3. OS is Big Sur, 11.6 (M1).
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.getPassword(Ljava/lang/String;)[C
at org.apache.spark.SSLOptions$.$anonfun$parse$8(SSLOptions.scala:188)
at scala.Option.orElse(Option.scala:289)
at org.apache.spark.SSLOptions$.parse(SSLOptions.scala:188)
at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:98)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:252)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:458)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2672)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:945)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:939)
[...]
at com.mymainpackage.Main.main(Main.java:157)
I have finally been able to figure this one out.
The root cause was that an older version of hadoop-core was being pulled in (1.2.1 instead of 2.6.5), which does in fact not have the Configuration.getPassword() method. I found this out after setting up a test project where the SparkContext was initialized correctly and then checking the source jars of the two Configuration classes in the two projects (using Configuration.class.getProtectionDomain().getCodeSource().getLocation().getPath()).
After forcing version 2.6.5 using Maven's dependency management and after manually deleting the older 1.2.1 jar from the local Maven repo, it work fine.
The only thing that I still don't understand is why the hadoop-core was not showing up in the Maven dependency tree. Otherwise, I would have found sooner (probably).

Maven different package name in newer dependency

The project I am working on produces a jar that I deploy on azure so Spark runs the job.
It is using an internal dependency A which uses the dependency org.apache.commons:commons-configuration:1.10 yet when I deploy on azure it uses 2.1.1 version by default.
On azure we have the version 2.1.1 in which the package name(org.apache.commons.configuration2) differs from the 1.10 version ( org.apache.commons.configuration).
So having this line in the dependency A caused an error when using the 2.1.1 version:
Import org.apache.commons.configuration
It needs to be having "2" at the end, a thing I can t add as A is a dependency.
I tried excluding org.apache.commons:commons-configuration from A then using the maven shade plugin to rename the package but the jar file become double the actual size besides the shaded jar produced alone not inside the zip with the workflow and the sh file, a thing my team may not like.
Updating from commons-configuration 1 to 2 is a major change, the new version is not a drop-in replacement. As you have already pointed out the top level package changes and this will most likely brake library A. The correct solution will probably be to update library A to use commons-configuration 2.
You can still try to hack the Maven project setup to see if it works:
Exclude commons-configuration 1 from library A dependency using <exclude> tag.
Add commons-configuration 2 as a direct project dependency with provided scope in module B. The provided scope is needed to avoid packaging the dependency.
If you want to avoid using the maven-shade plugin than an alternative solution might be to:
Exclude commons-configuration 1 in the library A dependency declaration;
Work out which classes and methods from commons-configuration 1 that library A uses (easy enough if you have the source code, otherwise a modern IDE will disassemble it for you);
Write your own versions of these classes and methods in your application that delegate to the commons-configuration2 implementation.
Note that commons-configuration2 is a part of the Apache Spark distribution and it cannot be ignored. It would need to be added to your project with <scope>provided</scope>.
If this is too hard then the maven-shade-plugin is your only viable solution.

MessageBodyProviderNotFoundException after building with maven shade plugin [solved]

Hi I'm building a java REST client application that uses json. There's a conflict in my dependencies: one essential dependency requires jackson databind/core/annotations 2.10.1, another essential dependency uses an older version 2.2.2.
When running the app in Eclipse, this led to a MessageBodyProviderNotFoundException. The problem was that maven included version 2.2.2 instead of 2.10.1. So I use dependency management to resolve the conflict and include version 2.10.1. Within Eclipse this solved the issue and the app runs fine.
But after building the app with the maven shade plugin, the standalone app still fails with the MessageBodyProviderNotFoundException. The exception is thrown in a part of the code requiring version 2.10.1, not in the part depending on version 2.2.2.
Any thoughts? Thanks in advance!
EDIT: I checked my local maven repository and it only contains version 2.10.1. So the problem is not that the jar somehow still includes version 2.2.2.
Problem solved. Running the app within Eclipse apparently doesn't require explicitly registering a JacksonJsonProvider, but running the standalone app does:
ClientConfig config = new ClientConfig().connectorProvider(new ApacheConnectorProvider()).register(new JacksonJsonProvider());
Client client = ClientBuilder.newBuilder().withConfig(config) ...

Jersey dependency conflict

I'm implementing a dropwizard server app on top of an existing project. building with maven.
I'm currently in jar hell, and it's not fun. I have a pom file that's rather convoluted. I have a big problem with jersey dependencies packaged with hadoop/glassfish/com.sun. com.sun.jersey:jersey-core:jar1.19 is conflicting with org.glassfish.jersey.core:jersey-client:jar:2.22.2. I tried adding some exclusions to make it use the 2.22.x version, but it's still giving me the error seen here. I've been combing through the dependency tree and adding exclusions where I see fit, but can't seem to get it right. Here is my pom file.
Personal experience tells me that you should check ALL your dependencies (especially the ones you developed in-house) if you have the old jersey version as a dependency in there.
That's what solved a similiar problem for me.
If you do :
mvn dependency:list -Dverbose
(grep for filtering results)
it gives u the list of dependencies(transitive ones also).Check the version of sun jersey or glassfish jersey that's being used in your application.
If you do :
mvn dependency:tree -Dverbose -Dincludes=jersey-server
you will see the graph of where any version of jersey-server is coming from a parent.
I had a hadoop-client and an in-house rest-bus-client using a version of sun jersey(1.9.x) which i needed to remove. I tried in maven and it simply worked.
Also, this jersey version unmatch caused the following issue for me in dropwizard.
java.lang.NoSuchMethodError: javax.ws.rs.core.Application.getProperties()Ljava/util/Map;

Gradle unable to find downloaded dependency packaged as OSGi bundle (non-OSGi app)

(Edited for clarification)
My (non-OSGi) application build is in Gradle, and I am trying to upgrade from very old version of Jersey (1.1.4.1) to something much newer (1.12?). I do not pretend to know anything about using OSGi. But when I point my Gradle dependencies (with $JERSEY_VERSION set to "1.12") to:
[group: 'com.sun.jersey', name: 'jersey-server', version: "$JERSEY_VERSION"]
it downloads the jersey-server-1.12.jar into my Gradle dependencies cache under a "bundles" directory instead of the normal "jars" directory, and then Gradle seems to not include this jar in its classpath like it would if it were under a "jars" subdirectory instead.
I discovered it went under "bundles" because the POM has it labeled as an OSGi enabled jar. I do not think we are going to want to OSGi-ify our project. Am I stuck with older versions of Jersey, or is there anything else I can do to get Gradle to see the Jersey jar? I would prefer to not manually copy the file to a local repo if possible, but rather somehow depend on the dependency management capabilities of Gradle if it is up to the task.
OSGi bundles are normal jars with extra manifest entries. You should be able to use them in a non OSGi project as you would any other dependency. Is it a problem that they end up in the cache's bundles directory?
'Twas a silly oversight: moving from 1.1.4.1 to 1.12, the POM dependencies changed, so that jersey-core.jar was no longer being brought in implicitly. I had to add jersey-core.jar explicitly. I had assumed the problem was the fact that jersey-server.jar was being imported as a bundle, but I was really just getting a ClassNotFoundException for a class that was in jersey-core.jar.

Categories