NoSuchMethodError: org.apache.hadoop.conf.Configuration.getPassword - java

I have been struggling with this NoSuchMethodError in a Spark project for a while now without getting anywhere. Currently, this project is running locally using SparkNLP 3.3.0 and Spark-Core/SparkMLLib 3.1.2, both with Scala 2.12.4. Hadoop 3.2.0 is pulled in as a transitive dependency via spark-core.
What I have tried so far:
check that this method is indeed present by stepping through the code
verify uniform Scala version across all dependencies
verify that spark and hadoop versions are the same throughout (using maven dep tree and enforcer plug-in)
manually remove other versions of Hadoop from local .m2 directory
The code is running from an executable JAR which pulls in other jars to the classpath that are provided at runtime. Java version is 1.8.0_282. Maven is version 3.6.3. OS is Big Sur, 11.6 (M1).
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.getPassword(Ljava/lang/String;)[C
at org.apache.spark.SSLOptions$.$anonfun$parse$8(SSLOptions.scala:188)
at scala.Option.orElse(Option.scala:289)
at org.apache.spark.SSLOptions$.parse(SSLOptions.scala:188)
at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:98)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:252)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:458)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2672)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:945)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:939)
[...]
at com.mymainpackage.Main.main(Main.java:157)

I have finally been able to figure this one out.
The root cause was that an older version of hadoop-core was being pulled in (1.2.1 instead of 2.6.5), which does in fact not have the Configuration.getPassword() method. I found this out after setting up a test project where the SparkContext was initialized correctly and then checking the source jars of the two Configuration classes in the two projects (using Configuration.class.getProtectionDomain().getCodeSource().getLocation().getPath()).
After forcing version 2.6.5 using Maven's dependency management and after manually deleting the older 1.2.1 jar from the local Maven repo, it work fine.
The only thing that I still don't understand is why the hadoop-core was not showing up in the Maven dependency tree. Otherwise, I would have found sooner (probably).

Related

Maven Eclipse plugin bug? failing to auto-add in-workspace dependency JAR to Run Configuration module-path

So here is a quick rundown of my situation:
I have two Java projects: one in Java 8 (so not modular) and one in Java 11 that is modular.
The modular/not-modular issue may not be relevant but for the sake of clarity, I've stated it.
For reference, the Java 8 is a game library I made, and the Java 11 is the game implementation I'm making.
I need to reference the Java 8 library from my Java 11 game project.
Both projects are Maven projects, and I have my dependency defined in my game's POM file.
I'm using latest version of Eclipse (2020-03 4.15.0) and Maven 3.6.3 with Java version 11.0.7 OracleJDK.
My Problem:
My understanding is that my Java 8 library project becomes an automatic module. Adding it into my Java 11 game project module-info file works (with a warning about the name being unstable, but no issue) and I can compile my game project code with no issues in Eclipse.
When I attempt to run the game, I get Module <my-library> not found, required by <my-game>. Now, since Maven is managing the dependencies, it should just work.
How can I get my game to run?
I Can Fix It Three Ways...
First I can simply manually add the library project's JAR file (in it's target folder) to the Run Configuration module-path of my game project.
Second, I can delete the library project from my workspace. This means Maven then goes and gets the JAR from the local m2 repo (it's been installed with mvn install). In this situation Maven DOES automatically add the JAR to the Run Configuration module-path correctly.
Third, I can change the version of the library project in it's POM file and like option two, this means it no longer satisfies the dependency and Maven then looks for the JAR in the local m2 repo.
But...
All three of these options seem to me like they should be unnecessary. This feels like a bug with Maven failing to add the in-workspace project dependency to the module path in the Run Configuration in Eclipse.
To be fair, it is a Maven Eclipse plugin feature that automatically detects when one of the in-workspace projects is a dependency and uses that "live" version instead of the m2 repo version. This is very handy for these situations where development on a library is happening in parallel.
But until this bug is fixed (or unless it's not a bug and I'm missing something), this caused me a ton of frustration. I've posted this in hopes of helping anyone else who may be facing the same issue.

Maven subdependency conflict at runtime with Twilio

Getting a class not found error at runtime due to maven sub dependency issue:
I am working on integrating twilio sdk ( com.twilio.sdk:twilio:7.35.0 ) into a multi module maven(3.x)/java(java8) project.
I firstly added the twilio maven dependency to the corresponding module
And I am getting a class not found exception at runtime on org.apache.http.conn.HttpClientConnectionManager.
I looked into it and found out that this class is part of org.apache.httpcomponents:httpclient (which is a subdependency in the twilio sdk ) and that an earlier version of this dependency is in my project.
And this earlier version does not have the HttpClientConnectionManager class.
So from this point, I tried to exclude the old version of the dependency with exclude tag first then with maven enforcer plugin and in the same time importing the dependency directly but nothing worked.
I tried to import the dependency in the parent pom and in the other modules that are using my twilio module as well.
I am using twilio 7.35 which uses org.apache.httpcomponents:4.5.6 but in my multi-module project I am using org.apache.cassandra:cassandra-thrift:3.0.0 which is using thrift:0.9.2 which contains the old version of httpclient(4.2.5).
The latest version of this cassandra module does not support the latest version of httpClient, so I need to make sure this httpclient older dependency does not mess up the twilio one.
I also analysed the output of mvn dependency:tree -Dverbose and it seems that the 4.5.6 is getting picked up correclty. And when I tried adding it to the parent module or the calling module, I can see that the old version is getting overwritten by the twilio one but it does not solve my issue.
I am starting to wonder if it is even possible to have two versions of the dependencies in the same maven project.
It sounds like you are experiencing something similar to a related question dealing with Jar Hell: Jar hell: how to use a classloader to replace one jar library version with another at runtime
In this case you need to use a separate classloader from the default one in your project. Perhaps you could use the URL Classloader and load some or all of your newer dependencies from the filesystem.

What is an OSGI version qualifier

I need to confirm what I suspect as I cannot find any documentation on it, so this would appear a silly question, and since I am a learner at eclipse PDE.
Initially,
I had a parent project pom of an eclipse plugin project with
<version>1.1.0-SNAPSHOT</version>
with two child projects, with both their poms referring to the parent pom as version 1.1.0-SNAPSHOT.
I was able to build the projects successfully and had a site which I use to install the plugin into eclipse.
Then, I wanted my personal temp version called 1.1.1-mine. So I modified the three poms to
1.1.1-mine
I also updated the META-INF/MANIFEST.MF and feature.xml from
0.13.0.qualifier
to
0.13.1.qualifier
However, the build encountered the following error.
[ERROR] Failed to execute goal org.eclipse.tycho:tycho-packaging-plugin:0.15.0:validate-version (default-validate-version) on project org.sonatype.m2e.subclipse: OSGi version 1.1.1.qualifier in META-INF/MANIFEST.MF does not match Maven version 1.1.1-mine in pom.xml
Does qualifier have to be a maven version keyword? Because, the build proceeded without error after I changed mine to SNAPSHOT in the poms.
If not, what did I do wrong?
What can I do to allow me to have version 1.1.1-mine?
In a nutshell, OSGi .qualifier means the same thing as -SNAPSHOT.
Since OSGi doesn't allow for more than 3 numbers in a version (+ qualifier), creating a -mine version is a bit tricky.
According to the FAQ, you can tell Tycho a string that it should be use to replace qualifier with:
mvn -DforceContextQualifier=mine
Note that this disables all the goodness you get from SNAPSHOT versions (namely that you can deploy the bundle several times).

Gradle unable to find downloaded dependency packaged as OSGi bundle (non-OSGi app)

(Edited for clarification)
My (non-OSGi) application build is in Gradle, and I am trying to upgrade from very old version of Jersey (1.1.4.1) to something much newer (1.12?). I do not pretend to know anything about using OSGi. But when I point my Gradle dependencies (with $JERSEY_VERSION set to "1.12") to:
[group: 'com.sun.jersey', name: 'jersey-server', version: "$JERSEY_VERSION"]
it downloads the jersey-server-1.12.jar into my Gradle dependencies cache under a "bundles" directory instead of the normal "jars" directory, and then Gradle seems to not include this jar in its classpath like it would if it were under a "jars" subdirectory instead.
I discovered it went under "bundles" because the POM has it labeled as an OSGi enabled jar. I do not think we are going to want to OSGi-ify our project. Am I stuck with older versions of Jersey, or is there anything else I can do to get Gradle to see the Jersey jar? I would prefer to not manually copy the file to a local repo if possible, but rather somehow depend on the dependency management capabilities of Gradle if it is up to the task.
OSGi bundles are normal jars with extra manifest entries. You should be able to use them in a non OSGi project as you would any other dependency. Is it a problem that they end up in the cache's bundles directory?
'Twas a silly oversight: moving from 1.1.4.1 to 1.12, the POM dependencies changed, so that jersey-core.jar was no longer being brought in implicitly. I had to add jersey-core.jar explicitly. I had assumed the problem was the fact that jersey-server.jar was being imported as a bundle, but I was really just getting a ClassNotFoundException for a class that was in jersey-core.jar.

Why do my tests fail to run when migrating from maven2 to buildr?

I have a straightforward maven2 java project (JMS relaying system). After we released the first version, we found that we spent more time configuring maven than actually coding.
For the next release we wanted to clean up the build process and someone suggested migrating to builder. So I was tasked with doing just that.
I setup buildr (1.3.4) according to the documentation on their website. And then from the root of the project I typed the buildr command and then informed buildr to create the build file based upon my pom.xml. That processed fine and compiled all the code. All was gravy until buildr started running the tests. Here is the ouput:
Test framework error: taskdef class org.apache.tools.ant.taskdefs.optional.junit.JUnitTask cannot be found
Obviously the class specified isn't in my classpath. However, the buildr documentation says that all the required items needed for basic testing are included. Their documentation doesn't say that they need any specific libraries for ant or a version of ant. Although I do have ant 1.7.0 installed (not included in my classpath however).
Has anyone seen this before?
Update
I located the infamous ant-optional jar on the maven repository. Including that in my test.with options did not resolve the issue.
Running the buildr command with --trace gives this extra information...
Tests failed!
/pathtoruby/buildr-1.3.4/lib/buildr/core/test.rb:455:in `run_tests'
/pathtoruby/buildr-1.3.4/lib/buildr/core/test.rb:199:in `initialize'
Found the issue... Apparently there is an ant-junit.jar that is needed but for whatever reason in my local repository it was owned by root and not my local user account (OSX system). So it wasn't accessible to buildr. I deleted the items from my local repository and reran buildr (it downloaded the needed items).
Update
Also this caused a few other issues. It seems that a few other items in my local repository had strange permissions. I ended up just archiving my repository and letting maven reconstruct it. This resolved all my issues. I now have a nice build file that is 25 lines of code compared to my previous pom.xml file that was over 100 lines.
You get that error because JUnitTask isn't on the classpath. I'm not very familiar with Buildr so can't say if it is required for you to specify the JUnit jars or not, but if Buildr uses the system classpath, try adding JUnit to it and see what happens.
Once you've confirmed your builds will run with JUnit hacked in to the classpath, you can then try varying your configuration until it runs as you expected, or leave it as is.
Can you post the reference to the relevant part of the documentation? I didn't see anything (in my very brief reading of the site) that says required items are included.
Is it possible that you've not downloaded all the gems? If you run "gem update --system" to update Ruby, then "gem update buildr" you can ensure that the required dependencies have all been installed.

Categories