com.fasterxml.jackson.core:jackson-databand upgraded to latest 2.14.2 version, yet application war shows vulnerabilities - java

I'm using 6.9.2 version for gradle wrapper and my application war file is flagged with the following Jackson-databand vulnerabilities CVE-2022-42004, CVE-2022-42003, CVE-2020-36518, CVE-2022-25649 in WIZ Security scanning.
To solve this I have upgraded the old vulnerable version 2.10.0 to the latest one 2.14.2 in build.gradle and excluded the jackson-databind module from some of the implementations but final scan results not of much luck.
Even after the upgrade, wiz scan shows vulnerability with old 2.10.0 version, although when I checked inside the war file, it shows 2.14.2 version for jackson-databind jar.
Can you please help me?
I fail to understand what's going wrong.
For reference find a screenshot of how I upgraded using gradle constraints and excluded the module, along with the final dependency tree.
Dependency constraints added for jackson-databind
Module exclusions

Related

Cannot resolve symbol 'annotations' about openapi v3

when I using the swagger v3 import like this:
import io.swagger.annotations.Api;
import io.swagger.annotations.Operation;
the Intellij Idea shows that:
Cannot resolve symbol 'annotations'
I have already add the openapi v3 plugin in the build.gradle:
id "org.springdoc.openapi-gradle-plugin" version "1.3.4"
and apply it like this:
apply plugin: 'org.springdoc.openapi-gradle-plugin'
Am I missing something? what should I do to fixed it? I also added this dependencies:
api "org.springdoc:springdoc-openapi-ui:1.6.9"
The annotations symbol cannot be resolved when using the org.springdoc.openapi-gradle-plugin plugin in a Spring Boot project may be caused by a few different issues:
The springdoc-openapi-data-rest dependency may not be included in
your project's classpath. This dependency is required for the
org.springdoc.openapi-gradle-plugin plugin to work properly, and
provides the necessary annotations that are used to generate the
OpenAPI documentation. Make sure that this dependency is included in
your project's build file (e.g. build.gradle).
You may be using an older version of the
org.springdoc.openapi-gradle-plugin plugin that is not compatible
with the version of Spring Boot that you are using. Make sure that
you are using a version of the plugin that is compatible with your
version of Spring Boot. You can check the plugin's documentation to
see which versions of Spring Boot are supported.
There may be a conflict with another library or plugin in your
project that is causing the annotations symbol to be unresolved. Try
resolving any other library conflicts or errors that you may be
encountering, as this may help to fix the annotations symbol issue.

NoSuchMethodError: org.apache.hadoop.conf.Configuration.getPassword

I have been struggling with this NoSuchMethodError in a Spark project for a while now without getting anywhere. Currently, this project is running locally using SparkNLP 3.3.0 and Spark-Core/SparkMLLib 3.1.2, both with Scala 2.12.4. Hadoop 3.2.0 is pulled in as a transitive dependency via spark-core.
What I have tried so far:
check that this method is indeed present by stepping through the code
verify uniform Scala version across all dependencies
verify that spark and hadoop versions are the same throughout (using maven dep tree and enforcer plug-in)
manually remove other versions of Hadoop from local .m2 directory
The code is running from an executable JAR which pulls in other jars to the classpath that are provided at runtime. Java version is 1.8.0_282. Maven is version 3.6.3. OS is Big Sur, 11.6 (M1).
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.conf.Configuration.getPassword(Ljava/lang/String;)[C
at org.apache.spark.SSLOptions$.$anonfun$parse$8(SSLOptions.scala:188)
at scala.Option.orElse(Option.scala:289)
at org.apache.spark.SSLOptions$.parse(SSLOptions.scala:188)
at org.apache.spark.SecurityManager.<init>(SecurityManager.scala:98)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:252)
at org.apache.spark.SparkEnv$.createDriverEnv(SparkEnv.scala:189)
at org.apache.spark.SparkContext.createSparkEnv(SparkContext.scala:277)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:458)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2672)
at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$2(SparkSession.scala:945)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:939)
[...]
at com.mymainpackage.Main.main(Main.java:157)
I have finally been able to figure this one out.
The root cause was that an older version of hadoop-core was being pulled in (1.2.1 instead of 2.6.5), which does in fact not have the Configuration.getPassword() method. I found this out after setting up a test project where the SparkContext was initialized correctly and then checking the source jars of the two Configuration classes in the two projects (using Configuration.class.getProtectionDomain().getCodeSource().getLocation().getPath()).
After forcing version 2.6.5 using Maven's dependency management and after manually deleting the older 1.2.1 jar from the local Maven repo, it work fine.
The only thing that I still don't understand is why the hadoop-core was not showing up in the Maven dependency tree. Otherwise, I would have found sooner (probably).

Maven subdependency conflict at runtime with Twilio

Getting a class not found error at runtime due to maven sub dependency issue:
I am working on integrating twilio sdk ( com.twilio.sdk:twilio:7.35.0 ) into a multi module maven(3.x)/java(java8) project.
I firstly added the twilio maven dependency to the corresponding module
And I am getting a class not found exception at runtime on org.apache.http.conn.HttpClientConnectionManager.
I looked into it and found out that this class is part of org.apache.httpcomponents:httpclient (which is a subdependency in the twilio sdk ) and that an earlier version of this dependency is in my project.
And this earlier version does not have the HttpClientConnectionManager class.
So from this point, I tried to exclude the old version of the dependency with exclude tag first then with maven enforcer plugin and in the same time importing the dependency directly but nothing worked.
I tried to import the dependency in the parent pom and in the other modules that are using my twilio module as well.
I am using twilio 7.35 which uses org.apache.httpcomponents:4.5.6 but in my multi-module project I am using org.apache.cassandra:cassandra-thrift:3.0.0 which is using thrift:0.9.2 which contains the old version of httpclient(4.2.5).
The latest version of this cassandra module does not support the latest version of httpClient, so I need to make sure this httpclient older dependency does not mess up the twilio one.
I also analysed the output of mvn dependency:tree -Dverbose and it seems that the 4.5.6 is getting picked up correclty. And when I tried adding it to the parent module or the calling module, I can see that the old version is getting overwritten by the twilio one but it does not solve my issue.
I am starting to wonder if it is even possible to have two versions of the dependencies in the same maven project.
It sounds like you are experiencing something similar to a related question dealing with Jar Hell: Jar hell: how to use a classloader to replace one jar library version with another at runtime
In this case you need to use a separate classloader from the default one in your project. Perhaps you could use the URL Classloader and load some or all of your newer dependencies from the filesystem.

Maven, can I force a dependency to use a specific dependency

I've recently upgraded com.amazonaws to version 1.11.xxx.
From version 1.11.0 onwards amazon dropped support for com.amazonaws.util.json in favour of Jackson.
One of my dependencies (over which I have no control), imports
com.amazonaws.util.json.JSONException
This is causing a NoClassDefFoundError for JSONException
Is there a way to configure maven so that this dependency can continue to use an older version of com.amazonaws so that it can find this class?
Note, I've not added an exclusion for amazonaws to the dependency in question, however it seems that the fact that I've upgraded the version in my pom is overriding the dependency's older version.

appengine 1.7.4 sdk and com.google.appengine.datanucleus.DatastoreManager not found

i have upgraded to appengine 1.7.4 and something is now broken.
when deploying to local dev the appengine errors with the following.
Class "com.google.appengine.datanucleus.DatastoreManager" was not found in the C
LASSPATH. Please check your specification and your CLASSPATH.
org.datanucleus.exceptions.ClassNotResolvedException: Class "com.google.appengin
e.datanucleus.DatastoreManager" was not found in the CLASSPATH. Please check you
r specification and your CLASSPATH.
This is NOT a java.lang.ClassNotFoundException but a org.datanucleus.exceptions.ClassNotResolvedException
my libs are
appengine-api-1.0-sdk-1.7.4
appengine-api-labs-1.7.4
datanucleus-api-jdo-3.2.0-m3
datanucleus-api-jpa-3.2.0-m3
datanucleus-appengine-2.1.1
datanucleus-core-3.2.0-m3
datanucleus-enhancer-3.1.1
it is build using maven.
the DataNucleus App Engine Plugin Compatibility
http://code.google.com/p/datanucleus-appengine/wiki/Compatibility
states
3.0:
Requires DataNucleus 3.2+ (core, api-jdo, api-jpa).
Requires SDK 1.7.0+
The datanucleus-appengine-2.1.1 pom has dependancies of
org.datanucleus datanucleus-api-jdo [3.1.1, 3.2)
org.datanucleus datanucleus-api-jpa [3.1.1, 3.2)
org.datanucleus datanucleus-core [3.1.1, 3.2)
org.datanucleus datanucleus-enhancer [3.1.0-release, )
something is wrong with the version of the libs but i cant determine it.
what is the correct dependency for DN plugin and the sdk 1.7.4?
-lp
Why have you got "datanucleus-api-jdo" AND "datanucleus-api-jpa" in the CLASSPATH? Decide which API you're using and use that one. Where are the other dependencies ? jdo-api.jar of persistence-api.jar ?
You can't use DataNUcleus 3.2 unless you're using SVN trunk of the datanucleus-appengine plugin, as shown clearly on http://code.google.com/p/datanucleus-appengine/wiki/Compatibility
OK the issue is that the plugin 2.1.1 has a dependency on DN 3.2x.
As #datanucleus has mentioned this is incorrect, it should be limited to DN 3.1.1.
by manual setting the dependency of the plugin to DN 3.1.1. everything now works.
thanks #Datanucleus

Categories