My application is dependent upon the following artifact
com.oracle.jdbc:ojdbc8
which has a circular dependency with
com.oracle.jdbc:ucp
The build fails with the following error
ERROR: /private/var/tmp/_bazel_me/4f1994ece960b360388a372b5e6aa4b2/external/maven/BUILD:2757:11: in jvm_import rule #maven//:com_oracle_jdbc_ojdbc8: cycle in dependency graph:
//package/java:MyClass
.-> #maven//:com_oracle_jdbc_ojdbc8
| #maven//:com_oracle_jdbc_ucp
`-- #maven//:com_oracle_jdbc_ojdbc8
Is there a way to get around this?
Looking at the artifact here:
https://mvnrepository.com/artifact/com.oracle.jdbc/ojdbc8
there's only one version:
https://mvnrepository.com/artifact/com.oracle.jdbc/ojdbc8/12.2.0.1
and it depends on com.oracle.jdbc:upc, which indeed depends back on com.oracle.jdbc:ojdbc8:12.2.0.1
https://mvnrepository.com/artifact/com.oracle.jdbc/ucp/12.2.0.1
This was surely a mistake, because I don't believe that maven allows circular dependencies either.
Looking back at https://mvnrepository.com/artifact/com.oracle.jdbc/ojdbc8, it says this was moved to https://mvnrepository.com/artifact/com.oracle.database.jdbc/ojdbc8, and the version of upc there has no dependencies:
https://mvnrepository.com/artifact/com.oracle.database.jdbc/ucp/12.2.0.1
So maybe com.oracle.database.jdbc:ojdbc8 will work for you (or the specific version com.oracle.database.jdbc:ojdbc8:12.2.0.1, since the previous one was 12.2.0.1 and the latest version is 21.4.0.0.1)
Interestingly, com.oracle.database.jdbc:ojdbc8 says it was also moved: https://mvnrepository.com/artifact/com.oracle.database.jdbc/ojdbc8
Related
I am having issues trying to get Spark to load, read and query a parquet file. The infrastructure seems to be set up (Spark standalone 3.0) and can be seen and will pick up jobs.
The issue I am having is when this line is called
Dataset<Row> parquetFileDF = sparkSession.read().parquet(parquePath);
the following error is thrown
Caused by: com.fasterxml.jackson.databind.JsonMappingException: Scala module 2.10.0 requires Jackson Databind version >= 2.10.0 and < 2.11.0
at com.fasterxml.jackson.module.scala.JacksonModule.setupModule(JacksonModule.scala:61)
I looked into JacksonModule.setupModule and when it gets to context.getMapperVersion the version that is being passed is 2.9.10. It appears to me that the DefaultScalaModule is pulling some older version.
I'm using Gradle to build and have the dependencies set up as such
implementation 'com.fasterxml.jackson.core:jackson-core:2.10.0'
implementation 'com.fasterxml.jackson.core:jackson-databind:2.10.0'
implementation 'org.apache.spark:spark-core_2.12:3.0.0'
implementation 'org.apache.spark:spark-sql_2.12:3.0.0'
implementation 'org.apache.spark:spark-launcher_2.12:3.0.0'
implementation 'org.apache.spark:spark-catalyst_2.12:3.0.0'
implementation 'org.apache.spark:spark-streaming_2.12:3.0.0'
That didn't work, so I tried forcing databind
implementation ('com.fasterxml.jackson.core:jackson-databind') {
version {
strictly '2.10.0'
}
}
I've tried a few different versions and still keep hitting this issue. Maybe I'm missing something super simple, but right now, I can't seem to get past this error.
Any help would be appreciated.
I was able to figure out the issue. I was pulling in jar file from another project. The functionality in the jar file wasn't being used at all, so it wasn't suspect. Unfortunately, that project hadn't been updated and there were some older Spark libraries that were some how being picked up by my current running app. Once I removed that, the error went away. What's interesting is the dependency graph didn't show anything about the libraries the other jar file was using.
I suppose if you run into a similar issue, double check any jar files being imported.
I have an older widget app, it is using JAXB and now being migrated to Java 11.
I am aware that java.xml.bind/JAXB has been removed in this version, so we are trying to replace it with Jakarta.
We have jakarta.activation.jar and jakarta.xml.bind-api.jar and it works fine for compilation, but not for runtime
when the app starts I am getting this:
javax.xml.bind.JAXBException: Implementation of JAXB-API has not been found on module path or classpath.
- with linked exception:
[java.lang.ClassNotFoundException: com.sun.xml.internal.bind.v2.ContextFactory]
at java.xml.bind/javax.xml.bind.ContextFinder.newInstance(ContextFinder.java:232)
... ...
The com.sun.xml.internal.* classes were part of rt.jar, which i believe was removed in Java 9,
but the latest jakarta source still refers to them... how is that supposed to work?
I saw some posts with a similar problem, and typical solution is Add Maven dependency.
I'm not sure about details, but in any case we don't use Maven or Gradle and don't have pom.xml.
Is there anything i can do to make it work?
Turned out it is not as bad as I thought.
rt. jar was broken down to multiple modules and I started looking for the ones I needed.
jaxb-runtime.jar looked right to me and I ran strings / grep on it - it indeed had all the com.sun.xml.bind classes in it!
When I added this jar to the ones I had the original error changed to "some-other-class not found" and I had to add more jars until the app was happy.
In the end I have this:
jakarta.activation.jar, jakarta.xml.bind-api.jar, jaxb-runtime.jar,
istack-commons-runtime.jar, stax-ex.jar, FastInfoset.jar, txw2.jar
No changes to manifest, makefile or anything else related to build, purely packaging issue.
Big sigh...
I'm starting from the examples in the Spark distribution (v1.6.2). I added my own "Hello World" example and that worked fine. Trying to add something that uses a 3rd party dependency (com.google.cloud:gcloud-java-nio:0.2.5), here is what happens:
16/07/22 13:05:26 ERROR SparkUncaughtExceptionHandler: Uncaught
exception in thread Thread[Executor task launch worker-7,5,main]
java.lang.NoClassDefFoundError:
org/spark-project/guava/base/MoreObjects at
com.google.cloud.ServiceOptions.activeGoogleCloudConfig(ServiceOptions.java:282)
at
com.google.cloud.ServiceOptions.googleCloudProjectId(ServiceOptions.java:294)
at
com.google.cloud.ServiceOptions.defaultProject(ServiceOptions.java:270)
at com.google.cloud.ServiceOptions.(ServiceOptions.java:206)
at
com.google.cloud.HttpServiceOptions.(HttpServiceOptions.java:153)
at
com.google.cloud.storage.StorageOptions.(StorageOptions.java:69)
(...)
I ran my code as follows:
spark-1.6.2$ mvn -DskipTests clean package
(lots of time passes...)
spark-1.6.2$ ./bin/run-example JavaGcsTest
And to add the dependency I added those lines to examples/pom.xml:
<dependency>
<groupId>com.google.cloud</groupId>
<artifactId>gcloud-java-nio</artifactId>
<version>0.2.5</version>
</dependency>
It looks like the root cause is that both gcloud-java-nio and Spark depend on guava, and perhaps they depend on different versions of it.
I looked at related questions and the answers suggest making a fat jar. I'm not sure how to apply this here, though, as examples are already bundled into a fat jar (examples/target/scala-2.10/spark-examples-1.6.2-hadoop2.2.0.jar).
I tried changing the version of guava that was used, raising it from 14 to 19 (the latest), but of course then the compilation failed (SparkEnv.scala:84: method softValues in class MapMaker cannot be accessed in com.google.common.collect.MapMaker).
Hopefully someone has advice on how to get Spark to work with this 3rd party library!
A way to get out of this problem is to compile a shaded version of the third-party library and use that jar as the dependency.
In the case of gcloud-java-nio the project already includes a shaded jar target so after running mvn package it's in target/gcloud-java-nio-0.2.7-SNAPSHOT-shaded.jar.
Disclaimer:
I'm new to Gradle, have read a lot of docs, and I don't know whether my maven-style understanding is tripping me out, or whether it's the sleep dep (kids - don't ask), but I'm still not getting it.
Problem Background:
I have a project that consists of several modules.
One of the modules, let's call it data-structure defines a data structure
Another module, data-structure-fabsearch, defines an implementation for a data source for the data structure, and finally
A third module, fabsearch-common, defines some common data source classes (eg: connection management to a fabsearch data source etc).
The reason I've done it like this is because there's actually another module that also uses the fabsearch-common stuff.
Anyway, my data-structure-fabsearch build.gradle looks something like this:
dependencies {
compile project(:data-structure)
compile project(:fabsearch-common)
}
The fabsearch-common module declares depedencies for the fabsearch api (let's call it fabsearch-api-1.0.0).
So, the dependency tree for data-structure-fabsearch should look like this:
- data-structure-fabsearch
- data-structure
- fabsearch-common
- fabsearch-api-1.0.0
This was all working wonderfully last night. This morning I came to work and all of a sudden those dependencies don't resolve anymore. References to fabsearch-api-1.0.0 classes are no longer found.
What I've Tried
1. In the parent build.gradle:
project(':data-structure-fabsearch'){
apply plugin: 'java'
dependencies {
compile project(path: ':data-structure', configuration: 'compile')
compile project(path: ':fabsearch-common', configuration: 'compile')
}
}
I've tried this with and without the configuration setting.
2. In the data-structure-fabsearch build.gradle file, adding the configuration parameter.
3. Restarting IntelliJ
4. Clicking the refresh icon in the Gradle tool window (repeatedly)
5. Reading all about transitive dependencies in the Gradle user guides
6. Drinking tea (repeatedly)
None of the above work.
What I'm Expecting
I'm expecting that the fabsearch-common dependencies (the fabsearch-api jars) should also be included in the data-structure-fabsearch dependency tree. All references to fabsearch-api classes in data-structure-fabsearch should resolve etc etc.
My Question[s]
Whilst this is possible in Maven, is it possible in Gradle?
What do I have to do to get it to work?
How much sleep dep can you take without dying?
Many thanks for any help.
Turns out the problem wasn't gradle at all. The problem was IntelliJ.
It got it's knickers into a proper twist!
Solution:
1. Close the project in IntelliJ
2. Delete the .idea directory
3. Delete all .iml files and any other IntelliJ cra-useful files
4. Open project in IntelliJ, choose same directory. Problem disappears.
I am writing a Maven plugin and I would like to automatically resolve specific dependencies and add them as dependencies to the project based on the parameters given to the plugin.
I have been able to successfully resolve dependencies through aether, but there seems to be a disconnect between aether and the MavenProject.
There is a method on MavenProject#addAttachedArtifact which I'm guessing is what I want to use. However, it takes a org.apache.maven.artifact.Artifact while the one retrieved from aether is org.sonatype.aether.artifact.Artifact. I found a plugin that has a conversion method between the two, but I figure there ought to be a more standard approach.
I have also tried using the DefaultArtifactFactory to create a org.apache.maven.artifact.Artifact but get a NullPointerException when trying to get an ArtifactHandler.
code:
DefaultArtifactFactory factory = new DefaultArtifactFactory();
Artifact mavenArtifact = factory.createBuildArtifact("com.beust", "jcommander", "1.27", "jar");
result:
Caused by: java.lang.NullPointerException
at org.apache.maven.artifact.factory.DefaultArtifactFactory.createArtifact(DefaultArtifactFactory.java:155)
at org.apache.maven.artifact.factory.DefaultArtifactFactory.createArtifact(DefaultArtifactFactory.java:117)
at org.apache.maven.artifact.factory.DefaultArtifactFactory.createArtifact(DefaultArtifactFactory.java:111)
at org.apache.maven.artifact.factory.DefaultArtifactFactory.createBuildArtifact(DefaultArtifactFactory.java:75)
at com.foo.bar.module.IncludeModuleFrontEndMojo.execute(IncludeModuleFrontEndMojo.java:165)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
... 20 more
So really, these are the things I've tried, a resolution to these issues would be great, but I'm really after the right way to do this. Any ideas?
UPDATE
I wrote my own conversion method between the two classes:
private static org.apache.maven.artifact.Artifact aetherToMavenArtifactBasic(Artifact artifact, String scope, ArtifactHandler artifactHandler) {
DefaultArtifact mavenArtifact = new DefaultArtifact(artifact.getGroupId(), artifact.getArtifactId(), artifact.getVersion(), scope, artifact.getExtension(), artifact.getClassifier(), artifactHandler);
mavenArtifact.setFile(artifact.getFile());
mavenArtifact.setResolved(true);
return mavenArtifact;
}
and found that the MavenProject#addAttachedArtifact method is to attach an artifact to an existing artifact (i.e. attach sources/javadocs jars to an artifact), which is not my goal. Instead I got the artifacts from the maven project and add my artifact:
project.getArtifacts().add(mavenArtifact);
which adds my artifact to the project (my artifact is then shown when I call the project's getArtifactMap() and getCompileClasspathElements(). However, this change does not persist. This is the problem I was really worried about. So the question has evolved into:
Can I make changes to the MavenProject and have it persist?
I don't think this is possible and for my purposes I decided instead to require the user to add the dependency in the project's pom file (and error out if they don't have it).
It seems to be by design that you don't allow the user to muck with the project configuration through a plugin to a point where you could break the build. I found a good post on advanced MOJO development here. A quote from it:
If this parameter could be specified separately from the main
dependencies section, users could easily break their builds –
particularly if the mojo in question compiled project source code. In
this case, direct configuration could result in a dependency being
present for compilation, but being unavailable for testing. Therefore,
the #readonly annotation functions to force users to configure the
POM, rather than configuring a specific plugin only.