The project I am working on produces a jar that I deploy on azure so Spark runs the job.
It is using an internal dependency A which uses the dependency org.apache.commons:commons-configuration:1.10 yet when I deploy on azure it uses 2.1.1 version by default.
On azure we have the version 2.1.1 in which the package name(org.apache.commons.configuration2) differs from the 1.10 version ( org.apache.commons.configuration).
So having this line in the dependency A caused an error when using the 2.1.1 version:
Import org.apache.commons.configuration
It needs to be having "2" at the end, a thing I can t add as A is a dependency.
I tried excluding org.apache.commons:commons-configuration from A then using the maven shade plugin to rename the package but the jar file become double the actual size besides the shaded jar produced alone not inside the zip with the workflow and the sh file, a thing my team may not like.
Updating from commons-configuration 1 to 2 is a major change, the new version is not a drop-in replacement. As you have already pointed out the top level package changes and this will most likely brake library A. The correct solution will probably be to update library A to use commons-configuration 2.
You can still try to hack the Maven project setup to see if it works:
Exclude commons-configuration 1 from library A dependency using <exclude> tag.
Add commons-configuration 2 as a direct project dependency with provided scope in module B. The provided scope is needed to avoid packaging the dependency.
If you want to avoid using the maven-shade plugin than an alternative solution might be to:
Exclude commons-configuration 1 in the library A dependency declaration;
Work out which classes and methods from commons-configuration 1 that library A uses (easy enough if you have the source code, otherwise a modern IDE will disassemble it for you);
Write your own versions of these classes and methods in your application that delegate to the commons-configuration2 implementation.
Note that commons-configuration2 is a part of the Apache Spark distribution and it cannot be ignored. It would need to be added to your project with <scope>provided</scope>.
If this is too hard then the maven-shade-plugin is your only viable solution.
Related
I have similar question as here. However that question does not mention what build tool he is using and I assume that he is using maven as I didn't have problems with maven when using java 9 modules with it previously.
I am using the hibernate validator and I want to use java 9 modules, so I added a module-info file to the package of the module where I am depending on the validator api (the classes I am using are Validator, ValidatiorFactory , ... from packages like javax.validation)
I searched for these classes and found that they reside in this jar in my project dependencies: validation-api-2.0.1.Final.jar, the classes I am using are inside package validation.
I used the command jar --file=/path-to-the-jar-on-my-pc/validation-api-2.0.1.Final.jar --describe-module in the terminal and got the names of the modules exported from that jar:
No module descriptor found. Derived automatic module.
java.validation#2.0.1.Final automatic
requires java.base mandated
contains javax.validation
contains javax.validation.bootstrap
contains javax.validation.constraints
contains javax.validation.constraintvalidation
contains javax.validation.executable
contains javax.validation.groups
contains javax.validation.metadata
contains javax.validation.spi
contains javax.validation.valueextraction
So now when I put in my module-info file for example requires javax.validation the IDE complains that module is not found . I even added the dependency manually in the project structure (pressing ctrl+shift+alt+s to access it in intellij) where I added it from the path where it is stored in my machine but still same result.
I also tried the help tool from intellij and I found that it added requires java.validation; to my module-info and not requires javax.validation;, but anyway neither of them work.
I searched in pom.xml of that module and found this element <Automatic-Module-Name>java.validation</Automatic-Module-Name>, so now I am almost sure that gradle is causing the problem but I am no expert in gradle and how building tools work, so how can I solve this with staying at using gradle as build tool?
Gradle didn't add proper support for the Java Platform Module System until version 6.4. However, you have to explicitly configure Gradle to work with modules.
// build.gradle
java {
modularity.inferModulePath = true
}
Though if I'm not mistaken, inferModulePath is true by default as of Gradle 7.0.
For more information regarding Java modules and Gradle see https://docs.gradle.org/current/samples/sample_java_modules_multi_project.html
I had the same issue. Changing javax.validation to jakarta.validation resolved it.
build.gradle.kts:
implementation("jakarta.validation:jakarta.validation-api:3.0.1")
module-info.java:
requires jakarta.validation;
I am using gradle java modularity plugin
https://plugins.gradle.org/plugin/org.javamodularity.moduleplugin
here is what I set on my build.gradle.kts:
java {
modularity.inferModulePath.set(false)
}
application {
// Define the main class for the application.
mainModule.set("modules.demo.gradle.consumer")
mainClass.set("com.demo.gradle.consumer.Consumer")
}
I know modularity.inferModulePath.set(false) contradicts what Slaw wrote, but it is necessary if you use the plugin. See documentation
https://github.com/java9-modularity/gradle-modules-plugin
Getting a class not found error at runtime due to maven sub dependency issue:
I am working on integrating twilio sdk ( com.twilio.sdk:twilio:7.35.0 ) into a multi module maven(3.x)/java(java8) project.
I firstly added the twilio maven dependency to the corresponding module
And I am getting a class not found exception at runtime on org.apache.http.conn.HttpClientConnectionManager.
I looked into it and found out that this class is part of org.apache.httpcomponents:httpclient (which is a subdependency in the twilio sdk ) and that an earlier version of this dependency is in my project.
And this earlier version does not have the HttpClientConnectionManager class.
So from this point, I tried to exclude the old version of the dependency with exclude tag first then with maven enforcer plugin and in the same time importing the dependency directly but nothing worked.
I tried to import the dependency in the parent pom and in the other modules that are using my twilio module as well.
I am using twilio 7.35 which uses org.apache.httpcomponents:4.5.6 but in my multi-module project I am using org.apache.cassandra:cassandra-thrift:3.0.0 which is using thrift:0.9.2 which contains the old version of httpclient(4.2.5).
The latest version of this cassandra module does not support the latest version of httpClient, so I need to make sure this httpclient older dependency does not mess up the twilio one.
I also analysed the output of mvn dependency:tree -Dverbose and it seems that the 4.5.6 is getting picked up correclty. And when I tried adding it to the parent module or the calling module, I can see that the old version is getting overwritten by the twilio one but it does not solve my issue.
I am starting to wonder if it is even possible to have two versions of the dependencies in the same maven project.
It sounds like you are experiencing something similar to a related question dealing with Jar Hell: Jar hell: how to use a classloader to replace one jar library version with another at runtime
In this case you need to use a separate classloader from the default one in your project. Perhaps you could use the URL Classloader and load some or all of your newer dependencies from the filesystem.
When developing Java libraries we're currently using the Apache Maven Shade Plugin to hide internal dependencies on other libraries (jars) by renaming their package names.
Is it possible to hide these internal library-dependencies by using the Java 9 module system and not exporting the name of the internally used libraries?
I.e:
Both module A and B include, but does not export, class org.codehaus.jackson.map.ObjectMapper (included using e.g. Maven Shade plugin) with different versions for the class
Module A uses module B
Will each module still use its implementation org.codehaus.jackson.map.ObjectMapper?
I believe it should by so, but I have found no documentation explicitely confirming this, nor any texts / examples recommending this approach for this quite usual versioning issue.
This issue is described as http://openjdk.java.net/projects/jigsaw/spec/issues/#MultiModuleExecutableJARs and there's no support for it yet. In case all dependencies are modules, it would make sense to use jlink to solve this. But as long as there is at least one non-module, there's no solution available yet. This is something that needs to be solved within the JDK/JRE.
It is still a valid case, so I would suggest to ask this question the at jigsaw-dev mailinglist and refer to #MultiModuleExecutableJARs
The module declaration defines, among other things, a module's dependencies. If I use Maven as a build tool, this is redundant because the pom.xml already contains these (and more) information. Based on that, couldn't Maven generate the module-info.java for me?
One might expect that most of the dependencies are indeed required modules as well. However, requirements can also point to modules of the JDK/JRE, which are not specified in the pom.xml. So yes, if you only look at the dependencies, probably most of them could be transformed to a required module reference.
But a module-descriptor contains much more information, which are all based on decisions to be made by the developer.
I've written an article about it which describes in detail why it is not possible to fully generate this file.
As far as I know, bnd-maven-plugin can generate module-info.class based on the configured dependencies. If you are working with maven-bundle-plugin, you need to specify the version of bndlib manually, for the latest version of maven-bundle-plugin(5.1.3) is still using the 5.x version of bndlib, and bndlib requires 6.x to support jpms.
Document: https://bnd.bndtools.org/releases/6.1.0/chapters/330-jpms.html
I have several gradle projects in my eclipse workspace. For the sake of simplicity I'm only really interested in 2 of them, let's just use A and B for this.
So the problem I'm having is that Project A has an included dependency on JBoss, which pulls in javax validation-api 1.0.0.GA, and Project B has a dependency on javax validation-api 1.1.0.Final. Since Gradle itself resolves the conflict by using the newer library first, B is happy when built by gradle. But Eclipse itself includes errors which are very distracting while editing.
The correct version of the validation-api jar ends up in B's class path but the problem is that the Gradle IDE plugin changes the project(':A') dependency to a project reference, and Eclipse seems to give the project reference precedence over the external jar. So the old jar is preferred by extension.
I tried adding { exclude module: 'validation-api' } in B's build.gradle for the dependency on A which works according to the output of 'gradle dependencies', however since Eclipse just gets as far as making it a project reference, it won't exclude the jar and the problem remains.
Also per this question I tried adding { transitive = false } and the same thing happens. I don't think even the hack posed there would work for me since the .classpath contains a single reference to the Gradle container so there's nothing to remove.
I've managed to get around this by explicitly including a reference to the correct version of the jar from my gradle cache and then moving it above the Gradle Classpath Container so that eclipse sees that version first.
My question is: Is there a better/more generic way to do this? Preferably one that I can commit to source control without breaking other people's builds or requiring them to manually modify paths or properties somewhere? There is another project with what appears to be a similar issue so something I can fix in the build.gradle file would be awesome.
Worst case scenario, I could probably switch to IntelliJ if that behaves itself better than the Eclipse-Gradle integration?
These kind of transitive dependency issues are long-standing problem with Gradle Eclipse integration (both in STS tooling and also commandline generated .classpath metadata from Gradle's Eclipse plugin. The problem is the way that Eclipse computes transitive classpaths.
Only recently we found a reasonable solution to this problem. Actually there are now two solutions, one better than the other but depending on your situation you might want to use either of them.
The first solution is a bug fix that changes the classpath order of project dependencies so that they are no longer 'preferred' over jar dependencies PR-74. To get this fix you may need to install gradle tooling from a snapshot update site because the fix went in after 3.6.3.
This solution doesn't fix the real problem (you still have the 'wrong' stuff on the classpath) but just makes it less likely to cause real problem in your projects.
The second solution is to enable use of the 'Custom Tooling API model' PR-55 introduced in STS 3.6.3. This is a bit experimental and only works for recent version of Gradle, at least 1.12 but probably better to use 2.x. It also only works for projects that have 'Dependency management' enabled (if not enabled you are using the .classpath generated by Gradle's eclipse plugin which has the same 'broken' classpath issues as the STS tooling).
The 'custom tooling model' is really the better solution in principle as it fixes the way gradle classpath get mapped to eclipse projects so that project dependencies are no longer exported and each project gets its own classpath considering dependencies conflict resolution.
To enable this go to "Window >> Preferences >> Gradle" and enable checkbox "Use Custom Tooling Model".