Ok Log4j has a vulnerability between 2.0 and <2.15. I've been charged with updating the versions to 2.15 in some java applications we have. In digging in, this is pretty easy with gradle
compile 'org.apache.logging.log4j:log4j-api:2.15.0'
compile 'org.apache.logging.log4j:log4j-core:2.15.0'
compile('io.sentry:sentry-log4j2:5.4.3'){
exclude group: 'org.apache.logging.log4j'
}
solves the issue. But, everything can't be that simple right? OF COURSE! We have an application that references an internal artifact where the internal artifact source code DOESN'T EXSIST. It'd be easy to do the above in the internal artifact and publish a new version, but noooo. The internal artifact source requires spring boot, so updating the main application like this does not solve the issue.
compile('org.apache.logging.log4j:log4j-api:2.15.0')
compile('org.apache.logging.log4j:log4j-core:2.15.0')
compile('com.xxxx:xxxxx:0.0.1'){ <--
exclude group: 'org.apache.logging.log4j'
}
While the internal artifact does not include log4j with this setup, spring boot cannot find the reference to log4j because springboot is encapsulated inside the internal artifact.
I've been working at this for some time. I've tried implementation constraints. I've tried downloading the artifact, unzipping, and trying to decompile the class objects into java, but the decompiler was doing some optimization and I couldn't determine the target java version based on the decompiled classes. Which is scary and would require alot of testing before going into production.
How the hell do I either, make the aforementioned log4j version available to this mysterious artifact or how to I force the artifact to use a different version.
P.S. I've ran gradle dependencies and its 2.x -> 2.15. I've confirmed everything works fine with this upgrade.
P.P.S. The artifact is built with maven. I don't know if that matters and I don't think it does.
P.P.P.S. I've edited this a few times to improve clarity, if this is not your first time here, please re-read.
Related
We are running our Java EE applications in WAS 8.5 and Gradle 5.* to build them.
In the past we packaged our .war application in an .ear archive, which we then deployed on our server. We had to separate our libraries from our applications and include them as shared libraries, because in our experience it made deploying much slower and in some cases used up all system memory, crashing the server.
After some experimentation, we realized that we don't need to extract the dependencies into shared libraries, because we can include them in the lib folder of our .ear archive.
Currently, we get this done by defining the dependencies of our .war application as compileOnly and redefining them as earlib in the root project (which generates the .ear archive). I'm looking for a way to automate this procedure.
The script I used looks something like this:
project.configurations.named('deploy').getAllDependencies().withType(ProjectDependency.class).forEach({dependency ->
project.configurations.named('earlib').getAllDependencies()
.addAll(dependency.dependentProject.configurations.named('earlib').getAllDependencies())
})
// This loosely resembles the actual code I used. The thought process is right, it just might have a couple syntax errors.
// Obviously, I defined an `earlib` configuration in the subproject
I tried running this code in the configuration phase, as well as in the doFirst{} section of the ear task. They all had different problems.
The former didn't work, because it seems like in the configuration phase when this code ran, the dependencies weren't configured yet.
The latter didn't work, because I can't just add dependencies during runtime (thinking back, it sounds ridiculous that I even tried it).
My question is: Can I find a phase in the build lifecycle, where I can find and modify the dependencies? Is there another workaround to solve my problem?
The technical answer to your questions is that you can use either:
A configuration.incoming.beforeResolve hook to do it last minute, only when the configuration really needs to be resolved.
Use an afterEvaluate block, assuming all the other dependencies are not defined in an afterEvaluate themselves.
However, the right solution would be to leverage the dependency management engine of Gradle and effectively declare that your root project, the one building the EAR, has dependencies on the specific configurations of the subprojects.
Not knowing your full setup and details, I believe the above would still be the more correct solution, though you may have to filter the subproject artifacts from the resulting graph.
Ideas on how this works in recent Gradle version: https://docs.gradle.org/6.2/userguide/cross_project_publications.html Most of the things explained there should work with the latest 5.x versions.
I've recently returned to working on a Scala project, after having spent some time working with the nodejs ecosystem. After getting used to package managers like npm and yarn, I feel like I need to rethink the tools/processes I've been using for Java/Scala. With that in mind, there are several problems which appear to exist in the JVM world for which I'd like to know if there's some automated solution:
Given some list of dependencies without versions (but with group, module), is there some automated way to detect what the valid combinations (if any exist) of versions for each dependency are? Specifically, ensuring that there are no conflicting transitive dependencies?
I believe Java Modules should reduce/eliminate this issue, but I'm currently limited to Java 8 so can't use them.
Aside from manually changing version numbers in my build.gradle, is there any automated way to update a dependency from cli?
For example, I can do yarn install <package>#<version> to record the new version of a nodejs library I depend on and install it in one step - does anything similar exist for JVM projects?
Are there any tools similar to updtr for Java/Scala projects? Basically; a tool that will automatically try to update my dependencies and run tests with the new versions, and rollback if anything fails.
In case it matters, I'm currently using gradle as my build tool in a Scala 2.11 project; but I'm curious to know about any answers that would apply to any mixed language project using any build tool. Ultimately I just want to avoid manually checking every one of my dependencies against every other dependency manually - anything else is an extra nicety.
I can answer only point 3 of your question, and even this - only partially.
You might be interested in Gradle Versions Plugin by Ben Manes.
This plugin does not update your dependencies (so all the more it does not have the test-running + rollback functionality).
However, it will list all the dependencies that can be upgraded, like that (it's only one of the possible formats):
The following dependencies are using the latest integration
version:
- backport-util-concurrent:backport-util-concurrent:3.1
- backport-util-concurrent:backport-util-concurrent-java12:3.1
The following dependencies exceed the version found at the integration
revision level:
- com.google.guava:guava [99.0-SNAPSHOT 3.0]
http://code.google.com/p/google-guice/
- com.google.inject.extensions:guice-multibindings [2.0 -> 3.0]
http://code.google.com/p/google-guice/
Gradle updates:
- Gradle: [4.6 -> 4.7 -> 4.8-rc-2]
Source: Report format
Moreover, the plugin can be configured as to what is not considered to be an upgradeable version:
boolean rejected = ['alpha', 'beta', 'rc', 'cr', 'm'].any { qualifier ->
selection.candidate.version ==~ /(?i).*[.-]${qualifier}[.\d-]*/
}
Source: Revisions
Let's assume that we have a project which uses Maven and has some dependencies which are developed in the same company/team or even by the same person. It is clear that when some developer wants to compile the project, the specified dependencies will be fetched from the repo and downloaded locally if they are not there yet.
Now let's assume the following scenario:
The developer does not care about the dependencies and
the version of the dependency is x.x.x.SNAPSHOT => maven will be fetching the latest version from the repo every 24 hours (by default). Problem: if this version is not compatible with your project, basically, you don't even know what happened because you did not change anything in your project. The only possible solution here is to compile and manage the dependency locally.
the version of the dependency is "x.x.x.y" => maven will fetch exactly this version and nothing else. So, to update this dependency I need to change the version. Problem: it seems that it means that every time when this dependency gets some changes and code is pushed to the server the version must be changed. But this sounds just ridiculous.
Possible solution:
It seems that the only possible solution in this case is to handle the internal dependencies manually (get the source from repo and compile locally). However, these two issues are bothering me:
This solution breaks the whole idea of maven which should fetch all the dependencies.
This solution will bring difficulties for the developers who just want to start development on the project, but do not care about those dependencies (because those dependencies are not used in the project part that they are working on).
Is there better solution?
You can also take care about latest stable dependencies by making your maven to work with your own administrated web repo. So you will be sure that all crew has a plugins,frameworks etc. of version needed. Our team uses TeamCity server for building workbanches, .m2/settings.xml is configured to work with our TC server repo as well. And team-lead is controling all versions.
I have several gradle projects in my eclipse workspace. For the sake of simplicity I'm only really interested in 2 of them, let's just use A and B for this.
So the problem I'm having is that Project A has an included dependency on JBoss, which pulls in javax validation-api 1.0.0.GA, and Project B has a dependency on javax validation-api 1.1.0.Final. Since Gradle itself resolves the conflict by using the newer library first, B is happy when built by gradle. But Eclipse itself includes errors which are very distracting while editing.
The correct version of the validation-api jar ends up in B's class path but the problem is that the Gradle IDE plugin changes the project(':A') dependency to a project reference, and Eclipse seems to give the project reference precedence over the external jar. So the old jar is preferred by extension.
I tried adding { exclude module: 'validation-api' } in B's build.gradle for the dependency on A which works according to the output of 'gradle dependencies', however since Eclipse just gets as far as making it a project reference, it won't exclude the jar and the problem remains.
Also per this question I tried adding { transitive = false } and the same thing happens. I don't think even the hack posed there would work for me since the .classpath contains a single reference to the Gradle container so there's nothing to remove.
I've managed to get around this by explicitly including a reference to the correct version of the jar from my gradle cache and then moving it above the Gradle Classpath Container so that eclipse sees that version first.
My question is: Is there a better/more generic way to do this? Preferably one that I can commit to source control without breaking other people's builds or requiring them to manually modify paths or properties somewhere? There is another project with what appears to be a similar issue so something I can fix in the build.gradle file would be awesome.
Worst case scenario, I could probably switch to IntelliJ if that behaves itself better than the Eclipse-Gradle integration?
These kind of transitive dependency issues are long-standing problem with Gradle Eclipse integration (both in STS tooling and also commandline generated .classpath metadata from Gradle's Eclipse plugin. The problem is the way that Eclipse computes transitive classpaths.
Only recently we found a reasonable solution to this problem. Actually there are now two solutions, one better than the other but depending on your situation you might want to use either of them.
The first solution is a bug fix that changes the classpath order of project dependencies so that they are no longer 'preferred' over jar dependencies PR-74. To get this fix you may need to install gradle tooling from a snapshot update site because the fix went in after 3.6.3.
This solution doesn't fix the real problem (you still have the 'wrong' stuff on the classpath) but just makes it less likely to cause real problem in your projects.
The second solution is to enable use of the 'Custom Tooling API model' PR-55 introduced in STS 3.6.3. This is a bit experimental and only works for recent version of Gradle, at least 1.12 but probably better to use 2.x. It also only works for projects that have 'Dependency management' enabled (if not enabled you are using the .classpath generated by Gradle's eclipse plugin which has the same 'broken' classpath issues as the STS tooling).
The 'custom tooling model' is really the better solution in principle as it fixes the way gradle classpath get mapped to eclipse projects so that project dependencies are no longer exported and each project gets its own classpath considering dependencies conflict resolution.
To enable this go to "Window >> Preferences >> Gradle" and enable checkbox "Use Custom Tooling Model".
Our project is like an adapter/facade interface for a huge amount of different other libraries. The dependencies are somehow overlapped, sometimes in conflict or sometimes even make project breaks in silence for wrong version of dependencies provide wrong behavior of same interface.
We are using Ivy and Ant to do basic dependencies management.
What's the best practice to manage dependencies and detect wrong behavior early on?
The important part of this question is about process, not tools.
If a project's dependencies are owned by other teams or third parties, that project must explicitly accept each new version of each new dependency. Allowing dependencies to upgrade themselves would allow them to break the depending project without warning, which is what it sounds like is happening.
Each dependency must release known versions, whether as binaries or tags in version control or whatever is appropriate to your stack. Each time a project wants to upgrade a dependency, it must test the result of the upgrade. (As usual comprehensive automated testing will be a big help.) If it fails (either because the dependency is just broken or because the dependency brings in an incompatible version of a transitive dependency), abandon the upgrade, report the problem to the owners of the dependencies, and try again after they've released a version which fixes the problem. If it succeeds, change the version of the dependency that the project uses in its build configuration.
Ideally a project will upgrade dependencies one at a time and fully test each upgrade separately. If it's necessary to upgrade more than one dependency all at once (perhaps because two dependencies both depend on a third dependency of which there can only be one version in the system) that's fine, although it's a bigger change and thus riskier. If your project has transitive dependencies like this, it will be worth the engineering effort to make them backward-compatible over as many versions as is reasonable.
Of course many tools support this process easily enough: just pin each dependency to a specific version.