Maven versioning of internal dependencies - java

Let's assume that we have a project which uses Maven and has some dependencies which are developed in the same company/team or even by the same person. It is clear that when some developer wants to compile the project, the specified dependencies will be fetched from the repo and downloaded locally if they are not there yet.
Now let's assume the following scenario:
The developer does not care about the dependencies and
the version of the dependency is x.x.x.SNAPSHOT => maven will be fetching the latest version from the repo every 24 hours (by default). Problem: if this version is not compatible with your project, basically, you don't even know what happened because you did not change anything in your project. The only possible solution here is to compile and manage the dependency locally.
the version of the dependency is "x.x.x.y" => maven will fetch exactly this version and nothing else. So, to update this dependency I need to change the version. Problem: it seems that it means that every time when this dependency gets some changes and code is pushed to the server the version must be changed. But this sounds just ridiculous.
Possible solution:
It seems that the only possible solution in this case is to handle the internal dependencies manually (get the source from repo and compile locally). However, these two issues are bothering me:
This solution breaks the whole idea of maven which should fetch all the dependencies.
This solution will bring difficulties for the developers who just want to start development on the project, but do not care about those dependencies (because those dependencies are not used in the project part that they are working on).
Is there better solution?

You can also take care about latest stable dependencies by making your maven to work with your own administrated web repo. So you will be sure that all crew has a plugins,frameworks etc. of version needed. Our team uses TeamCity server for building workbanches, .m2/settings.xml is configured to work with our TC server repo as well. And team-lead is controling all versions.

Related

Upgrade distant transitive dependencies in gradle

Ok Log4j has a vulnerability between 2.0 and <2.15. I've been charged with updating the versions to 2.15 in some java applications we have. In digging in, this is pretty easy with gradle
compile 'org.apache.logging.log4j:log4j-api:2.15.0'
compile 'org.apache.logging.log4j:log4j-core:2.15.0'
compile('io.sentry:sentry-log4j2:5.4.3'){
exclude group: 'org.apache.logging.log4j'
}
solves the issue. But, everything can't be that simple right? OF COURSE! We have an application that references an internal artifact where the internal artifact source code DOESN'T EXSIST. It'd be easy to do the above in the internal artifact and publish a new version, but noooo. The internal artifact source requires spring boot, so updating the main application like this does not solve the issue.
compile('org.apache.logging.log4j:log4j-api:2.15.0')
compile('org.apache.logging.log4j:log4j-core:2.15.0')
compile('com.xxxx:xxxxx:0.0.1'){ <--
exclude group: 'org.apache.logging.log4j'
}
While the internal artifact does not include log4j with this setup, spring boot cannot find the reference to log4j because springboot is encapsulated inside the internal artifact.
I've been working at this for some time. I've tried implementation constraints. I've tried downloading the artifact, unzipping, and trying to decompile the class objects into java, but the decompiler was doing some optimization and I couldn't determine the target java version based on the decompiled classes. Which is scary and would require alot of testing before going into production.
How the hell do I either, make the aforementioned log4j version available to this mysterious artifact or how to I force the artifact to use a different version.
P.S. I've ran gradle dependencies and its 2.x -> 2.15. I've confirmed everything works fine with this upgrade.
P.P.S. The artifact is built with maven. I don't know if that matters and I don't think it does.
P.P.P.S. I've edited this a few times to improve clarity, if this is not your first time here, please re-read.

Issues excluding transitive dependency of project reference from eclipse class path

I have several gradle projects in my eclipse workspace. For the sake of simplicity I'm only really interested in 2 of them, let's just use A and B for this.
So the problem I'm having is that Project A has an included dependency on JBoss, which pulls in javax validation-api 1.0.0.GA, and Project B has a dependency on javax validation-api 1.1.0.Final. Since Gradle itself resolves the conflict by using the newer library first, B is happy when built by gradle. But Eclipse itself includes errors which are very distracting while editing.
The correct version of the validation-api jar ends up in B's class path but the problem is that the Gradle IDE plugin changes the project(':A') dependency to a project reference, and Eclipse seems to give the project reference precedence over the external jar. So the old jar is preferred by extension.
I tried adding { exclude module: 'validation-api' } in B's build.gradle for the dependency on A which works according to the output of 'gradle dependencies', however since Eclipse just gets as far as making it a project reference, it won't exclude the jar and the problem remains.
Also per this question I tried adding { transitive = false } and the same thing happens. I don't think even the hack posed there would work for me since the .classpath contains a single reference to the Gradle container so there's nothing to remove.
I've managed to get around this by explicitly including a reference to the correct version of the jar from my gradle cache and then moving it above the Gradle Classpath Container so that eclipse sees that version first.
My question is: Is there a better/more generic way to do this? Preferably one that I can commit to source control without breaking other people's builds or requiring them to manually modify paths or properties somewhere? There is another project with what appears to be a similar issue so something I can fix in the build.gradle file would be awesome.
Worst case scenario, I could probably switch to IntelliJ if that behaves itself better than the Eclipse-Gradle integration?
These kind of transitive dependency issues are long-standing problem with Gradle Eclipse integration (both in STS tooling and also commandline generated .classpath metadata from Gradle's Eclipse plugin. The problem is the way that Eclipse computes transitive classpaths.
Only recently we found a reasonable solution to this problem. Actually there are now two solutions, one better than the other but depending on your situation you might want to use either of them.
The first solution is a bug fix that changes the classpath order of project dependencies so that they are no longer 'preferred' over jar dependencies PR-74. To get this fix you may need to install gradle tooling from a snapshot update site because the fix went in after 3.6.3.
This solution doesn't fix the real problem (you still have the 'wrong' stuff on the classpath) but just makes it less likely to cause real problem in your projects.
The second solution is to enable use of the 'Custom Tooling API model' PR-55 introduced in STS 3.6.3. This is a bit experimental and only works for recent version of Gradle, at least 1.12 but probably better to use 2.x. It also only works for projects that have 'Dependency management' enabled (if not enabled you are using the .classpath generated by Gradle's eclipse plugin which has the same 'broken' classpath issues as the STS tooling).
The 'custom tooling model' is really the better solution in principle as it fixes the way gradle classpath get mapped to eclipse projects so that project dependencies are no longer exported and each project gets its own classpath considering dependencies conflict resolution.
To enable this go to "Window >> Preferences >> Gradle" and enable checkbox "Use Custom Tooling Model".

Best practice for dependency management in a project with a huge number of dependencies

Our project is like an adapter/facade interface for a huge amount of different other libraries. The dependencies are somehow overlapped, sometimes in conflict or sometimes even make project breaks in silence for wrong version of dependencies provide wrong behavior of same interface.
We are using Ivy and Ant to do basic dependencies management.
What's the best practice to manage dependencies and detect wrong behavior early on?
The important part of this question is about process, not tools.
If a project's dependencies are owned by other teams or third parties, that project must explicitly accept each new version of each new dependency. Allowing dependencies to upgrade themselves would allow them to break the depending project without warning, which is what it sounds like is happening.
Each dependency must release known versions, whether as binaries or tags in version control or whatever is appropriate to your stack. Each time a project wants to upgrade a dependency, it must test the result of the upgrade. (As usual comprehensive automated testing will be a big help.) If it fails (either because the dependency is just broken or because the dependency brings in an incompatible version of a transitive dependency), abandon the upgrade, report the problem to the owners of the dependencies, and try again after they've released a version which fixes the problem. If it succeeds, change the version of the dependency that the project uses in its build configuration.
Ideally a project will upgrade dependencies one at a time and fully test each upgrade separately. If it's necessary to upgrade more than one dependency all at once (perhaps because two dependencies both depend on a third dependency of which there can only be one version in the system) that's fine, although it's a bigger change and thus riskier. If your project has transitive dependencies like this, it will be worth the engineering effort to make them backward-compatible over as many versions as is reasonable.
Of course many tools support this process easily enough: just pin each dependency to a specific version.

Apache Maven: where do dependencies and libraries installed locally up?

I'm building a Java project that has a dependency on a library. mvn.bat clean install produced the target subdirectories as expected, and the project built fine with mvn.bat clean install as well.
What's not expected is that when I deleted the entire directory of the library, the outer project still built fine, although the library it depends on was gone.
How does this work?
UPDATE: Turns out Maven makes some sort of cache in %USERPROFILE\.m2.
You are most likely thinking of your local repository where everything you install locally (and maven downloads for you from the central repository) is put for later usage.
The behavior you describe is intentional, and allows for building A once and then let B reference it whenever needed, without having to recompile A every time. This is usually very desirable, especially in teams or with large code bases.
Note, that for changing code you should be using -SNAPSHOT artifacts. They are treated slightly differently.
Your dependencies are always downloaded into .m2/repository.
If you want to have some predictability on downloaded libraries in your team, you can put in place a repository manager like Nexus : https://repository.apache.org/index.html#welcome
Instead of downloading dependencies from Maven central, your developers will download their dependencies from this repository manager.

Maven: repositories

I have a small Java project in a version control system (git), shared by 4 developers. I'm thinking about using Maven in this project as a build tool.
Dependency management is a wanted feature, but I don't want:
- automatic updates of dependencies (as this could break my software).
- to rely on an Internet connection to download dependencies from a remote repository and be able to compile my code.
Therefore, the questions:
1) May I configure Maven to use local dependencies (eg jars shared in a VCS)? I don't have several dependencies shared among several projects and my dependencies rarely will be updated, so using Maven repositories is not worth it to me imho.
2) If I choose to use a Maven repository, may I configure one in my local network? I don't want a remote repository mirror or a portal to the remote repository. I want a standalone repository with my dependencies, located at a server in my local network.
3) If I use the default Maven approach with the remote repository, could I turn off dependency updates after all dependencies are downloaded the first time?
Thanks in advance for any help.
Answer to 1:
Yes you can, google for System-Scope dependencies, BUT: It is not a good idea to do this, because you will remove one of the key-features.
<dependency>
<groupId>com.mycompany</groupId>
<artifactId>app-lib</artifactId>
<version>3.1.0</version>
<scope>system</scope>
<systemPath>${project.basedir}/libs/app-lib-3.1.0.jar</systemPath>
</dependency>
Answer to 2:
Yes you can:
- Artifactory
- Nexus
Answer to 3:
Yes you can. For that case you can use the ---offline flag OR better approach: release all dependencies.
Some thoughts:
You want to use a dependecy-management system, without using dependency-management, sounds strange to mee.
If you fear, that changes within your libs may break your code, just don't use SNAPSHOTs.
Try a kind of version scheme. We use
x.y.z
if z changes in a release, the jar should be compatible.
if y changes, you'll have to change youz code
if x changes... well everthing needs to be renewed.
Your concern about being dependent on Internet connectivity is a valid one, but I don't think it's as bad as you think.
After a dependency is downloaded from the Central Repository, it is saved to a cache on your hard drive (located at "~/.m2/repository"). From then on, the copy in the cache is used and Internet connectivity is no longer required to compile your application.
When you compile your first project in Maven, it will have to download a crap-load of stuff. But after that, all subsequent compilations will go much faster and they won't need to download anything.
Also, Maven's versioning scheme makes it so that all "release" versions of a dependency cannot change once they are deployed to the Central repository. For example, if I'm using version "2.2" of "commons-io", I know that this version of the library will always stay the same. A change cannot be made without releasing a new version.
"Snapshot" versions, however, can change. If a library's version is a snapshot, then it will end in "-SNAPSHOT" (for example, "1.2-SNAPSHOT" means the library will eventually be released as "1.2"). I don't think the Central repository allows snapshot builds though. You shouldn't use them in production code anyway.
I thought that Internet connectivity was only needed in the 1st compile, but I get several download msgs whenever I change code. Msgs like these:
Downloading: http://repo.maven.apache.org/maven2/org/eclipse/core/resources/maven-metadata.xml
Downloading: http://repository.springsource.com/maven/bundles/external/org/eclipse/core/resources/maven-metadata.xml
Downloading: http://repository.springsource.com/maven/bundles/release/org/eclipse/core/resources/maven-metadata.xml
Downloading: https://oss.sonatype.org/content/repositories/snapshots/org/eclipse/core/resources/maven-metadata.xml
Why is that? Is Maven looking for updates in these repositories or is there another reason to download these metadata xmls often?

Categories