How does the Spring BOM approach to releases and dependencies work? - java

I am relatively new to Maven and the JVM and am curious how Spring's approach to releases actually works? What is a BOM?
For example, the Spring Cloud page says Use your dependency management tools to control the version. If you are using Maven remember that the first version declared wins, so declare the BOMs in order, with the first one usually being the most recent (e.g. if you want to use Spring Boot 1.3.6 with Brixton.RELEASE, put the Boot BOM first).
Can someone give an example of what this means in practice?

You detected the Maven concept of Dependency Management. The Maven documentation illustrates that concept quite well. You will also find information anbout the Bill Of Materials (BOM) in the Maven documentation.
In short a BOM defines dependencies (with versions) and projects that import this BOM will get the version information for the atrifacts they depend on from the BOM. Thus a BOM can ensure that a set of dependencies is used with versions that are "compatible" with each other.

Related

ActiveMQ version 5.16.0 has vulnerable dependency jar

I am using ActiveMQ 5.16.0 downloaded from Apache. I see a few of the jars are older versions which have vulnerabilities, e.g.:
com.google.guava_guava 19.0
core_jackson-databind 2.9.10.4
shiro-core 1.5.3
log4j-1.2.17
I see all above vulnerable jars are located under apache-activemq-5.16.0\lib\optional\. What is use of jars under the optional directory? Is there any latest release of ActiveMQ which has all latest dependencies?
Optional dependencies are just that: Optional.
Using Shiro in your case as reference.
Optional Dependencies: Typically an optional dependency is not required for the core functionality of the library you are referencing. In this case, Shiro is only required if you intend to make use of Apache Shiro features or functionality. Shiro is used for security and therefore makes sense that it will not be used by everyone making use of ActiveMQ.
Versions: Many times (not always) optional dependency versions are not set in stone and it may be possible to use newer versions without breaking the functionality. This is not always the case, so if you aim to do this, start with the preferred version and only upgrade after the functionality is working to test.
Vulnerabilities: Simply because a vulnerability exists, does not make it applicable to your use case. Just because there is a known vulnerability in a dependency that can do XYZ, it will likely not affect you if your use case does not make use of XYZ. A security report such as the Apache Shiro one could help in understanding this.
Additionally: I would suggest that you look into Maven or Gradle for your Java projects. This will take away some of the need to worry about these types of dependency management issues as optional dependecies are not included in the dependecy hierarchy by default.

Security of outdated maven artifacts

Maven allows developers to have their artifacts depend on ancient artifacts as old as 10 years (e.g. commons-el:commons-el:1.0 released in 2005, or jetty:javax.servlet:5.1.11 released in 2007). It seems a common practice in the java ecosystem to depend on specific old-versioned artifacts because their updates often break API silently.
Are those old artifacts patched if a security flaw is found? Who is going to take care of this?
If I pull in, say the newest release of spark org.apache.spark:spark-core_2.11:2.0.0, after maven downloading 3GiB of its dependencies, I can see a couple of them even older than 2005. If the resulting spark is executed, will those outdated dependencies expose patential security flaws?
Note: this is neither about security of java itself, nor security of maven, but artifacts delivered by maven.
Maven's central repository requirements do not speak to transitive dependency security issues.
The responsibility for updating transitive dependencies is reliant on the owner of the dependency. The owner/maintainer of the dependency would need to implement any fixes to issues caused in their codebase when updating their dependencies(the ones with the security flaws).
As a user of dependencies in your application that may have insecure transitive dependencies, you have a few options:
Update to the latest version of the dependency, the dependency owner may have already implemented a fix.
Exclude insecure transitive dependencies. Use at your own risk, as this may have unintended effects. Often this does work, as the insecure dependency may not actually be used by the dependency you needed.
Fork the dependency codebase, update the insecure transitive dependency, fix any issues, and submit a pull request.
Also, if you want a detailed report on the security of dependencies in your java application, you can check out the OWASP Dependency Checker, which checks your project's dependencies(including transitive) against the NIST NVD database.
If security flaws are discovered in a particular package, the expectation is that a new patched version is published by the authors. To your point the older vulnerable version remains in Maven Central and at first glance this would appear to be a very bad thing.
It leads to the following obvious questions:
Why doesn't someone patch these vulnerable versions?
Why doesn't someone remove these vulnerable versions?
Let's explore the consequences....
If someone is changing a library version that I'm using, how certain am I that the code remains functionally the same? This is why patches are handled as new versions. It's a lot less work for the author.
So if old vulnerable versions are not being patched, surely they should be deleted? Well... If users don't want to be use the latest patched version of a library, for fear it would break their code, they would certainly be just as unhappy if someone removed the library version they did depend on..... Damned if you do, and damned if you don't....
So in conclusion it's a case of user beware. We all need to manage our dependencies and adapt to changes in the various underlying APIs. If we ignore change, we run the risk of exposure to a vulnerability without an option to upgrade. Welcome to software development :-)

Why does HK2 repackage everything?

I've recently switched from Jersey 1 to Jersey 2 for some projects I work on. The biggest annoyance I've run across with Jersey 2 is that it uses HK2, which for some reason repackages standard Maven artifacts. To avoid potential annoying-to-debug issues, I try to avoid pulling in the same classes from different projects. I use the Ban Duplicate Classes Maven enforcer rules from the Extra Enforcer Rules dependency to break the build if this occurs.
According to the aforementioned Ban Duplicate Classes enforcer rule, switching to Jersey 2 has introduced the following conflicts between its artifacts and standard ones I was previously using:
hk2 Artifact Conflicting Artifact
org.glassfish.hk2.external:aopalliance-repackaged:2.3.0-b07 aopalliance:aopalliance:1.0
org.glassfish.hk2.external:bean-validator:2.3.0-b07 com.fasterxml:classmate:0.8.0 (used by org.hibernate:hibernate-validator:5.0.0.Final)
org.glassfish.hk2.external:bean-validator:2.3.0-b07 javax.validation:validation-api:1.1.0.Final
org.glassfish.hk2.external:bean-validator:2.3.0-b07 org.hibernate:hibernate-validator:5.0.0.Final
org.glassfish.hk2.external:bean-validator:2.3.0-b07 org.jboss.logging:jboss-logging:3.1.0.GA
org.glassfish.hk2.external:javax.inject:2.3.0-b07 javax.inject:javax.inject:1
My solution has been to exclude the standard artifacts from the dependencies that transitively pull them, and therefore use only the hk2 artifacts. I figure this is safer: I don't know what else the hk2 artifacts are pulling in that I might be missing if I were to exclude them instead (for example, the bean-validator artifact appears to be repackaging at least four artifacts). The downsides to this are that first, I a ton of exclusions peppering my dependencies that were bringing in otherwise innocuous API dependencies, such as validation-api. Secondly, my artifacts are now exporting HK2 repackaged dependencies, rather than the actual API classes I would prefer to be exporting.
Ultimately, my questions are:
Why does HK2 repackage everything? Is there some good reason for this?
What is HK2 actually repackaging, and can I just use the standard API versions instead? How would I figure this out? I've cloned the HK2 project, and I've had a bit of trouble figuring out where to begin to find this out.
Barring the actual answer to these questions, what would be a good forum for contacting the developers behind HK2 so I can ask the question directly? I've looked through the website, and while I've found some mailing lists, I'm not seeing anything obviously appropriate for asking this question.
HK2 runs in an OSGi environment for products such as GlassFish. Unfortunately most of the standard jars such as javax.inject, bean-validator and aopalliance do not come with proper OSGi headers. So hk2 needs to repackage them with OSGi headers so that they will work properly in that environment.
Also since GlassFish is the RI for Java EE there are certain legal requirements that are made about the availability of the source code, so some of the repackaging that is done is to satisfy the availability of source code requirements.
That being said, if you are NOT running in an OSGi environment it is safe to replace these jars with the standard versions (though I myself have not tried this)

How to find where a library is used across multiple pom files

We have multiple maven projects depending on on our own common libraries.
When we upgrade a library it would be useful to quickly find out which projects have a dependency on the library (and might need to use the new version)
Obviously I can manually look in all the pom files or write a script to do it but this is less than ideal.
Are there any tools that provide this functionality. e.g. a hudson plugin, Nexus, artifactory etc?
EDIT:
Some clarifications:
I don't want to upgrade all projects at once. The regression testing and release effort makes this impractical and often unnecessary (even with automated testing and releasing). I just want a report showing me what may projects may need to have the library upgraded...
Many answers focus around the project itself flagging what version is used. Ideally the solution would work so that for a given library I can ask what uses this. This looks like what the Nexus issue below is talking about.
Hudson does something similar with automated downstream maven builds. I may look into extending this with a hudson plugin.
I know that albeit being a good tool for managing poms and jars, Artifactory has no such feature you're asking :(
You might want to check this StackOverflow question: Look for new versions of dependencies
Alternatively you might want to check Versions Maven Plugin. Among other things, it allows you to scan trough project's dependencies and produce a report of those dependencies which have newer versions available.
A quite usual practice is to declare all the versions in dependencyManagement of the parent module and reference dependencies without versions everywhere else. In this case you'll only need to update only one POM.
I solved this issue by using dependency version ranges to fetch the newest versions automatically.
<dependency>
<groupId>foo.xyzzy</groupId>
<artifactId>bar</artifactId>
<version>[1.0.0,2.0.0)</version>
</dependency>
It might make sense to use a version range such as [1.0,) to include the newest version after 1.0.
The trick to make this work is to have dev and prod profiles that include and exclude the snapshot repos for your commons and to use hudson to automatically build projects when a dependency is re-built.
Not exactly what you're asking for but the Dependency Convergence report might be helpful. Here are some examples.
Update: I'm aware that I'm not exactly answering the question with my suggestion but doing it the other way (finding all projects that reference a given artifact) would require gathering all POMs of all projects and I'm not aware of a solution currently offering such a referenced by feature (a corporate repository would be the perfect candidate to implement this though).
My search is over: http://www.sonarsource.org/sonar-2-1-in-screenshots/
Sonar now provides this.

Maven best practices: including timestamps for snapshot releases or not?

I recently added Maven snapshot build capability to a project, configured to use unique timestamp version on deployed artifact. But there is some confusion regarding whether this is the right thing to do (snapshots in question are deployed to one of public repos, not just within an entity like company): some say it causes problems when trying to use snapshots.
So: given how much of Maven is convention based, and following perceived best practices, I am hoping there are some guidelines as to which option to choose.
(NOTE: I slightly edited the title -- I am specifically interesting in benefits (or lack thereof) of including unique timestamp, via deploy option, for public snapshot versions; not so much whether to make use of timestamps if they are included, although that is obviously somewhat related question)
As a rule you should always build against the -SNAPSHOT dependency. However, you should avoid releasing your product if it includes -SNAPSHOT dependencies. If you use the Maven Release Plug-in to automate your release it will check to make sure you are not using release plug-ins or dependencies.
But that is not always possible. In cases where I need to release something based on a snapshot build that is when I use the explicit timestamp/build number rather than the -SNAPSHOT version naming scheme.
You can automate this using the Versions Maven Plugin. It provides goals to lock and unlock snapshot versions in your POM.
The whole point of a snapshot is to let someone use the latest version of the code. Why would you want to use a snapshot five versions back?
With that in mind, what do timestamps in the artifact name buy you?
I've been using Maven for about 5 years now. I've never seen a snapshot jar with a timestamp in the name.

Categories