Maven allows developers to have their artifacts depend on ancient artifacts as old as 10 years (e.g. commons-el:commons-el:1.0 released in 2005, or jetty:javax.servlet:5.1.11 released in 2007). It seems a common practice in the java ecosystem to depend on specific old-versioned artifacts because their updates often break API silently.
Are those old artifacts patched if a security flaw is found? Who is going to take care of this?
If I pull in, say the newest release of spark org.apache.spark:spark-core_2.11:2.0.0, after maven downloading 3GiB of its dependencies, I can see a couple of them even older than 2005. If the resulting spark is executed, will those outdated dependencies expose patential security flaws?
Note: this is neither about security of java itself, nor security of maven, but artifacts delivered by maven.
Maven's central repository requirements do not speak to transitive dependency security issues.
The responsibility for updating transitive dependencies is reliant on the owner of the dependency. The owner/maintainer of the dependency would need to implement any fixes to issues caused in their codebase when updating their dependencies(the ones with the security flaws).
As a user of dependencies in your application that may have insecure transitive dependencies, you have a few options:
Update to the latest version of the dependency, the dependency owner may have already implemented a fix.
Exclude insecure transitive dependencies. Use at your own risk, as this may have unintended effects. Often this does work, as the insecure dependency may not actually be used by the dependency you needed.
Fork the dependency codebase, update the insecure transitive dependency, fix any issues, and submit a pull request.
Also, if you want a detailed report on the security of dependencies in your java application, you can check out the OWASP Dependency Checker, which checks your project's dependencies(including transitive) against the NIST NVD database.
If security flaws are discovered in a particular package, the expectation is that a new patched version is published by the authors. To your point the older vulnerable version remains in Maven Central and at first glance this would appear to be a very bad thing.
It leads to the following obvious questions:
Why doesn't someone patch these vulnerable versions?
Why doesn't someone remove these vulnerable versions?
Let's explore the consequences....
If someone is changing a library version that I'm using, how certain am I that the code remains functionally the same? This is why patches are handled as new versions. It's a lot less work for the author.
So if old vulnerable versions are not being patched, surely they should be deleted? Well... If users don't want to be use the latest patched version of a library, for fear it would break their code, they would certainly be just as unhappy if someone removed the library version they did depend on..... Damned if you do, and damned if you don't....
So in conclusion it's a case of user beware. We all need to manage our dependencies and adapt to changes in the various underlying APIs. If we ignore change, we run the risk of exposure to a vulnerability without an option to upgrade. Welcome to software development :-)
Related
I'm using Snyk service to check my projects for vulnerabilities.
Projects with OkHttp dependency have one common vulnerability:
Vulnerable module: org.jetbrains.kotlin:kotlin-stdlib
Introduced through: com.squareup.okhttp3:okhttp#4.10.0
You can check the full report here: https://snyk.io/test/github/yvasyliev/deezer-api
In Overview section there is a note:
Note: As of version 1.4.21, the vulnerable functions have been marked as deprecated. Due to still being useable, this advisory is kept as "unfixed".
I have two questions:
Can I fix this vulnerability in Maven project and how?
If vulnerability cannot be fixed, then does it mean that every signle Kotlin application has this vulnerability by default (since it's comming from kotlin-stdlib)?
The latest stable version of OkHttp is added to project by Maven:
<dependency>
<groupId>com.squareup.okhttp3</groupId>
<artifactId>okhttp</artifactId>
<version>4.10.0</version>
</dependency>
As with all vulnerable software libraries, you need to assess whether or not you're actually affected by the vulnerability that is included.
Details on the vulnerability are listed in your Snyk report.
The problematic functions are createTempDir and createTempFile from the package kotlin.io. As outlined in your report as well as the Kotlin documentation, these functions are a possible source of leaking information, due to the created file / directory having having too wide permissions; that is, everyone with access to the file system can read the files.
Is this a problem?
If you (and any dependencies you're including in your software) is NOT using one of the aforementioned functions, you're not vulnerable.
Also, if you (or the dependency) is adjusting the file permissions after calling one of these functions and before inserting any information, you're not affected.
In case the functions are used and the permissions are not adjusted, still that might not pose a problem, as long as the data stored in the files do not need to be protected, e.g. are NOT secrets or personal information.
To address your questions directly:
Unfortunately, there is not easy way to fix this. You either would have to use a version of kotlin-stdlib where the function was not introduced yet, exclude the kotlin-stdlib from your classpath entirely or use a version where the functions are no longer included; which is not released yet. However, options 1 and 2 do not make any sense, because if the software keeps working, that means noone is using the functions and you're not affected anyway.
No and yes. Everyone relying on the kotlin-stdlib in one of the affected versions, has the function on its classpath. However, as long as it is not used, or the usage does not pose a problem as explained above, the software is not vulnerable.
The OkHttp project seems to know of the vulnerability, but seems not to be affected.
I've recently switched from Jersey 1 to Jersey 2 for some projects I work on. The biggest annoyance I've run across with Jersey 2 is that it uses HK2, which for some reason repackages standard Maven artifacts. To avoid potential annoying-to-debug issues, I try to avoid pulling in the same classes from different projects. I use the Ban Duplicate Classes Maven enforcer rules from the Extra Enforcer Rules dependency to break the build if this occurs.
According to the aforementioned Ban Duplicate Classes enforcer rule, switching to Jersey 2 has introduced the following conflicts between its artifacts and standard ones I was previously using:
hk2 Artifact Conflicting Artifact
org.glassfish.hk2.external:aopalliance-repackaged:2.3.0-b07 aopalliance:aopalliance:1.0
org.glassfish.hk2.external:bean-validator:2.3.0-b07 com.fasterxml:classmate:0.8.0 (used by org.hibernate:hibernate-validator:5.0.0.Final)
org.glassfish.hk2.external:bean-validator:2.3.0-b07 javax.validation:validation-api:1.1.0.Final
org.glassfish.hk2.external:bean-validator:2.3.0-b07 org.hibernate:hibernate-validator:5.0.0.Final
org.glassfish.hk2.external:bean-validator:2.3.0-b07 org.jboss.logging:jboss-logging:3.1.0.GA
org.glassfish.hk2.external:javax.inject:2.3.0-b07 javax.inject:javax.inject:1
My solution has been to exclude the standard artifacts from the dependencies that transitively pull them, and therefore use only the hk2 artifacts. I figure this is safer: I don't know what else the hk2 artifacts are pulling in that I might be missing if I were to exclude them instead (for example, the bean-validator artifact appears to be repackaging at least four artifacts). The downsides to this are that first, I a ton of exclusions peppering my dependencies that were bringing in otherwise innocuous API dependencies, such as validation-api. Secondly, my artifacts are now exporting HK2 repackaged dependencies, rather than the actual API classes I would prefer to be exporting.
Ultimately, my questions are:
Why does HK2 repackage everything? Is there some good reason for this?
What is HK2 actually repackaging, and can I just use the standard API versions instead? How would I figure this out? I've cloned the HK2 project, and I've had a bit of trouble figuring out where to begin to find this out.
Barring the actual answer to these questions, what would be a good forum for contacting the developers behind HK2 so I can ask the question directly? I've looked through the website, and while I've found some mailing lists, I'm not seeing anything obviously appropriate for asking this question.
HK2 runs in an OSGi environment for products such as GlassFish. Unfortunately most of the standard jars such as javax.inject, bean-validator and aopalliance do not come with proper OSGi headers. So hk2 needs to repackage them with OSGi headers so that they will work properly in that environment.
Also since GlassFish is the RI for Java EE there are certain legal requirements that are made about the availability of the source code, so some of the repackaging that is done is to satisfy the availability of source code requirements.
That being said, if you are NOT running in an OSGi environment it is safe to replace these jars with the standard versions (though I myself have not tried this)
I have a Java project, built with Maven, that aggregates several components, each one in its own Maven project. Any one of these components may evolve separately.
The structure of my project can be described as follows:
my-main-project that depends on:
my-component-1
my-component-2
etc.
Nowadays, all pom.xml are using "snapshot" versions, so, they are all using the "latest" version available in my repository.
But once I send a release version to my customer, I'm supposed to freeze the versions and make a TAG (or equivalent) in my source-control, so I can restore a previous state in case of maintenance.
So, my question is: should I change all pom.xml files before each release, give version numbers to the components, and tie everything with this dependency versions? Also, if I have many components (my project currenty has 30+ small subcomponents) would I have to renumber/reversion each one before each release? When a single component evolves (due to bug fix or enhancement), must I increase its version so that the changes do not affect pre-existing releases, right?
How people using maven generally handle this many-component versioning case?
Of course, I could just rely on my version-control tags to restore to a previous point-in-time and just tag every component on each release, but I don't like this approach, since the dependency versioning (with maven) gives me much more control and visibility about what is packaged, and relations of (broken-)compatibility and many more.
General Considerations
You may consider some relations between your components.
Are they really independant (each one vs each other) ? Or is there some kinds of relation ... some commons lifecycles ?
If you find some relationship between them, consider using maven multi-modules : http://www.sonatype.com/books/mvnex-book/reference/multimodule.html. In a few words, you will have a parent, with one version, and some modules (some jars .. in a way like Spring and its submodules). This will help you to reduce versions management.
You may consider using maven-release-plugin. It will help you to tag, build and deploy automatically your modules, dealing more easily with versionning and links with SCM and Repository.
Moreover, combine with multi-module it would drastically help you !
There is a lot of topic dealing with this on Stack Overflow.
I don't know if you already know that. I could explain it a lot further if you want, but you may have enough elements to search by yourself if you don't.
Straight Answers
So, my question is: should I change all pom.xml files before each release, give version numbers to the components, and tie everything with this dependency versions?
Yes you should. In Application Lifecycle Management follow the changes is REALLY important. So, as you could imagine, and as you point it out, you really should build and tag each of your components. It could be painful, but maven-realease-plugin and multi module (even with a Continuous Integration plateform) it could be easier.
would I have to renumber/reversion each one before each release?
For exactly the same reasons : yes !
must I increase its version so that the changes do not affect pre-existing releases, right?
Yes, you should too. Assuming you choose a common versionning like MAJOR.minor.correction, the first number indicate compatibilty breaks. Minor version would bring some breaks, but should not. Corrections whould NEVER affect compatibility.
How people using maven generally handle this many-component versioning case?
I cannot reply for every one, but my previous comments on release-plugin and multi-module considered as best pratices. If you want to a little bit further, you can imagine use more powerfull SCM (Clearcase, Perforce, ...), but maven integration is fewer, not "well" documented and community provide less examples than SVN or Git.
Maven Release Plugin
If you are using a multi-module pom.xml you should be able to do mvn release -DautoVersionSubmodules and have it do a "release" build of all your dependencies and remove the -SNAPSHOT versions and upload them to your repository. That is what the release plugin and its workflow exists solely to do.
We have multiple maven projects depending on on our own common libraries.
When we upgrade a library it would be useful to quickly find out which projects have a dependency on the library (and might need to use the new version)
Obviously I can manually look in all the pom files or write a script to do it but this is less than ideal.
Are there any tools that provide this functionality. e.g. a hudson plugin, Nexus, artifactory etc?
EDIT:
Some clarifications:
I don't want to upgrade all projects at once. The regression testing and release effort makes this impractical and often unnecessary (even with automated testing and releasing). I just want a report showing me what may projects may need to have the library upgraded...
Many answers focus around the project itself flagging what version is used. Ideally the solution would work so that for a given library I can ask what uses this. This looks like what the Nexus issue below is talking about.
Hudson does something similar with automated downstream maven builds. I may look into extending this with a hudson plugin.
I know that albeit being a good tool for managing poms and jars, Artifactory has no such feature you're asking :(
You might want to check this StackOverflow question: Look for new versions of dependencies
Alternatively you might want to check Versions Maven Plugin. Among other things, it allows you to scan trough project's dependencies and produce a report of those dependencies which have newer versions available.
A quite usual practice is to declare all the versions in dependencyManagement of the parent module and reference dependencies without versions everywhere else. In this case you'll only need to update only one POM.
I solved this issue by using dependency version ranges to fetch the newest versions automatically.
<dependency>
<groupId>foo.xyzzy</groupId>
<artifactId>bar</artifactId>
<version>[1.0.0,2.0.0)</version>
</dependency>
It might make sense to use a version range such as [1.0,) to include the newest version after 1.0.
The trick to make this work is to have dev and prod profiles that include and exclude the snapshot repos for your commons and to use hudson to automatically build projects when a dependency is re-built.
Not exactly what you're asking for but the Dependency Convergence report might be helpful. Here are some examples.
Update: I'm aware that I'm not exactly answering the question with my suggestion but doing it the other way (finding all projects that reference a given artifact) would require gathering all POMs of all projects and I'm not aware of a solution currently offering such a referenced by feature (a corporate repository would be the perfect candidate to implement this though).
My search is over: http://www.sonarsource.org/sonar-2-1-in-screenshots/
Sonar now provides this.
I recently added Maven snapshot build capability to a project, configured to use unique timestamp version on deployed artifact. But there is some confusion regarding whether this is the right thing to do (snapshots in question are deployed to one of public repos, not just within an entity like company): some say it causes problems when trying to use snapshots.
So: given how much of Maven is convention based, and following perceived best practices, I am hoping there are some guidelines as to which option to choose.
(NOTE: I slightly edited the title -- I am specifically interesting in benefits (or lack thereof) of including unique timestamp, via deploy option, for public snapshot versions; not so much whether to make use of timestamps if they are included, although that is obviously somewhat related question)
As a rule you should always build against the -SNAPSHOT dependency. However, you should avoid releasing your product if it includes -SNAPSHOT dependencies. If you use the Maven Release Plug-in to automate your release it will check to make sure you are not using release plug-ins or dependencies.
But that is not always possible. In cases where I need to release something based on a snapshot build that is when I use the explicit timestamp/build number rather than the -SNAPSHOT version naming scheme.
You can automate this using the Versions Maven Plugin. It provides goals to lock and unlock snapshot versions in your POM.
The whole point of a snapshot is to let someone use the latest version of the code. Why would you want to use a snapshot five versions back?
With that in mind, what do timestamps in the artifact name buy you?
I've been using Maven for about 5 years now. I've never seen a snapshot jar with a timestamp in the name.