We are using spring-web in one of our projects which uses Java 8 and Veracode has reported a v high flaw in spring-web framework.
I know updated 6.x.x has been released and it's vulnerability free but in order to update to spirng 6 Java needs to be updated considering our project is very old and on path of EOL by this Sept upgrading it to Java 17 is somewhat not a feasible option.
Are we expecting any release for 5.x.x version with this vulnerability fixed?
The problem highlighted by this CVE is that deserialization of Java serialized objects from an untrusted source is dangerous.
However, flagging all versions of Spring-Web 5.x.x as being vulnerable is misleading. Sure, the Spring-Web codebase has code that allows that to happen. But so do all versions of Java ... until they deprecate and remove Java object serialization entirely. (That is described as a "long term goal", but I doubt that will happen any time soon.)
The correct way to deal with this is not to "upgrade" ... but to audit your code carefully to look for places where you are using Java object deserialization. Then you check each place to see if you are (potentially) getting the data from an untrusted source, or a source that could be compromised thereby rendering it risky.
If you are not using Java object serialization you are fine.
If you are not getting the serialized data from an untrusted (or risky) source you are (probably) fine ... modulo the accuracy of your assumptions and your analysis.
Otherwise you need to change your application to stop using Java object serialization / deserialization in a risky way. Do it another way; e.g. use JSON and a JSON binding technology.
Note that upgrading to Spring-Web 6 or higher doesn't actually solve the problem anyway. Sure they removed the "vulnerable" classes, but the "ability" to use Java serialization unsafely exists in Java. Spring-Web cannot prevent it.
For more background, I recommend that you read issue 24434 on the Spring issue tracker.
In particular, read this comment from one of the developers which explains their stance on the issue.
Related
I'm using Snyk service to check my projects for vulnerabilities.
Projects with OkHttp dependency have one common vulnerability:
Vulnerable module: org.jetbrains.kotlin:kotlin-stdlib
Introduced through: com.squareup.okhttp3:okhttp#4.10.0
You can check the full report here: https://snyk.io/test/github/yvasyliev/deezer-api
In Overview section there is a note:
Note: As of version 1.4.21, the vulnerable functions have been marked as deprecated. Due to still being useable, this advisory is kept as "unfixed".
I have two questions:
Can I fix this vulnerability in Maven project and how?
If vulnerability cannot be fixed, then does it mean that every signle Kotlin application has this vulnerability by default (since it's comming from kotlin-stdlib)?
The latest stable version of OkHttp is added to project by Maven:
<dependency>
<groupId>com.squareup.okhttp3</groupId>
<artifactId>okhttp</artifactId>
<version>4.10.0</version>
</dependency>
As with all vulnerable software libraries, you need to assess whether or not you're actually affected by the vulnerability that is included.
Details on the vulnerability are listed in your Snyk report.
The problematic functions are createTempDir and createTempFile from the package kotlin.io. As outlined in your report as well as the Kotlin documentation, these functions are a possible source of leaking information, due to the created file / directory having having too wide permissions; that is, everyone with access to the file system can read the files.
Is this a problem?
If you (and any dependencies you're including in your software) is NOT using one of the aforementioned functions, you're not vulnerable.
Also, if you (or the dependency) is adjusting the file permissions after calling one of these functions and before inserting any information, you're not affected.
In case the functions are used and the permissions are not adjusted, still that might not pose a problem, as long as the data stored in the files do not need to be protected, e.g. are NOT secrets or personal information.
To address your questions directly:
Unfortunately, there is not easy way to fix this. You either would have to use a version of kotlin-stdlib where the function was not introduced yet, exclude the kotlin-stdlib from your classpath entirely or use a version where the functions are no longer included; which is not released yet. However, options 1 and 2 do not make any sense, because if the software keeps working, that means noone is using the functions and you're not affected anyway.
No and yes. Everyone relying on the kotlin-stdlib in one of the affected versions, has the function on its classpath. However, as long as it is not used, or the usage does not pose a problem as explained above, the software is not vulnerable.
The OkHttp project seems to know of the vulnerability, but seems not to be affected.
I need to support two builds with some set of differences in libraries versions, so I made two build profiles and this works fine, but now I have a versioning problem while preparing a release.
I use the major.minor.revision-qualifier versioning schema, where:
major - a breaking change
minor - a backward-compatible changes (new features)
revision - a bug fix
qualifier - the only qualifier I use now is SNAPSHOT to mark unreleased versions.
But since now I have two builds, I need to add some qualifiers to release versions e.g. 1.8.0-v1 and 1.8.0-v2, but then I won't be able to have two SNAPSHOT versions. Or I need to break the "rules" about major\minor version usage and make two "branches" e.g. release 1.8.0 and 1.9.0 and then increase only last number no matter when fixing a bug or adding a new features.
I have a feeling like I am doing something antipattern, could anyone please give me some advice?
P.S. I already have heavily reworked 2.x version, so I can't have separate "branches" as 2.x and 1.x versions, unless i change this new version to 3.0
upd
I guess i can't make this story short, so here we go.
In my project i use to have ojdbc6 and aqapi jars(oracle libraries), my app was working on java 7 and Apache ServiceMix 5 with oracle 11 database. But then some clients updated to oracle 12 and i need new libraries for that, but they only work on java 8, but ActiveMQ i am using as part of ServiceMix 5 doesn't work on java 8. So i updated to servicemix 7 and and after some chances it works fine. So rest of the difference in build profiles are versions of servicemix provided libraries (a complete list is redundant here i guess).
In the end despite the fact that new jdbc driver is fully compatible with old database(not completely sure about aqapi and client side of ActiveMQ, but they should be also compatible), i can't force every client to update and reinstall java\servicemix at the same time, but i still wanna be able to fix\add stuff for all of them.
So i need to support two builds for different versions of servicemix, at least for now(its a temporary solution, but as proverb says: there is nothing more permanent than temporary, so i want to make it in the most right way possible)
P.S.
I decided to make profiles instead of separate brunch in VCS, because it looks like much easier solution, but it doesn't metter in terms of the versioning problem.
So as #Software Engineer said, after thinking about reasons and writing a post update i realised its not multiprofile problem, it's purely versioning problem, it would be the absolutely the same if i make brunch in VCS.
So in the end i decided to make 1.x.x and 2.x.x versions despite the fact that changes are not that "breaking", but they are not fully backward-compatible(even tho new version can work with old database it still needs new servicemix).
This multiprofile workaround doesn't looks pretty, but i left it there, it allows me to build both versions in one go(i use mvn versions:set -DnewVersion command after the first build) and i don't need to support two brunches this way, so it saves some time.
I read here that Spring and many other popular libraries will break if Oracle removes sun.misc.Unsafe in Java 9. However, there are no static references to this class in Spring or Hibernate. So, is that claim true?
BTW there are 64 references to Unsafe in Java 8, but if Oracle removes that class they will update all of them and no library will be affected (unless they use Unsafe directly that is).
Mark Reinhold had a talk during JVM Language Summit 2015 titled The Secret History and Tragic Fate of sun.misc.Unsafe. Although these talks have plenty of disclaimers on them, you can see the proposed approach at 10:23, which is described in JEP260.
The general idea is:
replace existing functionality with safer, supported APIs
deprecate the previously existing Unsafe APIs that has been replaced
remove the deprecated code in the next version
Here is some relevant text from JEP260 (taken from October 20th 2015):
In JDK 9 we propose to:
Encapsulate all non-critical internal APIs by default: The modules that define them will not export their packages for outside use. (Access to such APIs will be available, as a last resort, via a command-line flag at both compile time and run time, unless those APIs are revised or removed for other reasons.)
Encapsulate critical internal APIs for which supported replacements exist in JDK 8, in the same manner and with the same last-resort workaround. (A supported replacement is one that is either part of the Java SE 8 standard (i.e., in a java.* or javax.* package) or else JDK-specific and annotated with #jdk.Exported (typically in a com.sun.* or jdk.* package).)
Not encapsulate critical internal APIs for which supported replacements do not exist in JDK 8 and, further, deprecate those which have supported replacements in JDK 9 with the intent to encapsulate them, or possibly even remove them, in JDK 10.
...
Critical internal APIs for which replacements are introduced in JDK 9 will be deprecated in JDK 9 and either encapsulated or removed in JDK 10.
Maybe the references are not in the core of Spring or Hibernate, but somewhere else. The document linked says with regard to Spring
Spring Framework (via Objenesis, with a fallback)
I tried to search for usages of Unsafe in the project I am currently working on, so there are still quite some libraries which may break.
result of quick search:
Guava
GWT
Netty
Jersey-Common
Infinispan
Jboss-Modules
This resource provides a proper understanding of the current status of JDK 9 and its features. The community started a discussion related to Unsafe and its future into the future of java. The given document is the effort of the community to react to JEP-260 that proposes hiding some internal APIs but leaving accessible some critical APIs, among witch Unsafe. As extracted from the document itself:
The critical internal APIs proposed to remain accessible in JDK 9 are:
sun.misc.Cleaner
sun.misc.{Signal,SignalHandler}
sun.misc.Unsafe (The functionality of many of the methods in this
class is now available via variable handles (JEP 193).)
sun.reflect.Reflection::getCallerClass (The functionality of this
method may be provided in a standard form via JEP 259.)
sun.reflect.ReflectionFactory
So to conclude, at least based on the given JEP, Unsafe should remain.
The answer is in the linked document. Spring does not have a direct dependency on Unsafe, but Spring depends on Objenesis and Objenesis depends on Unsafe.
Dependency for Objenesis on Unsafe: https://github.com/easymock/objenesis/blob/master/main/src/main/java/org/objenesis/instantiator/sun/UnsafeFactoryInstantiator.java
Spring's dependency on Objenesis is itself a bit strange. Spring's build script fetches the Objenesis binary and makes bytecode-level changes using the JarJar tool. You can see what it does in the following build script: https://github.com/spring-projects/spring-framework/blob/master/build.gradle (at time of writing, see lines 326-343, and 347).
This essentially means that Spring's "spring-core" binary ends up containing a load of classes under the org.springframework.objenesis.* package structure, but those classes were originally stored in source in Objenesis GitHub, published as a binary by the Objenesis team, fetched during Spring's build, repackaged to org.springframework.* packages and then republished as part of Spring. That's why you are having trouble finding them.
Spring uses Unsafe (via Objenesis) to create classes without first calling the constructor.
I am developing/maintaining a Java library, and would like to keep track of backwards-incompatible changes between releases. This list could contain changes in class declarations, method signatures etc.
For example, if I (accidentally) changed a constructor by adding a parameter, then I would like to have it included in the list and be warned about the change.
// before
public MyCar(String name) { ... }
// after (some accidental change)
public MyCar(String name, long mileage) { ... }
// an application using my library depending on this constructor would be broken
// when it updates to the new version
Is there an automated way to generate this list? It feels like something that IntelliJ or Gradle should be able to do.
My team has tried reviewing pull requests and maintaining a CHANGELOG manually (which seems to be a common approach), but that's prone to human errors. I seek an automated way that can ideally be part of the build system.
I've always maintained the compatibility list manually but sometimes I forget something.
A quick look around shows several open source libraries but they haven't had new versions released in nearly 10 years. So I don't know if they would work with new Java 7 or 8 features.
Note: I've never used any of these!
CLIRR - apache project used by some other apache projects to show what has changed (example output from apache commons-lang here. last updated in 2005 doesn't even build with Maven 2 (or 3)
JDiff javadoc doclet comparator. Might support Java 5. Last updated in 2008
Japitools - apparently was used by the GNU Classpath project to compare their APIs for signature compatibility with different versions of the Sun Java class libraries. Doesn't look like it's been updated since 2006
There's a better way to do it.
Preserve backwards compatibility for a time by annotating your methods with #Deprecated, and indicate when they'll be unsupported. Then add the #deprecated piece to your Javadoc and that will automatically generate a list of deprecated features that the end user needs to care about.
This has the added benefit of allowing you to introduce when a feature was introduced (#since), and when a feature will be removed, without having to fuss too much with a lot of other tools.
Since you've added a more concrete code example, I'll add one more note: those sorts of changes...are the result of a conscious design decision, and it brings to the forefront two issues:
Regression testing (as in, a test should have caught this)
Ease of transition into the newer API (as in, if I need to suddenly give a new parameter to this to gain functionality, isn't it a new thing rather than it being attached to the old, legacy thing?)
Those issues can't be teased away with any conventional tools; that requires an earnest conversation about the amount of time it takes to transition from one API to another. If you find that you need to introduce new functionality to the core, then you had better make darn certain that hasn't broken the legacy case.
This is what it means to have an API - you have to have the older version lurking around for a while.
I'm working on a Java library and would like to remove some functions from it. My reasons for this is public API and design cleanup. Some objects have setters, but should be immutable, some functionality has been implemented better/cleaner in different methods, etc.
I have marked these methods 'deprecated', and would like to remove them eventually. At the moment I'm thinking about removing these after few sprints (two week development cycles).
Are there any 'best practices' about removing redundant public code?
/JaanusSiim
Set a date and publicize it in the #deprecated tag. The amount of time given to the removal depends on the amount of users your code has, how well connected you are with them and the the reason for the change.
If you have thousands of users and you barely talk to them, the time frame should probably be in the decades range :-)
If your users are your 10 coworkers and you see them daily, the time frame can easily be in the weeks range.
/**
* #deprecated
* This method will be removed after Halloween!
* #see #newLocationForFunctionality
*/
Consider it this way, customer A downloads the latest version of you library file or frame work. He hits compile on this machine and suddenly he see thousands of errors because the member file or function does no longer exist. From this point on, you've given the customer a reason why not to upgrade to your new version and to stay with the old version.
Raymond Chen answers this the best with his blog about win32 API,
Though, our experience in our software house has been, once the API has been written we have to carry the API to the end of the product life cycle. To help users to new versions, we provide backwards compatibility with the old commands in the new framework.
It depends on how often the code is rebuild. For example, if there are 4 applications using the library, and they are rebuild daily, a month is a long enough time to fix the deprecated calls.
Also, if you use the deprecated tag, provide some comment on which code replaces the deprecated call.
Use #deprecated tag. Read the Deprecation of APIs document for more info.
After everyone using the code tells you they have cleaned up on their side, start removing the deprecated code and wait and see if someone complains - then tell them to fix their own code...
Given that this is a library, consider archiving a version with the deprecated functions. Make this version available in both source code and compiled form, as a backup solution for those who haven't modernized their code to your new API. (The binary form is needed, because even you may have trouble compiling the old version in a few years.) Make it clear that this version will not be supported and enhanced. Tag this version with a symbolic symbol in your version control system. Then move forward.
It certainly depends at which scale your API is used and what you promised upfront to your customers.
As described by Vinko Vrsalovic, you should enter a date when they have to expect the abandon of the function.
In production, if it's "just" a matter of getting cleaner code, I tend to leave things in place even past the deprecating date as long as it doesn't break anything.
On the other hand in development I do it immediately, in order to get things sorted out quickly.
You may be interested in examples of how deprecation works in some other projects. For example, here follows what the policy in the Django project for function deprecation is:
A minor release may deprecate certain features from previous releases. If a feature in version A.B is deprecated, it will continue to work in version A.B+1. In version A.B+2, use of the feature will raise a PendingDeprecationWarning but will continue to work. Version A.B+3 will remove the feature entirely.
too bad you are not using .Net :(
The built in Obsolete attribute generates compiler warnings.