Removal of sun.misc.Unsafe in Java 9 will break Spring, Hibernate - java

I read here that Spring and many other popular libraries will break if Oracle removes sun.misc.Unsafe in Java 9. However, there are no static references to this class in Spring or Hibernate. So, is that claim true?
BTW there are 64 references to Unsafe in Java 8, but if Oracle removes that class they will update all of them and no library will be affected (unless they use Unsafe directly that is).

Mark Reinhold had a talk during JVM Language Summit 2015 titled The Secret History and Tragic Fate of sun.misc.Unsafe. Although these talks have plenty of disclaimers on them, you can see the proposed approach at 10:23, which is described in JEP260.
The general idea is:
replace existing functionality with safer, supported APIs
deprecate the previously existing Unsafe APIs that has been replaced
remove the deprecated code in the next version
Here is some relevant text from JEP260 (taken from October 20th 2015):
In JDK 9 we propose to:
Encapsulate all non-critical internal APIs by default: The modules that define them will not export their packages for outside use. (Access to such APIs will be available, as a last resort, via a command-line flag at both compile time and run time, unless those APIs are revised or removed for other reasons.)
Encapsulate critical internal APIs for which supported replacements exist in JDK 8, in the same manner and with the same last-resort workaround. (A supported replacement is one that is either part of the Java SE 8 standard (i.e., in a java.* or javax.* package) or else JDK-specific and annotated with #jdk.Exported (typically in a com.sun.* or jdk.* package).)
Not encapsulate critical internal APIs for which supported replacements do not exist in JDK 8 and, further, deprecate those which have supported replacements in JDK 9 with the intent to encapsulate them, or possibly even remove them, in JDK 10.
...
Critical internal APIs for which replacements are introduced in JDK 9 will be deprecated in JDK 9 and either encapsulated or removed in JDK 10.

Maybe the references are not in the core of Spring or Hibernate, but somewhere else. The document linked says with regard to Spring
Spring Framework (via Objenesis, with a fallback)
I tried to search for usages of Unsafe in the project I am currently working on, so there are still quite some libraries which may break.
result of quick search:
Guava
GWT
Netty
Jersey-Common
Infinispan
Jboss-Modules

This resource provides a proper understanding of the current status of JDK 9 and its features. The community started a discussion related to Unsafe and its future into the future of java. The given document is the effort of the community to react to JEP-260 that proposes hiding some internal APIs but leaving accessible some critical APIs, among witch Unsafe. As extracted from the document itself:
The critical internal APIs proposed to remain accessible in JDK 9 are:
sun.misc.Cleaner
sun.misc.{Signal,SignalHandler}
sun.misc.Unsafe (The functionality of many of the methods in this
class is now available via variable handles (JEP 193).)
sun.reflect.Reflection::getCallerClass (The functionality of this
method may be provided in a standard form via JEP 259.)
sun.reflect.ReflectionFactory
So to conclude, at least based on the given JEP, Unsafe should remain.

The answer is in the linked document. Spring does not have a direct dependency on Unsafe, but Spring depends on Objenesis and Objenesis depends on Unsafe.
Dependency for Objenesis on Unsafe: https://github.com/easymock/objenesis/blob/master/main/src/main/java/org/objenesis/instantiator/sun/UnsafeFactoryInstantiator.java
Spring's dependency on Objenesis is itself a bit strange. Spring's build script fetches the Objenesis binary and makes bytecode-level changes using the JarJar tool. You can see what it does in the following build script: https://github.com/spring-projects/spring-framework/blob/master/build.gradle (at time of writing, see lines 326-343, and 347).
This essentially means that Spring's "spring-core" binary ends up containing a load of classes under the org.springframework.objenesis.* package structure, but those classes were originally stored in source in Objenesis GitHub, published as a binary by the Objenesis team, fetched during Spring's build, repackaged to org.springframework.* packages and then republished as part of Spring. That's why you are having trouble finding them.
Spring uses Unsafe (via Objenesis) to create classes without first calling the constructor.

Related

Spring-Web Very High Vulnerability in 5.x.x Version

We are using spring-web in one of our projects which uses Java 8 and Veracode has reported a v high flaw in spring-web framework.
I know updated 6.x.x has been released and it's vulnerability free but in order to update to spirng 6 Java needs to be updated considering our project is very old and on path of EOL by this Sept upgrading it to Java 17 is somewhat not a feasible option.
Are we expecting any release for 5.x.x version with this vulnerability fixed?
The problem highlighted by this CVE is that deserialization of Java serialized objects from an untrusted source is dangerous.
However, flagging all versions of Spring-Web 5.x.x as being vulnerable is misleading. Sure, the Spring-Web codebase has code that allows that to happen. But so do all versions of Java ... until they deprecate and remove Java object serialization entirely. (That is described as a "long term goal", but I doubt that will happen any time soon.)
The correct way to deal with this is not to "upgrade" ... but to audit your code carefully to look for places where you are using Java object deserialization. Then you check each place to see if you are (potentially) getting the data from an untrusted source, or a source that could be compromised thereby rendering it risky.
If you are not using Java object serialization you are fine.
If you are not getting the serialized data from an untrusted (or risky) source you are (probably) fine ... modulo the accuracy of your assumptions and your analysis.
Otherwise you need to change your application to stop using Java object serialization / deserialization in a risky way. Do it another way; e.g. use JSON and a JSON binding technology.
Note that upgrading to Spring-Web 6 or higher doesn't actually solve the problem anyway. Sure they removed the "vulnerable" classes, but the "ability" to use Java serialization unsafely exists in Java. Spring-Web cannot prevent it.
For more background, I recommend that you read issue 24434 on the Spring issue tracker.
In particular, read this comment from one of the developers which explains their stance on the issue.

Is Eclipse's ecj compiler extensible?

I am interested in modifying Java syntax and some implicit paradigms. Since I develop with Eclipse which provides it's own compiler, which can also be used standalone, I was wondering if it wasn't possible to extend ecj to respect additional grammar rules (and correctly handle them).
My syntactical changes are all resolvable by removing elements from the AST and creating some new ones, so I assume that what I want to do is possible without diving into bytecode.
Essentially, what I want to do could be done by 'virtually' modifying the source code before the actual compilation. However I suspect that doing so would mess up the source mapping, which would make debugging a hell.
On a sidenote: I am aware of project Lombok, which extends and alters class compilation, however Lombok uses annotations only, and does not modify syntax, strictly speaking. So what I want to do is more invasive to the language specs.
As Object Teams has been mentioned in comments:
(1) Object Teams itself extends JDT for its own language OT/J which is an extension of Java. This is done in a dual strategy:
We maintain a fork of org.eclipse.jdt.core. While this is quite heavy lifting it successfully demonstrates that the JDT architecture is suitable for modification.
We use our own concepts of role objects to non-invasively adapt the behavior of other parts of the IDE (notably org.eclipse.jdt.ui) to reflect the semantics of OT/J
(2) I have a few (oldish) blog posts that demonstrate how OT/J can be used for creating non-invasive variants of JDT including support for extended syntax:
IDE for your own language embedded in Java? (part 1)
IDE for your own language embedded in Java? (part 2)
Get for free what Coin doesn’t buy you
Disclaimer: I am author of OT/J and lead of its implementation, and later became a committer on Eclipse JDT.
For further questions, there's a forum.

Java package naming for versioning external APIs

Are there any conventions of how to name Java packages containing classes that consume an external, versioned API?
Let's assume we have a major-minor semantic versioning scheme of a service and we have to implement a consumer that is compatible with and bound to a specific version of that API. What are the best practices to name the packages (and classes)?
Currently, we're using the scheme of: ${service}_${M}_${N} (with M = Major version, N = minor version). For example: com.example.theService_1_0..
But sonarqube is complaining that it does not match conventions.
Of course I can just disable this rule, but I wonder if there are any best-practices?
I'm looking for a general approach not only something specific to REST, because I've encountered consumer implementations for WebService, REST and CORBA. And I'm not sure, artifact-versioning (as in maven) works well here, because it relates to the version of the implementation and not the API.
I know there are questions around api versioning, but those are about the producer, not the consumer.
Yes, Java package names have a strong and ambiguous convention for indicating dependencies’ versions: don’t.
If you change your application to use a new version of an external API, you create a new version of your application. The version number you are looking for is your application’s version number. On many Java projects, dependencies’ versions are management by a Maven configuration file.
In any case, when classes use a particular API version, the class and package names have no business exposing this information, which would violate encapsulation, apart from anything else. These names have a different purpose.
Note that this is no different when you use HTTP/REST APIs or Java APIs. After all, I don’t think you’d name your class TheServiceWithLog4J_12_1_5. I hope not, at least.
You didn’t mention whether you have a requirement to support multiple versions of this external API at the same time. Even if you did, I still wouldn’t recommend exposing the API version number in the package name. Instead, use the package name to indicate why you have two versions, and the important difference.

Does adding a new dependency to a library, with compatible API changes, affect binary compatibility?

My question:
Does adding a new dependency to a library affect binary compatibility, as long as the library's external API is otherwise backwards compatible?
My situation:
My CBOR library contains classes for arbitrary-precision arithmetic (in the PeterO namespace). (It's in C# and Java; the Java version is in a separate repository, but the same issue applies to both versions.)
I have moved these classes to a new namespace (in PeterO.Numbers), and renamed them (retaining the original classes for backward compatibility), because the namespace where they are now is meant to contain only utility classes. I plan to move the new classes to a separate library and make the CBOR library call that library as a dependency, since the arbitrary-precision classes are obviously useful outside of CBOR. (I plan to eventually deprecate the old classes.)
But I'm concerned if making a separate library this way is a binary compatibility issue, such that I can't just update the minor version, but also the major version. The CBOR library is version 2.3.1 at the time of this writing. Am I able to do this and change the version to 2.4, or only 3.0?
As long as you started out with an interface to begin with and all your libraries' clients are aware of that interface, you will be fine. It does not matter where your code resides in your library or in a library outside of it as long as your library has an interface which its clients understand and it implements the interface.
This is an age old problem and has been solved 15 years ago by COM (component object model). Keep your interfaces separate from implementation and you are golden.
I'll answer for the Java version. This section of the Java Language Specification describes in detail the changes that can be done to applications while preserving binary compatibility.
From what I understand, your changes (although they may affect a great portion of the source) are simple refactorings that expose some utility classes to another module, and re-direct the old classes to call this new module. This is described in the section on Evolution of Packages:
A new top level class or interface type may be added to a package without breaking compatibility with pre-existing binaries, provided the new type does not reuse a name previously given to an unrelated type.
So this does not break binary compatibility with existing classes that use your library. Any existing class CBORClient that used to call CBORUtil.doArithmetic() will continue to work without the need to re-compile it, since the method is still there, only its body has changed to call another module to do the computation.
It's better avoid adding new dependency until next major version, Until that, add change's internally and create your new arbitrary-precision library with same class and synchronize them without dependency.
so for version 2.4 add changes in new namespace and call them from old class and create another class library for arbitrary-precision classes and synchronize them until next major version for CBOR library

Is there an automatic way to generate a list of interface changes?

I am developing/maintaining a Java library, and would like to keep track of backwards-incompatible changes between releases. This list could contain changes in class declarations, method signatures etc.
For example, if I (accidentally) changed a constructor by adding a parameter, then I would like to have it included in the list and be warned about the change.
// before
public MyCar(String name) { ... }
// after (some accidental change)
public MyCar(String name, long mileage) { ... }
// an application using my library depending on this constructor would be broken
// when it updates to the new version
Is there an automated way to generate this list? It feels like something that IntelliJ or Gradle should be able to do.
My team has tried reviewing pull requests and maintaining a CHANGELOG manually (which seems to be a common approach), but that's prone to human errors. I seek an automated way that can ideally be part of the build system.
I've always maintained the compatibility list manually but sometimes I forget something.
A quick look around shows several open source libraries but they haven't had new versions released in nearly 10 years. So I don't know if they would work with new Java 7 or 8 features.
Note: I've never used any of these!
CLIRR - apache project used by some other apache projects to show what has changed (example output from apache commons-lang here. last updated in 2005 doesn't even build with Maven 2 (or 3)
JDiff javadoc doclet comparator. Might support Java 5. Last updated in 2008
Japitools - apparently was used by the GNU Classpath project to compare their APIs for signature compatibility with different versions of the Sun Java class libraries. Doesn't look like it's been updated since 2006
There's a better way to do it.
Preserve backwards compatibility for a time by annotating your methods with #Deprecated, and indicate when they'll be unsupported. Then add the #deprecated piece to your Javadoc and that will automatically generate a list of deprecated features that the end user needs to care about.
This has the added benefit of allowing you to introduce when a feature was introduced (#since), and when a feature will be removed, without having to fuss too much with a lot of other tools.
Since you've added a more concrete code example, I'll add one more note: those sorts of changes...are the result of a conscious design decision, and it brings to the forefront two issues:
Regression testing (as in, a test should have caught this)
Ease of transition into the newer API (as in, if I need to suddenly give a new parameter to this to gain functionality, isn't it a new thing rather than it being attached to the old, legacy thing?)
Those issues can't be teased away with any conventional tools; that requires an earnest conversation about the amount of time it takes to transition from one API to another. If you find that you need to introduce new functionality to the core, then you had better make darn certain that hasn't broken the legacy case.
This is what it means to have an API - you have to have the older version lurking around for a while.

Categories