In what cases legacy Java code would not compile on newer versions - java

Java is striving to be backward compatible. (It is to such an extent that it crippled its generics for that).
But are there situations when old code would not compile on newer versions (more importantly Java 5, and the forthcoming Java 7)

There seem to be quite a few of them actually - well, not all of them result in a compilation error but this is the official word from sun: http://java.sun.com/j2se/JM_White_Paper_R6A.pdf
I typically use these checks:
Prior to 1.4, URLConnection.getInputStream threw a FileNotFoundException if the file type was known and the response code was greater than or equal to 400. Otherwise no exception would be thrown
HttpURLConnection.getErrorStream can be used to read the error page returned from the server.Prior to 1.4, getErrorStream() always returned null.
New methods have been added to the DOM interfaces, so some existing applications will not be able to compile with the new interfaces.
ErrorHandler, EntityResolver, ContentHandler, and DTDHandler can now be set to null by applications. SAX 2.0 required the XML processor to throw java.lang.NullPointerException in this case. (The JAXP parser implemented in 5.0, like most implementations, reacts to null by using the default settings.)
The resolveEntity method in DefaultHandler and the EntityResolver subclass throws IOException and SAXException. Previously it threw only SAXException.
In SAX 2.0.1, an application can set ErrorHandler, EntityResolver, ContentHandler, or DTDHandler to null. This is a relaxation of the previous restriction in SAX 2.0, which generated a NullPointerException (NPE) in such circumstances.
As of 5.0, XSLTC is the default transformer, XSLTC does not support all the extensions that Xalan does. These extensions are beyond the definition of the JAXP and XSLT specifications.
In 5.0, the org.apache classes, have moved in 5.0 to com.sun.org.apache.package.internal so that they won’t clash with more recent, developer-downloaded versions of the classes.
A BigDecimal method changed its behavior between 1.4 and 5.0, causing JDBC drivers to malfunction.
As of 5.0, comparing a java.sql.Timestamp to a java.util.Date by invoking compareTo on the Timestamp results in a ClassCastException.
The java.net.Proxy class was added in 5.0, making two classes named Proxy:
(java.lang.reflect.Proxy,java.net.Proxy)
The following words were added to the Java language between 1.3 and 5.0, so they are no longer available for use as field or method identifiers:[assert (added in 1.4),enum]

Yeah, for example when using enum in older jdks:
Enumeration enum = ...
would compile with jdks prior to 1.5.

New versions might not "break" anything, and still make your code not compile.
For example, in JDK5, The method Timer.getId() was added, which returns long.
We actually had a class that subclassed Thread and had its own getId method that returned a string. This of course caused compilation problems, because all of a sudden we were attempting to override a method and change the type of its return value.

At one point they took away getenv, but then the next version they put it back.
I once had a problem where a new library class name conflicted with the name of a class we had created. We used "import java.whateveritwas.*" so we dragged in the new class without even knowing it. I forget what the class name was, but it could happen to you with any new class, especially one with a fairly generic name like "List" or "Map".
That's the only problems with new versions that I recall running into.

I once had a related issue with Class#getRessource() - some code that compiled well under 1.4.2 and 1.5+ but didn't work on JVMs > 1.4.2.
And I remember some issues with third party libraries (some versions of bea weblogic 8.1.4, if I remember right) that refused to cooperate in a Java 1.5 environment because some interface had been moved to different package (it's long ago, correct me if the details are not accurate.)

Methods and classes can be labeled deprecated, which would throw a compile time error. But you can tell the compiler to ignore it. Other than Enumeration, you could compile

The nastiest problems I've had recently1 with migrating code was with Eclipse on OSX. The problem was with the Java5→6 migration, and was due to the fact that on OSX the default build of Java5 was 32-bit and the only build of Java6 was 64-bit. This caused a lot of problems because the SWT (which Eclipse is built on) uses native code.
The other thing I'm aware of is the tangle you can get into with the various libraries that support web services, but the fix I've usually found there is to upgrade to Java6 and use the system libraries wherever possible. It's an area where Java6 was massively better than 5.
1 To be fair, this was a while ago and newer builds of Eclipse come with the required workarounds built in.

Related

Spring-Web Very High Vulnerability in 5.x.x Version

We are using spring-web in one of our projects which uses Java 8 and Veracode has reported a v high flaw in spring-web framework.
I know updated 6.x.x has been released and it's vulnerability free but in order to update to spirng 6 Java needs to be updated considering our project is very old and on path of EOL by this Sept upgrading it to Java 17 is somewhat not a feasible option.
Are we expecting any release for 5.x.x version with this vulnerability fixed?
The problem highlighted by this CVE is that deserialization of Java serialized objects from an untrusted source is dangerous.
However, flagging all versions of Spring-Web 5.x.x as being vulnerable is misleading. Sure, the Spring-Web codebase has code that allows that to happen. But so do all versions of Java ... until they deprecate and remove Java object serialization entirely. (That is described as a "long term goal", but I doubt that will happen any time soon.)
The correct way to deal with this is not to "upgrade" ... but to audit your code carefully to look for places where you are using Java object deserialization. Then you check each place to see if you are (potentially) getting the data from an untrusted source, or a source that could be compromised thereby rendering it risky.
If you are not using Java object serialization you are fine.
If you are not getting the serialized data from an untrusted (or risky) source you are (probably) fine ... modulo the accuracy of your assumptions and your analysis.
Otherwise you need to change your application to stop using Java object serialization / deserialization in a risky way. Do it another way; e.g. use JSON and a JSON binding technology.
Note that upgrading to Spring-Web 6 or higher doesn't actually solve the problem anyway. Sure they removed the "vulnerable" classes, but the "ability" to use Java serialization unsafely exists in Java. Spring-Web cannot prevent it.
For more background, I recommend that you read issue 24434 on the Spring issue tracker.
In particular, read this comment from one of the developers which explains their stance on the issue.

Compatibility of a Java runtime retention annotation in previous Java versions

I want to use the #FunctionalInterface from Java 8 in my code, but I want to be able to use the generated class files with Java 6. I think then that I should the source version to 1.8, and the target version to 1.6.
I would be using #FunctionalInterface just for documentation, but I note that it has #Retention(RetentionPolicy.RUNTIME). If no one ever uses that annotation, will it cause problems?
If someone iterates over the annotations of my object at runtime, will it cause a missing class exception? But if that is true, how is it that how Google Guava can declare the JSR 305 annotation dependency to have a Maven <scope> of provided, which means annotations such as javax.annotation.Nonnull are missing at runtime, too, in Guava, without causing problems?
Let me ask it another way: if I use Google Guava in my project but don't include a JSR 305 dependency, do I really risk some error if I use reflection on the code? If so, what error will occur? If no error will occur, then analogously can I use the #FunctionalInterface annotation in source compiled with Java version 1.8 yet targeted to version 1.6 without any risk of runtime errors, even using reflection?
I think then that I should [set] the source version to 1.8, and the target version to 1.6.
Actually, it is not possible to compile Java source files of newer source versions for older JVM target versions. Oracles and OpenJDKs javac will reject a compilation attempt where the -source version is higher than the -target version. (However, I couldn't find a specification denying it, even the manual doesn't mention that). The sole idea of javacs cross-compiling feature is that you can compile your old e.g. 1.6 Java files still for the old 1.6 JVM even when you are using a newer JDK for compilation.
The issue you are describing is the sort of reason for this. Since Java is using a sort of lazy dependency loading, the compiler can't guarantee that there will be an appropriated class at runtime for all the dependencies. This also applies to the standard library.
However, there are (unofficial) tools to compile the newer source idioms or byte code to older byte code versions. But that doesn't go for the standard library. If you wanna use newer classes, you have to provide them on your own. For this purpose, there exist some back ports for specific parts of the standard library.
Specifically about your annotation question:
I was not able to find any reliable specification to what should/might happen if the JVM encounters an annotated construct for which it could not retrieve the class file (I searched the Java virtual machine specification SE 8). However, I found a somewhat related reference in the Java language specification SE 8:
An annotation is a marker which associates information with a program construct, but has no effect at run time.
From JLS 9.7
This statement rather indicates that an annotation (present or not) should not have an influence on the execution of a JVM. Therefore, a exception (such as NoClassDefFoundError) because of a missing annotation were rather against this.
Finally, though the answers of this question, I found even more specific statements:
An annotation that is present in the binary form may or may not be available at run time via the reflection libraries of the Java SE platform.
From JLS 9.6.4.2
And
Adding or removing annotations has no effect on the correct linkage of the binary representations of programs in the Java programming language.
From JLS 13.5.7
This quite clearly states that missing annotations will not cause an error, but instead will be just ignored if examined by reflection.
And if you deliver a class annotated with a Java 1.8 standard library annotation, and it will be (somehow) executed on e.g. Java 1.6 JVM where that annotation is just not present, then this specifications denies that any error is generated.
This is also supported by the following test which I wrote: (notice the usage of reflection)
#TestAnno
public class Test {
public static void main(String[] args) {
Annotation[] annos = Test.class.getAnnotations();
for (Annotation a : annos) {
System.out.println(a);
}
}
}
#Retention(RetentionPolicy.RUNTIME)
#interface TestAnno {
}
If compiled, it yields a Test.class and a TestAnno.class. When executed the program outputs:
#TestAnno()
Because that is the one annotation applied to Test. Now, if the TestAnno.class is removed without any modifications to Test.class (which refers to TestAnno with LTestAnno; sequence in the byte code) and Test is executed again, it just does not output anything. So my JVM is indeed ignoring the missing annotation and does not generate any error or exception (Tested with a OpenJDK version 1.8.0_131 on Linux).
As with any class loading situation, if the class isn't needed (or rather, doesn't need to be loaded), it doesn't matter if the class doesn't exist at runtime. Runtime annotations normally have the same problem, since if they're retained at runtime, it usually means that there's logic based on them, meaning their classes are loaded too.
But #FunctionalInterface doesn't have runtime logic, so...
Why does #FunctionalInterface have a RUNTIME retention? Apparently not for any particularly compelling reason, just a side effect of it also being #Documented annotation.
So if you want to make sure there are no potential problems if someone (or some tool more likely (I don't mean a "tool", like a co-worker)) decides to enumerate the annotations in your classes, I guess you'd need to remove the annotations at pre-processing.

Removal of sun.misc.Unsafe in Java 9 will break Spring, Hibernate

I read here that Spring and many other popular libraries will break if Oracle removes sun.misc.Unsafe in Java 9. However, there are no static references to this class in Spring or Hibernate. So, is that claim true?
BTW there are 64 references to Unsafe in Java 8, but if Oracle removes that class they will update all of them and no library will be affected (unless they use Unsafe directly that is).
Mark Reinhold had a talk during JVM Language Summit 2015 titled The Secret History and Tragic Fate of sun.misc.Unsafe. Although these talks have plenty of disclaimers on them, you can see the proposed approach at 10:23, which is described in JEP260.
The general idea is:
replace existing functionality with safer, supported APIs
deprecate the previously existing Unsafe APIs that has been replaced
remove the deprecated code in the next version
Here is some relevant text from JEP260 (taken from October 20th 2015):
In JDK 9 we propose to:
Encapsulate all non-critical internal APIs by default: The modules that define them will not export their packages for outside use. (Access to such APIs will be available, as a last resort, via a command-line flag at both compile time and run time, unless those APIs are revised or removed for other reasons.)
Encapsulate critical internal APIs for which supported replacements exist in JDK 8, in the same manner and with the same last-resort workaround. (A supported replacement is one that is either part of the Java SE 8 standard (i.e., in a java.* or javax.* package) or else JDK-specific and annotated with #jdk.Exported (typically in a com.sun.* or jdk.* package).)
Not encapsulate critical internal APIs for which supported replacements do not exist in JDK 8 and, further, deprecate those which have supported replacements in JDK 9 with the intent to encapsulate them, or possibly even remove them, in JDK 10.
...
Critical internal APIs for which replacements are introduced in JDK 9 will be deprecated in JDK 9 and either encapsulated or removed in JDK 10.
Maybe the references are not in the core of Spring or Hibernate, but somewhere else. The document linked says with regard to Spring
Spring Framework (via Objenesis, with a fallback)
I tried to search for usages of Unsafe in the project I am currently working on, so there are still quite some libraries which may break.
result of quick search:
Guava
GWT
Netty
Jersey-Common
Infinispan
Jboss-Modules
This resource provides a proper understanding of the current status of JDK 9 and its features. The community started a discussion related to Unsafe and its future into the future of java. The given document is the effort of the community to react to JEP-260 that proposes hiding some internal APIs but leaving accessible some critical APIs, among witch Unsafe. As extracted from the document itself:
The critical internal APIs proposed to remain accessible in JDK 9 are:
sun.misc.Cleaner
sun.misc.{Signal,SignalHandler}
sun.misc.Unsafe (The functionality of many of the methods in this
class is now available via variable handles (JEP 193).)
sun.reflect.Reflection::getCallerClass (The functionality of this
method may be provided in a standard form via JEP 259.)
sun.reflect.ReflectionFactory
So to conclude, at least based on the given JEP, Unsafe should remain.
The answer is in the linked document. Spring does not have a direct dependency on Unsafe, but Spring depends on Objenesis and Objenesis depends on Unsafe.
Dependency for Objenesis on Unsafe: https://github.com/easymock/objenesis/blob/master/main/src/main/java/org/objenesis/instantiator/sun/UnsafeFactoryInstantiator.java
Spring's dependency on Objenesis is itself a bit strange. Spring's build script fetches the Objenesis binary and makes bytecode-level changes using the JarJar tool. You can see what it does in the following build script: https://github.com/spring-projects/spring-framework/blob/master/build.gradle (at time of writing, see lines 326-343, and 347).
This essentially means that Spring's "spring-core" binary ends up containing a load of classes under the org.springframework.objenesis.* package structure, but those classes were originally stored in source in Objenesis GitHub, published as a binary by the Objenesis team, fetched during Spring's build, repackaged to org.springframework.* packages and then republished as part of Spring. That's why you are having trouble finding them.
Spring uses Unsafe (via Objenesis) to create classes without first calling the constructor.

Error while compiling thrift generated classes with Java 1.5

`Platform`: Windows 7, MinGW, MSYS, Java 1.5
I have thrift 0.9.1 compiler (prebuilt for windows) and source. I use Ant to build java library.
I create one thrift idl and compile it with the compiler. No problem in generating code files.
I add these files in my project, and that add slf4j (downloaded from their site) and libthrift.
Most of the errors that I have previously (imports etc) are gone except for errors related to overriding methods.
So basically it complains like:
The method clear() of type Server must override a superclass method
and similarly for compareTo, write, read etc. In short it complains about all methods that are overridden. This is all thrift compiler generated code and I haven't changed anything.
Is there any incompatibility? I cannot really find any mention of that. I have tried removing and then adding the libraries, I have also tried cleaning, refreshing, validating the project but the errors are still there.
I have also tried to compile the code (thrift code) but MinGW is also a huge headache. It cannot find configure even though I have installed it. And if I run the msys console, it is able to configure but cannot make complaining about inttypes.h not present (which is not in msys include directory but is present in MinGW include directory.).
Any suggestion would be appreciated.
Are you using Java 5? With Java 5 #Override doesn't search for methods on interfaces, only on superclasses.
If you are using a Java 5 compiler trying using a more recent javac (preferably 7 or 8) and see of that works.
EDIT:
Not sure if this is in your version of Thrift, but in mine it looks like there is a flag called java5 that you an specify when generating code to specify that you want the generated code to be Java 5 compliant
java (Java):
beans: Members will be private, and setter methods will return void.
private-members: Members will be private, but setter methods will return 'this' like usual.
nocamel: Do not use CamelCase field accessors with beans.
fullcamel: Convert underscored_accessor_or_service_names to camelCase.
android: Generated structures are Parcelable.
android_legacy: Do not use java.io.IOException(throwable) (available for Android 2.3 and above).
java5: Generate Java 1.5 compliant code (includes android_legacy flag).
reuse-objects: Data objects will not be allocated, but existing instances will be used (read and write).
sorted_containers:
Use TreeSet/TreeMap instead of HashSet/HashMap as a implementation of set/map.

Were Java #Override annotations ever required/used for interface implementations?

I've downloaded an open-source Java project, JMapViewer.
http://svn.openstreetmap.org/applications/viewer/jmapviewer/
After importing it into Eclipse, there are numerous compiler errors, all regarding #Override notations preceding methods which are being implemented from an interface. I completely understand this error, since I would only use the #Override notation for a method which overrides a superclass method (an extension, not an implementation), which I believe is the only intended usage (and even then I don't think it's required).
This project has not been maintained for 4 months, but it does have a long history of revisions and community contributions. I cannot figure out why those #Override annotations are there if they prevent it from compiling, but in my inexperience I have to consider that those who put them there, the previous project contributors, had some good reason that is not clear to me. The project documentation says it was intended for usage with JDK 1.5, so I've tried compiling it under 1.5, 1.6, and 1.7 alternately in Eclipse, and in each case the result is the same... the compiler is very unhappy with those notations being where they are.
So... what am I missing?
The documentation you've seen is correct. #override should be accepted on methods that implement an interface method, by the compiler since JDK 1.6
Try compiling from the command line to make sure it's not eclipse still using the 1.5 compiler.
It depends whether you are using Java 5 or Java 6. #Override for interface overridden methods was only added in Java 6.

Categories