Were Java #Override annotations ever required/used for interface implementations? - java

I've downloaded an open-source Java project, JMapViewer.
http://svn.openstreetmap.org/applications/viewer/jmapviewer/
After importing it into Eclipse, there are numerous compiler errors, all regarding #Override notations preceding methods which are being implemented from an interface. I completely understand this error, since I would only use the #Override notation for a method which overrides a superclass method (an extension, not an implementation), which I believe is the only intended usage (and even then I don't think it's required).
This project has not been maintained for 4 months, but it does have a long history of revisions and community contributions. I cannot figure out why those #Override annotations are there if they prevent it from compiling, but in my inexperience I have to consider that those who put them there, the previous project contributors, had some good reason that is not clear to me. The project documentation says it was intended for usage with JDK 1.5, so I've tried compiling it under 1.5, 1.6, and 1.7 alternately in Eclipse, and in each case the result is the same... the compiler is very unhappy with those notations being where they are.
So... what am I missing?

The documentation you've seen is correct. #override should be accepted on methods that implement an interface method, by the compiler since JDK 1.6
Try compiling from the command line to make sure it's not eclipse still using the 1.5 compiler.

It depends whether you are using Java 5 or Java 6. #Override for interface overridden methods was only added in Java 6.

Related

Compatibility of a Java runtime retention annotation in previous Java versions

I want to use the #FunctionalInterface from Java 8 in my code, but I want to be able to use the generated class files with Java 6. I think then that I should the source version to 1.8, and the target version to 1.6.
I would be using #FunctionalInterface just for documentation, but I note that it has #Retention(RetentionPolicy.RUNTIME). If no one ever uses that annotation, will it cause problems?
If someone iterates over the annotations of my object at runtime, will it cause a missing class exception? But if that is true, how is it that how Google Guava can declare the JSR 305 annotation dependency to have a Maven <scope> of provided, which means annotations such as javax.annotation.Nonnull are missing at runtime, too, in Guava, without causing problems?
Let me ask it another way: if I use Google Guava in my project but don't include a JSR 305 dependency, do I really risk some error if I use reflection on the code? If so, what error will occur? If no error will occur, then analogously can I use the #FunctionalInterface annotation in source compiled with Java version 1.8 yet targeted to version 1.6 without any risk of runtime errors, even using reflection?
I think then that I should [set] the source version to 1.8, and the target version to 1.6.
Actually, it is not possible to compile Java source files of newer source versions for older JVM target versions. Oracles and OpenJDKs javac will reject a compilation attempt where the -source version is higher than the -target version. (However, I couldn't find a specification denying it, even the manual doesn't mention that). The sole idea of javacs cross-compiling feature is that you can compile your old e.g. 1.6 Java files still for the old 1.6 JVM even when you are using a newer JDK for compilation.
The issue you are describing is the sort of reason for this. Since Java is using a sort of lazy dependency loading, the compiler can't guarantee that there will be an appropriated class at runtime for all the dependencies. This also applies to the standard library.
However, there are (unofficial) tools to compile the newer source idioms or byte code to older byte code versions. But that doesn't go for the standard library. If you wanna use newer classes, you have to provide them on your own. For this purpose, there exist some back ports for specific parts of the standard library.
Specifically about your annotation question:
I was not able to find any reliable specification to what should/might happen if the JVM encounters an annotated construct for which it could not retrieve the class file (I searched the Java virtual machine specification SE 8). However, I found a somewhat related reference in the Java language specification SE 8:
An annotation is a marker which associates information with a program construct, but has no effect at run time.
From JLS 9.7
This statement rather indicates that an annotation (present or not) should not have an influence on the execution of a JVM. Therefore, a exception (such as NoClassDefFoundError) because of a missing annotation were rather against this.
Finally, though the answers of this question, I found even more specific statements:
An annotation that is present in the binary form may or may not be available at run time via the reflection libraries of the Java SE platform.
From JLS 9.6.4.2
And
Adding or removing annotations has no effect on the correct linkage of the binary representations of programs in the Java programming language.
From JLS 13.5.7
This quite clearly states that missing annotations will not cause an error, but instead will be just ignored if examined by reflection.
And if you deliver a class annotated with a Java 1.8 standard library annotation, and it will be (somehow) executed on e.g. Java 1.6 JVM where that annotation is just not present, then this specifications denies that any error is generated.
This is also supported by the following test which I wrote: (notice the usage of reflection)
#TestAnno
public class Test {
public static void main(String[] args) {
Annotation[] annos = Test.class.getAnnotations();
for (Annotation a : annos) {
System.out.println(a);
}
}
}
#Retention(RetentionPolicy.RUNTIME)
#interface TestAnno {
}
If compiled, it yields a Test.class and a TestAnno.class. When executed the program outputs:
#TestAnno()
Because that is the one annotation applied to Test. Now, if the TestAnno.class is removed without any modifications to Test.class (which refers to TestAnno with LTestAnno; sequence in the byte code) and Test is executed again, it just does not output anything. So my JVM is indeed ignoring the missing annotation and does not generate any error or exception (Tested with a OpenJDK version 1.8.0_131 on Linux).
As with any class loading situation, if the class isn't needed (or rather, doesn't need to be loaded), it doesn't matter if the class doesn't exist at runtime. Runtime annotations normally have the same problem, since if they're retained at runtime, it usually means that there's logic based on them, meaning their classes are loaded too.
But #FunctionalInterface doesn't have runtime logic, so...
Why does #FunctionalInterface have a RUNTIME retention? Apparently not for any particularly compelling reason, just a side effect of it also being #Documented annotation.
So if you want to make sure there are no potential problems if someone (or some tool more likely (I don't mean a "tool", like a co-worker)) decides to enumerate the annotations in your classes, I guess you'd need to remove the annotations at pre-processing.

Providing Dummy-Annotation for older Java compilers

I'm working on a Java library that I would like to be able to use across a couple of different Java compiler versions. Some annotations (specifically #SafeVarargs) only exist on some of these compiler versions and generate errors in others.
Especially for something like #SafeVarargs, which serves mostly as a marker to suppress warnings rather than actually changing the output of the compiler, I would like to be able to use these annotations and simply provide a dummy-implementation if an earlier compiler is missing them.
How would I go about doing this?
I guess you could just create surrogate implementations of those annotations and put them in a Jar that is added to the classpath making sure that the system/compiler provided one take priority when resolved by the corresponding class loader.
For example you can just copy the code of SafeVarargs from here

How to avoid deprecation warnings when #SuppressWarnings("deprecation") doesn't work?

We have a Java project. We enable -Xlint (enable warnings) and -Werror (treat warning as error) flags for javac, to make sure our code is warning-free. Recently we decide to deprecate a class. The problem is in some cases #SuppressWarnings("deprecation") will not suppress the deprecation warning at all, resulting in build failure. Below is a list of use cases that I ran into:
Imported in other non-deprecated classes.
Imported in other deprecated classes.
Parent class.
Type parameter. For example
#SuppressWarnings("deprecation")
public class Foo extends Bar<DeprecatedClass>
{ ... }
However, this one has no warning even without suppress:
#Deprecated
public class DeprecatedClass extends Bar<DeprecatedClass>
{ ... }
AFAIK, there is no syntax for annotating imports, so for case 1 and 2 our solution is to either import * or avoid importing. For case 3 and 4, both Java 6 and 7 do not suppress the warning. Java 8 will correctly suppress it (maybe a bug is fixed). So far no solution for this.
Unfortunately, we have to support Java 6, 7 and 8 at this point. Is there way to deal with the problem? It is a road block for our Java API evolution.
ADDENDUM
Many people ask why do we still use the deprecated class in our own codebase. The reason is that the project is a library, supporting many different clients. When introducing new replacement API, we have to first deprecate our old API, keep it in our codebase, wait for all clients to migrate then remove it. There are three common use cases:
We deprecate class Foo and Bar, where Foo extends Bar. This is the case 2 and 3 in my question.
We deprecate class Foo and Bar, where Foo extends Collection<Bar>. This is the case 2 and 4.
We must keep all test code for class Foo and Bar. The test code imports these classes. This is the case 1.
Why keep the test? Don't forget that if a serious bug (e.g. memory leak, security issue) is discovered, and the clients can't easily migrate to the new version, we still need to provide bug fix to the old API. And all changes must be tested.
I feel our situation should be fairly common in software library development and API evolution. Surprisingly it took Java such long time (until Java 8) to fix the bug.
I'm sorry to say that I don't have a solution to the problem you're facing, though as you've observed, there has been some progress. We've been trying to get rid of all the Java compilation warnings in the JDK itself, and this has been a long, difficult process. During JDK 8 development in 2011 I helped kick off the warnings cleanup effort and I later co-presented a JavaOne talk (slides and audio) on the subject.
More recently, my colleage Joe Darcy has continued the warnings cleanup work and has worked through the different warnings categories and has finally reached deprecation warnings. As you noted, there have been some bugs in the compiler's handling of suppression of deprecation warnings, such as JDK-6480588 which was fixed in JDK 8. Unfortunately, it is still not possible in JDK 8 to suppress warnings on imports of deprecated items. This bug, JDK-8032211, was fixed quite recently in our JDK 9 development line. In fact, we're still tuning up the handling of the #Deprecated annotation. For example, bug JDK-6481080 clarifies that attempting to use #Deprecated in a package-info.java file does not in fact deprecate the package; this bug was fixed just last week. And there is more work to be done but it's somewhat speculative at this point.
The JDK is facing similar problems to yours, in that we have to maintain deprecated APIs for clients that are still using them. But since we use and implement such APIs internally, we have a lot of deprecation warnings to suppress. As of this writing, in our JDK 9 development line, we still have not been able to compile the system without deprecation warnings. As a result, the javac options for lint warnings are still:
-Xlint:all,-deprecation
You will probably have to disable deprecation warnings in your compilation as well, especially if you are still building on JDK 6. I don't see a way around it at this point.
One final note on one of your deprecation cases:
#Deprecated
public class DeprecatedClass extends Bar<DeprecatedClass> { ... }
This does not issue a deprecation warning, nor should it. The Java Language Specification, section 9.6.4.6, specifies that deprecation warnings are not issued if the use of a deprecated entity is within an entity that is itself deprecated.
Consider using -Xmaxwarns, you can control how many warnings before stop.
Or try collect the number of warnings and fail the integration process, not compiling.
For example: https://issues.apache.org/jira/browse/HADOOP-11252.
Every code commit to the hadoop project need to pass the automated CI and it give -1 for increase number of warnings.
Normally, when you deprecate class, you don't want anybody to use it in later versions. Also, YOUR codebase should stop using deprecated class too. It looks strange, when you say everybody not to use MySuperDeprecatedUtil class but continue using it in your codebase.
If you need to use your MySuperDeprecatedUtil class in some other class - you should mark class where you use it as #Deprecated - every class that used deprecated code should be either deprecated, or would produce compilation warnings, or should be removed, or should stop using deprecated code.
If you can't stop using your class - maybe it's too early to deprecate it?
In my practice, when I want to deprecate some class, I create replacement class e.g. MySuperFreshUtil. Switch all classess using MySuperDeprecatedUtil to MySuperFresh util preserving interfaces where possible(if not possible - use FQCN and mark method as deprecated). Mark MySuperDeprecatedUtil as #Deprecated and add comment which class and how should be used instead. Then I commit this changes in single changelist.

Error while compiling thrift generated classes with Java 1.5

`Platform`: Windows 7, MinGW, MSYS, Java 1.5
I have thrift 0.9.1 compiler (prebuilt for windows) and source. I use Ant to build java library.
I create one thrift idl and compile it with the compiler. No problem in generating code files.
I add these files in my project, and that add slf4j (downloaded from their site) and libthrift.
Most of the errors that I have previously (imports etc) are gone except for errors related to overriding methods.
So basically it complains like:
The method clear() of type Server must override a superclass method
and similarly for compareTo, write, read etc. In short it complains about all methods that are overridden. This is all thrift compiler generated code and I haven't changed anything.
Is there any incompatibility? I cannot really find any mention of that. I have tried removing and then adding the libraries, I have also tried cleaning, refreshing, validating the project but the errors are still there.
I have also tried to compile the code (thrift code) but MinGW is also a huge headache. It cannot find configure even though I have installed it. And if I run the msys console, it is able to configure but cannot make complaining about inttypes.h not present (which is not in msys include directory but is present in MinGW include directory.).
Any suggestion would be appreciated.
Are you using Java 5? With Java 5 #Override doesn't search for methods on interfaces, only on superclasses.
If you are using a Java 5 compiler trying using a more recent javac (preferably 7 or 8) and see of that works.
EDIT:
Not sure if this is in your version of Thrift, but in mine it looks like there is a flag called java5 that you an specify when generating code to specify that you want the generated code to be Java 5 compliant
java (Java):
beans: Members will be private, and setter methods will return void.
private-members: Members will be private, but setter methods will return 'this' like usual.
nocamel: Do not use CamelCase field accessors with beans.
fullcamel: Convert underscored_accessor_or_service_names to camelCase.
android: Generated structures are Parcelable.
android_legacy: Do not use java.io.IOException(throwable) (available for Android 2.3 and above).
java5: Generate Java 1.5 compliant code (includes android_legacy flag).
reuse-objects: Data objects will not be allocated, but existing instances will be used (read and write).
sorted_containers:
Use TreeSet/TreeMap instead of HashSet/HashMap as a implementation of set/map.

Deprecating an java JRE method

I would like to mark usage of certain methods provide by the JRE as deprecated. How do I do this?
You can't. Only code within your control can have the #Deprecated annotation added. Any attempt to reverse engineer the bytecode will result in a non-portable JRE. This is contrary to Java's write once, run anywhere methodology.
you can't deprecate JRE methods, but you can add warnings or even compile errors to your build system i.e. using AspectJ or forbid the use of given methods in the IDE.
For example in Eclipse:
Go to Project properties -->Java Compiler --> Errors Warnings, Then enable project specific settings, Expand Deprecated and restrited APIs category
"Forbidden reference (acess rule)"
Obviously you could instrument or override the class adding #Deprecated annotation, but it's not a clean solution.
Add such restrictions to your coding guidelines, and enforce as part of your code review process.
You only can do it, if and only if you are building your own JRE! In that case just add #Deprecated above the corresponding code block! But if you are using Oracle's JRE, you are no where to do so!
In what context? Do you mean you want to be able to easily configure your IDE to inhibit use of certain API? Or are you trying to dictate to the world what APIs you prohibit? Or are you trying to do something at runtime?
If the first case, Eclipse, and I assume other IDEs, allow you to mark any API as forbidden, discouraged, or accessible at the package or class level.
If you mean the second, you can't, of course. That would be silly.
If you are trying to prohibit certain methods from being called at runtime, you can configure a security policy to prevent code loaded from specified locations from being able to call specific methods that check with the SecurityManager, if one is installed.
You can compile your own version of the class and add it to the boot class path or lib/ext directory. http://docs.oracle.com/javase/tutorial/ext/basics/install.html This will change the JDK and the JRE.
In fact you can remove it for compiling and your program won't compile if it is used.
Snihalani: Just so that I get this straight ...
You want to 'deprecate methods in the JRE' in order to 'Making sure people don't use java's implementation and use my implementation from now on.' ?
First of all: you can't change anything in the JRE, neither are you allowed to, it's property of Oracle. Uou might be able to change something locally if you want to go through the trouble, but that 'll just be in your local JRE, not in the ones that can be downloaded from the Oracle webpage.
Next to that, nobody has your implementation, so how would we be able to use it anyway? The implementations provided by Oracle do exactly what they should do, and when a flaw/bug/... is found it'll be corrected or replaced by a new method (at which point the original method becomes deprecated).
But, what mostly worries me, is that you would go and change implementations with something you came up with. Reminds me quite lot of phishing and such techniques, having us run your code, without knowing what it does, without even knowing we are running your code. After all, if you would have access to the original code and "build" the JRE, what's to stop you from altering the code in the original method?
Deprecated is a way for the author to say:
"Yup ... I did this in the past, but it seems that there are problems with the method.
just in order not to change the behaviour of existing applications using this method, I will not change this method, rather mark it as deprecated, and add a method that solves this problem".
You are not the author, so it isn't up to you to decide whether or not the methods work the way they should anyway.

Categories