Summary:
Is there any way to configure IntelliJ to not add errors to existing un-annotated code but NEVER allow a null to be passed into a #NonNull parameter?
Details: Eclipse actually has 3 "nullable" states, #NonNull, #Nullable and Unannotated.
IntelliJ appears to have 2 types of nullable, you treat Un-annotated as either #NonNull or #Nullable, Un-annotated does not have it's own behavior.
All three states are pretty important if you have a large existing project and want to start using the annotations.
Question: Am I missing a configuration option or are the annotations assumed to be an all-or-nothing in IntelliJ?
tl;dr: (why all three types are needed)
In eclipse the Un-annotated will act as:
#NonNull when interacting with other Un-Annotated items
#Nullable when being passed to something annotated with #NonNull
The first is needed so that warnings will not be created on existing un-annotated code like "str.length()"
The second is so that you will be warned if you attempt to pass an untyped potential null to a method annotated with #NonNull.
This combination allows you to slowly push annotations through your otherwise un-annotated code.
In Intellij an option called "Treat non-annotated members and parameters as #Nullable defines the behavior of un-annotated variables/code, if this is off they act as either #Non-Null, if it's on they are #Nullable.
Turning on the "Treat as #Nullable" option is great if you assume your entire codebase is annotated, but if you are working with existing code it will cause endless warnings in un-annotated cases like this:
public void test(String s) {
System.out.println(s.length());
}
public void callTest(String maybeNull) {
test(maybeNull);
}
If I use IntelliJ's setting specifying that un-annotated should be treated as Nullable, this un-annotated code gains a warning because s is nullable and must be null-checked before s.length() can be called. I don't want to see that kind of warning throughout my code until I'm ready to add the annotations.
On the other hand, If I assume the default settings then when I start to apply annotations by changing this line:
public void test(#NonNull String s) {
It does NOT cause the expected error in the caller because IntelliJ assumes that the un-annotated maybeNull should be treated as #NonNull. We absolutely want a warning here because maybeNull actually could be null--otherwise the annotations aren't doing anything for us.
Related
I have a Java method in which the method itself is annotated with a constraint (cross-parameter) and the arguments are also annotated with constraints (#NotNull, #NotEmpty etc).
Is the method constraint validated after method argument validation or is the order of validation not specified?
Annotations don't inherently do anything. They just mark things. javac itself knows what #Deprecated and #Override and #FunctionalInterface mean, but the effects are always to either do nothing, or to generate a compiler error: These annotations do not cause the compiler to generate any code.
Aside from Project Lombok this is a general principle of annotations, and even Project Lombok is an annotation processor: You have to put lombok on your classpath during a compilation, or nothing happens.
In other words, it is not possible for the #NonNull annotation in your code to generate any null check all on its own. The constraints are applied elsewhere, or by a code generating annotation processor that you explicitly include by putting it on the classpath or passing it along as annotation processor. For example, it IS possible for code you invoke to introspect your method and notice things. Thus, you could for example have:
class Example {
#NotEmpty String name;
}
and then you can do:
new Example("");
and this won't cause an exception. But you could do:
Validator validator = SomeHypotheticalValidationLibrary.newValidator();
validator.validate(new Example(""));
and then this validator would produce an error stating that the instance you provided fails verification. This is an example of annotations being introspected.
And now to answer your question:
The order in which such constraints are validated depends entirely on the validation library you use to do the validation; out of the box, the annotation itself does not and cannot produce any validation code. You'd have to check the documentation of your validation library, and provide the context within which you are validating.
If you're talking specifically about lombok's #NonNull - lombok scans your code for null checks (either of the form if (x == null) throw new Something(); or of the form Objects.nullCheck or guava's nullcheck). If it finds a nullcheck for an #NonNull annotated parameter, lombok does nothing. If it doesn't, it generates a nullcheck after all your explicit null checks. Lombok stops scanning for nullchecks once it hits a line that is NOT a nullcheck (so, neither an if (x == null) nor a methodInvocation(x, "optional text");). #NonNull is currently the only annotation that causes lombok to generate validation code (there is no #lombok.NotEmpty).
We may be able to give more insights if you explain which annotation processors / validation frameworks you are using.
I have a very simple class and using Immutables library. The auto-generated code defines equals method like so:
#Override
public boolean equals(#Nullable Object another) {
The #Nullable annotation causes the following FindBugs error:
NP_METHOD_PARAMETER_TIGHTENS_ANNOTATION: Method tightens nullness
annotation on parameter
A method should always implement the contract of a method it
overrides. Thus, if a method takes a parameter that is marked as
#Nullable, you shouldn't override that method in a subclass with a
method where that parameter is #Nonnull. Doing so violates the
contract that the method should handle a null parameter.
I am using Immutables-value-2.5.6.jar
Has anyone seen this error?
I have mitigated the issue temporarily by adding:
#SuppressFBWarnings
to the Immutables class. But I don't think this is a long term solution. There must be something else I am missing.
This appears to be an open bug in the FindBugs project (https://sourceforge.net/p/findbugs/bugs/1385/), so I would say that disabling the warning using an annotation is fine until the next release.
This class suggests that the SpotBugs project, which is the successor to FindBugs, have addressed the issue. Perhaps consider migrating?
Update : The FindBugs issue has since been closed.
I found something weird in Eclipse IDE. Let's say I have the following classes:
public class Super {
#Deprecated
public void doNotUseThisMethod() {
// do magic
}
}
public class Sub extends Super{
#Override
public void doNotUseThisMethod() {
// why is this not deprecated?
}
}
Surely overriding a deprecated method should result in a warning (because the same reason for not using it applies). Still in brand new workspaces for Eclipse Luna and Mars the above code does not yield a warning at all. And I can't find any way to enable it either.
I found this bug for interfaces (I get that if Super was an interface, there should be no warning) which implies that there was a warning once upon a time.
So what happened? Is there any reason overriding a deprecated method should not result in a warning? Can I do something to enable this functionality again?
The JLS is quite clear on this point (as it is on most points).
A program element annotated #Deprecated is one that programmers are discouraged from using, typically because it is dangerous, or because a better alternative exists.
A Java compiler must produce a deprecation warning when a type, method, field, or constructor whose declaration is annotated with #Deprecated is used (overridden, invoked, or referenced by name) in a construct which is explicitly or implicitly declared, unless:
The use is within an entity that is itself annotated with the
annotation #Deprecated; or
The use is within an entity that is annotated to suppress the warning with the annotation #SuppressWarnings("deprecation"); or
The use and declaration are both within the same outermost class.
(https://docs.oracle.com/javase/specs/jls/se8/html/jls-9.html#jls-9.6.4.6)
To be fully compliant with the JLS, Eclipse must mark your doNotUseThisMethod() with a warning. And it seems that it used to do that, a long time ago, but in 2003 bug 48335 came along and that warning became subject to a preference. The default, as noted in the bug comments, is "disabled". (The default should be "enabled" to be even remotely compliant with the JLS. Feel free to file a bug to get it changed.) I haven't done an exhaustive search, but since you are seeing behavior consistent with that, I'm going to go out on a limb and say that this is how it has remained ever since.
Since it's a preference, you can change it. Just go to "Window -> Preferences", then select "Java -> Compiler -> Errors/Warnings". Scroll down to "Deprecated and restricted API" and check the box next to "Signal overriding or implementing deprecated method". There's a dropdown to the right and slightly above where you can choose "Error", "Warning" or "Ignore". Select the one you prefer.
That setting affects only your local IDE.
To get your entire team using that setting, you need to set Project Specific Settings. On that same preferences screen, in the upper right there's a link to "Configure Project Specific Settings". Click it, select the project for which you want to set the preferences, and do the same thing described above. This will create a file named "org.eclipse.jdt.core.prefs" in your project's ".settings" folder. (You can't see them in Package Explorer. You'll need to look at them in the Navigator view.) Once created, you can then add them to your source control and they'll apply to everyone on your team working on that project.
Yes, you have to do that for every project.
Actually the behaviour seeems the right one for me: when you override a method is a symbol you implement a new code for it.
To avoid errors, if you use a deprecated super method, eclipse will raise a warning:
class Sub extends Super{
#Override
public void doNotUseThisMethod() { // <---------------*
super.doNotUseThisMethod(); // deprecated! <----*
}
}
But if you override a deprecated method functionallity without using the deprecated super.method() then new method is not deprecated by definition, because you implemented it without using the old one.
public class Sub extends Super{
#Override
public void doNotUseThisMethod() {
// new logic + not using super.method() = not a deprecated method!!
}
}
If the one that overrides the deprecated one, shouldn't be used, must be also annotated as deprecated as follows:
public class Super {
#Deprecated
public void doNotUseThisMethod() {
// do magic
}
}
public class Sub extends Super{
#Override
#Deprecated // <---------------------------*
public void doNotUseThisMethod() { // |
// this is deprecated also!!!! // <-----*
}
}
SIDENOTE: Another question will be if the name of your method can cause confusion.
But doesn't #Deprecate also mean "we're going to delete this method soon", which will break the code of the subclass thanks to the #Override annotation? (That's exactly why I need the warning - to prevent other developers from overriding a method I'm going to delete. But as it is they won't even get a warning.) – Steffi S.
Actually It could happen developers delete the method, but deprecation is more about not using the code in some method because is old, then developers make a NEW one, with NEW name and leave deprecated one as legacy.
But... What would happen in the case this method is deleted?
First of all, this would mean a change of Java (or framework, or utils) version, this should be done carefully, deprecated methods won't be your only problem.
In the case you choose major version and some deprecated methods are deleted, you won't have much problem... Why? Your IDE will clearly mark this methods with an error, and you just need to delete the #Override annotation, having your logic safe in your overriden method.
you need a #SuppressWarnings("deprecation") at the beginning of your classes that implement the deprecated interface/function.
It is considered to be a good practice to
use #Override annotation on methods which are being overriden in
subclass.
But why is same not applied to the classes that come with Java Library. For e.g. String Class. It overrides methods of Object class but does not use #Override annotation on these methods.
Is this so to maintain backward compatibility with previous releases of Java such as 1.4 etc.
Thanks
Within an API, it does not offer much to the user (of that API). However when you implement a method, and you 'intend' do override that of a super class, it is easy to miss out on the method signature, which is supposed to match.
In this case the #Override comes to the rescue as at compile time, it will fail or give a warning when the override does not happen. Also many IDE's recognize the #Override and give you enough support to flag and correct those situations before you even compile.
So the #Override in essence declares your intention that this method overrides something. The user of the API would care less what your intent is, as long as it works.
Actually, probably the true reason is this: The Retention of the #Override annotation is set to SOURCE. Which means the #Override flag is discarded when compiled into a class file.
#Target(value=METHOD)
#Retention(value=SOURCE)
public #interface Override
That is not much more of a cosmetic annotation, it's useful when generating documentation, to give hints through your Java IDE and to explicitly state when a method is overriden.
From the runtime/standard library implementors point of view, it was not worth the effort to modify all existing classes just to add something cosmetic.
Furthermore, regarding backward compatibility of annotations in general, considering that annotations are an optional and extended attribute present in .class file (when their retention policy is either CLASS or RUNTIME and available for Class and Method as Runtime(In)VisibleAnnotations and for Parameter as Runtime(In)VisibleParameterAnnotations) previous releases of the JVM would simply ignore that attribute during the .class file parsing performed the first time that Class is needed.
But actually, that 1.4 JVM class parser will not even reach the point where those Annotation .class attribute are located inside the structure because the parsing will end abruptly when the JVM will notice that the .class version is greater than the supported one.
#override annotation is used to provide some extra information, mainly while generating documentations and also for informing the developer that the code intends to override a method from the superclass.
This is mentioned in oracle documentation.
#Override #Override annotation informs the compiler that the element
is meant to override an element declared in a superclass. Overriding
methods will be discussed in Interfaces and Inheritance.
// mark method as a superclass method // that has been
overridden #Override int overriddenMethod() { }
While it is not required to use this annotation when overriding a
method, it helps to prevent errors. If a method marked with #Override
fails to correctly override a method in one of its superclasses, the
compiler generates an error.
Refer to this discussion in SO itself.
I want to create an annotation that restricts a developer from specifying null as a parameter, which has been annotated with #NoNull
For example, if I create this method:
public void printLine(#NoNull String line) {
System.out.println(line);
}
On a method call, I want an error to appear if the user specifies null for line: printLine(null);
I have been using APT for only a little bit of time, and am wondering how to do this (if possible)?
This is the annotation I have created so far:
#Target(ElementType.PARAMETER)
#Retention(RetentionPolicy.SOURCE)
public #interface NoNull {}
Compile time will be tough to check, since you're really dealing with runtime values. If you want to create annotations to automatically add code to check this stuff, you should look at project lombok:
http://projectlombok.org/
It uses an annotation processor to add code to your beans to do various things.
For example:
#Getter #Setter
private int id;
The annotation processor would automatically add get/set methods to your bean.
I don't think it has null checks, but you should be able to add this in and contribute it.
Another option is to use the validation jsr, though this requires you to explicitly validate at runtime, but you could accomplish this with proxies or AOP.
#NotNull #Min(1)
public void setId(Integer id)
The point isn't to use the annotation only for readability, but to enforce the annotation at compile-time with APT
Considering that null is a runtime artifact, I don't see how you will enforce a null check at "compile time."
Instead, you'll have to modify your classes, and apt is not the tool to do this, at least not by itself. It exists to extract information about annotated elements from source files. But to enforce your #Null restriction, you need to modify the running class.
One thing that you could do is use apt to extract information about annotated parameters, then use a tool like aspectj to modify those classes at runtime to check the parameter value.
But that's a topic that's way too broad for a single SO question.
#Nullable, #Nonnull are locating in package: javax.annotation
Checkout guava, its got some nice things are type safety:
http://code.google.com/p/guava-libraries/wiki/GuavaExplained