I have code in my project that looks like this:
public interface Bar<T extends Foo<?>> {
//...
}
public class MyFoo implements Foo<String> {
private List<Bar<Foo<String>> barFoo = ...
public <U extends Foo<String>> boolean addBar(Bar<? extends U> b) {
barFoo.add((Bar<Foo<String>>) b); //safe cast?
}
}
Eclipse gives a warning for the cast in addBar that the cast is unsafe. However, am I correct in assuming that the cast will not throw given the restrictions that I have put on the type parameters, and therefore the cast is indeed safe?
Not in general.
Suppose Bar has a method void get(T value), and there are two implementations of Foo<String>, MyFoo and YourFoo. Now suppose a caller calls addBar on a value of type Bar<MyFoo>. This works: when U = Foo<String>, we have that Bar<MyFoo> is a subtype of Bar<? extends U>. Now we cast that value to a Bar<Foo<String>>.
Now if Bar has no methods that accept T's as arguments, there's no problem. But suppose it has a method void process(T value). The implementation we called has T = MyFoo, so it only has a process(MyFoo value) method. Once we cast it to a Bar<Foo<String>>, though, we might call it with a YourFoo instead. This is illegal.
Stab in the dark, but I suspect that what you really wanted to do was declare barFoo as a List<? extends Bar<? extends Foo<String>>.
This is not a safe cast. Eclipse is correct.
Imagine you has a class MyFoo that extends Foo and you passed in a Bar<MyFoo<String>> Now some method in Bar with a myMethod(Foo x) signature when only a myMethod(MyFoo x) signature was compiled, so the method lookup would fail.
The cast is not safe, because although U extends Foo<String>, it is not (necessarily) the case that Bar<U> extends Bar<Foo<String>>. In fact, Bar<U> will only extend Bar<Foo<String>> when they are the same thing, i.e., when U is Foo<String>.
Intuitively, it may seem that (for example) List<String> should be a subtype of List<Object>, but this is not how generics work. List<String> is a subtype of List<? extends Object>, but it is not a subtype of List<Object>. (It may make more sense to consider an example like Comparable<T>: Comparable<String> means "can be compared to any String, whereas Comparable<Object> means "can be compared to any Object". It should be clear that Comparable<String> should not be a subtype of Comparable<Object>.)
[…] the cast will not throw […], and therefore the cast is indeed safe?
I think you're misunderstanding the nature of the warning. Eclipse is warning you that this cast will not throw even when it should, and this is actually why it's not safe. For example, this code:
final Object o = Integer.valueOf(7);
final String s = (String) o;
is perfectly safe, because the cast will throw an exception. But this code:
final List<?> wildcardList = new ArrayList<Integer>(Integer.valueOf(7));
final List<String> stringList = (List<String>) wildcardList;
is unsafe, because the runtime has no way of checking the cast (due to erasure), so it won't throw an exception, even though it's wrong: stringList is now a List<String> whose first element is of type Integer. (What happens is, at some point later on, you can get a spontaneous ClassCastException when you try to do something with that element.)
Related
How to safely cast Class<?> (returned by Class.forName()) to Class<Annotation> without issuing "Unchecked cast" warning?
private static Class<? extends Annotation> getAnnotation() throws ClassNotFoundException {
final Class<?> loadedClass = Class.forName("java.lang.annotation.Retention");
if (!Annotation.class.isAssignableFrom(loadedClass)) {
throw new IllegalStateException("#Retention is expected to be an annotation.");
}
#SuppressWarnings("unchecked")
final Class<? extends Annotation> annotationClass = (Class<? extends Annotation>) loadedClass;
return annotationClass;
}
Multiple misconceptions need to be explained before delving into the answer.
You're using the wrong variance
final Class<Annotation> annotationClass = (Class<Annotation>) loadedClass;
This is actually illegal in any case. Try it:
Class<Number> n = Integer.class;
That won't compile.
Generics are invariant. It means that within the <>, you can't use a supertype as a standin for a subtype or vice versa.
Normal java (when <> are not involved) is covariant. Any subtype is a stand-in for one of its supertypes. This:
Number n = Integer.valueOf(5);
is perfectly legal java. But in generics world it isn't. If you want it to be, then, you have to opt into it: X extends Y is how you opt into covariance, and X super Y is how you opt into contravariance (contravariance is as if Integer i = new Number(); was legal - a SUPERtype can stand in for a subtype).
This is all because that's just how the universe ends up working out. If generics were naturally covariant, this would compile:
List<Integer> listOfInts = new ArrayList<>();
List<Number> listOfNums = listOfInts;
listOfNums.add(Double.valueOf(1.0));
int i = listOfInts.get(0);
but, follow along with your own eyes and you realize that code is a walking type violation. It shoves a non-integer into a list of integers. That's why opting into covariance or contravariances closes doors. If you opt into covariance, the add method is disabled *1:
List<? extends Number> list = new ArrayList<Integer>(); //legal
list.add(Integer.valueOf(5)); // will not compile
similarly, if you opt into contravariance, add works great, but get is disabled. 'disabled' in the type system sense: You can call it. But the expression list.get(i) would be of type Object:
List<? super Integer> list = new ArrayList<Number>(); // legal
list.add(Integer.valueOf(5)); // legal
Integer i = list.get(0); // won't compile
Object o = list.get(0); // this will.
With classes, where 'write' is not exactly clear, it's harder to see why Class<Annotation> c = SomeSpecificAnno.class; should fail to compile, but it does, so, that's important realization one.
Why are you using reflection here?
You can make class literals in java. This works great:
Class<? extends Number> c = Integer.class;
That's real java: You can stick .class at the end of any type and that will be an expression of type java.lang.Class, in fact, it's of type Class<TheExactThing>. So:
private static Class<? extends Annotation> getAnnotationType() {
return Retention.class;
}
works and compiles fantastically. I had to update the return type because as I explained above, returning the instance of j.l.Class that represents the Retention annotation for a method that is specced to return Class<Annotation> is as broken as returning an integer from a method that is specced to return a string.
The answer
If your code example is using java.lang.annotation.Retention as a stand-in, but your actual string here is a dynamic value that you do not know at compile time, the return Retention.class; option is off the table, then:
private static Class<? extends Annotation> getAnnotationType(String fqn) throws ClassNotFoundException {
return Class.forName(fqn).asSubclass(Annotation.class);
}
Again, do not use reflection unless there is no other way, and if you have the class in a string constant, generally you do not need reflection.
*1 ) You can call add, but only with a null literal; list.add(null); compiles, because null is trivially a valid value for any type. However, that's not particularly useful, of course.
Due to type erasure, generic type information is not accessible anymore at runtime. This means: Class<?> and Class<? extends Annotation> are indistinguishable at runtime - so you cannot do a runtime check to make the unchecked cast a checked one.
This means: you have to live with the warning (mind: a warning is not an error, it just means "this is problemattic, make sure you know what you're doing!).
Why does the following not work?
void execute() {
Integer a = Integer.valueOf(1);
a = reassign(a);
D.log("a: " + a);
}
<T extends Integer> T reassign(T t) {
t = Integer.valueOf(2); // error: incompatible types: Integer cannot be converted to T
// t = (T) Integer.valueOf(2); // This works but with warning: [unchecked] unchecked cast
return t;
}
<T extends Integer> T reassign2(T t, T anotherT) {
t = anotherT; // This works without any warning.
return t;
}
My understanding is that generic methods/classes/interfaces will be compiled to a single class file where the type parameter is replaced with most appropriate lower bound (Integer in the above case).
Java env: java 11.0.4 2019-07-16 LTS
My understanding is that generic methods/classes/interfaces will be compiled to a single class file where the type parameter is replaced with most appropriate lower bound
Your understanding is correct, but compilers are designed to handle generics more intelligently. If compilers are designed exactly the way you described, what's the point of generics? I could just write a method taking an Integer instead. There's no need for generics, since the compiler will just replace whatever type parameter I have with Integer anyway.
You have specified that T must be Integer or a subclass of Integer. Think about the situation when T is a subclass of Integer, would the following assignment still work? It wouldn't!
t = Integer.valueOf(2); // you are assigning an instance of a superclass to a subclass variable
You could argue that Integer cannot have any subclasses as it is final, but the compiler is not designed to check for the finalness of classes in this situation. Using Integer as a bound here probably means that reassign shouldn't be generic at all.
Another thing that the compiler do is to insert casts where necessary, but that's not really relevant to this question.
The reason why it doesn't work is because the compiler can't prove that T is actually and not a subtype of T. Integer is a bad example here because it is final and no one can extend it but the compile is not smart enough to know that and reason about it.
Imagine you have the following
class Foo{
}
class Bar extends Foo {
}
and you call reassign like this
reassign(new Bar());
Where reassign was allowed to do
<T extends Foo> T reassign(T t){
t = new Foo();
return t;
}
then this would be equivalent of saying
Bar b = new Foo()
Which is not valid of course
import java.util.List;
import java.util.ArrayList;
interface Canine {}
class Dog implements Canine {}
public class Collie extends Dog {
public static void main(String[] args){
List<Dog> d = new ArrayList<Dog>();
List<Collie> c = new ArrayList<Collie>();
d.add(new Collie());
c.add(new Collie());
do1(d); do1(c);
do2(d); do2(c);
}
static void do1(List<? extends Dog> d2){
d2.add(new Collie());
System.out.print(d2.size());
}
static void do2(List<? super Collie> c2){
c2.add(new Collie());
System.out.print(c2.size());
}
}
The answer for this question tell that when a method takes a wildcard generic typ, the collection can be accessed or modified, but not both. (Kathy and Bert)
What does it mean 'when a method takes a wildcard generic typ, the collection can be accessed or modified, but not both' ?
As far as I know,
The method do1 has List<? extends Dog> d2 so d2 only can be accessed but not modified.
The method d2 has List<? super Collie> c2 so c2 can be accessed and modified and there is no compilation error.
Generic guidelines
You cannot add a Cat to a List<? extends Animal> because you don't know what kind of list that is. That could be a List<Dog> also. So you don't want to throw your Cat into a Black Hole. That is why modification of List declared that way is not allowed.
Similarly when you fetch something out of a List<? super Animal>, you don't know what you will get out of it. You can even get an Object, or an Animal. But, you can add an Animal safely in this List.
I pasted your code into my IDE. The following error was signalled inside do1:
The method add(capture#1-of ? extends Dog) in the type List is not applicable for the arguments (Collie)
This is, of course, as expected.
You simply cannot add a Collie to a List<? extends Dog> because this reference may hold for example a List<Spaniel>.
The answer for this question tell that when a method takes a wildcard generic typ, the collection can be accessed or modified, but not both. (Kathy and Bert)
That's a fair first approximation, but not quite correct. More correct would be:
You can only add null to a Collection<? extends Dog> because its add method takes an argument of ? extends Dog. Whenever you invoke a method, you must pass parameters that are of a subtype of the declared parameter type; but for the parameter type ? extends Dog, the compiler can only be sure that the argument is of compatible type if the expression is null. However, you can of course modify the collection by calling clear() or remove(Object).
On the other hand, if you read from a Collection<? super Dog>, its iterator has return type ? super Dog. That is, it will return objects that are a subtype of some unknown supertype of Dog. But differently, the Collection could be a Collection<Object> containing only instances of String. Therefore
for (Dog d : collection) { ... } // does not compile
so the only thing we know is that instances of Object are returned, i.e. the only type-correct way of iterating such a Collection is
for (Object o : collection) { ... }
but it is possible to read from a collection, you just don't know what types of objects you will get.
We can easily generalize that observation to: Given
class G<T> { ... }
and
G<? extends Something> g;
we can only pass null to method parameters with declared type T, but we can invoke methods with return type T, and assign the result a variable of type Something.
On the other hand, for
G<? super Something> g;
we can pass any expression of type Something to method parameters with declared type T, and we can invoke methods with return type T, but only assign the result to a variable of type Object.
To summarize, the restrictions on the use of wildcard types only depend on the form of the method declarations, not on what the methods do.
I pasted your code into IDEONE http://ideone.com/msMcQ. It did not compile for me - which is what I expected. Are you sure you did not have any compilation errors?
I have the following code:
public <T extends SomeObject> long doSomething(T someObject){
List<? extends SomeObject> l = new LinkedList<>();
l.add(someObject);
}
this causes a compilation error - telling me that there is no suitable methods found: add(T),
why is that?
If l accept things that extends SomeObject shouldn't it accept someObject as it bounds to extend SomeObject?
List<? extends SomeObject> l
What do you mean by that? Of course it will generate an error.
Take this example :SomeObject is Fruit, you have 2 derived classes Apple and Orange
Your list what will it contain? Apples or Oranges? The compiler cannot tell. So it generates error.
If you replace List<? extends SomeObject> l with List<SomeObject> l. Then this will work because Apple and Orange are both Fruit.
I would advise you to use this statement:
List<T> l = new LinkedList<T>();
This is no less type-safe then
List<SomeObject> l = new LinkedList<SomeObject>();
and additionally gives you an opportunity to get objects of type T from the list without casting. T is already SomeObject so no casting required to call methods of SomeObject on T.
And all that with less typing!
Back to the problem.
First thing to note is that wildcard type "?" means unknown, this is important.
You may, however, specify an upper (? extends) or a lower (? super) constraint to it.
You declared a list as "List".
List is known to have objects of SomeObject inside. but! the exact type of objects is unknown.
Compiler can not say if there are instances of "class A extends SomeObject" or instances of "class B extends SomeObject" inside the list.
If you call list.get() it can only say that there will be an object of type SomeObject.
SomeObject obj = list.get(1); // Ok
But inserting an object of any(!) type is unsafe because the actual type of elements in the list is unknown.
You could wonder why wildcard type ever exists.
It is here to lower restriction in type casting that will be too strict otherwise.
Sample
class A { }
class A2 extends A { }
class B <T> {
void change(T a) { .. };
T read() { .. };
}
If there were no wildcards we would not be able to do this: B<A> b = new B<A2>(); - it does not work.
This is because type conversion from B<A> to B<A2> is unsafe.
Why? Let's look (copied from http://en.wikipedia.org/wiki/Generics_in_Java)
List<Integer> ints = new ArrayList<Integer>();
ints.add(2);
List<Number> nums = ints; // valid if List<Integer> were a subtype of List<Number>
nums.add(3.14);
Integer x = ints.get(1); // now 3.14 is assigned to an Integer variable!
What is the solution? Sometimes, we want to do such assignments or pass parameters in a general way!
Wildcard type helps here: B<? extends A> b = new B<A2>();
Method B.void change(T a) is now disabled - this is what your question was about and explained in the first part.
Method B.T read() is still valid and returns A: A a = b.read();. Yes, it returns A2 actually but to the caller of b.read() it's visible as A.
Wildcard types are widely used in Collections Framework.
Why can I use super only with wildcards and not with type parameters?
For example, in the Collection interface, why is the toArray method not written like this
interface Collection<T>{
<S super T> S[] toArray(S[] a);
}
super to bound a named type parameter (e.g. <S super T>) as opposed to a wildcard (e.g. <? super T>) is ILLEGAL simply because even if it's allowed, it wouldn't do what you'd hoped it would do, because since Object is the ultimate super of all reference types, and everything is an Object, in effect there is no bound.
In your specific example, since any array of reference type is an Object[] (by Java array covariance), it can therefore be used as an argument to <S super T> S[] toArray(S[] a) (if such bound is legal) at compile-time, and it wouldn't prevent ArrayStoreException at run-time.
What you're trying to propose is that given:
List<Integer> integerList;
and given this hypothetical super bound on toArray:
<S super T> S[] toArray(S[] a) // hypothetical! currently illegal in Java
the compiler should only allow the following to compile:
integerList.toArray(new Integer[0]) // works fine!
integerList.toArray(new Number[0]) // works fine!
integerList.toArray(new Object[0]) // works fine!
and no other array type arguments (since Integer only has those 3 types as super). That is, you're trying to prevent this from compiling:
integerList.toArray(new String[0]) // trying to prevent this from compiling
because, by your argument, String is not a super of Integer. However, Object is a super of Integer, and a String[] is an Object[], so the compiler still would let the above compile, even if hypothetically you can do <S super T>!
So the following would still compile (just as the way they are right now), and ArrayStoreException at run-time could not be prevented by any compile-time checking using generic type bounds:
integerList.toArray(new String[0]) // compiles fine!
// throws ArrayStoreException at run-time
Generics and arrays don't mix, and this is one of the many places where it shows.
A non-array example
Again, let's say that you have this generic method declaration:
<T super Integer> void add(T number) // hypothetical! currently illegal in Java
And you have these variable declarations:
Integer anInteger
Number aNumber
Object anObject
String aString
Your intention with <T super Integer> (if it's legal) is that it should allow add(anInteger), and add(aNumber), and of course add(anObject), but NOT add(aString). Well, String is an Object, so add(aString) would still compile anyway.
See also
Java Tutorials/Generics
Subtyping
More fun with wildcards
Related questions
On generics typing rules:
Any simple way to explain why I cannot do List<Animal> animals = new ArrayList<Dog>()?
java generics (not) covariance
What is a raw type and why shouldn’t we use it?
Explains how raw type List is different from List<Object> which is different from a List<?>
On using super and extends:
Java Generics: What is PECS?
From Effective Java 2nd Edition: "producer extends consumer super"
What is the difference between super and extends in Java Generics
What is the difference between <E extends Number> and <Number>?
How can I add to List<? extends Number> data structures? (YOU CAN'T!)
As no one has provided a satisfactory answer, the correct answer seems to be "for no good reason".
polygenelubricants provided a good overview of bad things happening with the java array covariance, which is a terrible feature by itself. Consider the following code fragment:
String[] strings = new String[1];
Object[] objects = strings;
objects[0] = 0;
This obviously wrong code compiles without resorting to any "super" construct, so array covariance should not be used as an argument.
Now, here I have a perfectly valid example of code requiring super in the named type parameter:
class Nullable<A> {
private A value;
// Does not compile!!
public <B super A> B withDefault(B defaultValue) {
return value == null ? defaultValue : value;
}
}
Potentially supporting some nice usage:
Nullable<Integer> intOrNull = ...;
Integer i = intOrNull.withDefault(8);
Number n = intOrNull.withDefault(3.5);
Object o = intOrNull.withDefault("What's so bad about a String here?");
The latter code fragment does not compile if I remove the B altogether, so B is indeed needed.
Note that the feature I'm trying to implement is easily obtained if I invert the order of type parameter declarations, thus changing the super constraint to extends. However, this is only possible if I rewrite the method as a static one:
// This one actually works and I use it.
public static <B, A extends B> B withDefault(Nullable<A> nullable, B defaultValue) { ... }
The point is that this Java language restriction is indeed restricting some otherwise possible useful features and may require ugly workarounds. I wonder what would happen if we needed withDefault to be virtual.
Now, to correlate with what polygenelubricants said, we use B here not to restrict the type of object passed as defaultValue (see the String used in the example), but rather to restrict the caller expectations about the object we return. As a simple rule, you use extends with the types you demand and super with the types you provide.
The "official" answer to your question can be found in a Sun/Oracle bug report.
BT2:EVALUATION
See
http://lampwww.epfl.ch/~odersky/ftp/local-ti.ps
particularly section 3 and the last paragraph on page 9. Admitting
type variables on both sides of subtype constraints can result in a
set of type equations with no single best solution; consequently,
type inference cannot be done using any of the existing standard
algorithms. That is why type variables have only "extends" bounds.
Wildcards, on the other hand, do not have to be inferred, so there
is no need for this constraint.
####.### 2004-05-25
Yes; the key point is that wildcards, even when captured, are only used
as inputs of the inference process; nothing with (only) a lower bound needs
to be inferred as a result.
####.### 2004-05-26
I see the problem. But I do not see how it is different from the problems
we have with lower bounds on wildcards during inference, e.g.:
List<? super Number> s;
boolean b;
...
s = b ? s : s;
Currently, we infer List<X> where X extends Object as the type of the
conditional expression, meaning that the assignment is illegal.
####.### 2004-05-26
Sadly, the conversation ends there. The paper to which the (now dead) link used to point is Inferred Type Instantiation for GJ. From glancing at the last page, it boils down to: If lower bounds are admitted, type inference may yield multiple solutions, none of which is principal.
The only reason is it makes no sense when declaring a type parameter with a super keyword when defining at a class level.
The only logical type-erasure strategy for Java would have been to fallback to the supertype of all objects, which is the Object class.
A great example and explanation can be found here:
http://www.angelikalanger.com/GenericsFAQ/FAQSections/TypeParameters.html#Why%20is%20there%20no%20lower%20bound%20for%20type%20parameters?
A simple example for rules of type-erasure can be found here:
https://www.tutorialspoint.com/java_generics/java_generics_type_erasure.htm#:~:text=Type%20erasure%20is%20a%20process,there%20is%20no%20runtime%20overhead.
Suppose we have:
basic classes A > B > C and D
class A{
void methodA(){}
};
class B extends A{
void methodB(){}
}
class C extends B{
void methodC(){}
}
class D {
void methodD(){}
}
job wrapper classes
interface Job<T> {
void exec(T t);
}
class JobOnA implements Job<A>{
#Override
public void exec(A a) {
a.methodA();
}
}
class JobOnB implements Job<B>{
#Override
public void exec(B b) {
b.methodB();
}
}
class JobOnC implements Job<C>{
#Override
public void exec(C c) {
c.methodC();
}
}
class JobOnD implements Job<D>{
#Override
public void exec(D d) {
d.methodD();
}
}
and one manager class with 4 different approaches to execute job on object
class Manager<T>{
final T t;
Manager(T t){
this.t=t;
}
public void execute1(Job<T> job){
job.exec(t);
}
public <U> void execute2(Job<U> job){
U u= (U) t; //not safe
job.exec(u);
}
public <U extends T> void execute3(Job<U> job){
U u= (U) t; //not safe
job.exec(u);
}
//desired feature, not compiled for now
public <U super T> void execute4(Job<U> job){
U u= (U) t; //safe
job.exec(u);
}
}
with usage
void usage(){
B b = new B();
Manager<B> managerB = new Manager<>(b);
//TOO STRICT
managerB.execute1(new JobOnA());
managerB.execute1(new JobOnB()); //compiled
managerB.execute1(new JobOnC());
managerB.execute1(new JobOnD());
//TOO MUCH FREEDOM
managerB.execute2(new JobOnA()); //compiled
managerB.execute2(new JobOnB()); //compiled
managerB.execute2(new JobOnC()); //compiled !!
managerB.execute2(new JobOnD()); //compiled !!
//NOT ADEQUATE RESTRICTIONS
managerB.execute3(new JobOnA());
managerB.execute3(new JobOnB()); //compiled
managerB.execute3(new JobOnC()); //compiled !!
managerB.execute3(new JobOnD());
//SHOULD BE
managerB.execute4(new JobOnA()); //compiled
managerB.execute4(new JobOnB()); //compiled
managerB.execute4(new JobOnC());
managerB.execute4(new JobOnD());
}
Any suggestions how to implement execute4 now ?
==========edited =======
public void execute4(Job<? super T> job){
job.exec( t);
}
Thanks to all :)
========== edited ==========
private <U> void execute2(Job<U> job){
U u= (U) t; //now it's safe
job.exec(u);
}
public void execute4(Job<? super T> job){
execute2(job);
}
much better, any code with U inside execute2
super type U becomes named !
interesting discussion :)
I really like the accepted answer, but I would like to put a slightly different perspective on it.
super is supported in a typed parameter only to allow contravariance capabilities. When it comes to covariance and contravariance it's important to understand that Java only supports use-site variance. Unlike Kotlin or Scala, which allow declaration-site variance. Kotlin documentation explains it very well here. Or if you're more into Scala, here's one for you.
It basically means that in Java, you can not limit the way you're gonna use your class when you declare it in terms of PECS. The class can both consume and produce, and some of its methods can do it at the same time, like toArray([]), by the way.
Now, the reason extends is allowed in classes and methods declarations is because it's more about polymorphism than it is about variance. And polymorphism is an intrinsic part of Java and OOP in general: If a method can accept some supertype, a subtype can always safely be passed to it. And if a method, at declaration site as it's "contract", should return some supertype, it's totally fine if it returns a subtype instead in its implementations