Given some class SomeBaseClass, are these two method declarations equivalent?
public <T extends SomeBaseClass> void myMethod(Class<T> clz)
and
public void myMethod(Class<? extends SomeBaseClass> clz)
For the caller: yes, they are equivalent.
For the code inside the method: no.
The difference is that within the code of the first example you can use the type T (for example to hold an object created by clz.newInstance()), while in the second you can't.
No, they're not. With the first definition, you can use the type T inside the method definition, e.g. create an ArrayList<T> or return T. With the second definition, that's not possible.
Bounded wildcards are subject to certain restrictions to avoid heap pollution.
When you use the wildcard ? extends X you know you can read generic information, but you cannot write.
For instance
List<String> jedis = new ArrayList<String>();
jedis.add("Obiwan");
List<? extends CharSequence> ls = jedis
CharSequence obiwan = ls.get(0); //Ok
ls.add(new StringBuffer("Anakin")); //Not Ok
The compiler avoided heap pollution when you tried to add a CharSequence (i.e. StringBuffer) to the collection. Because the compiler cannot be sure (due to wildcards) that the actual implementation of the collection is of type StringBuffer.
When you use ? super X you know you can write generic information, but you cannot be sure of the type of what you read.
For instance
List<Object> jedis = new ArrayList<Object>();
jedis.add("Obiwan");
List<? super String> ls = jedis;
ls.add("Anakin"); //Ok
String obiwan = ls.get(0); //Not Ok, we can´t be sure list is of Strings.
In this case, due to wildcards, the compiler knows that the actual implementation of the collection could be anything in the ancestors of String. Thus it cannot guarantee that what you will get will be a String. Right?
This same restrictions are the ones you would be subject too in any declaration with bounded wildcards. These are typically known as the get/put principle.
By using a type parameter T you change the story, from the method standpoint you are not using a bounded wildcard but an actual type and therefore you could "get" and "put" things into instances of the class and the compiler would not complain.
For instance, consider the code in Collections.sort method. If we write a method as follows, we would get a compile error:
public static void sort(List<? extends Number> numbers){
Object[] a = numbers.toArray();
Arrays.sort(a);
ListIterator<? extends Number> i = numbers.listIterator();
for (int j=0; j<a.length; j++) {
i.next();
i.set((Number)a[j]); //Not Ok, you cannot be sure the list is of Number
}
}
But if you write it like this, you can do the work
public static <T extends Number> void sort(List<T> numbers){
Object[] a = numbers.toArray();
Arrays.sort(a);
ListIterator<T> i = numbers.listIterator();
for (int j=0; j<a.length; j++) {
i.next();
i.set((T)a[j]);
}
}
And you could even invoke the method with collections bounded with wildcards thanks to a thing called capture conversion:
List<? extends Number> ints = new ArrayList<Integer>();
List<? extends Number> floats = new ArrayList<Float>();
sort(ints);
sort(floats);
This could not be achieved otherwise.
In summary, as others said from the caller standpoint they are alike, from the implementation standpoint, they are not.
No. On top of my head, I can think of the following differences:
The two versions are not override-equivalent. For instance,
class Foo {
public <T extends SomeBaseClass> void myMethod(Class<T> clz) { }
}
class Bar extends Foo {
public void myMethod(Class<? extends SomeBaseClass> clz) { }
}
does not compile:
Name clash: The method myMethod(Class) of type Bar has the same erasure as myMethod(Class) of type Foo but does not override it
If a type parameter appears more than once in a method signature, it always represents the same type, but if a wildcard appears more than once, each occurrence may refer to a different type. For instance,
<T extends Comparable<T>> T max(T a, T b) {
return a.compareTo(b) > 0 ? a : b;
}
compiles, but
Comparable<?> max(Comparable<?> a, Comparable<?> b) {
return a.compareTo(b) > 0 ? a : b;
}
does not, because the latter may be called by
max(Integer.MAX_VALUE, "hello");
The method body may refer to the actual type used by the caller using a type parameter, but not using a wildcard type. For instance:
<T extends Comparable<T>> T max(T... ts) {
if (ts.length == 0) {
return null;
}
T max = ts[0];
for (int i = 1; i < ts.length; i++) {
if (max.compareTo(ts[i]) > 0) {
max = ts[i];
}
}
return max;
}
compiles.
#Mark #Joachim #Michael
see the example in JLS3 5.1.10 Capture Conversion
public static void reverse(List<?> list) { rev(list);}
private static <T> void rev(List<T> list){ ... }
so the <?> version can do anything the <T> version can do.
this is easy to accept if the runtime is reified. a List<?> object must be a List<X> object of some specific non-wildcard X anyway, and we can access this X at runtime. So there's no difference using a List<?> or a List<T>
With type erasure, we have no access to T or X, so there's no difference either. We can insert a T into a List<T> - but where can you get a T object, if T is private to the invocation, and erased? There are two possibilities:
the T object is already stored in the List<T>. so we are manipulating elements themselves. As the reverse/rev example shows, there's no problem doing this to List<?> either
it comes out-of-band. There's other arrangement made by the programmer, so that an object somewhere else is guaranteed to be of type T for the invocation. Unchecked casting must be done to override compiler. Again, no problem to do the same thing to List<?>
Related
<? extends T> makes for a read-only collection
<? super T> makes for a write-only collection
I somehow get why use a read-only collection,for instance to use it in a multithreaded environment (any other cases?)
But why use a write-only collection? What's the point if you cannot read from it and use its values at some point? I know that you can get an Object out of it but that defies type safety.
Edit:
#Thomas the linked question (Difference between <? super T> and <? extends T> in Java) does show how to make a write only collection but does not answer 'why' would you need one in the first place.So it's not a duplicate
Statements like
<? extends T> makes for a read-only collection
<? super T> makes for a write-only collection
are just wrong. Wildcard element types do not say anything about the ability to read or write.
To show counter examples:
static <T> void modify(List<? extends T> l) {
l.sort(Comparator.comparing(Object::toString));
l.remove(l.size() - 1);
Collections.swap(l, 0, l.size() - 1);
l.add(null);
duplicateFirst(l);
}
static <U> void duplicateFirst(List<U> l) {
U u = l.get(0);
l.add(u);
}
shows quite some modifications possible for the List<? extends T>, without problems.
Likewise, you can read a List<? super T>.
static <T> void read(List<? super T> l) {
for(var t: l) System.out.println(t);
}
Usage restrictions imposed by ? extends T or ? super T are only in relation to T. You can not take an object of type T, e.g. from another method parameter, and add it to a List<? extends T>, because the list’s actual type might be a subtype of T. Likewise, you can not assume the elements of a List<? super T> to be of type T, because the list’s actual type might be a supertype of T, so the only assumption you can make, is that the elements are instances of Object, as every object is.
So when you have a method like
public static <T> void copy(List<? super T> dest, List<? extends T> src)
the method can not take elements from dest and add them to src (in a typesafe way), but only the other way round.
It’s important to emphasize that unlike other programming languages, Java has use site variance, so the relationship between the two list described above only applies to the copy method declaring this relationship. The lists passed to this method do not have to be “consumer of T” and “producer of T” throughout their entire lifetime.
So you can use the method like
List<Integer> first = List.of(0, 1, 2, 3, 7, 8, 9);
List<Number> second = new ArrayList<>(Collections.nCopies(7, null));
Collections.copy(second, first);
List<Object> third = new ArrayList<>(Collections.nCopies(11, " x "));
Collections.copy(third.subList(2, 9), second);
System.out.println(third);
Yes, copy was a real life example. Online demo
Note how the second list changes its role from consumer of Integer to producer of Object for the two copy invocations while its actual element type is Number.
Other examples for ? super T
Collections.fill(List<? super T> list, T obj)
Collections.addAll(Collection<? super T> c, T... elements)
To sum it up, in Java, rules like PECS are relevant for the declaration of methods, to determine the (typical) roles of the arguments within the method itself. This raises the flexibility for the caller, as it allows combining different invariant types, like the example of copying from a List<Integer> to a List<Number>.
But never assume that the generic types tell anything about the ability to read or write a collection.
Note that "write only collection" depends on the point of view.
Lets write a method that adds a bunch of numbers to a collection:
public static void addNumbers(List<? super Integer> target, int count) {
for (int i = 0; i < count; i++) {
target.add(i);
}
}
For this method the list target is a write only list: the method can only add numbers to it, it can not use the values that it added to the list.
On the other side there is the caller:
public static void caller() {
List<Number> myList = new ArrayList<>();
addNumbers(myList, 10);
double sum = 0;
for (Number n: myList) {
sum += n.doubleValue();
}
System.out.println(sum);
}
This method works with a specific list (myList) and therefore can read the values that addNumbers stuffed into it.
For this method the list is not a write only list, for this method it is an ordinary list.
I can't see why size(test); won't compile. Can someone help me understand?
public class TestContainer<T extends Object> {
}
.
public class Main {
public static int size(List<TestContainer<?>> list) {
return list.size();
}
public static void main(String[] args) {
List<TestContainer<Object>> test = new ArrayList<TestContainer<Object>>();
size(test); // this does not compile
test.size(); // of course this works fine
}
}
List<TestContainer<Object>> is not a List<TestContainer<?>>.
A TestContainer<Integer> is a TestContainer<?>. Therefore, you can add a TC<Integer> to a List<TC<?>>. Similarly, you can add a TC<String> to a List<TC<?>>.
However, a TC<Integer> is not a TC<Object> (because generics are invariant in Java), so you mustn't be allowed to add a TC<Integer> to a List<TC<Object>>. As such, a List<TC<Object>> isn't a List<TC<?>>.
If you make it so that you are unable to add anything (except literal null) to the List - by making the method parameter List<? extends TC<?>> (or List<?>, if you are really uninterested in the elements) - it is safe, and thus allowed.
A method with as parameter a generic List can accept subclasses of the generic type only if the List parameter declaration is parameterized with an upper bounded wildcard.
You don't declare any upper bounded wildcard (<? extend T>) for List<TestContainer> here :
public static int size(List<TestContainer<?>> list) {
return list.size();
}
So you can only pass this exact type as parameter :
List<TestContainer<?>>
To achieve you want, you should so write :
public static int size(List<? extends TestContainer<?>> list) {
return list.size();
}
Note that the upper bounded wildcard is designed to ensure the type safety of the passed List.
As a consequence, inside the method, you could not add anything in but null.
But for your use case (returning the size of the list), it is not a problem.
I am new to Java. In this document they give this as a use case for using wildcard:
static void printCollection(Collection c) {
Iterator i = c.iterator();
for (int k = 0; k < c.size(); k++) {
System.out.println(i.next());
}
}
This is their solution:
static void printCollection(Collection<?> c) {
for (Object e : c) {
System.out.println(e);
}
}
But I could do the same without a wild card:
static <T> void printCollection(Collection<T> c) {
Iterator i = c.iterator();
for (int k = 0; k < c.size(); k++) {
System.out.println(i.next());
}
}
Can someone show me a simple use case where regular generics won't work but a wild card will?
Update: The answers over here When to use wildcards in Java Generics? do NOT tell us the need for wildcard. In fact its the other way around.
One thing wildcards allow us to do is declare types that are agnostic towards a particular type parameter, for example a "list of any kind of list":
List<List<?>> listOfAnyList = ...;
listOfAnyList.add( new ArrayList<String>() );
listOfAnyList.add( new ArrayList<Double>() );
This is impossible without a wildcard:* because the element lists may have different types from each other.
And if we try to capture it, we will find that we can't:
static <E> void m(List<List<E>> listOfParticularList) {}
m( listOfAnyList ); // <- this won't compile
Another thing wildcards allow us to do that type parameters cannot is set a lower bound. (A type parameter can be declared with an extends bound, but not a super bound.**)
class Protector {
private String secretMessage = "abc";
void pass(Consumer<? super String> consumer) {
consumer.accept( secretMessage );
}
}
Suppose pass was instead declared to take a Consumer<String>. Now suppose we had a Consumer<Object>:
class CollectorOfAnything implements Consumer<Object> {
private List<Object> myCollection = new ArrayList<>();
#Override
public void accept(Object anything) {
myCollection.add( anything );
}
}
The problem is: we can't pass it to a method accepting Consumer<String>. Declaring Consumer<? super String> means that we can pass any consumer which accepts a String. (Also see Java Generics: What is PECS?.)
Most of the time, wildcards just let us make tidy declarations.
If we don't need to use a type, we don't have to declare a type parameter for it.
* Technically also possible with a raw type, but raw types are discouraged.
** I don't know why Java doesn't allow super for a type parameter. 4.5.1. Type Arguments of Parameterized Types may hint that it has something to do with a limitation of type inference:
Unlike ordinary type variables declared in a method signature, no type inference is required when using a wildcard. Consequently, it is permissible to declare lower bounds on a wildcard […].
T stands for the generic type of that data structure. In your last example, you don't use it, and its NOT an actual type (for example String), and because you don't use it it doesn't really matter in this case.
For example, if you had a Collection and tried to pass it to a method that accepts a Collection, that works because there is no type T on the classpath so its considered a variable. If you tried passing the same Collection to a method that accepts a Collection, that would not work because you have String on your classpath so its not a variable.
Take List as the example.
List<?> can be the parent class of List<A>.
for instance,
List<B> bList = new ArrayList<>(); // B is a class defined in advance
List<?> list = bList;
you can never use <T> in this situation.
<?> has the wildcard capture.
here,
void foo(List<?> i) {
i.set(0, i.get(0));
}
the code above cannot be compiled. You can fix it:
void foo(List<?> i) {
fooHelper(i);
}
// wildcard can be captured through type inference.
private <T> void fooHelper(List<T> l) {
l.set(0, l.get(0));
}
see more, http://docs.oracle.com/javase/tutorial/java/generics/capture.html
I can only think of the two currently, later may update.
So I am reading about generic method and I am get confused. Let me state the problem here first:
In this example: Suppose that I need a version of selectionSort that works for any type T, by using an external comparable supplied by the caller.
First attempt:
public static <T> void selectionSort(T[] arr, Comparator<T> myComparator){....}
Suppose that I have:
Defined vehicle class
created VehicleComparator implementing Comparator while
compare vehicles by their price.
created Truck extends vehicle
instantiated Truck[] arr ; VehicleComparator myComparator
Now, I do:
selectionSort(arr, myComparator);
and it won't work, because myComparator is not available for any subclass of Vehicle.
Then, I do this:
public static <T> void selectionSort(T[] arr, Comparator<? super T> myComparator){....}
This declaration will work, but I don't completely sure what I've been doing... I know use is the way to go. If "? super T" means "an unknown supertype of T", then am I imposing a upper or lower bound? Why is it super? My intention is to let any subclass of T to use myComparator, why "? super T". So confused... I'd appreciate if you have any insight in this..
Thanks ahead!
Firstly, you could have solved it by having Vehicle[] which you then added Trucks to.
The reason you need <? super T> goes back to the generics rule that Comparator<Truck> is not a subtype of Comparator<Vehicle>; the unbounded type T must match exactly, which it doesn't.
In order for a suitable Comparator to be passed in, it must be a Comparator of the class being compared or any super class of it, because in OO languages any class may be treated as an instance of a superclass. Thus, it doesn't matter what the generic type of the Comparator is, as long as it's a supertype of the array's component type.
The quizzical phrase ? super T means that the destination list may have elements of any type
that is a supertype of T, just as the source list may have elements of any type that is a
subtype of T.
We can see pretty simple example copy from Collections:
public static <T> void copy(List<? super T> dst, List<? extends T> src) {
for (int i = 0; i < src.size(); i++) {
dst.set(i, src.get(i));
}
}
And call:
List<Object> objs = Arrays.<Object>asList(2, 3.14, "four");
List<Integer> ints = Arrays.asList(5, 6);
Collections.copy(objs, ints);
assert objs.toString().equals("[5, 6, four]");
As with any generic method, the type parameter may be inferred or may be given explicitly. In this case, there are four possible choices, all of which type-check and all of which have the same effect:
Collections.copy(objs, ints);
Collections.<Object>copy(objs, ints);
Collections.<Number>copy(objs, ints);
Collections.<Integer>copy(objs, ints);
Your method signature
public static <T> void selectionSort(T[] arr, Comparator<? super T> myComparator)
means that if you invoke it with an array of type T than you must also provide a Comparator of type T or a super type of T.
For example if you have the following classes
class Vehicle {}
class Truck extends Vehicle {}
class BigTruck extends Truck {}
class VehicleComparator implements Comparator<Vehicle> {
public int compare(Vehicle o1, Vehicle o2) {
return 0;
}
}
class BigTruckComparator implements Comparator<BigTruck> {
public int compare(BigTruck o1, BigTruck o2) {
return 0;
}
}
class TruckComparator implements Comparator<Truck> {
public int compare(Truck o1, Truck o2) {
return 0;
}
}
then this will work
Truck[] trucks = ...;
selectionSort(trucks, new TruckComparator());
selectionSort(trucks, new VehicleComparator());
Because
TruckComparator implements Comparator<Truck> and a Truck is equal to the array's type Truck
VehicleComparator implements Comparator<Vehicle> and a Vehicle is a super type of the array's type Truck
This will NOT WORK
selectionSort(trucks, new BigTruckComparator());
Because a BigTruckComparator is a Comparator<BigTruck> and a BigTruck is not a super type of the array's type Truck.
The two signatures are equivalent in terms of power -- for any set of arguments, if there exists a choice of type arguments that works for one of them, there exists a choice of type arguments that works for the other one, and vice versa.
You are simply running into limited inference in your compiler. Simply explicitly specify the desired type argument:
YourClass.<Vehicle>selectionSort(arr, myComparator);
Why can I use super only with wildcards and not with type parameters?
For example, in the Collection interface, why is the toArray method not written like this
interface Collection<T>{
<S super T> S[] toArray(S[] a);
}
super to bound a named type parameter (e.g. <S super T>) as opposed to a wildcard (e.g. <? super T>) is ILLEGAL simply because even if it's allowed, it wouldn't do what you'd hoped it would do, because since Object is the ultimate super of all reference types, and everything is an Object, in effect there is no bound.
In your specific example, since any array of reference type is an Object[] (by Java array covariance), it can therefore be used as an argument to <S super T> S[] toArray(S[] a) (if such bound is legal) at compile-time, and it wouldn't prevent ArrayStoreException at run-time.
What you're trying to propose is that given:
List<Integer> integerList;
and given this hypothetical super bound on toArray:
<S super T> S[] toArray(S[] a) // hypothetical! currently illegal in Java
the compiler should only allow the following to compile:
integerList.toArray(new Integer[0]) // works fine!
integerList.toArray(new Number[0]) // works fine!
integerList.toArray(new Object[0]) // works fine!
and no other array type arguments (since Integer only has those 3 types as super). That is, you're trying to prevent this from compiling:
integerList.toArray(new String[0]) // trying to prevent this from compiling
because, by your argument, String is not a super of Integer. However, Object is a super of Integer, and a String[] is an Object[], so the compiler still would let the above compile, even if hypothetically you can do <S super T>!
So the following would still compile (just as the way they are right now), and ArrayStoreException at run-time could not be prevented by any compile-time checking using generic type bounds:
integerList.toArray(new String[0]) // compiles fine!
// throws ArrayStoreException at run-time
Generics and arrays don't mix, and this is one of the many places where it shows.
A non-array example
Again, let's say that you have this generic method declaration:
<T super Integer> void add(T number) // hypothetical! currently illegal in Java
And you have these variable declarations:
Integer anInteger
Number aNumber
Object anObject
String aString
Your intention with <T super Integer> (if it's legal) is that it should allow add(anInteger), and add(aNumber), and of course add(anObject), but NOT add(aString). Well, String is an Object, so add(aString) would still compile anyway.
See also
Java Tutorials/Generics
Subtyping
More fun with wildcards
Related questions
On generics typing rules:
Any simple way to explain why I cannot do List<Animal> animals = new ArrayList<Dog>()?
java generics (not) covariance
What is a raw type and why shouldn’t we use it?
Explains how raw type List is different from List<Object> which is different from a List<?>
On using super and extends:
Java Generics: What is PECS?
From Effective Java 2nd Edition: "producer extends consumer super"
What is the difference between super and extends in Java Generics
What is the difference between <E extends Number> and <Number>?
How can I add to List<? extends Number> data structures? (YOU CAN'T!)
As no one has provided a satisfactory answer, the correct answer seems to be "for no good reason".
polygenelubricants provided a good overview of bad things happening with the java array covariance, which is a terrible feature by itself. Consider the following code fragment:
String[] strings = new String[1];
Object[] objects = strings;
objects[0] = 0;
This obviously wrong code compiles without resorting to any "super" construct, so array covariance should not be used as an argument.
Now, here I have a perfectly valid example of code requiring super in the named type parameter:
class Nullable<A> {
private A value;
// Does not compile!!
public <B super A> B withDefault(B defaultValue) {
return value == null ? defaultValue : value;
}
}
Potentially supporting some nice usage:
Nullable<Integer> intOrNull = ...;
Integer i = intOrNull.withDefault(8);
Number n = intOrNull.withDefault(3.5);
Object o = intOrNull.withDefault("What's so bad about a String here?");
The latter code fragment does not compile if I remove the B altogether, so B is indeed needed.
Note that the feature I'm trying to implement is easily obtained if I invert the order of type parameter declarations, thus changing the super constraint to extends. However, this is only possible if I rewrite the method as a static one:
// This one actually works and I use it.
public static <B, A extends B> B withDefault(Nullable<A> nullable, B defaultValue) { ... }
The point is that this Java language restriction is indeed restricting some otherwise possible useful features and may require ugly workarounds. I wonder what would happen if we needed withDefault to be virtual.
Now, to correlate with what polygenelubricants said, we use B here not to restrict the type of object passed as defaultValue (see the String used in the example), but rather to restrict the caller expectations about the object we return. As a simple rule, you use extends with the types you demand and super with the types you provide.
The "official" answer to your question can be found in a Sun/Oracle bug report.
BT2:EVALUATION
See
http://lampwww.epfl.ch/~odersky/ftp/local-ti.ps
particularly section 3 and the last paragraph on page 9. Admitting
type variables on both sides of subtype constraints can result in a
set of type equations with no single best solution; consequently,
type inference cannot be done using any of the existing standard
algorithms. That is why type variables have only "extends" bounds.
Wildcards, on the other hand, do not have to be inferred, so there
is no need for this constraint.
####.### 2004-05-25
Yes; the key point is that wildcards, even when captured, are only used
as inputs of the inference process; nothing with (only) a lower bound needs
to be inferred as a result.
####.### 2004-05-26
I see the problem. But I do not see how it is different from the problems
we have with lower bounds on wildcards during inference, e.g.:
List<? super Number> s;
boolean b;
...
s = b ? s : s;
Currently, we infer List<X> where X extends Object as the type of the
conditional expression, meaning that the assignment is illegal.
####.### 2004-05-26
Sadly, the conversation ends there. The paper to which the (now dead) link used to point is Inferred Type Instantiation for GJ. From glancing at the last page, it boils down to: If lower bounds are admitted, type inference may yield multiple solutions, none of which is principal.
The only reason is it makes no sense when declaring a type parameter with a super keyword when defining at a class level.
The only logical type-erasure strategy for Java would have been to fallback to the supertype of all objects, which is the Object class.
A great example and explanation can be found here:
http://www.angelikalanger.com/GenericsFAQ/FAQSections/TypeParameters.html#Why%20is%20there%20no%20lower%20bound%20for%20type%20parameters?
A simple example for rules of type-erasure can be found here:
https://www.tutorialspoint.com/java_generics/java_generics_type_erasure.htm#:~:text=Type%20erasure%20is%20a%20process,there%20is%20no%20runtime%20overhead.
Suppose we have:
basic classes A > B > C and D
class A{
void methodA(){}
};
class B extends A{
void methodB(){}
}
class C extends B{
void methodC(){}
}
class D {
void methodD(){}
}
job wrapper classes
interface Job<T> {
void exec(T t);
}
class JobOnA implements Job<A>{
#Override
public void exec(A a) {
a.methodA();
}
}
class JobOnB implements Job<B>{
#Override
public void exec(B b) {
b.methodB();
}
}
class JobOnC implements Job<C>{
#Override
public void exec(C c) {
c.methodC();
}
}
class JobOnD implements Job<D>{
#Override
public void exec(D d) {
d.methodD();
}
}
and one manager class with 4 different approaches to execute job on object
class Manager<T>{
final T t;
Manager(T t){
this.t=t;
}
public void execute1(Job<T> job){
job.exec(t);
}
public <U> void execute2(Job<U> job){
U u= (U) t; //not safe
job.exec(u);
}
public <U extends T> void execute3(Job<U> job){
U u= (U) t; //not safe
job.exec(u);
}
//desired feature, not compiled for now
public <U super T> void execute4(Job<U> job){
U u= (U) t; //safe
job.exec(u);
}
}
with usage
void usage(){
B b = new B();
Manager<B> managerB = new Manager<>(b);
//TOO STRICT
managerB.execute1(new JobOnA());
managerB.execute1(new JobOnB()); //compiled
managerB.execute1(new JobOnC());
managerB.execute1(new JobOnD());
//TOO MUCH FREEDOM
managerB.execute2(new JobOnA()); //compiled
managerB.execute2(new JobOnB()); //compiled
managerB.execute2(new JobOnC()); //compiled !!
managerB.execute2(new JobOnD()); //compiled !!
//NOT ADEQUATE RESTRICTIONS
managerB.execute3(new JobOnA());
managerB.execute3(new JobOnB()); //compiled
managerB.execute3(new JobOnC()); //compiled !!
managerB.execute3(new JobOnD());
//SHOULD BE
managerB.execute4(new JobOnA()); //compiled
managerB.execute4(new JobOnB()); //compiled
managerB.execute4(new JobOnC());
managerB.execute4(new JobOnD());
}
Any suggestions how to implement execute4 now ?
==========edited =======
public void execute4(Job<? super T> job){
job.exec( t);
}
Thanks to all :)
========== edited ==========
private <U> void execute2(Job<U> job){
U u= (U) t; //now it's safe
job.exec(u);
}
public void execute4(Job<? super T> job){
execute2(job);
}
much better, any code with U inside execute2
super type U becomes named !
interesting discussion :)
I really like the accepted answer, but I would like to put a slightly different perspective on it.
super is supported in a typed parameter only to allow contravariance capabilities. When it comes to covariance and contravariance it's important to understand that Java only supports use-site variance. Unlike Kotlin or Scala, which allow declaration-site variance. Kotlin documentation explains it very well here. Or if you're more into Scala, here's one for you.
It basically means that in Java, you can not limit the way you're gonna use your class when you declare it in terms of PECS. The class can both consume and produce, and some of its methods can do it at the same time, like toArray([]), by the way.
Now, the reason extends is allowed in classes and methods declarations is because it's more about polymorphism than it is about variance. And polymorphism is an intrinsic part of Java and OOP in general: If a method can accept some supertype, a subtype can always safely be passed to it. And if a method, at declaration site as it's "contract", should return some supertype, it's totally fine if it returns a subtype instead in its implementations