Java thenComparing wildcard signature - java

Why does the declaration look like this:
default <U extends Comparable<? super U>> Comparator<T> thenComparing(
Function<? super T, ? extends U> keyExtractor)
I understand most of it. It makes sense that U can be anything as long as it's comparable to a superclass of itself, and thus also comparable to itself.
But I don't get this part: Function<? super T, ? extends U>
Why not just have: Function<? super T, U>
Can't the U just parameterize to whatever the keyExtractor returns, and still extend Comparable<? super U> all the same?

Why is it ? extends U and not U?
Because of code conventions. Check out #deduper's answer for a great explanation.
Is there any actual difference?
When writing your code normally, your compiler will infer the correct T for things like Supplier<T> and Function<?, T>, so there is no practical reason to write Supplier<? extends T> or Function<?, ? extends T> when developing an API.
But what happens if we specify the type manually?
void test() {
Supplier<Integer> supplier = () -> 0;
this.strict(supplier); // OK (1)
this.fluent(supplier); // OK
this.<Number>strict(supplier); // compile error (2)
this.<Number>fluent(supplier); // OK (3)
}
<T> void strict(Supplier<T>) {}
<T> void fluent(Supplier<? extends T>) {}
As you can see, strict() works okay without explicit declaration because T is being inferred as Integer to match local variable's generic type.
Then it breaks when we try to pass Supplier<Integer> as Supplier<Number> because Integer and Number are not compatible.
And then it works with fluent() because ? extends Number and Integer are compatible.
In practice that can happen only if you have multiple generic types, need to explicitly specify one of them and get the other one incorrectly (Supplier one), for example:
void test() {
Supplier<Integer> supplier = () -> 0;
// If one wants to specify T, then they are forced to specify U as well:
System.out.println(this.<List<?>, Number> supplier);
// And if U happens to be incorrent, then the code won't compile.
}
<T, U> T method(Supplier<U> supplier);
Example with Comparator (original answer)
Consider the following Comparator.comparing method signature:
public static <T, U extends Comparable<? super U>> Comparator<T> comparing(
Function<? super T, U> keyExtractor
)
Also here is some test classes hierarchy:
class A implements Comparable<A> {
public int compareTo(A object) { return 0; }
}
class B extends A { }
Now let's try this:
Function<Object, B> keyExtractor = null;
Comparator.<Object, A>comparing(keyExtractor); // compile error
error: incompatible types: Function<Object,B> cannot be converted to Function<? super Object,A>

TL;DR:
Comparator.thenComparing(Function< ? super T, ? extends U > keyExtractor) (the method your question specifically asks about) might be declared that way as an idiomatic/house coding convention thing that the JDK development team is mandated to follow for reasons of consistency throughout the API.
The long-winded version
„…But I don't get this part: Function<? super T, ? extends U>…“
That part is placing a constraint on the specific type that the Function must return. It sounds like you got that part down already though.
The U the Function returns is not just any old U, however. It must have the specific properties (a.k.a „bounds“) declared in the method's parameter section: <U extends Comparable<? super U>>.
„…Why not just have: Function<? super T, U>…“
To put it as simply as I can (because I only think of it simply; versus formally): The reason is because U is not the same type as ? extends U.
Changing Comparable< ? super U > to List< ? super U > and Comparator< T > to Set< T > might make your quandary easier to reason about…
default < U extends List< ? super U > > Set< T > thenComparing(
Function< ? super T, ? extends U > keyExtractor ) {
T input = …;
/* Intuitively, you'd think this would be compliant; it's not! */
/* List< ? extends U > wtf = keyExtractor.apply( input ); */
/* This doesn't comply to „U extends List< ? super U >“ either */
/* ArrayList< ? super U > key = keyExtractor.apply( input ); */
/* This is compliant because key is a „List extends List< ? super U >“
* like the method declaration requires of U
*/
List< ? super U > key = keyExtractor.apply( input );
/* This is compliant because List< E > is a subtype of Collection< E > */
Collection< ? super U > superKey = key;
…
}
„Can't the U just parameterize to whatever the keyExtractor returns, and still extend Comparable<? super U> all the same?…“
I have established experimentally that Function< ? super T, ? extends U > keyExtractor could indeed be refactored to the the more restrictive Function< ? super T, U > keyExtractor and still compile and run perfectly fine. For example, comment/uncomment the /*? extends*/ on line 27 of my experimental UnboundedComparator to observe that all of these calls succeed either way…
…
Function< Object, A > aExtractor = ( obj )-> new B( );
Function< Object, B > bExtractor = ( obj )-> new B( ) ;
Function< Object, C > cExtractor = ( obj )-> new C( ) ;
UnboundedComparator.< Object, A >comparing( aExtractor ).thenComparing( bExtractor );
UnboundedComparator.< Object, A >comparing( bExtractor ).thenComparing( aExtractor );
UnboundedComparator.< Object, A >comparing( bExtractor ).thenComparing( bExtractor );
UnboundedComparator.< Object, B >comparing( bExtractor ).thenComparing( bExtractor );
UnboundedComparator.< Object, B >comparing( bExtractor ).thenComparing( aExtractor );
UnboundedComparator.< Object, B >comparing( bExtractor ).thenComparing( cExtractor );
…
Technically, you could do the equivalent debounding in the real code. From the simple experimentation I've done — on thenComparing() specifically, since that's what your question asks about — I could not find any practical reason to prefer ? extends U over U.
But, of course, I have not exhaustively tested every use case for the method with and without the bounded ? .
I would be surprised if the developers of the JDK haven't exhaustively tested it though.
My experimentation — limited, I admit — convinced me that Comparator.thenComparing(Function< ? super T, ? extends U > keyExtractor) might be declared that way for no other reason than as an idiomatic/house coding convention thing that the JDK development team follows.
Looking at the code base of the JDK it's not unreasonable to presume that somebody somewhere has decreed: «Wherever there's a Function< T, R > the T must have a lower bound (a consumer/you input something) and the R must have an upper bound (a producer/you get something returned to you)».
For obvious reasons though, U is not the same as ? extends U. So the former should not be expected to be substitutable for the latter.
Applying Occam's razor: It's simpler to expect that the exhaustive testing the implementers of the JDK have done has established that the U -upper bounded wildcard is necessary to cover a wider number of use cases.

It seems like your question is regarding type arguments in general so for my answer I will be separating the type arguments you provided from the types they belong to, in my answer, for simplicity.
First we should note that a parameterized type of wildcard is unable to access its members that are of the respective type parameter. This is why, in your specific case the ? extends U can be substituted for U and still work fine.
This won't work in every case. The type argument U does not have the versatility and additional type safety that ? extends U has. Wildcards are a unique type argument in which instantiations of the parameterized types (with wildcard type arguments) are not as restricted by the type argument as they would be if the type argument was a concrete type or type parameter; wildcards are basically place holders that are more general than type parameters and concrete types (when used as type arguments). The first sentence in the java tutorial on wild cards reads:
In generic code, the question mark (?), called the wildcard, represents an unknown type.
To illustrate this point take a look at this
class A <T> {}
now let's make two declarations of this class, one with a concrete type and the other with a wild card and then we'll instantiate them
A <Number> aConcrete = new A <Integer>(); // Compile time error
A <? extends Number> aWild = new A<Integer>() // Works fine
So that should illustrate how a wildcard type argument does not restrict the instantiation as much as a concrete type. But what about a type parameter? The problem with using type parameters is best manifested in a method. To illustrate examine this class:
class C <U> {
void parameterMethod(A<U> a) {}
void wildMethod(A<? extends U> a) {}
void test() {
C <Number> c = new C();
A<Integer> a = new A();
c.parameterMethod(a); // Compile time error
c.wildMethod(a); // Works fine
}
Notice how the references c and a are concrete types. Now this was addressed in another answer, but what wasn't addressed in the other answer is how the concept of type arguments relate to the compile time error(why one type argument causes a compile time error and the other doesn't) and this relation is the reason why the declaration in question is declared with the syntax it's declared with. And that relation is the additional type safety and versatility wildcards provide over type parameters and NOT some typing convention. Now to illustrate this point we will have to give A a member of type parameter, so:
class A<T> { T something; }
The danger of using a type parameter in the parameterMethod() is that the type parameter can be referred to in the form of a cast, which enables access to the something member.
class C<U> {
parameterMethod(A<U> a) { a.something = (U) "Hi"; }
}
Which in turn enables the possibility of heap pollution. With this implementation of the parameterMethod the statement C<Number> c = new C(); in the test() method could cause heap pollution. For this reason, the compiler issues a compile time error when methods with arguments of type parameter are passed any object without a cast from within the type parameters declaring class; equally a member of type parameter will issue a compile time error if it is instantiated to any Object without a cast from within the type parameter's declaring class. The really important thing here to stress is without a cast because you can still pass objects to a method with an argument of type parameter but it must be cast to that type parameter (or in this case, cast to the type containing the type parameter). In my example
void test() {
C <Number> c = new C();
A<Integer> a = new A();
c.parameterMethod(a); // Compile time error
c.wildMethod(a); // Works fine
}
the c.parameterMethod(a) would work if a were cast to A<U>, so if the line looked like this c.parameterMethod((A<U>) a); no compile time error would occur, but you would get a run time castclassexection error if you tried to set an int variable equal to a.something after the parameterMethod() is called (and again, the compiler requires the cast because U could represent anything). This whole scenario would look like this:
void test() {
C <Number> c = new C();
A<Integer> a = new A();
c.parameterMethod((A<U>) a); // No compile time error cuz of cast
int x = a.something; // doesn't issue compile time error and will cause run-time ClassCastException error
}
So because a type parameter can be referenced in the form of a cast, it is illegal to pass an object from within the type parameters declaring class to a method with an argument of a type parameter or containing a type parameter. A wildcard cannot be referenced in the form of a cast, so the a in wildMethod(A<? extends U> a) could not access the T member of A; because of this additional type safety, because this possibility of heap pollution is avoided with a wildcard, the java compiler does permit a concrete type being passed to the wildMethod without a cast when invoked by the reference c in C<Number> c = new C(); equally, this is why a parameterized type of wildcard can be instantiated to a concrete type without a cast. When I say versatility of type arguments, I'm talking about what instantiations they permit in their role of a parameterized type; and when I say additional type safety I'm talking about about the inability to reference wildcards in the form of a cast which circumvents heapPollution.
I don't know why someone would cast a type parameter. But I do know a developer would at least enjoy the versatility of wildcards vs a type parameter. I may have written this confusingly, or perhaps misunderstood your question, your question seems to me to be about type arguments in general instead of this specific declaration. Also if keyExtractor from the declaration Function<? super T, ? extends U> keyExtractor is being used in a way that the members belonging to Function of the second type parameter are never accessed, then again, wildcards are ideal because they can't possibly access those members anyway; so why wouldn't a developer want the versatility mentioned here that wildcards provide? It's only a benefit.

Related

Method taking an Object and any of its superclasses

To achieve.
A method signal should take
any Object N
any of the Object's superclasses Class<?super N>
<N>void
signal(N n, Class<?super N> n_super)
{
/*...*/
}
It should be ok to call
Object object=new Object();
signal(object, object.getClass());
since Object is a super type of object. But calling it gives a waring. In IDE words:
IntelliJ (Android Studio)
Wrong 2nd argument type. Found Class<? extends Object>, required: Class<? super Object>
Eclipse
The method signal(N, Class<? super N>)
is not applicable for the arguments (Object, Class<? extends Object>)
Questions.
Can the goal be achieved, the way I have tried, and if yes,
how can the warning be eliminated
Turning my comments into an answer.
Two approaches I can quickly think of:
Use Object.class directly here. signal(object, Object.class); compiles fine with no warnings on my java version.
Change your method signature to something like:
<N, M extends N> void signal(N n, Class<? super M> n_super)
which should let you call it the way you already are (as in signal(object, object.getClass());).
use something like this:
Object object=new Object();
Class c = object.getClass();
signal(object, c);

Not able to understand a complex parameterized return type - Java

I have come across a snippet similar to this in Java
public <H extends ABC<I, U>, I, U> Set<U> get(Type<H, I, U> type) {
}
I do not understand this.
I only understand that it takes a parameter of the parameterized type Type<H,I,U> and that it returns a Set<U> which is the return type.
But, I do not understand the part:
<H extends ABC<I,U>, I,U>
Can anybody clarify it?
Regards,
Chetan
It means that
The method takes three type parameters: H, I, and U
I and U can be anything
H must extend ABC<I, U> (or it can actually be ABC<I, U>)
So presumably you have a type ABC which has two type parameters. This is saying that, for instance, this would be valid:
Type<ABC<String, Date>, String, Date> type = new Type<>();
Set<String> set = instance.get(type);
...because there H would be ABC<String, Date>, I would be String, and U would be Date.

Where does the Java spec say List<T> assigns to List<? super T>?

Assume class B inherits from class A. The following is legal Java:
List<A> x;
List<? super B> y = x;
In terms of the specification, this means that List<A> assignsTo List<? super B>. However, I am having trouble finding the part of the spec that says this is legal. In particular, I believe we should have the subtype relation
List<A> <: List<? super B>
but section 4.10 of the Java 8 spec defines the subtype relation as the transitive closure of a direct supertype relation S >1 T, and it defines the direct supertype relation in terms of a finite function which computes a set of supertypes of T. There is no bounded function which on input List<A> can produce List<? super B> since there might be an arbitrary number of Bs that inherit from A, so the spec's subtype definition seems to break down for super wildcards. Section 4.10.2 on "Subtyping among class and interface types" does mention wildcards, but it handles only the other direction where the wildcard appears in the potential subtype (this direction fits into the computed direct supertype mechanism).
Question: What part of the spec says that the above code is legal?
The motivation is for compiler code, so it's not enough to understand why it is legal intuitively or come up with an algorithm that handles it. Since the general subtyping problem in Java is undecidable, I would like to handle exactly the same cases as the spec, and therefore want the part of the spec that handles this case.
List<? super B> is defined to be a supertype of List<A> by §4.10.2. Subtyping among Class and Interface Types:
The direct supertypes of the parameterized type C<T1,...,Tn>, where Ti
(1 ≤ i ≤ n) is a type, are all of the following:
D<U1 θ,...,Uk θ>, where D<U1,...,Uk> is a direct supertype of C<T1,...,Tn> and θ is the substitution [F1:=T1,...,Fn:=Tn].
C<S1,...,Sn>, where Si contains Ti (1 ≤ i ≤ n) (§4.5.1).
Let C<T1,...,Tn> = List<A> and C<S1,...,Sn> = List<? super B>.
According to the second bullet, List<? super B> is a supertype of List<A> if ? super B contains A.
The contains relation is defined in §4.5.1. Type Arguments and Wildcards:
A type argument T1 is said to contain another type argument T2, written T2 <= T1, if the set of types denoted by T2 is provably a subset of the set of types denoted by T1 under the reflexive and transitive closure of the following rules (where <: denotes subtyping (§4.10)):
? extends T <= ? extends S if T <: S
? super T <= ? super S if S <: T
T <= T
T <= ? extends T
T <= ? super T
By the second bullet, we can see that ? super B contains ? super A. By the last bullet, we see that ? super A contains A. Transitively, we therefore know that ? super B contains A.
What does assigning the list to <? super B> actually mean?
Consider the following program:
public class Generics {
static class Quux { }
static class Foo extends Quux { }
static class Bar extends Foo { }
public static void main(String... args) {
List<Foo> fooList = new ArrayList<>();
// This is legal Java
List<? super Bar> superBarList = fooList;
// So is this
List<? super Foo> superFooList = fooList;
// However, this is *not* legal Java
superBarList.add(new Quux());
// Neither is this
superFooList.add(new Quux());
// Or this:
superFooList.add(new Object());
// But this is fine
superFooList.add(new Foo());
}
}
Why would this be? First of all, let's talk about what the JLS says
From the JLS, §4.5.1:
A type argument T1 is said to contain another type argument T2, written T2 <= T1, if the set of types denoted by T2 is provably a subset of the set of types denoted by T1 under the reflexive and transitive closure of the following rules (where <: denotes subtyping (§4.10)):
? super T <= ? super S if S <: T
T <= ? super T
Therefore, T <= ? super S if S <: T.
... but what does THAT mean?
If I can't add a new Quux(), or a new Object()? List<? super Foo> means that this list contains only elements which are strict supertypes to Foo, but I don't know which type that happens to be. In other words, I can declare the list to be such a type, but I cannot add elements to it that I am not 100% certain are of type ? super Foo. Quux could be that type, but it might also not be that type.
For this reason, assigning a List<Foo> to to be List<? super Bar> doesn't allow heap pollution, and ultimately isn't a problem.
Further reading: Relevant section of AngelikaLanger's generic explanation

Diamond in Generics Java 1.7 - how to write this for Java Compiler in 1.6

how can I write Java 1.7 code for a Java 1.6 compiler, where the diamond can not be used?
Example:
private ReplacableTree<E> convertToIntended(Tree<? extends E> from,ReplacableTree<E> to) {
TreeIterator<? extends E> it = new TreeIterator<>(from.getRoot());
while(it.hasNext()) {
E e = it.next().getElem();
to.add(e);
}
return to;
}
public class TreeIterator<E> implements TreeIter<Node<E>> {
....
}
It is not allowed to write...
TreeIterator<? extends E> it = new TreeIterator<?>(from.getRoot());
TreeIterator<? extends E> it = new TreeIterator<E>(from.getRoot());
TreeIterator<? extends E> it = new TreeIterator<? extends E>(from.getRoot());
Especially the third one is confusing for me. Why doesn't it work? I just want to read Elements from a Tree (which could be a subtype tree), and when puch each of it in a new Tree with Elements of type E.
Wildcard types are not permitted as type arguments in class instance creation expressions:
It is a compile-time error if any of the type arguments used in a class instance creation expression are wildcard type arguments (§4.5.1).
so the first and third variants are not valid.
Variant 2 is invalid because the TreeIterator<E> constructor wants a Node<E>, but you give it a Node<? extends E>.
As for the solution, Java 5 and 6 did not have type inference for constructors, but do have type inference for methods, and in particular capture conversion. The following ought to compile:
TreeIterator<? extends E> it = makeIterator(from.getRoot());
where
private <E> TreeIterator<E> makeIterator(Node<E> node) {
return new TreeIterator<E>(node);
}
Edit: You asked in the comment:
The contstructor parameter type for TreeIterator is Node<E>. The constructor parameter of Node<E> therefore is E. When writing variant two, eclipse says the following: The constructor TreeIterator<E>(Node<capture#2-of ? extends E> ) is undefined What does that mean?
Being a wildcard type, the type Node<? extends E> represents a family of types. Node<capture#2-of ? extends E> refers to a specific type in that family. That distinction is irrelevant in this case. What matters is that Node<? extends E> is not a subtype of Node<E>, and hence you can't pass an instance of Node<? extends E> to a constructor expecting a Node<E>.
In short you don't write Java 7 code for a Java 6 compiler - you have to use the old, duplicative non-diamond syntax. And no, you can't specify a target of 1.6 with source 1.7, it won't work!
meriton already explained it well. I just want to suggest that you could as well do it without the wildcard declaration:
TreeIterator<E> it = new TreeIterator<E>(from.getRoot());
Usually, <> means to just use the same type parameter as in the declaration to the left. But in this case, that declaration is a wildcard.
It doesn't make sense to make a constructor with a wildcard type parameter, new TreeIterator<? extends E>(...) because, usually, if you don't care what parameter to use, you should just pick any type that satisfies that bound; which could be E, or any subtype thereof.
However, in this case, that doesn't work because the constructor of TreeIterator<E> takes an object with the type parameter <E>. You didn't show the source code of TreeIterator, so I can't see what it does, but chances are that its bound is too strict. It could probably be refactored to make the type parameter <? extends E>.
But there are some cases where that is not possible. In such a case, you can still eliminate the need for the type parameter E through a "capture helper" (what meriton suggests above) to turn something which takes parameter E into something that takes a wildcard ? extends E.
I know this is an old question, but in case someone stumbles on this, I would have thought the most obvious way of writing it would have been:
private <U extends E> ReplaceableTree<E> convertToIntended(Tree<U> from, ReplaceableTree<E> to)
{
TreeIterator<U> it = new TreeIterator<U>(from.getRoot());
while(it.hasNext())
{
E e = it.next().getElem();
to.add(e);
}
return to;
}
I don't think such a change would break existing code as the type constraints are the same from the existing signature to this one.

Bounding generics with 'super' keyword

Why can I use super only with wildcards and not with type parameters?
For example, in the Collection interface, why is the toArray method not written like this
interface Collection<T>{
<S super T> S[] toArray(S[] a);
}
super to bound a named type parameter (e.g. <S super T>) as opposed to a wildcard (e.g. <? super T>) is ILLEGAL simply because even if it's allowed, it wouldn't do what you'd hoped it would do, because since Object is the ultimate super of all reference types, and everything is an Object, in effect there is no bound.
In your specific example, since any array of reference type is an Object[] (by Java array covariance), it can therefore be used as an argument to <S super T> S[] toArray(S[] a) (if such bound is legal) at compile-time, and it wouldn't prevent ArrayStoreException at run-time.
What you're trying to propose is that given:
List<Integer> integerList;
and given this hypothetical super bound on toArray:
<S super T> S[] toArray(S[] a) // hypothetical! currently illegal in Java
the compiler should only allow the following to compile:
integerList.toArray(new Integer[0]) // works fine!
integerList.toArray(new Number[0]) // works fine!
integerList.toArray(new Object[0]) // works fine!
and no other array type arguments (since Integer only has those 3 types as super). That is, you're trying to prevent this from compiling:
integerList.toArray(new String[0]) // trying to prevent this from compiling
because, by your argument, String is not a super of Integer. However, Object is a super of Integer, and a String[] is an Object[], so the compiler still would let the above compile, even if hypothetically you can do <S super T>!
So the following would still compile (just as the way they are right now), and ArrayStoreException at run-time could not be prevented by any compile-time checking using generic type bounds:
integerList.toArray(new String[0]) // compiles fine!
// throws ArrayStoreException at run-time
Generics and arrays don't mix, and this is one of the many places where it shows.
A non-array example
Again, let's say that you have this generic method declaration:
<T super Integer> void add(T number) // hypothetical! currently illegal in Java
And you have these variable declarations:
Integer anInteger
Number aNumber
Object anObject
String aString
Your intention with <T super Integer> (if it's legal) is that it should allow add(anInteger), and add(aNumber), and of course add(anObject), but NOT add(aString). Well, String is an Object, so add(aString) would still compile anyway.
See also
Java Tutorials/Generics
Subtyping
More fun with wildcards
Related questions
On generics typing rules:
Any simple way to explain why I cannot do List<Animal> animals = new ArrayList<Dog>()?
java generics (not) covariance
What is a raw type and why shouldn’t we use it?
Explains how raw type List is different from List<Object> which is different from a List<?>
On using super and extends:
Java Generics: What is PECS?
From Effective Java 2nd Edition: "producer extends consumer super"
What is the difference between super and extends in Java Generics
What is the difference between <E extends Number> and <Number>?
How can I add to List<? extends Number> data structures? (YOU CAN'T!)
As no one has provided a satisfactory answer, the correct answer seems to be "for no good reason".
polygenelubricants provided a good overview of bad things happening with the java array covariance, which is a terrible feature by itself. Consider the following code fragment:
String[] strings = new String[1];
Object[] objects = strings;
objects[0] = 0;
This obviously wrong code compiles without resorting to any "super" construct, so array covariance should not be used as an argument.
Now, here I have a perfectly valid example of code requiring super in the named type parameter:
class Nullable<A> {
private A value;
// Does not compile!!
public <B super A> B withDefault(B defaultValue) {
return value == null ? defaultValue : value;
}
}
Potentially supporting some nice usage:
Nullable<Integer> intOrNull = ...;
Integer i = intOrNull.withDefault(8);
Number n = intOrNull.withDefault(3.5);
Object o = intOrNull.withDefault("What's so bad about a String here?");
The latter code fragment does not compile if I remove the B altogether, so B is indeed needed.
Note that the feature I'm trying to implement is easily obtained if I invert the order of type parameter declarations, thus changing the super constraint to extends. However, this is only possible if I rewrite the method as a static one:
// This one actually works and I use it.
public static <B, A extends B> B withDefault(Nullable<A> nullable, B defaultValue) { ... }
The point is that this Java language restriction is indeed restricting some otherwise possible useful features and may require ugly workarounds. I wonder what would happen if we needed withDefault to be virtual.
Now, to correlate with what polygenelubricants said, we use B here not to restrict the type of object passed as defaultValue (see the String used in the example), but rather to restrict the caller expectations about the object we return. As a simple rule, you use extends with the types you demand and super with the types you provide.
The "official" answer to your question can be found in a Sun/Oracle bug report.
BT2:EVALUATION
See
http://lampwww.epfl.ch/~odersky/ftp/local-ti.ps
particularly section 3 and the last paragraph on page 9. Admitting
type variables on both sides of subtype constraints can result in a
set of type equations with no single best solution; consequently,
type inference cannot be done using any of the existing standard
algorithms. That is why type variables have only "extends" bounds.
Wildcards, on the other hand, do not have to be inferred, so there
is no need for this constraint.
####.### 2004-05-25
Yes; the key point is that wildcards, even when captured, are only used
as inputs of the inference process; nothing with (only) a lower bound needs
to be inferred as a result.
####.### 2004-05-26
I see the problem. But I do not see how it is different from the problems
we have with lower bounds on wildcards during inference, e.g.:
List<? super Number> s;
boolean b;
...
s = b ? s : s;
Currently, we infer List<X> where X extends Object as the type of the
conditional expression, meaning that the assignment is illegal.
####.### 2004-05-26
Sadly, the conversation ends there. The paper to which the (now dead) link used to point is Inferred Type Instantiation for GJ. From glancing at the last page, it boils down to: If lower bounds are admitted, type inference may yield multiple solutions, none of which is principal.
The only reason is it makes no sense when declaring a type parameter with a super keyword when defining at a class level.
The only logical type-erasure strategy for Java would have been to fallback to the supertype of all objects, which is the Object class.
A great example and explanation can be found here:
http://www.angelikalanger.com/GenericsFAQ/FAQSections/TypeParameters.html#Why%20is%20there%20no%20lower%20bound%20for%20type%20parameters?
A simple example for rules of type-erasure can be found here:
https://www.tutorialspoint.com/java_generics/java_generics_type_erasure.htm#:~:text=Type%20erasure%20is%20a%20process,there%20is%20no%20runtime%20overhead.
Suppose we have:
basic classes A > B > C and D
class A{
void methodA(){}
};
class B extends A{
void methodB(){}
}
class C extends B{
void methodC(){}
}
class D {
void methodD(){}
}
job wrapper classes
interface Job<T> {
void exec(T t);
}
class JobOnA implements Job<A>{
#Override
public void exec(A a) {
a.methodA();
}
}
class JobOnB implements Job<B>{
#Override
public void exec(B b) {
b.methodB();
}
}
class JobOnC implements Job<C>{
#Override
public void exec(C c) {
c.methodC();
}
}
class JobOnD implements Job<D>{
#Override
public void exec(D d) {
d.methodD();
}
}
and one manager class with 4 different approaches to execute job on object
class Manager<T>{
final T t;
Manager(T t){
this.t=t;
}
public void execute1(Job<T> job){
job.exec(t);
}
public <U> void execute2(Job<U> job){
U u= (U) t; //not safe
job.exec(u);
}
public <U extends T> void execute3(Job<U> job){
U u= (U) t; //not safe
job.exec(u);
}
//desired feature, not compiled for now
public <U super T> void execute4(Job<U> job){
U u= (U) t; //safe
job.exec(u);
}
}
with usage
void usage(){
B b = new B();
Manager<B> managerB = new Manager<>(b);
//TOO STRICT
managerB.execute1(new JobOnA());
managerB.execute1(new JobOnB()); //compiled
managerB.execute1(new JobOnC());
managerB.execute1(new JobOnD());
//TOO MUCH FREEDOM
managerB.execute2(new JobOnA()); //compiled
managerB.execute2(new JobOnB()); //compiled
managerB.execute2(new JobOnC()); //compiled !!
managerB.execute2(new JobOnD()); //compiled !!
//NOT ADEQUATE RESTRICTIONS
managerB.execute3(new JobOnA());
managerB.execute3(new JobOnB()); //compiled
managerB.execute3(new JobOnC()); //compiled !!
managerB.execute3(new JobOnD());
//SHOULD BE
managerB.execute4(new JobOnA()); //compiled
managerB.execute4(new JobOnB()); //compiled
managerB.execute4(new JobOnC());
managerB.execute4(new JobOnD());
}
Any suggestions how to implement execute4 now ?
==========edited =======
public void execute4(Job<? super T> job){
job.exec( t);
}
Thanks to all :)
========== edited ==========
private <U> void execute2(Job<U> job){
U u= (U) t; //now it's safe
job.exec(u);
}
public void execute4(Job<? super T> job){
execute2(job);
}
much better, any code with U inside execute2
super type U becomes named !
interesting discussion :)
I really like the accepted answer, but I would like to put a slightly different perspective on it.
super is supported in a typed parameter only to allow contravariance capabilities. When it comes to covariance and contravariance it's important to understand that Java only supports use-site variance. Unlike Kotlin or Scala, which allow declaration-site variance. Kotlin documentation explains it very well here. Or if you're more into Scala, here's one for you.
It basically means that in Java, you can not limit the way you're gonna use your class when you declare it in terms of PECS. The class can both consume and produce, and some of its methods can do it at the same time, like toArray([]), by the way.
Now, the reason extends is allowed in classes and methods declarations is because it's more about polymorphism than it is about variance. And polymorphism is an intrinsic part of Java and OOP in general: If a method can accept some supertype, a subtype can always safely be passed to it. And if a method, at declaration site as it's "contract", should return some supertype, it's totally fine if it returns a subtype instead in its implementations

Categories