Let's say I have two classes and two methods:
class Scratch {
private class A{}
private class B extends A{}
public Optional<A> getItems(List<String> items){
return items.stream()
.map(s -> new B())
.findFirst();
}
public Optional<A> getItems2(List<String> items){
return Optional.of(
items.stream()
.map(s -> new B())
.findFirst()
.get()
);
}
}
Why does getItems2 compile while getItems gives compiler error
incompatible types: java.util.Optional<Scratch.B> cannot be converted to java.util.Optional<Scratch.A>
So when I get the value of the Optional returned by findFirst and wrap it again with Optional.of the compiler recognizes the inheritance but not if I use directly the result of findFirst.
An Optional<B> is not a subtype of Optional<A>. Unlike other programming languages, Java’s generic type system does not know “read only types” or “output type parameters”, so it doesn’t understand that Optional<B> only provides an instance of B and could work at places where an Optional<A> is required.
When we write a statement like
Optional<A> o = Optional.of(new B());
Java’s type inference uses the target type to determine that we want
Optional<A> o = Optional.<A>of(new B());
which is valid as new B() can be used where an instance of A is required.
The same applies to
return Optional.of(
items.stream()
.map(s -> new B())
.findFirst()
.get()
);
where the method’s declared return type is used to infer the type arguments to the Optional.of invocation and passing the result of get(), an instance of B, where A is required, is valid.
Unfortunately, this target type inference doesn’t work through chained invocations, so for
return items.stream()
.map(s -> new B())
.findFirst();
it is not used for the map call. So for the map call, the type inference uses the type of new B() and its result type will be Stream<B>. The second problem is that findFirst() is not generic, calling it on a Stream<T> invariably produces a Optional<T> (and Java’s generics does not allow to declare a type variable like <R super T>, so it is not even possible to produce an Optional<R> with the desired type here).
→ The solution is to provide an explicit type for the map call:
public Optional<A> getItems(List<String> items){
return items.stream()
.<A>map(s -> new B())
.findFirst();
}
Just for completeness, as said, findFirst() is not generic and hence, can’t use the target type. Chaining a generic method allowing a type change would also fix the problem:
public Optional<A> getItems(List<String> items){
return items.stream()
.map(s -> new B())
.findFirst()
.map(Function.identity());
}
But I recommend using the solution of providing an explicit type for the map invocation.
The issue you have is with inheritance for generics.
Optional< B > doesn't extend Optional< A >, so it can't be returned as such.
I'd imagine that something like this:
public Optional<? extends A> getItems( List<String> items){
return items.stream()
.map(s -> new B())
.findFirst();
}
Or:
public Optional<?> getItems( List<String> items){
return items.stream()
.map(s -> new B())
.findFirst();
}
Would work fine, depending on your needs.
Edit: escaping some characters
An Optional<B> is not a sub-class of Optional<A>.
In the first case, you have a Stream<B>, so findFirst returns an Optional<B>, which cannot be converted to an Optional<A>.
In the second case, you have a stream pipeline that returns an instance of B. When you pass that instance to Optional.of(), the compiler sees that the return type of the method is Optional<A>, so Optional.of() returns an Optional<A> (since an Optional<A> can hold an instance of B as its value (since B extends A)).
Look at this similar example:
Optional<A> optA = Optional.of(new B()); //OK
Optional<B> optB = Optional.of(new B()); //OK
Optional<A> optA2 = optB; //doesn't compile
You can make the second method fail by rewriting it as:
public Optional<A> getItems2(List<String> items) {
return Optional.<B>of(items.stream().map(s -> new B()).findFirst().get());
}
This is simply because generic types are invariant.
Why the difference? See the declaration of Optional.of:
public static <T> Optional<T> of(T value) {
return new Optional<>(value);
}
The type of the optional is picked up from the target of the assignment (or return type in this case).
And Stream.findFirst():
//T comes from Stream<T>, it's not a generic method parameter
Optional<T> findFirst();
In this case, however, return items.stream().map(s -> new B()).findFirst(); doesn't type the result of .findFirst() based on the declared return type of getItems (T is strictly based on the type argument of Stream<T>)
If the class B inherits class A, that doesn't mean Optional inherits Optional. Optional is a different class.
Related
What is a proper usage of the Comparator's methodthenComparing() and why it's not working properly in my code? I don't quite understand the reason of the error I'm getting:
no instance(s) of type variable(s) U exist so
that Object conforms to Comparable<? super U>
That is the code that produces the error:
Map<Integer, Long> sortedCards = new LinkedHashMap<>();
List<Card> cards = // initializing the list
cards.stream().collect(
Collectors.groupingBy(
card -> card.kind.rank,
Collectors.counting()
))
.entrySet().stream()
.sorted(Map.Entry.comparingByValue(Comparator.reverseOrder())
.thenComparing(Map.Entry::getKey))
.forEachOrdered(e -> sortedCards.put(e.getKey(), e.getValue()));
My Card class:
public static class Card implements Comparable<Card> {
private Kind kind;
// constructors, getters, etc.
}
Kind enum:
public enum Kind {
TWO(1, "2"), THREE(2, "3"), FOUR(3, "4"), // etc.;
public int rank;
// constructors, getters, etc.
}
The reason of the compilation error you've encountered is a type inference issue.
The when you're chaining methods while constructing a comparator, the compiler fails to infer the type of the method reference from the target type.
You can resolve it by using type-witness, i.e. providing the types of key and value explosively in the angle brackets <K,V> (these are the types declared by comparingByValue() which produces the first comparator in the chain):
Map.Entry.<Integer, Long>comparingByValue(Comparator.reverseOrder())
.thenComparing(Map.Entry::getKey)
Also, instead of forEachOrdered() it would be better to use collect() and generate a Map as a result of the stream execution then populate pre-created map via side-effects. It makes your code more verbose and less expressive because you need to instantiate the resulting collection separately, and goes against the guidelines listed in the documentation.
Methods forEach() and forEachOrdered() exist as a last resort, and it's discouraged to use them as a substitution of reduction operations like collect() or reduce(), for more information refer to the API documentation, pay attention to the code examples.
That's how it would look like if we make use of collect():
Map<Integer, Long> sortedCards = cards.stream()
.collect(Collectors.groupingBy(card -> card.kind.rank, Collectors.counting()))
.entrySet().stream()
.sorted(Map.Entry.<Integer, Long>comparingByValue(Comparator.reverseOrder())
.thenComparing(Map.Entry::getKey))
.collect(Collectors.toMap(
Map.Entry::getKey,
Map.Entry::getValue,
(left, right) -> {
throw new AssertionError("duplicate keys are not expected");
},
LinkedHashMap::new
));
I'm having trouble with typing the accumulator of a reduce operation, for two very distinct reasons.
First, I get from this article that when using the (identity, accumulator) signature, if the return type is different from the one inside the collection being reduced, it is necessary to have an explicit combiner to help the compiler. However, the return type is always made explicit by the identity we pass ! Why can't the compiler infer by itself that this also is the same ?
List<String> productNames =
products.stream()
.reduce(
new ArrayList<>(),
(acc, elm) -> {
List<String> newList = new ArrayList<>(acc);
newList.add(elm.getName());
return newList; // Won't compile!
});
Second, when I try to create a copy of my currently accumulated value, I have to explicitly pass the concrete type and not just the interface, as we usually do with Lists.
List<String> productNames =
products.stream()
.reduce(
new ArrayList<>(),
(acc, elm) -> {
List<String> newList = new ArrayList<>(acc); // Won't compile! needs ArrayList<String>
newList.add(elm.getName());
return newList;
},
(list1, list2) -> {
ArrayList<String> newList = new ArrayList<>(list2);
newList.addAll(list1);
return newList;
});
In this case to make reducing you should use method <U> U reduce(U identity, BiFunction<U, ? super T, U> accumulator, BinaryOperator<U> combiner) instead of T reduce(T identity, BinaryOperator<T> accumulator).
If we talk about the first signature, U identity and BiFunction<U, ? super T, U> return types have to be exactly the same U etc. ArrayList. And we can instantiate it as identity, but not List.
List<String> productNames =
products.stream()
.reduce(new ArrayList<>(),
(acc, elm) -> {
ArrayList<String> newList = new ArrayList<>(acc);
newList.add(elm.getName());
return newList;
},
(list1, list2) -> { list1.addAll(list2); return list1;});
P.S. It is easier to reach this goal by collecting:
List<String> productNames =
products.stream()
.collect(ArrayList::new ,
(list, product) -> list.add(product.getName()),
ArrayList::addAll);
or you can use map operation:
List<String> productNames =
products.stream()
.map(Product::getName)
.collect(Collectors.toList())
Some colleague provided me with both answers in shortly I posted it so I'll try to explain it this way.
The second snippet does not compile because reduce is trying to infer the return based on what it is given. In this case, the identity is an explicit type, ArrayList therefore reduce expect this precise type to be returned, not a sibling. Hence, List<String> is too broad.
As for the first case, the generic type <T> inside reduce's type is the same as the one that is passed of inside the Stream. Hence, when we use a basic form with only an identity and an accumulator, Java explicitly expects T to be the same of the type inside the Stream. If we want to deviate from this, we have to pass in a third parameters, the combiner.
For first. The signature is T reduce(T identity, BinaryOperator<T> accumulator) so the identity must have same type as your stream element. You identity has ArrayList<?> type, your stream element
(acc, elm) -> {
List<String> newList = new ArrayList<>(acc);
newList.add(elm.getName());
is most likely of a different type.
For the second your accumulator function should be corrected too, and the combiner function can be simplified
(list1, list2) -> {
list2.addAll(list1);
return list2;
}
Apologies that is probably the worst Title I've used but I can't quite think how to word it.
I'm calling a method table.getColData(COL_1) which returns a generic
public <T extends Object> List<T> getColData(final String col)
I am calling it twice and getting two lists of strings. I want to concatenate these and have ended up with this -
List<String> a = table.getColData(col1);
List<String> b = table.getColData(col2);
List <String> c = Stream.concat(a.stream(), b.stream()).collect(Collectors.toList());
which works nicely I think. I can't find a way to avoid the 2 declarations though without getting an error as the concat thinks it has a list of objects that are not Strings (prompt: change type of c to List<Object>) ?
Is there an easy way to do this to make it look a little more polished?!
You are limited by the inference of the compiler.
List <String> c = Stream.concat(getColDataStream(col1).stream(), getColDataStream(col2).stream()).collect(Collectors.toList());
cannot compile because getColDataStream() return is inferred as List<Object> as you don't specify a target type from the invocation side.
You can concatenate two streams of List<Object> but it not will produce Stream<String> but Stream<Object>.
Introducing two intermediary variables is not necessary the best way.
1) As alternative as suggested by Holger you could specify the T type from the target side :
Stream.concat(table.<String>getColData(col1).stream(), table.<String>getColData(col2).stream())
.collect(Collectors.toList());
2) You could also transform Stream<Object> to Stream<String>in a map() operation :
List<String> c = Stream.concat(table.getColData(col1).stream(), table.getColData(col2).stream())
.map(s -> (String) s)
.collect(Collectors.toList());
3) or introducing an additional method that prevents any explicit cast by concatenating streams of the lists, and collecting it in a List that it returns :
public <T> List<T> concatAndCollectToList(final List<T> a, List<T> b) {
return Stream.concat(a.stream(), b.stream())
.collect(Collectors.toList());
}
You can now do just :
List<String> c = concatAndCollectToList(table.getColData(col1), table.getColData(col2));
To be honest, I would add a type witness (a method argument) that is not used, to make this method type safe:
public static <T> List<T> getColData(String col, Class<T> clazz) {
// whatever you did as before
}
And in such a case your type-safety would be in place:
List<String> set = Stream.concat(
getColData("", String.class).stream(),
getColData("", String.class).stream())
.collect(Collectors.toList());
Firstly, the T extends Object is redundant since every object T extends from Object since the definition.
Since the method returns List<T>, it's unknown at the compilation time what type is T, therefore it's not possible to pass those two generic results to Stream - the collected result is List<Object> regardless the T.
You have to map each of the values to String - at this point you have to assure that all the list items are convertible into String using casting or conversion:
List<String> c = Stream
.concat(table.getColData(col1).stream(), table.getColData(col2).stream())
.map(s -> (String) s)
.collect(Collectors.toList());
I recommend you the shorter and more readable way which uses Stream::flatMap:
List<String> c = Stream.of(table.getColData(col1), table.getColData(col2))
.flatMap(list -> list.stream().map(s -> (String) s))
.collect(Collectors.toList());
The result depends on what exactly the method getColData returns and whether it is convertible to String (#Eugene).
The simplest would be:
Stream.of(col1, col2)
.map(table::<String>getColData)
.flatMap(List::stream)
.collect(Collectors.toList());
Since your getColData method returns a T, you can specify what type T is by using a type witness <String>. The rest is the java syntax for method references.
Also, the use of generics can be questioned here. This is equivalent to having
public List<Object> getColData(final String col)
and casting your list to a list of String:
Stream.of(col1, col2)
.map(table::getColData)
.map(o -> (String) o)
.flatMap(List::stream)
.collect(Collectors.toList());
The simplest approach would be to just add a generic argument to the method using <>
Stream.concat(
table.<String>getColData(col1).stream(),
table.<String>getColData(col2).stream())
.collect(toList());
I have a requirement to validate a field against some predefined values (that can grow in future). So for this I have created a Enum and defined a method that returns the stream of the allowed values.
public enum EnumDemo {
VERSION("1.0.0","2.0.3");
private List<String> ver;
EnumDemo(String... ver) {
this.ver = Arrays.asList(ver);
}
public List<String> getVer() {
return ver;
}
public static Stream<EnumDemo> stream() {
return Arrays.stream(EnumDemo.values());
}
}
Now I need to validate a field against the values defined in this Enum.
I'm using:
Optional<EnumDemo> ab = EnumDemo.stream()
.map(l -> {l.getVer().stream()
.filter(c -> c.equals("2.0.3"))
.findFirst();})
.findFirst();
System.out.println(ab.get().getVer());
But it is giving me compilation error. Any help would be appreciated.
Edit:
Compilation Error:
The method map(Function<? super EnumDemo,? extends R>) in the type Stream<EnumDemo> is not applicable for the arguments ((<no type> l) -> {})
You should write it this way:
Optional<EnumDemo> ab = EnumDemo.stream().filter(l -> l.getVer().contains("2.0.3"))
.findFirst();
By the way, it wasn't working because you used {} for the lambda expression, so it was expecting a return statement in the {}. You could either remove the {} (along with the ;) or add in the return.
Anyway the original codes looked confusing, not sure if I guessed the intention correctly, but this implementation should be clearer.
Edit
Based on your comment, this is what you need:
EnumDemo.stream().flatMap(l -> l.getVer().stream())
.filter("2.0.3"::equals)
.findAny()
.ifPresent(System.out::println);
Update
Holger commented that there is a shorter and more meaningful way, with better performance:
if(EnumDemo.stream()
.anyMatch(l -> l.getVer().contains(userString))) {
System.out.println(userString);
}
To understand it, you have to think about lambdas. Lambdas represent interfaces but are specially treated by the JVM, so not every Lambda needs a class to represent. (Stateless lambdas can be just methods).
Now when looking at the map() method in the Stream interface:
<R> Stream<R> map(Function<? super T, ? extends R> mapper);
You see that it expects an implementation of the Function interface. You now have many different ways to provide that mapper. In this example lets map from Object to String:
1. Using an inline lambda:
.map(o -> o.toString())
2. Using a multiline lambda:
.map(o -> {
return o.toString();
})
3. Using method references:
.map(Object::toString)
4. Using an anonymous class:
.map(new Function<Object, String>(){
#Override
public String apply(Object o){
return o.toString();
}
})
Your current code uses the 2. approach. But without a return statement. This is even better seen when looking at the anonymous class at 4.. It seems natural, that when not using a return statement in a method that no value is returned.
And that's why you get the compilation error.
You just have to add the return statement:
.map(l -> {
return l.getVer().stream()
.filter(c -> c.equals("2.0.3"))
.findFirst();
});
Or remove the brackets {}:
.map(l -> l.getVer().stream()
.filter(c -> c.equals("2.0.3"))
.findFirst());
Or even use the approach provided by #Jai in his answer. Which works even better, than what you currently have.
You are using lambda expression and not returning any value so it is giving compilation error. It is better to use ifPresent()
String val="2.0.3";
EnumDemo.stream()
.flatMap(l -> l.getVer().stream())
.filter(c -> c.equals(val))
.findAny()
.ifPresent(x -> System.out.println(x));
I have written following code:
Set<Pair<Predicate>> interestingPairs = ...
...
interestingPairs.stream()
.filter(pair -> pair.getFirst().isNegated() == pair.getSecond().isNegated())
.flatMap( pair -> findUnification(pair)
.map(u -> {
Set<String> aVars = pair.getFirst().getAllVariables();
Set<String> bVars = pair.getSecond().getAllVariables();
if(u.sigma.isEmptyOrVarByVar(aVars) && !u.sigmaPrime.isEmptyOrVarByVar(bVars))
return Stream.of(pair.getFirst());
if (!u.sigma.isEmptyOrVarByVar(bVars) && u.sigmaPrime.isEmptyOrVarByVar(aVars))
return Stream.of(pair.getSecond());
return Stream.empty();
})
.orElse(Stream.empty()))
.forEach(toRemove -> this.predicates.remove((Predicate)toRemove));
I'm using intelliJ IDE. After flatMap operation elements of my stream have Object type instead of Predicate, so in the foreach toRemove has not desired type. When I change return Stream.empty() to return Stream.<Predicate>empty(), object's type after flatMap is Predicate. I would even understand this if not .orElse(Stream.empty()) where I don't have to add the <Predicate implicitly. What's the thing I don't get here?
I assume that your Predicate is your custom non-generic type, not the java.util.function.Predicate. Seems that the compiler fails to infer the type of the .flatMap argument. You may help it specifying it explicitly:
interestingPairs.stream()
.filter(pair -> pair.getFirst().isNegated() == pair.getSecond().isNegated())
.<Predicate>flatMap( pair -> findUnification(pair)
... and so on
To my understanding, current JLS automatic type inference rules do not cover this case, so it's not a compiler bug. You just have to specify type argument explicitly sometimes or extract some complex lambdas to intermediate variables.