Generic chceck in the stream collect java - java

List<String> namesOfMaleMembersCollect = roster
.stream()
.filter(p -> p.getGender() == Person.Sex.MALE)
.map(p -> p.getName())
.collect(Collectors.toList());
I've got such a code, where roster is defined as List<Person>. In which place JVM checks if the returned List consist Strings? I mean we've got the List defined, but then there is no information about the String of retiring value. Is this:
.map(p -> p.getName())
.collect(Collectors.toList());
the place where JVM see that .map() getting String and know that the type of the list returned by .collect() will be same?

Type inference is a powerful tool that comes with generics. When you call .map(p -> p.getName()) it returns a Stream<String>, now the the Stream has type parameter String instead of T.
Now you call collect which takes a Collector of the following signature.
<R, A> R collect(Collector<? super T, A, R> collector)
And in the case Stream<String> it will be infered to
Collector<String, ?, List<String>>
Giving us List<String>
You can rewrite your code to the following
Collector<String, ?, List<String>> collector = Collectors.toList();
...map(p -> p.getName())
.collect(collector);
Meaning the type is infered from the type of the variable the result is being assigned to.

The map method return a stream of the result of applying the function in parameter so this return to you a stream of string and then we collect the result as a List using the collector.
In this cas its in this map function :
map(p -> p.getName())
it can be written like this (:
map(Person p -> {return p.getName();})

Related

How to collect data to List<Object> from Map<Object,Integer> using java Stream API?

I have map Map<Nominal, Integer> with objects and their counts:
a -> 3
b -> 1
c -> 2
And I need to get such a List<Nominal> from it:
a
a
a
b
c
c
How can I do this using the Stream API?
We can use Collections::nCopies to achieve the desired result:
private static <T> List<T> transform(Map<? extends T, Integer> map) {
return map.entrySet().stream()
.map(entry -> Collections.nCopies(entry.getValue(), entry.getKey()))
.flatMap(Collection::stream)
.collect(Collectors.toList());
}
Ideone demo
Remark
In the demo, I changed the key-type of the Map from Nominal to Object since the definition of Nominal was not provided. Changing the key-type, however, does not influence the solution.
Stream the entries and use flatMap to generate multiple copies of each key based on the value.
List<Nominal> expanded = map.entrySet().stream()
.flatMap(e -> generate(e::getKey).limit(e.getValue()))
.collect(toList());

Typing a list List<String> inside a reduce accumulator not compiling

I'm having trouble with typing the accumulator of a reduce operation, for two very distinct reasons.
First, I get from this article that when using the (identity, accumulator) signature, if the return type is different from the one inside the collection being reduced, it is necessary to have an explicit combiner to help the compiler. However, the return type is always made explicit by the identity we pass ! Why can't the compiler infer by itself that this also is the same ?
List<String> productNames =
products.stream()
.reduce(
new ArrayList<>(),
(acc, elm) -> {
List<String> newList = new ArrayList<>(acc);
newList.add(elm.getName());
return newList; // Won't compile!
});
Second, when I try to create a copy of my currently accumulated value, I have to explicitly pass the concrete type and not just the interface, as we usually do with Lists.
List<String> productNames =
products.stream()
.reduce(
new ArrayList<>(),
(acc, elm) -> {
List<String> newList = new ArrayList<>(acc); // Won't compile! needs ArrayList<String>
newList.add(elm.getName());
return newList;
},
(list1, list2) -> {
ArrayList<String> newList = new ArrayList<>(list2);
newList.addAll(list1);
return newList;
});
In this case to make reducing you should use method <U> U reduce(U identity, BiFunction<U, ? super T, U> accumulator, BinaryOperator<U> combiner) instead of T reduce(T identity, BinaryOperator<T> accumulator).
If we talk about the first signature, U identity and BiFunction<U, ? super T, U> return types have to be exactly the same U etc. ArrayList. And we can instantiate it as identity, but not List.
List<String> productNames =
products.stream()
.reduce(new ArrayList<>(),
(acc, elm) -> {
ArrayList<String> newList = new ArrayList<>(acc);
newList.add(elm.getName());
return newList;
},
(list1, list2) -> { list1.addAll(list2); return list1;});
P.S. It is easier to reach this goal by collecting:
List<String> productNames =
products.stream()
.collect(ArrayList::new ,
(list, product) -> list.add(product.getName()),
ArrayList::addAll);
or you can use map operation:
List<String> productNames =
products.stream()
.map(Product::getName)
.collect(Collectors.toList())
Some colleague provided me with both answers in shortly I posted it so I'll try to explain it this way.
The second snippet does not compile because reduce is trying to infer the return based on what it is given. In this case, the identity is an explicit type, ArrayList therefore reduce expect this precise type to be returned, not a sibling. Hence, List<String> is too broad.
As for the first case, the generic type <T> inside reduce's type is the same as the one that is passed of inside the Stream. Hence, when we use a basic form with only an identity and an accumulator, Java explicitly expects T to be the same of the type inside the Stream. If we want to deviate from this, we have to pass in a third parameters, the combiner.
For first. The signature is T reduce(T identity, BinaryOperator<T> accumulator) so the identity must have same type as your stream element. You identity has ArrayList<?> type, your stream element
(acc, elm) -> {
List<String> newList = new ArrayList<>(acc);
newList.add(elm.getName());
is most likely of a different type.
For the second your accumulator function should be corrected too, and the combiner function can be simplified
(list1, list2) -> {
list2.addAll(list1);
return list2;
}

casting & generics to enable list concatenation in Java 8

Apologies that is probably the worst Title I've used but I can't quite think how to word it.
I'm calling a method table.getColData(COL_1) which returns a generic
public <T extends Object> List<T> getColData(final String col)
I am calling it twice and getting two lists of strings. I want to concatenate these and have ended up with this -
List<String> a = table.getColData(col1);
List<String> b = table.getColData(col2);
List <String> c = Stream.concat(a.stream(), b.stream()).collect(Collectors.toList());
which works nicely I think. I can't find a way to avoid the 2 declarations though without getting an error as the concat thinks it has a list of objects that are not Strings (prompt: change type of c to List<Object>) ?
Is there an easy way to do this to make it look a little more polished?!
You are limited by the inference of the compiler.
List <String> c = Stream.concat(getColDataStream(col1).stream(), getColDataStream(col2).stream()).collect(Collectors.toList());
cannot compile because getColDataStream() return is inferred as List<Object> as you don't specify a target type from the invocation side.
You can concatenate two streams of List<Object> but it not will produce Stream<String> but Stream<Object>.
Introducing two intermediary variables is not necessary the best way.
1) As alternative as suggested by Holger you could specify the T type from the target side :
Stream.concat(table.<String>getColData(col1).stream(), table.<String>getColData(col2).stream())
.collect(Collectors.toList());
2) You could also transform Stream<Object> to Stream<String>in a map() operation :
List<String> c = Stream.concat(table.getColData(col1).stream(), table.getColData(col2).stream())
.map(s -> (String) s)
.collect(Collectors.toList());
3) or introducing an additional method that prevents any explicit cast by concatenating streams of the lists, and collecting it in a List that it returns :
public <T> List<T> concatAndCollectToList(final List<T> a, List<T> b) {
return Stream.concat(a.stream(), b.stream())
.collect(Collectors.toList());
}
You can now do just :
List<String> c = concatAndCollectToList(table.getColData(col1), table.getColData(col2));
To be honest, I would add a type witness (a method argument) that is not used, to make this method type safe:
public static <T> List<T> getColData(String col, Class<T> clazz) {
// whatever you did as before
}
And in such a case your type-safety would be in place:
List<String> set = Stream.concat(
getColData("", String.class).stream(),
getColData("", String.class).stream())
.collect(Collectors.toList());
Firstly, the T extends Object is redundant since every object T extends from Object since the definition.
Since the method returns List<T>, it's unknown at the compilation time what type is T, therefore it's not possible to pass those two generic results to Stream - the collected result is List<Object> regardless the T.
You have to map each of the values to String - at this point you have to assure that all the list items are convertible into String using casting or conversion:
List<String> c = Stream
.concat(table.getColData(col1).stream(), table.getColData(col2).stream())
.map(s -> (String) s)
.collect(Collectors.toList());
I recommend you the shorter and more readable way which uses Stream::flatMap:
List<String> c = Stream.of(table.getColData(col1), table.getColData(col2))
.flatMap(list -> list.stream().map(s -> (String) s))
.collect(Collectors.toList());
The result depends on what exactly the method getColData returns and whether it is convertible to String (#Eugene).
The simplest would be:
Stream.of(col1, col2)
.map(table::<String>getColData)
.flatMap(List::stream)
.collect(Collectors.toList());
Since your getColData method returns a T, you can specify what type T is by using a type witness <String>. The rest is the java syntax for method references.
Also, the use of generics can be questioned here. This is equivalent to having
public List<Object> getColData(final String col)
and casting your list to a list of String:
Stream.of(col1, col2)
.map(table::getColData)
.map(o -> (String) o)
.flatMap(List::stream)
.collect(Collectors.toList());
The simplest approach would be to just add a generic argument to the method using <>
Stream.concat(
table.<String>getColData(col1).stream(),
table.<String>getColData(col2).stream())
.collect(toList());

Java 8 streams.reduce() with combiner

My question is that i would like a breakdown of the following code
I have a map and i want to replace certain strings with the map's value with the following reduce() function:
Map<String, String> environmentMap = new HashMap<>();
Function<String, String> replaceFunction = environmentMap.entrySet()
.stream()
.reduce
(Function.identity(),
(function,entrySet) -> stringToReplace -> function.apply(stringToReplace.replaceAll(entrySet.getKey(),entrySet.getValue(),
Function::andThen);
Somehow i'm confused on the replaceFunction part but let me break it down based on my understanding.
The line environmentMap.entrySet().stream() will create a stream of Entry<String,String>.
The reduce() method that i am using takes an identity, accumulator, and a combiner.
Since i am not using a parallel stream i thought of omitting the combiner, but then the compiler throws an error. Is there any way that i could transform this accumulator into a BinaryOperator<T>?
Function.identity() will always return a Function that returns the input argument. In this case of type String
The second argument which intakes a BiFunction is one of the things i'm confused with. (function,entrySet) -> stringToReplace -> function.apply(stringToReplace.replaceAll(entrySet.getKey(),entrySet.getValue().
What does Function::andThen do in this process?
Lastly, where did the return type Function<String,String> in the BiFunction<T, U, R> came from??
Function<String, String> replaceFunction = environmentMap.entrySet()
.stream()
.reduce(Function.identity(),
(function,entrySet) -> stringToReplace -> function.apply(stringToReplace.replaceAll(entrySet.getKey(),entrySet.getValue())),
Function::andThen);
The line environmentMap.entrySet().stream() will create a stream of Entry<String,String>.
The reduce() method that i am using takes an identity, accumulator, and a combiner.item.
Till this you are right, but the identity, accumulator, and combiner gets resolved as follows:
identity - Function.identity() gets resolved to Function<String, String>.
accumulator - (function,entrySet) -> stringToReplace -> function.apply(stringToReplace.replaceAll(entrySet.getKey(),entrySet.getValue())).
Here the following part,
`stringToReplace -> function.apply(stringToReplace.replaceAll(entrySet.getKey(),entrySet.getValue()))`
gets resolved into Function<String, String>.
And so this entire accumulator gets resolved to
`BiFunction<Function<String,String>, Entry<String,String>,Function<String,String>`
combiner - Function::andThen gets resolved to BinaryOperator<Function<String,String>, Function<String,String>>
So, to answer your question in 6 and 7, Function::andThen acts as the combiner; and 7 is the accumulator as I described above.

Better way to create a stream of functions?

I wish to do lazy evaluation on a list of functions I've defined as follows;
Optional<Output> output = Stream.<Function<Input, Optional<Output>>> of(
classA::eval, classB::eval, classC::eval)
.map(f -> f.apply(input))
.filter(Optional::isPresent)
.map(Optional::get)
.findFirst();
where as you see, each class (a, b & c) has an Optional<Output> eval(Input in) method defined. If I try to do
Stream.of(...)....
ignoring explicit type, it gives
T is not a functional interface
compilation error. Not accepting functional interface type for T generic type in .of(T... values)
Is there a snappier way of creating a stream of these functions? I hate to explicitly define of method with Function and its in-out types. Wouldn't it work in a more generic manner?
This issue stems from the topic of the following question;
Lambda Expression and generic method
You can break it into two lines:
Stream<Function<Input, Optional<Output>>> stream = Stream
.of(classA::eval, classB::eval, classC::eval);
Optional<Output> out = stream.map(f -> f.apply(input))
.filter(Optional::isPresent)
.map(Optional::get)
.findFirst();
or use casting:
Optional<Output> out = Stream.of(
(<Function<Input, Optional<Output>>>)classA::eval,
classB::eval,
classC::eval)
.map(f -> f.apply(input))
.filter(Optional::isPresent)
.map(Optional::get)
.findFirst();
but I don't think you can avoid specifying the type of the Stream element - Function<Input, Optional<Output>> - somewhere, since otherwise the compiler can't infer it from the method references.
There is a way that allows to omit the Function<Input, Optional<Output>> type, but it’s not necessarily an improvement
Optional<Output> o =
Stream.concat(Stream.of(input).map(classA::eval),
Stream.concat(Stream.of(input).map(classB::eval),
Stream.of(input).map(classC::eval)))
.filter(Optional::isPresent)
.map(Optional::get)
.findFirst();
and it doesn’t scale.
It seems, the best option is to wait for Java-9 where you can use
Optional<Output> o = classA.eval(input)
.or(() -> classB.eval(input))
.or(() -> classC.eval(input));

Categories