Java 8 streams.reduce() with combiner - java

My question is that i would like a breakdown of the following code
I have a map and i want to replace certain strings with the map's value with the following reduce() function:
Map<String, String> environmentMap = new HashMap<>();
Function<String, String> replaceFunction = environmentMap.entrySet()
.stream()
.reduce
(Function.identity(),
(function,entrySet) -> stringToReplace -> function.apply(stringToReplace.replaceAll(entrySet.getKey(),entrySet.getValue(),
Function::andThen);
Somehow i'm confused on the replaceFunction part but let me break it down based on my understanding.
The line environmentMap.entrySet().stream() will create a stream of Entry<String,String>.
The reduce() method that i am using takes an identity, accumulator, and a combiner.
Since i am not using a parallel stream i thought of omitting the combiner, but then the compiler throws an error. Is there any way that i could transform this accumulator into a BinaryOperator<T>?
Function.identity() will always return a Function that returns the input argument. In this case of type String
The second argument which intakes a BiFunction is one of the things i'm confused with. (function,entrySet) -> stringToReplace -> function.apply(stringToReplace.replaceAll(entrySet.getKey(),entrySet.getValue().
What does Function::andThen do in this process?
Lastly, where did the return type Function<String,String> in the BiFunction<T, U, R> came from??

Function<String, String> replaceFunction = environmentMap.entrySet()
.stream()
.reduce(Function.identity(),
(function,entrySet) -> stringToReplace -> function.apply(stringToReplace.replaceAll(entrySet.getKey(),entrySet.getValue())),
Function::andThen);
The line environmentMap.entrySet().stream() will create a stream of Entry<String,String>.
The reduce() method that i am using takes an identity, accumulator, and a combiner.item.
Till this you are right, but the identity, accumulator, and combiner gets resolved as follows:
identity - Function.identity() gets resolved to Function<String, String>.
accumulator - (function,entrySet) -> stringToReplace -> function.apply(stringToReplace.replaceAll(entrySet.getKey(),entrySet.getValue())).
Here the following part,
`stringToReplace -> function.apply(stringToReplace.replaceAll(entrySet.getKey(),entrySet.getValue()))`
gets resolved into Function<String, String>.
And so this entire accumulator gets resolved to
`BiFunction<Function<String,String>, Entry<String,String>,Function<String,String>`
combiner - Function::andThen gets resolved to BinaryOperator<Function<String,String>, Function<String,String>>
So, to answer your question in 6 and 7, Function::andThen acts as the combiner; and 7 is the accumulator as I described above.

Related

Create a Map<Key,Object> based on a List<Object> - Converting Loops into Java streams

Facing error while converting a List<Object> to a HashMap<Key,Object>.
I can do with a traditional code, but I'm getting an error while doing using Java 8 streams and ignore the duplicate keys.
Equivalent code:
Map<String, Field> uniqueFields = new HashMap<>();
for (Field field : allFields) {
uniqueFields.put(field.getName(), field);
}
Tried below stream, but there's a syntactic mistake:
Map<String, Field> uniqueFields1 = allFields.stream()
.collect(Collectors.toMap(
Field::getName,
(oldValue, newValue) -> oldValue));
You just need one more argument to the toMap function to tell the collector what should be the values of the Map. You can use Function.identity(), which means it will just pass the Field right through.
Map<String,Field> uniqueFields = allFields.stream()
.collect(Collectors.toMap(Field::getName, Function.identity(),
(oldValue,newValue) -> oldValue));
The flavor of Collectors.toMap(), that expects two arguments, should be provided with Functions: keyMapper and valueMapper (that are responsible for expracting a key and a value respectivelly from a stream element).
But in your code snippet, the second argument isn't of type Function, because you've provided two arguments oldValue, newValue, but Function takes only one.
You need to use a version of toMap that in addition to keyMapper and valueMapper functions expects the third argument BinaryOperator mergeFunction which is meant to resolve values mapped to the same key:
Map<String, Field> uniqueFields = allFields.stream()
.collect(Collectors.toMap(
Field::getName, // generating a kay
Function.identity(), // generating a value
(oldValue, newValue) -> oldValue))); // resolving duplicates

How to merge two Maps <String, LocalDateTime> and calculate difference between values in nanoseconds

everyone.
I've encountered the problem while was trying to merge two maps Map<String, LocalDateTime>. IDE shows me compile error:
non static method cannot be referenced from a static context
The thing is that I need to merge them into a new one with different parameter as value, in this case, I need to calculate the difference between LocalDateTime of the first map and LocalDateTime of a second, therefore a result should be Map<String, Long>
Map<String, Long> result = Stream.concat(logStartTimeMap.entrySet().stream(), logEndTimeMap.entrySet().stream())
.collect(Collectors.toMap(
Map.Entry::getKey,
Map.Entry::getValue,
(value1, value2) -> new Long(ChronoUnit.NANOS.between(value1,value2))));
I get error in line Map.Entry::getKey, Map.Entry::getValue.
And another one in line ChronoUnit.NANOS.between(value1,value2) as IDE thinks that value1 and value2 are of Object type instead of Temporal. And that's quite strange because previously I got that error when chose incorrect parameter type in resulting Map. But here it's ok, ChronoUnit.NANOS.between(value1,value2) returns Long and Map result have Long as a parameter.
The way you're using Collectors.toMap means you're attempting to return a Map<String, LocalDateTime>. That's because Map.Entry::getKey returns a String and Map.Entry::getValue returns a LocalDateTime. Then in your merge function you attempt to return a Long which does not match the generic signature.
Instead, you can do something like the following:
Map<String, LocalDateTime> map1 = ...;
Map<String, LocalDateTime> map2 = ...;
Map<String, Duration> result = map1.entrySet().stream().collect(Collectors.toMap(
Map.Entry::getKey,
entry -> Duration.between(entry.getValue(), map2.get(entry.getKey()))
));
The above computes a Duration which gives you more flexibility than just having the nanoseconds, but you can obviously change that to fit your needs:
Map<String, LocalDateTime> map1 = ...;
Map<String, LocalDateTime> map2 = ...;
Map<String, Long> result = map1.entrySet().stream().collect(Collectors.toMap(
Map.Entry::getKey,
entry -> entry.getValue().until(map2.get(entry.getKey()), ChronoUnit.NANOS)
));
Note both examples above assume that map1's key set is a subset of map2's key set. In a comment you say:
I have the same keys in both maps. The mathcing key have to be in the second map. So the situation where is no matching key is eliminated
Which means that assumption is valid. However, if that assumption becomes incorrect all you need to do is add a filter operation where you exclude any entry of map1 whose key is not present in map2. Ravindra shows an example of this in his answer.
I see no way to do it by combining the two maps to one stream.
The mergeFunction (third parameter of Collectors.toMap) has to be a BinaryOperator<U>. That means something like (U, U) -> U. But in your code it takes two parameters of type LocalDateTime and returns a Long.
You can do it with streams like this:
Map<String, Long> result = logStartTimeMap.entrySet().stream()
.map(e -> new AbstractMap.SimpleEntry<>(e.getKey(), ChronoUnit.NANOS.between(e.getValue(), logEndTimeMap.get(e.getKey()))))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));

Group, Collectors, Map (Int to String), Map (Map to Object)

This is a continuation of my previous question at Group, Sum byType then get diff using Java streams.
As suggested, I should post as a separate thread instead of updating the original one.
So with my previous set of question, I have achieved that, and now, with the continuation.
Background:
I have the following dataset
Sample(SampleId=1, SampleTypeId=1, SampleQuantity=5, SampleType=ADD),
Sample(SampleId=2, SampleTypeId=1, SampleQuantity=15, SampleType=ADD),
Sample(SampleId=3, SampleTypeId=1, SampleQuantity=25, SampleType=ADD),
Sample(SampleId=4, SampleTypeId=1, SampleQuantity=5, SampleType=SUBTRACT),
Sample(SampleId=5, SampleTypeId=1, SampleQuantity=25, SampleType=SUBTRACT)
Sample(SampleId=6, SampleTypeId=2, SampleQuantity=10, SampleType=ADD),
Sample(SampleId=7, SampleTypeId=2, SampleQuantity=20, SampleType=ADD),
Sample(SampleId=8, SampleTypeId=2, SampleQuantity=30, SampleType=ADD),
Sample(SampleId=9, SampleTypeId=2, SampleQuantity=15, SampleType=SUBTRACT),
Sample(SampleId=10, SampleTypeId=2, SampleQuantity=35, SampleType=SUBTRACT)
I am currently using this:
sampleList.stream()
.collect(Collectors.groupingBy(Sample::getTypeId,
Collectors.summingInt(
sample -> SampleType.ADD.equalsIgnoreCase(sample.getSampleType())
? sample.getSampleQuantity() :
-sample.getSampleQuantity()
)));
And also this
sampleList.stream()
.collect(Collectors.groupingBy(Sample::getSampleTypeId,
Collectors.collectingAndThen(
Collectors.groupingBy(Sample::getSampleType,
Collectors.summingInt(Sample::getSampleQuantity)),
map -> map.getOrDefault(SampleType.ADD, 0)
- map.getOrDefault(SampleType.SUBTRACT, 0))));
as the accepted answer to get the desired output to group in a Map<Long, Integer>:
{1=15, 2=10}
With that, I was wondering, if this could be expanded into something more.
First, how could I have it return as a Map<String, Integer> instead of the original Map<Long, Integer>. Basically, for the SampleTypeId; 1 refers to HELLO, 2 refers to WORLD.
So I would need like a .map (or maybe other function) to transform the data from 1 to HELLO and 2 to WORLD by calling a function say convertType(sampleTypeId)?. So the expected output would then be {"HELLO"=15, "WORLD"=10}. Is that right? How should I edit the current suggested solution to this?
Lastly, I would like to know if it is also possible to return it to a Object instead of a Map. So let's say I have a Object; SummaryResult with (String) name and (int) result. So it returns a List<SummaryResult> instead of the original Map<Long, Integer>. How can I use the .map (or other) feature to do this? Or is there other way to doing so? The expected output would be something along this line.
SummaryResult(name="hello", result=15),
SummaryResult(name="world", result=10),
Would really appreciate it with the explanation in steps as given previously by #M. Prokhorov.
Update:
After updating to
sampleList.stream()
.collect(Collectors.groupingBy(sample -> convertType(sample.getSampleTypeId()),
Collectors.collectingAndThen(
Collectors.groupingBy(Sample::getSampleType,
Collectors.summingInt(Sample::getSampleQuantity)),
map -> map.getOrDefault(SampleType.ADD, 0)
- map.getOrDefault(SampleType.SUBTRACT, 0))));
private String convertType(int id) {
return (id == 1) ? "HELLO" : "WORLD";
}
For first part, considering you have somewhere the method
String convertType(int typeId)
You simply need to change first classifier from this
groupingBy(SampleType::getTypeId)
to this
groupingBy(sample -> convertType(sample.getTypeId()))
Everything else remains the same.
Latter type is a little trickier, and technically doesn't benefit from it being a stream-related solution at all.
What you need is this:
public List<SummaryResult> toSummaryResultList(Map<String, Integer> resultMap) {
List<SummaryResult> list = new ArrayList<>(resultMap.size());
for (Map.Entry<String, Integer> entry : resultMap.entrySet()) {
String name = entry.getKey();
Integer value = entry.getValue();
// replace below with construction method you actually have
list.add(SummaryResult.withName(name).andResult(value));
}
return list;
}
You can use this as part of collector composition, where your whole collector will get wrapped into a collectingAndThen call:
collectingAndThen(
groupingBy(sample -> convertType(sample.getTypeId()),
collectingAndThen(
groupingBy(Sample::getSampleType,
summingInt(Sample::getSampleQuantity)),
map -> map.getOrDefault(SampleType.ADD, 0)
- map.getOrDefault(SampleType.SUBTRACT, 0))),
result -> toSummaryResultList(result))
However, as you can see, it is the whole collector that gets wrapped, so there is no real benefit in my eyes to the above version to a simpler and easier to follow (at least to me) version below that uses an intermediate variable, but isn't so much of a wall of code:
// do the whole collecting thing like before
Map<String, Integer> map = sampleList.stream()
.collect(Collectors.groupingBy(sample -> convertType(sample.getTypeId()),
Collectors.collectingAndThen(
Collectors.groupingBy(Sample::getSampleType,
Collectors.summingInt(Sample::getSampleQuantity)),
map -> map.getOrDefault(SampleType.ADD, 0)
- map.getOrDefault(SampleType.SUBTRACT, 0))));
// return the "beautified" result
return toSummaryResultList(map);
Another point to consider in above is: convertType method will be called as many times as there are elements in sampleList, so if convertType call is "heavy" (for example, uses database or IO), then it's better to call it as part of toSummaryResultList conversion, not as stream element classifier. In which case you will be collecting from map of type Map<Integer, Integer> still, and using convertType inside the loop. I will not add any code with this in consideration, as I view this change as trivial.
You could indeed use a map() function
sampleList.stream()
.collect(Collectors.groupingBy(Sample::getSampleTypeId,
Collectors.collectingAndThen(
Collectors.groupingBy(Sample::getSampleType,
Collectors.summingInt(Sample::getSampleQuantity)),
map -> map.getOrDefault(SampleType.ADD, 0)
- map.getOrDefault(SampleType.SUBTRACT, 0))))
.entrySet()
.stream()
.map(entry->new SummaryResult(entry.getKey()),entry.getValue())
.collect(Collectors.toList());
ToIntFunction<Sample> signedQuantityMapper= sample -> sample.getQuantity()
* (sample.getType() == Type.ADD ? 1 : -1);
Function<Sample, String> keyMapper = s -> Integer.toString(s.getTypeId());
Map<String, Integer> result = sampleList.stream().collect(
Collectors.groupingBy(
keyMapper,
Collectors.summingInt(signedQuantityMapper)));

Java 8 stream "Cannot use this in a static context"

I am new to java8 stream & sorry about the stupid question . Here is my code which i am trying to create a map of id & value, but i am getting this error, not able to fix. Can anyone help me what is the alternative?
public static Map<Integer, String> findIdMaxValue(){
Map<Integer, Map<String, Integer>> attrIdAttrValueCountMap = new HashMap<>();
Map<Integer, String> attrIdMaxValueMap = new HashMap<>();
attrIdAttrValueCountMap.forEach((attrId, attrValueCountMap) -> {
attrValueCountMap.entrySet().stream().sorted(this::compareAttrValueCountEntry).findFirst().ifPresent(e -> {
attrIdMaxValueMap.put(attrId, e.getKey());
});
});
}
and sorting method
public static int compareAttrValueCountEntry(Map.Entry<String, Integer> e1, Map.Entry<String, Integer> e2) {
int diff = e1.getValue() - e2.getValue();
if (diff != 0) {
return -diff;
}
return e1.getKey().compareTo(e2.getKey());
}
I am getting this error
"Cannot use this in a static context"
There are several issues with your code. While this::compareAttrValueCountEntry would be easy to
fix by changing it to ContainingClassName::compareAttrValueCountEntry, this method is unnecessary
as there are several factory methods like Map.Entry.comparingByKey, Map.Entry.comparingByValue,
Comparator.reversed and Comparator.thenComparing, which can be combined to achieve the same goal
This guards you from the errors made within compareAttrValueCountEntry. It’s tempting to compare int
values by subtracting, but this is error prone as the difference between two int values doesn’t always
fit into the int range, so overflows can occur. Also, negating the result for reversing the order is
broken, as the value might be Integer.MIN_VALUE, which has no positive counterpart, hence, negating it
will overflow back to Integer.MIN_VALUE instead of changing the sign.
Instead of looping via forEach to add to another map, you may use a cleaner stream operation producing
the map and you can simplify sorted(…).findFirst() to min(…) which in not only shorter, but a
potentially cheaper operation.
Putting it together, we get
Map<Integer, String> attrIdMaxValueMap =
attrIdAttrValueCountMap.entrySet().stream()
.filter(e -> !e.getValue().isEmpty())
.collect(Collectors.toMap(Map.Entry::getKey,
e -> e.getValue().entrySet().stream()
.min(Map.Entry.<String, Integer>comparingByValue().reversed()
.thenComparing(Map.Entry.comparingByKey())).get().getKey()));
Note that I prepended a filter operation rejecting empty maps, which ensures that there will always be
a matching element, so there is no need to deal with ifPresent or such alike. Instead, Optional.get
can be called unconditionally.
Since this method is called findIdMaxValue, there might be a desire to reflect that by calling max
on the Stream instead of min, wich is only a matter of which comparator to reverse:
Map<Integer, String> attrIdMaxValueMap =
attrIdAttrValueCountMap.entrySet().stream()
.filter(e -> !e.getValue().isEmpty())
.collect(Collectors.toMap(Map.Entry::getKey,
e -> e.getValue().entrySet().stream()
.max(Map.Entry.<String, Integer>comparingByValue()
.thenComparing(Map.Entry.comparingByKey(Comparator.reverseOrder())))
.get().getKey()));
Unfortunately, such constructs hit the limitations of the type inference, which requires us to either,
use nested constructs (like Map.Entry.comparingByKey(Comparator.reverseOrder()) instead of
Map.Entry.comparingByKey().reversed()) or to insert explicit types, like with
Map.Entry.<String, Integer>comparingByValue(). In the second variant, reversing the second comparator,
we are hitting the litimation twice…
In this specific case, there might be a point in creating the comparator only once, keeping it in a variable and reuse it within the stream operation:
Comparator<Map.Entry<String, Integer>> valueOrMinKey
= Map.Entry.<String, Integer>comparingByValue()
.thenComparing(Map.Entry.comparingByKey(Comparator.reverseOrder()));
Map<Integer, String> attrIdMaxValueMap =
attrIdAttrValueCountMap.entrySet().stream()
.filter(e -> !e.getValue().isEmpty())
.collect(Collectors.toMap(Map.Entry::getKey,
e -> e.getValue().entrySet().stream().max(valueOrMinKey).get().getKey()));
Since the method compareAttrValueCountEntry is declared static,
replace the method reference
this::compareAttrValueCountEntry
with
<Yourclass>::compareAttrValueCountEntry

Generic chceck in the stream collect java

List<String> namesOfMaleMembersCollect = roster
.stream()
.filter(p -> p.getGender() == Person.Sex.MALE)
.map(p -> p.getName())
.collect(Collectors.toList());
I've got such a code, where roster is defined as List<Person>. In which place JVM checks if the returned List consist Strings? I mean we've got the List defined, but then there is no information about the String of retiring value. Is this:
.map(p -> p.getName())
.collect(Collectors.toList());
the place where JVM see that .map() getting String and know that the type of the list returned by .collect() will be same?
Type inference is a powerful tool that comes with generics. When you call .map(p -> p.getName()) it returns a Stream<String>, now the the Stream has type parameter String instead of T.
Now you call collect which takes a Collector of the following signature.
<R, A> R collect(Collector<? super T, A, R> collector)
And in the case Stream<String> it will be infered to
Collector<String, ?, List<String>>
Giving us List<String>
You can rewrite your code to the following
Collector<String, ?, List<String>> collector = Collectors.toList();
...map(p -> p.getName())
.collect(collector);
Meaning the type is infered from the type of the variable the result is being assigned to.
The map method return a stream of the result of applying the function in parameter so this return to you a stream of string and then we collect the result as a List using the collector.
In this cas its in this map function :
map(p -> p.getName())
it can be written like this (:
map(Person p -> {return p.getName();})

Categories