I have the following maps:
{21=0, 22=2, 11=0, 12=0}
{21=3, 22=0, 11=6, 12=3}
{21=6, 22=0, 11=7, 12=0}
{21=5, 22=7, 11=9, 12=1}
The following code returns these maps:
for (Chrom t: obj.getChroms) {
Map<Integer, Integer> result = t.getExecutionCount();
}
The method getExecutionCount() returns a single map. For the example I have given above, I have four chroms where each chrom will returns a single map.
I would like to sum the values of each key seperately so that the final result will be:
21 = 14
22 = 9
11 = 22
12 = 4
Is it possible to use stream to do that? If not, how can I do that?
Try this:
List<Map<Integer, Integer>> maps;
Map<Integer, Integer> result = maps.stream()
.map(Map::entrySet)
.flatMap(Collection::stream)
.collect(Collectors.groupingBy(
Map.Entry::getKey,
Collectors.summingInt(Map.Entry::getValue)));
You can create Stream of maps and the use flatMap,
Stream.of(map1, map2, map3)
.flatMap(m -> m.entrySet()
.stream())
.collect(Collectors.groupingBy(
Map.Entry::getKey,
Collectors.summingInt(Map.Entry::getValue)
)
);
It's possible using Stream. It should work (I can't compile right now unfortunately)
Map<Integer, Integer> result = Stream.of(map1, map2, map3, map4)
.map(Map::entrySet)
.flatMap(Set::stream)
.collect(Collectors.toMap(
Map.Entry::getKey,
Map.Entry::getValue,
Integer::sum)
);
A little explanation
Stream over all your Maps
Make a unique Stream of all entries contained inside of your Maps
Group every by key
For the collisions, use Integer::sum which will reduce both values for the same key
Related
I have a list of items as below
List<SomeModel> smList = new ArrayList<>();
smList.add(new SomeModel(1L,6.0f));//id = 1L and capacity = 6.0f
smList.add(new SomeModel(2L,7.0f));
smList.add(new SomeModel(3L,7.0f));
smList.add(new SomeModel(4L,7.0f));
Now I am converting this list into
Map<Float, Set<Long>> complexList = new HashMap<>();
complexList = smList.stream().collect(Collectors.groupingBy(SomeModel::getCapacity,
Collectors.mapping(SomeModel::getId, Collectors.toSet())));
here complexList
gives output as
7.0=[2, 3, 4]
6.0=[1]
Now I need to count number of values for each "capacity" giving output as
7.0=3
6.0=1
I tried
Map<Float, Long> complexCount = complexList.entrySet().stream().
collect(Collectors.groupingBy(Map.Entry::getKey,
Collectors.mapping(Map.Entry::getValue, Collectors.counting())));
complexCount.forEach((k,v)->System.out.println(k+"="+v));
and it outputs
6.0=1
7.0=1
I must be mistaking in understanding streams or not be using right methods. Can anyone suggest an approach or a solution? Reference link to streams will also be helpful.
if all you want to do is print each key of the map along with the size of the corresponding value, then there is no need to stream again as it causes unnecessary overhead. simply iterate overly the complexList and print it like so:
complexList.forEach((k,v)->System.out.println(k+"="+v.size()));
or if you really want a map then one could also do:
Map<Float, Integer> accumulator = new HashMap<>();
complexList.forEach((k,v)->accumulator.put(k, v.size()));
You are making it very complex. Easier solution below:
Map<Float, Long> complexCount = complexList
.entrySet()
.stream()
.collect(Collectors.toMap(
Map.Entry::getKey,
entry -> new Long(entry.getValue().size())
)
);
Here, you just need to call Collectors.toMap. It has two functions one for key and another for value of the map.
If there is no restriction of using Long as Map value type, then :
Map<Float, Integer> complexCount = complexList
.entrySet()
.stream()
.collect(Collectors.toMap(
Map.Entry::getKey,
entry -> entry.getValue().size()
)
);
You can make use of multiple Collectors and collectingAndThen(). And don't even need to collect it to an intermediate map:
import static java.util.stream.Collectors.*;
/* ... */
Map<Float, Integer> collect = smList.stream()
.collect(groupingBy(SomeModel::getCapacity,
collectingAndThen(
mapping(SomeModel::getId, toSet()),
Set::size
)
));
So there might be one abc for several payments, now I have:
//find abc id for each payment id
Map<Long, Integer> abcIdToPmtId = paymentController.findPaymentsByIds(pmtIds)
.stream()
.collect(Collectors.toMap(Payment::getAbcId, Payment::getPaymentId));
But then I reallize this could have duplicate keys, so I want it to return a
Map<Long, List<Integer>> abcIdToPmtIds
which an entry will contain one abc and his several payments.
I know I might can use groupingBy but then I think I can only get Map<Long, List<Payments>> .
Use the other groupingBy overload.
paymentController.findPaymentsByIds(pmtIds)
.stream()
.collect(
groupingBy(Payment::getAbcId, mapping(Payment::getPaymentId, toList());
Problem statement: Converting SimpleImmutableEntry<String, List<String>> -> Map<String, List<String>>.
For Instance you have a SimpleImmutableEntry of this form [A,[1]], [B,[2]], [A, [3]] and you want your map to looks like this: A -> [1,3] , B -> [2].
This can be done with Collectors.toMap but Collectors.toMap works only with unique keys unless you provide a merge function to resolve the collision as said in java docs.
https://docs.oracle.com/javase/8/docs/api/java/util/stream/Collectors.html#toMap-java.util.function.Function-java.util.function.Function-java.util.function.BinaryOperator-
So the example code looks like this:
.map(returnSimpleImmutableEntries)
.collect(Collectors.toMap(SimpleImmutableEntry::getKey,
SimpleImmutableEntry::getValue,
(oldList, newList) -> { oldList.addAll(newList); return oldList; } ));
returnSimpleImmutableEntries method returns you entries of the form [A,[1]], [B,[2]], [A, [3]] on which you can use your collectors.
With Collectors.toMap:
Map<Long, Integer> abcIdToPmtId = paymentController.findPaymentsByIds(pmtIds)
.stream()
.collect(Collectors.toMap(
Payment::getAbcId,
p -> new ArrayList<>(Arrays.asList(p.getPaymentId())),
(o, n) -> { o.addAll(n); return o; }));
Though it's more clear and readable to use Collectors.groupingBy along with Collectors.mapping.
You don't need streams to do it though:
Map<Long, Integer> abcIdToPmtId = new HashMap<>();
paymentController.findPaymentsByIds(pmtIds).forEach(p ->
abcIdToPmtId.computeIfAbsent(
p.getAbcId(),
k -> new ArrayList<>())
.add(p.getPaymentId()));
LinkedList<Double> list = new LinkedList<Double>();
list.add(9.5);
list.add(4.9);
list.add(3.2);
list.add(4.9);
I want to count the duplicate element in the list through a stream and put them into a HashMap which represent the occurrence of each number in the list:
e.g: (9.5=1, 4.9=2, 3.2=1)
Does anybody know how this works?
Using Collections.frequency
Make a list of all the distinct values, and for each of them, count their occurrences using the Collections.frequency method. Then collect into a Map
Map<Double, Integer> result = list.stream()
.distinct()
.collect(Collectors.toMap(
Function.identity(),
v -> Collections.frequency(list, v))
);
Using Collectors.groupingBy
I think it is not as nice as the example above.
Map<Double, Integer> result2 = list.stream()
.collect(Collectors.groupingBy(Function.identity())) // this makes {3.2=[3.2], 9.5=[9.5], 4.9=[4.9, 4.9]}
.entrySet().stream()
.collect(Collectors.toMap(
Map.Entry::getKey,
e -> e.getValue().size())
);
Plain for loop
A plain for loop is quite short, you might not need streams and lambdas
Map<Double, Integer> map = new HashMap<>();
for(Double d : list)
map.put(d, map.containsKey(d) ? map.get(d)+1 : 1);
Using forEach
Even shorter with forEach
Map<Double, Integer> map = new HashMap<>();
list.forEach(d -> map.put(d, map.containsKey(d) ? map.get(d)+1 : 1));
Another way, using Collectors.counting which doesn't need the distinct.
Map<Double, Long> frequencies = list.stream()
.collect(Collectors.groupingBy(Function.identity(), Collectors.counting()));
I have a collection of maps:
Collection<Map<String,Double>> myCol = table.values();
I would like to transform this into a Map
Map<String, Double>
such that, for a matching key, values are summed up. Using a for loop, it is rather simple:
Map<String, Double> outMap = new HashMap<>();
for (Map<String, Double> map : myCol) {
outMap = mergeMaps(outMap, map);
}
and mergeMaps() is defined as
mergeMaps(Map<String, Double> m1, Map<String, Double> m2){
Map<String, Double> outMap = new TreeMap<>(m1);
m2.forEach((k,v) -> outMap.merge(k,v,Double::sum)); /*sum values if key exists*/
return outMap;
}
However, I would like to use streams to get a map from collection. I have tried as follows:
Map<String, Double> outMap = new HashMap<>();
myCol.stream().forEach(e-> outMap.putAll(mergeMaps(outMap,e)));
return outMap;
This works without a problem. However, can I still improve it? I mean, how can I use collectors in it?
From your input, you can fetch the stream of maps and then flatmap it to have a Stream<Map.Entry<String, Double>>. From there, you collect them into a new map, specifying that you want to sum the values mapped to the same key.
import static java.util.stream.Collectors.groupingBy;
import static java.util.stream.Collectors.summingDouble;
import static java.util.stream.Collectors.toMap;
....
Map<String, Double> outMap =
myCol.stream()
.flatMap(m -> m.entrySet().stream())
.collect(toMap(Map.Entry::getKey, Map.Entry::getValue, Double::sum));
Alternatively, you can use groupingBy instead of toMap:
.collect(groupingBy(Map.Entry::getKey, summingDouble(Map.Entry::getValue)));
myCol.stream()
.flatMap(x -> x.entrySet().stream())
.collect(Collectors.groupingBy(
Entry::getKey,
TreeMap::new,
Collectors.summingDouble(Entry::getValue)));
Well, the other proposed solutions show that a pure stream solution is short, but if you wanted to use your existing mergeFunction (because in other cases it is more complex for example), you could just hand it over to Stream.reduce:
Optional<Map<String, Double>> outMap = myCol.stream().reduce((m1, m2) -> mergeMaps(m1, m2));
Your initial approach with the forEach is pretty much a streamyfied for loop and violates the concept of functions having no side effects. The reduce (or the above collects) handles all the data merging internally, without changing the input collection.
With streams:
Map<String, Double> outMap = myCol.stream()
.flatMap(map -> map.entrySet().stream())
.collect(Collectors.toMap(
Map.Entry::getKey, // key of the result map
Map.Entry::getValue, // value of the result map
Double::sum, // how to merge values for equal keys
TreeMap::new)); // the type of map to be created
This uses Collectors.toMap to create the result TreeMap.
You can do it without streams, though. I think your version is a little bit complicated, you could refactor it as follows:
Map<String, Double> outMap = TreeMap<>();
myCol.forEach(map -> map.forEach((k, v) -> outMap.merge(k, v, Double::sum)));
Which is shorter, easy and most readable.
How to convert a Map<String, Double> to List<Pair<String, Double>> in Java 8?
I wrote this implementation, but it is not efficient
Map<String, Double> implicitDataSum = new ConcurrentHashMap<>();
//....
List<Pair<String, Double>> mostRelevantTitles = new ArrayList<>();
implicitDataSum.entrySet()
.stream()
.sorted(Comparator.comparing(e -> -e.getValue()))
.forEachOrdered(e -> mostRelevantTitles.add(new Pair<>(e.getKey(), e.getValue())));
return mostRelevantTitles;
I know that it should works using .collect(Collectors.someMethod()). But I don't understand how to do that.
Well, you want to collect Pair elements into a List. That means that you need to map your Stream<Map.Entry<String, Double>> into a Stream<Pair<String, Double>>.
This is done with the map operation:
Returns a stream consisting of the results of applying the given function to the elements of this stream.
In this case, the function will be a function converting a Map.Entry<String, Double> into a Pair<String, Double>.
Finally, you want to collect that into a List, so we can use the built-in toList() collector.
List<Pair<String, Double>> mostRelevantTitles =
implicitDataSum.entrySet()
.stream()
.sorted(Comparator.comparing(e -> -e.getValue()))
.map(e -> new Pair<>(e.getKey(), e.getValue()))
.collect(Collectors.toList());
Note that you could replace the comparator Comparator.comparing(e -> -e.getValue()) by Map.Entry.comparingByValue(Comparator.reverseOrder()).
Note that if you want efficient implementation, you should consider this:
List<Pair<String, Double>> mostRelevantTitles =
implicitDataSum.entrySet()
.stream()
.map(e -> new Pair<>(e.getKey(), e.getValue()))
.collect(Collectors.toList());
mostRelevantTitles.sort(Comparators.comparing(Pair::getSecond, Comparator.reverseOrder()));
I assume that your Pair class have getSecond getter.
Using the sorted() stream pipeline step you create intermediate buffer, store everything to that buffer, convert it into array, sort that array, then store the result into the ArrayList. My approach, though less functional, stores data directly into the target ArrayList, then sorts it in-place without any additional copying. So my solution would take less time and intermediate memory.
public List<TeamResult> process(final Map<String, Team> aggregatedMap) {
return aggregatedMap.entrySet()
.stream()
.map(e -> new TeamResult(e.getKey(),e.getValue()))
.collect(Collectors.toList());
}
Sort the Map based on values in reverse order and collect the keys in list and also limit only first 2 results in the list
List<String> list = map.keySet().stream()
.sorted((k1, k2)->map.get(k2)- map.get(k1))
.limit(2)
.collect(Collectors.toList())