How to convert Map to List in Java 8 - java

How to convert a Map<String, Double> to List<Pair<String, Double>> in Java 8?
I wrote this implementation, but it is not efficient
Map<String, Double> implicitDataSum = new ConcurrentHashMap<>();
//....
List<Pair<String, Double>> mostRelevantTitles = new ArrayList<>();
implicitDataSum.entrySet()
.stream()
.sorted(Comparator.comparing(e -> -e.getValue()))
.forEachOrdered(e -> mostRelevantTitles.add(new Pair<>(e.getKey(), e.getValue())));
return mostRelevantTitles;
I know that it should works using .collect(Collectors.someMethod()). But I don't understand how to do that.

Well, you want to collect Pair elements into a List. That means that you need to map your Stream<Map.Entry<String, Double>> into a Stream<Pair<String, Double>>.
This is done with the map operation:
Returns a stream consisting of the results of applying the given function to the elements of this stream.
In this case, the function will be a function converting a Map.Entry<String, Double> into a Pair<String, Double>.
Finally, you want to collect that into a List, so we can use the built-in toList() collector.
List<Pair<String, Double>> mostRelevantTitles =
implicitDataSum.entrySet()
.stream()
.sorted(Comparator.comparing(e -> -e.getValue()))
.map(e -> new Pair<>(e.getKey(), e.getValue()))
.collect(Collectors.toList());
Note that you could replace the comparator Comparator.comparing(e -> -e.getValue()) by Map.Entry.comparingByValue(Comparator.reverseOrder()).

Note that if you want efficient implementation, you should consider this:
List<Pair<String, Double>> mostRelevantTitles =
implicitDataSum.entrySet()
.stream()
.map(e -> new Pair<>(e.getKey(), e.getValue()))
.collect(Collectors.toList());
mostRelevantTitles.sort(Comparators.comparing(Pair::getSecond, Comparator.reverseOrder()));
I assume that your Pair class have getSecond getter.
Using the sorted() stream pipeline step you create intermediate buffer, store everything to that buffer, convert it into array, sort that array, then store the result into the ArrayList. My approach, though less functional, stores data directly into the target ArrayList, then sorts it in-place without any additional copying. So my solution would take less time and intermediate memory.

public List<TeamResult> process(final Map<String, Team> aggregatedMap) {
return aggregatedMap.entrySet()
.stream()
.map(e -> new TeamResult(e.getKey(),e.getValue()))
.collect(Collectors.toList());
}

Sort the Map based on values in reverse order and collect the keys in list and also limit only first 2 results in the list
List<String> list = map.keySet().stream()
.sorted((k1, k2)->map.get(k2)- map.get(k1))
.limit(2)
.collect(Collectors.toList())

Related

Java-Stream, toMap with duplicate keys

So there might be one abc for several payments, now I have:
//find abc id for each payment id
Map<Long, Integer> abcIdToPmtId = paymentController.findPaymentsByIds(pmtIds)
.stream()
.collect(Collectors.toMap(Payment::getAbcId, Payment::getPaymentId));
But then I reallize this could have duplicate keys, so I want it to return a
Map<Long, List<Integer>> abcIdToPmtIds
which an entry will contain one abc and his several payments.
I know I might can use groupingBy but then I think I can only get Map<Long, List<Payments>> .
Use the other groupingBy overload.
paymentController.findPaymentsByIds(pmtIds)
.stream()
.collect(
groupingBy(Payment::getAbcId, mapping(Payment::getPaymentId, toList());
Problem statement: Converting SimpleImmutableEntry<String, List<String>> -> Map<String, List<String>>.
For Instance you have a SimpleImmutableEntry of this form [A,[1]], [B,[2]], [A, [3]] and you want your map to looks like this: A -> [1,3] , B -> [2].
This can be done with Collectors.toMap but Collectors.toMap works only with unique keys unless you provide a merge function to resolve the collision as said in java docs.
https://docs.oracle.com/javase/8/docs/api/java/util/stream/Collectors.html#toMap-java.util.function.Function-java.util.function.Function-java.util.function.BinaryOperator-
So the example code looks like this:
.map(returnSimpleImmutableEntries)
.collect(Collectors.toMap(SimpleImmutableEntry::getKey,
SimpleImmutableEntry::getValue,
(oldList, newList) -> { oldList.addAll(newList); return oldList; } ));
returnSimpleImmutableEntries method returns you entries of the form [A,[1]], [B,[2]], [A, [3]] on which you can use your collectors.
With Collectors.toMap:
Map<Long, Integer> abcIdToPmtId = paymentController.findPaymentsByIds(pmtIds)
.stream()
.collect(Collectors.toMap(
Payment::getAbcId,
p -> new ArrayList<>(Arrays.asList(p.getPaymentId())),
(o, n) -> { o.addAll(n); return o; }));
Though it's more clear and readable to use Collectors.groupingBy along with Collectors.mapping.
You don't need streams to do it though:
Map<Long, Integer> abcIdToPmtId = new HashMap<>();
paymentController.findPaymentsByIds(pmtIds).forEach(p ->
abcIdToPmtId.computeIfAbsent(
p.getAbcId(),
k -> new ArrayList<>())
.add(p.getPaymentId()));

how to merge a collection of Maps using streams

I have a collection of maps:
Collection<Map<String,Double>> myCol = table.values();
I would like to transform this into a Map
Map<String, Double>
such that, for a matching key, values are summed up. Using a for loop, it is rather simple:
Map<String, Double> outMap = new HashMap<>();
for (Map<String, Double> map : myCol) {
outMap = mergeMaps(outMap, map);
}
and mergeMaps() is defined as
mergeMaps(Map<String, Double> m1, Map<String, Double> m2){
Map<String, Double> outMap = new TreeMap<>(m1);
m2.forEach((k,v) -> outMap.merge(k,v,Double::sum)); /*sum values if key exists*/
return outMap;
}
However, I would like to use streams to get a map from collection. I have tried as follows:
Map<String, Double> outMap = new HashMap<>();
myCol.stream().forEach(e-> outMap.putAll(mergeMaps(outMap,e)));
return outMap;
This works without a problem. However, can I still improve it? I mean, how can I use collectors in it?
From your input, you can fetch the stream of maps and then flatmap it to have a Stream<Map.Entry<String, Double>>. From there, you collect them into a new map, specifying that you want to sum the values mapped to the same key.
import static java.util.stream.Collectors.groupingBy;
import static java.util.stream.Collectors.summingDouble;
import static java.util.stream.Collectors.toMap;
....
Map<String, Double> outMap =
myCol.stream()
.flatMap(m -> m.entrySet().stream())
.collect(toMap(Map.Entry::getKey, Map.Entry::getValue, Double::sum));
Alternatively, you can use groupingBy instead of toMap:
.collect(groupingBy(Map.Entry::getKey, summingDouble(Map.Entry::getValue)));
myCol.stream()
.flatMap(x -> x.entrySet().stream())
.collect(Collectors.groupingBy(
Entry::getKey,
TreeMap::new,
Collectors.summingDouble(Entry::getValue)));
Well, the other proposed solutions show that a pure stream solution is short, but if you wanted to use your existing mergeFunction (because in other cases it is more complex for example), you could just hand it over to Stream.reduce:
Optional<Map<String, Double>> outMap = myCol.stream().reduce((m1, m2) -> mergeMaps(m1, m2));
Your initial approach with the forEach is pretty much a streamyfied for loop and violates the concept of functions having no side effects. The reduce (or the above collects) handles all the data merging internally, without changing the input collection.
With streams:
Map<String, Double> outMap = myCol.stream()
.flatMap(map -> map.entrySet().stream())
.collect(Collectors.toMap(
Map.Entry::getKey, // key of the result map
Map.Entry::getValue, // value of the result map
Double::sum, // how to merge values for equal keys
TreeMap::new)); // the type of map to be created
This uses Collectors.toMap to create the result TreeMap.
You can do it without streams, though. I think your version is a little bit complicated, you could refactor it as follows:
Map<String, Double> outMap = TreeMap<>();
myCol.forEach(map -> map.forEach((k, v) -> outMap.merge(k, v, Double::sum)));
Which is shorter, easy and most readable.

Flatten a Map<Integer, List<String>> to Map<String, Integer> with stream and lambda

I would like to flatten a Map which associates an Integer key to a list of String, without losing the key mapping.
I am curious as though it is possible and useful to do so with stream and lambda.
We start with something like this:
Map<Integer, List<String>> mapFrom = new HashMap<>();
Let's assume that mapFrom is populated somewhere, and looks like:
1: a,b,c
2: d,e,f
etc.
Let's also assume that the values in the lists are unique.
Now, I want to "unfold" it to get a second map like:
a: 1
b: 1
c: 1
d: 2
e: 2
f: 2
etc.
I could do it like this (or very similarly, using foreach):
Map<String, Integer> mapTo = new HashMap<>();
for (Map.Entry<Integer, List<String>> entry: mapFrom.entrySet()) {
for (String s: entry.getValue()) {
mapTo.put(s, entry.getKey());
}
}
Now let's assume that I want to use lambda instead of nested for loops. I would probably do something like this:
Map<String, Integer> mapTo = mapFrom.entrySet().stream().map(e -> {
e.getValue().stream().?
// Here I can iterate on each List,
// but my best try would only give me a flat map for each key,
// that I wouldn't know how to flatten.
}).collect(Collectors.toMap(/*A String value*/,/*An Integer key*/))
I also gave a try to flatMap, but I don't think that it is the right way to go, because although it helps me get rid of the dimensionality issue, I lose the key in the process.
In a nutshell, my two questions are :
Is it possible to use streams and lambda to achieve this?
Is is useful (performance, readability) to do so?
You need to use flatMap to flatten the values into a new stream, but since you still need the original keys for collecting into a Map, you have to map to a temporary object holding key and value, e.g.
Map<String, Integer> mapTo = mapFrom.entrySet().stream()
.flatMap(e->e.getValue().stream()
.map(v->new AbstractMap.SimpleImmutableEntry<>(e.getKey(), v)))
.collect(Collectors.toMap(Map.Entry::getValue, Map.Entry::getKey));
The Map.Entry is a stand-in for the nonexistent tuple type, any other type capable of holding two objects of different type is sufficient.
An alternative not requiring these temporary objects, is a custom collector:
Map<String, Integer> mapTo = mapFrom.entrySet().stream().collect(
HashMap::new, (m,e)->e.getValue().forEach(v->m.put(v, e.getKey())), Map::putAll);
This differs from toMap in overwriting duplicate keys silently, whereas toMap without a merger function will throw an exception, if there is a duplicate key. Basically, this custom collector is a parallel capable variant of
Map<String, Integer> mapTo = new HashMap<>();
mapFrom.forEach((k, l) -> l.forEach(v -> mapTo.put(v, k)));
But note that this task wouldn’t benefit from parallel processing, even with a very large input map. Only if there were additional computational intense task within the stream pipeline that could benefit from SMP, there was a chance of getting a benefit from parallel streams. So perhaps, the concise, sequential Collection API solution is preferable.
You should use flatMap as follows:
entrySet.stream()
.flatMap(e -> e.getValue().stream()
.map(s -> new SimpleImmutableEntry(e.getKey(), s)));
SimpleImmutableEntry is a nested class in AbstractMap.
Hope this would do it in simplest way. :))
mapFrom.forEach((key, values) -> values.forEach(value -> mapTo.put(value, key)));
This should work. Please notice that you lost some keys from List.
Map<Integer, List<String>> mapFrom = new HashMap<>();
Map<String, Integer> mapTo = mapFrom.entrySet().stream()
.flatMap(integerListEntry -> integerListEntry.getValue()
.stream()
.map(listItem -> new AbstractMap.SimpleEntry<>(listItem, integerListEntry.getKey())))
.collect(Collectors.toMap(AbstractMap.SimpleEntry::getKey, AbstractMap.SimpleEntry::getValue));
Same as the previous answers with Java 9:
Map<String, Integer> mapTo = mapFrom.entrySet()
.stream()
.flatMap(entry -> entry.getValue()
.stream()
.map(s -> Map.entry(s, entry.getKey())))
.collect(toMap(Entry::getKey, Entry::getValue));

How to sort Map<String, Set<String>> by another Set size

I have a map
Map<String, Set<String>> map
I'd like to write a function that returns a list List<String> which sorts map by size of set of the value of each key. The item in the list should be the keys of the map. What's the best way to do this?
I know I can use set.size() but how do i keep the relation of size of a set to corresponding key string?
You can achieve it this way using Stream API:
List<String> sortedKeys = map.entrySet().stream()
.sorted((a, b) -> Integer.compare(a.getValue().size(), b.getValue().size()))
.map(Map.Entry::getKey)
.collect(Collectors.toList());
Frostbit answer is pretty right but you could use comparingByValue() instead:
List<String> sortedKeys = map.entrySet().stream()
.sorted((a, b) -> Integer.comparingByValue(a.size(),b.size())) //getValue() removed
.map(Map.Entry::getKey)
.collect(Collectors.toList());

Merge two List value maps

Anybody knows how to merge with Java 8 two maps of this type?
Map<String, List<String>> map1--->["a",{1,2,3}]
Map<String, List<String>> map2--->["a",{4,5,6}]
And obtain as result of the merge
Map<String, List<String>> map3--->["a",{1,2,3,4,5,6}]
I´m looking for a non verbose way if exist. I know how to do it in the old fashion way.
Regards.
The general idea is the same as in this post. You create a new map from the first map, iterate over the second map and merge each key with the first map thanks to merge(key, value, remappingFunction). In case of conflict, the remapping function is applied: in this case, it takes the two lists and merges them; if there is no conflict, the entry with the given key and value is put.
Map<String, List<String>> mx = new HashMap<>(map1);
map2.forEach((k, v) -> mx.merge(k, v, (l1, l2) -> {
List<String> l = new ArrayList<>(l1);
l.addAll(l2);
return l;
}));
You could try this, which gradually flattens the structure until you have a stream of tuples of the maps keys versus the lists values:
Map<K,List<V>> result = Stream.of(map1,map2) // Stream<Map<K,List<V>>>
.flatMap(m -> m.entrySet().stream()) // Stream<Map.Entry<K,List<V>>>
.flatMap(e -> e.getValue().stream() // Inner Stream<V>...
.map(v -> new AbstractMap.SimpleImmutableEntry<>(e.getKey(), v)))
// ...flatmapped into an outer Stream<Map.Entry<K,V>>>
.collect(Collectors.groupingBy(e -> e.getKey(), Collectors.mapping(e -> e.getValue(), Collectors.toList())));
Another option would avoid the internal streaming of the lists by using Collectors.reducing() as a second parameter of groupingBy, I guess. However, I would consider the accepted answer first
You have to use Set instead of List and can do it like this:
Map<String, Set<String>> map1--->["a",{1,2,3}]
Map<String, Set<String>> map2--->["a",{4,5,6}]
map1.forEach((k, v) -> v.addAll(map2.get(k) == null : new HashSet<> ? map2.get(k)));

Categories