Java8 streams : Transpose map with values as list - java

I have map with key as String and value as List. List can have 10 unique values. I need to convert this map with key as Integer and value as List. Example as below :
Input :
"Key-1" : 1,2,3,4
"Key-2" : 2,3,4,5
"Key-3" : 3,4,5,1
Expected output :
1 : "Key-1","Key-3"
2 : "Key-1","Key-2"
3 : "Key-1", "Key-2", "Key-3"
4 : "Key-1", "Key-2", "Key-3"
5 : "Key-2", "Key-3"
I am aware that using for loops i can achieve this but i needed to know can this be done via streams/lamda in java8.
-Thanks.

An idea could be to generate all value-key pairs from the original map and then group the keys by these values:
import java.util.AbstractMap.SimpleEntry;
import static java.util.stream.Collectors.groupingBy;
import static java.util.stream.Collectors.mapping;
import static java.util.stream.Collectors.toList;
...
Map<Integer, List<String>> transposeMap =
map.entrySet()
.stream()
.flatMap(e -> e.getValue().stream().map(i -> new SimpleEntry<>(i, e.getKey())))
.collect(groupingBy(Map.Entry::getKey, mapping(Map.Entry::getValue, toList())));

Alexis’ answer contains the general solution for this kind of task, using flatMap and a temporary holder for the combination of key and flattened value. The only alternative avoiding the creation of the temporary holder objects, is to re-implement the logic of the groupingBy collector and inserting the loop over the value list logic into the accumulator function:
Map<Integer, List<String>> mapT = map.entrySet().stream().collect(
HashMap::new,
(m,e) -> e.getValue().forEach(
i -> m.computeIfAbsent(i,x -> new ArrayList<>()).add(e.getKey())),
(m1,m2) -> m2.forEach((k,v) -> m1.merge(k, v, (l1,l2)->{l1.addAll(l2); return l1;})));

It's a bit scary (I generally try to break it down to make it more readable) but you could do it this way:
Map<Integer, List<String>> transposeMap = new HashMap<>();
map.forEach((key, list) -> list.stream().forEach(
elm -> transposeMap.put(elm,
transposeMap.get(elm) == null ? Arrays.asList(key) : (Stream.concat(transposeMap.get(elm).stream(),
Arrays.asList(key).stream()).collect(Collectors.toList())))));
Assuming Map<String, List<Integer>> map is your original Map that you want to transpose. transposeMap will have transposed map that you need.

You can Achieve in this way
Let suppose I have Person class with Gender and Age . I want to get it in this form
Map<SEX,List<Person>>
I would write simply
Map<SEX,List<Person>> map = personList.stream()
.collect(Collectors.groupingBy(Person::getGender));
it will get me something like below (one key against multiple values )
key:MALE
age31sexMALE
age28sexMALE
key:FEMALE
age40sexFEMALE
age44sexFEMALE

with teeing You can work on keys and values in 2 streams separately
since Java 12
Map<Integer, List<String>> to = from.entrySet().stream()
.collect(teeing(flatMapping(e -> e.getValue().stream(), toList()),
flatMapping(e -> (Stream<String>)e.getValue().stream().map(i -> e.getKey()), toList()),
(k, v) -> {
return IntStream.range(0, k.size()).boxed().collect(
groupingBy(i -> k.get(i), mapping(i -> v.get(i), toList())));
}));

Related

Using Collectors.toMap how to convert the map values

I have a Map<String, List<StartingMaterial>>
I want to convert the Object in the List to another Object.
ie. Map<String, List<StartingMaterialResponse>>
Can I do this using java stream Collectors.toMap()?
I tried something like the below code.
Map<String, List<StartingMaterial>> startingMaterialMap = xxxx;
startingMaterialMap.entrySet().stream().collect(Collectors.toMap( Map.Entry::getKey, Function.identity(), (k, v) -> convertStartingMaterialToDto(v.getValue())));
And my conversion code to change the Object is like below,
private StartingMaterialResponse convertStartingMaterialToDto(StartingMaterial sm) {
final StartingMaterialMatrix smm = sm.getStartingMaterialMatrix();
final StartingMaterial blending1Matrix = smm.getBlending1Matrix();
final StartingMaterial blending2Matrix = smm.getBlending2Matrix();
return new StartingMaterialResponse(
sm.getId(),
sm.getComponent().getCasNumber(),
sm.getDescription(),
sm.getPriority(),
String.join(" : ",
Arrays.asList(smm.getCarryInMatrix().getComponent().getMolecularFormula(),
blending1Matrix != null ? blending1Matrix.getComponent().getMolecularFormula() : "",
blending2Matrix != null ? blending2Matrix.getComponent().getMolecularFormula() : ""
).stream().distinct().filter(m -> !m.equals("")).collect(Collectors.toList())),
smm.getFamily(),
smm.getSplitGroup());
}
You can use the toMap collector since your source is a map. However you have to iterate over all the values and convert each of them into the DTO format inside the valueMapper.
Map<String, List<StartingMaterialResponse>> result = startingMaterialMap.entrySet().stream()
.collect(Collectors.toMap(Map.Entry::getKey, e -> e.getValue().stream()
.map(s -> convertStartingMaterialToDto(s)).collect(Collectors.toList())));
I think you mean to do :
startingMaterialMap.entrySet().stream()
.collect(Collectors.toMap(Map.Entry::getKey,
e -> e.getValue().stream()
.map(this::convertStartingMaterialToDto)
.collect(Collectors.toList()))
);
Here is my approach to this problem :
Map<String, List<Integer>> deposits = new HashMap<>();
deposits.put("first", Arrays.asList(1, 2, 3));
deposits.forEach((depositName, products) -> {
products.stream()
.map(myIntegerProduct -> myIntegerProduct.toString())
.collect(Collectors.toList());
});
The above example convert the List<Integer> to a list of Strings.
In your example, instead of myIntegerProduct.toString() is the convertStartingMaterialToDto method.
The forEach method iterates through every Key-Value pair in the map and you set some names for the key and the value parameters to be more specific and keep an understandable code for everyone who reads it. In my example : forEach( (depositName, products)) -> the depositName is the Key ( in my case a String ) and the products is the Value of the key ( in my case is a List of integers ).
Finally you iterate through the list too and map every item to a new type
products.stream()
.map(myIntegerProduct -> myIntegerProduct.toString())

Grouping By without using a POJO in java 8

I have a use case where I need to read a file and get the grouping of a sequence and a list of values associated with the sequence. The format of these records in the file are like sequence - val , example
10-A
10-B
11-C
11-A
I want the output to be a map (Map<String,List<String>>) with the sequence as the key and list of values associated with it as value, like below
10,[A,B]
11,[C,A]
Is there a way I can do this without creating a POJO for these records? I have been trying to explore the usage of Collectors.groupingBy and most of the examples I see are based on creating a POJO.
I have been trying to write something like this
Map<String, List<String>> seqCpcGroupMap = pendingCpcList.stream().map(rec ->{
String[] cpcRec = rec.split("-");
return new Tuple2<>(cpcRec[0],cpcRec[1])
}).collect(Collectors.groupingBy(x->x.))
or
Map<String, List<String>> seqCpcGroupMap = pendingCpcList.stream().map(rec ->{
String[] cpcRec = rec.split("-");
return Arrays.asList(cpcRec[0],cpcRec[1]);
}).collect(Collectors.groupingBy(x->(ArrayList<String>)x[0]));
I am unable to provide any key on which the groupingBy can happen for the groupingBy function, is there a way to do this or do I have to create a POJO to use groupingBy?
You may do it like so,
Map<String, List<String>> result = source.stream()
.map(s -> s.split("-"))
.collect(Collectors.groupingBy(a -> a[0],
Collectors.mapping(a -> a[1], Collectors.toList())));
Alternatively, you can use Map.computeIfAbsent directly as :
List<String> pendingCpcList = List.of("10-A","10-B","11-C","11-A");
Map<String, List<String>> seqCpcGroupMap = new HashMap<>();
pendingCpcList.stream().map(rec -> rec.split("-"))
.forEach(a -> seqCpcGroupMap.computeIfAbsent(a[0], k -> new ArrayList<>()).add(a[1]));

Java-Stream, toMap with duplicate keys

So there might be one abc for several payments, now I have:
//find abc id for each payment id
Map<Long, Integer> abcIdToPmtId = paymentController.findPaymentsByIds(pmtIds)
.stream()
.collect(Collectors.toMap(Payment::getAbcId, Payment::getPaymentId));
But then I reallize this could have duplicate keys, so I want it to return a
Map<Long, List<Integer>> abcIdToPmtIds
which an entry will contain one abc and his several payments.
I know I might can use groupingBy but then I think I can only get Map<Long, List<Payments>> .
Use the other groupingBy overload.
paymentController.findPaymentsByIds(pmtIds)
.stream()
.collect(
groupingBy(Payment::getAbcId, mapping(Payment::getPaymentId, toList());
Problem statement: Converting SimpleImmutableEntry<String, List<String>> -> Map<String, List<String>>.
For Instance you have a SimpleImmutableEntry of this form [A,[1]], [B,[2]], [A, [3]] and you want your map to looks like this: A -> [1,3] , B -> [2].
This can be done with Collectors.toMap but Collectors.toMap works only with unique keys unless you provide a merge function to resolve the collision as said in java docs.
https://docs.oracle.com/javase/8/docs/api/java/util/stream/Collectors.html#toMap-java.util.function.Function-java.util.function.Function-java.util.function.BinaryOperator-
So the example code looks like this:
.map(returnSimpleImmutableEntries)
.collect(Collectors.toMap(SimpleImmutableEntry::getKey,
SimpleImmutableEntry::getValue,
(oldList, newList) -> { oldList.addAll(newList); return oldList; } ));
returnSimpleImmutableEntries method returns you entries of the form [A,[1]], [B,[2]], [A, [3]] on which you can use your collectors.
With Collectors.toMap:
Map<Long, Integer> abcIdToPmtId = paymentController.findPaymentsByIds(pmtIds)
.stream()
.collect(Collectors.toMap(
Payment::getAbcId,
p -> new ArrayList<>(Arrays.asList(p.getPaymentId())),
(o, n) -> { o.addAll(n); return o; }));
Though it's more clear and readable to use Collectors.groupingBy along with Collectors.mapping.
You don't need streams to do it though:
Map<Long, Integer> abcIdToPmtId = new HashMap<>();
paymentController.findPaymentsByIds(pmtIds).forEach(p ->
abcIdToPmtId.computeIfAbsent(
p.getAbcId(),
k -> new ArrayList<>())
.add(p.getPaymentId()));

Flatten a Map<Integer, List<String>> to Map<String, Integer> with stream and lambda

I would like to flatten a Map which associates an Integer key to a list of String, without losing the key mapping.
I am curious as though it is possible and useful to do so with stream and lambda.
We start with something like this:
Map<Integer, List<String>> mapFrom = new HashMap<>();
Let's assume that mapFrom is populated somewhere, and looks like:
1: a,b,c
2: d,e,f
etc.
Let's also assume that the values in the lists are unique.
Now, I want to "unfold" it to get a second map like:
a: 1
b: 1
c: 1
d: 2
e: 2
f: 2
etc.
I could do it like this (or very similarly, using foreach):
Map<String, Integer> mapTo = new HashMap<>();
for (Map.Entry<Integer, List<String>> entry: mapFrom.entrySet()) {
for (String s: entry.getValue()) {
mapTo.put(s, entry.getKey());
}
}
Now let's assume that I want to use lambda instead of nested for loops. I would probably do something like this:
Map<String, Integer> mapTo = mapFrom.entrySet().stream().map(e -> {
e.getValue().stream().?
// Here I can iterate on each List,
// but my best try would only give me a flat map for each key,
// that I wouldn't know how to flatten.
}).collect(Collectors.toMap(/*A String value*/,/*An Integer key*/))
I also gave a try to flatMap, but I don't think that it is the right way to go, because although it helps me get rid of the dimensionality issue, I lose the key in the process.
In a nutshell, my two questions are :
Is it possible to use streams and lambda to achieve this?
Is is useful (performance, readability) to do so?
You need to use flatMap to flatten the values into a new stream, but since you still need the original keys for collecting into a Map, you have to map to a temporary object holding key and value, e.g.
Map<String, Integer> mapTo = mapFrom.entrySet().stream()
.flatMap(e->e.getValue().stream()
.map(v->new AbstractMap.SimpleImmutableEntry<>(e.getKey(), v)))
.collect(Collectors.toMap(Map.Entry::getValue, Map.Entry::getKey));
The Map.Entry is a stand-in for the nonexistent tuple type, any other type capable of holding two objects of different type is sufficient.
An alternative not requiring these temporary objects, is a custom collector:
Map<String, Integer> mapTo = mapFrom.entrySet().stream().collect(
HashMap::new, (m,e)->e.getValue().forEach(v->m.put(v, e.getKey())), Map::putAll);
This differs from toMap in overwriting duplicate keys silently, whereas toMap without a merger function will throw an exception, if there is a duplicate key. Basically, this custom collector is a parallel capable variant of
Map<String, Integer> mapTo = new HashMap<>();
mapFrom.forEach((k, l) -> l.forEach(v -> mapTo.put(v, k)));
But note that this task wouldn’t benefit from parallel processing, even with a very large input map. Only if there were additional computational intense task within the stream pipeline that could benefit from SMP, there was a chance of getting a benefit from parallel streams. So perhaps, the concise, sequential Collection API solution is preferable.
You should use flatMap as follows:
entrySet.stream()
.flatMap(e -> e.getValue().stream()
.map(s -> new SimpleImmutableEntry(e.getKey(), s)));
SimpleImmutableEntry is a nested class in AbstractMap.
Hope this would do it in simplest way. :))
mapFrom.forEach((key, values) -> values.forEach(value -> mapTo.put(value, key)));
This should work. Please notice that you lost some keys from List.
Map<Integer, List<String>> mapFrom = new HashMap<>();
Map<String, Integer> mapTo = mapFrom.entrySet().stream()
.flatMap(integerListEntry -> integerListEntry.getValue()
.stream()
.map(listItem -> new AbstractMap.SimpleEntry<>(listItem, integerListEntry.getKey())))
.collect(Collectors.toMap(AbstractMap.SimpleEntry::getKey, AbstractMap.SimpleEntry::getValue));
Same as the previous answers with Java 9:
Map<String, Integer> mapTo = mapFrom.entrySet()
.stream()
.flatMap(entry -> entry.getValue()
.stream()
.map(s -> Map.entry(s, entry.getKey())))
.collect(toMap(Entry::getKey, Entry::getValue));

Merge two List value maps

Anybody knows how to merge with Java 8 two maps of this type?
Map<String, List<String>> map1--->["a",{1,2,3}]
Map<String, List<String>> map2--->["a",{4,5,6}]
And obtain as result of the merge
Map<String, List<String>> map3--->["a",{1,2,3,4,5,6}]
I´m looking for a non verbose way if exist. I know how to do it in the old fashion way.
Regards.
The general idea is the same as in this post. You create a new map from the first map, iterate over the second map and merge each key with the first map thanks to merge(key, value, remappingFunction). In case of conflict, the remapping function is applied: in this case, it takes the two lists and merges them; if there is no conflict, the entry with the given key and value is put.
Map<String, List<String>> mx = new HashMap<>(map1);
map2.forEach((k, v) -> mx.merge(k, v, (l1, l2) -> {
List<String> l = new ArrayList<>(l1);
l.addAll(l2);
return l;
}));
You could try this, which gradually flattens the structure until you have a stream of tuples of the maps keys versus the lists values:
Map<K,List<V>> result = Stream.of(map1,map2) // Stream<Map<K,List<V>>>
.flatMap(m -> m.entrySet().stream()) // Stream<Map.Entry<K,List<V>>>
.flatMap(e -> e.getValue().stream() // Inner Stream<V>...
.map(v -> new AbstractMap.SimpleImmutableEntry<>(e.getKey(), v)))
// ...flatmapped into an outer Stream<Map.Entry<K,V>>>
.collect(Collectors.groupingBy(e -> e.getKey(), Collectors.mapping(e -> e.getValue(), Collectors.toList())));
Another option would avoid the internal streaming of the lists by using Collectors.reducing() as a second parameter of groupingBy, I guess. However, I would consider the accepted answer first
You have to use Set instead of List and can do it like this:
Map<String, Set<String>> map1--->["a",{1,2,3}]
Map<String, Set<String>> map2--->["a",{4,5,6}]
map1.forEach((k, v) -> v.addAll(map2.get(k) == null : new HashSet<> ? map2.get(k)));

Categories