Input: List<Foo> rawdata
Desired Output: Map<Foo, Bar>
Method implementation: Java 8
public Map<Foo, Bar> getMap(List<Foo> rawdata) {
rawData.stream().flatmap(d -> {
val key = method1(d);
val value = method2(d);
})
// Now I want to create a map using the Key/Value
// pair which was obtained from calling method 1 and 2.
}
I did try this:
public Map<Foo, Bar> getMap(List<Foo> rawdata) {
rawData.stream()
.flatmap(
d -> {
val key = method1(d);
val value = method2(d);
return Stream.of(new Object[][]{key, value});
})
.collect(Collectors.toMap(
key -> key[0],
key -> key[1]));
}
However want to see if there is less expensive way of doing this or any better way.
There's no point in creating an intermediate stream and then using flatMap().
You can just collect() directly into a map:
public Map<Foo, Bar> getMap(List<Foo> rawData) {
return rawData.stream()
.collect(Collectors.toMap(foo -> method1(foo), foo -> method2(foo)));
// OR: Collectors.toMap(this::method1, this::method2);
}
You can use toMap(keyMapper,valueMapper,mergeFunction) method with three parameters that allows to merge duplicates into a list, for example:
List<String[]> list = List.of(
new String[]{"Foo1", "Bar1"},
new String[]{"Foo1", "Bar2"},
new String[]{"Foo2", "Baz"});
Map<String, List<String>> map1 = list.stream()
.collect(Collectors.toMap(
e -> e[0],
e -> new ArrayList<>(List.of(e[1])),
(list1, list2) -> {
list1.addAll(list2);
return list1;
}));
System.out.println(map1);
// {Foo1=[Bar1, Bar2], Foo2=[Baz]}
If you are sure that there are no duplicates, you can use toMap(keyMapper,valueMapper) method with two parameters:
List<String[]> list = List.of(
new String[]{"Foo1", "Bar1"},
new String[]{"Foo2", "Baz"});
Map<String, String> map2 = list.stream()
.collect(Collectors.toMap(e -> e[0], e -> e[1]));
System.out.println(map2);
// {Foo1=Bar1, Foo2=Baz}
Anyway you can collect a list of Map.Entrys:
List<String[]> list = List.of(
new String[]{"Foo1", "Bar1"},
new String[]{"Foo1", "Bar2"},
new String[]{"Foo2", "Baz"});
List<Map.Entry<String, String>> list1 = list.stream()
.map(arr -> Map.entry(arr[0], arr[1]))
.collect(Collectors.toList());
System.out.println(list1);
// [Foo1=Bar1, Foo1=Bar2, Foo2=Baz]
See also: Ordering Map<String, Integer> by List<String> using streams
Related
Consider the following data structures:
ArrayList<HashMap<String, String>> entries = new ArrayList<>();
ArrayList<String> keyNamesToInclude = new ArrayList<>();
This code creates a copy of entries, but with hashmaps only including the keys in keyNamesToInclude:
ArrayList<HashMap<String, String>> copies = new ArrayList<>();
for (HashMap<String, String> entry: entries) {
HashMap<String, String> copy = new HashMap<>();
for (String keyName: keyNamesToInclude) {
copy.put(keyName, entry.get(keyName));
}
copies.add(copy);
}
How would one create this with Streams in a functional way?
It is better to convert keyNamesToInclude into Set to facilitate lookup of the keys.
Then use List::stream to get Stream<HashMap> and for each map get filtered stream of its entries re-collected into a new map and list accordingly.
Set<String> keys = new HashSet<>(keyNamesToInclude); // removes possible duplicates
List<Map<String, String>> copies = entries.stream() // Stream<HashMap>
.map(m -> m.entrySet()
.stream()
.filter(e -> keys.contains(e.getKey()))
.collect(Collectors.toMap(
Map.Entry::getKey, Map.Entry::getValue
))
)
.collect(Collectors.toList());
If it is very important to have concrete implementations of List and Map in copies, casting or special forms of collectors may be needed even though Collectors.toList() returns ArrayList and Collectors.toMap returns HashMap:
// casting
ArrayList<HashMap<String, String>> copies2 = (ArrayList) entries.stream() // Stream<HashMap>
.map(m -> (HashMap<String, String>) m.entrySet()
.stream()
.filter(e -> keys.contains(e.getKey()))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue))
)
.collect(Collectors.toList());
// special collectors
// toMap(keyMapper, valueMapper, mergeFunction, mapFactory)
// toList -> toCollection(ArrayList::new)
ArrayList<HashMap<String, String>> copies3 = entries.stream() // Stream<HashMap>
.map(m -> m.entrySet()
.stream()
.filter(e -> keys.contains(e.getKey()))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue, (a, b) -> a, HashMap::new))
)
.collect(Collectors.toCollection(ArrayList::new));
You can do something like:
Set<String> setNamesToInclude = new HashSet<>(keyNamesToInclude);
List<Map<String, String>> copies = entries.stream()
.map(hm -> hm.keySet()
.stream()
.filter(setNamesToInclude::contains)
.collect(Collectors.toMap(Function.identity(), hm::get))
)
.collect(Collectors.toList());
Maybe somewhat more efficient solution that utilizes hashmap instead of list contains.
List<HashMap<String, String>> copies = maps.stream().map(e -> {
HashMap<String, String> copy = new HashMap<>();
keys.forEach(k -> {
String value = e.get(k);
if (value != null)
copy.put(k, e.get(k));
});
return copy;
}).collect(Collectors.toList());
You can do something like the following (pseudo code):
return entries.stream()
.map(hashmap -> {
return hashmap.stream()
.filter((k,v) -> keyNamesToInclude.contains(k))
.collect(toMap());
})
.collect(toList());
I recommend using a Set to contain the keysToBeIncluded since it is a little more efficient. This also filters out empty maps from the final list.
List<Map<String, String>> entries = List.of(
Map.of("1", "one", "2", "two", "3", "three"),
Map.of("2", "two", "3", "three"), Map.of("1", "one"),
Map.of("6", "six", "7", "seven"));
entries.forEach(System.out::println);
Set<String> keyNamesToInclude = Set.of("2", "3", "6");
stream the original list of maps
for each map, filter the required keys
build the new map
filter to allow non-empty maps.
create the list
List<Map<String, String>> copies = entries.stream()
.map(m -> m.entrySet().stream().filter(
e -> keyNamesToInclude.contains(e.getKey()))
.collect(Collectors.toMap(Entry::getKey,
Entry::getValue))).filter(m->!m.isEmpty())
.toList(); // java 16 - can be replaced with a collector
copies.stream().forEach(System.out::println);
Prints four lines of source and three lines of result
{3=three, 2=two, 1=one}
{3=three, 2=two}
{1=one}
{7=seven, 6=six}
{2=two, 3=three}
{2=two, 3=three}
{6=six}
Given list of strings like this:
"Y:Yes",
"N:No",
"A:Apple"
I have something like
Map updated = values.stream().map(v -> v.split(":")).collect(Collectors.toMap(v1 -> v1[0],v1->v1.length>1?v1[1]:v1[0]));
But this gives me map as:
{
"Y":"Yes",
"N":"No",
"A":"Apple"
}
How can I get a list of maps as such:
[
{
name:"Y",
display:"Yes"
},
{
name:"N",
display:"No"
},
{
name:"A",
display:"Apple"
}
]
If you are using Java 9, you can use the new immutable map static factory methods, as follows:
List<Map<String, String>> updated = values.stream()
.map(v -> v.split(":"))
.map(a -> Map.of("name", a[0], "display", a[1]))
.collect(Collectors.toList());
As you want to get a List, not a map, your last function call cannot be Collectors.toMap, needs to be Collectors.toList. Now, each invocation to the map method should generate a new Map, so something like this would do:
List updated = values.stream()
.map(v -> {
String[] parts = v.split(":");
Map<String, String> map = new HashMap<>();
map.put("name", parts[0]);
map.put("display", parts[1]);
return map;
)
.collect(Collectors.toList());
Some people would prefer:
List updated = values.stream()
.map(v -> {
String[] parts = v.split(":");
return new HashMap<>() {{
put("name", parts[0]);
put("display", parts[1]);
}};
)
.collect(Collectors.toList());
which creates an extra helper class. Or if you can use Guava:
List updated = values.stream()
.map(v -> {
String[] parts = v.split(":");
return ImmutableMap.of("name", parts[0], "display", parts[1]);
)
.collect(Collectors.toList());
BTW: In the examples I used Listbut the complete type of what you describe would be List<Map<String, String>>.
You can use following if you're still using Java8, if you happen to use Java9 then have a look at Federicos answer:
final List<Map<String,String>> updated = values.stream()
.map(v -> v.split(":"))
.map(arr -> {
Map<String, String> map = new HashMap<>();
map.put("name", arr[0]);
map.put("display", arr[1]);
return map;
})
.collect(Collectors.toList());
Map<A, List<B>> xFunction() {
Map<A, List<B>> mapList = new HashMap<>();
List<A> aList = x.getAList();
for (A a : aList) {
List<B> bList = getBList(a.getId());
mapList.put(a, bList);
}
return mapList;
}
How to convert this in java 8 with collect and grouping by or mapping?
I try with something like:
x.getAList
.stream()
.map(a -> getBList(a.getId)) //return a list of B
.collect(Collectors.groupingBy (...) )
Cheers
You need Collectors.toMap:
Map<A, List<B>> map =
x.getAList()
.stream()
.collect(Collectors.toMap (Function.identity(), a -> getBList(a.getId())));
#Eran was first but to reproduce behavior you should use toMap collector with mergeFunction for duplicates by a.getId() because by default Java will throw IllegalStateException for entries with the same key:
x.getAList()
.stream()
.collect(Collectors.toMap(Function.identity(), a -> getBList(a.getId())), (u, u2) -> u2);
I have a Map object Map<t1, Set<t2>>, and I want to go into the set and turn t2 in the sets into the keys of the new map. The original key t1 will be the new value of the map.
For example, given a map containing two entries
{key1: [a, b, c], key2: [c, d]}
The resulting map would be
{a: [key1], b: [key1], c: [key1, key2], d: [key2]}
[ ] denotes Set in the above examples.
Java 8:
map.entrySet()
.stream()
.flatMap(e -> e.getValue()
.stream()
.map(v -> new SimpleEntry<>(v, e.getKey())))
.collect(Collectors.groupingBy(Entry::getKey,
Collectors.mapping(Entry::getValue, Collectors.toSet())))
Guava:
Multimaps.asMap(map.entrySet()
.stream()
.collect(ImmutableSetMultimap.flatteningToImmutableSetMultimap(
Entry::getKey, e -> e.getValue().stream()))
.inverse())
StreamEx:
EntryStream.of(map)
.flatMapValues(Set::stream)
.invert()
.grouping(Collectors.toSet())
One way could be :
private static <T1,T2> Map<T1, Set<T2>> invertMap(Map<T2, Set<T1>> data) {
Map<T1, Set<T2>> output = data.entrySet().stream().collect(() -> new HashMap<T1, Set<T2>>(),
(mapLeft, leftEntry) -> {
for (T1 i : leftEntry.getValue()) {
Set<T2> values = mapLeft.get(i);
if (values == null)
values = new HashSet<>();
values.add(leftEntry.getKey());
mapLeft.put(i, values);
}
}, (mapLeft, mapRight) -> mapLeft.putAll(mapRight));
return output;
}
One way to do it could be -
Map<V1,Set<V2>> inputHashMap = new HashMap<>(); // initialized with your input
Map<V2,Set<V1>> outputHashMap = new HashMap<>();
inputHashMap.forEach((val, keys) -> keys.forEach(key -> {
if (outputHashMap.containsKey(key)) {
outputHashMap.get(key).add(val);
} else {
outputHashMap.put(key, new HashSet<>() {{
add(val);
}});
}
}));
Using stream (might be good for parallel processing using parallel() on stream)
Map<String, Set<String>> inMap = new HashMap<>();
inMap.put("key1", new HashSet<>(Arrays.asList("a", "b", "c")));
inMap.put("key2", new HashSet<>(Arrays.asList("c", "d")));
Map<String, Set<String>> outMap = inMap.entrySet().stream().collect(
HashMap::new,
(m, e) -> e.getValue().forEach(v -> m.computeIfAbsent(v, ignore -> new HashSet<>())
.add(e.getKey())),
(m1, m2) -> m2.forEach((key, value) -> m1.merge(key, value,
(s1, s2) -> { s1.addAll(s2); return s1; })));
System.out.println(outMap);
// {a=[key1], b=[key1], c=[key1, key2], d=[key2]}
Of course old school for loop is much more cleaner
Map<String, Set<String>> outMap = new HashMap<>();
for (Entry<String, Set<String>> e : inMap.entrySet())
for (String v : e.getValue())
outMap.computeIfAbsent(v, key -> new HashSet<>()).add(e.getKey());
Less LOC
Map<String, Set<String>> outMap = new HashMap<>();
inMap.forEach((k, v) -> v.forEach(e ->
outMap.computeIfAbsent(e, __ -> new HashSet<>()).add(k)));
I have List<Person> persons = new ArrayList<>(); and I want to list all unique names. I mean If there are "John", "Max", "John", "Greg" then I want to list only "Max" and "Greg". Is there some way to do it with Java stream?
We can use streams and Collectors.groupingBy in order to count how many occurrences we have of each name - then filter any name that appears more than once:
List<String> res = persons.stream()
.collect(Collectors.groupingBy(Function.identity(), Collectors.counting()))
.entrySet()
.stream()
.filter(e -> e.getValue() == 1)
.map(e -> e.getKey())
.collect(Collectors.toList());
System.out.println(res); // [Max, Greg]
List persons = new ArrayList();
persons.add("Max");
persons.add("John");
persons.add("John");
persons.add("Greg");
persons.stream()
.filter(person -> Collections.frequency(persons, person) == 1)
.collect(Collectors.toList());
First guess solution.
persons.stream()
.collect(Collectors.groupingBy(Function.identity(), Collectors.counting()))
.entrySet()
.stream()
.filter(entry -> entry.getValue() == 1)
.map(Map.Entry::getKey)
.collect(Collectors.toList())
Here is my solution:
List<String> persons = new ArrayList<>();
persons.add("John");
persons.add("John");
persons.add("MAX");
persons.add("Greg");
persons.stream()
.distinct()
.sorted()
.collect(Collectors.toList());
This is an old post, but I'd like to propose yet another approach based on a custom collector:
public static <T> Collector<T, ?, List<T>> excludingDuplicates() {
return Collector.<T, Map<T, Boolean>, List<T>>of(
LinkedHashMap::new,
(map, elem) -> map.compute(elem, (k, v) -> v == null),
(left, right) -> {
right.forEach((k, v) -> left.merge(k, v, (o, n) -> false));
return left;
},
m -> m.keySet().stream().filter(m::get).collect(Collectors.toList()));
}
Here I'm using Collector.of to create a custom collector that will accumulate elements on a LinkedHashMap: if the element is not present as a key, its value will be true, otherwise it will be false. The merge function is only applied for parallel streams and it merges the right map into the left map by attempting to put each entry of the right map in the left map, changing the value of already present keys to false. Finally, the finisher function returns a list with the keys of the map whose values are true.
This method can be used as follows:
List<String> people = Arrays.asList("John", "Max", "John", "Greg");
List<String> result = people.stream().collect(excludingDuplicates());
System.out.println(result); // [Max, Greg]
And here's another approach simpler than using a custom collector:
Map<String, Boolean> duplicates = new LinkedHashMap<>();
people.forEach(elem -> duplicates.compute(elem, (k, v) -> v != null));
duplicates.values().removeIf(v -> v);
Set<String> allUnique = duplicates.keySet();
System.out.println(allUnique); // [Max, Greg]
You can try the below code.
List<Person> uniquePersons = personList.stream()
.collect(Collectors.groupingBy(person -> person.getName()))
.entrySet().stream().filter(stringListEntry -> stringListEntry.getValue().size()==1)
.map(stringListEntry -> { return stringListEntry.getValue().get(0); })
.collect(Collectors.toList());
This should remove all the duplicate elements.
List<String> persons = new ArrayList<>();
persons.add("John");
persons.add("John");
persons.add("MAX");
persons.add("Greg");
Set<String> set = new HashSet<String>();
Set<String> duplicateSet = new HashSet<String>();
for (String p : persons) {
if (!set.add(p)) {
duplicateSet.add(p);
}
}
System.out.println(duplicateSet.toString());
set.removeAll(duplicateSet);
System.out.println(set.toString());
You can simply use Collections.frequency to check the element occurance in the list as shown below to filter the duplicates:
List<String> listInputs = new ArrayList<>();
//add your users
List<String> listOutputs = new ArrayList<>();
for(String value : listInputs) {
if(Collections.frequency(listInputs, value) ==1) {
listOutputs.add(value);
}
}
System.out.println(listOutputs);