I have two List Map:
orders
[
{
item_id=1,
item=item-1,
user_id=1
},
{
item_id=2,
item=item-2,
user_id=2
},
{
item_id=3,
item=item-3,
user_id=3
}
]
users
[
{
user_id=1,
name=abh,
email=abh#bit.com
},
{
user_id=2,
name=pol,
email=pol#bit.com
},
{
user_id=3,
name=tre,
email=tre#bit.com
}
]
They are initialized as
List<Map<String, String>> data
I want to do an sql equivalent inner join on this List Maps using Streams.
I tried this:
List<Map<String, String>> collect = leftData.stream().flatMap(t1 -> rightData.stream())
.filter(t -> t.get(joinColumnTableLeft).equals(t.get(joinColumnTableRight)))
.collect(Collectors.toList());
This gives me a result of size size(users) * size(orders), which is 9.
And the collect has orders.
But I want both the Map to merged into single and then create a list out of it.
Cannot use any library as of now.
Assuming that you don't have duplicate entries (by the merge column key), you can use a method like this to merge.
This creates a map of the mergeColumn key to the full map by row in one of the lists, then uses that for lookup when merging by iterating through the other map.
static List<Map<String, String>> merge(List<Map<String, String>> left,
List<Map<String, String>> right, String joinColumnTableLeft,
String joinColumnTableRight) {
Map<String, Map<String, String>> rightById = right.stream()
.collect(Collectors.toMap(m -> m.get(joinColumnTableRight),
Function.identity()));
return left.stream()
.filter(e -> rightById.containsKey(e.get(joinColumnTableLeft)))
.map(l -> {
Map<String, String> all = new HashMap<>();
all.putAll(l);
all.putAll(rightById.get(l.get(joinColumnTableLeft)));
return all;
})
.collect(Collectors.toList());
}
As a test:
Map<String, String> left1 = new HashMap<>(), right1 = new HashMap<>();
left1.put("a", "A");
left1.put("b", "B");
left1.put("c", "C");
right1.put("a", "A");
right1.put("d", "B");
Map<String, String> left2 = new HashMap<>(), right2 = new HashMap<>();
left2.put("a", "AA");
left2.put("b", "BB");
left2.put("c", "CC");
right2.put("a", "AA");
right2.put("d", "BB");
System.out.println(merge(Arrays.asList(left1, left2),
Arrays.asList(right1, right2), "a", "a"));
The output is: [{a=A, b=B, c=C, d=B}, {a=AA, b=BB, c=CC, d=BB}]
The order of entries isn't important, though. Just note that this assumes that there are no overlapping keys other than the join column. Otherwise, you may want to collect pairs of maps instead of calling putAll on a new map.
The following will support duplicate join keys (and will produce a cartesian product for all entries per key):
static List<Map<String, String>> merge(List<Map<String, String>> left,
List<Map<String, String>> right,
String joinColumnTableLeft, String joinColumnTableRight) {
Map<String, List<Map<String, String>>> rightById = right.stream()
.collect(Collectors.groupingBy(m -> m.get(joinColumnTableRight)));
return left.stream()
.filter(e -> rightById.containsKey(e.get(joinColumnTableLeft)))
.flatMap(l -> rightById.get(l.get(joinColumnTableLeft)).stream()
.map(r -> {
Map<String, String> all = new HashMap<>();
all.putAll(l);
all.putAll(r);
return all;
}
)
).collect(Collectors.toList());
}
Related
Consider the following data structures:
ArrayList<HashMap<String, String>> entries = new ArrayList<>();
ArrayList<String> keyNamesToInclude = new ArrayList<>();
This code creates a copy of entries, but with hashmaps only including the keys in keyNamesToInclude:
ArrayList<HashMap<String, String>> copies = new ArrayList<>();
for (HashMap<String, String> entry: entries) {
HashMap<String, String> copy = new HashMap<>();
for (String keyName: keyNamesToInclude) {
copy.put(keyName, entry.get(keyName));
}
copies.add(copy);
}
How would one create this with Streams in a functional way?
It is better to convert keyNamesToInclude into Set to facilitate lookup of the keys.
Then use List::stream to get Stream<HashMap> and for each map get filtered stream of its entries re-collected into a new map and list accordingly.
Set<String> keys = new HashSet<>(keyNamesToInclude); // removes possible duplicates
List<Map<String, String>> copies = entries.stream() // Stream<HashMap>
.map(m -> m.entrySet()
.stream()
.filter(e -> keys.contains(e.getKey()))
.collect(Collectors.toMap(
Map.Entry::getKey, Map.Entry::getValue
))
)
.collect(Collectors.toList());
If it is very important to have concrete implementations of List and Map in copies, casting or special forms of collectors may be needed even though Collectors.toList() returns ArrayList and Collectors.toMap returns HashMap:
// casting
ArrayList<HashMap<String, String>> copies2 = (ArrayList) entries.stream() // Stream<HashMap>
.map(m -> (HashMap<String, String>) m.entrySet()
.stream()
.filter(e -> keys.contains(e.getKey()))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue))
)
.collect(Collectors.toList());
// special collectors
// toMap(keyMapper, valueMapper, mergeFunction, mapFactory)
// toList -> toCollection(ArrayList::new)
ArrayList<HashMap<String, String>> copies3 = entries.stream() // Stream<HashMap>
.map(m -> m.entrySet()
.stream()
.filter(e -> keys.contains(e.getKey()))
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue, (a, b) -> a, HashMap::new))
)
.collect(Collectors.toCollection(ArrayList::new));
You can do something like:
Set<String> setNamesToInclude = new HashSet<>(keyNamesToInclude);
List<Map<String, String>> copies = entries.stream()
.map(hm -> hm.keySet()
.stream()
.filter(setNamesToInclude::contains)
.collect(Collectors.toMap(Function.identity(), hm::get))
)
.collect(Collectors.toList());
Maybe somewhat more efficient solution that utilizes hashmap instead of list contains.
List<HashMap<String, String>> copies = maps.stream().map(e -> {
HashMap<String, String> copy = new HashMap<>();
keys.forEach(k -> {
String value = e.get(k);
if (value != null)
copy.put(k, e.get(k));
});
return copy;
}).collect(Collectors.toList());
You can do something like the following (pseudo code):
return entries.stream()
.map(hashmap -> {
return hashmap.stream()
.filter((k,v) -> keyNamesToInclude.contains(k))
.collect(toMap());
})
.collect(toList());
I recommend using a Set to contain the keysToBeIncluded since it is a little more efficient. This also filters out empty maps from the final list.
List<Map<String, String>> entries = List.of(
Map.of("1", "one", "2", "two", "3", "three"),
Map.of("2", "two", "3", "three"), Map.of("1", "one"),
Map.of("6", "six", "7", "seven"));
entries.forEach(System.out::println);
Set<String> keyNamesToInclude = Set.of("2", "3", "6");
stream the original list of maps
for each map, filter the required keys
build the new map
filter to allow non-empty maps.
create the list
List<Map<String, String>> copies = entries.stream()
.map(m -> m.entrySet().stream().filter(
e -> keyNamesToInclude.contains(e.getKey()))
.collect(Collectors.toMap(Entry::getKey,
Entry::getValue))).filter(m->!m.isEmpty())
.toList(); // java 16 - can be replaced with a collector
copies.stream().forEach(System.out::println);
Prints four lines of source and three lines of result
{3=three, 2=two, 1=one}
{3=three, 2=two}
{1=one}
{7=seven, 6=six}
{2=two, 3=three}
{2=two, 3=three}
{6=six}
I have a List<Map<String,String>>
such as:
Map<String, String> m1 = new HashMap<>();
m1.put("date", "2020.1.5");
m1.put("B", "10");
Map<String, String> m2 = new HashMap<>();
m2.put("date", "2020.1.5");
m2.put("A", "20");
Map<String, String> m3 = new HashMap<>();
m3.put("date", "2020.1.6");
m3.put("A", "30");
Map<String, String> m4 = new HashMap<>();
m4.put("date", "2020.1.7");
m4.put("C", "30");
List<Map<String, String>> before = new ArrayList<>();
before.add(m1);
before.add(m2);
before.add(m3);
before.add(m4);
My expect result is to generate a new List map, which is grouped by date , and all the entry set in the same date would be put together, like:
[{"A":"20","B":"10","date":"2020.1.5"},{"A":"30","date":"2020.1.6"},{"C":"30","date":"2020.1.7"}]
I tried with the following method, but always not my expect result.
stream().flatmap().collect(Collectors.groupingBy())
Some Additional Comments for this problem:
I worked this out with for LOOP, but the application hangs when the list size is about 50000, so I seek a better performant way to do this. Java 8 stream flat map is a perhaps way as far as I know.
So the key point is not only to remap this but also with the most performant way to do this.
before
.stream()
.collect(Collectors.toMap((m) -> m.get("date"), m -> m, (a,b) -> {
Map<String, String> res = new HashMap<>();
res.putAll(a);
res.putAll(b);
return res;
}))
.values();
This is the solution you're looking for.
The toMap function receives 3 parameters:
the key mapper, which in your case is the date
the value mapper, which is the map itself that's being processed
the merge function, which takes 2 maps with the same date and puts all the keys together
Output:
[{date=2020.1.5, A=20, B=10}, {date=2020.1.6, A=30}, {date=2020.1.7, C=30}]
You can do this way using groupingBy and Collector.of
List<Map<String, String>> list = new ArrayList<>(before.stream()
.collect(Collectors.groupingBy(
k -> k.get("date"),
Collector.of( HashMap<String,String>::new,
(m,e)-> m.putAll(e),
(map1,map2)->{ map1.putAll(map2); return map1;}
))).values());
Here, first use Collectors.groupingBy to group by date. Then define custom collector using Collector.of to collect List<Map<String, String>> into Map<String, String>. After create list using map values.
And using Collectors.flatMapping from Java 9
List<Map<String, String>> list = new ArrayList<>(before.stream()
.collect(Collectors.groupingBy(
k -> k.get("date"),
Collectors.flatMapping(m -> m.entrySet().stream(),
Collectors.toMap(k -> k.getKey(), v -> v.getValue(), (a,b) -> a))))
.values());
You can achieve the very same result using a certain number of Collectors, orderly:
Collectors.groupingBy to group by the date
Collectors.reducing to merge the Map<String, String> items
Collectors.collectingAndThen to transform the values from Map<String, Optional<Map<String, String>>>, as a result of the previous reducing to the final output List<Map<String, String>>.
List<Map<String, String>> list = before.stream()
.collect(Collectors.collectingAndThen(
Collectors.groupingBy(
m -> m.get("date"),
Collectors.reducing((l, r) -> {
l.putAll(r);
return l; })
),
o -> o.values().stream()
.flatMap(Optional::stream)
.collect(Collectors.toList())));
The list contains what are you looking for:
[{date=2020.1.5, A=20, B=10}, {date=2020.1.6, A=30}, {date=2020.1.7, C=30}]
Important: This solution has two he disadvantages:
It looks clumsy and might not be clear for an independent viewer
It mutates (modifies) the original maps included in the List<Map<String, String>> before.
It can be done as follows:
List<Map<String, String>> remapped = before.stream()
.collect(Collectors.groupingBy(m -> m.get("date")))
.values().stream()
.map(e -> e.stream()
.flatMap(m -> m.entrySet().stream())
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue, (x1, x2) -> x1)))
.collect(Collectors.toList());
remapped.forEach(System.out::println);
Output:
{date=2020.1.5, A=20, B=10}
{date=2020.1.6, A=30}
{date=2020.1.7, C=30}
I am trying to merge a Stream<Map<String, Map<String, String>>> object into a single map with keys in all the Streams.
For example,
final Map<String, someOtherObjectToProcess> someObject;
final List<Map<String, Map<String, String>>> list = someObject.entrySet()
.stream()
.flatMap(this::getInfoStream)
.collect(Collectors.toList());
The signature for getInfoStream is
public Stream<Map<String, Map<String, String>>> getInfoStream(Map.Entry<String, someOtherObjectToProcess> entry)
if I use (Collectors.toList()) I am able to get a list of these Map objects.
Sample output if I use the above code:
[{
"name" : {
"property":"value"
}
},
{
"name2" : {
"property":"value"
}
}]
But I want to collect into a Map with the structure
{
"name" : {
"property":"value"
},
"name2" : {
"property":"value"
}
}
Provided that the keys will be unique.
How can I do this with Collectors.toMap() or any other alternative way?
When you have
Stream<Map<String, Map<String, String>>> stream = ...
(which I am assuming is result of .flatMap(this::getInfoStream)) you can call
.flatMap(map -> map.entrySet().stream())
to create stream of entries from all maps which will produce Stream<Map.Entry<String, Map<String, String>>>.
Now from that stream all you need to do is collect key and value from each entry into map. Assuming each key will be unique across all maps you could use
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));
but if keys are not unique you need to decide what value should be placed in new map for same key. We can do it by filling ... part in
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue, (vOld, vNew) -> ...));
// ^^^
where vOld holds value currently held in result map under same key, and vNew holds new value (from current stream "iteration").
For instance if you want to ignore new value you can simply return old/currently held by (vOld, vNew) -> vOld
So in short (assuming unique keys):
Map<String, Map<String, String>> combinedMap =
/*your Stream<Map<String, Map<String, String>>>*/
.flatMap(map -> map.entrySet().stream())
.collect(Collectors.toMap(Map.Entry::getKey, Map.Entry::getValue));
Another way to solve this would be not using a collector(toList()) but the other overloaded .collect() method with Supplier, Accumulator, and Combiner:
Stream<Map<String, Map<String, String>>> stream = ...
Map<String, Map<String, String>> result = stream
.collect(HashMap::new, HashMap::putAll, HashMap::putAll);
The most readable way in my opinion is to map everything to a Map.Entry and then collect everything back to a Map using Collectors::toMap
import static java.util.stream.Collectors.toMap;
// ...
someObject.entrySet()
.stream()
.flatMap(this::getInfoStream)
.flatMap(map -> map.entrySet().stream())
.collect(toMap(Map.Entry::getKey, Map.Entry::getValue, (one, two) -> one));
(one, two) -> one is the merge function, basically if you have duplicates, you just arbitrarely take the first one to come up
TL;DR:
var merged = Stream.of(map1, map2, ..., mapN).reduce(new HashMap<>(), (a, b) -> {
a.putAll(b);
return a;
});
You can use reduce to combine a stream of Map<String, Map<String, String>> elements into one:
import java.util.HashMap;
import java.util.Map;
import java.util.stream.Stream;
public class Main {
public static void main(String[] args) {
alternative1();
alternative2();
}
// Use reduce without an initial identity value
public static void alternative1() {
Map<String, Map<String, String>> m1 = new HashMap<>();
m1.put("name", Map.of("property", "value"));
Map<String, Map<String, String>> m2 = new HashMap<>();
m2.put("name2", Map.of("property", "value"));
Stream<Map<String, Map<String, String>>> mapStream = Stream.of(m1, m2);
Map<String, Map<String, String>> m3 = mapStream.reduce((a, b) -> {
Map<String, Map<String, String>> temp = new HashMap<>();
temp.putAll(a);
temp.putAll(b);
return temp;
}).orElseThrow();
System.out.println(m3);
}
// Use reduce with an initial empty map as the identity value
public static void alternative2() {
Map<String, Map<String, String>> m1 = new HashMap<>();
m1.put("name", Map.of("property", "value"));
Map<String, Map<String, String>> m2 = new HashMap<>();
m2.put("name2", Map.of("property", "value"));
Stream<Map<String, Map<String, String>>> mapStream = Stream.of(m1, m2);
Map<String, Map<String, String>> m3 = mapStream.reduce(new HashMap<>(), (a, b) -> {
a.putAll(b);
return a;
});
System.out.println(m3);
}
}
Output:
{name={property=value}, name2={property=value}}
{name={property=value}, name2={property=value}}
But beware that these solutions assume keys (name and name2) are unique, otherwise duplicate keys would make map entries overwrite each other.
The same logic with a more modern syntax:
import java.util.HashMap;
import java.util.Map;
import java.util.stream.Stream;
public class Main {
public static void main(String[] args) {
alternative1();
alternative2();
}
// Use reduce without an initial identity value
public static void alternative1() {
var m1 = Map.of("name", Map.of("property", "value"));
var m2 = Map.of("name2", Map.of("property", "value"));
var m3 = Stream.of(m1, m2).reduce((a, b) -> {
var temp = new HashMap<String, Map<String, String>>();
temp.putAll(a);
temp.putAll(b);
return temp;
}).orElseThrow();
System.out.println(m3);
}
// Use reduce with an initial empty map as the identity value
public static void alternative2() {
var m1 = Map.of("name", Map.of("property", "value"));
var m2 = Map.of("name2", Map.of("property", "value"));
var m3 = Stream.of(m1, m2).reduce(new HashMap<>(), (a, b) -> {
a.putAll(b);
return a;
});
System.out.println(m3);
}
}
I have an Array containing Map. And I want to filter my array using some (multiple) key and value inside of the map object. For example, WHERE ID > 1 AND Name <> "cc" (key > 1, Name<>"cc").
How can i do that in Java?
I have imported the Guava libraries that has Collections2 to filter the array.
But, I didn't found any example that is filtering Map object inside the array.
here is some of my example codes:
List<Map<String, Object>> baseList = new ArrayList<>();
Map<String, Object> map1 = new HashMap<>();
map1.put("ID", 1);
map1.put("Name", "aa");
baseList.add(map1);
Map<String, Object> map2 = new HashMap<>();
map2.put("ID", 2);
map2.put("Name", "bb");
baseList.add(map2);
Map<String, Object> map3 = new HashMap<>();
map3.put("ID", 3);
map3.put("Name", "cc");
baseList.add(map3);
List<Map<String, Object>> filteredList = new ArrayList<>();
filteredList = Collections2.filter() ???
I want to filter with a kind of ID >= 1 AND NAME<>"cc" Which will resulting Array containing Map object like this: [{ID=1,Name="aa"}, {ID=2,Name="bb"}]
Anyone can help?
I have no idea what do you need Guava for. I'd do that in the following way:
List<Map<String, Object>> filteredList = baseList.stream()
.filter(map -> map.entrySet().stream()
.anyMatch(e -> e.getKey().equals(1L) && e.getValue().equals("cc")))
.collect(Collectors.toList());
Do you use Java 8? You can do:
List<Map<String, Object>> filteredList = maps.stream()
.filter(map -> (Integer) map.get("ID") >= 1 && !"cc".equals(map.get("Name")))
.collect(Collectors.toList());
to have new list with filtered maps.
If you want collection view using Guava goodies (or no Java 8), you should use Collections2.filter:
Collection<Map<String, Object>> filteredList = Collections2.filter(
maps, new Predicate<Map<String, Object>>() {
#Override
public boolean apply(#Nullable Map<String, Object> map) {
return (Integer) map.get("ID") >= 1 && !"cc".equals(map.get("Name"));
}
});
there's no Lists.filter, see IdeaGraveyard for explanation, hence only Collection interface is provided.
Do you really need list of maps instead of Map<Integer, String> (or maybe Map<Integer, YourDomainObject>)? Then you could do:
final Map<Integer, String> map = ImmutableMap.of(
1, "aa",
2, "bb",
3, "cc");
final Map<Integer, String> filteredMap = Maps.filterEntries(map,
e -> e.getKey() >= 1 && !"cc".equals(e.getValue()));
Given that we have the following function:
public Map<String, List<String>> mapListIt(List<Map<String, String>> input) {
Map<String, List<String>> results = new HashMap<>();
List<String> things = Arrays.asList("foo", "bar", "baz");
for (String thing : things) {
results.put(thing, input.stream()
.map(element -> element.get("id"))
.collect(Collectors.toList()));
}
return results;
}
Is there some way I could clean this up by binding "id" to a Map::get method reference?
Is there a more stream-y way to write this functionality?
As far as I can tell what you are intending is that this function returns a map from a defined list of strings to a list of all elements with key "id" in a list of input maps. Is that correct?
If so it could be significantly simplified as the value for all keys will be the same:
public Map<String, List<String>> weirdMapFunction(List<Map<String, String>> inputMaps) {
List<String> ids = inputMaps.stream()
.map(m -> m.get("id")).collect(Collectors.toList());
return Stream.of("foo", "bar", "baz")
.collect(Collectors.toMap(Function.identity(), s -> ids));
}
If you wish to use a method reference (which is my interpretation of your question about 'binding') then you will need a separate method to reference:
private String getId(Map<String, String> map) {
return map.get("id");
}
public Map<String, List<String>> weirdMapFunction(List<Map<String, String>> inputMaps) {
List<String> ids = inputMaps.stream()
.map(this::getId).collect(Collectors.toList());
return Stream.of("foo", "bar", "baz")
.collect(Collectors.toMap(Function.identity(), s -> ids));
}
However I'm guessing that you intended to use the items in the list as the keys (rather than "id") in which case:
public Map<String, List<String>> weirdMapFunction(List<Map<String, String>> inputMaps) {
return Stream.of("foo", "bar", "baz")
.collect(Collectors.toMap(Function.identity(), s -> inputMaps.stream()
.map(m -> m.get(s)).collect(Collectors.toList())));
}