I have a method that returns Mono<Output>:
interface Processor {
Mono<Output> process(Input input);
}
And I want to execute this processor method for a collection:
List<Input> inputs = // get inputs
Processor processor = // get processor
List<Mono<Output>> outputs = inputs.stream().map(supplier::supply).collect(toList());
But instead of a List<Mono<Output>> I want to get Mono<List<Output>> that will contain aggregated results.
I tried reduce, but the final result looks very clumsy:
Mono<List<Output>> result = inputs.stream().map(processor::process)
.reduce(Mono.just(new ArrayList<>()),
(monoListOfOutput, monoOfOutput) ->
monoListOfOutput.flatMap(list -> monoOfOutput.map(output -> {
list.add(output);
return list;
})),
(left, right) ->
left.flatMap(leftList -> right.map(rightList -> {
leftList.addAll(rightList);
return leftList;
})));
Can I achieve this with less code?
If you don't have to create stream for any reason, you could create Flux from your inputs, map it and collect list
Flux.fromIterable(inputs).flatMap(processor::process).collectList();
// first merge all the `Mono`s:
List<Mono<Output>> outputs = ...
Flux<Output> merged = Flux.empty();
for (Mono<Output> out : outputs) {
merged = merged.mergeWith(out);
}
// then collect them
return merged.collectList();
or (inspired by Alexander's answer)
Flux.fromIterable(outputs).flatMap(x -> x).collectList();
Related
There's a Mono<A> and Flux<B>, and we need to create a flux of tuples like this:
Mono<A> monoA = createMono(); // {a}
Flux<B> fluxB = createFlux(); // {b1, b2, ... b100, ...}
Flux<Tuple<A,B>> zippedTuples = magicZip(monoA, fluxB); // { (a:b1), (a:b2), ... (a:b100), ...}
What is the proper (or standard) way to write the magicZip function?
You can create this method:
private <T>Flux<Tuple2<T, T>> magicZip(Mono<T> mono, Flux<T> flux) {
Flux<T> repeatableMono = mono.repeat();
return flux.zipWith(repeatableMono);
}
Example for the String type:
Flux<Tuple2<String, String>> test = magicZip(getMono(), getFlux()).doOnNext(objects -> System.out.println(objects.getT1() + objects.getT2()));
test.blockLast();
I think that with zip function is not possible because it produces as many elements as the smallest of two.
The way I think you can achive this is:
Flux<A> fluxA = monoA.flux();
Flux<Tuple2<A,B>> zippedTuples =fluxB.flatMap(b -> fluxA.map(a -> Tuples.of(a,b)));
I am trying to convert an iterative block of code in Java 8 to functional. The functional approach is unable to find the matching message in the set shared.
List<Optional<Message>> allMessages = new ArrayList<>();
Set<Status> allStatuses = getAllStatuses();
//Iterative : Working
Set<StatusMessage> set = new HashSet<>(STATUS_MESSAGE.values());
for (StatusMessage statusMessage : set) {
for (Status status : statusMessage.getStatusAndInfo().keySet()) {
Optional<Message> message = MessageBuilder.createMessage(allStatuses, status, this::createMessage);
if (message.isPresent()) {
allMessages.add(message);
break;
}
}
}
//Functional : Not working - Never adds anything to the
//map even when matching status is present
STATUS_MESSAGE.values().stream()
.distinct()
.map(statusMessage -> statusMessage.getStatusAndInfo().keySet())
.flatMap(Collection::stream)
.map(key -> MessageBuilder.createMessage(allStatuses, key, this::createMessage))
.anyMatch(allMessages::add);
The MessageBuilder.createMessage looks like this:
Optional<Status> matchingStatus = statuses.stream()
.filter(matchingStatus::equals)
.findFirst();
System.out.println("Found : " + matchingStatus.toString());
return matchingStatus.flatMap(creator);
Also, for debugging purposes, how can I see what is happening at each step of the stream? The stack in the debugger in intellij wasn't showing anything in the stream.
This should do it:
STATUS_MESSAGE.values().stream()
.distinct()
.forEach(statusMessage ->
statusMessage.getStatusAndInfo().keySet().stream()
.map(status -> MessageBuilder.createMessage(allStatuses, status, this::createMessage))
.filter(Optional::isPresent)
.findFirst()
.ifPresent(allMessages::add)
);
UPDATE
To build the result list using toList instead of adding to a list:
List<Optional<Message>> allMessages = STATUS_MESSAGE.values().stream()
.distinct()
.flatMap(statusMessage ->
statusMessage.getStatusAndInfo().keySet().stream()
.map(status -> MessageBuilder.createMessage(allStatuses, status, this::createMessage))
.filter(Optional::isPresent)
.limit(1)
)
.collect(Collectors.toList());
This should be a comment, but it's too long...
Seems like your MessageBuilder.createMessage method is overcomplicated.
Check below a simplified and more readable version of the same logic:
if (allStatuses.contains(status)) {
System.out.println("Found : " + status.toString());
return creator.apply(status);
}
return Optional.empty();
You should not use forEach for accumulating operations, so this should be more idiomatic:
Function<StatusInfo, Optional<Message>> messageForStatus = statusInfo ->
statusInfo().keySet().stream()
.map(status -> MessageBuilder.createMessage(allStatuses, status, this::createMessage))
.filter(Optional::isPresent)
.findFirst()
.orElse(Optional.empty());
allMessages = STATUS_MESSAGE.values().stream()
.distinct()
.map(StatusMessage::getStatusAndInfo)
.map(messageForStatus)
.filter(Optional::isPresent)
.collect(toList());
As a side note, you have too many optionals, you may want to consider unwrapping some earlier, as a list of optionals may just as well be the list of only the present values.
I have a simple User class with a String and an int property.
I would like to add two Lists of users this way:
if the String equals then the numbers should be added and that would be its new value.
The new list should include all users with proper values.
Like this:
List1: { [a:2], [b:3] }
List2: { [b:4], [c:5] }
ResultList: {[a:2], [b:7], [c:5]}
User definition:
public class User {
private String name;
private int comments;
}
My method:
public List<User> addTwoList(List<User> first, List<User> sec) {
List<User> result = new ArrayList<>();
for (int i=0; i<first.size(); i++) {
Boolean bsin = false;
Boolean isin = false;
for (int j=0; j<sec.size(); j++) {
isin = false;
if (first.get(i).getName().equals(sec.get(j).getName())) {
int value= first.get(i).getComments() + sec.get(j).getComments();
result.add(new User(first.get(i).getName(), value));
isin = true;
bsin = true;
}
if (!isin) {result.add(sec.get(j));}
}
if (!bsin) {result.add(first.get(i));}
}
return result;
}
But it adds a whole lot of things to the list.
This is better done via the toMap collector:
Collection<User> result = Stream
.concat(first.stream(), second.stream())
.collect(Collectors.toMap(
User::getName,
u -> new User(u.getName(), u.getComments()),
(l, r) -> {
l.setComments(l.getComments() + r.getComments());
return l;
}))
.values();
First, concatenate both the lists into a single Stream<User> via Stream.concat.
Second, we use the toMap collector to merge users that happen to have the same Name and get back a result of Collection<User>.
if you strictly want a List<User> then pass the result into the ArrayList constructor i.e. List<User> resultSet = new ArrayList<>(result);
Kudos to #davidxxx, you could collect to a list directly from the pipeline and avoid an intermediate variable creation with:
List<User> result = Stream
.concat(first.stream(), second.stream())
.collect(Collectors.toMap(
User::getName,
u -> new User(u.getName(), u.getComments()),
(l, r) -> {
l.setComments(l.getComments() + r.getComments());
return l;
}))
.values()
.stream()
.collect(Collectors.toList());
You have to use an intermediate map to merge users from both lists by summing their ages.
One way is with streams, as shown in Aomine's answer. Here's another way, without streams:
Map<String, Integer> map = new LinkedHashMap<>();
list1.forEach(u -> map.merge(u.getName(), u.getComments(), Integer::sum));
list2.forEach(u -> map.merge(u.getName(), u.getComments(), Integer::sum));
Now, you can create a list of users, as follows:
List<User> result = new ArrayList<>();
map.forEach((name, comments) -> result.add(new User(name, comments)));
This assumes User has a constructor that accepts name and comments.
EDIT: As suggested by #davidxxx, we could improve the code by factoring out the first part:
BiConsumer<List<User>, Map<String, Integer>> action = (list, map) ->
list.forEach(u -> map.merge(u.getName(), u.getComments(), Integer::sum));
Map<String, Integer> map = new LinkedHashMap<>();
action.accept(list1, map);
action.accept(list2, map);
This refactor would avoid DRY.
There is a pretty direct way using Collectors.groupingBy and Collectors.reducing which doesnt require setters, which is the biggest advantage since you can keep the User immutable:
Collection<Optional<User>> d = Stream
.of(first, second) // start with Stream<List<User>>
.flatMap(List::stream) // flatting to the Stream<User>
.collect(Collectors.groupingBy( // Collecting to Map<String, List<User>>
User::getName, // by name (the key)
// and reducing the list into a single User
Collectors.reducing((l, r) -> new User(l.getName(), l.getComments() + r.getComments()))))
.values(); // return values from Map<String, List<User>>
Unfortunately, the result is Collection<Optional<User>> since the reducing pipeline returns Optional since the result might not be present after all. You can stream the values and use the map() to get rid of the Optional or use Collectors.collectAndThen*:
Collection<User> d = Stream
.of(first, second) // start with Stream<List<User>>
.flatMap(List::stream) // flatting to the Stream<User>
.collect(Collectors.groupingBy( // Collecting to Map<String, List<User>>
User::getName, // by name (the key)
Collectors.collectingAndThen( // reduce the list into a single User
Collectors.reducing((l, r) -> new User(l.getName(), l.getComments() + r.getComments())),
Optional::get))) // and extract from the Optional
.values();
* Thanks to #Aomine
As alternative fairly straight and efficient :
stream the elements
collect them into a Map<String, Integer> to associate each name to the sum of comments (int)
stream the entries of the collected map to create the List of User.
Alternatively for the third step you could apply a finishing transformation to the Map collector with collectingAndThen(groupingBy()..., m -> ...
but I don't find it always very readable and here we could do without.
It would give :
List<User> users =
Stream.concat(first.stream(), second.stream())
.collect(groupingBy(User::getName, summingInt(User::getComments)))
.entrySet()
.stream()
.map(e -> new User(e.getKey(), e.getValue()))
.collect(toList());
I am using RxJava in which I want to dynamically create a number of Observables based on some condition. Once I'm done with creating, I want to do some processing on the different values returned by the observables and then send as a single Observable to which I can subscribe on. Here is how my code is :
List<String> valueList = ....
List<Observable<String>> listOfObservables = new ArrayList<Observable<String>>();
for(int i =; i <valueList.size(); i++){
listOfObservables.add(new SomeClass.doOperation(valueList(i)));
// SomeClass.doOperation will return an Observable<String>
}
return Observable.merge(listOfObservables);
But here , I want to do some operation on the values emitted by different Observables in the listOfObservable and finally return it as a single Observable<String>
Like in Observable.zip() , I can do this like
return Observable.zip(observable1, observable2, (string1, string2) -> {
// joining final string here
return string1 + string2;
But I know the number of arguments here. Please let me know how I can achieve this.
Use the zip overload that takes a variable number of arguments, it has a signature of
<R> Observable<R> zip(Iterable<? extends Observable<?>> ws,
FuncN<? extends R> zipFunction)
Example usage:
List<String> valueList = ....
return Observable.from(valueList)
.map(string -> SomeClass.doOperationThatReturnsObservable(string))
.toList()
.flatMap(listOfObs -> Observable.zip(listOfObs, (Object[] results) -> {
// do something with the strings in the array.
return Arrays.stream(results)
.map(Object::toString)
.collect(Collectors.joining(","));
}));
I'd like to know if there is a good way of reusing a common stream operation that varies in the end for different outputs.
The example bellow is exactly what I'm trying to compact into a one-step operation:
public static DepartmentInfo extractDepartmentInfo(BaselinePolicy resource) throws ResourceProcessorError {
Function<Exception, Exception> rpe = e -> new ResourceProcessorError(e.getMessage());
List<String> parents =
Objects.requireNonNull(
Exceptions.trying(
() -> Arrays.asList(Exceptions.dangerous(resource::getParentIds).expecting(CMException.class).throwing(rpe))
.stream()
.map(cId -> Exceptions.dangerous(cId, resource.getCMServer()::getPolicy).expecting(CMException.class).throwing(rpe))
.filter(policy -> PagePolicy.class.isAssignableFrom(policy.getClass()))
.map(PagePolicy.class::cast)
.filter(page -> Exceptions.dangerous(page,
p -> Boolean.valueOf(p.getComponentNotNull(ComponentConstants.POLOPOLY_CLIENT,
ComponentConstants.IS_HOME_DEPARTMENT,
Boolean.FALSE.toString())).booleanValue())
.expecting(CMException.class).throwing(rpe))
.map(page -> Exceptions.dangerous(page, p -> p.getExternalId().getExternalId()).expecting(CMException.class).throwing(rpe)), ResourceProcessorError.class)
.collect(Collectors.toList()));
String externalId = parents.get(parents.size()-1).toString();
List<String> list =
Objects.requireNonNull(
Exceptions.trying(
() -> Arrays.asList(Exceptions.dangerous(resource::getParentIds).expecting(CMException.class).throwing(rpe))
.stream()
.map(cId -> Exceptions.dangerous(cId, resource.getCMServer()::getPolicy).expecting(CMException.class).throwing(rpe))
.filter(policy -> PagePolicy.class.isAssignableFrom(policy.getClass()))
.map(PagePolicy.class::cast)
.map(page ->
Exceptions.dangerous(page,
p -> p.getChildPolicy(PATH_SEGMENT) != null &&
StringUtils.hasLength(SingleValued.class.cast(p.getChildPolicy(PATH_SEGMENT)).getValue())?
SingleValued.class.cast(p.getChildPolicy(PATH_SEGMENT)).getValue(): p.getName()).expecting(CMException.class).throwing(rpe))
.filter(val -> val != null && !val.isEmpty()), ResourceProcessorError.class)
.collect(Collectors.toList()));
if(list.size() > 3) {
list = list.subList(list.size() - 3, list.size()-1);
}
switch(list.size()) {
case 0: {
throw new ResourceProcessorError("br.com.oesp.XMLRender.error.noProduct");
}
case 1: {
return DepartmentInfo.withProduct(list.get(0), externalId);
}
case 2: {
return DepartmentInfo.withProduct(list.get(0), externalId).withDepartment(list.get(1));
}
default: {
return DepartmentInfo.withProduct(list.get(0), externalId).withDepartment(list.get(1)).withSubDepartment(list.get(2));
}
}
}
Notice that the first step is repeated for both:
List<String> parents =
Objects.requireNonNull(
Exceptions.trying(
() -> Arrays.asList(Exceptions.dangerous(resource::getParentIds).expecting(CMException.class).throwing(rpe))
.stream()
.map(cId -> Exceptions.dangerous(cId, resource.getCMServer()::getPolicy).expecting(CMException.class).throwing(rpe))
.filter(policy -> PagePolicy.class.isAssignableFrom(policy.getClass()))
.map(PagePolicy.class::cast)
It's not only a problem for reading but specially because I'm redoing a heavy operation twice, meanwhile in a more imperative way I'd do it once.
There are two things you're trying to do:
avoid the redundant work of creating the input array
avoid the redundant code of the map/filter/map
The first is easy:
List<Id> list = Arrays.asList(Exceptions.dangerous(resource::getParentIds)
.expecting(CMException.class)
.throwing(rpe));
Now you can pull streams from this source twice without rematerializing it.
The next bit is simply a Function from List to Stream:
Function<List<Id>, Stream<Something>> asStream =
list -> list.stream().map(...).filter(...).map(...);
Now, just start your stream with this:
asStream.apply(list).moreStuff().moreStuff()