How to keep an app reactive when needing to subscribe? - java

I have a fully reactive web app that aggregates the information from two other backend-services.
Incoming request -> sends request to service A and B -> aggregates responses -> response is emitted.
pseudocode:
public Mono<ResponseEntity<List<String>>> getValues() {
return Mono.zip(getValuesA(), getValuesB(),
(a, b) -> Stream.concat(a.stream(), b.stream()).collect(Collectors.toList()))
.map(result -> ResponseEntity.ok(result));
}
public Mono<String> getValuesA() {
return webClient.get()
.uri(uriA)
.retrieve()
.bodyToMono(new ParameterizedTypeReference<>() {});
}
// getValuesB same as A, but with uriB.
Because of the high request frequency, I want to bundle requests to the backend-services. I thought using Sinks would be the right way to go. A sink is returned as mono to every requesting party. After a threshold of 10 requests has been exceeded, the request will be handled and the response will be emitted to every sink.
public Mono<ResponseEntity<List<String>>> getValues() {
return Mono.zip(getValuesA(), getValuesB(),
(a, b) -> Stream.concat(a.stream(), b.stream()).collect(Collectors.toList()))
.map(result -> ResponseEntity.ok(result));
}
public Mono<String> getValuesA() {
Sink.One<List<String>> sink = Sinks.one();
queue.add(sink);
if(queue.size() > 10) {
webClient.get()
.uri(uriA)
.retrieve()
.bodyToMono(new ParameterizedTypeReference<>() {})
.subscribe(response -> {
for(Sink.One<List<String>> sinkItem : queue) {
sink.tryEmitValue(response);
}
});
}
return sink.asMono();
}
// getValuesB same as A, but with uriB.
The problem in this code is the 'subscribe' part. As soon as we're subscribing to the webclient's response, it will block the thread. This will only happen in 10% of the requests, but this is already too much for an endpoint that's being called very frequently. What can I do to 'unblock' this part. If using sinks wasn't the best choice, what could have been a better one?
PS. All pseudocode used is NOT production code. It may have many flaws and it is only meant to visualize the problem I'm facing at this moment.

Because of the high request frequency, I want to bundle requests to the backend-services. I thought using Sinks would be the right way to go.
You shouldn't need a sink to do this at all - assuming a Flux as input, you should be able to do this in 3 steps with a standard reactive chain:
Buffer the input with a length of 10, which transforms your Flux<Foo> into a Flux<List<Foo>> where each element is a list of size 10 (or lower than 10 if the flux completes with fewer than 10 remaining elements);
Flatmap to a zipped mono which contains the original list, the "A" web service response given the list, and the "B" web service response given the list;
Implement a method (let's call it expand()) which takes the original list of 10 items, the A service response, and the B service response, and then splits it out into a flux of multiple items. Flatmap to this method.
The end result would be a reactive chain that looked something like:
input.buffer(10)
.flatMap(list -> Mono.zip(Mono.just(list), getResponseFromA(list), getResponseFromB(list)))
.flatMap(response -> expand(response.getT1(), response.getT2(), response.getT3()))

Related

doOnFirst element in Reactor Flux?

I would like to perform some action (side effect) only on the first emission of the element in the Flux.
Is there a way to do that?
Some context: I want to call .elapsed() on Flux and log only the first elapsed time.
It turns out I can perform conditional logic using .switchOnFirst operator.
So I have:
flux
.elapsed()
.switchOnFirst { signal, flux ->
if (signal.hasValue()) {
meterRegistry.timer("my.latency", tags).record(signal.get()!!.t1, TimeUnit.MILLISECONDS)
}
flux // returns the whole flux
}
.flatMap { Mono.just(it.t2) } // back to original flux

return computed Mono from completing Flux

I am new to spring webflux and have a problem with aggregating a flux to a Mono.
ProductController has a method Flux<Product> get(List<UUID> ids) returning a Stream of Products for a given list of ids. When all products have been fetched the flux completes.
Aggregator fetches a list of products, computes a new ProductAggregateDTO from the stream and sends it to an accountingService, which then processes them and assigns an UUID to the accounting process.
class Aggregator {
Mono<UUID> process(List<UUID> ids) {
ProductAggregateDTO adto = new ProductAggregateDTO();
productAdapter.getProducts(ids)
.doOnNext(e -> {
adto.consume(e);
})
.doOnComplete(() -> {
Mono<UUID> processId = accountAdapter.process(adto);
})
.subscribe();
}
}
I want to return processId from the process function. I don't think thats a big problem. But I can not find how.
Thanks for your help!
Kind Regards,
Andreas

How can I wait until multiple WebClient Flux requests are finished?

I want to:
subscribe to multiple endpoints returning Flux and output the messages I receive.
wait until all messages have been output from all endpoints before continuing.
avoid processing messages from multiple endpoints "together" (e.g. Flux.zip) because the endpoints will return an uneven number of messages and are not logically connected to each other
block forever if one or more endpoints generate an infinite number of messages
The following code satisfies 1 and 3 but not 2 and 4:
Stream.of("http://service1.com", "http://service2.com", "http://service3.com")
.forEach(service -> {
webClient.get()
.retrieve()
.bodyToFlux(String.class)
.map(message -> service + ": " + message)
.subscribe(System.out::println);
});
System.out.println("Received all messages");
The line "Received all messages" should not be printed until all endpoints have finished. However, because subscribe is asynchronous, that line is instead printed almost immediately and my application continues instead of waiting.
What should I do differently?
I believe the following code snippet achieves 3 out of 4 points in your question though I do not feel like I fully understand the 3rd requirement. Let me if this example meets what is needed and if not, what is missing.
Flux.just("http://service1.com", "http://service2.com", "http://service3.com")
.flatMap(url -> webClient.get()
.uri(url)
.retrieve()
.bodyToFlux(String.class)
.map(body -> url + ":" + body)
)
.collectList()
.doOnNext(list -> LOG.info("Received all messages"))
.subscribe(list -> LOG.info("" + list));
flatMap is one way to merge fluxes together but you can also use Flux.merge
List<Flux<String>> individualResults =
Stream.of("http://service1.com", "http://service2.com", "http://service3.com")
.map(
service ->
webClient //
.get()
.retrieve()
.bodyToFlux(String.class))
.collect(toList());
Flux<String> mergedResults = Flux.merge(individualResults); // Will not complete until all individual Fluxes have completed.
mergedResults //
.doOnNext(System.out::println)
.then()
.block(); // block this thread until mergedResults completes
System.out.println("Received all messages");

Java Loop until condition for webclient returning Mono

I have a java webclient code , the response of which I convert to Mono. I want to iterate on the api call until the Mono response matches certain condition. Of course I do not want to iterate till infinity. I want to iterate after every 5 seconds until 30 seconds. So far I have tried this
client.get()
.uri("https://someUri")
.retrieve()
.bodyToMono(Response.class)
.delayElement(Duration.ofSeconds(5))
.retryBackoff(5, Duration.ofSeconds(5))
.delayUntil(r -> {
System.out.print("Looping");
if(condition) {
System.out.print(r.getStatus());
return Mono.just(r);
}
return Mono.empty();
})
But no use.
You can use a filter, repeatWhenEmpty and Repeat like so
client.get()
.uri("https://someUri")
.retrieve()
.bodyToMono(Response.class)
.filter(response -> condition)
.repeatWhenEmpty(Repeat.onlyIf(r -> true)
.fixedBackoff(Duration.ofSeconds(5))
.timeout(Duration.ofSeconds(30)))
The Repeat class is part of the reactor-extra library
<dependency>
<groupId>io.projectreactor.addons</groupId>
<artifactId>reactor-extra</artifactId>
</dependency>

Execute a parallel flux after another flux has ended

BTW I'm still learning weblux;
I don't know if this is possible or I have the wrong approach but given this parallel flux.
Flux<String> enablers = Flux.fromIterable(enablersList)
.parallel()
.runOn(Schedulers.elastic())
.flatMap(element -> service.getAMono(string, entity, element))
.sequential();
who calls a method that has a webclient request (service.getAMono)
webClient.post()
.uri(url)
.headers(headers -> headers.addAll(httpHeaders))
.body(BodyInserters.fromObject(request))
.retrieve()
.bodyToMono(entity2.class);
I need to wait for enablers flux's flow ends and process all the responses inside it, the reason is if one of them gives me error or a negative response i won't run this other Parallel Flux for blockers
Flux<String> blockers = Flux.fromIterable(blockersList)
.parallel()
.runOn(Schedulers.elastic())
.flatMap(element -> service.callAMono(string, entity, element))
.sequential();
I though about "zip" method, but this one merge both response and is not what I want
If anybody could help me with this.
UPDATE
enablers. //handle enablers response and if error return a custom Mono<response> with .reduce
And if no error in the handle of enablers proceed to the .thenMany with the other Flux
I found the way to do it by conditional any in the first flux, like this
Flux.fromIterable(enablersList)
.parallel()
.runOn(Schedulers.elastic())
.flatMap(element -> service.getAMono(string, entity, element))
.sequential()
.any(element -> *stuff here)//condition
.flatMap(condition->{
if(condition.equals(Boolean.FALSE)){
return Flux.fromIterable(blockersList)
.parallel()
.runOn(Schedulers.elastic())
.flatMap(element -> service.callAMono(string, entity, element))
.sequential()
.reduce(**stuff here)// handle noError response and return;
}
return Mono.just(**stuff here);//handle error response and return
});
If there is another way to do this please I'll be glad you post it here Thanks, :D

Categories