Sometimes I want to peek what's the value that is flowing through the steam.
I cannot attach a debugger from my IDE. Because I will see unresolved objects instead of values. If I try to .await().indefinetely() it will raise an exception.
So I'm looking for something like in java streams I can simply use peek(e -> System.out.println(e)) which will simply print the value.
I have something like below
public Uni<TenantDraft> getTenantById(#PathParam("tenantKey") String tenantKey) {
return tenantService.findByTenantKey(tenantKey)
.onItem().ifNotNull().transform(TenantMapper.INSTANCE::tenantToTenantDraft)
.onItem().ifNull().failWith(ForbiddenException::new);
}
You can either use:
.log() which will log all the event
.invoke(item -> System.out.println(item))
Related
I'm trying to get my around a current issue I'm facing.
I have a function that returns an Optional type (an object with a few properties)
One of the properties is an url that might be present or not. I extract that url in order to make an HTTP request
injectedClass.method(tenant.clientKey()).flatMap(optionalProperty ->
optionalProperty.ifPresentOrElse(fi -> {
Blocking.get(() -> httpClientProvider.withHttpClient((HttpClient httpClient) ->
httpClient.request(URI.create(optionalProperty.webTriggerUrl()), (RequestSpec spec) -> {
LogstashMarker markers = append("webTriggerUrl", fi.webTriggerUrl()).and(append("method", "Post").and(append("behaviour", objectMapper.writeValueAsString(baseDTO))));
logger.debug(markers, "Executed a Post request to something webTriggerUrl");
spec.method(HttpMethod.POST);
spec.getBody().type(HttpHeaderValues.APPLICATION_JSON).text(objectMapper.writeValueAsString(baseDTO), CharsetUtil.UTF_8);
final MutableHeaders headers = spec.getHeaders();
headers
.set(HttpHeaderNames.USER_AGENT, userAgent);
headers.set(CorrelationId.HEADER_NAME, correlationId.id());
})
)).then(resp -> logger.info("ok"));
}, () -> logger.error("something"))
Blocking.get brings back a Promise and I get an error in my code basically saying that the expected return type of ifPresentOrElse should be void and not Promise
Is there a functional and better way to achieve this?
Yes there are ways, but you also have to decide what to do if the Optional is empty. Currently you want to return a Promise if the optional is present, and return nothing ("void") if it is empty. This doesn't work, the types for both branches need to be the same.
You can just use optionalProperty.map() to map your original Optional to a Optional<Promise>, and then use ifPresentOrElse, to do something with either the Promise or with the empty Optional, e.g. logging as you seem to be doing in your case.
But you also have a higher-level flatMap which I'm unclear from which type it is. Does this flatmap a Promise? Then you must return a Promise also from the other branch of the optional, and you could use optionalProperty.map(...).orElse( <create empty Promise here> ).
Also check out orElseGet() instead of orElse(), if you want to create the empty branch lazily (via Supplier).
ifPresentOrElse returns void. What you probably want is a combination of map and orElseGet:
optionalProperty.map(/* code to return a Promise */)
.orElseGet(() -> /* code to return a Promise that is immediately resolved */);
Inside the supplier to orElseGet() you can put your logger.error statement.
I'm quite newbie in webflux and I want to do the following thing:
I want to make parallel http requests to the same url with different parameter values and to stop when I get the first non null (and non exceptional) result.
I'm following the example from here https://www.baeldung.com/spring-webclient-simultaneous-calls
but I have no idea how to stop when I got the result. Can anybody help me?
Currently I have something like this:
RetrySpec retrySpec = Retry.max(3);
return webClient.get().uri("/getMyObject/{id}", id)
.retrieve()
.bodyToMono(MyObject.class)
.retryWhen(retrySpec);
}
public Flux<> getMyObjs(List<String> ids) {
return Flux.fromIterable(ids)
.parallel(Runtime.getRuntime().availableProcessors())
.runOn()
.flatMap(this::getMyObject)
.;//// Stop when I get first non exceptional value
}
Try the next() operator in Flux.
public Mono<MyObject> getMyObjs(List<String> ids) {
return Flux.fromIterable(ids)
.parallel(Runtime.getRuntime().availableProcessors())
.runOn()
.flatMap(this::getMyObject)
.next();// Emit only the first item emitted by this Flux, into a new Mono. If called on an empty Flux, emits an empty Mono.
}
Reference: https://projectreactor.io/docs/core/release/api/reactor/core/publisher/Flux.html#next--
However check the firstWithSignal & firstWithValue operator as well.
https://projectreactor.io/docs/core/release/api/reactor/core/publisher/Flux.html#firstWithSignal-java.lang.Iterable-
https://projectreactor.io/docs/core/release/api/reactor/core/publisher/Flux.html#firstWithValue-java.lang.Iterable-
When I get a problem like this, normally I check the documentation to find a proper operator from Flux API.
Sorry for some kind of theoretical question, but I'd like to find a way of quick reading someone else's functional code, building chain of methods use templates.
For example:
Case 1.
When I see use of .peek method or .wireTap from Spring Integration, I primarily expect logging, triggering monitoring or just transitional running external action, for instance:
.peek(params ->
log.info("creating cache configuration {} for key class \"{}\" and value class \"{}\"",
params.getName(), params.getKeyClass(), params.getValueClass()))
or
.peek(p ->
Try.run(() -> cacheService.cacheProfile(p))
.onFailure(ex ->
log.warn("Unable to cache profile: {}", ex.toString())))
or
.wireTap(sf -> sf.handle(msg -> {
monitoring.profileRequestsReceived();
log.trace("Client info request(s) received: {}", msg);
Case 2.
When I see use of .map method or .transform from Spring Integration, I understand that I'm up to get result of someFunction(input), for instance:
.map(e -> GenerateTokenRs.builder().token(e.getKey()).phoneNum(e.getValue()).build())
or
.transform(Message.class, msg -> {
ErrorResponse response = (ErrorResponse) msg.getPayload();
MessageBuilder builder = some tranforming;
return builder.build();
})
Current case.
But I don't have such a common view to .flatMap method.
Would you give me your opinion about this, please?
Add 1:
To Turamarth: I know the difference between .map and .flatMap methods. I actively use both .map, and .flatMap in my code.
But I ask community for theirs experience and coding templates.
It always helps to study the signature/javadoc of the streamish methods to understand them:
The flatMap() operation has the effect of applying a one-to-many transformation to the elements of the stream, and then flattening the resulting elements into a new stream.
So, typical code I expect, or wrote myself:
return someMap.values().stream().flatMap(Collection::stream)
The values of that map are sets, and I want to pull the entries of all these sets into a single stream for further processing here.
In other words: it is about "pulling out things", and getting them into a stream/collection for further processing.
I've found one more use template for .flatMap.
Let's have a look at the following code:
String s = valuesFromDb
.map(v -> v.get(k))
.getOrElse("0");
where Option<Map<String, String>> valuesFromDb = Option.of(.....).
If there's an entry k=null in the map, then we'll get null as a result of code above.
But we'd like to have "0" in this case as well.
So let's add .flatMap:
String s = valuesFromDb
.map(v -> v.get(k))
.flatMap(Option::of)
.getOrElse("0");
Regardless of having null as map's value we will get "0".
I have some code that looks like this (simplified pseudo-code):
[...]
// stream constructed of series of web service calls
Stream<InputStream> slowExternalSources = StreamSupport.stream(spliterator, false);
[...]
then this
public Stream<String> getLines(Stream<InputStream> slowExternalSources) {
return slowExternalSources.flatMap(is -> new BufferedReader(new InputStreamReader(is)).lines())
.onClose(() -> is.close());
}
and later this
Stream<String> lineStream = getLines();
lineStream.parallel().forEach( ... do some fast CPU-intensive stuff here ... }
I've been strugging to try to make this code execute with some level of parallelisation.
Inspection in jps/jstack/jmc shows that all the InputStream reading is occurring in the main thread, and not paralleling at all.
Possible culprints:
BufferedReader.lines() uses a Spliterator with parallel=false to construct the stream (source: see Java sources)
I think I read some articles that said flatMap does not interact well with parallel(). I am not able to locate that article right now.
How can I fix this code so that it runs in parallel?
I would like to retain the Java8 Streams if possible, to avoid rewriting existing code that expects a Stream.
NOTE I added java.util.concurrent to the tags because I suspect it might be part of the answer, even though it's not part of the question.
I have an issue while processing a flux that is built from a Stream.generate construct.
The Java stream is fetching some data from a remote source, hence I implemented a custom supplier that has the data fetching logic embedded, and then used it to populate the Stream.
Stream.generate(new SearchSupplier(...))
My idea is to detect an empty list and use the Java9 feature of takeWhile ->
Stream.generate(new SearchSupplier(this, queryBody))
.takeWhile(either -> either.isRight() && either.get().nonEmpty())
(using Vavr's Either construct)
The repositoroy layer flux will then do:
return Flux.fromStream (
this.searchStream(...) //this is where the stream gets generated
)
.map(Either::get)
.flatMap(Flux::fromIterable);
The "service" layer is composed of some transformation steps on the flux, but the method signature is something like Flux<JsonObject> search(...).
Finally, the controller layer has a GetMapping:
#GetMapping(produces = "application/stream+json")
public Flux search(...) {
return searchService.search(...) //this is the Flux<JsonObject> parth
.subscriberContext(...) //stuff I need available during processing
.doOnComplete(() -> log.debug("DONE"));
}
My problem is that the Flux seems to never terminate.
Doing a call from Postman for example just shot the 'Loading...' part in the response section. When I terminate the process from my IDE the results are then flushed to postman and I see what I'm expecting. Also the doOnComplete lambda never gets called
What I noticed is that if I change the source of a Flux:
Flux.fromArray(...) //harcoded array of lists of jsons
the doOnComplete lambda is called and also the http connection closes, and results are displayed in postman.
Any idea of what might be the issue?
Thanks.
You could create the Flux directly using code that looks like this. Note that I'm adding some assumed methods which you would need to implement based on your how your SearchSupplier works:
Flux<SearchResultType> flux = Flux.generate(
() -> new SearchSupplier(this, queryBody),
(supplier, sink) -> {
SearchResultType current = supplier.next();
if (isNotLast(current)) {
sink.next(current);
} else {
sink.complete();
}
return supplier;
},
supplier -> anyCleanupOperations(supplier)
);