I'm using Feign Client in Reactive Java. The Feign client has an interceptor that sends a blocking request to get auth token and adds it as a header to the feign request.
the feign request is wrapped in Mono.FromCallable with Schedulers.boundedElastic().
my question is: does the inner call to get the auth token considered as a blocking call?
I get that both calls will be on a different thread from Schedulers.boundedElastic() but not sure is ok to execute them on the same thread or I should change it so they'll run on different threads.
Feign client:
#FeignClient(name = "remoteRestClient", url = "${remote.url}",
configuration = AuthConfiguration.class, decode404 = true)
#Profile({ "!test" })
public interface RemoteRestClient {
#GetMapping(value = "/getSomeData" )
Data getData();
}
interceptor:
public class ClientRequestInterceptor implements RequestInterceptor {
private IAPRequestBuilder iapRequestBuilder;
private String clientName;
public ClientRequestInterceptor(String clientName, String serviceAccount, String jwtClientId) {
this.iapRequestBuilder = new IAPRequestBuilder(serviceAccount, jwtClientId);
this.clientName = clientName;
}
#Override
public void apply(RequestTemplate template) {
try {
HttpRequest httpRequest = iapRequestBuilder.buildIapRequest(); <---- blocking call
template.header(HttpHeaders.AUTHORIZATION, httpRequest.getHeaders().getAuthorization());
} catch (IOException e) {
log.error("Building an IAP request has failed: {}", e.getMessage(), e);
throw new InterceptorException(String.format("failed to build IAP request for %s", clientName), e);
}
}
}
feign configuration:
public class AuthConfiguration {
#Value("${serviceAccount}")
private String serviceAccount;
#Value("${jwtClientId}")
private String jwtClientId;
#Bean
public ClientRequestInterceptor getClientRequestInterceptor() {
return new ClientRequestInterceptor("Entitlement", serviceAccount, jwtClientId);
}
}
and feign client call:
private Mono<Data> getData() {
return Mono.fromCallable(() -> RemoteRestClient.getData()
.publishOn(Schedulers.boundedElastic());
}
You can sort of tell that it is a blocking call since it returns a concrete class and not a Future (Mono or Flux). To be able to return a concrete class, the thread needs to wait until we have the response to return it.
So yes it is most likely a blocking call.
Reactor recommends that you use the subscribeOn operator when doing blocking calls, this will place that entire chain of operators on its own thread pool.
You have chosen to use the publishOn and it is worth pointing out the following from the docs:
affects where the subsequent operators execute
This in practice means that up until the publishOn operator all actions will be executed using any available anonymous thread.
But all calls after will be executed on the defined thread pool.
private Mono<Data> getData() {
return Mono.fromCallable(() -> RemoteRestClient.getData()
.publishOn(Schedulers.boundedElastic());
}
You have chosen to place it after so the thread pool switch will be done after the call to getData.
publishOns placing in the chain matters while subscribeOn affects the entire chain of operator which means it's placing does not matter.
So to answer your question again, yes it is most likely a blocking call (i can't confirm by 100% since i have not looked into the source code) and how you wish to solve it with either publishOn on subscribeOn is up to you.
Or look into if there is an reactive alternative library to use.
Related
I have a Spring Boot application that will call several microservice URLs using the GET method. These microservice URL endpoints are all implemented as #RestControllers. They don't return Flux or Mono.
I need my application to capture which URLs are not returning 2xx HTTP status.
I'm currently using the following code to do this:
List<String> failedServiceUrls = new ArrayList<>();
for (String serviceUrl : serviceUrls.getServiceUrls()) {
try {
ResponseEntity<String> response = rest.getForEntity(serviceUrl, String.class);
if (!response.getStatusCode().is2xxSuccessful()) {
failedServiceUrls.add(serviceUrl);
}
} catch (Exception e){
failedServiceUrls.add(serviceUrl);
}
}
// all checks are complete so send email with the failedServiceUrls.
mail.sendEmail("Service Check Complete", failedServiceUrls);
}
The problem is that each URL call is slow to respond and I have to wait for one URL call to complete prior to making the next one.
How can I change this to make the URLs calls be made concurrently? After all call have completed, I need to send an email with any URLs that have an error that should be collected in failedServiceUrls.
Update
I revised the above post to state that I just want the calls to be made concurrently. I don't care that rest.getForEntity call blocks.
Using the executor service in your code, you can call all microservices in parallel this way:
// synchronised it as per Maciej's comment:
failedServiceUrls = Collections.synchronizedList(failedServiceUrls);
ExecutorService executorService = Executors.newFixedThreadPool(serviceUrls.getServiceUrls().size());
List<Callable<String>> runnables = new ArrayList<>().stream().map(o -> new Callable<String>() {
#Override
public String call() throws Exception {
ResponseEntity<String> response = rest.getForEntity(serviceUrl, String.class);
// do something with the response
if (!response.getStatusCode().is2xxSuccessful()) {
failedServiceUrls.add(serviceUrl);
}
return response.getBody();
}
}).collect(toList());
List<Future<String>> result = executorService.invokeAll(runnables);
for(Future f : result) {
String resultFromService = f.get(); // blocker, it will wait until the execution is over
}
If you just want to make calls concurrently and you don't care about blocking threads you can:
wrap the blocking service call using Mono#fromCallable
transform serviceUrls.getServiceUrls() into a reactive stream using Flux#fromIterable
Concurrently call and filter failed services with Flux#filterWhen using Flux from 2. and asynchronous service call from 1.
Wait for all calls to complete using Flux#collectList and send email with invalid urls in subscribe
void sendFailedUrls() {
Flux.fromIterable(erviceUrls.getServiceUrls())
.filterWhen(url -> responseFailed(url))
.collectList()
.subscribe(failedURls -> mail.sendEmail("Service Check Complete", failedURls));
}
Mono<Boolean> responseFailed(String url) {
return Mono.fromCallable(() -> rest.getForEntity(url, String.class))
.map(response -> !response.getStatusCode().is2xxSuccessful())
.subscribeOn(Schedulers.boundedElastic());
}
Blocking calls with Reactor
Since the underlying service call is blocking it should be executed on a dedicated thread pool. Size of this thread pool should be equal to the number of concurrent calls if you want to achieve full concurrency. That's why we need .subscribeOn(Schedulers.boundedElastic())
See: https://projectreactor.io/docs/core/release/reference/#faq.wrap-blocking
Better solution using WebClient
Note however, that blocking calls should be avoided when using reactor and spring webflux. The correct way to do this would be to replace RestTemplate with WebClient from Spring 5 which is fully non-blocking.
See: https://docs.spring.io/spring-boot/docs/2.0.3.RELEASE/reference/html/boot-features-webclient.html
I have two Singles (getChannels and getEPGs), both running in parallel. In most cases, getChannels is completed before getEPGs and I can connect the EPG's to the channels. However, I would like to handle the cases where getEPGs are completed before the getChannels.
In other words,
Both Singles are running parallel.
To connect the EPG, the channels must have been loaded.
If the getEPGs is completed before the getChannels, it must wait for the getChannels, and only then a method will be invoked
If the getEPGs fails, the app flow will continue regardless.
How can I accomplish this without relying on callbacks and while loops? I guess that there should be a reactive way to handle this case. Thanks in advance.
#GET
Single<ResponseBody> getChannels(#Url String url);
#Streaming
#GET
Single<ResponseBody> getEPGs(#Url String url);
getChannels [.............................]
getEPGs [..........................................]
Define your functions into a repository and access them through a viewModel.
public class Repository {
#GET
return Single<ResponseBody> getChannels(#Url String url);
#Streaming
#GET
return Single<ResponseBody> getEPGs(#Url String url);
}
Now ViewModel class.
public class SiteListViewModel extends BaseViewModel {
private CompositeDisposable mDisposable;
private Repository mRepository;
public void getData() {
mDisposable.add(
mRepository.getChannels().subscribeOn(Schedulers.io())
.observeOn(AndroidSchedulers.mainThread())
.subscribe(obj -> {"CALL SECOND FUNCTION OVER HERE"
},throwable -> Log.e("Error", e.printStackTrace()))
);
}
}
2. To connect the EPG, the channels must have been loaded.
Okay,
ResponseBody channels = getChannels.blockingGet();
3. If the getEPGs is completed before the getChannels, it must wait for the getChannels, and only then a method will be invoked
Okay, wait for channels and call a method:
ResponseBody channels = getChannels.blockingGet();
aMethod(channels);
4. If the getEPGs fails, the app flow will continue regardless.
the above statements indeed ignores the result of getEPGs().
So strictly speacing, all your conditions can be expressed simply: wait for getChannels() and ignore getEPGs(). That conditions are fully satisfied by the proposed 2 lines of code.
I'm working on a backend Spring Boot project which is called by multiple clients. One of the functionalities is to merge data from two different databases and return the result, which may take up to 2 minutes.
I would like to be able to make concurrent calls to this endpoint wait for an already running request and return the same result without running the query again.
As shown below I've tried to setup a CompletableFuture field in the service singleton bean (which I know is a code smell since singleton service beans should be stateless).
//RestController
#Async
#GetMapping
public CompletableFuture<List<Foo>> getSyncedFoo() {
return service.syncFoo();
}
//ServiceImpl
private CompletableFuture<List<Foo>> syncTask;
#Override
#Async
#Transactional
public CompletableFuture<List<Foo>> syncFoo() {
if (this.syncTask == null || this.syncTask.isDone()) {
this.syncTask = CompletableFuture.supplyAsync(() -> {
// long running task
return new ArrayList<>();
});
}
return this.dbaseSyncTask;
}
I expected multiple frontend clients calling the api endpoint to receive the same response at roughly the same time, resulting in the backend performing the long running operation just once.
The operation was in fact executed just once but one of the clients received a 503 (Service Unavailable) while the other client received the expected response.
I suspect it's due to the shared use of the CompletableFuture, but I'm at a loss on what approach I should take. Could RxJava be of any use with the Observable strategy?
I've found a functional answer, for now.
#Service
public class FooServiceImpl implements FooService {
private CompletableFuture<List<Foo>> syncFuture;
private Observable<List<Foo>> syncObservable;
#Override
public Single<List<Foo>> syncFoo() {
if (syncFuture == null || syncFuture .isDone()) {
syncFuture = syncFooAsync();
syncObservable = Observable.fromFuture(syncFuture).share();
}
return Single.fromObservable(syncObservable);
}
private CompletableFuture<List<Foo>> syncFooAsync() {
return CompletableFuture.supplyAsync(() -> {
try {
Thread.sleep(10_000);
} catch (InterruptedException e) {
e.printStackTrace();
} finally {
return new ArrayList<>();
}
});
}
}
By using the RxJava library it s possible to multicast the created observable to multiple listeners using Observable::share method and the #RestController will happily work with the returned Single(s).
Sadly it still uses state in a singleton which is accessed concurrently by multiple threads so I fear situations where concurrency issues like the Observable completing while a new request is still in the process of creating a new subscription.
Hence I do not recommend this as a best practice so I'm not marking this as a final answer.
I am working on a library which will take an object DataRequest as an input parameter and basis on that object, I will construct an URL and then make a call to our app servers using apache http client and then return the response back to the customer who is using our library. Some customer will call the executeSync method to get the same feature and some customer will call our executeAsync method to get the data.
executeSync() - waits until I have a result, returns the result.
executeAsync() - returns a Future immediately which can be processed after other things are done, if needed.
Below is my DataClient class which has above two methods:
public class DataClient implements Client {
private final ForkJoinPool forkJoinPool = new ForkJoinPool(16);
private CloseableHttpClient httpClientBuilder;
// initializing httpclient only once
public DataClient() {
try {
RequestConfig requestConfig =
RequestConfig.custom().setConnectionRequestTimeout(500).setConnectTimeout(500)
.setSocketTimeout(500).setStaleConnectionCheckEnabled(false).build();
SocketConfig socketConfig =
SocketConfig.custom().setSoKeepAlive(true).setTcpNoDelay(true).build();
PoolingHttpClientConnectionManager poolingHttpClientConnectionManager =
new PoolingHttpClientConnectionManager();
poolingHttpClientConnectionManager.setMaxTotal(300);
poolingHttpClientConnectionManager.setDefaultMaxPerRoute(200);
httpClientBuilder =
HttpClientBuilder.create().setConnectionManager(poolingHttpClientConnectionManager)
.setDefaultRequestConfig(requestConfig).setDefaultSocketConfig(socketConfig).build();
} catch (Exception ex) {
// log error
}
}
#Override
public List<DataResponse> executeSync(DataRequest key) {
List<DataResponse> responsList = null;
Future<List<DataResponse>> responseFuture = null;
try {
responseFuture = executeAsync(key);
responsList = responseFuture.get(key.getTimeout(), key.getTimeoutUnit());
} catch (TimeoutException | ExecutionException | InterruptedException ex) {
responsList =
Collections.singletonList(new DataResponse(DataErrorEnum.CLIENT_TIMEOUT,
DataStatusEnum.ERROR));
responseFuture.cancel(true);
// logging exception here
}
return responsList;
}
#Override
public Future<List<DataResponse>> executeAsync(DataRequest key) {
DataFetcherTask task = new DataFetcherTask(key, this.httpClientBuilder);
return this.forkJoinPool.submit(task);
}
}
Below is my DataFetcherTask class which also has a static class DataRequestTask which calls our app servers by making URL:
public class DataFetcherTask extends RecursiveTask<List<DataResponse>> {
private final DataRequest key;
private final CloseableHttpClient httpClientBuilder;
public DataFetcherTask(DataRequest key, CloseableHttpClient httpClientBuilder) {
this.key = key;
this.httpClientBuilder = httpClientBuilder;
}
#Override
protected List<DataResponse> compute() {
// Create subtasks for the key and invoke them
List<DataRequestTask> requestTasks = requestTasks(generateKeys());
invokeAll(requestTasks);
// All tasks are finished if invokeAll() returns.
List<DataResponse> responseList = new ArrayList<>(requestTasks.size());
for (DataRequestTask task : requestTasks) {
try {
responseList.add(task.get());
} catch (InterruptedException | ExecutionException e) {
Thread.currentThread().interrupt();
return Collections.emptyList();
}
}
return responseList;
}
private List<DataRequestTask> requestTasks(List<DataRequest> keys) {
List<DataRequestTask> tasks = new ArrayList<>(keys.size());
for (DataRequest key : keys) {
tasks.add(new DataRequestTask(key));
}
return tasks;
}
// In this method I am making a HTTP call to another service
// and then I will make List<DataRequest> accordingly.
private List<DataRequest> generateKeys() {
List<DataRequest> keys = new ArrayList<>();
// use key object which is passed in contructor to make HTTP call to another service
// and then make List of DataRequest object and return keys.
return keys;
}
/** Inner class for the subtasks. */
private static class DataRequestTask extends RecursiveTask<DataResponse> {
private final DataRequest request;
public DataRequestTask(DataRequest request) {
this.request = request;
}
#Override
protected DataResponse compute() {
return performDataRequest(this.request);
}
private DataResponse performDataRequest(DataRequest key) {
MappingHolder mappings = DataMapping.getMappings(key.getType());
List<String> hostnames = mappings.getAllHostnames(key);
for (String hostname : hostnames) {
String url = generateUrl(hostname);
HttpGet httpGet = new HttpGet(url);
httpGet.setConfig(generateRequestConfig());
httpGet.addHeader(key.getHeader());
try (CloseableHttpResponse response = httpClientBuilder.execute(httpGet)) {
HttpEntity entity = response.getEntity();
String responseBody =
TestUtils.isEmpty(entity) ? null : IOUtils.toString(entity.getContent(),
StandardCharsets.UTF_8);
return new DataResponse(responseBody, DataErrorEnum.OK, DataStatusEnum.OK);
} catch (IOException ex) {
// log error
}
}
return new DataResponse(DataErrorEnum.SERVERS_DOWN, DataStatusEnum.ERROR);
}
}
}
For each DataRequest object there is a DataResponse object. Now once someone calls our library by passing DataRequest object, internally we make List<DataRequest> object and then we invoke each DataRequest object in parallel and return List<DataResponse> back where each DataResponse object in the list will have response for corresponding DataRequest object.
Below is the flow:
Customer will call DataClient class by passing DataRequest object. They can call executeSync() or executeAsync() method depending on their requirements.
Now in the DataFetcherTask class (which is a RecursiveTask one of ForkJoinTask's subtypes), given a key object which is a single DataRequest, I will generate List<DataRequest> and then invokes each subtask in parallel for each DataRequest object in the list. These subtasks are executed in the same ForkJoinPool as the parent task.
Now in the DataRequestTask class, I am executing each DataRequest object by making an URL and return its DataResponse object back.
Problem Statement:
Since this library is being called in a very high throughput environment so it has to be very fast. For synchronous call, executing in a separate thread is ok here? It will incur extra costs and resources for a Thread along with the cost of context switch of threads in this case so I am little bit confuse. Also I am using ForkJoinPool here which will save me in using extra thread pool but is it the right choice here?
Is there any better and efficient way to do the same thing which can be performance efficient as well? I am using Java 7 and have access to Guava library as well so if it can simplify anything then I am open for that as well.
It looks like we are seeing some contention when it runs under very heavy load. Is there any way this code can go into thread contention when runs under very heavy load?
I think in your situation it's better to use async http call, see link: HttpAsyncClient. And you don't need to use thread pool.
In executeAsync method create empty CompletableFuture<DataResponse>() and pass it to client call, there in callback call set the result of completableFuture by calling complete on it (or completeExceptionally if exceptions raise).
ExecuteSync method implementation looks good.
edit:
For java 7 it's only need to replace a completableFuture to promise implementation in guava, like ListenableFuture or anything similar
The choice to use the ForkJoinPool is correct, its designed for efficiency with many small tasks:
A ForkJoinPool differs from other kinds of ExecutorService mainly by virtue of employing work-stealing: all threads in the pool attempt to find and execute tasks submitted to the pool and/or created by other active tasks (eventually blocking waiting for work if none exist). This enables efficient processing when most tasks spawn other subtasks (as do most ForkJoinTasks), as well as when many small tasks are submitted to the pool from external clients. Especially when setting asyncMode to true in constructors, ForkJoinPools may also be appropriate for use with event-style tasks that are never joined.
I suggest to try the asyncMode = true in the constructor since in your case the tasks are never joined:
public class DataClient implements Client {
private final ForkJoinPool forkJoinPool = new ForkJoinPool(16, ForkJoinPool.ForkJoinWorkerThreadFactory, null, true);
...
}
For the executeSync() you can use the forkJoinPool.invoke(task), this is the managed way to do a synchronous task execution in the pool for resources optimisation:
#Override
public List<DataResponse> executeSync(DataRequest key) {
DataFetcherTask task = new DataFetcherTask(key, this.httpClientBuilder);
return this.forkJoinPool.invoke(task);
}
If you can use Java 8 then there is a common pool already optimised: ForkJoinPool.commonPool()
I want to read a message at a specific position in an class other than InboundHandler. I can't find a way to read it expect in the channelRead0 method, which is called from the netty framework.
For example:
context.writeMessage("message");
String msg = context.readMessage;
If this is not possible, how can I map a result, which I get in the channelRead0 method to a specific call I made in another class?
The Netty framework is designed to be asynchronously driven. Using this analogy, it can handle large amount of connections with minimal threading usage. I you are creating an api that uses the netty framework to dispatch calls to a remote location, you should use the same analogy for your calls.
Instead of making your api return the value direct, make it return a Future<?> or a Promise<?>. There are different ways of implementing this system in your application, the simplest way is creating a custom handler that maps the incoming requests to the Promises in a FIFO queue.
An example of this could be the following:
This is heavily based on this answer that I submitted in the past.
We start with out handler that maps the requests to requests in our pipeline:
public class MyLastHandler extends SimpleInboundHandler<String> {
private final SynchronousQueue<Promise<String>> queue;
public MyLastHandler (SynchronousQueue<Promise<String>> queue) {
super();
this.queue = queue;
}
// The following is called messageReceived(ChannelHandlerContext, String) in 5.0.
#Override
public void channelRead0(ChannelHandlerContext ctx, String msg) {
this.queue.remove().setSuccss(msg);
// Or setFailure(Throwable)
}
}
We then need to have a method of sending the commands to a remote server:
Channel channel = ....;
SynchronousQueue<Promise<String>> queue = ....;
public Future<String> sendCommandAsync(String command) {
return sendCommandAsync(command, new DefaultPromise<>());
}
public Future<String> sendCommandAsync(String command, Promise<String> promise) {
synchronized(channel) {
queue.offer(promise);
channel.write(command);
}
channel.flush();
}
After we have done our methods, we need a way to call it:
sendCommandAsync("USER anonymous",
new DefaultPromise<>().addListener(
(Future<String> f) -> {
String response = f.get();
if (response.startWidth("331")) {
// do something
}
// etc
}
)
);
If the called would like to use our a api as a blocking call, he can also do that:
String response = sendCommandAsync("USER anonymous").get();
if (response.startWidth("331")) {
// do something
}
// etc
Notice that Future.get() can throw an InterruptedException if the Thread state is interrupted, unlike a socket read operation, who can only be cancelled by some interaction on the socket. This exception should not be a problem in the FutureListener.