I am new to Java thread, and I am trying to learn how completableFuture API works. When I ran the code below, I get the thread name output, as shown below. SupplyAsync and ThenApplyAsync seem to be using the same thread, which is ForkJoinPool.commonPool-worker-1. My understanding is that if I use ThenApplyAsync, ThenApplyAsync uses different thread from SupplyAsync. Can you tell me what is going on here? Thanks!
Code:
public static void main(String[] args) throws InterruptedException, ExecutionException {
System.out.println("Current Thread : " + Thread.currentThread().getName());
CompletableFuture<String> future = CompletableFuture.supplyAsync(() -> {
System.out.println("Current Thread (SupplyAsync) : " + Thread.currentThread().getName());
try {
TimeUnit.SECONDS.sleep(1);
} catch (InterruptedException ex) {
throw new IllegalStateException(ex);
}
return "Result";
}).thenApplyAsync(result -> {
System.out.println("Current Thread (ThenApplyAsync) : " + Thread.currentThread().getName());
return result.toUpperCase();
});
System.out.println("CompletableFuture Result : " + future.get());
}
Output:
Current Thread : main
Current Thread (SupplyAsync) : ForkJoinPool.commonPool-worker-1
Current Thread (ThenApplyAsync) : ForkJoinPool.commonPool-worker-1
CompletableFuture Result : RESULT
You are wrong to assume that thenApplyAsync will use a different thread than the previous completion stage.
<U> CompletableFuture<U> thenApplyAsync(Function<? super T,? extends U> fn)
Returns a new CompletionStage that, when this stage completes normally, is executed using this stage's default asynchronous execution facility, with this stage's result as the argument to the supplied function.
It uses the same executionFacility as the previous stage ie, ForkJoinPool.commonPool(). But beyond that there is no guarantee on which thread in the pool it runs on.
Related
Here's a short code version of the problem I'm facing:
public static void main(String[] args) {
CompletableFuture.supplyAsync(() -> {
/*
try {
Thread.sleep(2000);
} catch (InterruptedException ignored) {}
*/
//System.out.println("supplyAsync: " + Thread.currentThread().getName());
return 1;
})
.thenApply(i -> {
System.out.println("apply: " + Thread.currentThread().getName());
return i + 1;
})
.thenAccept((i) -> {
System.out.println("accept: " + Thread.currentThread().getName());
System.out.println("result: " + i);
}).join();
}
This is the output that I get:
apply: main
accept: main
result: 2
I'm surprised to see main there! I expected something like this which happens when I uncomment the Thread.sleep() call or even as much as uncomment the single sysout statement there:
supplyAsync: ForkJoinPool.commonPool-worker-1
apply: ForkJoinPool.commonPool-worker-1
accept: ForkJoinPool.commonPool-worker-1
result: 2
I understand thenApplyAsync() will make sure it won't run on the main thread, but I want to avoid passing the data returned by the supplier from the thread that ran supplyAsync to the thread that's going to run thenApply and the other subsequent thens in the chain.
The method thenApply evaluates the function in the caller’s thread because the future has been completed already. Of course, when you insert a sleep into the supplier, the future has not been completed by the time, thenApply is called. Even a print statement might slow down the supplier enough to have the main thread invoke thenApply and thenAccept first. But this is not reliable behavior, you may get different results when running the code repeatedly.
Not only does the future not remember which thread completed it, there is no way to tell an arbitrary thread to execute a particular code. The thread might be busy with something else, being entirely uncooperative, or even have terminated in the meanwhile.
Just consider
ExecutorService s = Executors.newSingleThreadExecutor();
CompletableFuture<Integer> cf = CompletableFuture.supplyAsync(() -> {
System.out.println("supplyAsync: " + Thread.currentThread().getName());
return 1;
}, s);
s.shutdown();
s.awaitTermination(1, TimeUnit.DAYS);
cf.thenApply(i -> {
System.out.println("apply: " + Thread.currentThread().getName());
return i + 1;
})
.thenAccept((i) -> {
System.out.println("accept: " + Thread.currentThread().getName());
System.out.println("result: " + i);
}).join();
How could we expect the functions passed to thenApply and thenAccept to be executed in the already terminated pool’s worker thread?
We could also write
CompletableFuture<Integer> cf = new CompletableFuture<>();
Thread t = new Thread(() -> {
System.out.println("completing: " + Thread.currentThread().getName());
cf.complete(1);
});
t.start();
t.join();
System.out.println("completer: " + t.getName() + " " + t.getState());
cf.thenApply(i -> {
System.out.println("apply: " + Thread.currentThread().getName());
return i + 1;
})
.thenAccept((i) -> {
System.out.println("accept: " + Thread.currentThread().getName());
System.out.println("result: " + i);
}).join();
which will print something alike
completing: Thread-0
completer: Thread-0 TERMINATED
apply: main
accept: main
result: 2
Obviously, we can’t insist on this thread processing the subsequent stages.
But even when the thread is a still alive worker thread of a pool, it doesn’t know that it has completed a future nor has it a notion of “processing subsequent stages”. Following the Executor abstraction, it just has received an arbitrary Runnable from the queue and after processing it, it proceeds with its main loop, fetching the next Runnable from the queue.
So once the first future has been completed, the only way to tell it to do the work of completing other futures, is by enqueuing the tasks. This is what happens when using thenApplyAsync specifying the same pool or performing all actions with the …Async methods without an executor, i.e. using the default pool.
When you use a single threaded executor for all …Async methods, you can be sure that all actions are executed by the same thread, but they will still get through the pool’s queue. Since even then, it’s the main thread actually enqueuing the dependent actions in case of an already completed future, a thread safe queue and hence, synchronization overhead, is unavoidable.
But note that even if you manage to create the chain of dependent actions first, before a single worker thread processes them all sequentially, this overhead is still there. Each future’s completion is done by storing the new state in a thread safe way, making the result potentially visible to all other threads, and atomically checking whether a concurrent completion (e.g. a cancelation) has happened in the meanwhile. Then, the dependent action(s) chained by other threads will be fetched, of course, in a thread safe way, before they are executed.
All these actions with synchronization semantics make it unlikely that there are benefits of processing the data by the same thread when having a chain of dependent CompletableFutures.
The only way to have an actual local processing potentially with performance benefits is by using
CompletableFuture.runAsync(() -> {
System.out.println("supplyAsync: " + Thread.currentThread().getName());
int i = 1;
System.out.println("apply: " + Thread.currentThread().getName());
i = i + 1;
System.out.println("accept: " + Thread.currentThread().getName());
System.out.println("result: " + i);
}).join();
Or, in other words, if you don’t want detached processing, don’t create detached processing stages in the first place.
I am making multiple async calls to my database. I store all those async calls on a List<CompletableFuture<X>> list. I want to collect all the results together, so I need to wait for all of those calls to complete.
One way is to create a CompletableFuture.allOf(list.toArray(...))...
Another way is to use: list.stream.map(cf -> cf.join())...
I was just wondering if there are any advantages of creating the global CompletableFuture and waiting for it to complete (when all the individual CompletableFuture complete) over directly waiting for the individual CompletableFutures to complete.
The main thread gets blocked either way.
static CompletableFuture<Void> getFailingCF() {
return CompletableFuture.runAsync(() -> {
System.out.println("getFailingCF :: Started getFailingCF.. ");
throw new RuntimeException("getFailingCF:: Failed");
});
}
static CompletableFuture<Void> getOkCF() {
return CompletableFuture.runAsync(() -> {
System.out.println("getOkCF :: Started getOkCF.. ");
LockSupport.parkNanos(TimeUnit.SECONDS.toNanos(3));
System.out.println("getOkCF :: Completed getOkCF.. ");
});
}
public static void main(String[] args) {
List<CompletableFuture<Void>> futures = new ArrayList<>();
futures.add(getFailingCF());
futures.add(getOkCF());
// using CompletableFuture.allOf
var allOfCF = CompletableFuture.allOf(futures.toArray(new CompletableFuture[0]));
allOfCF.join();
// invoking join on individual CF
futures.stream()
.map(CompletableFuture::join)
.collect(Collectors.toList());
}
In the code snippet above, the difference lies in handling exception: The CompletableFuture.allOf(..) wraps any exception thrown by any of the CompletableFutures while allowing rest of the threads (executing the CompletableFuture) continue their execution.
The list.stream.map(cf -> cf.join())... way immediately throws the exception and terminates the app (and all threads executing the CFs in the list).
Note that invoking join() on allOf throws the wrapped exception, too. It will also terminate the app. But, by this time, unlike list.stream.map(cf -> cf.join())..., the rest of the threads have completed their processing.
allOfCF.whenComplete(..) is one of the graceful ways to handle the execution result (normal or exceptional) of all the CFs:
allOfCF.whenComplete((v, ex) -> {
System.out.println("In whenComplete...");
System.out.println("----------- Exception Status ------------");
System.out.println(" 1: " + futures.get(0).isCompletedExceptionally());
System.out.println(" 2: " + futures.get(1).isCompletedExceptionally());
});
In the list.stream.map(cf -> cf.join())... way, one needs to wrap the join() call in try/catch.
I'm starting to be comfortable with Java CompletableFuture composition, having worked with JavaScript promises. Basically the composition just scheduled the chained commands on the indicated executor. But I'm unsure of which thread is running when the composition is performed.
Let's say I have two executors, executor1 and executor2; for simplicity let's say they are separate thread pools. I schedule a CompletableFuture (to use a very loose description):
CompletableFuture<Foo> futureFoo = CompletableFuture.supplyAsync(this::getFoo, executor1);
Then when that is done I transform the Foo to Bar using the second executor:
CompletableFuture<Bar> futureBar .thenApplyAsync(this::fooToBar, executor2);
I understand that getFoo() will be called from a thread in the executor1 thread pool. I understand that fooToBar() will be called from a thread in the executor2 thread pool.
But what thread is used for the actual composition, i.e. after getFoo() finishes and futureFoo() is complete; but before the fooToBar() command gets scheduled on executor2? In other words, what thread actually runs the code to schedule the second command on the second executor?
Is the scheduling performed as part of the same thread in executor1 that called getFoo()? If so, would this completable future composition be equivalent to my simply scheduling fooToBar() manually myself in the first command in the executor1 task?
This is intentionally unspecified. In practice, it will be handled by the same code that also handles the chained operations when the variants without the Async suffix are invoked and exhibits similar behavior.
So when we use the following test code
CompletableFuture.supplyAsync(() -> {
LockSupport.parkNanos(TimeUnit.SECONDS.toNanos(1));
return "";
}, r -> new Thread(r, "A").start())
.thenAcceptAsync(s -> {}, r -> {
System.out.println("scheduled by " + Thread.currentThread());
new Thread(r, "B").start();
});
it will likely print
scheduled by Thread[A,5,main]
as the thread that completed the previous stage was used to schedule the depending action.
However when we use
CompletableFuture<String> first = CompletableFuture.supplyAsync(() -> "",
r -> new Thread(r, "A").start());
LockSupport.parkNanos(TimeUnit.SECONDS.toNanos(1));
first.thenAcceptAsync(s -> {}, r -> {
System.out.println("scheduled by " + Thread.currentThread());
new Thread(r, "B").start();
});
it will likely print
scheduled by Thread[main,5,main]
as by the time the main thread invokes thenAcceptAsync, the first future is already completed and the main thread will schedule the action itself.
But that is not the end of the story. When we use
CompletableFuture<String> first = CompletableFuture.supplyAsync(() -> {
LockSupport.parkNanos(TimeUnit.MILLISECONDS.toNanos(5));
return "";
}, r -> new Thread(r, "A").start());
Set<String> s = ConcurrentHashMap.newKeySet();
Runnable submitter = () -> {
String n = Thread.currentThread().getName();
do {
for(int i = 0; i < 1000; i++)
first.thenAcceptAsync(x -> s.add(n+" "+Thread.currentThread().getName()),
Runnable::run);
} while(!first.isDone());
};
Thread b = new Thread(submitter, "B");
Thread c = new Thread(submitter, "C");
b.start();
c.start();
b.join();
c.join();
System.out.println(s);
It may not only print the combinations B A and C A from the first scenario and B B and C C from the second. On my machine it reproducibly also prints the combinations B C and C B indicating that an action passed to thenAcceptAsync by one thread got submitted to the executor by the other thread calling thenAcceptAsync with a different action at the same time.
This is matching the scenarios for the thread evaluating the function passed to thenApply (without the Async) described in this answer. As said at the beginning, that was what I expected as both things are likely handled by the same code. But unlike the thread evaluating the function passed to thenApply, the thread invoking the execute method on the Executor is not even mentioned in the documentation. So in theory, another implementation could use an entirely different thread not calling a method on the future nor completing it.
At the end is a simple program that does like your code snippet and allows you to play with it.
The output confirms that the executor you supply is called to complete (unless you explicitly call complete early enough - which would happen in the calling thread of complete) when the condition it is waiting on is ready - the get() on a Future blocks until the Future is finished.
Supply an arg - there's an executor 1 and executor 2, supply no args there's just one executor. The output is either (same executor - things a run as separate tasks in the same executor sequentially) -
In thread Thread[main,5,main] - getFoo
In thread Thread[main,5,main] - getFooToBar
In thread Thread[pool-1-thread-1,5,main] - Supplying Foo
In thread Thread[pool-1-thread-1,5,main] - fooToBar
In thread Thread[main,5,main] - Completed
OR (two executors - things again run sequentially but using different executors) -
In thread Thread[main,5,main] - getFoo
In thread Thread[main,5,main] - getFooToBar
In thread Thread[pool-1-thread-1,5,main] - Supplying Foo
In thread Thread[pool-2-thread-1,5,main] - fooToBar
In thread Thread[main,5,main] - Completed
Remember: the code with the executors (in this example can start immediately in another thread .. the getFoo was called prior to even getting to setting up the FooToBar).
Code follows -
package your.test;
import java.util.concurrent.CompletableFuture;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Executor;
import java.util.concurrent.Executors;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.TimeoutException;
import java.util.function.Function;
import java.util.function.Supplier;
public class TestCompletableFuture {
private static void dumpWhichThread(final String msg) {
System.err.println("In thread " + Thread.currentThread().toString() + " - " + msg);
}
private static final class Foo {
final int i;
Foo(int i) {
this.i = i;
}
};
public static Supplier<Foo> getFoo() {
dumpWhichThread("getFoo");
return new Supplier<Foo>() {
#Override
public Foo get() {
dumpWhichThread("Supplying Foo");
return new Foo(10);
}
};
}
private static final class Bar {
final String j;
public Bar(final String j) {
this.j = j;
}
};
public static Function<Foo, Bar> getFooToBar() {
dumpWhichThread("getFooToBar");
return new Function<Foo, Bar>() {
#Override
public Bar apply(Foo t) {
dumpWhichThread("fooToBar");
return new Bar("" + t.i);
}
};
}
public static void main(final String args[]) throws InterruptedException, ExecutionException, TimeoutException {
final TestCompletableFuture obj = new TestCompletableFuture();
obj.running(args.length == 0);
}
private String running(final boolean sameExecutor) throws InterruptedException, ExecutionException, TimeoutException {
final Executor executor1 = Executors.newSingleThreadExecutor();
final Executor executor2 = sameExecutor ? executor1 : Executors.newSingleThreadExecutor();
CompletableFuture<Foo> futureFoo = CompletableFuture.supplyAsync(getFoo(), executor1);
CompletableFuture<Bar> futureBar = futureFoo.thenApplyAsync(getFooToBar(), executor2);
try {
// Try putting a complete here before the get ..
return futureBar.get(50, TimeUnit.SECONDS).j;
}
finally {
dumpWhichThread("Completed");
}
}
}
Which thread triggers the Bar stage to progress - in the above - it's executor1. In general the thread completing the future (i.e. giving it a value) is what releases the thing depending on it. If you completed the FutureFoo immediately on the main thread - it would be the one triggering it.
SO you have to be careful with this. If you have "N" things all waiting on the future results - but use only a single threaded executor - then the first one scheduled will block that executor until it completes. You can extrapolate to M threads, N futures - it can decay into "M" locks preventing the rest of things progressing.
From the Official Documentation of Mono#block() it is said that:
Subscribe to this Mono and block indefinitely until a next signal is received. Returns that value, or null if the Mono completes empty. In case the Mono errors, the original exception is thrown (wrapped in a RuntimeException if it was a checked exception).
So it is sure that block() method is blocking and it will not execute the next line untill block() resolved.
But my confusion is while I was using toFuture() expecting it will be non-blocking but it is behaving exactly like block method. And in the Documentation of Mono#toFuture() it is stated:
Transform this Mono into a CompletableFuture completing on onNext or onComplete and failing on onError.
Not much clear. Nowhere in this doc said Mono#toFuture() is blocking.
Please confirm me if toFuture() method blocking or non-blocking?
Also If it is non-blocking then, which thread will responsible to execute the code inside CompletableFuture?
Update: added code snippet
using Mono.block() method:
long time = System.currentTimeMillis();
String block = Mono.fromCallable(() -> {
logger.debug("inside in fromCallable() block()");
//Upstream httpcall with apache httpClient().
// which takes atleast 1sec to complete.
return "Http response as string";
}).block();
logger.info("total time needed {}", (System.currentTimeMillis()-time));
return CompletableFuture.completedFuture(block);
Using Mono.ToFuture() method:
long time = System.currentTimeMillis();
CompletableFuture<String> toFuture = Mono.fromCallable(() -> {
logger.debug("inside in fromCallable() block()");
//Upstream httpcall with apache httpClient().
// which takes atleast 1sec to complete.
return "Http response as string";
}).toFuture();
logger.info("total time needed {}", (System.currentTimeMillis()-time));
return toFuture;
these two code snippets behaves exactly same.
-- EDIT: I was wrong. mono.toFuture() doesn't block --
mono.toFuture() isn't blocking. Look at this test:
#Test
void testMonoToFuture() throws ExecutionException, InterruptedException {
System.out.println(LocalTime.now() + ": start");
Mono<String> mono = Mono.just("hello StackOverflow")
.delayElement(Duration.ofMillis(500))
.doOnNext((s) -> System.out.println(LocalTime.now() + ": mono completed"));
Future<String> future = mono.toFuture();
System.out.println(LocalTime.now() + ": future created");
String result = future.get();
System.out.println(LocalTime.now() + ": future completed");
assertThat(result).isEqualTo("hello StackOverflow");
}
This is the result:
20:18:49.557: start
20:18:49.575: future created
20:18:50.088: mono completed
20:18:50.088: future completed
The future is created almost immediately. Half a second later, the mono completes and immediately after that, the future completes. This is exactly what I would expect to happen.
So why does the mono seem blocking in the example provided in the question? It's because of the way mono.fromCallable() works. When and where does that callable actually run? mono.fromCallable() doesn't spawn an extra thread to do the work. From my tests it seems that the callable runs when you first call subscribe() or block() or something similar on the mono, and it will run in the thread that does that.
Here is a test that shows that if you create a mono with fromCallable(), subscribe will cause the callable to be executed in the main thread and even the subscribe() method will seem blocking.
#Test
void testMonoToFuture() throws ExecutionException, InterruptedException {
System.out.println(LocalTime.now() + ": start");
System.out.println("main thread: " + Thread.currentThread().getName());
Mono<String> mono = Mono.fromCallable(() -> {
System.out.println("callabel running in thread: " + Thread.currentThread().getName());
Thread.sleep(1000);
return "Hello StackOverflow";
})
.doOnNext((s) -> System.out.println(LocalTime.now() + ": mono completed"));
System.out.println("before subscribe");
mono.subscribe(System.out::println);
System.out.println(LocalTime.now() + ": after subscribe");
}
result:
20:53:37.071: start
main thread: main
before subscribe
callabel running in thread: main
20:53:38.099: mono completed
Hello StackOverflow
20:53:38.100: after subscribe
Conclusion: mono.toFuture() isn't any more blocking than mono.subscribe(). If you want to execute some piece of code asynchronously, you shouldn't be using Mono.fromCallable(). You could consider using Executors.newSingleThreadExecutor().submit(someCallable)
For reference, here is my original (wrong) answer where I belittle the mono.block() method that was assuredly written by people who know a lot more about Java and coding than I do. A personal lesson in humility, I guess.
EVERYTHING BELOW THIS IS NONSENSE
I wanted to verify exactly how this works so I wrote some tests. Unfortunately, it turns out that mono.toFuture() is indeed blocking and the result is evaluated synchronously. I honestly don't know why you would ever use this feature. The whole point of a Future is to hold the result of an asynchronous evaluation.
Here is my test:
#Test
void testMonoToFuture() throws ExecutionException, InterruptedException {
Mono<Integer> mono = Mono.fromCallable(() -> {
System.out.println("start mono");
Thread.sleep(1000);
System.out.println("mono completed");
return 0;
});
Future<Integer> future = mono.toFuture();
System.out.println("future created");
future.get();
System.out.println("future completed");
}
Result:
start mono
mono completed
future created
future completed
Here is an implementation of monoToFuture() that works the way that I would expect it to:
#Test
void testMonoToFuture() throws ExecutionException, InterruptedException {
Mono<Integer> mono = Mono.fromCallable(() -> {
System.out.println("start mono");
Thread.sleep(1000);
System.out.println("mono completed");
return 0;
});
Future<Integer> future = monoToFuture(mono, Executors.newSingleThreadExecutor());
System.out.println("future created");
future.get();
System.out.println("future completed");
}
private <T> Future<T> monoToFuture(Mono<T> mono, ExecutorService executorService){
return executorService.submit((Callable<T>) mono::block);
}
Result:
future created
start mono
mono completed
future completed
TL;DR
Mono.toFuture() is not blocking but Mono.toFuture().get() is blocking. block() is technically the same as toFuture().get() and both are blocking.
Mono.toFuture() just transforms Mono into a CompletableFuture by subscribing to it and resolving immediately. But it doesn't mean that you can access result (in your case String) of the corresponding Mono after this. CompletableFuture is still async and you can use methods like thenApply(), thenCompose(), thenCombine(), ... to continue async processing.
CompletableFuture<Double> result = getUserDetail(userId)
.toFuture()
.thenCompose(user -> getCreditRating(user));
where getUserDetail is defined as
Mono<User> getUserDetail(String userId);
Mono.toFuture is useful when you need to combine different async APIs. For example, AWS Java v2 API is async but based on CompletableFuture but we can combine APIs using Mono.toFuture or Mono.fromFuture.
I´m trying to use subscribeOn and obsereOn with an Executor to allow me back to the main thread once the async task finish.
I end up with this code but it does not work
#Test
public void testBackToMainThread() throws InterruptedException {
processValue(1);
processValue(2);
processValue(3);
processValue(4);
processValue(5);
// while (tasks.size() != 0) {
// tasks.take().run();
// }
System.out.println("done");
}
private LinkedBlockingQueue<Runnable> tasks = new LinkedBlockingQueue<>();
private void processValue(int value) throws InterruptedException {
Observable.just(value)
.subscribeOn(Schedulers.io())
.doOnNext(number -> processExecution())
.observeOn(Schedulers.from(command -> tasks.add(command)))
.subscribe(x -> System.out.println("Thread:" + Thread.currentThread().getName() + " value:" + x));
tasks.take().run();
}
private void processExecution() {
System.out.println("Execution in " + Thread.currentThread().getName());
try {
Thread.sleep(2000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
Any idea how to accomplish what I want?
When I run I only printing
Execution in RxIoScheduler-2
Execution in RxIoScheduler-3
Execution in RxIoScheduler-4
Execution in RxIoScheduler-5
Execution in RxIoScheduler-6
done
Regards
The problem with your approach is that you can't know how many tasks should be executed at a given time and also not deadlock on waiting for tasks that should happen after you unblock the main thread.
Returning to the Java main thread is not supported by any extension to 1.x I know. For 2.x, there is the BlockingScheduler from the extensions project that allows you to do that:
public static void main(String[] args) {
BlockingScheduler scheduler = new BlockingScheduler();
scheduler.execute(() -> {
Flowable.range(1,10)
.subscribeOn(Schedulers.io())
.observeOn(scheduler)
.doAfterTerminate(() -> scheduler.shutdown())
.subscribe(v -> System.out.println(v + " on " + Thread.currentThread()));
});
System.out.println("BlockingScheduler finished");
}
Note the call to scheduler.shutdown() which has to be called eventually to release the main thread, otherwise your program may never terminate.
Your question will not happen in RxJava2. It's recommanded to use RxJava2.
I compared RxJava-1.2.7 and RxJava-2.0.7 and found the root cause. And now I am looking for the solution.
In RxJava-1.2.7.You can see ObservableObserveOn#145 and find it schedule the task when you call request. It means it will call Executor.execute when you subscribe on it. So your task queue accept the Runnable immediately. And then you take and run the Runnable (which is actual ExecutorSchedulerWorker) but the upstream's onNext haven't been called (because you sleep 2000ms). It will return null on ObserveOnSubscriber#213. When upstream call onNext(Integer), the task will never be run.
I just update my code with suggestion of akanord but this aproach it seems to block one task to the other, and just end up running sequential.
With the code:
#Test
public void testBackToMainThread() throws InterruptedException {
processValue(1);
processValue(2);
processValue(3);
processValue(4);
processValue(5);
System.out.println("done");
}
private void processValue(int value) throws InterruptedException {
BlockingScheduler scheduler = new BlockingScheduler();
scheduler.execute(() -> Flowable.just(value)
.subscribeOn(Schedulers.io())
.doOnNext(number -> processExecution())
.observeOn(scheduler)
.doAfterTerminate(() -> scheduler.shutdown())
.subscribe(v -> System.out.println(v + " on " + Thread.currentThread())));
}
private void processExecution() {
System.out.println("Execution in " + Thread.currentThread().getName());
try {
Thread.sleep(2000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
And the output
Execution in RxCachedThreadScheduler-1
1 on Thread[main,5,main]
Execution in RxCachedThreadScheduler-1
2 on Thread[main,5,main]
Execution in RxCachedThreadScheduler-1
3 on Thread[main,5,main]
Execution in RxCachedThreadScheduler-1
4 on Thread[main,5,main]
Execution in RxCachedThreadScheduler-1
5 on Thread[main,5,main]
done
What I want to achieve is this output
Execution in RxCachedThreadScheduler-1
Execution in RxCachedThreadScheduler-1
Execution in RxCachedThreadScheduler-1
Execution in RxCachedThreadScheduler-1
Execution in RxCachedThreadScheduler-1
1 on Thread[main,5,main]
2 on Thread[main,5,main]
3 on Thread[main,5,main]
4 on Thread[main,5,main]
5 on Thread[main,5,main]
done
So every time the main thread run the pipeline run the onNext in another thread and then it return from the method until the another thread finish and make it the main thread back to the pipeline.