Does runnable run() create a thread everytime I call it? - java

I wrote a class which I use like as follows:
EventWrapperBuilder.newWrapperBuilder().
addSync(this::<some_method>).
addSync(this::<some_method>).
addSync(this::<some_method>).
addAsync(() -> <some_method>, Duration.ofSeconds(10)).
GET();
My code is the following:
private final List<EventWrapper> _wrappers = new ArrayList<>();
public EventWrapperBuilder addSync(final Runnable task)
{
_wrappers.add(new EventWrapper(task, Duration.ZERO));
return this;
}
public EventWrapperBuilder addAsync(final Runnable task, final Duration duration)
{
_wrappers.add(new EventWrapper(task, duration));
return this;
}
/**
* #return {#code List} of all {#code Future}
*/
public List<Future<?>> GET()
{
final List<Future<?>> list = new ArrayList<>();
for (final EventWrapper wrapper : getWrappers())
{
if (!wrapper.getDuration().isZero())
{
list.add(ThreadPoolManager.getInstance().scheduleEvent(wrapper.getTask(), wrapper.getDuration().toMillis()));
}
else
{
wrapper.getTask().run();
}
}
return list;
}
/**
* #param builder
* #return {#code EventWrapperBuilder}
*/
public EventWrapperBuilder COMBINE(final EventWrapperBuilder builder)
{
_wrappers.addAll(builder.getWrappers());
return this;
}
/**
* #return {#code List} of all {#code EventWrapper}
*/
public List<EventWrapper> getWrappers()
{
return _wrappers;
}
//#formatter:off
private static record EventWrapper (Runnable getTask, Duration getDuration) {}
//#formatter:on
public static EventWrapperBuilder newWrapperBuilder()
{
return new EventWrapperBuilder();
}
My question is: Do I create a new thread every time I execute this if it's instant i.e. the EventWrappers duration is zero?
I obviously know that the
list.add(ThreadPoolManager.getInstance().scheduleEvent(wrapper.getTask(), wrapper.getDuration().toMillis()));
creates a thread and execute it after the scheduled time but the
wrapper.getTask().run();
is real time without a thread right?
I don't want my code to create threads and therefore to be heavy when it executes run().

No, calling run() of the Runnable interface will not spawn a new thread.
On the other hand, when you wrap the Runnable with a Thread class and call start() then the JVM will spawn a new thread and execute the Runnable in it's context.
In your code, only the Async tasks will run in a separate thread since the Threadpool is managing their execution.

In your code, calling run() will execute the Runnable in the current thread, while calling start() will run the Runnable in the new thread.
https://docs.oracle.com/javase/7/docs/api/java/lang/Runnable.html#run()
https://docs.oracle.com/javase/7/docs/api/java/lang/Thread.html#start()

Related

Processing tasks in parallel and sequentially Java

In my program, the user can trigger different tasks via an interface, which take some time to process. Therefore they are executed by threads. So far I have implemented it so that I have an executer with one thread that executes all tasks one after the other. But now I would like to parallelize everything a little bit.
i.e. I would like to run tasks in parallel, except if they have the same path, then I want to run them sequentially. For example, I have 10 threads in my pool and when a task comes in, the task should be assigned to the worker which is currently processing a task with the same path. If no task with the same path is currently being processed by a worker, then the task should be processed by a currently free worker.
Additional info: A task is any type of task that is executed on a file in the local file system. For example, renaming a file. Therefore, the task have the attribute path. And I don't want to execute two tasks on the same file at the same time, so such tasks with the same paths should be performed sequentially.
Here is my sample code but there is work to do:
One of my problems is, I need a safe way to check if a worker is currently running and get the path of the currently running worker. By safe I mean, that no problems of simultaneous access or other thread problems occur.
public class TasksOrderingExecutor {
public interface Task extends Runnable {
//Task code here
String getPath();
}
private static class Worker implements Runnable {
private final LinkedBlockingQueue<Task> tasks = new LinkedBlockingQueue<>();
//some variable or mechanic to give the actual path of the running tasks??
private volatile boolean stopped;
void schedule(Task task) {
tasks.add(task);
}
void stop() {
stopped = true;
}
#Override
public void run() {
while (!stopped) {
try {
Task task = tasks.take();
task.run();
} catch (InterruptedException ie) {
// perhaps, handle somehow
}
}
}
}
private final Worker[] workers;
private final ExecutorService executorService;
/**
* #param queuesNr nr of concurrent task queues
*/
public TasksOrderingExecutor(int queuesNr) {
Preconditions.checkArgument(queuesNr >= 1, "queuesNr >= 1");
executorService = new ThreadPoolExecutor(queuesNr, queuesNr, 0, TimeUnit.SECONDS, new SynchronousQueue<>());
workers = new Worker[queuesNr];
for (int i = 0; i < queuesNr; i++) {
Worker worker = new Worker();
executorService.submit(worker);
workers[i] = worker;
}
}
public void submit(Task task) {
Worker worker = getWorker(task);
worker.schedule(task);
}
public void stop() {
for (Worker w : workers) w.stop();
executorService.shutdown();
}
private Worker getWorker(Task task) {
//check here if a running worker with a specific path exists? If yes return it, else return a free worker. How do I check if a worker is currently running?
return workers[task.getPath() //HERE I NEED HELP//];
}
}
Seems like you have a pair of problems:
You want to check the status of tasks submitted to an executor service
You want to run tasks in parallel, and possibly prioritize them
Future
For the first problem, capture the Future object returned when you submit a task to an executor service. You can check the Future object for its completion status.
Future< Task > future = myExecutorService.submit( someTask ) ;
…
boolean isCancelled = future.isCancelled() ; // Returns true if this task was cancelled before it completed normally.
boolean isDone = future.isDone(); // Returns true if this task completed.
The Future is of a type, and that type can be your Task class itself. Calling Future::get yields the Task object. You can then interrogate that Task object for its contained file path.
Task task = future.get() ;
String path = task.getPath() ; // Access field via getter from your `Task` object.
Executors
Rather than instantiating new ThreadPoolExecutor, use the Executors utility class to instantiate an executor service on your behalf. Instantiating ThreadPoolExecutor directly is not needed for most common scenarios, as mentioned in the first line of its Javadoc.
ExecutorService es = Executors.newFixedThreadPool​( 3 ) ; // Instantiate an executor service backed by a pool of three threads.
For the second problem, use an executor service backed by a thread pool rather than a single thread. The executor service automatically assigns the submitted task to an available thread.
As for grouping or prioritizing, use multiple executor services. You can instantiate more than one. You can have as many executor services as you want, provided you do not overload the demand on your deployment machine for CPU cores and memory (think about your maximum simultaneous usage).
ExecutorService esSingleThread = Executors.newSingleThreadExecutor() ;
ExecutorService esMultiThread = Executors.newCachedThreadPool() ;
One executor service might be backed by a single thread to limit the demands on the deployment computer, while others might be backed by a thread pool to get more work done. You can use these multiple executor services as your multiple queues. No need for you to be managing queues and workers as seen in the code of your Question. Executors were invented to further simplify working with multiple threads.
Concurrency
You said:
And I don't want to execute two tasks on the same file at the same time, so such tasks with the same paths should be performed sequentially.
You should have a better way to handle the concurrency conflict that just scheduling tasks on threads.
Java has ways to manage concurrent access to files. Search to learn more, as this has been covered on Stack Overflow already.
Perhaps I have not understood fully your needs, so do comment if I am off-base.
It seems that you need some sort of "Task Dispatcher" that executes or holds some tasks depending on some identifier (here the Path of the file the task is applied to).
You could use something like this :
public class Dispatcher<I> implements Runnable {
/**
* The executor used to execute the submitted task
*/
private final Executor executor;
/**
* Map of the pending tasks
*/
private final Map<I, Deque<Runnable>> pendingTasksById = new HashMap<>();
/**
* set containing the id that are currently executed
*/
private final Set<I> runningIds = new HashSet<>();
/**
* Action to be executed by the dispatcher
*/
private final BlockingDeque<Runnable> actionQueue = new LinkedBlockingDeque<>();
public Dispatcher(Executor executor) {
this.executor = executor;
}
/**
* Task in the same group will be executed sequentially (but not necessarily in the same thread)
* #param id the id of the group the task belong
* #param task the task to execute
*/
public void submitTask(I id, Runnable task) {
actionQueue.addLast(() -> {
if (canBeLaunchedDirectly(id)) {
executeTask(id, task);
} else {
addTaskToPendingTasks(id, task);
ifPossibleLaunchPendingTaskForId(id);
}
});
}
#Override
public void run() {
while (!Thread.currentThread().isInterrupted()) {
try {
actionQueue.takeFirst().run();
} catch (InterruptedException e) {
Thread.currentThread().isInterrupted();
break;
}
}
}
private void addTaskToPendingTasks(I id, Runnable task) {
this.pendingTasksById.computeIfAbsent(id, i -> new LinkedList<>()).add(task);
}
/**
* #param id an id of a group
* #return true if a task of the group with the provided id is currently executed
*/
private boolean isRunning(I id) {
return runningIds.contains(id);
}
/**
* #param id an id of a group
* #return an optional containing the first pending task of the group,
* an empty optional if no such task is available
*/
private Optional<Runnable> getFirstPendingTask(I id) {
final Deque<Runnable> pendingTasks = pendingTasksById.get(id);
if (pendingTasks == null) {
return Optional.empty();
}
assert !pendingTasks.isEmpty();
final Runnable result = pendingTasks.removeFirst();
if (pendingTasks.isEmpty()) {
pendingTasksById.remove(id);
}
return Optional.of(result);
}
private boolean canBeLaunchedDirectly(I id) {
return !isRunning(id) && pendingTasksById.get(id) == null;
}
private void executeTask(I id, Runnable task) {
this.runningIds.add(id);
executor.execute(() -> {
try {
task.run();
} finally {
actionQueue.addLast(() -> {
runningIds.remove(id);
ifPossibleLaunchPendingTaskForId(id);
});
}
});
}
private void ifPossibleLaunchPendingTaskForId(I id) {
if (isRunning(id)) {
return;
}
getFirstPendingTask(id).ifPresent(r -> executeTask(id, r));
}
}
To use it, you need to launch it in a separated thread (or you can adapt it for a cleaner solution) like this :
final Dispatcher<Path> dispatcher = new Dispatcher<>(Executors.newCachedThreadPool());
new Thread(dispatcher).start();
dispatcher.submitTask(path, task1);
dispatcher.submitTask(path, task2);
This is basic example, you might need to keep the thread and even better wrap all of that in a class.
all you need is a hash map of actors, with file path as a key. Different actors would run in parallel, and concrete actor would handle tasks sequentially.
Your solution is wrong because Worker class uses blocking operation take but is executed in a limited thread pool, which may lead to a thread starvation (a kind of deadlock). Actors do not block when waiting for next message.
import org.df4j.core.dataflow.ClassicActor;
import java.util.HashMap;
import java.util.Map;
import java.util.concurrent.*;
public class TasksOrderingExecutor {
public static class Task implements Runnable {
private final String path;
private final String task;
public Task(String path, String task) {
this.path = path;
this.task = task;
}
//Task code here
String getPath() {
return path;
}
#Override
public void run() {
System.out.println(path+"/"+task+" started");
try {
Thread.sleep(500);
} catch (InterruptedException e) {
}
System.out.println(path+"/"+task+" stopped");
}
}
static class Worker extends ClassicActor<Task> {
#Override
protected void runAction(Task task) throws Throwable {
task.run();
}
}
private final ExecutorService executorService;
private final Map<String,Worker> workers = new HashMap<String,Worker>(){
#Override
public Worker get(Object key) {
return super.computeIfAbsent((String) key, (k) -> {
Worker res = new Worker();
res.setExecutor(executorService);
res.start();
return res;
});
}
};
/**
* #param queuesNr nr of concurrent task queues
*/
public TasksOrderingExecutor(int queuesNr) {
executorService = ForkJoinPool.commonPool();
}
public void submit(Task task) {
Worker worker = getWorker(task);
worker.onNext(task);
}
public void stop() throws InterruptedException {
for (Worker w : workers.values()) {
w.onComplete();
}
executorService.shutdown();
executorService.awaitTermination(10, TimeUnit.SECONDS);
}
private Worker getWorker(Task task) {
//check here if a runnig worker with a specific path exists? If yes return it, else return a free worker. How do I check if a worker is currently running?
return workers.get(task.getPath());
}
public static void main(String[] args) throws InterruptedException {
TasksOrderingExecutor orderingExecutor = new TasksOrderingExecutor(20);
orderingExecutor.submit(new Task("path1", "task1"));
orderingExecutor.submit(new Task("path1", "task2"));
orderingExecutor.submit(new Task("path2", "task1"));
orderingExecutor.submit(new Task("path3", "task1"));
orderingExecutor.submit(new Task("path2", "task2"));
orderingExecutor.stop();
}
}
The protocol of execution shows that tasks with te same key are executed sequentially and tasks with different keys are executed in parallel:
path3/task1 started
path2/task1 started
path1/task1 started
path3/task1 stopped
path2/task1 stopped
path1/task1 stopped
path2/task2 started
path1/task2 started
path2/task2 stopped
path1/task2 stopped
I used my own actor library DF4J, but any other actor library can be used.

ScheduledThreadPoolExecutor.remove : Is it safe to use

i am trying to schedule bunch of tasks to execute periodically. under certain situations some task need to be stopped from scheduling, so i remove them from the interal queue of threadPoolExecutor. I do that from within the task itself
Below is my approach. I am not sure the idea of removing the task from the threadPoolExecutor service, from inside of the task can cause any problem.(look at the synchronized method name 'removeTask'. Is there a better way to accomplish what i am trying to do here.
public class SchedulerDaemon {
private ScheduledExecutorService taskScheduler;
private ScheduledFuture taskResult1, taskResult2;
private Task1 task1;
private Task2 task2;
public SchedulerDaemon(Task1 task, Task2 task2)
{
this.task1 = task1;
this.task2 = task2;1
taskScheduler = new ScheduledThreadPoolExecutor(1);
}
public void start() {
if(taskScheduler == null) {
taskScheduler = new ScheduledThreadPoolExecutor(1);
taskResult = taskScheduler.scheduleAtFixedRate(new TaskWrapper(task1) , 60000,60000, TimeUnit.MILLISECONDS);
taskResult2 = taskScheduler.scheduleAtFixedRate(new TaskWrapper(task2) , 60000,60000, TimeUnit.MILLISECONDS);
}
}
public void stop() {
if(taskScheduler != null) {
taskScheduler.shutdown();
taskResult1.cancel(false);
taskResult2.cancel(false);
taskScheduler = null;
taskResult = null;
}
}
public synchronized void removeTask( TaskWrapper task){
((ScheduledThreadPoolExecutor) taskScheduler).remove(task);
}
class TaskWrapper implements Runnable {
private Task myTask;
public TaskWrapper(Task task) {
myTask = task;
}
#Override
public void run() {
try {
boolean keepRunningTask = myTask.call();
if(!keepRunningTask) {
***//Should this cause any problem??***
removeTask(this);
}
} catch (Exception e) {
//the task threw an exception remove it from execution queue
***//Should this cause any problem??***
removeTask(this);
}
}
}
}
public Task1 implements Callable<Boolean> {
public Boolean call() {
if(<something>)
return true;
else
return false;
}
}
public Task2 implements Callable<Boolean> {
public Boolean call() {
if(<something>)
return true;
else
return false;
}
}
Whenever you schedule a task
ScheduledFuture<?> future = schedulerService.scheduleAtFixedRate(new AnyTask());
Future Object is returned.
Use this Future Object to cancel this Task.
try this
future.cancel(true);
from JavaDocs
/**
* Attempts to cancel execution of this task. This attempt will
* fail if the task has already completed, has already been cancelled,
* or could not be cancelled for some other reason. If successful,
* and this task has not started when <tt>cancel</tt> is called,
* this task should never run. If the task has already started,
* then the <tt>mayInterruptIfRunning</tt> parameter determines
* whether the thread executing this task should be interrupted in
* an attempt to stop the task.
*
* <p>After this method returns, subsequent calls to {#link #isDone} will
* always return <tt>true</tt>. Subsequent calls to {#link #isCancelled}
* will always return <tt>true</tt> if this method returned <tt>true</tt>.
*
* #param mayInterruptIfRunning <tt>true</tt> if the thread executing this
* task should be interrupted; otherwise, in-progress tasks are allowed
* to complete
* #return <tt>false</tt> if the task could not be cancelled,
* typically because it has already completed normally;
* <tt>true</tt> otherwise
*/
Canceling a task by force is dangerous, that is why stop is mark to remove from java, so,
in alternative you should have a shared flag in your thread...
something like: can i live? can i live? no? ok return! this seam hugely but is the safe way!

ExecutorService.execute() does not return the thread type

I have something like this
public static void runThread(Thread t){
ExecutorService threadExecutor = Executors.newSingleThreadExecutor();
threadExecutor.execute(t);
}
if I do Thread.currentThread(), then I get back weblogic.work.ExecuteThread or sometimes java.lang.Thread (I used Weblogic as my AppServer), but if I do
public static void runThread(Thread t){
//ExecutorService threadExecutor = Executors.newSingleThreadExecutor();
//threadExecutor.execute(t);
t.start();
}
then when I dod Thread.currentThread(), I get back com.my.thread.JSFExecutionThread, which is the Thread that I passed in and this is what I want. Is there a way to fix so the ExecutorService#execute() return the correct Thread type like Thread#start()? The thing is that I want to use ExecutorService, because I want to leverage shutdown() and shutdownNow()
EDIT
Is there anything wrong with this implementation?
/**
* Run {#code Runnable runnable} with {#code ExecutorService}
* #param runnable {#code Runnable}
* #return
*/
public static ExecutorService runThread(Thread t){
ExecutorService threadExecutor = Executors.newSingleThreadExecutor(
new ExecutionThreadFactory(t));
threadExecutor.execute(t);
return threadExecutor;
}
private static class ExecutionThreadFactory implements ThreadFactory{
private JSFExecutionThread jsfThread;
ExecutionThreadFactory(Thread t){
if(t instanceof JSFExecutionThread){
jsfThread = (JSFExecutionThread)t;
}
}
#Override
public Thread newThread(Runnable r) {
if(jsfThread != null){
return jsfThread;
}else{
return new Thread(r);
}
}
}
Is there anything wrong with this implementation?
Yes.
First, the ExecutorService manages the lifetime of each Thread from the time the ThreadFactory creates it until the executor is done with it... and the punchline, a Thread is not re-usable, once it has terminated it can not be started.
Second
public Thread newThread(Runnable r) {
if(jsfThread != null){
return jsfThread;
}else{
return new Thread(r);
}
}
This code violates the contract of ThreadFactory.newThread by not making the Runnable r set as the runnable to be executed by the jsfThread.

Java: ExecutorService thread synchronization with CountDownLatch causes dead lock?

I have written a game of life for programming practice. There are 3 different implementations of the generator. First: One main thread + N sub threads, Second: SwingWorker + N sub threads, Third: SwingWorker + ExecutorService.
N is the number of availableProcessors or user defined.
The first two implementations runs fine, with one and more threads.
The implementation with the ExecutorServise runs fine with one thread, but locks with more than one. I tried everything, but i can't get the solution.
Here the code of the fine workling implementation (second one):
package example.generator;
import javax.swing.SwingWorker;
/**
* AbstractGenerator implementation 2: SwingWorker + sub threads.
*
* #author Dima
*/
public final class WorldGenerator2 extends AbstractGenerator {
/**
* Constructor.
* #param gamePanel The game panel
*/
public WorldGenerator2() {
super();
}
/* (non-Javadoc)
* #see main.generator.AbstractGenerator#startGenerationProcess()
*/
#Override
protected void startGenerationProcess() {
final SwingWorker<Void, Void> worker = this.createWorker();
worker.execute();
}
/**
* Creates a swing worker for the generation process.
* #return The swing worker
*/
private SwingWorker<Void, Void> createWorker() {
return new SwingWorker<Void, Void>() {
#Override
protected Void doInBackground() throws InterruptedException {
WorldGenerator2.this.generationProcessing();
return null;
}
};
}
/* (non-Javadoc)
* #see main.generator.AbstractGenerator#startFirstStep()
*/
#Override
public void startFirstStep() throws InterruptedException {
this.getQueue().addAll(this.getLivingCells());
for (int i = 0; i < this.getCoresToUse(); i++) {
final Thread thread = new Thread() {
#Override
public void run() {
WorldGenerator2.this.fistStepProcessing();
}
};
thread.start();
thread.join();
}
}
/* (non-Javadoc)
* #see main.generator.AbstractGenerator#startSecondStep()
*/
#Override
protected void startSecondStep() throws InterruptedException {
this.getQueue().addAll(this.getCellsToCheck());
for (int i = 0; i < this.getCoresToUse(); i++) {
final Thread thread = new Thread() {
#Override
public void run() {
WorldGenerator2.this.secondStepProcessing();
}
};
thread.start();
thread.join();
}
}
}
Here is the code of the not working implementation with executor service:
package example.generator;
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import javax.swing.SwingWorker;
/**
* AbstractGenerator implementation 3: SwingWorker + ExecutorService.
*
* #author Dima
*/
public final class WorldGenerator3 extends AbstractGenerator {
private CountDownLatch countDownLatch;
private ExecutorService executor;
/**
* Constructor.
* #param gamePanel The game panel
*/
public WorldGenerator3() {
super();
}
/* (non-Javadoc)
* #see main.generator.AbstractGenerator#startGenerationProcess()
*/
#Override
protected void startGenerationProcess() {
this.executor = Executors.newFixedThreadPool(this.getCoresToUse());
final SwingWorker<Void, Void> worker = this.createWorker();
worker.execute();
}
/**
* Creates a swing worker for the generation process.
* #return The swing worker
*/
private SwingWorker<Void, Void> createWorker() {
return new SwingWorker<Void, Void>() {
#Override
protected Void doInBackground() throws InterruptedException {
WorldGenerator3.this.generationProcessing();
return null;
}
};
}
/* (non-Javadoc)
* #see main.generator.AbstractGenerator#startFirstStep()
*/
#Override
public void startFirstStep() throws InterruptedException {
this.getQueue().addAll(this.getLivingCells());
this.countDownLatch = new CountDownLatch(this.getCoresToUse());
for (int i = 0; i < this.getCoresToUse(); i++) {
this.executor.execute(new Runnable() {
#Override
public void run() {
WorldGenerator3.this.fistStepProcessing();
WorldGenerator3.this.countDownLatch.countDown();
}
});
}
this.countDownLatch.await();
}
/* (non-Javadoc)
* #see main.generator.AbstractGenerator#startSecondStep()
*/
#Override
protected void startSecondStep() throws InterruptedException {
this.getQueue().addAll(this.getCellsToCheck());
this.countDownLatch = new CountDownLatch(this.getCoresToUse());
for (int i = 0; i < this.getCoresToUse(); i++) {
this.executor.execute(new Runnable() {
#Override
public void run() {
WorldGenerator3.this.secondStepProcessing();
WorldGenerator3.this.countDownLatch.countDown();
}
});
}
this.countDownLatch.await();
}
}
Here you can download, a sample of my application, with a small launcher. it prints only the result of a iteration on the console: Link
Now my code looks like this:
/* (non-Javadoc)
* #see main.generator.AbstractGenerator#startFirstStep()
*/
#Override
public void startFirstStep() throws InterruptedException {
this.getQueue().addAll(this.getLivingCells());
final ArrayList<Callable<Void>> list = new ArrayList<Callable<Void>>(this.getCoresToUse());
for (int i = 0; i < this.getCoresToUse(); i++) {
list.add(new Callable<Void>() {
#Override
public Void call() throws Exception {
WorldGenerator3.this.fistStepProcessing();
return null;
}
}
);
}
this.executor.invokeAll(list);
}
But here is again the same problem. If I run it with one core (thread) there are no problems. If I set the number of cores to more than one, it locks. In my first question there is a link to a example, which you can run (in eclipse). Maybe I overlook something in the previous code.
I find your usage of Executors facilities a little bit odd...
I.e. the idea is to have Executor with a pool of threads, size of which usually is related to number of cores your CPU supports.
Then you submit whatever number of parallel tasks to the Executor, letting it to decide what to execute when and on which available Thread from its pool.
As for the CountDownLatch... Why not use ExecutorService.invokeAll? This method will block untill all submitted tasks are completed or timeout is reached. So it will do counting of the work left on your behalf.
Or a CompletionService which "decouples the production of new asynchronous tasks from the consumption of the results of completed tasks" if you want to consume Task result as soon as it becomes available i.e. not wait for all tasks to complete first.
Something like
private static final int WORKER_THREAD_COUNT_DEFAULT = Runtime.getRuntime().availableProcessors() * 2;
ExecutorService executor = Executors.newFixedThreadPool(WORKER_THREAD_COUNT);
// your tasks may or may not return result so consuming invokeAll return value may not be necessary in your case
List<Future<T>> futuresResult = executor.invokeAll(tasksToRunInParallel, EXECUTE_TIMEOUT,
TimeUnit.SECONDS);
In all variants you are executing threads in serial rather than parallel because you join and await inside the for-loop. That means that the for-loop cannot move on to the next iteration until the thread just started is complete. This amounts to having only one thread live at any given time -- either the main thread or the one thread created in the current loop iteration. If you want to join on multiple threads, you must collect the refs to them and then, outside the loop where you started them all, enter another loop where you join on each one.
As for using CountDownLatch in the Executors variant, what was said for threads goes for the latch here: don't use an instance var; use a local list that collects all latches and await them in a separate loop.
But, you shouldn't really be using the CountDownLatch in the first place: you should put all your parallel tasks in a list of Callables and call ExecutorService.invokeAll with it. It will automatically block until all the tasks are done.

Why is UncaughtExceptionHandler not called by ExecutorService?

I've stumbled upon a problem, that can be summarized as follows:
When I create the thread manually (i.e. by instantiating java.lang.Thread) the UncaughtExceptionHandler is called appropriately. However, when I use an ExecutorService with a ThreadFactory the handler is ommited. What did I miss?
public class ThreadStudy {
private static final int THREAD_POOL_SIZE = 1;
public static void main(String[] args) {
// create uncaught exception handler
final UncaughtExceptionHandler exceptionHandler = new UncaughtExceptionHandler() {
#Override
public void uncaughtException(Thread t, Throwable e) {
synchronized (this) {
System.err.println("Uncaught exception in thread '" + t.getName() + "': " + e.getMessage());
}
}
};
// create thread factory
ThreadFactory threadFactory = new ThreadFactory() {
#Override
public Thread newThread(Runnable r) {
// System.out.println("creating pooled thread");
final Thread thread = new Thread(r);
thread.setUncaughtExceptionHandler(exceptionHandler);
return thread;
}
};
// create Threadpool
ExecutorService threadPool = Executors.newFixedThreadPool(THREAD_POOL_SIZE, threadFactory);
// create Runnable
Runnable runnable = new Runnable() {
#Override
public void run() {
// System.out.println("A runnable runs...");
throw new RuntimeException("Error in Runnable");
}
};
// create Callable
Callable<Integer> callable = new Callable<Integer>() {
#Override
public Integer call() throws Exception {
// System.out.println("A callable runs...");
throw new Exception("Error in Callable");
}
};
// a) submitting Runnable to threadpool
threadPool.submit(runnable);
// b) submit Callable to threadpool
threadPool.submit(callable);
// c) create a thread for runnable manually
final Thread thread_r = new Thread(runnable, "manually-created-thread");
thread_r.setUncaughtExceptionHandler(exceptionHandler);
thread_r.start();
threadPool.shutdown();
System.out.println("Done.");
}
}
I expect: Three times the message "Uncaught exception..."
I get: The message once (triggered by the manually created thread).
Reproduced with Java 1.6 on Windows 7 and Mac OS X 10.5.
Because the exception does not go uncaught.
The Thread that your ThreadFactory produces is not given your Runnable or Callable directly. Instead, the Runnable that you get is an internal Worker class, for example see ThreadPoolExecutor$Worker. Try System.out.println() on the Runnable given to newThread in your example.
This Worker catches any RuntimeExceptions from your submitted job.
You can get the exception in the ThreadPoolExecutor#afterExecute method.
Exceptions which are thrown by tasks submitted to ExecutorService#submit get wrapped into an ExcecutionException and are rethrown by the Future.get() method. This is, because the executor considers the exception as part of the result of the task.
If you however submit a task via the execute() method which originates from the Executor interface, the UncaughtExceptionHandler is notified.
Quote from the book Java Concurrency in Practice(page 163),hope this helps
Somewhat confusingly, exceptions thrown from tasks make it to the uncaught
exception handler only for tasks submitted with execute; for tasks submitted
with submit, any thrown exception, checked or not, is considered to be part of the
task’s return status. If a task submitted with submit terminates with an exception,
it is rethrown by Future.get, wrapped in an ExecutionException.
Here is the example:
public class Main {
public static void main(String[] args){
ThreadFactory factory = new ThreadFactory(){
#Override
public Thread newThread(Runnable r) {
// TODO Auto-generated method stub
final Thread thread =new Thread(r);
thread.setUncaughtExceptionHandler( new Thread.UncaughtExceptionHandler() {
#Override
public void uncaughtException(Thread t, Throwable e) {
// TODO Auto-generated method stub
System.out.println("in exception handler");
}
});
return thread;
}
};
ExecutorService pool=Executors.newSingleThreadExecutor(factory);
pool.execute(new testTask());
}
private static class TestTask implements Runnable {
#Override
public void run() {
// TODO Auto-generated method stub
throw new RuntimeException();
}
}
I use execute to submit the task and the console outputs "in exception handler"
I just browsed through my old questions and thought I might share the solution I implemented in case it helps someone (or I missed a bug).
import java.lang.Thread.UncaughtExceptionHandler;
import java.util.concurrent.Callable;
import java.util.concurrent.Delayed;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.FutureTask;
import java.util.concurrent.RunnableScheduledFuture;
import java.util.concurrent.ScheduledThreadPoolExecutor;
import java.util.concurrent.ThreadFactory;
import java.util.concurrent.TimeUnit;
/**
* #author Mike Herzog, 2009
*/
public class ExceptionHandlingExecuterService extends ScheduledThreadPoolExecutor {
/** My ExceptionHandler */
private final UncaughtExceptionHandler exceptionHandler;
/**
* Encapsulating a task and enable exception handling.
* <p>
* <i>NB:</i> We need this since {#link ExecutorService}s ignore the
* {#link UncaughtExceptionHandler} of the {#link ThreadFactory}.
*
* #param <V> The result type returned by this FutureTask's get method.
*/
private class ExceptionHandlingFutureTask<V> extends FutureTask<V> implements RunnableScheduledFuture<V> {
/** Encapsulated Task */
private final RunnableScheduledFuture<V> task;
/**
* Encapsulate a {#link Callable}.
*
* #param callable
* #param task
*/
public ExceptionHandlingFutureTask(Callable<V> callable, RunnableScheduledFuture<V> task) {
super(callable);
this.task = task;
}
/**
* Encapsulate a {#link Runnable}.
*
* #param runnable
* #param result
* #param task
*/
public ExceptionHandlingFutureTask(Runnable runnable, RunnableScheduledFuture<V> task) {
super(runnable, null);
this.task = task;
}
/*
* (non-Javadoc)
* #see java.util.concurrent.FutureTask#done() The actual exception
* handling magic.
*/
#Override
protected void done() {
// super.done(); // does nothing
try {
get();
} catch (ExecutionException e) {
if (exceptionHandler != null) {
exceptionHandler.uncaughtException(null, e.getCause());
}
} catch (Exception e) {
// never mind cancelation or interruption...
}
}
#Override
public boolean isPeriodic() {
return this.task.isPeriodic();
}
#Override
public long getDelay(TimeUnit unit) {
return task.getDelay(unit);
}
#Override
public int compareTo(Delayed other) {
return task.compareTo(other);
}
}
/**
* #param corePoolSize The number of threads to keep in the pool, even if
* they are idle.
* #param eh Receiver for unhandled exceptions. <i>NB:</i> The thread
* reference will always be <code>null</code>.
*/
public ExceptionHandlingExecuterService(int corePoolSize, UncaughtExceptionHandler eh) {
super(corePoolSize);
this.exceptionHandler = eh;
}
#Override
protected <V> RunnableScheduledFuture<V> decorateTask(Callable<V> callable, RunnableScheduledFuture<V> task) {
return new ExceptionHandlingFutureTask<V>(callable, task);
}
#Override
protected <V> RunnableScheduledFuture<V> decorateTask(Runnable runnable, RunnableScheduledFuture<V> task) {
return new ExceptionHandlingFutureTask<V>(runnable, task);
}
}
In addition to Thilos answer: I've written a post about this behavior, if one wants to have it explained a little bit more verbose: https://ewirch.github.io/2013/12/a-executor-is-not-a-thread.html.
Here is a excerpts from the article:
A Thread is capable of processing only one Runable in general. When the Thread.run() method exits the Thread dies. The ThreadPoolExecutor implements a trick to make a Thread process multiple Runnables: it uses a own Runnable implementation. The threads are being started with a Runnable implementation which fetches other Runanbles (your Runnables) from the ExecutorService and executes them: ThreadPoolExecutor -> Thread -> Worker -> YourRunnable. When a uncaught exception occurs in your Runnable implementation it ends up in the finally block of Worker.run(). In this finally block the Worker class tells the ThreadPoolExecutor that it “finished” the work. The exception did not yet arrive at the Thread class but ThreadPoolExecutor already registered the worker as idle.
And here’s where the fun begins. The awaitTermination() method will be invoked when all Runnables have been passed to the Executor. This happens very quickly so that probably not one of the Runnables finished their work. A Worker will switch to “idle” if a exception occurs, before the Exception reaches the Thread class. If the situation is similar for the other threads (or if they finished their work), all Workers signal “idle” and awaitTermination() returns. The main thread reaches the code line where it checks the size of the collected exception list. And this may happen before any (or some) of the Threads had the chance to call the UncaughtExceptionHandler. It depends on the order of execution if or how many exceptions will be added to the list of uncaught exceptions, before the main thread reads it.
A very unexpected behavior. But I won’t leave you without a working solution. So let’s make it work.
We are lucky that the ThreadPoolExecutor class was designed for extensibility. There is a empty protected method afterExecute(Runnable r, Throwable t). This will be invoked directly after the run() method of our Runnable before the worker signals that it finished the work. The correct solution is to extend the ThreadPoolExecutor to handle uncaught exceptions:
public class ExceptionAwareThreadPoolExecutor extends ThreadPoolExecutor {
private final List<Throwable> uncaughtExceptions =
Collections.synchronizedList(new LinkedList<Throwable>());
#Override
protected void afterExecute(final Runnable r, final Throwable t) {
if (t != null) uncaughtExceptions.add(t);
}
public List<Throwable> getUncaughtExceptions() {
return Collections.unmodifiableList(uncaughtExceptions);
}
}
There is a little bit of a workaround.
In your run method, you can catch every exception, and later on do something like this (ex: in a finally block)
Thread.getDefaultUncaughtExceptionHandler().uncaughtException(Thread.currentThread(), ex);
//or, same effect:
Thread.currentThread().getUncaughtExceptionHandler().uncaughtException(Thread.currentThread(), ex);
This will "ensure a firing" of the current exception as thrown to your uncoughtExceptionHandler (or to the defualt uncought exception handler).
You can always rethrow catched exceptions for pool worker.

Categories