I have a multi-thread multi-object system in which I have a manager and multiple workers. I need to synchronize the workers with the manager, like this:
manager does sth, gives the order to the workers, and then let the workers run in parallel, independent of each other. When they finished this round, they must wait for the manager to give them the new task or order. The manager issues the new order only if all the workers have finished their previous job.
I need to implement it using threads to avoid busy-waiting. However the synchronization is confusing.
Any idea?
EDIT: i missed a important part that says new tasks should arrive only when all have finished. Therefore using LinkedBlockingQueue is not the best solution. I recommend using the CyclicBarrier boris-the-spider has recomended.
You can use a LinkedBlockingQueue
Set a fixed capacity.
The manager can put tasks, and the workers can use function take to wait.
As #boristhespider suggested, I used CyclicBarrier for both manager and workers.
After each worker finishes its task, it calls barrier.await(). Then for the manager, I check if barrier.getNumberWaiting()==NumWorkers. If it's true, it updates the tasks of each worker and then calls barrier.await().
Maintain 2 Blocking queues
1. for Task
2. for free worker
Let worker notify manager via a callback, which add them to free worker queue
Inside manager thread you can check for workers available.
quick implementation
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.LinkedBlockingQueue;
public class ManagerWorker {
public static void main(String[] args) {
ExecutorService service = Executors.newCachedThreadPool();
BlockingQueue<String> taskQueue = new LinkedBlockingQueue<>();
Manager m = new Manager(5, taskQueue);
service.submit(m);
for (int i = 0; i < 5; i++) {
service.submit(new Worker(m, taskQueue));
}
}
}
class Manager implements Runnable {
int workerCount;
BlockingQueue<Worker> workerqueue = new LinkedBlockingQueue<>();
BlockingQueue<String> taskQueue;
public Manager(int workerCount, BlockingQueue<String> taskQueue) {
this.workerCount = workerCount;
this.taskQueue = taskQueue;
}
public void callBackForFreeNotification(Worker worker) {
workerqueue.add(worker);
}
#Override
public void run() {
while (true) {
try {
int i = 0;
while (i < workerCount) {
workerqueue.take();
i++;
}
System.out.println("Manager Worker available");
// add task to task queue here
for (int j = 0; j < workerCount; j++) {
taskQueue.add("task");
}
System.out.println("Manager task added");
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
}
class Worker implements Runnable {
private Manager manager;
private BlockingQueue<String> taskQueue;
public Worker(Manager manager, BlockingQueue<String> taskqueue) {
this.manager = manager;
this.taskQueue = taskqueue;
}
#Override
public void run() {
while(true){
try {
System.out.println("Worker - i have no work");
manager.callBackForFreeNotification(this);
taskQueue.take();
System.out.println("Worker working");
Thread.sleep(2000);
System.out.println("Worker Done with work");
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
}
Related
I have a need to run some threads concurrently, but need to force each process to run in a new Thread (this is due to some ThreadLocal bleed that I don't have full control over). To do so, I have been using the SimpleAsyncTaskExecutor. However, the issue with this is that it doesn't maintain a queue that allows new tasks to be submitted once it's reached the concurrency limit. What I really need to do is have functionality like the SimpleAsyncTaskExecutor but where tasks can still be submitted even after the concurrency limit has been reached - I just want those tasks to wait in the queue until another slot frees up. This is what I have right now:
SimpleAsyncTaskExecutor taskExecutor = new SimpleAsyncTaskExecutor();
taskExecutor.setConcurrencyLimit(maxThreads);
return taskExecutor;
Is there some out-of-the-box solution for this, or do I need to write something custom?
To ensure you need to execute every task in a new Thread, You are basically against use of any ThreadPool (ThreadLocal behavior in a ThreadPool is something you need to get rid of, sooner or later).
To overcome this, you can simply produce something like this,
class ThreadPerTaskExecutor implements Executor {
public void execute(Runnable r) {
Thread t = new Thread(r);
t.start();
try {
t.join();
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
which executes the Runnable always in a new Thread.
Coming to a crude implementation, We can do something like
final Executor executor = new ThreadPerTaskExecutor();
final ExecutorService service = Executors.newFixedThreadPool(3);
for (int i = 0; i < 100; i++) {
service.submit(new Runnable() {
public void run() {
try {
System.out.println("Executed inside Thread pool with concurrency level 3"
+ Thread.currentThread().toString());
executor.execute(new Runnable() {
public void run() {
try {
Thread.sleep(3000); //Some expensive operations here.
System.out.println(
"Executed inside new Thread always" + Thread.currentThread().toString());
} catch (InterruptedException e) {
e.printStackTrace();
}
}
});
} catch (Exception e) {
e.printStackTrace();
}
}
});
}
This can be improved with lambdas as well after Java 8. Hope this sheds the basic idea.
Is there some out-of-the-box solution for this, or do I need to write something custom?
I think there is no out-of-the-box solution for this, and you need to write your own code for this.
You can extend the SimpleAsyncTaskExecutor for simpler/quicker implementation. Example:
public class SimpleAsyncQueueTaskExecutor extends SimpleAsyncTaskExecutor {
private Queue<Runnable> queue = new ConcurrentLinkedQueue<Runnable>();
private AtomicInteger concurrencyValue = new AtomicInteger(0);
private void checkAndExecuteFromQueue() {
int count = concurrencyValue.get();
if (isThrottleActive() && !queue.isEmpty() &&
(count < getConcurrencyLimit())) {
Runnable task = queue.poll();
concurrencyValue.incrementAndGet();
doExecute(new ConcurrencyThrottlingRunnable(task));
}
}
private void afterExecute(Runnable task) {
queue.remove(task);
concurrencyValue.decrementAndGet();
// Check and execute other tasks
checkAndExecuteFromQueue();
}
#Override
public void execute(Runnable task, long startTimeout) {
Assert.notNull(task, "Runnable must not be null");
if (isThrottleActive() && startTimeout > TIMEOUT_IMMEDIATE) {
queue.offer(task);
checkAndExecuteFromQueue();
} else {
doExecute(task);
}
}
private class ConcurrencyThrottlingRunnable implements Runnable {
private final Runnable target;
public ConcurrencyThrottlingRunnable(Runnable target) {
this.target = target;
}
#Override
public void run() {
try {
this.target.run();
}
finally {
afterExecute(this.target);
}
}
}
This example code just add a queue, and override the execute method.
Hope this help.
I have loop that assign task to ExecutorService with fixed size thread, I want the main program wait for threadPool to free one of its' threads to assign another task to it.
Here is my sample code: in this sample code I want finished! be printed at end and want to use ExecutorService.
public static void main(String[] args) {
ExecutorService ex = Executors.newFixedThreadPool(3);
for(int i=0; i< 100; i++) {
ex.execute(new TestThread(i)); // I want the program wait here for at least one thread to free
}
System.out.println("finished!");
}
private static class TestThread implements Runnable {
private int i;
public TestThread(int i) {
this.i = i;
}
#Override
public void run() {
System.out.println("hi: " + i);
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
I understand you want for the thread that is submitting a job, to block in the case when there is not a free, readily available worker thread in the executor service. This can be useful to apply back-pressure.
At the core the executor service is "simply" composed of a queue of runnables, and of a pool of worker threads.
You can obtain this behaviour by building an executor service with a work-queue of fixed size (in your case, size one).
In code: (note that, your caller thread will still continue after submitting the last job; it will not wait for that job to be completed)
package stackOv;
import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.ThreadPoolExecutor;
import java.util.concurrent.TimeUnit;
public class BackPressure {
public static void main(String[] args) {
// this is the backing work queue; in this case, it is of bounded size
ArrayBlockingQueue<Runnable> q = new ArrayBlockingQueue<>(1);
ExecutorService ex = new ThreadPoolExecutor(3, 3, 30, TimeUnit.SECONDS, q,
new ThreadPoolExecutor.CallerRunsPolicy());
for(int i=0; i< 100; i++) {
ex.execute(new TestWork(i));
}
System.out.println("finished!");
}
private static class TestWork implements Runnable {
private int i;
public TestWork(int i) {
this.i = i;
}
#Override
public void run() {
System.out.println("hi: " + i);
try {
Thread.sleep(100);
} catch (InterruptedException e) { e.printStackTrace(); }
}
}
}
All you need is:
ex.awaitTermination();
Multiple workers are processing from a queue and when a database failure occurs it will contact a supervisor that will then lock all worker threads and poll the database at an interval until it is up and it will then release all the threads so they can continue processing. The worker threads can either advance or wait with the processing and the supervisor thread can lock or unlock.
I was thinking of an interface like this. What synchronization primitives would you use? Actors would be a good solution but i don't have the time for a rewrite.
public interface Latch {
/**
* This method will cause a thread(s) to only advance if the latch is in an open state. If the
* latch is closed the thread(s) will wait until the latch is open before they can advance.
*/
void advanceWhenOpen();
/**
* Close the latch forcing all threads that reaches the latch's advance method to wait until
* its open again.
*/
void close();
/**
* Opens the latch allowing blocked threads to advance.
*/
void open();
boolean isOpen();
}
What you want is not really a "latch" - at least the "Java Concurrency in Practice" book says that "Once the latch reaches the terminal state, it cannot change state again, so it remains open forever."
But you can use CountDownLatch objects in the background - whenever your "Latch" needs to be closed, then you can create a new CountDownLatch object with the count of one and await() on in in your advanceWhenOpen(). I think that from a readability point of view this would be the best solution.
I would use a ReadWriteLock as the synchronization primitive for this purpose. The advantage of a read/write lock as opposed to a simple monitor or mutex is that multiple threads can hold the read lock at any given time. This is advantageous when you have lots of readers (e.g. your thread pool in this case) and only one or few writers (e.g. the thread checking for open/close of the database).
With a single monitor or mutex, your threads will serialize on the one lock, making that section of code contentious.
One option is to proxy the queue to make it pausable when the database is unavailable. Workers can check the paused state of the queue while processing and, if necessary, wait for it to unpause. A basic code-demonstration:
import java.util.concurrent.CountDownLatch;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.LinkedBlockingQueue;
import java.util.concurrent.atomic.AtomicReference;
public class PausableQueue<T> {
LinkedBlockingQueue<T> q = new LinkedBlockingQueue<T>();
AtomicReference<CountDownLatch> pause = new AtomicReference<CountDownLatch>(new CountDownLatch(0));
public T take() throws InterruptedException {
awaitPause();
return q.take();
}
public void awaitPause() throws InterruptedException {
pause.get().await();
}
public void setPaused(boolean paused) {
if (paused) {
// only update if there are no threads waiting on current countdown-latch
if (!isPaused()) {
pause.set(new CountDownLatch(1));
}
} else {
pause.get().countDown();
}
}
public boolean isPaused() {
return (pause.get().getCount() > 0L);
}
/* *** Test the pausable queue *** */
public static void main(String[] args) {
ExecutorService executor = Executors.newCachedThreadPool();
try {
testPause(executor);
} catch (Exception e) {
e.printStackTrace();
}
executor.shutdownNow();
}
private static void testPause(ExecutorService executor) throws Exception {
final PausableQueue<Object> q = new PausableQueue<Object>();
for (int i = 0; i < 3; i++) {
q.q.add(new Object());
}
final CountDownLatch tfinished = new CountDownLatch(1);
Runnable taker = new Runnable() {
#Override
public void run() {
println("Taking an object.");
try {
Object o = q.take();
println("Got an object: " + o);
} catch (Exception e) {
e.printStackTrace();
} finally {
tfinished.countDown();
}
}
};
executor.execute(taker);
tfinished.await();
final CountDownLatch tstarted2 = new CountDownLatch(2);
final CountDownLatch tfinished2 = new CountDownLatch(2);
taker = new Runnable() {
#Override
public void run() {
println("Taking an object.");
tstarted2.countDown();
try {
Object o = q.take();
println("Got an object: " + o);
} catch (Exception e) {
e.printStackTrace();
} finally {
tfinished2.countDown();
}
}
};
q.setPaused(true);
println("Queue paused");
executor.execute(taker);
executor.execute(taker);
tstarted2.await();
// Pause to show workers pause too
Thread.sleep(100L);
println("Queue unpausing");
q.setPaused(false);
tfinished2.await();
// "Got an object" should show a delay of at least 100 ms.
}
private static void println(String s) {
System.out.println(System.currentTimeMillis() + " - " + s);
}
}
I wrote a producer/consumer based program using Java's BlockingQueue. I'm trying to find a way to stop the consumer if all producers are done. There are multiple producers, but only one consumer.
I found several solutions for the "one producer, many consumers" scenario, e.g. using a "done paket / poison pill" (see this discussion), but my scenario is just the opposite.
Are there any best practice solutions?
The best-practice system is to use a count-down latch. Whether this works for you is more interesting.....
Perhaps each producer should register and deregister with the consumer, and when all producers are deregistered (and the queue is empty) then the consumer can terminate too.
Presumably your producers are working in different threads in the same VM, and that they exit when done. I would make another thread that calls join() on all the producers in a loop, and when it exist that loop (because all the producer threads have ended) it then notifies the consumer that it's time to exit. This has to run in another thread because the join() calls will block. Incidentally, rolfl's suggestion of using a count down latch would have the problem, if I understand it correctly.
Alternately, if the producers are Callables, then the consumer can call isDone() and isCanceled() on their Futures in the loop, which won't bock, so it can be used right in the consumer thread.
You could use something like the following, i use registerProducer() and unregisterProducer() for keeping track of the producers. Another possible solution could make use of WeakReferences.
It's worth to mention that this solution will not consume the events that have already been queued when the consumer is shut down, so some events may be lost when shutting down.
You would have to drain the queue if the consumer gets interrupt and then process them.
import java.util.ArrayList;
import java.util.List;
import java.util.concurrent.ArrayBlockingQueue;
import java.util.concurrent.BlockingQueue;
import java.util.concurrent.ExecutorService;
import java.util.concurrent.Executors;
import java.util.concurrent.TimeUnit;
import java.util.concurrent.atomic.AtomicBoolean;
import java.util.concurrent.atomic.AtomicInteger;
public class TestConsumerShutdown {
private static interface SomeEvent {
String getName();
}
private static class Consumer implements Runnable {
private final BlockingQueue<SomeEvent> queue = new ArrayBlockingQueue<>(10);
private final ExecutorService consumerExecutor = Executors.newSingleThreadExecutor();
private final AtomicBoolean isRunning = new AtomicBoolean();
private final AtomicInteger numberProducers = new AtomicInteger(0);
public void startConsumer() {
consumerExecutor.execute(this);
}
public void stopConsumer() {
consumerExecutor.shutdownNow();
try {
consumerExecutor.awaitTermination(Long.MAX_VALUE, TimeUnit.SECONDS);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
public void registerProducer() {
numberProducers.incrementAndGet();
}
public void unregisterProducer() {
if (numberProducers.decrementAndGet() < 1) {
stopConsumer();
}
}
public void produceEvent(SomeEvent event) throws InterruptedException {
queue.put(event);
}
#Override
public void run() {
if (isRunning.compareAndSet(false, true)) {
try {
while (!Thread.currentThread().isInterrupted()) {
SomeEvent event = queue.take();
System.out.println(event.getName());
}
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
} finally {
System.out.println("Consumer stopped.");
isRunning.set(false);
}
}
}
}
public static void main(String[] args) {
final Consumer consumer = new Consumer();
consumer.startConsumer();
final Runnable producerRunnable = new Runnable() {
#Override
public void run() {
final String name = Thread.currentThread().getName();
consumer.registerProducer();
try {
for (int i = 0; i < 10; i++) {
consumer.produceEvent(new SomeEvent() {
#Override
public String getName() {
return name;
}
});
}
System.out.println("Produver " + name + " stopped.");
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
} finally {
consumer.unregisterProducer();
}
}
};
List<Thread> producers = new ArrayList<>();
producers.add(new Thread(producerRunnable, "producer-1"));
producers.add(new Thread(producerRunnable, "producer-2"));
producers.add(new Thread(producerRunnable, "producer-3"));
for (Thread t : producers) {
t.start();
}
}
}
I have a thread which must wait several objects from different threads.
#Override
public void run() {
while (true) {
for (BackgroundTask task : tasks) {
synchronized (task) {
if (task.isReady()) {
task.doTask();
}
}
}
}
}
But it is a stupid use of CPU time.
How to wait several objects?
IMO CountDownLatch would be a good way of going about it. Quoting from the Javadoc:
class Driver2 { // ...
void main() throws InterruptedException {
CountDownLatch doneSignal = new CountDownLatch(N);
Executor e = ...
for (int i = 0; i < N; ++i) // create and start threads
e.execute(new WorkerRunnable(doneSignal, i));
doneSignal.await(); // wait for all to finish
}
}
class WorkerRunnable implements Runnable {
private final CountDownLatch doneSignal;
private final int i;
WorkerRunnable(CountDownLatch doneSignal, int i) {
this.doneSignal = doneSignal;
this.i = i;
}
public void run() {
try {
doWork(i);
doneSignal.countDown();
} catch (InterruptedException ex) {} // return;
}
void doWork() { ... }
}
Please use notifyaAll() instead of notify() because notify wakes up single thread where as notifyAll() wakes up all the waiting threads.
If you can modify the BackgroundTask class, have it notify your runner when it is ready. Add a queue to your runner class, and each time a task is ready, it can add itself to the queue and notify it.
The runner class then waits on the queue when it is empty, and pulls items out of it to run when it is not.
You can utilize notify() and wait() on the Object. How you use it depends on the struture of your program.