I want to implement a variety of of a publisher/subscriber pattern using Java and currently running out of ideas.
There is 1 publisher and N subscribers, the publisher publish objects then each subscriber need to process the each of the objects once and only once in the correct order. The publisher and each subscriber run in their own thread.
In my original implementation, each subscriber has its own blocking queue, and the publisher put objects into each of the subscriber's queue. This works ok but the publisher will be blocked if any of the subscriber's queue is full. This leads to degration of performance as each subscriber takes different time in processing the object.
Then in another implementation, the publisher hold the object in its own queue. Along with the object, an AtomicInteger counter is associated with it with the number of subscribers out there. Each subscriber then peek the queue and decrease the counter, and remove it from the queue when the counter reach zero.
In this way the publisher is free from blocking but now the subscriber will need to wait for each other to process the object, removing the object from the queue, before the next object can be peeked.
Is there any better way to do this? I assume this should be a quite common pattern.
Your "many queues" implementation is the way to go. I don't think you need to necessarily be concerned with one full queue blocking the producer, because the overall time to completion won't be affected. Let's say you have three consumers, two consume at a rate of 1 per second and the third consumes at a rate of 1 per five seconds, meanwhile the producer produces at a rate of 1 per two seconds. Eventually the third queue will fill up, and so the producer will block on it and will also stop putting items in the first and second queues. There are ways around this, but they're not going to change the fact that the third consumer will always be the bottleneck. If you're producing/consuming 100 items, then this will take at least 500 seconds because of the third consumer (5 seconds times 100 items), and this will be the case even if the first and second consumers finish after 200 seconds (because you've done something clever to allow the producer to continue to fill their queues even after the third queue is full) or if they finish after 500 seconds (because the producer blocked on the third queue).
Definately
each subscriber has its own blocking queue, and the publisher put objects into each of the subscriber's queue.`
this is the way to go.
you can use threaded approach to put it in queue... so if one queue is full publisher will not wait..
for example.
s1 s2 s3 are subscribers and addToQueue is method in each subscriber which adds to corrosponding queue.
The addQueue Method is which waits till queue is non empty .. so call to addQueue will be a blocking call ideally synchronised code...
Then in publisher you can do something similar to below code
NOTE: code might not be in working condition as it is.. but should give you idea.
List<subscriber> slist;// Assume its initialised
public void publish(final String message){
for (final subscriber s: slist){
Thread t=new Thread(new Runnable(){
public void run(){
s.addToQueue(message);
}
});
t.start();
}
}
There is 1 publisher and N subscribers, the publisher publish objects then each subscriber need to process the each of the objects once and only once in the correct order. The publisher and each subscriber run in their own thread.
I would change this architecture. I initially considered the queue per subscriber but I don't like that mechanism. For example, if the first subscriber takes longer to run, all of the jobs will end up in that queue and you will only be doing 1 thread of work.
Since you have to run the subscribers in order, I'd have a pool of threads which run each message through all of the subscribers. The calls to the subscribers will need to be reentrant which may not be possible.
So you would have a pool of 10 threads (let's say) and each one dequeues from the publisher's queue, and does something like the following:
public void run() {
while (!shutdown && !Thread.currentThread().isInterrupted()) {
Article article = publisherQueue.take();
for (Subscriber subscriber : subscriberList) {
subscriber.process(article);
}
}
}
Related
I have a service class
#Path("/")
public class ABC {
#Path(/process/{param})
public String processRequest(#PathParam("param") String param){
Thread t = new Thread(()->{
// Do some processing with the param parmeter
System.out.println("Processing started for -> "+ param);
//Do many more things
//DB transactions,etc.
});
t.start();
return "Your request will be processed";
}
}
I accept some parameter and start processing it in a new thread and at the same time so that it should complete the processing within 30 secs, I break my connection with the client by acknowledging him that his request will be processed.
It works fine and till now without any issues, Currently, it can process more than 5k requests. The problem starts when there is a lot of requests come at the same time maybe more than 50k so my application creates a new thread for every new request which causes the application to allocate a lot of memory and also sometimes makes JVM memory to exhaust.
Is there any another way by which I can immediately start the processing without bothering for the number of requests and process all the requests within 30 secs and also limit the no of threads active working threads.
One way I found was the Producer-Consumer implementation in which I can accept all the requests and put simultaneously into the producers and my consumers pick up the request and start processing it, For this implementation i need to specify the maximum no of request which can be accepted by producer(Ex : 100 000) and no of consumers which can process the request(Ex : 1000) so that only 1000 threads are active and process one after another but issue with this approach is that if any of the consumer (working) thread if locks due to some reason and if not released then there are only the remaining unlocked threads left to process the request and the incoming request is continuously increasing in the producers. Only increasing the no of consumers creates more working thread but at the same time there can be a lot of locked threads processing the task.
Please let me know any another approach by which I can do it.
Note : All the request should be processed within 30 secs and if unable to do then it fails the success criteria.
You probably want a queueing mechanism, like RabbitMq.
Your application will run like this:
request -> push to queue -> return ACK to client
queue -> worker threads.
The queue consumer speed is determined by your worker threads speed, so you will never exhausted your system.
Under load, there're lots of message will be queued, mean while your workers reliably takes messages from queue and process them.
Your need is to serve a large no of (may be concurrent) requests and also want to control no of threads spawned (max cap on number of threads). Your best option is to use ExecuterService which is kind of managed thread pool where you can specify thread pool size and submit multiple Runnable or Callable objects for processing.
ExecutorService executorService = Executors.newFixedThreadPool(10);
It’s explained very well here. Thread Concurrency using ExecutorService in Java 8
You can use queuing system to put requests in queue and acknowledge client about processing and later you can process queue
lets say we have two threads with are connected by a ConcurrentLinkedQueue. What I want is something like a handler on the queue so that one thread knows when the other queue has added soemthing to the queue and to poll it. Is that possible?
Normally a ConcurrentLinkedQueue is used when there is at least one producer on a thread, and at least one consumer on a different thread.
The consumer will process the element as soon as they are available, to do so the read operation on the queue blocks, sometimes for a limited amount of time.
Depending on the application you can have a single producer and many consumer, or viceversa.
Blocking achieves exactly your requirement (the consumer thread knows when an element is inserted).
The fact that the consumer thread blocks is not a problem unless is your main process thread or unless you are planning to build several hundred concurrent consumers.
So, Queue#take() or Queue#poll(long timeout,TimeUnit unit) is your friend here, if you just run it on dedicated Thread.
On Android I have a normal Consumer - Producer scenario:
Different producer thread can add object to the list in
different time
When a certain event (Trigger event) appens a
consumer start to keep element from the list (there is only one
thread consumer.
When the list is empty the consumer it stop to
keep element from the list
As soon as the list is not empty the
consumer must tale the element from the list
The consumer must
be fast to keep data, as soon as the element it inserted in the list
from the producer the consumer have to keep it
I have this
scenario in a singleton, and I have to stop thread only when the app
is shutdown.
*
One of the producer is sometimes the UI thread
*
What type of synchronization and list do you suggest to use ? I would do this without waste cpu load.
I'm scared of point 7.. i don't want to block for a lot of time the UI thread
EDIT : for add details for #Alex
I'm writing it in pseudocode:
Thread C producer : calls EventTracker.trackEvent( C )
UI producer : calls EventTracker.trackEvent( A )
EventTracker
{
BlockingQueue<Event> blockingQueue
trackEvent(Event x)
{
blockingQueue.offer(500, ms);
}
Thread consumer
{
while(true){
Event p = blockingQueue.poll(100, ms);
}
}
}
if the timeout is triggered on trackEvent(A) the UI producer not waiting for long time but Does the event "A" missed ?
you could try SEDA approach to this problem with queuing and using an implementation like Blocking queue
in your case, the producer insert event in the queue by using 'offer' and the consumer take them by using 'poll'. (use the Timeout on those method to exit nicely the producer/consumer when the user quit the application)
note that there is a few things to get right on the Threading side when using this approach.
here's an example of the concept from the android developer documentation.
I'm working on a producer-consumer pattern that should work with a queue. As usually a consumer Thread and a Producer thread, producer will add an item to the queue at certain times interval (from 3 to 5 seconds), consumer will wait to process it as soon as the queue isn't empty.
As a requirement the producer should and will produce items non-stop, which means if the queue is full, it will keep producing, and that's why I can't use BlockingQueue implementations as they either wait for the queue to have available space or throw exception.
My current implementation is the following
// consumer's Runnable
public void run() {
while(true) {
if(!queue.isEmpty()) {
currentItem = queue.poll();
process(currentItem);
}
}
}
This thread will keep looping even if no item has been produced by the producer Thread.
How is it done to wait until the producer add an item to the queue, and also what is a good Queue implementation with no cap-limit ?
n threads produce to a BlockingQueue.
When the queue is full, the consumer drains the queue and does some processing.
How should I decide between the following 2 choices of implementation?
Choice A :
The consumer regularly polls the queue to check if it is full, all writers waiting (it is a blocking queue after all : ).
Choice B :
I implement my own queue with a synchronized "put" method. Before putting the provided element, I test if the queue is not nearly full (full minus 1 element). I then put the element and notify my consumer (which was waiting).
The first solution is the easiest but does polling ; which annoys me.
The second solution is in my opinion more error prone and more requires more coding.
I would suggest to write your proxy queue which would wrap a queue instance internally along with an Exchanger instance. Your proxy methods delegate calls to your internal queue. Check if the internal queue is full when you add and when it is full, exchange the internal queue with the consumer thread. The consumer thread will exchange an empty queue in return for the filled queue. Your proxy queue will continue filling the empty queue while the consumer can keep processing the filled queue. Both activities can run in parallel. They can exchange again when both parties are ready.
class MyQueue implements BlockingQueue{
Queue internalQueue = ...
Exchanger<Queue> exchanger;
MyQueue(Exchanger<BlockingQueue> ex){
this.exchanger = ex;
}
.
.
.
boolean add (E e) {
try{
internalQueue.add(e);
}catch(IllegalStateException ise){
internalQueue = exchanger.exchange(internalQueue);
}
internalQueue.add(e);
}
}
class Consumer implements Runnable {
public void run() {
Queue currentQueue = new empty queue;
while (...){
Object o = currentQueue.remove();
if (o == null){
currentQueue = exchanger.exchange(currentQueue);
continue;
}
//cast and process the element
}
}
}
The second solution is obviously better. And it is not so complicated. You can inherit or wrap any other BlockingQueue and override its method offer() as following: call the "real" offer(). If it returns true, exit. Otherwise trigger the working thread to work and immediately call offer() with timeout.
Here is the almost pseudo code:
public boolean offer(E e) {
if (queue.offer(e)) {
return true;
}
boolean result = queue.offer(e, timeout, unit); // e.g. 20 sec. - enough for worker to dequeue at least one task from the queue, so the place will be available.
worker.doYouJob();
return result; }
I don't know is there such implementation of queue you need: consumer are wait while queue become full and only when it full drain and start processing.
You queue should be blocked for consumers until it become full. Think you need to override drain() method to make it wait while queue become full. Than your consumers just call and wait for drain method. No notification from producer to consumer is needed.
Use an observer pattern. Have your consumers register with the queue notifier. When a producer does a put the queue would then decide whether to notify any listeners.
I used the CountdownLatch which is simple and works great.
Thanks for the other ideas :)