Manage Queue System in multiple Threads in java - java

I have a Queue of request. There are two threads. In on thread i am adding the items to queue and second thread basically get the requests from queue list and execute them. So second thread wait for 1st thread to put some request in the list. I am doing so in a while loop. I don't think this is a best way to do it. It is CPU intensive. I can think of a way to notify the 2nd thread whenever I add a request. but there can be problem that the request may not execute successfully so I have to ask 2nd thread again to execute the request.
so is there any way you can think will work ?

Use one of the available blocking queues in Java: http://docs.oracle.com/javase/7/docs/api/java/util/concurrent/BlockingQueue.html
The busy waiting is indeed not recommended (unless you want to use your computer for heating).

You can make use of Semaphores to solve this problem.
The second thread, which is the worker thread will wait on the semaphore. Every time the 1st thread pushes new task info onto the Queue structure, it will also post to the Semaphore so now the second thread can safely go and execute.
This may also need some synchronization along the way if there are multiple reader/writer threads.

Related

JAVA waking up threads in specific order

Let's say that i have 10 active threads and only 3 resources (of something)
while the first three threads got the resources i want all other thread that try to get the resource to wait but that the wake up or notify will be in f.i.f.o order i mean that the first thread that got the waiting will be the first to wake up.
thank you all.
I think this link explains it quite well: https://www.safaribooksonline.com/library/view/java-threads-second/1565924185/ch04s03.html
When using notify it is impossible to decide or determine in advance which thread will be allowed to execute. I see 2 solutions to this:
Use notifyAll() and let each thread check for itself whether whose turn it is (e.g. by using a synchronised FIFO queue)
Use the method described in the link: let each thread wait on a different object and use 1 thread that has as it's sole purpose to notify the correct object. This seems like the best solution to me.
Java generally doesn't decide these things however if you use a fair lock e.g.
Lock lock = new ReentrantLock(true);
then those threads will acquire the lock in the order they were attempted. This works by disregarding the order thread would be notified and ensuring a lock which is not taken unless the thread is next on the FIFO queue.

Synchronizers between two different execution blocks

Message-processing-task should stop processing a NEW message when it detects a client-login-task. However, the message-processor should complete the task it was processing before pausing. This basically means the client-login task should give the message processor a breathing space (await itself) before it can continue. So the scenario is this
1) message-processing-task is processing messages from a queue (one at a time)
2) It detects a client-login in the middle of processing a message
3) message-processing-task should complete processing the message before it waits for the client-login-task to complete. This is where the client-login-task must wait.
4) client-login-task complete and signals message-processing-task
5) message-processing-task goes about its business.
My question is , are there any ready made synchronizers between these two thread which are executing different paths and yet must wait on each other? My understanding is Cyclic Barrier, Semaphore, CountDownLatch synchronize between threads which are in the same paths of execution.
EDIT- There is a single message-processing thread. However, there can be multiple login threads.
I solution which I have in mind is to use a Reentrant lock. So what happens is before processing each message, a lock is acquired and message-processor checks if there are any client login in progress. An AtomicInteger tells me the number of login requests in progress. If there are more than one login requests in progress, the notification-processor awaits on the lock condition. The condition on which notification-processor is signalled to resume its work is that the AtomicInteger count has to come down to 0.
The only caveat with the solution is if the message is being processed and the login request comes in the middle of that, then the login thread does not wait. That is where I will need another lock on the client-login which must be released when the message-processor has processed the message. This makes the solution far too complex and I would like to avoid this unnecessary complexity.
Any suggestions appreciated.
Not sure I picked this up correctly, but I would use a ThreadPoolExecutor with single thread, passing a PriorityBlockingQueue in it. All your tasks will go to that queue, the login tasks will have higher priority so they will all be processed before any message-processing tasks.
Also if message-processing-task is in progress, it will complete before the client-login-tasks kicks in. Would that be what you need?
A simple approach with a Semaphore (1 permit, fairness true) would be to share one between login/message processing tasks. If the message processing thread were to be in the middle of a task, it would finish before the login task could process (fairness would guarantee that the login task would execute right after).
If there are multiple message processing threads, this approach won't work.

Multiple threads submitting actions to be done in order

A question on using threads in java (disclaimer - I am not very experienced with threads so please allow some leeway).
Overview:
I was wondering whether there was a way for multiple threads to add actions to be performed to a queue which another thread would take care of. It does not matter really what order - more important that the actions in the queue are taken care of one at a time.
Explanation:
I plan to host a small server (using servlets). I want each connection to a client to be handled by a separate thread (so far ok). However, each of these threads/clients will be making changes to a single xml file. However, the changes cannot be done at the same time.
Question:
Could I have each thread submit the changes to be made to a queue which another thread will continuously manage? As I said it does not matter on the order of the changes, just that they do not happen at the same time.
Also, please advise if this is not the best way to do this.
Thank you very much.
This is a reasonable approach. Use an unbounded BlockingQueue (e.g. a LinkedBlockingQueue) - the thread performing IO on the XML file calls take on the queue to remove the next message (blocking if the queue is empty) then processing the message to modify the XML file, while the threads submitting changes to the XML file will call offer on the queue in order to add their messages to it. The BlockingQueue is thread-safe, so there's no need for your threads to perform synchronization on it.
You could have the threads submit tasks to an ExecutorService that has only one thread. Or you could have a lock that allows only one thread to alter the file at once. The later seems more natural, as the file is a shared resource. The queue is the implied queue of threads awaiting a lock.
The Executor interface provides the abstraction you need:
An object that executes submitted Runnable tasks. This interface provides a way of decoupling task submission from the mechanics of how each task will be run, including details of thread use, scheduling, etc. An Executor is normally used instead of explicitly creating threads."
A single-threaded executor service seems like exactly the right tool for the job. See Executors.newSingleThreadExecutor(), whose javadoc says:
Creates an Executor that uses a single worker thread operating off an
unbounded queue. (Note however that if this single thread terminates
due to a failure during execution prior to shutdown, a new one will
take its place if needed to execute subsequent tasks.) Tasks are
guaranteed to execute sequentially, and no more than one task will be
active at any given time. Unlike the otherwise equivalent
newFixedThreadPool(1) the returned executor is guaranteed not to be
reconfigurable to use additional threads.
Note that in a JavaEE context, you need to take into consideration how to terminate the worker thread when your webapp is unloaded. There are other questions here on SO that deal with this.

Java: sharing workers in thread pool for several recursive tasks

There is one fixed thread pool (let it be with size=100), that I want to use for all tasks across my app.
It is used to limit server load.
Task = web crawler, that submits first job to thread pool.
That job can generate more jobs, and so on.
One job = one HTTP I/O request.
Problem
Suppose that there is only one executing task, that generated 10000 jobs.
Those jobs are now queued in thread pool queue, and all 100 threads are used for their execution.
Suppose that I now submit a second task.
The first job of the second task is 10001th in the queue.
It will be executed only after the 10000 jobs that the first task queued up.
So, this is a problem - I don't want the second task to wait so long to start its first job.
Idea
The first idea on my mind is to create a custom BlockingQueue and pass it to the thread pool constructor.
That queue will hold several blocking queues, one for each task.
Its take method will then choose a random queue and take an item from it.
My problem with this is that I don't see how to remove an empty queue from this list when its task is finished. This would mean some or all workers could get blocked on the take method, waiting for jobs from tasks that are finished.
Is this the best way to solve this problem?
I was unable to find any patterns for it in books or on the Internet :(
Thank you!
I would use multiple queues and draw from a random of the queues that contains items. Alternatively you could prioritize which queue should get the highest priority.
I would suggest using a single PriorityBlockingQueue and using the 'depth' of the recursive tasks to compute the priority. With a single queue, workers get blocked when the queue is empty and there is no need for randomization logic around the multiple queues.

Any available design pattern for a thread that is capable of executing a specific job sent by another threads?

I'm working on a project where execution time is critical. In one of the algorithms I have, I need to save some data into a database.
What I did is call a method that does that. It fires a new thread every time it's called. I faced a runoutofmemory problem since the loaded threads are more than 20,000 ...
My question now is, I want to start only one thread, when the method is called, it adds the job into a queue and notifies the thread, it sleeps when no jobs are available and so on. Any design patterns available or examples available online ?
Run, do not walk to your friendly Javadocs and look up ExecutorService, especially Executors.newSingleThreadExecutor().
ExecutorService myXS = Executors.newSingleThreadExecutor();
// then, as needed...
myXS.submit(myRunnable);
And it will handle the rest.
Yes, you want a worker thread or thread pool pattern.
http://en.wikipedia.org/wiki/Thread_pool_pattern
See http://www.ibm.com/developerworks/library/j-jtp0730/index.html for Java examples
I believe the pattern you're looking for is called producer-consumer. In Java, you can use the blocking methods on a BlockingQueue to pass tasks from the producers (that create the jobs) to the consumer (the single worker thread). This will make the worker thread automatically sleep when no jobs are available in the queue, and wake up when one is added. The concurrent collections should also handle using multiple worker threads.
Are you looking for java.util.concurrent.Executor?
That said, if you have 20000 concurrent inserts into the database, using a thread pool will probably not save you: If the database can't keep up, the queue will get longer and longer, until you run out of memory again. Also, note that an executors queue is volatile, i.e. if the server crashes, the data in it will be gone.

Categories