Analysing a BlockingQueue usage example - java

I was looking at the "usage example based on a typical producer-consumer scenario" at:
http://download.oracle.com/javase/1.5.0/docs/api/java/util/concurrent/BlockingQueue.html#put(E)
Is the example correct?
I think the put and take operations need a lock on some resource before proceeding to modify the queue, but that is not happening here.
Also, had this been a Concurrent kind of a queue, the lack of locks would have been understandable since atomic operations on a concurrent queue do not need locks.

I do not think there is something to add to what is written in api:
A Queue that additionally supports operations that wait for the queue to become non-empty when retrieving an element, and wait for space to become available in the queue when storing an element.
BlockingQueue implementations are thread-safe. All queuing methods achieve their effects atomically using internal locks or other forms of concurrency control.

BlockingQueue is just an interface. This implementation could be using synchronzed blocks, Lock or be lock-free. AFAIK most methods use Lock in the implementation.

Related

Can Threadpoolexecutor switch its blockingQueue after start?

Can Threadpoolexecutor change its blockingqueue after start? I am using multiple threadpoolexecutors in my process. I don't want to breach the maximum number of threads beyond a certain number in my process. That is why I thought of the idea of switching blockingqueue of my threadpool to a more busy blockingqueue. But I don't see any function in ThreadpoolExecutor class which provides the facility of switching blockingqueues. What could be the reason behind this?
Apparently threadpoolexecutor gives access to its blockingqueue. I can achieve the same behavior by transfering tasks from one queue to another queue.
Immutable objects are usually favoured in modern programming practices. It usually make things... Simpler in regards to object model growth and future enhancements (And no, I don't consider python's approach of "Let's all be responsible adults" as modern for the sake of the argument).
As for solving your problem you.could perhaps pass a smart "Delegating" BlockingQueue implementation that'll implement the standard interface but back it with some queue switching mechanism, controlled internally or externally as your specification requires

Why ConcurrentHashMap cannot be locked for exclusive access?

A quote from #JCIP :
"Since a ConcurrentHashMap cannot be locked for exclusive access, we
cannot use client-side locking to create new atomic operations such as
put-if-absent, as we did for Vector"
Why we can't just acquire the lock in order to implement additional atomic methods and keep the collection thread-safe (like synchronized collections returned by Collections.synchronizedxxx factory) :
The whole point of the ConcurrentHashMap is that read operations never block, i.e. do not have to check for locks. That precludes the ability to have such a lock.
Why we can't just acquire the lock :
You could do that, but you have to do it consistently for all access paths to the map, and then you have completely negated to purpose of a concurrent data structure. It is supposed to be lock-free.
Why? Because the implementation does not support it. Straight from the ConcurrentHashMap JavaDocs:
There is not any support for locking the entire table in a way that prevents all access
...which is, by definition, "exclusive access."
Code you have written is your implementation, and if you use it that way then all other operations must work that way i.e. all operations must accquire same lock.
But the main point here is that java has not provided ConcurrentHashMap for this purpose, its purpose is to allow multiple thread to work simultaneously.
For your requirement go for HashTable.

Does my implementation of LinkedBlockingQueue need to be synchronized?

To begin with, I have used search and found n topics related to this question. Unfortunately, they didin't help me, so it'll be n++ topics :)
Situation: I will have few working threads (the same class, just many dublicates) (let's call them WT) and one result writing thread (RT).
WT will add objects to the Queue, and RT will take them. Since there will be many WT won't there be any memory problems(independant from the max queue size)? Will those operations wait for each other to be completed?
Moreover, as I understand, BlockingQueue is quite slow, so maybe I should leave it and use normal Queue while in synchronized blocks? Or should I consider my self by using SynchronizedQueue?
LinkedBlockingQueue is designed to handle multiple threads writing to the same queue. From the documentation:
BlockingQueue implementations are thread-safe. All queuing methods achieve their effects atomically using internal locks or other forms of concurrency control. However, the bulk Collection operations addAll, containsAll, retainAll and removeAll are not necessarily performed atomically unless specified otherwise in an implementation.
Therefore, you are quite safe (unless you expect the bulk operations to be atomic).
Of course, if thread A and thread B are writing to the same queue, the order of A's items relative to B's items will be indeterminate unless you synchronize A and B.
As to the choice of queue implementation, go with the simplest that does the job, and then profile. This will give you accurate data on where the bottlenecks are so you won't have to guess.

infinite fifo in java

I am looking for a thread safe infinite blocking fifo which is backed by a fixed buffer such as an array. The semantics would be that multiple readers and writer threads can accesses it safely. Writers block if the buffer is full and overwrite the oldest item. Readers block if the buffer is empty. Fifo ordering must be maintained when the counter of total added and total removed has wrapped around the internal buffer size one or many times.
Interestingly the usual places that I would look for this (java's own concurrent collections, commons collections, guava) don't see to have an instant answer to such an 'obvious' requirement.
You are actually describing an ArrayBlockingQueue.
It is thread-safe and has been designed for that exact purpose:
writers wait for space to become available if the queue is full
readers can wait up to a specified wait time if necessary for an element to become available
It sounds like you're looking for ArrayBlockingQueue.
What about java.util.concurrent.ArrayBlockingQueue
It's not clear if you are looking for infinity blocking queue or for bounded blocking queue.
Bounded blocking queue: java.util.concurrent.ArrayBlockingQueue
Infinity blocking queue(limited only by RAM constraint): java.util.concurrent.LinkedBlockingQueue
For all the cases I will suggest to use ArrayBlockingQueue.
For an infinite queue you would have to create you own class implementing the BlockingQueue interface using a queue delegate (maybe ArrayBlockingQueue) and when the queue runs full, adapt the size, creating a new and bigger delegate. This should be infinite up to MAX_INT and avoid the GC overhead involved with linked queues (which need to create nodes for each inserted object). If needed, you can shrink the delegate, too.

Using LinkedBlockingQueue good enough for multi thread java program?

I have a consumer and a producer that adds and deletes Item objects from the queue. If I use the put() and take() methods. Is there any thread safety issues I need to still cover? This is similar to the bounded buffer problem and I was just wondering if using the blocking queue instead replaces the need for semaphores or monitors. The Item object itself would probably need synchronization (setters but getters don't need lock), am I right? And lastly, I'm not quite sure how to test if it is thread safe since I can't simultaneously make both threads call the take() because to order of execution is underterministic. Any ideas? Thanks.
It is perfectly thread-safe for what you're doing, in fact this is what it's designed for. The description of BlockingQueue (which is the interface implemented by LinkedBlockingQueue) states:
BlockingQueue implementations are thread-safe. All queuing methods
achieve their effects atomically using internal locks or other forms
of concurrency control.
Simultaneous put() and take() are not thread-safe since they use 2 different locks.
This is already answered here : Are LinkedBlockingQueue's insert and remove methods thread safe?

Categories