Using LinkedBlockingQueue good enough for multi thread java program? - java

I have a consumer and a producer that adds and deletes Item objects from the queue. If I use the put() and take() methods. Is there any thread safety issues I need to still cover? This is similar to the bounded buffer problem and I was just wondering if using the blocking queue instead replaces the need for semaphores or monitors. The Item object itself would probably need synchronization (setters but getters don't need lock), am I right? And lastly, I'm not quite sure how to test if it is thread safe since I can't simultaneously make both threads call the take() because to order of execution is underterministic. Any ideas? Thanks.

It is perfectly thread-safe for what you're doing, in fact this is what it's designed for. The description of BlockingQueue (which is the interface implemented by LinkedBlockingQueue) states:
BlockingQueue implementations are thread-safe. All queuing methods
achieve their effects atomically using internal locks or other forms
of concurrency control.

Simultaneous put() and take() are not thread-safe since they use 2 different locks.
This is already answered here : Are LinkedBlockingQueue's insert and remove methods thread safe?

Related

Making a non-thread safe queue thread safe

Say I have a queue implementation which is not thread safe. How can I make it thread safe (without modifying original code)? In other words:
How can I write a SynchronizedQueueWrapper which is backed by a regular non-thread safe queue?
Plus: My queue does not implement Collection interface besides add, remove, peek and size functions.
If your queue implements Collection, you can simply use a Collections.synchronizedCollection(queue);. Otherwise I suggest you checked the code of synchronizedCollection and did something similar (essentially: guarding all operations on the queue by a mutex on this).
Note that it will still require the users to properly access the synchronized queue, for example during iteration.

Threadsafe vs Synchronized

I'm new to java.
I'm little bit confused between Threadsafe and synchronized.
Thread safe means that a method or class instance can be used by multiple threads at the same time without any problems occurring.
Where as Synchronized means only one thread can operate at single time.
So how they are related to each other?
The definition of thread safety given in Java Concurrency in Practice is:
A class is thread-safe if it behaves correctly when accessed from multiple threads, regardless of the scheduling or interleaving of the execution of those threads by the runtime environment, and with no additional synchronization or other coordination on the part of the calling code.
For example, a java.text.SimpleDateFormat object has internal mutable state that is modified when a method that parses or formats is called. If multiple threads call the methods of the same dateformat object, there is a chance a thread can modify the state needed by the other threads, with the result that the results obtained by some of the threads may be in error. The possibility of having internal state get corrupted causing bad output makes this class not threadsafe.
There are multiple ways of handling this problem. You can have every place in your application that needs a SimpleDateFormat object instantiate a new one every time it needs one, you can make a ThreadLocal holding a SimpleDateFormat object so that each thread of your program can access its own copy (so each thread only has to create one), you can use an alternative to SimpleDateFormat that doesn't keep state, or you can do locking using synchronized so that only one thread at a time can access the dateFormat object.
Locking is not necessarily the best approach, avoiding shared mutable state is best whenever possible. That's why in Java 8 they introduced a date formatter that doesn't keep mutable state.
The synchronized keyword is one way of restricting access to a method or block of code so that otherwise thread-unsafe data doesn't get corrupted. This keyword protects the method or block by requiring that a thread has to acquire exclusive access to a certain lock (the object instance, if synchronized is on an instance method, or the class instance, if synchronized is on a static method, or the specified lock if using a synchronized block) before it can enter the method or block, while providing memory visibility so that threads don't see stale data.
Thread safety is a desired behavior of the program, where the synchronized block helps you achieve that behavior. ​There are other methods of obtaining Thread safety e.g immutable class/objects. Hope this helps.
Thread safety: A thread safe program protects it's data from memory consistency errors. In a highly multi-threaded program, a thread safe program does not cause any side effects with multiple read/write operations from multiple threads on shared data (objects). Different threads can share and modify object data without consistency errors.
synchronized is one basic method of achieving ThreadSafe code.
Refer to below SE questions for more details:
What does 'synchronized' mean?
You can achieve thread safety by using advanced concurrency API. This documentation page provides good programming constructs to achieve thread safety.
Lock Objects support locking idioms that simplify many concurrent applications.
Concurrent Collections make it easier to manage large collections of data, and can greatly reduce the need for synchronization.
Atomic Variables have features that minimize synchronization and help avoid memory consistency errors.
ThreadLocalRandom (in JDK 7) provides efficient generation of pseudorandom numbers from multiple threads.
Refer to java.util.concurrent and java.util.concurrent.atomic packages too for other programming constructs.
Related SE question:
Synchronization vs Lock
Synchronized: only one thread can operate at same time.
Threadsafe: a method or class instance can be used by multiple threads at the same time without any problems occurring.
If you relate this question as, Why synchronized methods are thread safe? than you can get better idea.
As per the definition this appears to be confusive. But not,if you understand it analytically.
Synchronized means: sequentially one by one in an order,Not concurrently [Not at the same time].
synchronized method not allows to act another thread on it, While a thread is already working on it.This avoids concurrency.
example of synchronization: If you want to buy a movie ticket,and stand in a queue. you will get the ticket only after the person in front of you get the ticket.
Thread safe means: method becomes safe to be accessed by multiple threads without any problem at the same time.synchronized keyword is one of the way to achieve 'thread safe'. But Remember:Actually while multiple threads tries to access synchronized method they follow the order so becomes safe to access. Actually, Even they act at the same time, but cannot access the same resource(method/block) at the same time, because of synchronized behavior of the resource.
Because If a method becomes synchronized, so this is becomes safe to allow multiple threads to act on it, without any problem. Remember:: multiple threads "not act on it at the same time" hence we call synchronized methods thread safe.
Hope this helps to understand.
After patiently reading through a lot of answers and not being too technical at the same time, I could say something definite but close to what Nayak had already replied to fastcodejava above, which comes later on in my answer but look
synchronization is not even close to brute-forcing thread-safety; it's just making a piece of code (or method) safe and incorruptible for a single authorized thread by preventing it from being used by any other threads.
Thread safety is about how all threads accessing a certain element behave and get their desired results in the same way if they would have been sequential (or even not so), without any form of undesired corruption (sorry for the pleonasm) as in an ideal world.
One of the ways of achieving proximity to thread-safety would be using classes in java.util.concurrent.atomic.
Sad, that they don't have final methods though!
Nayak, when we declare a method as synchronized, all other calls to it from other threads are locked and can wait indefinitely. Java also provides other means of locking with Lock objects now.
You can also declare an object to be final or volatile to guarantee its availability to other concurrent threads.
ref: http://www.javamex.com/tutorials/threads/thread_safety.shtml
In practice, performance wise, Thread safe, Synchronised, non-thread safe and non-synchronised classes are ordered as:
Hashtable(slower) < Collections.SynchronizedMap < HashMap(fastest)

Difference in ownership pattern in List and Queue

From the code, it seems that there is a difference in the way ownership of members is handled in the case of a LinkedList and ArayBlockingQueue.
(It may be the same in others too - but as of now I am focussing only on the above.)
While in the case of ArrayBlockingQueue, the ownership seems to be transferred from the input thread to the extracting thread - in LinkedList, the thread that puts in an object, maintains a reference to it even after it has been retrieved by a separate thread(potentially).
Is my understanding correct?
Why do we have this difference in behaviour?
(Here I am using instance and thread synonymously,as an instance would be running in a particular thread.)
LinkedList does not provide any thread safety or synchronization at all. You are responsible for doing that yourself.
The concurrent package collections do provide thread safety on the collection itself, you are still responsible for managing any modifications you might make to objects within the collection though.
There is no concept of "ownership" of objects in Java.
Watching Oracle documentation:
http://docs.oracle.com/javase/6/docs/api/java/util/concurrent/ArrayBlockingQueue.html
http://docs.oracle.com/javase/6/docs/api/java/util/LinkedList.html
You can read that they are different about thread safety.
The first one is inside cuncurrent subpackege.
In the second (LinkedList) you can read "Note that this implementation is not synchronized".
This should be the cause of the differences you are observing.

Does my implementation of LinkedBlockingQueue need to be synchronized?

To begin with, I have used search and found n topics related to this question. Unfortunately, they didin't help me, so it'll be n++ topics :)
Situation: I will have few working threads (the same class, just many dublicates) (let's call them WT) and one result writing thread (RT).
WT will add objects to the Queue, and RT will take them. Since there will be many WT won't there be any memory problems(independant from the max queue size)? Will those operations wait for each other to be completed?
Moreover, as I understand, BlockingQueue is quite slow, so maybe I should leave it and use normal Queue while in synchronized blocks? Or should I consider my self by using SynchronizedQueue?
LinkedBlockingQueue is designed to handle multiple threads writing to the same queue. From the documentation:
BlockingQueue implementations are thread-safe. All queuing methods achieve their effects atomically using internal locks or other forms of concurrency control. However, the bulk Collection operations addAll, containsAll, retainAll and removeAll are not necessarily performed atomically unless specified otherwise in an implementation.
Therefore, you are quite safe (unless you expect the bulk operations to be atomic).
Of course, if thread A and thread B are writing to the same queue, the order of A's items relative to B's items will be indeterminate unless you synchronize A and B.
As to the choice of queue implementation, go with the simplest that does the job, and then profile. This will give you accurate data on where the bottlenecks are so you won't have to guess.

Analysing a BlockingQueue usage example

I was looking at the "usage example based on a typical producer-consumer scenario" at:
http://download.oracle.com/javase/1.5.0/docs/api/java/util/concurrent/BlockingQueue.html#put(E)
Is the example correct?
I think the put and take operations need a lock on some resource before proceeding to modify the queue, but that is not happening here.
Also, had this been a Concurrent kind of a queue, the lack of locks would have been understandable since atomic operations on a concurrent queue do not need locks.
I do not think there is something to add to what is written in api:
A Queue that additionally supports operations that wait for the queue to become non-empty when retrieving an element, and wait for space to become available in the queue when storing an element.
BlockingQueue implementations are thread-safe. All queuing methods achieve their effects atomically using internal locks or other forms of concurrency control.
BlockingQueue is just an interface. This implementation could be using synchronzed blocks, Lock or be lock-free. AFAIK most methods use Lock in the implementation.

Categories