Synchronizing LinkedHashmap externally - java

What is the best way to implement synchronization of a linkedhashmap externally, without using Collections.synchronizedMap
When Collections.synchronizedMap is used entire datastructure is locked, so performance is hugely impacted in a bad way.
What is the best way to lock only required part of datastructure. e.g. If thread is accessing key (K1), it should lock only Key(K1) and Value(v1) part of the datastructure

You can't get a fine-grained-locking, FIFO-eviction concurrent map from the built-in Java implementations.
Check out Guava's Cache or the open-source ConcurrentLinkedHashMap project.

I think you may want to synchronize the subsequent operation you do, just on the value coming from the map:
Object value = map.get(key);
synchronized(value) {
doSomethingWith(value);
}
Synchronizing to values get from the Map, makes sense, since they can be shared and accessed concurrently; the example I posted above should do what you need. That should be enough.
By the way you can also synchronize on the key doing two nested synchronized blocks:
synchronized(key) {
Object value = map.get(key);
synchronized(value) {
doSomethingWith(value);
}
}
The key is -usually- just used to access the object (by hashing). Keys are matched by hash value, so it doesn't make full sense to me to synchronize over the key.
Or, maybe you can subclass ConcurrentHashMap adding what is missing from LinkedHashMap.

Louis Wasserman's suggestion is probably the best because it gives you a lot of useful functionality. However, even if you lock on the entire map, you have to be hitting it really, really hard to make that a bottleneck (as in, your code is mostly doing read/write on the map). If you don't need the additional functionality of Guava's Cache, a synchronized map could be simpler & better. You could also use a ReadWriteLock if you mostly read from the map.

Best option would be to use java.util.concurrent.ConcurrentHashMap .
I can't see how it would be possible to externally lock only parts of zour Map, since you cannot control what shared datastructures are accessed internally by a call to any of the maps function.

If you don't need a LinkedHaspMap, use a ConcurrentHashMap from the java.util.concurrent package.
It is specifically designed for both speed and thread safety. It uses the minimal possible locking to achieve its thread safety.

An insertion in a HashMap, or LinkedHashMap, can cause a rehash because it increases the ratio between the size and the number of buckets. Having two or more threads rehash simultaneously would be a disaster.
Even if you are only doing a get, another thread may be removing an entry from the same bucket, so you are scanning a linked list that is being modified under you. You could also have two or more threads appending to the main linked list at the same time.
If you can do without the linking, use java.util.concurrent.ConcurrentHashMap, as already suggested.

Related

Should I use ConcurrentHashMap or Hashmap or SynchronizedMap?

I have created a hashmap outside my mutithreading code. There are going to be no changes in this hasmap later.
After this, I am starting two threads which will both be reading from this hashmap(yes, only read operations). If thread1 is reading from my hashmap object , can thread 2 also read at the same time? Or do I need a Concurrenthashmap or any other version of Map?
If thread1 is reading from my hashmap object, can thread 2 also read
at the same time?
If you are sure that there are no write operations then you need not use Synchronization options at all, Go for a normal version of Map.
You can also use Immutable Map
A Map whose contents will never change, with many other important
properties detailed at ImmutableCollection
No you don't need concurrent Hash maps..but only since you don not do any modifications. You can read the same without any problem.
If you required only read operation then no need to use synchronization.If you are not doing any write after creation make it immutable so no body can change it .No need for synchronization.

Is CopyOnWriteArrayList enough for keeping shopping cart thread-safe in Servlet Session scope

Is CopyOnWriteArrayList list enough to use as a collection for shopping-cart. As I understand it is thread-safe and the iterator is guaranteed not to throw ConcurrentModificationException when during iteration another thread removes a product. For example:
...
CopyOnWriteArrayList<Product> products = (CopyOnWriteArrayList<Product>)session.getAttribute("PRODUCTS");
products.addIfAbsent(aProduct);
...
P.S. I found synchronization approaches using synchronized (session) {...} but it seams a little ugly when I need synchronize session access everywhere when I work with shopping-cart as offered in this article
You need to understand what CopyOnWriteArrayList provides.
It provides you a snapshot and does not give you real time view of backend array.
It weakens the contract of visibility, it says that you will not get ConcurrentModificationException but also says that if other thread removes some element, the effect will not be visible to other thread which is iterating maybe, because on addition or removal the original array is not mutated or touched and a new one is created on every operation that mutates the backing array.
Is CopyOnWriteArrayList list enough to use as a collection for
shopping-cart.
Depends.
If this behavior is acceptable in your scenario then you can use it, but if you want visibility guarantee you may have to use explicit locking.
I think you are good to go with CopyOnWriteArrayList in the scenario you described.
It has sufficient guarantees to work as thread safe Implementations including
visibility. Yes it is true that it gives the snapshot of the data when you call iterate.
But there is always a race condition, while you remove it before reading or read it before removing.CopyOnWriteArrayList is a fine implementation
which can be used where reads >>> writes, which i think is the case in shopping cart use case.
It is just that while iterating you will not see changes (write operation). You should understand nothing is free, if you want to see the changes while traversing you need to properly synchronize your every iteration with any write operation which is will compromise perfomance. Trust me you will gain nothing. Most of the concurrent Data structures gives weakly consistent state on iteration see (ConcurrentSkipListSet).
So use either CopyOnWriteArrayList, ConcurrentSkipListSet you are good to go.
I think sets are better for your use case i.e to avoid duplicate orders ..
Is CopyOnWriteArrayList list enough to use as a collection for shopping-cart
No because it depends on what you need to synchronize. Think about what must not happen at the same time.
As I understand it is thread-safe and the iterator is guaranteed not to throw ConcurrentModificationException when during iteration another thread removes a product.
You will not get a ConcurrentModificationException because every modification you do to the list will create a copy of the list. A thread that iterates will use the most current copy. But that thread can't assume that a product is still actually in the list when it sees it. It might have been removed in the most current version.
Or maybe to use "heavier artillery" like following, in all places when accessing to shopping-cart collection.
AtomicReference<List<Product>> productListRef =
AtomicReference<List<Product>>)session.getAttribute("PRODUCTS");
List<Product> oldList;
List<Product> newList;
do {
oldList = productListRef.get();
newList = new ArrayList<>(oldList);
newList.add(aProduct);
} while (!ref.compareAndSet(oldList, newList));
Thank a lot for previous answers!

Sort concurrentHash Map with threadsafty

I am using 'concurrentHashMap' in my 'multithreaded' application. i was able to sort it as describe here. but since i am converting hashmap to a list i am bit worried about the thred safty. My 'ConcurrentHashMap' is a static variable hence i can guarantee there will be only one instance of it. but when i am going to sort it i convert it to a list, and sort then put it back to a new concurrentHashMap.
Is this a good practice in multi-threading enlivenment?
Please let me know your thoughts and suggestions.
Thank you in advance.
You should use a ConcurrentSkipListMap. It is thread-safe, fast and maintains ordering according to the object's comparable implementation.
If you don't change it a lot and all you want is to have it sorted, you should use a TreeMap ** wrapped by a **Collections.synchronizedMap() call
Your code would be something like this:
public class YourClass {
public static final Map<Something,Something> MAP = Collections.synchronizedMap( new TreeMap<Something,Something>() );
}
My 'ConcurrentHashMap' is a static variable hence i can guarantee there will be only one instance of it. but when i am going to sort it i convert it to a list, and sort then put it back to a new concurrentHashMap.
This is not a simple problem.
I can tell you for a fact, that using a ConcurrentHashMap won't make this thread-safe. Nor will using a synchronizedMap wrapper. The problem is that sorting is not supported as a single atomic operation. Rather it involves a sequence of Map API operations, probably with significant time gaps in between them.
I can think of two approaches to solving this:
Avoid the need for sorting in the first place by using a Map that keeps the keys in order; e.g. use ConcurrentSkipListMap.
Wrap the Map class in a custom synchronized wrapper class with a synchronized sort method. The problem with this approach is that you are likely to reintroduce the concurrency bottleneck that you avoided by using ConcurrentHashMap.
And it is worth pointing out that it doesn't make any sense to sort a HashMap or a ConcurrentHashMap because these maps will not preserve the order into which you sort the elements. You could use a LinkedHashMap, which preserves the entry insertion order.

Do I have to use a thread-safe Map implementation when only reading from it?

If I do the following.
Create a HashMap (in a final field)
Populate HashMap
Wrap HashMap with unmodifiable wrapper Map
Start other threads which will access but not modify the Map
As I understand it the Map has been "safely published" because the other threads were started after the Map was fully populated so I think it is ok to access the Map from multiple threads as it cannot be modified after this point.
Is this right?
This is perfectly fine concerning the map itself. But you need to realize the making the map unmodifiable will only make the map itself unmodifiable and not its keys and values. So if you have for example a Map<String, SomeMutableObject> such as Map<String, List<String>>, then threads will still be able to alter the value by for example map.get("foo").add("bar");. To avoid this, you'd like to make the keys/values immutable/unmodifiable as well.
As I understand it the Map has been "safely published" because the other threads were started after the Map was fully populated so I think it is ok to access the Map from multiple threads as it cannot be modified after this point.
Yes. Just make sure that the other threads are started in a synchronized manner, i.e. make sure you have a happens-before relation between publishing the map, and starting the threads.
This is discussed in this blog post:
[...] This is how Collections.unmodifiableMap() works.
[...]
Because of the special meaning of the keyword "final", instances of this class can be shared with multiple threads without using any additional synchronization; when another thread calls get() on the instance, it is guaranteed to get the object you put into the map, without doing any additional synchronization. You should probably use something that is thread-safe to perform the handoff between threads (like LinkedBlockingQueue or something), but if you forget to do this, then you still have the guarantee.
In short, no you don't need the map to be thread-safe if the reads are non-destructive and the map reference is safely published to the client.
In the example there are two important happens-before relationships established here. The final-field publication (if and only if the population is done inside the constructor and the reference doesn't leak outside the constructor) and the calls to start the threads.
Anything that modifies the map after these calls wrt the client reading from the map is not safely published.
We have for example a CopyOnWriteMap that has a non-threadsafe map underlying that is copied on each write. This is as fast as possible in situations where there are many more reads than writes (caching configuration data is a good example).
That said, if the intention really is to not change the map, setting an immutable version of the map into the field is always the best way to go as it guarantees the client will see the correct thing.
Lastly, there are some Map implementations that have destructive reads such as a LinkedHashMap with access ordering, or a WeakHashMap where entries can disappear. These types of maps must be accessed serially.
You are correct. There is no need to ensure exclusive access to the data structure by different threads by using mutex'es or otherwise since it's immutable. This usually greatly increases performance.
Also note that if you only wrap the original Map rather than creating a copy, ie the unmodifiable Map delegates method calls further to the inner HashMap, modifying the underlying Map may introduce race condition problems.
Immutable map is born to thread-safe. You could use ImmutableMap of Guava.

is there any Concurrent LinkedHashSet in JDK6.0 or other libraries?

my code throw follow exception:
java.util.ConcurrentModificationException
at java.util.LinkedList$ListItr.checkForComodification(LinkedList.java:761)
at java.util.LinkedList$ListItr.next(LinkedList.java:696)
at java.util.AbstractCollection.addAll(AbstractCollection.java:305)
at java.util.LinkedHashSet.<init>(LinkedHashSet.java:152)
...
I want a ConcurrentLinkedHashSet to fix it,
but I only found ConcurrentSkipListSet in java.util.concurrent,this is TreeSet, not LinkedHashSet
any easies way to get ConcurrentLinkedHashSet in JDK6.0?
thanks for help :)
A ConcurrentModificationException has nothing to do with concurrency in the form you're thinking of. This just means that while iterating over the Collection, someone (probably your own code - that happens often enough ;) ) is changing it, i.e. adding/removing some values.
Make sure you're using the Iterator to remove values from the collection and not the collection itself.
Edit: If really another thread is accessing the Collection at the same time, the weak synchronization you get from the standard library is useless anyhow, since you've got to block the Collection for the whole duration of the operation not just for one add/remove! I.e. something like
synchronize(collection) {
// do stuff here
}
You can always create a synchronized collection with Collections.synchronizedMap(myMap);. However, trying to alter the map while you're iterating (which I'm assuming is the cause of your error) will still be a problem.
From the docs for synchronizedMap:
Returns a synchronized (thread-safe) map backed by the specified map. In order to guarantee serial access, it is critical that all access to the backing map is accomplished through the returned map.
It is imperative that the user
manually synchronize on the returned
map when iterating over any of its
collection views ... Failure to follow
this advice may result in
non-deterministic behavior.
This is because
normally a concurrent collection is really guaranteeing atomic get/put but is not locking the entire collection during iteration, which would be too slow. There's no concurrency guarantee over iteration, which is actually many operations against the map.
it's not really concurrency if you're altering during iteration, as it's impossible to determine correct behavior - for example, how do you reconcile your iterator returning hasNext == true with deleting a (possibly the next value) from the collection?
There is ConcurrentLinkedHashMap - https://code.google.com/p/concurrentlinkedhashmap/
You can create Set out of it with java.util.Collections.newSetFromMap(map)
Unfortunately not. You could implement your own, wrapping a ConcurrentHashMap and a ConcurrentLinkedQueue, but this wouldn't allow you to remove values easily (removal would be O(N), since you'd have to iterate through everything in the queue) ...
What are you using the LinkedHashSet for though? Might be able to suggest alternatives...

Categories