I have a SortedSet holding my ordered data.
I use the .first() method to return the first record, and pass it to another window.
When the other window finishes I get an event called, and I want to pass the next from the SortedSet to the window, so how to move to the next element?
launchWindow(this.set.first());
Then I have this:
onActivityResult(...) {
if (this.set.hasNext()) launchWindow(this.set.next());//hasNext/next doesn't exists in the current context for SortedSet
}
What options I have?
Instead of the Set you should pass the Iterator, then next consumer would just call next()
Don't you want to use an Iterator on the SortedSet?
The iterator solution:
You should probably have something like this:
class WindowLauncherClass {
SortedSet set = null;
Iterator setIterator = null;
public WindowLauncherClass(SortedSet set) {
this.set = set; // or you can copy it if that's what you need.
}
protected void launchWindow(Object item) {
// impl
}
public void onActivityResult() {
if ( setIterator != null && setIterator.hasNext() )
{
launchWindow(setIterator.next());
}
}
public void start() {
setIterator = set.iterator();
onActivityResult();
}
}
In the comments appeared the question about updates to the set. Will the iterator see it ?.
The normal answer is depends on the application requirements. In this case i don't have all the information and i'll try to guess.
until jdk 1.5 there was only one SortedSet implementstion ( TreeSet ). this had a fail fast iterator.
in jdk 6 appeared a new implementation: ConcurrentSkipListSet. The iterator for this sorted set is not a fail fast one.
If you are adding an element into the set that is "smaller" than the currently displayed element then you will not be able to see it anyway by a "good" (not fail fast) iterator. If you are adding an element "bigger" that the currently displayed element you will see it by a proper iterator.
The final solution is to actually reset the set and the iterator when a proper change is created. By using a ConcurrentSkipListSet initially you will see only the "bigger" changes and by using a TreeSet you will fail at every update.
If you afford to miss updates "smaller" than the current one then go for the jdk 6.0 and ConcurrentSkipListSet. If not than you'll have to keep track of what you displayed and rebuild a proper set with new items and undisplayed items.
Unless you're using some SortedSet from a third-party library, your set is also a NavigableSet (every SortedSet in java.util also implements NavigableSet). If you can make the event pass back the element it just finished working on, NavigableSet has a method higher which will get the next element higher than the one you pass in:
public void onActivityResult(Event event) {
Element element = event.processedElement;
Element next = set.higher(element);
if(next != null)
launchWindow(next);
}
Related
In the Head First Design Patterns book, the authors describe using an iterator to traverse over composite data structures. They provide some sample code which, when executed, prints out a series of menu items stored within the composite. However, if you try to call the iterator more than once, it no longer works as expected and won't produce any results. The following code appears to be causing the problem:
public Iterator<MenuComponent> createIterator() {
if (iterator == null) {
iterator = new CompositeIterator(menuComponents.iterator());
}
return iterator;
}
In essence, they are creating a singleton iterator that cannot be reset for future iterations. Unfortunately, simply replacing this logic to return a new instance of the CompositeIterator also breaks the algorithm. An issue was raised on GitHub several years ago, although is yet to be resolved. Does anyone have any suggestions on how to overcome this issue?
As the linked issue says in the comments:
return iterator; // the `iterator' never resets to null once it's set.
We need to reset the iterator we are done with it, but not when the iterator still has elements left, because CompositeIterator depends on that.
One way to do this is to add another condition on which iterator is reset - when the iterator has no more elements:
public Iterator<MenuComponent> createIterator() {
if (iterator == null || !iterator.hasNext()) {
iterator = new CompositeIterator(menuComponents.iterator());
}
return iterator;
}
Consider the following code snippet:
private List<Listener<E>> listenerList = new CopyOnWriteArrayList<Listener<E>>();
public void addListener(Listener<E> listener) {
if (listener != null) {
listenerList.add(listener);
}
}
public void removeListener(Listener<E> listener) {
if (listener != null) {
listenerList.remove(listener);
}
}
protected final void fireChangedForward(Event<E> event) {
for (Listener<E> listener : listenerList) {
listener.changed(event);
}
}
protected final void fireChangedReversed(Event<E> event) {
final ListIterator<Listener<E>> li = listenerList.listIterator(listenerList.size());
while (li.hasPrevious()) {
li.previous().changed(event);
}
}
There is a listener list that can be modified and iterated.
I think the forward iteration (see method #fireChangedForward)
should be safe.
The question is: is the reverse iteration (see method #fireChangedReversed) also safe in a multi-threaded environment?
I doubt that, because there are two calls involved: #size and #listIterator.
If it's not thread-safe, what is the most efficient way to implement #fireChangedReversed under the following circumstances:
optimize for traversal
avoid usage of locking if possible
avoid usage of javax.swing.event.EventListenerList
prefer solution without usage of third-party lib, e.g. implementation in own code possible
Indeed, listenerList.listIterator(listenerList.size()) is not thread-safe, for exactly the reason you suggested: the list could change size between the calls to size() and listIterator(), resulting in either the omission of an element from the iteration, or IndexOutOfBoundsException being thrown.
The best way to deal with this is to clone the CopyOnWriteArrayList before getting the iterator:
CopyOnWriteArrayList<Listener<E>> listenerList = ... ;
#SuppressWarnings("unchecked")
List<Listener<E>> copy = (List<Listener<E>>)listenerList.clone();
ListIterator<Listener<E>> li = copy.listIterator(copy.size());
The clone makes a shallow copy of the list. In particular, the clone shares the internal array with the original. This isn't entirely obvious from the specification, which says merely
Returns a shallow copy of this list. (The elements themselves are not copied.)
(When I read this, I thought "Of course the elements aren't copied; this is a shallow copy!" What this really means is that neither the elements nor the array that contains them are copied.)
This is fairly inconvenient, including the lack of a covariant override of clone(), requiring an unchecked cast.
Some potential enhancements are discussed in JDK-6821196 and JDK-8149509. The former bug also links to a discussion of this issue on the concurrency-interest mailing list.
One simple way to do that is to call #toArray method and iterate over the array in reverse order.
You could always just get a ListIterator and "fast-forward" to the end of the list as such:
final ListIterator<Listener<E>> li = listenerList.listIterator();
if (li.hasNext()) {
do{
li.next();
} while (li.hasNext());
}
while (li.hasPrevious()) {
li.previous().changed(event);
}
EDIT I switched the quirky exception-handling of my previous answer for a do/while loop that places the cursor of the ListIterator after the last element, in order to be ready for the next previous call.
RE-EDIT As pointed out by #MikeFHay, a do/while loop on an iterator will throw a NoSuchElementException on an empty list. To prevent this from happening, I wrapped the do/while loop with if (li.hasNext()).
Background Information
You can make a LRU cache with a LinkedHashMap as shown at this link. Basically, you just:
Extend linked hash map.
Provide a capacity parameter.
Initialize the super class (LinkedHashMap) with parameters to tell it its capacity, scaling factor (which should never be used), and to keep items in insertion/reference order.
Override removeEldestEntry to remove the oldest entry when the capacity is breached.
My Question
This is a pretty standard LRU cache implementation. But one thing that I can't figure out how to do is how to be notified when the LinkedHashMap removes an entry due to it not being used recently enough.
I know I can make removeEldestEntry provide some form of notification... but is there any way to retrieve the element that is removed from the cache right when a new one is inserted (put) into the underlying map? Alternatively, is there a way to query for the last item that was removed from the cache?
You can get it to work with some creative use of thread local storage:
class LRUCacheLHM<K,V> extends LinkedHashMap<K,V> {
private int capacity;
public LRUCacheLHM(int capacity) {
//1 extra element as add happens before remove (101), and load factor big
//enough to avoid triggering resize. True = keep in access order.
super(capacity + 1, 1.1f, true);
this.capacity = capacity;
}
private ThreadLocal<Map.Entry<K,V>> removed = new ThreadLocal<Map.Entry<K,V>>();
private ThreadLocal<Boolean> report = new ThreadLocal<Boolean>();
{
report.set(false);
}
#Override
public boolean removeEldestEntry(Map.Entry<K,V> eldest) {
boolean res = size() > capacity;
if (res && report.get()) {
removed.set(eldest);
}
return res;
}
public Map.Entry<K,V> place(K k, V v) {
report.set(true);
put(k, v);
try {
return removed.get();
} finally {
removed.set(null);
report.set(false);
}
}
}
Demo.
The idea behind the place(K,V) method is to signal to removeEldestEntry that we would like to get the eldest entry by setting a thread-local report flag to true. When removeEldestEntry sees this flag and knows that an entry is being removed, it places the eldest entry in the report variable, which is thread-local as well.
The call to removeEldestEntry happens inside the call to the put method. After that the eldest entry is either null, or is sitting inside the report variable ready to be harvested.
Calling set(null) on removed is important to avoid lingering memory leaks.
is there any way to retrieve the element that is removed from the cache right when a new one is inserted (put) into the underlying map?
The removeEldestEntry is notified of the entry to be removed. You can add a listener which this method calls if you want to make it dynamically configurable.
From the Javadoc
protected boolean removeEldestEntry(Map.Entry eldest)
eldest - The least recently inserted entry in the map, or if this is an access-ordered map, the least recently accessed entry. This is the entry that will be removed it this method returns true. If the map was empty prior to the put or putAll invocation resulting in this invocation, this will be the entry that was just inserted; in other words, if the map contains a single entry, the eldest entry is also the newest.
.
is there a way to query for the last item that was removed from the cache?
The last item removed has been removed, however you could have the sub-class store this entry in a field you can retrieve later.
I have the following scenario: I have an existing iterator Iterator<String> it and I iterate over its head (say first k elements, which are flagged elements, i.e. they start with '*' ). The only way to know that the flagged elements are over, is by noticing that the (k+1)th element is not flagged.
The problem is that if I do that, the iterator it will not provide me the first value anymore on the next call to next().
I want to pass this iterator to a method as it's only argument and I would like to avoid changing its signarture and it implementation. I know I could do this:
public void methodAcceptingIterator(Iterator<String> it) //current signature
//change it to
public void methodAcceptingIterator(String firstElement, Iterator<String> it)
But this looks like a workarround/hack decreasing the elegance and generality of the code, so I don't want to this.
Any ideas how I could solve this problem ?
You could use Guava's PeekingIterator (link contains the javadoc for a static method which, given an Iterator, will return a wrapping PeekingIterator). That includes a method T peek() which shows you the next element without advancing to it.
The solution is to create your own Iterator implementation which stores the firstElement and uses the existing iterator as an underlying Iterator to delegate the requests for the rest of the elements to.
Something like:
public class IteratorMissingFirst<E> implements Iterator<E>{
private Iterator<E> underlyingIterator;
private E firstElement;
private boolean firstElOffered;
public IteratorMissingFirst(E firstElement, Iterator<E> it){
//initialize all the instance vars
}
public boolean hasNext(){
if(!firstElOffered && firstElement != null){
return true;
}
else{
return underlyingIterator.hasNext();
}
}
public E next(){
if(!firstElOffered){
firstElOffered = true;
return firstElement;
}
else return underlyingIterator.next();
}
public void remove(){
}
}
Why don't you just have methodAcceptingIterator store the first element it gets out of the iterator in a variable? Or -- in a pinch -- just copy the contents of the Iterator into an ArrayList at the beginning of your method; now you can revisit elements as often as you like.
With Guava, you can implement Razvan's solution in an easier way by using some methods from the Iterables class:
Iterators.concat(Iterators.singletonIterator(firstElement), it)
This gives you an iterator working similar to IteratorMissingFirst, and it's easy to extend if you need to look at more than one element in front (but it creates two objects instead of only one).
I need to update some fixed-priority elements in a PriorityQueue based on their ID. I think it's quite a common scenario, here's an example snippet (Android 2.2):
for (Entry e : mEntries) {
if (e.getId().equals(someId)) {
e.setData(newData);
}
}
I've then made Entry "immutable" (no setter methods) so that a new Entry instance is created and returned by setData(). I modified my method into this:
for (Entry e : mEntries) {
if (e.getId().equals(someId)) {
Entry newEntry = e.setData(newData);
mEntries.remove(e);
mEntries.add(newEntry);
}
}
The code seems to work fine, but someone pointed out that modifying a queue while iterating over it is a bad idea: it may throw a ConcurrentModificationException and I'd need to add the elements I want to remove to an ArrayList and remove it later. He didn't explain why, and it looks quite an overhead to me, but I couldn't find any specific explanation on internet.
(This post is similar, but there priorities can change, which is not my case)
Can anyone clarify what's wrong with my code, how should I change it and - most of all - why?
Thanks,
Rippel
PS: Some implementation details...
PriorityQueue<Entry> mEntries = new PriorityQueue<Entry>(1, Entry.EntryComparator());
with:
public static class EntryComparator implements Comparator<Entry> {
public int compare(Entry my, Entry their) {
if (my.mPriority < their.mPriority) {
return 1;
}
else if (my.mPriority > their.mPriority) {
return -1;
}
return 0;
}
}
This code is in the Java 6 implementation of PriorityQueue:
private class Itr implements Iterator<E> {
/**
* The modCount value that the iterator believes that the backing
* Queue should have. If this expectation is violated, the iterator
* has detected concurrent modification.
*/
private int expectedModCount = modCount;
public E next() {
if(expectedModCount != modCount) {
throw new ConcurrentModificationException();
}
}
}
Now, why is this code here? If you look at the Javadoc for ConcurrentModificationException you will find that the behaviour of an iterator is undefined if modification occurs to the underlying collection before iteration completes. As such, many of the collections implement this modCount mechanism.
To fix your code
You need to ensure that you don't modify the code mid-loop. If your code is single threaded (as it appears to be) then you can simply do as your coworker suggested and copy it into a list for later inclusion. Also, the use of the Iterator.remove() method is documented to prevent ConcurrentModificationExceptions. An example:
List<Entry> toAdd = new ArrayList<Entry>();
Iterator it = mEntries.iterator();
while(it.hasNext()) {
Entry e = it.next();
if(e.getId().equals(someId)) {
Entry newEntry = e.setData(newData);
it.remove();
toAdd.add(newEntry);
}
}
mEntries.addAll(toAdd);
The Javadoc for PriorityQueue says explicitly:
"Note that this implementation is not synchronized. Multiple threads should not access a PriorityQueue instance concurrently if any of the threads modifies the list structurally. Instead, use the thread-safe PriorityBlockingQueue class."
This seems to be your case.
What's wrong in your code was already explained -- implementing iterator, which can consistently iterate through collection with intersected modification is rather hard task to do. You need to specify how to deal with removed items (will it be seen through iterator?), added items, modified items... Even if you can do it consistently it will be rather complex and unefficient implementation -- and, mostly, not very usefull, since use case "iterate without modifications" is much more common. So, java architects choose to deny modification while iterate, and most collections from Java collection API follow this, and throw ConcurrentModificationException if such modification detected.
As for your code -- for me, your just should not make items immutable. Immutability is great thing, but it should not be overused. If Entry object you use here is some kind of domain object, and you really want them to be immutable -- you can just create some kind of temporary data holder (MutableEntry) object, use it inside your algorithm, and copy data to Entry before return. From my point of view it will be best solution.
a slightly better implementation is
List<Entry> toAdd = new ArrayList<Entry>();
for (Iterator<Entry> it= mEntries.iterator();it.hasNext();) {
Entry e = it.next();
if (e.getId().equals(someId)) {
Entry newEntry = e.setData(newData);
it.remove();
toAdd.add(newEntry);
}
}
mEntries.addAll(toAdd);
this uses the remove of the iterator and a bulk add afterwards