I'm looking for a collection that allows fastest element removal. I tested ArrayList on 1 million rows and it turns out removing the first element is faster than removing the last one. It takes about 50 seconds to remove one million elements
import java.util.ArrayList;
public class TestArray {
final int numberOfElements = 1000000;
public void testArray () {
// test array
ArrayList<String> testArray = new ArrayList<String>();
for (int i = 0; i < numberOfElements; i++) {
testArray.add("" + Math.random());
}
// testing speed when removing the first element
long startTime = System.currentTimeMillis();
while (true) {
if (testArray.isEmpty()) {
System.out.println("Milliseconds to fisnish when removing the first element " + (System.currentTimeMillis() - startTime));
break;
}
else {
String testString = testArray.get(0);
testArray.remove(testString);
}
}
testArray = new ArrayList<String>();
for (int i = 0; i < numberOfElements; i++) {
testArray.add("" + Math.random());
}
// testing speed when removing the last element
long startTime2 = System.currentTimeMillis();
while (true) {
if (testArray.isEmpty()) {
System.out.println("Milliseconds to fisnish when removing the last element " + (System.currentTimeMillis() - startTime2));
break;
}
else {
String testString = testArray.get(testArray.size()-1);
testArray.remove(testString);
}
}
}
}
But I'm not sure if this is the fastest possible way. Is 50 seconds the fastest way? Or is there any better collection, for example will LinkedList do it faster? Or what is the fastest collection to remove elements one by one?
1) You should consider LinkedList which has O(1) Big O performance for remove operation (Explanation below), while ArrayList is O(n).
2) You can try HashSet if you are not interested in duplicates.
LinkedList Remove:
1) LinkedList removal at the beginning and end is constant time since traversal is not required.
2) It takes longer time for removing the middle elements because the element needs to be found first.
3) If you have an iterator at the location you want to remove, then remove is constant time.
The best collection for performance is TreeSet because if you insert objects in according to Comparable / Comparator, collection will be orded.
My times:
ArrayList
Milliseconds to fisnish when removing the first element 698
Milliseconds to fisnish when removing the last element 121960
TreeSet:
Milliseconds to fisnish when removing the first element 55
Milliseconds to fisnish when removing the last element 50
WARNING: With this solutions you can't have duplicate objects in the collection.
#Test
public void testTreeSet() {
/* RESULTS
* Milliseconds to fisnish when removing the first element 55
* Milliseconds to fisnish when removing the last element 50
*/
// test array
TreeSet<String> testArray = new TreeSet<String>();
int numberOfElements = 100000;
for (int i = 0; i < numberOfElements; i++) {
testArray.add("" + Math.random());
}
// testing speed when removing the first element
long startTime = System.currentTimeMillis();
while (true) {
if (testArray.isEmpty()) {
System.out.println("Milliseconds to fisnish when removing the first element "
+ (System.currentTimeMillis() - startTime));
break;
} else {
//String testString = testArray.get(0);
String testString = testArray.first();
testArray.remove(testString);
}
}
testArray = new TreeSet<String>();
for (int i = 0; i < numberOfElements; i++) {
testArray.add("" + Math.random());
}
// testing speed when removing the last element
long startTime2 = System.currentTimeMillis();
while (true) {
if (testArray.isEmpty()) {
System.out.println("Milliseconds to fisnish when removing the last element "
+ (System.currentTimeMillis() - startTime2));
break;
} else {
//String testString = testArray.get(testArray.size() - 1);
String testString = testArray.last();
testArray.remove(testString);
}
}
}
First: There must be something wrong with your benchmark, a ArrayList removes elements much slower then adding some. This is because the array must not have gaps in the underlaying array. So elements need to be shifted if you remove everywhere but in the end.
This answer depends wether you want to remove index-based or value-based.
In general, index-based operations are faster because no expansive value comparisons need to be made.
Since if you want to remove elements you must have added them once, it is helpful to consider the add complexity as well
ArrayList: Add: O(n), amortized O(1) (in practice cheap). Remove is always O(n), Find O(1) if index-based, O(n) if value based
Example in practice for the effect of amortized analysis: Adding one million elements in a row will result in 10 million copies. However the number of copies is O(log n), n is the number of consecutive add operations.
LinkedList Add: O(n) at average, AddFirst/Last O(1), removeLast/First O(1), find O(n), getFirstElement/GetLastElement O(1). Note here: You have to know, that the element you search for is in the end/beginning and call the corresponding method.
So far, if you have a lot of consecutive add/remove operations and few search operations (except getting the first or last element), I recommend you to use LinkedList.
If you have no two identical objects, that is ( Object.equals(sameObject) ) returns true for exactly the same object. you should use LinkedHashSet It has O(1) for all operations, but equal objects can only be contained once.
Unfortunately index-based search is here not possible, the methods are also not snychronized. But there is always a trade-off.
Some theory: According to the papers mentioned here we cannot do better then amortized Omega(log n) for arbitraty adding and removal of elements.
Related
for(int i = 0; i < points.size(); i = i+lines) {
points.remove(i);
}
The idea here is that a user can either remove every other space or every third space or every fourth space .. And so forth, of an array list by entering an int "line" that will skip the spaces. However, I realize the list gets smaller each time messing with the skip value. How do I account for this? I'm using the ArrayList library from java so don't have the option of just adding a method in the array list class. Any help would be greatly appreciated.
I've perfomed a benchmark of all the answers proposed to this question so far.
For an ArrayList with ~100K elements (each a string), the results are as follows:
removeUsingRemoveAll took 15018 milliseconds (sleepToken)
removeUsingIter took 216 milliseconds (Arvind Kumar Avinash)
removeFromEnd took 94 milliseconds (WJS)
Removing an element from an ArrayList is an Θ(n) operation, as it has to shift all remaining elements in the array to the left (i.e. it's slow!). WJS's suggestion of removing elements from the end of the list first, appears to be the fastest (inplace) method proposed so far.
However, for this problem, I'd highly suggest considering alternative data structures such as a LinkedList, which is designed to make removing (or adding) elements in the middle of the list fast. Another alternative, if you have sufficient memory, is to build up the results in a separate list rather than trying to modify the list inplace:
removeUsingIterLinked took 12 milliseconds
removeUsingSecondList took 3 milliseconds (sleepToken with WJS's comment)
Use an Iterator with a counter e.g. the following code will remove every other (i.e. every 2nd) element (starting with index, 0):
Iterator<Point> itr = points.iterator();
int i = 0;
while(itr.hasNext()) {
itr.next();
if(i % 2 == 0) {
itr.remove();
}
i++;
}
Here, I've used i as a counter.
Similarly, you can use the condition, i % 3 == 0 to remove every 3rd element (starting with index, 0).
Here is a different approach. Simply start from the end and remove in reverse. That way you won't mess up the index synchronization. To guarantee that removal starts with the second item from the front, ensure you start with the last odd index to begin with. That would be list.size()&~1 - 1. If size is 10, you will start with 9. If size is 11 you will start with 9
List<Integer> list = IntStream.rangeClosed(1,11)
.boxed().collect(Collectors.toList());
for(int i = (list.size()&~1)-1; i>=0; i-=2) {
list.remove(i);
}
System.out.println(list);
Prints
[1, 3, 5, 7, 9, 11]
You could add them to a new ArrayList and then remove all elements after iterating.
You could set count to remove every countth element.
import java.util.ArrayList;
public class Test {
static ArrayList<String> test = new ArrayList<String>();
public static void main(String[] args) {
test.add("a");
test.add("b");
test.add("c");
test.add("d");
test.add("e");
ArrayList<String> toRemove = new ArrayList<String>();
int count = 2;
for (int i = 0; i < test.size(); i++) {
if (i % count == 0) {
toRemove.add(test.get(i));
}
}
test.removeAll(toRemove);
System.out.print(test);
}
}
I used the following code to test the performance between Array/ArrayList/LinkedList
import java.util.ArrayList;
import java.util.LinkedList;
public class Main3 {
public static void main(String[] args) throws Exception{
int n = 20000000;
long bt = 0, et = 0;
int[] a0 = new int[n];
ArrayList<Integer> a1 = new ArrayList<>(n);
LinkedList<Integer> a2 = new LinkedList<>();
Integer[] a3 = new Integer[n];
bt = System.currentTimeMillis();
for(int i=0; i<n; i++){
a0[i] = i;
}
et = System.currentTimeMillis();
System.out.println("===== loop0 time =======" + (et - bt));
bt = System.currentTimeMillis();
for(int i=0; i<n; i++){
a1.add(i);
}
et = System.currentTimeMillis();
System.out.println("===== loop1 time =======" + (et - bt));
bt = System.currentTimeMillis();
for(int i=0; i<n; i++){
a2.add(i);
}
et = System.currentTimeMillis();
System.out.println("===== loop2 time =======" + (et - bt));
bt = System.currentTimeMillis();
for(int i=0; i<n; i++){
a3[i] = i;
}
et = System.currentTimeMillis();
System.out.println("===== loop3 time =======" + (et - bt));
}
}
The result is
===== loop0 time =======11
===== loop1 time =======6776
===== loop2 time =======17305
===== loop3 time =======56
Why the ArralyList/LinkedList is so slower than array ?
How could I improve the performance.
env:
Java: jdk1.8.0_231
Thanks
There are potential inaccuracies in your benchmark, but the overall ranking of the results is probably correct. You may get faster results for all of the benchmarks if you "warm-up" the code before taking timings to allow the JIT compiler to generate native code and optimise it. Some benchmark results may be closer or even equal.
Iterating over an int array is going to be much faster than iterating over a List of Integer objects. A LinkedList is going to be slowest of all. These statements assume the optimiser doesn't make radical changes.
Let's look at why:
An int array (int[]) is a contiguous area of memory containing your 4 byte ints arranged end-to-end. The loop to iterate over this and set the elements just has to work its way through the block of memory setting each 4 bytes in turn. In principle an index check is required, but in practice the optimiser can realise this isn't necessary and remove it. The JIT compiler is well able to optimise this kind of thing based on native CPU instructions.
An ArrayList of Integer objects contains an array of references which point to individual Integer objects (or are null). Each Integer object will have to be allocated separately (although Java can re-use Integers of small numbers). There is an overhead to allocate new objects and in addition the reference may be 8 bytes instead of 4. Also, if the list size is not specified (though it is in your case) the internal array may need to be reallocated. There is an overhead due to calling the add method instead of assigning to the array directly (the optimizer may remove this though).
Your array of Integer benchmark is similar to the array list but doesn't have the overhead of the list add method call (which has to track the list size). Probably your benchmark overstates the difference between this array and the array list though.
A LinkedList is the worst case. Linked lists are optimised for inserting in the middle. They have references to point to the next item in the list and nodes to hold those references in addition to the Integer object that needs allocating. This is a big memory overhead that also requires some initialisation and you would not use a linked list unless you were expecting to insert a lot of elements into the middle of the list.
Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 4 years ago.
Improve this question
I did some research and wrote the following article: http://www.heavyweightsoftware.com/blog/linkedlist-vs-arraylist/ and wanted to post a question here.
class ListPerformanceSpec extends Specification {
def "Throwaway"() {
given: "A Linked List"
List<Integer> list
List<Integer> results = new LinkedList<>()
when: "Adding numbers"
Random random = new Random()
//test each list 100 times
for (int ix = 0; ix < 100; ++ix) {
list = new LinkedList<>()
LocalDateTime start = LocalDateTime.now()
for (int jx = 0; jx < 100000; ++jx) {
list.add(random.nextInt())
}
LocalDateTime end = LocalDateTime.now()
long diff = start.until(end, ChronoUnit.MILLIS)
results.add(diff)
}
then: "Should be equal"
true
}
def "Linked list"() {
given: "A Linked List"
List<Integer> list
List<Integer> results = new LinkedList<>()
when: "Adding numbers"
Random random = new Random()
//test each list 100 times
for (int ix = 0; ix < 100; ++ix) {
list = new LinkedList<>()
LocalDateTime start = LocalDateTime.now()
for (int jx = 0; jx < 100000; ++jx) {
list.add(random.nextInt())
}
long total = 0
for (int jx = 0; jx < 10000; ++jx) {
for (Integer num : list) {
total += num
}
total = 0
}
LocalDateTime end = LocalDateTime.now()
long diff = start.until(end, ChronoUnit.MILLIS)
results.add(diff)
}
then: "Should be equal"
System.out.println("Linked list:" + results.toString())
true
}
def "Array list"() {
given: "A Linked List"
List<Integer> list
List<Integer> results = new LinkedList<>()
when: "Adding numbers"
Random random = new Random()
//test each list 100 times
for (int ix = 0; ix < 100; ++ix) {
list = new ArrayList<>()
LocalDateTime start = LocalDateTime.now()
for (int jx = 0; jx < 100000; ++jx) {
list.add(random.nextInt())
}
long total = 0
for (int jx = 0; jx < 10000; ++jx) {
for (Integer num : list) {
total += num
}
total = 0
}
LocalDateTime end = LocalDateTime.now()
long diff = start.until(end, ChronoUnit.MILLIS)
results.add(diff)
}
then: "Should be equal"
System.out.println("Array list:" + results.toString())
true
}
}
Why does ArrayList outperform LinkedList by 28% for sequential access when LinkedList should be faster?
My question is different from When to use LinkedList over ArrayList? because I'm not asking when to choose it, but why it's faster.
Array-based lists, as Java ArrayList, use much less memory for the same data amount than link-based lists (LinkedList), and this memory is organized sequentially. This essentially decreases CPU cache trashing with side data. As soon as access to RAM requires 10-20 times more delay than L1/L2 cache access, this is causing sufficient time difference.
You can read more about these cache issues in books like this one, or similar resources.
OTOH, link-based lists outperform array-based ones in operation like insering to middle of list or deleting there.
For a solution that have both memory economy (so, fast iterating) and fast inserting/deleting, one should look at combined approaches, as in-memory B⁺-trees, or array of array lists with proportionally increased sizes.
From LinkedList source code:
/**
* Appends the specified element to the end of this list.
*
* <p>This method is equivalent to {#link #addLast}.
*
* #param e element to be appended to this list
* #return {#code true} (as specified by {#link Collection#add})
*/
public boolean add(E e) {
linkLast(e);
return true;
}
/**
* Links e as last element.
*/
void linkLast(E e) {
final Node<E> l = last;
final Node<E> newNode = new Node<>(l, e, null);
last = newNode;
if (l == null)
first = newNode;
else
l.next = newNode;
size++;
modCount++;
}
From ArrayList source code:
/**
* Appends the specified element to the end of this list.
*
* #param e element to be appended to this list
* #return <tt>true</tt> (as specified by {#link Collection#add})
*/
public boolean add(E e) {
ensureCapacityInternal(size + 1); // Increments modCount!!
elementData[size++] = e;
return true;
}
private void ensureExplicitCapacity(int minCapacity) {
modCount++;
// overflow-conscious code
if (minCapacity - elementData.length > 0)
grow(minCapacity);
}
So linked list has to create new node for each element added, while array list does not. ArrayList does not reallocate/resize for each new element, so most of time array list simply set object in array and increment size, while linked list does much more work.
You also commented:
When I wrote a linked list in college, I allocated blocks at a time and then farmed them out.
I do not think this would work in Java. You cannot do pointer tricks in Java, so you would have to allocate a lot of small arrays, or create empty nodes ahead. In both cases overhead would probably be a bit higher.
Why does ArrayList outperform LinkedList by 28% for sequential access when LinkedList should be faster?
You're assuming that, but don't provide anything to back it up. But it's not really a great surprise. An ArrayList has an array as the underlying data store. Accessing this sequentially is extremely fast, because you know exactly where every element is going to be. The only slowdown comes when the array grows beyond a certain size and needs to be expanded, but that can be optimised.
The real answer would probably be: check the Java source code, and compare the implementations of ArrayList and LinkedList.
One explanation is that your base assumption (that multiplication is slower than memory fetches) is questionable.
Based on this document, a AMD Bulldozer takes 1 clock cycles to perform a 64 bit integer multiply instruction (register x register) with 6 cycles of latency1. By contrast, a memory to register load takes 1 clock cycle with 4 cycles of latency. But that assumes that you get a cache hit for the memory fetch. If you get a cache miss, you need to add a number of cycles. (20 clock cycles for an L2 cache miss, according to this source.)
Now that is just one architecture, and others will vary. And we also need to consider other issues, like constraints on the number of multiplications that can be overlapped, and how well the compiler can organize the instructions to get them minimize instruction dependencies. But the fact remains that for a typical modern pipelined chip architecture, the CPU can execute integer multiplies as fast as it can execute memory to register moves, and much faster if there are more cache misses in the memory fetches.
Your benchmark is using lists with 100,000 Integer elements. When you look at the amount of memory involved, and the relative locality of the heap nodes that represent the lists and the elements, the linked list case will use significantly more memory, and have correspondingly worse memory locality. That will lead to more cache misses per cycle of the inner loop, and worse performance.
Your benchmark results are not surprising2 to me.
The other thing to note is that if you use Java LinkedList, a separate heap node is used to represent the list nodes. You can implement your own linked lists more efficiently if your element class has its own next field that can be used to chain the elements. However, brings its own limitations; e.g. an element can only be in one list at a time.
Finally, as #maaartinus points out, a full IMUL is not required in the case of a Java ArrayList. When reading or writing the ArrayList's array, the indexing multiplication will be either x 4 or x 8 and that can be performed by a MOV with one of the standard addressing modes; e.g.
MOV EAX, [EDX + EBX*4 + 8]
This multiplication can be done (at the hardware level) by shifting with much less latency than 64 bit IMUL.
1 - In this context, the latency is the number of cycles delay before the result of the instruction is available ... to the next instruction that depends on it. The trick is to order the instructions so that other work is done during the delay.
2 - If anything, I am surprised that LinkedList appears to be doing so well. Maybe calling Random.nextInt() and autoboxing the result is dominating the loop times?
So I am trying to solve this problem: http://oj.leetcode.com/problems/merge-intervals/
My solution is:
public class Solution {
public ArrayList<Interval> merge(ArrayList<Interval> intervals) {
// Start typing your Java solution below
// DO NOT write main() function
// ArrayList<Interval> result = new ArrayList<Interval>();
//First sort the intervals
Collections.sort(intervals,new Comparator<Interval>(){
public int compare(Interval interval1, Interval interval2) {
if(interval1.start > interval2.start) return 1;
if(interval1.start == interval2.start) return 0;
if(interval1.start < interval2.start) return -1;
return 42;
}
});
for(int i = 0; i < intervals.size() - 1; i++){
Interval currentInterval = intervals.get(i);
Interval nextInterval = intervals.get(i+1);
if(currentInterval.end >= nextInterval.start){
intervals.set(i,new Interval(currentInterval.start,nextInterval.end));
intervals.remove(i+1);
i--;
}
}
return intervals;
}
}
I have seen some blogs using exactly the same solution but get accepted but mine is rejected because it takes too long. Can you enlighten me why it takes longer than expected?
Cheers
EDIT: solved, remove is too costly, using a new arraylist to store the result is faster
Initially you are sorting all your intervals - due to javadocs, this operation has complexity O(N*log(N))
But, after that, as I have noticed - you are iterating over ArrayList, and sometimes removing elements from it.
But removing some element from ArrayList has complexity O(N) (as underlying implementation of ArrayList is plain array - removing any elemnt from the middle of array, requires shifting of the entire right part of this array).
As you do that in loop - finally, complexity of your algirithm would be O(N^2).
I'd suggest you to use LinkedList instead of ArrayList in this case.
You could improve your sorting by using one computation instead of 3 comparisons:
Collections.sort(intervals,new Comparator<Interval>(){
public int compare(Interval interval1, Interval interval2) {
return interval1.start - interval2.start;
}
});
I got a weird problem.
I thought this would cost me few minutes, but I am struggling for few hours now...
Here is what I got:
for (int i = 0; i < size; i++){
if (data.get(i).getCaption().contains("_Hardi")){
data.remove(i);
}
}
The data is the ArrayList.
In the ArrayList I got some strings (total 14 or so), and 9 of them, got the name _Hardi in it.
And with the code above I want to remove them.
If I replace data.remove(i); with a System.out.println then it prints out something 9 times, what is good, because _Hardi is in the ArrayList 9 times.
But when I use data.remove(i); then it doesn't remove all 9, but only a few.
I did some tests and I also saw this:
When I rename the Strings to:
Hardi1
Hardi2
Hardi3
Hardi4
Hardi5
Hardi6
Then it removes only the on-even numbers (1, 3, 5 and so on).
He is skipping 1 all the time, but can't figure out why.
How to fix this? Or maybe another way to remove them?
The Problem here is you are iterating from 0 to size and inside the loop you are deleting items. Deleting the items will reduce the size of the list which will fail when you try to access the indexes which are greater than the effective size(the size after the deleted items).
There are two approaches to do this.
Delete using iterator if you do not want to deal with index.
for (Iterator<Object> it = data.iterator(); it.hasNext();) {
if (it.next().getCaption().contains("_Hardi")) {
it.remove();
}
}
Else, delete from the end.
for (int i = size-1; i >= 0; i--){
if (data.get(i).getCaption().contains("_Hardi")){
data.remove(i);
}
}
You shouldn't remove items from a List while you iterate over it. Instead, use Iterator.remove() like:
for (Iterator<Object> it = list.iterator(); it.hasNext();) {
if ( condition is true ) {
it.remove();
}
}
Every time you remove an item, you are changing the index of the one in front of it (so when you delete list[1], list[2] becomes list[1], hence the skip.
Here's a really easy way around it: (count down instead of up)
for(int i = list.size() - 1; i>=0; i--)
{
if(condition...)
list.remove(i);
}
Its because when you remove an element from a list, the list's elements move up. So if you remove first element ie at index 0 the element at index 1 will be shifted to index 0 but your loop counter will keep increasing in every iteration. so instead you of getting the updated 0th index element you get 1st index element. So just decrease the counter by one everytime you remove an element from your list.
You can use the below code to make it work fine :
for (int i = 0; i < data.size(); i++){
if (data.get(i).getCaption().contains("_Hardi")){
data.remove(i);
i--;
}
}
It makes perfect sense if you think it through. Say you have a list [A, B, C]. The first pass through the loop, i == 0. You see element A and then remove it, so the list is now [B, C], with element 0 being B. Now you increment i at the end of the loop, so you're looking at list[1] which is C.
One solution is to decrement i whenever you remove an item, so that it "canceles out" the subsequent increment. A better solution, as matt b points out above, is to use an Iterator<T> which has a built-in remove() function.
Speaking generally, it's a good idea, when facing a problem like this, to bring out a piece of paper and pretend you're the computer -- go through each step of the loop, writing down all of the variables as you go. That would have made the "skipping" clear.
I don't understand why this solution is the best for most of the people.
for (Iterator<Object> it = data.iterator(); it.hasNext();) {
if (it.next().getCaption().contains("_Hardi")) {
it.remove();
}
}
Third argument is empty, because have been moved to next line. Moreover it.next() not only increment loop's variable but also is using to get data. For me use for loop is misleading. Why you don't using while?
Iterator<Object> it = data.iterator();
while (it.hasNext()) {
Object obj = it.next();
if (obj.getCaption().contains("_Hardi")) {
it.remove();
}
}
Because your index isn't good anymore once you delete a value
Moreover you won't be able to go to size since if you remove one element, the size as changed.
You may use an iterator to achieve that.
for (Iterator<Object> it = data.iterator(); it.hasNext();) {
if ( it.getCaption().contains("_Hardi")) {
it.remove(); // performance is low O(n)
}
}
If your remove operation is required much on list. Its better you use LinkedList which gives better performance Big O(1) (roughly).
Where in ArrayList performance is O(n) (roughly) . So impact is very high on remove operation.
It is late but it might work for someone.
Iterator<YourObject> itr = yourList.iterator();
// remove the objects from list
while (itr.hasNext())
{
YourObject object = itr.next();
if (Your Statement) // id == 0
{
itr.remove();
}
}
In addition to the existing answers, you can use a regular while loop with a conditional increment:
int i = 0;
while (i < data.size()) {
if (data.get(i).getCaption().contains("_Hardi"))
data.remove(i);
else i++;
}
Note that data.size() must be called every time in the loop condition, otherwise you'll end up with an IndexOutOfBoundsException, since every item removed alters your list's original size.
This happens because by deleting the elements you modify the index of an ArrayList.
import java.util.ArrayList;
public class IteratorSample {
public static void main(String[] args) {
// TODO Auto-generated method stub
ArrayList<Integer> al = new ArrayList<Integer>();
al.add(1);
al.add(2);
al.add(3);
al.add(4);
System.out.println("before removal!!");
displayList(al);
for(int i = al.size()-1; i >= 0; i--){
if(al.get(i)==4){
al.remove(i);
}
}
System.out.println("after removal!!");
displayList(al);
}
private static void displayList(ArrayList<Integer> al) {
for(int a:al){
System.out.println(a);
}
}
}
output:
before removal!!
1
2
3
4
after removal!!
1
2
3
There is an easier way to solve this problem without creating a new iterator object. Here is the concept. Suppose your arrayList contains a list of names:
names = [James, Marshall, Susie, Audrey, Matt, Carl];
To remove everything from Susie forward, simply get the index of Susie and assign it to a new variable:
int location = names.indexOf(Susie);//index equals 2
Now that you have the index, tell java to count the number of times you want to remove values from the arrayList:
for (int i = 0; i < 3; i++) { //remove Susie through Carl
names.remove(names.get(location));//remove the value at index 2
}
Every time the loop value runs, the arrayList is reduced in length. Since you have set an index value and are counting the number of times to remove values, you're all set. Here is an example of output after each pass through:
[2]
names = [James, Marshall, Susie, Audrey, Matt, Carl];//first pass to get index and i = 0
[2]
names = [James, Marshall, Audrey, Matt, Carl];//after first pass arrayList decreased and Audrey is now at index 2 and i = 1
[2]
names = [James, Marshall, Matt, Carl];//Matt is now at index 2 and i = 2
[2]
names = [James, Marshall, Carl];//Carl is now at index 3 and i = 3
names = [James, Marshall,]; //for loop ends
Here is a snippet of what your final method may look like:
public void remove_user(String name) {
int location = names.indexOf(name); //assign the int value of name to location
if (names.remove(name)==true) {
for (int i = 0; i < 7; i++) {
names.remove(names.get(location));
}//end if
print(name + " is no longer in the Group.");
}//end method
This is a common problem while using Arraylists and it happens due to the fact that the length (size) of an Arraylist can change. While deleting, the size changes too; so after the first iteration, your code goes haywire. Best advice is either to use Iterator or to loop from the back, I'll recommend the backword loop though because I think it's less complex and it still works fine with numerous elements:
//Let's decrement!
for(int i = size-1; i >= 0; i--){
if (data.get(i).getCaption().contains("_Hardi")){
data.remove(i);
}
}
Still your old code, only looped differently!
I hope this helps...
Merry coding!!!