This question already has answers here:
Is there a concurrent List in Java's JDK?
(8 answers)
Closed 5 years ago.
I have a list of accounts that will be updated - not too frequent ~1-2 times a day.
There would be a 'contains' lookup on this data at much regular interval.
An ideal data structure would have been ConcurrentLinkedList ,which unfortunately isnt around.
Is CopyOnWriteArrayList the sole preferred option?
If you are reading mostly , why you need concurrent data structure. Instead you can use HashSet or HashMap. It will make read faster. It is updated so less frequently then u can synchronise the writing part explicitly.
Related
This question already has answers here:
What is the difference between LRU and LFU
(4 answers)
Closed 6 years ago.
I was asked this question in an interview where he asked first about the difference between LRU and LFU and then asked to implement both. I knew LRU can be implemented through LinkedHashMap but I got confused with LFU. Can anyone tell me how to implement it with simplest data structures with good explaination?
Also can it be implemented with LinkedHashMap too.?
Assuming the cache entries were keyed, you could do it with a queue (LinkedList) and a map (HashMap).
(assume the queue and map are full)
Pick a bound b for the queue. On every cache request, push the key for the requested page into the queue then poll the queue.
If the page you want is in the map, return the page. If the page isn't in the map find the least frequently occurring key in the queue which is also in the map or key for the page you want. If that key is the key for the page you want, do nothing; else remove the entry from the map for that key and insert your page into the map.
Return your page.
Complexity for a cache hit is O(1), O(b) for a miss.
This assumes you want to bound the frequency. ie. "least frequently used in the last b requests" instead of "least frequently used ever".
This question already has answers here:
Closing Streams in Java
(6 answers)
Closed 6 years ago.
With huge advancements in CUPs able to process mass amounts of information in fractions of seconds, why is it important that I close a file stream?
Remember that not all devices are the same, platforms like mobile(smartphones and tablets) need to be as efficient as possible. Or if the application has a big user base, maybe when 400 people are logged in there wont be that many problems, but what happens when it goes to ~40k? You have to make your code as versatile as possible, always think about scalability.
This question already has answers here:
implement AbstractTableModel for a Java collection
(6 answers)
Closed 8 years ago.
I have created a JTable using AbstractTableModel in which I added a collection of objects (ArrayList).
I want to be able to search through the objects and return in the same JTable only the ones that meet a conditions (for example the names starts with "St"). Theoretically, how can I do that? Do I have to make new ArrayLists for every condition, and store these searched (and returned) objects there? Is there a better/simpler way? Thanks
As shown here, you can access a Collection in your implementation of AbstractTableModel. As shown here, you can sort and filter the results without modifying the original data structure. A complete example is examined here.
here is the thing,
if you are looking for syntaxic filtering this functionality already exists in java as TableRowSorter which can be combined with row filter to set the subList and show it.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Memory overhead of Java HashMap compared to ArrayList
I try to search on google but didnt found an answer, I need to store 160 entrys on a collection, and I dont want to iterate over them, I want to get the value by one entry, a point 2D, what is the best?
With less memory consume and faster access ?
Thanks alot in advance ;)
For 160 entries? Who cares. Use whichever API is better for this – probably the HashMap.
Also, with data structures, very frequently the tradeoff is speed versus memory use – as a rule you can't have both. (E.g. a hash table that uses chaining for collision resolution will have memory overhead for the buckets over an array; accessing an element by key in it will be faster than searching the array for the key.)
With less memory consume and faster access ?
HashMap probably gives faster access. This could be computationally important, but only if the application does a huge number of lookups using the collection.
ArrayList probably gives least memory usage. However, for a single collection containing 160 elements, the difference is probably irrelevant.
My advice is to not spend a lot of time trying to decide which is best. Toss a coin if you need to, and then move on to more important problems. You only should be worrying about this kind of thing (where the collection is always small) if CPU and/or memory profiling tells you that this data structure is a critical bottleneck.
If Hashmap vs ArrayList are you only two options, obviously it is going to be a Hashmap for speed of retrival. I am not sure of memory usage. But ArrayList has the ability to maintain order.
This question already has answers here:
How to determine the size of an object in Java
(28 answers)
Closed 9 years ago.
I am using a lru cache which has limit on memory usage size. The lru cache include two data structures: a hashmap and a linked list. The hash map holds the cache objects and the linked list keeps records of cache object access order. In order to determine java object memory usage, I use an open source tool -- Classmexer agent
which is a simple Java instrumentation agent at http://www.javamex.com/classmexer/.
I try another tool named SizeOf at http://sizeof.sourceforge.net/.
The problem is the performance very expensive. The time cost for one operation for measuring object size is around 1.3 sec which is very very slow. This can make the benefits of caching to be zero.
Is there any other solution to measuring a java object memory size when it is alive?
Thanks.
Getting an accurate, to the byte value will be expensive as it needs to use reflection to get a nested size, but it can be done.
In Java, what is the best way to determine the size of an object?
http://docs.oracle.com/javase/7/docs/api/java/lang/instrument/Instrumentation.html
Look at this: How to calculate the memory usage of a Java array Maybe it helps you in the analysis.
Since you already have a tool that does it...the reality is that there's no fast way to do this in Java.