How to add all integers for duplicates elements in HashMap? - java

I have the following HashMap:
Map<String, Integer> map = new HashMap<>();
How can I sum up all the integers for the duplicates String? or is there a better way to do it using Set?
for example, if I add these elements:
car 100
TV 140
car 5
charger 10
TV 10
I want the list to have:
car 105
TV 150
charger 10

I believe your question is: how do I put key/value pairs into a map in a way that changes the value rather than replacing it, for the same key.
Java has a Map method specifically for this purpose:
map.merge(key, value, (v, n) -> v + n);
This will add the value if the key isn't in the map. Otherwise it'll replace the current value with the sum of the current and new values.
The merge method was introduced in Java 8.

First of all you cannot add duplicate keys in map.
But if I understood what you want, the below code may help you:
if (map.containsKey(key))
map.put(key, map.get(key) + newValue);
else
map.put(key, newValue);

For java-8 and higher
You may just want to use the Map#merge method. It is the easiest way possible. If the key does not exist, it will add it, if it does exist, it will perform the merge operation.
map.merge("car", 100, Integer::sum);
map.merge("car", 20, Integer::sum);
System.out.println(map); // {car=120}

When you add "TV" for the second time, the first value (140) will be override because you cannot have duplicated keys on Map implementation. If you want to increment the value you will need to check if the key "TV" already exists and then increment/add the value.
For example:
if (map.containsKey(key)) {
value += map.get(key);
}
map.put(key, value)

HashMap dosen't save duplicates keys!
You can extend the HashMap Class(JAVA >= 8):
public class MyHashMap2 extends HashMap<String, Integer>{
#Override
public Integer put(String key, Integer value) {
return merge(key, value, (v, n) -> v + n);
}
public static void main (String[] args) throws java.lang.Exception
{
MyHashMap2 list3=new MyHashMap2();
list3.put("TV", 10);
list3.put("TV", 20);
System.out.println(list3);
}
}
Or You can aggregate the HashMap and replace the put method to add to the previous value the new value.
HashMap<String, Integer> list = new HashMap<>();
list.put("TV", 10);
list.put("TV", 20);
System.out.println(list);
MyHashMap list2 = new MyHashMap();
list2.put("TV", 10);
list2.put("TV", 20);
System.out.println(list2);
//OUTPUT:
//{TV=20}
//MyHashMap [List={TV=30}]
public class MyHashMap implements Map<String, Integer>{
HashMap<String, Integer> list = new HashMap<>();
public MyHashMap() {
super();
}
#Override
public int size() {
return list.size();
}
#Override
public boolean isEmpty() {
return list.isEmpty();
}
#Override
public boolean containsKey(Object key) {
return list.containsKey(key);
}
#Override
public boolean containsValue(Object value) {
return list.containsValue( value);
}
#Override
public Integer get(Object key) {
return list.get(key);
}
#Override
public Integer put(String key, Integer value) {
if(list.containsKey(key))
list.put(key, list.get(key)+value);
else
list.put(key, value);
return value;
}
#Override
public Integer remove(Object key) {
return list.remove(key);
}
#Override
public void putAll(Map<? extends String, ? extends Integer> m) {
list.putAll(m);
}
#Override
public void clear() {
list.clear();
}
#Override
public Set<String> keySet() {
return list.keySet();
}
#Override
public Collection<Integer> values() {
return list.values();
}
#Override
public Set<java.util.Map.Entry<String, Integer>> entrySet() {
return list.entrySet();
}
#Override
public String toString() {
return "MyHashMap [list=" + list + "]";
}
}
you can try the code here:https://ideone.com/Wl4Arb

Related

JAVA - Ordered HashMap Implementation with change key name function

I am trying to create a user interface with a HashMap. Users could change values and change the name of the keys without disturbing the order of the keys. I searched and found the LinkedHashMap. Which kept the order of the keys in most cases. But when I remove a key and add it back after renaming it, it always adds it to the end. So I've overridden LinkedHashMap class and added a changeKeyName() function.
Now it works (in my case) but I was wondering if it could be improved and made foolproof. I only overridden the functions I was using. What other functions have to be overridden in order make it complete?
Thanks in advance.
Here is the code:
private static class OrderedHashMap<K, V> extends LinkedHashMap<K, V> {
ArrayList<K> keys = new ArrayList<K>();
#Override
public V put(K key, V value) {
if (!keys.contains(key))
keys.add(key);
return super.put(key, value);
}
#Override
public V remove(Object key) {
keys.remove(key);
return super.remove(key);
}
#Override
public Set<K> keySet() {
LinkedHashSet<K> keys = new LinkedHashSet<K>();
for (K key : this.keys) {
keys.add(key);
}
return keys;
}
public void changeKeyName(K oldKeyName, K newKeyName) {
int index = keys.indexOf(oldKeyName);
keys.add(index, newKeyName);
keys.remove(keys.get(index + 1));
V value = super.get(oldKeyName);
super.remove(oldKeyName);
super.put(newKeyName, value);
}
#Override
public Set<Map.Entry<K, V>> entrySet() {
final OrderedHashMap<K, V> copy = this;
LinkedHashSet<Map.Entry<K, V>> keys = new LinkedHashSet<Map.Entry<K, V>>();
for (final K key : this.keys) {
final V value = super.get(key);
keys.add(new Map.Entry<K, V>() {
#Override
public K getKey() {
return key;
}
#Override
public V getValue() {
return value;
}
#Override
public V setValue(V value) {
return copy.put(getKey(), value);
}
});
}
return keys;
}
}
EDIT: I think the why wasn't clear enough. Let's say we added the keys below.
{"key1":"value1"},
{"key2":"value2"},
{"key3":"value3"},
{"key4":"value4"}
And for example I want to change the key name of the "key2". But as this is also a user interface, order of the keys should stay the same.
I made some research and I found out that apart from removing the key and re-puting the new key name with the same value, nothing could be done. So if we do that and change "key2" to "key2a":
{"key1":"value1"},
{"key3":"value3"},
{"key4":"value4"},
{"key2a":"value2"}
And what I want is this:
{"key1":"value1"},
{"key2a":"value2"},
{"key3":"value3"},
{"key4":"value4"}
So I just kept the keys in a ArrayList and returned them when entrySet() and keySet() methods are called.
Have you considered simply using the TreeMap class instead of a custom subclass of LinkedHashMap? It will maintain order if you implement the Comparable interface on the keys.
If you want to be able to change keys without affecting the hashing function in the collection where the value is stored try a custom class such as;
private class VariableKeyMap {
private LinkedHashSet<K, V> myCollection = new LinkedHashSet<K, V>();
private HashMap<int, K> aliases = new HashMap<int, K>();
int id = 0;
public void addEntry(K key, V value) {
id += 1;
aliases.put(K, id);
myCollection.put(id, V);
}
public V getValue(K key) {
return myCollection.get(aliases.get(key));
}
...
}
The you can update your key alias without affecting where the value is actually stored;
public void changeKey(K oldKey, K newKey) {
int currentId = aliases.get(oldKey);
aliases.remove(oldKey);
aliases.put(newKey, currentId);
}

Use LinkedHashMap to implement LRU cache

I was trying to implement a LRU cache using LinkedHashMap.
In the documentation of LinkedHashMap (http://docs.oracle.com/javase/7/docs/api/java/util/LinkedHashMap.html), it says:
Note that insertion order is not affected if a key is re-inserted into the map.
But when I do the following puts
public class LRUCache<K, V> extends LinkedHashMap<K, V> {
private int size;
public static void main(String[] args) {
LRUCache<Integer, Integer> cache = LRUCache.newInstance(2);
cache.put(1, 1);
cache.put(2, 2);
cache.put(1, 1);
cache.put(3, 3);
System.out.println(cache);
}
private LRUCache(int size) {
super(size, 0.75f, true);
this.size = size;
}
#Override
protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
return size() > size;
}
public static <K, V> LRUCache<K, V> newInstance(int size) {
return new LRUCache<K, V>(size);
}
}
The output is
{1=1, 3=3}
Which indicates that the re-inserted did affected the order.
Does anybody know any explanation?
As pointed out by Jeffrey, you are using accessOrder. When you created the LinkedHashMap, the third parameter specify how the order is changed.
"true for access-order, false for insertion-order"
For more detailed implementation of LRU, you can look at this
http://www.programcreek.com/2013/03/leetcode-lru-cache-java/
But you aren't using insertion order, you're using access order.
order of iteration is the order in which its entries were last
accessed, from least-recently accessed to most-recently (access-order)
...
Invoking the put or get method results in an access to the
corresponding entry
So this is the state of your cache as you modify it:
LRUCache<Integer, Integer> cache = LRUCache.newInstance(2);
cache.put(1, 1); // { 1=1 }
cache.put(2, 2); // { 1=1, 2=2 }
cache.put(1, 1); // { 2=2, 1=1 }
cache.put(3, 3); // { 1=1, 3=3 }
Here is my implementation by using LinkedHashMap in AccessOrder. It will move the latest accessed element to the front which only incurs O(1) overhead because the underlying elements are organized in a doubly-linked list while also are indexed by hash function. So the get/put/top_newest_one operations all cost O(1).
class LRUCache extends LinkedHashMap<Integer, Integer>{
private int maxSize;
public LRUCache(int capacity) {
super(capacity, 0.75f, true);
this.maxSize = capacity;
}
//return -1 if miss
public int get(int key) {
Integer v = super.get(key);
return v == null ? -1 : v;
}
public void put(int key, int value) {
super.put(key, value);
}
#Override
protected boolean removeEldestEntry(Map.Entry<Integer, Integer> eldest) {
return this.size() > maxSize; //must override it if used in a fixed cache
}
}
Technically LinkedHashMap has the following constructor. Which help us to make the access-order True/False. If it is false then it keep the insertion-order.
LinkedHashMap(int initialCapacity, float loadFactor, boolean accessOrder)
(#Constructs an empty LinkedHashMap instance with the specified initial capacity, load factor and ordering mode)
Following is the simple implementation of LRU Cache ---
class LRUCache {
private LinkedHashMap<Integer, Integer> linkHashMap;
public LRUCache(int capacity) {
linkHashMap = new LinkedHashMap<Integer, Integer>(capacity, 0.75F, true) {
#Override
protected boolean removeEldestEntry(Map.Entry<Integer, Integer> eldest) {
return size() > capacity;
}
};
}
public void put(int key, int value) {
linkHashMap.put(key, value);
}
public int get(int key) {
return linkHashMap.getOrDefault(key, -1);
}
}
I used the following code and its works!!!!
I have taken window size to be 4, but any value can be taken.
for Insertion order:
1: Check if the key is present.
2: If yes, then remove it (by using lhm.remove(key))
3: Add the new Key Value pair.
for Access Order:
No need of removing keys only put and get statements do everything automatically.
This code is for ACCESS ORDER:
import java.util.LinkedHashMap;
public class LRUCacheDemo {
public static void main(String args[]){
LinkedHashMap<String,String> lhm = new LinkedHashMap<String,String>(4,0.75f,true) {
#Override
protected boolean removeEldestEntry(Map.Entry<String,String> eldest) {
return size() > 4;
}
};
lhm.put("test", "test");
lhm.put("test1", "test1");
lhm.put("1", "abc");
lhm.put("test2", "test2");
lhm.put("1", "abc");
lhm.put("test3", "test3");
lhm.put("test4", "test4");
lhm.put("test3", "test3");
lhm.put("1", "abc");
lhm.put("test1", "test1");
System.out.println(lhm);
}
}
I also implement LRU cache with little change in code. I have tested and it works perfectly as concept of LRU cache.
package com.first.misc;
import java.util.LinkedHashMap;
import java.util.Map;
public class LRUCachDemo {
public static void main(String aa[]){
LRUCache<String, String> lruCache = new LRUCache<>(3);
lruCache.cacheable("test", "test");
lruCache.cacheable("test1", "test1");
lruCache.cacheable("test2", "test2");
lruCache.cacheable("test3", "test3");
lruCache.cacheable("test4", "test4");
lruCache.cacheable("test", "test");
System.out.println(lruCache.toString());
}
}
class LRUCache<K, T>{
private Map<K,T> cache;
private int windowSize;
public LRUCache( final int windowSize) {
this.windowSize = windowSize;
this.cache = new LinkedHashMap<K, T>(){
#Override
protected boolean removeEldestEntry(Map.Entry<K, T> eldest) {
return size() > windowSize;
}
};
}
// put data in cache
public void cacheable(K key, T data){
// check key is exist of not if exist than remove and again add to make it recently used
// remove element if window size is exhaust
if(cache.containsKey(key)){
cache.remove(key);
}
cache.put(key,data);
}
// evict functioanlity
#Override
public String toString() {
return "LRUCache{" +
"cache=" + cache.toString() +
", windowSize=" + windowSize +
'}';
}
}

Hashmap with a .class object as a key

I've created a hashmap with .class objects for keys.
Hashmap<Class<? extends MyObject>, Object> mapping = new Hashmap<Class<? extends MyObject>, Object>();
This is all well and fine, but I'm getting strange behaviour that I can only attribute to strangeness with the hash function. Randomly during runtime, iterating through the hashmap will not hit every value; it will miss one or two. I think this may be due to the .class object not being final, and therefore it changes causing it to map to a different hash value. With a different hash value, the hashmap wouldn't be able to correctly correlate the key with the value, thus making it appear to have lost the value.
Am I correct that this is what is going on? How can I work around this? Is there a better way to accomplish this form of data structure?
Edit: I really thought I was onto something with the hash function thing, but I'll post my real code to try and figure this out. It may be a problem with my implementation of a multimap. I've been using it for quite some time and haven't noticed any issues until recently.
/**
* My own implementation of a map that maps to a List. If the key is not present, then
* the map adds a List with a single entry. Every subsequent addition to the key
* is appended to the List.
* #author
*
* #param <T> Key
* #param <K> Value
*/
public class MultiMap<T, K> implements Map<T, List<K>>, Serializable, Iterable<K> {
/**
*
*/
private static final long serialVersionUID = 5789101682525659411L;
protected HashMap<T, List<K>> set = new HashMap<T, List<K>>();
#Override
public void clear() {
set = new HashMap<T, List<K>>();
}
#Override
public boolean containsKey(Object arg0) {
return set.containsKey(arg0);
}
#Override
public boolean containsValue(Object arg0) {
boolean output = false;
for(Iterator<List<K>> iter = set.values().iterator();iter.hasNext();) {
List<K> searchColl = iter.next();
for(Iterator<K> iter2 = searchColl.iterator(); iter2.hasNext();) {
K value = iter2.next();
if(value == arg0) {
output = true;
break;
}
}
}
return output;
}
#Override
public Set<Entry<T, List<K>>> entrySet() {
Set<Entry<T, List<K>>> output = new HashSet<Entry<T,List<K>>>();
for(Iterator<T> iter1 = set.keySet().iterator(); iter1.hasNext();) {
T key = iter1.next();
for(Iterator<K> iter2 = set.get(key).iterator(); iter2.hasNext();) {
K value = iter2.next();
List<K> input = new ArrayList<K>();
input.add(value);
output.add(new AbstractMap.SimpleEntry<T,List<K>>(key, input));
}
}
return output;
}
#Override
public boolean isEmpty() {
return set.isEmpty();
}
#Override
public Set<T> keySet() {
return set.keySet();
}
#Override
public int size() {
return set.size();
}
#Override
public Collection<List<K>> values() {
Collection<List<K>> values = new ArrayList<List<K>>();
for(Iterator<T> iter1 = set.keySet().iterator(); iter1.hasNext();) {
T key = iter1.next();
values.add(set.get(key));
}
return values;
}
#Override
public List<K> get(Object key) {
return set.get(key);
}
#Override
public List<K> put(T key, List<K> value) {
return set.put(key, value);
}
public void putValue(T key, K value) {
if(set.containsKey(key)) {
set.get(key).add(value);
}
else {
List<K> setval = new ArrayList<K>();
setval.add(value);
set.put(key, setval);
}
}
#Override
public List<K> remove(Object key) {
return set.remove(key);
}
public K removeValue(Object value) {
K valueRemoved = null;
for(T key:this.keySet()) {
for(K val:this.get(key)) {
if(val.equals(value)) {
List<K> temp = this.get(key);
temp.remove(value);
valueRemoved = val;
this.put(key, temp);
}
}
}
return valueRemoved;
}
#Override
public void putAll(Map<? extends T, ? extends List<K>> m) {
for(Iterator<? extends T> iter = m.keySet().iterator(); iter.hasNext();) {
T key = iter.next();
set.put(key, m.get(key));
}
}
#Override
public Iterator<K> iterator() {
return new MultiMapIterator<K>(this);
}
}
Perhaps there is an issue with my iterator? I'll post that code as well.
import java.util.Iterator;
import java.util.List;
import java.util.NoSuchElementException;
public class MultiMapIterator<T> implements Iterator<T> {
private MultiMap <?, T> map;
private Iterator<List<T>> HashIter;
private Iterator<T> govIter;
private T value;
public MultiMapIterator(MultiMap<?, T> map) {
this.map = map;
HashIter = map.values().iterator();
if(HashIter.hasNext()) {
govIter = HashIter.next().iterator();
}
if(govIter.hasNext()) {
value = govIter.next();
}
}
#Override
public boolean hasNext() {
if (govIter.hasNext()) {
return true;
}
else if(HashIter.hasNext()) {
govIter = HashIter.next().iterator();
return this.hasNext();
}
else {
return false;
}
}
#Override
public T next() {
if(!this.hasNext()) {
throw new NoSuchElementException();
}
else {
value = govIter.next();
return value;
}
}
#Override
public void remove() {
map.remove(value);
}
}
Sorry for the long tracts of code. Thank you for spending time helping me with this.
You pull the a value out of govIter in the constructor, but never return it.
Your iterator remove method is completely wrong. You are iterating values, but calling the map.remove which removes by key. you simply want to call govIter.remove() (unless you need to avoid empty lists, in which case it's more complicated).
Your hasNext() method could also have problems depending on whether or not you allow empty Lists values in your multimap.

how to convert HashMultiset<String> to Map<String,Integer>

Is there some trick to convert HashMultiset<String> to Map<String,Integer>, except from iterating all the entries in the Set?
Update: The Integer should represent the count of String in the multiset.
You can use Maps.asMap. With lambda expression (Java 8) it will be a one-liner:
Maps.asMap(multiset.elementSet(), elem -> multiset.count(elem));
In Java 7 and below:
final Multiset<String> multiset = HashMultiset.create();
Map<String, Integer> freqMap = Maps.asMap(multiset.elementSet(),
new Function<String, Integer>() {
#Override
public Integer apply(String elem) {
return multiset.count(elem);
}
});
Updated to java 8, here is what I found as the best answer (based on other answers):
public static <E> Map<E, Integer> convert(Multiset<E> multiset) {
return multiset.entrySet().stream().collect(
Collectors.toMap(x->x.getElement(),x->x.getCount()));
}
or:
public static <E> Map<E, Integer> convert(Multiset<E> multiset) {
return multiset.entrySet().stream().collect(
Collectors.toMap(Entry::getElement,Entry::getCount));
}
With Eclipse Collections you can use the method toMapOfItemToCount on a Bag (aka Multiset), which will return a Map with a key of the same type in the Bag and an Integer count.
Note: I am a committer for Eclipse collections.
You could simply loop through the entries and put the element and count to a map.
public class MultisetToMap {
public static <E> Map<E, Integer> convert(Multiset<E> multiset) {
Map<E, Integer> map = Maps.newHashMap();
for (E e : multiset) {
multiset.count(e);
map.put(e, multiset.count(e));
}
return map;
}
}
Below is the (passing) JUnit test.
#Test
public void testConvert() {
HashMultiset<String> hashMultiset = HashMultiset.create();
hashMultiset.add("a");
hashMultiset.add("a");
hashMultiset.add("a");
hashMultiset.add("b");
hashMultiset.add("c");
Map<String, Integer> map = MultisetToMap.convert(hashMultiset);
assertEquals((Integer) 3, map.get("a"));
assertEquals((Integer) 1, map.get("b"));
assertEquals((Integer) 1, map.get("c"));
}
If you really want to avoid looping through the entries of the Multiset, you can create a view of it as a Map:
public class MultisetMapView<E> implements Map<E, Integer> {
private Multiset<E> delegate;
public MultisetMapView(Multiset<E> delegate) {
this.delegate = delegate;
}
public int size() {
return delegate.size();
}
public boolean isEmpty() {
return delegate.isEmpty();
}
public boolean containsKey(Object key) {
return delegate.contains(key);
}
public boolean containsValue(Object value) {
throw new UnsupportedOperationException();
}
public Integer get(Object key) {
return delegate.count(key);
}
public Integer put(E key, Integer value) {
return delegate.setCount(key, value);
}
public Integer remove(Object key) {
int count = delegate.count(key);
delegate.remove(key);
return count;
}
public void putAll(Map<? extends E, ? extends Integer> m) {
for (Entry<? extends E, ? extends Integer> entry : m.entrySet()) {
delegate.setCount(entry.getKey(), entry.getValue());
}
}
public void clear() {
delegate.clear();
}
public Set<E> keySet() {
return delegate.elementSet();
}
public Collection<Integer> values() {
throw new UnsupportedOperationException();
}
public Set<java.util.Map.Entry<E, Integer>> entrySet() {
Set<java.util.Map.Entry<E, Integer>> entrySet = Sets.newHashSet();
for (E e : delegate) {
delegate.count(e);
entrySet.add(Maps.immutableEntry(e, delegate.count(e)));
}
return entrySet;
}
}
In my implementation, I declined to implement the containsValue and values methods, as these are not useful in the context. If desired, these could be implemented by looping through the entries and inspecting the count of the elements encountered.
And again, you can see this working in this JUnit case:
#Test
public void testConvert() {
HashMultiset<String> hashMultiset = HashMultiset.create();
hashMultiset.add("a");
hashMultiset.add("a");
hashMultiset.add("a");
hashMultiset.add("b");
hashMultiset.add("c");
Map<String, Integer> map = new MultisetMapView<String>(hashMultiset);
assertEquals((Integer) 3, map.get("a"));
assertEquals((Integer) 1, map.get("b"));
assertEquals((Integer) 1, map.get("c"));
}
This is possible, but only with reflection and it looks very unsafe.
HashMultiset<String> hashMultiset = HashMultiset.create();
hashMultiset.add("a");
hashMultiset.add("a");
hashMultiset.add("a");
hashMultiset.add("b");
hashMultiset.add("c");
System.out.println(hashMultiset);
Method method = hashMultiset.getClass().getSuperclass().getDeclaredMethod("backingMap");
method.setAccessible(true);
Map<String, Integer> map = (Map<String, Integer>) method.invoke(hashMultiset);
System.out.println(map);
Result:
[b, c, a x 3]
{b=1, c=1, a=3}

How to declare final HashMap that should not allow to update or remove element

I have a hash map like,
public static void main(String[] args) {
final Map<String, String> daysMap = new HashMap(7);
daysMap.put("1", "Sunday");
daysMap.put("2", "Monday");
daysMap.put("3", "Tuesday");
daysMap.put("4", "Wednesday");
daysMap.put("5", "Thursday");
daysMap.put("6", "Friday");
daysMap.put("7", "Saturday");
}
In this map
1. Should not allow to put more than 7 elements
2. Should not update value for corresponding key [like daysMap.put("5", "xxx");]
3. Should not allow to remove any key
How to do?
You can implement a new HashMap
public class CoolMap<K, V> extends HashMap<K, V> {
#Override
public V put(K key, V value) {
if (size() == 7) {
throw new IllegalStateException("Size is at max!");
} else {
// If there is something already with that key
if (containsKey(value)) {
// do nothing
return value;
} else {
// put inside
return super.put(key, value);
}
}
}
#Override
public void putAll(Map<? extends K, ? extends V> collection) {
if (collection.size() > 7) {
throw new IllegalStateException("Size is at max!");
} else {
super.putAll(collection);
}
}
#Override
public V remove(Object key) {
return null;// doesn't remove anything
}
Points 2 and 3 are covered by Collections.unmodifiableMap. To cover the first point, you can add an hand written test.
As has been already discussed ,the Points 2 and 3 are covered like this
import java.util.*;
public class OP2 {
public static void main(String[] s) {
//object hash table
Hashtable<String,String> table = new Hashtable<String,String>();
table.
// populate the table
table.put("1", "Sunday");
table.put("2", "Monday");
table.put("3", "Tuesday");
table.put("4", "Wednesday");
table.put("5", "Thursday");
table.put("6", "Friday");
table.put("7", "Saturday");
System.out.println("Initial collection: "+table);
// create unmodifiable map
Map m = Collections.unmodifiableMap(table);
// try to modify the collection
// m.put("key3", "value3");
//Uncomment the above line and an error is obtained
}
}
Moreover for the first problem it would be better to call a function when you populate your map:-
public boolean putAndTest(MyKey key, MyValue value) {
if (map.size() >= MAX && !map.containsKey(key)) {
return false;
} else {
map.put("Whatever you want");
return true;
}
}
Why donĀ“t you create your own object that contains a private hashmap and then you can allow what methods on that private hashmap that are made public?

Categories