I have below 2 Enums, One is for type and another is for value.
public enum DifferentiatorType {
#JsonProperty("differentiatorType")
MARKETPLACE_ID("MarketplaceId");
private final String value;
}
public enum DifferentiatorValue {
VALUE_ONE("valueOne");
VALUE_TWO("valueTwo");
VALUE_THREE("valueThree");
VALUE_FOUR("valueFour");
private final String value;
}
Now the situation is ,i have a payload which holds both the field DifferentiatorType and DifferentiatorValue. I want to check if the type belongs to DifferentiatorType then go and check if the value is present there in DifferentiatorValue then only proceed farther. So basically i want to create a map between the 2 Enums .
PS: In future new object/item can be added to DifferentiatorType and in that case we have to create new Enum for that type of values accepted.
Here is the sample payload:
{\"referenceId\":\"B01-2776421-8482453\",\"preferenceType\":\"CREDITCARD\",\"differentiatorValue\":\"ABXWY75J\",\"customerId\":\"A37I50ASYHT\",\"differentiatorType\":\"MarketplaceId\"}
Use EnumMap and EnumSet to hold all the relationships.
import java.util.Map;
import java.util.EnumMap;
import java.util.EnumSet;
import java.util.Collection;
import java.util.Collections;
public class DifferentiatorMappings {
private final Map<DifferentiatorType, Collection<?>> values;
public DifferentiatorMappings() {
Map<DifferentiatorType, Collection<?>> map =
new EnumMap<>(DifferentiatorType.class);
map.put(DifferentiatorType.MARKETPLACE_ID,
Collections.unmodifiableCollection(
EnumSet.allOf(DifferentiatorValue.class)));
map.put(DifferentiatorType.ANOTHER_ID,
Collections.unmodifiableCollection(
EnumSet.allOf(AnotherValue.class)));
values = Collections.unmodifiableMap(map);
}
public Collection<?> getValuesFor(DifferentiatorType type) {
if (type == null) {
return Collections.emptySet();
}
return values.getOrDefault(type, Collections.emptySet());
}
}
Then you can use the returned values collection to check for a correct value:
DifferentiatorType type = payload.getDifferentiatorType();
Object value = payload.getDifferentiatorValue();
DifferentiatorMappings mappings = new DifferentiatorMappings();
if (mappings.getValuesFor(type).contains(value)) {
// proceed
}
Related
I have the following Student class
import javax.persistence.Entity;
import javax.persistence.Id;
#Entity
public class Student {
#Id
private int studentId;
private int groupId;
private String firstName;
private String lastName;
// getter setter
}
What I am trying to do is to get the field value and the field type through reflection. And put the result into list. The field value is retrieved through Field#get and the type is retrieved through Method#getReturnType of the getter, as shown below.
import java.lang.reflect.Field;
import java.lang.reflect.Method;
import java.util.ArrayList;
import java.util.List;
public class MethodFieldOrder {
public static void main(String[] args) {
Student entity = new Student();
entity.setGroupId(1);
entity.setFirstName("Jackson");
entity.setLastName("Romanovich");
List<String> valuesToInsert = new ArrayList<>();
Field[] field = entity.getClass().getDeclaredFields();
for (Field field1 : field) {
field1.setAccessible(true);
try {
Object value = field1.get(entity);
valuesToInsert.add(value.toString());
} catch (IllegalAccessException e) {
e.printStackTrace();
}
}
valuesToInsert.remove(0);
Method[] methods = entity.getClass().getDeclaredMethods();
List<String> returnTypes = new ArrayList<>();
for (Method method : methods) {
if (method.getName().startsWith("get") || method.getName().startsWith("is")) {
returnTypes.add(String.valueOf(method.getReturnType()));
}
}
returnTypes.remove(0);
System.out.println("Values to insert : " + valuesToInsert);
System.out.println("RETURN TYPES : " + returnTypes);
// Values to insert : [1, Jackson, Romanovich]
// RETURN TYPES : [int, int, class java.lang.String]
// What I want is: [int, class java.lang.String, class java.lang.String]
}
}
As shown in the program, the problem is the order of fields entity.getClass().getDeclaredFields() is not consistent with the order of entity.getClass().getDeclaredMethods(). How can I make sure field value list and field type list have same ordering?
As per docs, both getFields() and getMethods() mention the following:
The elements in the returned array are not sorted and are not in any particular order.
So, you don't have any guarantee about the order of the Fields or Methods in the returned array. If the order is important for you, or you just want a constant order, you have to sort yourself. In your example, you use a List to hold field values and method return types. One solution is to use a TreeSet so that your collections have always the same order, no matter how the Field[] or Method[] arrays you get are ordered.
Is there a mechanism to apply a standard set of checks to detect and then transform a String to the detected type, using one of Jackson's standard text related libs (csv, json, or even jackson-core)? I can imagine using it along with a label associated with that value (CSV header, for example) to do something sorta like the following:
JavaTypeAndValue typeAndValue = StringToJavaType.fromValue(Object x, String label);
typeAndValue.type() // FQN of Java type, maybe
typeAndValue.label() // where label might be a column header value, for example
typeAndValue.value() // returns Object of typeAndValue.type()
A set of 'extractors' would be required to apply the transform, and the consumer of the class would have to be aware of the 'ambiguity' of the 'Object' return type, but still capable of consuming and using the information, given its purpose.
The example I'm currently thinking about involves constructing SQL DDL or DML, like a CREATE Table statement using the information from a List derived from evaluating a row from a csv file.
After more digging, hoping to find something out there, I wrote the start of what I had in mind.
Please keep in mind that my intention here isn't to present something 'complete', as I'm sure there are several things missing here, edge cases not addressed, etc.
The pasrse(List<Map<String, String>> rows, List<String> headers comes from the idea that this could be a sample of rows from a CSV file read in from Jackson, for example.
Again, this isn't complete, so I'm not looking to pick at everything that's wrong with the following. The question isn't 'how would we write this?', it's 'is anyone familiar with something that exists that does something like the following?'.
import gms.labs.cassandra.sandbox.extractors.Extractor;
import gms.labs.cassandra.sandbox.extractors.Extractors;
import lombok.Builder;
import lombok.Getter;
import lombok.Setter;
import lombok.experimental.Accessors;
#Accessors(fluent=true, chain=true)
public class TypeAndValue
{
#Builder
TypeAndValue(Class<?> type, String rawValue){
this.type = type;
this.rawValue = rawValue;
label = "NONE";
}
#Getter
final Class<?> type;
#Getter
final String rawValue;
#Setter
#Getter
String label;
public Object value(){
return Extractors.extractorFor(this).value(rawValue);
}
static final String DEFAULT_LABEL = "NONE";
}
A simple parser, where the parse came from a context where I have a List<Map<String,String>> from a CSVReader.
import org.apache.commons.lang3.ObjectUtils;
import org.apache.commons.lang3.math.NumberUtils;
import java.util.*;
import java.util.function.BiFunction;
public class JavaTypeParser
{
public static final List<TypeAndValue> parse(List<Map<String, String>> rows, List<String> headers)
{
List<TypeAndValue> typesAndVals = new ArrayList<TypeAndValue>();
for (Map<String, String> row : rows) {
for (String header : headers) {
String val = row.get(header);
TypeAndValue typeAndValue =
// isNull, isBoolean, isNumber
isNull(val).orElse(isBoolean(val).orElse(isNumber(val).orElse(_typeAndValue.apply(String.class, val).get())));
typesAndVals.add(typeAndValue.label(header));
}
}
}
public static Optional<TypeAndValue> isNumber(String val)
{
if (!NumberUtils.isCreatable(val)) {
return Optional.empty();
} else {
return _typeAndValue.apply(NumberUtils.createNumber(val).getClass(), val);
}
}
public static Optional<TypeAndValue> isBoolean(String val)
{
boolean bool = (val.equalsIgnoreCase("true") || val.equalsIgnoreCase("false"));
if (bool) {
return _typeAndValue.apply(Boolean.class, val);
} else {
return Optional.empty();
}
}
public static Optional<TypeAndValue> isNull(String val){
if(Objects.isNull(val) || val.equals("null")){
return _typeAndValue.apply(ObjectUtils.Null.class,val);
}
else{
return Optional.empty();
}
}
static final BiFunction<Class<?>, String, Optional<TypeAndValue>> _typeAndValue = (type, value) -> Optional.of(
TypeAndValue.builder().type(type).rawValue(value).build());
}
Extractors. Just an example of how the 'extractors' for the values (contained in strings) might be registered somewhere for lookup. They could be referenced any number of other ways, too.
import gms.labs.cassandra.sandbox.TypeAndValue;
import org.apache.commons.lang3.ObjectUtils;
import org.apache.commons.lang3.math.NumberUtils;
import java.math.BigDecimal;
import java.math.BigInteger;
import java.util.Arrays;
import java.util.List;
public class Extractors
{
private static final List<Class> NUMS = Arrays.asList(
BigInteger.class,
BigDecimal.class,
Long.class,
Integer.class,
Double.class,
Float.class);
public static final Extractor<?> extractorFor(TypeAndValue typeAndValue)
{
if (NUMS.contains(typeAndValue.type())) {
return (Extractor<Number>) value -> NumberUtils.createNumber(value);
} else if(typeAndValue.type().equals(Boolean.class)) {
return (Extractor<Boolean>) value -> Boolean.valueOf(value);
} else if(typeAndValue.type().equals(ObjectUtils.Null.class)) {
return (Extractor<ObjectUtils.Null>) value -> null; // should we just return the raw value. some frameworks coerce to null.
} else if(typeAndValue.type().equals(String.class)) {
return (Extractor<String>) value -> typeAndValue.rawValue(); // just return the raw value. some frameworks coerce to null.
}
else{
throw new RuntimeException("unsupported");
}
}
}
I ran this from within the JavaTypeParser class, for reference.
public static void main(String[] args)
{
Optional<TypeAndValue> num = isNumber("-1230980980980980980980980980980988009808989080989809890808098292");
num.ifPresent(typeAndVal -> {
System.out.println(typeAndVal.value());
System.out.println(typeAndVal.value().getClass()); // BigInteger
});
num = isNumber("-123098098097987");
num.ifPresent(typeAndVal -> {
System.out.println(typeAndVal.value());
System.out.println(typeAndVal.value().getClass()); // Long
});
num = isNumber("-123098.098097987"); // Double
num.ifPresent(typeAndVal -> {
System.out.println(typeAndVal.value());
System.out.println(typeAndVal.value().getClass());
});
num = isNumber("-123009809890898.0980979098098908080987"); // BigDecimal
num.ifPresent(typeAndVal -> {
System.out.println(typeAndVal.value());
System.out.println(typeAndVal.value().getClass());
});
Optional<TypeAndValue> bool = isBoolean("FaLse");
bool.ifPresent(typeAndVal -> {
System.out.println(typeAndVal.value());
System.out.println(typeAndVal.value().getClass()); // Boolean
});
Optional<TypeAndValue> nulll = isNull("null");
nulll.ifPresent(typeAndVal -> {
System.out.println(typeAndVal.value());
//System.out.println(typeAndVal.value().getClass()); would throw null pointer exception
System.out.println(typeAndVal.type()); // ObjectUtils.Null (from apache commons lang3)
});
}
I don't know of any library to do this, and never seen anything working in this way on an open set of possible types.
For closed set of types (you know all the possible output types) the easier way would be to have the class FQN written in the string (from your description I didn't get if you are in control of the written string).
The complete FQN, or an alias to it.
Otherwise I think there is no escape to not write all the checks.
Furthermore it will be very delicate as I'm thinking of edge use case.
Suppose you use json as serialization format in the string, how would you differentiate between a String value like Hello World and a Date written in some ISO format (eg. 2020-09-22). To do it you would need to introduce some priority in the checks you do (first try to check if it is a date using some regex, if not go with the next and the simple string one be the last one)
What if you have two objects:
String name;
String surname;
}
class Employee {
String name;
String surname;
Integer salary
}
And you receive a serialization value of the second type, but with a null salary (null or the property missing completely).
How can you tell the difference between a set or a list?
I don't know if what you intended is so dynamic, or you already know all the possible deserializable types, maybe some more details in the question can help.
UPDATE
Just saw the code, now it seems more clear.
If you know all the possible output, that is the way.
The only changes I would do, would be to ease the increase of types you want to manage abstracting the extraction process.
To do this I think a small change should be done, like:
interface Extractor {
Boolean match(String value);
Object extract(String value);
}
Then you can define an extractor per type:
class NumberExtractor implements Extractor<T> {
public Boolean match(String val) {
return NumberUtils.isCreatable(val);
}
public Object extract(String value) {
return NumberUtils.createNumber(value);
}
}
class StringExtractor implements Extractor {
public Boolean match(String s) {
return true; //<-- catch all
}
public Object extract(String value) {
return value;
}
}
And then register and automatize the checks:
public class JavaTypeParser {
private static final List<Extractor> EXTRACTORS = List.of(
new NullExtractor(),
new BooleanExtractor(),
new NumberExtractor(),
new StringExtractor()
)
public static final List<TypeAndValue> parse(List<Map<String, String>> rows, List<String> headers) {
List<TypeAndValue> typesAndVals = new ArrayList<TypeAndValue>();
for (Map<String, String> row : rows) {
for (String header : headers) {
String val = row.get(header);
typesAndVals.add(extract(header, val));
}
}
}
public static final TypeAndValue extract(String header, String value) {
for (Extractor<?> e : EXTRACTOR) {
if (e.match(value) {
Object v = extractor.extract(value);
return TypeAndValue.builder()
.label(header)
.value(v) //<-- you can put the real value here, and remove the type field
.build()
}
}
throw new IllegalStateException("Can't find an extractor for: " + header + " | " + value);
}
To parse CSV I would suggest https://commons.apache.org/proper/commons-csv as CSV parsing can incur in nasty issues.
What you actually trying to do is to write a parser. You translate a fragment into a parse tree. The parse tree captures the type as well as the value. For hierarchical types like arrays and objects, each tree node contains child nodes.
One of the most commonly used parsers (albeit a bit overkill for your use case) is Antlr. Antlr brings out-of-the-box support for Json.
I recommend to take the time to ingest all the involved concepts. Even though it might seem overkill initially, it quickly pays off when you do any kind of extension. Changing a grammar is relatively easy; the generated code is quite complex. Additionally, all parser generator verify your grammars to show logic errors.
Of course, if you are limiting yourself to just parsing CSV or JSON (and not both at the same time), you should rather take the parser of an existing library. For example, jackson has ObjectMapper.readTree to get the parse tree. You could also use ObjectMapper.readValue(<fragment>, Object.class) to simply get the canonical java classes.
Try this :
import com.fasterxml.jackson.core.JsonFactory;
import com.fasterxml.jackson.databind.JsonNode;
import com.fasterxml.jackson.databind.ObjectMapper;
String j = // json string;
JsonFactory jsonFactory = new JsonFactory();
ObjectMapper jsonMapper = new ObjectMapper(jsonFactory);
JsonNode jsonRootNode = jsonMapper.readTree(j);
Iterator<Map.Entry<String,JsonNode>> jsonIterator = jsonRootNode.fields();
while (jsonIterator.hasNext()) {
Map.Entry<String,JsonNode> jsonField = jsonIterator.next();
String k = jsonField.getKey();
String v = jsonField.getValue().toString();
...
}
I'm having tough time coming up with a class that represents the following JSON. "familyRelationships" is an array of arrays. Each array has person1 and person2 identifiers and the relationship between person1 and person2. As you might've already guessed, order of the values in array is very important. For example, if "12345" and "31142" positions switch, then it means "31142" is the PARENT of "12345" which is totally wrong.
{
"familyRelationships": [
[
"12345",
"SPOUSE",
"67890",
{
"careTaker": false,
"liveTogether": true
}
],
[
"12345",
"PARENT",
"31142",
{
"careTaker": true,
"liveTogether": true
}
],
[
"67890",
"PARENT",
"31142",
{
"careTaker": true,
"liveTogether": true
}
]
]
}
This should be doable using a custom deserializer.
A relationship in my opinion should be modeled as a proper java class with proper names. Note that the constructor takes a JSONNode as an argument and that I have left our any getters and setters:
public class Relationship {
private final String id1;
private final String id2;
private final Relation relation;
private final boolean careTaker;
private final boolean liveTogether;
public Relationship(JsonNode base) {
this.id1 = base.get(0).asText();
this.id2 = base.get(2).asText();
this.relation = Relation.valueOf(base.get(1).asText());
this.careTaker = base.get(3).get("careTaker").asBoolean();
this.liveTogether = base.get(3).get("liveTogether").asBoolean();
}
public enum Relation {
PARENT,
SPOUSE;
}
}
We also need a class which stores the collection. This is the one that you would deserialize the top level object into (again leaving out getters and setters):
#JsonDeserialize( using = FamillyRelationshipsDeserializer.class )
public class FamillyRelationships {
public List<Relationship> familyRelationships = new ArrayList<>();
}
Finally we need to implement the actual JsonDeserializer referenced in the above class. It should look something like the following. I used this question as a reference:
class FamillyRelationshipsDeserializer extends JsonDeserializer<FamillyRelationships> {
#Override
public FamillyRelationships deserialize(JsonParser jp, DeserializationContext ctxt) throws
IOException, JsonProcessingException {
FamillyRelationships relationships = new FamillyRelationships();
JsonNode node = jp.readValueAsTree();
JsonNode rels = node.get("familyRelationships");
for (int i = 0; i < rels.size(); i++) {
relationships.familyRelationships.add(new Relationship(rels.get(i));
}
return relationships;
}
}
I hope this helps, I haven't actually tested any of this, it probably will not even compile, but the principles should be right. I have also assumed that the JSON is of the format you supplied. If that's not a guarantee then you will need to make the proper checks and deal with any deviations.
If you also want to be able to serialize everything then you will need to implement JsonSerializer see this question for more details.
That JSON isn't going to nicely become a POJO because the types contained within the arrays are not consistent:
e.g.
[
"12345",
"SPOUSE",
"67890",
{
"careTaker": false,
"liveTogether": true
}
]
is String,String,String,Object. The only way that works is if you have a Object[] or List<Object>. So familyRelationships should actually be a List<List<Object>>. The end result is going to require a bunch of casting (and probably some checks to make sure that the item at a given index is the class you expect), but it will work.
There are some tools online to do that for you
package com.example;
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import com.fasterxml.jackson.annotation.JsonAnyGetter;
import com.fasterxml.jackson.annotation.JsonAnySetter;
import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.annotation.JsonInclude;
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.annotation.JsonPropertyOrder;
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonPropertyOrder({
"familyRelationships"
})
public class Example {
#JsonProperty("familyRelationships")
private List<List<String>> familyRelationships = null;
#JsonIgnore
private Map<String, Object> additionalProperties = new HashMap<String, Object>();
#JsonProperty("familyRelationships")
public List<List<String>> getFamilyRelationships() {
return familyRelationships;
}
#JsonProperty("familyRelationships")
public void setFamilyRelationships(List<List<String>> familyRelationships) {
this.familyRelationships = familyRelationships;
}
#JsonAnyGetter
public Map<String, Object> getAdditionalProperties() {
return this.additionalProperties;
}
#JsonAnySetter
public void setAdditionalProperty(String name, Object value) {
this.additionalProperties.put(name, value);
}
}
This is only 1 possiblilty made with http://www.jsonschema2pojo.org/
I have a class with various properties and I would like to write a wrapper method around them in order to loop around them more easily.
Some properties return a collection of values, some a single value. And I'm looking for the best approach for this.
My first approach is to let the wrapper method return whatever the property getters return.
public class Test {
public Object getValue(String propName) {
if ("attr1".equals(propName)) return getAttribute1();
else if ("attr2".equals(propName)) return getAttribute2();
else return null;
}
public List<String> getAttribute1() {
return Arrays.asList("Hello","World");
}
public String getAttribute2() {
return "Goodbye";
}
public static void main(String[] args) {
final Test test=new Test();
Stream.of("attr1","attr2")
.forEach(p-> {
Object o=test.getValue(p);
if (o instanceof Collection) {
((Collection) o).forEach(v->System.out.println(v));
}
else {
System.out.println(o);
}
});
}
}
The bad point with this approach is that the caller has to test himself whether the result is a collection or not.
Other approach, seamless for the caller, is to always return a collection, ie. the wrapper function wraps the single values into a Collection. Here an HashSet, but we can imagine an adhoc, minimum 1 element list.
public class TestAlt {
public Collection getValue(String propName) {
if ("attr1".equals(propName))
return getAttribute1();
else if ("attr2".equals(propName)) {
Set s = new HashSet();
s.add(getAttribute2());
return s;
}
else
return null;
}
public List<String> getAttribute1() {
return Arrays.asList("Hello", "World");
}
public String getAttribute2() {
return "Goodbye";
}
public static void main(String[] args) {
final TestAlt test = new TestAlt();
Stream.of("attr1", "attr2")
.forEach(p -> {
test.getValue(p).forEach(v -> System.out.println(v));
});
}
Performance-wise, design-wise, ... what's your opinion on these approaches ? Do you have better ideas ?
Well, you could pass the action to be performed on each attribute to the object and let the object decide on how to handle it. E.g.:
in Class Test:
public void forEachAttribute(String propName, Handler h) {
if ("attr1".equals(propName))
h.handle(getAttribute1());
else if ("attr2".equals(propName)) {
getAttribute2().forEach(o -> h.handle(o))
}
}
And a class Handler with the function handle(String s), that does, what you want to do.
If you cannot edit Test, you can also move the function outside Test
public void forEachTestAttribute(Test t, String propName, Handler h)...
Performance-wise: This removes an if-clause
Design-wise: This removes a cast, but creates more classes.
*Edit: It also maintains type-security, and if there are multiple kinds of attributes (String, int, etc.) you could add more handle-functions, to still maintain type-security.
Regarding the design I would rewrite your code into this:
TestAlt.java
import java.util.*;
import java.util.stream.Stream;
public class TestAlt {
private Map<String, AttributeProcessor> map = AttributeMapFactory.createMap();
public Collection getValue(String propName) {
return Optional
.ofNullable(map.get(propName))
.map(AttributeProcessor::getAttribute)
.orElse(Arrays.asList("default")); //to avoid unexpected NPE's
}
public static void main(String[] args) {
final TestAlt test = new TestAlt();
Stream.of("attr1", "attr2")
.forEach(p -> test.getValue(p).forEach(v -> System.out.println(v)));
}
}
AttributeMapFactory.java
import java.util.HashMap;
import java.util.Map;
public class AttributeMapFactory {
public static Map<String, AttributeProcessor> createMap() {
Map<String, AttributeProcessor> map = new HashMap<>();
map.put("attr1", new HiAttributeProcessor());
map.put("attr2", new ByeAttributeProcessor());
return map;
}
}
AttributeProcessor.java
import java.util.Collection;
public interface AttributeProcessor {
Collection<String> getAttribute();
}
HiAttributeProcessor.java
import java.util.Arrays;
import java.util.Collection;
public class HiAttributeProcessor implements AttributeProcessor{
#Override
public Collection<String> getAttribute() {
return Arrays.asList("Hello", "World");
}
}
ByeAttributeProcessor.java
import java.util.Arrays;
import java.util.Collection;
public class ByeAttributeProcessor implements AttributeProcessor{
#Override
public Collection<String> getAttribute() {
return Arrays.asList("Goodbye");
}
}
The main point is that you get rid of if-else statements using map and dynamic dispatch.
The main advantage of this approach is that your code becomes more flexible to further changes. In case of this small programm it does not really matter and is an overkill. But if we are talking about large enterprise application, then yes, it becomes crucial.
I have to use a map which stores keys of type Integer, String and Long only.
One solution: To store type Object and in put method check with instanceof operator. Is there any better solution, maybe with enum
You can use a map and storing Long as String into it
or you can use two different hashmap and duplicate put/get methods. If you have two types, it is probably for two different things, and having two different map should probably be the correct answer
Create a class that has a map as a member and add methods that will store and retrieve int and long as Strings.
class MyMap {
private Map mabObject = Map<String, Object>;
public void add(long key, Object value) {
mapObject.put(Long.toString(key),value);
}
public void add(String key, Object value) {
mapObject.put(key, value);
}
public Object get(long key) {
return mapObject.get(Long.toString(key));
}
public Object get(String key) {
return mapObject.get(key);
}
}
I agree with Paul Boddington's comment, and the need of such trick shows that code smells.
Just for a funny excercise (not for production code) I've made an example that shows what we can do in compile time for limiting types of keys in a map.
For example we can create a wrapper allowing only values of specific classes.
common/map/Wrap.java
package common.map;
import java.util.Arrays;
import java.util.List;
public class Wrap<T> {
private T value;
private Wrap(T value){
this.value = value;
}
public T get() {
return this.value;
}
/*
* it's important to implement this method
* if we intend to use Wrap instances as map's key
*
* and it's needed to see that hash codes are computing differently in different classes,
* and depending on `allowedClasses` contents we can face some unexpected collisions
* so if you care of performance - test your maps usage accurately
*/
public int hashCode() {
return this.value.hashCode();
}
/*
* static
*/
private static List<Class> allowedClasses = Arrays.asList(Long.class, String.class);
public static <T> Wrap<T> create(Class<? extends T> clazz, T value) {
if (!allowedClasses.contains(clazz)) {
throw new IllegalArgumentException("Unexpected class " + clazz);
}
return new Wrap<>(value);
}
public static <T> Wrap<T> create(AllowedClasses allowedClass, T value) {
return create(allowedClass.clazz, value);
}
public enum AllowedClasses {
LONG(Long.class),
STRING(String.class);
private Class clazz;
AllowedClasses(Class clazz) {
this.clazz = clazz;
}
}
}
And let's run it
common/map/Example.java
package common.map;
import common.map.Wrap.AllowedClasses;
import java.util.HashMap;
import java.util.Map;
public class Example {
public static void main(String... args) {
Map<Wrap, Object> map = new HashMap<>();
// next two lines create wrappers for values of types we added to enum AllowedClasses
// but since enums cannot have type parameters, we are not able to check
// if the second parameter type is compatible with a type associated with given enum value
// so I think usage of enum is useless for your purpose
Wrap<?> valLong0 = Wrap.create(AllowedClasses.LONG, "the string in place of Long is OK");
Wrap<?> valString0 = Wrap.create(AllowedClasses.STRING, 12345);
// from the next lines you can see how we can use the Wrap class to keep
// only allowed types to be associated with the map keys
Wrap<Long> valLong = Wrap.create(Long.class, 1L); // legal
Wrap<String> valString = Wrap.create(String.class, "abc"); // legal
Wrap<String> valWrong = Wrap.create(String.class, 123); // doesn't compile
Wrap<Object> valWrong2 = Wrap.create(Object.class, 123); // compiles but throws exception in runtime
Object obj = ThirdParty.getObjectOfUnknownClass();
Wrap<?> valDynamic = Wrap.create(obj.getClass(), obj); // compiles but MAYBE throws exception in runtime
// so we get to this point only if all the wrappers are legal,
// and we can add them as keys to the map
map.put(valLong, new Object());
map.put(valString, new Object());
map.put(valDynamic, new Object());
}
}
HashMap<DataType1,DataType2>hm = new HashMap<DataType1,DataType2>();
or
Map<DataType1,DataType2> m = new HashMap<DataType1,DataType2>();
m.put(key, value);
Instead of DataType1 & DataType2 you can add Integer,String,Long ,etc. and use the put(key,value) method to enter key and values into the HashMap.