I have this snippet of code:
public abstract class Repository<Entity extends BaseObject> {
...
public void readFromJson(){
String content = "JSON content here";
Gson gson = new Gson();
Type entityType = new TypeToken<JSONObject<Entity>>(){}.getType();
jsonObject = gson.fromJson(content, entityType);
for (Entity ent : jsonObject.getEntities()) ;
}
}
When I try to do the foreach my entities object is no longer of type Entity but LinkedHashMap and I get this exception: java.lang.ClassCastException: java.util.LinkedHashMap cannot be cast to com.tranca.bookstore.domain.shared.BaseObject
Here is the JSONObject class(created by me)
public class JSONObject<Entity> {
private List<Entity> entities = new ArrayList<Entity>();
private long lastId = -1;
public List<Entity> getEntities() {
return entities;
}
public void setEntities(List<Entity> entities) {
this.entities = entities;
}
public long getLastId() {
return lastId;
}
public void setLastId(long lastId) {
this.lastId = lastId;
}
public void incrementLastId() {
this.lastId++;
}
}
maybe the base object is relevant so I will put the code here:
public abstract class BaseObject implements Serializable {
protected long id = (long) -1;
protected int version = 0;
protected BaseObject(){}
public long getId() {
return id;
}
public void setId(long id) {
this.id = id;
}
public int getVersion() {
return version;
}
public void setVersion(int version) {
this.version = version;
}
}
I had the same / a similar problem. To give a more clear answer in a slightly different context:
I had following Method which produced the error "com.google.gson.internal.LinkedTreeMap cannot be cast to MyType":
/**
* Reads a LinkedHashMap from the specified parcel.
*
* #param <TKey>
* The type of the key.
* #param <TValue>
* The type of the value.
* #param in
* The in parcel.
* #return Returns an instance of linked hash map or null.
*/
public static <TKey, TValue> LinkedHashMap<TKey, TValue> readLinkedHashMap(Parcel in) {
Gson gson = JsonHelper.getGsonInstance();
String content = in.readString();
LinkedHashMap<TKey, TValue> result = gson.fromJson(content, new TypeToken<LinkedHashMap<TKey, TValue>>(){}.getType());
return result;
}
I wanted an easy generic way to read/write linked hashmap. The above solution does not work because the type information of the TypeToken with TKey an TValue will be lost after compilation as far as i understand. And this is the problem. If you change you're code to following example, then it works, because now we explicitly define the type token. I am not so much into java that i understand why in this case it is possible to read the type information at runtime.
/**
* Reads a LinkedHashMap from the specified parcel.
*
* #param <TKey>
* The type of the key.
* #param <TValue>
* The type of the value.
* #param in
* The in parcel.
* #return Returns an instance of linked hash map or null.
*/
public static <TKey, TValue> LinkedHashMap<TKey, TValue> readLinkedHashMap(Parcel in, TypeToken<LinkedHashMap<TKey, TValue>> typeToken) {
Gson gson = JsonHelper.getGsonInstance();
Type type = typeToken.getType();
String content = in.readString();
LinkedHashMap<TKey, TValue> result = gson.fromJson(content, type);
return result;
}
And now you would call the above function like:
readLinkedHashMap(in, new TypeToken<LinkedHashMap<UUID, MyObject>>(){});
A sidenote 1: When writign the linked hash map, you do not need to specify any type token at all. toJson(map) is sufficient.
A sidenote 2 (to a problem which I had): By default gson uses toString() to serialize the key. If you register a type adapter for the key type which is maybe a more complex type, then this type adapter is not applied when serializing, but when deserializing. This leads to a non consistent and therefore failing process. Following options activates complex map key serialization.
gsonBuilder.enableComplexMapKeySerialization()
Finally got it!
The problem was that:
new TypeToken< JSONObject< Entity>>(){}.getType();
returns the type of JSONObject< T> not the specific entity of the subclass that was extending Repository(eg UserRepository extends Repository< User>).
The trick was to create an abstract method to force the subclasses to set the Type for deserialization.
In conclusion if you get this error be sure you have the right type of class (in case you use subclasses be sure it returns the type of you subclass not superclass).
Related
We are putting all our data points for one row into a hashmap. We don't want to use a pojo because the values are a different set each time. For example, we might get "place" on some records and we might get "hometown" on others. Actually we have thousands of different column names to choose from. Our code looks like this:
Map<String, Object> aMap = new HashMap<>();
aMap.put("id", Integer.valueOf(1));
aMap.put("age", Integer.valueOf(45));
aMap.put("name", "mark");
aMap.put("place", "home");
final GenericRecord record = new GenericData.Record(avroSchema);
aMap.forEach((k, v) -> {
record.put(k, v);
});
writer.write(record);
We would like to put all the values in a map and then generate a schema. Since using the Reflect api, it can be done for a pojo, I was wondering if it could be done from a hashmap as well?
As a side question, Is there any way to eliminate the forEach above and just write the map?
Here is what we came up with. We also had nested columns.
import java.util.ArrayList;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import org.apache.avro.Schema;
import org.apache.avro.Schema.Parser;
import org.apache.avro.Schema.Type;
/**
* This does NOT do all types. So far just the types we think we need. See
* https://docs.oracle.com/database/nosql-12.1.3.0/GettingStartedGuide/avroschemas.html
* <p>
* We need some error handlling here and when we don't have the correct type, call it out!
* <p>
* This runs in 1-2ms even with a large payload.
*/
public class AvroSchemaBuilder {
/**
* Construct!
*/
private AvroSchemaBuilder() {
//private constructor. All methods are static.
}
/**
* Build the Avro schema and return it.
*
* #param name Name of object.
* #param nameTypeConsumer The nameTypeConsumer of objects being saved.
* #return the Avro schema.
*/
public static Schema getAvroSchema(String name, NameTypeConsumer nameTypeConsumer) {
String json = Lson.toJson(getAvroSchemaAsMap(name, nameTypeConsumer, true));
Parser parser = new Parser().setValidate(true);
return parser.parse(json);
}
/**
* Returns the map with all the attributes to build a schema. This would be recursive if we need
* to build a complex schema. For example for Trends this would build a complex schema where some
* of the types are maps that are themselves described as another nested schema.
*/
private static Map<String, Object> getAvroSchemaAsMap(String name,
NameTypeConsumer nameTypeConsumer,
boolean addNameSpace) {
Map<String, Object> schemaMap = new LinkedHashMap<>();
schemaMap.put("type", "record");
schemaMap.put("name", name);
if (addNameSpace) {
schemaMap.put("namespace", "com.blah.blah");
}
List<Field> fields = new ArrayList();
nameTypeConsumer.consumeNestedNameType((columnName, nestedNameType) -> {
Object avroType;
if (nestedNameType.getNameTypeConsumer() != null) {
avroType = getAvroSchemaAsMap(columnName, nestedNameType.getNameTypeConsumer(), false);
} else {
avroType = getAvroType(nestedNameType.getType()).getName();
}
Object[] types = {"null", avroType}; //adding null first always.
fields.add(new Field(columnName, types));
});
schemaMap.put("fields", fields);
return schemaMap;
}
/**
* Finds the avro type by class.
*
* #param type the Type (this is an avro type).
* #return avro constant.
*/
private static Type getAvroType(Class<?> type) {
if (type.equals(Integer.class)) {
return Type.INT;
}
if (type.equals(Long.class)) {
return Type.LONG;
}
if (type.equals(Float.class)) {
return Type.FLOAT;
}
if (type.equals(Double.class)) {
return Type.DOUBLE;
}
if (type.equals(String.class)) {
return Type.STRING;
}
if (type.equals(Boolean.class)) {
return Type.BOOLEAN;
}
throw new GenericRuntimeException("Cannot get Avro type for type " + type.getName());
}
/**
* Nested class to make our field.
*/
private static class Field {
public final String name;
public final Object[] type;
public Field(String name, Object[] type) {
this.name = name;
this.type = type;
}
}
}
I have to use a map which stores keys of type Integer, String and Long only.
One solution: To store type Object and in put method check with instanceof operator. Is there any better solution, maybe with enum
You can use a map and storing Long as String into it
or you can use two different hashmap and duplicate put/get methods. If you have two types, it is probably for two different things, and having two different map should probably be the correct answer
Create a class that has a map as a member and add methods that will store and retrieve int and long as Strings.
class MyMap {
private Map mabObject = Map<String, Object>;
public void add(long key, Object value) {
mapObject.put(Long.toString(key),value);
}
public void add(String key, Object value) {
mapObject.put(key, value);
}
public Object get(long key) {
return mapObject.get(Long.toString(key));
}
public Object get(String key) {
return mapObject.get(key);
}
}
I agree with Paul Boddington's comment, and the need of such trick shows that code smells.
Just for a funny excercise (not for production code) I've made an example that shows what we can do in compile time for limiting types of keys in a map.
For example we can create a wrapper allowing only values of specific classes.
common/map/Wrap.java
package common.map;
import java.util.Arrays;
import java.util.List;
public class Wrap<T> {
private T value;
private Wrap(T value){
this.value = value;
}
public T get() {
return this.value;
}
/*
* it's important to implement this method
* if we intend to use Wrap instances as map's key
*
* and it's needed to see that hash codes are computing differently in different classes,
* and depending on `allowedClasses` contents we can face some unexpected collisions
* so if you care of performance - test your maps usage accurately
*/
public int hashCode() {
return this.value.hashCode();
}
/*
* static
*/
private static List<Class> allowedClasses = Arrays.asList(Long.class, String.class);
public static <T> Wrap<T> create(Class<? extends T> clazz, T value) {
if (!allowedClasses.contains(clazz)) {
throw new IllegalArgumentException("Unexpected class " + clazz);
}
return new Wrap<>(value);
}
public static <T> Wrap<T> create(AllowedClasses allowedClass, T value) {
return create(allowedClass.clazz, value);
}
public enum AllowedClasses {
LONG(Long.class),
STRING(String.class);
private Class clazz;
AllowedClasses(Class clazz) {
this.clazz = clazz;
}
}
}
And let's run it
common/map/Example.java
package common.map;
import common.map.Wrap.AllowedClasses;
import java.util.HashMap;
import java.util.Map;
public class Example {
public static void main(String... args) {
Map<Wrap, Object> map = new HashMap<>();
// next two lines create wrappers for values of types we added to enum AllowedClasses
// but since enums cannot have type parameters, we are not able to check
// if the second parameter type is compatible with a type associated with given enum value
// so I think usage of enum is useless for your purpose
Wrap<?> valLong0 = Wrap.create(AllowedClasses.LONG, "the string in place of Long is OK");
Wrap<?> valString0 = Wrap.create(AllowedClasses.STRING, 12345);
// from the next lines you can see how we can use the Wrap class to keep
// only allowed types to be associated with the map keys
Wrap<Long> valLong = Wrap.create(Long.class, 1L); // legal
Wrap<String> valString = Wrap.create(String.class, "abc"); // legal
Wrap<String> valWrong = Wrap.create(String.class, 123); // doesn't compile
Wrap<Object> valWrong2 = Wrap.create(Object.class, 123); // compiles but throws exception in runtime
Object obj = ThirdParty.getObjectOfUnknownClass();
Wrap<?> valDynamic = Wrap.create(obj.getClass(), obj); // compiles but MAYBE throws exception in runtime
// so we get to this point only if all the wrappers are legal,
// and we can add them as keys to the map
map.put(valLong, new Object());
map.put(valString, new Object());
map.put(valDynamic, new Object());
}
}
HashMap<DataType1,DataType2>hm = new HashMap<DataType1,DataType2>();
or
Map<DataType1,DataType2> m = new HashMap<DataType1,DataType2>();
m.put(key, value);
Instead of DataType1 & DataType2 you can add Integer,String,Long ,etc. and use the put(key,value) method to enter key and values into the HashMap.
I know there are quite a few CCE questions on SO. I have read the majority of them either in detail or briefly and I cannot find anything that applies to my situation. My exact error is:
Exception in thread "pool-1-thread-1" java.lang.ClassCastException: datastructures.instances.JClass cannot be cast to java.util.ArrayList at if ((results = mCallsDownstreamCache.get(origin)) == null) {
As you will see in the code, what I'm doing is asking for an ArrayList from a cache (HashMap) and then making a decision on that. The odd behavior here is that datastructures.instances.JClass is in no way referenced in the piece of code that generates the error.
To give you some context, I have a database "model" which fulfills requests from a "controller". Those results are stored in a cache local to the model and if they exist the model will return the cache thus not having to hit the db. My caching elements are effectively decorators for the Commons' JCS.
The offending line is wrapped in a block comment and inline comment
public class AnalyzeModel extends Model {
public final String TAG = getClass().getSimpleName();
public CacheDecorator<Integer, JClass> mClassCache = new CacheDecorator<Integer, JClass>();
public CacheDecorator<Integer, JMethod> mMethodCache = new CacheDecorator<Integer, JMethod>();
public CacheDecorator<Integer, ArrayList<Integer>> mCallsUpstreamCache =
new CacheDecorator<Integer, ArrayList<Integer>>();
public CacheDecorator<Integer, ArrayList<Integer>> mCallsDownstreamCache =
new CacheDecorator<Integer, ArrayList<Integer>>();
public void close() {
super.close();
}
public Pair<Integer, ArrayList<Integer>> selectCallGraphDownstream(int origin) {
ArrayList<Integer> results = new ArrayList<Integer>();
/**
* This is the offending line
*/
if ((results = mCallsDownstreamCache.get(origin)) == null) {
// End error line
results = new ArrayList<Integer>();
for (Record r : mQuery.select(
mQuery.mQuerier.select(Calls.CALLS.TID)
.from(Calls.CALLS)
.where(Calls.CALLS.SID.eq(origin)))) {
results.add(r.getValue(Calls.CALLS.TID));
}
mCallsDownstreamCache.put(origin, results);
Statistics.CACHE_MISS++;
} else {
Statistics.CACHE_HITS++;
}
return new Pair<Integer, ArrayList<Integer>>(origin, results);
}
}
public class CacheDecorator<K, V> {
public final String TAG = getClass().getSimpleName();
private CacheAccess<K, V> mCache;
public CacheDecorator() {
try {
mCache = JCS.getInstance("default");
} catch (CacheException e) {
BillBoard.e(TAG, "Error getting cache configuration: " + e.toString());
e.printStackTrace();
}
}
/**
* Get an object from cache
* #param obj object to retrieve from cache
* #return generic object of retrieved value, null if not found
*/
public V get(K obj) {
return mCache.get(obj);
}
/**
* Place an object in cache
* #param key generic key for reference
* #param obj generic object to be cached
*/
public synchronized void put(K key, V obj) {
try {
if(obj != null) {
mCache.putSafe(key, obj);
}
} catch( CacheException e) {
//BillBoard.d(TAG, obj.toString());
//BillBoard.e(TAG, "Error placing item in cache: " + e.toString());
//e.printStackTrace();
}
}
/**
* Get the stats from our cache manager
* #return String of our cache object
*/
public String getStats() {
shutDownCache();
return mCache.getStats();
}
public static void shutDownCache() {
CompositeCacheManager.getInstance().shutDown();
}
}
Some additional details that may, or may not, be helpful:
The Pair<V, K> datastructure is just an immutable 2-pair tuple class
CacheDecorator.get(V obj) returns null if the object doesn't exist in cache
I've tried quite a bit in regards to casting and such
JClass does have references elsewhere in the code, but no reference in the offending method
JClass is a representation of a java class, it's a custom structure
By altering your config to include region-specific configs as presented in the documentation, and passing a region to your wrapper, should resolve the problem.
Looks like you are using the same cache region for all your wrappers and hence referencing the same 'under laying cache' across your wrappers. Could you change mCache = JCS.getInstance("default"); to something like mCache = JCS.getInstance("uniqueNameForWrapper"); for all your wrappers?
Explanation of the situation:
I want to instanciate an object that can that have basically 2 parameters. A set of 'meta-parameters' and a value. The type of the value is determined by the 'meta-parameters'.
Example:
public class Meta
{
public static final Meta META_FIRST = new Meta(String.class, new byte[] {0x00, 0x01});
public static final Meta META_SECOND = new Meta(Float.class, new byte[] {0x00, 0x02});
public static final Meta META_THIRD = new Meta(Double.class, new byte[] {0x00, 0x03});
private Class<?> type;
private byte[] prelude;
private Meta(Class<?> type, byte[] prelude)
{
this.type = type;
this.prelude = prelude;
}
public Class<?> getType()
{
return this.type;
}
public byte[] getPrelude()
{
return this.prelude;
}
}
public class Record
{
private # value;
private byte[] prelude;
public Record(Meta meta, # value)
{
this.prelude = meta.getPrelude();
}
public void doSomeWork()
{
//Do some work with prelude and value
}
}
Expected usage:
Record recordString = new Record(Meta.META_FIRST, "hello");
Record recordDouble = new Record(Meta.META_THIRD, 12.8);
My doubt yet is how to determine the type of 'value' (actually symbolized by '#').
I think generics or reflexion could solve my problem but I can't figure out how a parameter in the constructor can influence the type of another parameter.
I would like to avoid using the generic notation when instanciating a Record (that's the reason why I putted this 'generic' information in the Meta-class).
Can anyone has an idea how to solve that ? (feel free to suggest an other approach)
Note: it is also acceptable for me to initialize the record value later with a setter.
In order to have it compiling, you have to make the Record class generic (parameterized by the type of the value):
public class Record<T> {
private T value;
public Record(Meta meta, T value) {
//Initialization
}
}
However, I don't see a reason you have a Meta class, since it does nothing but holding the Class type of the value. In order to simplify the hierarchy and to make sure the Meta is compatible with the value type, I would remove the Meta class and keep a Class<T> in Record, which will represent the meta about the value.
public class Record<T> {
private T value;
private Class<T> meta;
public Record(T value, Class<T> meta) {
//Initialization
}
public Class<T> getMeta() {
return meta;
}
}
and will use it like this:
Record recordString = new Record("hello", String.class);
Class<String> recordStringMeta = recordString.getMeta();
Record recordDouble = new Record(12.8, Double.class);
Class<Double> recordDoubleMeta = recordDouble.getMeta();
Update:
Since you don't want to have the Record class generic (which I don't advice you, but ...), you can introduce three constructors there and copy the passed value to an Object member. Unfortunately, this will force you to do casts when extracting the value back:
public class Record {
private Object value;
public Record(Meta meta, String value) { ... }
public Record(Meta meta, Double value) { ... }
public Record(Meta meta, Float value) { ... }
}
I need a data structure to store different type of objects.E.g. String, Boolean and other classes.
Is using a Map<String, Object> where using the key you get the according object which assumes that you know how to cast it a good practice?
Is there a better solution?
That's a perfect use case for a PropretyHolder I wrote a while ago. You can read in length about it on my blog. I developed it with immutability in mind, feel free to adapt it to your needs.
In general I'd say if you want to profit from type safety in Java you need to know your keys. What I mean by that - it will be hardly possible to develop type safe solution where keys come from external source.
Here's a special key that knows type of its value (it's not complete please download the source for complete version):
public class PropertyKey<T> {
private final Class<T> clazz;
private final String name;
public PropertyKey(Class<T> valueType, String name) {
this.clazz = valueType;
this.name = name;
}
public boolean checkType(Object value) {
if (null == value) {
return true;
}
return this.clazz.isAssignableFrom(value.getClass());
}
... rest of the class
}
Then you develop a data structure that utilizes it:
public class PropertyHolder {
private final ImmutableMap<PropertyKey<?>, ?> storage;
/**
* Returns value for the key of the type extending-the-one-declared-in-the {#link PropertyKey}.
*
* #param key {#link PropertyKey} instance.
* #return Value of the type declared in the key.
*/
#SuppressWarnings("unchecked")
public <T extends Serializable> T get(PropertyKey<T> key) {
return (T) storage.get(key);
}
/**
* Adds key/value pair to the state and returns new
* {#link PropertyHolder} with this state.
*
* #param key {#link PropertyKey} instance.
* #param value Value of type specified in {#link PropertyKey}.
* #return New {#link PropertyHolder} with updated state.
*/
public <T> PropertyHolder put(PropertyKey<T> key, T value) {
Preconditions.checkNotNull(key, "PropertyKey cannot be null");
Preconditions.checkNotNull(value, "Value for key %s is null",
key);
Preconditions.checkArgument(key.checkType(value),
"Property \"%s\" was given "
+ "value of a wrong type \"%s\"", key, value);
// Creates ImmutableMap.Builder with new key/value pair.
return new PropertyHolder(filterOutKey(key)
.put(key, value).build());
}
/**
* Returns {#link Builder} with all the elements from the state except for the given ket.
*
* #param key The key to remove.
* #return {#link Builder} for further processing.
*/
private <T> Builder<PropertyKey<? extends Serializable>, Serializable> filterOutKey(PropertyKey<T> key) {
Builder<PropertyKey<? extends Serializable>, Serializable> builder = ImmutableMap
.<PropertyKey<? extends Serializable>, Serializable> builder();
for (Entry<PropertyKey<? extends Serializable>, Serializable> entry : this.storage.entrySet()) {
if (!entry.getKey().equals(key)) {
builder.put(entry);
}
}
return builder;
}
... rest of the class
}
I omit here a lot of unnecessary details please let me know if something is not clear.
A typesafe heterogeneous container can be used for this purpose:
import java.util.HashMap;
import java.util.Map;
public class Container {
private Map<Class<?>, Object> container = new HashMap<Class<?>, Object>();
public <T> void putElement(Class<T> type, T instance) {
if (type == null) {
throw new NullPointerException("Type is null");
}
//container.put(type, instance); // 'v1'
container.put(type, type.cast(instance)); // 'v2' runtime type safety!
}
public <T> T getElement(Class<T> type) {
return type.cast(container.get(type));
}
public static void main(String[] args) {
Container myCont = new Container();
myCont.putElement(String.class, "aaa");
myCont.putElement(Boolean.class, true);
myCont.putElement(String[].class, new String[] {"one", "two"});
System.out.println(myCont.getElement(String.class));
System.out.println(myCont.getElement(String[].class)[1]);
}
}
Limitation: this container in its form is capable only to store one instance/object type.
In putElement() you can achieve runtime type safety by using a dynamic cast. This will hoewever add an extra overhead.
E.g: Try to pass a raw class object to the container. Note where the exception occurs:
Class raw = Class.forName("MyClass");
myCont.putElement(raw, "aaa"); //ClassCastException if using 'v2'
System.out.println(myCont.getElement(raw)); //ClassCastException if using 'v1'