I'm learning generics and trying to write generic method in a project that I'm working on. I have a use case where I have a method which is recursive, which is exactly same but the parameters type are either Document or BasicDBObject. These parameters have nested list of objects so each of them again have objects of type Document or BasicDBObject inside them. So I have overloaded the methods with both the types. Can I use generics to change this method to accept any type?
public <T> String convertToRules(T rulesObj) {
final StringBuilder expressionBuilder = new StringBuilder("${");
if (rulesObj.getString("condition") != null) {
String condition = rulesObj.getString("condition");
if (rulesObj.get("rules") != null) {
ArrayList<BasicDBObject> rules = (ArrayList<BasicDBObject>) rulesObj.get("rules");
// rules can be ArrayList<BasicDBObject> or ArrayList<Document> so just mentioning them
// both here in question to avoid confusion of what type they are.
ArrayList<Document> rules = (ArrayList<Document>) rulesObj.get("rules");
rules.forEach(rule -> {
// Some code
expressionBuilder.append(convertToRules(rule));
}
}
}
There are compiler errors in the above code as I was trying to convert the existing method to generic method. Below are the actual methods.
public String convertToRules(BasicDBObject rulesObj) {
final StringBuilder expressionBuilder = new StringBuilder("${");
if (rulesObj.getString("condition") != null) {
String condition = rulesObj.getString("condition");
if (rulesObj.get("rules") != null) {
ArrayList<BasicDBObject> rules = (ArrayList<BasicDBObject>) rulesObj.get("rules");
rules.forEach(rule -> {
// Some code
expressionBuilder.append(convertToRules(rule));
}
}
}
public String convertToRules(Document rulesObj) {
final StringBuilder expressionBuilder = new StringBuilder("${");
if (rulesObj.getString("condition") != null) {
String condition = rulesObj.getString("condition");
if (rulesObj.get("rules") != null) {
ArrayList<Document> rules = (ArrayList<Document>) rulesObj.get("rules");
rules.forEach(rule -> {
// Some code
expressionBuilder.append(convertToRules(rule));
}
}
}
I'm pretty sure this is not the right way to do it, need some guidance on how this can be achieved. I want to know if I'm thinking the right way about generics or this implementation is wrong.
RuleType - Contract of Two objects.
import java.util.List;
public interface RuleType<T> {
String getString(String parameter);
List<T> get(String param);
}
BasicDBObject
import java.util.ArrayList;
import java.util.List;
public class BasicDBObject implements RuleType<BasicDBObject> {
#Override
public String getString(String parameter) {
return "Something";
}
#Override
public List<BasicDBObject> get(String param) {
return new ArrayList<>();
}
}
Doument
import java.util.ArrayList;
import java.util.List;
public class Document implements RuleType<Document> {
#Override
public String getString(String parameter) {
return "Something";
}
#Override
public List<Document> get(String param) {
return new ArrayList<>();
}
}
Generic Method (Logic implementation)
public <T extends RuleType> String convertToRules(T rulesObj) {
final StringBuilder expressionBuilder = new StringBuilder("${");
if (rulesObj.getString("condition") != null) {
String condition = rulesObj.getString("condition");
if (rulesObj.get("rules") != null) {
List<T> rules = rulesObj.get("rules");
rules.forEach(rule ->
// Some code
expressionBuilder.append(convertToRules(rule))
);
}
}
return expressionBuilder.toString();
}
You can make an abstract superclass or interface for both of the class. Let's say SuperClass with the common methods like getString() of the BasicDBObject and Document. Then, override the methods in the two class. Then for the generic method, you can make it like <? extends SuperClass. Then you can read the property of SuperClass easily. I think it should work fine.
Related
Currently, I have three classes, Apple, Grape, Banana, and I created a converter class (can convert a string to list) for each of them. Then I realized all those converter classes are basically the same except they take different class types.
The example of AppleConverter.java
public class AppleConverter extends StrutsTypeConverter {
#SuppressWarnings("rawtypes")
#Override
public Object convertFromString(Map context, String[] values, Class toClass) {
final List<Apple> appleList = new ArrayList<>();
try {
for (String appleString : values) {
Apple apple = Apple.valueOf(appleString);
appleList.add(apple);
}
}
catch (IllegalArgumentException e) {
throw new TypeConversionException(e);
}
return appleList;
}
#SuppressWarnings("rawtypes")
#Override
public String convertToString(Map context, Object o) {
if (o == null) {
return null;
}
if (o instanceof List<?>) {
List<Apple> list = (List<Apple>) o;
String appleString = list
.stream()
.map(e -> String.valueOf(e))
.collect(Collectors.joining(","));
return appleString;
}
throw new TypeConversionException("wrong");
}
Basically, GrapdeConverter.java and BananaConverter.java are the same as AppleConverter.java except they take different class types.
So I tried to create a generic class then those converter classes can inherit it.
My generic class code:
public class FruitConverter<T> extends StrutsTypeConverter {
#SuppressWarnings("rawtypes")
#Override
public Object convertFromString(Map context, String[] values, Class toClass) {
if (ArrayUtils.isEmpty(values)) {
return null;
}
final List<T> objs = new ArrayList<>();
try {
for (String objStr : values) {
Object obj = toClass.getDeclaredMethod("valueOf", String.class).invoke(null, objStr);
objs.add(obj);
}
}
}
catch (IllegalArgumentException e) {
throw new TypeConversionException(e);
}
return objs;
}
#SuppressWarnings("rawtypes")
#Override
public String convertToString(Map context, Object o) {
if (o == null) {
return null;
}
if (o instanceof List<?>) {
List<?> list = (List<?>) o;
String fruitString = list
.stream()
.map(e -> String.valueOf(e))
.collect(Collectors.joining(","));
return fruitString;
}
throw new TypeConversionException("object is not a list of objs");
}
}
When I called T obj = T.valueOf(objStr);, it throws an error can not resolve method valueOf.
May anyone tell me how to apply generics correctly in this case.
The compilation error occurs because Java's generics are implemented with type erasure. In this case, T.valueOf is treated as the same as if you had tried to do Object.valueOf, i.e. it doesn't know which classes to search in for the valueOf method except for Object.
You can pass the method in as a function using a method reference, as in the following example:
import java.util.function.Function;
class FruitConverter<T> {
private final Function<String, T> valueOf;
FruitConverter(Function<String, T> valueOf) {
this.valueOf = valueOf;
}
// ...
}
class AppleConverter extends FruitConverter<Apple> {
AppleConverter() {
super(Apple::valueOf);
}
}
Then you can use this.valueOf.apply(...) when you want to call the method.
Also see https://docs.oracle.com/javase/tutorial/java/javaOO/methodreferences.html.
You cannot use T in such expressions, I doubt that you even need generics for this task.
Quick and dirty way to generalize is to use reflection, instead of T obj = T.valueOf(objStr); use this:
Object obj = toClass.getDeclaredMethod("valueOf", String.class).invoke(null, objStr);
then you will be able to use same function for all objects like this.
Another approach is to use fabrics, but it will require much more code.
Edit: I was trying to simplify my problem at hand a little, but turns out, it created more confusion instead. Here's the real deal:
I am working with AWS's Java SDK for DynamoDB. Using the DynamoDBMapper class, I am trying to query DynamoDB to retrieve items from a particular table. I have several objects that map to my DynamoDB tables, and I was hoping to have a generic method that could accept the mapped objects, query the table, and return the item result.
Psudo-code:
#DynamoDBTable(tableName="testTable")
public class DBObject {
private String hashKey;
private String attribute1;
#DynamoDBHashKey(attributeName="hashKey")
public String getHashKey() { return this.hashKey; }
public void setHashKey(String hashKey)
#DynamoDBAttribute(attributeName="attribute1")
public String getAttribute1() { return this.attribute1; }
public void setAttribute1(String attribute1) { this.attribute1 = attribute1; }
}
public class DatabaseRetrieval {
public DatabaseRetrieval()
{
DBObject dbObject = new DBObject();
dbObject.setHashKey = "12345";
DBRetrievalAgent agent = new DBRetrievalAgent;
dbObject = agent.retrieveDBObject(dbObject.getClass(), dbObject);
}
}
public class DBRetrievalAgent {
public Object retrieveDBObject(Class<?> classType, Object dbObject)
{
DynamoDBQueryExpression<classType> temp = new DynamoDBQueryExpression<classType>().withHashKeyValues(dbObject);
return this.dynamoDBMapper.query(classType, temp);
}
}
Use a type witness within your method:
public <T> String getResult(Class<T> type) {
List<T> newList = new ArrayList<>();
//other code
}
Try this
ArrayList<T> newList = new ArrayList<>();
You can specify the type as T for your getResult() to make it generic (i.e., accepts any class) as shown below:
public <T> String getResult(T t) {
String result = "";
List<T> newList = new ArrayList<>();
// perform actions
return result;
}
I have two unrelated java classes (only *.class, no *.java) like this:
public class Trick {
public String getName() { return "Jack"; }
public String trick() { ... }
}
and
public class Treat {
public String getName() { return "John"; }
public String treat() { ... }
}
and I would like to generate a sort of Proxy class at runtime that represents the union of both classes and forwards them to the respective instance, and maybe throw if that's not possible. I assume that can be done with cglib but I don't know where to start.
This is what I would like to do (pseudocode):
// prepare: generate a common interface
TrickOrTreat trickOrTreat = magic.createUnionInterface(Trick.class, Treat.class);
// use with concrete interface A:
Trick trick = new Trick();
TrickOrTreat proxyA = magic.createProxy(trickOrTreat.class, trick);
System.out.println("trick name: " + proxyA.getName());
// use with concrete interface B:
Treat treat = new Treat();
TrickOrTreat proxyB = magic.createProxy(trickOrTreat.class, treat);
System.out.println("treat name: " + proxyB.getName());
Or something to that effect. I would like to do it completely dynamically, probably cglib-based? If thats not possible I would do it with a code generation step in between?
If you are willing to trade in cglib, you can do this with Byte Buddy. I typically refuse to call it magic but here you go:
class Magic {
Class<?> createUnionInterface(Class<?> a, Class<?> b) {
DynamicType.Builder<?> builder = new ByteBuddy().makeInterface();
Set<MethodDescription.SignatureToken> tokens = new HashSet<>();
for (MethodDescription m : new TypeDescription.ForLoadedType(a)
.getDeclaredMethods()
.filter(ElementMatchers.isVirtual())) {
tokens.add(m.asSignatureToken());
builder = builder.defineMethod(m.getName(),
m.getReturnType(),
m.getModifiers()).withoutCode();
}
for (MethodDescription m : new TypeDescription.ForLoadedType(b)
.getDeclaredMethods()
.filter(ElementMatchers.isVirtual())) {
if (!tokens.contains(m.asSignatureToken())) {
builder = builder.defineMethod(m.getName(),
m.getReturnType(),
m.getModifiers()).withoutCode();
}
}
return builder.make()
.load(Magic.class.getClassLoader())
.getLoaded();
}
Object createProxy(Class<?> m, final Object delegate) throws Exception {
return new ByteBuddy()
.subclass(m)
.method(new ElementMatcher<MethodDescription>() {
#Override
public boolean matches(MethodDescription target) {
for (Method method : delegate.getClass()
.getDeclaredMethods()) {
if (new MethodDescription.ForLoadedMethod(method)
.asSignatureToken()
.equals(target.asSignatureToken())) {
return true;
}
}
return false;
}
}).intercept(MethodDelegation.to(delegate))
.make()
.load(Magic.class.getClassLoader())
.getLoaded()
.newInstance();
}
}
Note that you cannot reference a runtime-generated type at compile-time. This is however a given constraint with runtime code generation.
Magic magic = new Magic();
Class<?> trickOrTreat = magic.createUnionInterface(Trick.class, Treat.class);
Trick trick = new Trick();
Object proxyA = magic.createProxy(trickOrTreat, trick);
System.out.println("trick name: " + trickOrTreat.getDeclaredMethod("getName").invoke(proxyA));
Treat treat = new Treat();
Object proxyB = magic.createProxy(trickOrTreat, treat);
System.out.println("trick name: " + trickOrTreat.getDeclaredMethod("getName").invoke(proxyB));
You can overcome this by generating your TrickOrTreat class prior to runtime such that you can reference the type at runtime.
As for the suggested union-type approach, this would require you to have at least one class to be an interface type as Java does not support multiple inheritance.
If you need functionality of both classes/interfaces you can use
public <TT extends Trick & Treat> void process(TT thing){
//...
}
edit:
Implement new Interface MyProxyHandler
public interface MyProxyHandler {}
Extend it with interfaces of classes say TreatInterface and TrickInterface
Create class ProxyManager that implements java.lang.reflect.InvocationHandler
public abstract class ProxyManager<T extends MyProxyHandler> implements InvocationHandler {
protected static String LOCK_OBJECT = new String("LOCK");
protected T proxyHandler;
protected List<T> handlers = new ArrayList<>();
#SuppressWarnings("unchecked")
public ProxyManager(Class<T> _clazz) {
proxyHandler = (T) Proxy.newProxyInstance(_clazz.getClassLoader(), new Class[]{_clazz}, this);
}
public T getProxy() {
return proxyHandler;
}
public List<T> getHandlers() {
return handlers;
}
public void setHandlers(List<T> handlers) {
this.handlers = handlers;
}
public boolean registerHandler(T handler) {
synchronized (LOCK_OBJECT) {
boolean add = true;
for (T item : this.handlers) {
if (item.getClass().equals(handler.getClass())) {
add = false;
}
}
if (add)
this.handlers.add(handler);
return add;
}
}
#Override
public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
String result = "";
for (MyProxyHandler handler : getHandlers()) {
try {
//I recommend that methods returns some enum like HANDLED/NOTHANDLED
result = (String) method.invoke(handler, args);
if (result.equals("Some flag"))
break;
} catch (InvocationTargetException e) {
throw e.getCause();
}
}
return result;
}
}
Extend that class with your concrete class
public class TreatTrickProxyManager<T extends TreatInterface & TreatInterface> extends ProxyManager<T> {
public TreatTrickProxyManager(Class<T> _clazz) {
super(_clazz);
}
}
In your bussines logic class get an instance of TreatTrickProxyManager
In your method
public void retrieveSomeData(){
((TreatTrickProxyManager)getTreatTrickProxyManager().getProxy()).someMethodInvocation()
}
What I need to do is having a unit test that checks whether a given class MyClass is serializable. But I can't only check that the class implements Serializable: all the attributes of the class (and the attributes of the attributes etc.) have to implement Serializable.
Also, a solution like creating an instance of MyClass and try to serialize and deserialize it is not satisfactory: if someone adds an attribute and does not update the unit test, the unit test won't test the new attribute.
One solution would probably be to use reflection recursively to test whether the classes of all the attributes implement Serializable, but that seems really heavy.
Isn't there a better solution?
Thanks.
This is an old question, but I thought I would add my 2c.
Even if you recurse through the whole object, you cannot guarantee that your object implements serializable. If you have a member variable that is abstract or is an interface, you cannot say if the object you eventually store will or will not be serialized. If you know what you are doing, it would serialized.
This answer provides a solution by populating your objects with random data, but this is probably overkill.
I have created a class that does recurse through the structure ignoring interfaces, java.lang.Object and abstract classes.
package au.com.tt.util.test;
import java.io.Serializable;
import java.lang.reflect.Field;
import java.lang.reflect.Modifier;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.HashSet;
import java.util.List;
import java.util.Set;
public final class SerializableChecker
{
public static class SerializationFailure
{
private final String mContainingClass;
private final String mMemberName;
public SerializationFailure(String inNonSerializableClass, String inMemberName)
{
mContainingClass = inNonSerializableClass;
mMemberName = inMemberName;
}
public String getContainingClass()
{
return mContainingClass;
}
public String getMemberName()
{
return mMemberName;
}
public String getBadMemberString()
{
if (mMemberName == null)
return mContainingClass;
return mContainingClass + "." + mMemberName;
}
#Override
public String toString()
{
return "SerializationFailure [mNonSerializableClass=" + mContainingClass + ", mMemberName=" + mMemberName + "]";
}
}
private static class SerializationCheckerData
{
private Set<Class<?>> mSerializableClasses;
SerializationCheckerData()
{
mSerializableClasses = new HashSet<Class<?>>();
}
boolean isAlreadyChecked(Class<?> inClass)
{
return mSerializableClasses.contains(inClass);
}
void addSerializableClass(Class<?> inClass)
{
mSerializableClasses.add(inClass);
}
}
private SerializableChecker()
{ }
public static SerializationFailure isFullySerializable(Class<?> inClass)
{
if (!isSerializable(inClass))
return new SerializationFailure(inClass.getName(), null);
return isFullySerializable(inClass, new SerializationCheckerData());
}
private static SerializationFailure isFullySerializable(Class<?> inClass, SerializationCheckerData inSerializationCheckerData)
{
for (Field field : declaredFields(inClass))
{
Class<?> fieldDeclaringClass = field.getType();
if (field.getType() == Object.class)
continue;
if (Modifier.isStatic(field.getModifiers()))
continue;
if (field.isSynthetic())
continue;
if (fieldDeclaringClass.isInterface() || fieldDeclaringClass.isPrimitive())
continue;
if (Modifier.isAbstract(field.getType().getModifiers()))
continue;
if (inSerializationCheckerData.isAlreadyChecked(fieldDeclaringClass))
continue;
if (isSerializable(fieldDeclaringClass))
{
inSerializationCheckerData.addSerializableClass(inClass);
SerializationFailure failure = isFullySerializable(field.getType(), inSerializationCheckerData);
if (failure != null)
return failure;
else
continue;
}
if (Modifier.isTransient(field.getModifiers()))
continue;
return new SerializationFailure(field.getDeclaringClass().getName(), field.getName());
}
return null;
}
private static boolean isSerializable(Class<?> inClass)
{
Set<Class<?>> interfaces = getInterfaces(inClass);
if (interfaces == null)
return false;
boolean isSerializable = interfaces.contains(Serializable.class);
if (isSerializable)
return true;
for (Class<?> classInterface : interfaces)
{
if (isSerializable(classInterface))
return true;
}
if (inClass.getSuperclass() != null && isSerializable(inClass.getSuperclass()))
return true;
return false;
}
private static Set<Class<?>> getInterfaces(Class<?> inFieldDeclaringClass)
{
return new HashSet<Class<?>>(Arrays.asList(inFieldDeclaringClass.getInterfaces()));
}
private static List<Field> declaredFields(Class<?> inClass)
{
List<Field> fields = new ArrayList<Field>(Arrays.asList(inClass.getDeclaredFields()));
Class<?> parentClasses = inClass.getSuperclass();
if (parentClasses == null)
return fields;
fields.addAll(declaredFields(parentClasses));
return fields;
}
}
As a general check, you can use SerializationUtils from Apache Commons, as follows:
byte [] data = SerializationUtils.serialize(obj);
Object objNew = SerializationUtils.deserialize(data);
but it does not guarantee that other instances of that class will serialize properly; for complex objects there is a risk that particular members will not be serializable depending on their members.
Using reflection recursively is not perfect but could partially makes the job.
To do that, you can re-use an utility class like this.
Usage will look like this :
public class SerializationTests {
#Test
public void ensure_MyClass_is_serializable() {
assertIsSerializable(MyClass.class);
}
#Test
public void ensure_MyComplexClass_is_serializable() {
// We excludes "org.MyComplexClass.attributeName"
// because we can not be sure it is serializable in a
// reliable way.
assertIsSerializable(MyComplexClass.class, "org.MyComplexClass.attributeName");
}
private static void assertIsSerializable(Class<?> clazz, String... excludes) {
Map<Object, String> results = SerializationUtil.isSerializable(clazz, excludes);
if (!results.isEmpty()) {
StringBuilder issues = new StringBuilder();
for (String issue : results.values()) {
issues.append("\n");
issues.append(issue);
}
fail(issues.toString());
}
}
}
I think serialize and deserialize instance is OK. If class is changed unit test will still be valid, you need to compare original and deserialized objects
An easy way to test if an object is serializable in Kotlin :
/**
* Check if an object is Serializable
*/
fun assertIsSerializable(obj: Any) {
val serialized = SerializationUtils.serialize(obj as Serializable)
val unSerialized = SerializationUtils.deserialize<Any>(serialized)
}
I'm using Dozer version 5.4.0.
I have one class with a Map in it, and another class with a List. I'm trying to write a custom converter that will take the values of the map and put it in the List. But the problem is, the converter always gets passed a null source object for the Map, even if the parent source has a populated Map. I can't figure out why this is happening, I think the converter should be passed a populated Map object.
Here is some source code that compiles and shows the problem in action:
package com.sandbox;
import org.dozer.DozerBeanMapper;
import org.dozer.loader.api.BeanMappingBuilder;
import org.dozer.loader.api.FieldsMappingOptions;
public class Sandbox {
public static void main(String[] args) {
DozerBeanMapper mapper = new DozerBeanMapper();
mapper.addMapping(new MappingConfig());
ClassWithMap parentSource = new ClassWithMap();
ClassWithList parentDestination = mapper.map(parentSource, ClassWithList.class);
int sourceMapSize = parentSource.getMyField().size();
assert sourceMapSize == 1;
assert parentDestination.getMyField().size() == 1; //this assertion fails!
}
private static class MappingConfig extends BeanMappingBuilder {
#Override
protected void configure() {
mapping(ClassWithMap.class, ClassWithList.class)
.fields("myField", "myField",
FieldsMappingOptions.customConverter(MapToListConverter.class, "com.sandbox.MyMapValue"));
}
}
}
As you can see, that second assertion fails. Here are the other classes I'm using.
MapToListConverter.java:
package com.sandbox;
import org.dozer.DozerConverter;
import org.dozer.Mapper;
import org.dozer.MapperAware;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
public class MapToListConverter extends DozerConverter<Map, List> implements MapperAware {
private Mapper mapper;
public MapToListConverter() {
super(Map.class, List.class);
}
#Override
public List convertTo(Map source, List destination) { //source is always null, why?!
List convertedList = new ArrayList();
if (source != null) {
for (Object object : source.values()) {
Object mappedItem = mapper.map(object, getDestinationClass());
convertedList.add(mappedItem);
}
}
return convertedList;
}
private Class<?> getDestinationClass() {
try {
return Class.forName(getParameter());
} catch (ClassNotFoundException e) {
throw new IllegalArgumentException(e);
}
}
#Override
public Map convertFrom(List source, Map destination) {
throw new UnsupportedOperationException();
}
#Override
public void setMapper(Mapper mapper) {
this.mapper = mapper;
}
}
ClassWithMap.java:
package com.sandbox;
import java.util.HashMap;
import java.util.Map;
public class ClassWithMap {
private Map<String, MyMapValue> myField;
public Map<String, MyMapValue> getMyField() { //this method gets called by dozer, I've tested that with a break point
if (myField == null) {
myField = new HashMap<String, MyMapValue>();
myField.put("1", new MyMapValue());
}
return myField; //myField has an entry in it when called by dozer
}
public void setMyField(Map<String, MyMapValue> myField) {
this.myField = myField;
}
}
ClassWithList.java:
package com.sandbox;
import java.util.List;
public class ClassWithList {
private List<MyMapValue> myField;
public List<MyMapValue> getMyField() {
return myField;
}
public void setMyField(List<MyMapValue> myField) {
this.myField = myField;
}
}
MyMapValue.java
package com.sandbox;
public class MyMapValue {
}
The problem seems to be in the MapFieldMap.getSrcFieldValue method of dozer. These comments are added by me:
#Override
public Object getSrcFieldValue(Object srcObj) {
DozerPropertyDescriptor propDescriptor;
Object targetObject = srcObj;
if (getSrcFieldName().equals(DozerConstants.SELF_KEYWORD)) {
propDescriptor = super.getSrcPropertyDescriptor(srcObj.getClass());
} else {
Class<?> actualType = determineActualPropertyType(getSrcFieldName(), isSrcFieldIndexed(), getSrcFieldIndex(), srcObj, false);
if ((getSrcFieldMapGetMethod() != null)
|| (this.getMapId() == null && MappingUtils.isSupportedMap(actualType) && getSrcHintContainer() == null)) {
// Need to dig out actual map object by using getter on the field. Use actual map object to get the field value
targetObject = super.getSrcFieldValue(srcObj);
String setMethod = MappingUtils.isSupportedMap(actualType) ? "put" : getSrcFieldMapSetMethod();
String getMethod = MappingUtils.isSupportedMap(actualType) ? "get" : getSrcFieldMapGetMethod();
String key = getSrcFieldKey() != null ? getSrcFieldKey() : getDestFieldName();
propDescriptor = new MapPropertyDescriptor(actualType, getSrcFieldName(), isSrcFieldIndexed(), getDestFieldIndex(),
setMethod, getMethod, key, getSrcDeepIndexHintContainer(), getDestDeepIndexHintContainer());
} else {
propDescriptor = super.getSrcPropertyDescriptor(srcObj.getClass());
}
}
Object result = null;
if (targetObject != null) {
result = propDescriptor.getPropertyValue(targetObject); //targetObject is my source map, but the result == null
}
return result;
}
I figured out how to fix this. Still not sure if it's a bug or not, but I think it is. The solution is to change my configuration to say this:
mapping(ClassWithMap.class, ClassWithList.class, TypeMappingOptions.oneWay())
.fields("myFields", "myFields"
, FieldsMappingOptions.customConverter(MapToListConverter.class, "com.sandbox.MyMapValue")
);
The fix is in the TypeMappingOptions.oneWay(). When it's bidirectional, the dozer MappingsParser tries to use a MapFieldMap which causes my problem:
// iterate through the fields and see wether or not they should be mapped
// one way class mappings we do not need to add any fields
if (!MappingDirection.ONE_WAY.equals(classMap.getType())) {
for (FieldMap fieldMap : fms.toArray(new FieldMap[]{})) {
fieldMap.validate();
// If we are dealing with a Map data type, transform the field map into a MapFieldMap type
// only apply transformation if it is map to non-map mapping.
if (!(fieldMap instanceof ExcludeFieldMap)) {
if ((isSupportedMap(classMap.getDestClassToMap())
&& !isSupportedMap(classMap.getSrcClassToMap()))
|| (isSupportedMap(classMap.getSrcClassToMap())
&& !isSupportedMap(classMap.getDestClassToMap()))
|| (isSupportedMap(fieldMap.getDestFieldType(classMap.getDestClassToMap()))
&& !isSupportedMap(fieldMap.getSrcFieldType(classMap.getSrcClassToMap())))
|| (isSupportedMap(fieldMap.getSrcFieldType(classMap.getSrcClassToMap())))
&& !isSupportedMap(fieldMap.getDestFieldType(classMap.getDestClassToMap()))) {
FieldMap fm = new MapFieldMap(fieldMap);
classMap.removeFieldMapping(fieldMap);
classMap.addFieldMapping(fm);
fieldMap = fm;
}
}