Orika: Map from String to a List of SomeObjects - java

Consider the following situation:
public class A {
private String stringA;
public String getStringA() {
return stringA;
}
public void setStringA(String stringA) {
this.stringA = stringA;
}
}
public class B {
List<SomeObject> someObjects;
public List<SomeObject> getSomeObjects() {
if (someObjects == null) {
someObjects = new ArrayList<SomeObject>();
}
return someObjects;
}
}
public class SomeObject {
private String stringSomeObject;
public String getStringSomeObject() {
return stringSomeObject;
}
public void setStringSomeObject(String stringSomeObject) {
this.stringSomeObject = stringSomeObject;
}
}
I want to map from A to B. Whilst mapping these, stringA needs to be mapped to stringSomeObject in SomeObject. I tried to write a Orika-Mapper for this:
public class MyMapper extends ConfigurableMapper {
#Override
protected void configure(MapperFactory factory) {
ConverterFactory converterFactory = factory.getConverterFactory();
converterFactory.registerConverter(new StringToSomeObjectConverter());
factory.classMap(A.class, B.class) //
.field("stringA", "someObjects") //
.byDefault() //
.register();
}
}
It maps class A to B and whenever it encounters a conversion from String to List<SomeObject> it calls a custom-converter:
public class StringToSomeObjectConverter extends CustomConverter<String, List<SomeObject>> {
private static final String BORROWER_PARTY_TYP_CODE = "147";
#Override
public List<SomeObject> convert(String source, Type<? extends List<SomeObject>> destinationType) {
SomeObject someObject = new SomeObject();
someObject.setStringSomeObject(source);
return Arrays.asList(someObject);
}
}
I wrote an unit-test to ensure that this works:
#Test
public void testMap() throws Exception {
A a = new A();
a.setStringA("a");
B outcome = new MyMapper().map(a, B.class);
assertThat(outcome.getSomeObjects.size(), is(1));
}
Sadly this test fails with:
java.lang.AssertionError:
Expected: is <1>
but: was <0>
It seems like the Converter is never executed so I tried to debug it. And indeed: The debugger never reaches the converter. Am I doing something wrong? It seems like. I know there are more methods which one could go with like: mapAToB e.g...
Ok I found a solut...nah! It's not a solution, it's just a workaround. I defined the stringA as List<String> as well and defined a converter extending CustomConverter<String, LoanContrReqERPCrteReqLoanContrBrrwrPty>.
Because this feels a little "hacky", I am still interested in a nice solution. (Though I am just thinking that this solution might be fine: Now the datastructure of both objects is more equal than before. The problem is, that object B is coming from an external service, I can't modify it.)

You mapping doesn't work because you don't have setter for someObjects.
When Orika tries to generate code for mapper, it checks all fieldMaps in classMap for sourceProperty is readable and destinationProperty is assignable. If this checks passed, generator puts field conversion into generated mapper. If check failed, Orika just skip this field conversion.
Few options you can use to solve a problem:
You can add setter for someObjects field in class B:
public static class B {
List<SomeObject> someObjects;
public List<SomeObject> getSomeObjects() {
if (someObjects == null) {
someObjects = new ArrayList<SomeObject>();
}
return someObjects;
}
public void setSomeObjects(List<SomeObject> someObjects) {
this.someObjects = someObjects;
}
}
Use custom mapper instead of converter:
factory.classMap(A.class, B.class)
.customize(
new CustomMapper<A, B>() {
#Override
public void mapAtoB(A a, B b, MappingContext context) {
SomeObject someObject = new SomeObject();
someObject.setStringSomeObject(a.getStringA());
b.getSomeObjects().add(someObject);
}
}
)
.byDefault()
.register();
Orika will put invocation customMapper after resolving of field maps.
Generated mapper will looks like:
b.setOtherField(a.getOtherField());
if (customMapper != null) {
customMapper.map(source, destination); <-- Your mapper invocation
}
Use follow syntax for fields:
factory.classMap(A.class, B.class)
.field("stringA", "someObjects[0].stringSomeObject")
.byDefault()
.register();
Generated mapper will looks like:
if (source.getStringA() != null) {
if (((((java.util.List) destination.getSomeObjects()).size() <= 0 || ((List) destination.getSomeObjects()).get(0) == null))) {
((java.util.List) destination.getSomeObjects()).add(0, ((BoundMapperFacade) usedMapperFacades[0]).newObject(((String) source.getStringA()), mappingContext));
}
}
if (!(((java.lang.String) source.getStringA()) == null)) {
(((java.util.List) destination.getSomeObjects()).get(0)).setStringSomeObject(source.getStringA());
} else if (!(((java.util.List) destination.getSomeObjects()) == null) && !((((java.util.List) destination.getSomeObjects()).size() <= 0 || ((List) destination.getSomeObjects()).get(0) == null))) {
( ((java.util.List) destination.getSomeObjects()).get(0)).setStringSomeObject(null);
}
Also there was a bug in Orika to map from single property to property of collection using syntax .field("stringA", "elements{stringB}") ( Incorrect mapper code generated for mapping from a single property to property of collection element). Bug closed at 31 Dec 2016 here: Fix for bug

Related

What is the correct way to use Spring Boot ConversionService in order to convert a retrieved list of entity objects into a list of DTOs objects?

I am working on a Spring Boot application and I have the following doubt.
I have this service method (that works fine) that insert an object into the DB calling the repository:
#Override
#Transactional
public CoinDTO createCoin(CoinDTO coin) throws DuplicateException {
Coin checkCoinExists = coinRepository.findByCode(coin.getCode());
if (checkCoinExists != null) {
String MsgErr = String.format("Coin %s already registered in the system !!! "
+ "Impossible to use POST", coin.getCode());
log.warning(MsgErr);
throw new DuplicateException(MsgErr);
}
Coin result = coinRepository.save(conversionService.convert(coin,Coin.class));
return conversionService.convert(result,CoinDTO.class);
}
As you can see the save() methjod return the inserted Coin object (that is an Hibernate entity class mapping my table). The service method than convert this Coin object into the CoinDTO object in order to return the DTO object instead the entity instance. It works fine and it is the expected behavior.
Now I created this second service method that simply retrieve the list of all the Coin objects and must return the list of the related CoinDTO objects:
#Override
public List<CoinDTO> getCoinList() {
List<Coin> coinsList = this.coinRepository.findAll();
return null;
}
and here I have the following doubt: I think that I can implement thid ENTITY to DTO conversion behavior iterating on the coinsList element, converting each element of the list one by one and then adding it to a new List list. It should work
Exist some more modern and smarter way to do it? Maybe using lambda function? Can you help me to implement this behavior in a modern and a smart way?
You may create an generic abstract class like this:
public abstract class AbstractConverter<T, DTO> {
public abstract T fromDto(DTO dto);
public abstract DTO toDTO(T t);
public List<T> fromDTOs(List<DTO> dtos) {
if (dtos == null || dtos.isEmpty()) {
return null;
} else {
return dtos.stream().map(this::fromDTO).collect(Collectors.toList());
}
}
public List<DTO> toDTOs(List<T> ts) {
if (ts == null || ts.isEmpty()) {
return null;
} else {
return ts.stream().map(this::toDTO).collect(Collectors.toList());
}
}
}
Then create another class that implements the aforecreated abstract class by assigning your desired values like this:
#Component(value = "coinConverter")
public class CoinConverter extends AbstractConverter<Coin, CoinDTO> {
#Override
public Coin fromDTO(CoinDTO dto) {
if (dto == null) {
return null;
} else {
Coin coin = new Coin();
// Assign all values you wanted to consume
// of the following form
// coin.setYourAttribite(dto.getYourAttribute())
return coin;
}
}
#Override
public CoinDTO toDTO(Coin coin) {
if (t == null) {
return null;
} else {
CoinDTO coinDTO = new CoinDTO();
// Assign all values you wanted to expose
// of the following form
// coinDTO.setYourAttribite(coin.getYourAttribute())
return coinDTO;
}
}
}
In controller layer you may change your existing code by this one:
#Autowired
#Qualifier("coinConverter")
AbstractConverter<Coin, CoinDTO> abstractConverter;
#Override
public List<CoinDTO> getCoinList() {
List<Coin> coinsList = this.coinRepository.findAll();
return abstractConverter.toDTOs(cointList);
}
This way your code is flexible to add more converters without changing the existing ones.
As far as I understand, you are looking for a way that makes the conversion process shorter and more convenient. if so use ModelMapper class in this case, read this http://modelmapper.org/ documentation, ModelMapper uses TypeTokens to allow mapping of generic parameterized types.
Not sure if I understood your question, but is the following what you are looking for:
#Override
public List<CoinDTO> getCoinList() {
return this.coinRepository.findAll().stream()
.map(coin -> conversionService.convert(coin, CoinDTO.class))
.collect(Collectors.toList());
}

ValidatorUtils of list inside a list

I'm currently making some validations in my code and one of the main problem is I have a object list that has another object list and so on.
public class BigObject{
private Long idObject;
private String idLanguage;
private Date dateGeneration;
private List<FirstObject> firstObject;
//getters and setters
}
public class FirstObject{
private List<SecondObject> secondObject;
//getters and setters
}
public class SecondObject{
private Long order;
private String titol;
private int floatProperty;
//getters and setters
}
These are my classes and their are inside of another. I set up my Validator in the Main and created their respective class, now, in the validator class I have this:
public class BigObjectValidator implements Validator {
#Override
public boolean supports(Class clazz) {
return BigObject.class.equals(clazz)
|| FirstObject.class.equals(clazz)
|| SecondObject.class.equals(clazz);
}
#Override
public void validate(Object obj, Errors e) {
BigObject bigObject = (BigObject) obj;
ValidationUtils.rejectIfEmptyOrWhitespace(e, "idObject", "empty.id");
ValidationUtils.rejectIfEmptyOrWhitespace(e, "idLanguage", "empty.id");
ValidationUtils.rejectIfEmptyOrWhitespace(e, "dateGeneration", "empty.id");
if (!(bigObject.getFirstObject().isEmpty())) {
for (FirstObject firstObject : bigObject.getFirstObject()) {
if (firstObject.getSecondObject() != null) {
for (SecondObject secondObject : firstObject.getSecondObject()) {
if (secondObject != null){
validateSecondObject(secondObject,e);
}
}
}
}
}
}
private void validateSecondObject(SecondObject secondObject, Errors e) {
ValidationUtils.rejectIfEmptyOrWhitespace(e, "order", "order.empty");
ValidationUtils.rejectIfEmptyOrWhitespace(e, "titol", "order.empty");
ValidationUtils.rejectIfEmptyOrWhitespace(e, "floatProperty", "order.empty");
}
}
The main problem is I'm getting a org.springframework.beans.NotReadablePropertyException: Invalid property 'order' of bean class I'm trying to guess why is that, its because the validator is set up in the BigObject class and not the other ones. Now I don't know if I have to create another class inside BigObjectValidator or something like that.
Edit:
Main
try{
BigObject object = new BigObject();
List<FirstObject> firstObj = ArrayList<FirstObject>;
SecondObject secondObj = new SecondObject();
object.getIdObject("something");
object.getIdLanguage("En");
object.getDateGeneration("05-18-2018");
secondObject.setOrder(null);
firstObj.set(1,secondObject);
BeanPropertyBindingResult result = new BeanPropertyBindingResult(je.getValue(), "Object");
BigObjectValidator validateObject = new BigObjectValidator();
validateObject.validate(object, result);
if (result.hasErrors()){
System.out.println(result.getAllErrors().toString());
}
}catch(Exception e){
System.out.println(e);
}
Please look here
https://docs.spring.io/spring-framework/docs/current/javadoc-api/org/springframework/validation/Errors.html
public interface Errors Stores and exposes information about
data-binding and validation errors for a specific object. Field names
can be properties of the target object (e.g. "name" when binding to a
customer object), or nested fields in case of subobjects (e.g.
"address.street"). Supports subtree navigation via
setNestedPath(String): for example, an AddressValidator validates
"address", not being aware that this is a subobject of customer.
If you pass the same Errors object to your validateSecondObject method, it still references the original obj, not your firstObject.. You must validate this differently. Either get a new instance of Errors (eg. org.springframework.validation.BindException) or do it by manually throwing exceptions

Fallback Class<?> for mapper.readValue

I am using Jackson for de/serialization in my app.
I have a situation where I need to convert a JSON string to one of my 3 classes. In case the string can't be converted to either one of 3 classes, it will considered to be an unrecognized case.
However, if the schema of json string and the provided class in mapper.readValue(jsonString,MyClass1.class) does not match, it throws an UnrecognizedPropertyException.
Currently I am using something like below, but it seems to be pretty messy.
try {
obj = mapper.readValue(jsonString, MyClass1.class);
} catch (UnrecognizedPropertyException e1) {
try {
obj = mapper.readValue(jsonString, MyClass2.class);
} catch (UnrecognizedPropertyException e2) {
try {
obj = mapper.readValue(jsonString, MyClass3.class);
} catch (Exception e) {
//handle unrecognized string
}
} catch (Exception e) {
//handle unrecognized string
}
} catch (Exception e) {
//handle unrecognized string
}
Is this how it needs to be done or is there any other alternative? Is there any way to configure the mapper to return null in case of unrecognized properties, as that would result in creating a simple series if blocks instead of nested try-catch blocks?
You can try this method to do deserialization thing. this will return null on UnrecognizedPropertyException:
private <T> T deserialize(ObjectMapper mapper, Class<T> type, String jsonString) {
T t = null;
try {
t = mapper.readValue(jsonString, type);
} catch (UnrecognizedPropertyException e) {
//handle unrecognized string
}catch (IOException e) {
//handle under errors
}
return t;
}
If jsonString is generated by you, you can consider to add type info and then use it to convert deserialized object. You could refer to this post for how to do it.
If jsonString is generated by other services beyond your control, then there's no type info you can get so you can only try it one by one, #Sachin Gupta's answer would be a nice choice.
I'd like to provide an additional option: define an all-in-one entity including all fields of MyClass1, MyClass2 and MyClass3, and make MyClass1, MyClass2 and MyClass3 be separated wrapper and only expose related fields for each. Code as follows:
Class AllInOne:
public class AllInOne {
protected String a;
protected String b;
protected String c;
public A asA() {
return new A(this);
}
public B asB() {
return new B(this);
}
public C asC() {
return new C(this);
}
}
Class A:
public class A {
private AllInOne allInOne;
public A(AllInOne allInOne) {
this.allInOne = allInOne;
}
public String getA() {
return allInOne.a;
}
}
Class B:
public class B {
private AllInOne allInOne;
public B(AllInOne allInOne) {
this.allInOne = allInOne;
}
public String getB() {
return allInOne.b;
}
}
Class C:
public class C {
private AllInOne allInOne;
public C(AllInOne allInOne) {
this.allInOne = allInOne;
}
public String getC() {
return allInOne.c;
}
}
Test code:
public class Main {
public static void main(String[] args) throws IOException {
ObjectMapper om = new ObjectMapper();
om.setVisibility(PropertyAccessor.FIELD, JsonAutoDetect.Visibility.ANY);
String jsonA = "{\"a\":\"a value\"}";
String jsonB = "{\"b\":\"b value\"}";
String jsonC = "{\"c\":\"c value\"}";
needTypeA(om.readValue(jsonA, AllInOne.class).asA());
needTypeB(om.readValue(jsonB, AllInOne.class).asB());
needTypeC(om.readValue(jsonC, AllInOne.class).asC());
}
private static void needTypeA(A a) {
System.out.println(a.getA());
}
private static void needTypeB(B b) {
System.out.println(b.getB());
}
private static void needTypeC(C c) {
System.out.println(c.getC());
}
}
With implementation like this, we erased the specific type info at deserialization step, and bring it back at the moment we really need/use it. And as you can see there's not too much extra code, because what we actually did is just moving all fields declaration together, and added couple methods.
Notes:
I declare fields in AllInOne to be protected, putting all POJO class in the same package will make A, B and C be able to access them directly, but not for other classes outside.
Setting om.setVisibility(PropertyAccessor.FIELD, JsonAutoDetect.Visibility.ANY); to make jackson deserialize by field, so that we can remove duplicate setter and getter from AllInOne class
If you do need to know the type info, you could add methods like isA inside AllInOne based on the fields info
If json contains some define property, than you can try to use #JsonTypeInfo and #JsonSubTypes. Classes MyClass1, ... must implement this interface. Also I don`t remember exactly how to map unknown implementations to null.
#JsonTypeInfo(
use = JsonTypeInfo.Id.NAME,
include = JsonTypeInfo.As.EXISTING_PROPERTY, // level of define property
property = <property_name>,
visible = true,
defaultImpl = NoClass.class)
#JsonSubTypes({#JsonSubTypes.Type(value = <interface-impl>.class, name = <property_value>)})
private <interface> value;
// getters and setters

Multiple #QueryParam keys for a single value in Jersey

Is it possible to allow multiple #QueryParam keys for a single object/variable in Jersey?
Actual:
#POST
public Something getThings(#QueryParam("customer-number") Integer n) {
...
}
so, if I add ?customer-number=3 after the URL it works.
Expected:
I want to get the behavior above if I add any of the following values:
?customer-number=3
?customerNumber=3
?customerNo=3
Obs:
The QueryParam annotation looks like:
...
public #interface QueryParam {
String value();
}
so, it cannot accept multiple String values (like #Produces).
The approach below allows the user to use multiple keys having the same meaning at the same time (and I want to have an "OR" condition between them):
#POST
public Something getThings(#QueryParam("customer-number") Integer n1,
#QueryParam("customerNumber") Integer n2,
#QueryParam("customerNo") Integer n3) {
...
}
Something like this doesn't work:
#POST
public Something getThings(#QueryParam("customer-number|customerNumber|customerNo") Integer n) {
...
}
How can I do this?
Details:
Jersey 2.22.1
Java 8
To be honest: this is not how webservices are supposed to be designed. You lay down a strict contract that both client and server follow; you define one parameter and that's it.
But of course it would be a perfect world where you have the freedom to dictate what is going to happen. So if you must allow three parameters in, then you'll have to make that the contract. This is one way following approach #2 which I have to provide without being able to test it for goofs:
public Something getThings(#QueryParam("customer-number") Integer n1,
#QueryParam("customerNumber") Integer n2,
#QueryParam("customerNo") Integer n3) throws YourFailureException {
Integer customerNumber = getNonNullValue("Customer number", n1, n2, n3);
// things with stuff
}
private static Integer getNonNullValue(String label, Integer... params) throws YourFailureException {
Integer value = null;
for(Integer choice : params){
if(choice != null){
if(value != null){
// this means there are at least two query parameters passed with a value
throw new YourFailureException("Ambiguous " + label + " parameters");
}
value = choice;
}
}
if(value == null){
throw new YourFailureException("Missing " + label + " parameter");
}
return value;
}
So basically reject any call that does not pass specifically one of the parameters, and let an exception mapper translate the exception you throw into a HTTP response code in the 4xx range of course.
(I made the getNonNullValue() method static is it strikes me as a reusable utility function).
Maybe the simplest and easiest way would be to use a custom #BeanParam:
First define the custom bean merging all the query parameters as:
class MergedIntegerValue {
private final Integer value;
public MergedIntegerValue(
#QueryParam("n1") Integer n1,
#QueryParam("n2") Integer n2,
#QueryParam("n3") Integer n3) {
this.value = n1 != null ? n1
: n2 != null ? n2
: n3 != null ? n3
: null;
// Throw an exception if value == null ?
}
public Integer getValue() {
return value;
}
}
and then use it with #BeanParam in your resource method:
public Something getThings(
#BeanParam MergedIntegerValue n) {
// Use n.getValue() ...
}
Reference: https://jersey.java.net/documentation/latest/user-guide.html#d0e2403
You can create a custom annotation. I won't go in too much about how to do it, you can see this post, or this post. Basically it relies on a different infrastructure than the usual dependency injection with Jersey. You can see this package from the Jersey project. This is where all the injection providers live that handle the #XxxParam injections. If you examine the source code, you will see the the implementations are fairly the same. The two links I provided above follow the same pattern, as well as the code below.
What I did was created a custom annotation
#Target({ElementType.FIELD, ElementType.PARAMETER})
#Retention(RetentionPolicy.RUNTIME)
public #interface VaryingParam {
String value();
#SuppressWarnings("AnnotationAsSuperInterface")
public static class Factory
extends AnnotationLiteral<VaryingParam> implements VaryingParam {
private final String value;
public static VaryingParam create(final String newValue) {
return new Factory(newValue);
}
public Factory(String newValue) {
this.value = newValue;
}
#Override
public String value() {
return this.value;
}
}
}
It may seem odd that I have a factory to create it, but this was required for the implementation of the below code, where I split the value of the String, and end up creating a new annotation instance for each split value.
Here is the ValueFactoryProvider (which, if you've read either of the above articles, you will see that is required for custom method parameter injection). It a large class, only because I put all the required classes into a single class, following the pattern you see in the Jersey project.
public class VaryingParamValueFactoryProvider extends AbstractValueFactoryProvider {
#Inject
public VaryingParamValueFactoryProvider(
final MultivaluedParameterExtractorProvider mpep,
final ServiceLocator locator) {
super(mpep, locator, Parameter.Source.UNKNOWN);
}
#Override
protected Factory<?> createValueFactory(final Parameter parameter) {
VaryingParam annotation = parameter.getAnnotation(VaryingParam.class);
if (annotation == null) {
return null;
}
String value = annotation.value();
if (value == null || value.length() == 0) {
return null;
}
String[] variations = value.split("\\s*\\|\\s*");
return new VaryingParamFactory(variations, parameter);
}
private static Parameter cloneParameter(final Parameter original, final String value) {
Annotation[] annotations = changeVaryingParam(original.getAnnotations(), value);
Parameter clone = Parameter.create(
original.getRawType(),
original.getRawType(),
true,
original.getRawType(),
original.getRawType(),
annotations);
return clone;
}
private static Annotation[] changeVaryingParam(final Annotation[] annos, final String value) {
for (int i = 0; i < annos.length; i++) {
if (annos[i] instanceof VaryingParam) {
annos[i] = VaryingParam.Factory.create(value);
break;
}
}
return annos;
}
private class VaryingParamFactory extends AbstractContainerRequestValueFactory<Object> {
private final String[] variations;
private final Parameter parameter;
private final boolean decode;
private final Class<?> paramType;
private final boolean isList;
private final boolean isSet;
VaryingParamFactory(final String[] variations, final Parameter parameter) {
this.variations = variations;
this.parameter = parameter;
this.decode = !parameter.isEncoded();
this.paramType = parameter.getRawType();
this.isList = paramType == List.class;
this.isSet = paramType == Set.class;
}
#Override
public Object provide() {
MultivaluedParameterExtractor<?> e = null;
try {
Object value = null;
MultivaluedMap<String, String> params
= getContainerRequest().getUriInfo().getQueryParameters(decode);
for (String variant : variations) {
e = get(cloneParameter(parameter, variant));
if (e == null) {
return null;
}
if (isList) {
List list = (List<?>) e.extract(params);
if (value == null) {
value = new ArrayList();
}
((List<?>) value).addAll(list);
} else if (isSet) {
Set set = (Set<?>) e.extract(params);
if (value == null) {
value = new HashSet();
}
((Set<?>) value).addAll(set);
} else {
value = e.extract(params);
if (value != null) {
return value;
}
}
}
return value;
} catch (ExtractorException ex) {
if (e == null) {
throw new ParamException.QueryParamException(ex.getCause(),
parameter.getSourceName(), parameter.getDefaultValue());
} else {
throw new ParamException.QueryParamException(ex.getCause(),
e.getName(), e.getDefaultValueString());
}
}
}
}
private static class Resolver extends ParamInjectionResolver<VaryingParam> {
public Resolver() {
super(VaryingParamValueFactoryProvider.class);
}
}
public static class Binder extends AbstractBinder {
#Override
protected void configure() {
bind(VaryingParamValueFactoryProvider.class)
.to(ValueFactoryProvider.class)
.in(Singleton.class);
bind(VaryingParamValueFactoryProvider.Resolver.class)
.to(new TypeLiteral<InjectionResolver<VaryingParam>>() {
})
.in(Singleton.class);
}
}
}
You will need to register this class' Binder (bottom of class) with Jersey to use it.
What differentiates this class from Jersey QueryParamValueFactoryProvider is that instead of just processing a single String value of the annotation, it splits the value, and tries to extract the values from the query param map. The first value found will be returned. If the parameter is a List or Set, it just continues to keep looking up all the options, and adding them to the list.
For the most part this keeps all the functionality you would expect from an #XxxParam annotation. The only thing that was difficult to implement (so I left out supporting this use case), is multiple parameters, e.g.
#GET
#Path("multiple")
public String getMultipleVariants(#VaryingParam("param-1|param-2|param-3") String value1,
#VaryingParam("param-1|param-2|param-3") String value2) {
return value1 + ":" + value2;
}
I actually don't think it should be that hard to implement, if you really need it, it's just a matter of creating a new MultivaluedMap, removing a value if it is found. This would be implemented in the provide() method of the VaryingParamFactory above. If you need this use case, you could just use a List or Set instead.
See this GitHub Gist (it's rather long) for a complete test case, using Jersey Test Framework. You can see all the use cases I tested in the QueryTestResource, and where I register the Binder with the ResourceConfig in the test configure() method.

Why is dozer passing in a null source object to my Configurable Custom Converter?

I'm using Dozer version 5.4.0.
I have one class with a Map in it, and another class with a List. I'm trying to write a custom converter that will take the values of the map and put it in the List. But the problem is, the converter always gets passed a null source object for the Map, even if the parent source has a populated Map. I can't figure out why this is happening, I think the converter should be passed a populated Map object.
Here is some source code that compiles and shows the problem in action:
package com.sandbox;
import org.dozer.DozerBeanMapper;
import org.dozer.loader.api.BeanMappingBuilder;
import org.dozer.loader.api.FieldsMappingOptions;
public class Sandbox {
public static void main(String[] args) {
DozerBeanMapper mapper = new DozerBeanMapper();
mapper.addMapping(new MappingConfig());
ClassWithMap parentSource = new ClassWithMap();
ClassWithList parentDestination = mapper.map(parentSource, ClassWithList.class);
int sourceMapSize = parentSource.getMyField().size();
assert sourceMapSize == 1;
assert parentDestination.getMyField().size() == 1; //this assertion fails!
}
private static class MappingConfig extends BeanMappingBuilder {
#Override
protected void configure() {
mapping(ClassWithMap.class, ClassWithList.class)
.fields("myField", "myField",
FieldsMappingOptions.customConverter(MapToListConverter.class, "com.sandbox.MyMapValue"));
}
}
}
As you can see, that second assertion fails. Here are the other classes I'm using.
MapToListConverter.java:
package com.sandbox;
import org.dozer.DozerConverter;
import org.dozer.Mapper;
import org.dozer.MapperAware;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
public class MapToListConverter extends DozerConverter<Map, List> implements MapperAware {
private Mapper mapper;
public MapToListConverter() {
super(Map.class, List.class);
}
#Override
public List convertTo(Map source, List destination) { //source is always null, why?!
List convertedList = new ArrayList();
if (source != null) {
for (Object object : source.values()) {
Object mappedItem = mapper.map(object, getDestinationClass());
convertedList.add(mappedItem);
}
}
return convertedList;
}
private Class<?> getDestinationClass() {
try {
return Class.forName(getParameter());
} catch (ClassNotFoundException e) {
throw new IllegalArgumentException(e);
}
}
#Override
public Map convertFrom(List source, Map destination) {
throw new UnsupportedOperationException();
}
#Override
public void setMapper(Mapper mapper) {
this.mapper = mapper;
}
}
ClassWithMap.java:
package com.sandbox;
import java.util.HashMap;
import java.util.Map;
public class ClassWithMap {
private Map<String, MyMapValue> myField;
public Map<String, MyMapValue> getMyField() { //this method gets called by dozer, I've tested that with a break point
if (myField == null) {
myField = new HashMap<String, MyMapValue>();
myField.put("1", new MyMapValue());
}
return myField; //myField has an entry in it when called by dozer
}
public void setMyField(Map<String, MyMapValue> myField) {
this.myField = myField;
}
}
ClassWithList.java:
package com.sandbox;
import java.util.List;
public class ClassWithList {
private List<MyMapValue> myField;
public List<MyMapValue> getMyField() {
return myField;
}
public void setMyField(List<MyMapValue> myField) {
this.myField = myField;
}
}
MyMapValue.java
package com.sandbox;
public class MyMapValue {
}
The problem seems to be in the MapFieldMap.getSrcFieldValue method of dozer. These comments are added by me:
#Override
public Object getSrcFieldValue(Object srcObj) {
DozerPropertyDescriptor propDescriptor;
Object targetObject = srcObj;
if (getSrcFieldName().equals(DozerConstants.SELF_KEYWORD)) {
propDescriptor = super.getSrcPropertyDescriptor(srcObj.getClass());
} else {
Class<?> actualType = determineActualPropertyType(getSrcFieldName(), isSrcFieldIndexed(), getSrcFieldIndex(), srcObj, false);
if ((getSrcFieldMapGetMethod() != null)
|| (this.getMapId() == null && MappingUtils.isSupportedMap(actualType) && getSrcHintContainer() == null)) {
// Need to dig out actual map object by using getter on the field. Use actual map object to get the field value
targetObject = super.getSrcFieldValue(srcObj);
String setMethod = MappingUtils.isSupportedMap(actualType) ? "put" : getSrcFieldMapSetMethod();
String getMethod = MappingUtils.isSupportedMap(actualType) ? "get" : getSrcFieldMapGetMethod();
String key = getSrcFieldKey() != null ? getSrcFieldKey() : getDestFieldName();
propDescriptor = new MapPropertyDescriptor(actualType, getSrcFieldName(), isSrcFieldIndexed(), getDestFieldIndex(),
setMethod, getMethod, key, getSrcDeepIndexHintContainer(), getDestDeepIndexHintContainer());
} else {
propDescriptor = super.getSrcPropertyDescriptor(srcObj.getClass());
}
}
Object result = null;
if (targetObject != null) {
result = propDescriptor.getPropertyValue(targetObject); //targetObject is my source map, but the result == null
}
return result;
}
I figured out how to fix this. Still not sure if it's a bug or not, but I think it is. The solution is to change my configuration to say this:
mapping(ClassWithMap.class, ClassWithList.class, TypeMappingOptions.oneWay())
.fields("myFields", "myFields"
, FieldsMappingOptions.customConverter(MapToListConverter.class, "com.sandbox.MyMapValue")
);
The fix is in the TypeMappingOptions.oneWay(). When it's bidirectional, the dozer MappingsParser tries to use a MapFieldMap which causes my problem:
// iterate through the fields and see wether or not they should be mapped
// one way class mappings we do not need to add any fields
if (!MappingDirection.ONE_WAY.equals(classMap.getType())) {
for (FieldMap fieldMap : fms.toArray(new FieldMap[]{})) {
fieldMap.validate();
// If we are dealing with a Map data type, transform the field map into a MapFieldMap type
// only apply transformation if it is map to non-map mapping.
if (!(fieldMap instanceof ExcludeFieldMap)) {
if ((isSupportedMap(classMap.getDestClassToMap())
&& !isSupportedMap(classMap.getSrcClassToMap()))
|| (isSupportedMap(classMap.getSrcClassToMap())
&& !isSupportedMap(classMap.getDestClassToMap()))
|| (isSupportedMap(fieldMap.getDestFieldType(classMap.getDestClassToMap()))
&& !isSupportedMap(fieldMap.getSrcFieldType(classMap.getSrcClassToMap())))
|| (isSupportedMap(fieldMap.getSrcFieldType(classMap.getSrcClassToMap())))
&& !isSupportedMap(fieldMap.getDestFieldType(classMap.getDestClassToMap()))) {
FieldMap fm = new MapFieldMap(fieldMap);
classMap.removeFieldMapping(fieldMap);
classMap.addFieldMapping(fm);
fieldMap = fm;
}
}

Categories