ModelMapper map value from Map - java

I am trying to get a grasp of ModelMapper for the following use case:
class A {
public String name;
public Map<String, ATranslation> translations;
}
class ATranslation {
public String desc;
public String content;
}
class DTO {
public String name;
public String desc;
public String content;
}
Assume Constructors, Getters and Setters.
public class App {
public static void main(String[] args) {
Map<String, ATranslation> translations = new HashMap<>();
translations.put("en", new ATranslation("en-desc","content1"));
translations.put("nl", new ATranslation("nl-desc","content2"));
A entity = new A("John Wick",translations);
System.out.println(App.toDto(entity,"en"));
System.out.println(App.toDto(entity,"nl"));
}
private static DTO toDto(A entity, String lang) {
ModelMapper modelMapper = new ModelMapper();
//how to set up ModelMapper?
return modelMapper.map(entity, DTO.class);
}
}
Without any setup the output is:
DTO(name=John Wick, desc=null, content=null)
DTO(name=John Wick, desc=null, content=null)
A converter does not work:
modelMapper
.createTypeMap(A.class, DTO.class)
.setConverter(new Converter<A, DTO>() {
public DTO convert(MappingContext<A, DTO> context) {
A s = context.getSource();
DTO d = context.getDestination();
d.setDesc(s.getTranslation().get(lang).getDesc());
d.setContent(s.getTranslation().get(lang).getContent());
return d;
}
});
A postConverter does work, but does not seem to be the most ModelMapper way...
modelMapper
.createTypeMap(A.class, DTO.class)
.setPostConverter(new Converter<A, DTO>() {
public DTO convert(MappingContext<A, DTO> context) {
A s = context.getSource();
DTO d = context.getDestination();
d.setDesc(s.getTranslation().get(lang).getDesc()); //tedious, if many fields...
d.setContent(s.getTranslation().get(lang).getContent()); //feels redundant already
return d;
}
});
DTO(name=John Wick, desc=en-desc, content=content1)
DTO(name=John Wick, desc=nl-desc, content=content2)
Is there a better way to use ModelMapper here?

Did not test it! This is just a theoretical answer composed from research.
Design considerations
Lead by your concerns about the tedious implementation of a Post-Converter the following solution is designed out of small components.
I tried to decompose your mapping use-case into smaller problems, and solve each using a component from the mapping framework.
Mechanics of a mapper
Looking at how a bean- or object- or model-mapper typically works, may shed some light on the issue at hand.
A mapper maps an object of type or source-class A to an new object of another type or target-class B.
ModelMapper - components to use
I tried to re-use examples from Baeldung's tutorial: Guide to Using ModelMapper. We need 3 steps:
(a) inferred property- or type-mapping for String name to equivalent target
(b) Expression Mapping: customized property-mapping for Map translation to String desc
(c) parameterized converter to translate and lookup by key String language and extract or convert the value ATranslation to String desc
The features we use are:
Property Mapping for (a) and (b)
Converters for (c)
1. property-mapping
In its simple form it infers the mapping of properties. It does so by mapping the properties of the source to the target. By default most mappers map the properties to ones equivalent by type or name:
field types and names perfectly match
In your case this worked for the property String name.
// GIVEN
Map<String, ATranslation> translations = Map.of(
"en", new ATranslation("en-desc"),
"nl", new ATranslation("nl-desc")
);
A entity = new A("John Wick", translations);
// SETUP (a) property-mapping by type
TypeMap<A, DTO> typeMap = modelMapper.createTypeMap(A.class, DTO.class);
// WHEN mapping the properties
DTO dto = modelMapper.map(entity, DTO.class);
// THEN desc expected to be null
assertNull(dto.getDesc());
assertEquals(entity.getName(), dto.getName());
2. conversion
In some use-cases you need some kind of conversion, when type or name of the properties to map can not be inferred by simple equivalence.
Define a translation factory-method to configure the converter. It creates a new converter or lets say "interpreter for the specified language" on demand.
// Converter: translates for specified language, means lookup in the map using a passed parameter
Converter<Map<String, ATranslation>, String> translateToLanguage(final String language) {
return c -> c.getOrDefault(language, new ATranslation("")).getDesc(); // language needs to be final inside a lambda
}
This method can be used to translate or convert
// SETUP (a) property-mapping as default type-map
TypeMap<A, DTO> typeMap = modelMapper.createTypeMap(A.class, DTO.class);
// (b) map the property translations (even so other type) to desc
typeMap.addMapping(Source::getTranslation, Destination::setDesc);
// (c) add the converter to the property-mapper
typeMap.addMappings(
mapper -> mapper.using(translateToLanguage("nl")).map(A::getTranslation, DTO::setDesc)
);
// WHEN mapping the properties
DTO dto = modelMapper.map(entity, DTO.class);
// THEN desc expected to be mapped to the specified language's translation
assertEquals(entity.getTranslation().get("nl").getDesc(), dto.getDesc());
assertEquals(entity.getName(), dto.getName());

Related

How convert string field to hashmap value with mapstruct java?

I want map accountNo to argument value with specific key(I mean key of hashmap), with mapstrcut. Any ideas?
Event is my target
private Map<String, Object> argument;
private LocalDateTime dateTime;
AccountRequest is my source
private String accountNo;
private LocalDateTime dateTime;
I have the mapper as below but I want the opposite one too.
#Mapping(target = "accountNo", expression = "java(event.getArgument().get(key).toString())")
AccountRequest eventToAccountRequest(Event event, String key);
Event accountRequestToEvent(AccountRequest accountRequest); // this is my question
Currently there is no out-of-the-box support for mapping from a Bean into a Map. There is however an open feature request for this functionality.
This means that for your example you'll need to do it in a custom mapping.
e.g.
#Mapping(target = "argument", source = "accountNo", qualifiedByName = "accountNoToArgument")
Event accountRequestToEvent(AccountRequest accountRequest);
#Named("accountNoToArgument")
default Map<String, Object> accountNotToArgument(String accountNo) {
return accountNo == null ? null : Collections.singletonMap("accountNo", accountNo);
}
This would make sure that the rest of the parameters are automatically mapped by MapStruct.
Side note about a better mapping for eventToAccountRequest. Instead of using an expression you can improve it with a custom mapping method and #Context. e.g.
#Mapping(target = "accountNo", source = "argument", qualifiedByName = "argumentToAccountNo")
AccountRequest eventToAccountRequest(Event event, #Context String key);
#Named("argumentToAccountNo")
default String argumentToAccountNo(Map<String, Object> argument, #Context key) {
return argument == null : null : argument.get(key).toString();
}
Using this you will avoid a potential NPE if event.getArgument() is null.
Why not using a default mapper. Something like:
default Event accountRequestToEvent(AccountRequest accountRequest) {
Event event = new Event();
event.setArguement(Collections.singletonMap(accountRequest.getAccountNo(), "value"));
return event;
}

Using decorator pattern without adding "different" behaviour

I have facade interface where users can ask for information about lets say Engineers. That information should be transferred as JSON of which we made a DTO for. Now keep in mind that I have multiple datasources that can provide an item to this list of DTO.
So I believe right now that I can use Decorative Pattern by adding handler of the datasource to the myEngineerListDTO of type List<EngineerDTO>. So by that I mean all the datasources have the same DTO.
This picture below shows that VerticalScrollbar and HorizontalScrollBar have different behaviours added. Which means they add behaviour to the WindowDecorator interface.
My question, does my situation fit the decorator pattern? Do I specifically need to add a behaviour to use this pattern? And is there another pattern that does fit my situation? I have already considered Chain of Responsibility pattern, but because I don't need to terminate my chain on any given moment, i thought maybe Decorator pattern would be better.
Edit:
My end result should be: List<EngineersDTO> from all datasources. The reason I want to add this pattern is so that I can easily add another datasource behind the rest of the "pipeline". This datasource, just like the others, will have addEngineersDTOToList method.
To further illustrate on how you can Chain-of-responsibility pattern I put together a small example. I believe you should be able to adapt this solution to suit the needs of your real world problem.
Problem Space
We have an unknown set of user requests which contain the name of properties to be retrieved. There are multiple datasources which each have varying amounts of properties. We want to search through all possible data sources until all of the properties from the request have been discovered. Some data types and data sources might look like bellow (note I am using Lombok for brevity):
#lombok.Data
class FooBarData {
private final String foo;
private final String bar;
}
#lombok.Data
class FizzBuzzData {
private final String fizz;
private final String buzz;
}
class FooBarService {
public FooBarData invoke() {
System.out.println("This is an expensive FooBar call");
return new FooBarData("FOO", "BAR");
}
}
class FizzBuzzService {
public FizzBuzzData invoke() {
System.out.println("This is an expensive FizzBuzz call");
return new FizzBuzzData("FIZZ", "BUZZ");
}
}
Our end user might require multiple ways to resolve the data. The following could be a valid user input and expected response:
// Input
"foobar", "foo", "fizz"
// Output
{
"foobar" : {
"foo" : "FOO",
"bar" : "BAR"
},
"foo" : "FOO",
"fizz" : "FIZZ"
}
A basic interface and simple concrete implementation for our property resolver might look like bellow:
interface PropertyResolver {
Map<String, Object> resolve(List<String> properties);
}
class UnknownResolver implements PropertyResolver {
#Override
public Map<String, Object> resolve(List<String> properties) {
Map<String, Object> result = new HashMap<>();
for (String property : properties) {
result.put(property, "Unknown");
}
return result;
}
}
Solution Space
Rather than using a normal "Decorator pattern", a better solution may be a "Chain-of-responsibility pattern". This pattern is similar to the decorator pattern, however, each link in the chain is allowed to either work on the item, ignore the item, or end the execution. This is helpful for deciding if a call needs to be made, or terminating the chain if the work is complete for the request. Another difference from the decorator pattern is that resolve will not be overriden by each of the concrete classes; our abstract class can call out to the sub class when required using abstract methods.
Back to the problem at hand... For each resolver we need two components. A way to fetch data from our remote service, and a way to extract all the required properties from the data retrieved. For fetching the data we can provide an abstract method. For extracting a property from the fetched data we can make a small interface and maintain a list of these extractors seeing as multiple properties can be pulled from a single piece of data:
interface PropertyExtractor<Data> {
Object extract(Data data);
}
abstract class PropertyResolverChain<Data> implements PropertyResolver {
private final Map<String, PropertyExtractor<Data>> extractors = new HashMap<>();
private final PropertyResolver successor;
protected PropertyResolverChain(PropertyResolver successor) {
this.successor = successor;
}
protected abstract Data getData();
protected final void setBinding(String property, PropertyExtractor<Data> extractor) {
extractors.put(property, extractor);
}
#Override
public Map<String, Object> resolve(List<String> properties) {
...
}
}
The basic idea for the resolve method is to first evaluate which properties can be fulfilled by this PropertyResolver instance. If there are eligible properties then we will fetch the data using getData. For each eligible property we extract the property value and add it to a result map. Each property which cannot be resolved, the successor will be requested to be resolve that property. If all properties are resolved the chain of execution will end.
#Override
public Map<String, Object> resolve(List<String> properties) {
Map<String, Object> result = new HashMap<>();
List<String> eligibleProperties = new ArrayList<>(properties);
eligibleProperties.retainAll(extractors.keySet());
if (!eligibleProperties.isEmpty()) {
Data data = getData();
for (String property : eligibleProperties) {
result.put(property, extractors.get(property).extract(data));
}
}
List<String> remainingProperties = new ArrayList<>(properties);
remainingProperties.removeAll(eligibleProperties);
if (!remainingProperties.isEmpty()) {
result.putAll(successor.resolve(remainingProperties));
}
return result;
}
Implementing Resolvers
When we go to implement a concrete class for PropertyResolverChain we will need to implement the getData method and also bind PropertyExtractor instances. These bindings can act as an adapter for the data returned by each service. This data can follow the same structure as the data returned by the service, or have a custom schema. Using the FooBarService from earlier as an example, our class could be implemented like bellow (note that we can have many bindings which result in the same data being returned).
class FooBarResolver extends PropertyResolverChain<FooBarData> {
private final FooBarService remoteService;
FooBarResolver(PropertyResolver successor, FooBarService remoteService) {
super(successor);
this.remoteService = remoteService;
// return the whole object
setBinding("foobar", data -> data);
// accept different spellings
setBinding("foo", data -> data.getFoo());
setBinding("bar", data -> data.getBar());
setBinding("FOO", data -> data.getFoo());
setBinding("__bar", data -> data.getBar());
// create new properties all together!!
setBinding("barfoo", data -> data.getBar() + data.getFoo());
}
#Override
protected FooBarData getData() {
return remoteService.invoke();
}
}
Example Usage
Putting it all together, we can invoke the Resolver chain as shown bellow. We can observe that the expensive getData method call is only performed once per Resolver only if the property is bound to the resolver, and that the user gets only the exact fields which they require:
PropertyResolver resolver =
new FizzBuzzResolver(
new FooBarResolver(
new UnknownResolver(),
new FooBarService()),
new FizzBuzzService());
Map<String, Object> result = resolver.resolve(Arrays.asList(
"foobar", "foo", "__bar", "barfoo", "invalid", "fizz"));
ObjectMapper mapper = new ObjectMapper();
mapper.enable(SerializationFeature.INDENT_OUTPUT);
System.out.println(mapper
.writerWithDefaultPrettyPrinter()
.writeValueAsString(result));
Output
This is an expensive FizzBuzz call
This is an expensive FooBar call
{
"foobar" : {
"foo" : "FOO",
"bar" : "BAR"
},
"__bar" : "BAR",
"barfoo" : "BARFOO",
"foo" : "FOO",
"invalid" : "Unknown",
"fizz" : "FIZZ"
}

Modelmapper: Map an element in a list to field in POJO

public class SimpleDTO{
private String firstElement;
private String lastElement;
}
public class ComplexSource{
private List<String> elementList;
}
I tried to map it usingmap().setFirstElement(source.getElementList().get(0)) but I get an error stating "1) Invalid source method java.util.List.get(). Ensure that method has zero parameters and does not return void."
How do I map an element in a list to a field in a Pojo using ModelMapper or any other alternative?
In this case is you can't use a PropertyMap. If you want map it using ModelMapper you must use a Converter instead of PropertyMap as you have done.
First your Converter would be as next where the source is ComplexSource and SimpleDTO is the destination:
Converter<ComplexSource, SimpleDTO> converter = new AbstractConverter<ComplexSource, SimpleDTO>() {
#Override
protected SimpleDTO convert(ComplexSource source) {
SimpleDTO destination = new SimpleDTO();
List<String> sourceList = source.getElementList();
if(null != sourceList && !sourceList.isEmpty()){
int sizeList = sourceList.size();
destination.setFirstElement(sourceList.get(0));
destination.setLastElement(sourceList.get(sizeList - 1));
}
return destination;
}
};
Then you just need to add the converter to your ModelMapper instance:
ModelMapper mapper = new ModelMapper();
mapper.addConverter(converter);
If you try the map, it works perfectly:
ComplexSource complexSource = new ComplexSource();
complexSource.setElementList(Arrays.asList("firstElement", "lastElement"));
SimpleDTO simpleDto = mapper.map(complexSource, SimpleDTO.class);
System.out.println(simpleDto);
Output
SimpleDTO [firstElement=firstElement, lastElement=lastElement]
Respect your comment, you need to check nulls if it is need in your source instance (in this case it is possible a null pointer if the list is null). But it inits for you the destination instance, even you can configure the destination instance how you want with a Provider (Providers documentation).
In cases of special use cases like this, you need to worry about null checks and exceptions handling because a Converter I would say is the way of modelmapper to map manually pojos.
The advantadges of use ModelMapper are explained in its web:
If you configure it correctly in some cases it is no need to do the map manually.
It centralizes the mapping.
It provides a mapping API for handling special use cases. (This is your case)
And so on (take a look its web)

XmlHttpContent serializer sort fileds alphabetically

I need strict compliance with the order of the elements in my xml document. If I use XmlHttpContent serializer to form xml content, fields sort alphabetically.
Is there any way to specify explicitly order of the elements in xml? Or are there other ways to create and post http requests with the xml body?
I know this answer isn't ideal but I recently came across this issue when trying to use the http client library for serialisation to xml. The solution I've found that works is to have my DTO classes provide a method to convert them into a sorted map of some kind.
In my case this is an ImmutableMap<String, Object> as I'm also using Guava but any map with controllable order will do. The basic idea is to work with the java objects to construct your data but then when the time comes to serialise them you serialise the map instead.
public interface OrderedXml {
ImmutableMap<String, Object> toOrderedMap();
}
public class Parent implements OrderedXml {
#Key("First") String first;
#Key("Second") String second;
#Key("Child") Child third;
#Override
public ImmutableMap<String, Object> toOrderedMap() {
return ImmutableMap.of(
// the order of elements in this map will be the order they are serialised
"First", first,
"Second", second,
"Child", third.toOrderedMap()
);
}
}
public class Child implements OrderedXml {
#Key("#param1") String param1;
#Key("#param2") String param2;
#Key("text()") String value;
#Override
public ImmutableMap<String, Object> toOrderedMap() {
return ImmutableMap.of(
// the same goes for attributes, these will appear in this order
"#param1", param1,
"#param2", param2,
"text()", value
);
}
}
public class Main {
public static void main(String[] args) {
// make the objects
Parent parent = new Parent();
parent.first = "Hello";
parent.second = "World";
parent.child = new Child();
parent.child.param1 = "p1";
parent.child.param2 = "p2";
parent.child.value = "This is a child";
// serialise the object to xml
String xml = new XmlNamespaceDictionary()
.toStringOf("Parent", parent.toOrderedXml()); // the important part
System.out.println(xml); // should have the correct order
}
}
I know this solution isn't ideal but at least you can reuse the toOrderedXml to make a nice toString :-).

ORM supporting immutable classes

Which ORM supports a domain model of immutable types?
I would like to write classes like the following (or the Scala equivalent):
class A {
private final C c; //not mutable
A(B b) {
//init c
}
A doSomething(B b) {
// build a new A
}
}
The ORM has to initialized the object with the constructor. So it is possible to check invariants in the constructor. Default constructor and field/setter access to intialize is not sufficient and complicates the class' implementation.
Working with collections should be supported. If a collection is changed it should create a copy from the user perspective. (Rendering the old collection state stale. But user code can still work on (or at least read) it.) Much like the persistent data structures work.
Some words about the motivation. Suppose you have a FP-style domain object model. Now you want to persist this to a database. Who do you do that? You want to do as much as you can in a pure functional style until the evil sides effect come in. If your domain object model is not immutable you can for example not share the objects between threads. You have to copy, cache or use locks. So unless your ORM supports immutable types your constrainted in your choice of solution.
UPDATE: I created a project focused on solving this problem called JIRM:
https://github.com/agentgt/jirm
I just found this question after implementing my own using Spring JDBC and Jackson Object Mapper. Basically I just needed some bare minimum SQL <-> immutable object mapping.
In short I just use Springs RowMapper and Jackson's ObjectMapper to map Objects back and forth from the database. I use JPA annotations just for metadata (like column name etc...). If people are interested I will clean it up and put it on github (right now its only in my startup's private repo).
Here is a rough idea how it works here is an example bean (notice how all the fields are final):
//skip imports for brevity
public class TestBean {
#Id
private final String stringProp;
private final long longProp;
#Column(name="timets")
private final Calendar timeTS;
#JsonCreator
public TestBean(
#JsonProperty("stringProp") String stringProp,
#JsonProperty("longProp") long longProp,
#JsonProperty("timeTS") Calendar timeTS ) {
super();
this.stringProp = stringProp;
this.longProp = longProp;
this.timeTS = timeTS;
}
public String getStringProp() {
return stringProp;
}
public long getLongProp() {
return longProp;
}
public Calendar getTimeTS() {
return timeTS;
}
}
Here what the RowMapper looks like (notice it mainly delegats to Springs ColumnMapRowMapper and then uses Jackson's objectmapper):
public class SqlObjectRowMapper<T> implements RowMapper<T> {
private final SqlObjectDefinition<T> definition;
private final ColumnMapRowMapper mapRowMapper;
private final ObjectMapper objectMapper;
public SqlObjectRowMapper(SqlObjectDefinition<T> definition, ObjectMapper objectMapper) {
super();
this.definition = definition;
this.mapRowMapper = new SqlObjectMapRowMapper(definition);
this.objectMapper = objectMapper;
}
public SqlObjectRowMapper(Class<T> k) {
this(SqlObjectDefinition.fromClass(k), new ObjectMapper());
}
#Override
public T mapRow(ResultSet rs, int rowNum) throws SQLException {
Map<String, Object> m = mapRowMapper.mapRow(rs, rowNum);
return objectMapper.convertValue(m, definition.getObjectType());
}
}
Now I just took Spring JDBCTemplate and gave it a fluent wrapper. Here are some examples:
#Before
public void setUp() throws Exception {
dao = new SqlObjectDao<TestBean>(new JdbcTemplate(ds), TestBean.class);
}
#Test
public void testAll() throws Exception {
TestBean t = new TestBean(IdUtils.generateRandomUUIDString(), 2L, Calendar.getInstance());
dao.insert(t);
List<TestBean> list = dao.queryForListByFilter("stringProp", "hello");
List<TestBean> otherList = dao.select().where("stringProp", "hello").forList();
assertEquals(list, otherList);
long count = dao.select().forCount();
assertTrue(count > 0);
TestBean newT = new TestBean(t.getStringProp(), 50, Calendar.getInstance());
dao.update(newT);
TestBean reloaded = dao.reload(newT);
assertTrue(reloaded != newT);
assertTrue(reloaded.getStringProp().equals(newT.getStringProp()));
assertNotNull(list);
}
#Test
public void testAdding() throws Exception {
//This will do a UPDATE test_bean SET longProp = longProp + 100
int i = dao.update().add("longProp", 100).update();
assertTrue(i > 0);
}
#Test
public void testRowMapper() throws Exception {
List<Crap> craps = dao.query("select string_prop as name from test_bean limit ?", Crap.class, 2);
System.out.println(craps.get(0).getName());
craps = dao.query("select string_prop as name from test_bean limit ?")
.with(2)
.forList(Crap.class);
Crap c = dao.query("select string_prop as name from test_bean limit ?")
.with(1)
.forObject(Crap.class);
Optional<Crap> absent
= dao.query("select string_prop as name from test_bean where string_prop = ? limit ?")
.with("never")
.with(1)
.forOptional(Crap.class);
assertTrue(! absent.isPresent());
}
public static class Crap {
private final String name;
#JsonCreator
public Crap(#JsonProperty ("name") String name) {
super();
this.name = name;
}
public String getName() {
return name;
}
}
Notice in the above how easy it is to map any query into immutable POJO's. That is you don't need it 1-to-1 of entity to table. Also notice the use of Guava's optionals (last query.. scroll down). I really hate how ORM's either throw exceptions or return null.
Let me know if you like it and I'll spend the time putting it on github (only teste with postgresql). Otherwise with the info above you can easily implement your own using Spring JDBC. I'm starting to really dig it because immutable objects are easier to understand and think about.
Hibernate has the #Immutable annotation.
And here is a guide.
Though not a real ORM, MyBatis may able to do this. I didn't try it though.
http://mybatis.org/java.html
AFAIK, there are no ORMs for .NET supporting this feature exactly as you wish. But you can take a look at BLTookit and LINQ to SQL - both provide update-by-comparison semantics and always return new objects on materialization. That's nearly what you need, but I'm not sure about collections there.
Btw, why you need this feature? I'm aware about pure functional languages & benefits of purely imutable objects (e.g. complete thread safety). But in case with ORM all the things you do with such objects are finally transformed to a sequence of SQL commands anyway. So I admit the benefits of using such objects are vaporous here.
You can do this with Ebean and OpenJPA (and I think you can do this with Hibernate but not sure). The ORM (Ebean/OpenJPA) will generate a default constructor (assuming the bean doesn't have one) and actually set the values of the 'final' fields. This sounds a bit odd but final fields are not always strictly final per say.
SORM is a new Scala ORM which does exactly what you want. The code below will explain it better than any words:
// Declare a model:
case class Artist ( name : String, genres : Set[Genre] )
case class Genre ( name : String )
// Initialize SORM, automatically generating schema:
import sorm._
object Db extends Instance (
entities = Set() + Entity[Artist]() + Entity[Genre](),
url = "jdbc:h2:mem:test"
)
// Store values in the db:
val metal = Db.save( Genre("Metal") )
val rock = Db.save( Genre("Rock") )
Db.save( Artist("Metallica", Set() + metal + rock) )
Db.save( Artist("Dire Straits", Set() + rock) )
// Retrieve values from the db:
val metallica = Db.query[Artist].whereEqual("name", "Metallica").fetchOne() // Option[Artist]
val rockArtists = Db.query[Artist].whereEqual("genres.name", "Rock").fetch() // Stream[Artist]

Categories