How to apply List in one field with MapStruct? - java

I have this POJO :
public class PlayerDto {
private Long id;
private String name;
private String past;
}
And I have this entity :
public class Player {
private Long id;
private String name;
private List<String> past;
}
How can I map the List<String> past into the String past of the DTO wih MapStruct ? For example the List is containing [ Monty , Boto , Flaouri ] and the String of the DTO has to contain "Monty, Boto, Flaouri" in a single String.
This classic way doesn't work with the target and source :
#Mappings({
#Mapping(target = "past", source = "past"),
})
PlayerDto entityToDto(final Player entity);
Thanks

I guess you need to define a default method in your mapper interface to handle data conversion from List<String> to String. Mapstruct will automatically use the default method.
The default method signature for your mapping should be like this :
String map(List<String> past)
Example :
default String map(List<String> past) {
return past.stream().collect(Collectors.joining(","));
}

Related

Converting the unstructured object in java

I'm using MongoDb for unstructured documents. When I do the aggregations, I'm getting final output as unstructured objects. I post some sample data for the easiness. Actual objects have many fields.
Eg :
[
{ _id : "1", type: "VIDEO", videoUrl : "youtube.com/java"},
{ _id : "2", type: "DOCUMENT", documentUrl : "someurl.com/spring-boot-pdf"},
{ _id : "3", type: "ASSESSMENT", marks : 78}
]
The respective class for the types of above objects are
#Data
public class Video{
private String _id;
private String type;
private String videoUrl;
}
#Data
public class Document{
private String _id;
private String type;
private String documentUrl;
}
#Data
public class Assessment{
private String _id;
private String type;
private Integer marks;
}
Since I can't specify the converter class, I get all objects as list of Object.class which is a general type for all.
List<Object> list = mongoTemplate.aggregate(aggregation, mongoTemplate.getCollectionName(YOUR_COLLECTION.class), Object.class).getMappedResults();
It's working, but this is not readable and not maintainable for backend and front-end developers (eg : swagger ui). So I came up with a solution, that put all fields as a class.
#Data
#JsonInclude(JsonInclude.Include.NON_NULL)
class MyConvetor{
private String _id;
private String type;
private String videoUrl;
private String documentUrl;
private Integer marks;
}
Here Jackson helps to ignore all null fields
Now I can use MyConverter as Type
List<MyConverter> list = mongoTemplate.aggregate(aggregation, mongoTemplate.getCollectionName(YOUR_COLLECTION.class), MyConverter.class).getMappedResults();
But I feel this is not a good practice when we implementing a standard application. I'd like to know, is there any way to avoid the general type class (e.g. extending any abstract class)? Or is this the only way I can do?
I don't think so (or I don't know) if MongoDB in Java provides this kind of dynamic conversion by some field (it would require specify what field and what classes). But you can do it by hand.
First, you need to define your types (enum values or some map) for matching string to class. You can create abstract parent class (eg. TypedObject) for easier usage and binding all target classes (Video, Document, Assessment) .
Next you have to read and map values from Mongo to anything because you want to read all data in code. Object is good but I recommend Map<String, Object> (your Object actually is that Map - you can check it by invoking list.get(0).toString()). You can also map to String or DBObject or some JSON object - you have to read "type" field by hand and get all data from object.
At the end you can convert "bag of data" (Map<String, Object> in my example) to target class.
Now you can use converted objects by target classes. For proving these are actually target classes I print objects with toString all fields.
Example implementation
Classes:
#Data
public abstract class TypedObject {
private String _id;
private String type;
}
#Data
#ToString(callSuper = true)
public class Video extends TypedObject {
private String videoUrl;
}
#Data
#ToString(callSuper = true)
public class Document extends TypedObject {
private String documentUrl;
}
#Data
#ToString(callSuper = true)
public class Assessment extends TypedObject {
private Integer marks;
}
Enum for mapping string types to classes:
#RequiredArgsConstructor
public enum Type {
VIDEO("VIDEO", Video.class),
DOCUMENT("DOCUMENT", Document.class),
ASSESSMENT("ASSESSMENT", Assessment.class);
private final String typeName;
private final Class<? extends TypedObject> clazz;
public static Class<? extends TypedObject> getClazz(String typeName) {
return Arrays.stream(values())
.filter(type -> type.typeName.equals(typeName))
.findFirst()
.map(type -> type.clazz)
.orElseThrow(IllegalArgumentException::new);
}
}
Method for converting "bag of data" from JSON to your target class:
private static TypedObject toClazz(Map<String, Object> objectMap, ObjectMapper objectMapper) {
Class<? extends TypedObject> type = Type.getClazz(objectMap.get("type").toString());
return objectMapper.convertValue(objectMap, type);
}
Read JSON to "bags of data" and use of the above:
String json = "[\n" +
" { _id : \"1\", type: \"VIDEO\", videoUrl : \"youtube.com/java\"},\n" +
" { _id : \"2\", type: \"DOCUMENT\", documentUrl : \"someurl.com/spring-boot-pdf\"},\n" +
" { _id : \"3\", type: \"ASSESSMENT\", marks : 78}\n" +
"]";
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.configure(JsonParser.Feature.ALLOW_UNQUOTED_FIELD_NAMES, true);
List<Map<String, Object>> readObjects = objectMapper.readValue(json, new TypeReference<>() {});
for (Map<String, Object> readObject : readObjects) {
TypedObject convertedObject = toClazz(readObject, objectMapper);
System.out.println(convertedObject);
}
Remarks:
In example I use Jackson ObjectMapper for reading JSON. This makes the example and testing simpler. I think you can replace it with mongoTemplate.aggregate(). But anyway I need ObjectMapper in toClazz method for converting "bags of data".
I use Map<String, Object> instead of just Object. It is more complicated: List<Map<String, Object>> readObjects = objectMapper.readValue(json, new TypeReference<>() {});. If you want, you can do something like this: List<Object> readObjects2 = (List<Object>) objectMapper.readValue(json, new TypeReference<List<Object>>() {});
Result:
Video(super=TypedObject(_id=1, type=VIDEO), videoUrl=youtube.com/java)
Document(super=TypedObject(_id=2, type=DOCUMENT), documentUrl=someurl.com/spring-boot-pdf)
Assessment(super=TypedObject(_id=3, type=ASSESSMENT), marks=78)
Of course you can cast TypedObject to target class you need (I recommend checking instance of before casting) and use:
Video video = (Video) toClazz(readObjects.get(0), objectMapper);
System.out.println(video.getVideoUrl());
I assumed you read whole collection once and you get all types mixed up in one list (as in example in your question). But you can try find documents in MongoDB by field "type" and get data separately for each of type. With this you can easily convert to each type separately.

MapStruct: How to map property from "java.lang.Object to "java.lang.String"

New to MapStrut; Object to String Error:
[ERROR] /util/LicenseMapper.java:[11,23] Can't map property "java.lang.Object license.customFields[].value" to "java.lang.String license.customFields[].value". Consider to declare/implement a mapping method: "java.lang.String map(java.lang.Object value)".
Code:
#Mapper
public interface LicenseMapper {
List<License> jsonToDao(List<com.integrator.vo.license.License> source);
}
The vo.license contains List of CustomFields having property as
#SerializedName("Value")
#Expose
private Object value;
The Json has input for one field as object as it might come boolean or string or anything so i have mapped it into object. Whereas in dao layer has same field in String. (In custom mapper i just String.valueof but not sure how to achieve it using Mapstrut)
Can anyone tell me what settings are required in LicenseMapper to convert Object to String?
License Structure - Source and destination:
.
.
private String notes;
private Boolean isIncomplete;
private List<CustomField> customFields = null;
private List<Allocation> allocations = null;
Custom Field Structure in Source (removed gson annotations):
.
.
private String name;
private Object dataType;
private Object value;
Custom FIeld Structure in Destination
private String name;
private String datatype;
private String value;
You can try to use annotation #Mapping with expression
#Mapping(expression = "java( String.valueOf(source.getValue()) )", target = "value")
List<License> jsonToDao(List<com.integrator.vo.license.License> source);
UPDATE
#Mapper
public interface LicenseMapper {
LicenseMapper MAPPING = Mappers.getMapper(LicenseMapper.class);
List<License> entityListToDaoList(List<com.integrator.vo.license.License> source);
License entityToDao(com.integrator.vo.license.License source);
List<CustomField> customFieldListToCustomFieldList(List<*your custom field path*CustomField> source);
#Mapping(expression = "java( String.valueOf(source.getValue()) )", target = "value")
CustomField customFieldToCustomField(*your custom field path*CustomField source);
}
IN YOUR CODE
import static ***.LicenseMapper.MAPPING;
***
List<License> myList = MAPPING.jsonToDao(mySource);
U can do this :
#Mapping(target = "yourTarget", source = "yourClass.custField.value")
enter image description here

MapStruct : Mapping Collections objects based on it's type's properties

I have a following java bean in my application
public class Status{
private String statusType;
private String status;
private String statusCode;
//getters and setters
}
public class Associate{
private String name;
private String id;
private List<Status> statuses;
//getters & setters
}
And possible values for Status.statusTypes are {"O", "P", "R", "S", "A"}. Now I need to map statuses as part of Associate mapper but mapping should only return status and statusType if statusType matches any of {"O", "A", "P"} ?
As of now I'm doing custom default method in mapper like below
public AssociateMapper{
Predicate<Status> status = (sts) -> null != sts &&null!=sts.getStatusType
&& "O|A|P".contains(sts.getStatusType());
#Mappings({
//some mappings
#Mapping(target = "statuses", expression = "java(associate.getStatuses().stream().filter(status).collect(Collectors.toList()))" )
})
Associate mapAssociate(Associate associate);
}
Is there any elegant way than this using mapStruct config itself?
A more elegant way to achieve what you are trying to do would be to use a custom mapping between statuses.
In your case it would look like:
#Mapper
public AssociateMapper{
Associate mapAssociate(Associate associate);
default List<Status> mapStatuses(List<Status> statuses) {
return statuses.stream()
.filter(Objects::nonNull)
.filter(status -> Objects.equals("O", status.getStatusType()) || Objects.equals("A", status.getStatusType()) || Objects.equals("P", status.getStatusType()))
.collect(Collectors.toList());
}
}
What you are doing in the expression can be part of a custom mapping for a list of Status(es). If there is a custom method that maps between List<Status> then MapStruct will use that one

Java Spring + MongoTemplate cannot retrieve List of Object

I had a Java Class linked to a MongoDB Collection:
#Document(collection = "my_collection")
public class Ev extends MyDTO{
#Id
private String id;
#Indexed
private String sessionId;
private List<String> findings;
}
I had to change findings in this
private List<MyObject> findings;
Declared as
public class MyObject {
private String find;
private String description;
private int number;
private List<SecondaryObj> details;
}
Here are the constructors
public MyObject(String find, int number) {
super();
this.find= find;
this.number= number;
}
public MyObject(String find, int number, List<SecondaryObj> details) {
super();
this.find= find;
this.details = details;
this.number= number;
}
So in mongoDB I have a situation similar to
{
"_id" : ObjectId("5b487a2667a1aa18f*******"),
"sessionId" : "abc123mySessionId",
"findings" : [
{
"find" : "HTTPS",
"description" : "I found HTTPS",
"number" : 10,
"details": [
{"a":"1", "b":"2"},
{"a":"2", "b":"3"}
]
},
{
"find" : "NAME",
"description" : "I found name",
"number" : 3,
"details": [
{"a":"1", "b":"2"},
{"a":"2", "b":"3"}
]
}
]
}
I obviously updated all the methods to match the new data set, but if I try to retrieve
Query searchQuery = new Query(Criteria.where("sessionId").is("abc123mySessionId"));
Ev result = mongoTemplate.findOne(searchQuery, Ev.class);
I obtain this error
Request processing failed; nested exception is org.springframework.data.mapping.model.MappingInstantiationException: Failed to instantiate com.my.project.domain.MyObject using constructor NO_CONSTRUCTOR with arguments
with root cause
java.lang.NoSuchMethodException: om.my.project.domain.MyObject.<init>()
I'm using spring-data-mongodb version 2.0.8 and mongo-java-driver version 3.8.0
I think I should declare MyObject somewhere, but I'm pretty new in Spring, so I'm trying in a kinda blind way... Any suggestion?
You have two non-zero-argument constructors and Spring does not know which one to call. It tries to call no-args constructor, but your class does not have that one.
Check Spring Data Mongo docs
You can create no-args constructor and mark it with #PersistenceConstructor annotation. This way Spring calls it to create an object and sets fields via reflection based on a document fields names, so no setters are required.
#Document(collection = "my_collection")
public class Ev extends MyDTO{
#Id
private String id;
#Indexed
private String sessionId;
private List<MyObject> findings;}
public class MyObject {
private String find;
private String description;
private int number;}
In this it work fine for me in spring-boot-starter-data-mongodb - version 2.0.3.RELEASE

ModelMapper integration with Jooq Record

===== POJO =====
// Employee POJO
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonNaming(PropertyNamingStrategy.LowerCaseWithUnderscoresStrategy.class)
public class Employee implements Serializable {
private Integer id;
private String name;
private Integer companyId;
// assume getters ,setters and serializable implementations.
}
// Company POJO
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonNaming(PropertyNamingStrategy.LowerCaseWithUnderscoresStrategy.class)
public class Company implements Serializable {
private Integer id;
private String name;
// assume getters ,setters and serializable implementations.
}
// EmployeeVO POJO
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonNaming(PropertyNamingStrategy.LowerCaseWithUnderscoresStrategy.class)
public class EmployeeVO implements Serializable {
private Employee employee;
private Company company;
// assume getters ,setters and serializable implementations.
}
===== My DAO layer class =====
public List<EmployeeVO> getEmployees(){
// configuring model mapper.
ModelMapper modelMapper = new ModelMapper();
modelMapper.getConfiguration()
.addValueReader(new RecordValueReader())
.setSourceNameTokenizer(NameTokenizers.UNDERSCORE);
//property map configuration.
PropertyMap<Record, EmployeeVO> employeeVOMap = new PropertyMap<Record, EmployeeVO>() {
protected void configure() {
map().getEmployee().setName(this.<String>source("name"));
map().getEmployee()..setId(this.<Integer>source("id"));
map().getCompany().setName(this.<String>source("comp_name"));
map().getCompany().setId(this.<String>source("comp_id"));
}
};
// TypeMap config
modelMapper.createTypeMap(Record.class, EmployeeVO.class);
// adding employeeVOMap .
modelMapper.addMappings(employeeVOMap);
// JOOQ query
List<Field<?>> fields = Lists.newArrayList();
// fields includes, id, name, comp_name, comp_id
SelectJoinStep query = select(dslContext, fields).from(EMPLOYEE)
.join(COMPANY)
.on(COMPANY.ID.equal(EMPLOYEE.COMPANY_ID));
Result<Record> records = query.fetch();
Record record = null;
Iterator<Record> it = records.iterator();
List<EmployeeVO> employeeList= Lists.newArrayList();
while (it.hasNext()) {
record = it.next();
EmployeeVO employeeVOObj =
modelMapper.map(record, EmployeeVO.class);
employeeList.add(employeeVOObj);
}
return employeeList;
}
===== Error log =====
1) Error mapping org.jooq.impl.RecordImpl to com.myportal.bingo.db.model.EmployeeVO
1 error] with root cause
java.lang.ArrayIndexOutOfBoundsException: -1
Note:
ModelMapper throws the above exception when it reaches below method.
private void matchSource(TypeInfo<?> sourceTypeInfo, Mutator destinationMutator)
in ImplicitMappingBuilder.java
sourceTypeInfo.getAccessors() is null.
Any help?
Had the same problem, or at least which looked the same. (You can move directly to my solution in the last paragraph.) Lots of debugging have shown the following:
if accessors on that line (mentioned in your question) are null, then accessors = PropertyInfoSetResolver.resolveAccessors(source, type, configuration) line in TypeInfoImpl class is executed, and the reason of exception in my case was this call:
valueReader.get(source, memberName) at the following piece of code at 'resolveAccessors' method in the PropertyInfoSetResolver class:
if (valueReader == null)
resolveProperties(type, true, configuration, accessors);
else {
NameTransformer nameTransformer = configuration.getSourceNameTransformer();
for (String memberName : valueReader.memberNames(source))
accessors.put(nameTransformer.transform(memberName, NameableType.GENERIC),
new ValueReaderPropertyInfo(valueReader, valueReader.get(source, memberName),
memberName));
which ends up in source.getValue(memberName.toUpperCase()), where source is JOOQ's Record; InvoiceRecord in my case. And - tada - for some reason invoice.getValue("INVOICE_ID") ends up in the exception (no such field and therefore indexOf returns -1 which causes the ArrayIndexOutOfBoundsException), while invoice.getValue("invoice_id") is totally fine.
So else condition (the same piece of code above) wasn't the right way to execute the code, and if case turned out to be ok.
So that's what helped me in my particular case: removing of the row modelMapper.getConfiguration().addValueReader(new RecordValueReader()). Hope this will help you too.

Categories