I have the following Entities
public class Manufacturer
{
int id;
String name;
Country country;
List<Model> models;
}
public class Model
{
int id;
String name;
}
And the following DTO
public class ManufacturerLastModelDto
{
Integer id;
String name;
ModelDto model;
}
public class ModelDto
{
int id;
String name;
}
Now I want to map the Manufacturer to the ManufacturerLastModelDto, like that:
modelMapper.map(manufacturer, ManufacturerLastModelDto.class)
So that only the first entry of the List model will be assigned from manufacturer.
My previous solution was that I had a List of ModelDto's even in the DTO and removed all Entries after Index 0. That was OK, because the ModelMapper mapped the child from Model to ModelDTO automatically.
But only Response wasn't so nice:
models: [
{...}
]
because it was sent as an Array.
Do I need a custom ModelMapper here? If so, how to build it? The tutorial is really complex. Do I need a converter or a TypeMap (or both)?
I am not too familiar with ModelMapper but have used it occasionally.
Yes, you would need to create a converter for the property, and you could either use it with the ModelMapper or a TypeMap. Only caveat being that you will need to map the property yourself, for instance
The converter
Converter<List<Model>, ModelDto> modelConverter = new AbstractConverter<List<Model>, ModelDto>() {
#Override
protected ModelDto convert(List<Model> models) {
if (models == null || models.isEmpty()) {
return null;
}
Model model = models.get(0);
ModelDto dto = new ModelDto();
dto.setId(model.getId());
dto.setName(model.getName());
return dto;
}
};
Now using default model mapper
ModelMapper modelMapper = new ModelMapper();
modelMapper.addConverter(modelConverter);
ManufacturerLastModelDto result = modelMapper.map(manufacturer, ManufacturerLastModelDto.class);
Or using the TypeMap
TypeMap<Manufacturer, ManufacturerLastModelDto> typeMap = modelMapper.typeMap(Manufacturer.class, ManufacturerLastModelDto.class)
.addMappings(mapper ->
mapper.using(modelConverter).map(Manufacturer::getModels, ManufacturerLastModelDto::setModel)
);
ManufacturerLastModelDto result = typeMap.map(manufacturer);
Related
I am using spring-boot-1.5.6 and modelmapper-1.1.0. I would like to map the entity object to OrderDto
but don't know how to do it via modelMapper. Please find the below code
Order.Java
public class Order {
private String orderUid;
private Invoice invoice;
private List<Item> items;
//Getter & setter
}
Item.java
public class Item {
private String itemId;
private String itemName;
private String itemUid;
private String isbn;
//other details of item
//Getter & setter
}
OrderDTO.java
public class OrderDTO {
private String orderUid;
private Invoice invoice;
private String itemId;
private String itemName;
private String itemUid;
private String isbn;
//other details of item
//Getter & setter
}
I would like to return OrderDTO with the item based on the itemID we are getting from the client(FrontEnd)
public Page<OrderDTO> convertOrderEntityToDTO (Page<Order> orderList,String itemId) {
ModelMapper modelMapper = new ModelMapper();
Type listType = new TypeToken<Page<OrderDTO>>() {}.getType();
modelMapper.addConverter((MappingContext<Order, OrderDTO> context) -> {
Item item = context.getSource().getItems().stream()
.filter(item -> equalsIgnoreCase(item.getItemId,itemId))
.findAny().orElse(null);
if (item != null) {
OrderDTO orderDTO = context.getDestination();
orderDTO.setItemId(item.getItemId());
orderDTO.setItemName(item.getItemName());
orderDTO.setItemUid(item.getItemUid());
orderDTO.setIsbn(item.getIsbn());
return orderDTO;
}
return null;
});
Page<OrderDTO> addonServices = modelMapper.map(orderList, listType);
}
In the above method, converter was never called(may be because of incorrect TypePair of modelMapper) and the item related attributes
in OrderDTO is always null. I would like to get the Item value based on ItemId.
Any help or hint would be appreciable. Any suggestion with or without modelMapper would be really appreciable.
As far as I understand, you use Page class of org.springframework.data or something like that. Anyway, this generic Page class contains generic List with your data. I would say, that this construction is just "too generic" for modelMapper. You'd better get list of your data, convert it and create a new Page.
Furthermore
You should create a typeMap to register a new converter
context.getDestination() returns not null if you provide new destination object by modelMapper.map(..). In your case you'll get null.
This is my vision of your convertOrderEntityToDTO method:
public Page<OrderDTO> convertOrderEntityToDTO(Page<Order> orderList, String itemId) {
ModelMapper modelMapper = new ModelMapper();
modelMapper.createTypeMap(Order.class, OrderDTO.class)
.setConverter(context -> context.getSource().getItems().stream()
.filter(o -> equalsIgnoreCase(o.getItemId(), itemId))
.findAny().map(o -> {
OrderDTO orderDTO = new OrderDTO();
orderDTO.setItemId(o.getItemId());
orderDTO.setItemName(o.getItemName());
orderDTO.setItemUid(o.getItemUid());
orderDTO.setIsbn(o.getIsbn());
return orderDTO;
}).orElse(null));
List<OrderDTO> orderDtoList = orderList.getContent().stream()
.map(o -> modelMapper.map(o, OrderDTO.class))
.collect(Collectors.toList());
return new PageImpl<>(orderDtoList, orderList.getPageable(), orderList.getTotalElements());
}
If you are using Spring Data JpaRepositories to retrieve data from the database you can make a repository method that creates a DTO for you. Such a method would look something like this:
#Query("SELECT new full.path.OrderDTO(s.itemId, s.itemName, s.itemUid) FROM Item s WHERE s.itemId = :id")
public OrderDTO getOrderDTOByItemId(#Param("id") long id);
The method should go in the jparepository class you use to retrieve Item instances from the database with. To make this work your OrderDTO needs to have a constructor with this parameter list. ( long itemId, String itemName, String itemUid). The parameters need to be in the same order as (s.itemId, s.itemName, s.itemUid). Also always make sure you create a default constructor as well when you create a constructor with parameters.
I have a JPA entity with a List of custom objects as one of its fields. Using a Jackson converter, I've managed to persist this list as a JSON array into a MySQL database, but Iam unable to insert into this list after its initial creation.
I can successfully retrieve the existing list, add a new object in memory(and test that it has been inserted), then save it via a Spring REST repository. However, it never seems to persist. Any ideas? Here is my code (this is a Spring Boot project FYI):
Candidate entity with a List inside
#Entity
#Table(name = "Candidates", schema = "Candidate")
public class Candidate extends ResourceSupport {
#Id
#Column(name = "CandidateID")
private Long candidateID;
// More fields
#Column(name = "Fields")
#Convert(converter = CollectionConverter.class)
private List<CandidateField> fields;
//Getters & setters
}
CandidateField class which makes up the List above. The CandidateField is simply a POJO that models the JSON stored in a single field in the Candidate table, it is not an independent entity.
public class CandidateField {
private Long fieldID;
private String name;
private boolean current;
public CandidateField () {
}
public CandidateField (Long fieldID, String name, boolean current) {
this.fieldID = fieldID;
this.name = name;
this.current = current;
}
//Getters & Setters
}
Converter
public class CollectionConverter implements AttributeConverter<List<CandidateField>, String> {
private ObjectMapper objectMapper = new ObjectMapper();
#Override
public String convertToDatabaseColumn(List<CandidateField> object) {
try {
return objectMapper.writeValueAsString(object);
} catch (JsonProcessingException e) {
e.printStackTrace();
return "";
}
}
#Override
public List<CandidateField> convertToEntityAttribute(String data) {
try {
return objectMapper.readValue(data, new TypeReference<List<CandidateField>>() {});
} catch (Exception e) {
e.printStackTrace();
return null;
}
}
Code that persists to database
public void addField(Long fieldID, Long candidateID) {
Candidate candidate = repository.findOne(candidateID);
candidate.getFields().add(new CandidateField(fieldID, "", true));
repository.saveAndFlush(candidate);
}
Repository
#RepositoryRestResource
public interface CandidateRepository extends JpaRepository<Candidate,Long>{}
I can't seem to figure out why this won't persist. Any help will be very much appreciated. Cheers!
Consider defining the cascade type for your collection.
When you persist your Candidate objects the operation is not cascaded by default and thus you need to define it yourself unless you persist your CandidateField objects directly.
I'm trying to distinguish between null values and not provided values when partially updating an entity with PUT request method in Spring Rest Controller.
Consider the following entity, as an example:
#Entity
private class Person {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
/* let's assume the following attributes may be null */
private String firstName;
private String lastName;
/* getters and setters ... */
}
My Person repository (Spring Data):
#Repository
public interface PersonRepository extends CrudRepository<Person, Long> {
}
The DTO I use:
private class PersonDTO {
private String firstName;
private String lastName;
/* getters and setters ... */
}
My Spring RestController:
#RestController
#RequestMapping("/api/people")
public class PersonController {
#Autowired
private PersonRepository people;
#Transactional
#RequestMapping(path = "/{personId}", method = RequestMethod.PUT)
public ResponseEntity<?> update(
#PathVariable String personId,
#RequestBody PersonDTO dto) {
// get the entity by ID
Person p = people.findOne(personId); // we assume it exists
// update ONLY entity attributes that have been defined
if(/* dto.getFirstName is defined */)
p.setFirstName = dto.getFirstName;
if(/* dto.getLastName is defined */)
p.setLastName = dto.getLastName;
return ResponseEntity.ok(p);
}
}
Request with missing property
{"firstName": "John"}
Expected behaviour: update firstName= "John" (leave lastName unchanged).
Request with null property
{"firstName": "John", "lastName": null}
Expected behaviour: update firstName="John" and set lastName=null.
I cannot distinguish between these two cases, sincelastName in the DTO is always set to null by Jackson.
Note:
I know that REST best practices (RFC 6902) recommend using PATCH instead of PUT for partial updates, but in my particular scenario I need to use PUT.
Another option is to use java.util.Optional.
import com.fasterxml.jackson.annotation.JsonInclude;
import java.util.Optional;
#JsonInclude(JsonInclude.Include.NON_NULL)
private class PersonDTO {
private Optional<String> firstName;
private Optional<String> lastName;
/* getters and setters ... */
}
If firstName is not set, the value is null, and would be ignored by the #JsonInclude annotation. Otherwise, if implicitly set in the request object, firstName would not be null, but firstName.get() would be. I found this browsing the solution #laffuste linked to a little lower down in a different comment (garretwilson's initial comment saying it didn't work turns out to work).
You can also map the DTO to the Entity with Jackson's ObjectMapper, and it will ignore properties that were not passed in the request object:
import com.fasterxml.jackson.databind.ObjectMapper;
class PersonController {
// ...
#Autowired
ObjectMapper objectMapper
#Transactional
#RequestMapping(path = "/{personId}", method = RequestMethod.PUT)
public ResponseEntity<?> update(
#PathVariable String personId,
#RequestBody PersonDTO dto
) {
Person p = people.findOne(personId);
objectMapper.updateValue(p, dto);
personRepository.save(p);
// return ...
}
}
Validating a DTO using java.util.Optional is a little different as well. It's documented here, but took me a while to find:
// ...
import javax.validation.constraints.NotNull;
import javax.validation.constraints.NotBlank;
import javax.validation.constraints.Pattern;
// ...
private class PersonDTO {
private Optional<#NotNull String> firstName;
private Optional<#NotBlank #Pattern(regexp = "...") String> lastName;
/* getters and setters ... */
}
In this case, firstName may not be set at all, but if set, may not be set to null if PersonDTO is validated.
//...
import javax.validation.Valid;
//...
public ResponseEntity<?> update(
#PathVariable String personId,
#RequestBody #Valid PersonDTO dto
) {
// ...
}
Also might be worth mentioning the use of Optional seems to be highly debated, and as of writing Lombok's maintainer(s) won't support it (see this question for example). This means using lombok.Data/lombok.Setter on a class with Optional fields with constraints doesn't work (it attempts to create setters with the constraints intact), so using #Setter/#Data causes an exception to be thrown as both the setter and the member variable have constraints set. It also seems better form to write the Setter without an Optional parameter, for example:
//...
import lombok.Getter;
//...
#Getter
private class PersonDTO {
private Optional<#NotNull String> firstName;
private Optional<#NotBlank #Pattern(regexp = "...") String> lastName;
public void setFirstName(String firstName) {
this.firstName = Optional.ofNullable(firstName);
}
// etc...
}
There is a better option, that does not involve changing your DTO's or to customize your setters.
It involves letting Jackson merge data with an existing data object, as follows:
MyData existingData = ...
ObjectReader readerForUpdating = objectMapper.readerForUpdating(existingData);
MyData mergedData = readerForUpdating.readValue(newData);
Any fields not present in newData will not overwrite data in existingData, but if a field is present it will be overwritten, even if it contains null.
Demo code:
ObjectMapper objectMapper = new ObjectMapper();
MyDTO dto = new MyDTO();
dto.setText("text");
dto.setAddress("address");
dto.setCity("city");
String json = "{\"text\": \"patched text\", \"city\": null}";
ObjectReader readerForUpdating = objectMapper.readerForUpdating(dto);
MyDTO merged = readerForUpdating.readValue(json);
Results in {"text": "patched text", "address": "address", "city": null}
Note that text and city were patched (city is now null) and that address was left alone.
In a Spring Rest Controller you will need to get the original JSON data instead of having Spring deserialize it in order to do this. So change your endpoint like this:
#Autowired ObjectMapper objectMapper;
#RequestMapping(path = "/{personId}", method = RequestMethod.PATCH)
public ResponseEntity<?> update(
#PathVariable String personId,
#RequestBody JsonNode jsonNode) {
RequestDTO existingData = getExistingDataFromSomewhere();
ObjectReader readerForUpdating = objectMapper.readerForUpdating(existingData);
RequestDTO mergedData = readerForUpdating.readValue(jsonNode);
...
}
Use boolean flags as jackson's author recommends.
class PersonDTO {
private String firstName;
private boolean isFirstNameDirty;
public void setFirstName(String firstName){
this.firstName = firstName;
this.isFirstNameDirty = true;
}
public String getFirstName() {
return firstName;
}
public boolean hasFirstName() {
return isFirstNameDirty;
}
}
Actually,if ignore the validation,you can solve your problem like this.
public class BusDto {
private Map<String, Object> changedAttrs = new HashMap<>();
/* getter and setter */
}
First, write a super class for your dto,like BusDto.
Second, change your dto to extend the super class, and change the
dto's set method,to put the attribute name and value to the
changedAttrs(beacause the spring would invoke the set when the
attribute has value no matter null or not null).
Third,traversal the map.
I have tried to solve the same problem. I found it quite easy to use JsonNode as the DTOs. This way you only get what is submitted.
You will need to write a MergeService yourself that does the actual work, similar to the BeanWrapper. I haven't found an existing framework that can do exactly what is needed. (If you use only Json requests you might be able to use Jacksons readForUpdate method.)
We actually use another node type as we need the same functionality from "standard form submits" and other service calls. Additionally the modifications should be applied within a transaction inside something called EntityService.
This MergeService will unfortunately become quite complex, as you will need to handle properties, lists, sets and maps yourself :)
The most problematic piece for me was to distinguish between changes within an element of a list/set and modifications or replacements of lists/sets.
And also validation will not be easy as you need to validate some properties against another model (the JPA entities in my case)
EDIT - Some mapping code (pseudo-code):
class SomeController {
#RequestMapping(value = { "/{id}" }, method = RequestMethod.POST, consumes = MediaType.APPLICATION_JSON_VALUE)
#ResponseBody
public void save(
#PathVariable("id") final Integer id,
#RequestBody final JsonNode modifications) {
modifierService.applyModifications(someEntityLoadedById, modifications);
}
}
class ModifierService {
public void applyModifications(Object updateObj, JsonNode node)
throws Exception {
BeanWrapperImpl bw = new BeanWrapperImpl(updateObj);
Iterator<String> fieldNames = node.fieldNames();
while (fieldNames.hasNext()) {
String fieldName = fieldNames.next();
Object valueToBeUpdated = node.get(fieldName);
Class<?> propertyType = bw.getPropertyType(fieldName);
if (propertyType == null) {
if (!ignoreUnkown) {
throw new IllegalArgumentException("Unkown field " + fieldName + " on type " + bw.getWrappedClass());
}
} else if (Map.class.isAssignableFrom(propertyType)) {
handleMap(bw, fieldName, valueToBeUpdated, ModificationType.MODIFY, createdObjects);
} else if (Collection.class.isAssignableFrom(propertyType)) {
handleCollection(bw, fieldName, valueToBeUpdated, ModificationType.MODIFY, createdObjects);
} else {
handleObject(bw, fieldName, valueToBeUpdated, propertyType, createdObjects);
}
}
}
}
Maybe too late for an answer, but you could:
By default, don't unset 'null' values. Provide an explicit list via query params what fields you want to unset. In such a way you can still send JSON that corresponds to your entity and have flexibility to unset fields when you need.
Depending on your use case, some endpoints may explicitly treat all null values as unset operations. A little bit dangerous for patching, but in some circumstances might be an option.
Another solution would be to imperatively deserialize the request body. By doing it, you will be able to collect user provided fields and selectively validate them.
So your DTO might look like this:
public class CatDto {
#NotBlank
private String name;
#Min(0)
#Max(100)
private int laziness;
#Max(3)
private int purringVolume;
}
And your controller can be something like this:
#RestController
#RequestMapping("/api/cats")
#io.swagger.v3.oas.annotations.parameters.RequestBody(
content = #Content(schema = #Schema(implementation = CatDto.class)))
// ^^ this passes your CatDto model to swagger (you must use springdoc to get it to work!)
public class CatController {
#Autowired
SmartValidator validator; // we'll use this to validate our request
#PatchMapping(path = "/{id}", consumes = "application/json")
public ResponseEntity<String> updateCat(
#PathVariable String id,
#RequestBody Map<String, Object> body
// ^^ no Valid annotation, no declarative DTO binding here!
) throws MethodArgumentNotValidException {
CatDto catDto = new CatDto();
WebDataBinder binder = new WebDataBinder(catDto);
BindingResult bindingResult = binder.getBindingResult();
List<String> patchFields = new ArrayList<>();
binder.bind(new MutablePropertyValues(body));
// ^^ imperatively bind to DTO
body.forEach((k, v) -> {
patchFields.add(k);
// ^^ collect user provided fields if you need
validator.validateValue(CatDto.class, k, v, bindingResult);
// ^^ imperatively validate user input
});
if (bindingResult.hasErrors()) {
throw new MethodArgumentNotValidException(null, bindingResult);
// ^^ this can be handled by your regular exception handler
}
// Here you can do normal stuff with your catDto.
// Map it to cat model, send to cat service, whatever.
return ResponseEntity.ok("cat updated");
}
}
No need for Optional's, no extra dependencies, your normal validation just works, your swagger looks good. The only problem is, you don't get proper merge patch on nested objects, but in many use cases that's not even required.
Probably to late but following code works for me to distinguish between null and not provided values
if(dto.getIban() == null){
log.info("Iban value is not provided");
}else if(dto.getIban().orElse(null) == null){
log.info("Iban is provided and has null value");
}else{
log.info("Iban value is : " + dto.getIban().get());
}
I have these two classes:
#JsonIdentityInfo(generator = ObjectIdGenerators.PropertyGenerator.class, property = "id",scope = Rol.class)
public class Rol extends MyEntity implements Serializable {
private Integer id;
private String rolName;
public Rol(Integer id, String rolName) {
this.id = id;
this.rolName = rolName;
}
...
}
#JsonIdentityInfo(generator = ObjectIdGenerators.PropertyGenerator.class, property = "id",scope = User.class)
public class User extends MyEntity implements Serializable {
private Integer id;
private String name;
private List<Rol> rolList;
public User(Integer id, String name, List<Rol> rolList) {
this.id = id;
this.name = name;
this.rolList = rolList;
}
...
}
and I try to serialize and deserialize the user object as following
Rol rol1 = new Rol(1, "MyRol");
Rol rol2 = new Rol(1, "MyRol");
List<Rol> rolList = new ArrayList();
rolList.add(rol1);
rolList.add(rol2);
user = new User(1, "MyUser", rolList);
ObjectMapper mapper = new ObjectMapper();
String jsonString = mapper.writeValueAsString(user);
User userJson = mappe.readValue(jsonString, User.class);
and the JsonMappingException: Already had POJO for id is produced. Why?
When I review the json result of the serialization I see that the result is
{"id": 1,"name": "MyName","rolList": [{"id": 1,"rolName": "MyRol"},{"id": 1,"rolName": "MyRol"}]}
when the result should be
{"id": 1,"name": "MyName","rolList": [{"id": 1,"rolName": "MyRol"},1]}
because rol1 and rol2 are different instances of the same POJO identifier with id 1.
How can I avoid the JsonMappingException? In my project I have some different instances of the same POJO. I can guarantee that if the id's are equal -> objects are equal.
Excuse me for my bad English.
For anyone returning to this question, it looks like there's option to do this with a custom ObjectIdResolver in Jackson. You can specify this on the #JsonIdentityInfo annotation, e.g. :
#JsonIdentityInfo(generator = ObjectIdGenerators.PropertyGenerator.class, property = "name",
resolver = CustomObjectIdResolver.class)
Then perhaps wrap the normal SimpleObjectIdResolver class to get going and customise bindItem().
In my case I wanted to avoid overlapping objectIds, so cleared down the references when I started a new Something:
public class CustomObjectIdResolver implements ObjectIdResolver {
private ObjectIdResolver objectIdResolver;
public CustomObjectIdResolver() {
clearReferences();
}
#Override
public void bindItem(IdKey id, Object pojo) {
// Time to drop the references?
if (pojo instanceof Something)
clearReferences();
objectIdResolver.bindItem(id, pojo);
}
#Override
public Object resolveId(IdKey id) {
return objectIdResolver.resolveId(id);
}
#Override
public boolean canUseFor(ObjectIdResolver resolverType) {
return resolverType.getClass() == getClass();
}
#Override
public ObjectIdResolver newForDeserialization(Object context) {
return new CustomObjectIdResolver();
}
private void clearReferences() {
objectIdResolver = new SimpleObjectIdResolver();
}
}
Jackson expects in this case different id for different class instances. There has been a previous discussion at github here. Overriding hashCode and equals will not help. Object references must match for equal id.
Options
Reuse Rol instances instead of making new ones with equal fields. As a bonus you will also save memory.
Modify the application logic so that it doesn't depend on #JsonIdentityInfo
===== POJO =====
// Employee POJO
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonNaming(PropertyNamingStrategy.LowerCaseWithUnderscoresStrategy.class)
public class Employee implements Serializable {
private Integer id;
private String name;
private Integer companyId;
// assume getters ,setters and serializable implementations.
}
// Company POJO
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonNaming(PropertyNamingStrategy.LowerCaseWithUnderscoresStrategy.class)
public class Company implements Serializable {
private Integer id;
private String name;
// assume getters ,setters and serializable implementations.
}
// EmployeeVO POJO
#JsonInclude(JsonInclude.Include.NON_NULL)
#JsonNaming(PropertyNamingStrategy.LowerCaseWithUnderscoresStrategy.class)
public class EmployeeVO implements Serializable {
private Employee employee;
private Company company;
// assume getters ,setters and serializable implementations.
}
===== My DAO layer class =====
public List<EmployeeVO> getEmployees(){
// configuring model mapper.
ModelMapper modelMapper = new ModelMapper();
modelMapper.getConfiguration()
.addValueReader(new RecordValueReader())
.setSourceNameTokenizer(NameTokenizers.UNDERSCORE);
//property map configuration.
PropertyMap<Record, EmployeeVO> employeeVOMap = new PropertyMap<Record, EmployeeVO>() {
protected void configure() {
map().getEmployee().setName(this.<String>source("name"));
map().getEmployee()..setId(this.<Integer>source("id"));
map().getCompany().setName(this.<String>source("comp_name"));
map().getCompany().setId(this.<String>source("comp_id"));
}
};
// TypeMap config
modelMapper.createTypeMap(Record.class, EmployeeVO.class);
// adding employeeVOMap .
modelMapper.addMappings(employeeVOMap);
// JOOQ query
List<Field<?>> fields = Lists.newArrayList();
// fields includes, id, name, comp_name, comp_id
SelectJoinStep query = select(dslContext, fields).from(EMPLOYEE)
.join(COMPANY)
.on(COMPANY.ID.equal(EMPLOYEE.COMPANY_ID));
Result<Record> records = query.fetch();
Record record = null;
Iterator<Record> it = records.iterator();
List<EmployeeVO> employeeList= Lists.newArrayList();
while (it.hasNext()) {
record = it.next();
EmployeeVO employeeVOObj =
modelMapper.map(record, EmployeeVO.class);
employeeList.add(employeeVOObj);
}
return employeeList;
}
===== Error log =====
1) Error mapping org.jooq.impl.RecordImpl to com.myportal.bingo.db.model.EmployeeVO
1 error] with root cause
java.lang.ArrayIndexOutOfBoundsException: -1
Note:
ModelMapper throws the above exception when it reaches below method.
private void matchSource(TypeInfo<?> sourceTypeInfo, Mutator destinationMutator)
in ImplicitMappingBuilder.java
sourceTypeInfo.getAccessors() is null.
Any help?
Had the same problem, or at least which looked the same. (You can move directly to my solution in the last paragraph.) Lots of debugging have shown the following:
if accessors on that line (mentioned in your question) are null, then accessors = PropertyInfoSetResolver.resolveAccessors(source, type, configuration) line in TypeInfoImpl class is executed, and the reason of exception in my case was this call:
valueReader.get(source, memberName) at the following piece of code at 'resolveAccessors' method in the PropertyInfoSetResolver class:
if (valueReader == null)
resolveProperties(type, true, configuration, accessors);
else {
NameTransformer nameTransformer = configuration.getSourceNameTransformer();
for (String memberName : valueReader.memberNames(source))
accessors.put(nameTransformer.transform(memberName, NameableType.GENERIC),
new ValueReaderPropertyInfo(valueReader, valueReader.get(source, memberName),
memberName));
which ends up in source.getValue(memberName.toUpperCase()), where source is JOOQ's Record; InvoiceRecord in my case. And - tada - for some reason invoice.getValue("INVOICE_ID") ends up in the exception (no such field and therefore indexOf returns -1 which causes the ArrayIndexOutOfBoundsException), while invoice.getValue("invoice_id") is totally fine.
So else condition (the same piece of code above) wasn't the right way to execute the code, and if case turned out to be ok.
So that's what helped me in my particular case: removing of the row modelMapper.getConfiguration().addValueReader(new RecordValueReader()). Hope this will help you too.