I have a JPA entity with a List of custom objects as one of its fields. Using a Jackson converter, I've managed to persist this list as a JSON array into a MySQL database, but Iam unable to insert into this list after its initial creation.
I can successfully retrieve the existing list, add a new object in memory(and test that it has been inserted), then save it via a Spring REST repository. However, it never seems to persist. Any ideas? Here is my code (this is a Spring Boot project FYI):
Candidate entity with a List inside
#Entity
#Table(name = "Candidates", schema = "Candidate")
public class Candidate extends ResourceSupport {
#Id
#Column(name = "CandidateID")
private Long candidateID;
// More fields
#Column(name = "Fields")
#Convert(converter = CollectionConverter.class)
private List<CandidateField> fields;
//Getters & setters
}
CandidateField class which makes up the List above. The CandidateField is simply a POJO that models the JSON stored in a single field in the Candidate table, it is not an independent entity.
public class CandidateField {
private Long fieldID;
private String name;
private boolean current;
public CandidateField () {
}
public CandidateField (Long fieldID, String name, boolean current) {
this.fieldID = fieldID;
this.name = name;
this.current = current;
}
//Getters & Setters
}
Converter
public class CollectionConverter implements AttributeConverter<List<CandidateField>, String> {
private ObjectMapper objectMapper = new ObjectMapper();
#Override
public String convertToDatabaseColumn(List<CandidateField> object) {
try {
return objectMapper.writeValueAsString(object);
} catch (JsonProcessingException e) {
e.printStackTrace();
return "";
}
}
#Override
public List<CandidateField> convertToEntityAttribute(String data) {
try {
return objectMapper.readValue(data, new TypeReference<List<CandidateField>>() {});
} catch (Exception e) {
e.printStackTrace();
return null;
}
}
Code that persists to database
public void addField(Long fieldID, Long candidateID) {
Candidate candidate = repository.findOne(candidateID);
candidate.getFields().add(new CandidateField(fieldID, "", true));
repository.saveAndFlush(candidate);
}
Repository
#RepositoryRestResource
public interface CandidateRepository extends JpaRepository<Candidate,Long>{}
I can't seem to figure out why this won't persist. Any help will be very much appreciated. Cheers!
Consider defining the cascade type for your collection.
When you persist your Candidate objects the operation is not cascaded by default and thus you need to define it yourself unless you persist your CandidateField objects directly.
Related
I had a class like:
public class EmailAddress {
public String value;
public String tld() {...}
public String host() {...}
public String mailbox() {...}
}
Now I use this class in an Object / Entity:
#Entity
public class Customer {
public String name;
public EmailAddress mail;
}
Now, when I do a rest service for Customer, I get this format:
{
"id": 1,
"name": "Test",
"email": {
"value": "test#test.de"
}
}
But I only want "email": "test#test.de"
{
"id": 1,
"name": "Test",
"email": "test#test.de"
}
What I must do? I use Spring Boot and Hibernate Entities.
Thank you for any support
You should use DTO class in request handling and make mappings from DTO to Entity and backwards, e.g.:
public class CustomerDTO {
private Integer id;
private String name;
private String email;
}
You should use DataTransferObjects for your (REST) APIs.
The DTOs only contain the fields the interface should provide (or receive).
When receiving objects from the client and before returning the object from your Controller you can convert the DTOs to your domain model (Which could be your JPA entites classes).
Example for a controller method. We assume you get an object from an user-editor which contains all data you want to update in your database-objects and return the updated company DTO:
#PutMapping
public CustomerDto updateCustomer(CustomerEditorDto updatedCustomerDto) {
Customer updatedCustomer = CustomerConverter.convert(updatedCustomerDto);
updatedCustomer = customerService.updateCustomer(updatedCustomer);
return CustomerConverter.convert(updatedCustomer);
}
and your Converter class:
#NoArgsConstructor(access = AccessLevel.PRIVATE)
public class CustomerConverter {
public static CustomerDto convert(Customer customer) {
CustomerDto result = null;
if (customer != null) {
// TODO: set fields in result-dto
}
return result;
}
public static Customer convert(CustomerEditorDto customer) {
Customer result = null;
if (customer != null) {
// TODO set fields in result;
}
return result;
}
}
and here are the DTOs
#Getter
#Setter
public class CustomerDto {
private Integer id;
private String name;
private String email;
}
#Getter
#Setter
public class CustomerEditorDto {
private Integer id;
private String firstName;
private String lastName;
private String email;
private String otherPropertyOrStuff;
}
This way you can separate the API modell from your JPA entites. You can use the same models for input/output. And you can even use a different model to work with inside your services and the finally convert them into your JPA entites, before persisting the data (or after reading the data).
There are tools which can take care of the conversion, like mapstruct.
* The above annotations #Getter, #Setter, ... are from project lombok and very are handy to generate boiler-plate code automatically.
I found an other easier solution, use a JsonSerializer on the entity Property:
#JsonSerialize(using = EmailAddressSerializer.class)
private EmailAddress email;
The serializer class:
public class EmailAddressSerializer extends StdSerializer<EmailAddress> {
public EmailAddressSerializer() {
super(EmailAddress.class);
}
protected EmailAddressSerializer(Class<EmailAddress> t) {
super(t);
}
#Override
public void serialize(EmailAddress email,
JsonGenerator jsonGenerator,
SerializerProvider serializerProvider) throws IOException {
jsonGenerator.writeString(email.value);
}
}
I have the following Entities
public class Manufacturer
{
int id;
String name;
Country country;
List<Model> models;
}
public class Model
{
int id;
String name;
}
And the following DTO
public class ManufacturerLastModelDto
{
Integer id;
String name;
ModelDto model;
}
public class ModelDto
{
int id;
String name;
}
Now I want to map the Manufacturer to the ManufacturerLastModelDto, like that:
modelMapper.map(manufacturer, ManufacturerLastModelDto.class)
So that only the first entry of the List model will be assigned from manufacturer.
My previous solution was that I had a List of ModelDto's even in the DTO and removed all Entries after Index 0. That was OK, because the ModelMapper mapped the child from Model to ModelDTO automatically.
But only Response wasn't so nice:
models: [
{...}
]
because it was sent as an Array.
Do I need a custom ModelMapper here? If so, how to build it? The tutorial is really complex. Do I need a converter or a TypeMap (or both)?
I am not too familiar with ModelMapper but have used it occasionally.
Yes, you would need to create a converter for the property, and you could either use it with the ModelMapper or a TypeMap. Only caveat being that you will need to map the property yourself, for instance
The converter
Converter<List<Model>, ModelDto> modelConverter = new AbstractConverter<List<Model>, ModelDto>() {
#Override
protected ModelDto convert(List<Model> models) {
if (models == null || models.isEmpty()) {
return null;
}
Model model = models.get(0);
ModelDto dto = new ModelDto();
dto.setId(model.getId());
dto.setName(model.getName());
return dto;
}
};
Now using default model mapper
ModelMapper modelMapper = new ModelMapper();
modelMapper.addConverter(modelConverter);
ManufacturerLastModelDto result = modelMapper.map(manufacturer, ManufacturerLastModelDto.class);
Or using the TypeMap
TypeMap<Manufacturer, ManufacturerLastModelDto> typeMap = modelMapper.typeMap(Manufacturer.class, ManufacturerLastModelDto.class)
.addMappings(mapper ->
mapper.using(modelConverter).map(Manufacturer::getModels, ManufacturerLastModelDto::setModel)
);
ManufacturerLastModelDto result = typeMap.map(manufacturer);
I have in my controller:
#RestController
public class OneTwoController {
private OnTwoService _service;
//... more code
#PostMapping("/api/one-two")
#CrossOrigin
public ResponseEntity<ServiceResponse> save(#RequestBody OneTwo model) {
return ResponseEntity.ok().body( _service.Save(model));
}
In my entity:
#Entity(name = "OneTwo")
#Where (clause = "deleted='false'")
public class OneTwo{
#EmbeddedId
private OneTwoKey_id;
public OneTwo(OneTwoKey id) {
this._id = id;
}
#JsonProperty("oneTwo")
public void setId(OneTwoKey value) {
this._id = value;
}
The OneTwoKey class:
public class OneTwoKey implements Serializable {
#Column(name = "OneID")
private int _oneID;
#Column(name = "TwoID")
private int _twoID;
public OneTwoKey(int oneID, int twoID) {
this._oneID = oneID;
this._twoID = twoID;
}
}
The json that I send to the Rest API:
{
"oneTwo": {
"oneID": 83,
"twoID": 69
},
"deleted": true
}
The issue is that both ids arrive null, so the service can't do the insert on the DB.
How can I deal with those cases when the ids are more than one?
Try adding setters in the OneTwoKey class to make it easier for the JSON deserializer:
#JsonProperty("oneID")
public void setOneID(int oneID) {
this._oneID = oneID;
}
#JsonProperty("twoID")
public void setTwoID(int twoID) {
this._twoID = twoID;
}
Another solution is to create a DTO, use it to receive the data in the controller and then convert it to your entity:
public class OneTwoDTO {
private Map<String, Int> oneTwo;
private boolean deleted;
// setters & getters
}
Simply what you can do is instead of using
public ResponseEntity<ServiceResponse> save(#RequestBody OneTwo model) {
you can use
public ResponseEntity<ServiceResponse> save(#RequestBody String model) {
Now convert the String to json and get all the key value pairs, it would be easier if you have dynamic number of variables and you want to capture them all.
or you can use tools like jsonschema2pojo whick take a json schema and generate a pojo. In the json schema if you set
"additionalProperties": true
you can capture all the values.
Could you make sure the problem is not because of case sensitivity?
Lower case the column names. Also could you use public access on those variables as well? These are my initial guesses as to why the payload is not being binded correctly.
public class OneTwoKey implements Serializable {
#Column(name = "oneID")
public int _oneID;
#Column(name = "twoID")
public int _twoID;
I'm using a Converter class to store a complex class as JSON text in mySQL. When I add a new entity, the Converter class works as intended. However, when I update the entity, the data in the complex class is not updated in the database but it's updated in memory. Other attributes such as Lat and Long are updated. The breakpoint I placed at the convertToDatabaseColumn method and it did not trigger on update.
Object Class
public class Project {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private long id;
private String city;
private String state;
private String country;
#Enumerated(EnumType.STRING)
private StatusType status;
private String street;
private double latitude;
private double longitude;
#Convert(converter=ProjectPropertyConverter.class)
private ProjectProperty property;
}
public class ProjectProperty {
private String description;
private List<String> projectImgs;
private Boolean hasImages;
}
Property Converter Class
#Converter (autoApply=true)
public class ProjectPropertyConverter implements AttributeConverter<ProjectProperty, String> {
#Override
public String convertToDatabaseColumn(ProjectProperty prop) {
try {
ObjectMapper mapper = new ObjectMapper();
String jsonString = mapper.writeValueAsString(prop);
return jsonString;
} catch (Exception e) {
System.out.print(e.toString());
return null;
}
}
#Override
public ProjectProperty convertToEntityAttribute(String jsonValue) {
try {
ObjectMapper mapper = new ObjectMapper();
ProjectProperty p = mapper.readValue(jsonValue, ProjectProperty.class);
if(p.getProjectImgs().isEmpty())
{
p.setHasImages(Boolean.FALSE);
}
else
{
p.setHasImages(Boolean.TRUE);
}
return p;
} catch (Exception e) {
System.out.print(e.toString());
return null;
}
}
}
Method to Update Database
public void modifyEntity(Object entity, String query, HashMap params) {
try {
tx.begin();
em.flush();
tx.commit();
} catch (Exception e) {
e.toString();
}
}
I came here looking for same answers. Turns out the problem is JPA doesn't know that your object is dirty. This was solved by implementing equals()/hashcode() methods on this complex objects. In your example, implement equals and hashcode for ProjectProperty
Once that is done, JPA is able to identify via these methods that the underlying object is dirty and converts and persists.
I have these two classes:
#JsonIdentityInfo(generator = ObjectIdGenerators.PropertyGenerator.class, property = "id",scope = Rol.class)
public class Rol extends MyEntity implements Serializable {
private Integer id;
private String rolName;
public Rol(Integer id, String rolName) {
this.id = id;
this.rolName = rolName;
}
...
}
#JsonIdentityInfo(generator = ObjectIdGenerators.PropertyGenerator.class, property = "id",scope = User.class)
public class User extends MyEntity implements Serializable {
private Integer id;
private String name;
private List<Rol> rolList;
public User(Integer id, String name, List<Rol> rolList) {
this.id = id;
this.name = name;
this.rolList = rolList;
}
...
}
and I try to serialize and deserialize the user object as following
Rol rol1 = new Rol(1, "MyRol");
Rol rol2 = new Rol(1, "MyRol");
List<Rol> rolList = new ArrayList();
rolList.add(rol1);
rolList.add(rol2);
user = new User(1, "MyUser", rolList);
ObjectMapper mapper = new ObjectMapper();
String jsonString = mapper.writeValueAsString(user);
User userJson = mappe.readValue(jsonString, User.class);
and the JsonMappingException: Already had POJO for id is produced. Why?
When I review the json result of the serialization I see that the result is
{"id": 1,"name": "MyName","rolList": [{"id": 1,"rolName": "MyRol"},{"id": 1,"rolName": "MyRol"}]}
when the result should be
{"id": 1,"name": "MyName","rolList": [{"id": 1,"rolName": "MyRol"},1]}
because rol1 and rol2 are different instances of the same POJO identifier with id 1.
How can I avoid the JsonMappingException? In my project I have some different instances of the same POJO. I can guarantee that if the id's are equal -> objects are equal.
Excuse me for my bad English.
For anyone returning to this question, it looks like there's option to do this with a custom ObjectIdResolver in Jackson. You can specify this on the #JsonIdentityInfo annotation, e.g. :
#JsonIdentityInfo(generator = ObjectIdGenerators.PropertyGenerator.class, property = "name",
resolver = CustomObjectIdResolver.class)
Then perhaps wrap the normal SimpleObjectIdResolver class to get going and customise bindItem().
In my case I wanted to avoid overlapping objectIds, so cleared down the references when I started a new Something:
public class CustomObjectIdResolver implements ObjectIdResolver {
private ObjectIdResolver objectIdResolver;
public CustomObjectIdResolver() {
clearReferences();
}
#Override
public void bindItem(IdKey id, Object pojo) {
// Time to drop the references?
if (pojo instanceof Something)
clearReferences();
objectIdResolver.bindItem(id, pojo);
}
#Override
public Object resolveId(IdKey id) {
return objectIdResolver.resolveId(id);
}
#Override
public boolean canUseFor(ObjectIdResolver resolverType) {
return resolverType.getClass() == getClass();
}
#Override
public ObjectIdResolver newForDeserialization(Object context) {
return new CustomObjectIdResolver();
}
private void clearReferences() {
objectIdResolver = new SimpleObjectIdResolver();
}
}
Jackson expects in this case different id for different class instances. There has been a previous discussion at github here. Overriding hashCode and equals will not help. Object references must match for equal id.
Options
Reuse Rol instances instead of making new ones with equal fields. As a bonus you will also save memory.
Modify the application logic so that it doesn't depend on #JsonIdentityInfo