JPA + #OneToMany + DELETE: Item is not deleted if I access parent later - java

I got a Deal which can have multiple DealItems.
The DealItems are linked in the Deal with the following JPA annotation:
public class DealEntity extends BasicEntity {
#OneToMany(mappedBy = "deal", cascade = CascadeType.ALL, fetch = FetchType.LAZY)
private List<DealItemEntity> items;
...
This is the relation inside a DealItem:
public class DealItemEntity extends BasicEntity {
#ManyToOne
#JoinColumn(name = "deal_id", nullable = false)
private DealEntity deal;
...
When I delete a DealItem it is deleted and persited again, when when I access the Deal after the deletion, see here:
public FullDealResponse deleteDealItem(final String dealCode, final long dealItemId) {
DealEntity dealEntity = dealControl.findDealByDealCode(dealCode);
if (dealEntity == null) {
throw new WorkbenchGenericErrorException("Deal not found");
}
DealItemEntity dealItemEntity = dealItemControl.findDealItemByIdAndDealId(dealItemId, dealEntity.getId());
if (dealItemEntity == null) {
throw new WorkbenchGenericErrorException("Deal item not found");
}
// this makes a database DELETE call that is executed after the session is done
dealItemControl.deleteDealItem(dealItemEntity);
// When I remove this and I do not return anything, the deletion works
return this.getFullDealResponse(dealEntity);
}
EDIT:
This is getFullDealResponse() and getFullDealItemResponse():
private FullDealResponse getFullDealResponse(final DealEntity dealEntity) {
FullDealResponse response = new FullDealResponse();
response.setDescription(dealEntity.getDescription());
response.setTitle(dealEntity.getTitle());
response.setDealCode(dealEntity.getDealCode());
response.setCreatedAt(dealEntity.getCreatedAt());
response.setUpdatedAt(dealEntity.getUpdatedAt());
// get related items
List<FullDealItemResponse> itemsResponse = new ArrayList<FullDealItemResponse>();
for (DealItemEntity dealItemEntity : dealEntity.getItems()) {
itemsResponse.add(this.getFullDealItemResponse(dealItemEntity));
}
response.setItems(itemsResponse);
return response;
}
private FullDealItemResponse getFullDealItemResponse(final DealItemEntity dealItemEntity) {
FullDealItemResponse response = new FullDealItemResponse();
response.setId(dealItemEntity.getId());
response.setDescription(dealItemEntity.getDescription());
response.setTitle(dealItemEntity.getTitle());
response.setCreatedAt(dealItemEntity.getCreatedAt());
response.setUpdatedAt(dealItemEntity.getUpdatedAt());
return response;
}
This is deleteDealItem() and delete() function:
public void deleteDealItem(final DealItemEntity dealItemEntity) {
super.delete(DealItemEntity.class, dealItemEntity.getId());
}
protected void delete(final Class<?> type, final Object id) {
Object ref = this.em.getReference(type, id);
this.em.remove(ref);
}
Can this be solved when I switch the CascadeType, and if so, which would be the correct type? Or would I have to iterate over Deal.getItems(), remove the unwanted item, set the new list with Deal.setItems() and update only the Deal so it propagates the deletion?
What is the preferred way to do this?

I have replicated this code locally and verified my explanation
Summary:
Cascade does not have impact. Even if you remove your cascade operation, save each item separately , then when you come to this method, it will not delete your item.
To have same behaviour regardless of deal.getItems initialisation, You will have to delete the dealItem by removing it from deal.getItems in addition to deleting the dealItem directly.
On a bi-directional relationship, you will have to explicitly manage both sides. Exactly the same way, you add the dealItem to deal as well set deal field of dealItem before you save.
Overall Explanation
JPA can have only one representation of a particular item associated with it's session.
It is the foundation for providing Repeatble Read, Dirty Checking etc.
JPA also tracks every object associated with its session and If any of the tracked objects have changes, they will flushed when the transaction committed.
When only deal object (with lazy deaItems collection) and the directly fetched dealItem are the only two entities associated with the session, then JPA has one presentation for each in the session, since there is no conflict, when you delete it, it deletes it via dealItemControl.deleteDealItem the dealItem is deleted
However, once you call deal.getItems, JPA not only manages deal, but also every dealItem associated with the deal object. So when when you delete the dealItemControl.deleteDealItem, JPA has an issue because deal.getItems tells it is not marked for delete. So the delete is not issued.
Reference: JPA QL generated also confirms my explanation
1. With deal.getItems and Queries Generated
#OneToMany(mappedBy = "deal", cascade = CascadeType.ALL, fetch = FetchType.LAZY)
private List<DealItemEntity> items;
DealEntity dealEntity = dealControl.findDealByDealCode(dealCode);
....
dealItemControl.deleteDealItem(dealItemEntity);
....
dealEntity.getItems()
select deal0_.* from deal deal0_ where deal0_.id=?
select dealitem0_.*
deal1_.*
from
deal_item dealitem0_ inner join deal deal1_ on dealitem0_.deal_id=deal1_.id
where
dealitem0_.id=?
select items0_.* from deal_item items0_ where items0_.deal_id=?
2. Without deal.getItems and Queries Generated
#OneToMany(mappedBy = "deal", cascade = CascadeType.ALL, fetch = FetchType.LAZY)
private List<DealItemEntity> items;
DealEntity dealEntity = dealControl.findDealByDealCode(dealCode);
....
dealItemControl.deleteDealItem(dealItemEntity);
select deal0_.* from deal deal0_ where deal0_.id=?
select dealitem0_.*
deal1_.*
from
deal_item dealitem0_ inner join deal deal1_ on dealitem0_.deal_id=deal1_.id
where
dealitem0_.id=?
delete from deal_item where id=?

Related

Hibernate: Replacing object (OneToOne)

I have a parent entity Stock which has a child entity StockDetails in a OneToOne relation.
I can't figure out how to properly set and replace values for the Stock.details field.
Here are my entity classes (#Getter/#Setter from Lombok):
#Getter
#Setter
#NoArgsConstructor
#Entity
#Table(name = "stocks")
public class Stock
{
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private long id;
private String isin;
private String symbol;
private String name;
#OneToMany(mappedBy = "stock", cascade = CascadeType.ALL, orphanRemoval = true, fetch = FetchType.LAZY)
private List<StockChartPoint> chart;
#OneToOne(mappedBy = "stock", cascade = CascadeType.ALL, orphanRemoval = true, fetch = FetchType.LAZY)
private StockDetails details;
public void setDetails(StockDetails d)
{
details = d;
details.setStock(this);
}
}
and
#Getter
#Setter
#NoArgsConstructor
#Entity
#Table(name = "stock_details")
public class StockDetails
{
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private long id;
#OneToOne
#JoinColumn(name = "stock")
private Stock stock;
}
Inserting a new value (and corresponding row) into the DB works fine when the details table is empty and I can see that Hibernate logs exactly one intert statement. My code looks like this:
dbService.transactional(session ->
{
var stocks = getAllStocks();
var s = stocks.get(0);
StockDetails n = new StockDetails();
s.setDetails(n);
session.save(s);
});
The DbService.transactional method (TransactionExecutor is just a functional interface):
public void transactional(TransactionExecutor executor)
{
Transaction t = null;
try
{
t = session.beginTransaction();
executor.execute(session);
session.flush();
t.commit();
}
catch (HibernateException ex)
{
if (t != null)
t.rollback();
}
}
But when there is already an existing row in the details table the old row is deleted and a new row is inserted. The result is that the PK of the details table is increasing every time I update the values. Is there a pattern which I couldn't find to address this? I could also just update the existing Stock.details field but this would lead in just copying all the fields from object A to object B and I guess there is a smarter way doing this. I tried using an EntityManager, merge()/saveOrUpdate()/persist() instead of save() as well as manipulating the ID in the new object but this resulted in changing nothing or throwing exceptions.
Then there is another problem I encountered and I don't know if it's related to the first one:
When executing the dbService.transactional(...) block twice it behaves differently: Now two rows are added to the DB. The SQL log looks like this:
...
Hibernate: insert into stock_details (stock) values (?)
Hibernate: delete from stock_details where id=?
-> insert/delete produces by the first run
Hibernate: select stock0_.id as id1_3_, stock0_.isin as isin2_3_, stock0_.name as name3_3_,
stock0_.symbol as symbol4_3_ from stocks stock0_
Hibernate: insert into stock_details (stock) values (?)
-> Just insert, no delete
Please let me know if more information is needed.
mysql.mysql-connector-java > 8.0.22
org.hibernate.hibernate-core > 5.4.25.Final
org.hibernate.hibernate-validator > 5.4.3.Final
But when there is already an existing row in the details table the old row is deleted...
...which is to be expected with orphanRemoval = true
...and a new row is inserted
...which is obviously to be expected as well, since you're overwriting the existing StockDetails associated with s with a brand new instance of StockDetails.
If you wish to update the existing StockDetails, rather than create a new StockDetails entity, you need to, well, do just that in Java code.
I could also just update the existing Stock.details field but this would lead in just copying all the fields from object A to object B...
that would be the least error-prone approach
...but I guess there is a smarter way doing this
You could just do:
StockDetails n = new StockDetails();
n.setId(s.getStockDetails().getId());
... //configure the remaining properties
n.setStock(s);
entityManager.merge(n);
As you might have guessed you are actually creating a new StockDetails and replacing the old one. If you want to update the existing StockDetails you really need to fetch that and update it field by field (as you told you could do). So like:
StockDetails sd = s.getStockDetails();
sd.setSomeFiled(updateValue);
// ... copying field by field
To prevent copying field by field you can obtain StockDetails from the database and make the possible edits directly into it so there would not be a need to copy each field.
However if it is a case - for example - that you need to write values from some DTO to your entity there are libraries that you can use to ease the pain of copying.
Just to mention one ModelMapper. With it the copying goes like:
ModelMapper mm = new ModelMapper();
StockDetails sd = s.getStocDetails();
mm.map(stockDetailsDto, sd);
where stockDetailsDto is some DTO object that contains fields to update to entity.
If your StockDetails contains all the other fields also even those not changed the em.merge is most easy as told in answer from crizzis.
But if you get only the updated fields then the other fields would be set to null. Sometimes id is not settable and then merge is impossible.

Get number how many new records are added and how many are updated JPA

I have the following entities:
public class Parent implements Serializable {
#Id
private Long propertyId;
#OneToMany(mappedBy = "parent", orphanRemoval = true, cascade = CascadeType.ALL)
private List<Child> objects = new ArrayList<>();
And the child class:
public class Child implements Serializable {
#Id
#NotNull
private Long childId;
#Id
#JoinColumn(name = "parent_id")
#ManyToOne(fetch = FetchType.LAZY)
private Parent parent;
I am getting the data from a another api in csv file, parsing it and persisting it.
The way I persist the data is:
First creating the Parent objects from the csv file
Creating the Child objects from the csv file
Set for every Parent the list of children and for every children set the parent
In the end I have the following code that persists the data:
parentRepository.saveAll(parents);
where parents is the complete list where I have the complete data.
I am calling this api from time to time, which means every time is called there might be new data or the same but in the end I must persist it.
What is my question is how to track how many new records are added and how many records are updated. I know that I can do some filter thing and query every entity and check if it exists but since I have around 80 000 entites adding so many queries is really time consuming.
Any proposal of how to do this but not to be so time consuming? Is there any kind of interceptor that will maybe give me the data at the end of the query?
Some background:
When you use manually assigned ids, If the object you are sending already exists, hibernate does not issue insert or update.
The reason for this is that whenever you call save() on entities with manually assigned ids, hibernate first does a select to decide if it is insert or update.
But since it has already done a select, it can compare it to decide if they are exactly equal, in that case, it will not issue insert or update
You can verify this behaviour by adding spring.jpa.show-sql=true, spring.jpa.properties.hibernate.format_sql=true and
logging.level.org.hibernate.SQL=DEBUG to application.properties
Solution
Add the following fields and methods to your entity
#Transient
private boolean updated;
#Transient
private boolean created;
#PrePersist
public void setCreated() {
this.created = true;
}
#PreUpdate
public void setUpdated() {
this.updated = true;
}
public boolean isCreated() {
return created;
}
public boolean isUpdated() {
return updated;
}
After saveAll() ,
parent.isCreated() is true, then you know it is new insert
parent.isUpdated() is true, then you know it is update
If both are false, then insert or update didn't happen

Lazy load doesn't work - I Only need the parent object, not all associations

How do get the object I want, without all of the child associations.
I have my class Site:
#Entity
#Table(name = "Sites")
public class Site {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
#Column(name = "Id_Site", unique = true, nullable = false)
private long Id_Site;
private String ...;
private boolean ...;
private long ...;
private Date ...;
private Date ...;
private String ...;
#OneToMany(cascade = CascadeType.ALL, fetch = FetchType.LAZY)
private Set<Sequence> sequences = new HashSet<>();
#ManyToOne
private ... ...;
#OneToMany(cascade = CascadeType.ALL, fetch = FetchType.LAZY)
private Set<...> ... = new HashSet<>();
#ManyToOne
private ... ...;
public constructor...
public set..
public get..
}
I only need a Site object, without the Sequence Associations.
In my Sequence Table, I have:
#Entity
#Table(name = "Sequences")
public class Sequence {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
#Column(name = "Id_Sequence", unique = true, nullable = false)
private long Id_Sequence;
private Date ....;
private Date ....;
private String ....;
private String ....;
private String ....;
private int ....;
private int ....;
private double ....;
private double ....;
#OneToMany(cascade = CascadeType.ALL, fetch = FetchType.EAGER)
private Set<TraceSequence> traceSequences = new HashSet<>();
#ManyToOne(cascade = CascadeType.ALL)
private Site site;
public constructor...
public set..
public get..
}
When I use FetchType.Lazy, and call my method:
#Override
public Site findSiteByName(String Name_Site) {
List<Site> sites = entityManager.createQuery("SELECT s FROM Site s").getResultList();
for (Site item : sites) {
if (item.getNom_Site().equals(Name_Site)) {
return item;
}
}
return null;
}
I get this error:
failed to lazily initialize a collection of role: xxx.xxx.xxx.xxx.xxx.site.Site.sequences, could not initialize proxy - no Session
When I use FetchType.EAGER, I get not only a Site object, but I also get all sequence objects, and all objects of other sequence associations. (I know it is the normal response.)
Could someone who knows why this attempt at lazy initialization doesn't work, please, tell me how to resolve this problem.
These lazy errors happens when the jpa tries to get the data after the session is closed.
But using eager will influence all the queries that include that entity.
Try to use a join fetch in the query instead of the eager.
Somewhere in your code you are calling Site.GetSequences(), maybe iterating in the view or in another part of your code. It doesn't look like the piece of code you gave are generating the exception.
I you try to use a collection that is not loaded to your entity, the code throws the exception you mentioned.
To solve this, identify where you are using the sequences and load them before you use by changing the fetch to EAGER or using the JOIN FETCH in your query.
Returning a hibernate managed entity (or a collection of hibernate managed entities) will most likely cause these sort of problems unless you are super cautious on what is being returned and what was populated by hibernate when session was available.
I would say create a DTO (or a collection of DTO) and populate its fields the way you like. There are many Entity to DTO conversion framework; my fav is ModelMapper.
I also tend to agree with other suggestions to play with FetchType but since DTOs are populated by us we know what we populated as opposed to entity-relationships which are populated by hibernate based on annotations.
If you need something in the DTO you simply ask the entity and since session would be available at that point of time you could populate any field that you think you would need on the UI.
I don't want to hijack this topic towards DTO and Entity but that's how I would do it.
This may be helpful too Avoid Jackson serialization on non fetched lazy objects
Error happen becouse you try execute getSequences(), but becouse of is lazy and session is alredy closed hibernate rais the error.
To avoid this error read read sequencese inside query method, "inside" session, like this:
public Site findSiteByName(String Name_Site) {
List sites = entityManager.createQuery("SELECT s FROM Sites").getResultList();
for (Site item : sites) {
if (item.getNom_Site().equals(Name_Site)) {
item.getSites();
return item;
}
}
return null;
}
This is a lazy loading, you read collenction just when you need it!
As stated by other SE members above, you are getting this error because session is already closed.
If you want to load a particular object then you can use Hibernate.initialize method. it will execute one additional query to fetch the data of related entity.
Therefore, it is as per need basis and will not be executed all times as compared to Eager loading
I'm working on a project that aims to solve common JPA problems when mapping entities to DTOs using ModelMapper. This issue has already been solved on the project. Project link: JPA Model Mapper
On this scenario I believe that we'd want to simply get null for all lazy load entities. For this question specifically, this could be done by using de JPA Model Mapper to map an entity to DTO.
I've already answered the same issue on this question: How to solve the LazyInitializationException when using JPA and Hibernate

Hibernate does not refresh entity childs completely

I use Hibernate 5.1.0.Final. My GenericDAO class main methods:
public T save(T entity) {
entityManager.getTransaction().begin();
entityManager.persist(entity);
entityManager.getTransaction().commit();
}
public T find(Long entityId) {
T e = (T) entityManager.find(type, entityId);
entityManager.refresh(e);
return e;
}
I have Item entity which contains List<Image> images.
#Entity
#Table(name="item")
public class Item extends GenericEntity {
#Column(name="title")
String title;
#OneToMany(fetch = FetchType.LAZY, mappedBy = "item", cascade = {CascadeType.ALL}, orphanRemoval = true)
#NotFound(action = NotFoundAction.IGNORE)
List<Image> images;
#Version
#Temporal(TemporalType.TIMESTAMP)
#Column(name="version")
Date version;
}
#Entity
#Table(name="image")
public class Image extends GenericEntity {
#ManyToOne
#JoinColumn(name="item")
Item item;
#Column(name="image_name")
String imageName;
}
First, I load Item with genericDAO.find(id) method - it contains up-to-date list of images. Then I load Item with the same ID in another REST method which removes old images and adds new ones, changes title. Later, if try to reload Item with genericDAO.find(id) in the first REST method, I get outdated images - they aren't selected again while Item title is retrieved correctly.
How to get completely refreshed entity with its childs from a database?
Your way of caching your entityManagers is correct since in the request scope but it has side-effects as you can notice.
You use the level one cache of Hibernate.
Why don't use a distinct entityManager instance by transaction ?
It should not be expensive to instantiate a EntityManager by user request.
If you want to cache the entityManager, entityManager.refres‌​h(e) seems not enough since it doesn't seem to clear the level one cache. So, if you want to clear the first level cache, you should try to clear the persistenceContext. No guarantee but you can try : entityManager.clear()
Personally, I prefer the solution using a unique entityManager instance by transaction. It's cheap and a clear design.

Hibernate - A collection with cascade=”all-delete-orphan” was no longer referenced by the owning entity instance

I'm having the following issue when trying to update my entity:
"A collection with cascade=”all-delete-orphan” was no longer referenced by the owning entity instance".
I have a parent entity and it has a Set<...> of some children entities. When I try to update it, I get all the references to be set to this collections and set it.
The following code represents my mapping:
#OneToMany(mappedBy = "parentEntity", fetch = FetchType.EAGER)
#Cascade({ CascadeType.ALL, CascadeType.DELETE_ORPHAN })
public Set<ChildEntity> getChildren() {
return this.children;
}
I've tried to clean the Set<..> only, according to this: How to "possible" solve the problem but it didn't work.
If you have any ideas, please let me know.
Thanks!
Check all of the places where you are assigning something to sonEntities. The link you referenced distinctly points out creating a new HashSet but you can have this error anytime you reassign the set. For example:
public void setChildren(Set<SonEntity> aSet)
{
this.sonEntities = aSet; //This will override the set that Hibernate is tracking.
}
Usually you want to only "new" the set once in a constructor. Any time you want to add or delete something to the list you have to modify the contents of the list instead of assigning a new list.
To add children:
public void addChild(SonEntity aSon)
{
this.sonEntities.add(aSon);
}
To remove children:
public void removeChild(SonEntity aSon)
{
this.sonEntities.remove(aSon);
}
The method:
public void setChildren(Set<SonEntity> aSet) {
this.sonEntities = aSet;
}
works if the parentEntity is detached and again if we update it.
But if the entity is not detached from per context, (i.e. find and update operations are in the same transaction) the below method works.
public void setChildren(Set<SonEntity> aSet) {
//this.sonEntities = aSet; //This will override the set that Hibernate is tracking.
this.sonEntities.clear();
if (aSet != null) {
this.sonEntities.addAll(aSet);
}
}
When I read in various places that hibernate didn't like you to assign to a collection, I assumed that the safest thing to do would obviously be to make it final like this:
class User {
private final Set<Role> roles = new HashSet<>();
public void setRoles(Set<Role> roles) {
this.roles.retainAll(roles);
this.roles.addAll(roles);
}
}
However, this doesn't work, and you get the dreaded "no longer referenced" error, which is actually quite misleading in this case.
It turns out that hibernate calls your setRoles method AND it wants its special collection class installed here, and won't accept your collection class. This had me stumped for a LONG time, despite reading all the warnings about not assigning to your collection in your set method.
So I changed to this:
public class User {
private Set<Role> roles = null;
public void setRoles(Set<Role> roles) {
if (this.roles == null) {
this.roles = roles;
} else {
this.roles.retainAll(roles);
this.roles.addAll(roles);
}
}
}
So that on the first call, hibernate installs its special class, and on subsequent calls you can use the method yourself without wrecking everything. If you want to use your class as a bean, you probably need a working setter, and this at least seems to work.
Actually, my problem was about equals and hashcode of my entities. A legacy code can bring a lot of problems, never forget to check it out. All I've done was just keep delete-orphan strategy and correct equals and hashcode.
I fixed by doing this :
1. clear existing children list so that they are removed from database
parent.getChildren().clear();
2. add the new children list created above to the existing list
parent.getChildren().addAll(children);
Hope this post will help you to resolve the error
I had the same error. The problem for me was, that after saving the entity the mapped collection was still null and when trying to update the entity the exception was thrown. What helped for me: Saving the entity, then make a refresh (collection is no longer null) and then perform the update. Maybe initializing the collection with new ArrayList() or something might help as well.
I ran into this when updating an entity with a JSON post request.
The error occurred when I updated the entity without data about the children, even when there were none.
Adding
"children": [],
to the request body solved the problem.
I used #user2709454 approach with small improvement.
public class User {
private Set<Role> roles;
public void setRoles(Set<Role> roles) {
if (this.roles == null) {
this.roles = roles;
} else if(this.roles != roles) { // not the same instance, in other case we can get ConcurrentModificationException from hibernate AbstractPersistentCollection
this.roles.clear();
if(roles != null){
this.roles.addAll(roles);
}
}
}
}
It might be caused by hibernate-enhance-maven-plugin. When I enabled enableLazyInitialization property this exception started on happening on my lazy collection. I'm using hibernate 5.2.17.Final.
Note this two hibernate issues:
https://hibernate.atlassian.net/browse/HHH-10708
https://hibernate.atlassian.net/browse/HHH-11459
All those answers didnt help me, BUT I found another solution.
I had an Entity A containing a List of Entity B. Entity B contained a List of Entity C.
I was trying to update Entity A and B. It worked. But when updating Entity C, I got the mentioned error. In entity B I had an annotation like this:
#OneToMany(mappedBy = "entity_b", cascade = [CascadeType.ALL] , orphanRemoval = true)
var c: List<EntityC>?,
I simply removed orphanRemoval and the update worked.
HAS RELATION TYPE:
Don't try to instantiate the collection when it's declared in hasMany, just add and remove objects.
class Parent {
static hasMany = [childs:Child]
}
USE RELATION TYPE:
But the collection could be null only when is declared as a property (use relation) and is not initialized in declaration.
class Parent {
List<Child> childs = []
}
The only time I get this error is when I try to pass NULL into the setter for the collection. To prevent this, my setters look like this:
public void setSubmittedForms(Set<SubmittedFormEntity> submittedForms) {
if(submittedForms == null) {
this.submittedForms.clear();
}
else {
this.submittedForms = submittedForms;
}
}
I had this problem when trying to use TreeSet. I did initialize oneToMany with TreeSet which works
#OneToMany(mappedBy = "question", fetch = FetchType.EAGER, cascade = { CascadeType.ALL }, orphanRemoval=true)
#OrderBy("id")
private Set<WizardAnswer> answers = new TreeSet<WizardAnswer>();
But, this will bring the error described at the question above. So it seems that hibernate supported SortedSet and if one just change the line above to
#OneToMany(mappedBy = "question", fetch = FetchType.EAGER, cascade = { CascadeType.ALL }, orphanRemoval=true)
#OrderBy("id")
private SortedSet<WizardAnswer> answers;
it works like magic :)
more info on hibernate SortedSet can be here
There is this bug which looks suspiciously similar: https://hibernate.atlassian.net/browse/HHH-9940.
And the code to reproduce it: https://github.com/abenneke/sandbox/tree/master/hibernate-null-collection/src/test
There are 2 possible fixes to this:
the collection is initialized with an empty collection (instead of null)
orphanRemoval is set to false
Example - was:
#OneToMany(cascade = CascadeType.REMOVE,
mappedBy = "jobEntity", orphanRemoval = true)
private List<JobExecutionEntity> jobExecutionEntities;
became:
#OneToMany(cascade = CascadeType.REMOVE,
mappedBy = "jobEntity")
private List<JobExecutionEntity> jobExecutionEntities;
I had the same issue, but it was when the set was null. Only in the Set collection, in List work fine. You can try to the hibernate annotation #LazyCollection(LazyCollectionOption.FALSE) instead of JPA annotation fetch = FetchType.EAGER.
My solution:
This is my configuration and work fine
#OneToMany(mappedBy = "format", cascade = CascadeType.ALL, orphanRemoval = true)
#LazyCollection(LazyCollectionOption.FALSE)
private Set<Barcode> barcodes;
#OneToMany(mappedBy = "format", cascade = CascadeType.ALL, orphanRemoval = true)
#LazyCollection(LazyCollectionOption.FALSE)
private List<FormatAdditional> additionals;
One other cause may be using lombok.
#Builder - causes to save Collections.emptyList() even if you say .myCollection(new ArrayList());
#Singular - ignores the class level defaults and leaves field null even if the class field was declared as myCollection = new ArrayList()
My 2 cents, just spent 2 hours with the same :)
I was getting A collection with cascade=”all-delete-orphan” was no longer referenced by the owning entity instance when I was setting parent.setChildren(new ArrayList<>()). When I changed to parent.getChildren().clear(), it solved the problem.
Check for more details: HibernateException - A collection with cascade="all-delete-orphan" was no longer referenced by the owning entity instance.
I am using Spring Boot and had this issue with a collection, in spite of not directly overwriting it, because I am declaring an extra field for the same collection with a custom serializer and deserializer in order to provide a more frontend-friendly representation of the data:
public List<Attribute> getAttributes() {
return attributes;
}
public void setAttributes(List<Attribute> attributes) {
this.attributes = attributes;
}
#JsonSerialize(using = AttributeSerializer.class)
public List<Attribute> getAttributesList() {
return attributes;
}
#JsonDeserialize(using = AttributeDeserializer.class)
public void setAttributesList(List<Attribute> attributes) {
this.attributes = attributes;
}
It seems that even though I am not overwriting the collection myself, the deserialization does it under the hood, triggering this issue all the same. The solution was to change the setter associated with the deserializer so that it would clear the list and add everything, rather than overwrite it:
#JsonDeserialize(using = AttributeDeserializer.class)
public void setAttributesList(List<Attribute> attributes) {
this.attributes.clear();
this.attributes.addAll(attributes);
}
Mine was completely different with Spring Boot!
For me it was not due to setting collection property.
In my tests I was trying to create an entity and was getting this error for another collection that was unused!
After so much trying I just added a #Transactional on the test method and it solved it. Don't no the reason though.
#OneToMany(mappedBy = 'parent', cascade= CascadeType.ALL, orphanRemoval = true)
List<Child> children = new ArrayList<>();
I experienced the same error when I was adding child object to the existing list of Child Objects.
childService.saveOrUpdate(child);
parent.addToChildren(child);
parentService.saveOrUpdate(parent);
What resolved my problem is changing to:
child = childService.saveOrUpdate(child);
Now the child is revive with other details as well and it worked fine.
Had this issue with spring-boot 2.4.1 when running the tests in bulk from [Intellij Idea] version 2020.3. The issue doesn't appear when running only one test at a time from IntelliJ or when running the tests from command line.
Maybe Intellij caching problem?
Follow up:
The problem appears when running tests using the maven-surefire-plugin reuseForks true. Using reuseForks false would provide a quick fix, but the tests running time will increase dramatically. Because we are reusing forks, the database context might become dirty due to other tests that are run - without cleaning the database context afterwards. The obvious solution would be to clean the database context before running a test, but the best one should be to clean up the database context after each test (solving the root cause of the original problem). Using the #Transactional annotation on your test methods will guarantee that your database changes are rolled back at the end of the test methods. See the Spring documentation on transactions: https://docs.spring.io/spring-framework/docs/current/reference/html/testing.html#testcontext-tx.
I face a similar issue where I was using these annotations in my parent entity :
#Cascade({ CascadeType.ALL, CascadeType.DELETE_ORPHAN })
Mistakenly, I was trying to save a null parent object in database and properly setting values to my entity object resolved my error. So, do check if you are silly setting wrong values or trying to save a null object in database.
Adding my dumb answer. We're using Spring Data Rest. This was our pretty standard relationship. The pattern was used elsewhere.
//Parent class
#OneToMany(mappedBy = 'parent',
cascade= CascadeType.ALL, orphanRemoval = true)
#LazyCollection(LazyCollectionOption.FALSE)
List<Child> children = new LinkedList<>()
//Child class
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = 'ParentID', updatable = false)
#JsonBackReference
Parent parent
With the relationship we created, it was always intended that the children would be added through their own repo. I had not yet added the repo. The integration test we had was going through a complete lifecycle of the entity via REST calls so the transactions would close between requests. No repo for the child meant the json had the children as part of the main structure instead of in _embedded. Updates to the parent would then cause problems.
Following solution worked for me
//Parent class
#OneToMany(mappedBy = 'parent',
cascade= CascadeType.ALL, orphanRemoval = true)
#OrderBy(value="ordinal ASC")
List<Child> children = new ArrayList<>()
//Updated setter of children
public void setChildren(List<Children> children) {
this.children.addAll(children);
for (Children child: children)
child.setParent(this);
}
//Child class
#ManyToOne
#JoinColumn(name="Parent_ID")
private Parent parent;
Instead of assigning new collection
public void setChildren(Set<ChildEntity> children) {
this.children = children;
}
Replace all elements with
public void setChildren(Set<ChildEntity> children) {
Collections.replaceAll(this.children,children);
}
be careful with
BeanUtils.copyProperties(newInsum, insumOld,"code");
This method too break the hibernate.
This is in contrast to the previous answers, I had exactly the same error: "A collection with cascade=”all-delete-orphan” was no longer referenced...." when my setter function looked like this:
public void setTaxCalculationRules(Set<TaxCalculationRule> taxCalculationRules_) {
if( this.taxCalculationRules == null ) {
this.taxCalculationRules = taxCalculationRules_;
} else {
this.taxCalculationRules.retainAll(taxCalculationRules_);
this.taxCalculationRules.addAll(taxCalculationRules_);
}
}
And then it disappeared when I changed it to the simple version:
public void setTaxCalculationRules(Set<TaxCalculationRule> taxCalculationRules_) {
this.taxCalculationRules = taxCalculationRules_;
}
(hibernate versions - tried both 5.4.10 and 4.3.11. Spent several days trying all sorts of solutions before coming back to the simple assignment in the setter. Confused now as to why this so.)
In my case it was concurrent access to one Hibernate Session from several threads.
I had the Spring Boot Batch and RepositoryItemReader implementation where I fetched entities by page request with size 10.
For example my entities are:
#Entity
class JobEntity {
#ManyToOne(fetch = FetchType.LAZY)
private GroupEntity group;
}
#Entity
class GroupEntity {
#OneToMany(mappedBy = "group", cascade = CascadeType.ALL, fetch = FetchType.LAZY, orphanRemoval = true)
private Set<Config> configs;
}
Batch process: reader -> processor -> writer in one transaction.
In that entities configuration, GroupEntity can escapes to other threads:
First thread that entered to read section fetches the page of JobEntity with size 10 (RepositoryItemReader#doRead), this items contain one shared GroupEntity object (because all of them pointed to the same group id). Then it takes the first entity. Next threads that come to read section take JobEntity from this page one by one, until this page will be exhausted.
So now threads have access to the same GroupEntity instance thought the JobEntity instances, that is unsafe multi thread access to the one Hibernate Session.
As of 2021 and Spring Boot 2.5, it helped me to initialize the field right away when declaring it:
#OneToMany(mappedBy="target",fetch= FetchType.EAGER,cascade = CascadeType.ALL, orphanRemoval = true)
private List<TargetEntity> targets = new ArrayList<>();
Issue is solved when we make the child as final..
we should not change the reference of the child in constructor as well as setter.

Categories