Fetchtype.LAZY with same Entity - java

I build a tree structure which is stored in the database. The relation is build on the id, parent_id columns in the database table. I'm using spring data and hibernate.
For accessing the tree structure I build a entity class "Node" and a "NodeRepository". The entity class has an attribute "children" which has a #OneToMany relation to itself.
Fetching the nodes is no problem. But fetching the children is kind of a problem, because of lazy fetching outside of a transactional environment ("failed to lazily initialize a collection of role").
I changed the Fetchmode to eager. But that is also a problem because of the relation to itself, it ended up with the whole tree structure to be fetched.
So, what is the best practice in this case to keep the retrieval of the children easy on one side and don't fetching the whole structure on the other side?

I think i got a solution which fits my needs.
First i defined a custom interface which extends my NodeRepository interface, give it an additional method "findOneWithChildrenInit" and implemented it.
public interface NodeRepositoryCustom {
public Node findOneWithChildrenInit(Long id);
}
public class NodeRepositoryImpl implements NodeRepositoryCustom {
#Autowired
NodeRepository repo;
#Override
#Transactional
public Node findOneWithChildrenInit(Long id) {
Node node = repo.findOne(id);
node.getChildren().size();
return node;
}
}
So I can decide. When I don't need the children, I can simply call findOne(). Then I need them, I call findOneWithChildrenInit().

It probably depends on the provider implementation. But usually, the children collection is initialized when you use the collection, in any way. So, a common way is to call children.size() before you get out of the transaction, and throw away the result.

You can still use lazy fetch type and make sure that session has not been closed by the time all children were fetched. You can simple do:
Session session = sessionFactory.openSession();
// fetch all required Nodes & all their children by using the same session object
session.close();

When you have one more relationship eager in your entity you need include the annotation #Fetch(FetchMode.SELECT) in the others eagers.
Check in official documentation each of typer avaiables in the fetchmode.

Related

DDD implementation with Spring Data and JPA + Hibernate problem with identities

So I'm trying for the first time in a not so complex project to implement Domain Driven Design by separating all my code into application, domain, infrastructure and interfaces packages.
I also went with the whole separation of the JPA Entities to Domain models that will hold my business logic as rich models and used the Builder pattern to instantiate. This approach created me a headache and can't figure out if Im doing it all wrong when using JPA + ORM and Spring Data with DDD.
Process explanation
The application is a Rest API consumer (without any user interaction) that process daily through Scheduler tasks a fairly big amount of data resources and stores or updates into MySQL. Im using RestTemplate to fetch and convert the JSON responses into Domain objects and from there Im applying any business logic within the Domain itself e.g. validation, events, etc
From what I have read the aggregate root object should have an identity in their whole lifecycle and should be unique. I have used the id of the rest API object because is already something that I use to identify and track in my business domain. I have also created a property for the Technical id so when I convert Entities to Domain objects it can hold a reference for the update process.
When I need to persist the Domain to the data source (MySQL) for the first time Im converting them into Entity objects and I persist them using the save() method. So far so good.
Now when I need to update those records in the data source I first fetch them as a List of Employees from data source, convert Entity objects to Domain objects and then I fetch the list of Employees from the rest API as Domain models. Up until now I have two lists of the same Domain object types as List<Employee>. I'm iterating them using Streams and checking if an objects are not equal() between them if yes a collection of List items is created as a third list with Employee objects that need to be updated. Here I've already passed the technical Id to the domain objects in the third list of Employees so Hibernate can identify and use to update the records that are already exists.
Up to here are all fairly simple stuff until I use the saveAll() method to update the records.
Questions
I alway see Hibernate using INSERT instead of updating the list of
records. So If Im correct Hibernate session is not recognising the
objects that Im throwing into it because I have detached them when I
used the convert to domain object?
Does anyone have a better idea how can I implement this differently or fix
this problem?
Or should I stop using this approach as two different objects and continue use
them as rich Entity models?
Simple classes to explain it with code
EmployeeDO.java
#Entity
#Table(name = "employees")
public class EmployeeDO implements Serializable {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Long id;
private String name;
public EmployeeDO() {}
...omitted getter/setters
}
Employee.java
public class Employee {
private Long persistId;
private Long employeeId;
private String name;
private Employee() {}
...omitted getters and Builder
}
EmployeeConverter.java
public class EmployeeConverter {
public static EmployeeDO serialize(Employee employee) {
EmployeeDO target = new EmployeeDO();
if (employee.getPersistId() != null) {
target.setId(employee.getPersistId());
}
target.setName(employee.getName());
return target;
}
public static Employee deserialize(EmployeeDO employee) {
return new Country.Builder(employee.getEmployeeId)
.withPersistId(employee.getId()) //<-- Technical ID setter
.withName(employee.getName())
.build();
}
}
EmployeeRepository.java
#Component
public class EmployeeReporistoryImpl implements EmployeeRepository {
#Autowired
EmployeeJpaRepository db;
#Override
public List<Employee> findAll() {
return db.findAll().stream()
.map(employee -> EmployeeConverter.deserialize(employee))
.collect(Collectors.toList());
}
#Override
public void saveAll(List<Employee> employees) {
db.saveAll(employees.stream()
.map(employee -> EmployeeConverter.serialize(employee))
.collect(Collectors.toList()));
}
}
EmployeeJpaRepository.java
#Repository
public interface EmployeeJpaRepository extends JpaRepository<EmployeeDO, Long> {
}
I use the same approach on my project: two different models for the domain and the persistence.
First, I would suggest you to don't use the converter approach but use the Memento pattern. Your domain entity exports a memento object and it could be restored from the same object. Yes, the domain has 2 functions that aren't related to the domain (they exist just to supply a non-functional requirement), but, on the other side, you avoid to expose functions, getters and constructors that the domain business logic never use.
For the part about the persistence, I don't use JPA exactly for this reason: you have to write a lot of code to reload, update and persist the entities correctly. I write directly SQL code: I can write and test it fast, and once it works I'm sure that it does what I want. With the Memento object I can have directly what I will use in the insert/update query, and I avoid myself a lot of headaches about the JPA of handling complex tables structures.
Anyway, if you want to use JPA, the only solution is to:
load the persistence entities and transform them into domain entities
update the domain entities according to the changes that you have to do in your domain
save the domain entities, that means:
reload the persistence entities
change, or create if there're new ones, them with the changes that you get from the updated domain entities
save the persistence entities
I've tried a mixed solution, where the domain entities are extended by the persistence ones (a bit complex to do). A lot of care should be took to avoid that domain model should adapts to the restrictions of JPA that come from the persistence model.
Here there's an interesting reading about the splitting of the two models.
Finally, my suggestion is to think how complex the domain is and use the simplest solution for the problem:
is it big and with a lot of complex behaviours? Is expected that it will grow up in a big one? Use two models, domain and persistence, and manage the persistence directly with SQL It avoids a lot of caos in the read/update/save phase.
is it simple? Then, first, should I use the DDD approach? If really yes, I would let the JPA annotations to split inside the domain. Yes, it's not pure DDD, but we live in the real world and the time to do something simple in the pure way should not be some orders of magnitude bigger that the the time I need to to it with some compromises. And, on the other side, I can write all this stuff in an XML in the infrastructure layer, avoiding to clutter the domain with it. As it's done in the spring DDD sample here.
When you want to update an existing object, you first have to load it through entityManager.find() and apply the changes on that object or use entityManager.merge since you are working with detached entities.
Anyway, modelling rich domain models based on JPA is the perfect use case for Blaze-Persistence Entity Views.
Blaze-Persistence is a query builder on top of JPA which supports many of the advanced DBMS features on top of the JPA model. I created Entity Views on top of it to allow easy mapping between JPA models and custom interface defined models, something like Spring Data Projections on steroids. The idea is that you define your target structure the way you like and map attributes(getters) via JPQL expressions to the entity model. Since the attribute name is used as default mapping, you mostly don't need explicit mappings as 80% of the use cases is to have DTOs that are a subset of the entity model.
The interesting point here is that entity views can also be updatable and support automatic translation back to the entity/DB model.
A mapping for your model could look as simple as the following
#EntityView(EmployeeDO.class)
#UpdatableEntityView
interface Employee {
#IdMapping("persistId")
Long getId();
Long getEmployeeId();
String getName();
void setName(String name);
}
Querying is a matter of applying the entity view to a query, the simplest being just a query by id.
Employee dto = entityViewManager.find(entityManager, Employee.class, id);
The Spring Data integration allows you to use it almost like Spring Data Projections: https://persistence.blazebit.com/documentation/entity-view/manual/en_US/index.html#spring-data-features and it can also be saved back. Here a sample repository
#Repository
interface EmployeeRepository {
Employee findOne(Long id);
void save(Employee e);
}
It will only fetch the mappings that you tell it to fetch and also only update the state that you make updatable through setters.
With the Jackson integration you can deserialize your payload onto a loaded entity view or you can avoid loading alltogether and use the Spring MVC integration to capture just the state that was transferred and flush that. This could look like the following:
#RequestMapping(path = "/employee/{id}", method = RequestMethod.PUT, consumes = MediaType.APPLICATION_JSON_VALUE)
public ResponseEntity<String> updateEmp(#EntityViewId("id") #RequestBody Employee emp) {
employeeRepository.save(emp);
return ResponseEntity.ok(emp.getId().toString());
}
Here you can see an example project: https://github.com/Blazebit/blaze-persistence/tree/master/examples/spring-data-webmvc

JPA EntityManager.merge() attemps to cascade the update to deleted entities

I'm facing a problem with EntityManager.merge() where the merge is cascaded to other entities that have already been deleted from the database. Say I have the following entities:
#Entity
public class Parent {
#OneToMany(cascade = CascadeType.ALL, orphanremoval = true, mappedBy = "parent")
private List<Child> children;
public void clearChildren() { children.clear(); }
public void createChildren(Template template) { ... }
}
#Entity
public class Child {
#ManyToOne
#JoinColumn(name = "parentId")
private Parent parent;
}
The situation where the problem occurs is the following:
The user creates a new Parent instance, and creates new Child instances based on a template of their choosing by calling the createChildren() method. The template defines the amount and properties of the created children.
The user saves the parent, which cascades the persist to the children.
The user notices that the used template was wrong. He changes the template and saves, which results in deletion of the old children and the creation of new ones.
Commonly the deletion of the old children would be handled automatically by the orphanRemoval property, but the Child entity has a multi-column unique index, and some of the new children created based on the new template can have identical values in all columns of the index as some of the original children. When the changes are flushed to the database, JPA performs inserts and updates before deletions (or at least Hibernate does), and a constraint violation occurs. Oracle's deferred constraints would solve this, but we also support MS SQL, which AFAIK doesn't support deferred constraints (correct me if I'm wrong).
So in order to solve this, I manually delete the old children, flush the changes, create the new children, and save my changes. The artificial code snippet below shows the essential parts of what's happening now. Due to the way our framework works, the entities passed to this method are always in a detached state (which I'm afraid is a part of the problem).
public void createNewChildren(Parent parent, Template template) {
for (Child child : parent.getChildren()) {
// Have to run a find since the entities are detached
entityManager.remove(entityManager.find(Child.class, child.getId()));
}
entityManager.flush();
parent.clearChildren();
parent.createChildren(template);
entityManager.merge(parent); // EntityNotFoundException is thrown
}
The last line throws an exception as the EntityManager attempts to load the old children and merge them as well, but fails since they're already deleted. The question is, why does it try to load them in the first place? And more importantly, how can I prevent it? The only thing that comes to my mind that could cause this is a stale cache issue. I can't refresh the parent as it can contain other unsaved changes and those would be lost (plus it's detached). I tried setting the parent reference explicitly to null for each child before deleting them, and I tried to evict the old children from the 2nd level cache after deleting them. Neither helped. We haven't modified the JPA cache settings in any way.
We're using Hibernate 4.3.5.
UPDATE:
We are in fact clearing the children from the parent as well, this was maybe a bit ambiguous originally so I updated the code snippets to make it clear.
Try removing the children from parent before deleting them, that way MERGE can't be cascaded to them because they are not in the parent's collection.
for (Child child : parent.getChildren()) {
// Have to run a find since the entities are detached
Child c = entityManager.find(Child.class, child.getId());
parent.getChildren().remove(c); // ensure that the child is actually removed
entityManager.remove(c);
}
UPDATE
I still think the order of operations is the cause of the problems here, try if this works
public void createNewChildren(Parent parent, Template template) {
for (Child child : parent.getChildren()) {
// Have to run a find since the entities are detached
Child c = entityManager.find(Child.class, child.getId());
parent.getChildren().remove(c); // ensure that the child is actually removed
c.setParent(null);
entityManager.remove(c);
}
parent.createChildren(template);
entityManager.merge(parent);
}

Should JPA Repositories be created by Table or Object?

What is the best practice for creating JPA Repositories?
Right now I have two tables: Media and Ratings.
To find media that is similar a query has to be made to the Rating table to find the interconnections between the different media. This query then returns a list of Media objects.
Should this query be placed in the Rating repository (as it queries the Rating table),
or in the Media repository (as it returns a collection of Media objects with the IDs set)?
I have tried searching for best-practices for this particular use-case but haven't found anything relevant.
Update:
The SQL query is defined like this:
#Query(value=[SQL query with several joins],nativeQuery=true)
List<Media> mySQLQuery()
it returns a list of mediaId's which can be returned from the function as Media objects.
Well, probably neither.
You see when you implement a Repository and do for example findAll() you will get a List of all Objects from the Entity used in the Repository creation.
interface MyRepo implements<Media, Long>....
myRepo.findAll() will return a List of Media objects.
What you are trying to do is out of scope from a Repository as it acts on only that particular Entity with a finit operations on that particular Entity.
Also it seems to me that Media and Ratings are connected with a OneToMany or ManyToMany, then this definitely should go to a separate DAO method.
I solved it by creating a custom implementation of the repository, implementing a custom interface.
First I had to declare the interface:
public interface CustomQuery {
public List<Integer>myCustomQuery(int id)
}
Then I implemented it in my custom repository. The name is important, it has to begin with the name of the repository interface it is extending. My repo is named MediaRepository so I named the custom implementation MediaRepositoryImpl:
#Component
public class MediaRepositoryImpl implements CustomQuery {
#PersistenceContext
EntityManager manager;
public List<Integer>myCustomQuery(int id){
Query q = manager.createNativeQuery(SQL_QUERY_GOES_HERE);
List<Integer> ids = new ArratList<Integer>();
#SuppressWarnings("unchecked")
List<Integer> result = q.getResultList();
for(Integer o : result){
//process the results and add them to the list
}
return list;
}
}
This way, you can do custom native queries while still keeping the regular Repositories clean. This approach is also easier to test.

Explicit delete on JPA relationships

I am a bit confused about managing relationship in JPA.
basically I have two entities with a One to Many relationship
A configuration can have have a one or many email list associated with it.
#Entity
public class Config {
#OneToMany(mappedBy="owner",cascade=CascadeType.ALL, fetch=FetchType.EAGER)
private List<Email> emailReceivers;
}
#Entity
public class Email {
#ManyToOne
private Config owner;
}
In an EJB and during update/merge operation wherein I would edit the list of emails associated with a configuration,
I thought that I dont need to explicitly call the delete operation on my email Entity and I would just manage the relationship by deleting the email in my configuration email list.
#Stateless
public class ConfigFacadeImpl implements ConfigFacade{
#EJB
private ConfigDao configDao;
#EJB
private EmailDao emailDao;
#Override
public void update(Config Config, List<Email> emailsForDelete) {
if(emailsForDelete!=null && emailsForDelete.size() > 0){
for(Email emailTemp: emailsForDelete){
Email email = emailDao.find(emailTemp.getId());
emailDao.delete(email); // Do I need to explicitly call the remove??
config.getEmailReceivers().remove(email);
}
}
configDao.update(config);
}
}
If I don't execute the delete and only remove it from the list, it wont erase my table row.
The UI and the database is now not in sync as the UI would not show the email(s) that I have deleted but when you check the database, the row(s) are still there.
Is it required? I thought JPA would handle this for me if I would just remove it in my entities.
UPDATE
I have tweaked my code to get the entity from the database first before making any changes but still it is not deleting my child email entities. I wonder if this is an apache derby issues. (This is the correct way right as I am passing my entities from my JSF managed bean into my EJB so I need to get the sync from the DB first.)
#Override
public void update(Config config, List<Email> emailsForDelete) {
Config configTemp = configDao.find(config.getId());
if(emailsForDelete!=null && emailsForDelete.size() > 0){
for(Email emailTemp: emailsForDelete){
configTemp.getEmailReceivers().remove(emailTemp);
}
}
configDao.update(config);
}
Since you have already defined cascade type = CascadeType.ALL, JPA should take care of the deletion. Explicit Delete statement is not required.
These two statements are not required:
Email email = emailDao.find(emailTemp.getId());
emailDao.delete(email); // Do I need to explicitly call the remove??
Instead, you may want to just find the matching emailReceiver in config.getEmailReceivers() and remove the matching EmailReceivers as you are doing. There is no need to load the Email entity from the database.
EDIT: To delete orphan objects, you may want to include CascadeType.DELETE_ORPHAN cascade attribute along with CascadeType.ALL.
This is the same issue as in Why merging is not cascaded on a one to many relationship
Basically, JPA can only cascade over entities in your collection. So changes to child objects removed from the collection are never putinto the context, and so can't be pushed to the database. In this case, the oneToMany is controlled by the manytones back pointer, so even collection changes won't show up unless the child is also merged. Once a child is pruned from the tree, it needs to be managed and merged individually for changes to it to be picked up.
With JPA 2.0, you can use the option orphanRemoval=true in parent entity
Example:
#Entity
public class Parent {
...
#OneToMany(mappedBy="parentId",cascade=CascadeType.ALL, orphanRemoval=true)
private List<Child> childList;
...
}

How can I access lazy-loaded fields after the session has closed, using hibernate?

consider this scenario:
I have loaded a Parent entity through hibernate
Parent contains a collection of Children which is large and lazy loaded
The hibernate session is closed after this initial load while the user views the Parent data
The user may choose to view the contents of the lazy Children collection
I now wish to load that collection
What are the ways / best way of loading this collection?
Assume session-in-view is not an option as the fetching of the Children collection would only happen after the user has viewed the Parent and decided to view the Children.
This is a service which will be accessed remotely by web and desktop based client.
Thanks.
The lazy collection can be loaded by using Hibernate.initialize(parent.getCollection()) except that the parent object needs to be attached to an active session.
This solution takes the parent Entity and the name of the lazy-loaded field and returns the Entity with the collection fully loaded.
Unfortunately, as the parent needs to be reattached to the newly opened session, I can't use a reference to the lazy collection as this would reference the detached version of the Entity; hence the fieldName and the reflection. For the same reason, this has to return the attached parent Entity.
So in the OP scenario, this call can be made when the user chooses to view the lazy collection:
Parent parentWithChildren = dao.initialize(parent,"lazyCollectionName");
The Method:
public Entity initialize(Entity detachedParent,String fieldName) {
// ...open a hibernate session...
// reattaches parent to session
Entity reattachedParent = (Entity) session.merge(detachedParent);
// get the field from the entity and initialize it
Field fieldToInitialize = detachedParent.getClass().getDeclaredField(fieldName);
fieldToInitialize.setAccessible(true);
Object objectToInitialize = fieldToInitialize.get(reattachedParent);
Hibernate.initialize(objectToInitialize);
return reattachedParent;
}
I'm making some assumptions about what the user is looking at, but it seems like you only want to retrieve the children if the user has already viewed the parent and really wants to see the children.
Why not try opening a new session and fetching the children by their parent? Something along the lines of ...
criteria = session.createCriteria(Child.class);
criteria.add(Restrictions.eq("parent", parent));
List<Child> children = criteria.list();
Hibernate handles collections in a different way that normal fields.
At my work we get around this by just initialising the fields in the initial load that we need on a case by case basis. For example, in a facade load method that is surrounded by a transaction you might have:
public Parent loadParentWithIntent1(Long parentId)
{
Parent parent = loadParentFromDAO();
for (Child c : parent.getChildren())
{
c.getField1();
}
}
and we have a different facade call for each intent. This essentially achieves what you need because you'd be loading these specific fields when you need them any way and this just puts them in the session at load time.

Categories