One of our model object in our application has many fields configured to be eagerly fetched like so:
#ManyToOne(fetch = FetchType.EAGER)
#JoinColumn(name = "field")
public Field getField() {
return this.field;
}
However I sometime do not need these information, which slow down my queries for nothing. I cannot change the behaviour and use FetchType.LAZY instead as I've no idea what will be the impact on the whole application (legacy...). Is there a way to simply tell hibernate to fetch nothing, except if it is specified in the query?
Last time I checked there was no proper solution provided by hibernate, so I ended up with this solution:
Configured the problematic references as LAZY.
All affected service methods (that used these models) got an overloaded version with boolean forceEager
by default all existing functions were refactored to call the new ones with forceEager=true
and here comes the trick: as a means of "forcing the eager fetching" I found nothing better than actually accessing the proxied (lazy-fetched) objects. In case for example a lazily referenced list doing list.size() will force Hibernate to load the full list, hence the service returns with fully fetched object.
In case of more than one layer in your objectstructure is affected, you need to traverse through the whole hierarchy and access every lazily loaded object from top to bottom.
This is a bit error-prone solution, so you need to handle it with care.
If its possible to switch to Criteria for this query, you could use FetchMode.SELECT for the field property
crit.setFetchMode("field", FetchMode.SELECT);
Related
I have a lazily fetched field in my entity
#ElementCollection(fetch = LAZY)
private List<String> emails;
And my transaction boundary stops at service class, I don't want to keep it open while rendering the view. I want my service classes to return detached entities.
In my service class I tried calling the getters but that seem to be erased by the compiler -- maybe it's an optimization for a statement that appear to do nothing
/* User Service class */
#Transactional
public List<User> getAllUsers() {
List<User> users = new ArrayList();
for(User u : userRepo.findAll()) {
u.getEmails(); // <-- this seem to be erased by the compiler optimization.
users.add(u);
}
return users;
}
Hence I'm forced to print the lazy field into TRACE log so that it won't clutter the production logs. Doing this will ensure the lazy field is pre-populated before the entities are detached:
LOG.trace(u.getEmails().toString());
However this solution isn't pretty at all. Is there any better way to do this?
I don't want to mark the field as EAGER because I have another service method that purposely skips the relationship for efficiency.
Since you are using Hibernate, this is probably going to have to be specific. I'm not aware of any JPA functionality that does this. According to the Hibernate Documentation:
The static methods Hibernate.initialize() and Hibernate.isInitialized(), provide the application with a convenient way of working with lazily initialized collections or proxies. Hibernate.initialize(cat) will force the initialization of a proxy, cat, as long as its Session is still open. Hibernate.initialize( cat.getKittens() ) has a similar effect for the collection of kittens.
This should prevent the compiler from erasing that call, and remove the need for you to do some kind of work with the return value to trick the compiler. So Hibernate.initialize(u.getEmails()) should work for you.
Hibernate.initialize(u.getEmails())
I am currently working on a product that works with Hibernate (HQL) and another one that works with JPQL. As much as I like the concept of the mapping from a relational structure (database) to an object (Java class), I am not convinced of the performance.
EXAMPLE:
Java:
public class Person{
private String name;
private int age;
private char sex;
private List<Person> children;
//...
}
I want to get attribute age of a certain Person. A person with 10 children (he has been very busy). With Hibernate or JPQL you would retrieve the person as an object.
HQL:
SELECT p
FROM my.package.Person as p
WHERE p.name = 'Hazaart'
Not only will I be retrieving the other attributes of the person that I don't need, it will also retrieve all the children of that person and their attributes. And they might have children as well and so on... This would mean more tables would be accessed on database level than needed.
Conclusion:
I understand the advantages of Object Relational Mapping. However it would seem that in a lot of cases you will not need every attribute of a certain object. Especially in a complex system. It would seem like the advantages do not nearly justify the performance loss. I've always learned performance should be the main concern.
Can anyone please share their opinion? Maybe I am looking at it the wrong way, maybe I am using it the wrong way...
I'm not familiar with JPQL, but if you set up Hiernate correctly, it will not automatically fetch the children. Instead it will return a proxy list, which will fetch the missing data transparently if it is accessed.
This will also work with simple references to other persistent objects. Hibernate will create a proxy object, containing only the ID, and load the actual data only if it is accessed. ("lazy loading")
This of couse has some limitations (like persistent class hierarchies), but overall works pretty good.
BTW, you should use List<Person> to reference the children. I'm not sure that Hibernate can use a proxy List if you specify a specific implementation.
Update:
In the example above, Hibernate will load the attributes name, age and sex, and will create a List<Person> proxy object that initially contains no data.
Once the application accesses calls any method of the List that requires knowledge of the data, like childen.size() or iterates over the list, the proxy will call Hibernate to read the children objects and populate the List. The cildren objects, being instances of Person, will also contain a proxy List<Person> of their children.
There are some optimizations hibernate might perform in the background, like loading the children for other Person objects at the same time that might be in this session, since it is querying the database anyways. But whether this is done, and to what extend, is configurable per attribute.
You can also tell hibernate to never use lazy-loading for certain references or classes, if you are sure you'll need them later, or if you continue to use the persistent oject once the session is closed.
Be aware that lazy loading will of course fail if the session is no longer active. If for example you load a Person oject, don't access the children List, and close the session, a call to children.size() for example will fail.
IIRC the hibernate session class has method to populate all not-yet-loaded references in a persistent oject, if needed.
Best read the hibernate documentation on how to configure all this.
I have a HQL query something ala'
SELECT myclass
FROM
MyClass myclass JOIN FETCH
myclass.anotherset sub JOIN FETCH
sub.yetanotherset
...
So, class MyClass has a property "anotherset" , which is a set containing instance of another class, lets call it MyClassTwo. And, class MyClassTwo has a property yetanotherset which is a set of a third type of class (with no further associations on it).
In this scenario, I'm having trouble with the hashCode implementation. Basically, the hashCode implementation of MyClassTwo, uses the "yetanotherset" property, and on the exact line it accesses the yetanothertest property, it fails with a LazyInitializationException.
org.hibernate.LazyInitializationException: illegal access to loading collection
I'm guessing, this is because the data from "yetanotherset" hasn't been fetched yet, but how do I fix this? I don't particularly like the idea of dumbing down the hashCode to ignore the property.
Additional question, does HQL ignore fetch=FetchType.EAGER as defined in XML or annotations, it seems like it does. But I cant verify this anywhere.
Implementing hashCode() using a mutable field is a bad idea: it makes storing the entity in a HashSet and modifying the mutable property impossible.
Implementing it in terms of a collection of other entities is an even worse idea: it forces the loading of the collection to compute the hashCode.
Choose a unique, immutable property (or set of properties) in your entity, and implement hashCode based on that. On last resort, you have the option of using the ID, but if it's autogenerated, you must not put it in a Set before the ID is generated.
This is hibernate's most famous exception and it is exactly as you described it. The session has been disconnected, transaction closed, and you are attempting to access this collection. JOIN FETCH in your HQL should force EAGER loading to occur regardless of whether than annotation is present.
I suspect that your annotations are malformed, you have missing or out of date jars, or some other problem of that type.
Bump your Hibernate logging level up to generate the SQL hibernate.SQL=debug and investigate exactly what SQL is being executed up to where you see this exception. This should indicate to you whether your hibernate configuration is behaving the way you think its configured.
Post more of your code and the logs and someone might be able to help you spot the error.
I'm using JPA 1, Hibernate and Oracle 10.2.0 and my entities are defined like this:
#Entity
#Table(name="TERMS")
public class Term implements Serializable {
#Id
#GenericGenerator(name = "generator", strategy = "guid", parameters = {})
#GeneratedValue(generator = "generator")
#Column(name="TERM_ID")
private String termId;
}
I have a situation where an XML representation of the Entity (and child entities) will be coming in through a web service to update/replace existing ones. My thought was to just delete the old ones and re-create it from the incoming XML.
However, doing a persist when my entities having existing IDs seem to make Hibernate very angry. So is this actually possible or is it better to avoid deleting them and just trying to do it with merge?
Angriness from hibernate:
org.hibernate.PersistentObjectException: detached entity passed to persist: com.idbs.omics.catalog.entity.Term
Thanks
My thought was to just delete the old ones and re-create it from the incoming XML. However, doing a persist when my entities having existing IDs seem to make Hibernate very angry..
Indeed, you cannot assign an Id when it is supposed to be generated, at least not with Hibernate that won't consider the entity as new but as detached (the JPA specification is a bit blurry on the exact rules in this case but that's how Hibernate behaves, see 5.1.4.5. Assigned identifiers for more hints).
So is this actually possible or is it better to avoid deleting them and just trying to do it with merge?
To make the delete/insert possible for the web service use case, you'd have to either:
not assign the id ~or~
use a special version of the entity without a generated identifier ~or~
use bulk operations(?)
The alternative if you're actually updating detached entities would be indeed to use a merge (but have a look at these previous questions just in case).
Which approach is better? I don't know, it think it depends on your needs. The later seems more natural if you're updating existing entities. With the former, you'd really get "new" entities (including a new value for the optimistic locking column). Depending on the exact implementation of the process, performances might also vary. And, by the way, what about concurrency (just to mention it, I'm not really expecting an answer)?
You can use EntityManager.merge to save an updated version of the entity. Be aware that this returns another object than the one you pass to it, because it basically fetches the entity from the database, updates the persistent properties from the object you pass and saves the persistent object.
See http://blog.xebia.com/2009/03/23/jpa-implementation-patterns-saving-detached-entities/ for more information on this problem.
Correct me if anything is wrong.
Now when we use Spring DAO for ORM templates, when we use #Transactional attribute,
we do not have control over the transaction and/or session when the method is called externally, not within the method.
Lazy loading saves resources - less queries to the db, less memory to keep all the collections fetched in the app memory.
So, if lazy=false, then everything is fetched, all associated collections, that is not effectively, if there are 10,000 records in a linked set.
Now, I have a method in a DAO class that is supposed to return me a User object.
It has collections that represent linked tables of the database.
I need to get a object by id and then query its collections.
Hibernate "failed to lazily initialize a collection" exception occurs when I try to access the linked collection that this DAO method returns.
Explain please, what is a workaround here?
Update: All right, let me ask you this. DAO is an abstract layer, so a method "getUserById(Integer id)" is supposed to return an Object.
What if in some cases I need these linked collections of the User object and in other situation I need those collections.
Are there only two ways:
1) lazy loading = false
2) create different methods: getUserByIdWithTheseCollections(), getUserByIdWithOtherCollections() and inside those methods use your approach?
I mean are there only 2 ways and nothing better?
Update 2: Explain please, what would give me the explicit use of SESSIONFACTORY?
How does it look in practice? We create an instance of DAO object,
then inject it with session factory and this would mean that two consequent
method calls to DAO will run within the same transaction?
It seems to me that anyway, DAO is detached from the classes that make use of it!
The logic and transactions are encapsulated within DAO, right?
You can get the linked collection in transaction to load it while you're still within the transaction:
User user = sessionFactory.getCurrentSession().get(User.class, userId);
user.getLinkedCollection().size();
return user;
As BalusC has pointed out, you can use Hibernate.initialize() instead of size(). That's a lot cleaner.
Then when you return such an entity, the lazy field is already initialized.
Replying to your PS - is using transactions on service level (rather than DAO) level feasible? It seems to be, as doing each DAO call in separate transaction seems to be a waste (and may be incorrect).
I find that it's best to put #Transactional at the service layer, rather than the DAO layer. Otherwise, all your DAO calls are in separate hibernate sessions - all that object equality stuff won't work.
In my opinion best way to solve this problem will be to design application in a session-per-request model. Then, if you even have an object taken from DAO, until your OSIV pattern works you can use the object safely anywhere in application, even in views without bothering this stuff. This is probably better solution that those proposed because:
Hibernate.initialize() or size is a very artificial workaround - what if you want to have User with different collection initialized, would you write another method for getting user?
Service layer transactional model is OK, but the same problem comes when you want to get object extracted from the service layer to use it in controller or view
You could do something like following:
public User getByUserId(Long id, String ... fetch) {
Criteria criteria = createCriteria();
if (fetch != null) {
for (String fieldName : fetch) {
criteria.setFetchMode(fieldName, FetchMode.JOIN); // fetch these fields eagerly
}
}
return criteria.add(Restrictions.eq("id", id)).list();
}