Lazy Loading using MyBatis 3 with Java - java

I am using Mybatis (3.2.7 version) as an ORM framework for my JAVA project.
As I'm from JPA background, I was keen to explore the LAZYLOADING supported by Mybatis.
But I couldn't get across anything substantial.
(I am configuring MYBATIS using JAVA API and annotations solely for querying purpose)
As per the Mybatis documentation:
1. lazyLoadingEnabled: default value=TRUE
Globally enables or disables lazy loading. When enabled, all relations will be lazily
loaded. This value can be superseded for an specific relation by using the fetchType attribute
on it.
2. aggressiveLazyLoading : default value=TRUE
When enabled, an object with lazy loaded properties will be loaded entirely upon a call to any of the lazy properties. Otherwise, each property is loaded on demand.
Using the following attributes, I tried the following code:
a. JAVA Classes :
Feedback.java
public class Feedback implements Serializable {
private static final long serialVersionUID = 1L;
private int id;
private String message;
/**
* while loading Feedback, I want sender object to be lazily loaded
*/
private User sender;
private boolean seen;
// getters and setters
}
User.java
public class User implements Serializable, {
private static final long serialVersionUID = 1L;
private int id;
private String email;
// getters and setters
}
b. DB schema:
Feedback table
Table "public.feedback"
Column | Type | Modifiers
-------------+-----------+-------------------------------------------------------
id | integer | PRIMARY KEY
seen | boolean | not null
sender_id | integer | FOREIGN KEY (sender_id) REFERENCES users(id)
message | text |
User Table:
Table "public.users"
Column | Type | Modifiers
-------------+----------+----------------------------------------------------
id | integer | PRIMARY KEY
email | text |
c. Configuring MyBatis via JAVA API:
DataSource dataSource = new PGSimpleDataSource();
((PGSimpleDataSource) dataSource).setServerName("localhost");
((PGSimpleDataSource) dataSource).setDatabaseName(dbName);
((PGSimpleDataSource) dataSource).setPortNumber(5432);
((PGSimpleDataSource) dataSource).setUser(new UnixSystem().getUsername());
((PGSimpleDataSource) dataSource).setPassword("");
TransactionFactory transactionFactory = new JdbcTransactionFactory();
Environment environment = new Environment(dbName, transactionFactory, dataSource);
Configuration configuration = new Configuration(environment);
configuration.addMapper(FeedbackMapper.class);
//
configuration.setAggressiveLazyLoading(false);
sqlSessionFactory = new SqlSessionFactoryBuilder().build(configuration);
d. Querying DB and DB Queries in Feedbackmapper:
d.1 Code in Feedbackmapper:
#Select("SELECT f.id, f.message, f.seen, f.sender_id FROM feedback f WHERE f.id= #{feedbackId}")
#Results(value = {
#Result(property = "id", column = "id"),
#Result(property = "sender", column = "sender_id", javaType = User.class, one = #One(select = "getUser", fetchType=FetchType.DEFAULT))
})
public Feedback getFeedback(#Param("feedbackId") int feedbackId);
#Select("SELECT id, email FROM users WHERE id=#{id}")
public User getUser(int id);
d.2: Code to invoke the queries in feedbackMapper
// setup Mybatis session factory and config
Feedback feedback =feedbackMapper.getFeedback(70000);
System.out.println(feedback);
But still the "sender" object is populated upon querying the getFeedback(id). I expect the sender object shouldn't be populated immediately but only when I call getSender() on the fetched feedback object . Please help.
My recent Observations:
Mybatis team has indeed got it wrong in their documentation ie in documentation:
lazyLoadingEnabled: default value=TRUE
aggressiveLazyLoading : default value=TRUE
But looking at their source code:
protected boolean lazyLoadingEnabled = false;
protected boolean aggressiveLazyLoading = true;
**However that being corrected, the results are not affected and lazy loading isnt working :( **

I think I found a way to enable the lazyloading (though not cent-percent sure):
MyBatis documentation has following setting in configuration:
Setting : lazyLoadTriggerMethods
Description : Specifies which Object's methods trigger a lazy load
Valid Values : A method name list separated by commas
Default: equals,clone,hashCode,toString
As per the source code, this thing maps properly to what is given in documentation:
public class Configuration {
// other attributes + methods
protected Set<String> lazyLoadTriggerMethods = new HashSet<String>(Arrays.asList(new
String[] { "equals", "clone", "hashCode", "toString" }));
}
I have changed the Mybatis configuration to follows:
TransactionFactory transactionFactory = new JdbcTransactionFactory();
Environment environment = new Environment(dbName, transactionFactory, dataSource);
Configuration configuration = new Configuration(environment);
/**
*This is the cause why LAZY LOADING is working now
*/
configuration.getLazyLoadTriggerMethods().clear();
///////////////////////////////////////////////////
configuration.setLazyLoadingEnabled(true);
configuration.setAggressiveLazyLoading(false);
The queries in Mapper (mostly unchanged):
#Select("SELECT id, message, seen, sender_id
FROM feedback WHERE f.id= #{feedbackId}")
#Results(value = {
#Result(property = "id", column = "id"),
#Result(property = "sender", column = "sender_id", javaType = User.class, one = #One(select = "getUser"))
// Set fetchType as DEFAULT or LAZY or don't set at all-- lazy loading takes place
// Set fetchType as EAGER --sender Object is loaded immediately
})
public Feedback getFeedback(#Param("feedbackId") int feedbackId);
#Select("SELECT id, email FROM users WHERE id=#{id}")
public User getUser(int id);
- JAVA Code to invoke mapper
FeedbackMapper mapper = sqlSession.getMapper(FeedbackMapper.class);
Feedback feedback =mapper.getFeedback(69999);
System.out.println("1. Feedback object before sender lazily load: \n"+ feedback);
System.out.println("2. Sender loaded explicitly \n" +feedback.getSender());
System.out.println("3. Feedback object after sender loading \n" + feedback);
Output of the CODE
1. Feedback object before sender lazily load:
{id : 69999, message : message123, sender : null, seen : false}
2. Sender loaded explicitly
{id : 65538 , email: hemant#gmail.com}
3. Feedback object after sender loading:
{id : 69999, message : message123, sender : {id : 65538, email : hemant#gmail.com},
seen : false}
Though this works satisfactorily, upon doing
configuration.getLazyLoadTriggerMethods().clear();
However to lack of documentation in Mybatis, I'm not sure, whether this is associated with any drawbacks as such.

UPDATE
I looked into the source code, and the problem is that the Configuration class does not reflect the doc.
In configuration class, lazy load is disabled by default. This changed in commit f8ddba364092d819f100e0e8f7dec677c777d588, but the doc was not updated to reflect the change.
protected boolean lazyLoadingEnabled = false;
I filled a bug report https://github.com/mybatis/mybatis-3/issues/214.
For now, add configuration.setLazyLoadingEnabled(true) to enable lazy load.
Old answer:
The documentation is incorrect. When aggressiveLazyLoading is true, all lazy properties are loaded after any method call on the object.
So calling feedback.toString() will fetch the Feedback's sender property.
You should set aggressiveLazyLoading to false to achieve what you want.

I think it's not easy to use print to verify the lazy loading in mybatis.
We can use configuration.getLazyLoadTriggerMethods().clear(); to remove the default triggerMethods as the previous answer. But when we print it, or using toString, It still will call getXXX. so It still trigger lazy loading to select more. So we can't debug or print to see the procedure to lazy loading.
I found a way to verify this function.
set log level to debug.
write the follow codes and see the console.
log.info("user :{}", userLazyDepartment);
log.info("user :{}", userLazyDepartment.getDepartment());

Related

Unnecessary update by Hibernate

I have an entity, for example Manager and some json field, that contains some list of clients.
#Entity
#TypeDefs({#TypeDef( name = "json", typeClass = JsonBinaryType.class )})
public class Manager {
#Id
private String managerId;
#Type( type = "json" )
#Column( columnDefinition = "jsonb", name = "json" )
private SomeJsonDto json;
...
}
Also, i have custom class Context, where I got some entities from DB via JpaRepository and map them to the Map.
public class Context{
private Map<String, Manager> managersFromDBMap;
}
After that, I need to create new Manager if not exists and add some client to the list of clients.
First step, I create new Manager, make em.persist(em annotated with PersistanceContext) and put that object in my managersFromDBMap in Context.
Second step, I get that manager from managersFromDBMap and add some clients to the list.
All these steps I do in #Transactional method and expect that at the end of transaction I will flush changes and get only 1 insert into DB(new manager with clients). But I see 1 insert(for create manager) and 1 update(for update that manager with clients)
Hibernate: insert into manager ...
Hibernate: update manager ...
I don't understand why hibernate makes 2 requests to DB if all my changes were made with 1 entity in 1 transaction. Am I doing something wrong or hibernate will always make 2 requests?

how springboot project to set request argument

My controller accepts an object to save it, and the object has a field named addTime,and i have to set this field by myself ,how can i do to let this operation automatically by springboot features。
#PostMapping("/HotelVersionDistribute/apply")
#Override
public Result<Boolean> apply(#RequestBody HotelVersionDistribute entity) {
// i dont want to do it,but i have no idea
entity.setAddTime(LocalDateTime.now());
HotelClientVersion hotelClientVersion = hotelClientVersionMapper.selectById(entity.getVersionId());
if(hotelClientVersion == null){
log.warn("version_id={},not exit", entity.getVersionId());
return Result.error(Result.CODE_REASOURCE_NOT_EXIST, "版本未找到");
}
saveApply(entity, hotelClientVersion);
return Result.success(true);
}
In general such columns ( createdBy , createdDate , updatedBy ,updatedDate )like addTime are called audit columns.
Spring provides Auditing support for Spring Data / Spring Data JPA . Since the question does not mention the same and assuming you are using one of those , please read through and implement your requirement.
From the documentation : Auditing
Spring Data provides sophisticated support to transparently keep track
of who created or changed an entity and when the change happened. To
benefit from that functionality, you have to equip your entity classes
with auditing metadata that can be defined either using annotations or
by implementing an interface
Two ways:
Direct set value in class:
public class HotelVersionDistribute {
private LocalDateTime addTime = LocalDateTime.now();
...
}
Use MySQL's default value mechanism, and set column updatable to false
public class HotelVersionDistribute {
#Column(insertable = false, updatable = false)
private LocalDateTime addTime;
...
}

Dynamically Add Fields to Hibernate Entity at Runtime

I have an application with custom fields - users are basically able to define a custom field by selecting a type for the field and giving it a name. The custom fields are then presented as being part of an entity and data given to these fields is saved to my database. In most circumstances, I've been able to handle these programmatically and through the normal hibernate mappings (ie, #OneToMany annotated collection) without a problem. I'm currently faced with a problem, however. We'd like to have these custom fields and their values used for real-time reporting of the "parent" entities. Custom field values are mapped as collections inside the parent entities, but I need them flat for reporting purposes. I have created a view that provides exactly what I need from the SQL side of things - I followed this example to add dynamic pivoting and the resulting query is precisely how I'd like to display my information. Not the following images, of course, but that's essentially the output I have.
The view returns a completely dynamic number of columns, each named for a custom field and populated with the relevant data for that row.
The problem is that I now have no idea how to retrieve this information with Hibernate.
I found documentation for updating the PersistentClass by getting the ClassMappings from the Hibernate Configuration:
Manipulating metadata at runtime
//Get the existing mapping for AgreementsGrid from Configuration
PersistentClass gridMapping = configuration.getClassMapping(AgreementsGrid.class.getName());
//Define new Column
Column column = new Column();
column.setName("ESTIMATED_COST_OVERRUNS");
column.setNullable(true);
column.setUnique(false);
gridMapping.getTable().addColumn(column);
//Wrap the column in a value
SimpleValue value = new SimpleValue();
value.setTable(gridMapping.getTable());
value.setTypeName("string");
value.addColumn(column);
//Define new property for the AgreementsGrid class
Property prop = new Property();
prop.setValue(value);
prop.setName("customField1");
prop.setNodeName(prop.getName());
gridMapping.addProperty(prop);
//Build a new session factory for the new mapping
SessionFactory sessionFactory = configuration.buildSessionFactory();
I've only just realized that this is for Hibernate 3 & 4, and isn't even possible in Hibernate 5 (I'm using 5.2.18).
So, I'm trying to figure out how to handle this in Hibernate 5. I have a base entity mapped to a view, and at runtime I need to be able to dynamically add "fields" to it, so that my DAOs can dynamically filter the information and handle sorts/grouping.
Here is the entity I have for my view:
#Entity
#Table(name="AGREEMENTS_GRID")
public class AgreementsGrid implements Serializable {
private static final long serialVersionUID = 1L;
private Integer entityId;
#Column(name="ENTITY_ID")
#Id
public Integer getEntityId() {
return this.entityId;
}
public void setEntityId(Integer entityId) {
this.entityId = entityId;
}
private Agreements agreement;
#ManyToOne
#JoinColumn(name = "AGREEMENT_ID", referencedColumnName = "ID", nullable = false)
public Agreements getAgreement() {
return this.agreement;
}
public void setAgreement(Agreements agreement) {
this.agreement= agreement;
}
private BigDecimal expenditure;
#Column(name = "EXPENDITURE", nullable = true, precision = 22, scale = 2)
public BigDecimal getExpenditure() {
return this.expenditure;
}
public void setExpenditure(BigDecimal expenditure) {
this.expenditure = expenditure;
}
/*
* Dynamic fields would theoretically go here and look like this,
* for a custom field of type CURRENCY named 'Estimated Cost Overruns'
*/
/*
private BigDecimal customField1;
#Column(name = "ESTIMATED_COST_OVERRUNS", nullable = true, precision = 22, scale = 2)
public BigDecimal getCustomField1() {
return this.customField1;
}
public void setCustomField1(BigDecimal customField1) {
this.customField1 = customField1;
}
*/
}
Just to be clear, I cannot map these fields at compile time. They are purely custom and are defined entirely by users. At runtime, I will be able to know what custom fields do exist, so I would be able to loop through them and add them (as I hoped to do with the add column seen above), but I cannot know before deployment. The custom fields are also subject to change any moment.
For Hibernate 5 you should build MetaData via RegistryService, add property, then build SessionFactory via MetaData (Bootstrap native metadata). Something like this:
public SessionFactory buildSessionFactory(LocalSessionFactoryBuilder sessionFactoryBuilder) {
StandardServiceRegistryBuilder registryBuilder = new StandardServiceRegistryBuilder();
registryBuilder.applySettings(sessionFactoryBuilder.getProperties());
Metadata metaData = getMetadataSources().buildMetadata(registryBuilder.build());
PersistentClass gridMapping = metaData.getEntityBinding(AgreementsGrid.class.getName());
Column column = new Column();
...
Property prop = new Property();
...
gridMapping.addProperty(prop);
SessionFactory sessionFactory = metaData.buildSessionFactory();
return sessionFactory;
}

Updating with Morphia Optimistic locking

Hi considering the following example:
Resource:
#PUT
#Path("{id}")
public Response update(#PathParam(value = "id") final String id, final Person person) {
final Person person = service.getPerson(id);
final EntityTag etag = new EntityTag(Integer.toString(person.hashCode()));
// If-Match is required
ResponseBuilder builder = request.evaluatePreconditions(etag);
if (builder != null) {
throw new DataHasChangedException("Person data has changed: " + id);
}
service.updatePerson(id, person.getName());
....
}
Service:
public void updatePerson(final String id, final String name) {
final Query<Person> findQuery = morphiaDataStore.createQuery(Person.class).filter("id ==", id);
UpdateOperations<Person> operation = morphiaDataStore.createUpdateOperations(Person.class).set("name", name);
morphiaDataStore.findAndModify(findQuery, operation );
}
Person:
#Entity("person")
public class Person {
#Id
private ObjectId id;
#Version
private Long version;
private String name;
...
}
I do check if the etag provided is the same of the person within the database. However this check is been done on the resource itself. I don't think that this is safe since the update happens after the check and another thread could have gone threw the check in the meantime. How can this be solved correctly? Any example or advise is appreciated.
Morphia already implements optimistic-locking via #Version annotation.
http://mongodb.github.io/morphia/1.3/guides/annotations/#version
#Version marks a field in an entity to control optimistic locking. If the versions change in the database while modifying an entity (including deletes) a ConcurrentModificationException will be thrown. This field will be automatically managed for you – there is no need to set a value and you should not do so. If another name beside the Java field name is desired, a name can be passed to this annotation to change the document’s field name.
I see you have already use the annotation in your example. Make sure the clients include the version of the document as part of the request so you can also pass it to morphia.
Not sure if findAndModify will be able to handle it (I would think it does). but at least I'm sure save does handle it.
Assuming the object person contains the new name and version that the client was looking at, you can do directly something like this to update the record:
morphiaDataStore.save(person);
If there was another save before this client could pick it up the versions will no longer match and a ConcurrentModificationException will be issued with this message:
Entity of class %s (id='%s',version='%d') was concurrently updated

hibernate auto_increment getter/setter

I have a class which is mapped to a table using the hibernate notations of auto increment. This class works fine when I set values and update this to the database and I get a correct updated value in the table.
But the issue is when I create a new object of this class and try to get the id, it returns me a 0 instead of the auto_incremented id.
The code of the class is
#Entity(name="babies")
public class Baby implements DBHelper{
private int babyID;
#Id
#Column(name="babyID", unique=true, nullable= false)
#GeneratedValue(strategy = GenerationType.AUTO)
public int getBabyID() {
return babyID;
}
public void setBabyID(int babyID) {
this.babyID = babyID;
}
}
The code I use to get the persistent value is
Baby baby = new Baby();
System.out.println("BABY ID = "+baby.getBabyID());
This returns me a
BABY ID = 0
Any pointers would be appreciated.
Thanks,
Sana.
Hibernate only generates the id after an entity becomes persistent, ie after you have saved it to the database. Before this the object is in the transient state. Here is an article about the Hibernate object states and lifecycle
The ID is set by hibernate when object is saved and became persistable.
The annotation are only informing hibernate, how he should behave with class, property, method that annotation refer to.
Another thing if You have current id value how hibernate, would be able to recognize that he should insert or only update that value.
So this is normal expected behavior.

Categories