I have an entity, for example Manager and some json field, that contains some list of clients.
#Entity
#TypeDefs({#TypeDef( name = "json", typeClass = JsonBinaryType.class )})
public class Manager {
#Id
private String managerId;
#Type( type = "json" )
#Column( columnDefinition = "jsonb", name = "json" )
private SomeJsonDto json;
...
}
Also, i have custom class Context, where I got some entities from DB via JpaRepository and map them to the Map.
public class Context{
private Map<String, Manager> managersFromDBMap;
}
After that, I need to create new Manager if not exists and add some client to the list of clients.
First step, I create new Manager, make em.persist(em annotated with PersistanceContext) and put that object in my managersFromDBMap in Context.
Second step, I get that manager from managersFromDBMap and add some clients to the list.
All these steps I do in #Transactional method and expect that at the end of transaction I will flush changes and get only 1 insert into DB(new manager with clients). But I see 1 insert(for create manager) and 1 update(for update that manager with clients)
Hibernate: insert into manager ...
Hibernate: update manager ...
I don't understand why hibernate makes 2 requests to DB if all my changes were made with 1 entity in 1 transaction. Am I doing something wrong or hibernate will always make 2 requests?
Related
I am new to spring data jpa. I have a scenario where I have to create an entity if not exists or update based on non primary key name.Below is the code i wrote to create new entity,it is working fine,but if an already exists record ,its creating duplicate.How to write a method to update if exists ,i usually get list of records from client.
#Override
#Transactional
public String createNewEntity(List<Transaction> transaction) {
List<Transaction> transaction= transactionRespository.saveAll(transaction);
}
Add in your Transaction Entity on variable called name this for naming as unique:
#Entity
public class Transaction {
...
#Column(name="name", unique=true)
private String name;
...
}
Then you won't be able to add duplicate values for name column.
First, this is from google composite key means
A composite key is a combination of two or more columns in a table that can be used to uniquely identify each row in the table when the columns are combined uniqueness is guaranteed, but when it taken individually it does not guarantee uniqueness.
A composite key with an unique key is a waste.
if you want to update an entity by jpa, you need to have an key to classify if the entity exist already.
#Transactional
public <S extends T> S save(S entity) {
if(this.entityInformation.isNew(entity)) {
this.em.persist(entity);
return entity;
} else {
return this.em.merge(entity);
}
}
There are two ways to handle your problem.
If you can not get id from client on updating, it means that id has lost its original function. Then remove your the annotation #Id on your id field,set name with #Id. And do not set auto generate for it.
I think what you want is an #Column(unique = true,nullable = false) on your name field.
And that is the order to update something.
Transaction t = transactionRepository.findByName(name);
t.set.... //your update
transactionRepository.save(t);
I've the following Entity:
class User {
..
...
#OneToMany(cascade = { CascadeType.ALL }, fetch = FetchType.LAZY, mappedBy = "id.uKey")
#MapKey(name="id.achieveId")
private Map<Integer, Achievement> achievements;
..
..
}
at some point I call:
Hibernate.Initialize();
and this map is filled with entries from DB.
when the app continues I save new entries into the DB table.
and then I try to access the map but it doesn't contain the new entries.
Is there a way to make it aware that it needs to re-select the DB table?
Thanks
EDIT:
I add new entries like this:
public void save() {
..
tx = dbs.beginTransaction();
Achievement ua = new Achievement(key, id);
dbs.save(ua);
tx.commit();
}
After initializing your object, it resides in the hibernate session and this session is not designed to be updated by changes from the underlying database. Just reload the object from the database, but remove it from the session before doing so.
But maybe, what you really want is to modify the map contained in your object. That would be the OOP way. After that you could persist the entire object with Hibernate.
I am using Mybatis (3.2.7 version) as an ORM framework for my JAVA project.
As I'm from JPA background, I was keen to explore the LAZYLOADING supported by Mybatis.
But I couldn't get across anything substantial.
(I am configuring MYBATIS using JAVA API and annotations solely for querying purpose)
As per the Mybatis documentation:
1. lazyLoadingEnabled: default value=TRUE
Globally enables or disables lazy loading. When enabled, all relations will be lazily
loaded. This value can be superseded for an specific relation by using the fetchType attribute
on it.
2. aggressiveLazyLoading : default value=TRUE
When enabled, an object with lazy loaded properties will be loaded entirely upon a call to any of the lazy properties. Otherwise, each property is loaded on demand.
Using the following attributes, I tried the following code:
a. JAVA Classes :
Feedback.java
public class Feedback implements Serializable {
private static final long serialVersionUID = 1L;
private int id;
private String message;
/**
* while loading Feedback, I want sender object to be lazily loaded
*/
private User sender;
private boolean seen;
// getters and setters
}
User.java
public class User implements Serializable, {
private static final long serialVersionUID = 1L;
private int id;
private String email;
// getters and setters
}
b. DB schema:
Feedback table
Table "public.feedback"
Column | Type | Modifiers
-------------+-----------+-------------------------------------------------------
id | integer | PRIMARY KEY
seen | boolean | not null
sender_id | integer | FOREIGN KEY (sender_id) REFERENCES users(id)
message | text |
User Table:
Table "public.users"
Column | Type | Modifiers
-------------+----------+----------------------------------------------------
id | integer | PRIMARY KEY
email | text |
c. Configuring MyBatis via JAVA API:
DataSource dataSource = new PGSimpleDataSource();
((PGSimpleDataSource) dataSource).setServerName("localhost");
((PGSimpleDataSource) dataSource).setDatabaseName(dbName);
((PGSimpleDataSource) dataSource).setPortNumber(5432);
((PGSimpleDataSource) dataSource).setUser(new UnixSystem().getUsername());
((PGSimpleDataSource) dataSource).setPassword("");
TransactionFactory transactionFactory = new JdbcTransactionFactory();
Environment environment = new Environment(dbName, transactionFactory, dataSource);
Configuration configuration = new Configuration(environment);
configuration.addMapper(FeedbackMapper.class);
//
configuration.setAggressiveLazyLoading(false);
sqlSessionFactory = new SqlSessionFactoryBuilder().build(configuration);
d. Querying DB and DB Queries in Feedbackmapper:
d.1 Code in Feedbackmapper:
#Select("SELECT f.id, f.message, f.seen, f.sender_id FROM feedback f WHERE f.id= #{feedbackId}")
#Results(value = {
#Result(property = "id", column = "id"),
#Result(property = "sender", column = "sender_id", javaType = User.class, one = #One(select = "getUser", fetchType=FetchType.DEFAULT))
})
public Feedback getFeedback(#Param("feedbackId") int feedbackId);
#Select("SELECT id, email FROM users WHERE id=#{id}")
public User getUser(int id);
d.2: Code to invoke the queries in feedbackMapper
// setup Mybatis session factory and config
Feedback feedback =feedbackMapper.getFeedback(70000);
System.out.println(feedback);
But still the "sender" object is populated upon querying the getFeedback(id). I expect the sender object shouldn't be populated immediately but only when I call getSender() on the fetched feedback object . Please help.
My recent Observations:
Mybatis team has indeed got it wrong in their documentation ie in documentation:
lazyLoadingEnabled: default value=TRUE
aggressiveLazyLoading : default value=TRUE
But looking at their source code:
protected boolean lazyLoadingEnabled = false;
protected boolean aggressiveLazyLoading = true;
**However that being corrected, the results are not affected and lazy loading isnt working :( **
I think I found a way to enable the lazyloading (though not cent-percent sure):
MyBatis documentation has following setting in configuration:
Setting : lazyLoadTriggerMethods
Description : Specifies which Object's methods trigger a lazy load
Valid Values : A method name list separated by commas
Default: equals,clone,hashCode,toString
As per the source code, this thing maps properly to what is given in documentation:
public class Configuration {
// other attributes + methods
protected Set<String> lazyLoadTriggerMethods = new HashSet<String>(Arrays.asList(new
String[] { "equals", "clone", "hashCode", "toString" }));
}
I have changed the Mybatis configuration to follows:
TransactionFactory transactionFactory = new JdbcTransactionFactory();
Environment environment = new Environment(dbName, transactionFactory, dataSource);
Configuration configuration = new Configuration(environment);
/**
*This is the cause why LAZY LOADING is working now
*/
configuration.getLazyLoadTriggerMethods().clear();
///////////////////////////////////////////////////
configuration.setLazyLoadingEnabled(true);
configuration.setAggressiveLazyLoading(false);
The queries in Mapper (mostly unchanged):
#Select("SELECT id, message, seen, sender_id
FROM feedback WHERE f.id= #{feedbackId}")
#Results(value = {
#Result(property = "id", column = "id"),
#Result(property = "sender", column = "sender_id", javaType = User.class, one = #One(select = "getUser"))
// Set fetchType as DEFAULT or LAZY or don't set at all-- lazy loading takes place
// Set fetchType as EAGER --sender Object is loaded immediately
})
public Feedback getFeedback(#Param("feedbackId") int feedbackId);
#Select("SELECT id, email FROM users WHERE id=#{id}")
public User getUser(int id);
- JAVA Code to invoke mapper
FeedbackMapper mapper = sqlSession.getMapper(FeedbackMapper.class);
Feedback feedback =mapper.getFeedback(69999);
System.out.println("1. Feedback object before sender lazily load: \n"+ feedback);
System.out.println("2. Sender loaded explicitly \n" +feedback.getSender());
System.out.println("3. Feedback object after sender loading \n" + feedback);
Output of the CODE
1. Feedback object before sender lazily load:
{id : 69999, message : message123, sender : null, seen : false}
2. Sender loaded explicitly
{id : 65538 , email: hemant#gmail.com}
3. Feedback object after sender loading:
{id : 69999, message : message123, sender : {id : 65538, email : hemant#gmail.com},
seen : false}
Though this works satisfactorily, upon doing
configuration.getLazyLoadTriggerMethods().clear();
However to lack of documentation in Mybatis, I'm not sure, whether this is associated with any drawbacks as such.
UPDATE
I looked into the source code, and the problem is that the Configuration class does not reflect the doc.
In configuration class, lazy load is disabled by default. This changed in commit f8ddba364092d819f100e0e8f7dec677c777d588, but the doc was not updated to reflect the change.
protected boolean lazyLoadingEnabled = false;
I filled a bug report https://github.com/mybatis/mybatis-3/issues/214.
For now, add configuration.setLazyLoadingEnabled(true) to enable lazy load.
Old answer:
The documentation is incorrect. When aggressiveLazyLoading is true, all lazy properties are loaded after any method call on the object.
So calling feedback.toString() will fetch the Feedback's sender property.
You should set aggressiveLazyLoading to false to achieve what you want.
I think it's not easy to use print to verify the lazy loading in mybatis.
We can use configuration.getLazyLoadTriggerMethods().clear(); to remove the default triggerMethods as the previous answer. But when we print it, or using toString, It still will call getXXX. so It still trigger lazy loading to select more. So we can't debug or print to see the procedure to lazy loading.
I found a way to verify this function.
set log level to debug.
write the follow codes and see the console.
log.info("user :{}", userLazyDepartment);
log.info("user :{}", userLazyDepartment.getDepartment());
I have a bidirectional one-to-many relationship.
0 or 1 client <-> List of 0 or more product orders.
That relationship should be set or unset on both entities:
On the client side, I want to set the List of product orders assigned to the client; the client should then be set / unset to the orders chosen automatically.
On the product order side, I want to set the client to which the oder is assigned; that product order should then be removed from its previously assiged client's list and added to the new assigned client's list.
I want to use pure JPA 2.0 annotations and one "merge" call to the entity manager only (with cascade options). I've tried with the following code pieces, but it doesn't work (I use EclipseLink 2.2.0 as persistence provider)
#Entity
public class Client implements Serializable {
#OneToMany(mappedBy = "client", cascade= CascadeType.ALL)
private List<ProductOrder> orders = new ArrayList<>();
public void setOrders(List<ProductOrder> orders) {
for (ProductOrder order : this.orders) {
order.unsetClient();
// don't use order.setClient(null);
// (ConcurrentModificationEx on array)
// TODO doesn't work!
}
for (ProductOrder order : orders) {
order.setClient(this);
}
this.orders = orders;
}
// other fields / getters / setters
}
#Entity
public class ProductOrder implements Serializable {
#ManyToOne(cascade= CascadeType.ALL)
private Client client;
public void setClient(Client client) {
// remove from previous client
if (this.client != null) {
this.client.getOrders().remove(this);
}
this.client = client;
// add to new client
if (client != null && !client.getOrders().contains(this)) {
client.getOrders().add(this);
}
}
public void unsetClient() {
client = null;
}
// other fields / getters / setters
}
Facade code for persisting client:
// call setters on entity by JSF frontend...
getEntityManager().merge(client)
Facade code for persisting product order:
// call setters on entity by JSF frontend...
getEntityManager().merge(productOrder)
When changing the client assignment on the order side, it works well: On the client side, the order gets removed from the previous client's list and is added to the new client's list (if re-assigned).
BUT when changing on the client side, I can only add orders (on the order side, assignment to the new client is performed), but it just ignores when I remove orders from the client's list (after saving and refreshing, they are still in the list on the client side, and on the order side, they are also still assigned to the previous client.
Just to clarify, I DO NOT want to use a "delete orphan" option: When removing an order from the list, it should not be deleted from the database, but its client assignment should be updated (that is, to null), as defined in the Client#setOrders method. How can this be archieved?
EDIT: Thanks to the help I received here, I was able to fix this problem. See my solution below:
The client ("One" / "owned" side) stores the orders that have been modified in a temporary field.
#Entity
public class Client implements Serializable, EntityContainer {
#OneToMany(mappedBy = "client", cascade= CascadeType.ALL)
private List<ProductOrder> orders = new ArrayList<>();
#Transient
private List<ProductOrder> modifiedOrders = new ArrayList<>();
public void setOrders(List<ProductOrder> orders) {
if (orders == null) {
orders = new ArrayList<>();
}
modifiedOrders = new ArrayList<>();
for (ProductOrder order : this.orders) {
order.unsetClient();
modifiedOrders.add(order);
// don't use order.setClient(null);
// (ConcurrentModificationEx on array)
}
for (ProductOrder order : orders) {
order.setClient(this);
modifiedOrders.add(order);
}
this.orders = orders;
}
#Override // defined by my EntityContainer interface
public List getContainedEntities() {
return modifiedOrders;
}
On the facade, when persisting, it checks if there are any entities that must be persisted, too. Note that I used an interface to encapsulate this logic as my facade is actually generic.
// call setters on entity by JSF frontend...
getEntityManager().merge(entity);
if (entity instanceof EntityContainer) {
EntityContainer entityContainer = (EntityContainer) entity;
for (Object childEntity : entityContainer.getContainedEntities()) {
getEntityManager().merge(childEntity);
}
}
JPA does not do this and as far as I know there is no JPA implementation that does this either. JPA requires you to manage both sides of the relationship. When only one side of the relationship is updated this is sometimes referred to as "object corruption"
JPA does define an "owning" side in a two-way relationship (for a OneToMany this is the side that does NOT have the mappedBy annotation) which it uses to resolve a conflict when persisting to the database (there is only one representation of this relationship in the database compared to the two in memory so a resolution must be made). This is why changes to the ProductOrder class are realized but not changes to the Client class.
Even with the "owning" relationship you should always update both sides. This often leads people to relying on only updating one side and they get in trouble when they turn on the second-level cache. In JPA the conflicts mentioned above are only resolved when an object is persisted and reloaded from the database. Once the 2nd level cache is turned on that may be several transactions down the road and in the meantime you'll be dealing with a corrupted object.
You have to also merge the Orders that you removed, just merging the Client is not enough.
The issue is that although you are changing the Orders that were removed, you are never sending these orders to the server, and never calling merge on them, so there is no way for you changes to be reflected.
You need to call merge on each Order that you remove. Or process your changes locally, so you don't need to serialize or merge any objects.
EclipseLink does have a bidirectional relationship maintenance feature which may work for you in this case, but it is not part of JPA.
Another possible solution is to add the new property on your ProductOrder, I named it detached in the following examples.
When you want to detach the order from the client you can use a callback on the order itself:
#Entity public class ProductOrder implements Serializable {
/*...*/
//in your case this could probably be #Transient
private boolean detached;
#PreUpdate
public void detachFromClient() {
if(this.detached){
client.getOrders().remove(this);
client=null;
}
}
}
Instead of deleting the orders you want to delete you set detached to true. When you will merge & flush the client, the entity manager will detect the modified order and execute the #PreUpdate callback effectively detaching the order from the client.
I have a class which is mapped to a table using the hibernate notations of auto increment. This class works fine when I set values and update this to the database and I get a correct updated value in the table.
But the issue is when I create a new object of this class and try to get the id, it returns me a 0 instead of the auto_incremented id.
The code of the class is
#Entity(name="babies")
public class Baby implements DBHelper{
private int babyID;
#Id
#Column(name="babyID", unique=true, nullable= false)
#GeneratedValue(strategy = GenerationType.AUTO)
public int getBabyID() {
return babyID;
}
public void setBabyID(int babyID) {
this.babyID = babyID;
}
}
The code I use to get the persistent value is
Baby baby = new Baby();
System.out.println("BABY ID = "+baby.getBabyID());
This returns me a
BABY ID = 0
Any pointers would be appreciated.
Thanks,
Sana.
Hibernate only generates the id after an entity becomes persistent, ie after you have saved it to the database. Before this the object is in the transient state. Here is an article about the Hibernate object states and lifecycle
The ID is set by hibernate when object is saved and became persistable.
The annotation are only informing hibernate, how he should behave with class, property, method that annotation refer to.
Another thing if You have current id value how hibernate, would be able to recognize that he should insert or only update that value.
So this is normal expected behavior.