Hibernate session updation issue - java

I am using hibernate in one of my java applications. The database runs on master slave hierarchy. In one of the classes I am trying to update an existing entity by first getting the entire entity from the master db & then updating it and saving it in the master db. But since this call can increase the load on the master database, I want all the reads to happen from the slave db only, and only updates on the master db. I cannot read from slave & write to master as hibernate somehow saves the session along with the entity and doesn't let you update in some other slavesession.
I came up with a solution:
First I will change the entity to json & then read the json to form a new entity which will hopefully solve the session issue. Below is the code:
private UserEntity getUserEntityForUpdate(UserData userData) {
UserEntity userEntity = (userEntity) sessionFactory.getCurrentSession().byId(userEntity.class).load(user.getId());
ObjectMapper mapper = new ObjectMapper();
String user1 = null;
try {
user1 = mapper.writeValueAsString(userEntity);
} catch (JsonProcessingException e) {
e.printStackTrace();
}
userEntity obj = new UserEntity();
try {
obj = mapper.readValue(user1, userEntity.class);
} catch (IOException e) {
e.printStackTrace();
}
if (obj.getId() != null) {
userEntity.setId(user.getId());
}
obj.setEnabledBy(user.getEnabledBy() != null ? user.getEnabledBy() : "");
obj.setEnabledAt(user.getEnabledAt());
return obj;
}
Just wanted to know if this is a good approach or can I go for any better approach?

You can use session.evict(entity) to detach the entity from the session you have loaded it from and then you should be able to update it on the master.
The problem with this approach is that you can't really make transactions, You can't be sure that the entity has not changed in the meanwhile.

Related

WildFly/Jboss/persistance-unit/Entity Manager - How to create new connection every time when user calls GET API Endpoint

I'm trying to modify existing Java app (WildFly, Jboss, oracle) which currently working fine as using persistence-unit and EntityManager connect to Oracle database(using standalone.xml and persistence.xml). However, I need to create every time new connection to database for the user which calls new GET API Endpoint using credentials from the HttpHeaders. Currently, I'm creating new entitymanager object which session is commit, rollback nad close. Unfortunately time response for every call become higher and higher. There is warning about "PersistenceUnitUser" being already registered and memory usage constantly growing. So that is bad solution.
Is there any proper way to do it, which works witout any harms ?
P.S.
Currently app using standalone.xml and persistence.xml. And that is working fine. I'm calling java api endpoint using entity manager being connected as Admin user/pass but I need to create new connection using user/pass from the httpHeaders and call one sql statement to see proper results as ORACLE uses reserved word such us: 'user'. For instance : select * from table where create_usr = user. When done 'Main EntityManager will use data from it to continue some process.
Please see code example below :
#GET
#Path("/todo-list-enriched")
#Produces(MediaType.APPLICATION_JSON)
public Response getToDoListEnriched(#Context HttpHeaders httpHeaders, #QueryParam("skip") int elementNumber, #QueryParam("take") int pageSize, #QueryParam("orderby") String orderBy)
{
String userName = httpHeaders.getHeaderString(X_USER_NAME);
String userName = httpHeaders.getHeaderString(X_PASSWORD);
EntityManager entityManager = null;
try {
Map<String, String> persistenceMap = new HashMap<String, String>();
persistenceMap.put("hibernate.dialect","org.hibernate.dialect.Oracle8iDialect");
persistenceMap.put("hibernate.connection.username", asUserName);
persistenceMap.put("hibernate.connection.password", asPassword);
EntityManagerFactory emf = Persistence.createEntityManagerFactory("PersistenceUnitUser", persistenceMap);
entityManager = emf.createEntityManager();
if (!entityManager.getTransaction().isActive()) {
entityManager.getTransaction().begin();
}
-- Do some works as select, update, select
-- and after that
if (entityManager.getTransaction().isActive()) {
entityManager.getTransaction().commit();
}
}
catch (Exception ex)
{
if (entityManager != null && entityManager.getTransaction().isActive()) {
entityManager.getTransaction().rollback();
}
}
finally {
if (entityManager != null && entityManager.isOpen()) {
entityManager.close();
}
}
}
}
``
Best Regards
Marcin
You should define a connection pool and a datasource in the standalone.xml (cf. https://docs.wildfly.org/26.1/Admin_Guide.html#DataSource) and then use it in your persistence.xml and inject the EntitytManager in your rest service class (cf. https://docs.wildfly.org/26.1/Developer_Guide.html#entity-manager).
You may look at this example application: https://github.com/wildfly/quickstart/tree/main/todo-backend

Hibernate :HTTP Status 500 - Internal Server Error

below is my hibernate code:
SessionFactory sessionFactory;
Session session = null;
LoginEntity user = null;
try {
sessionFactory = HibernateUtility.getSessionFactory();
session = sessionFactory.openSession();
session.beginTransaction();
user = session.get(LoginEntity.class, user_id);
System.out.println(user.getUserCountryMapping()); // if I remove this line I get error..
session.getTransaction().commit();
return user;
} catch (Exception e) {
Logger.getLogger(BulkActionMethods.class.getName()).log(Level.SEVERE, null, e);
} finally {
if (session != null) {
session.close();
}
}
I am facing a weird issue with my code, When I remove the System.out.println(user.getUserCountryMapping()); line I am getting HTTP Status 500 - Internal Server Error error on browser but when I write this line I am getting expacted JSON response on browser..
Please someone help me to understand this issue.
Without seeing the error or your entity mappings, it's hard to give a firm answer.
However, cases like this are almost always due to uninitialized lazy collections. The line:
System.out.println(user.getUserCountryMapping());
makes Hibernate fetch the data in that relationship. If you don't do this within a Hibernate session, and then try to rely on this relationship later you will get a LazyInitializationException, if not handled it will be a HTTP status 500.
Replace
System.out.println(user.getUserCountryMapping());
with
Hibernate.initialize(user.getUserCountryMapping());
Read more about lazy vs eager fetch type.

How to avoid "a different object with the same id.."?

I am using:
Web App (a filter opens session. DAO uses getCurrentSession())
Hibernate
Spring (AOP configuration over Service)
xml configuration for all
DTO between Mbean and Service
Well, I have two methods (business service):
service.findUser(..DTO..)
service.updateUser(..DTO..)
update throws org.hibernate.NonUniqueObjectException exception.
How can I avoid that?
I need to use update, not merge.
Thanks in advance.
//MBean.java method
public void testUpdateUser(ActionEvent e) {
System.out.println(name);
ServiceResponse<UserDto> serviceResponse = super.getPrincipalService().findUser(name);
UserDto userDto = serviceResponse.getResponseList().get(0);
//update some properties here
serviceResponse = super.getPrincipalService().updateUser(userDto);
LOG.info("" + serviceResponse);
}
//Service.java: update method
public ServiceResponse<UserDto> updateUser(UserDto userDto) {
LOG.info("");
ServiceResponse<UserDto> serviceResponse = new ServiceResponse<UserDto>();
try {
User user = this.getGlobalMapper().map(userDto, User.class);
//
this.getUserDao().update(user);
userDto = this.getGlobalMapper().map(user, UserDto.class);
serviceResponse.getResponseList().add(userDto);
serviceResponse.setOperationCodeResponse(ServiceResponseCode.OK);
serviceResponse.getMessages().add("Operacion OK");
} catch (Exception e) {
serviceResponse.getMessages().add(e.getMessage());
serviceResponse.setOperationCodeResponse(ServiceResponseCode.MODEL_ERROR);
LOG.error("", e);
}
return serviceResponse;
}
//Exception result
org.hibernate.NonUniqueObjectException: a different object with the same identifier value was already associated with the session: [com.softlogia.copi.model.domain.User#155]
at org.hibernate.engine.internal.StatefulPersistenceContext.checkUniqueness(StatefulPersistenceContext.java:696)
at org.hibernate.event.internal.DefaultSaveOrUpdateEventListener.performUpdate(DefaultSaveOrUpdateEventListener.java:296)
at org.hibernate.event.internal.DefaultSaveOrUpdateEventListener.entityIsDetached(DefaultSaveOrUpdateEventListener.java:241)
at org.hibernate.event.internal.DefaultUpdateEventListener.performSaveOrUpdate(DefaultUpdateEventListener.java:55)
at org.hibernate.event.internal.DefaultSaveOrUpdateEventListener.onSaveOrUpdate(DefaultSaveOrUpdateEventListener.java:90)
at org.hibernate.internal.SessionImpl.fireUpdate(SessionImpl.java:705)
at org.hibernate.internal.SessionImpl.update(SessionImpl.java:697)
at org.hibernate.internal.SessionImpl.update(SessionImpl.java:693)
I am assuming you are using pure Hibernate as ORM; simply put, regardless of the status of your db, you have in your current Hibernate session different copies of the same row. To resolve this you can:
1) flush() the hibernate session after every writing operation on db (insert or update)
OR
2) In your update metod call merge() instead of saveOrUpdate()

Spring Jpa / Hibernate - Deadlock found when trying to get lock; try restarting transaction

I got a deadlock problem with mysql and an application that I am developing. The application, based on spring boot, integration and jpa, has different threads and all of them can access this service:
#Override
#Transactional()
public List<TwitterUser> storeTwitterUsers(List<TwitterUser> twitterUsers)
{
logger.debug("Store list of users, total: " + twitterUsers.size());
List<TwitterUser> savedUsers = new ArrayList<>();
for ( TwitterUser twitterUser : twitterUsers ) {
TwitterUser user = getTwitterUserById(twitterUser.getTwitterId());
if ( user != null ) {
twitterUser.setId(user.getId());
user = entityManager.merge(twitterUser);
} else {
//HERE THE EXCEPTION IS THROWN
entityManager.persist(twitterUser);
user = twitterUser;
}
entityManager.flush();
savedUsers.add(user);
}
return savedUsers;
}
#Override
#Transactional(readOnly = true)
public TwitterUser getTwitterUserById(Long id)
{
Query query = entityManager.createQuery("from TwitterUser u where u.twitterId=:id");
query.setParameter("id", id);
TwitterUser twitterUser = null;
//Throw Exception NoResultException
try {
twitterUser = (TwitterUser)query.getSingleResult();
} catch (NoResultException e) {
//no result found
}
return twitterUser;
}
When more than one thread is within the method storeTwitterUsers, mysql throw this error:
Deadlock found when trying to get lock; try restarting transaction
This is the full stack track of the error:
http://pastebin.com/nZEvykux
I already read those two questions:
How to avoid mysql 'Deadlock found when trying to get lock; try restarting transaction'
Getting "Deadlock found when trying to get lock; try restarting transaction"
but my problem seems slightly different because I got the exception when almost any thread tries to persist the object.
Is there a clean and easy way to resolve the problem without implementing a low level code check? Can Spring JPA automatically manage the deadlock situation?
Any help is really appreciated, I am struggling with that error!

Java Datasource, how to dispose it

I'm working on a webapp where i manually create my DataSource. (also see my other question why: How to use Spring to manage connection to multiple databases) because I need to connect to other databases (dev, prod, qa, test).
Now I have solved it to choose and switch between databases. But if a user logs out of my app. He wants to try to connect to an other database. He is still connected to the same datasource because at runtime the myDs is not null. How can I properly dispose of this Datasource when user logs out? I don't want the user to create the datasource every time he queries the database.
private DataSource createDataSource(Environment e) {
OracleDataSource ds = null;
String url = null;
try {
if (myDs != null) {
logger.info("myDs connection: " + etmetaDs.getConnection().getMetaData().getURL());
url = myDs.getConnection().getMetaData().getURL();
}
} catch (SQLException exc) {
// TODO Auto-generated catch block
exc.printStackTrace();
}
if (myDs == null) {
try {
ds = new OracleDataSource();
} catch (SQLException ex) {
ex.printStackTrace();
}
ds.setDriverType("oracle.jdbc.OracleDriver");
ds.setURL(e.getUrl());
try {
Cryptographer c = new Cryptographer();
ds.setUser(c.decrypt(e.getUsername()));
ds.setPassword(c.decrypt(e.getPassword()));
} catch (CryptographyException ex) {
logger.error("Failed to connect to my environment [" + e.getName() + "]");
ex.printStackTrace();
return null;
}
logger.info("Connecting to my environment [" + e.getName() + "]");
myDs = ds;
} else if (url.equals(e.getUrl())) {
} else {
}
return myDs;
}
If you read the answer of Reza in you other question you can see how to create multiple DataSource.
I think here that the problem is not the DataSource but the way you store information in your code. I suppose that your etmetaDs is shared but all your users, so dispose it when a user log out (= set it to null) is not the good option.
What you have to do, is to maintain the status of the connection for each user. And when a user log off, you can reset is status in order to obtain a new connection the next time it connects.
Update: There are many way to achieve this. I give here an example of what I imagine, but you have to adapt it to your needs. Suppose that you have a UserData object that holds information :
public class UserData
{
String id;
String name;
String database;
}
You may have in your application a dropdown with the name of the database (dev, test, ...) with an empty first item. When the user selects a database, you get the connection with createDataSource(). If it already exists you returns the DataSource else you create a new one. When your user disconnect (or when the user log on), you set the database to "" to force him to select the database in the dropdown. There is no need to reset the datasource.

Categories