Why is this giving me a lock timeout:
for(int i = 0; i < playersCount ; i++) {
StatUser stats = (StatUser) selectedUsers.get(i).getStatUsers().iterator().next();
int gc = stats.getGamesPlayed();
int gm = stats.getMakeCount();
Session session = HibernateUtil.getSessionFactory().getCurrentSession();
Transaction tx = session.beginTransaction();
if(i == winnerIndex) {
stats.setGamesPlayed(gc++);
stats.setMakeCount(gm++);
session.update(stats);
session.flush();
tx.commit();
customTextBean.sendMail(selectedUsers.get(i).getEmail(), "Tillykke du har vundet");
}
else {
stats.setGamesPlayed(gc++);
session.update(stats);
session.flush();
tx.commit();
customTextBean.sendMail(selectedUsers.get(i).getEmail(), "Tillykke " + winnersName + " skal lave kaffe");
}
}
If you create a new transaction (session.beginTransaction();), then a new DB connection is created. So you have a transaction which has a read-lock on stats (from the for loop) and inside of that you try to write to the same row -> Deadlock.
To fix that, you first fetch all StatUsers with a second loop, close the first transaction and then iterate over the result in the code above. If you can't do that because you run out of memory, then Hibernate is no longer your friend and you must use custom SQL.
Other solutions: Use optimistic locking or read the data to be changed with custom SQL and instantiate the objects in the loop.
Related
I am dealing with an issue when I attempt to retrieve a large amount of records from a database. It seems that when the amount of records exceed 90.000, the elements can not be retrieved.
When that happens I get the following exception:
com.sun.jdi.ObjectCollectedException occurred while retrieving value.
The code that I am using is the following one:
Session objSession;
List<GroupEntity> colResults;
objSession = this.objSessionFactory.openSession();
try
{
objQuery = objSession.createQuery("FROM GroupEntity WHERE (strDomain = :Domain)")
.setParameter("Domain", strDomain)
.list();
}
catch (Exception objException)
{
throw new GroupException("Could not retrieve the list of WebFiltering groups to scan");
}
objSession.close();
return colResults;
I attempt to page the results retrieved by sets of 1.000, using this method when I insert up to 89.999 records the list is fine. however when I exceed 90.000 I get the same exception.
Any idea about how to face this issue?
In case you process such a big amount of data I'd recommend that you use batch processing with ScrollableResults: https://grokonez.com/hibernate/resolve-hibernate-outofmemoryerror-problem-hibernate-batch-processing
Session session = factory.openSession();
Transaction tx = null;
try {
tx = session.beginTransaction();
ScrollableResults dataCursor = session.createQuery("FROM Data").scroll();
int count = 1;
while (dataCursor.next()) {
Data data = (Data) dataCursor.get(0);
String newText = Utilities.generatedRandomString();
data.setText(newText);
session.update(data);
if (count % 50 == 0) {
System.out.println("============================log: count = " + count);
session.flush();
session.clear();
}
count++;
}
tx.commit();
} catch (Exception e) {
if (null != tx) {
tx.rollback();
}
} finally {
session.close();
}
In this case session will not keep all 90000 records in memory.
"com.sun.jdi.ObjectCollectedException" happens when the object you referring to is garbage collected.
there is no such limit of size on java arrayList.
I'm trying to save bulk data in to the table
while saving I'm facing the issue like
illegally attempted to associate a proxy with two open Sessions
Session session = sessionFactory.openSession();
Transaction transaction = session.beginTransaction();
for (Object obj : ObjectList) {
session.save(obj );
}
transaction.commit();
session.close();
}
I saw this issue while updating the records but now it is getting for save is there any solution.
Here you can flush a batch of inserts and release memory:
for more information you can refer hibernate doc on batch processing.
int count=0;
for (Object obj : ObjectList) {
session.save(obj );
//20, same as default the JDBC batch size
if ( ++count % 20 == 0 ) {
session.flush();
session.clear();
}
}
transaction.commit();
session.close();
I have some class Calls
public class Calls {
private String Field1;
//and many fields, for example 15
private List<MyModel> models;
}
Each minute I get a List of Calls
List<Calls> list = someService.getCallsList();
and try to insert it into DB
Session session = getSession();
Transaction tx = session.beginTransaction();
for(int i=0; i< list.size(); i++) {
Calls calls = list.get(i);
session.createSQLQuery(
"INSERT /*+ ignore_row_on_dupkey_index(CALLS,UNIQUE_CALLS_CONSTRAINT) */ " +
"INTO CALLS(Field1,....,FieldEnd) VALUES(:field1,...,:fieldEnd)")
.setParameter("field1",calls.getField1()
//set all params
.setParameter("fielEnd1",calls.getFieldEnd();
if ( i % 20 == 0 ) { //20, same as the JDBC batch size
//flush a batch of inserts and release memory:
session.flush();
session.clear();
}
}
tx.commit();
session.close();
}
I need to INSERT row and skip all duplicate: /*+ ignore_row_on_dupkey_index(CALLS,UNIQUE_CALLS_CONSTRAINT) *// For it I use SQLQuery
My question is: how can I make this using hibernate without SQLQuery? as generally is implemented correctly. I have 15 parameters and I do not want them all to register for SQLQuery
You can use Oracle hint by Criteria,see:
https://docs.jboss.org/hibernate/orm/4.3/javadocs/org/hibernate/Criteria.html#addQueryHint(java.lang.String)
I can't save all rows from my file Excel in database, because i get this error :
Exception in thread "main" org.hibernate.SessionException: Session is closed!
My code :
AnnotationConfiguration conf = new AnnotationConfiguration();
conf.addAnnotatedClass(Etudiant.class);
conf.configure("hibernate.cfg.xml");
new SchemaExport(conf).create(true, true);
SessionFactory factory = conf.buildSessionFactory();
Session session = factory.getCurrentSession();
for(int i=3;i<tab.length;i++){
session.beginTransaction();
etudiant.setNom(tab[i]);
i++;
etudiant.setPrenom(tab[i]);
i++;
etudiant.setAge(tab[i]);
session.save(etudiant);
session.getTransaction().commit();
}
Anyone have an idea plz ?
You are fulling up your first level cache. Clearing and flushing the cache periodically should be considered when your are doing a bulk insert. Also committing in each iteration slow down the insertion.
You have to do something like this..
13.1. Batch inserts
When making new objects persistent flush() and then clear() the sessionregularly in order to control the size of the first-level cache.
Session session = sessionFactory.openSession();
Transaction tx = session.beginTransaction();
for ( int i=0; i<100000; i++ ) {
Customer customer = new Customer(.....);
session.save(customer);
if ( i % 20 == 0 ) { //20, same as the JDBC batch size
//flush a batch of inserts and release memory:
session.flush();
session.clear();
}
}
tx.commit();
session.close();
You need to start a Session with factory.openSession() before you can use the current Session.
Here is the code example from Hibernate about batch processing:
Session session = sessionFactory.openSession();
Transaction tx = session.beginTransaction();
for ( int i=0; i<100000; i++ ) {
Customer customer = new Customer(.....);
session.save(customer);
if ( i % 20 == 0 ) { //20, same as the JDBC batch size
//flush a batch of inserts and release memory:
session.flush();
session.clear();
}
}
tx.commit();
session.close();
At the beginning of codes, it uses openSession().However, when i wrote my code, i use getCurreentSession(). It seems like it will generate org.hibernate.TransactionException: nested transactions not supported error.
Could anybody explain why this happens?
SessionFactory.openSession() always opens a new session that you have to close once you are done with the operations. SessionFactory.getCurrentSession() returns a session bound to a context - you don't need to close this.