I am trying to do some big lists of object saving using hibernate..
problem is before saving I need to confirm if a record with same field data already exists if yes then need to fetch its id and create an association in another table.. else make a new entry and a new insert in the association table for the same..
Please guide me how I can improve the save time..
following is how the save is done..
Session session = SchemaManager.getDatabaseSession("com.server.domin.PublicaccountBehavior");
try {
List<Post> posts = this.getAllPosts();
Transaction transaction = session.beginTransaction();
for (Post post : posts) {
Behavior behavior = new Behavior();
behavior.setElementValue(val);
behavior.setIsDeleted(false);
Date now = new Date();
behavior.setCreatedOn(now);
behavior.setModifiedOn(now);
PublicaccountType type = new PublicaccountType();
type.setId(3L);
behavior.setPublicaccountType(type);
PublicaccountBehavior publicaccountBehavior = new PublicaccountBehavior();
publicaccountBehavior.setBehavior(behavior);
publicaccountBehavior.setPublicAccount(account);
publicaccountBehavior.setTimeOfBookmark(post.getTimeAsDate());
publicaccountBehavior.setCreatedOn(now);
publicaccountBehavior.setModifiedOn(now);
try {
Behavior behav;
List list2 = session.createQuery("from Behavior where elementValue = :elementVal").setString("elementVal",
behavior.getElementValue()).list();
if (list2.size() > 0) {
behav = (Behavior) list2.get(0);
publicaccountBehavior.setBehavior(behav);
} else {
Long id = (Long) session.save(behavior);
behavior.setId(id);
publicaccountBehavior.setBehavior(behavior);
}
session.saveOrUpdate(publicaccountBehavior);
} catch (HibernateException e) {
e.printStackTrace();
}
}
transaction.commit();
When you are saving a new object - flush() and then clear() the session regularly in order to control the size of the first-level cache. which will enhance the performance.
example is explained in hibernate docs.
Related
I have different types of objects to update. All objects are set to a list and pass them to a method.
List list = new ArrayList();
list.add(mediaInfo); // Class MediaInfo
list.add(mediaMode); // Class MediaMode
list.add(paidCustomer); // Class paidCustomer
updateList ( l );
All above objects have loaded before and I have changed one field (called "position" : String value). Also above any object is not attached to any hb session. Those objects are loaded in another place. I just want to update them with updated data.
public boolean updateList(java.util.List <Object> dataList){
Session session = null;
Hbutility myHbutil = null;
try {
myHbutil = new Hbutility();
session = myHbutil.getSession();
Transaction tx = session.beginTransaction();
for(Object entity: dataList){
logger.info("Updating Objects : " + entity );
session.update( entity );
}
tx.commit();
} catch (Exception e) {
e.printStackTrace();
}finally{
session.close();
}
return updateStaus;
}
All objects have their id s. But they are not updated. Any one See any problem here ?
There are many samples of hibernate update in google. But all of them shows, loading a object inside the session, setting new values and simply updating. In my scenario, objects are loaded out of the session and all of them are different type of objects. Any help please.
To update the content, you can also use the merge method. Maibe it can help you ?
Try to get objet with the entities manager. Then modify the properties. And save change Exemple :
MediaInfo tmp = em.find(MediaInfo.class, mediaInfo.getId();
//Modify some properties
tmp.setMachin(....);
list.add(tmp);
updateList ( list );
Need to ensure that your mapping for 'position' is as per your expectation, ie it should'nt be transient and updatable should'nt be false
This is the DAO I have created:
public Poll updatePoll(int id){
Session s = factory.getCurrentSession();
Transaction t = s.beginTransaction();
Poll poll = (Poll) s.get(Poll.class, id);
Citizen citizen = (Citizen) s.get(Citizen.class, 1);
List<Poll> list = citizen.getPolledList();
boolean check = list.contains(poll);
if(!check){
Query q = s.createSQLQuery("update Poll set poll_count = poll_count + 1 where poll_id = id");
q.executeUpdate();
s.update(poll);
}else{
return poll;
}
s.close();
return poll;
}
This is the Action created:
public String submitVote(){
ServletContext ctx = ServletActionContext.getServletContext();
ProjectDAO dao = (ProjectDAO)ctx.getAttribute("DAO");
Poll poll = dao.updatePoll(poll_id);
String flag = "error";
if (poll != null){
ServletActionContext.getRequest().getSession(true).setAttribute("POLL", poll);
flag = "voted";
}
return flag;
}
I know I have been going horribly wrong and the code I'm posting might be utter rubbish. But I hope the intent is clear, thus if possible please lent me a helping hand. My project is mainly in JSP (Struts 2), jQuery and MySQL 5.1, so please do not suggest PHP codes as I've found earlier.
The framework is used to wrap the servlet stuff from user, you should use its features if you want doing something like
ServletActionContext.getRequest().getSession(true)
But
Map m = ActionContext.getContext().getSession();
I have this piece of code:
public String getEventsForCalendar(#RequestParam("userId") Long userId){
Session session = NewHibernateUtil.getSessionFactory().getCurrentSession();
try{
session.beginTransaction();
JSONArray eventsArray = new JSONArray((List<Event>)(session.createCriteria(Event.class)
.add(Restrictions.eq("ownerid", userId))
.list()));
User user = (User)session.createCriteria(User.class)
.add(Restrictions.eq("id", userId)).uniqueResult();
for (Event event: (List<Event>) session.createCriteria(Event.class)
.add(Restrictions.ne("ownerid", userId))
.list()){
if (event.getInvited().contains(user)){
eventsArray.put(new JSONObject(event));
}
}
}
catch(Exception e){
System.out.println(e.getMessage());
}
return null;
}
It, at first, should find Events created by User. Next, in loop it should get Events which User is invited. Each Events has List called invited and I think I have to iterate over all list and check if User is in invited list, but can I do something like?
List<Event> list = session(...)
.add(Restrictions.eq("invited", user);
In the other words, can I get from database Object which in list contains object? I will be very happy if anybody answers my question - thank you in advance.
You would use the Restrictions.in("colName", collection) method.
Edit:
If you want to do the inverse. If column is not in the collection then you can do:
Restrictions.not(Restrictions.in("colName", collection))
See http://docs.jboss.org/hibernate/orm/3.3/api/org/hibernate/criterion/Restrictions.html#in(java.lang.String, java.util.Collection)
I need to insert a list of objects with a predefined _id (Long) into a collection.
insert(object) method for a single object from AdvancedDatastore works great. The trouble begins when i try to use the insert() method which accepts an Iterable. Here is a sample piece of code:
try {
advancedDatastore.insert("collection_name", feeds, WriteConcern.ERRORS_IGNORED);
} catch (Exception e) {
e.printStackTrace();
}
I guess that this code is supposed to ignore errors (an object with a duplicate id already exists in the collection) and just continue with the next item, but it does not. And no exception is raised.
Thanks!
Update:
This code inserts all the elements, but "1" is not printed out.
try {
System.err.println(0);
advancedDatastore.insert("collection_name", feeds, WriteConcern.ERRORS_IGNORED.continueOnErrorForInsert(true));
System.err.println(1);
} catch (Exception e) {
e.printStackTrace();
}
Update2:
Sorry, the code completes properly and "1" is printed out, but it takes tremendously more time than single inserts. In my case 35_000 inserts 1 by one - 3 seconds, in batch - 100+ seconds
Update3:
So far the best way to deal with the issue for me is to use native java driver for mongodb.
1st I convert my object list to DBObject list:
final List<DBObject> dbObjects = new ArrayList<DBObject>();
for (MyObject object: objectList) {
dbObjects.add(morphia.toDBObject(object));
}
Then I insert through mongo DB instance:
db.getCollection("collection_name").insert(dbObjects, WriteConcern.UNACKNOWLEDGED.continueOnErrorForInsert(true));
Performace for inserting 150_000 objects:
Native DB insert: 2-3 seconds
via Morphia's insert(object): 15+ seconds
via Morphia's insert(Iterable): 400+ seconds
A better way would be appreciated.
It works to me in this way
final List<DBObject> dbObjects = new ArrayList<DBObject>();
try {
TypedQuery<RegistroCivil> consulta = em.createQuery("select p from RegistroCivil p", RegistroCivil.class);
List<RegistroCivil> lista = consulta.getResultList();
for (RegistroCivil object : lista) {
dbObjects.add(morphia.toDBObject(object));
}
long start = System.currentTimeMillis();
ds.getCollection(RegistroCivil.class).insert(dbObjects);
//ds.save(lista);
long end = System.currentTimeMillis();
tmongo = end - start;
I have verified this multiple times using appstats. When the below code is NOT wrapped in a transaction, JDO performs two datastore reads and one write, 3 RPC's, at a cost of 240. Not just the first time, every time, even though it is accessing the same record every time hence should be pulling it from cache. However, when I wrap the code in a transaction as above, the code makes 4 RPC's: begin transaction, get, put, and commit -- of these, only the Get is billed as a datastore read, so the overall cost is 70.
If it's pulling it from cache, why would it only bill for a read? It would seem that it would bill for a write, not a read. Could app engine be billing me the same amount for non-transactional cache reads as it does for datastore reads? why?
This is the code WITH transaction:
PersistenceManager pm = PMF.getManager();
Transaction tx = pm.currentTransaction();
String responsetext = "";
try {
tx.begin();
Key userkey = obtainUserKeyFromCookie();
User u = pm.getObjectById(User.class, userkey);
Key mapkey = obtainMapKeyFromQueryString();
// this is NOT a java.util.Map, just FYI
Map currentmap = pm.getObjectById(Map.class, mapkey);
Text mapData = currentmap.getMapData(); // mapData is JSON stored in the entity
Text newMapData = parseModifyAndReturn(mapData); // transform the map
currentmap.setMapData(newMapData); // mutate the Map object
tx.commit();
responsetext = "OK";
} catch (JDOCanRetryException jdoe) {
// log jdoe
responsetext = "RETRY";
} catch (Exception e) {
// log e
responsetext = "ERROR";
} finally {
if (tx.isActive()) {
tx.rollback();
}
pm.close();
}
resp.getWriter().println(responsetext);
This is the code WITHOUT the transaction:
PersistenceManager pm = PMF.getManager();
String responsetext = "";
try {
Key userkey = obtainUserKeyFromCookie();
User u = pm.getObjectById(User.class, userkey);
Key mapkey = obtainMapKeyFromQueryString();
// this is NOT a java.util.Map, just FYI
Map currentmap = pm.getObjectById(Map.class, mapkey);
Text mapData = currentmap.getMapData(); // mapData is JSON stored in the entity
Text newMapData = parseModifyAndReturn(mapData); // transform the map
currentmap.setMapData(newMapData); // mutate the Map object
responsetext = "OK";
} catch (Exception e) {
// log e
responsetext = "ERROR";
} finally {
pm.close();
}
resp.getWriter().println(responsetext);
With the transaction, the PersistenceManager can know that the caches are valid throughout the processing of that code. Without the transaction, it cannot (it doesn't know whether some other action has come in behind its back and changed things) and so must validate the cache's contents against the DB tables. Each time it checks, it needs to create a transaction to do so; that's a feature of the DB interface itself, where any action that's not in a transaction (with a few DB-specific exceptions) will have a transaction automatically added.
In your case, you should have a transaction anyway, because you want to have a consistent view of the database while you do your processing. Without that, the mapData could be modified by another operation while you're in the middle of working on it and those modifications would be silently lost. That Would Be Bad. (Well, probably.) Transactions are the cure.
(You should also look into using AOP for managing the transaction wrapping; that's enormously easier than writing all that transaction management code yourself each time. OTOH, it can add a lot of complexity to deployment until you get things right, so I could understand not following this piece of adviceā¦)