I'm trying to delete an entity in google datastore
String keyValue = "someValue";
Key tweetKey = KeyFactory.createKey("tweetKey", keyValue);
Entity someEntity = new Entity(tweetKey);
Entity getEntity = datastore.get(tweetKey);
datastore.delete(tweetKey);
getEntity = datastore.get(tweetKey);
if(getEntity != null)
{
//This happens
System.out.println("Something wrong");
}
The entity are not deleted, and it something wrong as the output
Edit 1: I didn't copy and paste from the original code, because there are a lot of other logic in between getting and using values from the entity.
You are deleting a different key than the one you are checking.
You are deleting the tweetKey and checking the tweetkey (notice the capital K in the first one). If this snippet is a copy paste from your original then that's the mistake.
Make sure you don't have a transaction active and if you do commit it.
Related
I want to use DFCs in order to change the value for r_accessor_permit of a group assigned to an acl :
1 - > first I chose an ACL called "fme"
2 - > Then I wrote the following DQL in order to get the groups assigned to it :
select r_accossor_name from dm_acl where object_name = 'fme'
3 - > I received a list of groups and I copy pasted one of them which was called : grp_corp_lgl_sateri_hk
Then I wrote the following :
public void changeGroupPermission (){
try{
String myAcl = "fme";
IDfACL acl = (IDfACL)_session.newObject("dm_acl");
acl.setString("object_name", myAcl);
acl.revoke("grp_corp_lgl_sateri_hk","execute_proc");
acl.save();
}catch(Exception E){
System.out.println(E.getLocalizedMessage());
}
}
And I ran it the result was the following error :
[DM_ACL_E_NOMATCH]error: "No ACEs matched for the name
'grp_corp_lgl_sateri_hk' in the ACL 'fme'."
I am quite sure that I have not made any typing mistake . But I cant find the reason why I am facing such error . Any idea where am I making my mistake ?
===> Updating question
After I realised what my mistake was based on the comments I gave it another attempt as follow :
try{
String aclName = "fme";
IDfACL acl = _session.getACL(aclName, "LEXOPEDIA");
acl.destroy();
//or
acl.revoke("grp_sateri_prc_lgl_acc2", "change_location");
acl.save();
}catch(Exception E){
E.printStackTrace();
}
But I still keep getting error and I really know why ? Any idea ?
You made mistake when you thought that retrieving ACL from repository is done using IDfSession.newObject(<type_name>) method. This method serves to create new objects in repository. Instead you should do:
IDfACL acl = sess.getObjectByQualification("dm_acl where object_name='" + myAcl + "'");
acl.revoke("grp_corp_lgl_sateri_hk","execute_proc");
acl.save();
or
IDfACL acl = sess.getACL(myACL, <your ACL's domain name>);
// domain name could be "DM_DBO" if your ACL is available for everyone in repository
acl.revoke("grp_corp_lgl_sateri_hk","execute_proc");
acl.save();
Since you created new object it's logical that you at that point don't have accessor with name grp_corp_lgl_sateri_hk.
I have some method in my DAO class:
public void insertAVAYAcmCDRs(List<AvayaCmCdr> cdrList) {
AvayaCmCdr aCdrList1 = null;
try {
em.getTransaction().begin();
for (AvayaCmCdr aCdrList : cdrList) {
aCdrList1 = aCdrList;
em.persist(aCdrList);
}
em.getTransaction().commit();
em.clear();
} catch (Exception e) {
logger.log(Level.INFO, "Exception in task time={0}. Exception message = {1}.", new Object[]{aCdrList1.getDate(), e.getMessage()});
}
}
I tried save all array entities to DB. But in DB i have uniqe index - it does not allow to insert duplicate rows. It work normaly on DB side but i have some error in java.
a different object with the same identifier value was already associated with the session:
I get this error on 2 step of cycle. I print this object and found dublicate in DB.
I want ignore this error and continue insert data or somehow handle the error.
if this row already in the database i want ignore and skip it and continue insert
Why are you assigning this aCdrList1 = aCdrList ? Is there any specific reason?
you can save aCdrList object. Use below one
em.saveOrUpdate(aCdrList);
or
em.merge(aCdrList);
I have verified this multiple times using appstats. When the below code is NOT wrapped in a transaction, JDO performs two datastore reads and one write, 3 RPC's, at a cost of 240. Not just the first time, every time, even though it is accessing the same record every time hence should be pulling it from cache. However, when I wrap the code in a transaction as above, the code makes 4 RPC's: begin transaction, get, put, and commit -- of these, only the Get is billed as a datastore read, so the overall cost is 70.
If it's pulling it from cache, why would it only bill for a read? It would seem that it would bill for a write, not a read. Could app engine be billing me the same amount for non-transactional cache reads as it does for datastore reads? why?
This is the code WITH transaction:
PersistenceManager pm = PMF.getManager();
Transaction tx = pm.currentTransaction();
String responsetext = "";
try {
tx.begin();
Key userkey = obtainUserKeyFromCookie();
User u = pm.getObjectById(User.class, userkey);
Key mapkey = obtainMapKeyFromQueryString();
// this is NOT a java.util.Map, just FYI
Map currentmap = pm.getObjectById(Map.class, mapkey);
Text mapData = currentmap.getMapData(); // mapData is JSON stored in the entity
Text newMapData = parseModifyAndReturn(mapData); // transform the map
currentmap.setMapData(newMapData); // mutate the Map object
tx.commit();
responsetext = "OK";
} catch (JDOCanRetryException jdoe) {
// log jdoe
responsetext = "RETRY";
} catch (Exception e) {
// log e
responsetext = "ERROR";
} finally {
if (tx.isActive()) {
tx.rollback();
}
pm.close();
}
resp.getWriter().println(responsetext);
This is the code WITHOUT the transaction:
PersistenceManager pm = PMF.getManager();
String responsetext = "";
try {
Key userkey = obtainUserKeyFromCookie();
User u = pm.getObjectById(User.class, userkey);
Key mapkey = obtainMapKeyFromQueryString();
// this is NOT a java.util.Map, just FYI
Map currentmap = pm.getObjectById(Map.class, mapkey);
Text mapData = currentmap.getMapData(); // mapData is JSON stored in the entity
Text newMapData = parseModifyAndReturn(mapData); // transform the map
currentmap.setMapData(newMapData); // mutate the Map object
responsetext = "OK";
} catch (Exception e) {
// log e
responsetext = "ERROR";
} finally {
pm.close();
}
resp.getWriter().println(responsetext);
With the transaction, the PersistenceManager can know that the caches are valid throughout the processing of that code. Without the transaction, it cannot (it doesn't know whether some other action has come in behind its back and changed things) and so must validate the cache's contents against the DB tables. Each time it checks, it needs to create a transaction to do so; that's a feature of the DB interface itself, where any action that's not in a transaction (with a few DB-specific exceptions) will have a transaction automatically added.
In your case, you should have a transaction anyway, because you want to have a consistent view of the database while you do your processing. Without that, the mapData could be modified by another operation while you're in the middle of working on it and those modifications would be silently lost. That Would Be Bad. (Well, probably.) Transactions are the cure.
(You should also look into using AOP for managing the transaction wrapping; that's enormously easier than writing all that transaction management code yourself each time. OTOH, it can add a lot of complexity to deployment until you get things right, so I could understand not following this piece of adviceā¦)
Can someone please tell me how to get a Text value out of a Google App Engine datastore using Java? I have some entities in the datastore with a Text property named longDescription. When I try this:
DatastoreService ds = DatastoreServiceFactory.getDatastoreService();
Query q = new Query("Items");
PreparedQuery pq = ds.prepare(q);
for (Entity result : pq.asIterable()) {
Text longDescription = (Text)result.getProperty("longDescription");
}
I'm getting this warning on the longDescription assignment line:
WARNING: /pstest
java.lang.ClassCastException: java.lang.String cannot be cast to
com.google.appengine.api.datastore.Text
I'm absolutely bumfuzzled here. The only string in my code is the literal "longDescription" that is used to fetch the correct property. If I put this just above the assignment line:
log.warning("Type is " + (result.getProperty("longDescription")).getClass());
I see the following output:
WARNING: Type is class com.google.appengine.api.datastore.Text
Okay, so result.getProperty("longDescription") really, really is a Text object that is being passed back as an object. I've even tried using the fully qualified name (com.google.appengine.api.datastore.Text) instead of just Text with the same results. Where is the String cast coming in? And more importantly, how do I get that Text out of the datastore? I'm at my wit's end here, and any help would be appreciated!
Oh, one other possibly relevant note: This is the assignment I used when inserting the property into the datastore:
Entity eItem = new Entity("Items");
eItem.setProperty("longDescription", new Text(req.getParameter("ldes")));
ds.put(eItem);
When I look at the description in my management console, it seems to be over 500 characters, and it's displayed like this:
<Text: This is a long form description of an item in the store that is access...>
Did I screw something up when inserting it? If so, how do you insert Text items into the datastore?
I figured out the problem, Wei Hao was right in the comments above. It seems that at some point, I inserted a test String as a longDescription instead of a Text. I'm going to chalk this up to being a lesson learned from the school of hard knocks due to being a bit of a noob with the datastore.
For anyone else who runs across this question, the answer is: If you're iterating over a list of query results, make sure that you're getting back what you expect on every result that comes back! Remember, this isn't an RDBMS and every entity can have a different datatype for the same property. So yes, you can have 1,572,394 entities in which longDescription is a Text and one entity in which longDescription is a String, and this will hose you up.
Here's a little code snippet that probably would help to diagnose this issue:
DatastoreService ds = DatastoreServiceFactory.getDatastoreService();
Query q = new Query("Items");
PreparedQuery pq = ds.prepare(q);
for (Entity result : pq.asIterable()) {
if (longDescription isinstanceof Text)
Text longDescription = (Text)result.getProperty("longDescription");
else
log.severe("Unexpected datatype: longDescription is a "
+ result.getProperty("longDescription").getClass().toString());
}
Here's my code;
DatastoreService ds = DatastoreServiceFactory.getDatastoreService();
Query q = new Query(entityKind);
PreparedQuery pq = ds.prepare(q);
for (Entity e : pq.asIterable()) {
String longtext = ((Text)e.getProperty("somelongdescription")).getValue();}
When I am creating a new H2 database via ORMLite the database file get created but after I close my application, all the data that it stored in the database is lost:
JdbcConnectionSource connection =
new JdbcConnectionSource("jdbc:h2:file:" + path.getAbsolutePath() + ".h2.db");
TableUtils.createTable(connection, SomeClass.class);
Dao<SomeClass, Integer> dao = DaoManager.createDao(connection, SomeClass.class);
SomeClass sc = new SomeClass(id, ...);
dao.create(sc);
SomeClass retrieved = dao.queryForId(id);
System.out.println("" + retrieved);
This code will produce good results. It will print the object that I stored.
But when I start the application again this time without creating the table and storing new object I get an exception telling me that the required table is not exists:
JdbcConnectionSource connection =
new JdbcConnectionSource("jdbc:h2:file:" + path.getAbsolutePath() + ".h2.db");
Dao<SomeClass, Integer> dao = DaoManager.createDao(connection, SomeClass.class);
SomeClass retrieved = dao.queryForId(id); // will produce an exception..
System.out.println("" + retrieved);
The following worked fine for me if I ran it once and then a second time with the createTable turned off. The 2nd insert gave me a primary key violation of course but that was expected. It created the file with (as #Thomas mentioned) a ".h2.db.h2.db" prefix.
Some questions:
After you run your application the first time, can you see the path file being created?
Is it on permanent storage and not in some temporary location cleared by the OS?
Any chance some other part of your application is clearing it before the database code begins?
Hope this helps.
#Test
public void testStuff() throws Exception {
File path = new File("/tmp/x");
JdbcConnectionSource connection = new JdbcConnectionSource("jdbc:h2:file:"
+ path.getAbsolutePath() + ".h2.db");
// TableUtils.createTable(connection, SomeClass.class);
Dao<SomeClass, Integer> dao = DaoManager.createDao(connection,
SomeClass.class);
int id = 131233;
SomeClass sc = new SomeClass(id, "fopewjfew");
dao.create(sc);
SomeClass retrieved = dao.queryForId(id);
System.out.println("" + retrieved);
connection.close();
}
I can see Russia from my house:
> ls -l /tmp/
...
-rw-r--r-- 1 graywatson wheel 14336 Aug 31 08:47 x.h2.db.h2.db
Did you close the database? It is closed automatically but it's better to close it manually (so recovery is faster).
In many cases the database URL is the problem. Are you sure the same path is used in both cases? Otherwise you end up with two databases. By the way, ".h2.db" is added automatically, you don't need to add it manually.
To better analyze the problem, you could append ;TRACE_LEVEL_FILE=2 to the database URL, and then check in the *.trace.db file what SQL statements were executed against the database.