I'm using Play framework 1.2.5 and Play-Morphia module.
I want to know if there's a way to update many objects at one Morphia query. I've found this example at https://github.com/greenlaw110/play-morphia/blob/master/documentation/manual/crud.textile, but it seems that I can't use "in" operation in norder to find all the objects which I hold in a list of their IDs.
I'm trying to update the paidInvoiceDocNum filed in each of the objects which their IDs are in the list "itemsIds". This is what I've tried so far:
String q = TransactionItem.find().field("id").in(itemsIds).toString();
TransactionItem.o().set("paidInvoiceDocNum", String.valueOf(docNumber)).update(q);
Without the .toString() it doesn't work either.
Any suggestions?
After long time of experimenting with Play-Morphia, I've found the way to do this update and here it is:
Datastore ds = TransactionItem.ds();
UpdateOperations<TransactionItem> op = ds.createUpdateOperations(TransactionItem.class).set("paidInvoiceDocNum", String.valueOf(docNumber));
Query<TransactionItem> q = (Query<TransactionItem>)TransactionItem.q().filter("id in", itemsIds).getMorphiaQuery();
ds.update(q, op);
Hope It will help...
Can you try this?
TransactionItem.o().set("paidInvoiceDocNum", docNumber).update("id in", itemsIds);
BTW, what's your morphia version. Keep in mind Play has close the updates to modules. Use this to get the latest morphia plugin version: https://gist.github.com/greenlaw110/2868365
Related
There are some samples on http://www.mybatis.org/mybatis-dynamic-sql/docs/select.html.
I want to implement limit/offset for mysql but failed to see any document on describing how to extend this library to support additional where condition.
here is what i'd like to achieve:
SelectStatementProvider selectStatement = select(id, animalName, bodyWeight, brainWeight)
.from(animalData)
.where(id, isIn(1, 5, 7))
.and(bodyWeight, isBetween(1.0).and(3.0))
.orderBy(id.descending(), bodyWeight)
.limit(1).offset(10)
.build()
.render(RenderingStrategy.MYBATIS3);
There are a couple of resources you can use.
This page - http://www.mybatis.org/mybatis-dynamic-sql/docs/whereClauses.html - shows an example of using standalone where clauses to build a paging query. This is not exactly what you are looking for, but it shows one way to do it.
There is a unit test showing something that is closer to what you are looking for here - https://github.com/mybatis/mybatis-dynamic-sql/tree/master/src/test/java/examples/paging. This code works for MySQL and you could use it as is.
I hope to make this a little easier in a future release.
I am using Java API for elastic search.
I need the autocomplete feature and for that I am using Completion
CompletionSuggestionBuilder compBuilder = new CompletionSuggestionBuilder("suggestapi");
compBuilder.field( field_where_search );
compBuilder.text( text_to_search );
SuggestRequestBuilder suggestRequestBuilder = client.prepareSuggest(index);
suggestRequestBuilder.addSuggestion(compBuilder);
SuggestResponse suggestResponse = suggestRequestBuilder.execute().actionGet();
I am getting the correct response. Now, I want to apply a Filter/query along with this suggestion. So, I need to autocomplete only for records where ["genre" : "action"]
I thought to use "BoolFilterBuilder" but did not find how to apply it to CompletionSuggestionBuilder.
Any solution will be highly appreciated.
Thanks.
No you can't do what you want with the Completion Suggester. You would need to post filter the results yourself.
EDIT
You could change the way you index your completions suggestions, each genre could have its own field. The mapping would look something like:
{
"movie_drama_suggest":{
"type":"completion"
},
"movie_comedy_suggest":{
"type":"completion"
},
"movie_horror_suggest":{
"type":"completion"
}
}
You'll need to have have logic in you application that will determine what suggestion field to use during indexing based on the genre of the movie. Then in your query you can reference the suggestions for a specific genre.
CompletionSuggestionBuilder compBuilder = new CompletionSuggestionBuilder("suggestapi");
compBuilder.field("movie_horror_suggest");
compBuilder.text("Ham");
I am using db4o 8.0.
I have a class
PostedMessage{
#Indexed
long receivedTime;
#Indexed
long sentTime;
...
//getter methods and setter methods for all the fields.
}
I persist the PostedMessage objects to db4o database. I have already saved 15000+ objects to db4o database. And now when I run following query, it results in OutOfMemoryError.
//Query to get PostedMessages between "start" and "end" dates.
Query q = db.query();
q.constrain(PostedMessage.class);
Constraint from = q.descend("receivedTime").constrain(new Long(start.getTimeInMillis())).greater().equal();
q.descend("receivedTime").constrain(new Long(end.getTimeInMillis())).smaller().equal().and(from);
q.execute();//results in OutOfMemoryError
To avoid the OutOfMemoryError, I need to add indexes to the fields of PostedMessage class. Read This.
I have a server/client configuration. I don't have control over pre-configuring the ObjectContainer before opening it.
I will have to apply/append the indexing CommonConfiguration after the ObjectContainer is just opened and provided to me.
I know how to create the config.
EmbeddedConfiguration appendConfig = Db4oEmbedded.newConfiguration();
appendConfig.common().objectClass(EmailMessage.class).objectField("receivedTime").indexed(true);
appendConfig.common().objectClass(EmailMessage.class).objectField("sentTime").indexed(true);
I am not able to figure out how to apply this config to already opened ObjectContainer.
How can I add indexes to the just opened ObjectContainer?
Is EmbeddedConfigurationItem's apply() method the answer? If it is, can I get a sample code that shows how to use it?
Edited : Added #Indexed annotation later to the question.
Look in Reference doc
at #Indexed
cl-r's suggestion of using TA/TP worked like a charm in my case. See his comment above.
You have also to install Transparent Activation/Transparent
Persistence to avoid to load unnecessary object in memmory. Look at
chapter 10&11 in tutorial (in the doc/tutorial directory of the
downloaded db4o[Version].zip] - cl-r
In my particular case, I need to iterate over the ObjectSet returned by the query.
It was found that using IMMEDIATE and SNAPSHOT query modes also solved the OutOfMemoryError problem. Also the timings were equally well. LAZY mode is not the solution for me.
It took about 8000 to 9000 ms to retrieve any 100 PostedMessages out of 100000 saved PostedMessages. e.g. 1 to 100, 1001 to 1100, 99899 to 99999.
You should add indexes for your queries. Otherwise db4o has to scan over all objects.
You can do it with an annotation, like this:
import com.db4o.config.annotations.Indexed;
PostedMessage{
#Indexed
long receivedTime;
long sentTime;
Or as you do, with the configuration:
EmbeddedConfiguration config = Db4oEmbedded.newConfiguration();
config.common().objectClass(EmailMessage.class).objectField("receivedTime").indexed(true);
config.common().objectClass(EmailMessage.class).objectField("sentTime").indexed(true);
ObjectContainer container = Db4oEmbedded.openFile(config,"your-data.db4o");
You cannot add this configuration when the container is already running. Only when opening it. When the indexes are not there yet, they will be added while opening the database. You need to get control over it, when opening. Or use the annotation above.
is there any ORM tool/framework for mongoDB with java and also support maven, so that it will be helpful to apply constraints, use of cursers in database operations?
There are some. Start reading:
http://www.mongodb.org/display/DOCS/Java+Language+Center
As for maven support, just look up libraries in mvnrepository.com ( most of them will be there )
This is what you need:
http://www.infoq.com/articles/mongodb-java-orm-bcd
It is maven-based.
See this presenation on slide share http://www.slideshare.net/mongodb/java-persistence-frameworks-for-mongodb
To work with Mongo Db at grass root level I found http://howtodoinjava.com/2014/05/29/mongodb-selectqueryfind-documents-examples/ link very helpful
You can use morphia.
It is a wrapper over mongo-java-driver and works well in the production environment. It is well documented and supports raw queries as well.
Also, well SO community support
try MongoDBExecutor. It will definitely increase the productivity of development. Here is simple sample about CRUD:
#Test
public void test_crud_by_id() {
Account account = createAccount();
account.setId(ObjectId.get().toString());
// create
collExecutor.insert(account);
// read
Account dbAccount = collExecutor.get(Account.class, account.getId());
// update
dbAccount.setFirstName("newFirstName");
collExecutor.update(dbAccount.getId(), N.asMap(FIRST_NAME, dbAccount.getFirstName()));
// delete
collExecutor.delete(dbAccount.getId());
// check
assertFalse(collExecutor.exists(dbAccount.getId()));
}
Declaration: I'm the developer of AbacusUtil
I'm learning to use neo4j, but am a bit confused on its usage. When I'm adding nodes and relationships, I can do it like this:
GraphDatabaseService graphDb = new EmbeddedGraphDatabase("C:/temp/graphdb");
Transaction tx = graphDb.beginTx();
try {
org.neo4j.graphdb.Node node = graphDb.createNode();
...
I could also do it like this:
NeoService neoService = new EmbeddedNeo("C:/temp/graphdb");
Transaction tx = neoService.beginTx();
try {
org.neo4j.api.core.Node node = neoService.createNode();
...
What is the difference here really? Which one should I use? Why are they 2 different mechanisms? Is this just API evolution here? :) I want to use the MetaModel API and it needs a NeoService, so the choice there is clear I guess.
Sorry,
you should use the first one, since in the latest 1.0-RC1 the namespace was moved. This is just naming, the semantics are the same. The second example is outdated and should be removed form the official documentation. Where did you find that?
Cheers,
/peter neubauer
You're spot on with the API evolution comment. The old API is NeoService, so you shouldn't use that. Go with your first snippet. For more information on the API change see e.g. the release mail for the latest rc:
http://www.mail-archive.com/user#lists.neo4j.org/msg02378.html
If you use the latest snapshot (0.7-SNAPSHOT) of the meta-model component, you'll find that it uses the latest API. For our 1.0 release (should be out Real Soon Now :), we're going to make non-SNAPSHOT releases of all components that will use the new API.
-EE
And regarding the meta model, please use the meta-model component (now with the maven artifactId: neo4j-meta-model).
I also notice that the component overview http://components.neo4j.org/neo4j-meta-model/ has some invalid example code and descriptions. I'll try to fix that.