mongoDB: cursor notimeout setting isn't working in java client - java

i set an 'notimeout' option to a dbcursor in java:
BasicDBObject nearbyQueries = new BasicDBObject("$gt", 0)
.append("$lte", 2);
DBCursor trueClassInstances = locationsCollection.find(new BasicDBObject("distanceFromHotel", nearbyQueries)).addOption(Bytes.QUERYOPTION_NOTIMEOUT).limit(100000);
double counter = 0;
int currentPresent = 0;
for (DBObject instance : trueClassInstances) {
...
}
even with this option i set, this exception is thrown:
Exception in thread "main" com.mongodb.MongoCursorNotFoundException: Query failed with error code -5 and error message 'Cursor 1876954464377 not found on server XXXXXX:27017' on server XXXXXXXX:27017
at com.mongodb.connection.GetMoreProtocol.receiveMessage(GetMoreProtocol.java:115)
at com.mongodb.connection.GetMoreProtocol.execute(GetMoreProtocol.java:68)
at com.mongodb.connection.GetMoreProtocol.execute(GetMoreProtocol.java:37)
at com.mongodb.connection.DefaultServer$DefaultServerProtocolExecutor.execute(DefaultServer.java:155)
at com.mongodb.connection.DefaultServerConnection.executeProtocol(DefaultServerConnection.java:219)
at com.mongodb.connection.DefaultServerConnection.getMore(DefaultServerConnection.java:194)
at com.mongodb.operation.QueryBatchCursor.getMore(QueryBatchCursor.java:197)
at com.mongodb.operation.QueryBatchCursor.hasNext(QueryBatchCursor.java:93)
at com.mongodb.MongoBatchCursorAdapter.hasNext(MongoBatchCursorAdapter.java:46)
at com.mongodb.DBCursor.hasNext(DBCursor.java:152)
at locationExtraction.DistanceClassification.FeatureAnalyzer.main(FeatureAnalyzer.java:27)
FeatureAnalyzer.java:27 is the for loop line.
this problem appear in other project with similar setting...
what am i doing wrong? maybe my choice of 'for' loop instead of this kind of iteration can cause this strange behavior?
while(cursor.hasNext())
{
DBObject next = cursor.next();
}
Thanks

Looks like you are not able to process each batch within time limit. Try reducing batch size so that each batch could be consumed before time runs out. This should help.
cursor.addOption(com.mongodb.Bytes.QUERYOPTION_NOTIMEOUT).batchSize(100)

so the problem is solved.
this is very strange but there is a problem with using 'for' loop for iterating on cursor. so dont do it like i did it, use 'while' loop:
while(cursor.hasNext())
{
DBObject next = cursor.next();
}

before use cursor.hasNext() and cursor.next() to do business logical, just before you get the mongo cursor, invoke FindIterable object's noCursorTimeout(true) method. for example:
FindIterable<Document> findIterable = sentenceColl.find(condition);
// set no timeout
findIterable.noCursorTimeout(true);
MongoCursor<Document> mongoCursor = findIterable.iterator();
while (mongoCursor.hasNext()) {
mongoCursor.next();
}

try this:
Iterator<BasicDBObject> it = null;
it = coll.find("{"field": {$in:#}}", fieldList).with(
new QueryModifier() {
public void modify(DBCursor cursor) {
cursor.setOptions(Bytes.QUERYOPTION_NOTIMEOUT);
}
}
).as(BasicDBObject.class);

Related

mongo BulkWriteOperation with upsert option doesn't return upserted entries for updates

I'm using mongo-java-driver-3.0.4 jar in my application. My mongodb version is 3.2.10.
Basically what i'm trying is to do a bulk write operation to create or insert documents. I'm trying this with the upsert option. What i notice is the following:
Whenever new documents are created, the BulkWriteResult#getUpserts() returns me the List<BulkWriteUpsert> with the created documents.
However when i'm trying to update existing documents, BulkWriteResult#getUpserts() returns me an empty array.
I use the following snippet:
DBCollection coll = getDBCollection();
BulkWriteOperation bulkWriteOperation = coll.initializeUnorderedBulkOperation();
for() { //in a loop to populate the bulkWriteOperation
DBObject obj = getDbObject();
bulkWriteOperation.find(getQueryObject()).upsert().replaceOne(obj);
}
BulkWriteResult result = bulkWriteOperation.execute(writeConcern)
This looks to be a bug in the driver but i'm not sure, since as per the api i see this description:
Gets an unmodifiable list of upserted items, or the empty list if
there were none.
which i understood as get the list of items which are either updated or inserted.
The upserts will only be populated when the upsert resulted in an insert. Here's a MCVE that demonstrates:
public class BulkUpsertTest {
public static void main(String args[]) throws UnknownHostException {
MongoClient m = new MongoClient("localhost");
DBCollection coll = m.getDB("test").getCollection("bulkUpsertTest");
coll.drop(); // drop the collection so that first iteration is insert
test(coll); // first iteration insert
test(coll); // second iteration update
}
public static void test(DBCollection coll) {
BulkWriteOperation bulkWriteOperation = coll.initializeUnorderedBulkOperation();
for(int i = 0; i < 3; i++) {
DBObject query = new BasicDBObject("_id", i);
DBObject obj = new BasicDBObject("x", i);
bulkWriteOperation.find(query).upsert().replaceOne(obj);
}
BulkWriteResult result = bulkWriteOperation.execute();
List<BulkWriteUpsert> upserts = result.getUpserts();
System.out.println("result: " + result);
if (upserts != null) {
System.out.println("Upserts size: " + upserts.size());
} else {
System.out.println("Upserts is null");
}
}
}
Output (note differences in counts for matchedCount, modifiedCount, and upserts for the 2 cases:
result: AcknowledgedBulkWriteResult{insertedCount=0, matchedCount=0, removedCount=0, modifiedCount=0, upserts=[BulkWriteUpsert{index=0, id=0}, BulkWriteUpsert{index=1, id=1}, BulkWriteUpsert{index=2, id=2}]}
Upserts size: 3
result: AcknowledgedBulkWriteResult{insertedCount=0, matchedCount=3, removedCount=0, modifiedCount=3, upserts=[]}
Upserts size: 0
From java docs for BulkWriteUpsert
Represents an upsert request in a bulk write operation that resulted
in an insert. It contains the index of the upsert request in the
operation and the value of the _id field of the inserted document.
So it contains the write operations that resulted in insert but not update.
In case of updates this property may be relevant to you BulkWriteResult.nModified
More information here on BulkWriteResult

creating unique index in mongoDB

I am using a java program for mongo db insertion trying to create a unique index for a field. product_src is a field in my collection and I want to set it as unique index for avoiding the duplicate insertion. I am trying the following code but showing syntax error what is the problem with this.
DB db;
try {
sample = new MongoClient("myIP",PORT);
db = sample.getDB("client_mahout");
t = db.getCollection("data_flipkart_in_avoid_duplicate_checking");
System.out.println("enter the system ip");
db.t.ensureIndex({"product_src":1});
} catch (Exception e) {}
t is the collection. there is problem with line db.t.ensureIndex({"product_src":1});
Please give me a sample code how to create unique index in mongo DB
For future reference, the way to handle this in the Java Mongo driver v3.0+ is by:
public void createUniqueIndex() {
Document index = new Document("field", 1);
MongoCollection<Document> collection = client.getDatabase("db").getCollection("Collection");
collection.createIndex(index, new IndexOptions().unique(true));
}
You need to pass a DBObject to the ensureIndex() method.
db.t.ensureIndex(new BasicDBObject("product_src",1))
But, the ensureIndex method has been deprecated since version 2.12, you need to use createIndex() instead.
db.t.createIndex(new BasicDBObject("product_src",1));

Java - MongoDB Minimum Operator

I have tried to following to calculate the minimum in Java using the Java driver for MongoDB:
for (int i = 0; i < uniqueRegions.size(); i++) {
BasicDBObject query = new BasicDBObject("Region", uniqueRegions.get(i))
.append("Year", new BasicDBObject("$min", "Year"));
cursor = coll.find(query);
try {
while (cursor.hasNext()) {
System.out.println(uniqueRegions.get(i));
System.out.println(cursor.next());
}
} finally {
cursor.close();
}
}
However I get the error:
Exception in thread "main" com.mongodb.MongoException: Can't canonicalize query: BadValue unknown operator: $min
I've created a query with the $min operator isolated:
BasicDBObject query = new BasicDBObject("$min", "Year");
and it still generates the error so the problem is with how I'm using it. Can someone tell me the correct syntax to get this to work please.

Not able to retrive list of document in mongodb

{
"question":"what is your color?",
"choices":[{"option":"yello"},{"option":"blue"}],
"creation-date":"2014-04-13",
"expiry date":"2014-04-14"
}
In order to retrive list of polls
public List<Poll> getPolls()
{
try {
mongoClient=new MongoClient("NavDeep",27017);
db=mongoClient.getDB("sms-voting");
collection=db.getCollection("pollsCollection");
} catch(Exception e){
e.printStackTrace();
}
List<Poll> polls = new ArrayList<Poll>();
DBCursor cursor=collection.find();
while(cursor.hasNext())
{
DBObject object = cursor.next();
what should i write in order to retrieve List<Poll>????
}
}
mongoClient.close();
return polls;
}
but i am getting null pointer exception near BasicDBList pollList=(BasicDBList)object.get("pollsCollection");
Can any body please Help me out.What should i actually write inside get()??
Thnks,
deepthi
I think the problem could be that you're doing cursor.next() twice in the lines above, pulling two records at a time.
What if you try:
polls.add(object)
instead of
polls.add(cursor.next())
Rather than iterating the cursor you appear to want the .toArray() method:
DBCursor cursor = collection.find();
List<DBObject> list = cursor.toArray();
Generally cursors are a good idea, and you probably should be building any array type results from within that loop. But this is a way to change the cursor.

Morphia stops when inserting objects with predefined ids

I need to insert a list of objects with a predefined _id (Long) into a collection.
insert(object) method for a single object from AdvancedDatastore works great. The trouble begins when i try to use the insert() method which accepts an Iterable. Here is a sample piece of code:
try {
advancedDatastore.insert("collection_name", feeds, WriteConcern.ERRORS_IGNORED);
} catch (Exception e) {
e.printStackTrace();
}
I guess that this code is supposed to ignore errors (an object with a duplicate id already exists in the collection) and just continue with the next item, but it does not. And no exception is raised.
Thanks!
Update:
This code inserts all the elements, but "1" is not printed out.
try {
System.err.println(0);
advancedDatastore.insert("collection_name", feeds, WriteConcern.ERRORS_IGNORED.continueOnErrorForInsert(true));
System.err.println(1);
} catch (Exception e) {
e.printStackTrace();
}
Update2:
Sorry, the code completes properly and "1" is printed out, but it takes tremendously more time than single inserts. In my case 35_000 inserts 1 by one - 3 seconds, in batch - 100+ seconds
Update3:
So far the best way to deal with the issue for me is to use native java driver for mongodb.
1st I convert my object list to DBObject list:
final List<DBObject> dbObjects = new ArrayList<DBObject>();
for (MyObject object: objectList) {
dbObjects.add(morphia.toDBObject(object));
}
Then I insert through mongo DB instance:
db.getCollection("collection_name").insert(dbObjects, WriteConcern.UNACKNOWLEDGED.continueOnErrorForInsert(true));
Performace for inserting 150_000 objects:
Native DB insert: 2-3 seconds
via Morphia's insert(object): 15+ seconds
via Morphia's insert(Iterable): 400+ seconds
A better way would be appreciated.
It works to me in this way
final List<DBObject> dbObjects = new ArrayList<DBObject>();
try {
TypedQuery<RegistroCivil> consulta = em.createQuery("select p from RegistroCivil p", RegistroCivil.class);
List<RegistroCivil> lista = consulta.getResultList();
for (RegistroCivil object : lista) {
dbObjects.add(morphia.toDBObject(object));
}
long start = System.currentTimeMillis();
ds.getCollection(RegistroCivil.class).insert(dbObjects);
//ds.save(lista);
long end = System.currentTimeMillis();
tmongo = end - start;

Categories