Spring Webflux Reactive Mongo Bulk Operations (Java) - java

https://github.com/spring-projects/spring-data-mongodb/issues/2821
https://jira.spring.io/browse/DATAMONGO-1922?redirect=false
I have been looking for ReactiveBulk operations to update documents as a batch in Spring WebFlux.
Like in the Mongo Template
var bulkOps = mongoTemplate.bulkOps()
for(dto : List<DTO> DTOs) {
Query query = new Query();
query.addCriteria(Criteria.where(ID).is(dto.getId()));
Update update = new Update()
.set(STATUS, dto.getStatus())
bulkOps.updateOne(query, update)
}
bulkOps.execute();
is there a workaround to implement that operation in reactive way since reactivemongotemplate look like does not support that operation currently?
similiar topic in so: Bulk Update with ReactiveMongoTemplate

Quickly remind that Bulk is different than UpdateMulti.
Bulk is meant to write multiple queries, therefore, update various objects. On the other side, UpdateMulti is intended to update all lines where expression matches
As for reactive bulk, you should be able to use ReactiveMongoTemplate and implement something like that:
reactiveMongoTemplate.getCollection("collection_name")
.flatMap(mongoCollection -> {
List<UpdateOneModel<Document>> operations = DTOs.stream()
.map(dto -> {
Document doc = new Document("status", dto.getStatus());
reactiveMongoTemplate.getConverter().write(dto, doc);
Document filter = new Document("id", dto.getId());
return new UpdateOneModel<Document>(filter, new Document("$set", doc));
}
}).toList();
return Mono.from(mongoCollection.bulkWrite(operations));
});
You can also add custom options to bulkWrite() if you desire.
If more filter is needed, you can append them to the document
Document filter = new Document("id", dto.getId())
.append("country", dto.getCountry);

Related

Is there any way to update/replace whole document of mongoDB from java using mongoDB morphia?

I need to replace a whole existing document of mongodb from java instead of setting every field.Is there any way? I am using mongo morphia.
Right now i am setting fields one by one ,following is code :
DBObject searchObject =new BasicDBObject();
searchObject.put("procId", procId);
final UpdateOperations<Timesheet> updateOperations = ds.createUpdateOperations(Timesheet.class)
.set("wheelInTime", timesheet.getWheelInTime())
.set("wheelOutTime", timesheet.getWheelOutTime())
.set("tableOnTime", timesheet.getTableOnTime())
.set("tableOffTime", timesheet.getTableOffTime())
final UpdateResults results = ds.updateFirst(findQuery,updateOperations);
You can 'overwrite' any entry in a MongoDB collection but simply creating a new DbObject with the same _id field and saving it to the database. So just set the fields in your object as you would any Java object and use myCollection.save(obj)
Just save the object and it will overwrite the document with the same #id. This can be done with one line of code:
dao.save(timesheet);
More complete example code of the usage of the Morphia DAO:
class Dao extends BasicDAO<TimeSheet, String> {
Dao(Datastore ds) {
super(TimeSheet.class, ds);
}
}
Datastore ds = morphia.createDatastore(mongoClient, DB_NAME);
Dao dao = new Dao(ds);
dao.save(timesheet);

Neo4j OGM FilteredQueryBuilder

Anyone has use class FilteredQueryBuilder to create a Cypher query in Java?
I'm trying to create this query using neo4j-ogm:
MATCH (n:Message) WHERE n.messageContext = 'RECEBER_BOLETO_EM_ABERTO'
MATCH (n)-[r0:NEXT]->(m) WHERE r0.response = 'SIM'
return m
Map<String, Object> parameters = new HashedMap<>();
parameters.put("messageContext", "RECEBER_BOLETO_EM_ABERTO");
parameters.put("response", "SIM");
Filters filtersNode = new Filters();
Filter filterStartNode = new Filter("messageContext", ComparisonOperator.EQUALS, "RECEBER_BOLETO_EM_ABERTO");
filterStartNode.setNestedEntityTypeLabel("Message");
filterStartNode.setNestedPropertyName("messageContext");
filterStartNode.setRelationshipDirection(Relationship.OUTGOING);
filterStartNode.setBooleanOperator(BooleanOperator.AND);
filtersNode.add(filterStartNode);
Filter filterEndNode = new Filter("response", ComparisonOperator.EQUALS, "SIM");
filterEndNode.setNestedPropertyName("response");
filterEndNode.setRelationshipDirection(Relationship.TYPE);
filterEndNode.setBooleanOperator(BooleanOperator.AND);
filtersNode.add(filterEndNode);
FilteredQuery fq = FilteredQueryBuilder.buildRelationshipQuery("NEXT", filtersNode);
fq.setReturnClause("return m");
The Builder class doesn't parse the parameters into cypher query and throw the exception as follow:
org.neo4j.ogm.exception.CypherException: Error executing Cypher; Code:
Neo.ClientError.Statement.ParameterMissing; Description: Expected a
parameter named messageContext_messageContext_0
Thanks in advance.
The query builders are internal classes to OGM. Don't rely on them, they could change in the future. The use of Filters with the OGM session is fine though.
To build custom cypher queries, you might be interested by Cypher DSL.

Streaming the result of an aggregate operation using spring-data-mongodb

I am using spring-data-mongodb and I want to use a cursor for an aggregate operation.
MongoTemplate.stream() gets a Query, so I tried creating the Aggregation instance, convert it to a DbObject using Aggregation.toDbObject(), created a BasicQuery using the DbObject and then invoke the stream() method.
This returns an empty cursor.
Debugging the spring-data-mongodb code shows that MongoTemplate.stream() uses the FindOperation, which makes me thinkspring-data-mongodb does not support streaming an aggregation operation.
Has anyone been able to stream the results of an aggregate query using spring-data-mongodb?
For the record, I can do it using the Java mongodb driver, but I prefer using spring-data.
EDIT Nov 10th - adding sample code:
MatchOperation match = Aggregation.match(Criteria.where("type").ne("AType"));
GroupOperation group = Aggregation.group("name", "type");
group = group.push("color").as("colors");
group = group.push("size").as("sizes");
TypedAggregation<MyClass> agg = Aggregation.newAggregation(MyClass.class, Arrays.asList(match, group));
MongoConverter converter = mongoTemplate.getConverter();
MappingContext<? extends MongoPersistentEntity<?>, MongoPersistentProperty> mappingContext = converter.getMappingContext();
QueryMapper queryMapper = new QueryMapper(converter);
AggregationOperationContext context = new TypeBasedAggregationOperationContext(MyClass.class, mappingContext, queryMapper);
// create a BasicQuery to be used in the stream() method by converting the Aggregation to a DbObject
BasicQuery query = new BasicQuery(agg.toDbObject("myClass", context));
// spring-mongo attributes the stream() method to find() operationsm not to aggregate() operations so the stream returns an empty cursor
CloseableIterator<MyClass> iter = mongoTemplate.stream(query, MyClass.class);
// this is an empty cursor
while(iter.hasNext()) {
System.out.println(iter.next().getName());
}
The following code, not using the stream() method, returns the expected non-empty result of the aggregation:
AggregationResults<HashMap> result = mongoTemplate.aggregate(agg, "myClass", HashMap.class);
For those who are still trying to find the answer to this:
From spring-data-mongo version 2.0.0.M4 onwards (AFAIK) MongoTemplate got an aggregateStream method.
So you can do the following:
AggregationOptions aggregationOptions = Aggregation.newAggregationOptions()
// this is very important: if you do not set the batch size, you'll get all the objects at once and you might run out of memory if the returning data set is too large
.cursorBatchSize(mongoCursorBatchSize)
.build();
data = mongoTemplate.aggregateStream(Aggregation.newAggregation(
Aggregation.group("person_id").count().as("count")).withOptions(aggregationOptions), collectionName, YourClazz.class);

MongoDB: Query using $gte and $lte in java

I want to perform a query on a field that is greater than or equal to, AND less than or equal to(I'm using java btw). In other words. >= and <=. As I understand, mongoDB has $gte and $lte operators, but I can't find the proper syntax to use it. The field i'm accessing is a top-level field.
I have managed to get this to work:
FindIterable<Document> iterable = db.getCollection("1dag").find(new Document("timestamp", new Document("$gt", 1412204098)));
as well ass...
FindIterable<Document> iterable = db.getCollection("1dag").find(new Document("timestamp", new Document("$lt", 1412204098)));
But how do you combine these with each other?
Currently I'm playing around with a statement like this, but it does not work:
FindIterable<Document> iterable5 = db.getCollection("1dag").find(new Document( "timestamp", new Document("$gte", 1412204098).append("timestamp", new Document("$lte",1412204099))));
Any help?
Basically you require a range query like this:
db.getCollection("1dag").find({
"timestamp": {
"$gte": 1412204098,
"$lte": 1412204099
}
})
Since you need multiple query conditions for this range query, you can can specify a logical conjunction (AND) by appending conditions to the query document using the append() method:
FindIterable<Document> iterable = db.getCollection("1dag").find(
new Document("timestamp", new Document("$gte", 1412204098).append("$lte", 1412204099)));
The constructor new Document(key, value) only gets you a document with one key-value pair. But in this case you need to create a document with more than one. To do this, create an empty document, and then add pairs to it with .append(key, value).
Document timespan = new Document();
timespan.append("$gt", 1412204098);
timespan.append("$lt", 1412204998);
// timespan in JSON:
// { $gt: 1412204098, $lt: 1412204998}
Document condition = new Document("timestamp", timespan);
// condition in JSON:
// { timestamp: { $gt: 1412204098, $lt: 1412204998} }
FindIterable<Document> iterable = db.getCollection("1dag").find(condition);
Or if you really want to do it with a one-liner without temporary variables:
FindIterable<Document> iterable = db.getCollection("1dag").find(
new Document()
.append("timestamp", new Document()
.append("$gt",1412204098)
.append("$lt",1412204998)
)
);

MongoDB: Simple query issue

currently I'm trying to learn dealing with MongoDB in Java. I created the collection "plots" and inserted a document:
final Document plotObj = new Document();
plotObj.put(DataKey.PLOT_UUID.getKey(), plot.getUniqueId());
plotObj.put(DataKey.REGION_ID.getKey(), plot.getRegionId());
plotObj.put(DataKey.REGION_WORLD.getKey(), plot.getRegionWorld());
plotObj.put(DataKey.REGION_OWNER.getKey(), plot.getPlotOwner().isPresent() ? plot.getPlotOwner() : null);
plotObj.put(DataKey.PLOT_TRUSTED.getKey(), new BasicDBList().addAll(plot.getTrusted()));
this.collection.insertOne(plotObj);
"DataKey.PLOT_UUID.getKey()" represents a String. "plot.getUniqueId()" represents a java.util.UUID. After inserting this Document, I want to query it:
public boolean hasPlot(UUID plotId){
final BasicDBObject query = new BasicDBObject(DataKey.PLOT_UUID.getKey(), new BasicDBObject("$eq", plotId));
return this.collection.find(query).iterator().hasNext();
}
However this methods always returns false event though the Document was successfully inserted.
Maybe this problem can be fixed with ease but nevertheless: thanks in advance! :)
According to the documentation you don't need the $eq
just write
new BasicDBObject(DataKey.PLOT_UUID.getKey(), plotId));

Categories