MongoTemplate upsert property snapshot - java

I'm using mongo 4.2.15
Here is entry:
{
"keys": {
"country": "US",
"channel": "c999"
},
"counters": {
"sale": 0
},
"increments": null
}
I want to be able to initialize counter set as well as increment counters.sale value and save increment result snapshot to increments property. Something like that:
db.getCollection('counterSets').update(
{ "$and" : [
{ "keys.country" : "US"},
{ "keys.channel" : "c999"}
]
},
{ "$inc" :
{ "counters.sale" : 10
},
"$set" :
{ "keys" :
{ "country" : "US", "channel" : "c999"},
"increments":
{ "3000c058-b8a7-4cff-915b-4979ef9a6ed9": {"counters" : "$counters"} }
}
},
{upsert: true})
The result is:
{
"_id" : ObjectId("61965aba1501d6eb40588ba0"),
"keys" : {
"country" : "US",
"channel" : "c999"
},
"counters" : {
"sale" : 10.0
},
"increments" : {
"3000c058-b8a7-4cff-915b-4979ef9a6ed9" : {
"counters" : "$counters"
}
}
}
Does it possible to do such update which is some how copy increment result from root object counters to child increments.3000c058-b8a7-4cff-915b-4979ef9a6ed9.counters with a single upsert. I want to implement safe inrement. Maybe you can suggest some another design?

In order to use expressions, your $set should be part of aggregation pipeline. So your query should look like
NOTE: I've added square brackets to the update
db.getCollection('counterSets').update(
{ "$and" : [
{ "keys.country" : "US"},
{ "keys.channel" : "c999"}
]
},
[ {"$set": {"counters.sale": {"$sum":["$counters.sale", 10]}}}, {"$set": {"increments.x": "$counters"}}],
{upsert: true})
I haven't found any information about the atomicity of aggregation pipelines, so use this carefully.

Related

Elastic search term query not working on a specific field

I'm new to elastic search.
So this is how the index looks:
{
"scresults-000001" : {
"aliases" : {
"scresults" : { }
},
"mappings" : {
"properties" : {
"callType" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"code" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"data" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"esdtValues" : {
"type" : "text",
"fields" : {
"keyword" : {
"type" : "keyword",
"ignore_above" : 256
}
}
},
"gasLimit" : {
"type" : "long"
},
AND MORE OTHER Fields.......
If I'm trying to create a search query in Java that looks like this:
{
"bool" : {
"filter" : [
{
"term" : {
"sender" : {
"value" : "sendervalue",
"boost" : 1.0
}
}
},
{
"term" : {
"data" : {
"value" : "YWRkTGlxdWlkaXR5UHJveHlAMDAwMDAwMDAwMDAwMDAwMDA1MDBlYmQzMDRjMmYzNGE2YjNmNmE1N2MxMzNhYjdiOGM2ZjgxZGM0MDE1NTQ4M0A3ZjE1YjEwODdmMjUwNzQ4QDBjMDU0YjcwNDhlMmY5NTE1ZWE3YWU=",
"boost" : 1.0
}
}
}
],
"adjust_pure_negative" : true,
"boost" : 1.0
}
}
If I run this query I get 0 hits. If I change the field "data" with other field it works. I don't understand what's different.
How I actually create the query in Java+SpringBoot:
QueryBuilder boolQuery = QueryBuilders.boolQuery()
.filter(QueryBuilders.termQuery("sender", "sendervalue"))
.filter(QueryBuilders.termQuery("data",
"YWRkTGlxdWlkaXR5UHJveHlAMDAwMDAwMDAwMDAwMDAwMDA1MDBlYmQzMDRjMmYzNGE2YjNmNmE1N2MxMzNhYjdiOGM2ZjgxZGM0MDE1NTQ4M0A3ZjE1YjEwODdmMjUwNzQ4QDBjMDU0YjcwNDhlMmY5NTE1ZWE3YWU="));
Query searchQuery = new NativeSearchQueryBuilder()
.withFilter(boolQuery)
.build();
SearchHits<ScResults> articles = elasticsearchTemplate.search(searchQuery, ScResults.class);
Since you're trying to do an exact match on a string with a term query, you need to do it on the data.keyword field which is not analyzed. Since the data field is a text field, hence analyzed by the standard analyzer, not only are all letters lowercased but the = sign at the end also gets stripped off, so there's no way this can match (unless you use a match query on the data field but then you'd not do exact matching anymore).
POST _analyze
{
"analyzer": "standard",
"text": "YWRkTGlxdWlkaXR5UHJveHlAMDAwMDAwMDAwMDAwMDAwMDA1MDBlYmQzMDRjMmYzNGE2YjNmNmE1N2MxMzNhYjdiOGM2ZjgxZGM0MDE1NTQ4M0A3ZjE1YjEwODdmMjUwNzQ4QDBjMDU0YjcwNDhlMmY5NTE1ZWE3YWU="
}
Results:
{
"tokens" : [
{
"token" : "ywrktglxdwlkaxr5uhjvehlamdawmdawmdawmdawmdawmda1mdblymqzmdrjmmyznge2yjnmnme1n2mxmznhyjdiogm2zjgxzgm0mde1ntq4m0a3zje1yjewoddmmjuwnzq4qdbjmdu0yjcwndhlmmy5nte1zwe3ywu",
"start_offset" : 0,
"end_offset" : 163,
"type" : "<ALPHANUM>",
"position" : 0
}
]
}

How to get the count of element with non-empty-array-field when group in mongodb aggregate using Spring Data Mongo?

I have the following documents in one collection named as mail_test. Some of them have a tags field which is an array:
/* 1 */
{
"_id" : ObjectId("601a7c3a57c6eb4c1efb84ff"),
"email" : "aaaa#bbb.com",
"content" : "11111"
}
/* 2 */
{
"_id" : ObjectId("601a7c5057c6eb4c1efb8590"),
"email" : "aaaa#bbb.com",
"content" : "22222"
}
/* 3 */
{
"_id" : ObjectId("601a7c6d57c6eb4c1efb8675"),
"email" : "aaaa#bbb.com",
"content" : "33333",
"tags" : [
"x"
]
}
/* 4 */
{
"_id" : ObjectId("601a7c8157c6eb4c1efb86f4"),
"email" : "aaaa#bbb.com",
"content" : "4444",
"tags" : [
"yyy",
"zzz"
]
}
There are two documents with non-empty-tags, so I want the result to be 2.
I use the the following statement to aggregate and get the correct tag_count:
db.getCollection('mail_test').aggregate([{$group:{
"_id":null,
"all_count":{$sum:1},
"tag_count":{"$sum":{$cond: [ { $ne: ["$tags", undefined] }, 1, 0]}}
//if replace `undefined` with `null`, I got the tag_count as 4, that is not what I want
//I also have tried `$exists`, but it cannot be used here.
}}])
and the result is:
{
"_id" : null,
"all_count" : 4.0,
"tag_count" : 2.0
}
and I use spring data mongo in java to do this:
private void test(){
Aggregation agg = Aggregation.newAggregation(
Aggregation.match(new Criteria()),//some condition here
Aggregation.group(Fields.fields()).sum(ConditionalOperators.when(Criteria.where("tags").ne(null)).then(1).otherwise(0)).as("tag_count")
//I need an `undefined` instead of `null`,or is there are any other solution?
);
AggregationResults<MailTestGroupResult> results = mongoTemplate.aggregate(agg, MailTest.class, MailTestGroupResult.class);
List<MailTestGroupResult> mappedResults = results.getMappedResults();
int tag_count = mappedResults.get(0).getTag_count();
System.out.println(tag_count);//get 4,wrong
}
I need an undefined instead of null but I don't know how to do this,or is there are any other solution?
You can use Aggregation operators to check if the field tags exists or not with one of the following constructs in the $group stage of your query (to calculate the tag_count value):
"tag_count":{ "$sum": { $cond: [ { $gt: [ { $size: { $ifNull: ["$tags", [] ] }}, 0 ] }, 1, 0] }}
// - OR -
"tag_count":{ "$sum": { $cond: [ $eq: [ { $type: "$tags" }, "array" ] }, 1, 0] }
Both, return the same result (as you had posted).

Spring MongoDB Aggregation Group - Can't get the group query right

I have a document in a MongoDB, which looks like follows.
{
"_id" : ObjectId("5ceb812b3ec6d22cb94c82ca"),
"key" : "KEYCODE001",
"values" : [
{
"classId" : "CLASS_01",
"objects" : [
{
"code" : "DD0001"
},
{
"code" : "DD0010"
}
]
},
{
"classId" : "CLASS_02",
"objects" : [
{
"code" : "AD0001"
}
]
}
]
}
I am interested in getting a result like follows.
{
"classId" : "CLASS_01",
"objects" : [
{
"code" : "DD0001"
},
{
"code" : "DD0010"
}
]
}
To get this, I came up with an aggregation pipeline in Robo 3T, which looks like follows. And it's working as expected.
[
{
$match:{
'key':'KEYCODE001'
}
},
{
"$unwind":{
"path": "$values",
"preserveNullAndEmptyArrays": true
}
},
{
"$unwind":{
"path": "$values.objects",
"preserveNullAndEmptyArrays": true
}
},
{
$match:{
'values.classId':'CLASS_01'
}
},
{
$project:{
'object':'$values.objects',
'classId':'$values.classId'
}
},
{
$group:{
'_id':'$classId',
'objects':{
$push:'$object'
}
}
},
{
$project:{
'_id':0,
'classId':'$_id',
'objects':'$$objects'
}
}
]
Now, when I try to do the same in a SpringBoot application, I can't get it running. I ended up having the error java.lang.IllegalArgumentException: Invalid reference '$complication'!. Following is what I have done in Java so far.
final Aggregation aggregation = newAggregation(
match(Criteria.where("key").is("KEYCODE001")),
unwind("$values", true),
unwind("$values.objects", true),
match(Criteria.where("classId").is("CLASS_01")),
project().and("$values.classId").as("classId").and("$values.objects").as("object"),
group("classId", "objects").push("$object").as("objects").first("$classId").as("_id"),
project().and("$_id").as("classId").and("$objects").as("objects")
);
What am I doing wrong? Upon research, I found that multiple fields in group does not work or something like that (please refer to this question). So, is what I am currently doing even possible in Spring Boot?
After hours of debugging + trial and error, found the following solution to be working.
final Aggregation aggregation = newAggregation(
match(Criteria.where("key").is("KEYCODE001")),
unwind("values", true),
unwind("values.objects", true),
match(Criteria.where("values.classId").is("CLASS_01")),
project().and("values.classId").as("classId").and("values.objects").as("object"),
group(Fields.from(Fields.field("_id", "classId"))).push("object").as("objects"),
project().and("_id").as("classId").and("objects").as("objects")
);
It all boils down to group(Fields.from(Fields.field("_id", "classId"))).push("object").as("objects") that which introduces a org.springframework.data.mongodb.core.aggregation.Fields object that wraps a list of org.springframework.data.mongodb.core.aggregation.Field objects. Within Field, the name of the field and the target could be encapsulated. This resulted in the following pipeline which is a match for the expected.
[
{
"$match" :{
"key" : "KEYCODE001"
}
},
{
"$unwind" :{
"path" : "$values", "preserveNullAndEmptyArrays" : true
}
},
{
"$unwind" :{
"path" : "$values.objects", "preserveNullAndEmptyArrays" : true
}
},
{
"$match" :{
"values.classId" : "CLASS_01"
}
},
{
"$project" :{
"classId" : "$values.classId", "object" : "$values.objects"
}
},
{
"$group" :{
"_id" : "$classId",
"objects" :{
"$push" : "$object"
}
}
},
{
"$project" :{
"classId" : "$_id", "objects" : 1
}
}
]
Additionally, figured that there is no need to using $ sign anywhere and everywhere.

how to update nested field values in elasticsearch using java api

i have a following document in elasticsearch
{
"uuid":"123",
"Email":"mail#example.com",
"FirstName":"personFirstNmae",
"LastName":"personLastName",
"Inbox":{
"uuid":"1234",
"messageList":[
{
"uuid":"321",
"Subject":"subject1",
"Body":"bodyText1",
"ArtworkUuid":"101",
"DateAndTime":"2015-10-15T10:59:12.096+05:00",
"ReadStatusInt":0,
"Delete":{
"deleteStatus":0,
"deleteReason":0
}
},
{
"uuid":"123",
"Subject":"subject",
"Body":"bodyText",
"ArtworkUuid":"100",
"DateAndTime":"2015-10-15T10:59:11.982+05:00",
"ReadStatusInt":1,
"Delete":{
"deleteStatus":0,
"deleteReason":0
}
}
]
}
}
and here is the mapping of the doc
{
"testdb" : {
"mappings" : {
"directUser" : {
"properties" : {
"Email" : {
"type" : "string",
"store" : true
},
"FirstName" : {
"type" : "string",
"store" : true
},
"Inbox" : {
"type" : "nested",
"include_in_parent" : true,
"properties" : {
"messageList" : {
"type" : "nested",
"include_in_parent" : true,
"properties" : {
"ArtworkUuid" : {
"type" : "string",
"store" : true
},
"Body" : {
"type" : "string",
"store" : true
},
"DateAndTime" : {
"type" : "date",
"store" : true,
"format" : "dateOptionalTime"
},
"Delete" : {
"type" : "nested",
"include_in_parent" : true,
"properties" : {
"deleteReason" : {
"type" : "integer",
"store" : true
},
"deleteStatus" : {
"type" : "integer",
"store" : true
}
}
},
"ReadStatusInt" : {
"type" : "integer",
"store" : true
},
"Subject" : {
"type" : "string",
"store" : true
},
"uuid" : {
"type" : "string",
"store" : true
}
}
},
"uuid" : {
"type" : "string",
"store" : true
}
}
},
"LastName" : {
"type" : "string",
"store" : true
},
"uuid" : {
"type" : "string",
"store" : true
}
}
}
}
}
}
Now i want to update the values of Inbox.messageList.Delete.deleteStatus and Inbox.messageList.Delete.deleteReason from 0 to 1 of the doc with uuid 321 (Inbox.messageList.uuid).
i want to achieve something like this
{
"uuid":"123",
"Email":"mail#example.com",
"FirstName":"personFirstNmae",
"LastName":"personLastName",
"Inbox":{
"uuid":"1234",
"messageList":[
{
"uuid":"321",
"Subject":"subject1",
"Body":"bodyText1",
"ArtworkUuid":"101",
"DateAndTime":"2015-10-15T10:59:12.096+05:00",
"ReadStatusInt":0,
"Delete":{
"deleteStatus":1,
"deleteReason":1
}
},
{
"uuid":"123",
"Subject":"subject",
"Body":"bodyText",
"ArtworkUuid":"100",
"DateAndTime":"2015-10-15T10:59:11.982+05:00",
"ReadStatusInt":1,
"Delete":{
"deleteStatus":0,
"deleteReason":0
}
}
]
}
}
i am trying the following code to achieve my desired updated document
var xb:XContentBuilder=XContentFactory.jsonBuilder().startObject()
.startObject("Inbox")
xb.startArray("messageList")
xb.startObject();
xb.startObject("Delete")
xb.field("deleteStatus",1)
xb.field("deleteReason",1)
xb.endObject()
xb.endObject();
xb.endArray()
.endObject()
xb.endObject()
val responseUpdate=client.prepareUpdate("testdb", "directUser", directUserObj.getUuid.toString())
.setDoc(xb).execute().actionGet()
but from this code my doc becomes
{"uuid":"123",
"Email":"mail#example.com",
"FirstName":"personFirstNmae",
"LastName":"personLastName",
,"Inbox":{
"uuid":"1234",
"messageList":[
{
"Delete":{
"deleteStatus":1,
"deleteReason":1
}
}
]
}
}
and i do not want this, please help me how can i achieve my desired document , Iam using elasticsearch version 1.6
The best way I've found to update a single nested field is to use the elasticsearch update API that takes a (parameterised) script a la this answer. Last time I checked this kind of thing is only supported in groovy scripts, not lucene expression scripts (unfortunately). The reason your update produces the result it does is you are updating the whole nested object, not a specific nested item. Groovy script update will allow you to select and update the nested object with the specified ID.
You can also have a look at the nested object that I have currently updated and how I used the UpdateRequest class in Java here.
Specifically for the JAVA API, it is also possible to update a nested document with this answer of PeteyPabPro.

retrieve values from nested json array in mongodb

My mongo collection has entries in the following format
{
"myobj" : {
"objList" : [
{ "location" : "Texas" },
{ "location" : "Houston"},
{ "name":"Sam" }
]
},
"category" : "cat1"
}
{
"myobj" :
{
"objList" : [
{ "location" : "Tennesy" },
{ "location" : "NY"},
{ "location" : "SF" }
]
},
"category" : "cat2"
}
I want to extract the "**category**" where location is "Houston". In case of simple JSON object I have to just pass it as query like:
BasicDBObject place = new BasicDBObject();
place.put("location", "Houston");
But in case of nested JSON I don't know how to pass it as a query and get the appropriate category. ie If I pass my location as"Houston" then it should return it's appropriate category "cat1"...i hope my question is clear now....
Ok, you have your documents:
db.coll1.insert({
"myobj" : {
"objList" : [
{ "location" : "Texas" },
{ "location" : "Houston"},
{ "name":"Sam" }
]
},
"category" : "cat1"
})
and
db.coll1.insert({
"myobj" : {
"objList" : [
{ "location" : "Tennesy" },
{ "location" : "Houston"},
{ "location" : "SF" }
]
},
"category" : "cat1"
})
Now you can find what you want using the dot operator:
db.coll1.find({"myobj.objList.location": "Texas"}).pretty() will return one object which has Texas
db.coll1.find({"myobj.objList.location": "SF"}).pretty() will return one object which has SF
db.coll1.find({"myobj.objList.location": "Houston"}).pretty() will return both objects
And now I hope you will be able to write it in Java. I have never used Java, but based on this question you can do something like this. If it will not work, just look how to use dot operator in java driver for mongo:
DBCursor cursor = coll1.find(new BasicDBObject("myobj.objList.location", "Texas"));
P.S. you told, that you wanted to retrieve category. In such a way, you will need to use a projection db.coll1.find({<the query I provided}, {category: 1, _id: 0})

Categories