I saw this snippet where update_by_query can update source directly
POST twitter/_update_by_query
{
"script": {
"inline": "ctx._source.likes++",
"lang": "painless"
},
"query": {
"term": {
"user": "kimchy"
}
}
}
instead of using painless, I wrote a native script plugin in Java because of my complex business logic.
{
"subtotal": 1000,
"markup": 2,
"total": 2000,
"items": [
{
"subtotal": 100,
"markup": 2,
"total": 200
},
{
"subtotal": 500,
"markup": 2,
"total": 1000
}
]
}
User can set markup value in the application. If user change markup to 3, I want to update markup and total field including the ones in nested object. (NOTE: I can't use painless because in my case, the logic is more complicated than just multiplies those fields. That's why I use Java)
// my plugin code
public Object run() {
// change field value of "markup"
// change field value of "total"
return true;
}
My Code is almost similar with https://github.com/imotov/elasticsearch-native-script-example/blob/master/src/main/java/org/elasticsearch/examples/nativescript/script/TFIDFScoreScript.java
I was trying with source().put("markup", 3) but I kept getting NullPointerException
ElasticSearch Version: 5.0.0
Thank you
Related
I'm working on an event collection, where each document is modeled like so:
{
"event_name": "John Doe Concert",
"event_date": "2022-01-01"
"ticket_types": [
{
"name": "Front seats",
"total": 50 (we'll call this `oldTotal1`),
"available": 25 (we'll call this `oldAvailable1`)
},
{
"name": "Back seats",
"total": 100 (we'll call this `oldTotal2`),
"available": 50 (we'll call this `oldAvailable2`)
}
]
}
Suppose I have a REST API supporting a PUT endpoint which accepts a payload like this:
{
"event_name": "Jane Doe Concert",
"event_date": "2022-02-02"
"ticket_types": [
{
"name": "Front seats",
"total": 100 (we'll call this `newTotal1`),
},
{
"name": "Back seats",
"total": 150 (we'll call this `newTotal2`),
}
]
}
As you can see, I'm looking to update the document (doesn't matter if it's update or replace as long as the operation is atomic on a document level). For each element i in the ticket_types array in the payload, we can assume newTotal[i] >= oldTotal[i]
The way I'd like my document to look like after the update:
{
"event_name": "Jane Doe Concert",
"event_date": "2022-02-02"
"ticket_types": [
{
"name": "Front seats",
"total": 100 (the value of `newTotal1`),
"available": 75 (the value of `oldAvailable1` + `newTotal1` - `oldTotal1`)
},
{
"name": "Back seats",
"total": 150 (the value of `newTotal2`),
"available": 100 (the value of `oldAvailable2` + `newTotal2` - `oldTotal2`)
}
]
}
The problem I'm having is that I would like to perform the calculations of the resulting values of total and available exclusively through Mongo's findAndUpdate operation (without fetching the document first and doing the changes via Java code after so to speak). This is because the environment the code is run in is AWS Lambda, so I don't think I can rely on any locking mechanism other than the DB's own mechanisms, nor would I want to touch MongoDB's Transaction API.
My attempt so far:
final Document result = collection.findOneAndUpdate(query, Updates.combine(
Updates.set("event_name", request.getEventName()),
Updates.set("event_date", request.getDate().toString()),
// please help me!
Updates.set("ticket_types.$[elem].total", ...)
Updates.set("ticket_types.$[elem].available", ...)
Javascript equivalent should be:
db.getCollection("events").updateOne({name: 'John Doe Concert'},
{
$set: {
"ticket_types.$[first].total": 1000,
"ticket_types.$[first].available": "ticket_types.$[first].available" + 1000 - "ticket_types.$[first].total",
"ticket_types.$[second].total": 2000,
"ticket_types.$[second].available": "ticket_types.$[second].available" + 2000 - "ticket_types.$[second].total"
}
},
{
arrayFilters: [
{ "first.name": "Front seats" },
{ "second.name": "Back seats" }
]
}
)
Edit: The goal is to update the data doing calculation using existing attributes. I feel it's possible to achieve this using an update aggregation pipeline.
I am using DocumentDB 4.0 with application layer written in Java
Thanks
With a collection of
[
{
"user": "abc",
"seconds": 1111,
"time": ISODate("2020-05-05T00:00:00Z")
},
{
"user": "abc",
"seconds": 2222,
"time": ISODate("2020-05-05T00:00:00Z")
}
]
I need to have another field, which adds the seconds to the time of individual record.
This seems to be possible with dateAdd which is added in version 5 (the database is version 5).
However, I am not able to achieve that with MongoDB Java driver 4.6.0.
var alerts = db.getCollection("alerts");
alerts.updateMany(eq("user", user),
set(new Field<>("time2",
new Document("$dateAdd",
new Document("startDate", "$time")
.append("unit", "second")
.append("amount", "$seconds"))
)));
Since mongoDB version 4.0, before $dateAdd of version 5.0, you can use $toDate:
db.collection.update({},
[
{
$set: {
time: {
$toDate: {
$add: [
{$toLong: "$time"},
{$multiply: ["$seconds", 1000]}
]
}
}
}
}
],
{multi: true})
See how it works on the playground example
I need to store java object (may be json formatted) in Redis. I was searching over internet and found ReJson module.
{
"site": "sddd",
"pConfig" : {
"floatpoint" : "http://10.32.3.36:8003",
"user" : "root",
"password" : "xxx"
},
"Config": {
"initInSecs": 0,
"checkInSecs": 29
},
"refC": {
"initSecs": 0,
"InSecs": 59,
"InSecsOnDown": 15,
"InMillis" : 5000,
"endPoints": [
{
"ip": "10.32.17.66",
"port": "22"
},
{
"ip": "10.32.17.66",
"port": "21"
}
]
},
"syncWConfig": {
"initDelayInSecs": 0
}
}
Can you please help how to store this Json using ReJson. I also want to retrieved elements and its values. Can you help with small code snippet.
You should check JRedisJSON java client
https://github.com/RedisJSON/JRedisJSON
As for search and secondary index support it should be available soon for RedisJSON see https://github.com/RedisJSON/RedisJSON2
I'm using elasticsearch for the first time. I'm trying to use completion suggester in multi-field key, although I don't see any error but I don't get the response.
Mapping creation:
PUT /products5/
{
"mappings":{
"products" : {
"properties" : {
"name" : {
"type":"text",
"fields":{
"text":{
"type":"keyword"
},
"suggest":{
"type" : "completion"
}
}
}
}
}
}
}
Indexing:
PUT /products5/product/1
{
"name": "Apple iphone 5"
}
PUT /products5/product/2
{
"name": "iphone 4 16GB"
}
PUT /products5/product/3
{
"name": "iphone 3 SS 16GB black"
}
PUT /products5/product/4
{
"name": "Apple iphone 4 S 16 GB white"
}
PUT /products5/product/5
{
"name": "Apple iphone case"
}
Query:
POST /products5/product/_search
{
"suggest":{
"my-suggestion":{
"prefix":"i",
"completion":{
"field":"name.suggest"
}
}
}
}
Output:
{
"took": 0,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"failed": 0
},
"hits": {
"total": 0,
"max_score": 0,
"hits": []
},
"suggest": {
"my-suggestion": [
{
"text": "i",
"offset": 0,
"length": 1,
"options": []
}
]
}
}
Please guide me what is the mistake, I tried every possible options.
From the first perspective this looks accurate. Probably the reason why you don't have correct response is that you added documents in the index before you created mapping in the index. And documents are not indexed according to the mapping you specified
I have found an issue in your mapping name. There is an inconsistency between name of the mapping and value which you specifies in the url when you're creating new documents. You create a mapping in the index with the name products. And when you add new documents you're specifying product as a name of the mapping of your index and it doesn't end with s. You have a typo.
Edit - Resolved:
The issued turned out to be a combination of our logging code and how json.org handles the distinction between Javascript null and Javascript undefined. Our logging code didn't print null values in objects, so although the json object I saw had no "invoice" field, the actual input had a null "invoice" field. JSONObject's .has() method reported that there was an "invoice" field, but when I tried to access it, it's value was null so it wasn't possible to access it. Replacing the .has() check with a .isNull() check (which considers both absent fields and null values) resolved the problem.
In my Google App Engine application, we parse webhook data in the form of json and go down different code paths depending on the presence of a certain key in the json object - if $.data.object.invoice exists we do some work, otherwise we don't. However, when running in production, this detection seems to be broken. I do a simple
jsonObject.getJSONObject("data").getJSONObject("object").has("invoice")
to check for the invoice key and sometimes this returns true even when the invoice key doesn't exist. I came across this error because we do make use of the invoice value when it exists, and I was getting JSONExceptions trying to access it, even though it was protected by a has() check. I added the following logging to ensure that I was doing everything correct:
boolean isAutomaticCharge = jsonObject.getJSONObject("data").getJSONObject("object").has("invoice");
boolean doesGettingStringWork;
try {
jsonObject.getJSONObject("data").getJSONObject("object").getString("invoice");
doesGettingStringWork = true;
}
catch (JSONException e) {
doesGettingStringWork = false;
}
Logger.log("The value of automatic charge is: " + isAutomaticCharge);
Logger.log("Geting the invoice worked? " + doesGettingStringWork);
Logger.log(jsonObject.getJSONObject("data").getJSONObject("object").toString());
if (isAutomaticCharge) {
handleFailedCharge(jsonObject);
}
else {
//Manual Charge
}
And in production the following data resulted in isAutomaticCharge being true and doesGettingStringWork being false:
{
"id": "evt_16vC0s1WJGsEk3Qvnstmq6fa",
"object": "event",
"api_version": "2014-03-28",
"created": 1444678274,
"data": {
"object": {
"id": "ch_16vC0r1WJGsEk3Qv5rxezNQj",
"object": "charge",
"amount": 1900,
"amount_refunded": 0,
"captured": false,
"card": {
"id": "card_16vC0k1WJGsEk3Qvl9LLEt5K",
"object": "card",
"brand": "MasterCard",
"country": "CA",
"customer": "cus_79V0ZH6qFFA4K0",
"cvc_check": "fail",
"exp_month": 9,
"exp_year": 2016,
"fingerprint": "ifxs23sixYzpKant",
"funding": "credit",
"last4": "6912",
"metadata": {},
"name": "derek#footbole.com",
"type": "MasterCard"
},
"created": 1444678273,
"currency": "usd",
"customer": "cus_79V0ZH6qFFA4K0",
"failure_code": "card_declined",
"failure_message": "Your card was declined.",
"fraud_details": {},
"livemode": true,
"metadata": {},
"paid": false,
"refunded": false,
"refunds": [],
"source": {
"id": "card_16vC0k1WJGsEk3Qvl9LLEt5K",
"object": "card",
"brand": "MasterCard",
"country": "CA",
"customer": "cus_79V0ZH6qFFA4K0",
"cvc_check": "fail",
"exp_month": 9,
"exp_year": 2016,
"fingerprint": "ifxs23sixYzpKant",
"funding": "credit",
"last4": "6912",
"metadata": {},
"name": "derek#footbole.com",
"type": "MasterCard"
},
"status": "failed"
}
},
"livemode": true,
"pending_webhooks": 2,
"request": "req_79V0v7433eb9hZ",
"type": "charge.failed"
}
When I run the code locally and feed it that json, it works as expected with isAutomaticCharge and doesGettingStringWork both being false.
I'm running version 20140107 of org.json. I declare a new JSONObject for every request, so threading shouldn't be an issue. Has anyone else had issues running org.json on Google App Engine?