I have an Elasticsearch index which has _timestamp populated on every record. Using Marvel or curl I can get the _timestamp in the "fields" part of the result for example:
GET index/type/_search?fields=_timestamp,_source
{
"took": 11,
"timed_out": false,
"_shards": {
"total": 3,
"successful": 3,
"failed": 0
},
"hits": {
"total": 116888,
"max_score": 1,
"hits": [
{
"_index": "index",
"_type": "type",
"_id": "mXJdWqSLSfykbMtChiCRjA",
"_score": 1,
"_source": {
"results": "example",
},
"fields": {
"_timestamp": 1443618319514
}
},...
However when doing a search using the Java API I cant get it to return the _timestamp.
SearchRequestBuilder builder= client.prepareSearch(index)
.addFacet(facet)
.setFrom(start)
.setSize(limit);
SearchResponse response = builder.execute().actionGet();
Can anyone tell me how to ask for _timestamp too?
You simply need to use the setFields() method like this:
SearchRequestBuilder builder= client.prepareSearch(index)
.setType(type)
.addFacet(facet)
.setFields("_timestamp") <--- add this line
.setFrom(start)
.setSize(limit);
SearchResponse response = builder.execute().actionGet();
Related
I'm trying to validate the response schema with karate but facing issue with array.
Attaching the response and feature as well as my schema.json.
Response -
{
"page": 1,
"per_page": 6,
"total": 12,
"total_pages": 2,
"data": [
{
"id": 3,
"email": "emma.wong#reqres.in",
"first_name": "Emma",
"last_name": "Wong",
"avatar": "https://reqres.in/img/faces/3-image.jpg"
},
{
"id": 4,
"email": "eve.holt#reqres.in",
"first_name": "Eve",
"last_name": "Holt",
"avatar": "https://reqres.in/img/faces/4-image.jpg"
},
{
"id": 5,
"email": "charles.morris#reqres.in",
"first_name": "Charles",
"last_name": "Morris",
"avatar": "https://reqres.in/img/faces/5-image.jpg"
},
{
"id": 6,
"email": "tracey.ramos#reqres.in",
"first_name": "Tracey",
"last_name": "Ramos",
"avatar": "https://reqres.in/img/faces/6-image.jpg"
}
],
"support": {
"url": "https://reqres.in/#support-heading",
"text": "To keep ReqRes free, contributions towards server costs are appreciated!"
}
}
Scenario: Get all Users and validate schema
Given url getUrl
When method Get
Then status 200
And print response
Then match response == '#object'
* string jsonSchemaExpected = read('file:src/test/resources/features/sample/responseSchema.json')
And print response.data.length
And match response == jsonSchemaExpected
responseSchema.json
{
"page": "#number",
"per_page": "#number",
"total": "#number",
"total_pages": "#number",
"data": "#[] #object",
"support": "#object"
}
The only observation I have is if you cast to a string, you won't be able to do any matching.
Instead of * string jsonSchemaExpected do * def jsonSchemaExpected.
I am searching an elastic search index from Java using Elastic's high level REST client for JAVA.
My response looks like this...
{
"took": 25,
"timed_out": false,
"_shards": {
"total": 1,
"successful": 1,
"skipped": 0,
"failed": 0
},
"hits": {
"total": {
"value": 10000,
"relation": "gte"
},
"max_score": 2,
"hits": [
{
"_index": "contacts_1_rvmmtqnnlh",
"_type": "_doc",
"_id": "1",
"_score": 2,
"_source": {
"location": {
"lon": -71.34,
"lat": 41.12
}
}
},
{
"_index": "contacts_1_rvmmtqnnlh",
"_type": "_doc",
"_id": "5291485",
"_score": 2,
"_source": {
"home_address1": "208 E Main ST Ste 230",
"firstname": "Geri",
"home_city": "Belleville",
"location": "39.919499456869055,-89.08605153191894",
"lastname": "Boyer"
}
},
...
{
"_index": "contacts_1_rvmmtqnnlh",
"_type": "_doc",
"_id": "5291492",
"_score": 2,
"_source": {
"home_address1": "620 W High ST",
"firstname": "Edna",
"home_city": "Nashville",
"location": "40.55917440131824,-89.24254785283054",
"lastname": "Willis"
}
}
]
}
}
How can I parse out the latitude and longitude of each document hit? The latitude and longitude are stored in a field named "location" that is of type GeoPoint
Here is what I have tried...
SearchHit[] hits = searchResponse.getHits().getHits();
for (SearchHit hit : hits) {
Map<String, Object> contactMap = hit.getSourceAsMap();
LinkedHashMap<String, Object> contactLHM = new LinkedHashMap<>(contactMap);
Object coordinate = contactLHM.get("location");
location.latitude = ??????
location.longitude = ?????
}
How can I parse out the latitude and longitude given that the value of the coordinate variable is
{lon=-71.34, lat=41.12}
By the way, this is the location class definition:
public static class Location{
public Double latitude;
public Double longitude;
}
The source here indicates that you have saved documents with different _source.
You can do that with the geo_point type and of course, query them by using the same query. Basically elasticsearch understands both formats and analyzes them to the same structure (lat, lon), but that doesn't mean that it will change your source (which is exactly the data you saved).
First of all, if that's an option, you need to save the data with only one way, so the _source comes always the same. If that's not an option then you need to handle both formats (location as string, location as object of lat and lon). Moreover, you can update your _source by script.
https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-update-by-query.html
I know that we can search multiple indexes in elastic search but would I know if a particular search result is belonging to which index?
As per my requirement , I want to provide a global search on different types/indexes but a user should know that the search is coming from which index/context as that will help them to correctly associate the result to the context
Elasticsearch adds some fields to the search response. Some od them are _index and _type. You can use them for your purpose.
So the sample Elasticsearch response looks like below:
{
"took": 2,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"failed": 0
},
"hits": {
"total": 19,
"max_score": 1.1,
"hits": Array[10][
{
"_index": "first_index_name",
"_type": "first_type_of_first_index",
"_id": "doc-id-125125422",
"_score": 1.1,
"_source": { /*here is your indexed document*/ }
},
{
"_index": "second_index_name",
"_type": "first_type_of_second_index",
"_id": "doc-id-212452314",
"_score": 0.9,
"_source": {...}
},
...
]
}
}
I'm try to make a bulk update
Method: Post
Url: /customer/external/_bulk
Json Body:
{"index":{"_id":"1"}}
{"name": "John Doe" }
{"index":{"_id":"2"}}
{"name": "Jane Doe" }
Id 1 is updated but id 2 didnt update. I dont know why?
Response is here:
{
"took": 138,
"errors": false,
"items": [
{
"index": {
"_index": "customer",
"_type": "external",
"_id": "1",
"_version": 15,
"result": "updated",
"_shards": {
"total": 2,
"successful": 1,
"failed": 0
},
"created": false,
"status": 200
}
}
]
}
As #Val mentioned, you should be having the new line character \n at the end of the last line in your json body:
{"index":{"_id":"1"}}
{"name": "John Doe" }
{"index":{"_id":"2"}}
{"name": "Jane Doe" }\n
as per mentioned in bulk_api. Hope it helps!
I am currently in the process of attempting to update an ElasticSearch document via the Java API. I have a groovy script with the following code:
static updateRequestById(String agencyIndex, String type, String id, def policy) {
UpdateRequest updateRequest = new UpdateRequest()
updateRequest.docAsUpsert(true);
updateRequest.parent("agentNumber");
updateRequest.index(agencyIndex)
updateRequest.type(type)
updateRequest.id(id)
updateRequest.doc("policies", policy)
elasticsearchClient.update(updateRequest).get()
}
The problem with I am having is that I want to update an array within the following document:
{
"took": 4,
"timed_out": false,
"_shards": {
"total": 10,
"successful": 10,
"failed": 0
},
"hits": {
"total": 1,
"max_score": 1,
"hits": [
{
"_index": "int-b-agency",
"_type": "jacket",
"_id": "99808.1.27.09_4644",
"_score": 1,
"_source": {
"agentNumber": "99808.1.27.09",
"fileNumber": "4644",
"policies": [
{
"agentNumber": "99808.1.27.09",
"fileNumber": "4644",
"policyNumber": "2730609-91029084",
"checkNumber": "0",
"checkAmount": 0,
"createdOn": null,
"createdBy": "traxuser621",
"propertyTypeCode": "",
"propertyTypeDesc": "1-4 FAMILY RESIDENTIAL",
"ppaddress": "110 Allan Ct ",
"ppcity": "Jacksonville",
"ppstate": "FL",
"ppzip": "32226",
"ppcounty": "Duval",
"policytype": "",
"status": "Active",
"effectiveDate": "2015-04-01T00:00:00-05:00",
"formType": "BASIC OWNERS - ALTA Owners Policy 06_306_FL - FL Original Rate",
"rateCode": "FLOR",
"rateCodeDesc": "FL Original Rate",
"policyTypeCode": "1",
"policyTypeCodeDesc": "BASIC OWNERS",
"amount": 200000,
"hoiAgentNumber": "",
"proForma": false,
"pdfLocation": "\\\\10.212.61.206\\FNFCenter\\legacy_jacket_pdfs\\2015_4_FL6465\\Policy_2730609-91029084.pdf",
"legacyPolicy": "true",
"associatedPolNbr": null
}
]
}
}
]
}
}
In the document above I have a document that has an array called "policies" with a single object. I want to be able to update the "policies" array with additional objects. The end result should look something like the following:
{
"took": 4,
"timed_out": false,
"_shards": {
"total": 10,
"successful": 10,
"failed": 0
},
"hits": {
"total": 1,
"max_score": 1,
"hits": [
{
"_index": "int-b-agency",
"_type": "jacket",
"_id": "41341.1.81.38_41340103",
"_score": 1,
"_source": {
"agentNumber": "41341.1.81.38",
"fileNumber": "41340103",
"policies": [
{
"agentNumber": "41341.1.81.38",
"fileNumber": "41340103",
"policyNumber": "8122638-91036874",
"checkNumber": "0",
"checkAmount": 0,
"createdOn": null,
"createdBy": "traxuser621",
"propertyTypeCode": "",
"propertyTypeDesc": "1-4 FAMILY RESIDENTIAL",
"ppaddress": "1800 Smith St ",
"ppcity": "sicklerville",
"ppstate": "PA",
"ppzip": "08105",
"ppcounty": "Dauphin",
"policytype": "",
"status": "Active",
"effectiveDate": "2016-02-01T00:00:00-06:00",
"formType": "TestData",
"rateCode": "PASALERATE",
"rateCodeDesc": "Sale Rate - Agent",
"policyTypeCode": "26",
"policyTypeCodeDesc": "SALE OWNERS",
"amount": 180000,
"hoiAgentNumber": "",
"proForma": false,
"pdfLocation": "SomeLocation1",
"legacyPolicy": "true",
"associatedPolNbr": null
},
{
"agentNumber": "41341.1.81.38",
"fileNumber": "41340103",
"policyNumber": "8122638-91036875",
"checkNumber": "0",
"checkAmount": 0,
"createdOn": null,
"createdBy": "traxuser621",
"propertyTypeCode": "",
"propertyTypeDesc": "1-4 FAMILY RESIDENTIAL",
"ppaddress": "1800 Smith St ",
"ppcity": "sicklerville",
"ppstate": "PA",
"ppzip": "08105",
"ppcounty": "Dauphin",
"policytype": "",
"status": "Active",
"effectiveDate": "2016-02-01T00:00:00-06:00",
"formType": "Test Data",
"rateCode": "PASALERATE",
"rateCodeDesc": "Sale Rate - Agent",
"policyTypeCode": "26",
"policyTypeCodeDesc": "SALE OWNERS",
"amount": 180000,
"hoiAgentNumber": "",
"proForma": false,
"pdfLocation": "SomeLocation2",
"legacyPolicy": "true",
"associatedPolNbr": null
}
]
}
}
]
}
}
What am I doing wrong?
You can use a scripted update:
Put your new policy in a parameter, for example policy
Use a script like the following :
if (!ctxt._source.policies) { ctxt._source.policies = [] }
ctxt._source.policies += policy
See this documentation : https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-update.html
Updates in inverted indexes are deletes and replacements of documents. There is no in-place update like you find in a db. ES uses Lucene under the hood which in-turn implements a kick-ass inverted index.