Convert Elastic Search Query into Java - java

I have written an elastic query and it's working completely fine(verified in Kibana). But I have to call this query in java to convert it. I am trying to do it using the repository Query method. But its giving me error while compilation only. Please suggest the correct way to do it.
Error: Reason: No property searchLocationOnLevel found for type LocationSearch!; nested exception is org.springframework.data.mapping.PropertyReferenceException: No property searchLocationOnLevel found for type LocationSearch!
Elastic Query(Working)
GET dev_skp_location/_search
{
"query": {
"bool":{
"must":[
{
"regexp": { "name": ".*pur*"}
},
{
"nested": {
"path": "locationType",
"query": {
"bool": {
"must": [
{
"match": { "locationType.level": "1" }
}]
}
},
"score_mode": "avg"
}
}
]
}
}
}
The JPA way I am implemented it.
#Query("{\n" +
" \"bool\":{\n" +
" \"must\":[\n" +
" {\n" +
" \"regexp\": { \"name\": \".*pur*\"}\n" +
" },\n" +
" {\n" +
" \"nested\": {\n" +
" \"path\": \"locationType\",\n" +
" \"query\": {\n" +
" \"bool\": {\n" +
" \"must\": [\n" +
" { \n" +
" \"match\": { \"locationType.level\": \"1\" } \n" +
" \n" +
" }]\n" +
" }\n" +
" },\n" +
" \"score_mode\": \"avg\"\n" +
" }\n" +
" }\n" +
" ]\n" +
" }\n" +
" }")
Page<LocationSearch> searchLocationOnLevel(String loc, String level, Pageable pageable);

I was able to figure out later how to do it, but I still feel JPA should have worked too. Anyone who has a better explanation for this is most welcome.
I wrote a Query Builder method and called by using normal Elastic Search Query Methods.
public Query AutoCompleteLocationQueryBuilder(String locationTerm, String level, Long tenantId){
QueryBuilder tenantQuery = QueryBuilders
.matchQuery("tenantId", tenantId);
String regexExpression = ".*" + locationTerm + "*";
QueryBuilder regexQuery = QueryBuilders.regexpQuery("name",regexExpression);
String nestedPath="locationType";
BoolQueryBuilder boolQueryBuilder = QueryBuilders.boolQuery();
MatchQueryBuilder matchQuery =
QueryBuilders.matchQuery("locationType.level", level);
NestedQueryBuilder nestedQuery = QueryBuilders
.nestedQuery(nestedPath, boolQueryBuilder.must(matchQuery), ScoreMode.Avg);
QueryBuilder finalQuery = QueryBuilders.boolQuery()
.must(tenantQuery)
.must(regexQuery)
.must(nestedQuery);
return new NativeSearchQueryBuilder()
.withQuery(finalQuery)
.build()
.setPageable(PageRequest.of(0, 10));
}

Related

Deeply nested JSON response from third party

I'm getting this deeply nested JSON response from an api that I have no control,
What should be the best way to get to "generalDetails" and then find the first true value under security, address, account and mobile?
{
"info_code": "201",
"info_description": "info description",
"data": {
"status": "here goes the status",
"failure_data": {
"source": "anySource",
"details": {
"data": {
"server_response": {
"generalDetails": {
"security": {
"isAccountLocked": "false"
},
"address": {
"isAddresExists": "true"
},
"account": {
"accountExists": "true",
"isValidAccount": "true"
},
"mobile": {
"mobileExists": "true"
}
}
}
}
}
}
}
}
My request looks like:
#Autowired
private WebClient.Builder webClientBuilder;
String resp = webClientBuilder.build().get().uri(URL)
.accept(MediaType.APPLICATION_JSON)
.retrieve()
.bodyToMono(String.class).block();
First, build the model, automatic here https://codebeautify.org/json-to-java-converter.
Then read data with the model
.bodyToMono(MyData.class)
Then decide how you want evaluate the requirement find the first true value under security, address, account and mobile.
What means "first" ? JSON has no natural order without indicating explicity (e.g. field "order": 2).
N.B. "true", "false" of the response are Strings, not booleans.
Once you have the model with data, you may do:
Object firstTrue(GeneralDetails gd) {
// No null checks here
if ("true".equals(gd.getSecurtity().isLockedAccount())) return gd.getSecurtity();
if ("true".equals(gd.getAddress().isAddressExists())) return gd.getAddress();
if ("true".equals(gd.getAccount().isAccountExists()) || "true".equals(gd.getAccount().isAccountValid())) return gd.getAccount();
if ("true".equals(gd.getMobile().isMobileExists())) return gd.getMobile();
return null;
}
https://github.com/octomix/josson
Deserialization
Josson josson = Josson.fromJsonString(
"{" +
" \"info_code\": \"201\"," +
" \"info_description\": \"info description\"," +
" \"data\": {" +
" \"status\": \"here goes the status\"," +
" \"failure_data\": {" +
" \"source\": \"anySource\"," +
" \"details\": {" +
" \"data\": {" +
" \"server_response\": {" +
" \"generalDetails\": {" +
" \"security\": {" +
" \"isAccountLocked\": \"false\"" +
" }," +
" \"address\": {" +
" \"isAddresExists\": \"true\"" +
" }," +
" \"account\": {" +
" \"accountExists\": \"true\"," +
" \"isValidAccount\": \"true\"" +
" }," +
" \"mobile\": {" +
" \"mobileExists\": \"true\"" +
" }" +
" }" +
" }" +
" }" +
" }" +
" }" +
" }" +
"}");
Query
JsonNode node = josson.getNode(
"data.failure_data.details.data.server_response" +
".generalDetails.**.mergeObjects().assort().[*]");
System.out.println(node.toPrettyString());
Output
{
"isAddresExists" : "true"
}
If changed isAddresExists and accountExists to false
" \"generalDetails\": {" +
" \"security\": {" +
" \"isAccountLocked\": \"false\"" +
" }," +
" \"address\": {" +
" \"isAddresExists\": \"false\"" +
" }," +
" \"account\": {" +
" \"accountExists\": \"false\"," +
" \"isValidAccount\": \"true\"" +
" }," +
" \"mobile\": {" +
" \"mobileExists\": \"true\"" +
" }" +
" }" +
Output
{
"isValidAccount" : "true"
}
If you only want the key name
String firstTureKey = josson.getString(
"data.failure_data.details.data.server_response" +
".generalDetails.**.mergeObjects().assort().[*].keys().[0]");
System.out.println(firstTureKey);
Output
isValidAccount

Spring query convert to a nested JSON structure

I'm new to spring and Java and trying to figure out how to go about formatting the json response into the desired structure.
I have a spring query that's returning 2 columns from a table like below which are really the key and values I need for the json structure:
Names
Values
Car
Toyota
Bike
Schwinn
Scooter
Razor
A0
11
A1
12
A2
13
B0
2000
B1
4000
B2
22000
The current json output from the controller is this:
[{
"names": "Car",
"values": "Toyota"
},
{
"names": "Bike",
"values": "Schwinn"
},
{
"names": "Scooter",
"values": "Razor"
},
{
"names": "A0",
"values": "11"
},
{
"names": "A1",
"values": "12"
},
{
"names": "A2",
"values": "13"
},
{
"names": "B0",
"values": "2000"
},
{
"names": "B1",
"values": "4000"
},
{
"names": "B2",
"values": "22000"
}
]
And the desired json format is this where the table column names are removed and instead json structure is created using the names column for the keys:
{
"Car": "Toyota",
"Bike": "Schwinn",
"Scooter": "Razor",
"Data": [{
"A0": "11",
"B0": "2000"
}, {
"A1": "12",
"B1": "4000"
}, {
"A2": "13",
"B2": "22000"
}]
}
Repository
#Query (value = "Select names, values ... :id")
List<Data> findData(#Param("id") Long id) ;
interface Data {
String getnames();
String getvalues();
}
Service
public List<Data> getData(Long id) {return repo.findData(id);}
Controller
#GetMapping("/getdata/{id}")
public ResponseEntity<List<Data>> getData(#PathVariable Long id) {
List<Data> c = service.getData(id);
return new ResponseEntity<>(c, HttpStatus.OK);
}
It seems that I need to process the result set and need to loop through them to create the desired structure but not sure how to proceed with that, or perhaps there is an easier way to get to the desired structure. Any guidance would be appreciated.
So return a ResponseEntity<Map<String, Object>> instead of a List to simulate a Json object.
List<Data> c = service.getData(id);
Map<String, Object> map = new HashMap<>();
map.put("Key", "Value");
map.put("Car", c.get(0).getvalues());
map.put("Entire List", c);
return new ResponseEntity<>(c, HttpStatus.OK);
Obviously you'll have to write your own logic but it should be pretty straight forward. Or, even better, consider making a class for the object returned if you're going to be using it a lot, and just return ResponseEntity< YourCustomObject >
This looks a bit complicated, I think you should set the primary key association for values like A0 B0
import com.black_dragon.utils.JacksonUtils;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
import static java.util.stream.Collectors.groupingBy;
/**
* #author black_dragon
* #version V1.0
* #Package com.black_dragon.swing
* #date 2022/9/6 10:35
* #Copyright
*/
public class ConvertToMap {
String names;
String values;
public String getNames() {
return names;
}
public void setNames(String names) {
this.names = names;
}
public String getValues() {
return values;
}
public void setValues(String values) {
this.values = values;
}
private static String DIGIT_REGEX = "[^0-9]";
private static String LETTER_DIGIT_REGEX = "[a-zA-Z]+";
public static Integer getDigit(String str){
Pattern pattern = Pattern.compile(DIGIT_REGEX);
if(!isLetterDigit(str)){
String[] keySet = pattern.split(str);
if(keySet.length > 0){
return Integer.valueOf(keySet[1]);
}
}
return -1;
}
public static boolean isLetterDigit(String str){
return str.matches(LETTER_DIGIT_REGEX);
}
private static String fetchGroupKey(ConvertToMap convertToMap){
return String.valueOf(getDigit(convertToMap.names));
}
public static void main(String[] args) {
String jsonString = "[{\n" +
" \"names\": \"Car\",\n" +
" \"values\": \"Toyota\"\n" +
" },\n" +
" {\n" +
" \"names\": \"Bike\",\n" +
" \"values\": \"Schwinn\"\n" +
" },\n" +
" {\n" +
" \"names\": \"Scooter\",\n" +
" \"values\": \"Razor\"\n" +
" },\n" +
" {\n" +
" \"names\": \"A0\",\n" +
" \"values\": \"11\"\n" +
" },\n" +
" {\n" +
" \"names\": \"A1\",\n" +
" \"values\": \"12\"\n" +
" },\n" +
" {\n" +
" \"names\": \"A2\",\n" +
" \"values\": \"13\"\n" +
" },\n" +
" {\n" +
" \"names\": \"B0\",\n" +
" \"values\": \"2000\"\n" +
" },\n" +
" {\n" +
" \"names\": \"B1\",\n" +
" \"values\": \"4000\"\n" +
" },\n" +
" {\n" +
" \"names\": \"B2\",\n" +
" \"values\": \"22000\"\n" +
" }\n" +
"]";
List<ConvertToMap> convertToMaps = JacksonUtils.toJavaList(jsonString, ConvertToMap.class);
// Extract a string that does not contain numbers and convert it to a map
Map<String, Object> result = convertToMaps.stream()
.filter(x -> isLetterDigit(x.names))
.collect(Collectors.toMap(ConvertToMap::getNames, ConvertToMap::getValues));
List<Map<String, String>> mapList = new ArrayList<>();
// Group by string numbers containing numbers
Map<String, List<ConvertToMap>> stringListMap = convertToMaps.stream().collect(groupingBy(convertToMap -> fetchGroupKey(convertToMap)));
for (String key : stringListMap.keySet()) {
if(Integer.valueOf(key) >= 0){
mapList.add(stringListMap.get(key)
.stream()
.collect(Collectors.toMap(ConvertToMap::getNames, ConvertToMap::getValues)));
}
}
result.put("Data", mapList);
System.out.println(JacksonUtils.toJSONString(result));
}
}
Assume that your data key name pattern is one non-digit followed by digits.
https://github.com/octomix/josson
Deserialization
Josson josson = Josson.fromJsonString(
"[" +
" {" +
" \"names\": \"Car\"," +
" \"values\": \"Toyota\"" +
" }," +
" {" +
" \"names\": \"Bike\"," +
" \"values\": \"Schwinn\"" +
" }," +
" {" +
" \"names\": \"Scooter\"," +
" \"values\": \"Razor\"" +
" }," +
" {" +
" \"names\": \"A0\"," +
" \"values\": \"11\"" +
" }," +
" {" +
" \"names\": \"A1\"," +
" \"values\": \"12\"" +
" }," +
" {" +
" \"names\": \"A2\"," +
" \"values\": \"13\"" +
" }," +
" {" +
" \"names\": \"B0\"," +
" \"values\": \"2000\"" +
" }," +
" {" +
" \"names\": \"B1\"," +
" \"values\": \"4000\"" +
" }," +
" {" +
" \"names\": \"B2\"," +
" \"values\": \"22000\"" +
" }" +
"]");
Transformation
JsonNode node = josson.getNode(
"#collect([names !=~ '\\D\\d+']*" +
" .map(names::values)" +
" ,[names =~ '\\D\\d+']*" +
" .group(names.substr(1), map(names::values))#" +
" .elements" +
" .mergeObjects()" +
" .#toObject('Data')" +
")" +
".flatten(1)" +
".mergeObjects()");
System.out.println(node.toPrettyString());
Output
{
"Car" : "Toyota",
"Bike" : "Schwinn",
"Scooter" : "Razor",
"Data" : [ {
"A0" : "11",
"B0" : "2000"
}, {
"A1" : "12",
"B1" : "4000"
}, {
"A2" : "13",
"B2" : "22000"
} ]
}

Response of an Http request not contain the whole data

I send a http POST request to an azure timeseries insights by using the standard Spring Boot weblient.
Inside the response body I miss values.
Environment:
Windows 10
Java 11 (ibm-semeru_jdk-11.0.13+8_openj9 and amazon-corretto_jdk11.0.13_8)
Spring Boot 2.5.6
OkHttpClient 4.9.2
IntelliJ IDEA 2021.2.3
Here are my steps:
I try this with the spring boot webclient
final ResponseEntity<String> responseEntity = webClient.post()
.uri(tsiUrl)
.header(HttpHeaders.AUTHORIZATION, "Bearer " + token)
.contentType(MediaType.APPLICATION_JSON)
.bodyValue(requestBody)
.retrieve()
.toEntity(String.class)
.block();
return responseEntity != null ? responseEntity.getBody() : null;
and this with OkHttpClient (to verify the response, but I get the same reponse content)
public class App
{
public static final okhttp3.MediaType JSON = okhttp3.MediaType.get("application/json; charset=utf-8");
public static void main( String[] args )
{
String requestBody = "{\n"
+ " \"aggregateSeries\": {\n"
+ " \"searchSpan\": {\n"
+ " \"from\": \"2021-01-01T00:00Z\",\n"
+ " \"to\": \"2021-12-31T00:00:01Z\"\n"
+ " },\n"
+ " \"timeSeriesId\": [\n"
+ " \"edge-goldwind-qa-002-astraios\",\n"
+ " \"GcmProcessed\"\n"
+ " ],\n"
+ " \"interval\": \"P1D\",\n"
+ " \"inlineVariables\": {\n"
+ " \"gcm01DeteriorationMax\": {\n"
+ " \"kind\": \"numeric\",\n"
+ " \"value\": {\n"
+ " \"tsx\": \"$event.GCM01Deterioration.Double\"\n"
+ " },\n"
+ " \"filter\": null,\n"
+ " \"aggregation\": {\n"
+ " \"tsx\": \"max($value)\"\n"
+ " }\n"
+ " },\"gcm01TemperatureOpticsMax\": {\n"
+ " \"kind\": \"numeric\",\n"
+ " \"value\": {\n"
+ " \"tsx\": \"$event.GCM01TemperatureOptics.Long\"\n"
+ " },\n"
+ " \"filter\": null,\n"
+ " \"aggregation\": {\n"
+ " \"tsx\": \"max($value)\"\n"
+ " }\n"
+ " }\n"
+ " },\n"
+ " \"projectedVariables\": [\n"
+ " \"gcm01DeteriorationMax\",\n"
+ " \"gcm01TemperatureOpticsMax\"\n"
+ " ]\n"
+ " }\n"
+ "}";
String token = "bearertoken"; //removed original bearer token
try {
OkHttpClient client = new OkHttpClient();
Request request = new Request.Builder()
.header("Authorization", "Bearer " + token)
.url("https://1ff924d7-55b5-48c7-8c29-7fcbc18b8776.env.timeseries.azure.cn/timeseries/query?api-version=2020-07-31&storeType=WarmStore")
.post(RequestBody.create(requestBody, JSON))
.build();
Response response = client.newCall(request).execute();
final ResponseBody body = response.body();
final String string = body.string();
} catch (Exception e) {
e.fillInStackTrace();
}
}
}
and I send this POST body:
{
"aggregateSeries": {
"searchSpan": {
"from": "2021-01-01T00:00Z",
"to": "2021-12-31T00:00:01Z"
},
"timeSeriesId": [
"edge-goldwind-qa-002-astraios",
"GcmProcessed"
],
"interval": "P1D",
"inlineVariables": {
"gcm01DeteriorationMax": {
"kind": "numeric",
"value": {
"tsx": "$event.GCM01Deterioration.Double"
},
"filter": null,
"aggregation": {
"tsx": "max($value)"
}
},"gcm01TemperatureOpticsMax": {
"kind": "numeric",
"value": {
"tsx": "$event.GCM01TemperatureOptics.Long"
},
"filter": null,
"aggregation": {
"tsx": "max($value)"
}
}
},
"projectedVariables": [
"gcm01DeteriorationMax",
"gcm01TemperatureOpticsMax"
]
}
}
The result of the Spring Boot webclient and OkHttpClient (not expected)
{"values":[null,..,null,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,null,...,null],"name":"gcm01DeteriorationMax","type":"Double"}
(I removed all the null values, to see the differences simple)
But if I send the same POST with Postman I get this result (expected):
{"values":[null,..,null,69.209999084472656,95.569999694824219,87.209999084472656,90.419998168945313,89.419998168945313,65.120002746582031,73.19000244140625,75.6500015258789,77.44000244140625,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,100.0,null,null,null,null,null,null,null,null,null,100.0,100.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,0.0,null,..,null],"name":"gcm01DeteriorationMax","type":"Double"}
As you can see the result of the Postman contains more values and less null values.
I have tried the same POST with .Net Core 5 httpclient and I get the same results as with the Postman.
My question is, does anyone have an idea what is going wrong here?

Java Spring elasticsearch "Failed to derive xcontent" with #Query

I have a custom #Query in one of my elasticsearch repositories because the autoGenerated method didn't use match (instead query_string with analyze_wildcard) and so didn't work for example with spaces. This query looks pretty simple to me so I thought it wouldn't be a problem to write it myself.
#Query("\"bool\": { " +
" \"filter\": [ " +
" { " +
" \"term\": { " +
" \"userId.keyword\": \"?0\" " +
" } " +
" }, " +
" {" +
" \"match\": { " +
" \"content\": \"?1\" " +
" }" +
" } " +
" ] " +
" }")
Page<SearchablePageHistory> findAllByUserIdAndContentLike(String userId, String content, Pageable pageable);
But when I try to execute that function I get the following error:
org.elasticsearch.ElasticsearchStatusException: Elasticsearch exception [type=x_content_parse_exception, reason=Failed to derive xcontent]
at org.elasticsearch.rest.BytesRestResponse.errorFromXContent(BytesRestResponse.java:177) ~[elasticsearch-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestHighLevelClient.parseEntity(RestHighLevelClient.java:1793) ~[elasticsearch-rest-high-level-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestHighLevelClient.parseResponseException(RestHighLevelClient.java:1770) ~[elasticsearch-rest-high-level-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1527) ~[elasticsearch-rest-high-level-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1484) ~[elasticsearch-rest-high-level-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1454) ~[elasticsearch-rest-high-level-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestHighLevelClient.search(RestHighLevelClient.java:970) ~[elasticsearch-rest-high-level-client-7.6.2.jar:7.6.2]
at org.springframework.data.elasticsearch.core.ElasticsearchRestTemplate.lambda$search$10(ElasticsearchRestTemplate.java:265) ~[spring-data-elasticsearch-4.0.0.RELEASE.jar:4.0.0.RELEASE]
at org.springframework.data.elasticsearch.core.ElasticsearchRestTemplate.execute(ElasticsearchRestTemplate.java:351) ~[spring-data-elasticsearch-4.0.0.RELEASE.jar:4.0.0.RELEASE]
at org.springframework.data.elasticsearch.core.ElasticsearchRestTemplate.search(ElasticsearchRestTemplate.java:265) ~[spring-data-elasticsearch-4.0.0.RELEASE.jar:4.0.0.RELEASE]
at org.springframework.data.elasticsearch.repository.query.ElasticsearchStringQuery.execute(ElasticsearchStringQuery.java:89) ~[spring-data-elasticsearch-4.0.0.RELEASE.jar:4.0.0.RELEASE]
at org.springframework.data.repository.core.support.QueryExecutorMethodInterceptor$QueryMethodInvoker.invoke(QueryExecutorMethodInterceptor.java:195) ~[spring-data-commons-2.3.0.RELEASE.jar:2.3.0.RELEASE]
at org.springframework.data.repository.core.support.QueryExecutorMethodInterceptor.doInvoke(QueryExecutorMethodInterceptor.java:152) ~[spring-data-commons-2.3.0.RELEASE.jar:2.3.0.RELEASE]
at org.springframework.data.repository.core.support.QueryExecutorMethodInterceptor.invoke(QueryExecutorMethodInterceptor.java:130) ~[spring-data-commons-2.3.0.RELEASE.jar:2.3.0.RELEASE]
Suppressed: org.elasticsearch.client.ResponseException: method [POST], host [http://localhost:9200], URI [/history/_search?pre_filter_shard_size=128&typed_keys=true&max_concurrent_shard_requests=5&ignore_unavailable=false&expand_wildcards=open&allow_no_indices=true&ignore_throttled=true&search_type=dfs_query_then_fetch&batched_reduce_size=512&ccs_minimize_roundtrips=true], status line [HTTP/1.1 400 Bad Request]
{"error":{"root_cause":[{"type":"x_content_parse_exception","reason":"Failed to derive xcontent"}],"type":"x_content_parse_exception","reason":"Failed to derive xcontent"},"status":400}
at org.elasticsearch.client.RestClient.convertResponse(RestClient.java:283) ~[elasticsearch-rest-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:261) ~[elasticsearch-rest-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:235) ~[elasticsearch-rest-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1514) ~[elasticsearch-rest-high-level-client-7.6.2.jar:7.6.2]
... 124 common frames omitted
With debuggin I tracked down the raw Rest-Request that is sent to elasticsearch in org.elasticsearch.client.RestClient.java:244 and found that this is the payload sent to the server:
{"from":0,"size":10,"query":{"wrapper":{"query":"ImJvb2wiOiB7ICAgICJmaWx0ZXIiOiBbICAgICB7ICAgICAgICJ0ZXJtIjogeyAgICAgICAidXNlcklkLmtleXdvcmQiOiAiMzFjMjA5NTktNjg5Zi00YjI4LWExNzctNmQ3ZTQ2YTBhYzMwIiAgICAgIH0gICAgIH0sICAgeyJtYXRjaCI6IHsgICAgImNvbnRlbnQiOiAidGVzdHNzIiAgIH19ICAgIF0gICB9"}},"version":true,"sort":[{"id":{"order":"desc"}}]}
With that payload an error is not suprising however I have no idea why there is this weird mumble of characters. I suspect that that is supposed to be my custom query which is not used correctly. I got this payload by debugging into this line:
httpResponse = client.execute(context.requestProducer, context.asyncResponseConsumer, context.context, null).get();
and then executing:
StandardCharsets.UTF_8.decode(((NByteArrayEntity) ((HttpPost) ((HttpAsyncMethods.RequestProducerImpl) context.requestProducer).request).entity).buf).toString()
These are my imports and the classname that I use in the Repository-Class:
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
import org.springframework.data.elasticsearch.annotations.Query;
import org.springframework.data.elasticsearch.repository.ElasticsearchRepository;
import java.util.stream.Stream;
public interface SearchablePageHistoryRepository extends ElasticsearchRepository<SearchablePageHistory, Integer> {
Page<SearchablePageHistory> findAllByUserId(String userId, Pageable pageable);
#Query("\"bool\": { " +
" \"filter\": [ " +
" { " +
" \"term\": { " +
" \"userId.keyword\": \"?0\" " +
" } " +
" }, " +
" {" +
" \"match\": { " +
" \"content\": \"?1\" " +
" }" +
" } " +
" ] " +
" }")
Page<SearchablePageHistory> findAllByUserIdAndContentLike(String userId, String content, Pageable pageable);
}
All other queries where I don't use #Query work fine without a problem. I have no idea what I am doing wrong since my example seems very similar to the one given in the documentation: https://docs.spring.io/spring-data/elasticsearch/docs/current/reference/html/#elasticsearch.query-methods.at-query
Hard facepalm, I found my error -> still gonna leave this post up in case someone else stumbles across the same problem since the error message is not very helpful in my opinion.
I simply forgot the surrounding brackets around the outside of the query:
changing this:
#Query("\"bool\": { " +
" \"filter\": [ " +
" { " +
" \"term\": { " +
" \"userId.keyword\": \"?0\" " +
" } " +
" }, " +
" {" +
" \"match\": { " +
" \"content\": \"?1\" " +
" }" +
" } " +
" ] " +
" }")
Page<SearchablePageHistory> findAllByUserIdAndContentLike(String userId, String content, Pageable pageable);
to this:
#Query("{\"bool\": { " +
" \"filter\": [ " +
" { " +
" \"term\": { " +
" \"userId.keyword\": \"?0\" " +
" } " +
" }, " +
" {" +
" \"match\": { " +
" \"content\": \"?1\" " +
" }" +
" } " +
" ] " +
" }}")
Page<SearchablePageHistory> findAllByUserIdAndContentLike(String userId, String content, Pageable pageable);
solved the problem.
Addition:
"ImJvb2wiOiB7ICAgICJmaWx0ZXIiOiBbICAgICB7ICAgICAgICJ0ZXJtIjogeyAgICAgICAidXNlcklkLmtleXdvcmQiOiAiMzFjMjA5NTktNjg5Zi00YjI4LWExNzctNmQ3ZTQ2YTBhYzMwIiAgICAgIH0gICAgIH0sICAgeyJtYXRjaCI6IHsgICAgImNvbnRlbnQiOiAidGVzdHNzIiAgIH19ICAgIF0gICB9"
is a wrapper query, that's a base64 encoded string conataining
""bool": { "filter": [ { "term": { "userId.keyword": "31c20959-689f-4b28-a177-6d7e46a0ac30" } }, {"match": { "content": "testss" }} ] }"
Had similar issue and was actually the absence of bracket

Elasticsearch: rewrite a query using java native api

I have this query in Elasticsearch that is working perfectly if I run it from the command line:
POST http://localhost:9200/YOUR_INDEX_NAME/_search/
{
"size": 0,
"aggs": {
"autocomplete": {
"terms": {
"field": "autocomplete",
"order": {
"_count": "desc"
},
"include": {
"pattern": "c.*"
}
}
}
},
"query": {
"prefix": {
"autocomplete": {
"value": "c"
}
}
}
}
I have tried to rewrite it in java using the native client:
SearchResponse searchResponse2 = newClient.prepareSearch(INDEX_NAME)
.setSearchType(SearchType.DFS_QUERY_THEN_FETCH)
.setQuery("{\n" +
" \"size\": 0,\n" +
" \"aggs\": {\n" +
" \"autocomplete\": {\n" +
" \"terms\": {\n" +
" \"field\": \"autocomplete\",\n" +
" \"order\": {\n" +
" \"_count\": \"desc\"\n" +
" },\n" +
" \"include\": {\n" +
" \"pattern\": \"c.*\"\n" +
" }\n" +
" }\n" +
" }\n" +
" },\n" +
" \"query\": {\n" +
" \"prefix\": {\n" +
" \"autocomplete\": {\n" +
" \"value\": \"c\"\n" +
" }\n" +
" }\n" +
" }\n" +
"}").get();
for (SearchHit res : searchResponse2.getHits()){
System.out.println(res.getSourceAsString());
}
Seems, that I'm missing something in this translation process. Thanks in advance
The Java client setQuery() method doesn't take a String with the JSON query, you need to build the query using the QueryBuilders helper methods and build the aggregation your the AggregationBuilders helper methods.
In your case that would go like this:
// build the aggregation
TermsBuilder agg = AggregationBuilders.terms("autocomplete")
.field("autocomplete")
.include("c.*")
.order(Terms.Order.count(false));
// build the query
SearchResponse searchResponse2 = newClient.prepareSearch(INDEX_NAME)
.setSearchType(SearchType.DFS_QUERY_THEN_FETCH)
.setSize(0)
.setQuery(QueryBuilders.prefixQuery("autocomplete", "c"))
.addAggregation(agg)
.get();

Categories