Deeply nested JSON response from third party - java

I'm getting this deeply nested JSON response from an api that I have no control,
What should be the best way to get to "generalDetails" and then find the first true value under security, address, account and mobile?
{
"info_code": "201",
"info_description": "info description",
"data": {
"status": "here goes the status",
"failure_data": {
"source": "anySource",
"details": {
"data": {
"server_response": {
"generalDetails": {
"security": {
"isAccountLocked": "false"
},
"address": {
"isAddresExists": "true"
},
"account": {
"accountExists": "true",
"isValidAccount": "true"
},
"mobile": {
"mobileExists": "true"
}
}
}
}
}
}
}
}
My request looks like:
#Autowired
private WebClient.Builder webClientBuilder;
String resp = webClientBuilder.build().get().uri(URL)
.accept(MediaType.APPLICATION_JSON)
.retrieve()
.bodyToMono(String.class).block();

First, build the model, automatic here https://codebeautify.org/json-to-java-converter.
Then read data with the model
.bodyToMono(MyData.class)
Then decide how you want evaluate the requirement find the first true value under security, address, account and mobile.
What means "first" ? JSON has no natural order without indicating explicity (e.g. field "order": 2).
N.B. "true", "false" of the response are Strings, not booleans.
Once you have the model with data, you may do:
Object firstTrue(GeneralDetails gd) {
// No null checks here
if ("true".equals(gd.getSecurtity().isLockedAccount())) return gd.getSecurtity();
if ("true".equals(gd.getAddress().isAddressExists())) return gd.getAddress();
if ("true".equals(gd.getAccount().isAccountExists()) || "true".equals(gd.getAccount().isAccountValid())) return gd.getAccount();
if ("true".equals(gd.getMobile().isMobileExists())) return gd.getMobile();
return null;
}

https://github.com/octomix/josson
Deserialization
Josson josson = Josson.fromJsonString(
"{" +
" \"info_code\": \"201\"," +
" \"info_description\": \"info description\"," +
" \"data\": {" +
" \"status\": \"here goes the status\"," +
" \"failure_data\": {" +
" \"source\": \"anySource\"," +
" \"details\": {" +
" \"data\": {" +
" \"server_response\": {" +
" \"generalDetails\": {" +
" \"security\": {" +
" \"isAccountLocked\": \"false\"" +
" }," +
" \"address\": {" +
" \"isAddresExists\": \"true\"" +
" }," +
" \"account\": {" +
" \"accountExists\": \"true\"," +
" \"isValidAccount\": \"true\"" +
" }," +
" \"mobile\": {" +
" \"mobileExists\": \"true\"" +
" }" +
" }" +
" }" +
" }" +
" }" +
" }" +
" }" +
"}");
Query
JsonNode node = josson.getNode(
"data.failure_data.details.data.server_response" +
".generalDetails.**.mergeObjects().assort().[*]");
System.out.println(node.toPrettyString());
Output
{
"isAddresExists" : "true"
}
If changed isAddresExists and accountExists to false
" \"generalDetails\": {" +
" \"security\": {" +
" \"isAccountLocked\": \"false\"" +
" }," +
" \"address\": {" +
" \"isAddresExists\": \"false\"" +
" }," +
" \"account\": {" +
" \"accountExists\": \"false\"," +
" \"isValidAccount\": \"true\"" +
" }," +
" \"mobile\": {" +
" \"mobileExists\": \"true\"" +
" }" +
" }" +
Output
{
"isValidAccount" : "true"
}
If you only want the key name
String firstTureKey = josson.getString(
"data.failure_data.details.data.server_response" +
".generalDetails.**.mergeObjects().assort().[*].keys().[0]");
System.out.println(firstTureKey);
Output
isValidAccount

Related

Spring query convert to a nested JSON structure

I'm new to spring and Java and trying to figure out how to go about formatting the json response into the desired structure.
I have a spring query that's returning 2 columns from a table like below which are really the key and values I need for the json structure:
Names
Values
Car
Toyota
Bike
Schwinn
Scooter
Razor
A0
11
A1
12
A2
13
B0
2000
B1
4000
B2
22000
The current json output from the controller is this:
[{
"names": "Car",
"values": "Toyota"
},
{
"names": "Bike",
"values": "Schwinn"
},
{
"names": "Scooter",
"values": "Razor"
},
{
"names": "A0",
"values": "11"
},
{
"names": "A1",
"values": "12"
},
{
"names": "A2",
"values": "13"
},
{
"names": "B0",
"values": "2000"
},
{
"names": "B1",
"values": "4000"
},
{
"names": "B2",
"values": "22000"
}
]
And the desired json format is this where the table column names are removed and instead json structure is created using the names column for the keys:
{
"Car": "Toyota",
"Bike": "Schwinn",
"Scooter": "Razor",
"Data": [{
"A0": "11",
"B0": "2000"
}, {
"A1": "12",
"B1": "4000"
}, {
"A2": "13",
"B2": "22000"
}]
}
Repository
#Query (value = "Select names, values ... :id")
List<Data> findData(#Param("id") Long id) ;
interface Data {
String getnames();
String getvalues();
}
Service
public List<Data> getData(Long id) {return repo.findData(id);}
Controller
#GetMapping("/getdata/{id}")
public ResponseEntity<List<Data>> getData(#PathVariable Long id) {
List<Data> c = service.getData(id);
return new ResponseEntity<>(c, HttpStatus.OK);
}
It seems that I need to process the result set and need to loop through them to create the desired structure but not sure how to proceed with that, or perhaps there is an easier way to get to the desired structure. Any guidance would be appreciated.
So return a ResponseEntity<Map<String, Object>> instead of a List to simulate a Json object.
List<Data> c = service.getData(id);
Map<String, Object> map = new HashMap<>();
map.put("Key", "Value");
map.put("Car", c.get(0).getvalues());
map.put("Entire List", c);
return new ResponseEntity<>(c, HttpStatus.OK);
Obviously you'll have to write your own logic but it should be pretty straight forward. Or, even better, consider making a class for the object returned if you're going to be using it a lot, and just return ResponseEntity< YourCustomObject >
This looks a bit complicated, I think you should set the primary key association for values like A0 B0
import com.black_dragon.utils.JacksonUtils;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
import java.util.regex.Pattern;
import java.util.stream.Collectors;
import static java.util.stream.Collectors.groupingBy;
/**
* #author black_dragon
* #version V1.0
* #Package com.black_dragon.swing
* #date 2022/9/6 10:35
* #Copyright
*/
public class ConvertToMap {
String names;
String values;
public String getNames() {
return names;
}
public void setNames(String names) {
this.names = names;
}
public String getValues() {
return values;
}
public void setValues(String values) {
this.values = values;
}
private static String DIGIT_REGEX = "[^0-9]";
private static String LETTER_DIGIT_REGEX = "[a-zA-Z]+";
public static Integer getDigit(String str){
Pattern pattern = Pattern.compile(DIGIT_REGEX);
if(!isLetterDigit(str)){
String[] keySet = pattern.split(str);
if(keySet.length > 0){
return Integer.valueOf(keySet[1]);
}
}
return -1;
}
public static boolean isLetterDigit(String str){
return str.matches(LETTER_DIGIT_REGEX);
}
private static String fetchGroupKey(ConvertToMap convertToMap){
return String.valueOf(getDigit(convertToMap.names));
}
public static void main(String[] args) {
String jsonString = "[{\n" +
" \"names\": \"Car\",\n" +
" \"values\": \"Toyota\"\n" +
" },\n" +
" {\n" +
" \"names\": \"Bike\",\n" +
" \"values\": \"Schwinn\"\n" +
" },\n" +
" {\n" +
" \"names\": \"Scooter\",\n" +
" \"values\": \"Razor\"\n" +
" },\n" +
" {\n" +
" \"names\": \"A0\",\n" +
" \"values\": \"11\"\n" +
" },\n" +
" {\n" +
" \"names\": \"A1\",\n" +
" \"values\": \"12\"\n" +
" },\n" +
" {\n" +
" \"names\": \"A2\",\n" +
" \"values\": \"13\"\n" +
" },\n" +
" {\n" +
" \"names\": \"B0\",\n" +
" \"values\": \"2000\"\n" +
" },\n" +
" {\n" +
" \"names\": \"B1\",\n" +
" \"values\": \"4000\"\n" +
" },\n" +
" {\n" +
" \"names\": \"B2\",\n" +
" \"values\": \"22000\"\n" +
" }\n" +
"]";
List<ConvertToMap> convertToMaps = JacksonUtils.toJavaList(jsonString, ConvertToMap.class);
// Extract a string that does not contain numbers and convert it to a map
Map<String, Object> result = convertToMaps.stream()
.filter(x -> isLetterDigit(x.names))
.collect(Collectors.toMap(ConvertToMap::getNames, ConvertToMap::getValues));
List<Map<String, String>> mapList = new ArrayList<>();
// Group by string numbers containing numbers
Map<String, List<ConvertToMap>> stringListMap = convertToMaps.stream().collect(groupingBy(convertToMap -> fetchGroupKey(convertToMap)));
for (String key : stringListMap.keySet()) {
if(Integer.valueOf(key) >= 0){
mapList.add(stringListMap.get(key)
.stream()
.collect(Collectors.toMap(ConvertToMap::getNames, ConvertToMap::getValues)));
}
}
result.put("Data", mapList);
System.out.println(JacksonUtils.toJSONString(result));
}
}
Assume that your data key name pattern is one non-digit followed by digits.
https://github.com/octomix/josson
Deserialization
Josson josson = Josson.fromJsonString(
"[" +
" {" +
" \"names\": \"Car\"," +
" \"values\": \"Toyota\"" +
" }," +
" {" +
" \"names\": \"Bike\"," +
" \"values\": \"Schwinn\"" +
" }," +
" {" +
" \"names\": \"Scooter\"," +
" \"values\": \"Razor\"" +
" }," +
" {" +
" \"names\": \"A0\"," +
" \"values\": \"11\"" +
" }," +
" {" +
" \"names\": \"A1\"," +
" \"values\": \"12\"" +
" }," +
" {" +
" \"names\": \"A2\"," +
" \"values\": \"13\"" +
" }," +
" {" +
" \"names\": \"B0\"," +
" \"values\": \"2000\"" +
" }," +
" {" +
" \"names\": \"B1\"," +
" \"values\": \"4000\"" +
" }," +
" {" +
" \"names\": \"B2\"," +
" \"values\": \"22000\"" +
" }" +
"]");
Transformation
JsonNode node = josson.getNode(
"#collect([names !=~ '\\D\\d+']*" +
" .map(names::values)" +
" ,[names =~ '\\D\\d+']*" +
" .group(names.substr(1), map(names::values))#" +
" .elements" +
" .mergeObjects()" +
" .#toObject('Data')" +
")" +
".flatten(1)" +
".mergeObjects()");
System.out.println(node.toPrettyString());
Output
{
"Car" : "Toyota",
"Bike" : "Schwinn",
"Scooter" : "Razor",
"Data" : [ {
"A0" : "11",
"B0" : "2000"
}, {
"A1" : "12",
"B1" : "4000"
}, {
"A2" : "13",
"B2" : "22000"
} ]
}

Java Spring elasticsearch "Failed to derive xcontent" with #Query

I have a custom #Query in one of my elasticsearch repositories because the autoGenerated method didn't use match (instead query_string with analyze_wildcard) and so didn't work for example with spaces. This query looks pretty simple to me so I thought it wouldn't be a problem to write it myself.
#Query("\"bool\": { " +
" \"filter\": [ " +
" { " +
" \"term\": { " +
" \"userId.keyword\": \"?0\" " +
" } " +
" }, " +
" {" +
" \"match\": { " +
" \"content\": \"?1\" " +
" }" +
" } " +
" ] " +
" }")
Page<SearchablePageHistory> findAllByUserIdAndContentLike(String userId, String content, Pageable pageable);
But when I try to execute that function I get the following error:
org.elasticsearch.ElasticsearchStatusException: Elasticsearch exception [type=x_content_parse_exception, reason=Failed to derive xcontent]
at org.elasticsearch.rest.BytesRestResponse.errorFromXContent(BytesRestResponse.java:177) ~[elasticsearch-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestHighLevelClient.parseEntity(RestHighLevelClient.java:1793) ~[elasticsearch-rest-high-level-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestHighLevelClient.parseResponseException(RestHighLevelClient.java:1770) ~[elasticsearch-rest-high-level-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1527) ~[elasticsearch-rest-high-level-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1484) ~[elasticsearch-rest-high-level-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1454) ~[elasticsearch-rest-high-level-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestHighLevelClient.search(RestHighLevelClient.java:970) ~[elasticsearch-rest-high-level-client-7.6.2.jar:7.6.2]
at org.springframework.data.elasticsearch.core.ElasticsearchRestTemplate.lambda$search$10(ElasticsearchRestTemplate.java:265) ~[spring-data-elasticsearch-4.0.0.RELEASE.jar:4.0.0.RELEASE]
at org.springframework.data.elasticsearch.core.ElasticsearchRestTemplate.execute(ElasticsearchRestTemplate.java:351) ~[spring-data-elasticsearch-4.0.0.RELEASE.jar:4.0.0.RELEASE]
at org.springframework.data.elasticsearch.core.ElasticsearchRestTemplate.search(ElasticsearchRestTemplate.java:265) ~[spring-data-elasticsearch-4.0.0.RELEASE.jar:4.0.0.RELEASE]
at org.springframework.data.elasticsearch.repository.query.ElasticsearchStringQuery.execute(ElasticsearchStringQuery.java:89) ~[spring-data-elasticsearch-4.0.0.RELEASE.jar:4.0.0.RELEASE]
at org.springframework.data.repository.core.support.QueryExecutorMethodInterceptor$QueryMethodInvoker.invoke(QueryExecutorMethodInterceptor.java:195) ~[spring-data-commons-2.3.0.RELEASE.jar:2.3.0.RELEASE]
at org.springframework.data.repository.core.support.QueryExecutorMethodInterceptor.doInvoke(QueryExecutorMethodInterceptor.java:152) ~[spring-data-commons-2.3.0.RELEASE.jar:2.3.0.RELEASE]
at org.springframework.data.repository.core.support.QueryExecutorMethodInterceptor.invoke(QueryExecutorMethodInterceptor.java:130) ~[spring-data-commons-2.3.0.RELEASE.jar:2.3.0.RELEASE]
Suppressed: org.elasticsearch.client.ResponseException: method [POST], host [http://localhost:9200], URI [/history/_search?pre_filter_shard_size=128&typed_keys=true&max_concurrent_shard_requests=5&ignore_unavailable=false&expand_wildcards=open&allow_no_indices=true&ignore_throttled=true&search_type=dfs_query_then_fetch&batched_reduce_size=512&ccs_minimize_roundtrips=true], status line [HTTP/1.1 400 Bad Request]
{"error":{"root_cause":[{"type":"x_content_parse_exception","reason":"Failed to derive xcontent"}],"type":"x_content_parse_exception","reason":"Failed to derive xcontent"},"status":400}
at org.elasticsearch.client.RestClient.convertResponse(RestClient.java:283) ~[elasticsearch-rest-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:261) ~[elasticsearch-rest-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:235) ~[elasticsearch-rest-client-7.6.2.jar:7.6.2]
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1514) ~[elasticsearch-rest-high-level-client-7.6.2.jar:7.6.2]
... 124 common frames omitted
With debuggin I tracked down the raw Rest-Request that is sent to elasticsearch in org.elasticsearch.client.RestClient.java:244 and found that this is the payload sent to the server:
{"from":0,"size":10,"query":{"wrapper":{"query":"ImJvb2wiOiB7ICAgICJmaWx0ZXIiOiBbICAgICB7ICAgICAgICJ0ZXJtIjogeyAgICAgICAidXNlcklkLmtleXdvcmQiOiAiMzFjMjA5NTktNjg5Zi00YjI4LWExNzctNmQ3ZTQ2YTBhYzMwIiAgICAgIH0gICAgIH0sICAgeyJtYXRjaCI6IHsgICAgImNvbnRlbnQiOiAidGVzdHNzIiAgIH19ICAgIF0gICB9"}},"version":true,"sort":[{"id":{"order":"desc"}}]}
With that payload an error is not suprising however I have no idea why there is this weird mumble of characters. I suspect that that is supposed to be my custom query which is not used correctly. I got this payload by debugging into this line:
httpResponse = client.execute(context.requestProducer, context.asyncResponseConsumer, context.context, null).get();
and then executing:
StandardCharsets.UTF_8.decode(((NByteArrayEntity) ((HttpPost) ((HttpAsyncMethods.RequestProducerImpl) context.requestProducer).request).entity).buf).toString()
These are my imports and the classname that I use in the Repository-Class:
import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
import org.springframework.data.elasticsearch.annotations.Query;
import org.springframework.data.elasticsearch.repository.ElasticsearchRepository;
import java.util.stream.Stream;
public interface SearchablePageHistoryRepository extends ElasticsearchRepository<SearchablePageHistory, Integer> {
Page<SearchablePageHistory> findAllByUserId(String userId, Pageable pageable);
#Query("\"bool\": { " +
" \"filter\": [ " +
" { " +
" \"term\": { " +
" \"userId.keyword\": \"?0\" " +
" } " +
" }, " +
" {" +
" \"match\": { " +
" \"content\": \"?1\" " +
" }" +
" } " +
" ] " +
" }")
Page<SearchablePageHistory> findAllByUserIdAndContentLike(String userId, String content, Pageable pageable);
}
All other queries where I don't use #Query work fine without a problem. I have no idea what I am doing wrong since my example seems very similar to the one given in the documentation: https://docs.spring.io/spring-data/elasticsearch/docs/current/reference/html/#elasticsearch.query-methods.at-query
Hard facepalm, I found my error -> still gonna leave this post up in case someone else stumbles across the same problem since the error message is not very helpful in my opinion.
I simply forgot the surrounding brackets around the outside of the query:
changing this:
#Query("\"bool\": { " +
" \"filter\": [ " +
" { " +
" \"term\": { " +
" \"userId.keyword\": \"?0\" " +
" } " +
" }, " +
" {" +
" \"match\": { " +
" \"content\": \"?1\" " +
" }" +
" } " +
" ] " +
" }")
Page<SearchablePageHistory> findAllByUserIdAndContentLike(String userId, String content, Pageable pageable);
to this:
#Query("{\"bool\": { " +
" \"filter\": [ " +
" { " +
" \"term\": { " +
" \"userId.keyword\": \"?0\" " +
" } " +
" }, " +
" {" +
" \"match\": { " +
" \"content\": \"?1\" " +
" }" +
" } " +
" ] " +
" }}")
Page<SearchablePageHistory> findAllByUserIdAndContentLike(String userId, String content, Pageable pageable);
solved the problem.
Addition:
"ImJvb2wiOiB7ICAgICJmaWx0ZXIiOiBbICAgICB7ICAgICAgICJ0ZXJtIjogeyAgICAgICAidXNlcklkLmtleXdvcmQiOiAiMzFjMjA5NTktNjg5Zi00YjI4LWExNzctNmQ3ZTQ2YTBhYzMwIiAgICAgIH0gICAgIH0sICAgeyJtYXRjaCI6IHsgICAgImNvbnRlbnQiOiAidGVzdHNzIiAgIH19ICAgIF0gICB9"
is a wrapper query, that's a base64 encoded string conataining
""bool": { "filter": [ { "term": { "userId.keyword": "31c20959-689f-4b28-a177-6d7e46a0ac30" } }, {"match": { "content": "testss" }} ] }"
Had similar issue and was actually the absence of bracket

how to run mongodb native query with mongodb date function in spring-data-mongodb?

I want to execute the below native query in spring data mongodb :
db.runCommand({aggregate:"mycollection", pipeline :[{$match : {$and :
[{"orderDate" : {$gte : ISODate("2016-07-25T10:33:04.196Z")}},
{"orderDate" : {$lte :ISODate("2018-07-25T10:33:04.196Z")
}}
]}},
{ "$project" : { "orderType" : 1 ,"count" : 1 ,
"month" : { "$month" : [ "$orderDate"]}}},
{ "$group" : { "_id" : { "month" : "$month" , "orderType" : "$orderType"} ,
"count" : { "$sum" : 1} }}],
cursor:{batchSize:1000}})
I tried with mongoTemplate.executeCommand it executes a json string, Please help...
Regards
Kris
You can use the mongoTemplate.executeCommand(DBObject dbObject) variant.
Just change the date to extended json which is supported by Json parser and build the command.
Something like
long date1 = Instant.parse("2016-07-25T10:33:04.196Z").toEpochMilli();
long date2 = Instant.parse("2018-07-25T10:33:04.196Z").toEpochMilli();
DBObject dbObject = new BasicDBObject(
"aggregate", "mycollection").append(
"pipeline", JSON.parse("[\n" +
" {\n" +
" \"$match\": {\n" +
" \"$and\": [\n" +
" {\n" +
" \"orderDate\": {\n" +
" \"$gte\": \""+ date1 +"\"\n" +
" }\n" +
" },\n" +
" {\n" +
" \"orderDate\": {\n" +
" \"$gte\": \""+ date2 +"\"\n" +
" }\n" +
" }\n" +
" ]\n" +
" }\n" +
" },\n" +
" {\n" +
" \"$project\": {\n" +
" \"orderType\": 1,\n" +
" \"count\": 1,\n" +
" \"month\": {\n" +
" \"$month\": [\n" +
" \"$orderDate\"\n" +
" ]\n" +
" }\n" +
" }\n" +
" },\n" +
" {\n" +
" \"$group\": {\n" +
" \"_id\": {\n" +
" \"month\": \"$month\",\n" +
" \"orderType\": \"$orderType\"\n" +
" },\n" +
" \"count\": {\n" +
" \"$sum\": 1\n" +
" }\n" +
" }\n" +
" }\n" +
"]")).append(
"cursor", new BasicDBObject("batchSize", 1000)
);
mongoTemplate.executeCommand(dbObject)

string line breaks causing error

We can surely, define string in Strings.xml & call - in that case the line break is not an issue.
But suppose I want to put it in my java page -- if I put it as follows - it will bring error
String strJson="
{
\"Employee\" :[
{
\"id\":\"01\",
\"name\":\"Gopal Varma\",
\"salary\":\"500000\"
},
{
\"id\":\"02\",
\"name\":\"Sairamkrishna\",
\"salary\":\"500000\"
},
{
\"id\":\"03\",
\"name\":\"Sathish kallakuri\",
\"salary\":\"600000\"
}
]
}";
I can fix the error by making it in a single line
String strJson=" { \"Employee\" :[ { \"id\":\"01\",\"name\":\"Gopal Varma\",\"salary\":\"500000\"},{\"id\":\"02\",\"name\":\"Sairamkrishna\",\"salary\":\"500000\"}, { \"id\":\"03\", \"name\":\"Sathish kallakuri\", \"salary\":\"600000\" } ] }";
But I want to know, instead is there any escape char or something to fix the error.
Java String literals cannot span multiple lines, but you can do this.
String strJson = "{\n" +
" \"Employee\" :[\n" +
" {\n" +
" \"id\":\"01\",\n" +
" \"name\":\"Gopal Varma\",\n" +
" \"salary\":\"500000\"\n" +
" }\n" +
" ]\n" +
"}";
You need to escape the strings properly,
Try this,
String strJson = "{" +
"\"Employee\": [{" +
"\"id\": \"01\"," +
"\"name\": \"Gopal Varma\"," +
"\"salary\": \"500000\"" +
"}, {" +
"\"id\": \"02\"," +
"\"name\": \"Sairamkrishna\"," +
"\"salary\": \"500000\"" +
"}, {" +
"\"id\": \"03\"," +
"\"name\": \"Sathish kallakuri\"," +
"\"salary\": \"600000\"" +
"}]" +
"}";

Elasticsearch: rewrite a query using java native api

I have this query in Elasticsearch that is working perfectly if I run it from the command line:
POST http://localhost:9200/YOUR_INDEX_NAME/_search/
{
"size": 0,
"aggs": {
"autocomplete": {
"terms": {
"field": "autocomplete",
"order": {
"_count": "desc"
},
"include": {
"pattern": "c.*"
}
}
}
},
"query": {
"prefix": {
"autocomplete": {
"value": "c"
}
}
}
}
I have tried to rewrite it in java using the native client:
SearchResponse searchResponse2 = newClient.prepareSearch(INDEX_NAME)
.setSearchType(SearchType.DFS_QUERY_THEN_FETCH)
.setQuery("{\n" +
" \"size\": 0,\n" +
" \"aggs\": {\n" +
" \"autocomplete\": {\n" +
" \"terms\": {\n" +
" \"field\": \"autocomplete\",\n" +
" \"order\": {\n" +
" \"_count\": \"desc\"\n" +
" },\n" +
" \"include\": {\n" +
" \"pattern\": \"c.*\"\n" +
" }\n" +
" }\n" +
" }\n" +
" },\n" +
" \"query\": {\n" +
" \"prefix\": {\n" +
" \"autocomplete\": {\n" +
" \"value\": \"c\"\n" +
" }\n" +
" }\n" +
" }\n" +
"}").get();
for (SearchHit res : searchResponse2.getHits()){
System.out.println(res.getSourceAsString());
}
Seems, that I'm missing something in this translation process. Thanks in advance
The Java client setQuery() method doesn't take a String with the JSON query, you need to build the query using the QueryBuilders helper methods and build the aggregation your the AggregationBuilders helper methods.
In your case that would go like this:
// build the aggregation
TermsBuilder agg = AggregationBuilders.terms("autocomplete")
.field("autocomplete")
.include("c.*")
.order(Terms.Order.count(false));
// build the query
SearchResponse searchResponse2 = newClient.prepareSearch(INDEX_NAME)
.setSearchType(SearchType.DFS_QUERY_THEN_FETCH)
.setSize(0)
.setQuery(QueryBuilders.prefixQuery("autocomplete", "c"))
.addAggregation(agg)
.get();

Categories