How to perform request with MultiSearch API in Jest? - java

I need to make some request with the MultiSearch API from Jest.
I tried to build Search request like this :
Search search = new Search.Builder(query).addIndex(index).addType(type).build();
And then, I add all these requests into a collection, to build the MultiSearch and get the result, like this :
List<Search> ms = new ArrayList<Search>();
for (#iterate over#) {
ms.add(search())
//Adding the searches queries to the List
}
MultiSearch multi = new MultiSearch.Builder(ms).build();
MultiSearchResult multir = client.execute(multi);
But this return this error from elasticsearch :
{
"error": {
"caused_by": {
"reason": "Unexpected end-of-input: expected close marker for Object (start marker at [Source: org.elasticsearch.transport.netty4.ByteBufStreamInput#2ccf4bb6; line: 1, column: 1])\n at [Source: org.elasticsearch.transport.netty4.ByteBufStreamInput#2ccf4bb6; line: 2, column: 3]",
"type": "json_e_o_f_exception"
},
"reason": "Exception when parsing search request",
"root_cause": [
{
"reason": "Exception when parsing search request",
"type": "parse_exception"
}
],
"type": "parse_exception"
},
"status": 400
}
So my question is, how to perform MultiSearch request with jest ?

Well, after tests, I found a solution :
Search search = new Search.Builder(query.toString().replaceAll("\\n|\\r", "")).addIndex(es_index_data)
.addType(es_type_data).build();

Related

Translation of empty array element in XML to JSON became null in Oracle Service Bus Pipeline

I'm facing an issue when working with Oracle Service Bus. I want to translate XML to JSON as the response but when using nxsd in OSB pipeline to do that translation the empty array (<in:<ARRAYNAME>/>) in XML will be translated as "<ARRAYNAME>":[ null ]. The expected result is "<ARRAYNAME>":[], without null.
Here is the current response:
{
"data": [ null ],
"description": "Success",
"transactionId": null,
"status": 0
}
Here is the expected response:
{
"data": [ ] ,
"description": "Success",
"transactionId": null ,
"status": 0
}

Insomnia MultiPart Mutation

here is another screeshothere is a screenshot of my Insomnia workspaceI'm trying to define my "operations" with type Text(Multi-Line) with this content below :
{
"query": "mutation ($number: String! $countr: String! $image: Upload!){
create(number: $number, countr: $countr, image: $image){
id,
timestamp
}
}",
"variables": {
"number": "99999",
"countr": "Abc",
"image": null
}
}
but Insomnia notify me this below :
Parse error on line2: {"query": "mutation ($number:---------^ "Expecting" : "STRING", "NUMBER", "{", "[", "UNDEFINED", ...
but once I sendind the query, I'm receiving this below :
{
"timestamp": 1602665134654,
"status": 422,
"error": "Unprocessable Entity",
"message": "",
"path": "/graphql"
}
Can someone help me tofix this issue?
Thanks!
Define "operations" with JUST Text type.
I'm not sure of your business logic, but if you use DefaultGraphQLServletContext.getFileParts(), then this method will judge the "operations" as a file not a text.

Handing of "JSON does not allow non-finite numbers" in elastic search

Hi I am trying to do aggregation on a field using MAX function inside my index, the issue is when I do aggregation with group by it fails whenever I don't have that value inside of a time frame
POST /_opendistro/_sql
{
"query": "SELECT date(timeStamp) time_unit, MAX(testField) test_field_alias FROM my_index where orgId = 'xyz' and timeStamp <= '2020-07-04T23:59:59' and timeStamp >= '2020-07-01T00:00' group by date(timeStamp) order by time_unit desc"
}
My index mapping is as given below
"mappings" : {
"properties" : {
"orgId" : {
"type" : "text"
},
"testField" : {
"type" : "long",
"null_value" : 0
},
"timeStamp": {
"type":"date"
}
}
When I try the above query I get
{
"error": {
"root_cause": [
{
"type": "j_s_o_n_exception",
"reason": "JSON does not allow non-finite numbers."
}
],
"type": "j_s_o_n_exception",
"reason": "JSON does not allow non-finite numbers."
},
"status": 500
}
I have understood this happens for time frames where I don't have the above-mentioned field in the documents in my index, SO I added "null_value":0 to my mapping but still of no use.
Thing is I want to do aggregation using MAX function grouped by time scales if there is another approach that works, that's enough for me it doesn't have to be in SQL format only.

ElasticSearch Java API nested query with inner_hits error

i have problem with ElasticSearch Java API. I use version 5.1.2.
I will now describe code pasted below. I need to optimize search mechanism by limiting inner_hits only to object id. I used InnerHitBuilder with .setFetchSourceContext(FetchSourceContext.DO_NOT_FETCH_SOURCE) and .addDocValueField("item.id"). Query being generated has error - there is "ignore_unmapped" attribute inside "inner_hits" node.
..."inner_hits": {
"name": "itemTerms",
"ignore_unmapped": false,
"from": 0,
"size": 2147483647,
"version": false,
"explain": false,
"track_scores": false,
"_source": false,
"docvalue_fields": ["item.id"]
}...
Executing such query results with error:
{
"error": {
"root_cause": [
{
"type": "illegal_argument_exception",
"reason": "[inner_hits] unknown field [ignore_unmapped], parser not found"
}
],
"type": "illegal_argument_exception",
"reason": "[inner_hits] unknown field [ignore_unmapped], parser not found"
},
"status": 400
}
When i manually remove that attribute from query, everything runs smoothly.
protected BoolQueryBuilder itemTermQuery(FileTerms terms, boolean withInners) {
BoolQueryBuilder termsQuery = QueryBuilders.boolQuery();
for (String term : FileTerms.terms()) {
if (terms.term(term).isEmpty())
continue;
Set<String> fns = terms.term(term).stream().
map(x -> x.getTerm())
.filter(y -> !y.isEmpty())
.collect(Collectors.toSet());
if (!fns.isEmpty())
termsQuery = termsQuery.must(
QueryBuilders.termsQuery("item.terms." + term + ".term", fns));
}
QueryBuilder query = terms.notEmpty() ? termsQuery : QueryBuilders.matchAllQuery();
TermsQueryBuilder discontinuedQuery = QueryBuilders.termsQuery("item.terms." + FileTerms.Terms.USAGE_IS + ".term",
new FileTerm("Discontinued", "", "", "", "").getTerm());
FunctionScoreQueryBuilder.FilterFunctionBuilder[] functionBuilders = {
new FunctionScoreQueryBuilder.FilterFunctionBuilder(query, ScoreFunctionBuilders.weightFactorFunction(1)),
new FunctionScoreQueryBuilder.FilterFunctionBuilder(discontinuedQuery, ScoreFunctionBuilders.weightFactorFunction(-1000))
};
FunctionScoreQueryBuilder functionScoreQuery = functionScoreQuery(functionBuilders);
NestedQueryBuilder nested = QueryBuilders.nestedQuery("item", functionScoreQuery.query(), ScoreMode.None);
if (withInners) nested = nested.innerHit(new InnerHitBuilder()
.setFetchSourceContext(FetchSourceContext.DO_NOT_FETCH_SOURCE)
.addDocValueField("item.id")
.setSize(Integer.MAX_VALUE)
.setName("itemTerms"));
return QueryBuilders.boolQuery().must(nested);
}
How to build query without that unnecessary attribute inside "inner_hits" node?
EDIT:
I use 5.1.2 library and 5.1.2 elastic server.
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>transport</artifactId>
<version>5.1.2</version>
</dependency>
"version": {
"number": "5.1.2",
"build_hash": "c8c4c16",
"build_date": "2017-01-11T20:18:39.146Z",
"build_snapshot": false,
"lucene_version": "6.3.0"
},

rest assured check the name is exist in the json response

Am new to rest assured.Using rest assured am trying to verify data detail is found or not.Here two data details present.Some times it will be 2 or 3 or 5
Am getting response as follows and using java
{
"queryPath": "/api/",
"nId": "f084f5ad24fcfaa9e9faea0",
"statusCode": 707
"statusMessage": "Success",
"results": {
"data": [
{
"id": "10248522500798",
"capabilities": [
"record",
"HDt"
],
"name": "errt2"
},
{
"id": "418143778",
"capabilities": [
"1record",
"HDy"
],
"name": "Livin"
}
]
}
}
code using
JsonPath jsonResponse = new JsonPath(response.asString());
ArrayList<String> list = new ArrayList<String>();
list = jsonResponse.get("results.data"); //
if (list.size() < 1 ) {
SpiceCheck.fail("data not found! " + list.size());
}
Rather than this i wwant to check the data name is null or not also.How can i do that rest assured.
Just so you know you are missing a comma after 707.
To verify that none of the names is null I would parse out the names as a list, then iterate over the names one by one and check that they aren't null.
List<String> names = from(response.asString()).getList("results.data.name");
for(String name : names){
if(name == null){
// error handling code here
}
}

Categories