Elastic Search autocompletion/suggestion Java - java

I want to implement a search as you type functionality in Elastic Search using Java API. The queries that I want to transform to Java are below.
Do you have any idea, how can I execute this queries in Java?
These queries are very similar but I want to solve at least one.
This is my initial approach:
SearchResponse response = client.prepareSearch("kal")
.setTypes("products")
.setQuery(multiMatchQuery("description_en", "name", "description_en"))// Query
.setFrom(0).setSize(60).setExplain(true)
.get();
SearchHit[] results = response.getHits().getHits();
for (SearchHit hit : results) {
String sourceAsString = hit.getSourceAsString();
Map<String, SearchHitField> responseFields = hit.getFields();
SearchHitField field = responseFields.get("product_id");
Map map = hit.getSource();
System.out.println(map.toString());
}
Queries:
POST /kal/products/_search?pretty
{
"suggest": {
"name-suggest" : {
"prefix" : "wine",
"completion" : {
"field" : "suggest_name"
}
}
}
}
GET /kal/products/_search
{ "query": {
"prefix" : {
"name" : "wine",
"description": "wine"
}
}
}
GET /kal/products/_search
{
"query" : {
"multi_match" : {
"fields" : ["name", "description_en"],
"query" : "description_",
"type" : "phrase_prefix"
}
}
}

Related

Azure Functions CosmosDb Binding api keeps loading

First of all, the api works as intended locally, when deploying to azure functions app, the api endpoint keeps loading and it will eventually show HTTP.504(Gateway Timeout)
page keeps loading, no response from azure functions
Integration
I'm looking to fetch all data from the collection when I call HttpTrigger
Function.java
#FunctionName("get")
public HttpResponseMessage get(
#HttpTrigger(name = "req",
methods = {HttpMethod.GET, HttpMethod.POST},
authLevel = AuthorizationLevel.ANONYMOUS)
HttpRequestMessage<Optional<String>> request,
#CosmosDBInput(name = "database",
databaseName = "progMobile",
collectionName = "news",
partitionKey = "{Query.id}",
connectionStringSetting = "CosmosDBConnectionString")
Optional<String> item,
final ExecutionContext context) {
// Item list
context.getLogger().info("Parameters are: " + request.getQueryParameters());
context.getLogger().info("String from the database is " + (item.isPresent() ? item.get() : null));
// Convert and display
if (!item.isPresent()) {
return request.createResponseBuilder(HttpStatus.BAD_REQUEST)
.body("Document not found.")
.build();
}
else {
// return JSON from Cosmos. Alternatively, we can parse the JSON string
// and return an enriched JSON object.
return request.createResponseBuilder(HttpStatus.OK)
.header("Content-Type", "application/json")
.body(item.get())
.build();
}
}
Function.json
{
"scriptFile" : "../ProgMobileBackend-1.0-SNAPSHOT.jar",
"entryPoint" : "com.function.Function.get",
"bindings" : [ {
"type" : "httpTrigger",
"direction" : "in",
"name" : "req",
"methods" : [ "GET", "POST" ],
"authLevel" : "ANONYMOUS"
}, {
"type" : "cosmosDB",
"direction" : "in",
"name" : "database",
"databaseName" : "progMobile",
"partitionKey" : "{Query.id}",
"connectionStringSetting" : "CosmosDBConnectionString",
"collectionName" : "news"
}, {
"type" : "http",
"direction" : "out",
"name" : "$return"
} ]
}
Azure Functions monitor log does not show any error
Running the function in the portal(Code + Test menu) does not show any error either
httpTrigger I'm using: https://johnmiguel.azurewebsites.net/api/get?id=id
I added CosmosDBConnectionString value to Azure Functions App configuration(did not check on "Deployment slot" option)
I'm using an instance of CosmosDB for NoSQL
Functions App runtime is set to Java and version set to Java 8
figured it out. Java function was in Java 17 and Function App in Java 8.

How to "Explicitly order highlighted fields" using ElasticSearch Java API client v7.16?

I want to explicitly order highlighted fields using the Elasticsearch Java API Client 7.16.
In other words I want to build the following request
GET /_search
{
"highlight": {
"fields": [
{ "a": {} },
{ "b": {} },
{ "c": {} }
]
}
}
Unfortunately the following code ignores the insertion order:
new Highlight.Builder()
.fields("a", new HighlightField.Builder().build())
.fields("b", new HighlightField.Builder().build())
.fields("c", new HighlightField.Builder().build());
Actually all available fields() methods eventually put the data in the unordered map. So my request actually is following:
GET /_search
{
"highlight": {
"fields": {
"b": {},
"c": {},
"a": {}
}
}
}
Is there any other Java API that allows to control the order of highlighted fields?
As i know this is not possible and this is not issue of elasticsearch but it is how JSON work. Below is mentioned in JSON documentation.
An object is an unordered set of name/value pairs
I am not sure why you want to rely on order. You should not rely on the ordering of elements within a JSON object.
You can pass Map of field like below for order, just check javadoc:
Map<String, HighlightField> test = new HashMap();
test.put("a", new HighlightField.Builder().build());
test.put("b", new HighlightField.Builder().build());
test.put("b", new HighlightField.Builder().build());
Builder highlight = new Highlight.Builder().fields(test);
There is class HighlightBuilder present in package
package org.elasticsearch.search.fetch.subphase.highlight;
Which has following property as member variable,
private boolean useExplicitFieldOrder = false;
The you can build
List<HighlightBuilder.Field> fields1 = new ArrayList<>();
fields1.add(new HighlightBuilder.Field("a"));
fields1.add(new HighlightBuilder.Field("b"));
fields1.add(new HighlightBuilder.Field("c"));
HighlightBuilder highlightBuilder = new HighlightBuilder(null,null,fields1).useExplicitFieldOrder(true);
Another way:
To Generate the explicit ordering Json :
boolean useExplicitFieldOrder = true;
XContentBuilder builder = XContentFactory.jsonBuilder();
builder.prettyPrint();
builder.startObject();
builder.startObject("highlight");
if (fields != null) {
if (useExplicitFieldOrder) {
builder.startArray("fields");
} else {
builder.startObject("fields");
}
for (HighlightField field : fields) {
if (useExplicitFieldOrder) {
builder.startObject();
}
builder.startObject(field.field());
builder.endObject();
if (useExplicitFieldOrder) {
builder.endObject();
}
}
if (useExplicitFieldOrder) {
builder.endArray();
} else {
builder.endObject();
}
}
builder.endObject();
builder.endObject();
String json = Strings.toString(builder);
System.out.println(json);
It will o/p as follow:
{
"highlight" : {
"fields" : [
{
"a" : { }
},
{
"b" : { }
},
{
"c" : { }
}
]
}
}

Convert the aggregation query of mongodb for spring boot

I have a mongodb query which works fine
db.user.aggregate([
{
"$project": {
"data": {
"$objectToArray": "$$ROOT"
}
}
},
{
$unwind: "$data"
},
{
"$match": {
"data.v": {
$regex: "Mohit Chandani"
}
}
}
])
It basically, get all the document having the value Mohit Chandani and here is the output:
{ "_id" : "b387d728-1feb-45b6-bdec-dafdf22685e2", "data" : { "k" : "fullName", "v" : "Mohit Chandani" } }
{ "_id" : "8e35c497-4296-4ad9-8af6-9187dc0344f7", "data" : { "k" : "fullName", "v" : "Mohit Chandani" } }
{ "_id" : "c38b6767-6665-46b8-bd29-645c41d03850", "data" : { "k" : "fullName", "v" : "Mohit Chandani" } }
I need this query to be converted for my spring boot application and I am writing the following:-
Aggregation aggregation = Aggregation.newAggregation(Aggregation.project(Aggregation.ROOT), Aggregation.match(Criteria.where(connectionRequest.getWord())));
It would be helpful to know which approach to take when you do long aggregations in Spring-Data.
This might help you, Hope you are using MongoTemplate for aggregation.
#Autowired
private MongoTemplate mongoTemplate;
And the code for above script is
public List<Object> test() {
Aggregation.newAggregation(
project().and(ObjectOperators.valueOf(ROOT).toArray()).as("data"),
unwind("data"),
match(Criteria.where("data.v").regex("Mohit Chandani")
)
).withOptions(AggregationOptions.builder().allowDiskUse(Boolean.TRUE).build());
return mongoTemplate.aggregate(aggregation, mongoTemplate.getCollectionName(YOUR_COLLECTION.class), Object.class).getMappedResults();
}
I'm not sure the project() of above code would work, because I haven't tried it. I referred it form Spring data mongodb
If it doesn't work, this would definitely work. Few opetaiotion are not suppored in spring data like $addFields, $filter.. So we do a Trick to convert
public List<Object> test() {
Aggregation aggregation = Aggregation.newAggregation(
p-> new Document("$project",
new Document("data",
new Document("$objectToArray","$$ROOT")
)
),
unwind("data"),
match(Criteria.where("data.v").regex("Mohit Chandani"))
).withOptions(AggregationOptions.builder().allowDiskUse(Boolean.TRUE).build());
return mongoTemplate.aggregate(aggregation, mongoTemplate.getCollectionName(YOUR_COLLECTION.class), Object.class).getMappedResults();
}

AggregationResults Type Cast MongoDB + Spring data + Aggregation

I'm asking mongodb from my Spring data application using aggregation and when I get the result I need to cast it (or transform it) to my own object or type and I don't know how.
This is my code
AggregationResults aggregationResults = DBManager.getInstance().getMongoOperation().aggregate(aggregation, LevelEntity.class,Object.class);
I would like to have something like
AggregationResults<LevelList> aggregationResults = DBManager.getInstance().getMongoOperation().aggregate(aggregation, LevelEntity.class,LevelList.class);
and it gives me back this information
{
"_id" : {
"date" : "24/03/2015"
},
"levels" : [
{
"_id" : ObjectId("54f8627071fdac0ec132b4e5"),
"_class" : "org.xxxxxxxxxxx.persistence.model.impl.LevelEntity",
"user" : {
"_id" : ObjectId("54da19ce71fd56a173a3451a"),
...
...
...
},
"level" : 4,
"date" : ISODate("2015-03-24T14:04:32.830Z"),
"dateFormatted" : "24/03/2015"
},
{
"_id" : ObjectId("54f8627071fdac0ec132b4f4"),
"_class" : "org.xxxxxxxxxxx.persistence.model.impl.LevelEntity",
"user" : {
"_id" : ObjectId("54e34bd671fde9071569650c"),
...
...
...
},
"level" : 3,
"date" : ISODate("2015-03-24T14:04:32.866Z"),
"dateFormatted" : "24/03/2015"
}
]
}
Can you please help me!!???
Thanks a lot
Use as follow
TypedAggregation<LevelList> aggregation=Aggregation.newAggregation(LevelList.class,
match, sortOp, skipOp, limitOp);
AggregationResults<LevelList> data = mongoTemplate.aggregate(aggregation,
COLLECTION_NAME, LevelList.class);
well Cataclysm your response is not totally correct but gave me the key to solve my problem, this is what I had to do
TypedAggregation<LevelEntity> aggregation = Aggregation.newAggregation(LevelEntity.class,
userMatchOperation, dateMatchOperation, group(LevelEntity.DATE_FORMATTED_FIELD).push(ROOT).as(LevelByDate.ENTITY_NAME));
AggregationResults<LevelByDate> data = DBManager.getInstance().getMongoOperation().aggregate(aggregation, HappinessByDate.class);
And this is the LevelByDate object
#Document
public class LevelByDate {
public static final String ENTITY_NAME = "levelEntity";
#Id
private String id;
List<LevelEntity> levelEntity;
public LevelByDate() {
}
public String getId() {
return id;
}
public void setId(String id) {
this.id = id;
}
public List<LevelEntity> getLevelEntity() {
return levelEntity;
}
public void setLevelEntity(List<LevelEntity> levelEntity) {
this.levelEntity = levelEntity;
}
}
Thanks a lot,
Kike.

Spring Mongo Upsert nested document

I am new bie to spring framework. I have my mongo document like
{
"_id" : ObjectId("527242d584ae917d8bd75c7b"),
"postTitle" : "Car",
"postDesc" : "rent",
"owner" : ObjectId("526a588f84aed6f41cca10bd"),
"intrest" : []
}
What I want is to search document having id
"_id" : ObjectId("527242d584ae917d8bd75c7b")
and update it to
{
"_id" : ObjectId("527242d584ae917d8bd75c7b"),
"postTitle" : "Car",
"postDesc" : "rent",
"owner" : ObjectId("526a588f84aed6f41cca10bd"),
"intrest" : [
{
"userId" : ObjectId("526a587d84aed6f41cca10bc"),
"timestamp" : ISODate("2013-10-31T11:45:25.256Z")
},
{
"userId" : ObjectId("526a587d84aed6f41cca10bc"),
"timestamp" : ISODate("2013-11-31T11:55:25.256a")
}
]
}
my domain is
#Document
public class Post {
#Id
private ObjectId _id;
private String postTitle;
private String postDesc;
private ObjectId owner=Global.getCurruser();
private List<Intrest> intrest = new ArrayList<Intrest>();
// Getters and setters
}
#Document
public class Intrest {
private ObjectId userId;
private Date timestamp;
// Getters and setters
}
What upsert should I write to add or modify entries in intrest array[].
Please Help.
I am using spring-mongodb .. Here is what I do
Intrest insertObj = new Insert();
//initilize insert obj here ..
Update args = new Update();
args.addToSet("intrest",insertObj);
Query query = new Query(Criteria.where("id").is("527242d584ae917d8bd75c7b"));
// if u want to do upsert
mongoOperation.findAndModify(query, args, FindAndModifyOptions.options().upsert(true), Post.class);
//if u want to just update
mongoOperation.findAndModify(query, args, Post.class);
I think what you intend to do is an update. Upsert will modify your document matching the given query if not it will create a new document , where as update will only modify your document if found. here is the reference
I do not know about java, but all you need to do is $pushAll operator (I really hope you can find how to do this with java driver).
db.collection.update(
{"_id" : ObjectId("527242d584ae917d8bd75c7b")},
{ $pushAll: { intrest: [ {
"userId" : ObjectId("526a587d84aed6f41cca10bc"),
"timestamp" : ISODate("2013-10-31T11:45:25.256Z")
},
{
"userId" : ObjectId("526a587d84aed6f41cca10bc"),
"timestamp" : ISODate("2013-11-31T11:55:25.256a")
}] } }
);

Categories