spring elasticsearch : how to save Map as string in index? - java

i'm facing an index issue.
In my Spring document, I have a map. This map can contains thousands of data because I save history.
private NavigableMap<String, Integer> installHistory = new TreeMap<>();
In elastic search all data in my map are index, and I got a limit exeed error.
How could I do to not index all data inside the Map ?
I use spring 2.2 and spring elastic search 3.2.4
Thanks in advance.
Edit :
I upgrade to spring data elastic 4.0.1 to use FielType.Flattened, but spring data elastic 4.0.1 supports min version of elasticsearch 7.6.X. My version is 7.4 and I can't change it because it is the latest version provided by aws.
I made the field transient, and created a String property for this Map. Before saving my object i convert the map into list and put it in the String variable.

A map is converted to a JSON object that has the map keys as properties and the map values as values. So you end up storing objects with thousands of properties, see the Elasticsearch documentation about this.
You could declare the type of the installHistory with FieldType.Flattened
Edit:
I missed that you are using Spring Data Elasticsearch 3.2.x. Support for the flattened field type was added in 4.0.
The best thing then would probably be to convert the Map property to a List of Pairs or Tupels, where each Pair contains a key-value pair from the map.

Did you try these Custom Converters provided in the Documentation
https://docs.spring.io/spring-data/elasticsearch/docs/current/reference/html/#elasticsearch.mapping.meta-model.conversions

Related

How to set collection name attribute for org.springframework.cloud.gcp.data.firestore.Document annotation value using env variable or property value?

I'm using Spring Data Cloud Firestore. And i want to map object to collection using #Document annotation
#Document(collectionName = "usersCollection")
public class User {
}
But i need to have different collection name depends on environment(We are using one gcp project for dev, stage and prod environments and I need to have "usersCollectionDev", "usersCollectionStage" and "usersCollectionProd" collections for example).
Is it possible to do?

Spring Data Elasticsearch - Create keyword field with normalizer

We are using the spring-data-elasticsearch project to interface with our elasticsearch clusters, and have been using it now for around a year. Recently, we moved to elasticsearch 5.x (from 2.x) where we now have the "keyword" datatype.
I would like to index these keywords as lowercase values, which I know can be done with field normalizers. I can't find anywhere in the documentation or online where I can add a normalizer to a field through the annotation based mapping.
E.g
#Field(type = FieldType.keyword, <some_other_param = some_normalizer>)
Is this something that can be done? I know that we can use JSON based mapping definitions as well, so I will fall back to that option if needed, but would like to be able to do it this way if possible.
Any help would be very appreciated!
Since the pull request of #xhaggi has been merged (spring-data-elasticsearch 3.1.3+ or Spring Boot 2.1.1), we have a normalizer field in the #Field annotation.
To use it, we need:
declare a #Field or an #InnerField with params type = FieldType.Keyword, normalizer = "%NORMALIZER_NAME%"
add #Setting(settingPath = "%PATH_TO_NORMALIZER_JSON_FILE%") at the class level.
put the normalizer mapping into a json file at %PATH_TO_NORMALIZER_JSON_FILE%
Example of usage
FYI, for anyone looking at this, the answer is there is not a way to do this at this time.
You can do this, however, by creating your mappings file as JSON in the Elasticsearch format. See:
https://www.elastic.co/guide/en/elasticsearch/reference/current/indices-put-mapping.html
You can then create that JSON file and link it to your Domain model with.
#Mapping(mappingPath = "some/path/mapping.json")
Note that this is not, in my experience, compatible with the provided annotation based mapping for fields.
There is a pending issue https://jira.spring.io/browse/DATAES-492 waiting for review.

What is the JacksonDB equivalent of db.collection.find({},{"fieldName":1})?

I am new to Jackson DB. Now I know that to get the entire document list of a collection using Jackson we need to do :
COllectionClass.coll.find().toArray();
which is Jackson DB equivalent to the mongodb command :
db.collection.find()
So What is the Jackson DB equivalent of say :
db.collection.find({},{"fieldName":1, "_id":0})
As given here, This might be helpful to you. (not tested)
coll.find(DBQuery.is(),//add your criteria for search here
DBProjection.include("fieldName")).toArray();

Filter custom fields not present in database using Apache Cayenne

In my API it is currently possible to filter on all fields that exists in the database.
Filtering is implemented in FilterUtils.java in the API project. This translates the url parameters to a Apache Cayenne search and returns the result to the resource.
In the "Main" project there are generated classes for each database table in com.foo.bar.auto these are extended by classes in com.foo.bar.
The classes in com.foo.bar can have custom functions. An example is Document.getAccount.
Document.getAccount is exposed in the API, but it is not able to filter it because it's not a database field. I need to be able to filter fields like Document.getAccount.
Is it possible to register these functions in Cayenne somehow?
The syntax for searching on custom fields needs to be equal to todays filtering syntax. So when searching for account it should look like this: Document?filter=account(EQ)1234.
Any ideas? All help is appreciated.
Your best bet is to split your filter keys into persistent and non-persistent properties. Then you'd build 2 expressions, one for each subset of keys. Use the first expression to build a query to fetch from DB, and the second one - to filter the returned result in memory:
Expression p = ...
Expression np = ...
SelectQuery query = new SelectQuery(Document.class, p);
List<Document> docs = context.performQuery(query);
List<Document> filteredDocs = np.filterObjects(p);

Retrieve GAE Datastore's data types in Java

How to retrieve data types of properties of entities stored in Google App Engine Datastore using Java? I didn't find property.getType() or similar method in Java Datastore API.
There is no direct method provided.
but you can retrieve it by comparing the java Type of Property with the table given at this link
Map<String, Object> properties = entity.getProperties();
String[] propertyNames = properties.keySet().toArray(
new String[properties.size()]);
for(final String propertyName : propertyNames) {
// propertyNames string contains
// "com.google.appengine.api.users.User" then its Google Accounts user
// "java.lang.Integer" then its Integer
// "int" then premetive integer
}
Hope this helps
You need to use Property Metadata Queries.
Be aware that, as stated in the documentation, the property representation returned by the queries would be AppEngine representation, and does not have one-to-one mapping with Java classes. But you would be able to get the general data type at least.

Categories