I'm trying to persist method value in the MongoDB collection as property but MongoTemplate is not storing it's value:
#Data
#Document(collection = "collection")
#JsonIgnoreProperties(ignoreUnknown = true)
public class CollectionObject{
#Id
private ObjectId id;
private List<Object> elements;
public int getElementCount() {
return size(elements);
}
}
This method (as property) exists in the JSON after serialization.
When using
#AccessType(AccessType.Type.PROPERTY)
and I'm trying to persist this object I'm receiving error
java.lang.IllegalStateException: Cannot set property elementCount because no setter, no wither and it's not part of the persistence constructor private com.package.CollectionObject()!
Object is stored using
mongoTemplate.insert(OBJECT, COLLECTION)
JsonProperty with READ_ONLY option is not working as well.
Is there any option to achieve that except creating separate field ?
Related
I have a question about elasticsearch with spring data.
#Data
#NoArgsConstructor
#AllArgsConstructor
#Document(indexName = "my_es_index")
public class MyEsIndex {
private String id;
private Long counter;
private Long timestamp;
}
and repository
public interface MyEsIndexRepository extends ElasticsearchRepository<MyEsIndex, String> {
Optional<MyEsIndex> findFirstByIdOrderByTimestampDesc(String id);
}
so I have a service where I have to search first for previous one saved record to retrieve previous value, always doing search ordered by timestamp.
#Service
#RequiredArgsConstructor
public class MyEsService {
private final MyEsIndexRepository repository;
public MyEsIndex insert(String previousId) {
Long previousCounter =
repository.findFirstByIdOrderByTimestampDesc(previousId).map(MyEsIndex::getCounter).orElse(0L);
var index = new MyEsIndex(UUID.randomUUID().toString(), ++previousCounter,
Instant.now().toEpochMilli());
return repository.save(index);
}
}
and when trying to do the operation receiving
{"error":{"root_cause":[{"type":"query_shard_exception","reason":"No mapping found for [timestamp] in order to sort on","index":"my_es_index"}
is it possible to do initialization for fields in elasticsearch on empty index?
because the solution of init config is not that clear because it will be used only once when starting working with empty index where never saved a record
#Configuration
public class InitElasticsearchConfig {
private final MyEsIndexRepository repository;
#EventListener(ApplicationReadyEvent.class)
public void initIndex() {
if (repository.findAll(PageRequest.of(0, 1)).isEmpty()) {
var initIndex = new MyEsIndex("initId", 0L, 0L);
repository.save(initIndex);
repository.delete(initIndex);
}
}
is it possible to delegate this solution to spring? I didn't find any
When using Spring Data Elasticsearch repositories - as you do - the normal behaviour is that the mapping is written to Elasticsearch after index creation on application startup when the index does not yet exist.
The problem in your code is that you do not define to what types the properties of your entity should be mapped; you need to add #Field annotations to do that:
#Document(indexName = "my_es_index")
public class MyEsIndex {
private String id;
#Field(type = FieldType.Long)
private Long counter;
#Field(type = FieldType.Long)
private Long timestamp;
}
Properties that are not annotated with a #Field annotation are not written to the mapping but left for automatic mapping by Elasticsearch, that's the cause for the sort not working. As there is no document written to the index, Elasticsearch does not know what type it is and how to sort on that.
In your code there is another thing that might probably not match your desired application logic. In Spring Data Elasticsearch an entity needs to have an id property, that's the property that will be used as the document's id in Elasticsearch. This is normally defined by annotating the property with #Id, if that is missing - as in your case - a property with the name of "id" or "document" is used. So in your case the property id is used.
A document's id is unique in Elasticsearch, if you store a new document under an existing id, the previous content will be overwritten. If that's what you want, the you should add the #Id annotation to your property to make it clear that this is the unique id. But in this case then your code findFirstByIdOrderByTimestamp does not make sense, as a find by id will always return at most one document, so the order by is irrelevant, you could just use a findById() then. I assume that the id should be unique as you initialize it with a UUID.
If your id is not unique and you have multiple documents with the same id and different timestamps, the you'll need to add a new unique property to your entity and annotate that with #Id to prevent id to be used as a unique identifier.
I have an entity 'Product' and I want the primary key in ES to be used as a combination of 'id' and 'name' attributes. How can we do that using spring data elastic search.
public class Product {
#Id
private String id;
#Id
private String name;
#Field(type = FieldType.Keyword)
private Category category;
#Field(type = FieldType.Long)
private double price;
#Field(type = FieldType.Object)
private List<ValidAge> age;
public enum Category {
CLOTHES,
ELECTRONICS,
GAMES;
}
}
One way to achieve this would be the following:
first rename your id property, I changed it to documentId here. This is necessary, because in Spring Data
Elasticsearch an id-property can be either annotated with #Id or it can be namend id. As there can only be one
id-property we need to get this out of the way. It can have the name id in Elasticsearch, set by the #Field
annotation, but the Java property must be changed.
second, add a method annotated with #Id and #AccessType(AccessType.Type.PROPERTY) which returns the value you
want to use in Elasticsearch.
third, you need to provide noop-setter for this property. This is necessary because Spring Data Elasticsearchsoe
not check the id property to be read only when populating an entity after save or when reading from the index.
This is a bug in Spring Data Elasticsearch, I'll create an issue for that
So that comes up with an entity like this:
#Document(indexName = "composite-entity")
public class CompositeEntity {
#Field(name="id", type = FieldType.Keyword)
private String documentId;
#Field(type = FieldType.Keyword)
private String name;
#Field(type = FieldType.Text)
private String text;
#Id
#AccessType(AccessType.Type.PROPERTY)
public String getElasticsearchId() {
return documentId + '-' + name;
}
public void setElasticsearchId(String ignored) {
}
// other getter and setter
}
The repository definition would be straight forward:
public interface CompositeRepository extends ElasticsearchRepository<CompositeEntity,
String> {
}
Remember that for every method that needs an Elasticsearch Id, you'll need to create like it's done in the entity
class.
I am not sure about spring data elasticsearch but spring jpa provides the facility of defining composite primary key by using #IdClass where we can define a separate class(let us say class A) in which we can define all the fields which we want to be a part of composite key Then we can use #IdClass(A.class) in entity class and use #Id annotation on all the fields which should be the part of the composite key
you can refer to this article, although I am not sure whether the same concept will be applicable for spring data es - https://www.baeldung.com/jpa-composite-primary-keys
Is it possible in Spring JPA to map a Transient property of an Object to an alias like so?
Native Query
SELECT *, 1 AS liked FROM User WHERE user_id = 123 // + logic to determine if liked
Class
#Entity
public class User {
#Id
private Long userId;
#Column(name = "displayName")
private String displayName;
#Transient
private int liked; // not tied to any column
}
I've tried to implement this but liked always returns 0 where it should be 1 (and null if I defined the field as an Object type)
Any help is appreciated!
You should use #Formula annotation for the field (see the example)
The #Formula annotation to provide an SQL snippet which Hibernate will execute when it fetches the entity from the database. The return value of the SQL snippet gets mapped to a read-only entity attribute.
I have a lazy fetch type collection in an entity. And I am using Spring Data (JpaRepository) for accessing the entities.
#Entity
public class Parent{
#Id
private Long id;
#OneToMany(mappedBy = "parentId", fetch = FetchType.LAZY)
private Set<Child> children;
}
I want two functions in service class and current implementation are as following:
"children" should be null when fetching parent
public Parent getParent(Long parentId){
return repo.findOne(parentId);
}
"children" should be filled when fetching parent:
public Parent getParentWithChildren(Long parentId){
Parent p = repo.findOne(parentId);
Hibernate.initialize(p.children);
return p;
}
When returning "Parent" entity from a RestController, following exception is thrown:
#RequestMapping("/parent/{parentId}")
public Parent getParent(#PathVariable("parentId") Long id)
{
Parent p= parentService.getParent(id);//ok till here
return p;//error thrown when converting to JSON
}
org.springframework.http.converter.HttpMessageNotWritableException:
Could not write content: failed to lazily initialize a collection of
role: com.entity.Parent.children, could not initialize proxy - no
Session (through reference chain: com.entity.Parent["children"]);
nested exception is
com.fasterxml.jackson.databind.JsonMappingException: failed to lazily
initialize a collection of role: com.entity.Parent.children, could not
initialize proxy - no Session (through reference chain:
com.entity.Parent["children"])
If you are looking to allow for different JSON representations of the same domain model depending on use case, then you can look at the following which will allow you to do so without requiring DTOs:
https://spring.io/blog/2014/12/02/latest-jackson-integration-improvements-in-spring
Alternatively, see also the 'Projections in Spring Data REST' section in the following
https://spring.io/blog/2014/05/21/what-s-new-in-spring-data-dijkstra#projections-in-spring-data-rest
The RestController should return a ParentDTO instead of Parent entity. ParentDTO can be populated in a Transactional service method.
The exception is thrown because the JSON serializer requires all properties to be already initialized. So, all REST Controllers that need to return a Parent, have to initialize the properties first:
#RequestMapping("/parent/{parentId}")
public Parent getParent(#PathVariable("parentId") Long id) {
return parentService.getParentWithChildren(id);
}
The getParentWithChildren Service method is run inside a transaction, and associated Hibernate Session is closed when the transaction is committed. This means you have to initialize all properties while the Hibernate Session is still open (inside the Service method).
You can also use the Spring Data entity graph support:
#Entity
#NamedEntityGraphs(#NamedEntityGraph(name = "Parent.children", attributeNodes = #NamedAttributeNode("children")))
public class Parent{
#Id
private Long id;
#OneToMany(mappedBy = "parentId", fetch = FetchType.LAZY)
private Set<Child> children;
}
And the getParentWithChildren method becomes:
#Repository
public interface ParentRepository extends CrudRepository<Parent, Long> {
#EntityGraph(value = "Parent.children", type = EntityGraphType.LOAD)
Parent getParentWithChildren(Long parentId);
}
So, you don't even need to implement the:
getParent
getParentWithChildren
These methods can be supplied by Spring Data.
First of all, you did not show us the Child Java class: I hope the property is called parentId and not parent:
public class Child {
#ManyToOne
private Parent parentId;
}
Solution 1: your code is actually correct, just that you MUST use a second layer of DTOs (simple POJO classes) for transporting your domain layer to the client/browser. If you do not do so, after you will solve your Lazy Exceptions you will get a problem with the circular dependency from Parent to Child and the JSON marshaller (Jackson) will try to encode a Child, then its Parent, then again its children, then again their Parent and so on. An example of DTO would be:
public class ParentDto {
private Long id;
private String prop1;
public ParentDto(Parent parent) {
this.id = parent.id;
this.prop1 = parent.prop1;
//...other properties
}
//here come all getters for the properties defined above.
}
Solution 2: Use #JsonIgnore for your public property Parent.getChildren(), so that the Jackson does not try to encode the children when marshalling the Parent instance.
my problem:
#Entity
public class Container {
#OneToMany(cascade = CascadeType.ALL)
private List<Element> containedElements;
public final List<Element> getContainedElements() {
return containedElements;
}
}
#Entity
#Inheritance(strategy = InheritanceType.TABLE_PER_CLASS)
public abstract class Element {
#ManyToOne(cascade = CascadeType.ALL)
private Container myContainer;
public final Container getMyContainer() {
return myContainer;
}
public abstract Object getValue();
public abstract void setValue(final Object newValue);
}
#Entity
public class StringElement extends Element {
private String someValue;
public final Object getValue() {
return someValue;
}
public final void setValue(final Object newValue) {
someValue = newValue;
}
}
I have a container class containing probably many objects of an abstract class Element.
I have more than one implementation of this Element class, one of them StringElement.
Using the JPA API (provided by Hibernate) and a local H2 database and a small test class, I can persist entities of these classes and query the database for them and output them to the console.
Using Wildfly 8.0 (JBoss), the JPA API (provided by Hibernate) and a Wildfly-"managed" H2 database, I can persist entities, but when I query the database for a Container object, I cannot access the contained elements. Trying to do this results in the following error:
Caused by: java.lang.NumberFormatException: empty String
at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:1011)
at java.lang.Double.parseDouble(Double.java:540)
at org.h2.value.Value.convertTo(Value.java:846)
I can query the database for a list of all StringElements in the database and parse the results. I can access the Container via getMyContainer(). Then when I try to access an Element via getContainedElements().get(0), I get the above error again.
Did I use the correct JPA annotations? How can I have a list of abstract objects in my container?
#OneToMany annotation has FetchType.LAZY as default.
It looks like you are trying access containedElements when the Container entity is no longer managed by the persistence context.
You must read containedElements when the Container is managed by persistence context (if you want to leave FetchType.LAZY) or just change fetch to FetchType.EAGER:
#OneToMany(cascade = CascadeType.ALL, fetch=FetchType.EAGER )
private List<Element> containedElements;
I can finally answer this on my own:
The problem lies with multiple classes that inherit Element. Or rather, the cause of the error are two or more classes inheriting from Element which each have an attribute named someValue. The problem is that each attribute has a different primitive type.
Doing a select over all Elements (and subclasses of it) tries to join the tables of two Element subclasses and then tries to merge the someValue columns. As each column has a different type, the engine gives the merged column some type (e.g. Long) and tries to cast all values from the other columns to the decided type.
One DB engine merged the columns and made a Long column. This resulted in the above error. Another DB engine merged the columns and made a Varchar column. This resulted in no error because all my values could be easily transformed to Varchar.