I am new to this forum and hibernate. I am having a problem with hibernate many-to-one mapping.
#ManyToOne(cascade = CascadeType.ALL,fetch = FetchType.EAGER)
#JoinColumn(name = "DTE_ID")
#NotNull
private Dte raisedByDte;
This is the code I am using in main object and foreign key is DTE_ID. But when I am trying to save it is updating all fields in referenced table. My reference object is as follows:
#Entity
#Table(name = "DTE_MASTERS", uniqueConstraints = #UniqueConstraint(columnNames = "DTE_NAME"))
public class Dte {
#Id
#Column(name="DTE_ID", updatable = false, nullable = false)
private int dte_id;
#Column(name="DTE_NAME")
private String dte_name;
public Dte() {
super();
// TODO Auto-generated constructor stub
}
public Dte(int dte_id, String dte_name) {
super();
this.dte_id = dte_id;
this.dte_name = dte_name;
}
public int getDte_id() {
return dte_id;
}
public void setDte_id(int dte_id) {
this.dte_id = dte_id;
}
public String getDte_name() {
return dte_name;
}
public void setDte_name(String dte_name) {
this.dte_name = dte_name;
}
I want to restrict the update of DTE_MASTERS when I am inserting ..can some body please guide me through this?
You have to remove the cascade option from the mapping. It enforces the same operation as was performed on the parent object.
I am guessing that you are doing merge() on the main object.. and with that option, the merge() (which would result in an update if the entity is not new) will also be invoked on the Dte dependency:
#ManyToOne(fetch = FetchType.EAGER)
Also, keep in mind that if you any non-transient field within the Dte entity once its loaded with the main entity, all of the changes will be commited implicilty upon the transaction end.
In order to prevent that you would need to perform session.evict(dte); so that any changes will not get persisted in the database even if they were performed in within the transactional method.
Related
In our spring boot application, I am trying to save an aggregate, that consists of a root entity (ParentEntity) and a Set of child entities (ChildEntity).
The intention is, that all operations are done through the aggreate. So there is no need for a repository for ChildEntity, as the ParentEntity is supposed to manage all save or update operations.
This is how the Entities look like:
#Entity
#Table(name = "tab_parent", schema = "test")
public class ParentEntity implements Serializable {
#Id
#Column(name = "parent_id")
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Integer parentId;
#Column(name = "description")
private String description;
#Column(name = "created_datetime", updatable = false, nullable = false)
#ColumnTransformer(write = "COALESCE(?,CURRENT_TIMESTAMP)")
private OffsetDateTime created;
#Column(name = "last_modified_datetime", nullable = false)
#ColumnTransformer(write = "COALESCE(CURRENT_TIMESTAMP,?)")
private OffsetDateTime modified;
#OneToMany(fetch = FetchType.EAGER, cascade = CascadeType.ALL, orphanRemoval = true, mappedBy = "ParentEntity")
private Set<ChildEntity> children;
// constructor and other getters and setters
public void setChildren(final Set<ChildEntity> children) {
this.children = new HashSet<>(children.size());
for (final ChildEntity child : children) {
this.addChild(child);
}
}
public ParentEntity addChild(final ChildEntity child) {
this.children.add(child);
child.setParent(this);
return this;
}
public ParentEntity removeChild(final ChildEntity child) {
this.children.add(child);
child.setParent(null);
return this;
}
}
#Entity
#DynamicUpdate
#Table(name = "tab_child", schema = "test")
public class ChildEntity implements Serializable {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "child_id")
private Integer childId;
#Column(name = "language_id")
private String languageId;
#Column(name = "text")
private String text;
#Column(name = "created_datetime", updatable = false, nullable = false)
#ColumnTransformer(write = "COALESCE(?,CURRENT_TIMESTAMP)")
public OffsetDateTime created;
#Column(name = "last_modified_datetime", nullable = false)
#ColumnTransformer(write = "COALESCE(CURRENT_TIMESTAMP,?)")
public OffsetDateTime modified;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "parent_id", updatable = false)
private ParentEntity parent;
// constructor and other getters and setters
public ParentEntity getParent() {
return this.parent;
}
public void setParent(final ParentEntity parent) {
this.parent = parent;
}
}
This is the store method to save or update the entities:
public Integer merge(final ParentDomainObject parentDomainObject) {
final ParentEntity parentEntity =
this.mapper.toParentEntity(parentDomainObject);
final ParentEntity result = this.entityManager.merge(parentEntity);
this.entityManager.flush();
return result.getParentId();
}
And this is the store method to retrieve the aggregate by id:
public Optional<ParentDomainObject> findById(final Integer id) {
return this.repo.findById(id).map(this.mapper::toParentDomainObject);
}
As you can see our architecture strictly separates the store from the service layer. So the service only knows about domain objects and does not depend on Hibernate Entites at all.
When updating either the child or the parent, firstly the parent is loaded. In the service layer, the domain object is updated (fields are set, or a child is added/removed).
Then the merge method (see code snippet) of the store is called with the updated domain object.
This works, but not completely as we want to. Currently every update leads to the parent and EVERY chhild entity being saved, even if all field remained the same. We added the #DynamicUpdate annotaton. Now we saw, that the "modified" field is the problem.
We use a #ColumnTransformer to have the database set the date. Now even if you call the services update method without changing anything, Hibernate generates a update query for EVERY object, which updates only the modified field.
The worst thing about that is, as every object is saved, every modified date changed as well to the current date. But we need information about exactly which object really changed and when.
Is there any way to tell hibernate, that this column should not be taken into account when deciding what to update. However of course, if a field changed, the update operation should indeed update the modified field.
UPDATE:
My second approach after #Christian Beikov mentioned the use of #org.hibernate.annotations.Generated( GenerationTime.ALWAYS )
is the following:
Instead of #Generated (which uses #ValueGenerationType( generatedBy = GeneratedValueGeneration.class )),
I created my own annotations, which use custom AnnotationValueGeneration implementations:
#ValueGenerationType(generatedBy = CreatedTimestampGeneration.class)
#Retention(RetentionPolicy.RUNTIME)
public #interface InDbCreatedTimestamp {
}
public class CreatedTimestampGeneration
implements AnnotationValueGeneration<InDbCreatedTimestamp> {
#Override
public void initialize(final InDbCreatedTimestamp annotation, final Class<?> propertyType) {
}
#Override
public GenerationTiming getGenerationTiming() {
return GenerationTiming.INSERT;
}
#Override
public ValueGenerator<?> getValueGenerator() {
return null;
}
#Override
public boolean referenceColumnInSql() {
return true;
}
#Override
public String getDatabaseGeneratedReferencedColumnValue() {
return "current_timestamp";
}
}
#ValueGenerationType(generatedBy = ModifiedTimestampGeneration.class)
#Retention(RetentionPolicy.RUNTIME)
public #interface InDbModifiedTimestamp {
}
public class ModifiedTimestampGeneration
implements AnnotationValueGeneration<InDbModifiedTimestamp> {
#Override
public void initialize(final InDbModifiedTimestamp annotation, final Class<?> propertyType) {
}
#Override
public GenerationTiming getGenerationTiming() {
return GenerationTiming.ALWAYS;
}
#Override
public ValueGenerator<?> getValueGenerator() {
return null;
}
#Override
public boolean referenceColumnInSql() {
return true;
}
#Override
public String getDatabaseGeneratedReferencedColumnValue() {
return "current_timestamp";
}
}
I use these annotations in my entities instead of the #ColumnTransformer annotations now.
This works flawlessly when I insert a new ChildEntity via addChild(), as now not all timestamps of all entities of the aggregate are updated anymore. Only the timestamps of the new child are set now.
In other words, the InDbCreatedTimestamp works as it should.
Sadly, the InDbModifiedTimestamp does not. Because of GenerationTiming.ALWAYS, I expected the timestamp to be generated on db level, everytime an INSERT OR UPDATE is issued. If I change a field of a ChildEntity and then save the aggregate, an update statement is generated only for this one database row, as expected. However, the last_modified_datetime column is not updated, which is surprising.
It seems that this is unfortunately still an open bug. This issue describes my problem precisely: Link
Can someone provide a solution how to get this db function executed on update as well (without using db triggers)
You could try to use #org.hibernate.annotations.Generated( GenerationTime.ALWAYS ) on these fields and use a database trigger or default expression to create the value. This way, Hibernate will never write the field, but read it after insert/update.
Overall this has a few downsides though (need the trigger, need a select after insert/update), so I think this is a perfect use case for Blaze-Persistence Entity Views.
I created the library to allow easy mapping between JPA models and custom interface or abstract class defined models, something like Spring Data Projections on steroids. The idea is that you define your target structure(domain model) the way you like and map attributes(getters) via JPQL expressions to the entity model.
A DTO/domain model for your use case could look like the following with Blaze-Persistence Entity-Views:
#EntityView(ParentEntity.class)
#UpdatableEntityView
public interface ParentDomainObject {
#IdMapping
Integer getParentId();
OffsetDateTime getModified();
void setModified(OffsetDateTime modified);
String getDescription();
void setDescription(String description);
Set<ChildDomainObject> getChildren();
#PreUpdate
default preUpdate() {
setModified(OffsetDateTime.now());
}
#EntityView(ChildEntity.class)
#UpdatableEntityView
interface ChildDomainObject {
#IdMapping
Integer getChildId();
String getName();
}
}
Querying is a matter of applying the entity view to a query, the simplest being just a query by id.
ParentDomainObject a = entityViewManager.find(entityManager, ParentDomainObject.class, id);
The Spring Data integration allows you to use it almost like Spring Data Projections: https://persistence.blazebit.com/documentation/entity-view/manual/en_US/index.html#spring-data-features
Page<ParentDomainObject> findAll(Pageable pageable);
The best part is, it will only fetch the state that is actually necessary! It also supports writing/mapping back to the persistence model in an efficient manner. Since it does dirty tracking for you, it will only flush changes if the object is actually dirty.
public Integer merge(final ParentDomainObject parentDomainObject) {
this.entityViewManager.save(this.entityManager, parentDomainObject);
this.entityManager.flush();
return parentDomainObject.getParentId();
}
I'm trying to use JPA (with Hibernate) to save 2 entities. Spring data is providing the interface but I don't think it matters here.
I have a main entity called 'Model'. This model has many 'Parameter' entities linked. I'm writing a method to save a model and its parameters.
This is the method:
private void cascadeSave(Model model) {
modelRepository.save(model);
for (ParameterValue value : model.getParameterValues()) {
parameterValueRepository.save(value);
}
}
This is the problem:
When I load a Model that already existed before, add some new parameters to it and then call this method to save both of them something strange happens:
Before the first save (modelRepository.save) this is what the model's data looks like when debugging:
The model has 2 parameters, with filled in values (name and model are filled).
Now, after saving the model the first save in my method, this happens. Note that the object reference is a different one so Hibernate must have done something magical and recreated the values instead of leaving them alone:
For some reason hibernate cleared all the attributes of the parameters in the set.
Now when the saving of the new parameters happens in the following code it fails because of not null constraints etc.
My question: Why does hibernate clear all of the fields?
Here are the relevant mappings:
ParameterValue
#Entity
#Table(name = "tbl_parameter_value")
#Inheritance(strategy = InheritanceType.SINGLE_TABLE)
#DiscriminatorColumn(name = "PARAMETER_TYPE")
public abstract class ParameterValue extends AbstractBaseObject {
#Column(nullable = false)
#NotBlank
private String name;
private String stringValue;
private Double doubleValue;
private Integer intValue;
private Boolean booleanValue;
#Enumerated(EnumType.STRING)
private ModelType modelParameterType;
#Column(precision = 7, scale = 6)
private BigDecimal bigDecimalValue;
#Lob
private byte[] blobValue;
ParameterValue() {
}
ParameterValue(String name) {
this.name = name;
}
ModelParameterValue
#Entity
#DiscriminatorValue(value = "MODEL")
public class ModelParameterValue extends ParameterValue {
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "model_id", foreignKey = #ForeignKey(name = "FK_VALUE_MODEL"))
private Model model;
ModelParameterValue() {
super();
}
ModelParameterValue(String name) {
super(name);
}
Model
#Entity
#Table(name = "tbl_model")
public class Model extends AbstractBaseObject implements Auditable {
#OneToMany(fetch = FetchType.LAZY, mappedBy = "model")
private Set<ModelParameterValue> parameterValues = new HashSet<>();
EDIT
I was able to reproduce this with a minimal example.
If you replace everything spring data does this is what happened under the hood (em is a JPA EntityManager):
public Model simpleTest() {
Model model = new Model("My Test Model");
em.persist(model);
model.addParameter(new Parameter("Param 1"));
em.merge(model);
for (Parameter child : model.getParameters()) {
em.persist(child);
}
return model;
}
When the merge is executed, all of the attributes of the parameters are set to null. They are actually just replaced with completely new parameters.
I guess you are using Spring Data Jpa as your modelRepository. This indicates following consequences.
Spring Repository Save
S save(S entity)
Saves a given entity. Use the returned
instance for further operations as the save operation might have
changed the entity instance completely.
So it is normal behaviour which you had encountered.
Code should be changed to :
model = modelRepository.save(model);
for (ParameterValue value : model.getParameterValues()) {
parameterValueRepository.save(value);
}
EDIT:
I think that your saving function is broken in sense, that you do it backwards. Either you can use CascadeType on your relation or you have to save children first.
Cascade
Cascade works like that "If you save Parent, save Children, if you update Parent, update Children ..."
So we can put cascade on your relation like that :
#Entity
#Table(name = "tbl_model")
public class Model extends AbstractBaseObject implements Auditable {
#OneToMany(fetch = FetchType.LAZY, mappedBy = "model", cascade = CascadeType.ALL)
private Set<ModelParameterValue> parameterValues = new HashSet<>();
and then only save like this
private void cascadeSave(Model model) {
modelRepository.save(model);
//ParamValues will be saved/updated automaticlly if your model has changed
}
Bottom-Up save
Second option is just to save params first and then model with them.
private void cascadeSave(Model model) {
model.setParameterValues(
model.getParameterValues().stream()
.map(param -> parameterValueRepository.save(param))
.collect(Collectors.toSet())
);
modelRepository.save(model);
}
I haven't checked second code in my compiler but the idea is to first save children (ParamValues), put it into Model and then save Model : )
I got issues with my model classes. For example:
#Entity
#Table(name = "kreis", catalog = "quanto_portal")
#JsonIdentityInfo(generator = ObjectIdGenerators.PropertyGenerator.class, property="idKreis")
public class Kreis implements java.io.Serializable {
private Integer idKreis;
private String kreisname;
private Set<Ort> orts = new HashSet<Ort>(0);
public Kreis() {
}
public Kreis(String kreisname) {
this.kreisname = kreisname;
}
public Kreis(String kreisname, Set<Ort> orts) {
this.kreisname = kreisname;
this.orts = orts;
}
#Id
#GeneratedValue(strategy = IDENTITY)
#Column(name = "idKreis", unique = true, nullable = false)
public Integer getIdKreis() {
return this.idKreis;
}
public void setIdKreis(Integer idKreis) {
this.idKreis = idKreis;
}
#Column(name = "kreisname", nullable = false, length = 50)
public String getKreisname() {
return this.kreisname;
}
public void setKreisname(String kreisname) {
this.kreisname = kreisname;
}
//#JsonManagedReference(value="kreis-ort")
#OneToMany(fetch = FetchType.LAZY, mappedBy = "kreis")
public Set<Ort> getOrts() {
return this.orts;
}
public void setOrts(Set<Ort> orts) {
this.orts = orts;
}
When I query for an "Kreis"-Object it also internally querys for the dependent "Orts", although Lazy-Loading is set. Next, in "Ort"-class a statement for dependent "Kreis"-objects is done (cause it's an attribute of Ort; Lazy-Loading is set). If "Ort" has more dependent classes/attributes for example "Persons", even the whole "Person"-class is loaded. Can anyone tell me why? Do I need to set a property in Spring or initializing a specific bean?
So far I need to ignore (with #JsonIgnoreProperties) every attribute that references to another class. I think thats wrong, cause lazy-loading should effect that dependet objects are only loaded, if I ask for it.
LAZY means lazily loaded from the database when the collection is accessed. As soon as Jackson starts serializing the object, it reads all the fields, including the orts field, which triggers the lazy loading.
If you're wanting to only serialize certain fields, then you probably want to return a projection of some sort from your controller; the just-released Spring Data Hopper M1 supports returning projections from Spring Data repositories, and you can also use Jackson projections if you need to deal with the full entity object in your controller.
I have created some Hibernate mappings with Hibernate 4.3.8.
#Entity
#Table(name = ErrorEntity.TABLE_ID)
#XmlRootElement(name = ErrorEntity.XML_ROOT_TAG)
public class ErrorEntity {
/**
*
*/
private static final long serialVersionUID = 8083918635458543738L;
public static final String TABLE_ID = "Error";
public static final String ERRORCODE = "error_code";
public static final String ENV_ID = "envid";
private Integer error_code;
private Integer envId;
private EnvironmentEntity environment;
public ErrorEntity() {
}
#Id
#Column(name = ErrorEntity.ERRORCODE)
public Integer getError_code() {
return error_code;
}
public void setError_code(Integer errorcode) {
this.error_code = errorcode;
}
#Column(name = ErrorEntity.ENV_ID)
public Integer getEnvId() {
return envId;
}
public void setEnvId(Integer envId) {
this.envId = envId;
}
#ManyToOne
#JoinColumn(name = ErrorEntity.ENV_ID, referencedColumnName = EnvironmentEntity.ENV_ID, insertable = false, updatable = false)
public EnvironmentEntity getEnvironment() {
return environment;
}
public void setEnvironment(EnvironmentEntity environment) {
this.environment = environment;
}
}
As you can see the mapping property ENV_ID is mapped twice.
Thisway I thought I would be able to set the JoinColumn value without querying the database to get the mapped object because I have the JoinColumn value at this point.
The value of ENV_ID is written correctly to the database but if I query this ErrorEntity later and try to get the EnvironmentEntity the reference is null.
ErrorEntity error = (ErrorEntity) criteria.uniqueResult();
System.out.println(error.getEnvironment().getName());
getEnvironment() returns null.
Any ideas how to achieve this?
Edit
It was working like expected to create a new object with the PK set.
Now I have a special situation where it does not work.
I need to reference another object where the joincolumn is not the PK. I know that the value i will join on is unique but there are also some duplicate values i will not join on.
However Hibernate seems to be unable to map this relationship automatically.
ErrorEntity error = new ErrorEntity();
SignalEntity signal = new SignalEntity();
signal.setName(signalName);
error.setSignal(signal);
The problem is that I do not have the signalID (PK) in that situation. The other idea would be to query the db but thats too slow.
I tried to create an composite PK with 3 columns but this breaks the logic at another place.
Is it possible to create two independent PK's?
The ErrorEntity has two ErrorEntity.ENV_ID mappings, which unless you use #MapsId then it's a configuration issue.
You should have an env_id column in EnvironmentEntity table and just the:
#ManyToOne
#JoinColumn(name = ErrorEntity.ENV_ID, referencedColumnName = EnvironmentEntity.ENV_ID, insertable = false, updatable = false)
public EnvironmentEntity getEnvironment() {
return environment;
}
mapping in ErrorEntity.
My suggestion is to remove this:
#Column(name = ErrorEntity.ENV_ID)
public Integer getEnvId() {
return envId;
}
To set the envId directly without querying the database and request the whole EnvironmentEntity, you can do something like this:
errrorEntity.setEnvironment(new EnvironmentEntity());
errrorEntity.getEnvironment().setEnvId(envId);
This is not a JPA standard requirement but Hibernate supports it.
I work with hibernate and try to optimize loading foreign entities annotated with
#ManyToOne(fetch = FetchType.LAZY)
I don't want to retrieve foreign entity during hibernate query and use LAZY fetch type.
Later(after session already closed), I want to get that foreign entity but use tool that different from hibernate (another cached DAO (GuavaCache) that already stores foreign entity).
Of couse, immediately I have got an LazyInitializationException.
I can't replace #ManyToOne annotation with #Transient because of toooooo much legacy HQL code witch does not works after deleting #MabyToOne.
Somewere somebody adviced to make getter method final and do not access to entity field straightly, but just use getter. Here is an example :
private int foreignId;
#Basic
#Column(name = "foregn_id")
public int getForeignId() { return foreignId;}
public void setForeignId(int id) { this.foreignId = id; }
// private DBForeignEntity foreignEntity; no more sense to have this field
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "foregn_id", referencedColumnName = "id", nullable = false, insertable = false, updatable = false)
public final DBForeignEntity getForeign() {
// return foreignEntity; deprecated usage! Hibernate will create proxy object and throw LazyInitException after session was closed
return getFromCache(getForeignId());
}
public void setForeign(DBForeignEntity foreignEntity) {
// this.foreignEntity = foreignEntity; no more sence for having setter at all
}
this ugly solution exludes any abilities to persist nested entities, because of no setter for foreign entity anymore!
Is there another solution to deprecate Hibernate to create a proxy object for my entity?
How to avoid LazyInitializationException if session was closed?
Is there any bad consequences of non-proxing in this case?
In current situation hibernate proxies only child (foreignEntity) entity, and do not proxy current (parent) entity. Thus, there is no problem to check the instance of foreign entity and replace it with custom loaded :
private int foreignId;
#Basic
#Column(name = "foregn_id")
public int getForeignId() { return foreignId;}
public void setForeignId(int id) { this.foreignId = id; }
private DBForeignEntity foreignEntity;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "foregn_id", referencedColumnName = "id", nullable = false, insertable = false, updatable = false)
public final DBForeignEntity getForeignEntity() {
if (foreignEntity instanceof HibernateProxy) {
foreignEntity = getFromCache(foreignId);
}
return foreignEntity;
}
public void setForeign(DBForeignEntity foreignEntity) {
this.foreignEntity = foreignEntity;
}
Further, there is no LazyInitException during invoking
DBParentEntity.getForeignEntity()