I'm having a few troubles with Java programming. In my case, I'm using Hibernate criteria queries through Spring's getHibernateTemplate without having direct access to the session object.
I have a parent-child relationship mapped with JPA annotations. Here is my DAO method
public List<Child> findByParent(final Parent parent)
{
return getHibernateTemplate().executeFind(new HibernateCallback<List<Child>>()
{
#Override
public List<Child> doInHibernate(Session session) throws HibernateException, SQLException
{
return session.createCriteria(Child.class)
.add(Restrictions.eq("parent", parent))
.list();
}
});
}
I have marked parent as final because I'm passing it to the anonymous class. When I run the code Hibernate does perform the query (... where parentId = ?) but I get no result. Running it in mysql with correct parentId returns results.
When debugging, I see Eclipse can't inspect the value of parent from within the anonymous class. What's wrong? How do I fix this?
[edit] Here are the Pojo. I have no control over Parent (not even source, I just pressed F3 for auto decompile)
#Entity
#SequenceGenerator(name = "SEQ_STORE", sequenceName = "SEQ_CHILD")
#Table(name = "TA_CHILD", uniqueConstraints = { #UniqueConstraint(columnNames = { "A_FIELD", "PARENT_ID" }) })
public class Immobile
{
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
#Column(name = "CHILD_ID")
protected int id;
#JoinColumn(name = "PARENT_ID")
#ManyToOne(fetch = FetchType.LAZY, optional = false, targetEntity = Organization.class)
protected Parent parent;
#GeneratedValue
#Column(name = "PARENT_ID", insertable = false, updatable = false)
protected String parentId;
}
// Compiled from Parent.java (version 1.6 : 50.0, super bit)
#javax.persistence.Entity
#javax.persistence.Table(name="TA_PARENT")
public class org.Partent extends org.ExtensibleBase implements java.io.Serializable {
// Field descriptor #161 J
private static final long serialVersionUID = 1L;
// Field descriptor #166 Ljava/lang/String;
#javax.persistence.Id
#javax.persistence.Column(name="PARENT_ID",
nullable=false,
length=(int) 20)
private java.lang.String id;
}
In restriction, you have to add criteria for parent entity's id
Try below code,
public List<Child> findByParent(final Parent parent)
{
return getHibernateTemplate().executeFind(new HibernateCallback<List<Child>>()
{
#Override
public List<Child> doInHibernate(Session session) throws HibernateException, SQLException
{
return session.createCriteria(Child.class)
.add(Restrictions.eq("parent.id", parent.getId()))
.list();
}
});
}
There was no problem in the code. Simply, after populating the tables from MySQL workbench I forgot to commit the transaction or leave auto commit enabled.
Both fragments (mine and Pritesh's) are correct and return all results.
Problem solved but that doesn't explain Eclipse being unable to evaluate parent variable, which is out of the scope
Related
In our spring boot application, I am trying to save an aggregate, that consists of a root entity (ParentEntity) and a Set of child entities (ChildEntity).
The intention is, that all operations are done through the aggreate. So there is no need for a repository for ChildEntity, as the ParentEntity is supposed to manage all save or update operations.
This is how the Entities look like:
#Entity
#Table(name = "tab_parent", schema = "test")
public class ParentEntity implements Serializable {
#Id
#Column(name = "parent_id")
#GeneratedValue(strategy = GenerationType.IDENTITY)
private Integer parentId;
#Column(name = "description")
private String description;
#Column(name = "created_datetime", updatable = false, nullable = false)
#ColumnTransformer(write = "COALESCE(?,CURRENT_TIMESTAMP)")
private OffsetDateTime created;
#Column(name = "last_modified_datetime", nullable = false)
#ColumnTransformer(write = "COALESCE(CURRENT_TIMESTAMP,?)")
private OffsetDateTime modified;
#OneToMany(fetch = FetchType.EAGER, cascade = CascadeType.ALL, orphanRemoval = true, mappedBy = "ParentEntity")
private Set<ChildEntity> children;
// constructor and other getters and setters
public void setChildren(final Set<ChildEntity> children) {
this.children = new HashSet<>(children.size());
for (final ChildEntity child : children) {
this.addChild(child);
}
}
public ParentEntity addChild(final ChildEntity child) {
this.children.add(child);
child.setParent(this);
return this;
}
public ParentEntity removeChild(final ChildEntity child) {
this.children.add(child);
child.setParent(null);
return this;
}
}
#Entity
#DynamicUpdate
#Table(name = "tab_child", schema = "test")
public class ChildEntity implements Serializable {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "child_id")
private Integer childId;
#Column(name = "language_id")
private String languageId;
#Column(name = "text")
private String text;
#Column(name = "created_datetime", updatable = false, nullable = false)
#ColumnTransformer(write = "COALESCE(?,CURRENT_TIMESTAMP)")
public OffsetDateTime created;
#Column(name = "last_modified_datetime", nullable = false)
#ColumnTransformer(write = "COALESCE(CURRENT_TIMESTAMP,?)")
public OffsetDateTime modified;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "parent_id", updatable = false)
private ParentEntity parent;
// constructor and other getters and setters
public ParentEntity getParent() {
return this.parent;
}
public void setParent(final ParentEntity parent) {
this.parent = parent;
}
}
This is the store method to save or update the entities:
public Integer merge(final ParentDomainObject parentDomainObject) {
final ParentEntity parentEntity =
this.mapper.toParentEntity(parentDomainObject);
final ParentEntity result = this.entityManager.merge(parentEntity);
this.entityManager.flush();
return result.getParentId();
}
And this is the store method to retrieve the aggregate by id:
public Optional<ParentDomainObject> findById(final Integer id) {
return this.repo.findById(id).map(this.mapper::toParentDomainObject);
}
As you can see our architecture strictly separates the store from the service layer. So the service only knows about domain objects and does not depend on Hibernate Entites at all.
When updating either the child or the parent, firstly the parent is loaded. In the service layer, the domain object is updated (fields are set, or a child is added/removed).
Then the merge method (see code snippet) of the store is called with the updated domain object.
This works, but not completely as we want to. Currently every update leads to the parent and EVERY chhild entity being saved, even if all field remained the same. We added the #DynamicUpdate annotaton. Now we saw, that the "modified" field is the problem.
We use a #ColumnTransformer to have the database set the date. Now even if you call the services update method without changing anything, Hibernate generates a update query for EVERY object, which updates only the modified field.
The worst thing about that is, as every object is saved, every modified date changed as well to the current date. But we need information about exactly which object really changed and when.
Is there any way to tell hibernate, that this column should not be taken into account when deciding what to update. However of course, if a field changed, the update operation should indeed update the modified field.
UPDATE:
My second approach after #Christian Beikov mentioned the use of #org.hibernate.annotations.Generated( GenerationTime.ALWAYS )
is the following:
Instead of #Generated (which uses #ValueGenerationType( generatedBy = GeneratedValueGeneration.class )),
I created my own annotations, which use custom AnnotationValueGeneration implementations:
#ValueGenerationType(generatedBy = CreatedTimestampGeneration.class)
#Retention(RetentionPolicy.RUNTIME)
public #interface InDbCreatedTimestamp {
}
public class CreatedTimestampGeneration
implements AnnotationValueGeneration<InDbCreatedTimestamp> {
#Override
public void initialize(final InDbCreatedTimestamp annotation, final Class<?> propertyType) {
}
#Override
public GenerationTiming getGenerationTiming() {
return GenerationTiming.INSERT;
}
#Override
public ValueGenerator<?> getValueGenerator() {
return null;
}
#Override
public boolean referenceColumnInSql() {
return true;
}
#Override
public String getDatabaseGeneratedReferencedColumnValue() {
return "current_timestamp";
}
}
#ValueGenerationType(generatedBy = ModifiedTimestampGeneration.class)
#Retention(RetentionPolicy.RUNTIME)
public #interface InDbModifiedTimestamp {
}
public class ModifiedTimestampGeneration
implements AnnotationValueGeneration<InDbModifiedTimestamp> {
#Override
public void initialize(final InDbModifiedTimestamp annotation, final Class<?> propertyType) {
}
#Override
public GenerationTiming getGenerationTiming() {
return GenerationTiming.ALWAYS;
}
#Override
public ValueGenerator<?> getValueGenerator() {
return null;
}
#Override
public boolean referenceColumnInSql() {
return true;
}
#Override
public String getDatabaseGeneratedReferencedColumnValue() {
return "current_timestamp";
}
}
I use these annotations in my entities instead of the #ColumnTransformer annotations now.
This works flawlessly when I insert a new ChildEntity via addChild(), as now not all timestamps of all entities of the aggregate are updated anymore. Only the timestamps of the new child are set now.
In other words, the InDbCreatedTimestamp works as it should.
Sadly, the InDbModifiedTimestamp does not. Because of GenerationTiming.ALWAYS, I expected the timestamp to be generated on db level, everytime an INSERT OR UPDATE is issued. If I change a field of a ChildEntity and then save the aggregate, an update statement is generated only for this one database row, as expected. However, the last_modified_datetime column is not updated, which is surprising.
It seems that this is unfortunately still an open bug. This issue describes my problem precisely: Link
Can someone provide a solution how to get this db function executed on update as well (without using db triggers)
You could try to use #org.hibernate.annotations.Generated( GenerationTime.ALWAYS ) on these fields and use a database trigger or default expression to create the value. This way, Hibernate will never write the field, but read it after insert/update.
Overall this has a few downsides though (need the trigger, need a select after insert/update), so I think this is a perfect use case for Blaze-Persistence Entity Views.
I created the library to allow easy mapping between JPA models and custom interface or abstract class defined models, something like Spring Data Projections on steroids. The idea is that you define your target structure(domain model) the way you like and map attributes(getters) via JPQL expressions to the entity model.
A DTO/domain model for your use case could look like the following with Blaze-Persistence Entity-Views:
#EntityView(ParentEntity.class)
#UpdatableEntityView
public interface ParentDomainObject {
#IdMapping
Integer getParentId();
OffsetDateTime getModified();
void setModified(OffsetDateTime modified);
String getDescription();
void setDescription(String description);
Set<ChildDomainObject> getChildren();
#PreUpdate
default preUpdate() {
setModified(OffsetDateTime.now());
}
#EntityView(ChildEntity.class)
#UpdatableEntityView
interface ChildDomainObject {
#IdMapping
Integer getChildId();
String getName();
}
}
Querying is a matter of applying the entity view to a query, the simplest being just a query by id.
ParentDomainObject a = entityViewManager.find(entityManager, ParentDomainObject.class, id);
The Spring Data integration allows you to use it almost like Spring Data Projections: https://persistence.blazebit.com/documentation/entity-view/manual/en_US/index.html#spring-data-features
Page<ParentDomainObject> findAll(Pageable pageable);
The best part is, it will only fetch the state that is actually necessary! It also supports writing/mapping back to the persistence model in an efficient manner. Since it does dirty tracking for you, it will only flush changes if the object is actually dirty.
public Integer merge(final ParentDomainObject parentDomainObject) {
this.entityViewManager.save(this.entityManager, parentDomainObject);
this.entityManager.flush();
return parentDomainObject.getParentId();
}
I work with hibernate and try to optimize loading foreign entities annotated with
#ManyToOne(fetch = FetchType.LAZY)
I don't want to retrieve foreign entity during hibernate query and use LAZY fetch type.
Later(after session already closed), I want to get that foreign entity but use tool that different from hibernate (another cached DAO (GuavaCache) that already stores foreign entity).
Of couse, immediately I have got an LazyInitializationException.
I can't replace #ManyToOne annotation with #Transient because of toooooo much legacy HQL code witch does not works after deleting #MabyToOne.
Somewere somebody adviced to make getter method final and do not access to entity field straightly, but just use getter. Here is an example :
private int foreignId;
#Basic
#Column(name = "foregn_id")
public int getForeignId() { return foreignId;}
public void setForeignId(int id) { this.foreignId = id; }
// private DBForeignEntity foreignEntity; no more sense to have this field
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "foregn_id", referencedColumnName = "id", nullable = false, insertable = false, updatable = false)
public final DBForeignEntity getForeign() {
// return foreignEntity; deprecated usage! Hibernate will create proxy object and throw LazyInitException after session was closed
return getFromCache(getForeignId());
}
public void setForeign(DBForeignEntity foreignEntity) {
// this.foreignEntity = foreignEntity; no more sence for having setter at all
}
this ugly solution exludes any abilities to persist nested entities, because of no setter for foreign entity anymore!
Is there another solution to deprecate Hibernate to create a proxy object for my entity?
How to avoid LazyInitializationException if session was closed?
Is there any bad consequences of non-proxing in this case?
In current situation hibernate proxies only child (foreignEntity) entity, and do not proxy current (parent) entity. Thus, there is no problem to check the instance of foreign entity and replace it with custom loaded :
private int foreignId;
#Basic
#Column(name = "foregn_id")
public int getForeignId() { return foreignId;}
public void setForeignId(int id) { this.foreignId = id; }
private DBForeignEntity foreignEntity;
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "foregn_id", referencedColumnName = "id", nullable = false, insertable = false, updatable = false)
public final DBForeignEntity getForeignEntity() {
if (foreignEntity instanceof HibernateProxy) {
foreignEntity = getFromCache(foreignId);
}
return foreignEntity;
}
public void setForeign(DBForeignEntity foreignEntity) {
this.foreignEntity = foreignEntity;
}
Further, there is no LazyInitException during invoking
DBParentEntity.getForeignEntity()
I have a hibernate entity with one-to-many association:
#Entity
public class Parent {
#OneToMany(mappedBy = "parent", fetch = FetchType.LAZY)
#Cascade(CascadeType.ALL)
private Set<Child> children = new HashSet<Child>();
#Version
private Date version;
}
#Entity
public class Child {
#ManyToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "PARENT_ID")
private Parent parent;
#Basic
private String key;
}
*some annotations removed for clarity
Child entity maps on a table with composite primary key (KEY and PARENT_ID). The problem is when two users adds the same Child (with the same key) to the same Parent the cascade save (session.saveOrUpdate(parent)) fails with Child's primary key violation instead of optimistic lock failure.
If users change some other property in the Parent entity in addition to the collection, the optimistic lock works fine.
I could add some fictive property to the Parent class and change it every time when the collection changes and it will do the trick but it looks like a hack.
Or I could replace composite primary key to a surrogate one (by adding #Id).
The question is: What is the recommended approach of implementing optimistic locking in such a case?
Could be related to Hibernate #Version causing database foreign key constraint failure.
Only unidirectional collection changes are going to be propagated to the parent entity version. Because you are using a bidirectional association, it's the #ManyToOne side that will control this association, so adding/removing an entity in the parent-side collection is not going to affect the parent entity version.
However, you can still propagate changes from child entities to parent entities. This requires you to propagate the OPTIMISTIC_FORCE_INCREMENT lock whenever the child entity is modified.
In short, you need to have all your entities implementing a RootAware interface:
public interface RootAware<T> {
T root();
}
#Entity(name = "Post")
#Table(name = "post")
public class Post {
#Id
private Long id;
private String title;
#Version
private int version;
//Getters and setters omitted for brevity
}
#Entity(name = "PostComment")
#Table(name = "post_comment")
public class PostComment
implements RootAware<Post> {
#Id
private Long id;
#ManyToOne(fetch = FetchType.LAZY)
private Post post;
private String review;
//Getters and setters omitted for brevity
#Override
public Post root() {
return post;
}
}
#Entity(name = "PostCommentDetails")
#Table(name = "post_comment_details")
public class PostCommentDetails
implements RootAware<Post> {
#Id
private Long id;
#ManyToOne(fetch = FetchType.LAZY)
#MapsId
private PostComment comment;
private int votes;
//Getters and setters omitted for brevity
#Override
public Post root() {
return comment.getPost();
}
}
Then, you need two event listeners:
public static class RootAwareInsertEventListener
implements PersistEventListener {
private static final Logger LOGGER =
LoggerFactory.getLogger(RootAwareInsertEventListener.class);
public static final RootAwareInsertEventListener INSTANCE =
new RootAwareInsertEventListener();
#Override
public void onPersist(PersistEvent event) throws HibernateException {
final Object entity = event.getObject();
if(entity instanceof RootAware) {
RootAware rootAware = (RootAware) entity;
Object root = rootAware.root();
event.getSession().lock(root, LockMode.OPTIMISTIC_FORCE_INCREMENT);
LOGGER.info("Incrementing {} entity version because a {} child entity has been inserted", root, entity);
}
}
#Override
public void onPersist(PersistEvent event, Map createdAlready)
throws HibernateException {
onPersist(event);
}
}
and
public static class RootAwareInsertEventListener
implements PersistEventListener {
private static final Logger LOGGER =
LoggerFactory.getLogger(RootAwareInsertEventListener.class);
public static final RootAwareInsertEventListener INSTANCE =
new RootAwareInsertEventListener();
#Override
public void onPersist(PersistEvent event) throws HibernateException {
final Object entity = event.getObject();
if(entity instanceof RootAware) {
RootAware rootAware = (RootAware) entity;
Object root = rootAware.root();
event.getSession().lock(root, LockMode.OPTIMISTIC_FORCE_INCREMENT);
LOGGER.info("Incrementing {} entity version because a {} child entity has been inserted", root, entity);
}
}
#Override
public void onPersist(PersistEvent event, Map createdAlready)
throws HibernateException {
onPersist(event);
}
}
which you can register as follows:
public class RootAwareEventListenerIntegrator
implements org.hibernate.integrator.spi.Integrator {
public static final RootAwareEventListenerIntegrator INSTANCE =
new RootAwareEventListenerIntegrator();
#Override
public void integrate(
Metadata metadata,
SessionFactoryImplementor sessionFactory,
SessionFactoryServiceRegistry serviceRegistry) {
final EventListenerRegistry eventListenerRegistry =
serviceRegistry.getService( EventListenerRegistry.class );
eventListenerRegistry.appendListeners(EventType.PERSIST, RootAwareInsertEventListener.INSTANCE);
eventListenerRegistry.appendListeners(EventType.FLUSH_ENTITY, RootAwareUpdateAndDeleteEventListener.INSTANCE);
}
#Override
public void disintegrate(
SessionFactoryImplementor sessionFactory,
SessionFactoryServiceRegistry serviceRegistry) {
//Do nothing
}
}
and then supply the RootAwareFlushEntityEventListenerIntegrator via a Hibernate configuration property:
configuration.put(
"hibernate.integrator_provider",
(IntegratorProvider) () -> Collections.singletonList(
RootAwareEventListenerIntegrator.INSTANCE
)
);
Now, when you modify a PostCommentDetails entity:
PostCommentDetails postCommentDetails = entityManager.createQuery(
"select pcd " +
"from PostCommentDetails pcd " +
"join fetch pcd.comment pc " +
"join fetch pc.post p " +
"where pcd.id = :id", PostCommentDetails.class)
.setParameter("id", 2L)
.getSingleResult();
postCommentDetails.setVotes(15);
The parent Post entity version is modified as well:
SELECT pcd.comment_id AS comment_2_2_0_ ,
pc.id AS id1_1_1_ ,
p.id AS id1_0_2_ ,
pcd.votes AS votes1_2_0_ ,
pc.post_id AS post_id3_1_1_ ,
pc.review AS review2_1_1_ ,
p.title AS title2_0_2_ ,
p.version AS version3_0_2_
FROM post_comment_details pcd
INNER JOIN post_comment pc ON pcd.comment_id = pc.id
INNER JOIN post p ON pc.post_id = p.id
WHERE pcd.comment_id = 2
UPDATE post_comment_details
SET votes = 15
WHERE comment_id = 2
UPDATE post
SET version = 1
where id = 1 AND version = 0
For me it was enough to set the OptimisticLock annotation:
#OneToMany(mappedBy = "parent", cascade = CascadeType.ALL, orphanRemoval = true)
#OptimisticLock(excluded = false)
private Set<Child> children = new HashSet<Child>();
First I think you need to declare your primary key and define how the PK is generated.
Example :
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
#Column(name = "id")
private Long id;
Then, the best way to add new child to your parent should be like this (on the parent side):
public Child addChild() {
Child child = new Child()
if (childList== null) {
childList= new ArrayList<childList>();
}
child.setparent(this);
childList.add(child);
return child;
}
When the child already exist, simply do the same but without creating a new Child.
I think it should resolve some of your problems.
I am trying to configure this #OneToMany and #ManyToOne relationship but it's simply not working, not sure why. I have done this before on other projects but somehow it's not working with my current configuration, here's the code:
public class Parent {
#OneToMany(mappedBy = "ex", fetch= FetchType.LAZY, cascade=CascadeType.ALL)
private List<Child> myChilds;
public List<Child> getMyChilds() {
return myChilds;
}
}
public class Child {
#Id
#ManyToOne(fetch=FetchType.LAZY)
private Parent ex;
#Id
private String a;
#Id
private String b;
public Parent getParent(){
return ex;
}
}
At first, I thought it could be the triple #Id annotation that was causing the malfunction, but after removing the annotations it still doesn't work. So, if anyone have any idea, I am using EclipseLink 2.0.
I just try to execute the code with some records and it returns s==0 always:
Parent p = new Parent();
Integer s = p.getMyChilds().size();
Why?
The problem most probably is in your saving because you must not be setting the parent object reference in the child you want to save, and not with your retrieval or entity mappings per se.
That could be confirmed from the database row which must be having null in the foreign key column of your child's table. e.g. to save it properly
Parent p = new Parent();
Child child = new Child();
p.setChild(child);
child.setParent(p);
save(p);
PS. It is good practice to use #JoinColumn(name = "fk_parent_id", nullable = false) with #ManyToOne annotation. This would have stopped the error while setting the value which resulted in their miss while you are trying to retrieve.
All entities need to have an #Id field and a empty constructor.
If you use custom sql scripts for initialize your database you need to add the annotation #JoinColumn on each fields who match a foreign key :
example :
class Parent {
#Id
#GeneratedValue(strategy=GenerationType.IDENTITY)
private int id;
public Parent() {}
/* Getters & Setters */
}
class Child {
#Id
#GeneratedValue(strategy=GenerationType.IDENTITY)
private int id;
/* name="<tablename>_<column>" */
#JoinColumn(name="Parent_id", referencedColumnName="id")
private int foreignParentKey;
public Child () {}
}
fetch= FetchType.LAZY
Your collection is not loaded and the transaction has ended.
Can anyone tell me whether Hibernate supports associations as the pkey of an entity? I thought that this would be supported but I am having a lot of trouble getting any kind of mapping that represents this to work. In particular, with the straight mapping below:
#Entity
public class EntityBar
{
#Id
#OneToOne(optional = false, mappedBy = "bar")
EntityFoo foo
// other stuff
}
I get an org.hibernate.MappingException: "Could not determine type for: EntityFoo, at table: ENTITY_BAR, for columns: [org.hibernate.mapping.Column(foo)]"
Diving into the code it seems the ID is always considered a Value type; i.e. "anything that is persisted by value, instead of by reference. It is essentially a Hibernate Type, together with zero or more columns." I could make my EntityFoo a value type by declaring it serializable, but I wouldn't expect this would lead to the right outcome either.
I would have thought that Hibernate would consider the type of the column to be integer (or whatever the actual type of the parent's ID is), just like it would with a normal one-to-one link, but this doesn't appear to kick in when I also declare it an ID. Am I going beyond what is possible by trying to combine #OneToOne with #Id? And if so, how could one model this relationship sensibly?
If the goal is to have a shared primary key, what about this (inspired by the sample of Java Persistence With Hibernate and tested on a pet database):
#Entity
public class User {
#Id
#GeneratedValue
private Long id;
#OneToOne(cascade = CascadeType.ALL)
#PrimaryKeyJoinColumn
private Address shippingAddress;
//...
}
This is the "parent" class that get inserted first and gets a generated id. The Address looks like this:
#Entity
public class Address implements Serializable {
#Id #GeneratedValue(generator = "myForeignGenerator")
#org.hibernate.annotations.GenericGenerator(
name = "myForeignGenerator",
strategy = "foreign",
parameters = #Parameter(name = "property", value = "user")
)
#Column(name = "ADDRESS_ID")
private Long id;
#OneToOne(mappedBy="shippingAddress")
#PrimaryKeyJoinColumn
User user;
//...
}
With the above entities, the following seems to behave as expected:
User newUser = new User();
Address shippingAddress = new Address();
newUser.setShippingAddress(shippingAddress);
shippingAddress.setUser(newUser); // Bidirectional
session.save(newUser);
When an Address is saved, the primary key value that gets inserted is the same as the primary key value of the User instance referenced by the user property.
Loading a User or an Address also just works.
Let me know if I missed something.
PS: To strictly answer the question, according to Primary Keys through OneToOne Relationships:
JPA 1.0 does not allow #Id on a OneToOne or ManyToOne, but JPA 2.0 does.
But, the JPA 1.0 compliant version of Hibernate
allows the #Id annotation to be used on a OneToOne or ManyToOne mapping*.
I couldn't get this to work with Hibernate EM 3.4 though (it worked with Hibernate EM 3.5.1, i.e. the JPA 2.0 implementation). Maybe I did something wrong.
Anyway, using a shared primary key seems to provide a valid solution.
Yes that is possible.
Look at the following example using Driver and DriverId class as id for Driver.
#Entity
public class Drivers {
private DriversId id; //The ID which is located in another class
public Drivers() {
}
#EmbeddedId
#AttributeOverrides({
#AttributeOverride(name = "personId", column = #Column(name = "person_id", nullable = false))})
#NotNull
public DriversId getId() {
return this.id;
}
//rest of class
}
Here we are using personId as the id for Driver
And the DriversId class:
//composite-id class must implement Serializable
#Embeddable
public class DriversId implements java.io.Serializable {
private static final long serialVersionUID = 462977040679573718L;
private int personId;
public DriversId() {
}
public DriversId(int personId) {
this.personId = personId;
}
#Column(name = "person_id", nullable = false)
public int getPersonId() {
return this.personId;
}
public void setPersonId(int personId) {
this.personId = personId;
}
public boolean equals(Object other) {
if ((this == other))
return true;
if ((other == null))
return false;
if (!(other instanceof DriversId))
return false;
DriversId castOther = (DriversId) other;
return (this.getPersonId() == castOther.getPersonId());
}
public int hashCode() {
int result = 17;
result = 37 * result + this.getPersonId();
return result;
}
}
You can do this by sharing a primary key between EntityFoo and EntityBar:
#Entity
public class EntityBar
{
#Id #OneToOne
#JoinColumn(name = "foo_id")
EntityFoo foo;
// other stuff
}
#Entity
public class EntityFoo
{
#Id #GeneratedValue
Integer id;
// other stuff
}
You have to use #EmbeddedId instead of #Id here.
And EntityFoo should be Embeddable.
Another way is to put an integer, and a OneToOne with updateble and instertable set to false.