I have a table defined as this:
#Entity
#Table
#Inheritance(strategy=InheritanceType.JOINED)
public class Table implements Serializable {
#Id
#GeneratedValue
private Long id;
...
}
Then I have an inherited table:
#Entity
#Table
public class SubTable extends Table {
...
}
Hibernate correctly creates two table in my Postgres database but it defines on-delete action between the two tables as "NO ACTION".
How can I define in Hibernate, that I want CASCADE action to be defined on delete? For example when I manually delete a row from table Table I want to automatically get the corresponding row in table SubTable to be deleted. When I try to delete a row from Table now, it returns me foreign key constraint violation error.
Hibernate can take care of the cascade deletes if you always use it to perform the deletes.
So removing a SubTable entity will succeed and it will remove both the subclass table record and the associated base class row as well.
If you want to use SQL level deletes, assuming you are using hbm2ddl (which you shouldn't do since you should use Flyway), then you need to annotate the SubTable with the #OnDelete annotation:
#Entity
#Table
#OnDelete(action = OnDeleteAction.CASCADE)
public class SubTable extends Table {
...
}
Related
#OneToMany annotation, by default, creates a join table, unless the mappedBy element is specified.
What is the reason for this behaviour? For example, with the following entities:
#Entity
public class User {
// ...
#OneToMany
private List<UserDocument> documents;
// ...
}
#Entity
public class UserDocument {
// ...
#ManyToOne
private User user;
// ...
}
For the User entity, why doesn't Hibernate simply:
Find the field with type User in UserDocument by doing reflection on UserDocument entity.
Infer the value of mappedBy for the #OneToMany annotation by itself?
What is the reason for not doing this and generating a join table as the default behaviour? Why is Hibernate (or JPA) is designed this way?
A simple reason behind this is that Hibernate cannot known for sure that a filed of type User inside of UserDocument is corresponding to the specific User-UserDocument relation. Without a mappedBy property, Hibernate can only create a join table or insert a generated column in UserDocument table. However, the latter alters data model and introduces more problem than it may resolve ( distinguish generated or declared column; table schema mismatch model class; etc.). Thus Hibernate use a join table to store the mapping.
For example, if you want to track the last one who modifies a document, you may need another many-to-one relation in UserDocument. This cannot be infered and resolved just using reflection.
#Entity
public class UserDocument {
// ...
#ManyToOne
private User user;
#ManyToOne
private User lastModifiedBy;
// ...
}
I've got the following entity:
#Entity
#Table(name = "ONE")
#SecondaryTable(name = "VIEW_TWO", pkJoinColumns = #PrimaryKeyJoinColumn(name="ONE_ID"))
public class CpBracket {
#Id
private Long id;
#Column(name="progress", table="VIEW_TWO", updatable = false, insertable = false)
private int progress = 0;
(...)
}
As you see, this entity uses table ONE and (read only) view VIEW_TWO. When I'm persisting the entity, hibernate is performing insert into view:
insert into VIEW_TWO (ONE_ID) values (?)
It is ignoring the non-updatable and non-insertable column progress (that's good) and it is still trying to insert value of ONE_ID column. As far as I know, the annotation #PrimaryKeyJoinColumn marks selected column as insertable=false and updatable=false.
How can I prevent hibernate from inserting rows into secondary table (view)?
As far as I know, the annotation #PrimaryKeyJoinColumn marks selected
column as insertable=false and updatable=false.
I do not believe this can be the case: how then do we get records inserted into the #SecondaryTable when it is an actual table rather than a view?
As neither #SecondaryTable or #PrimarykeyJoinColumn have a means to prevent insert then it would appear that your original solution is not going to work and an alternative is required.
One option is to map VIEW_TWO as an #Entity and link to your class CPBracket as a #OneToOne relationship with cascade options set to none.
#Entity
#Table(name ="VIEW_TWO")
private CpBracketSummaryData(){
}
#Entity
#Table(name = "ONE")
public class CpBracket {
#OneToOne
#PrimaryKeyJoinColumn
private CPBracketSummaryData summaryData;
public int getSomeValue(){
return summaryData.getSomeValue();
}
}
The second option would be to use the non JPA compliant, Hibernate specific #Formula annotation.
#Entity
#Table(name = "ONE")
public class CpBracket {
#Formula("native sql query")
private int someValue;
}
Update October 2016
I have revisited this in both Hibernate 4.3.10.Final and 5.1.0.Final and it is possible to have the view as a #SecondaryTable without the insert: if you have the correct mappings.
Scenario 1
Load an entity for edit and do not touch any fields mapped to the secondary table. No update is issued to the secondary table
Scenario 2
Create and save a new entity and do not set any fields mapped to the secondary table. No insert is issued for the secondary table
Scenario 3
Create or update an entity including a field mapped to a secondary table and where this field is marked as insertable = false and updateable = false. An insert is made to the secondary table only for the ID field -the behaviour reported in the original question.
The issue with the mapping in the original question is the fact that the secondary table field is a primitive type and therefore when saving a new entity Hibernate does think a record has to be written to the secondary table with a value of zero.
#Column(name="progress", table="VIEW_TWO", updatable = false, insertable = false)
private int progress = 0;
The solution then is to replace primitives with the corresponding wrapper types and leave them as null. Then when saving a new record there is nothing to write to the secondary table and no insert will be made:
#Column(name="progress", table="VIEW_TWO")
private Integer progress;
I solved a similar problem with #SecondaryTable, which was a database view. So maybe it will help someone else.
The problem was on cascade delete to #SecondaryTable, when record from primary table was deleted.
As a solution, I implemented RULE on view for delete
CREATE RULE on_delete AS ON DELETE TO my_view DO INSTEAD(
select 1;
)
Similar solution can be used for INSERT and UPDATE operation on view.
I have the following error when I'm trying to map an entity:
ORA-00942: table or view does not exist
I figured out that the problem is that hibernate is trying to find the name of the table in "non-capital" (lowercase) letters, but Oracle has the tables names in capital letters (although the tables were created with non-capital letters).
I can fix the problem if I add the annotations #Table and #Column putting the names in capital letters, but I don't want to add those annotations.
I would like to know if there is any different way to do it.
Sql:
create table foo(
id integer not null
);
alter table foo
add constraint foo_pk
primary key (id);
Entity that is not working:
#Entity
public class Foo {
#Id
private Long id;
//getter and setter
}
Entity that is working:
#Entity
#Table(name = "FOO")
public class Foo {
#Id
private Long id;
//getter and setter
}
Thanks!
You can define a custom NamingStrategy, to make the specific table and column names translation from your entity to Database. Here is some example how to do it.
So, all you need is to create an implementation of NamingStrategy interface or extend some of existing strategies, modifying it's behaviour with the behaviour you want to get and then register this new strategy via hibernate XML configuration parameter hibernate.ejb.naming_strategy or via Configuration class.
In a Spring MVC application using Hibernate and MySQL, I have an abstract superclass BaseEntity that manages the values of the IDs for all the other entities in the model. The id field uses #GeneratedValue. I am encountering a problem whenever my code tries to save any of the subclasses that extend BaseEntity. The problem comes with the choice of GenerationType for the #GeneratedValue.
At every place in my code where a subclass of BaseEntity tries to save to the underlying MySQL database, I get the following error:
ERROR SqlExceptionHelper - Table 'docbd.hibernate_sequences' doesn't exist
I have read many postings about this on SO and on google, but they either deal with other databases (not MySQL) or they do not deal with abstract superclasses. I cannot solve the problem by using GenerationType.IDENTITY because I am using an abstract superclass to manage id fields for all entities in the model. Similarly, I cannot use GenerationType.SEQUENCE because MySQL does not support sequences.
So how do I solve this problem?
Here is the code for BaseEntity.java:
#Entity
#Inheritance(strategy = InheritanceType.TABLE_PER_CLASS)
public abstract class BaseEntity {
#Id
#GeneratedValue(strategy = GenerationType.TABLE)
protected Integer id;
public void setId(Integer id) {this.id = id;}
public Integer getId() {return id;}
public boolean isNew() {return (this.id == null);}
}
Here is an example of the code for one of the entities that extends BaseEntity:
#Entity
#Table(name = "ccd")
public class CCD extends BaseEntity{
//other stuff
}
Here is the DDL:
CREATE TABLE IF NOT EXISTS ccd(
id int(11) UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY,
#other stuff
)engine=InnoDB;SHOW WARNINGS;
Here is the JPQL code in the DAO:
#Override
#Transactional
public void saveCCD(CCD ccd) {
if (ccd.getId() == null) {
System.out.println("[[[[[[[[[[[[ about to persist CCD ]]]]]]]]]]]]]]]]]]]]");
this.em.persist(ccd);
this.em.flush();
}
else {
System.out.println("]]]]]]]]]]]]]]]]]] about to merge CCD [[[[[[[[[[[[[[[[[[[[[");
this.em.merge(ccd);
this.em.flush();
}
}
EDIT:
The reason I cannot use #MappedSuperClass in this situation is that I need to have ManyToOne relationships that allow for multiple subtypes to be used interchangeably. Look at the AccessLog class below as an example. It has an actor_entity and a target_entity. There can be many types of actor entities and many types of target entities, but they all inherit from BaseEntity. This inheritance enables the underlying accesslogs data table in MySQL to just have one actor_entity_id field and just one target_entity_id field instead of having to have several fields for each. When I change #Entity above BaseEntity to #MappedSuperClass, a different error gets thrown indicating that AccessLog cannot find BaseEntity. BaseEntity needs #Entity annotation in order for AccessLog to have polymorphic properties.
#Entity
#Table(name = "accesslogs")
public class AccessLog extends BaseEntity{
#ManyToOne
#JoinColumn(name = "actorentity_id")
private BaseEntity actor_entity;
#ManyToOne
#JoinColumn(name = "targetentity_id")
private BaseEntity target_entity;
#Column(name="action_code")
private String action;
//getters, setters, & other stuff
}
SECOND EDIT:
As per JBNizet's suggestion, I created a hibernate_sequences table as follows:
CREATE TABLE IF NOT EXISTS hibernate_sequences(
sequence_next_hi_value int(11) UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY
)engine=InnoDB;SHOW WARNINGS;
But now I am getting the following error:
Caused by: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'sequence_name' in 'where clause'
Here is the hibernate sql causing the error, followed by the next 2 lines of the stack trace:
Hibernate: select sequence_next_hi_value from hibernate_sequences where sequence_name = 'BaseEntity' for update
ERROR MultipleHiLoPerTableGenerator - HHH000351: Could not read or init a hi value
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Unknown column 'sequence_name' in 'where clause'
How do I resolve this?
What a mess... AUTO_INCREMENT is MySQL's hidden sequence. The radical problem is that MySQL can not insert and return the PK at the same time, but Hibernate need this while INSERTing a new Entity.
The Problems you run into:
If Hibernate save a new Entity, he try to immerdentelly set the id to the new EntityBean. Therefore hibernate must read what ID will the Database use before hibernate save the new Tuple to the Table.
If you have multiple Servers who access the database, you shall let hibernate's session-factory decide to use the built-in sequence(AUTO-INCREMENT) or let hibernate decide (GenerationType.AUTO/GenerationType.IDENTITY) how large the open range of reserved PK's is (Job of a DB-Architect). (We have about 20 servers to one Database, so on a good-used table we use a PK-distance of +100). If only one server have access to the database GenerationType.TABLE shall be correct.
Hibernate must calculate the next id by yourself using max(*)+1 but:
What if two requests ask for max(*)+1 at the same time/with the same result? Right: The last try to insert will fail.
So you need to have a Table LAST_IDS in the database who stores the last Table-PK's. If you like to add one, you must do this steps:
Start read-optimistic transaction.
SELECT MAX(address_id) FROM LAST_IDS
store the maximum in a java-variable i.e.: $OldID.
$NewID = $OldID + 1. (+100 in pessimistic-lock)
UPDATE LAST_IDS SET address_id= $newID WHERE address_id= $oldID?
commit the read-optimistic transaction.
if commit was successfull, store $newID to setID() in the HibernateBean you like to save.
Finally let Hibernate call the insert.
This is the only way i know.
BTW: Hibernate-Entitys shall only use inheritance if the Database support inheritance between tables like PostgreSQL or Oracle.
Because you use the TABLE identifier generator you need to have that table created. If you are not using the enhanced identifier generators, chances are you are going to use the MultipleHiLoPerTableGenerator.
The MultipleHiLoPerTableGenerator can use one table for all table identifier generators.
My suggestion is to grab the table ddl from your integration tests, in case you use hbmddl to build the test schema. If you use flyway or liquibase for testing, you can add a maven plugin to generate the ddl schema.
Once you have the schema, you need to take the exact create table command and make add it to your MySQL database.
I have mapped my inheritance hierarchy in Hibernate using InheritanceType.Single_Table and discriminator columns to distinguish between the different entities. All subclasses of the superclass store their fields into secondary tables. As an example:
#MappedSuperclass
public abstract class Base
{
#Id
private String id;
#Version
private long version;
}
#Entity
#Inheritance(strategy = InheritanceType.SINGLE_TABLE)
#DiscriminatorColumn(name = "type", discriminatorType = DiscriminatorType.STRING)
public class Parent extends Base
{
#Column(nullable=false)
private BigDecimal value;
}
#Entity
#DiscriminatorValue("child1")
#SecondaryTable(name = "Child1")
public class Child1 extends Parent
{
#Column(table="Child1")
private String name;
}
#Entity
#DiscriminatorValue("child2")
#SecondaryTable(name = "Child2")
public class Child2 extends Parent
{
#Column(table="Child2")
private String name2;
}
I now have an Entity that has a #OneToOne relationship with the Parent class. This Entity only needs to work with the value field from the Parent class. It will never need to work with any fields from any subclass of Parent
#Entity
public class AnotherEntity extends Base
{
#JoinColumn(name="parentId")
#OneToOne(fetch=FetchType.Lazy, optional=true, targetEntity=Parent.class)
private Parent parent;
}
What I want to happen is that only the fields of Parent.class are selected when the relationship to parent is loaded from the database. What I'm seeing is that Hibernate attempts to load all properties of the entities that extend Parent. It also left joins all of the Secondary tables. This is problematic as I have rougly 30 entities that extend Parent. This makes fetching the Parent entity non-viable as the query performs 30 joins.
As an example, this is the type of query I am seeing:
Hibernate:
select
parent.id as id3_0_,
parent_.version as version3_0_,
parent.name1 as name110_3_0_,
parent.name2 as name24_3_0_,
parent.type as type3_0_
from
Parent parent0_
left outer join
Child1 parent0_2_
on parent0_.id=parent0_2_.id
left outer join
Child2 parent0_3_
on parent0_.id=parent0_3_.id
I don't understand why Hibernate decides to select a superset of all properties defined in the subclasses of Parent and join all of the secondary tables? I could understand it joining the secondary table for entity defined by the discriminator value of the parent being referenced, but otherwise I am confused.
My question is, how do I go about achieving my requirement of only having the fields from the Parent class loaded when I retrieve the Parent relationship in the AnotherEntity class?
Thanks.
A secondary table is normally used to map the content of a single entity to two tables. It doesn't allow for lazy/select fetching using standard JPA annotations. You may use a proprietary Hibernate annotation to load it using a separate select, and only if necessary, though. See http://docs.jboss.org/hibernate/core/3.6/reference/en-US/html_single/#mapping-declaration-join:
fetch: If set to JOIN, the default, Hibernate will use an inner join
to retrieve a secondary table defined by a class or its superclasses
and an outer join for a secondary table defined by a subclass. If set
to SELECT then Hibernate will use a sequential select for a secondary
table defined on a subclass, which will be issued only if a row turns
out to represent an instance of the subclass. Inner joins will still
be used to retrieve a secondary defined by the class and its
superclasses.
So setting the fetch attribute of the Hibernate #Table annotation to SELECT will do what you want : an additional select clause will be issued to select the values from just the appropriate secondary table.
If you want lazy fetching, then a secondary table is not what you want. You'll have to do it using associations.