Hibernate will not make indexes when creating tables from annotations - java

I'm using hibernate with hbm2ddl.auto=update so that it'll automatically generate my oracle tables for me (i've no intention of learning oracle's sql).
So far, so good, until now i'm trying to get it to create an index. As far as i can tell, i've made my annotations correctly:
package data;
import javax.persistence.*;
import org.hibernate.annotations.Index;
#Entity
#Table(name="log_entries")
#org.hibernate.annotations.Table(appliesTo="log_entries",
indexes = { #Index(name="idx", columnNames = {"job", "version", "schedule", "dttmRun", "pid" } ) } )
public class LogEntry {
#Id #GeneratedValue
Long id;
String job;
String version;
String schedule;
String dttmRun;
int pid;
String command;
int duration;
// getters and setters...
}
When i drop the table and restart my app, it creates the table but not the index. Any ideas?

I tested your code on Derby and here is what I get when running SchemaUpdate SchemaExport:
drop table log_entries
create table log_entries (
id bigint not null,
command varchar(255),
dttmRun varchar(255),
duration integer not null,
job varchar(255),
pid integer not null,
schedule varchar(255),
version varchar(255),
primary key (id)
)
create index idx on log_entries (job, version, schedule, dttmRun, pid)
Works as expected. Can't try on Oracle right now though.
Tested with Hibernate EM 3.4.0.GA, Hibernate Annotations 3.4.0.GA, Hibernate Core 3.3.0.SP1.
Update: I realized I ran SchemaExport, not SchemaUpdate and confirm the index didn't get created when running SchemaUpdate with the versions of libraries mentioned above. So, while ANN-108 has been rejected as a dupe of HHH-1012, while HHH-1012 is marked as fixed, while HB-1458 is open (!?), my quick test didn't work as expected. I'll take a deeper look later but I'm very confused now (especially by HHH-1012).

Related

pgloader - How to import a longblob as oid?

In a nutshell
How do you migrate a longblob from MySQL to Postgres using pgloader s.t. Hibernate is happy if the column is annotated #Lob and #Basic(fetch= FetchType.LAZY)?
Full story
So I'm migrating (or trying to, at least) a MySQL DB to postgres. And I'm now trying to move this table correctly:
My current pgloader script is fairly simple:
LOAD DATABASE
FROM mysql://foo:bar#localhost:3306/foobar
INTO postgresql://foo:bar#localhost:5432/foobar
CAST
type int to integer drop typemod,
type bigint with extra auto_increment to bigserial drop typemod,
type bigint to bigint drop typemod
ALTER TABLE NAMES MATCHING 'User' RENAME TO 'users'
ALTER TABLE NAMES MATCHING ~/./ SET SCHEMA 'public'
;
This is sufficient to load the data and have the foreign keys working.
The postgres table looks like this:
The File, however, is a java entity and its content is annotated #Lob:
#Entity
#Inheritance(strategy= InheritanceType.JOINED)
public class File extends BaseEntity {
#NotNull
private String name;
#Column
#Size(max = 4096)
private String description;
#NotNull
private String mimeType;
#Lob
#Basic(fetch= FetchType.LAZY)
private transient byte[] content;
...
}
which is why the application fails to connect to the migrated database with error:
Schema-validation: wrong column type encountered in column [content] in table [File];
found [bytea (Types#BINARY)], but expecting [oid (Types#BLOB)]
How do I get this migration to work?
I did try setting
spring.jpa.properties.hibernate.jdbc.use_streams_for_binary=false
as suggested in proper hibernate annotation for byte[] but that didn't do anything.
Hm ... I guess I can just create blobs after the fact, as suggested by Migrate PostgreSQL text/bytea column to large object?
Meaning the migration script will get an extension:
LOAD DATABASE
FROM mysql://foo:bar#localhost:3306/foobar
INTO postgresql://foo:bar#localhost:5432/foobar
CAST
type int to integer drop typemod,
type bigint with extra auto_increment to bigserial drop typemod,
type bigint to bigint drop typemod
ALTER TABLE NAMES MATCHING 'User' RENAME TO 'users'
ALTER TABLE NAMES MATCHING ~/./ SET SCHEMA 'public'
AFTER LOAD DO
$$
ALTER TABLE file RENAME COLUMN content TO content_bytes;
$$,
$$
ALTER TABLE file ADD COLUMN content OID;
$$,
$$
UPDATE file SET
content = lo_from_bytea(0, content_bytes::bytea),
content_bytes = NULL
;
$$,
$$
ALTER TABLE file DROP COLUMN content_bytes
$$
;

Force hibernate to leave id empty

So I am using Postgres and Hibernate 4.2.2 and with entity like this
#Entity(name = "Users")
#Check(constraints = "email ~* '^[A-Za-z0-9._%-]+#[A-Za-z0-9.-]+[.][A-Za-z]+$'")
#DynamicInsert
public class Users {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "id_user",unique = true)
#Index(name = "user_pk")
private Integer idUser;
Hibernate still inserts some id that is already in the table, instead of leaving it emtpy for the database to fill it in. Also hibernate forces ids based on its cache not even checking the database whether it has the lates id.
How can I force it so I can leave id blank and let the database insert it?
First I thought it was because I was using int and that int is by default 0 but even when using object it just forces the id there from its cache.
So my goal is to let the database fill the ids instead of hibernate or at least Hibernate before filling it in to check the database for id first.
So the error I was getting wasCaused by: org.postgresql.util.PSQLException: ERROR: duplicate key value violates unique constraint "users_pkey" Detail: Key (id_user)=(1) already exists.
And it wasn't caused by Hibernate and caching but by import of data at creation of database, where I inserted with given ids eg: INSERT INTO users(id_user,email,password,tag) VALUES (1,'a#b.c','***','Adpleydu');
and the sequence for generating wasn't updated so if I inserted with pure SQL via console I got the same error.
Seeding the data is the problem. However you can still seed with pure sequal and have the sequence "keep up".
1) Assure your primary key is of type SERIAL.
CREATE TABLE table_name(
id SERIAL
);
2) Add this 'setval' line to assure the sequence is updated.
select setval('table_name_id_seq',COALESCE((select max(id) + 1 from table_name), 1));
Reference:
https://www.postgresqltutorial.com/postgresql-serial/

Hibernate / SQLException: field doesn't have default value

mySQL Table generated using:
CREATE TABLE `actors` (
`actorID` INT(11) NOT NULL,
`actorName` VARCHAR(255) NOT NULL,
PRIMARY KEY AUTO_INCREMENT (actorID)
);
Mapped class:
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "actorID", length = 11, unique = true, nullable = false)
private int actorID;
....
Error:
ERROR: Field 'actorID' doesn't have a default value
This error is hit when trying to create a new Actor object, and saving it to the database.
I've tried using other generation strategies, and dropping/rebuilding tables / entire database, as per the answers to similar questions. No luck so far.
Thank you for your time,
Samuel Smith
EDIT: Dropping the table and recreating it using the SQL syntax shown by shippi seemed to solve the problem for me.
Just try to use
#GeneratedValue(strategy=GenerationType.AUTO)
#Id
#Column(name = "actorID")
private int actorID;
This should fix your issue. Leave the actorID null if saving.
Ensure also that the DB works fine, try to write an insert statement without inserting the ID and lets see whether the pure insert is working. If the direct insert is working to the database you could start to troubleshoot hibernate.
And a small point, I use to define the PH and Auto increment in the same line where defining the column:
actorID INT(11) NOT NULL AUTO_INCREMENT PRIMARY KEY,
but it should work either way.
Your error is in your table definition.
Must be:
CREATE TABLE `actors` (
`actorID` INT(11) NOT NULL AUTO_INCREMENT,
`actorName` VARCHAR(255) NOT NULL,
PRIMARY KEY AUTO_INCREMENT (actorID)
);
You missed AUTO_INCREMENT in actorID.
I had the same issue, Solved it empty the entire database to empty and re run the application. Everything is no working fine.

Hibernate: #GeneratedValue(strategy = GenerationType

I am using DB2 for my application. I run some insertion script after creating database. That insertion script generates records in table with id's given in insertion script.
Suppose for abc table insertion script creates a record with id = 3. As id’s are set to auto generated in hibernate so while saving third record from application I got exception.
Caused by: com.ibm.websphere.ce.cm.DuplicateKeyException: One or
more values in the INSERT statement, UPDATE statement, or foreign
key update caused by a DELETE statement are not valid
because the primary key, unique constraint or unique
index identified by "1" constrains table
I am using #GeneratedValue(strategy = GenerationType.AUTO)
What strategy = GenerationType I should use to overcome this problem.
There are issues with certain Databases and Hibernate when you use GenerationType.IDENTITY. Try using a sequence and explicitlly configure everything for it:
#Id
#SequenceGenerator(name = "DEPARTMENT_ID_GENERATOR", sequenceName="department_sequence", allocationSize=100)
#GeneratedValue(strategy=GenerationType.SEQUENCE, generator = "DEPARTMENT_ID_GENERATOR")
#Column(unique = true, nullable = false)
protected Long id;
For DB2 #GeneratedValue(strategy = GenerationType.IDENTITY) should work correctly.
If ids are provided in file then you don't need #GeneratedValue at all as there is no id to generate. And make sure to clean database as #SjB suggested.
Also, without knowing much about DB2, the error message suggests that there may be other violation than just duplicate id on insert. Are there any foreign keys involved?
Nothing works except this query.
alter table TABLE_NAME alter column ID set GENERATED BY DEFAULT RESTART WITH 10000;
DB2 should choose available ID itself but not doing so.

Hibernate: fetching a set of objects in an entity mapping

I've got an entity Case that has an id CaseId (unfortunately a string due to compability with a legacy system). This id is foreign key in the table Document, and each Case can have many documents (onetomany). I've put the following in my Case entity:
#Id
#Column(name = "CaseId", length = 20, nullable = false)
private String caseId;
#OneToMany(fetch=FetchType.EAGER)
#JoinColumns ( {
#JoinColumn(name="caseId", referencedColumnName="CaseId")
} )
private Set<Document> documents;
The table for Document contains "CaseId varchar(20) not null". Right now, in the database, all cases have six documents. Yet when I do myCase.documents().size, I only ever get a single document. What should I do to get all the documents?
Cheers
Nik
The mapping looks correct. But it would be interesting to see:
the Document entity (and its equals/hashCode)
the SQL performed (see this previous answer to activate SQL logging)

Categories