mySQL Table generated using:
CREATE TABLE `actors` (
`actorID` INT(11) NOT NULL,
`actorName` VARCHAR(255) NOT NULL,
PRIMARY KEY AUTO_INCREMENT (actorID)
);
Mapped class:
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "actorID", length = 11, unique = true, nullable = false)
private int actorID;
....
Error:
ERROR: Field 'actorID' doesn't have a default value
This error is hit when trying to create a new Actor object, and saving it to the database.
I've tried using other generation strategies, and dropping/rebuilding tables / entire database, as per the answers to similar questions. No luck so far.
Thank you for your time,
Samuel Smith
EDIT: Dropping the table and recreating it using the SQL syntax shown by shippi seemed to solve the problem for me.
Just try to use
#GeneratedValue(strategy=GenerationType.AUTO)
#Id
#Column(name = "actorID")
private int actorID;
This should fix your issue. Leave the actorID null if saving.
Ensure also that the DB works fine, try to write an insert statement without inserting the ID and lets see whether the pure insert is working. If the direct insert is working to the database you could start to troubleshoot hibernate.
And a small point, I use to define the PH and Auto increment in the same line where defining the column:
actorID INT(11) NOT NULL AUTO_INCREMENT PRIMARY KEY,
but it should work either way.
Your error is in your table definition.
Must be:
CREATE TABLE `actors` (
`actorID` INT(11) NOT NULL AUTO_INCREMENT,
`actorName` VARCHAR(255) NOT NULL,
PRIMARY KEY AUTO_INCREMENT (actorID)
);
You missed AUTO_INCREMENT in actorID.
I had the same issue, Solved it empty the entire database to empty and re run the application. Everything is no working fine.
Related
In a nutshell
How do you migrate a longblob from MySQL to Postgres using pgloader s.t. Hibernate is happy if the column is annotated #Lob and #Basic(fetch= FetchType.LAZY)?
Full story
So I'm migrating (or trying to, at least) a MySQL DB to postgres. And I'm now trying to move this table correctly:
My current pgloader script is fairly simple:
LOAD DATABASE
FROM mysql://foo:bar#localhost:3306/foobar
INTO postgresql://foo:bar#localhost:5432/foobar
CAST
type int to integer drop typemod,
type bigint with extra auto_increment to bigserial drop typemod,
type bigint to bigint drop typemod
ALTER TABLE NAMES MATCHING 'User' RENAME TO 'users'
ALTER TABLE NAMES MATCHING ~/./ SET SCHEMA 'public'
;
This is sufficient to load the data and have the foreign keys working.
The postgres table looks like this:
The File, however, is a java entity and its content is annotated #Lob:
#Entity
#Inheritance(strategy= InheritanceType.JOINED)
public class File extends BaseEntity {
#NotNull
private String name;
#Column
#Size(max = 4096)
private String description;
#NotNull
private String mimeType;
#Lob
#Basic(fetch= FetchType.LAZY)
private transient byte[] content;
...
}
which is why the application fails to connect to the migrated database with error:
Schema-validation: wrong column type encountered in column [content] in table [File];
found [bytea (Types#BINARY)], but expecting [oid (Types#BLOB)]
How do I get this migration to work?
I did try setting
spring.jpa.properties.hibernate.jdbc.use_streams_for_binary=false
as suggested in proper hibernate annotation for byte[] but that didn't do anything.
Hm ... I guess I can just create blobs after the fact, as suggested by Migrate PostgreSQL text/bytea column to large object?
Meaning the migration script will get an extension:
LOAD DATABASE
FROM mysql://foo:bar#localhost:3306/foobar
INTO postgresql://foo:bar#localhost:5432/foobar
CAST
type int to integer drop typemod,
type bigint with extra auto_increment to bigserial drop typemod,
type bigint to bigint drop typemod
ALTER TABLE NAMES MATCHING 'User' RENAME TO 'users'
ALTER TABLE NAMES MATCHING ~/./ SET SCHEMA 'public'
AFTER LOAD DO
$$
ALTER TABLE file RENAME COLUMN content TO content_bytes;
$$,
$$
ALTER TABLE file ADD COLUMN content OID;
$$,
$$
UPDATE file SET
content = lo_from_bytea(0, content_bytes::bytea),
content_bytes = NULL
;
$$,
$$
ALTER TABLE file DROP COLUMN content_bytes
$$
;
CREATE TABLE `user_info` (
`user_info_id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`nickname` varchar(40) NOT NULL,
`email` varchar(100)
)
this my database table,database have A record id=1, nickname=test, email=null
UserInfo info = new UserInfo();
info.setUserInfoId(1);
info.setEmail("hello#hotmail.com");
When I update with the above code, an error Column 'nickname' cannot be null occurs!
I know this is because JPA has to perform a lookup first, but I don't want to give JPA all the values that can't be empty when updating the operation.
Additional explanation:
Assuming that my front end only sends me ID and email, how can I update it? Use SQL is feasible, but JPA must require nickname not null
How to solve this problem?
thx
The way to update an existing record is to load the record, update any columns that need updating, and then save it, not by creating a new object.
UserInfo info = userInfoRepository.findById(1);
info.setEmail("hello#hotmail.com");
userInfoRepository.save(info);
I'm having some trouble creating tables using Hibernate with the following exception:
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Incorrect column specifier for column 'id'
Unable to execute schema management to JDBC target [create table services (id varchar(255) not null auto_increment, adult_cost double precision not null, child_cost double precision not null, partner_id varchar(255) not null, primary key (id)) ENGINE=InnoDB]
My code for the ID column is as follows:
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private int id;
From what I understand, this exception normally occurs when you try to assign an auto_increment to a non-numeric but as you can see - I've defined id as an int.
Any help appreciated.
When you select primary key or any key select and give apply/go then check whether its auto incremented with not null specification then its defenitly works
use GenerationType.IDENTITY instead of GenerationType.AUTO. Mysql doesn't use table to generate id values.
I am new in derby library. why I got this error when I use the auto_increment in my query?
here is my java code
this.conn.createStatement().execute(create table user("user_id int auto_increment, PRIMARY KEY(user_id))");
I tried this in mysql server and its works but in derby I got this error
java.sql.SQLSyntaxErrorException: Syntax error: Encountered "auto_increment" at line 1
why I got this error?
Derby does not have auto_increment as a keyword. In derby you need to use identity columns to implement auto increment behaviour
For example
CREATE TABLE students
(
id INTEGER NOT NULL GENERATED ALWAYS AS IDENTITY (START WITH 1, INCREMENT BY 1),
name VARCHAR(24) NOT NULL,
address VARCHAR(1024),
CONSTRAINT primary_key PRIMARY KEY (id)
) ;
Above statement will create Student table with id as auto increment column and primary key as well.
Hope this helps
I'm using hibernate with hbm2ddl.auto=update so that it'll automatically generate my oracle tables for me (i've no intention of learning oracle's sql).
So far, so good, until now i'm trying to get it to create an index. As far as i can tell, i've made my annotations correctly:
package data;
import javax.persistence.*;
import org.hibernate.annotations.Index;
#Entity
#Table(name="log_entries")
#org.hibernate.annotations.Table(appliesTo="log_entries",
indexes = { #Index(name="idx", columnNames = {"job", "version", "schedule", "dttmRun", "pid" } ) } )
public class LogEntry {
#Id #GeneratedValue
Long id;
String job;
String version;
String schedule;
String dttmRun;
int pid;
String command;
int duration;
// getters and setters...
}
When i drop the table and restart my app, it creates the table but not the index. Any ideas?
I tested your code on Derby and here is what I get when running SchemaUpdate SchemaExport:
drop table log_entries
create table log_entries (
id bigint not null,
command varchar(255),
dttmRun varchar(255),
duration integer not null,
job varchar(255),
pid integer not null,
schedule varchar(255),
version varchar(255),
primary key (id)
)
create index idx on log_entries (job, version, schedule, dttmRun, pid)
Works as expected. Can't try on Oracle right now though.
Tested with Hibernate EM 3.4.0.GA, Hibernate Annotations 3.4.0.GA, Hibernate Core 3.3.0.SP1.
Update: I realized I ran SchemaExport, not SchemaUpdate and confirm the index didn't get created when running SchemaUpdate with the versions of libraries mentioned above. So, while ANN-108 has been rejected as a dupe of HHH-1012, while HHH-1012 is marked as fixed, while HB-1458 is open (!?), my quick test didn't work as expected. I'll take a deeper look later but I'm very confused now (especially by HHH-1012).