I'm creating a db using an initial.sql
<property name="hibernate.dialect">org.hibernate.dialect.MySQLDialect</property>
<property name="hibernate.connection.driver_class">com.mysql.cj.jdbc.Driver</property>
<property name="hibernate.connection.url">jdbc:mysql://localhost:3310/mydb?createDatabaseIfNotExist=true&autoReconnect=true&useSSL=false</property>
<property name="hibernate.connection.username">root</property>
<property name="hibernate.connection.password">qet</property>
<property name="format_sql">true</property>
<property name="hibernate.hbm2ddl.import_files_sql_extractor">org.hibernate.tool.hbm2ddl.MultipleLinesSqlCommandExtractor</property>
<property name="hibernate.hbm2ddl.auto">create</property>
<property name="hibernate.hbm2ddl.import_files">/database/initial.sql</property>
it works perfectly but the problem now is that I have the db but when I restart the app it will insert more data.
How do I check if the schema already exists dont do anything. I just want to create the db if it doesn't have any tables / doesn't exist...
Try switching the property to update
<property name="hibernate.hbm2ddl.auto">update</property>
Create recreates it every time and update creates it only if it doesn't exist. The standard values can be seen here:
https://docs.jboss.org/hibernate/orm/5.0/manual/en-US/html/ch03.html
validate : validates the schema
update : update if there are changes, create if it doesn't exist
create : recreate it from scratch
create-drop : creates it and then drops it when the session factory is closed
Also from the same reference we can see that import files are executed only if the hbm2ddl is create or create-drop
File order matters, the statements of a give file are executed before
the statements of the following files. These statements are only
executed if the schema is created ie if hibernate.hbm2ddl.auto is set
to create or create-drop.
If you want that initial_sql to be executed you can leave it as create and you can make that SQL to have the proper commands that don't change the state if already executed. Like DROP TABLE IF EXISTS, CREATE TABLE IF NOT EXISTS etc. For inserts it is a bit more complicated but you can still do it by insert into table join the same table where value is not found
That's not the perfect solution though because if you have tables created from annotated classes they will be recreated. I think you should think about another design or use hibernate.hbm2ddl.auto=create when deploying for the first time and then changing it to update manually
Related
With Wildfly 16, using the EntityManager I persist a new entity and invoke another routine to perform an HQL query to retrieve that entity, but it fails to return.
This is running server side and is normally triggered by a client command. The command should cause the new object to be persisted, then call a routine used in several places that selects objects (like and including the newly persisted one) and format and push them to subscribers.
I've tried a few things to get this working:
Flush the EntityManager prior to the HQL query
Reuse the EntityManager via the invoked routine (with and without flushing post persist())
The only way I've managed to get the desired results is to cause the initial client command to persist the entity, then perform a second client command to retrieve the entity persisted via the first command. This is not an issue with the HQL retrieving the data as it does work - it just doesn't work immediately after persisting the entity. It seems like either the data isn't persisted prior to the HQL query or the HQL query is looking at something cached (although I haven't specifically set anything like that, so it would have to be a default I'm unaware of).
An example of the three routines:
//Routine A - calls B & C
routineB();
routineC();
//Routine B
EntityManager em = emProvider.getEntityManager(); //pulls em from a stateless bean tagged #PersistenceContext
Blah blah = new Blah();
em.persist(blah);
em.flush();
//Routine C
EntityManager em = emProvider.getEntityManager();
List<Blah> blahs = em.createNamedQuery("retrieveBlah").getResultList();
//do some stuff with blahs... except it's missing blah from Routine A
My persistence.xml settings JIC that's relevant
<property name="hibernate.dialect" value="org.hibernate.dialect.PostgreSQL94Dialect" />
<property name="hibernate.connection.useUnicode" value="true" />
<property name="hibernate.connection.characterEncoding" value="UTF-8" />
<property name="hibernate.connection.charSet" value="UTF-8" />
<property name="hibernate.id.new_generator_mappings" value="true" />
I need to be able to persist and retrieve the persisted via HQL query as the result of a single client command.
It turns out the post above was missing the key bit of information to resolve the issue. After posting this I continued testing and discovered the piece I overlooked, "routineB()" was marked #Asynchronous. So it was executing in parallel to routineC().
I have issue with Hibernate query, my IDEA inspection error syntax:
This inspection controls whether the Persistence QL Queries are
error-checked
But I create mapping for Task objects in my hibernate.cfg.xml:
<session-factory>
<property name="connection.url">jdbc:postgresql://localhost:5432/todo_list</property>
<property name="connection.driver_class">org.postgresql.Driver</property>
<property name="connection.username">postgres</property>
<property name="connection.password">1</property>
<property name="dialect">org.hibernate.dialect.PostgreSQL95Dialect</property>
<mapping resource="ru/pravvich/model/Task.hbm.xml" />
</session-factory>
Facets:
If I cheating IDE and instead createQuery("select t from Task t"), create variable and push in createQuery
String hql = format("select t from Task t where t.id > %s", 0);
session.createQuery(hql)
It's work, but it's not normal code. How to fix this issue
Here is what for me resolve the same issue:
Open in the IDEA Preferences (Settings)/Editor/Language Injections and under the list of languages find Session (org.hibernate).
Under the column Language, it should be selected Hibernate QL.
Double click on it and a list of operations will be displayed.
Select the operations that you need.
IDEA doesn't recognise which or what Descriptor you are using. Check Project Structure -> Facets -> Hibernate. You should have found a cfg.xml file in Descriptors. If you are using package scanning through spring session factory definition,you should have found a session factory bean. If neither of them exists,you may add one.
I was looking over some code that I created awhile ago, and noticed something odd.
I am creating a Persistence Unit programmatically due to needed user input as to the location of the Database to read.
My code is as follows
Map properties = new HashMap();
db = plan.db;
// Configure the internal EclipseLink connection pool
properties.put(TRANSACTION_TYPE, PersistenceUnitTransactionType.RESOURCE_LOCAL.name());
properties.put(JDBC_DRIVER, "net.ucanaccess.jdbc.UcanaccessDriver");
properties.put(JDBC_URL, "jdbc:ucanaccess://" + db + ";singleconnection=true;memory=true");
properties.put(JDBC_USER, "");
properties.put(JDBC_PASSWORD, "");
// properties.put( "provider" , "org.eclipse.persistence.jpa.PersistenceProvider");
EntityManagerFactory emf2;
EntityManager em2;
emf2 = Persistence.createEntityManagerFactory("PU", properties);
em2 = emf2.createEntityManager();
With this I was able to create my connections multiple times.
The problem I noticed is that I also had code in my "Persistence.xml"
<persistence-unit name="PU" transaction-type="RESOURCE_LOCAL">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<class>db.Items</class>
<exclude-unlisted-classes>false</exclude-unlisted-classes>
<properties>
<property name="javax.persistence.jdbc.url" value=""/>
<property name="javax.persistence.jdbc.user" value=""/>
<property name="javax.persistence.jdbc.driver" value="net.ucanaccess.jdbc.UcanaccessDriver"/>
<property name="javax.persistence.jdbc.password" value=""/>
</properties>
Now I noticed that I cannot find any way to add an "Entity Class" to this "Persistence Unit," however I was able to run my code fine, just like this.
I'm curious if it just overwrites the old properies and such from the Persistence Unit of the same name? It still uses the Persistence Class of "db.Items."
I just want to make sure that this is the correct way to do it.
I'm doing changes to my code, so I cannot run it currently to see if I delete everything in my PErsistence.xml what will happen, but I'm curious about this.
I also noticed that the "provider" property was commented out. Do I need that posted? (It's included in the xml file).
there is also an example I saw that mentioned about "Server target" being set to "no" or something? Any comments on that?
Thanks all
It overwrites the properties you have specified in persistence.xml. You can for example only set user name and password in this way and the other properties will be used as defined in the file. If it's "right" to do it this way I don't know, but I have done the same.
The call to Persistence.createEntityManager(unit, props) starts with searching for the named unit in any persistence.xml found in the classpath. Then properties from props are added or overwritten to the properties read from file for that unit.
I have no comment about your other questions.
I'm getting the following error message from hibernate when attempting to insert a row into a table:
org.hibernate.exception.ConstraintViolationException: Column
'priority' cannot be null
I know that I could put a line into the code to set the value but there are many other instances where the program relies on the default value in the database (db is mysql).
I read somewhere that you can provide a default value in the hbm.xml file but hibernate is not recognizing it. Here's the corresponding section from JobQueue.hbm.xml
<property name="priority" type="integer">
<column name="priority" default="0" />
</property>
I suppose another option would be to modify the JobQueue.java file that gets generated (I'm using eclipse hibernate tools to auto generate the hibernate classes) but for now I'd like to try to get the hbm.xml configuration to work.
I'm using version 4.1.3 of the hibernate libraries and eclipse hibernate tools 3.4.0.x.
default="0" is only relevant for SchemaExport which generates the database schema. other than that hibernate completely ignores this setting. you could try to set not-null="true" for the column.
Unless you are not able to recreate the whole database schema, you can set the default value in the variable initialization.
In your model set the priority to 0 in the initialization.
In your class:
private Integer priority = 0;
I ended up modifying the JobQueue.java POJO to set the default value. To make sure that the Code Generation of hibernate tools wouldn't overwrite this change, I set it up so that the code generation generates the files in a temp folder and then the necessary files are copied over to the permanent source location.
is there a possibility to execute an sql script, after EclipseLink generated the ddl?
In other words, is it possible that the EclipseLink property "eclipselink.ddl-generation" with "drop-and-create-tables" is used and EclipseLink executes another sql-file (to insert some data into some tables just created) after creating the table definition?
I'm using EclipseLink 2.x and JPA 2.0 with GlassFish v3.
Or can I init the tables within a java method which is called on the project (war with ejb3) deployment?
I came across this question for the same reasons, trying to find an approach to run an initialization script after DDL generation. I offer this answer to an old question in hopes of shortening the amount of "literary research" for those looking for the same solution.
I'm using GlassFish 4 with its default EclipseLink 2.5 JPA implementation. The new Schema Generation feature under JPA 2.1 makes it fairly straightforward to specify an "initialization" script after DDL generation is completed.
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.1"
xmlns="http://xmlns.jcp.org/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence http://xmlns.jcp.org/xml/ns/persistence/persistence_2_1.xsd">
<persistence-unit name="cbesDatabase" transaction-type="JTA">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<jta-data-source>java:app/jdbc/cbesPool</jta-data-source>
<properties>
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata"/>
<property name="javax.persistence.schema-generation.drop-source" value="metadata"/>
<property name="javax.persistence.sql-load-script-source" value="META-INF/sql/load_script.sql"/>
<property name="eclipselink.logging.level" value="FINE"/>
</properties>
</persistence-unit>
</persistence>
The above configuration generates DDL scripts from metadata (i.e. annotations) after which the META-INF/sql/load_script.sql script is run to populate the database. In my case, I seed a few tables with test data and generate additional views.
Additional information on EclipseLink's use of JPA's properties can be found in the DDL Generation section of EclipseLink/Release/2.5/JPA21. Likewise, Section 37.5 Database Schema Creation in Oracle's Java EE 7 Tutorial and TOTD #187 offer a quick introduction also.
Have a look at Running a SQL Script on startup in EclipseLink that describes a solution presented as a kind of equivalent to Hibernate's import.sql feature1. Credits to Shaun Smith:
Running a SQL Script on startup in EclipseLink
Sometimes, when working with DDL
generation it's useful to run a script
to clean up the database first. In
Hibernate if you put a file called
"import.sql" on your classpath its
contents will be sent to the database.
Personally I'm not a fan of magic
filenames but this can be a useful
feature.
There's no built in support for this
in EclipseLink but it's easy to do
thank's to EclipseLink's high
extensibility. Here's a quick solution
I came up with: I simply register an
event listener for the session
postLogin event and in the handler I
read a file and send each SQL
statement to the database--nice and
clean. I went a little further and
supported setting the name of the file
as a persistence unit property. You
can specify this all in code or in the
persistence.xml.
The ImportSQL class is configured as
a SessionCustomizer through a
persistence unit property which, on
the postLogin event, reads the file
identified by the "import.sql.file"
property. This property is also
specified as a persistence unit
property which is passed to
createEntityManagerFactory. This
example also shows how you can define
and use your own persistence unit
properties.
import org.eclipse.persistence.config.SessionCustomizer;
import org.eclipse.persistence.sessions.Session;
import org.eclipse.persistence.sessions.SessionEvent;
import org.eclipse.persistence.sessions.SessionEventAdapter;
import org.eclipse.persistence.sessions.UnitOfWork;
public class ImportSQL implements SessionCustomizer {
private void importSql(UnitOfWork unitOfWork, String fileName) {
// Open file
// Execute each line, e.g.,
// unitOfWork.executeNonSelectingSQL("select 1 from dual");
}
#Override
public void customize(Session session) throws Exception {
session.getEventManager().addListener(new SessionEventAdapter() {
#Override
public void postLogin(SessionEvent event) {
String fileName = (String) event.getSession().getProperty("import.sql.file");
UnitOfWork unitOfWork = event.getSession().acquireUnitOfWork();
importSql(unitOfWork, fileName);
unitOfWork.commit()
}
});
}
public static void main(String[] args) {
Map<String, Object> properties = new HashMap<String, Object>();
// Enable DDL Generation
properties.put(PersistenceUnitProperties.DDL_GENERATION, PersistenceUnitProperties.DROP_AND_CREATE);
properties.put(PersistenceUnitProperties.DDL_GENERATION_MODE, PersistenceUnitProperties.DDL_DATABASE_GENERATION);
// Configure Session Customizer which will pipe sql file to db before DDL Generation runs
properties.put(PersistenceUnitProperties.SESSION_CUSTOMIZER, "model.ImportSQL");
properties.put("import.sql.file","/tmp/someddl.sql");
EntityManagerFactory emf = Persistence
.createEntityManagerFactory("employee", properties);
}
I'm not sure it's a strict equivalent though, I'm not sure the script will run after the database generation. Testing required. If it doesn't, maybe it can be adapted.
1 Hibernate has a neat little feature that is heavily under-documented and unknown. You can execute an SQL script during the SessionFactory creation right after the database schema generation to import data in a fresh database. You just need to add a file named import.sql in your classpath root and set either create or create-drop as your hibernate.hbm2ddl.auto property.
This might help as there is a confusion here:
Use exactly the same set of properties (except logger) for data seeding.
DO NOT USE:
<property name="eclipselink.ddl-generation" value="create-tables"/>
<property name="eclipselink.ddl-generation.output-mode" value="database"/>
DO USE:
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata"/>
<property name="javax.persistence.schema-generation.drop-source" value="metadata"/>
I confirm this worked for me.
:) Just substitue with your data
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata-then-script"/>
<property name="javax.persistence.sql-load-script-source" value="META-INF/seed.sql"/>
It is called BEFORE ddl-execution. And there seems to be no nice way to adapt it, as there is no suitable event one could use.
This process offers executing sql before DDL statments whereas what would be nice (for example, to insert seed data ) is to have something which executes after DDL statements. I don't if I am missing something here. Can somebody please tell me how to execute sql AFTER eclipselink has created tables (when create-tables property is set to tru)