I'm looking for easy way to check inconsistency between entity and table for my JPA application.
After changing table definition (ex. column name, type, add new column, delete column), I sometimes forget to change entity definition.
So I'd like to be notified if entity and table definitions are inconsistent.
Is some tool available? Eclipse plugin is preferable, but others are also considerable.
I know Dali. But this tool does not suit for me because I should modify Dali output.
(I'm using class inheritance as this question, and so on.)
Your JPA implementation should provide a property on persistence.xml to make it for you. By example, Hibernate provides hibernate.hbm2ddl.auto property which allow to create the schema, update or just validate.
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<persistence ...>
<persistence-unit ...>
<provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>
<properties>
<!-- ... -->
<!-- ... -->
<property name="hibernate.hbm2ddl.auto" value="validate"/>
This makes the schema validation process on EntityManager initialization.
Check on your current JPA implementation documentation to find the equivalent property.
Good luck!
Related
I am using camel and open jpa as persistent provider, but I don't want alter statements to be run on prduction.
Snapshot of persistence.xml
<persistence-unit name="camel-openjpa-oracle-alert" transaction-type="RESOURCE_LOCAL">
.
.
<provider>
org.apache.openjpa.persistence.PersistenceProviderImpl
</provider>
<properties>
<property name="openjpa.jdbc.SynchronizeMappings" value="buildSchema(ForeignKeys=false)" />
</properties>
.
.
</persistence-unit>
What value we have to put for openjpa.jdbc.SynchronizeMappings, so that alter command are not executed.
I searched but was unable to find any such value.
It would be nice to know a little more about what you are doing and why you need to use SynchronizeMappings. The fact that you use ForeignKeys=true tells me you want OpenJPA to read you schema and determine if you have any database FKs defined (i.e. so OpenJPA knows about these FKs so it can order SQL properly to honor parent/child FK constraints). This is a perfectly valid use of SynchMappings. However, by using 'buildSchema', you are specifically telling OpenJPA to make "the database schema match your existing mappings"....this comment is lifted from this OpenJPA doc:
http://openjpa.apache.org/builds/1.2.3/apache-openjpa/docs/ref_guide_mapping.html#ref_guide_mapping_synch
Therefore, you are specifically telling OpenJPA to update your database schema. You can remove the 'buildSchema' if you don't want OpenJPA to update your schema to match your domain model. That is, try:
Or you could use 'validate' in place of 'buildSchema'....however, as the above doc states, OpenJPA will throw an exception if it finds a schema/domain mismatch which may not be what you want. I suggest you read the above doc, and look at the available options to you.
Thanks,
Heath Thomann
Fist of all, I'm using EclipseLink 2.5.2, ojdbc6, spring-orm 4.1.1 and QueryDSL 3.7.1.
I don't understand why my objects are not in my persistence context (or is this how it should be?).
I'm using QueryDSL to query my objects, however when I try to persist such an object using entitymanager.persist() it always creates an insert statement resulting in in a duplicated primarykey exception.
Calling refresh() on the object crashes with java.lang.IllegalArgumentException: Can not refresh not managed object. Using merge() works fine however that's not what I want. I need to keep my original reference to the saved object.
persistence.xml
<persistence version="2.0"
xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
<persistence-unit name="XXXXXX"
transaction-type="RESOURCE_LOCAL">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<jta-data-source>jdbc/XXXXX</jta-data-source>
<exclude-unlisted-classes>false</exclude-unlisted-classes>
<properties>
<property name="eclipselink.weaving" value="static" />
<property name="eclipselink.target-database" value="Oracle11" />
</properties>
</persistence-unit>
</persistence>
The entitymanager used to create the JPAQuery and to refresh/merge/persist are the same.
If you need more information/configurations/etc. please leave a comment. I'm really stuck and can't wrap my head around what the reason could be and what other information could be useful to you guys.
EntityManager.persist() is used to make a transient instance persistent. In this case transient (a term used by Hibernate, but valid for other persistence providers as well) means an entity which doesn't have a representation in the persistence context or the underlying datastore. It's not meant to be used on entities already present in the database. Use merge() to update persistent entities.
There is an article about the subject with a nice state diagram representing the states an entity can be in and the transitions between those states:
How can I set openJPA cache so that it works only for chosen entities, maybe I need use some annotaion over they?
my persistence.xml contains:
<property name="openjpa.DataCache" value="true"/>
<property name="openjpa.RemoteCommitProvider" value="sjvm"/>
but thats settings works for all my entities(tables), so i want to cache for example only that table:
#Entity(name = "IsoCountryCodes")
#Table(name = "ISO_COUNTRY_CODES", schema = "ANALYSIS")
#DataCache(timeout=120000)
public class IsoCountryCodes implements Serializable{
....
}
But #DataCache doesnt fix it, its only set the timeout of cache saving.
UPDATE:
I cannot use openJPA 2.0 cause my project deployed on WebLogic 10.36 and have provided KODO openJPA 1.3.
Also i try to include only chosen entities by adding property:
property name="openjpa.DataCache" value="true(Types=foo.bar.FullTimeEmployee)"
but got this error:
org.apache.openjpa.lib.util.ParseException: There was an error while setting up the configuration plugin option "DataCache". The plugin was of type "class kodo.datacache.KodoConcurrentDataCache". The plugin property "Type" had no corresponding setter method or accessible field. All possible plugin properties are: [CacheSize, EvictionSchedule, FailFast, NAME_DEFAULT, Name, SoftReferenceSize].
Can you help me?Maybe you know other ways to exclude or include entitites from caching, maybe with Ehcache usage?
<property name="openjpa.DataCache" value="true"/>
That enables the L2 cache for all Entities. If you are using jpa-2.0, try adding <shared-cache-mode>ENABLE_SELECTIVE</shared-cache-mode> to turn the cache on. Also, replace the #DataCache annotation with a #javax.persistence.Cacheable annotation.
I have two Java applications, manager and server, that use a shared library, domain. The Hibernate mapping is defined through annotations in the source of domain.
What I'm trying to accomplish is overriding the caching strategy of server to the 'read-only' caching strategy for some performance tests. Following the annotations documentation, I came to the point where I was trying:
<?xml version="1.0" encoding="UTF-8"?>
<entity-mappings
xmlns="http://java.sun.com/xml/ns/persistence/orm"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence/orm orm_2_0.xsd"
version="2.0">
<entity class="com.example.data.Entity" metadata-complete="false">
<cache usage="read-only"/>
</entity>
...
</entity-mappings>
Trying to mirror the annotation markup, mixing it with those instructions. Problem is, there's no allowed element cache for entity, and I get:
Error parsing XML (line66 : column 108): cvc-complex-type.2.4.a: Invalid content was found starting with element 'cache'. One of '{"http://java.sun.com/xml/ns/persistence/orm":description, "http://java.sun.com/xml/ns/persistence/orm":table, "http://java.sun.com/xml/ns/persistence/orm":secondary-table, "http://java.sun.com/xml/ns/persistence/orm":primary-key-join-column, "http://java.sun.com/xml/ns/persistence/orm":id-class, "http://java.sun.com/xml/ns/persistence/orm":inheritance, "http://java.sun.com/xml/ns/persistence/orm":discriminator-value, "http://java.sun.com/xml/ns/persistence/orm":discriminator-column, "http://java.sun.com/xml/ns/persistence/orm":sequence-generator, "http://java.sun.com/xml/ns/persistence/orm":table-generator, "http://java.sun.com/xml/ns/persistence/orm":named-query, "http://java.sun.com/xml/ns/persistence/orm":named-native-query, "http://java.sun.com/xml/ns/persistence/orm":sql-result-set-mapping, "http://java.sun.com/xml/ns/persistence/orm":exclude-default-listeners, "http://java.sun.com/xml/ns/persistence/orm":exclude-superclass-listeners, "http://java.sun.com/xml/ns/persistence/orm":entity-listeners, "http://java.sun.com/xml/ns/persistence/orm":pre-persist, "http://java.sun.com/xml/ns/persistence/orm":post-persist, "http://java.sun.com/xml/ns/persistence/orm":pre-remove, "http://java.sun.com/xml/ns/persistence/orm":post-remove, "http://java.sun.com/xml/ns/persistence/orm":pre-update, "http://java.sun.com/xml/ns/persistence/orm":post-update, "http://java.sun.com/xml/ns/persistence/orm":post-load, "http://java.sun.com/xml/ns/persistence/orm":attribute-override, "http://java.sun.com/xml/ns/persistence/orm":association-override, "http://java.sun.com/xml/ns/persistence/orm":attributes}' is expected.
When, I try to return to the hibernate-core XML markup, I get errors that the <class> definition is incomplete, and I can't seem to find a metadata-complete analog. Does anyone know if what I'm trying to do is possible? And if so, what steps am I going to have to take? Thanks!
is there a possibility to execute an sql script, after EclipseLink generated the ddl?
In other words, is it possible that the EclipseLink property "eclipselink.ddl-generation" with "drop-and-create-tables" is used and EclipseLink executes another sql-file (to insert some data into some tables just created) after creating the table definition?
I'm using EclipseLink 2.x and JPA 2.0 with GlassFish v3.
Or can I init the tables within a java method which is called on the project (war with ejb3) deployment?
I came across this question for the same reasons, trying to find an approach to run an initialization script after DDL generation. I offer this answer to an old question in hopes of shortening the amount of "literary research" for those looking for the same solution.
I'm using GlassFish 4 with its default EclipseLink 2.5 JPA implementation. The new Schema Generation feature under JPA 2.1 makes it fairly straightforward to specify an "initialization" script after DDL generation is completed.
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.1"
xmlns="http://xmlns.jcp.org/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence http://xmlns.jcp.org/xml/ns/persistence/persistence_2_1.xsd">
<persistence-unit name="cbesDatabase" transaction-type="JTA">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<jta-data-source>java:app/jdbc/cbesPool</jta-data-source>
<properties>
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata"/>
<property name="javax.persistence.schema-generation.drop-source" value="metadata"/>
<property name="javax.persistence.sql-load-script-source" value="META-INF/sql/load_script.sql"/>
<property name="eclipselink.logging.level" value="FINE"/>
</properties>
</persistence-unit>
</persistence>
The above configuration generates DDL scripts from metadata (i.e. annotations) after which the META-INF/sql/load_script.sql script is run to populate the database. In my case, I seed a few tables with test data and generate additional views.
Additional information on EclipseLink's use of JPA's properties can be found in the DDL Generation section of EclipseLink/Release/2.5/JPA21. Likewise, Section 37.5 Database Schema Creation in Oracle's Java EE 7 Tutorial and TOTD #187 offer a quick introduction also.
Have a look at Running a SQL Script on startup in EclipseLink that describes a solution presented as a kind of equivalent to Hibernate's import.sql feature1. Credits to Shaun Smith:
Running a SQL Script on startup in EclipseLink
Sometimes, when working with DDL
generation it's useful to run a script
to clean up the database first. In
Hibernate if you put a file called
"import.sql" on your classpath its
contents will be sent to the database.
Personally I'm not a fan of magic
filenames but this can be a useful
feature.
There's no built in support for this
in EclipseLink but it's easy to do
thank's to EclipseLink's high
extensibility. Here's a quick solution
I came up with: I simply register an
event listener for the session
postLogin event and in the handler I
read a file and send each SQL
statement to the database--nice and
clean. I went a little further and
supported setting the name of the file
as a persistence unit property. You
can specify this all in code or in the
persistence.xml.
The ImportSQL class is configured as
a SessionCustomizer through a
persistence unit property which, on
the postLogin event, reads the file
identified by the "import.sql.file"
property. This property is also
specified as a persistence unit
property which is passed to
createEntityManagerFactory. This
example also shows how you can define
and use your own persistence unit
properties.
import org.eclipse.persistence.config.SessionCustomizer;
import org.eclipse.persistence.sessions.Session;
import org.eclipse.persistence.sessions.SessionEvent;
import org.eclipse.persistence.sessions.SessionEventAdapter;
import org.eclipse.persistence.sessions.UnitOfWork;
public class ImportSQL implements SessionCustomizer {
private void importSql(UnitOfWork unitOfWork, String fileName) {
// Open file
// Execute each line, e.g.,
// unitOfWork.executeNonSelectingSQL("select 1 from dual");
}
#Override
public void customize(Session session) throws Exception {
session.getEventManager().addListener(new SessionEventAdapter() {
#Override
public void postLogin(SessionEvent event) {
String fileName = (String) event.getSession().getProperty("import.sql.file");
UnitOfWork unitOfWork = event.getSession().acquireUnitOfWork();
importSql(unitOfWork, fileName);
unitOfWork.commit()
}
});
}
public static void main(String[] args) {
Map<String, Object> properties = new HashMap<String, Object>();
// Enable DDL Generation
properties.put(PersistenceUnitProperties.DDL_GENERATION, PersistenceUnitProperties.DROP_AND_CREATE);
properties.put(PersistenceUnitProperties.DDL_GENERATION_MODE, PersistenceUnitProperties.DDL_DATABASE_GENERATION);
// Configure Session Customizer which will pipe sql file to db before DDL Generation runs
properties.put(PersistenceUnitProperties.SESSION_CUSTOMIZER, "model.ImportSQL");
properties.put("import.sql.file","/tmp/someddl.sql");
EntityManagerFactory emf = Persistence
.createEntityManagerFactory("employee", properties);
}
I'm not sure it's a strict equivalent though, I'm not sure the script will run after the database generation. Testing required. If it doesn't, maybe it can be adapted.
1 Hibernate has a neat little feature that is heavily under-documented and unknown. You can execute an SQL script during the SessionFactory creation right after the database schema generation to import data in a fresh database. You just need to add a file named import.sql in your classpath root and set either create or create-drop as your hibernate.hbm2ddl.auto property.
This might help as there is a confusion here:
Use exactly the same set of properties (except logger) for data seeding.
DO NOT USE:
<property name="eclipselink.ddl-generation" value="create-tables"/>
<property name="eclipselink.ddl-generation.output-mode" value="database"/>
DO USE:
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata"/>
<property name="javax.persistence.schema-generation.drop-source" value="metadata"/>
I confirm this worked for me.
:) Just substitue with your data
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata-then-script"/>
<property name="javax.persistence.sql-load-script-source" value="META-INF/seed.sql"/>
It is called BEFORE ddl-execution. And there seems to be no nice way to adapt it, as there is no suitable event one could use.
This process offers executing sql before DDL statments whereas what would be nice (for example, to insert seed data ) is to have something which executes after DDL statements. I don't if I am missing something here. Can somebody please tell me how to execute sql AFTER eclipselink has created tables (when create-tables property is set to tru)