Persist always generates an insert query - java

Fist of all, I'm using EclipseLink 2.5.2, ojdbc6, spring-orm 4.1.1 and QueryDSL 3.7.1.
I don't understand why my objects are not in my persistence context (or is this how it should be?).
I'm using QueryDSL to query my objects, however when I try to persist such an object using entitymanager.persist() it always creates an insert statement resulting in in a duplicated primarykey exception.
Calling refresh() on the object crashes with java.lang.IllegalArgumentException: Can not refresh not managed object. Using merge() works fine however that's not what I want. I need to keep my original reference to the saved object.
persistence.xml
<persistence version="2.0"
xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
<persistence-unit name="XXXXXX"
transaction-type="RESOURCE_LOCAL">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<jta-data-source>jdbc/XXXXX</jta-data-source>
<exclude-unlisted-classes>false</exclude-unlisted-classes>
<properties>
<property name="eclipselink.weaving" value="static" />
<property name="eclipselink.target-database" value="Oracle11" />
</properties>
</persistence-unit>
</persistence>
The entitymanager used to create the JPAQuery and to refresh/merge/persist are the same.
If you need more information/configurations/etc. please leave a comment. I'm really stuck and can't wrap my head around what the reason could be and what other information could be useful to you guys.

EntityManager.persist() is used to make a transient instance persistent. In this case transient (a term used by Hibernate, but valid for other persistence providers as well) means an entity which doesn't have a representation in the persistence context or the underlying datastore. It's not meant to be used on entities already present in the database. Use merge() to update persistent entities.
There is an article about the subject with a nice state diagram representing the states an entity can be in and the transitions between those states:

Related

How to stop persistence from altering database, JPA

I am using camel and open jpa as persistent provider, but I don't want alter statements to be run on prduction.
Snapshot of persistence.xml
<persistence-unit name="camel-openjpa-oracle-alert" transaction-type="RESOURCE_LOCAL">
.
.
<provider>
org.apache.openjpa.persistence.PersistenceProviderImpl
</provider>
<properties>
<property name="openjpa.jdbc.SynchronizeMappings" value="buildSchema(ForeignKeys=false)" />
</properties>
.
.
</persistence-unit>
What value we have to put for openjpa.jdbc.SynchronizeMappings, so that alter command are not executed.
I searched but was unable to find any such value.
It would be nice to know a little more about what you are doing and why you need to use SynchronizeMappings. The fact that you use ForeignKeys=true tells me you want OpenJPA to read you schema and determine if you have any database FKs defined (i.e. so OpenJPA knows about these FKs so it can order SQL properly to honor parent/child FK constraints). This is a perfectly valid use of SynchMappings. However, by using 'buildSchema', you are specifically telling OpenJPA to make "the database schema match your existing mappings"....this comment is lifted from this OpenJPA doc:
http://openjpa.apache.org/builds/1.2.3/apache-openjpa/docs/ref_guide_mapping.html#ref_guide_mapping_synch
Therefore, you are specifically telling OpenJPA to update your database schema. You can remove the 'buildSchema' if you don't want OpenJPA to update your schema to match your domain model. That is, try:
Or you could use 'validate' in place of 'buildSchema'....however, as the above doc states, OpenJPA will throw an exception if it finds a schema/domain mismatch which may not be what you want. I suggest you read the above doc, and look at the available options to you.
Thanks,
Heath Thomann

Including an Entity Class programmatically with a persistence unit?

I was looking over some code that I created awhile ago, and noticed something odd.
I am creating a Persistence Unit programmatically due to needed user input as to the location of the Database to read.
My code is as follows
Map properties = new HashMap();
db = plan.db;
// Configure the internal EclipseLink connection pool
properties.put(TRANSACTION_TYPE, PersistenceUnitTransactionType.RESOURCE_LOCAL.name());
properties.put(JDBC_DRIVER, "net.ucanaccess.jdbc.UcanaccessDriver");
properties.put(JDBC_URL, "jdbc:ucanaccess://" + db + ";singleconnection=‌​true;memory=true");
properties.put(JDBC_USER, "");
properties.put(JDBC_PASSWORD, "");
// properties.put( "provider" , "org.eclipse.persistence.jpa.PersistenceProvider");
EntityManagerFactory emf2;
EntityManager em2;
emf2 = Persistence.createEntityManagerFactory("PU", properties);
em2 = emf2.createEntityManager();
With this I was able to create my connections multiple times.
The problem I noticed is that I also had code in my "Persistence.xml"
<persistence-unit name="PU" transaction-type="RESOURCE_LOCAL">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<class>db.Items</class>
<exclude-unlisted-classes>false</exclude-unlisted-classes>
<properties>
<property name="javax.persistence.jdbc.url" value=""/>
<property name="javax.persistence.jdbc.user" value=""/>
<property name="javax.persistence.jdbc.driver" value="net.ucanaccess.jdbc.UcanaccessDriver"/>
<property name="javax.persistence.jdbc.password" value=""/>
</properties>
Now I noticed that I cannot find any way to add an "Entity Class" to this "Persistence Unit," however I was able to run my code fine, just like this.
I'm curious if it just overwrites the old properies and such from the Persistence Unit of the same name? It still uses the Persistence Class of "db.Items."
I just want to make sure that this is the correct way to do it.
I'm doing changes to my code, so I cannot run it currently to see if I delete everything in my PErsistence.xml what will happen, but I'm curious about this.
I also noticed that the "provider" property was commented out. Do I need that posted? (It's included in the xml file).
there is also an example I saw that mentioned about "Server target" being set to "no" or something? Any comments on that?
Thanks all
It overwrites the properties you have specified in persistence.xml. You can for example only set user name and password in this way and the other properties will be used as defined in the file. If it's "right" to do it this way I don't know, but I have done the same.
The call to Persistence.createEntityManager(unit, props) starts with searching for the named unit in any persistence.xml found in the classpath. Then properties from props are added or overwritten to the properties read from file for that unit.
I have no comment about your other questions.

Inconsistency checking between entity and table

I'm looking for easy way to check inconsistency between entity and table for my JPA application.
After changing table definition (ex. column name, type, add new column, delete column), I sometimes forget to change entity definition.
So I'd like to be notified if entity and table definitions are inconsistent.
Is some tool available? Eclipse plugin is preferable, but others are also considerable.
I know Dali. But this tool does not suit for me because I should modify Dali output.
(I'm using class inheritance as this question, and so on.)
Your JPA implementation should provide a property on persistence.xml to make it for you. By example, Hibernate provides hibernate.hbm2ddl.auto property which allow to create the schema, update or just validate.
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<persistence ...>
<persistence-unit ...>
<provider>org.hibernate.jpa.HibernatePersistenceProvider</provider>
<properties>
<!-- ... -->
<!-- ... -->
<property name="hibernate.hbm2ddl.auto" value="validate"/>
This makes the schema validation process on EntityManager initialization.
Check on your current JPA implementation documentation to find the equivalent property.
Good luck!

Getting old data with JPA

I'm getting old data with JPA, even if I disable the cache. I guess is because the resource is configured to be RESOURCE_LOCAL, but I'm not sure.
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0" xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
<persistence-unit name="AppPU" transaction-type="RESOURCE_LOCAL">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<class>com.myentities.User</class>
<properties>
<property name="javax.persistence.jdbc.url" value="jdbc:mysql://127.0.0.1:3306/mydatabase"/>
<property name="javax.persistence.jdbc.password" value="*****"/>
<property name="javax.persistence.jdbc.driver" value="com.mysql.jdbc.Driver"/>
<property name="javax.persistence.jdbc.user" value="user1"/>
<property name="eclipselink.logging.level" value="FINEST"/>
</properties>
</persistence-unit>
</persistence>
My code that is getting old info about the user:
public List<User> findAll(App app) {
getEntityManager().getTransaction().begin();
Query q = getEntityManager().createQuery("SELECT t1 FROM User t1 WHERE t1.app.idApp=:idApp");
q.setParameter("idApp", app.getIdApp());
getEntityManager().flush();
getEntityManager().getTransaction().commit();
List resultList = q.getResultList();
return resultList;
}
My entity:
#Entity
#Table(name = "user")
#Cache (
type=CacheType.NONE
)
public class User implements Serializable {
// some attributtes
}
Anybody has some idea of what is going on?
UPDATE 1
The begin, flush and commit methods were just acts of desperation! I know it's not needed.
I forgot to say something important: the test I make is to add a user record direct on database console and then try to see it through my webapp, which is not showing the new user. That is the "old data" I mentioned, it only displays "old users".
I already tried to put this on persistence.xml and I didn't see any difference in the results:
<property name="eclipselink.cache.shared.default" value="false"/>
<property name="eclipselink.cache.size.default" value="0"/>
<property name="eclipselink.cache.type.default" value="None"/>
So is something else…
There are a few suggestions posted already such as ensuring the shared cache is off, and to manage back references so that the cache is consistent. These are for specific situations that could be occuring, but you have not provided enough to say what is really happening.
Another that is specific but seems possible based on your getEntityManager() usage, is if you are reusing the EntityManager instance without clearing it. The EntityManager holds a references to all managed entities since the EM is required to return the same instance on subsequent query and find calls to maintain identity.
If this is not done already, will want to clear the EntityManager or obtain a new one at certain points to release the memory and managed entities.
First off, don't use,
#Cache(type=CacheType.NONE)
or,
<property name="eclipselink.cache.size.default" value="0"/>
or,
<property name="eclipselink.cache.type.default" value="None"/>
just set,
#Cache(shared=false)
or,
<property name="eclipselink.cache.shared.default" value="false"/>
Second, where is your EntityManager coming from? Do you create a new one per request/transaction? If you don't then everything read in the EntityManager will be in its (L1) cache. You need to call clear() or create a new one.
Use
<property name="eclipselink.cache.shared.default" value="false"/>
<property name="eclipselink.cache.size.default" value="0"/>
<property name="eclipselink.cache.type.default" value="None"/>
or
#Cache(shared=false)
As opposed to the caching answer (which I will have to try) you're likely running into a situation where your referenced entity isn't updated.
#Entity
Class Parent
{
#OneToOne(Cascade.ALL)//Or only Merge, whatever you're needs
Child child;
}
#Entity
Class Child
{
Parent parent;
... Values
}
Upon saving the Child you need to update your reference to Parent so that the Memory Model (cache) matches the database. It is fairly frustrating, but the way I've dealt with this is to cascade only from the parent.
public void saveChild(Child child)
{
child.getParent().setChild(this); //or DTO Code, whatever
EntityManager.merge(parent); //cascades to the child.
//If you're manually cascading (why?)
//EntityManager.merge(child);
}
This will cascade if you set it up--what I've seen is that the reverse cascade (the child merge causes a cascade to the parent) has not been reliable--stemming from my lack of knowledge in the subject.
In short--if you handle the merge in your data-layer explicitly, the problem goes away. I'm reluctant to disable caching as it could have a significant impact in large applications, thus, I went this route. Good luck, and please let us know your approach.
1) Refine the code
public List<User> findAll(App app) {
Query q = getEntityManager().createQuery("SELECT t1 FROM User t1 WHERE t1.app.idApp=:idApp"); q.setParameter("idApp", app.getIdApp());
List resultList = q.getResultList();
return resultList;
}
2)Remove #Cache (type=CacheType.NONE) from your entity class
3) No need to change persistence.xml
The usage of EntityManager is the key. I've reached the perfect solution after months:
Use a common DEFAULT entity manager for all READS of an entity. This means create a separate entity manager for each entity.
Create a new entity manager for each write/update/delete operation of each entity. Use begin/commit for transaction of that new entity. Close the entity manager after the operation finish.
The node point: Clear the DEFAULT entity manager (reader one) after you commit and close the writer entity manager. This means only clear after write; not before each read.

Execute sql script after jpa/EclipseLink created tables?

is there a possibility to execute an sql script, after EclipseLink generated the ddl?
In other words, is it possible that the EclipseLink property "eclipselink.ddl-generation" with "drop-and-create-tables" is used and EclipseLink executes another sql-file (to insert some data into some tables just created) after creating the table definition?
I'm using EclipseLink 2.x and JPA 2.0 with GlassFish v3.
Or can I init the tables within a java method which is called on the project (war with ejb3) deployment?
I came across this question for the same reasons, trying to find an approach to run an initialization script after DDL generation. I offer this answer to an old question in hopes of shortening the amount of "literary research" for those looking for the same solution.
I'm using GlassFish 4 with its default EclipseLink 2.5 JPA implementation. The new Schema Generation feature under JPA 2.1 makes it fairly straightforward to specify an "initialization" script after DDL generation is completed.
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.1"
xmlns="http://xmlns.jcp.org/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence http://xmlns.jcp.org/xml/ns/persistence/persistence_2_1.xsd">
<persistence-unit name="cbesDatabase" transaction-type="JTA">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<jta-data-source>java:app/jdbc/cbesPool</jta-data-source>
<properties>
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata"/>
<property name="javax.persistence.schema-generation.drop-source" value="metadata"/>
<property name="javax.persistence.sql-load-script-source" value="META-INF/sql/load_script.sql"/>
<property name="eclipselink.logging.level" value="FINE"/>
</properties>
</persistence-unit>
</persistence>
The above configuration generates DDL scripts from metadata (i.e. annotations) after which the META-INF/sql/load_script.sql script is run to populate the database. In my case, I seed a few tables with test data and generate additional views.
Additional information on EclipseLink's use of JPA's properties can be found in the DDL Generation section of EclipseLink/Release/2.5/JPA21. Likewise, Section 37.5 Database Schema Creation in Oracle's Java EE 7 Tutorial and TOTD #187 offer a quick introduction also.
Have a look at Running a SQL Script on startup in EclipseLink that describes a solution presented as a kind of equivalent to Hibernate's import.sql feature1. Credits to Shaun Smith:
Running a SQL Script on startup in EclipseLink
Sometimes, when working with DDL
generation it's useful to run a script
to clean up the database first. In
Hibernate if you put a file called
"import.sql" on your classpath its
contents will be sent to the database.
Personally I'm not a fan of magic
filenames but this can be a useful
feature.
There's no built in support for this
in EclipseLink but it's easy to do
thank's to EclipseLink's high
extensibility. Here's a quick solution
I came up with: I simply register an
event listener for the session
postLogin event and in the handler I
read a file and send each SQL
statement to the database--nice and
clean. I went a little further and
supported setting the name of the file
as a persistence unit property. You
can specify this all in code or in the
persistence.xml.
The ImportSQL class is configured as
a SessionCustomizer through a
persistence unit property which, on
the postLogin event, reads the file
identified by the "import.sql.file"
property. This property is also
specified as a persistence unit
property which is passed to
createEntityManagerFactory. This
example also shows how you can define
and use your own persistence unit
properties.
import org.eclipse.persistence.config.SessionCustomizer;
import org.eclipse.persistence.sessions.Session;
import org.eclipse.persistence.sessions.SessionEvent;
import org.eclipse.persistence.sessions.SessionEventAdapter;
import org.eclipse.persistence.sessions.UnitOfWork;
public class ImportSQL implements SessionCustomizer {
private void importSql(UnitOfWork unitOfWork, String fileName) {
// Open file
// Execute each line, e.g.,
// unitOfWork.executeNonSelectingSQL("select 1 from dual");
}
#Override
public void customize(Session session) throws Exception {
session.getEventManager().addListener(new SessionEventAdapter() {
#Override
public void postLogin(SessionEvent event) {
String fileName = (String) event.getSession().getProperty("import.sql.file");
UnitOfWork unitOfWork = event.getSession().acquireUnitOfWork();
importSql(unitOfWork, fileName);
unitOfWork.commit()
}
});
}
public static void main(String[] args) {
Map<String, Object> properties = new HashMap<String, Object>();
// Enable DDL Generation
properties.put(PersistenceUnitProperties.DDL_GENERATION, PersistenceUnitProperties.DROP_AND_CREATE);
properties.put(PersistenceUnitProperties.DDL_GENERATION_MODE, PersistenceUnitProperties.DDL_DATABASE_GENERATION);
// Configure Session Customizer which will pipe sql file to db before DDL Generation runs
properties.put(PersistenceUnitProperties.SESSION_CUSTOMIZER, "model.ImportSQL");
properties.put("import.sql.file","/tmp/someddl.sql");
EntityManagerFactory emf = Persistence
.createEntityManagerFactory("employee", properties);
}
I'm not sure it's a strict equivalent though, I'm not sure the script will run after the database generation. Testing required. If it doesn't, maybe it can be adapted.
1 Hibernate has a neat little feature that is heavily under-documented and unknown. You can execute an SQL script during the SessionFactory creation right after the database schema generation to import data in a fresh database. You just need to add a file named import.sql in your classpath root and set either create or create-drop as your hibernate.hbm2ddl.auto property.
This might help as there is a confusion here:
Use exactly the same set of properties (except logger) for data seeding.
DO NOT USE:
<property name="eclipselink.ddl-generation" value="create-tables"/>
<property name="eclipselink.ddl-generation.output-mode" value="database"/>
DO USE:
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata"/>
<property name="javax.persistence.schema-generation.drop-source" value="metadata"/>
I confirm this worked for me.
:) Just substitue with your data
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata-then-script"/>
<property name="javax.persistence.sql-load-script-source" value="META-INF/seed.sql"/>
It is called BEFORE ddl-execution. And there seems to be no nice way to adapt it, as there is no suitable event one could use.
This process offers executing sql before DDL statments whereas what would be nice (for example, to insert seed data ) is to have something which executes after DDL statements. I don't if I am missing something here. Can somebody please tell me how to execute sql AFTER eclipselink has created tables (when create-tables property is set to tru)

Categories