Including an Entity Class programmatically with a persistence unit? - java

I was looking over some code that I created awhile ago, and noticed something odd.
I am creating a Persistence Unit programmatically due to needed user input as to the location of the Database to read.
My code is as follows
Map properties = new HashMap();
db = plan.db;
// Configure the internal EclipseLink connection pool
properties.put(TRANSACTION_TYPE, PersistenceUnitTransactionType.RESOURCE_LOCAL.name());
properties.put(JDBC_DRIVER, "net.ucanaccess.jdbc.UcanaccessDriver");
properties.put(JDBC_URL, "jdbc:ucanaccess://" + db + ";singleconnection=‌​true;memory=true");
properties.put(JDBC_USER, "");
properties.put(JDBC_PASSWORD, "");
// properties.put( "provider" , "org.eclipse.persistence.jpa.PersistenceProvider");
EntityManagerFactory emf2;
EntityManager em2;
emf2 = Persistence.createEntityManagerFactory("PU", properties);
em2 = emf2.createEntityManager();
With this I was able to create my connections multiple times.
The problem I noticed is that I also had code in my "Persistence.xml"
<persistence-unit name="PU" transaction-type="RESOURCE_LOCAL">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<class>db.Items</class>
<exclude-unlisted-classes>false</exclude-unlisted-classes>
<properties>
<property name="javax.persistence.jdbc.url" value=""/>
<property name="javax.persistence.jdbc.user" value=""/>
<property name="javax.persistence.jdbc.driver" value="net.ucanaccess.jdbc.UcanaccessDriver"/>
<property name="javax.persistence.jdbc.password" value=""/>
</properties>
Now I noticed that I cannot find any way to add an "Entity Class" to this "Persistence Unit," however I was able to run my code fine, just like this.
I'm curious if it just overwrites the old properies and such from the Persistence Unit of the same name? It still uses the Persistence Class of "db.Items."
I just want to make sure that this is the correct way to do it.
I'm doing changes to my code, so I cannot run it currently to see if I delete everything in my PErsistence.xml what will happen, but I'm curious about this.
I also noticed that the "provider" property was commented out. Do I need that posted? (It's included in the xml file).
there is also an example I saw that mentioned about "Server target" being set to "no" or something? Any comments on that?
Thanks all

It overwrites the properties you have specified in persistence.xml. You can for example only set user name and password in this way and the other properties will be used as defined in the file. If it's "right" to do it this way I don't know, but I have done the same.
The call to Persistence.createEntityManager(unit, props) starts with searching for the named unit in any persistence.xml found in the classpath. Then properties from props are added or overwritten to the properties read from file for that unit.
I have no comment about your other questions.

Related

java.lang.IllegalArgumentException: Object: Transaction is not a known entity type

When I try to persist an entity to the database, I get the error in the subject line. I have it in the persistence.xml along side my other classes, which happen to work and persist.
I know this question has been asked many times but each time it seems to have a different cause. I tried the suggestions elsewhere. If nothing else you can shed light on a mystery for me. I am pretty new to Java as well as anything Inversion of Control.
My Transaction is a valid class. Trust me when it works. It works in every sense- rendering the result of its' computation to the jsp page, even sending it to pubsub and unpacking it when it returns. It simply doesn't work when I try to persist it through a DAO (entity manager).
#Entity
public class Transaction {
#Id
String uuid;
#Transient
Wallet senderWallet; // should this be transient? Can it be?
#Column(name = "address")
String recipientAddress;
double amount;
#Transient
HashMap<String, Object> output; // like basic receipt
#Transient
HashMap<String, Object> input; // like wire transfer document
I made a lot of this transient just to exclude what could be wrong.
I strongly suspect it has something to do with higher level configuration and structure. I am new to Java and these types of frameworks in general but I would guess that we should be seeing a Transaction.class file in the bin folder, correct? You can see we do not.
Yet it's obviously being compiled and run, because I am building transactions all the time in my experimentation code. Something is funny about this, so if we can solve the persistence issue, great, but to learn something about what's going on in bin would be great. I am willing to do a refactor/overhaul of my structure and configuration if it requires it. In fact I am looking forward to it. I was told also I should be using beans like #service and #repository. I am using #Entity and #Controller. We were taught JDBC and then JPA first and second and then spring and making beans, but by that time I already started the project and my mind was still remembering and trying to digest the earlier stuff, so I have not made these types of beans. I might have some tight coupling in different places that I can fix but my first priority is getting Transaction persisting. Other entities persist no problem and the class itself works and I see nothing special about this one. I have tried various permutations but in particular, should I not see Transaction.class files?
I see all my class files for every single thing used in my Target folder. That's a maven folder that is sibling of src and bin, and seems to be used for making artifacts. Might I have been running my code through that and certain entities are invisible to spring but visible to eclipse/tomcat? If so how to correct? I am only guessing wildly
Edit:
Here is a class that is persisting fine:
#Entity
public class User {
#Id
String username;
String password;
String hint;
String answer;
String email;
#Embedded
WalletForDB wallet;
Here's my persistence.xml
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.2"
xmlns="http://xmlns.jcp.org/xml/ns/persistence"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence http://xmlns.jcp.org/xml/ns/persistence/persistence_2_2.xsd">
<persistence-unit name="Case Study">
<class>privblock.gerald.ryan.entity.Transaction</class>
<class>privblock.gerald.ryan.entity.Account</class>
<class>privblock.gerald.ryan.entity.Block</class>
<class>privblock.gerald.ryan.entity.Blockchain</class>
<class>privblock.gerald.ryan.entity.User</class>
<class>privblock.gerald.ryan.entity.WalletForDB</class>
<properties>
<!-- DB configuration -->
<property name="javax.persistence.jdbc.url"
value="jdbc:mysql://localhost:3306/blockchain" />
<property name="javax.persistence.jdbc.user" value="root" />
<property name="javax.persistence.jdbc.driver"
value="com.mysql.cj.jdbc.Driver" />
<property name="javax.persistence.jdbc.password"
value="root" />
<!-- EclipseLink configuration -->
<property name="eclipselink.logging.level" value="FINEST" />
<property name="eclipselink.ddl-generation"
value="create-or-extend-tables" />
</properties>
</persistence-unit>
</persistence>
BTW my configuration is now broken, as I tried to make my folder structure like Maven/Spring conforming and I cleaned and updated project. For some reason the combination of Tomcat, Spring, Maven and Eclipse is not working. I don't know which, I don't know if I'm close but that's a separate problem.

Hibernate Check if DB exists

I'm creating a db using an initial.sql
<property name="hibernate.dialect">org.hibernate.dialect.MySQLDialect</property>
<property name="hibernate.connection.driver_class">com.mysql.cj.jdbc.Driver</property>
<property name="hibernate.connection.url">jdbc:mysql://localhost:3310/mydb?createDatabaseIfNotExist=true&autoReconnect=true&useSSL=false</property>
<property name="hibernate.connection.username">root</property>
<property name="hibernate.connection.password">qet</property>
<property name="format_sql">true</property>
<property name="hibernate.hbm2ddl.import_files_sql_extractor">org.hibernate.tool.hbm2ddl.MultipleLinesSqlCommandExtractor</property>
<property name="hibernate.hbm2ddl.auto">create</property>
<property name="hibernate.hbm2ddl.import_files">/database/initial.sql</property>
it works perfectly but the problem now is that I have the db but when I restart the app it will insert more data.
How do I check if the schema already exists dont do anything. I just want to create the db if it doesn't have any tables / doesn't exist...
Try switching the property to update
<property name="hibernate.hbm2ddl.auto">update</property>
Create recreates it every time and update creates it only if it doesn't exist. The standard values can be seen here:
https://docs.jboss.org/hibernate/orm/5.0/manual/en-US/html/ch03.html
validate : validates the schema
update : update if there are changes, create if it doesn't exist
create : recreate it from scratch
create-drop : creates it and then drops it when the session factory is closed
Also from the same reference we can see that import files are executed only if the hbm2ddl is create or create-drop
File order matters, the statements of a give file are executed before
the statements of the following files. These statements are only
executed if the schema is created ie if hibernate.hbm2ddl.auto is set
to create or create-drop.
If you want that initial_sql to be executed you can leave it as create and you can make that SQL to have the proper commands that don't change the state if already executed. Like DROP TABLE IF EXISTS, CREATE TABLE IF NOT EXISTS etc. For inserts it is a bit more complicated but you can still do it by insert into table join the same table where value is not found
That's not the perfect solution though because if you have tables created from annotated classes they will be recreated. I think you should think about another design or use hibernate.hbm2ddl.auto=create when deploying for the first time and then changing it to update manually

Persist always generates an insert query

Fist of all, I'm using EclipseLink 2.5.2, ojdbc6, spring-orm 4.1.1 and QueryDSL 3.7.1.
I don't understand why my objects are not in my persistence context (or is this how it should be?).
I'm using QueryDSL to query my objects, however when I try to persist such an object using entitymanager.persist() it always creates an insert statement resulting in in a duplicated primarykey exception.
Calling refresh() on the object crashes with java.lang.IllegalArgumentException: Can not refresh not managed object. Using merge() works fine however that's not what I want. I need to keep my original reference to the saved object.
persistence.xml
<persistence version="2.0"
xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
<persistence-unit name="XXXXXX"
transaction-type="RESOURCE_LOCAL">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<jta-data-source>jdbc/XXXXX</jta-data-source>
<exclude-unlisted-classes>false</exclude-unlisted-classes>
<properties>
<property name="eclipselink.weaving" value="static" />
<property name="eclipselink.target-database" value="Oracle11" />
</properties>
</persistence-unit>
</persistence>
The entitymanager used to create the JPAQuery and to refresh/merge/persist are the same.
If you need more information/configurations/etc. please leave a comment. I'm really stuck and can't wrap my head around what the reason could be and what other information could be useful to you guys.
EntityManager.persist() is used to make a transient instance persistent. In this case transient (a term used by Hibernate, but valid for other persistence providers as well) means an entity which doesn't have a representation in the persistence context or the underlying datastore. It's not meant to be used on entities already present in the database. Use merge() to update persistent entities.
There is an article about the subject with a nice state diagram representing the states an entity can be in and the transitions between those states:

Getting old data with JPA

I'm getting old data with JPA, even if I disable the cache. I guess is because the resource is configured to be RESOURCE_LOCAL, but I'm not sure.
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0" xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
<persistence-unit name="AppPU" transaction-type="RESOURCE_LOCAL">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<class>com.myentities.User</class>
<properties>
<property name="javax.persistence.jdbc.url" value="jdbc:mysql://127.0.0.1:3306/mydatabase"/>
<property name="javax.persistence.jdbc.password" value="*****"/>
<property name="javax.persistence.jdbc.driver" value="com.mysql.jdbc.Driver"/>
<property name="javax.persistence.jdbc.user" value="user1"/>
<property name="eclipselink.logging.level" value="FINEST"/>
</properties>
</persistence-unit>
</persistence>
My code that is getting old info about the user:
public List<User> findAll(App app) {
getEntityManager().getTransaction().begin();
Query q = getEntityManager().createQuery("SELECT t1 FROM User t1 WHERE t1.app.idApp=:idApp");
q.setParameter("idApp", app.getIdApp());
getEntityManager().flush();
getEntityManager().getTransaction().commit();
List resultList = q.getResultList();
return resultList;
}
My entity:
#Entity
#Table(name = "user")
#Cache (
type=CacheType.NONE
)
public class User implements Serializable {
// some attributtes
}
Anybody has some idea of what is going on?
UPDATE 1
The begin, flush and commit methods were just acts of desperation! I know it's not needed.
I forgot to say something important: the test I make is to add a user record direct on database console and then try to see it through my webapp, which is not showing the new user. That is the "old data" I mentioned, it only displays "old users".
I already tried to put this on persistence.xml and I didn't see any difference in the results:
<property name="eclipselink.cache.shared.default" value="false"/>
<property name="eclipselink.cache.size.default" value="0"/>
<property name="eclipselink.cache.type.default" value="None"/>
So is something else…
There are a few suggestions posted already such as ensuring the shared cache is off, and to manage back references so that the cache is consistent. These are for specific situations that could be occuring, but you have not provided enough to say what is really happening.
Another that is specific but seems possible based on your getEntityManager() usage, is if you are reusing the EntityManager instance without clearing it. The EntityManager holds a references to all managed entities since the EM is required to return the same instance on subsequent query and find calls to maintain identity.
If this is not done already, will want to clear the EntityManager or obtain a new one at certain points to release the memory and managed entities.
First off, don't use,
#Cache(type=CacheType.NONE)
or,
<property name="eclipselink.cache.size.default" value="0"/>
or,
<property name="eclipselink.cache.type.default" value="None"/>
just set,
#Cache(shared=false)
or,
<property name="eclipselink.cache.shared.default" value="false"/>
Second, where is your EntityManager coming from? Do you create a new one per request/transaction? If you don't then everything read in the EntityManager will be in its (L1) cache. You need to call clear() or create a new one.
Use
<property name="eclipselink.cache.shared.default" value="false"/>
<property name="eclipselink.cache.size.default" value="0"/>
<property name="eclipselink.cache.type.default" value="None"/>
or
#Cache(shared=false)
As opposed to the caching answer (which I will have to try) you're likely running into a situation where your referenced entity isn't updated.
#Entity
Class Parent
{
#OneToOne(Cascade.ALL)//Or only Merge, whatever you're needs
Child child;
}
#Entity
Class Child
{
Parent parent;
... Values
}
Upon saving the Child you need to update your reference to Parent so that the Memory Model (cache) matches the database. It is fairly frustrating, but the way I've dealt with this is to cascade only from the parent.
public void saveChild(Child child)
{
child.getParent().setChild(this); //or DTO Code, whatever
EntityManager.merge(parent); //cascades to the child.
//If you're manually cascading (why?)
//EntityManager.merge(child);
}
This will cascade if you set it up--what I've seen is that the reverse cascade (the child merge causes a cascade to the parent) has not been reliable--stemming from my lack of knowledge in the subject.
In short--if you handle the merge in your data-layer explicitly, the problem goes away. I'm reluctant to disable caching as it could have a significant impact in large applications, thus, I went this route. Good luck, and please let us know your approach.
1) Refine the code
public List<User> findAll(App app) {
Query q = getEntityManager().createQuery("SELECT t1 FROM User t1 WHERE t1.app.idApp=:idApp"); q.setParameter("idApp", app.getIdApp());
List resultList = q.getResultList();
return resultList;
}
2)Remove #Cache (type=CacheType.NONE) from your entity class
3) No need to change persistence.xml
The usage of EntityManager is the key. I've reached the perfect solution after months:
Use a common DEFAULT entity manager for all READS of an entity. This means create a separate entity manager for each entity.
Create a new entity manager for each write/update/delete operation of each entity. Use begin/commit for transaction of that new entity. Close the entity manager after the operation finish.
The node point: Clear the DEFAULT entity manager (reader one) after you commit and close the writer entity manager. This means only clear after write; not before each read.

Execute sql script after jpa/EclipseLink created tables?

is there a possibility to execute an sql script, after EclipseLink generated the ddl?
In other words, is it possible that the EclipseLink property "eclipselink.ddl-generation" with "drop-and-create-tables" is used and EclipseLink executes another sql-file (to insert some data into some tables just created) after creating the table definition?
I'm using EclipseLink 2.x and JPA 2.0 with GlassFish v3.
Or can I init the tables within a java method which is called on the project (war with ejb3) deployment?
I came across this question for the same reasons, trying to find an approach to run an initialization script after DDL generation. I offer this answer to an old question in hopes of shortening the amount of "literary research" for those looking for the same solution.
I'm using GlassFish 4 with its default EclipseLink 2.5 JPA implementation. The new Schema Generation feature under JPA 2.1 makes it fairly straightforward to specify an "initialization" script after DDL generation is completed.
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.1"
xmlns="http://xmlns.jcp.org/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/persistence http://xmlns.jcp.org/xml/ns/persistence/persistence_2_1.xsd">
<persistence-unit name="cbesDatabase" transaction-type="JTA">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<jta-data-source>java:app/jdbc/cbesPool</jta-data-source>
<properties>
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata"/>
<property name="javax.persistence.schema-generation.drop-source" value="metadata"/>
<property name="javax.persistence.sql-load-script-source" value="META-INF/sql/load_script.sql"/>
<property name="eclipselink.logging.level" value="FINE"/>
</properties>
</persistence-unit>
</persistence>
The above configuration generates DDL scripts from metadata (i.e. annotations) after which the META-INF/sql/load_script.sql script is run to populate the database. In my case, I seed a few tables with test data and generate additional views.
Additional information on EclipseLink's use of JPA's properties can be found in the DDL Generation section of EclipseLink/Release/2.5/JPA21. Likewise, Section 37.5 Database Schema Creation in Oracle's Java EE 7 Tutorial and TOTD #187 offer a quick introduction also.
Have a look at Running a SQL Script on startup in EclipseLink that describes a solution presented as a kind of equivalent to Hibernate's import.sql feature1. Credits to Shaun Smith:
Running a SQL Script on startup in EclipseLink
Sometimes, when working with DDL
generation it's useful to run a script
to clean up the database first. In
Hibernate if you put a file called
"import.sql" on your classpath its
contents will be sent to the database.
Personally I'm not a fan of magic
filenames but this can be a useful
feature.
There's no built in support for this
in EclipseLink but it's easy to do
thank's to EclipseLink's high
extensibility. Here's a quick solution
I came up with: I simply register an
event listener for the session
postLogin event and in the handler I
read a file and send each SQL
statement to the database--nice and
clean. I went a little further and
supported setting the name of the file
as a persistence unit property. You
can specify this all in code or in the
persistence.xml.
The ImportSQL class is configured as
a SessionCustomizer through a
persistence unit property which, on
the postLogin event, reads the file
identified by the "import.sql.file"
property. This property is also
specified as a persistence unit
property which is passed to
createEntityManagerFactory. This
example also shows how you can define
and use your own persistence unit
properties.
import org.eclipse.persistence.config.SessionCustomizer;
import org.eclipse.persistence.sessions.Session;
import org.eclipse.persistence.sessions.SessionEvent;
import org.eclipse.persistence.sessions.SessionEventAdapter;
import org.eclipse.persistence.sessions.UnitOfWork;
public class ImportSQL implements SessionCustomizer {
private void importSql(UnitOfWork unitOfWork, String fileName) {
// Open file
// Execute each line, e.g.,
// unitOfWork.executeNonSelectingSQL("select 1 from dual");
}
#Override
public void customize(Session session) throws Exception {
session.getEventManager().addListener(new SessionEventAdapter() {
#Override
public void postLogin(SessionEvent event) {
String fileName = (String) event.getSession().getProperty("import.sql.file");
UnitOfWork unitOfWork = event.getSession().acquireUnitOfWork();
importSql(unitOfWork, fileName);
unitOfWork.commit()
}
});
}
public static void main(String[] args) {
Map<String, Object> properties = new HashMap<String, Object>();
// Enable DDL Generation
properties.put(PersistenceUnitProperties.DDL_GENERATION, PersistenceUnitProperties.DROP_AND_CREATE);
properties.put(PersistenceUnitProperties.DDL_GENERATION_MODE, PersistenceUnitProperties.DDL_DATABASE_GENERATION);
// Configure Session Customizer which will pipe sql file to db before DDL Generation runs
properties.put(PersistenceUnitProperties.SESSION_CUSTOMIZER, "model.ImportSQL");
properties.put("import.sql.file","/tmp/someddl.sql");
EntityManagerFactory emf = Persistence
.createEntityManagerFactory("employee", properties);
}
I'm not sure it's a strict equivalent though, I'm not sure the script will run after the database generation. Testing required. If it doesn't, maybe it can be adapted.
1 Hibernate has a neat little feature that is heavily under-documented and unknown. You can execute an SQL script during the SessionFactory creation right after the database schema generation to import data in a fresh database. You just need to add a file named import.sql in your classpath root and set either create or create-drop as your hibernate.hbm2ddl.auto property.
This might help as there is a confusion here:
Use exactly the same set of properties (except logger) for data seeding.
DO NOT USE:
<property name="eclipselink.ddl-generation" value="create-tables"/>
<property name="eclipselink.ddl-generation.output-mode" value="database"/>
DO USE:
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata"/>
<property name="javax.persistence.schema-generation.drop-source" value="metadata"/>
I confirm this worked for me.
:) Just substitue with your data
<property name="javax.persistence.schema-generation.database.action" value="drop-and-create"/>
<property name="javax.persistence.schema-generation.create-source" value="metadata-then-script"/>
<property name="javax.persistence.sql-load-script-source" value="META-INF/seed.sql"/>
It is called BEFORE ddl-execution. And there seems to be no nice way to adapt it, as there is no suitable event one could use.
This process offers executing sql before DDL statments whereas what would be nice (for example, to insert seed data ) is to have something which executes after DDL statements. I don't if I am missing something here. Can somebody please tell me how to execute sql AFTER eclipselink has created tables (when create-tables property is set to tru)

Categories