I have a JPA 2.1 application that connects to two different databases with two different sets of tables. Within IntelliJ, there is a Persistence View you can use to assign each JPA Entity to the appropriate Data Source and this works fine. IntelliJ is able to validate the Entity's Table and Column against the corresponding table in the Data Source.
Every now and then, IntelliJ (version 2018.3) loses my choice and attaches the Entity to the other Data Source. I find this out when I open the class and find the Entity's table and columns don't match. I stumble across the change some indefinite time after the swap has occurred.
My work-around is to manually remove the incorrect assignment and make the correct assignment. IntelliJ's inability to remember this assignment is getting old.
I suspect IntelliJ might be auto-generating files to represent the classes which are annotated #Entity and maybe this is where the problem lies.
I understand I could add the Entity to the persistence.xml using the <class> attribute such that the data source assignment is made in this config file, but it only appears to be a problem with IntelliJ. Deployments (maven-based) to the server compile and run as expected.
Is there something I can do with IntelliJ to avoid losing the data source assignments?
Other notes:
These mappings are not recorded in the <module>.iml file. I remain unsuccessful at finding where this is recorded.
Using the <class> attribute in the persistence.xml is not considered by IntelliJ for validation.
Using the #PersistenceUnit(unitName = "unitName") annotation on the Entity is not considered by IntelliJ for validation.
To get IntelliJ to remember the association between and Entity and its DataSource is the wrong approach. Instead, IntelliJ needs to be told the association between the Persistence Unit and its DataSource.
Both options are available from the Persistence View that is enabled when the JPA facet is enabled, but IntelliJ will list all annotated Entities under both persistence units which can lead you to thinking you need to open each Entity to perform the mapping.
The only action that is required within the Persistence View is to map each Persistence Unit to its DataSource.
Related
I have been working on EER diagram model for my Mysql base for the last 7 days. It is a rather complicated model with lots of connections and attributes. Now i know spring boot automatically creates tables based on your entity classes(including foreign keys and other settings) if you use spring.jpa.hibernate.ddl-auto = update in your application.properties, but is it possible to create entity classes after providing good credentials to DataSource object based on tables withing schema?
Point is i would probably need another 3-4 days of back-end coding to create all the classes with all the attribute, relationships etc.
Given the fact it can only be done in one correct way, based on schema tables and it is not really rocket science. Why not do the thing just once?
There is this question Automatically create Entities from database
but 1. i am using spring boot not JPA project and second blog is no longer active.
Any hints?
Just figured it out. I used the hibernate perspective in eclipse to connect to a database, create a cfg.xml and run the project as hibernate configurations, create a new reveng.xml config and that will work. More detailed answer can be found in the article below.
http://o7planning.org/en/10125/using-hibernate-tools-generate-entity-classes-from-tables
Hope it will help someone!!
There is this question Automatically create Entities from database but
1. i am using spring boot not JPA project and second blog is no longer active.
Under the woods, Spring boot uses JPA and more specifically Hibernate since it has bad compatibility with other JPA implmentations such as EclipseLink.
Why don't you use Dali Eclipse plugin ?
https://www.eclipse.org/webtools/dali/docs/3.2/user_guide/tasks006.htm
It has a wizard with many options and it addresses well this kind of need. The real drawback when I use it and we cannot store our orm configuration. So, since you have many tables, I advise you to generate it in an incremental way your entities.
I have a JPA project in eclipse and something keeps synchronizing the DB tables with the entities. The problem is that the entities that are created end with underscore ("_.java"). I already manually created the entities from the schema, so that causes duplicate entities, then version control picks them up and wants to add them to the repo etc.
How do I tell eclipse not to generate the entities, or to generate them but without the underscore?
Those xxx_.java files are metamodel classes, and they are necessary, automatically generated by the framework and should not interfered with your work, Why do you want to delete them?
As this site says:
This metamodel is important in 2 ways.
First, it allows providers and frameworks a generic way to deal with an application's domain model.
Second, from an application writer's perspective, it allows very fluent expression of completely type-safe criteria queries, especially the Static Metamodel approach.
And:
For each managed class X in package p, a metamodel class X_ in package p is created.
The name of the metamodel class is derived from the name of the managed class by appending "_" to the name of the managed class.
For what you have written, everything appears to be normal. The duplicated class and the _ are necessary for the framework and you can simply ignore them. You should read this site too for more information.
But if you want not to add those files to your repo, there are ways to exclude certain files when you have to commit, depending on the repository system (like in git, using the .gitignore file).
I am in the process of troubleshooting a recent translation of EJB code to native Hibernate code (painful process, since EJB spoiled me so much with its convenience).
One thing I find troublesome is that Hibernate keeps its entity declarations in a hbm.xml file and then the configurations in separate files. While this isn't necessarily a big issue, the Netbeans wizard doesn't really let the developer to just click a button, detect, all the entities on the fly, and update the configuration file.
With persistence.xml, however, I can do that easily by just adding the classes and forget about it. Another good thing is that persistence.xml stores pretty much everything needed for the ORM aside from the class-specific annotations (which I am keeping).
With that said, is there any way for me to have Hibernate to (1) stay off EE and (2) use persistence.xml to get the connection, mapping, etc?
Also, a related question - CriteriaQuery is apparently a Java EE thing. One thing I really like about using EJB is that there are strong compile-time contraints. For instance, I can put ClassName_.myAttribute directly as a parameter in a CriteriaQuery, whereas if I use the Hibernate native "Criteria" object, I have to use "my_attribute" instead, which is not subjected to compile time integrity checks (Note: ClassName_.myAttribute maps to "my_attribute" on the table).
So is there anyway to keep that compile-time integrity?
Thanks.
Hibernate EntityManager can be used outside of a Java EE container. See http://docs.jboss.org/hibernate/core/4.0/hem/en-US/html_single/#architecture-javase.
Moreover, even with the Hibernate native API, since you're using annotations, you don't need any hbm.xml file. Just a central Hibernate config file listing the entities and some Hibernate properties.
I have a much used project that I am working on currently updating. There are several places where this project can be installed, and in the future it is not certain what version is used where and to what version one might be updated to in the future. Right now they are all the same, though.
My problem stems from the fact that there might be many changes to the hibernate entity classes, and it must be easy to update to a newer version without any hassle, and no loss of database content. Just replace WAR and start and it should migrate itself.
To my knowledge Hibernate does no altering of tables unless hibernate.hbm2ddl.auto=create, but which actually throws away all the data?
So right now when the Spring context has fully loaded, it executes a bean that will migrate the database to the current version by going through all the changes from versionX to versionY (what version it previously was is saved in the database), and manually alter the table.
It's not much hassle doing a few hard-coded ALTER TABLE to add some columns, but when it comes to adding complete new tables, it feels silly to have to write all that...
So my question(s) is this:
Is there any way to send an entity class and a dialect to Hibernate
code somewhere, and get back a valid SQL query for creating a table?
And even better, somehow create an SQL string for adding a column to a table, dialect-safe?
I hope this is not a silly question, and I have not missed something obvious when it comes to Hibernate...
have you tried
hibernate.hbm2ddl.auto=update
it retains all the database with the data and append only columns and tables you have changed in entity.
I don't think you'll be able to fully automate this. Hibernate has the hbm2ddl tool (available as an ant task or a maven plugin) to generate the required DDL statements from your hibernate configuration to create an empty database but I'm not aware of any tools that can do an automatic "diff" between two versions. In any case you're probably better off doing the diff carefully by hand, as only you know your object model well enough to be able to pick the right defaults for new properties of existing entities etc.
Once you have worked out your diffs you can use a tool like liquibase to manage them and handle actually applying the updates to a database at application start time.
Maybe you should try a different approach. In stead of generating an schema at runtime update, make one 'by hand' (could be based on a hibernate generated script though).
Store a version number in the database and create an update script for every next version. The only thing you have to do now is determine in which version the database currently is and sequentially run the necessary update scripts to get it to the current version.
To make it extra robust you can make a unit/integration test which runs every possible database update and checks the integrity of the resulting database.
I used this method for an application I build and it works flawlessly. An other example of an implementation of this pattern is Android. They have an upgrade method in their API
http://developer.android.com/reference/android/database/sqlite/SQLiteOpenHelper.html#onUpgrade(android.database.sqlite.SQLiteDatabase, int, int)
Don't use Hibernate's ddl. It throws away your data if you want to migrate. I suggest you take a look at Liquibase. Liquibase is a database version control. It works using changesets. Each changeset can be created manually or you can let Liquibase read your Hibernate config and generate a changeset.
Liquibase can be started via Spring so it should fit right in with your project ;-)
I'm currently working on a desktop application using JPA/Hibernate to persist data in a H2 database. I'm curious what my options are if I need to make changes to the database schema in the future for some reason. Maybe I'll have to introduce new entities, remove them or just change the types of properties in an entity.
Is there support in JPA/Hibernate to do this?
Would I have to manually script a solution?
I usually let Hibernate generate the DDL during development and then create a manual SQL migration script when deploying to the test server (which I later use for UAT and live servers as well).
The DDL generation in Hibernate does not offer support for data migration at all, if you only do as much as adding a non-null field, DDL generation cannot help you.
I have yet to find any truely useful migration abstraction to help with this.
There are a number of libraries (have a look at this SO question for examples), but when you're doing something like splitting an existing entity into a hierarchy using joined inheritance, you're always back to plain SQL.
Maybe I'll have to introduce new entities, remove them or just change the types of properties in an entity.
I don't have any experience with it but Liquibase provides some Hibernate Integration and can compare your mappings against a database and generate the appropriate change log:
The LiquiBase-Hibernate integration records the database changes required by your current Hibernate mapping to a change log file which you can then inspect and modify as needed before executing.
Still looking for an opportunity to play with it and find some answers to my pending questions:
does it work when using annotations?
does it require an hibernate.cfg.xml file (although this wouldn't be a big impediment)?
Update: Ok, both questions are covered by Nathan Voxland in this response and the answers are:
yes it works when using annotations
yes it requires an hibernate.cfg.xml (for now)
There are two options:
db-to-hibernate - mirror DB changes to your entities manually. This means your DB is "leading"
hibernate-to-db - either use hibernate.hbm2ddl.auto=update, or manually change the DB after changing your entity - here your object model is "leading"