I was learning some JPA to teach to some java friends and I was wondering, how do you handle updates that comes after the creation of the db in JPA? Let's say I have a production environment where there's data that I cannot lose.
Some changes comes in and how do I apply that on my production environment? It there a way that JPA would only update the changes on the database?
Or do I need to manually create a SQL script to update my database?
Is there any other options?
[]'s
Rodrigo Dellacqua
Some changes comes in and how do I apply that on my production environment? It there a way that JPA would only update the changes on the database?
Nothing standardized. In other words, that would be a provider specific feature. For example, Hibernate has a SchemaUpdate tool that can (in theory) safely update a database schema. In practice, many don't use that on a production database (including me).
Or do I need to manually create a SQL script to update my database?
Using migration scripts (and maybe a database migration tool) is IMO the safe way to handle this and is the way to go on real life projects.
And again, some migration tools might provide support for a given JPA provider. For example, liquibase does offer Hibernate support and can diff your Entities against a database to generate a change script.
Related
We have a java application which uses spring boot and hibernate.
There are many changes on entities and fields. That's why, I want to follow changes and rollback mechanism. So that, I need a version control system over database. I checked flyway and liquibase, but I think those don't solve my problem. Because my table creations and updates are handled by hibernate.
Is there any way to see which queries are executed by hibernate to change the database and which changes have occurred since the latest database change (I mean new table, column creation or refactoring)?
One way to do it (how we do it):
Use 2 databases. A reference database and a development database.
On the development database use hibernate to let it create the strucutre.
Once a development cycle is done you run liquibase diffChangelog on the reference database. It will create a changelog.xml with all changes that have been done by hibernate on the development db. Manually correct it (names, etc).
When your happy with changelog file and the development cylce is done apply the changelog onto the reference database.
Start your next development cycle and repeat.
That way you can combine the advantages of letting hibernate generate the schema and still use liquibase to have a versioned DB-Schema that is re-creatable.
Use those tools as those are dedicated for this purpose. Personally, I like Liquibase more but it's your choice.
Hibernate's schema creation mechanism should not be used in production as then your Java description of the entity would drive the creation of the tables resulting in an inefficient structure.
That feature is only there for testing purposes.
im building a Spring Boot web app with MySQL and till now i used the
spring.jpa.hibernate.ddl-auto=create-drop
in my properties file, and now i want to move to production and i cant use this line anymore because as u all know its saving the data on the memory and the worst thing its destroying all the data and create a new table every deploy.
for dev purposes its wonderful but what i need to do next cause i want it to behave exactly as i was on the ddl-auto but to persistently save the data and most inportantly never to drop the data.
P.S. the hibernate.ddl-auto has nothig to do with the JPA Repository?
cause i use Crud Repository alot and i need this to continue working with Crud Repository, will it?
the best thing to do, according to me, is:
use the option spring.jpa.hibernate.ddl-auto=create-drop to create
the DB schema and the default data (if any) in development environment
export the created DB schema in a normal DDL
give the DDL to DBAs to check if any improvement must be done (e.g.
add some indexes, review some FK etc..)
adapt JPA models after DBAs review
give the final DDL to the "production DBAs" in order to create the
final correct schema in production environment too
Regarding to your question:
the hibernate.ddl-auto has nothig to do with the JPA Repository? cause
i use Crud Repository alot and i need this to continue working with
Crud Repository, will it?
You can of course use the crud repository; this option will not influence your business logic
I hope it's useful
Angelo
You don't want to be sending DDLs around. You will end up trying to invent version control system for your scripts, by naming them specifically or putting them in folders, you will struggle communicating with DBAs, scripts breaking..
You want your database definition code to be a part of your code base so you can put it under version control (yes, git).
Try to use Liquibase for this. It will help you do automatic updates of the schema, data, everything db related, and it knows how to migrate your app's db, lets say, from 1.1 to 1.2 but also from 1.1 to 1.6. There are also other db migration tools like Flyway, you can look them up and play around.
We are doing a project in which we have planned to use JPA Persistence. We think that once the project goes live, there is a small chance that changes in the data model might be required.
My query is that what are the different strategies available to handle such a change. Particularly I have following questions:
With updated JPA classes, what are the best practices to incorporate them in the existing database schema?
With JPA, are there any best practices to, archive old data, update database schema, and again migrate the database to the new schema?
What are the various kinds of changes (broadly speaking) that will make such a migration impossible?
In RHQ (http://rhq-project.org/ ) we have some dbutils that have a schema description in XML that serves to populate the initial schema on an empty database and then another xml file that registers changes to this base schema as individual "diffs" of DDL and DML statements.
Whenever a JPA class is changed (in a schema relevant way), both XML files are updated. On the next run of the installer, it will look at the existing database, gather its version and then play all the update steps from the version in the DB to the most current one.
This dbutils code is available in git.
There are other frameworks around like liquibase that can help you here.
You can also take a look at this framework:
http://flywaydb.org
Advertised as: "The agile database migration framework for Java"
In my experience, migrations are not the problem (hibernate can do them automatically), but rollbacks are, if you are dealing with destructive changes. For example, if you remove a column, there's no way to rollback that change, unless you have the data from that column backed up somewhere. Best way do such backups probably depends on your DB vendor.
I'm currently working on a desktop application using JPA/Hibernate to persist data in a H2 database. I'm curious what my options are if I need to make changes to the database schema in the future for some reason. Maybe I'll have to introduce new entities, remove them or just change the types of properties in an entity.
Is there support in JPA/Hibernate to do this?
Would I have to manually script a solution?
I usually let Hibernate generate the DDL during development and then create a manual SQL migration script when deploying to the test server (which I later use for UAT and live servers as well).
The DDL generation in Hibernate does not offer support for data migration at all, if you only do as much as adding a non-null field, DDL generation cannot help you.
I have yet to find any truely useful migration abstraction to help with this.
There are a number of libraries (have a look at this SO question for examples), but when you're doing something like splitting an existing entity into a hierarchy using joined inheritance, you're always back to plain SQL.
Maybe I'll have to introduce new entities, remove them or just change the types of properties in an entity.
I don't have any experience with it but Liquibase provides some Hibernate Integration and can compare your mappings against a database and generate the appropriate change log:
The LiquiBase-Hibernate integration records the database changes required by your current Hibernate mapping to a change log file which you can then inspect and modify as needed before executing.
Still looking for an opportunity to play with it and find some answers to my pending questions:
does it work when using annotations?
does it require an hibernate.cfg.xml file (although this wouldn't be a big impediment)?
Update: Ok, both questions are covered by Nathan Voxland in this response and the answers are:
yes it works when using annotations
yes it requires an hibernate.cfg.xml (for now)
There are two options:
db-to-hibernate - mirror DB changes to your entities manually. This means your DB is "leading"
hibernate-to-db - either use hibernate.hbm2ddl.auto=update, or manually change the DB after changing your entity - here your object model is "leading"
I just wanted to hear the opinion of Hibernate experts about DB schema generation best practices for Hibernate/JPA based projects. Especially:
What strategy to use when the project has just started? Is it recommended to let Hibernate automatically generate the schema in this phase or is it better to create the database tables manually from earliest phases of the project?
Pretending that throughout the project the schema was being generated using Hibernate, is it better to disable automatic schema generation and manually create the database schema just before the system is released into production?
And after the system has been released into production, what is the best practice for maintaining the entity classes and the DB schema (e.g. adding/renaming/updating columns, renaming tables, etc.)?
It's always recommended to generate the schema manually, preferably by a tool supporting database schema revisions, such as the great Liquibase. Generating the schema from the entities is great in theory, but were fragile in practice and causes lots of problems in the long run(trust me on this).
In productions it's always best to have manually generated and review the schema.
You make an update to an entity and create a matching update script(revision) to update your database schema to reflect the entity change. You can create a custom solution(I've written a few) or use something more popular like liquibase(it even supports schema changes rollbacks). If you're using a build tool such as maven or ant - it's recommend to plug the db schema update util into the build process so that fresh builds stay in sync with the schema.
Although disputable, I'd say that the answer to all 3 questions is: let hibernate automatically generate the tables in the schema.
I haven't had any problems with that so far. You might need to clean some field up manually from time to time, but this is no headache compared to separately keeping track of DDL scripts - i.e. managing their revisions and synchronizing them with entity changes (and vice-versa)
For deploying on production - an obvious tip - first make sure everything is generated OK on the test environment and then deploy on production.
Manually, because:
Same database may be used by different applications and not all of
them would be using hibernate or even java. Database schema should
not be dictated by ORM, it should be designed around the data and
business requirements.
The datatypes chosen by hibernate might not be best suited for the application.
As mentioned in an earlier comment, changes to the entities would require manual intervention if data loss is not acceptable.
Things such as additional properties (generic term not java
properties) on join tables work wonderfully in RDBMS but are
somewhat complex and inefficient to use in an ORM. Doing such a
mapping from ORM -> RDBMS might create tables that are not
efficient. In theory, it is possible to build the exact same join
table using hibernate generated code, but it would require some
special care while writing the Entities.
I would use automatic generation for standalone applications or databases that are accessed via the same ORM layer and also if the app needs to be ported to different databases. It would save lot of time in by not requiring one to write and maintain DB vendor specific DDL scripts.
Like Bozhidar said, donĀ“t let Hibernate create&update the database schema.
Let your application create and update the database schema.
For java the best tool to do this is Flyway. You need to create one or more SQL files with DDL statements which are describing your database schema. These SQL files are then executed by Flyway. For more information look at the site of Flyway.
I believe that a lot of what is being discussed or argued here should also be related to if you are more confortable with the code-first or the database-first approach.
Personally, I am more intended to go for latter and, making a reference to Single Responsibility Principle (SRP), I prefer having DB specialist handling the DB and an application specialist handling the application, than having the application handling the DB. Additionally, I am of the opinion that taking too many shortcuts will work fine at the beginning but create unmanageable problems as things grow/evolve.