How to exclude specific columns when generating entities from existing DB? - java

I'm relatively new at Hibernate and have therefore the following question:
Is there any way to exclude a specific column/s when I generate an entity from an existing DB ? For this purpose I'm using Hibernate Tools with Eclipse Luna.

You can delete the column from the generated files if you like.

I'm also new at Hibernate, and need to generate lots of entities from a DB. After struggling myself all day long on how to do that, I finally got a decent answer: use Hibernate Tools either as a Maven plugin or with ant.
I found this interesting discussion on how to initially configure Hibernate tools as a Maven plugin: https://developer.jboss.org/message/801478#801478
Then for your specific question (which is also mine) on How to exclude unwanted columns for the entity generation here is the answer:
You need to write a hibernate-reverse-engineering.xml file something like this:
<table name="myTable">
<!--...magic tricks and configurations...-->
<column name="myColumnName" exclude="true"/>
</table>
But that fits just for myTable columns, not for all the columns in your DB/schema (which is what I needed =[ ). I don't see any global column-filter tag or whatsoever to tell Hibernate to ignore that column for all the tables you are mapping to JPA entities.
Here is the documentation on the hibernate-reverse-engineering.xml with examples and all the magic tricks you can do:
http://docs.jboss.org/tools/latest/en/hibernatetools/html/reverseengineering.html

Related

How to store Flyway H2 migrations' history in another schema than PUBLIC?

Using Flyway with H2 database, I get out of the box all SQL schema migrations history stored in PUBLIC.schema_version.
I would like to store this table in a dedicated SQL schema FLYWAY, like this: FLYWAY.history.
Reasons for doing so are not to pollute H2 console visually when opening PUBLIC tables, and to avoid any namespace collision.
By modifying the property flyway.table, the name for the history table can be changed.
But using flyway.table=FLYWAY.history does not work. The schema FLYWAY is not created and the table PUBLIC.'FLYWAY.history' gets created instead.
How should one tweak Flyway configuration in order to achieve the expected result?
Providing this property solves the problem partially: flyway.schemas=FLYWAY,PUBLIC.
By doing so, the history table will be stored in the schema FLYWAY but all the migrations will be run by default on this schema.
Please refer to http://flywaydb.org/documentation/commandline/migrate.html and look for schemas for more details.
I found 2 issues to this approach, which can be fixed with minor tweaks.
1st issue:
The schema FLYWAY must be existing before any Flyway migration attempt. This can be done in java by using stmt.execute("CREATE SCHEMA IF NOT EXISTS FLYWAY"); and close the database before migration
2nd issue:
All the migrations will run by default on the schema FLYWAY.
This can be fixed by modifying each SQL migration file to specifically point to PUBLIC schema. Each file would then contain these statements: create table PUBLIC.PERSON (...);
I've solved my problem at hand, but I'm not fully happy with the fix and the extra manual work. Hopefully someone can come with a better answer (more native way) and less tweaks.

Hibernate Domain Object Generation

I'm trying to understand how to best generate and synchronize domain model POJO's from my database using Hibernate. Right now the process I managed to build is the following:
Build the ER schema on the database
Have an hibernate.reveng.xml file containing the elements (one for each table)
Use JBoss tools on eclipse to run a code generation configuration where I set the target package and location, the aforementioned reveng.xml file and get generated POJO's, mapping files and hibernate.cfg.xml files
But this has a lot of problems:
I cannot map common fields (ID, created by, modified by, etc..) to a
particular base entity.
I have to manage a lot of mapping files (doesn't seem to generate a
single one)
I cannot generate a basePojo and have my extended one so that my
modifications on the POJO's aren't overriden by the next code
generation.
I cannot fine tune the output location of the generated artifacts (mappings, .cfg and Pojos) they all go into the same base folder (POJO's are placed according to the package name I set)
Is it possible to "tell" the generator to map the common table fields to the same classe (createdBy, ModifiedBy, ID, etc...) ?
I'm questioning if this approach makes sense at all? Should I be managing my POJO's by hand ? I don't mind that but some help managing the mapping files (.hbm.xml) would help a lot.
Or should I find some way to go "code first", ie. write the POJO's and then generate the schema ? I'm a bit used to the .NET's entity framework and I feel quite lost on what's the "proper" way to build the persistence layer in Java/Hibernate
Thank you
The Telosys Tools code generator is probably the solution for you.
It uses an existing database to generate any kind of source file for
each entity (database table), typically POJO, DTO, DAO, web pages, etc...
When the database schema change you just have to regenerate.
For more information see the web site : https://sites.google.com/site/telosystools/
and the tutorials : https://sites.google.com/site/telosystutorial/
All the templates are free and customizable,
for Hibernate POJO you can use the JPA templates (https://github.com/telosys-tools/persistence-jpa-TT210-R2) and adapt them if necessary

update hibernate configuration file in Netbeans

I want to update my mysql database and alter some tables(add new tables and columns).Is there any way to update the hibernate configuration file and mapping classes and add the entity classes for the newly added table in netbeans IDE without manual coding for swing applications?
There are some tools out there to generate hibernate mappings, like this. For netbeans, I think this link might help you.

Migrating Data accross different DB Schema

I want to migrate my data from one DB to other using Java. Both DBs have different schema structure. I might also need to define some mapping / validation rule. Can anyone please guide me about any strategy, framework or any opensource project.
Thanks
Isn't in this case I have to create all the POJO to match the both schema (even by auto generating). Is there any way to avoid this thing i.e. giving schema mapping and generating POJO on fly in memory ?
Any idea?
Thanks
Yes, you need an Extract-Transform-Load (ETL) tool.
Here are some open source choices:
http://www.google.com/search?gcx=w&sourceid=chrome&ie=UTF-8&q=open+source+etl
ETL is generally used for this as in duffymo's answer.. you could also try ORM tools for this:
There is the Torque project.. http://db.apache.org/torque/
Read the data from your existing schema into java objects, then set them into the other objects for the other schema and then save them into the database. I am pretty sure hibernate also can be used, although I havent used hibernate per se. It works on the same way as torque..

How to generate orm mapping classes from sql schema in Java

I have an existing sql schema file for db. Is it possible to generate and re-generate when needed DAO's entities and all other required helper/client classes to access it? I don't mind what will it be -- hibernate, other jpa or something else.
Asuming you/others are still looking for a solution:
I just got the same problem and got it working in Eclipse (slightly different) as follows:
created JPA Project and downloaded & added user library in the wizard
Also wanted to give a schema-sql-file as input but instead found a way to take an actual db as input. (That was surely much easier for the developers of the tool to process than parsing proprietary sql-script-files)
To do that "rightclick" you jpa project an there "new/other/jpa/entities from tables"
In the following Wizard you have to create a db-connection to the db whose schema you want to get as jpa-annotated POJOs (IMHO It's very intuitive..but you may ask if there is a problem)
After finishing all jpa-classes are generated from the db...saved me from a lot of dummy work :)

Categories