How do you stop eclipse from automatically creating JPA entities from tables? - java

I have a JPA project in eclipse and something keeps synchronizing the DB tables with the entities. The problem is that the entities that are created end with underscore ("_.java"). I already manually created the entities from the schema, so that causes duplicate entities, then version control picks them up and wants to add them to the repo etc.
How do I tell eclipse not to generate the entities, or to generate them but without the underscore?

Those xxx_.java files are metamodel classes, and they are necessary, automatically generated by the framework and should not interfered with your work, Why do you want to delete them?
As this site says:
This metamodel is important in 2 ways.
First, it allows providers and frameworks a generic way to deal with an application's domain model.
Second, from an application writer's perspective, it allows very fluent expression of completely type-safe criteria queries, especially the Static Metamodel approach.
And:
For each managed class X in package p, a metamodel class X_ in package p is created.
The name of the metamodel class is derived from the name of the managed class by appending "_" to the name of the managed class.
For what you have written, everything appears to be normal. The duplicated class and the _ are necessary for the framework and you can simply ignore them. You should read this site too for more information.
But if you want not to add those files to your repo, there are ways to exclude certain files when you have to commit, depending on the repository system (like in git, using the .gitignore file).

Related

IntelliJ randomly re-assigns Entity's Data Source

I have a JPA 2.1 application that connects to two different databases with two different sets of tables. Within IntelliJ, there is a Persistence View you can use to assign each JPA Entity to the appropriate Data Source and this works fine. IntelliJ is able to validate the Entity's Table and Column against the corresponding table in the Data Source.
Every now and then, IntelliJ (version 2018.3) loses my choice and attaches the Entity to the other Data Source. I find this out when I open the class and find the Entity's table and columns don't match. I stumble across the change some indefinite time after the swap has occurred.
My work-around is to manually remove the incorrect assignment and make the correct assignment. IntelliJ's inability to remember this assignment is getting old.
I suspect IntelliJ might be auto-generating files to represent the classes which are annotated #Entity and maybe this is where the problem lies.
I understand I could add the Entity to the persistence.xml using the <class> attribute such that the data source assignment is made in this config file, but it only appears to be a problem with IntelliJ. Deployments (maven-based) to the server compile and run as expected.
Is there something I can do with IntelliJ to avoid losing the data source assignments?
Other notes:
These mappings are not recorded in the <module>.iml file. I remain unsuccessful at finding where this is recorded.
Using the <class> attribute in the persistence.xml is not considered by IntelliJ for validation.
Using the #PersistenceUnit(unitName = "unitName") annotation on the Entity is not considered by IntelliJ for validation.
To get IntelliJ to remember the association between and Entity and its DataSource is the wrong approach. Instead, IntelliJ needs to be told the association between the Persistence Unit and its DataSource.
Both options are available from the Persistence View that is enabled when the JPA facet is enabled, but IntelliJ will list all annotated Entities under both persistence units which can lead you to thinking you need to open each Entity to perform the mapping.
The only action that is required within the Persistence View is to map each Persistence Unit to its DataSource.

How do I instantiate an EntityManagerFactory that augments a persistence.xml with extra classes?

I have a module with a persistence.xml for several classes. I have an application which uses that module, but wants to augment that EntityManagerFactory with a couple of other classes that are specific to this application and don't belong in the module.
If I create a persistence.xml in the application that overrides the persistence unit it does not work reliably (it does work when run from IntelliJ's debugger, but does not work when invoked using a maven appassemble package) because it seems the rules governing which of the persistence.xml files in the various jars takes effect are beyond my understanding, and probably difficult to control.
If I create a second persistence unit to contain only the new tables, then I will need multiple EntityManagerFactory-s to retrieve the various object types in JPA. I do not currently need to execute queries that join objects from the library module with objects specific to the application module, but I am reasonably certain it would be impossible if the objects were in different persistence units.
Even worse, using multiple persistence units appears to make derby angry because the second persistence unit fails when it finds that the database is already opened (by the first persistence unit; why derby can't share in the same JVM I don't know, and there may be workarounds I do not know).
What are the dangers if you have persistence units that overlap? ( both units have objects mapped to the same table in the same database )
What are the proper guidelines for dealing with persistence units from multiple .jars ?
Using standard JPA functionality, there is no means to supply additional entity classes in runtime.
The approach I recommend is to remove persistence.xml from your modules and create orm files, which contain the same entities as original persistence.xml files.
Then create single persistence unit in the application and include orm files from all required modules.
This is what has always worked for me an it seems the only reasonable approach with current JPA version.
This way, you end up with single persistence unit, and still having modular and extensible sets of entities.

Separating JPA information from POJO

I am working on a project that will have entities persisted to a database using JPA. We will be using Maven as the project management framework. I am wondering if it would be possible to create one project for the POJOs and another for the persistence definitions and then "combine" the two into a single output that contains the POJOs and their persistence information.
Basically I am trying to separate the code that POJOs from the persistence definition. Because the POJOs may be reused by several different projects that may or may not need to persist them and may or may not want to change the persistence information. (Similar but not quite the same as Is it possible to build a JPA entity by extending a POJO?)
I have two ideas on how I might be able to do it. If I were to use the POJOs in a web application I could provide persistence.xml and map the classes in that project and just add a dependency to the project containing the POJOs. But if I wanted to create a single jar file containing the persistence information and the POJOs, I think I could use the shade plugin?
Is there any other way to essentially merge two maven projects into a single output and is this a reasonable thing to want to do?
If I remember correctly, then annotations do not have to be on the classpath if you're not using them. The annotated classes can still be loaded.
So my recommendation would be:
Stick with the JPA annotations, as this is the easiest way to define the mappings and tooling support is usually better.
Declare the JPA dependencies as optional and probably also as provided.
If you need to override the mappings defined by the annotations, it should be possible to do this using the persistence.xml, AFAIK (never tried).
I do appreciate the input. In the end, the solution for me was to create two projects. The first provided the definition of the POJOs without any JPA information. Yes there are some JPA related members such as id but I will address those. The second project contained the JPA metadata (orm and persistence XML files).
As for the members related to persistence (e.g. id) I could probably live with those in the model classes but using the suggestion in this post (Is it possible to build a JPA entity by extending a POJO?) I extended my POJO classes and declared id in the "entity" sub classes. This does require some consideration when defining the POJO in terms of access to members.
One thing to note, this solution runs into trouble when you have a class hierarchy (inheritance in your model). The classes in your "pure" model inherit from some common class. That class is then extended in the "persistence" model to provide the id and other persistence related members. Now if the persistent subclasses inherit from the classes in the "pure" model, they do not inherit the id and other persistent members.
There may be workarounds in different inheritance mapping such as table per concrete class.

Where should we place HBM files?

We have a team of 5 to 8 people and our project is using Hibernate (ORM) but we are facing some problems related to HBM files and there respective VOs (Value Objects). Actually we all are working on different modules and we all are creating HBM files and there respective VOs as per our module (so we have our HBM files and VOs specific to our module). If common table is used in more than one module then we have multiple HBM file and their VOs for that single table. So should we place all the HBM files anf VOs to a specific location or keep them module specific even if we have multiple HBMs and VOs. Please suggest the GOOD or BAD practice as well.
Thanks
From the query it seems each module has its own data access. If its not very complex, you can put all the data access in separate module. A project can have multiple modules but should have one place for data access.
As suggested, you can have a DAO module which is only doing to Data related operations.
Packages can be used to identify different DAO types.
The common DAO should be kept simple. Business Logic should not go in that. Logic should be handled at a higher level.
Other than that :-
Your project should be properly structured i.e. packages should be clearly defined.
module1/src/../com/../../bl
module2/src/../com/../../b0
dataacess/src/../com/../../bl
dataacess/src/../com/../../bo
Dependencies should be clearly extrapolated. If you have one DAO module then DAO should be independent. Other Modules should depend on DAO. If its java you can use maven to do this.
Finally its the choice we make. There will be lot of best practices. You should choose what suits best in your scenario. In the end it should be simple and manageable in future.
There should be a common project that will contain all DAO related stuff. Each module/project will include that commonDAO project in its classpath to perform hibernate and database related operations. This will overcome HBM files duplicacy and ease to maintain code.

Using Hibernate with Dynamic Eclipse Plug-ins

I have classes that are named exactly the same across different plug-ins that I use for my application, and I'd like to be able to configure them properly with Hibernate. The problem is that it looks like Hibernate dynamically generates a class' package name when trying to find a class when it's doing its mapping. With one plug-in this scheme works, but across multiple plug-ins it's not working. It looks like Hibernate gets confused when dealing with Hibernate configuration files across multiple plug-ins.
Is this because each plug-in has its own class-loader? What is the best way to proceed to make this work with the existing plug-ins and Hibernate?
The problem is, that every plugin has its own Classloader and Hibernate uses Reflection to find the right classes.
I have a very nice article at home about exactly this problem, but this one is in German. I will try to explain what you need to do.
In order to have the datastructure shared over several plugins, you have to put it in a plugin and enable a feature called buddy-policy.
Lets say you have a main-application-plugin which is initiating hibernate on startup, this plugin needs to "see" the classes from the datastructure-plugin. To do this the main-plugin sets its Buddy-Policy to "registered" and the datastructure-plugin registers itself as a "buddy". Unfortunately you have to do this all directly in the manifest file, at least in 3.3 there was no way to do this in the editor.
Once this buddy-policy works, Hibernate will also.
I looked up my old application and here is how I did it.
The main-application (toolseye.rcp) is dependent on the hibernate plugin (de.eye4eye.hibernate) and the datastructure-plugin (toolseye.datastructures)
The hibernate-plugin specifies its buddy-policy as "registered"
The datastructure-plugin registers itself to the hibernate-plugin
Here are the important lines:
Hibernate-plugin de.eye4eye.hibernate
Eclipse-BuddyPolicy: registered
Datastructure-plugin toolseye.datastructures
Eclipse-RegisterBuddy: de.eye4eye.hibernate
Put those line directly in the MANIFEST.MF
Both plugins need to reexport their packages in order that the main application or whatever layer you have in between can use them.
Hope that helped.
Just to make this complete.
Instead of using Hibernate, EclipseLink could be used as JPA-provider in a Eclipse RCP application. EclipseLink is the former TopLink from Oracle and has been choosen to be the reference implementation for JPA 2.
The point for an RCP is, that EclipseLink is available as OSGI-Bundles (org.eclipse.persistence.jpa), and due to that it can load classes from another plugin without an additional buddy-policy.
Currently I was playing around, using the following project structure (Model-View-Presenter Pattern). The names in the brackets specify the dependecy plugins (not all are included, only the ones related to this question)
rcp.mvp.view (rcp.mvp.presenter /
rcp.mvp.model)
rcp.mvp.presenter (rcp.mvp.data -
data reexports the model, so this is
not needed here) *
rcp.mvp.data (rcp.mvp.data.mysql /
rcp.mvp.model / javax.persistence /
org.eclipse.persistence.jpa)
rcp.mvp.data.mysql - provides only
the mysql-jdbc-driver. has to be
inside the classpath
rcp.mvp.model
In this scenario, the JPA provider in the data-plugin is able to load the classes from the model-plugin without a buddy-policy.
*Note, the presenter is not dependent on any JPA packages since this is encapsulated by DAOs (the main reason why to use them still)
Links
User Guide
RCP example (unfortunately not using DAOs)
EclipseLink conceptual Webinar from live.eclipse.org

Categories