I want to let my client create his own fields and bean in the CMS dynamically.
As well, once he creates a form, I need to create an Hibernate Entity that could be saved to the database.
Is there a way to do it?
I am using JSF2 and Hibernate 3
With recompiling and without?
Creating tables and entities dynamically is IMO not a good idea. Hibernate is not really made for that and generating entities is only a small part of the problem. You would have to add them to the configuration, rebuild a session factory, update the model. And what about subsequent restarts of the application? Not recommended, just forget this approach...
Another option would be to use an Entity-Attribute-Value (EAV) model. This is something many CMS are doing. I've never implemented this with Hibernate but it's doable (and has already been done). Here are some resources:
Adding new persistable classes at runtime
[hibernate-dev] dynamic entities (has some sources attached)
But to be honest, I wouldn't implement my own CMS but rather reuse an existing one. One Hippo seems to be a candidate.
See also
The CCK buzz (Content Creation Kit) and the EAV problem
Related questions
Entity Attribute Value Database vs. strict Relational Model Ecommerce question
Approach to generic database design
How do you build extensible data model
I am looking for something similar to the drupal CCK, but in Java (in a Java CMS)?
Java Frameworks that support Entity-Attribute-Value Models
Implementing EAV pattern with Hibernate for User -> Settings relationship
Easiest way would be using a List<String> for the field names and a Map<String, Object> for the field values. Maps can be accessed in EL using dynamic keys like so:
<ui:repeat value="#{bean.fieldnames}" var="fieldname">
<h:inputText value="#{bean.fieldvalues[fieldname]}" /><br />
</ui:repeat>
A completely different alternative is to autogenerate classes using tools like ASM/Javassist and creating database tables on the fly. But that's a lot more work.
Related
We have a web application which uses JSP, Servlet and Hibernate. The design pattern we got is MVC like, which means,
JSP -> Servlet -> Beans (POJO)
Now the question. Once of out developers has inserted the hibernate queries, hibernate session creation etc inside POJOs. Now inside these POJOs, there are methods like getAllEmployees(), getAllAgents() etc. This made it extremely hard for us to change the database tables (we handle database manually, using MySQL Workbench) and use automatic tools to re-generate the POJOs because these methods will be lost.
Now there are 2 arguments. One is that this maintaining hibernate queries, sessions inside POJOs is a good work, because it seems like pure MVC. The other argument is move the hibernatre code to Servlets and call the POJOs just like beans, only to set and get values.
We have not worked in Hibernate before. From above 2, what is the preferred place to write hibermnate code when it comes to hibernate?
Finally, please note we are not interested in Spring or other frameworks, we use pure JSP and Servlet with Hibernate
Probably, what you need is another layer of abstraction. Since your pojos are recreated after new migrations, you shouldn't insert code in it (I don't agree with that aproach, but that's just my opinion :-) )
JSP -> Servlet -> NewLayer -> POJO
I don't know where you put your business rules, but in this scenario it will be in the "NewLayer" wich would consist of a "hibrid" layer of service and dao.
I would recommend these readings to rethink your actual architecture:
https://softwareengineering.stackexchange.com/questions/220909/service-layer-vs-dao-why-both
Responsibilities and use of Service and DAO Layers
I am developing an application in Flex, using Blaze DS to communicate with a Java back-end, which provides persistence via JPA (Eclipse Link).
I am encountering issues when passing JPA entities to Flex via Blaze DS. Blaze DS uses reflection to convert the JPA entity into an ObjectProxy (effectively a HashMap) by calling all getter methods on the entity. This includes any lazy-initialised one/many-to-many relationships.
You can probably see where I am going. If I pass a single object through JPA this will call all one/many-to-many methods on this object. For each returned object if they have one/many-to-many relationships they will be called too. As such, by passing back a single JPA entity I actually end up doing multiple database calls and passing all related entries back as a single ObjectProxy instance!
My solution to date is to create a translator to convert each entity to an ObjectProxy and vice-versa. This is clearly cumbersome and there must be a better way.
Thoughts please?
As an alternative, you could consider using GraniteDS instead of BlazeDS: GraniteDS has a much more powerful data management stack than BlazeDS (it competes more with LCDS) and fully support lazy-loading for all major JPA engines: Hibernate, EclipseLink, OpenJPA, etc.
Moreover, GraniteDS has a great client-side transparent lazy loading feature and even a so-called reverse lazy-loading mechanism.
And you don't need any kind of intermediate DTOs: it serializes JPA entities as is and uses code-generated ActionScript beans on the client-side to keep their initialization states.
Unfortunately, lazy-loading is not easy to accomplish with Flash clients. There are some working solutions, like dpHibernate, but so far all the different solutions I have tested fall short of what you would expect in terms of performance and ease of use.
So in my experience, it is the best and most reliable solution to always use DTOs, which adds the benefit of cleanly separating the database and view layers. This necessitates, though, that you implement either eager loading, or a second server round trip to resolve your many-to-many relations, as well as a good deal more boilerplate code to copy the DAO and DTO field values.
Which one to choose depends on your use case: Sometimes getting only the main object's fields might be enough, then you could simply omit the List of related objects from your DTO (transfer only those values you need for your query). Sometimes you may actually need the entire list of related entities, and then you could get it via eager loading, or by setting up a second remote object to find only the list.
EclipseLink also provides a copyObject() API that allows you to give a copy group of exactly what attribute you want. You could then use this copy to avoid having the relationships that you do not want.
If you have a detached object, you could just null out the fields that you do not want as well, or use a DTO.
I am developing a new web application with Struts2, Spring and Hibernate as its core building blocks.
We have created POJO classes with respect to hibernate mapping files.There will be some inputs from users which needs to be updated in to the underlying database
e.g registration or updation.
We have few option like creating new POJO/DTO for action classes which will be filled by the Struts2 and than we can transfer them to the service layer where we can convert those DTO to the respected hibernate POJO else we can expose same POJO to struts2 so that the framework can fill them with the user input and we need not to do the work for conversion and creating extra set of classes.
Application will be not big in size and will have a medium size application tag.
My question is what is the best way to transfer this user input to underlying hibernate layer to perform data base specific work.
Thanks in advance
I'd prefer the "DTO" approach in this case since you then can validate the input first and trigger updates only when wanted.
However, you could use detached entities as DTOs and reattach them when you want to create or update them. If you don't want the web part of your application to depend on Hibernate and/or JPA you might need to create another set of classes (unless you don't use a single annotation).
You'll get both answers on this.
With Struts 2 I tend to use normal S2 action properties to gather form values/etc. and use BeanUtils to copy them to the Hibernate objects. The problem with exposing the Hibernate objects to the form, like with ModelDriven etc. is that you need to define whitelists/blacklists if you have columns that should not be set directly by the user. (Or handle the problem in a different way.)
That said, I'm not fundamentally opposed to the idea like a lot of people are, and they're arguably correct.
In our project we have a constraint of not having the luxury to alter the table structure already in place. The tables are highly denormalized in nature.
We have come up with good POJOs for the application. We have the Entity beans generated out of the exiting tables. Now we have to map the POJOs to the entities so that we can persist.
Ultimately, we combine a good POJO with a bad table. Any thoughts on options/alternatives/suggestions to this approach?
Hibernate/JPA(2) has a rich set of functionality to manipulate the mapping (so that your objects can differ from the tables), so that many (NOT ALL) old tables can be mapped to normal object. -- May you should have a look at this first, any use your pojo/table-"solution" only if this mapping is not powerful enough.
If you have a read only application, you can think of using views to make your table/views more like you objects. This may reduse the amount of strange mapping.
I don't know your mapping, size of the application or use case, but have you considered not to use Hibernate? I ask this, because I can imagine (how I said: I don't know you application), that in a architecture like this, no Hibernate feature is used and so Hibernate will add only a not needed complexity.
If you are using Hibernate you should be able to map your POJOs to the table structure using only XML files, without creating new Java beans. This would allow you to easily change the mapping if all of a sudden you can change the tables structures and make the economy of intermediary beans. That's the best you can do.
I'm hesitating between two designs of a database project using Hibernate.
Design #1.
(1) Create a general data provider interface, including a set of DAO interfaces and general data container classes. It hides the underneath implementation. A data provider implementation could access data in database, or an XML file, or a service, or something else. The user of a data provider does not to know about it.
(2) Create a database library with Hibernate. This library implements the data provider interface in (1).
The bad thing about Design #1 is that in order to hide the implementation details, I need to create two sets of data container classes. One in the general data provider interface - let's call them DPI-Objects, the other set is used in the database library, exclusively for entity/attribute mapping in Hibernate - let's call them H-Objects. In the DAO implementation, I need to read data from database to create H-Objects (via Hibernate) and then convert H-Objects into DPI-Objects.
Design #2.
Do not create a general data provider interface. Expose H-Objects directly to components that use the database lib. So the user of the database library needs to be aware of Hibernate.
I like design #1 more, but I don't want to create two sets of data container classes. Is that the right way to hide H-Objects and other Hibernate implementation details from the user who uses the database-based data provider?
Are there any drawbacks of Design #2? I will not implement other data provider in the new future, so should I just forget about the data provider interface and use Design #2?
What do you think about this? Thanks for your time!
Hibernate Domain objects are simple POJO so you won't have to create separate DPI-objects, H-Object themselves can be used directly. In DAO you can control whether they come from hibernate or anything else.
I highly recommend reading Chapter 4 "Hitting the database" of Spring in Action, 3rd edition, even if you aren't using Spring in your application. Although my second recommendation would be to use Spring :-)
The DAO pattern is a great way to keep database and ORM logic isolated in the DAO implementation, and you only need one set of entity objects. You can make that happen without Spring, it just takes more work managing your sessions and transactions.
If I understand your post, this is sort of a middle-ground between Design 1 and Design 2. The H-Objects (the entities that Hibernates loads and persists) don't need any Hibernate specific code in them at all. That makes them perfectly acceptable to be used as your DPI-Objects.
I've had arguments with folks in the past who complain that the use of JPA or Hibernate Annotations exposes Hibernate specifics through the DAO interface. I personally take a more pragmatic view, since annotations are just metadata, and don't directly affect the operation of your entity classes.
If you do feel that the annotations expose too much, then you can go old school and use Hibernate Mappings instead. Then your H-Objects are 100% Hibernate free :-)
I recommend design #2. Simply construct domain objects, and let hibernate look after them. Don't write separate classes that are persisted.
Hibernate tries to hide most of the persistence business from you. You may need to add a few small annotations to your entities to help it along. But certainly don't make separate classes.
You may need some very small DAO classes. For example, if you have a Person entity, it would be fairly common practice to have a PersonDAO object that saves a person. Having said that, the code inside the DAO will be very simple, so for a really small project, it may not be worth it. For a large project, it's probably worth keeping your persistence code separate from your business logic, in case you want to use a different persistence technology later.