Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I’m working on a little multi-tier application utilizing JPA/EclipseLink as a persistence layer. In my current design I have two sets of objects, POJOs and Entity objects, I use POJOs for general programming tasks and Entity classes are used for DB read/write and table mapping.
Now is it necessary to have POJO=>Entity mapping (problematically) and then a second Entity==>DB tables mapping (JPA annotations)? I find it easier to just use Entity classes as my main java objects and persist them whenever necessary, after all Entity classes are essentially POJO with with couple of JPA annotations.
Also in a situation where it's indeed necessary to keep things separated, what is the best place to do the POJO=>Entity mapping, currently I do this in a CRUD method, e.g.
public void addCustomerPOJO(Customer customerPOJO){
//Cteat EntityManager and start a Transaction
//Create Entity class and populate it with values
//from the passed-in regular (non entity) Customer class
//Persiste and close
}
Is there a better or more common way to do this?
There is nothing wrong with using your entities as your domain objects. You have to be aware of using entities that have been detached and whatnot, but that can be managed.
I would not artificially create work for yourself by forcing each entity to be mapped to another bean (or POJO). Sometimes it is necessary to wrap many entities (or values from entities) into a bean, but only do it if there is a good reason.
Maybe the confussion is due to the fact that the entity is just a POJO with the mappings info (in the code as annotations or in a separate configuration file). Works as a POJO as long as you want (you can create and modify objects; as long as you don't save them with a Session they won't be written in the DB).
Sometimes you might need to have the data in a bean that is not an Entity (mainly because that bean is managed by another framework and you do not want to mix things *1), then you only have to copy (by an specific constructor, by calling lots of set...(), whatever) that data from your bean to your Entity/POJO.
*1 I am thinking of JSF here.
I see no reason for two parallel object hierarchies like this. I'd have entities and ditch what you're calling POJOs. No need for mapping. It's a waste of CPU cycles for no benefit that I can see.
I am currently working on a three-tired Java EE app with JPA serving the back end. I use a single java class to represent each table in the database(entity classes) And i use the same classes to do all the operations, both in the business layer as well as the database layer. And it makes sense too.
Because in all the three layers, you can create an instance of the same entity class independently.
PS - #Hay : Even when i started learning JPA, I was doing manipulations with two different sets of same classes as you :) I guess, this practice emerged becoz of EJB 2.1 which didn't have any annotations in them. So basically two different sets of classes are required where one has to be entirely dedicated as ENTITY CLASSES for DAO operations.
As JPA evolved, Annotations are brought into picture, which made our lives easy.. OLD HABBITS DIE HARD indeed ;)
Annotations do have their downside, especially in multi-tiered Java EE applications.
In the example below, you have a simple POJO object (Domain object) which you want
the java REST clients to use
the REST server accepts this object as a parameter, and
to persist this object to a database.
I would think this is a common use-case.
With so many annotations the clients using this object needs all the jar dependencies. I suppose the annotations can be moved to an XML file, but then the annotation advantages are lost.
Are there any other creative solutions?
#Data
#Entity
#XmlRootElement(name="sport")
#Table(name = "db_sport")
#NamedQueries({
#NamedQuery(name = "Sport.findAll", query = "SELECT d FROM Sport d")})
public class Sport implements Serializable {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
#Basic(optional = false)
#Column(name = "sportId")
int sportId;
}
You may need to use another set of classes to prevent ripple effect. This is often the case with web services that have several dependencies. Data mapping in general adds to complexity of program and should be avoided without a valid reason.
My $0.20
Unless you can remember how relationships are marked in your code and
when they are populated by hibernate and when/where they are accessed
in the code I would suggest you to go with DTO approach.
However, if you are learning hibernate or going to use it in small project it may be easy for you to return the entity (or a collection of them) from your controller layer.
But I'm sure the more you do it the more you will find the need to
move to DTO or even JsonView. If you are not the one who is
building UI then you will realize it even sooner.
Speaking of DTO, my fav is ModelMapper. You can do conversion (simple and complex whatever you like) at controller layer. This way you will know what you are returning inside the DTO.
See the slides of JPA Best Practices of Lee Chuk Munn. You can find it in JPA Best Practices - Indo Java Podcast.
Related
REST API - DTOs or not?
I would like to re-ask this question in Microservices' context. Here is the quote from original question.
I am currently creating a REST-API for a project and have been reading
article upon article about best practices. Many seem to be against
DTOs and simply just expose the domain model, while others seem to
think DTOs (or User Models or whatever you want to call it) are bad
practice. Personally, I thought that this article made a lot of sense.
However, I also understand the drawbacks of DTOs with all the extra
mapping code, domain models that might be 100% identical to their
DTO-counterpart and so on.
Now, My question
I am more aligned towards using one Object through all the layers of my application (In other words, just expose Domain Object rather than creating DTO and manually copying over each fields). And the differences in my Rest contract vs domain object can be addressed using Jackson annotations like #JsonIgnore or #JsonProperty(access = Access.WRITE_ONLY) or #JsonView etc). Or if there is one or two fields that needs a transformation which cannot be done using Jackson Annotation, then I will write custom logic to handle just that (Trust me, I haven't come across this scenario not even once in my 5+ years long journey in Rest services)
I would like to know if I am missing any real bad effects for not copying the Domain to DTO
I would vote for using DTOs and here is why:
Different requests (events) and your DB entities. Often it happens that your requests/responses different from what you have in the domain model. Especially it makes sense in microservice architecture, where you have a lot of events coming from other microservices. For instance, you have Order entity, but the event you get from another microservice is OrderItemAdded. Even if half of the events (or requests) are the same as entities it still does make sense to have a DTOs for all of them in order to avoid a mess.
Coupling between DB schema and API you expose. When using entities you basically expose how you model your DB in a particular microservice. In MySQL you probably would want to have your entities to have relations, they will be pretty massive in terms of composition. In other types of DBs, you would have flat entities without lots of inner objects. This means that if you use entities to expose your API and want to change your DB from let's say MySQL to Cassandra - you'll need to change your API as well which is obviously a bad thing to have.
Consumer Driven Contracts. Probably this is related to the previous bullet, but DTOs makes it easier to make sure that communication between microservices is not broken whilst their evolution. Because contracts and DB are not coupled this is just easier to test.
Aggregation. Sometimes you need to return more than you have in one single DB entity. In this case, your DTO will be just an aggregator.
Performance. Microservices implies a lot of data transferring over the network, which may cost you issues with performance. If clients of your microservice need less data than you store in DB - you should provide them less data. Again - just make a DTO and your network load will be decreased.
Forget about LazyInitializationException. DTOs doesn't have any lazy loading and proxying as opposed to domain entities managed by your ORM.
DTO layer is not that hard to support with right tools. Usually, there is a problem when mapping entities to DTOs and backwards - you need to set right fields manually each time you want to make a conversion. It's easy to forget about setting the mapping when adding new fields to the entity and to the DTO, but fortunately, there are a lot of tools that can do this task for you. For instance, we used to have MapStruct on our project - it can generate conversion for you automatically and in compile time.
The Pros of Just exposing Domain Objects
The less code you write, the less bugs you produce.
despite of having extensive (arguable) test cases in our code base, I have came across bugs due to missed/wrong copying of fields from domain to DTO or viceversa.
Maintainability - Less boiler plate code.
If I have to add a new attribute, I don't have to add in Domain, DTO, Mapper and the testcases, of course. Don't tell me that this can be achieved using a reflection beanCopy utils, it defeats the whole purpose.
Lombok, Groovy, Kotlin I know, but it will save me only getter setter headache.
DRY
Performance
I know this falls under the category of "premature performance optimization is the root of all evil". But still this will save some CPU cycles for not having to create (and later garbage collect) one more Object (at the very least) per request
Cons
DTOs will give you more flexibility in the long run
If only I ever need that flexibility. At least, whatever I came across so far are CRUD operations over http which I can manage using couple of #JsonIgnores. Or if there is one or two fields that needs a transformation which cannot be done using Jackson Annotation, As I said earlier, I can write custom logic to handle just that.
Domain Objects getting bloated with Annotations.
This is a valid concern. If I use JPA or MyBatis as my persistent framework, domain object might have those annotations, then there will be Jackson annotations too. In my case, this is not much applicable though, I am using Spring boot and I can get away by using application-wide properties like mybatis.configuration.map-underscore-to-camel-case: true , spring.jackson.property-naming-strategy: SNAKE_CASE
Short story, at least in my case, cons doesn't outweigh the pros, so it doesn't make any sense to repeat myself by having a new POJO as DTO. Less code, less chances of bugs. So, going ahead with exposing the Domain object and not having a separate "view" object.
Disclaimer: This may or may not be applicable in your use case. This observation is per my usecase (basically a CRUD api having 15ish endpoints)
The decision is a much simpler one in case you use CQRS because:
for the write side you use Commands that are already DTOs; Aggregates - the rich behavior objects in your domain layer - are not exposed/queried so there is no problem there.
for the read side, because you use a thin layer, the objects fetched from the persistence should be already DTOs. There should be no mapping problem because you can have a readmodel for every use case. In worst case you can use something like GraphQL to select only the fields you need.
If you do not split the read from write then the decision is harder because there are tradeoffs in both solutions.
This question already has answers here:
How to use DTO in JSF + Spring + Hibernate
(2 answers)
JSF Service Layer
(2 answers)
Closed 7 years ago.
I am developing a java web application and am trying to follow some patterns like dao/dto. At the moment i am thinking about such base architecture layers:
I ran into some questions regarding the layers. The scheme would go as such: DAO takes in DTO and returns objects(entities) from DataBase, Service layer also takes in DTO, uses DAO and does all the required logic with the returned objects. UI Bean, Service, DAO and DTO classes are Entity specific - each entity has its own layers.
Now would i need the UI bean to use in views or would that be an overkill and UI views can directly use service classes as ui beans? If no, why would i need UI bean?
Another question is regarding DTO. I have created entities with all the required properties and as i understand DTO classes are like reflections of Entity classes. So why would i need these DTO classes and if i use them i recon it would require some converting from entity to dto and vice versa. Do i do the converting in Service layer? Would views (for eg. html pages) also display DTO object properties not actual Entities (as in calling #{UIBean.entityProperty})?
First of all, I would use the DTO beans on the front-end part only, but since u already mention UI-beans, i suppose these will do the trick just fine, the facade uses these to pass them to the controller for displaying your web-components.
in between the Service and the facade you map the entities of the backend towards dto-beans.
In this way your front-end will be completely loosely coupled to your backend.
Regarding your 2nd question I would like to point out an exact valable reason why your UI should always use dto or view beans.
You can combine several backend entity-beans into one dto bean for easier processing on the front-end.
In general I keep always in mind DTO's for public acces, eta a web-service exposing it or a web-front end or a swing app, or...
Entity classes only used in dao and service layer never further up.
As rule of thumb try to divide logical layers according to your context. Inspire you of the theory but use it with care. I give you my humble understanding of layer's interest with few examples. This vision is of course not complete but I hope it will help you to answer your questions.
So is it overkill to use UIBean instead of Service DTO ? I would say it depends of your context.
Maybe there are user inputs data inside your UI beans ? You have to validate them with JSR 303 annotations for example. If those annotations have a meaning in this layer they are useless for underneath layers. That's why you will have a UIBean with JSR 303 annotations and a DTOBean without JSR 303 annotations.
But if they are exactly the same why duplicate ? Maybe at UIBean layer a date could be represented as a String type and you want to manipulate Date type instead of String at DTO layer. That's why you need to adapt your data between layers to work with objects that make sense to a particular layer. For example, you could add a BOAdapter (between UIView and Service) and DTOAdapter (between Service and DAO). Those adapters are usefull for transforming your data inside each POJO's format. For example, you could have in your BO(=UIBean) a date expressed inside three strings and you want a Date object for DTO so you transform it inside the BOAdapter:
public class BOAdapter(){
private BOAdapter(){}
public static DTO toDTO(BO objectBO){
DTO objectDTO = new DTO();
SimpleDateFormat df = new SimpleDateFormat("yyyy-MM-aa");
objectDTO.setDate(df.parse(objectBO.getYear()+"-"+objectBO.getMonth()+"-"+objectBO.getDay());
[...]
}
}
Why I need DTOAdapter ? Maybe you have a database that contains at least two tables Customers and Adresses with an integrity constraint between them. JPA will automatically generate the right code. But do you really need all this code up to UIView ? I mean if the functionnality you are coding needs only the name, surname and date of birth of your customer, their adress is useless. Again that's why you need to adapt your data between layers to work with objects that make sense to a particular layer. In this case you could create a DTO object only with name, surname and date of birth information and create a method inside your DTOadapter to transform your custom DTO into an heavy JPA object to work properly with database.
But I need the whole entity for coding my fonctionnality ? Maybe you need to add validation constraints inside this layer besides JSR 303. So it could be interesting to have DTO classes besides your entity for the same reason as BO objects.
But my entity is big how duplicate it easily ? Try to use a tool to map data (like dozer) automatically. If it is not too big do it manually.
Since you have Spring tag,
I will replace your [DAO] with Spring Data Repository. So most of the time, you write interface method / #query annotation, Spring Data write implement.
Replace DTO with JPA Entity. So I could use some reverse engineering.
[UI Bean] will most be composite of JPA Entity. With some validation
I'm trying to figure out the best way to use JPA in the context of a restful web service. The input comes in as JSON and I can use Jackson/JAX-RS to convert that to a POJO. This gets passed to a service where I need to somehow merge into a JPA entity.
These are the options I've found so far with pros and cons.
1. JPA merge()
The first thing I tried was probably the simplest. The GET action returns the JPA entity which is easily serialized into JSON. On the update the object is passed back is JSON which can be used to populate a detached entity. This can be saved to the DB using the JPA merge() method.
Pros
Simple architecture with less code duplication (i.e. no DTO's)
Cons
As far as I can tell this only works if you pass the whole model around. If you try to hide certain fields, like the maybe the password on a User entity, then the merge thinks you're trying to set these fields to null in the DB. Not good!
2. DTO's using JPA find() and dozer
Next I thought I'd look at using data transfer objects. Apparently an anti-pattern but worth a look. The service now creates a DTO instance based on the entity and it is this DTO that is serialized to JSON. The update then gets the entity from the DB using a find() method and the values need to be copied across from the DTO to the entity. I tried automating this mapping using the dozer framework.
Pros
You don't have to return the entire model. If you have certain fields you don't want to be updated you can leave them off the DTO and they can't be copied to the entity by mistake. Using dozer means you don't have to manually copy attributes from dto to entity and vice versa.
Cons
It feels like repeating yourself when writing the DTO's. Somehow you have to map between entities and DTO's. I tried to automate this with dozer but it was a bit disappointing. It was nulling out things it shouldn't have been and to get full control you have to write xml.
3. DTO's using manual merge
A third way would be to abandon dozer and just copy the properties across from the DTO to the entity in the service. Everybody seems to say anti-pattern but it's pretty much how every non-trivial application that I've seen in the past has worked.
Summary
It seems to be a decision between keeping things simple for the developer but not having control over the input/output or making a more robust web service but having to use an anti-pattern in the process...
Have I missed anything? Perhaps there's an elusive alternative?
Using JPA merge looks the simplest, cleanest and with very less effort but as correctly discovered creates problems with detached entity attributes set to null.
Another problem which turned out to be big in one of my experiences was that if you rely on JPA merge operation you must be using Cascade feature as well.
For simple and less nested relation this works reasonably well, but for deeply nested domain objects and lots of relations, this becomes a big impact on performance. The reason being that the ORM tool (Hibernate in my experience) upfront caches the SQL to load the merge entity ( 'merge path' in Hibernate parlance) and if the nesting is too deep with Cascade mappings the joins in the SQL becomes too big. Marking realtions Lazy does not help here as the merge path is determined by the Cascades in relations. This problem becomes apparent slowly as your model evolves. Plus the prospect of angry DBA waving a huge join query on our face prompted us to do something different :-)
There is an interesting issue related to Hibernate regarding Merging of Lazy relations still unresolved (actually rejected but the discussion is very enjoyable to read) in Hibernate JIRA.
We then moved towards the DTO approach where we refrained from using merge and relied on doing it manually. Yes it was tedious and required the knowledge of
what state is actally coming from the detached entity, but to us it was worth. This way we do not touch the Lazy relations and attributes not meant to change. and set only what is required. The automatic state detection of Hibernate does the rest on transaction commit.
This is approach I am using:
suppress serialization of certain fields with XmlTransient annotation
when updating the record from the client, get the entity from the database and use ModelMapper with custom property mapping to copy the updated values without changing the fields that are not in the JSON representation.
For example:
public class User {
#Id
private long id;
private String email;
#XmlTransient
private String password;
...
}
public class UserService {
...
public User updateUser(User dto) {
User entity = em.find(User.class, dto.getId());
ModelMapper modelMapper = new ModelMapper();
modelMapper.addMappings(new UserMap());
modelMapper.map(userDto, user);
return user;
}
}
public class UserMap extends PropertyMap<User, User> {
protected void configure() {
skip().setPassword(null);
}
}
BeanUtils is an alternative to ModelMapper.
It would be nice if these libraries could recognize the XmlTransient annotation so the programmer can avoid creating the custom property map.
My application has about 50 entities that are displayed in grid format in the UI. All 50 entities have CRUD operations. Most of the operations have the standard flow
ie. for get, read entities from repository, convert to DTO and return a list of DTO's.
for create/update/delete - get DTO's - convert to entities, use repository to create/update/delete on DB, return updated DTOs
Mind you that for SOME entities, there are also some entity specific operations that have to be done.
Currently, we have a get/create/update/delete method for all our entities like
getProducts
createProducts
updateProducts
getCustomers
createCustomers
updateCustomers
in each of these methods, we use the Product/Customer repository to perform the CRUD operation AFTER conversion from entity -> dto and vice versa.
I feel there is a lot of code repetition and there must be a way by which we can remove so many of these methods.
Can i use some pattern (COMMAND PATTERN) to get away with code repetition?
Have a look at the Spring Data JPA or here project. It does away with boilerplate code for DAO.
I believe it basically uses AOP to interpret calls like
findByNameandpassword (String name,String passwd)
to do a query based upon the parameters passed in selecting the fields in the method name (only an interface).
Being a spring project it has very minimal requirements for spring libraries.
Basically, you have 2 ways to do this.
First way: Code generation
Write a class that can generate the code given a database schema.
Note that this you will create basic classes for each entity.
If you have custom code (code specific to certain entities) you can put that in subclasses so that it doesn't get overwritten when you regenerate the basic classes.
Object instatiation should be via Factory methods so that the correct subclass is used.
Make sure you add comments in the generated code that clearly states that the code is generated automatically (so that people don't start editing them directly).
Second way: Reflection
This solution, while being more elegant, is also more complex.
Instead of generating one basic class for each entity you have one basic class that can handle any entity. The class would be using reflection to access the DTO:s.
If you have custom code (code specific to certain entities) you can put that in other classes. These other classes would be injected into the generic class.
Using reflection would require a strict naming policy on your DTO:s.
Conclusion
I have been in a project using the first method in a migration project to generate DTO classes for the service interface between the new application server (running java) and the fat clients and it worked quite well. We had more than 100 generated DTO classes. I am aware that what you are attempting is slighty different. Editing database records is a generic problem (all projects need it) but there aren't (m)any frameworks for it.
I have been thinking about creating a generic tool or framework for it but I have never gotten around to it.
I have had similar questions and concerns as to how to convert between Hibernate entities and data transfer objects to be returned by a web service as are discussed in this question:
Is using data transfer objects in ejb3 considered best practice
One of the factors mentioned here is that if the domain model changes, a set of DTOs will protect consumers in the case of a web service.
Even though it seems like it will add a substantial amount of code to my project, this reasoning seems sound.
Is there a good design pattern that I can use to convert a Hibernate entity (which implements an interface) to a DTO that implements the same interface?
So assuming both of the following implement 'Book', I would need to convert a BookEntity.class to a BookDTO.class so that I can let JAXB serialize and return.
Again, this whole prospect seems dubious to me, but if there are good patterns out there for helping to deal with this conversion, I would love to get some insight.
Is there perhaps some interesting way to convert via reflection? Or a 'builder' pattern that I'm not thinking of?
Should I just ignore the DTO pattern and pass entities around?
Should I just ignore the DTO pattern
and pass entities around?
My preference is usually "yes". I don't like the idea of parallel hierarchies created just for the sake of architectural or layer purity.
The original reason for the DTO pattern was excessive chattiness in EJB 1.0 and 2.0 apps when passing entity EJBs to the view tier. The solution was to put the entity bean state into a DTO.
Another reason that's usually given for creating DTOs is to prohibit modification by the view layer. DTOs are immutable objects in that case, with no behavior. They do nothing but ferry data to the view layer.
I would argue that DTO is a Core J2EE pattern that's become an anti-pattern.
I realize that some people would disagree. I'm simply offering my opinion. It's not the only way to do it, nor necessarily the "right" way. It's my preference.
There needs to be a contrarian view amongst all the jolly kicking of the DTO.
tl;dr - It is sometimes still useful.
The advantage of the DTO is that you don't have to add a zillion annotations to your domain classes.
You start with #Entity. Not so bad. But then you need JAXB so you add #XMLElement etc - and then you need JSON so you add things like #JsonManagedReference for Jackson to do the right thing with relationships then you add etc. etc. etc. ad infinitum.
Pretty soon your POJO ain't so plain any more. Read about "domain driven design" sometime.
In addition you can "filter" some properties that you don't want the view to know about.
We should not forget that entity objects are not easy to handle when they are in managed state. This makes their passing to GUI forms problematic. To be more precise, child objects are handled eagerly. This cannot be done out of session, cousing exceptions. So, they either have to be evicted (detached) from the entity manager of they have to be converted to appropriate DTOs. Unless of cource there is a pattern, which I am not aware of, that I would be very glad to know.
For quickly create a "look-alike" DTO, without a bunch of duplicate get/set code, you can use BeanUtils.copyProperties. That function help you quickly copy the data from DAO to DTO class. Just remember that there are more than one common libraries support BeanUtils.copyProperties, but their syntax are not the same.
I know this is an old question, but thought I would add an answer offering a framework to help in case someone else is tackling this problem.
Our project has JAXB annotated POJOs that are separate from the JPA annotated POJOs. Our team was debating how best to move data between the two objects (actually data structures).
Here is an option for people to consider:
We found and are experimenting with Dozer which handles (1) same name, (2) XML mapping and (3) custom conversions as ways to copy data between two POJOs.
It has been very easy to use so far.