I need to convert a List of records(java objs) into XML output. The XML schema can change based on the request type and the xml output needs to be in a different format with different elements based on the request parameter (Eg. Request A -> produces xml in format A<A><aa>name</aa></A> Request B -> format B <B><bab>name</bab></B>). The existing framework uses JAXB but the dataobjects are tightly coupled with one XML schema.
We are trying to refactor to make it flexible so that based on the request type, we can generate different xml outputs. i.e. same data -> multiple output formats. We are also expecting more xml schemas, so we need to keep it flexible.
Appreciate if you could please let me know which framework would be most suitable for this type of scenario. What would be a good solution for huge schema and data files.
Thank you!
So you have your Domain Model and one or more Integration Models (output formats). I assume, when you create a response (Integration Model), you use some kind of adapter from Domain Model (if not, you should consider this). I would keep it this way for multiple Integration Models (response types / output formats).
For each response type, have a dedicated Integration Model (e.g. generate it from a XSD, e.g. with JAXB) and have an adapter from your Domain Model to this Integration Model.
Then, based on request type, select one of the adapters and produce the response.
If you want it to be really futureproof, you should consider to have your own Integration Model for the app. This will prevent you from making changes to the adapters whenever your Domain Model changes, like this:
Domain Model -> [trafo] -> Common Integration Model -> [trafos] -> 3rd Party Integration Models
By the way, converters don't need to be implemented in one particular technology, use what best fits a particular case. One adapter may use JAXB generated Java classes, other may be a XML Transformation.
You may consider implementing your integration layer with frameworks like Apache Camel.
Related
I have a use case where I have one dto class that has all the data retrieved from a db call. I need to create different json formats using information from that class.
What is the best way to do this?.Also is there a way to do this with out making a code change everytime a new json format is needed ,something like storing the different json schemas in a persistence layer and then
do the mapping dynamically ?
I provide below my simple thoughts. Suppose you have a dto class say EmpDto which has data in relation to your database table model. If you want to create a json in way different way, then you create a separate object model like EmpJsonBean and annotate the Json annotations from Jackson framework. You have to populate the EmpJsonBean by taking required data from the EmpDto class. This way you can do it.
If you think of a design pattern so that you can have minimal impact, I would suggest for Adapter design pattern so that you can have a different json structure based upon the business need.
I have a JSON which I am using as a fabricated data to create a POJO using GSON. This is expected data for my tests. Next, I have data from the cucumber feature file which would override some of the fields in the already created POJO i.e. update expected data object. I was wondering if anybody has done this earlier or is aware of pattern which I can use to achieve this. In terms of approaches, in was wondering if makes more sense to create an updated json first from both the data sources or to create POJO from the first JSON first and then apply mutation.
I have found a way around - using Apache Velocity template instead of vanilla json files simplifies the implementation and provides flexibility.
In simple terms, why do we need 'a bean to bean mapping service' (like Dozer) in a web-application.
Suppose I'm working on a web-service.
I'm receiving an XML in request.
I fetch the the values from XML elements.
Perform the required operation on the fetched values.
Prepare the response XML.
Send the response XML as response
Why should I add one more steps of mapping XML elements to own custom elements.
I'm not able to convince myself, probably because I'm not able to think of a better situation/reason.
Please suggest, with example if possible.
It helps to reduce coupling between the presentation (i.e. the XML schema) and the business logic. For example in case of schema changes you don't have to touch the business logic, just the mapping between the objects.
In simple cases it might not be worth the additional complexity. But if the objects are used extensively in the business logic component you should consider it.
Just as a quick answer, the case you described is not the only one :).
Suppose you are working with an internal library providing some POJO / entity / other beans. You want to abstract from the internal representation (for a reason or anohter), you then want to map those bean to yours. It works :
for ejb client, or somehting like that,
when you don't want to expose internal entities (Business Object vs Presentation object) (see #Henry's reply)
you have beans that don't inherit from the same parent (and can't for any reason, even leacy) and you want to tarnsfert value from on to another
There are plenty of (other) reasons :)
As an advice see also orika
and this post :
any tool for java object to object mapping?
Short answer for me as henry said it helps reduce coupling between what you expose or consume and your core data model.
It is one way build Hexagonal Architecture. You can freely modify your core model without impacting the exposed model. In hexagonal architecture, it is used to expose only a small relevant part of the core model.
It is also a very goog way to handle services and model versionning since multiple versions can be mapped to the core model.
When working with XML services I tend to build contract first application so, I first write the XMLSchema then generate Jaxbeans and I realy don't want my business code to be polluted by JAxb annotations.
If you know that your exposed model will always be the same and that your application does not fall in the previously mentionned cases, so you realy don't need to use DTO.
Last, I would recommend using a framework with strong compile time checking like Selma instead of Dozer or Orika because they are evaluating the mapping only at runtime which is weak typing and sensible to refactoring.
Restful resources do not always have a one-to-one mapping with your jpa entities. As I see it there are a few problems that I am trying to figure out how to handle:
When a resource has information that is populated and saved by more than one entity.
When an entity has more information in it that you want to send down as a resource. I could just use Jackson's #JsonIgnore but I would still have issue 1, 3 and 4.
When an entity (like an aggregate root) has nested entities and you want to include part of its nested entities but only to a certain level of nesting as your resource.
When you want to exclude once piece of an entity when its part of one parent entity but exclude a separate piece when its part of a different parent entity.
Blasted circular references (I got this mostly working with JSOG using Jackson's #JsonIdentityInfo)
Possible solutions:
The only way I could think of that would handle all of these issues would be to create a whole bunch of "resource" classes that would have constructors that took the needed entities to construct the resource and put necessary getters and setters for that resource on it. Is that overkill?
To solve 2, 3, 4 , and 5 I could just do some pre and post processing on the actual entity before sending it to Jackson to serialize or deserialize my pojo into JSON, but that doesn't address issue 1.
These are all problems I would think others would have come across and I am curious what solutions other people of come up with. (I am currently using JPA 2, Spring MVC, Jackson, and Spring-Data but open to other technologies)
With a combination of JAX_RS 1.1 and Jackson/GSON you can expose JPA entities directly as REST resources, but you will run into a myriad of problems.
DTOs i.e. projections onto the JPA entities are the way to go. It would allow you to separate the resource representation concerns of REST from the transactional concerns of JPA. You get to explicitly define the nature of the representations. You can control the amount of data that appears in the representation, including the depth of the object graph to be traversed, if you design your DTOs/projections carefully. You may need to create multiple DTOs/projections for the same JPA entity for the different resources in which the entity may need to be represented differently.
Besides, in my experience using annotations like #JsonIgnore and #JsonIdentityInfo on JPA entities doesnt exactly lend to more usable resource representations. You may eventually run into trouble when merging the objects back into the persistence context (because of ignored properties), or your clients may be unable to consume the resource representations, since object references as a scheme may not be understood. Most JavaScript clients will usually have trouble consuming object references produced by the #JsonidentityInfo annotation, due to the lack of standardization here.
There are other additional aspects that would be possible through DTOs/projections. JPA #EmbeddedIds do not fit naturally into REST resource representations. Some advocate using the JAX-RS #MatrixParam annotation to identify the resource uniquely in the resource URIs, but this does not work out of the box for most clients. Matrix parameters are after all only a design note, and not a standard (yet). With a DTO/projection, you can serve out the resource representation against a computed Id (could be a combination of the constituent keys).
Note: I currently work on the JBoss Forge plugin for REST where some or all of these issues exist and would be fixed in some future release via the generation of DTOs.
I agree with the other answers that DTOs are the way to go. They solve many problems:
Separation of layers and clean code. One day you may need to expose the data model using a different format (eg. XML) or interface (eg. non web-service based). Keeping all configuration (such as #JsonIgnore, #JsonidentityInfo) for each interface/format in domain model would make is really messy. DTOs separate the concerns. They can contain all the configuration required by your external interface (web-service) without involving changes in domain model, which can stay web-service and format agnostic.
Security - you easily control what is exposed to the client and what the client is allowed to modify.
Performance - you easily control what is sent to the client.
Issues such as (circular) entity references, lazily-loaded collections are also resolved explicitly and knowingly by you on converting to DTO.
Given your constraints, there looks to be no other solution than Data Transfer Objects - yes, it's occurring frequently enough that people named this pattern...
If you application is completely CRUDish then the way to go is definitely Spring Data REST in which you absolutely do not need DTOs. If it's more complicated than that you will be safer with DTOs securing the application layer. But do not attempt to encapsulate DTOs inside the controller layer. They belong to a service layer cause the mapping is also the part of logic (what you let in the application and what you let out of it). This way the application layer stays hermetic. Of course in most cases it can be the mix of those two.
What are commonly used ways to build web forms in Java and Groovy?
Spring and Grails provide corresponding taglibs, but I am wondering whether there are form frameworks, which allow to create forms as sets of objects, dynamically manipulate the elements, embed sub-forms, populate and validate, and render.
For example, if I have a group of fields common for a number of forms, I would like to reuse the code. Furthermore, such a group of elements, in turn, may be a part of another group. I also would like to dynamically reorder the elements, change field names and other input field attributes, and so on without altering any HTML code.
You can try open-source Formio library. It can be used in many environments/frameworks and with various template frameworks.
With Formio, you can manipulate forms using objects: Create form definitions - mappings, definitions of form fields and nested mappings (for nested objects like person's address or list of addresses). Both mappings and form fields can be defined as reusable (immutable) objects and composed together. You can use nested mappings to model reusable "groups" of form fields (and to nest them into another groups).
Formio supports binding of data from Java object to form definition that can be then passed to a template and rendered. Data from the request can be validated (using bean validation API) and bound back to the Java object (to newly created instance or to provided one). Binding of basic Java types, nested object types, collections or arrays (of primitives or complex objects) is supported. Immutable classes can be used.
Form definition can be prepopulated with data (before passing to the template) and filled back with (validated) data from the request. Automatic bidirectional data binding is implemented in "fill" and "bind" methods on the form definition object.
Formio is server-centric library, but can be combined with existing client-side libraries. Rendering is left to the template system. For example, using JSP you can prepare your own reusable tags to render form definition populated with data and its parts (nested mappings, various types of form fields) which already contain all neccessary data to render, including flags like visible, enabled, readonly, required. You can define your own custom field attributes and use them in reusable tags/template snippets so the HTML code need not to be altered in most cases.
Note: I am author of the library, you can find the sources on the GitHub and make your fork. Check the documentation for further details.
What you need is a templating/layout framework which will allow you to reuse snippets of code.
In the java web dev space, Tiles is the de facto framework for layouts. Both Grails and Spring MVC support Tiles.
I haven't come across an MVC framework that would expose a complete interface to handling form data as objects, in both ways. There is always some manual coding that is required to translate form data into objects, and back.
There are alternative frameworks, like Echo (http://echo.nextapp.com/site/) that completely isolate the application from dealings with requests and responses, they may be more suited for the kind of abstraction you are looking for.