I've got simple JavaBeans objects which represents data in my database. I want to transform this to several formats(XML, JSON) to share it to my clients and I also need reverse transformation. What tools do you advise to me to do this? I need fast, simple and not complicated tool. I know GSON or JSONObject will be nice for JSON producing but what about XML? I found JAXB is too "fat" for my needs. Or maybe I'm wrong? Thanks.
Note: I'm the EclipseLink JAXB (MOXy) lead and a member of the JAXB 2 (JSR-222) expert group.
MOXy offers both XML and JSON binding by leveraging the JAXB metadata plus it's own extensions. In the example below the same object with the same metadata is mapped to both the XML and JSON representations of the Google Maps Geocoding API V2:
http://blog.bdoughan.com/2011/08/binding-to-json-xml-geocode-example.html
MOXy also has an external mapping document that allows you to map a single object model to multiple XML or JSON representations. In the next example one object model is mapped to the results of both the Google and Yahoo weather APIs:
http://blog.bdoughan.com/2011/09/mapping-objects-to-multiple-xml-schemas.html
One of the things that make MOXy so flexible is its path based mapping which breaks the one-to-one relationship between objects and nodes in XML and JSON messages:
http://blog.bdoughan.com/2011/08/binding-to-json-xml-geocode-example.html
http://blog.bdoughan.com/2011/03/map-to-element-based-on-attribute-value.html
Plus since EclipseLink as offers a JPA implementation, MOXy contains extensions for handling objects that are also mapped to the database:
http://blog.bdoughan.com/2010/07/jpa-entities-to-xml-bidirectional.html
http://wiki.eclipse.org/EclipseLink/Examples/MOXy/JPA
Related
I have a JSON which I am using as a fabricated data to create a POJO using GSON. This is expected data for my tests. Next, I have data from the cucumber feature file which would override some of the fields in the already created POJO i.e. update expected data object. I was wondering if anybody has done this earlier or is aware of pattern which I can use to achieve this. In terms of approaches, in was wondering if makes more sense to create an updated json first from both the data sources or to create POJO from the first JSON first and then apply mutation.
I have found a way around - using Apache Velocity template instead of vanilla json files simplifies the implementation and provides flexibility.
I need to convert a List of records(java objs) into XML output. The XML schema can change based on the request type and the xml output needs to be in a different format with different elements based on the request parameter (Eg. Request A -> produces xml in format A<A><aa>name</aa></A> Request B -> format B <B><bab>name</bab></B>). The existing framework uses JAXB but the dataobjects are tightly coupled with one XML schema.
We are trying to refactor to make it flexible so that based on the request type, we can generate different xml outputs. i.e. same data -> multiple output formats. We are also expecting more xml schemas, so we need to keep it flexible.
Appreciate if you could please let me know which framework would be most suitable for this type of scenario. What would be a good solution for huge schema and data files.
Thank you!
So you have your Domain Model and one or more Integration Models (output formats). I assume, when you create a response (Integration Model), you use some kind of adapter from Domain Model (if not, you should consider this). I would keep it this way for multiple Integration Models (response types / output formats).
For each response type, have a dedicated Integration Model (e.g. generate it from a XSD, e.g. with JAXB) and have an adapter from your Domain Model to this Integration Model.
Then, based on request type, select one of the adapters and produce the response.
If you want it to be really futureproof, you should consider to have your own Integration Model for the app. This will prevent you from making changes to the adapters whenever your Domain Model changes, like this:
Domain Model -> [trafo] -> Common Integration Model -> [trafos] -> 3rd Party Integration Models
By the way, converters don't need to be implemented in one particular technology, use what best fits a particular case. One adapter may use JAXB generated Java classes, other may be a XML Transformation.
You may consider implementing your integration layer with frameworks like Apache Camel.
I use JAXB to load an XML configuration file to a Java object (ConfigurationDTO). Is it good practice to add some logic code on the this Java object (ConfigurationDTO) or I should create a different java object with this logic code (ie Configuration). When I say logic code I mean some checks/constraints that the configuration file should have. Should the java class 'ConfigurationDTO' contain only getters?
The question is why do you need that constraints? Are you going to use your object not only for marshalling/unmarshalling? If so it is bad idea. The rule of thumb is not to spread DTO objects among all levels of an application. If you follow this rule you'll not need to have additional constraints in your DTO.
The JAXB standard provides you with ability to validate an object during marshal and unmarshal time. It means that if your XML schema requires nonempty field but the corresponding java object has null value then marshal will fail. And vise versa.
Here is quote from the JAXB documentation
Validation is the process of verifying that an XML document meets all the constraints expressed in the schema. JAXB 1.0 provided validation at unmarshal time and also enabled on-demand validation on a JAXB content tree. JAXB 2.0 only allows validation at unmarshal and marshal time. A web service processing model is to be lax in reading in data and strict on writing it out. To meet that model, validation was added to marshal time so users could confirm that they did not invalidate an XML document when modifying the document in JAXB form.
Such approach has its own drawbacks (if you spread the DTO among the application you'll lost control on it) but the advantages are more valuable.
I'm trying to find some tutorial examples on how to exchange data between databases and XML files using Java, from getting and setting specific data from a database to (if possible) change how the database is structured.
I have conducted research into this, but I'm unsure on whether its JDBC I should look into, XML:DB, JAXB, or if any of them are even relevant to what I'm trying to do.
I plan to create a database example, and then see if I can exchange data to and from an XML file using Java, just to see how it works; what should I look into in order to accomplish this?
Many thanks.
You can do this in many other ways but I do this way
Get data from Databases
Convert it to HashMap
Create a JaxB detail class matching your schema
Create a constructor in the JaxB class which accepts the HashMap and assign the data to the variables in JaxB
Convert JaxB object to XML/JSON by Marshaling
Write to a file if you want
If your new to Jax-B view this tutorial here!
You could do the following:
Use a JPA implementation (EclipseLink, Hibernate, Open JPA, etc) to convert the database data to/from Java objects.
Use a JAXB implementation (EclipseLink MOXy, reference implementation, etc) to convert the Java objects to/from XML.
After further research into my query, I found JDBC (database manipulations) and XStream (XML conversions) to be my most preferred solutions.
For JDBC, I referred to this link
For XStream, I referred to this link
Thank you for your replies.
I'm working on a project that uses JAX-RS, Jackson, and JPA. The JAX-RS resources map incoming JSON directly to the POJO (JPA) that is to ultimately be persisted.
#POST
#Consumes(MediaType.APPLICATION_JSON)
public Response createEntity(Entity entity) {
...
}
However, I occasionally find that the information that a resource needs from the client doesn't cleanly map 1:1 to a POJO. There are extra fields that provide some metadata on how to handle the request. For example, a callback URL or a plaintext password that doesn't get persisted.
Is there an elegant way to perserve this information while still mapping directly to the JPA entity?
I have some ideas, but I'm not thrilled with any of them:
First map to a Map<String, Object>: Then use the ObjectMapper to map to an entity that is configured to ignore certain properties. This results in some extra boilerplate code for certain resources (possibly all resources that consume JSON for consistency's sake)
Use #Transient fields for extra values: This allows Jackson to map cleanly to POJOs, but tends to clutter the data model with business logic instead of just concerning itself with the state and behavior of entities.
Use #QueryParam for extra values: Seems to complicate the interface for the resource, and seems kind of arbitrary from a client perspective.
Any ideas? It would be nice if it were possible to rig a JAX-RS MessageBodyReader or some kind of context provider to pass a Map of the extra parameters as an additional argument to the method, but I don't know how much work this would be.
This use case is often handled by using dedicated data transfer objects at the resource level which will be mapped by frameworks like Dozer to the JPA entities. Besides the obvious boilerplate code, there are advantages of this approach:
If the resources follow the HATEOAS principle, the entities must be enriched with further REST specific information like their own link, links to other resources and pagination information.
Often REST clients have the option to specify entity expansion properties (which properties of the entity or referenced entities shall be included in the response for bandwidth reasons), where you would have to apply at least filters to the entities.
But coming back to your question, if you want to re-use the JPA entities for JSON mapping, I think your ideas are all valid. Another variant of your second idea may be to store all this extra information in a map as part of the entity (to have a single property as business logic clutter instead of many), if your JPA entities have a common base class this mapping can be done there. You can use the #JsonAnySetter annotation to achieve this.