Differentiating the Jersey, Jackson, and JaxB APIs - java

Hi : I've been using Jackson for JSON processing internally , and I want to serve these objects as Jsons to an external API (REST) (now, they are stored internally as java objects) .
The obvious implementation would be to write some kind of query engine that reads requests, retrieves objects from the underlying data store, and then serializes them into Jsons using Jackson.
However I'm starting to realize that there are APIs that already can be used to assemble such web services , taking care of a lot of the mundane details (security, query parsing, REST coordination) . For example, it appears that jersey annotations can be used to define REST services ....
So my question is : what are the state of the art in Java EE JSON based web services, and what do these services use as data stores (I.e. Plaintext? RDBMS? Object data services?)
Most importantly... what is the functional difference between the different apis for xml and json data mapping i.e. jersey/Jackson/JaxB ?

Aside from Jersey (and other JAX-RS impls like RESTeasy), which use Jackson, you might also benefit from using something like jDBI for binding relational data in POJOs first.
It does many things bigger ORMs (like Hibernate) do, but is simpler to use for most common tasks.
Or if you prefer Hibernate, use Jackson Hibernate module to handle some edge cases there may be when reading/writing POHOs as JSON.

There is a plugin for Jersey that will take your JAXB annotated objects and serialize them as JSON automatically. Jersey (JAX-RS) is a really good offering.
You can also use JPA annotations on the same objects and a JPA provider like Eclipse Link for a lot of your database needs. A basic relational database can handle most website's needs.

JAVA had released some specifications called JAX-RS to standardise the development of RESTfull web services using J2EE. These specifications are just the definitions and not the concrete implementation .
There are various implementations providers of these APIs(specifications). Jersey, RestEasy, RestLet, Apache-cxf are few such implementations which can be used to implement RESTfull services in JAVA.
Specifically to Jersey, It is not just limited to the implementation of the JAX-RS APIs. It is a framework which has its own set of APIs built by extending JAX-RS capabilities and provides additional capabilities to further ease the development of REST APIs in JAVA.
JAXB stand for Java architecture for XML binding which is another specification provided by JAVA to marshall and unmarshall Java objects to XML and vice versa. Again, Its just the specification and not the concrete implementation.
Coming to Jackson, It is just a JSON processor(typically one of the implementation of JAXB) used to marshall and unmarshall objects from Java to JSON. Jersey uses Jackson internally to convert Java objects to JSON and vice versa.

Jersey is an implementation of JAX-RS. You can think of JAX-RS as an Common interface build for RESTful Web Services. The implementation of this interface is provided by vendors. There are many implementation of this interface available like JERSEY and Rest-Easy. On the other hand Jackson is a Json Processor. It helps you in converting your objects to json and vice versa.

Related

What is the best way to parse data using same API between JEE Restfull webservice and Android?

We have a big project which is composed by JEE modules, JAVA client applications, Android applications, and other self-made java projects.
Because of the variety of projects, we decided to make java libs projects and java entity projects which are common to JEE, Java client applications and Android applications in the goal of limit code redundancy between projects.
At the beginning, we only had Java Clients and Restfull web services on the JEE server side which were exchanging data using JAXB XML Binding API. So it was easy to use JAXB annotations on our Classes in the entity project (which is set as dependency on Java Client project and JEE projects). Both sides could easily encode and decode XML data with the same annotations.
Now that we have an Android app, I wanted to use the same way to exchange data. The thing is that JAXB is 'depreciated' on Android. So I found in Jackson lib that there is a JaxbAnnotation parameter for getting data which is bind with JAXB but I'm not convinced by the full compatibility of the solution.
I also tried using JSON binding since I saw that JSON-B will be the standard in JavaEE 8 but it seems that it needs JavaEE API classes and I don't think that it's good to add it to Android project.
So my question is: What is the best practice to exchange data between JEE Restfull web services and Android application using the same entity dependency (and same parsing API) and limiting the XML or JSON binding annotation on the entity objects?
I hope that you will well understand the problem.
Thank you.
Let's name your entities project entities-module. This project contains POJO classes annotated with JAXB annotations such as: #XmlElement, #XmlType, etc. Since, like you wrote, "JAXB is 'depreciated' on Android" you need to choose between: read JAXB annotations by other tools or create new customised POJO structure.
Read JAXB annotations by other tools
Jackson library has good support for JAXB annotations. All you need to do is to use jackson-module-jaxb-annotations module. It is easy to register and if you already use Jackson for serialising/deserialising JSON this is obvious choice. How to do that you can find reading answers for this question on SO: Using JAXB with Google Android.
Pros:
You do not need to create new POJOs schema.
In case of changes in Restful API you use next version of entities-module project.
Cons:
Problems with enabling JAXB on Android.
Performance issues linked with using heavy JAXB and Jackson module layers.
New module
Second choice is to create new module with POJOs and use faster and smaller library like SimpleXML.
Pros:
Better performance
No problems with building app with depreciated JAXB classes.
You can optimise structure: ignore unused fields, choose better types, etc.
Cons:
You need to create new module with new classes structure.
Learn and maintain new library
In case of changes in API you need to duplicate changes in few modules.
You need to take a look on above pros and cons list and decide what is the best for you.
I would like to also add two more options:
Replace JAXB annotations with Jackson annotations. Jackson has really good XML support beside great for JSON. It will enable for you easy support for JSON and XML formats in your API. Also, you can use Jackson library on Android.
Create new JSON API for Android app. Mostly, UI flows on Android app is different than on Web App and you will, probably, end up with that idea anyway.

JAX-WS & JAX-RS with EclipseLink MOXy and external mapping documents

This is about using JAX-WS with EclipseLink MOXy and my problem of not being able to use MOXy's external mapping documents in this combination. (The same actually seems to apply to JAX-RS as well, but I'm limiting the details to JAX-WS to keep this from becoming even longer than it already is).
I first describe (in some detail) the context of my requirements and the test code I've written to try and solve them. The actual question follows at the end of the post.
The context:
In some legacy parts of our project, we use JAX-WS directly on top of a simple server-side POJO model. The model is part of an API that we use for direct Java calls, but we also provide a SOAP layer implementing the same Java interface as our direct implementation, i.e. we have an IManager interface using our model, and the caller doesn't care whether his IManager instance is the local implementation or a SOAP client that calls the JAX-WS server wrapper for the local implementation.
Back when we started implementing the model, we didn't know much about JAXB, so the model has no JAXB annotations and everything is auto-deduced by JAX-WS. From the JAX-WS annotated server classes, we create a client using wsgen and wsimport. But since the model is part of our API, and the model generated by wsimport is separate set of classes (albeit with identical signatures), we have to wrap all client calls with methods that copy between these two models. Copying from one model to the other is implemented manually and has to be updated manually for every tiny change in the API model.
Since both server and client are under our control, we'd like to use the same set of (hand-written) model classes in the server and the client implementation. I've spent the past few weeks playing around with a test project that works, but doesn't completely satisfy me yet.
I built my test project in three steps:
Step 1: Implement a simple test API consisting of a model and a manger interface. Then create a simple implementation.
In my case, the model consists mainly of and IPerson and an IBook interface, and the IModelManager has methods to get, store/update and delete persons and books. The interface also provides a factory for creating new persons and books, so client projects only need the API at compile time (and the implementation at runtime).
Step 2: Provide modules for serialization to and from XML and JSON.
Because of its support of external mapping documents, I played around with EclipseLink MOXy. This allowed me to take my model from step 1 and add XML binding declarations in my separate serialization project. I could also write my serialization utility classes in such a way that the client code calling them only knows about the API, i.e. all method signatures rely only on the model interfaces and not the implementation classes. All this without touching the classes from step 1 - all the serialization stuff was cleanly layered on top of the Java implementation project.
Step 3: Provide SOAP and REST layers via JAX-WS and JAX-RS.
Now I wrote server classes for SOAP and REST, using JAX-WS and JAX-RS annotations. From the JAX-WS classes I generated a client with wsgen and wsimport and wrapped that in a client-side IModelManager implementation which simply redirects the calls to the client class generated by wsimport. For the JAX-RS classes I used Jersey with annotations in my sever class and wrote a IModelManager implementation which uses the Jersey client classes to call the REST service.
This is where I lost some of the nice functionality from step 2.
Using the JAX-WS EclipseLink plugin, all the JAX-WS code uses MOXy for serializing and deserializing the model (or at least I think it does). But I couldn't find a way to specify my external binding files. I had to add annotations to my actual model implementation from step 1 (and even to the API, because the model also includes an enum referenced from the interfaces and the enum needs JAXB annotations as well). I also had to change my JAX-WS server class to use the model implementation classes instead of the interfaces (for now I simply use casts where necessary, assuming there will never be another model implementation).
Using the WSDL and schema generated from the updated model by wsgen, I can call wsimport with a custom JAXB bindings XML to generate a client that maps all the types from the generated XSD to my existing model classes instead of generating new ones. The result is a client interface that accepts and returns the model interfaces from my API and always uses the standard implementation. I just have to write simple wrappers and cast my model implementation classes to API interfaces in several places, which is a massive improvement over our legacy code with its duplicate (and triplicate) models and loads of copying logic.
The question:
I'm not happy with having to annotate all my model classes directly (and even the enums in the API), especially considering that I had a working serialization/deserialization pipeline without annotations in step 2. Presumably, if I could provide wsimport and the code that starts up the sever-side JAX-WS implementation with my MOXy mapping documents, I could do without any annotations just like in step 2. It even seems to me that if I could provide wsgen with the mapping files, I could use my API interfaces (instead of implementation classes) in the server-side JAX-WS methods and do without all the casts, and perhaps even without the JAXB bindings file that manually maps the wsgen created XSD types to my existing classes.
But I haven't been able to find a way to provide JAX-WS with the MOXy mappings, neither the wsgen and wsimport tools nor the client or server side code. Is there something I'm missing, or if not, is this indeed a gap in JAX-WS's MOXy bridge, and is there a chance to fill it to get the full MOXy functionality in a future version? Or did I make a fundamental error in my line of thinking and there either cannot be a solution or there already is one? (Hence my detailed description of my test project)
The target platform is plain Java 8, either as a standalone application (with Jetty, using javax.xml.ws.Endpoint to publish the SOAP service and Jersey for the REST service), or Tomcat with Jersey. My tests run directly in JUnit using Endpoints and JerseyTest, plus manual tests in Tomcat.

Avoid intermediate data conversion between RPC (SOAP) APIs?

Lets assume I develop two services called ProductAPI and OrderAPI.Both of them uses a Common Domain Model (common Entity hierarchy).
Both of the services are finally exposed as RCP (SOAP or REST).
The OrderAPI internally invokes ProductAPI.
In the case of the REST:
We can develop ProductAPI using JAX-RS and we can implement a "ProductAPI REST Client" to be used within OrderAPI to access ProductAPI.
This client can use the same class heirachy to deserialize the JSON into the same classes used in ProductAPI.
So,no indermediate format convertion.
In the case of SOAP:
We develop the ProductAPI using JAX-WS (or Axis2..etc) and expose the service in WSDL.
In this case, we have to implement a "ProductAPI SOAP Client" using the exposed WSDL.(may be using a stub generation tool using the exposed WSDL).
In this came the generated Classes are generated from the XSD definition in WSDL and we have to do additional format conversion if we want to use the same Common Domain Model classes.
My questions:
1) In the case of SOAP is there a way to skip this format conversion ?
2) In an enterprise application (like eCommerce) , is it a good practice to avoid this kind of middle data conversions for performance?
To move data between processes, the data needs to be serialized. You can certainly use a different and better format than SOAP for the serialized data, but if you already have SOAP, it will be more work.
Is it worth trading programmer effort for performance in this kind of project? My guess is no, since the performance will probably be good enough anyway. And if it isn't, the usual advice is to measure and identify bottlenecks before optimizing anything.

using thrift generated models with Hibernate annotated POJOs

Basically I am in the process of evaluating thrift for an upcoming project. What I am trying to achieve is to have my data layer written in Java which then serves (via thrift) a ror powered website as well as an iPhone application.
I have familiarised myseld with thrift's IDL and it seems a strong contender due to its efficiencies versus a RESTful service.
I would like to send the POJO via thrift however to do so I am currently having to convert the POJO to the thrift generated object before it can be used by the thrift service however I can't stop feeling there is a better way of doing this which doesn't involve having to do the conversion.
Are there any best practises to overcome this problem?
If you need any more specific information please let me know.
Swift can also do this - you can annotate your POJOs with both JPA and Swift annotations, then use Swift+Thrift to serialize them. Swift can generate Thrift IDL from the annotated classes for you to use elsewhere.
This is Swift: https://github.com/facebook/swift/
I think the best way is to implement the thrift IDL properly and map your structs against a hbm.xml. This way you can generate your POJO's throught thrift compiler, and can persist'em using hibernate.

Java Framework for integrating WSDL, REST, etc

At work, we currently have a WSDL interface as well as a semi-RESTful interface that we're looking to expand upon and take it to the next level.
The main application runs using Servlets + JSPs as well as Spring.
The idea is that the REST and WSDL are interfaces for an API that will be designed. These (and potentially other things in future) are simply a method through which clients will be able to integrate with the interface.
I'm wondering if there are any suggestions or recommendations on frameworks / methodologies, etc for implementing that under-lying API or does it make sense simply to create some Spring beans which is called either by WSDL or REST?
Hope that makes sense.
Have a look at Eunicate it is great . You are using spring , Spring has had support of SOAP for a while and Spring 3 has support of REST (Creating and Consuming).
Your approach makes sense. Probably the most important advice is to make the external API layer as thin as possible. You can use Axis, Apache CXF, Jersey, etc. to handle the implementation of the REST or SOAP protocols, but the implementation of those services should just load the passed in data into a common request object, and pass that into a separate service that handles the request and returns a response object which the external API layer will marshall into the correct format for you.
This approach works especially well when you have a competitor providing similar services and you want to make it easy for their customers to switch. You just build a new external API that mirrors the competitors, and simply translates their format to your internal api model and provided your services are functionally equivalent, you're done.
This is a really late response, but I have a different view on this topic. The traditional way as we know it is to unmarshall xml to java and marshall java to xml. However if the wsdl changes then it would effectively be a structural change in the code which would again require a deployment.
Instead of the above approach if we list the fields mentioned in the wsdl in a presistent store, load the mappings in memory and prepare our structures based on these mappings we would have to have many less changes for this..Thus IMO instead of using existing libraries a configurable approach to unmarshalling and marshalling should be taken.

Categories