Can I get Jersey to use natural JSON notation globally/as default? - java

I'm using Jersey to build a REST API to a service. I'd like to be able to accept and return both JSON and XML, and have this mostly working but I don't like the default "mapped" flavor of JSON that Jersey likes to spit out.
I know about the newer "natural" notation (from http://jersey.java.net/nonav/documentation/latest/json.html, which I'll quote at length because it makes obvious the problems with the default "mapped" notation):
After using mapped JSON notation for a while, it was apparent, that a
need to configure all the various things manually could be a bit
problematic. To avoid the manual work, a new, natural, JSON notation
was introduced in Jersey version 1.0.2. With natural notation, Jersey
will automatically figure out how individual items need to be
processed, so that you do not need to do any kind of manual
configuration. Java arrays and lists are mapped into JSON arrays, even
for single-element cases. Java numbers and booleans are correctly
mapped into JSON numbers and booleans, and you do not need to bother
with XML attributes, as in JSON, they keep the original names
and would like to use it everywhere, but I haven't been able to figure out how to. I'm instantiating/configuring Jersey via Tomcat's XML config files -- using what I believe is the normal dance with servlet/servlet-class/init-param tags -- but I haven't been able to find documentation on whether or how it's possible to specify JSONConfiguration options from there.
I've also tried implementing my own ContextResolver which applies a JSONJAXBContext I instantiated from Java code, where I can apply JSONConfiguration.natural() (an example of this looks like this answer). This works, but only for types I explicitly list out in that code, and pass to the JSONJAXBContext constructor. Not only is this extra code to write and maintain, and change if I add more data classes, but it doesn't work for things like List.
Is there a way to tell Jersey to just use the natural notation instead of mapped notation, always and for all types?

I never did find an answer to the actual question I was asking here, but instead I found a simple 3 step process that accomplishes the same end result that I wanted:
add Jackson to my project
configure Jersey to enable FEATURE_POJO_MAPPING
slap myself on the head a few times because it turned out to be so easy.
The Jersey documentation mentions this POJOMappingFeature/FEATURE_POJO_MAPPING prominently (it's the first example in the doc page I linked in the question), but doesn't describe exactly what it means, and from the way that document presents its information I thought this option (5.1, "POJO support") was at odds with option 5.2 ("JAXB based JSON support") which sounded more like what I wanted. So I tried a lot of other things before I tried enabling FEATURE_POJO_MAPPING.
But once I tried it, it worked exactly as I wanted, and I haven't had to look back.
A side benefit of this is that Jackson generates much better error messages in the case that the client passes it bogus JSON content, compared to Jersey's native JSON handling implementation.

Related

Utilize Spring Restdocs DSL for validation

our REST API is documented by set of tests using Spring Restdocs in standard way (via mockMvc.perform(...)...andDo(document().fieldWithPath(...)) statement. Since fields have type and mandatory/optional flag, I play with idea to reuse this info for purpose of response body validation also in production code.
I moved Spring Restdocs to compile Maven scope and moved snippet creation to production code where it is visible for both the documenting test and response body interceptor in src/main code (the latter just calls ResponseFieldsSnippet.createModel method). Everything works fine except following pitfall: empty collection of objects looks like invalid since framework tries to match fieldWithPath rule of object field against nonexistent data.
For example, assuming JSON of cat is described as fieldWithPath("kittens[]"), fieldWithPath("kittens[].name"), the actual JSON {"kittens":[]} appears invalid since the latter descriptor is not satisfied. This doesn't happen for test samples where data is fabricated to maximize documentation benefit but it's issue for real cases.
Based on this observation, I tend to consider reusing Restdocs DSL for validation as a bad idea. Before switching to heavy-weight solution รก la JSON schema, I would like to ask: does Restdocs offer some way to express field descriptors as tree rather than list of rules? For example above, something like fieldWithPath("kittens[]", subfieldWithPath("name")). (I think it could be useful regardless of how abusing is my case.)
I browsed and elaborated examples from documentation which seemed promising but AFAIK don't actually cover this case, namely: subsectionWithPath (which skips subtree), beneathPath (which focuses only to subtree) or ResponseFieldsSnippet.andWithPrefix (only shortcut for list creation but still list not tree).
Thanks for your opinion!
I finally found the problem is solved in newer library version, namely 1.2.5 and 2.0.2 (I had 1.2.2). The example above must be expressed as
fieldWithPath("kittens"),
fieldWithPath("kittens[]").optional(),
fieldWithPath("kittens[].name").type(STRING)
This setup says the kitten field itself is mandatory but the array is allowed to be empty and hence no field name is expected in such case (type of name must then be explicitly stated since library can't get clue from data).
More info: original issue, example above just as project test case, another examples can be found in commits linked from issue.
(Note: upgrading to 2.0.2 didn't work for me since it required also to upgrade Spring, which is not currently possible.)
The answer to original question is no, since Spring Restdocs still preserves the list format of field descriptors. However, after this fix it seems it doesn't bother me very much.

how to separate business logic in RestAssured

We have REST webservice. It operates over JSON data representation. I would like to provide functional testing. I plan to use RestAssured framework. It provides understandable methods for testing correctness of output json.
Example, get("/method").then().assertThat().body("obj.field", equalTo(5));
But one problem arise: if json structure will change, all tests shall be invalid. For example, if field should be renamed to field2 we shall fix all test with occurrences of field. The problem is very similar to web pages testing problem, where we should check presence of some web elements, etc. It was solved introducing by Page Object pattern. Does some similar solution exist for testing of REST api or could you advise some elegant one?
In the example given in your question you validate the entire body of a response object in which case it is probable you will create brittle tests.
However it looks like REST-Assured already provides all the functionality you need to test specific parts of a JSON response:
JSON example
JSON Advanced Examples
Using JSON Path
You can even map objects and then do whatever you wish with the objects constructed, for example validation and manipulation.
See here for more examples.
Just like with an HTML page, one way to write tests less exposed to changes is to use a strategy to locate the target you want to evaluate.
With a web page you would use an XPath query, a CSS Seletor or directly the id to avoid dependecies over the ancestors.
So how to do it with a JSON ?
Well you could use a Regular expression, but it can get really messy or you could use an XPath like query for JSON :
http://goessner.net/articles/JsonPath/
http://defiantjs.com/
So in your case, writing reliable tests is more about what you evaluate rather than the framework you use to do it.
Changes in REST API (especial public) are less frequent than in GUI. When changes in API are introduced they should be marked with new version and do not break old one (in most cases). So keep your tests as simple as possible, without introducing additional patterns, that will have some benefits - you can easily throw them away and write new. Hihger test framework complexity provides hihger maintanence costs. Any way in REST-Assured you may create ResponseSpecification and reuse it in assertions.

Suggestion on approach to compare XMLs (instead of using regex with Java's String replaceAll)

I'm writing a test utility that I intend to use to create artificial traffic to the main application (a Spring Integration based application that has entry points in both JMS based and SOAP services downstream and upstream).
In addition to creating the traffic I want to be able to tell if the application is responding properly (i.e. taking an XML and comparing against a predetermined expected value). I have base XMLs for the different types of responses, but there are dynamic values that based on the situation I need to manipulate to be able to compare against the base source for comparison.
One way to solve it: using the replaceAll method of the String Java class, where I manipulate both source and target XMLs as needed to a logical point where I can determine if the response is valid or not.
I'm interested to know if there is any XML utility framework that provides a more advance set of capabilities to achieve this?
Many thanks
Do the "replace" normalization using an XSLT transformation, then use the XPath 2.0 deep-equal() function (perhaps within the same transformation) to do the comparison with reference results. This enables proper XML comparison semantics, e.g. ignoring insignificant whitespace, or arbitrary distinctions such as single-versus-double quotes.
You could generate jaxb classs for your xml, unmarshal it, and then check the properties of interest. Chances are you already have those classes as your dom objects for your JMS and WS logic, so maybe they need only #XMLRoot anotation on top.
To stay closer to xml, you could use XPath to extract the dynamic bits that you expect.

How to render special XML/JSON Flavours with Playframework

According to James Wards Play Tutorial it's very easy to get a JSON out of your model. Also with XML this should be quite simple.
But most of the time, I have the requirement to build not just an plain XML or JSON Endpoint, but furthermore to deliver special flavours of those. In my case this is GeoJSON or TopoJSON. But also in XML, it could be a simple RSS or ATOM Feed you have to deliver out of your model. Also building a XML fitting to a very nasty XSD schema is still a case sometimes.
What options do you have in play to perform this, or which one of the following would you recommend?:
In case of GeoJSON/TopoJSON: Activate JSON as a template format, and create JSON Templates
In case of ATOM/RSS: Just use an XML Template
Some way to modify the JSON response coming from toJson(tasks)?
Use of a fancy library which does all that out of the box, and everyone knows about it, except me?
If you're doing GeoJSON, just annotate your objects with Jackson annotations according to the GeoJSON spec, it's not hard. If it is hard, then there are a few libraries out there that come with Java objects with the necessary annotations already for you, eg: https://github.com/opendatalab-de/geojson-jackson
An XML template is probably the simplest from Java.
What's your use case? toJson returns a Jackson JSONNode. You can modify it as much as you want. But the better thing to do would be to use Jackson annotations on your objects to get the format right in the first place.
I think you're referring to Jackson, it can do everything you want. It can even do XML if you want it to.

Generate object model out of RelaxNG schema with RNGOM - how to start?

I want to generate an object model out of an RelaxNG Schema.
Therefore I want to use the RNGOM Object Model/Parser (mainly because I could not find any alternative - although I don't even care about the language the parser is written in/generates). Now that I checked out the RNGOM source from SVN, I don't have ANY idea how to use RNGOM, since there is not any piece of information out there about the usage.
A useful hint how to start with RNGOM - a link, example, or any description which saves me from having to read understand the whole source code of RNGOM - will be awarded as an answer.
Even better would be a simple example how to use the parser to generate an Object model out of an RNG file.
More infos:
I want to generate Java classes out of the following RelaxNG Schema:
http://libvirt.org/git/?p=libvirt.git;a=tree;f=docs/schemas;hb=HEAD
I found out that the Glassfish guys are using rngom to generate the same object model I need, but I could not yet find out how they are using rngom.
A way to proceed could be to :
use jing to convert from Relax NG to XML Schema (see here)
use more common tools to generate classes (e.g. JaxB).
Hi I ran into mostly the same requirement except I am concentrating on the Compact Syntax. Here is one way of doing what you want but YMMV.
To give some context, my goal in 2 phases: (a) Trying to slurp RelaxNG Compact Syntax and traverse an object/tree to create Spring 4 POJOs usable in Spring 4 Rest Controller. (b) From there I want to develop a request validator that uses the RNG Compact and automatically validates the request before Spring de-serializes the request. Basically scaffolding JSON REST API development using RelaxNG Compact Syntax as both design/documentation and JSON schema definition/validation.
For the first objective I thought about annotating the CompactSyntax with JJTree but I am obviously not fluent in JavaCC so I decided to go a more programatic approach...
I analyzed and tested the code in several ways to determine if there was a tree implementation in binary, digested and/or nc packages but I don't think there is one (an om/tree) as such.
So my latest, actually successful approach, has been to build upon binary and extend SchemaBuilderImpl, implement the visitor interface, and passing my custom SchemaBuilderImpl to CompactSyntax using the long constructor: CompactSyntax(CompactParseable parseable, Reader r, String sourceUri, SchemaBuilder sb, ErrorHandler eh, String inheritedNs)
When you call CompactParseable.parse you will get structured events in the visitor interface and I think this is good enough to traverse the rng schema and from here you could easily create an OM or tree.
But I am not sure this is the best approach. Maybe I missed something and there is in fact an OM/Tree built by the rngom implementation (in my case CompactSyntax) that you can traverse to determine parent/child relationships more easily. Or maybe there are other approaches to this.
Anyway, this is one approach that seems to be working for what I want. Is mostly visitor pattern based and since the interfaces were there I decided to use them. Maybe it will work for you. Bottom line, I could not find an OM/AST that can be traversed implemented anywhere in the implementation packages (nc, binary, digested).

Categories