Utilize Spring Restdocs DSL for validation - java

our REST API is documented by set of tests using Spring Restdocs in standard way (via mockMvc.perform(...)...andDo(document().fieldWithPath(...)) statement. Since fields have type and mandatory/optional flag, I play with idea to reuse this info for purpose of response body validation also in production code.
I moved Spring Restdocs to compile Maven scope and moved snippet creation to production code where it is visible for both the documenting test and response body interceptor in src/main code (the latter just calls ResponseFieldsSnippet.createModel method). Everything works fine except following pitfall: empty collection of objects looks like invalid since framework tries to match fieldWithPath rule of object field against nonexistent data.
For example, assuming JSON of cat is described as fieldWithPath("kittens[]"), fieldWithPath("kittens[].name"), the actual JSON {"kittens":[]} appears invalid since the latter descriptor is not satisfied. This doesn't happen for test samples where data is fabricated to maximize documentation benefit but it's issue for real cases.
Based on this observation, I tend to consider reusing Restdocs DSL for validation as a bad idea. Before switching to heavy-weight solution รก la JSON schema, I would like to ask: does Restdocs offer some way to express field descriptors as tree rather than list of rules? For example above, something like fieldWithPath("kittens[]", subfieldWithPath("name")). (I think it could be useful regardless of how abusing is my case.)
I browsed and elaborated examples from documentation which seemed promising but AFAIK don't actually cover this case, namely: subsectionWithPath (which skips subtree), beneathPath (which focuses only to subtree) or ResponseFieldsSnippet.andWithPrefix (only shortcut for list creation but still list not tree).
Thanks for your opinion!

I finally found the problem is solved in newer library version, namely 1.2.5 and 2.0.2 (I had 1.2.2). The example above must be expressed as
fieldWithPath("kittens"),
fieldWithPath("kittens[]").optional(),
fieldWithPath("kittens[].name").type(STRING)
This setup says the kitten field itself is mandatory but the array is allowed to be empty and hence no field name is expected in such case (type of name must then be explicitly stated since library can't get clue from data).
More info: original issue, example above just as project test case, another examples can be found in commits linked from issue.
(Note: upgrading to 2.0.2 didn't work for me since it required also to upgrade Spring, which is not currently possible.)
The answer to original question is no, since Spring Restdocs still preserves the list format of field descriptors. However, after this fix it seems it doesn't bother me very much.

Related

How can I test for the presence of an object in a List managed by Protocol Buffers?

I have generated a Java API from some Protocol Buffers source code. You can see it here: https://microbean.github.io/microbean-helm/apidocs/ Things in the hapi.* package hierarchy are generated. The semantics of this API don't matter for this question.
As part of working with this API, I need to assemble a Chart.
With Protocol Buffers, when you are creating or mutating things, you work with "builders". These builders have a build() method that returns the (immutable) object that they are responsible for building. That's fine. Consequently, in this generated API a Chart.Builder builds Charts.
In this generated API, a Chart.Builder has "sub-" Chart.Builders that it contains. The list of such sub-builders is known as its dependencies and is represented via a smattering of generated methods. I'll point out two interesting ones:
addDependencies that takes a Chart.Builder
addDependenciesBuilder that takes no arguments and returns the current Chart.Builder
(I'll start by saying I have no idea what the second method is good for. It appears to stuff a new builder into an internal list and then...that's it.)
In the first method's case, if I hand it a Chart.Builder, this should add it. Just what I needed!
OK, but now I don't want to add a Chart.Builder if it's already present. How can I reliably check for the presence of a sub-builder in a top-level builder? More generally, in a Protocol Buffers Java API, how can I check for a builder inside a List of builders?
(I understand I can roll my own "equality" checker, but I'm wary that perhaps I'm off in the wrong patch of weeds altogether: maybe this isn't how you construct an object graph using Protocol Buffers-generated code?)
You would think that I could call the getDependenciesBuilderList() method, and see if it contains my candidate builder. But I've found this doesn't work: java.util.List#contains(Object) will return false for any builder passed in.
Surely this very simple treat-a-list-as-a-set pattern has a solution when working with Java APIs generated from .proto files? If so, I'm failing to see it. How can I do what I want?
(For further reading, I've gone through https://developers.google.com/protocol-buffers/docs/javatutorial which treats the subject in a fairly cursory fashion.)

how to separate business logic in RestAssured

We have REST webservice. It operates over JSON data representation. I would like to provide functional testing. I plan to use RestAssured framework. It provides understandable methods for testing correctness of output json.
Example, get("/method").then().assertThat().body("obj.field", equalTo(5));
But one problem arise: if json structure will change, all tests shall be invalid. For example, if field should be renamed to field2 we shall fix all test with occurrences of field. The problem is very similar to web pages testing problem, where we should check presence of some web elements, etc. It was solved introducing by Page Object pattern. Does some similar solution exist for testing of REST api or could you advise some elegant one?
In the example given in your question you validate the entire body of a response object in which case it is probable you will create brittle tests.
However it looks like REST-Assured already provides all the functionality you need to test specific parts of a JSON response:
JSON example
JSON Advanced Examples
Using JSON Path
You can even map objects and then do whatever you wish with the objects constructed, for example validation and manipulation.
See here for more examples.
Just like with an HTML page, one way to write tests less exposed to changes is to use a strategy to locate the target you want to evaluate.
With a web page you would use an XPath query, a CSS Seletor or directly the id to avoid dependecies over the ancestors.
So how to do it with a JSON ?
Well you could use a Regular expression, but it can get really messy or you could use an XPath like query for JSON :
http://goessner.net/articles/JsonPath/
http://defiantjs.com/
So in your case, writing reliable tests is more about what you evaluate rather than the framework you use to do it.
Changes in REST API (especial public) are less frequent than in GUI. When changes in API are introduced they should be marked with new version and do not break old one (in most cases). So keep your tests as simple as possible, without introducing additional patterns, that will have some benefits - you can easily throw them away and write new. Hihger test framework complexity provides hihger maintanence costs. Any way in REST-Assured you may create ResponseSpecification and reuse it in assertions.

Java annotation-based code injection without altering the annotated code

There is a Java application and I have no permission to alter the Java code apart from annotating classes or methods (with custom or existing annotations).
Using annotations and annotations only I have to invoke code which means that every time an instance of an annotated class is created or an annotated method is called, some extra Java code must be executed (e.g a call to a REST Webservice). So my question is: how can I do this?
In order to prevent answers that I have already checked I will give you some solutions that seem to work but are not satisfying enough.
Aspect Oriented Programming (e.g AspectJ) can do this (execute code before and after the call of an annotated method) but I don't really want the runtime overhead.
Use the solution provided here which actually uses reflection. This is exactly what I need only that it alters the initial code further than just annotating and so I cannot use it.
Use annotation processor for source code generation as suggested here by the last answer. However, still this means that I will alter the source code which I don't want.
What I would really like is a way to simply include a Java file that somehow will execute some Java lines every time the annotated element will be triggered.
Why not skip annotations completely and use byteman to inject code at runtime into the entry points of your code.
I have to agree with the comment above though, that this sort of restriction is ridiculous and should be challenged.

Mongodb indexing should be redefined every time

I am really new to noSql and mongoDb and alot of questions are in my hand,
After searching I found Morphia, a ODM framework for java, in the documents of Morphia we can see some annotations like #Indexed that cause to create index for that specific column.
But the confusing issue for me is "datastore.ensureIndexes()" , document says
If you are using #Indexedannotation you shoud call datastore.ensureIndexes() after registering entity after application start.
So I can see my question in my mind after reading thatsentence, "we should redefine all indexes everytime?
I expect we can define indexes once somewhere like mongeez (mongeez is similar to liquibase) to run once at all.
Calling ensureIndexes() is virtually free if the indexes are already in place. You can (and arguably should) put this call directly after your mapping calls with no tangible impact. Any new indexes would get created but the existing indexes would essentially be no-ops.
For what it's worth, the morphia documentation can be found here.
So what you are referring to is the documentation here, and possibly then a little clarification of what that means along with the opinions that are offered in that document.
So as the document says, as per your quote, is that the index definitions you provide for your "Entity" classes are picked up by the .ensureIndex() method on the datastore, "when that is called in order to go and "re-create" all of those indexes.
The second point in the documentation defines this "as an example" along with the class mapping definitions like so:
Morphia m = ...
Datastore ds = ...
m.map(Product.class);
ds.ensureIndexes(); //creates all defined with #Indexed
And indeed that will be invoked every time your application is started, and some consider this best practice to ensure that all the definitions are "up to date" as it were. But also note, this is only an opinion.
As you seem to be pointing at, it would probably be better practice if you simply had some "post deploy" hook in which can be called when you actually "deploy" your application, or indeed "as needed" where you determine that re-defining your indexes is actually required.
It is generally one technique that I agree with, is to expose such a method as "callable" API for your application so that upon deployment, you can "script in" methods there to call that API function and actually re-define all your indexes (or even sub-set of) as you decide to do so.
So the actual interpretation is that using Morphia does not actually mean your indexes are re-defined every time the application is started "automatically", but if you do put the call to that .enssureIndexes() method somewhere where it will be called every time the application is started, then it will do so.
Don't call this in the same place as the class mappings. Put it somewhere else where you can control this, and the problem is solved.

Can I get Jersey to use natural JSON notation globally/as default?

I'm using Jersey to build a REST API to a service. I'd like to be able to accept and return both JSON and XML, and have this mostly working but I don't like the default "mapped" flavor of JSON that Jersey likes to spit out.
I know about the newer "natural" notation (from http://jersey.java.net/nonav/documentation/latest/json.html, which I'll quote at length because it makes obvious the problems with the default "mapped" notation):
After using mapped JSON notation for a while, it was apparent, that a
need to configure all the various things manually could be a bit
problematic. To avoid the manual work, a new, natural, JSON notation
was introduced in Jersey version 1.0.2. With natural notation, Jersey
will automatically figure out how individual items need to be
processed, so that you do not need to do any kind of manual
configuration. Java arrays and lists are mapped into JSON arrays, even
for single-element cases. Java numbers and booleans are correctly
mapped into JSON numbers and booleans, and you do not need to bother
with XML attributes, as in JSON, they keep the original names
and would like to use it everywhere, but I haven't been able to figure out how to. I'm instantiating/configuring Jersey via Tomcat's XML config files -- using what I believe is the normal dance with servlet/servlet-class/init-param tags -- but I haven't been able to find documentation on whether or how it's possible to specify JSONConfiguration options from there.
I've also tried implementing my own ContextResolver which applies a JSONJAXBContext I instantiated from Java code, where I can apply JSONConfiguration.natural() (an example of this looks like this answer). This works, but only for types I explicitly list out in that code, and pass to the JSONJAXBContext constructor. Not only is this extra code to write and maintain, and change if I add more data classes, but it doesn't work for things like List.
Is there a way to tell Jersey to just use the natural notation instead of mapped notation, always and for all types?
I never did find an answer to the actual question I was asking here, but instead I found a simple 3 step process that accomplishes the same end result that I wanted:
add Jackson to my project
configure Jersey to enable FEATURE_POJO_MAPPING
slap myself on the head a few times because it turned out to be so easy.
The Jersey documentation mentions this POJOMappingFeature/FEATURE_POJO_MAPPING prominently (it's the first example in the doc page I linked in the question), but doesn't describe exactly what it means, and from the way that document presents its information I thought this option (5.1, "POJO support") was at odds with option 5.2 ("JAXB based JSON support") which sounded more like what I wanted. So I tried a lot of other things before I tried enabling FEATURE_POJO_MAPPING.
But once I tried it, it worked exactly as I wanted, and I haven't had to look back.
A side benefit of this is that Jackson generates much better error messages in the case that the client passes it bogus JSON content, compared to Jersey's native JSON handling implementation.

Categories