Consuming Json in Play Framework - java

I try to run this code but I get a Null Exception.
Java Code :
public static void updateData(List<Users> users){
for(Users u : users){ //Error
System.out.println(u.name); // Error
}
}
Extjs Code :
proxy: {
type: 'ajax',
api: {
update: '/Application/updateData'
},
reader: {
type: 'json',
root: 'users',
successProperty: 'success'
}
}
Json Array :
[{"name":"Ed","email":"a...#aa.com"},{"name":"Ez","email":"b...#bb.com"}]
So please tell how to bind json Array to Entity List on Play Framework
1.2.2.
Thanks ...

You've specified root: 'users' in your reader's config. This means that JSON Array should look like this:
{users: [{"name":"Ed","email":"a...#aa.com"},{"name":"Ez","email":"b...#bb.com"}]}

You have to use Gson :
List<User> userList = new Gson().fromJson(yourString, Users.class);
And have a Users class suitable for your JSON :
public class Users {
private String name;
private String email;
...
//[Add your getter and setter]
...
}
For more information you can read the GSON documentation

Related

Nested json type won't serialize in jersey

This bounty has ended. Answers to this question are eligible for a +100 reputation bounty. Bounty grace period ends in 22 hours.
James Wierzba wants to draw more attention to this question.
I'm running a dropwizard/jersey java restful web app.
I have an endpoint that is defined like this in api.yaml:
swagger: '2.0'
info:
version: 0.0.1
basePath: /
schemes:
- https
- http
consumes:
- application/json
- application/x-protobuf
produces:
- application/json
- application/x-protobuf
paths:
/v1/event:
post:
summary: receive a event
operationId: receiveEvent ## this value names the generated Java method
parameters:
- name: event
in: body
schema:
$ref: "#/definitions/Event"
responses:
200:
description: success
schema:
type: object
$ref: '#/definitions/EventResponse'
definitions:
Stream:
properties:
vendor:
type: "string"
Event:
properties:
eventCity:
type: "string"
streams:
type: "array"
items:
$ref: "#/definitions/Stream"
EventResponse:
required:
- statusCode
properties:
statusCode:
type: "integer"
Endpoint is defined like so
#POST
#Consumes({ "application/json", "application/x-protobuf" })
#Produces({ "application/json", "application/x-protobuf" })
#Path("/event")
void receiveEvent(
#Suspended AsyncResponse response,
#Valid Event.EventModel event
);
When issuing json POST request, I cannot get the streams field to get serialized/deserialized property.
This is the payload
{
"eventCity": "San Diego",
"streams": [
{
"vendor": "CBS"
}
]
}
I test like this with curl
curl -X POST -H "Content-Type: application/json" -d '{"eventCity": "San Diego", "streams": [{"vendor": "CBS"}]}' https://localhost:8990/v1/event
In the server request handler:
#Override
public void receiveEvent(AsyncResponse response, Event.EventModel event) {
System.out.println(event.getEventCity());
System.out.println(event.getStreamCount()); // <-- this returns 0? why is the inner 'streams' list not getting serialized? it should have one element
}
And the output:
San Diego
0
Another observation, is that when I issue the same post, but with a protobuf payload, it works. The streams list is populated.
The protobuf was generated like so
// create proto
Stream.StreamModel stream = Stream.StreamModel.newBuilder()
.setVendor("CBS")
.build();
Event.EventModel event = Event.EventModel.newBuilder()
.setEventCity("San Diego")
.addStream(stream)
.build();
// write to file
FileOutputStream fos = new FileOutputStream("/Users/jameswierzba/temp/proto.out");
stream.writeTo(fos);
The output in the endpoint is as expected:
San Diego
1
This is the full generated code for the Event class: https://gist.github.com/wierzba3/84face6c21c4fb6ce554f90707ba6ef9
This is the full generated doe for the Stream class: https://gist.github.com/wierzba3/32664312df87c64049b281daab928f94
You can't use the same class as the payload for protobuf and json.
If you inspect the Event generated for protobuf processing you will notice that the the stream is stored as stream_ and has a setStream that returns a Builder as follows. A JSON deserialiser can't work with it:
/**
* <code>repeated .com.apple.amp.social.linearmasterplaylist.refresh.model.StreamModel stream = 2;</code>
*/
public Builder setStream(
int index, com.apple.amp.social.linearmasterplaylist.refresh.model.Stream.StreamModel value) {
if (streamBuilder_ == null) {
if (value == null) {
throw new NullPointerException();
}
ensureStreamIsMutable();
stream_.set(index, value);
onChanged();
} else {
streamBuilder_.setMessage(index, value);
}
return this;
}
A JSON deserialiser (probably jackson-databind) needs a definition of Event like this generated with openapi-generator-cli. In this case the stream and its setter look like this: (note that the #JsonProperty("streams") is redundant as the property is named streams)
#JsonProperty("streams")
#Valid
private List<Stream> streams = null;
and
public void setStreams(List<Stream> streams) {
this.streams = streams;
}
I have included the other model definitions here to allow you to try them with you controller to show that JSON is correctly consumed.
It is possible to write code to inspect the incoming media type and fork the processing. Eg the following generated by openapi-generator. I would recommend a separate controller method for each media type and some sort of mapping so that a single service layer call will suffice.
#RequestMapping(
method = RequestMethod.POST,
value = "/v1/event",
produces = { "application/json", "application/x-protobuf" },
consumes = { "application/json", "application/x-protobuf" }
)
default ResponseEntity<EventResponse> receiveEvent(
#Parameter(name = "event", description = "") #Valid #RequestBody(required = false) Event event
) {
getRequest().ifPresent(request -> {
for (MediaType mediaType: MediaType.parseMediaTypes(request.getHeader("Accept"))) {
if (mediaType.isCompatibleWith(MediaType.valueOf("application/json"))) {
String exampleString = "{ \"statusCode\" : 0 }";
ApiUtil.setExampleResponse(request, "application/json", exampleString);
break;
}
if (mediaType.isCompatibleWith(MediaType.valueOf("application/x-protobuf"))) {
String exampleString = "Custom MIME type example not yet supported: application/x-protobuf";
ApiUtil.setExampleResponse(request, "application/x-protobuf", exampleString);
break;
}
}
});
return new ResponseEntity<>(HttpStatus.NOT_IMPLEMENTED);
}
On a separate note, but I doubt it will help: it looks like the protobuf is using stream (singular) (eg getStreamList()) and the swagger get uses streams (plural).
So, the issue turned out to be a bug in our companies internal implementation of the code generation module.
Workflow looks like this: api.yaml specified -> .proto file generated -> .java code generated.
To be explicit. Using the schema in question. Note I changed the list to have name myStreamArrayInput to illustrate the issue at hand.
Event:
properties:
eventCity:
type: "string"
myStreamArrayInput:
type: "array"
items:
$ref: "#/definitions/Stream"
This would generate a proto like this:
message Event {
optional string eventCity = 1;
/*
this name should be `myStreamArrayInput` or something
close to it. but it is using the name of the TYPE
instead of the name specified by the dev in api.yaml
*/
repeated StreamModel stream;
}
And then this would generate a java class like this
public final class Event {
public static final class EventModel extends
com.google.protobuf.GeneratedMessageV3 implements EventModelOrBuilder {
private EventModel() {
eventCity_ = "";
stream_ = "";
}
}
}
YMMV, since this is an bug in our own internal code generation engine
import com.fasterxml.jackson.annotation.JsonProperty;
import com.fasterxml.jackson.databind.ObjectMapper;
import javax.ws.rs.*;
import javax.ws.rs.core.MediaType;
import javax.ws.rs.core.Response;
import java.util.ArrayList;
import java.util.List;
#Path("/myresource")
public class MyResource {
#GET
#Produces(MediaType.APPLICATION_JSON)
public Response getJson() {
ObjectMapper mapper = new ObjectMapper();
List<MyObject> myObjects = new ArrayList<>();
MyObject myObject = new MyObject("foo", new NestedObject("bar"));
myObjects.add(myObject);
String json;
try {
json = mapper.writeValueAsString(myObjects);
} catch (Exception e) {
return Response.status(Response.Status.INTERNAL_SERVER_ERROR).build();
}
return Response.ok(json, MediaType.APPLICATION_JSON).build();
}
private static class MyObject {
private String name;
private NestedObject nestedObject;
public MyObject(String name, NestedObject nestedObject) {
this.name = name;
this.nestedObject = nestedObject;
}
#JsonProperty("name")
public String getName() {
return name;
}
#JsonProperty("nested_object")
public NestedObject getNestedObject() {
return nestedObject;
}
}
private static class NestedObject {
private String value;
public NestedObject(String value) {
this.value = value;
}
#JsonProperty("value")
public String getValue() {
return value;
}
}
}

Write custom document to Cosmos DB with Java API

I have a Cosmos DB and want to write different kind of documents to it. The structure of the documents is dynamic and can change.
I tried the following. Let's say I have the following class:
class CosmosDbItem implements Serializable {
private final String _id;
private final String _payload;
public CosmosDbItem(String id, String payload) {
_id = id;
_payload = payload;
}
public String getId() {
return _id;
}
public String getPayload() {
return _payload;
}
}
I can create then the document with some JSON as follows:
CosmosContainer _container = ...
CosmosDbItem dataToWrite = new CosmosDbItem("what-ever-id-18357", "{\"name\":\"Jane Doe\", \"age\":42}")
item = _cosmosContainer.createItem(dataToWrite, partitionKey, cosmosItemRequestOptions);
This results in a document like that:
{
"id": "what-ever-id-18357",
"payload": "{\"name\":\"Jane Doe\", \"age\":42}",
"_rid": "aaaaaaDaaAAAAAAAAAA==",
"_self": "dbs/aaaaAA==/colls/aaaaAaaaDI=/docs/aaaaapaaaaaAAAAAAAAAA==/",
"_etag": "\"6e00c443-0000-0700-0000-5f8499a70000\"",
"_attachments": "attachments/",
"_ts": 1602525607
}
Is there a way in generating the payload as real JSON object in that document? What do I need to change in my CosmosDbItem class? Like this:
{
"id": "what-ever-id-18357",
"payload": {
"name":"Jane Doe",
"age":42
},
"_rid": "aaaaaaDaaAAAAAAAAAA==",
"_self": "dbs/aaaaAA==/colls/aaaaAaaaDI=/docs/aaaaapaaaaaAAAAAAAAAA==/",
"_etag": "\"6e00c443-0000-0700-0000-5f8499a70000\"",
"_attachments": "attachments/",
"_ts": 1602525607
}
Here is my solution that I ended up. Actually it is pretty simple once I got behind it. Instead of using CosmosDbItem I use a simple HashMap<String, Object>.
public void writeData() {
...
Map<String, Object> stringObjectMap = buildDocumentMap("the-id-", "{\"key\":\"vale\"}");
_cosmosContainer.createItem(stringObjectMap, partitionKey, cosmosItemRequestOptions);
...
}
public Map<String, Object> buildDocumentMap(String id, String jsonToUse) {
JSONObject jsonObject = new JSONObject(jsonToUse);
jsonObject.put("id", id);
return jsonObject.toMap();
}
This can produce the following document:
{
"key": "value",
"id": "the-id-",
"_rid": "eaaaaaaaaaaaAAAAAAAAAA==",
"_self": "dbs/eaaaAA==/colls/eaaaaaaaaaM=/docs/eaaaaaaaaaaaaaAAAAAAAA==/",
"_etag": "\"3b0063ea-0000-0700-0000-5f804b3d0000\"",
"_attachments": "attachments/",
"_ts": 1602243389
}
One remark: it is important to set the id key in the HashMap. Otherwise one will get the error
"The input content is invalid because the required properties - 'id; ' - are missing"

Put mapping in ElasticSearch by JAVA API

I want to put mapping for a field by JAVA API but failed. Following is detailed information:
My data structure is :
{
"mje-test-execution-id": "464b66ea6c914ddda217659c84a3cb9d",
"jvm-free-memory": 315245608,
"jvm-total-memory": 361758720,
"system-free-memory": 0,
"jvm-max-memory": 7600078848,
"system-total-memory": 34199306240,
"memory-time-stamp": "2020-03-12T05:12:16.835Z",
"mje-host-name": "CN-00015345",
"mje-test-suite-name": "SCF Test no mje version",
"mje-version": "1.8.7771-SNAPSHOT",
"mje-test-artifact-id": "msran-regression-tests",
"mje-test-version": "1.8.7771-SNAPSHOT",
"stp-id": "vran-stp",
"mje-test-location": {
"lat": 58.41,
"lon": 15.62
}
}
What I want to do is : put "mje-test-location" type to be "geo_point"
My code snippet :
public void postMapping(String indexName, String field, String type) throws IOException {
GetIndexRequest request = new GetIndexRequest(indexName);
boolean exists = client.indices().exists(request, RequestOptions.DEFAULT);
if (!exists) {
LOGGER.info("index {} does not exist. Now to post mapping.", indexName);
PutMappingRequest putMappingRequest = new PutMappingRequest(indexName);
XContentBuilder builder = XContentFactory.jsonBuilder();
builder.startObject();
{
builder.startObject("properties");
{
builder.startObject(field);
{
builder.field("type", type);
}
builder.endObject();
}
builder.endObject();
}
builder.endObject();
putMappingRequest.source(builder);
//
AcknowledgedResponse putMappingResponse = client.indices().putMapping(putMappingRequest,
RequestOptions.DEFAULT);
boolean acknowledged = putMappingResponse.isAcknowledged();
if (acknowledged) {
LOGGER.info("Succeed to put mapping: field:{}, type: {}", field, type);
}
}
LOGGER.info("Fail to put mapping due to index {} already exist, ", indexName);
}
Error info:
15:59:54.397 [main] DEBUG org.elasticsearch.client.RestClient - request [PUT http://seliiuapp00269.lmera.ericsson.se:9208/mje-scf-v2-20200313-post/_mapping?master_timeout=30s&timeout=30s] returned [HTTP/1.1 400 Bad Request]
org.elasticsearch.ElasticsearchStatusException: Elasticsearch exception [type=action_request_validation_exception, reason=Validation Failed: 1: mapping type is missing;]
ElasticSearch JAVA API Version : <elasticsearch.rest.high.level.client>7.0.0</elasticsearch.rest.high.level.client>
You need to specify the document type like this:
putMappingRequest.type("_doc");
And also need to specify the types of the fields in here:
builder.field("type", type);
Like: ("type", "text") or ("type", "long"), ("type", "date") ......
You can see the datatypes in here
I'll post #Mincong's comment as an answer because I was importing the wrong import that I found from github searchs:
What is the full name of your class "PutMappingRequest": org.elasticsearch.action.admin.indices.mapping.put.PutMappingRequest or org.elasticsearch.client.indices.PutMappingRequest? You should use the second one.

Response of REST API with the logs generated during execution of the Request

I am writing some REST apis with swagger-ui . Now during the execution process of this apis I am performing some operation, Which I need to send as a response of the API. Consider the following response as an example:
{
"Database": [
"Table 1 created",
"data 1 inserted",
"Data 3 insertion failed"
],
"Kafka": [
"Topic 1 created",
"Topic 3 deleted",
"Topic 4 rebalanced"
]
}
So is there any framework for this, or I need to manually create the JSON object and send it as a response.
I suppose you are using Spring MVC?
First: create a class for api response.
public class Data {
private List<String> database = new ArrayList();
private List<String> kafka = new ArrayList();
public List<String> getDatabase() {
return database;
}
public void setDatabase(List<String> database) {
this.database = database;
}
public List<String> getKafka() {
return kafka;
}
public void setKafka(List<String> kafka) {
this.kafka = kafka;
}
}
Second: use #ResponseBody annotation against the controller's method. This will make spring understand that method return value should be bound to the web response body.
#ResponseBody
public Data apiMethod() {
return new Data();
}

Pass json object to a play-framework action

I am currently using Play v1.2.3. I have an endpoint to which I want to send a json object which will be deserialized into a Java object. So, I have something that looks like this:
public class UserController extends Controller {
public static class FullName {
public String first;
public String last;
}
public static void putName( FullName name ) { ... }
}
##### routes
PUT /user/name UserController.putName
With that in place, I would hope to call the endpoint with the given javascript:
$.ajax({
type: "PUT",
data: { first: "Michael", last: "Bailey" },
url: "/user/name"
});
Unfortunately, with the above setup, it seems that play is not wanting to send the entire data object, but is instead attempting to populate two parameters (first and last). Is there a way to define the endpoint to consume the complete body directly, or does it have to be done by hand?
To cast the entire input body into a Model class:
public static void create(JsonObject body) {
CaseFolder caseFolder = new Gson().fromJson(body, CaseFolder.class);
caseFolder.user = getConnectedUser();
if(caseFolder.validateAndSave()) {
renderJSON(
new JSONSerializer()
.exclude("*.class")
.exclude("user")
.serialize(caseFolder));
} else
error();
}
Also, the above code takes the resulting Model object and serializes it back out to JSON as the response body.
If you want to just access certain fields within the JSON request, you can use:
public static void update(JsonObject body) {
try {
Long id = (long) body.get("id").getAsInt();
CaseFolder cf = CaseFolder.loadAndVerifyOwner(getConnectedUser(), id);
cf.number = body.get("number").getAsString();
cf.description = body.get("description").getAsString();
if(cf.validateAndSave())
ok();
else
error();
}
catch (NullIdException e) {error();}
catch (NotFoundException e) {notFound();}
catch (NotOwnerException e) {forbidden();}
catch (Exception e) {e.printStackTrace(); error();}
}
Play's action method parameter binding mechanism does not accept JSON. You need to bind it manually. In your example, the code could be something like:
public static void putName( String data ) {
FullName fname = new Gson().fromJSON(data, FullName.class);
...
}
Note, Gson is provided with play!framework distribution, so you are free to use it
With your settings play is expecting params with names "name.first" and "name.last" and you are sending "first" and "last". Try with this ajax post
$.ajax({
type: "PUT",
data: {
name: {
first: "Michael",
last: "Bailey"
}
},
url: "/user/name"
});

Categories