I am a new guy to Java and to the REST API .I would like to know the way of receiving the dynamic data from client by REST api in java and process it.
For example,
Some times client will send the data like below,
{
"User" : "XXXX",
"Role" : "ZZZZ",
"Product" :
{
"Name" : "yyyy",
"Valid to" : "04/4/2025",
"Licensed version" : "jjjjj",
},
}
In the next contract, client may send like below,
{
"User" : "XXXX",
"Role" : "ZZZZ",
"Product" :
{
"Name" : "yyyy",
"Expiry Date" : "04/4/2025",
"Activation Date" : "jjjjj",
},
}
By referring the both examples, "Product" section having different data .For additional information may Client can send the additional data in this "Product" section. Would it be possible to make my REST Api to receive this type of dynamic data?.
If possible please let me know how can my REST api will able to receive this type of dynamic data and process it?.
Thanks
I do the same in php with the slim network and Doctrine 2. In Doctrine 2 i defined the entities with all the possible fields. In my javascript fronted, i collect the dynamic data and then deserialize it into the entityclasses. The classes can get saved into the database then or you can do whatever you need to do to process the data.
http://www.restapitutorial.com/ is a very good tutorial on how to design your REST API, i suggest reading it.
Related
I am using RESTful API to retrieve some data over HTTP using retrofit2.
I have the following type of data, and I would like to store replies as object:
The problem is, replies object is an empty string when no values (or fields) are present, i.e:
replies = ""
When there are fields, then replies object is given as:
replies = {
"kind" : "Listing",
"data" : {
"key" : "data",
"value" : {
"modhash" : "",
.
.
.
}
}
}
What is bothering me is the inconsistency between data types of replies, which is given as an object when nonempty and when empty, given as a String ("").
My dilemma is this: since Java is a statically typed language, I need to define what replies is beforehand, but I cannot define it as neither Replies nor String because of its inconsistency. How can I resolve this issue?
You need to change answer from your api, when replies is an empty you should send replies : {}
I have an Apache Beam streaming job which reads data from Kafka and writes to ElasticSearch using ElasticSearchIO.
The issue I'm having is that messages in Kafka already have key field, and using ElasticSearchIO.Write.withIdFn() I'm mapping this field to document _id field in ElasticSearch.
Having a big volume of data I don't want the key field to be also written to ElasticSearch as part of _source.
Is there an option/workaround that would allow doing that?
Using the Ingest API and the remove processor you´ll be able to solve this pretty easy only using your elasticsearch cluster. You can also simulate ingest pipeline and the results.
I´ve prepared a example which will probably cover your case:
POST _ingest/pipeline/_simulate
{
"pipeline": {
"description": "remove id form incoming docs",
"processors": [
{"remove": {
"field": "id",
"ignore_failure": true
}}
]
},
"docs": [
{"_source":{"id":"123546", "other_field":"other value"}}
]
}
You see, there is one test document containing a filed "id". This field is not present in the response/result anymore:
{
"docs" : [
{
"doc" : {
"_index" : "_index",
"_type" : "_type",
"_id" : "_id",
"_source" : {
"other_field" : "other value"
},
"_ingest" : {
"timestamp" : "2018-12-03T16:33:33.885909Z"
}
}
}
]
}
I've created a ticket in Apache Beam JIRA describing this issue.
For now the original issue can not be resolved as part of indexation process using Apache Beam API.
The workaround that Etienne Chauchot, one of the maintainers, proposed is to
have separate task which will clear indexed data afterwords.
See Remove a field from a Elasticsearch document for example.
For the future, if someone also would like to leverage such feature, you might want to follow the linked ticket.
Currently we are using a schema file that contains oneOf with 2 schemas: one for PATCH requests and one for POST requests. In Java code we check if id is available in the request then we check if there is any error message for the first schema in oneOf section.
Something like this:
processingReport.iterator().forEachRemaining(processingMessage -> {
JsonNode json = processingMessage.asJson();
JSONObject reports = new JSONObject(json.get("reports").toString());
logger.debug("Schema validation: {}", reports.toString());
//Seems always has 2 reports.
String reportIdentifier = isCreate ? "/properties/data/oneOf/0" : "/properties/data/oneOf/1";
JSONArray errorsArray = new JSONArray(reports.get(reportIdentifier).toString());
//Do something with the error here
});
But this seems not right to me. Is there any way to manage this in the schema itself so when id is available then it picks the right schema from the oneOf or perhaps there is better way to do it?
I know one option would be having different json files but our technical managers would rather to keep them in 1 place.
oneOf and anyOf clauses can be used to model conditional constraints. The following schema would validate on of patch or post schemas depending on the existence of id property:
{
"oneOf" : [{
"$ref" : "/post_request_schema#"
}, {
"allOf" : [{
"$ref" : "/patch_request_schema#"
}, {
"required" : ["id"]
}
]
}
]
}
I want to create a JSonObject with some values for call a webservice but where webservice in a order like:
{
"id" : 1
"email" : "test#test.com",
"pin" : 1234,
"age" : 25,
"firstName" : "Test First Name",
"lastName" : "Test Last Name",
"location" : "India",
"phone" : "1234567890"
}
but when I create a json object from android code it is not maintaining the order like:
requestJOB=new JSONObject();
requestJOB.put("userid",Pref.getValue(this, Const.USER_ID, requestJOB.optString("userid")));
requestJOB.put("email", Pref.getValue(this, Const.PREF_EMAIL, requestJOB.optString("email")));
requestJOB.put("pin", Pref.getValue(this, Const.PREF_PIN, requestJOB.optString("pin")));
requestJOB.put("age", Pref.getValue(this, Const.PREF_AGE, requestJOB.optString("age")));
requestJOB.put("firstname", etFirstName.getText().toString().trim());
requestJOB.put("lastname", etLastName.getText().toString().trim());
requestJOB.put("phone", etPhone.getText().toString().trim());
requestJOB.put("location", etLocation.getText().toString().trim());
I write the code my desired order but JsonObject change the order in run time. I also tried with map and LinkedList but A exception is
occured when I want to convert LIST to JsonObject.
I searched in stackoverflow where no satisfactory answer.
In this situation I don't understand exactly what I have to do.
In Android platform there is better way to serialize a object in json by using Google GSON API... Which provide all possible functionality to convert a class to their corresponding JSON. U can prepare nested jsonobject ..
Nested like json object with in a json object. Json array embedded within a json. Object
Multiple jsonarray with in a same json object. And their may be multiple variety .. Just explore this jar .. It's very easy to use and user-friendly jar. Just go and grab it .. Hopefully u feel better with this API
I used this jar in my Android project actually
I have the following document in my collection:
{
"_id":NumberLong(106379),
"_class":"x.y.z.SomeObject",
"name":"Some Name",
"information":{
"hotelId":NumberLong(106379),
"names":[
{
"localeStr":"en_US",
"name":"some Other Name"
}
],
"address":{
"address1":"5405 Google Avenue",
"city":"Mountain View",
"cityIdInCitiesCodes":"123456",
"stateId":"CA",
"countryId":"US",
"zipCode":"12345"
},
"descriptions":[
{
"localeStr":"en_US",
"description": "Some Description"
}
],
},
"providers":[
],
"some other set":{
"a":"bla bla bla",
"b":"bla,bla bla",
}
"another Property":"fdfdfdfdfdf"
}
I need to run through all documents in collection and if "providers": [] is empty I need to create new set based on values of information section.
I'm far from being MongoDB expert, so I have the few questions:
Can I do it as atomic operation?
Can I do this using MongoDB console? as far as I understood I can do it using $addToSet and $each command?
If not is there any Java based driver that can provide such functionality?
Can I do it as atomic operation?
Every document will be updated in an atomic fashion. There is no "atomic" in MongoDB in the sense of RDBMS, meaning all operations will succeed or fail, but you can prevent other writes interleaves using $isolated operator
Can I do this using MongoDB console?
Sure you can. To find all empty providers array you can issue a command like:
db.zz.find(providers :{ $size : 0}})
To update all documents where the array is of zero length with a fixed set of string, you can issue a query such as
db.zz.update({providers : { $size : 0}}, {$addToSet : {providers : "zz"}})
If you want to add a portion to you document based on a document's data, you can use the notorious $where query, do mind the warnings appearing in that link, or - as you had mentioned - query for empty provider array, and use cursor.forEach()
If not is there any Java based driver that can provide such functionality?
Sure, you have a Java driver, as for each other major programming language. It can practically do everything described, and basically every thing you can do from the shell. Is suggest you to get started from the Java Language Center.
Also there are several frameworks which facilitate working with MongoDB and bridge the object-document world. I will not give a least here as I'm pretty biased, but I'm sure a quick Google search can do.
db.so.find({ providers: { $size: 0} }).forEach(function(doc) {
doc.providers.push( doc.information.hotelId );
db.so.save(doc);
});
This will push the information.hotelId of the corresponding document into an empty providers array. Replace that with whatever field you would rather insert into the providers array.