In a social app (similar to Facebook), a user's profile could look like this:
{"name": "Peter",
"gender": "Male",
"age": "22"}
Now, when people navigate to Peter's page, people would be able to see Peter's age, gender and name.
Let's say Peter is a very private person and he does not want anyone to know any of his personal information. I could structure the data to looks like this instead:
{"name": "Peter",
"alias":"GoofyDuck",
"gender": "Male",
"age": "22",
"showGender": false,
"showName":false,
"showAge": false}
When other people navigate to his page, because of the boolean set in the json, the page could be prevent from displaying his details, however, if you have a lot of fields of personal details (for example, besides name, gender and age, Peter could put his address, phone number etc.), it could make the json unneccessarily long.
I was thinking maybe some type of mask on a binary string might be more appropriate:
{"name": "Peter",
"gender": "Male",
"age": "22",
"privacy":"110"}
In this case, the binary on the "privacy" field could say that only his age would be hidden as it is a 0.
I do think that the binary string is properly the most efficient way to store preferences on the cloud but I have not seen any examples of using a mask to mask out the meaning of the binary in java.
I could also parse out the binary string to get each value separately but think that won't be efficient. Is there a way I can mask out each value separately?
I think a better design is to couple every property with its own privacy setrtings like this:
{ {"name": "Peter", "show": false},
{"alias":"GoofyDuck", "show": false},
{"gender": "Male", "show": false},
{"age": "22", "show": false} }
This design carries several advantages:
more clear then having a mask where the settings is dependant on position in the mask
you can add new properties without worrying about affecting the privacy settings of existing ones
with this design, you can have more fine tune privacy settings, other than just boolean value:
{ {"name": "Peter", "show": "all"},
{"alias":"GoofyDuck", "show": "all"},
{"gender": "Male", "show": "friends"},
{"age": "22", "show": "none"} }
once you have a structure per property, you can enhance it with further metadata once it is required, for example,
{ {"name": "Peter", "show": "all", "decorate": "bold"},
{"alias":"GoofyDuck", "show": "all"},
{"gender": "Male", "show": "friends"},
{"age": "22", "show": "none"} }
you could just have a whitelist of visible attributes
{public_attributes: {"name", "gender", "age"}
note that you shouldn't use a blacklist because that may accidentally reveal data when you add new attributes
Related
I'm new to JSON schema, so bear with me. My goal is to have a JSON property that is an object. It's keys relate to each other, meaning multiple keys always have the same values together. This will probably help make it clear, it's my attempt to do this with an enum:
{
"$schema": "https://json-schema.org/draft/2019-09/schema",
"title": "Part",
"type": "object",
"properties": {
"relationship": {
"type": "object",
"enum": [
{
"code": "1",
"value": "MEMBER"
},
{
"code": "2",
"value": "SPOUSE"
},
{
"code": "3",
"value": "CHILD"
},
{
"code": "4",
"value": "STUDENT"
},
{
"code": "5",
"value": "DISABILITY_DEPENDENT"
},
{
"code": "6",
"value": "ADULT_DEPENDENT"
},
{
"code": "8",
"value": "DOMESTIC_PARTNER"
}
]
}
}
}
So using an enum like this works, even though I can't find it anywhere in the JSON Schema spec. However, the error message sucks. Normally I get the most extremely detailed error messages from schema validation, however in this case I do not.
$.part.relationship: does not have a value in the enumeration [, , , , , , ]
I'm not sure what I'm doing wrong. I'm using a Java parser for JSON Schema:
<dependency>
<groupId>com.networknt</groupId>
<artifactId>json-schema-validator</artifactId>
<version>1.0.53</version>
</dependency>
Not sure if the error message is the fault of the parser or something I'm doing bad with the schema. Help would be appreciated.
It was news to me, but according to the spec it does seem that objects are valid enum values. That said, your usage is quite unusual. I've not seen it used before.
the six primitive types ("null", "boolean", "object", "array", "number", or "string")
...
6.1.2. enum
...
Elements in the array might be of any type, including null.
Your problem is fundamentally that the library that you're using doesn't know how to convert those objects to printable strings. Even if it did give it a reasonable go, you might end up with
does not have a value in the enumeration [{"code": "1", "value":"MEMBER"}, {"code": "2" ...
which might be okay, but it's hardly amazing. If the code and value were both valid but didn't match, you might have to look quite closely at the list before you ever saw the problem.
JSON Schema in general is not very good at enforcing constraints between what it considers to be 2 unrelated fields. That's beyond the scope of it what it aims to do. It's trying to validate the structure. Dependencies between fields are business constraints, not structural ones.
I think the best thing you could do to achieve readable error messages would be to have 2 sub-properties, each with an enumeration containing 8 values; one for the codes, one for the values.
Then you'll get
$.part.relationship.code does not have a value in the enumeration [1,2,3,4 ...
or
$.part.relationship.value does not have a value in the enumeration ["MEMBER", "SPOUSE", ...
You can do some additional business validation on top of the schema validation if enforcing that constraint is important to you. Then generate your own error such as
code "1" does not match value "SPOUSE"
If code and value always have the same values relative to each other, why encode both in the JSON? Just encode a single value in the JSON and infer the other in the application.
This will be much easier to validate.
I am using this Java code to upload a resource to a FHIRstore.
The resource is as follows
{
"resourceType": "Bundle",
"id": "bundle-transaction",
"meta": {
"lastUpdated": "2018-03-11T11:22:16Z"
},
"type": "transaction",
"entry": [
{
"resource": {
"resourceType": "Patient", "id" : 123456,
"name": [
{
"family": "Smith",
"given": [
"Darcy"
]
}
],
"gender": "female",
"address": [
{
"line": [
"123 Main St."
],
"city": "Anycity",
"state": "CA",
"postalCode": "12345"
}
]
},
"request": {
"method": "POST",
"url": "Patient"
}
}
]
}
But the id i am using(123456) is getting replaced by a hexadecimal number.
This does not happen while using fhirstores.import method
Is there any way to stop executeBundle method from replacing my id...as i want to use custom id in my resource?
Any help would be appreciated.
Thank you
When you're performing a transaction, the effect is going to be the same as if you were POSTing the resources individually. On a POST, the server determines the resource id. On a regular POST, the id is just ignored or raises an error. Within a transaction, the id is used to manage resolution of references across the transaction, but the server still chooses what the id will be of the persisted resources (and updates all references accordingly). If you want to control the resource id values within a transaction, use PUT rather than POST. (Note that not all servers will allow an 'upsert' - i.e. a PUT that performs a create at a specific resource location.) For details, see http://hl7.org/fhir/http.html#upsert.
so i have my project there is this part where there are 2 one to many relation ship to the same entity
what happens is that the response on the get request on postman come like this :
the one to many relationship is writen the same for both elements
{
"elemnt1withonetomany": {
"id": 2,
"name": "something",
"last_name": "something",
"email": "something"
},
"elemnt2withonetomany": {
"#id": 4,
"id": 4,
"code": "details",
"email": "details",
"name": "details",
"lastname": "details"
},
{
"elemnt1withonetomany": {
"id": 2,
"name": "something",
"last_name": "something",
"email": "something"
},
"element2withonetomany": 4,
}
so is there any way to make the get request gives the same form of information with elemnt2withonetomany
i kinda found where it came from but then it's gonna need a lot of JsonBackReference and similar annotations
and yep it was from the #JsonIdentityInfo on top of the entities
kinda took me a while to find the source so i'm just gonna post what i found if someone needed it
so my new question is there a way to by pass it without deleting this one
I need to process a big JSON payload(~1MB) coming from an API, a portion of the JSON is something like this:
{
"id": "013dd2a7-fec4-4cc5-b819-f3cf16a1f820",
//more attributes
"entry_mode": "LDE",
"periods": [
{
"type": "quarter",
"id": "fe96dc03-660c-423c-84cc-e6ae535edd2d",
"number": 1,
"sequence": 1,
"scoring": {
//more attribtues
},
"events": [
{
"id": "e4426708-fadc-4cae-9adc-b7f170f5d607",
"clock": "12:00",
"updated": "2013-12-22T03:41:40+00:00",
"description": "J.J. Hickson vs. DeAndre Jordan (Blake Griffin gains possession)",
"event_type": "opentip",
"attribution": {
"name": "Clippers",
"market": "Los Angeles",
"id": "583ecdfb-fb46-11e1-82cb-f4ce4684ea4c",
"team_basket": "left"
},
"location": {
"coord_x": 572,
"coord_y": 296
},
"possession": {
"name": "Clippers",
"market": "Los Angeles",
"id": "583ecdfb-fb46-11e1-82cb-f4ce4684ea4c"
}
},
//more events
]
}
]
}
This is a nearly-realtime API that I need to process only the events, identify a set of event UUIDs, look for duplicates in the database and save new events.
I could use a JSONObject/JSONArray or use regex with string parsing to and fetch the events portion. Processing time is critical since this should be nearly-realtime and memory efficiency is important since there can be multiple payloads coming in at once.
Which one is more efficient for this use case?
Use a proper streaming JSON parser. You know what you want to pull out of the stream, you know when you can quit parsing it, so read the stream in small, manageable chunks, and quit as soon as you know you are done.
Circa 2017, I'm not aware of any browser/native JSON streaming APIs, so you'll need to find a Javascript-based streaming library. Fortunately, streaming is not a new concept, so there are a number of options already in existence:
http://oboejs.com/
https://github.com/dominictarr/JSONStream
https://github.com/creationix/jsonparse
https://github.com/dscape/clarinet
http://danieltao.com/lazy.js/demos/json/
I've implemented an application using Moqui Framework.I have a field named as "age" in entity.I am trying to get all record from entity where the age between 20 to 25. How to create rest service in order to get the given age between & How to specify the url for this requirement.
This is the url:rest/s1/UserMargen/DetailsOfUser And i am getting as
{
"street": "Bridege",
"age": 22,
"city": "kol",
"username": "Debendu",
"lastUpdatedStamp": "2016-04-26T12:43:45+0000",
"userid": "2000"
},
{
"lastUpdatedStamp": "2016-04-26T12:42:42+0000",
"userid": "2001",
"street": "White",
"username": "rolla",
"city": "Ban",
"age": 20
},
{
"username": "Venkatesh",
"street": "T-nager",
"age": 28,
"userid": "2005",
"city": "chennai",
"lastUpdatedStamp": "2016-04-26T12:48:33+0000"
}
In rest.xml I have Defined as
<resource name="DetailsOfUser" >
<method type="get"><entity name="UserInDetails" operation="list"/></method>
<method type="post"><service name="UserMargen.UserMargenServices.create#userDetails"/></method>
<id name="age">
<method type="get"><entity name="UserInDetails" operation="list"/></method>
</id></resource>
If i give /rest/s1/UserMargen/DetailsOfUser/22 it will display which age equals 22. How to get records in between age from 20 to 25 ?
In XML REST API definitions in Moqui Framework the method.entity element behaves the same as for the entity (/rest/e1) and entity master (/rest/m1) interfaces, which behave the same as the search form inputs (either in a XML Screen file or through the EntityFind interface). For search form inputs you can use the field name plus "_from" and "_thru" suffixes to do a ranged find on numeric or date/time fields.
If you want the age range values to be in the URL as path parameters you'll need to define an id element with the name age_from and another id element under it with the name age_thru. A more flexible approach would just be to pass them as URL parameters instead of path elements, i.e. something like:
/rest/s1/UserMargen/DetailsOfUser?age_from=20&age_thru=25
Using this pattern you can pass field values or any of the suffixes supported by EntityFind.searchFormInputs()/searchFormMap(): _op, _not, _ic, _period/_poffset, _from, and _thru. You can also pass an orderByField parameter which can be a comma-separated list of field names. You can also pass pagination parameters like pageIndex, pageSize, or even pageNoLimit to not paginate.