finding json diff fails using JSONAssert - java

I was hoping to use Jackson to find JSON diff but it does not give detailed error messages.
So I tried using JSOnAssert to find the diff between two JSON strings.
JSONAssert.assertEquals(expectedJsonResponse, actualJsonResponse, false);
Sadly, it does not appear to match correctly and give the detailed error messages as in the examples. If you have used it, Can you please clarify?
java.lang.AssertionError: data[0] Could not find match for element {"errors":[{"httpStatus":"BAD_REQUEST","personId":null,"details":"User ID [UNKNOWN]. Invalid ID: NONSENSE"}],"successfulIds":["A0","B1","C3"]}
at org.skyscreamer.jsonassert.JSONAssert.assertEquals(JSONAssert.java:222)
Actual JSON:
{"_links":{"self":{"href":"https://myserver.com:1000/api/person/upload? myCsvFile={myCsvFile}","templated":true}},"data":[{"successfulIds":["A0","XYZ","C3"],"errors":[{"personId":null,"httpStatus":"BAD_REQUEST","details":"User ID [UNKNOWN]. Invalid ID: NONSENSE"}]}]}
Expected JSON:
{
"_links": {
"self": {
"href": "https://myserver.com:1000/api/person/upload?myCsvFile={myCsvFile}",
"templated": true
}
},
"data": [
{
"successfulIds": [
"A0",
"B1",
"C3"
],
"errors": [
{
"personId": null,
"httpStatus": "BAD_REQUEST",
"details": "User ID [UNKNOWN]. Invalid ID: NONSENSE"
}
]
}
]
}

I tried to email the address at http://jsonassert.skyscreamer.org/ but got a
The following message to jsonassert-dev#skyscreamer.org was
undeliverable. The reason for the problem:
5.1.0 - Unknown address error 550-"5.1.1 The email account that you tried to reach does not exist
So I tried ZJsonPatch. I like the fact that using Jackson with it, the ordering of the members does not matter. In other words, I first try to check for equality using Jackson. Jackson is ordering independent. Then if it fails, I use ZJsonPatch to tell me what the diff is.
{"op":"replace","path":"/data/0/successfulIds/1","value":"B9"}
which handles nested JSON well.
ObjectMapper mapper = new ObjectMapper();
JsonNode expected = mapper.readTree(expectedJsonResponse);
JsonNode actual = mapper.readTree(actualJsonResponse);
try {
assertEquals(expected, actual);
} catch (AssertionError ae) {
JsonNode patch = JsonDiff.asJson(actual, expected);
throw new Exception(patch.toString(), ae);
}

Related

Google DLP - Can I use a delimiter to instruct DLP infotype detectors to search only inside that for sensitive text?

I have an issue while trying to deidentify some data with DLP using an object mapper to parse the object into string - send it to DLP for deidentification - getting back the deidentified string and using the object mapper to parse the string back to the initial object. Sometimes DLP will return a string that cannot be parsed back to the initial object (it breaks the json format of the object mapper)
I use an objectMapper to parse an Address object to string like this:
Address(
val postal_code: String,
val street: String,
val city: String,
val provence: String
)
and my objectmapper will transform this object into a string eg: "{\"postal_code\":\"123ABC\",\"street\":\"Street Name\",\"city\":\"My City\",\"provence\":\"My Provence\"}" which is sent to DLP and deidentified (using LOCATION or STREET_ADDRESS detectors).
The issue is that my object mapper would expect to take back the deidentified string and parse it back to my Address object using the same json format eg:
"{\"postal_code\":\"LOCATION_TOKEN(10):asdf\",\"street\":\"LOCATION_TOKEN(10):asdf\",\"city\":\"LOCATION_TOKEN(10):asdf\",\"provence\":\"LOCATION_TOKEN(10):asdf\"}"
But there are a lot of times that DLP will return something like
"{"LOCATION_TOKEN(25):asdfasdfasdf)\",\"provence\":\"LOCATION_TOKEN(10):asdf\"}" - basically breaking the json format and i am unable to parse back the string from DLP to my initial object
Is there a way to instruct DLP infotype detectors to keep the json format, or to look for sensitive text only inside \" * \"?
Thanks
There are some options here using a custom regex and a detection ruleset in order to define a boundary on matches.
The general idea is that you require that findings must match both an infoType (e.g. STREET_ADDRESS, LOCATION, PERSON_NAME, etc.) and your custom infoType before reporting as a finding or for redaction. By requiring that both match, you can set bounds on where the infoType can detect.
Here is an example.
{
"item": {
"value": "{\"postal_code\":\"123ABC\",\"street\":\"Street Name\",\"city\":\"My City\",\"provence\":\"My Provence\"}"
},
"inspectConfig": {
"customInfoTypes": [
{
"infoType": {
"name": "CUSTOM_BLOCK"
},
"regex": {
"pattern": "(:\")([^,]*)(\")",
"groupIndexes": [
2
]
},
"exclusionType": "EXCLUSION_TYPE_EXCLUDE"
}
],
"infoTypes": [
{
"name": "EMAIL_ADDRESS"
},
{
"name": "LOCATION"
},
{
"name": "PERSON_NAME"
}
],
"ruleSet": [
{
"infoTypes": [
{
"name": "LOCATION"
}
],
"rules": [
{
"exclusionRule": {
"excludeInfoTypes": {
"infoTypes": [
{
"name": "CUSTOM_BLOCK"
}
]
},
"matchingType": "MATCHING_TYPE_INVERSE_MATCH"
}
}
]
}
]
},
"deidentifyConfig": {
"infoTypeTransformations": {
"transformations": [
{
"primitiveTransformation": {
"replaceWithInfoTypeConfig": {}
}
}
]
}
}
}
Example output:
"item": {
"value": "{\"postal_code\":\"123ABC\",\"street\":\"Street Name\",\"city\":\"My City\",\"provence\":\"My [LOCATION]\"}"
},
By setting "groupIndexes" to 2 we are indicating that we only want the custom infoType to match the middle (or second) regex group and not allow the :" or " to be part of the match. Also, in this example we mark the custom infoType as EXCLUSION_TYPE_EXCLUDE so that it does not report itself:
"exclusionType": "EXCLUSION_TYPE_EXCLUDE"
If you remove this line, anything matching your infoType could also get redacted. This can be useful for testing though - example output:
"item": {
"value": "{\"postal_code\":\"[CUSTOM_BLOCK]\",\"street\":\"[CUSTOM_BLOCK]\",\"city\":\"[CUSTOM_BLOCK]\",\"provence\":\"[CUSTOM_BLOCK][LOCATION]\"}"
},
...
Hope this helps.

Not sure why it has to say not valid Json after replacing the values of paymentKey and Session key

I am trying to read a json file and after traversing to attrib`s called "paymentKey" and "Session key" and changing their values through JSONObject , the post operation failing.
When i checked the out json after performing above changes it seems that structure is bit unordered , changed and even got to learn that json is not an valid one.
This is bit annoying and not sure how to keep the json format in tag after replacing the attrib`s values.
Below is the Json used
{
"idempotentId": "133215472229",
"customerId": "12345",
"brandId": "ANCHOR",
"sellingChannel": "WEBOA",
"items": [
{
"lineItemId": 123,
"productId": "ANCHOR-WEBOA-640213214",
"price": 1.19,
"quantity": 1,
"modifierGroups": [],
"childItems": [],
"note": " Drink without snacks"
}
],
"fulfillment": {
"email": "12#gmail.com",
"phoneNumber": "+912222621",
"fulfillmentType": "PickUp",
"asap": true,
"pickupFirstName": "Kiran",
"pickupLastName": "Kumar",
"locationId": "33211111"
},
"payment": {
"paymentKey": "12222-444-555-2222-44444121e",
"sessionKey": "02f3waAjHJnVCTstOIu0jcSZfm_1HnGum1lZdsu6iDlLxxjO1FYsG9DHz9130ZzMMkjYY9j5w.7V8CijbmiPSo5ESDsq5hsQ.RpYSS5wkgoSSOMjktEyDTHZh1IPq0wNayp--DE3HE53uUgTEehCvHjSsUP5q8U2ZN1kZXbsufwm_mRCV8hLCrmWVTchhVUTJtmEpyYy142DtSp1ikXOVzGN5i9z_oP5e79QvgmU7_n1C5DeARFRagQClT87vUFBUfleSbLaRyH5v3wkU7ji9URUetcq1iAfS5-cNt6-uJaulFJc2y6uNdn0OtjIe74Hp5G7Gx54VYggduoqx5X1rsCssobfUSJUDLt_vVpz5BvhQM88EaysMAB6EcQHoOnZd_YWrz4IDAAZSwSBUFQAkypVmHo5pbvp64cTDrZE73EYkEwJLGf0dRmedMFe2HiU3DiCr97K3I3KuufxYM_eMRIcn739dntxTq4QePtFdqYGWBzXWQutvvqxWQPbNi7PG_-aauEOzlwJiXG94C8t7NGu0SjB8xHf11Z3orf5Ni4-fRKugY8VJNBl39hnb4-d-g47ut7iuiFDkDHJzlSgt9LFq__CxShG_.YkL2w7QEU85VHjpOj5urieCr4-G"
},
"subTotal": 100.19,
"tax": 4.19
}
Below is the snippet of the code
import org.json.JSONObject;
import org.json.JSONArray;
public JSONObject constructCreateOrderPayload( String freedomPayPaymentKey,String orderInit_SessionKey, String payloadFile) {
String filepath = System.getProperty("user.dir")+"/src/test/resources/JsonFiles/"+payloadFile;
try {
String jsonContents = new String((Files.readAllBytes(Paths.get(filepath))));
JSONObject jsonObject = new JSONObject(jsonContents);
JSONObject payment_obj = (JSONObject) jsonObject.get("payment");
payment_obj.put("paymentKey", freedomPayPaymentKey);
payment_obj.put("sessionKey",orderInit_SessionKey);
System.out.println("-------------------------------------------------------------------------------------------------------------------------------------------------------------------------");
System.out.println( " After Changes in JSON OBJECT : ");
System.out.println(jsonObject.toString());
System.out.println("");
System.out.println("-------------------------------------------------------------------------------------------------------------------------------------------------------------------------");
payload = jsonObject; // when i print the json boject the format is displaced hence when validated it says invalid json
} catch (IOException e) {
System.out.println("No file found in the path ");
e.printStackTrace();
}
return payload;
}
When I validated the Json after changes it shows as invalid with errors as shown in below snapshot
I tried a lot but no success, can somebody please look in to issue and advise me where I am going wrong or provide an solution this issue.
JSON in unordered, When you print jsonObject before making the changes you will know the order of the JSON is changed, I have used the Jackson Databind libraries and below is a working code, Change it accordingly
String filepath = "C:\\Users\\wilfred\\Desktop\\Input.json";
try {
String jsonContents = new String((Files.readAllBytes(Paths.get(filepath))));
ObjectMapper mapper = new ObjectMapper();
JsonNode expected = mapper.readTree(jsonContents);
System.out.println("Before converting : " + expected.toString());
JsonNode payment_obj = (expected.get("payment"));
((ObjectNode) payment_obj).put("paymentKey", "Trial1");
((ObjectNode) payment_obj).put("sessionKey", "Trial2");
System.out.println("After converting : " + expected.toString());
} catch (IOException e) {
System.out.println("No file found in the path ");
e.printStackTrace();
}
}
My approach was correct. The only mistake was i had not supply/pass on the correct values to few of the JSon attributes and that resulted in error response.
Rectified as per requirements and was able to get results correctly, hence closing this.

Hamcrest matcher to check if any of the element in response json array has a property value same as a specific value in Rest Assured

I am working on REST API test automation with Rest-Assured. For one API I am getting an array like below in response. From that data array I need to check any of the array item has any property "requestRefNo" with value: "Sss/12345637/58"
{
"data": [
{
"requestRefNo": "Sss/12345637/88",
"requestRefType": "AST",
"requestedByCode": "OWR",
"requestedByDesc": "Asset Owner",
"requestedDate": "12/06/2016",
"requestTypeRefNo": "Sss/12345637/SWT/73"
},
{
"requestRefNo": "Sss/12345637/58",
"requestRefType": "AST",
"requestedByCode": "OWR",
"requestedByDesc": "Asset Owner",
"requestedDate": "10/06/2016",
"requestTypeRefNo": "Sss/12345637/SWT/43"
},
....
],
"links": {
"linkDetails": [
],
"empty": true
},
"errors": {
"empty": true,
"errorDetails": [
]
}
}
I have tried like this:
.assertThat().statusCode(200).body("data.requestRefNo", IsArrayContaining.hasItemInArray("Sss/12345637/58"))))
But it is giving the below error:
java.lang.AssertionError: 1 expectation failed.
JSON path data.requestRefNo doesn't match.
Expected: an array containing "Sss/12345637/58"
Actual: [Sss/12345637/58, Sss/12345637/88]
Can anyone give me any idea?
Thanks,
Surodip
Got a very simple answer, missed earlier:
...
.body("data.requestRefNo", Matchers.hasItem("Sss/12345637/58"))
.extract().response();
"data.requestRefNo" will return the array of all requestRefNo in the response array like [Sss/12345637/58, Sss/12345637/88] and Matchers.hasItem will check if the value "Sss/12345637/58" exists in that.
Thanks.

Update nested field in an index of ElasticSearch with Java API

I am using Java API for CRUD operation on elasticsearch.
I have an typewith a nested field and I want to update this field.
Here is my mapping for the type:
"enduser": {
"properties": {
"location": {
"type": "nested",
"properties":{
"point":{"type":"geo_point"}
}
}
}
}
Of course my enduser type will have other parameters.
Now I want to add this document in my nested field:
"location":{
"name": "London",
"point": "44.5, 5.2"
}
I was searching in documentation on how to update nested document but I couldn't find anything. For example I have in a string the previous JSON obect (let's call this string json). I tried the following code but seems to not working:
params.put("location", json);
client.prepareUpdate(index, ElasticSearchConstants.TYPE_END_USER,id).setScript("ctx._source.location = location").setScriptParams(params).execute().actionGet();
I have got a parsing error from elasticsearch. Anyone knows what I am doing wrong ?
You don't need the script, just update it.
UpdateRequestBuilder br = client.prepareUpdate("index", "enduser", "1");
br.setDoc("{\"location\":{ \"name\": \"london\", \"point\": \"44.5,5.2\" }}".getBytes());
br.execute();
I tried to recreate your situation and i solved it by using an other way the .setScript method.
Your updating request now would looks like :
client.prepareUpdate(index, ElasticSearchConstants.TYPE_END_USER,id).setScript("ctx._source.location =" + json).execute().actionGet()
Hope it will help you.
I am not sure which ES version you were using, but the below solution worked perfectly for me on 2.2.0. I had to store information about named entities for news articles. I guess if you wish to have multiple locations in your case, it would also suit you.
This is the nested object I wanted to update:
"entities" : [
{
"disambiguated" : {
"entitySubTypes" : [],
"disambiguatedName" : "NameX"
},
"frequency" : 1,
"entityType" : "Organization",
"quotations" : ["...", "..."],
"name" : "entityX"
},
{
"disambiguated" : {
"entitySubType" : ["a", "b" ],
"disambiguatedName" : "NameQ"
},
"frequency" : 5,
"entityType" : "secondTypeTest",
"quotations" : [ "...", "..."],
"name" : "entityY"
}
],
and this is the code:
UpdateRequest updateRequest = new UpdateRequest();
updateRequest.index(indexName);
updateRequest.type(mappingName);
updateRequest.id(url); // docID is a url
XContentBuilder jb = XContentFactory.jsonBuilder();
jb.startObject(); // article
jb.startArray("entities"); // multiple entities
for ( /*each namedEntity*/) {
jb.startObject() // entity
.field("name", name)
.field("frequency",n)
.field("entityType", entityType)
.startObject("disambiguated") // disambiguation
.field("disambiguatedName", disambiguatedNameStr)
.field("entitySubTypes", entitySubTypeArray) // multi value field
.endObject() // disambiguation
.field("quotations", quotationsArray) // multi value field
.endObject(); // entity
}
jb.endArray(); // array of nested objects
b.endObject(); // article
updateRequest.doc(jb);
Blblblblblblbl's answer couldn't work for me atm, because scripts are not enabled in our server. I didn't try Bask's answer yet - Alcanzar's gave me a hard time, because I supposedly couldn't formulate the json string correctly that setDoc receives. I was constantly getting errors that either I am using objects instead of fields or vice versa. I also tried wrapping the json string with doc{} as indicated here, but I didn't manage to make it work. As you mentioned it is difficult to understand how to formulate a curl statement at ES's java API.
A simple way to update the arraylist and object value using Java API.
UpdateResponse update = client.prepareUpdate("indexname","type",""+id)
.addScriptParam("param1", arrayvalue)
.addScriptParam("param2", objectvalue)
.setScript("ctx._source.field1=param1;ctx._source.field2=param2").execute()
.actionGet();
arrayvalue-[
{
"text": "stackoverflow",
"datetime": "2010-07-27T05:41:52.763Z",
"obj1": {
"id": 1,
"email": "sa#gmail.com",
"name": "bass"
},
"id": 1,
}
object value -
"obj1": {
"id": 1,
"email": "sa#gmail.com",
"name": "bass"
}

How to create JSON Schema for Name/Value structure?

My problem is that i am serializing the content of map to JSON.
In the output (JSON), i have object that follow key/name syntax rule.
The key is created from map key, and the name from the value.
Model Example:
class Storage {
Map<String,String> values = new HashMap<>();
{
map.put("key1","key1");
map.put("key2","key2");
map.put("key3","key3");
}
}
JSON Example object:
{
key1=value1,
key2=value2,
key3=value3
}
JSON Schema:
{
"name": "storage",
"description": "Store of key values",
"properties": {
// How can we describe the properties if we do not know the name ?
}
}
The issue is that i do not know what the values will be but i know that they will be some.
Can you help me to provide me the full definition of schema?
Disclaimer:
I know that this can be also serialized as
{
values: [
{key="key1", value="value1"},
{key="key2", value="value2"},
{key="key3", value="value3"}
]
}
but is do not want to have array in the JSON.
Assuming your validator supports it you can use patternProperties.
For the schema...
{
"title": "Map<String,String>",
"type": "object",
"patternProperties": {
".{1,}": { "type": "string" }
}
}
...and the document...
{
"foo":"bar",
"baz":1
}
...the value of property foo is valid because it is a string but baz fails validation because it is a number.
I used the Solution suggested by #augurar
"additionalProperties": { "type": "string" }
for AWS API Gateway Model .... and the SDK was able to generate the Map variable as required in Java / Android SDK
#Arne Burmeister - in my case - Solution 1 didnt worked as needed - although it didnt gave any error in the Model (Schema Created)

Categories