I want to know whether there is a method in which I can generate sample json output based on a json schema input.
for example :-
input =>
{
"title": "Example Schema",
"type": "object",
"properties": {
"firstName": {
"type": "string"
},
"lastName": {
"type": "string"
},
"age": {
"description": "Age in years",
"type": "integer",
"minimum": 0
}
},
"required": ["firstName", "lastName"]
}
output =>
{
"firstName" : "RandomFirstName",
"lastName" : "RandomLastName"
}
I have a large Json Schema with plenty of validations so to generate a sample valid json I could either create one manually using either Java or just a type it into a file. Is there a better way available ?
You can try the JSON Schema Faker. It will take a schema and generate/output a JSON object that will validate against the schema.
JSONBuddy can do this for you. It is a Windows desktop JSON editor and generates live JSON sample data while you are editing your schema.
fake-schema-cli is another option you can use.
Example: fake-schema file-input-schema.json > output.json.
My team and I have created an online tool that allows you to parse JSON schema and generate an array of JSON data that complies to the schema. You can save it as .json file and parse it to your app with a Java parser.
The tool is called Mock turtle - https://mockturtle.net .
You can also use the ModelObject in Adobe Ride (full disclosure: self-plug here). Point the ModelObject (or a subclass thereof) to a schema in your java project resources: https://github.com/adobe/ride/blob/develop/sample/sample-service-extension/src/test/java/com/adobe/ride/sample/tests/ObjectCreation.java#L38
You can also use the Ride Fuzzer Lib to easily tests sending negative data into the schema nodes (based on an array of OWASP and google injection test strings, and other various types data): https://github.com/adobe/ride/tree/develop/libraries/ride-fuzzer-lib
All Ride modules are open source and free: https://github.com/adobe/ride/
Related
Goal: Save the response I am getting from api request to json files.
filename needed: name_of_original_file_request_was_sent_with + "_response.json"
Until now the jmeter program reads like 6 files, takes data from each file and puts it in body while making API request. I get 6 responses. Now How do I save those responses in files?
I used this code but it replaces the original files that I sent request with:
new File(vars.get('file')).bytes = prev.getResponseData()
Below is response file:
{
"type": "transaction-response",
"link": [
{
"relation": "self",
"url": "https://someurl.com"
}
],
"entry": [
{
"response": {
"status": "201 Created",
"location": "Player/aerear",
"etag": "1",
"lastModified": "2020"
}
}
]
}
So you basically need to amend your code in order to:
Remove extension from the original file
Add _response.json to it
For point 1 you can use FileNameUtils.getBaseName() function
For point 2 you can use simple string concatenation
Something like:
new File(org.apache.commons.io.FilenameUtils.getBaseName(vars.get('file')) + '_response.json').bytes = prev.getResponseData()
should do the trick for you.
More information on Groovy scripting in JMeter: Apache Groovy: What Is Groovy Used For?
Currently, I have a set of json files in the automation framework. In each json file there is an array containing a set of similar json objects. I need to use these files for validations from web application using cucumber and selenium in java. Below is the example of the json array present in each json file:
[
{
"Status": "Active",
"Company": "XYZ",
"PackageId": "551819",
"ProductCode": "BACC"
},
{
"Status": "Active",
"Company": "ABC",
"PackageId": "551829",
"ProductCode": "IRE7"
},
{
"Status": "Active",
"Company": "MAU",
"PackageId": "551009",
"ProductCode": "BACC"
},
{
"Status": "Active",
"Company": "XYZ",
"PackageId": "551073",
"ProductCode": "AXM"
}
]
All the files contains similar json array. My framework needs to take all these files and take the data from each json object of each array of each file and validate it with the data reflecting in web application using a single scenario of a Step definition file of Cucumber. Can anyone please suggest me how can it be done as i have never run a set of files for data provider in Cucumber framework?
is this can help you ?
Data Driven Testing using JSON with Cucumber
We have to do lot of amendments in our project in this chapter to implement Data Driven Technique using JSON files:
1.Decide what data needs to pass through JSON files
2.Create JSON Data set
3.Write a Java POJO class to represent JSON data
4.Pass JSON data file location to Properties file and Write a method to read the same
5.Create a JSON Data Reader class
6.Modify FileReaderManager to accommodate JSON Data Reader
7.Modify Checkout Steps file to pass Test Data to Checkout Page Objects
8.Modify Checkout Page object to use Test Data object
https://www.toolsqa.com/selenium-cucumber-framework/data-driven-testing-using-json-with-cucumber/
I have a problem with a very large JSON file that is too large to use ObjectMapper.readValue() into a JsonNode. I would like to use the solution from Out of memory error while parsing a large JSON using Jackson library on Android, except the JSON file is a single object with field names that are not known ahead of time, so I can't create a model POJO to deserialize to.
Each property inside the object has the same format, and I can ignore many of the properties of those inner objects (I already have a POJO class to model that). It would be easier for me to solve this problem if the JSON file was an array instead of an object. (I'm not the one creating the file, just reading from it.)
(I'm posting my solution below, but I hope there's a better one!)
Without being able to load the original file in an ObjectMapper, I decided to parse the JSON file and rewrite it as an array. Reading and writing line-by-line, I converted a file that looks like this (but much larger):
{
"Unexpected Monkey" : {
"name" : "UnexpectedMonkey",
"age" : 7
},
"Another Unexpected Name" : {
"name" : "Another Unexpected Name",
"age" : 2
}
}
into:
[
{
"name" : "UnexpectedMonkey",
"age" : 7
},
{
"name" : "Another Unexpected Name",
"age" : 2
}
]
Then I could parse the file a la How to parse a JSON string to an array using Jackson
Does Google Data Loss Prevention API support .pdf or .docx?
I am trying to do reduction on *.pdf file in Java to hide sensitive data.
many thanks!
Emi
Currently, the Google Data Loss Prevention API only supports a string of text.
Sample Input:
{
"items":
[
{
"value": "My phone number is (123) 456-7890",
"type": "text/plain"
}
],
"replaceConfigs":
[
{
"replaceWith": "[REDACTED PHONE NUMBER]",
"infoType":
{
"name": "PHONE_NUMBER"
}
}
]
}
URL:
POST https://dlp.googleapis.com/v2beta1/content:redact
Sample Output:
{
"items": [
{
"type": "text/plain",
"value": "My phone number is [REDACTED PHONE NUMBER]"
}
]
}
The methods for streamed in content support images, text, and binary data. You can stream your pdf through ByteContentItem https://cloud.google.com/dlp/docs/reference/rpc/google.privacy.dlp.v2#contentitem or you can convert your PDF to images and scan them as images.
If scanning content in GCS, some PII is detectable from PDFs, but you should test your use cases out.
So I'm currently building a game, and I'm trying to parse this in the game. Now this is how a part of it looks like:
{
"CircuitList": [
{
"name": "GP SILVERSTONE",
"location": "ENGLAND",
"laps": 57,
"Parts":
[
{
"1":{
"type": "straight",
"length": 800
},
"2": {
"type": "sharpturn",
"length": 200
},
Now this is followed by more parts. Right now, I've parsed the json file, and used
JSONArray Silverstoneparts = (JSONArray) jsonObject.get("Parts");
to create a array with all the parts. But I don't know how to read out the types and lengths, so if there is anyone willing to help, like push me gently into the right direction, it would be highly appreciated :)
JSONArray circuitList = (JSONArray) jsonObject.get("CircuitList");
JsonArray parts = circuitList.getJsonObject(0).getJsonArray("parts");
It's recommend to use JsonObject to build json string.
You can try parsing the JSON using gson library. You should define a data struct with nested arrays/properties as per your JSON data and serialise the JSON data to native object.
Native java object will help you read the parsed arrays and other properties in your code rather being relying on keys and running loops within loops to extract the values.