single file Json Parsing to multiple java objects - java

I have a Json file which i want to parse and store it to different objects based on a particular value in the json using java?
I have a file of this type :
{
"TEST_ID": "INV_CRE_184",
"TEST_DESCRIPTION": "This validate Invoice Creation by mocking dependency",
"TEST_PLAN": "LINK",
"TEST_VARIABLE": [{
"retailcountryCode": "DE"
}],
"TEST_CASE": [{
"OUTPUT_PLACEHOLDER": "V1",
"ACTION": "Generate",
"PARAMETERS": [
"GenerateVendor",
"VENDOR",
"${retailcountryCode}"
]
}, {
"OUTPUT_PLACEHOLDER": "P1",
"ACTION": "Generate",
"PARAMETERS": [
"GeneratePayee",
"${V1}",
"${ofaOrg}"
]
},
I have two objects V1 and P1 and according to the placeholder values they must be parsed respectively

Related

Flattening a heavily nested JSON in Java - Time Complexity

{
"id": "12345678",
"data": {
"address": {
"street": "Address 1",
"locality": "test loc",
"region": "USA"
},
"country_of_residence": "USA",
"date_of_birth": {
"month": 2,
"year": 1988
},
"links": {
"self": "https://testurl"
},
"name": "John Doe",
"nationality": "XY",
"other": [
{
"key1": "value1",
"key2": "value2
},
{
"key1": "value1",
"key2": "value2"
}
],
"notified_on": "2016-04-06"
}
}
I am trying to read data from a GraphQL API that returns paginated JSON response. I need to write this into a CSV. I have been exploring Spring Batch for implementation where I would read JSON data in the ItemReader and flatten each JSON entry (in ItemProcessor) and then write this flattened data into a CSV (in ItemWriter). While I could use something like Jackson for flattening the JSON, I am concerned about possible performance implications if the JSON data is heavily nested.
expected output:
id, data.address.street, data.address.locality, data.address.region, data.country_of_residence, data.date_of_birth.month, data.date_of_birth.year, data.links.self, data.name, data.nationality, data.other (using jsonPath), data.notified_on
I need to do process more than a million records. While I believe flattening the CSV would be a linear operation O(n), I was still wondering if there could be other caveats if the JSON structure gets severely nested.

How to create a Sql Statement from a mutable Json File using Java

I'm trying to create SQL Tables from a Json File which is written following the OpenApi Specification. Here is an example of an Input file I must convert:
"definitions": {
"Order": {
"type": "object",
"properties": {
"id": {
"type": "integer",
"format": "int64"
},
"petId": {
"type": "integer",
"format": "int64"
},
"quantity": {
"type": "integer",
"format": "int32"
},
"shipDate": {
"type": "string",
"format": "date-time"
},
"status": {
"type": "string",
"description": "Order Status",
"enum": [
"placed",
"approved",
"delivered"
]
},
"complete": {
"type": "boolean",
"default": false
}
},
"xml": {
"name": "Order"
}
},
"Category": {
"type": "object",
"properties": {
"id": {
"type": "integer",
"format": "int64"
},
"name": {
"type": "string"
}
},
"xml": {
"name": "Category"
}
},
My aim to to create two tables named "Order" and "Category" whose columns must be to ones listed in the "properties" field. I'm using Java.
The Input file is mutable, so I used Gson to read it. I managed to get an Output like this:
CREATE TABLE ORDER
COLUMNS:
id->
type: integer
format: int64
petId->
type: integer
format: int64
quantity->
type: integer
format: int32
shipDate->
type: string
format: date-time
status->
type: string
description: Order Status
Possibilities:
-placed
-approved
-delivered
complete->
type: boolean
default: false
CREATE TABLE CATEGORY
COLUMNS:
id->
type: integer
format: int64
name->
type: string
I'm stuck here, trying to convert the "type" and "format" fields into a type that can be read by PostgreSQL or MySQL. Furthermore, it is hard to work directly on the code to get a readable SQL string due the presence of nesting. So I thought it might be a good idea to work on the output and "translate" it to SQL. Is there any class\package that could help me reading a file like this? I'm trying to avoid the use of thousands IF ELSE conditions. Thank you.
Your assignment involves two phases.
One is "Parsing" the given JSON object and understanding the content
Second one is "Translating" the parsed content into a working SQL query
Here your Java program should work as a kind of Translation engine.
For parsing the JSON objects many java libraries are available.
To translate the parsed json into a SQL query you can simply use basic String manipulation methods.

Java - Optimal way to overwrite values from a subset JSON document to a superset JSON document

I have the below superset JSON and a subset JSON. In my java code, I need to parse the superset JSON and for every element in the propList which has overwrite flag set to true, I need to read the value from subset JSON which has a similar structure as superset JSON and overwrite it.
What is the optimal way to achieve this in java? The JSON document can be quite big.
Superset JSON:
{
"configList": [
{
"configElement": "elem1",
"propList": [
{
"property": "prop1",
"value": "val1",
"overwriteValueFromSubset": false
},
{
"property": "prop2",
"value": "",
"overwriteValueFromSubset": true
},
{
"property": "prop3",
"value": "val3",
"overwriteValueFromSubset": false
}
]
},
{
"configElement": "elem2",
"propList": [
{
"property": "prop1",
"value": "val1",
"overwriteValueFromSubset": false
},
{
"property": "prop2",
"value": "val2",
"overwriteValueFromSubset": false
},
{
"property": "prop3",
"value": "",
"overwriteValueFromSubset": true
}
]
}
]
}
Subset JSON:
{
"configList": [
{
"configElement": "elem1",
"propList": [
{
"property": "prop2",
"value": "new_value",
}
]
},
{
"configElement": "elem2",
"propList": [
{
"property": "prop3",
"value": "new_value",
}
]
}
]
}
Assuming you can fit all data in memory. Google "how to convert Java POJO to JSON object", (and vice versa). So the answer (without actually coding it for you) is convert it into a big Java object (graph), and process the POJOs, then then just call whatever method saves it back out as JSON. The Jackson libraries are what you want here. The object is called a 'mapper'. That should be enough tips to get you going.
https://www.mkyong.com/java/how-to-convert-java-object-to-from-json-jackson/

Jackson mapping same JSON nodes with different names as key

I'm working with a RESTful webservice in android, and I'm using Spring for Android with Jackson for the first time.
I'm using this generator to generate the java classes, but I'm in trouble sometimes when an array of the same objects inside JSON have a different names:
"a2e4ea4a-0a29-4385-b510-2ca6df65db1c": {
"url": "//url1.jpg",
"ext": "jpg",
"name": "adobe xm0 ",
"children": {},
"tree_key": []
},
"d3ff3921-e084-4812-bc49-6a7431b6ce52": {
"url": "https://www.youtube.com/watch?v=myvideo",
"ext": "video",
"name": "youtube example",
"children": {},
"tree_key": []
},
"151b5d60-8f41-4f38-8b67-fe875c3f0381": {
"url": "https://vimeo.com/channels/staffpicks/something",
"ext": "video",
"name": "vimeo example",
"children": {},
"tree_key": []
}
All the 3 nodes are of the same kind and can be mapped with the same object, but the generator creates 3 classes for each node with different name.
Thanks for the help.
With Jackson, you can use Map map = new ObjectMapper().readValue(<insert object here>, Map.class);
as mentioned by Programmer Bruce : here

Ignore Null Value Fields From Rest API Response Java

In my project when i send Rest Response to Advance Rest Client It only shows Fields which Have some values and Ignores(Does not show) fields which have NULL Values or Empty values.
Part Of Code:
Gson gson=new Gson();
// firstResponse is Object which contains the values
String jsonString = gson.toJson(firstResponse);
test.saveJson(jsonString); //OR System.out.println(jsonString);
return Response.ok(firstResponse).build(); // Response to Rest Client
Response sample To return Response.ok(firstResponse).build();
Advance rest client From web project :
{
"Name": "smith",
"Properties": {
"propertyList": [
{
"ID": "072",
"Number": "415151",
"Address": "Somewhere"
},
{
"ID": "151",
"Number": "a800cc79-99d1-42f1-aeb4-808087b12c9b",
"Address": "ninink"
},
{
"ID": "269",
},
],
},
}
Now when i save this as Json String in DB or When i want to Print this to console it also prints the fiels with null or empty values:
{
"Name": "smith",
"Properties": {
"propertyList": [
{
"ID": "072",
"Number": "415151",
"Address": "Somewhere"
},
{
"ID": "151",
"Number": "a800cc79-99d1-42f1-aeb4-808087b12c9b",
"Address": "ninink"
},
{
"ID": "269",
"Number": "",
"Address": ""
},
],
},
"resultList" :[]
}
How can i print or save this JSON string same as response in rest client i.e. i dont want to print null or empty value field i just want to ignore them.
in top og entity class , try with the annotation
#JsonInclude(Include.NON_EMPTY)
this annotation don't show any empty field in your json.
Not giving you a code but here are some pointers for you:
Read the manual: How to handle NULL values
You may need to use a custom exclusion strategy
Also read this Q&A: Gson: How to exclude specific fields from Serialization without annotations

Categories