I am trying to sort below array node within my json payload. Should be sort by name element
Before:
"empData": [
{
"name": "Jhon",
"age": 33
},
{
"name": "Sam",
"age": 24
},
{
"name": "Mike",
"age": 65
},
{
"name": "Jenny",
"age": 33
}
]
Expected:
"empData": [
{
"name": "Jenny",
"age": 33
},
{
"name": "Jhon",
"age": 33
},
{
"name": "Mike",
"age": 65
},
{
"name": "Sam",
"age": 24
}
]
I was trying below option:
private static final ObjectMapper SORTED_MAPPER = new ObjectMapper();
static {
SORTED_MAPPER.configure(SerializationFeature.ORDER_MAP_ENTRIES_BY_KEYS, true);
}
public static JsonNode sortJsonArrayList(final JsonNode node) throws IOException {
final Object obj = SORTED_MAPPER.treeToValue(node, Object.class);
final String json = SORTED_MAPPER.writeValueAsString(obj);
return objectMapper.readTree(json);
}
But not sure how to select name key for sorting.
Try something like this:
public JsonNode some(JsonNode node){
//find list of objects that contains field name
List<JsonNode> dataNodes = node.findParents("name");
//sort it
List<JsonNode> sortedDataNodes = empDataNodes
.stream()
.sorted(Comparator.comparing(o -> o.get("name").asText()))
.collect(Collectors.toList());
//return the same Json structure as in method parameter
ArrayNode arrayNode = objectMapper.createObjectNode().arrayNode().addAll(sortedEmpDataNodes);
return objectMapper.createObjectNode().set("empData", arrayNode);
}
You can debug it step by step and see how it works.
ORDER_MAP_ENTRIES_BY_KEYS is the feature which enables Map sorting by Map keys. As your result type is not a Map this sorting will not be applied. To sort array you can create custom deserialiser or sort deserialised array in place. This answer describes how this can be achieved.
Related
I have two different JSON files.
File A:
{
"label": "A",
"links": [
{
"url": "urla"
}
]
}
File B:
{
"links": [
{
"url": "urlb"
}
]
}
Now I want to update A with the contents of B to get the following result JSON:
{
"label": "A",
"links": [
{
"url": "urlb"
}
]
}
That is the links array should be fully replaced with the contents of B.
But instead it merges the two Arrays:
{
"label": "A",
"links": [
{
"url": "urla",
"url": "urlb"
}
]
}
This is not desired.
The code for the merged file:
JsonNode A = ... // resolved from a service call
JsonNode B = ... // resolved from a service call
ObjectMapper mapper = new ObjectMapper();
result = mapper.readerForUpdating(A).readValue(B);
I also tried to set mapper.setDefaultMergeable(false); but it didn't help.
I use com.fasterxml.jackson.core:jackson-core:jar:2.9.8
Can someone help me?
My JSON Doc is structured like this and being saved in MongoDB with null values.
{
"userId": "123456",
"session": "string",
"timestamp": 0,
"product": "tracker",
"version": "13",
"flight": "A",
"activity": "search",
"action": "opportunity",
"dataDictionary": {
"datadictionary": {
"query": "roofing",
"docid": 12,
"rank": 1,
"search": {
"id": null
}
},
"id": 40
}
I have also tried to put #JsonInclude(JsonInclude.Include.NON_NULL)
My Hash map is declared like
Map<String, Object >dataDict = new LinkedHashMap<>();
dataDict.values().removeIf(Objects::isNull);
As far as I can tell this should be removing all null values regardless of level/layer in the Map.
JSON is stored like this
{
"userId": "123456",
"session": "string",
"timestamp": 0,
"product": "tracker",
"version": "13",
"flight": "A",
"activity": "search",
"action": "opportunity",
"dataDictionary": {
"datadictionary": {
"query": "roofing",
"docid": 12,
"rank": 1,
"search": {
"id": null,
"name":"test"
}
},
"id": 40
}
Should be stored like this
{
"userId": "123456",
"session": "string",
"timestamp": 0,
"product": "tracker",
"version": "13",
"flight": "A",
"activity": "search",
"action": "opportunity",
"dataDictionary": {
"datadictionary": {
"query": "roofing",
"docid": 12,
"rank": 1,
"search": {
"name":"test"
}
},
"id": 40
}
The problem is that you are removing null values from the top level Map.
This map contains internally values that are other maps. You don't remove null values from thos maps.
Try to use a recursive function to remove all null elements from inner maps.
The json:
{
"topField": null,
"innerMap": {
"innerField": null
}
}
is equivalent to the following maps in java
Map map = new LinkedHashMap();
map.put("topField", null);
Map innerMap = new LinkedHashMap();
innerMap.put("innerField", null);
map.put("innerMap", innerMap);
If you apply the code to remove null values to map:
map.values().removeIf(Objects::isNull);
results in a map that is equivalent to the following manually built map:
Map map = new LinkedHashMap();
// map.put("topField", null);
Map innerMap = new LinkedHashMap();
innerMap.put("innerField", null);
map.put("innerMap", innerMap);
because it removes null values from the map, not from innerMap.
You can remove all null elements at any level as follow:
public void removeNull(Map map) {
map.values().removeIf(Objects::isNull);
for (Object value: map.values()) {
if (value instanceof Map) {
// Apply a recursion on inner maps
removeNull((Map) value);
}
}
}
And you can remove all null items as follow:
Map map = ...
removeNull(map);
As mentioned by others, you would need to do it in a recursive manner. For example, if you have function like removeNulls:
private boolean removeNulls(final Object o) {
if (Objects.isNull(o)) {
return true;
} else if (o instanceof Map) {
((Map) o).values().removeIf(MyClass::removeNulls);
}
return false;
}
Then, you could do it like:
dataDict.values().removeIf(MyClass::removeNulls);
This is just to serve as an example.
I'm making a spreadSheet using SpreadJS, and I should be able to to add, delete and change the value of a key nested inside many objects. Here is how my json is formatted:
{
"version": "10.0.0",
"sheets": {
"Sheet1": {
"name": "Sheet1",
"data": {
"dataTable": {
"0": {
"0": {
"value": 129
}
}
}
},
"selections": {
"0": {
"row": 0,
"rowCount": 1,
"col": 0,
"colCount": 1
},
"length": 1
},
"theme": "Office",
"index": 0
}
}
}
The data represents, say, the value of each cell in the spreadSheet [0,0], [0,1], [1,1] ...etc. I want to parse this data into a List of generic model, for the field dataTable i would like to represent it like this: Map<Integer, Map<Integer, ValueObj>> for example in this case <0, <0, 129>> but i didn 't find how to do that and how my model would likely be.
I am new to JSON any help is appreciated! Thanks
Then to handle data, you can have a generic class like :
class CellData<T> {
T data;
}
Then read as below :
String jsonInput = "{ \"0\": { \"0\": { \"value\": 129 } } }";
ObjectMapper mapper = new ObjectMapper();
TypeReference<HashMap<Integer,HashMap<Integer,CellData<Integer>>>> typeRef =
new TypeReference<HashMap<Integer, HashMap<Integer, CellData<Integer>>>>() {};
Map<Integer, Map<Integer, CellData<Integer>>> map = mapper.readValue(jsonInput, typeRef);
Consider the following Json structure:
{ "ubds": [
{
"id": "33",
"metaData": {
"lineInfo": {
"poNumber": "PO_123",
"poLineNumber": 1
}
},
"confirmedDeliveryDate": "2016-05-26T16:15:51",
"quantity": 99
},
{
"id": "34",
"metaData": {
"lineInfo": {
"poNumber": "PO_123",
"poLineNumber": 2
}
},
"confirmedDeliveryDate": "2016-05-26T16:15:51",
"quantity": 99
},
{
"id": "35",
"metaData": {
"lineInfo": {
"poNumber": "PO_123",
"poLineNumber": 3
}
},
"confirmedDeliveryDate": "2016-05-26T16:15:51",
"quantity": 99
}]}
Using JsonNode, is there a way to get the entire child node {id through quantity} with the poLineNumber attribute value of 3 without having to iterate through all the nodes and returning on a match? Do I need to use JsonPath for this?
You can have a look to JsonPath.
You can first use ObjectMapper to create a Map<String, Object> from the given json string, and read it and evaluate a JsonPath expression. For example:
ObjectMapper mapper = new ObjectMapper();
Map<String, Object> mappedObject = mapper.readValue(jsonString, Map.class);
// Evaluate that expression
Object result = JsonPath.read(mappedObject, "$.ubds[?(#.metaData.lineInfo.poLineNumber==3)]");
or directly read the json string with JsonPath:
Object result = JsonPath.parse(jsonString).read("$.ubds[?(#.metaData.lineInfo.poLineNumber==3)]");
I have got the JSON as below
{
"brands": [
{
"name": "ACC",
"quantity": "0",
"listedbrandID": 1,
"status": "0"
}
],
"others": [
{
"name": "dd",
"quantity": "55"
},
{
"name": "dd",
"quantity": "55"
}
]
}
How can i remove the duplicates from others JSON array
i have tried as following
import org.json.JSONArray;
import org.json.JSONException;
import org.json.JSONObject;
String json = "{
"brands": [
{
"name": "ACC",
"quantity": "0",
"listedbrandID": 1,
"status": "0"
}
],
"others": [
{
"name": "dd",
"quantity": "55"
},
{
"name": "dd",
"quantity": "55"
}
]
}
";
JSONObject json_obj = new JSONObject(json);
JSONArray array = json_obj.getJSONArray("others");
HashMap<String, String> map = new HashMap<String, String>();
System.out.println(array.length());
for(int i=0;i<array.length();i++)
{
String name = array.getJSONObject(i).getString("name");
String quantity = array.getJSONObject(i).getString("quantity");
if(name!=null && !name.trim().equals(""))
{
map.put(name, quantity);
}
}
But no idea how to remove duplicate JSON other than which are present under Map only.
Create an object representing your Others. It will have a name and quantity property and it will also override the equals method, wherein, two Other objects are considered to be equal if they have the same name and quantity properties.
Once you have that, iterate over your JSON, create a new Other object and place it in a HashSet<Other>. The .equals will ensure that the HashSet will contain unique items, as per your definition of unique.