Utilize the Json file in Mulesoft Java class - java

I have a Json file and imported it to my resource folder in mulesoft. I am trying to pass it to mule java class to do some calculations with the values in it. Do i need Json to Object transformer or I can directly pass the Json data to java class? Flow explanation is really helpful. Thanks

of course you could pass JSON as string in your custom Java component, but it is more convenient to work with Java objects.
you can use json-to-object-transformer to convert your JSON to a generic Java object (java.util.Map) like this:
<flow name="flow">
<json:json-to-object-transformer returnClass="java.util.Map" doc:name="JSON to Object"/>
<!-- ... -->
</flow>
now the payload is a instance of java.util.HashMap containing the values from your JSON.
if you have a class representing the data in your JSON the replace java.util.Map with the fully qualified name of the class and json-to-object-transformer will return a instance of this class.
take a look at "Using the Transformers Explicitly" here: https://docs.mulesoft.com/mule-user-guide/v/3.7/native-support-for-json

In order to pass the JSON file to a json-to-object-transformer
you can also use parse-template component.

Related

How to create a JSON String from a list of objects and a JSON schema?

I have a list of objects which needs to be converted to JSON. I also have a JSON schema corresponding to which the objects should be placed in the JSON to be created. How can I achieve this? I cannot seem to find any references on the internet. Is this possible? Any help would be much appreciated.
If it's JavaScript, you can do it using Ajv and custom keywords that would generate the object you need as a side-effect of validation process.
Most likely you would have to define template that will be validated and the data that needs to be embedded in this template will be passed as a context into validation function:
var validate = ajv.compile(schema);
var context = { data: { /* ... */ } };
validate.call(context, template);
console.log(template); // template with inserted data
There is no Java technology that that uses a json schema to influence serialization that I am aware of. If you use a library such as Jackson to serialize, it's up to you to use the available customization mechanisms to make any changes to the defaults needed to conform to the schema.

using JsonObject as entity for Jersey 2 response

I have this simple code:
package com.example
import javax.json.Json;
import javax.json.JsonObject;
...
#Path("/")
public class Resource {
#GET
#Produces(MediaType.APPLICATION_JSON)
public Response defaultEntry() {
JsonObject result = (Json.createObjectBuilder()
.add("hello", "world")
.build());
return Response.status(200).entity(result.toString()).build();
}
}
I am new to Java, could someone please explain why, if I omit the call to result.toString() and simply pass result to .entity (like so: return Response.status(200).entity(result).build()), I get JSON on the client that includes type information etc, but not what I expected:
{"hello":{"chars":"world","string":"world","valueType":"STRING"}}
what is the intention of this? How is passing JsonObject to it different from passing a string?
Also, I did not find Response.entity method in the documentation (https://jersey.java.net/apidocs/2.11/jersey/javax/ws/rs/core/Response.html). I copied this code from a tutorial, that did not explain properly what is going on...
I wish I had a better answer for you, this more of a hint until a better answer arrives. There are a few moving parts here. JsonObject is an interface. Its implementation is not described. Furthermore there is a Json serializer that is turning your returned objects into Json text. It is both these things together that is leading to this Json schema output. When you did the .toString() variation, the serializer just returned the String as is. But when you return the JsonObject now you have these two dynamics at play, the implementation of the JsonObject and the implementation of the serializer. Since you are using Jersey 2.0 you could be using Jackson, Moxy, or Jettison serializers. These all might have different output when serializing the JsonObject, but we would have to test to be sure. Furthermore, the JsonObject implementation might be configured in a way that when serialized by your chosen serializer leads to its output being a Json schema, versus just regular Json. This can be done using annotations that are specific to the chosen Json serializer.
In my career I have used multiple Json serializers. Jackson is probably the most popular one out there. But I have also used Gson extensively. In one project we configured Gson in some way where its serialized Json output came out as a Json schema when serializing POJO's. So its not far fetched to have a Json serializer output Json schema under certain conditions.
When serializing POJO's (aka Java Beans), you expect a regular Json output when using default settings of your serializer on a Java Bean. But when sending back objects that could have complex interworkings with specific Json serializers you may get varying Json output.
In this situation you would have to run tests to dive deeper into what is going on. For example, I would first test out the serializer against a POJO that matches the JsonObject you created. Then I would also test out other Json serializers on the same JsonObject. See if you can pick up on a pattern.
The datatype module jackson-datatype-jsr353 provides support for javax.json types. The page includes instructions how to add the dependency and register the module with Jackson's ObjectMapper.
Starting with Jackson 2.11.0, the module moved under the umbrella of jackson-datatypes-misc.

Mule Anypoint Studio Passing in JSON and working in Java

I'm not too familiar with Anypoint and we will probably only use this program once, I've looked at tutorials on the website but I feel I can't find one that demonstrates the task we have. Basically we are trying to read a JSON file that will be sent to the software, from there we want to use JAVA to read/alter the contents into a desired XML formatting. Finally we then send back or redirect the XML results.
1: I am trying to accept a JSON file from a HTTP POST, I believe I have accomplished this part by using the tutorials and an HTTP element with metadata attached representing the JSON format that may come. Using postman to send the json data as well.
2: From here is where I start to get completely confused. I am wondering what I would need to do in order to pass the data into a Java class object to read the JSON file and begin using Java code (getters/setters) instead of the Anypoint interface in order to start designing the xml layout.
Thanks,
I think the next step is for you to define a model class(with getter/setter), which would represent your json file contents. Since you have already the POST part, the next thing you would do is to use JSON to Object Transformer which the model is injected in.
<json:json-to-object-transformer returnClass="com.alexfrndz.Person" doc:name="JSON to Object"/>
After adding the transformer, you could use a mule custom transformer,
Here is the custom transformer.
package com.alexfrndz;
import org.mule.api.MuleMessage;
import org.mule.api.transformer.TransformerException;
import org.mule.transformer.AbstractMessageTransformer;
public class PersonTransformer extends AbstractMessageTransformer {
#Override
public Object transformMessage(MuleMessage message, String outputEncoding) throws TransformerException {
Person person = (Person) message.getPayload();
//Do your transformation hear
return null;
}
}
Here is how you implement it,
<custom-transformer class="com.alexfrndz.PersonTransformer" doc:name="PersonTransformer"/>
Hope this will help you.

Jackson Deserialize To Concrete Class Based On Type

I have what I believe should be a simple use case.
I would like to serialize a POJO with type metadata (preferably a simple name I come up with, not the fully qualified class/package name), and later have Jackson deserialize the JSON back into the concrete class it came from by using this metadata. There is no inheritance hierarchy among classes being serialized and deserialized.
My scenario is I have a service which accepts multiple file types. For each file uploaded, the client can retrieve JSON data whose structure and type depends on the file it came from. Thus when I retrieve JSON from the service, it's not known what the concrete class is to deserialize to. I would like Jackson to figure this out based on metadata which it supplies.
For example, I'd like to be able to do this:
String json = ... // get JSON from the service
Object obj = mapper.readValue(json, Object.class) // concrete class is not known
System.out.println(obj.getClass()) // I want this to be MyConcreteClass.class
There is no inheritance hierarchy among JSON types returned.
I don't want to reveal package names or other internal service
details/structure.
I have control over Jackson's serialization process
Relevant question: Can jackson determine root object type to deserialize to when json includes type property?
Thank you so much for your help!
This can be achieved using Jackson's JavaType:
String className = "class.name.from.json.service";
JavaType dtoType = TypeFactory.defaultInstance().constructFromCanonical(className);
Object dto = new ObjectMapper().readValue(InputStream, dtoType);
assert dto.getClass().equals(dtoType.getRawClass());

Can the XPage JSON library automatically convert a Java Bean to a JSON representation?

I want to use the com.ibm.commons.util.io.json.* library which comes with the XPages runtime to serialise a Java Bean into JSON.
The question is can it do it automatically by just passing it the object - like you can with the Google library - http://code.google.com/p/google-gson/ or do you need to construct the JSON manually by which I mean passing the individual properties to construct the JSON.
Having trouble locating the documentation for this library, though I have seen some examples:
http://www.openntf.org/internal/home.nsf/project.xsp?action=openDocument&name=JSON%20and%20REST%20Samples
http://www-10.lotus.com/ldd/ddwiki.nsf/dx/Sending_requests_in_Java_dds10
Ideally we dont want to use a 3rd party library, even though it works great, because we need to modify the java security properties file which in turn gets wiped if the server gets upgraded.
The com.ibm.commons.util.io.json library is a genric library for converting JSON representations to Java objects, back and forth. By generic, I mean that it uses a factory to both browse and update the Java objects (see: JsonFactory). By implementing such a factory, and implementing the getter/setters for all the properties, one can serialize/deserialize any kind of objects.
The JSON library is equiped with a set of predefined factories:
JsonJavaFactory, that maps JSON Objects to Java Maps (with a extended version that uses a JsonJavaObject wrapper which is more convenient)
JsonJavaScriptFactory, that maps JSON objects to actual JavaScript objects (see: ObjectObject) and Java values (String, Integer...) to JavaScript values (FBSString, FBSNumber...). These objects can be directly used by the server side JS engine.
We don't have a factory for JavaBeans per say, but implementing such a factory should not be a big deal.
The ibm commons library for json works by constructing an object, then adding json properties to the object. It can not auto-serialize an object, and works really only with primitive data types.
I've attached some SSJS code to illustrate how to work with the class. It assumes recordMap is a java map instance with some beans in it, and each bean has 5 fields named fieldName1 through fieldName5. The code iterates through each bean in the map, retrieves the 5 fields, convert the values to JSON, then pushes them into array. Finally the array is put inside another json object that includes the count, and the array itself.
var jsonObjArr = [];
var itr:java.util.Iterator = recordMap.keySet().iterator();
while (itr.hasNext()) {
var record = recordMap.get(itr.next());
var jsonObj:com.ibm.commons.util.io.json.JsonJavaObject =
new com.ibm.commons.util.io.json.JsonJavaObject();
jsonObj.putJsonProperty("fieldName1", record.getFieldName1());
jsonObj.putJsonProperty("fieldName2", record.getFieldName2());
jsonObj.putJsonProperty("fieldName3", record.getFieldName3());
jsonObj.putJsonProperty("fieldName4", record.getFieldName4());
jsonObj.putJsonProperty("fieldName5", record.getFieldName5());
jsonObj.putJsonProperty("fieldName6", record.getFieldName6());
jsonObjArr.push(com.ibm.commons.util.io.json.JsonGenerator
.toJson(com.ibm.commons.util.io.json.JsonJavaFactory.instanceEx, empr));
};
var jsonString = "{" +
"count:" + #Text(jsonObjArr.length) + "," +
"employees:" + "[" + jsonObjArr.join(",") + "]" +
"}";
return jsonString;
Hope this helps..

Categories