Read #JsonProperty dynamically from config - java

I am developing a Spring boot application which uses Jackson annotations.
I want to read value of #JsonProperty from a config, instead of using constant string.
Example JSON input
{"s":12}
Code
I want to read property from my config:
#JsonProperty("${myconfig.fieldAlias.stream}")
private Integer stream;
instead of
#JsonProperty("s")
private Integer stream;
Issue
While executing the code above using config:
variable "s" is not identified as stream
unless I use constant #JsonProperty("s"), which is not desired.
Is it possible to use dynamic JsonProperty values? If so, what is the proper way to do so?

The name given to #JsonProperty must be statically given. What you can do is to overwrite the given name dynamically, by implementing a custom serializer for the propery:
public static class StreamSerializer extends JsonSerializer<Integer> {
#Override public void serialize(Integer value, JsonGenerator jsonGenerator, SerializerProvider provider)
throws IOException {
jsonGenerator.writeStartObject();
jsonGenerator.writeStringField("s", your_dynamic_name_here);// dynamic field name
jsonGenerator.writeEndObject();
}
}
and use it like this:
#JsonProperty("s")
#JsonSerialize(using = StreamSerializer.class)
private Integer stream;

Related

Can jacksons ObjectMapper serialize Java-null as empty String, in Version 2.11.2

I'm using the com.fasterxml.jackson.databind.ObjectMapper in jackson-databind 2.11.2 and trying to serialize Java properties with null-value to something like this:
{ %Key% : "" }
I've tried:
ObjectMapper MAPPER = new JodaMapper();
DefaultSerializerProvider defaultSerializerProvider = new DefaultSerializerProvider.Impl();
defaultSerializerProvider.setNullValueSerializer(new JsonSerializer<Object>() {
#Override
public void serialize(Object value, JsonGenerator gen, SerializerProvider serializers) throws IOException {
gen.writeString("bla");
}
});
MAPPER.setSerializerProvider(defaultSerializerProvider);
But the NullValueSerializers serialize-method does not get triggered for any fields.
Has anybody some ideas?
I found the solution.... I had
#JsonInclude(JsonInclude.Include.NON_NULL)
at class level in the class that I wanted to serialize. When I remove the annotation I the code above works.
There are a couple of ways to achieve custom null value serialising:
If you want to serialise a null as an empty String, try using this annotation on a property, or setter:
#JsonSetter(nulls=Nulls.AS_EMPTY)
or the same for specific mapper:
MAPPER.configOverride(String.class).setSetterInfo(JsonSetter.Value.forValueNulls(Nulls.AS_EMPTY));
You can initialise properties with default values on the declaration or in the getter.
As you've already mentioned, by providing a custom serialiser.
I did try your code, and that serialised null value as expected when using an ObjectMapper instead of JodaMapper. Is there any particular reason for using a JodaMapper?

Jackson JsonInclude.Include.NON_NULL is not working with custom serialiser

I have a custom serialiser which extends JsonSerializer<T>
and in the ObjectMapper I have included setSerializationInclusion(JsonInclude.Include.NON_NULL).
I still see null field in response.
Currently, I ignore them by checking null for each property. I have almost 15 objects and it's very difficult to add null checking to each property. Object I am using is shared by my applications, that is the reason why I am using custom serialiser to name the properties
#Override
public void serialize(Person personBean, JsonGenerator jgen, SerializerProvider provider) throws IOException {
if(personBean.getFirstName() != null){
jgen.writeStringField("firstname", personBean.getFirstName() );
}
//etc...
}
How to avoid null check for each property and implement some generic code to avoid null values in my serialised response.
Unfortunately, when we write custom serialiser we need to take care about null values by ourself. To make it at least a little bit better we can add new writeStringField method and use it. For example:
class PersonJsonSerializer extends JsonSerializer<Person> {
#Override
public void serialize(Person value, JsonGenerator gen, SerializerProvider provider) throws IOException {
gen.writeStartObject();
writeStringField(gen, "firstname", value.getFirstName());
gen.writeEndObject();
}
private void writeStringField(JsonGenerator gen, String fieldName, String value) throws IOException {
if (value != null) {
gen.writeStringField(fieldName, value);
}
}
}
If you need to change only property names you can use PropertyNamingStrategy option. There is a few possibilities like:
LOWER_CASE - Naming convention in which all words of the logical name are in lower case, and no separator is used between words.
KEBAB_CASE - Naming convention used in languages like Lisp, where words are in lower-case letters, separated by hyphens.
For more check documentation
Example ObjectMapper customisation could look like:
ObjectMapper mapper = new ObjectMapper();
mapper.setSerializationInclusion(JsonInclude.Include.NON_NULL);
mapper.setPropertyNamingStrategy(PropertyNamingStrategy.LOWER_CASE);
If there is no predefined strategy which satisfies your need you can use JsonView annotation.

How to save Java 8 Instant to MongoDB as Date type using Spring MongoTemplate?

I have a Java class having an Instant type of member variable:
public class SomeRecord {
private String someId;
private Instant someInstant;
// getters and setters
}
I am using MongoTemplate to update the someInstant field in database:
public SomeRecord updateSomeRecordBySomeId(final String someId, Object someInstant) {
Query query = new Query();
query.addCriteria(Criteria.where("someId").is(someId));
Update update = new Update();
update.set("someInstant", someInstant);
return operations.findAndModify(query, update, new FindAndModifyOptions().returnNew(true), SomeRecord.class);
}
This works great if I am calling the method as:
updateSomeRecordBySomeId("SOME-ID", Instant.now());
persisting the field in DB as a Date type:
"someInstant" : ISODate("2017-07-11T07:26:44.269Z")
Now the method may also be called as:
updateSomeRecordBySomeId("SOME-ID", "2017-07-11T07:26:44.269Z");
In this case I get an exception as:
org.springframework.core.convert.ConverterNotFoundException: No
converter found capable of converting from type [java.lang.String] to
type [java.time.Instant]
which makes complete sense. (It updates the field in the DB as String though. "someInstant" : "2017-07-11T07:26:44.269Z")
So I added a converter as follows:
MongoConfig.java:
#Configuration
#ComponentScan(basePackages = {"dao package path here"})
public class MongoConfig {
#Autowired
private MongoDbFactory mongoDbFactory;
#Bean
public MongoTemplate mongoTemplate() {
MappingMongoConverter converter = new MappingMongoConverter(new DefaultDbRefResolver(mongoDbFactory),
new MongoMappingContext());
converter.setCustomConversions(new CustomConversions(Collections.singletonList(new StringToInstantConverter())));
return new MongoTemplate(mongoDbFactory, converter);
}
}
StringToInstantConverter.java:
public class StringToInstantConverter implements Converter<String, Instant> {
#Override
public Instant convert(String utcString) {
// TODO: Make it generic for any time-zone
return Instant.parse(utcString);
}
}
After adding the above converter I am not getting ConverterNotFoundException any longer, but the field someInstant is being persisted as plain string: "someInstant" : "2017-07-11T07:26:44.269Z"
And that's what my question is. I know that the converter is being identified that is the reason I am not getting the exception anymore. But why the converter is not converting the String to Instant? Why the field is being persisted as plain String? Is the converter supplied incorrect? How to write converter for this case?
Note:
I have simplified the code to focus on the actual problem. In actual the method does not receive the someInstant field as parameter. So writing overloaded method is not going to be applicable here. Also any kind of instanceOf check inside the method won't work for the actual scenario. So the focus is on the question why the conversion not happening?
The actual data-store for us is DocumentDB, but we use DocumentDB with MongoDB API(as Spring Data does not support DocumentDB) for our database operations.
Your update logic is written in type agnostic way: you can pass any object type (Integer, Long, Boolean, String, Date, etc.) and it will be persisted in DB by overriding the existing value/type with new value and new type. Note: document oriented databased like MongoDB have no fixed schema, so stored data can arbitrary change data types.
The issue you had before you introduced converter with ConverterNotFoundException was not during update action, but during retrieval of updated object and setting it into your Java bean model: Java class defined someInstant property to be of an Instant / Date type, but database supplied a String value.
After you introduced a converter the issue of reading was solved, but only for String and Date types. If you update the someInstant property with some boolean value, you'll get back to the issue to read the object and map it to you Java bean.

Custom object deserializes fine in Jax-RS but if it is used in another object it doesn't work

Deserializing works fine if I just pass my custom object through
#POST
public Response saveCustomObject(CustomObject data)
{
// Prints correct value
System.out.println(data);
}
However, if it is a property on another object, it just gets the default value of my custom object
#POST
public Response saveCustomObjectWrapper(CustomObjectWrapper data)
{
// Prints incorrect value
System.out.println(data.getCustomObject());
}
My provider is registered and looks like this:
public CustomObject readFrom(Class<CustomObject> type, Type type1, Annotation[] antns, MediaType mt, MultivaluedMap<String, String> mm, InputStream in) throws IOException, WebApplicationException
{
try {
return new CustomObject(IOUtils.toString(in));
} catch (Exception ex) {
throw new ProcessingException("Error deserializing a CustomObject.", ex);
}
}
The problem is that the reader for all other objects doesn't do lookup/delegation while unmarshalling. What I mean by that, can be seen in this answer, where one reader looks up another reader based on the type. Assuming the format is JSON, whether you're using MOXy (the default with Glassfish) or Jackson, the result is the same. The reader is smart enough to handle the the JSON by itself, so doesn't need to lookup any other readers.
One solution would be to create another reader for the wrapper class, and do lookup/delegation, as seen in the link above. If you have a lot of these situations, you may can extend the default reader, and override its unmarshalling method, but I would completely advise against this, unless you really know what you're doing.
Another solution, depending on the serializer you're using, is to write JsonDeserializer (for Jackson) or XmlAdapter (for MOXy or Jackson). For Jackson an example would be something like (you can see a better example here)
public class CustomObjectDeserializer extends JsonDeserializer<CustomObject> {
#Override
public CustomObject deserialize(JsonParser jp, DeserializationContext dc)
throws IOException, JsonProcessingException {
JsonNode node = jp.getCodec().readTree(jp);
return new CustomObject("Hello World");
}
}
#JsonDeserialize(using = CustomObjectDeserializer.class)
public class CustomObject {
public String message;
public String getMessage() { return message; }
public void setMessage(String message) { this.message = message; }
public CustomObject(String message) { this.message = message; }
public CustomObject(){}
}
In which case, there is no need for a custom reader at all. This will handle CustomObjects and objects that have CustomObject as a member. One problem with this is I'm not sure how or if you can get the InputStream. You just need to use the Jackson APIs to parse the JSON.
If you want to use Jackson instead of the default MOXy for glassfish, you can just add the Jackson dependency
<dependency>
<groupId>org.glassfish.jersey.media</groupId>
<artifactId>jersey-media-json-jackson</artifactId>
<version>2.13</version>
</dependency>
Then register the JacksonFeature, or simply disable MOXy, as mentioned here. If you want to continue using MOXy, I don't know if there is such thing as a class level adapter, so you will still need the reader as well as create a XmlAdapter for class members. It's a bit of a hassle, but that's why I recommend Jackson, for many other reasons, besides this particular use case. You can see an example of an adapter here
Now a lot of this answer is based on the assumption you are using JSON format, as you haven't specified the media type you are using. If it some other format, then I think maybe your only solution is to create another customer reader for the wrapper.

Jackson not naming fields how I want [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Different names of JSON property during serialization and deserialization
I am using Jackson on my site to create an options string to be used with a charting tool that expects JSON. So for example, I have a
public class Chart {
Integer zIndex = 3;
public Integer getZIndex() {
return zIndex;
}
}
so then I use Jackson's objectMapper on my chart and the output is {"zindex":3} where my issue is that the charting tool will not accept "zindex" but insists on the camel cased "zIndex".
What can I do to get this to be named properly in the output?
I've tried #JsonProperty("zIndex") but this generates two copies in the output, zindex and zIndex, which is confusing and ugly. Also, I am using lombok to generate my getters, if that makes a difference.
I tried:
public class FieldNamingStrategy extends PropertyNamingStrategy {
#Override
public String nameForField(MapperConfig<?> config, AnnotatedField field, String defaultName) {
return field.getName();
}
}
and then
objectMapper.setPropertyNamingStrategy()
but this didn't work.
My configuration looks like
String json = null;
StringWriter stringWriter = new StringWriter();
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.setSerializationInclusion(JsonSerialize.Inclusion.NON_NULL);
//TODO: figure this out
objectMapper.setPropertyNamingStrategy(new FieldNamingStrategy());
try {
final JsonGenerator jsonGenerator = objectMapper.getJsonFactory().createJsonGenerator(stringWriter);
jsonGenerator.useDefaultPrettyPrinter();
objectMapper.writeValue(jsonGenerator, object);
json = stringWriter.toString();
Make sure you use a modern version of Jackson: 1.9 improved handling of properties, so that annotation would work even when added to just one of pieces.
Or if you can not do that, just add #JsonProperty annotation to BOTH getter and field.
Your main problem is really that name itself is "non-compliant", meaning that pieces might not match.

Categories