Custom serialization and deserialization is scala using jackson - java

I have a JSON as a string which I am deserializing and instantiating as MyPOJO case class of scala. My data is in YYYY-MM-DD format but the actual attribute in POJO createdBy is LocalDateTime.
How to assign a default time value of 2020-03-02 00:00:00 while instantiating Pojo,
Serialization should return yyyy-mm-dd format. My serialization and deserialization format are different.
case class MyPOJO( #JsonFormat(pattern = "yyyy-mm-dd" ) createdBy :LocalDateTime )
object MyJaxsonP {
def main(args: Array[String]): Unit = {
val objectMapper = new ObjectMapper() with ScalaObjectMapper
objectMapper.findAndRegisterModules()
objectMapper.registerModule(DefaultScalaModule)
objectMapper.registerModule(new JavaTimeModule)
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
objectMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS)
val adminDatasP = objectMapper.readValue[MyPOJO]("{\"createdBy\":\"2020-03-02\"}")
print(adminDatasP.toString)
}
}
I have tried with custom serialization and deserialization, like below, but is not working saying default constructor is missing
case class MyPOJO( #JsonDeserialize(using = classOf[CustomDateDeserializer] ) createdBy :LocalDateTime )
object CustomDateDeserializer {
private val formatter = new SimpleDateFormat("dd-MM-yyyy")
}
class CustomDateDeserializer(val t: Class[String]) extends StdDeserializer[String](t) {
override def deserialize(p: JsonParser, ctxt: DeserializationContext): String = {
val date = p.getText
return CustomDateDeserializer.formatter.format(date);
}
}
Need expert input on how to solve this problem

but is not working saying default constructor is missing
You getting the error, because there is not default or empty, if you will, constructor for case classes. In this particular case when you declare case class MyPOJO(createdBy : LocalDateTime) scala compiler will generate something like (example might be not very accurate, I want just to show an idea):
class MyPOJO(createdBy : LocalDateTime) extends Product with Serializeble {
override def hashCode(): Int = ...
override def equals(oterh: Any): Boolean = ...
override def toString(): String = ...
// and other methods, like copy, unaply, tupled etc.
}
object MyPOJO {
def apply(createdBy : LocalDateTime) = new MyPOJO(createdBy)
}
so Jackson wont be able to create empty class instance with empty fields (or null values more precisely) and then inject values from source JSON.
What you can do - use plain class instead of case classes. Or, what is more preferable, to take a look on Scala friendly JSON libraries like Circe which is not reflection based, unlike Jackson, and instead generates JSON codec in compile time for some classes, based on implicit's and Scala macros (more precisely it relies on Shapeless library which uses Scala macros mechanism under the hood).
In your particular case, code would look like:
import io.circe._
import io.circe.generic.auto._
import io.circe.syntax._
case class MyPOJO(createdBy: LocalDateTime)
val format = DateTimeFormatter.ofPattern("yyyy-MM-dd")
// Implement own `Encoder` to render `LocalDateTime` as JSON string, separated with comma inside
implicit val encoder: Encoder[LocalDateTime] = Encoder[String].contramap(_.format(format))
// Implement own `Decoder` to parse JSON string as `LocalDateTime`
implicit val decoder: Decoder[LocalDateTime] = Decoder[String].
emapTry(value => Try(LocalDate.parse(value, format).atStartOfDay()))
val foo = MyPOJO(LocalDateTime.now())
val json = foo.asJson
println(json.noSpaces)
println(json.as[MyPOJO])
which will produce next result:
{"createdBy":"2020-03-04"}
Right(MyPOJO(2020-03-04T00:00))
Hope this helps!

Related

Copy most attributes from one class object to another class

Between two separate data classes, Person and PersonRecord, which share the same attribute names, I want an elegant way to copy the values from one class's attributes to the other's.
I have a data class, say for example Person, that defines the business logic data of a person in the application.
import kotlinx.serialization.Serializable
data class Person(
val id: String,
val name: String,
val age: Int,
val currentEmployment: Employment,
val workPermit: WorkPermit
)
#Serializable
data class Employment(
val employer: String,
val job: String,
val yearsWithEmployer: Double
)
#Serializable
data class WorkPermit(
val nationality: String,
val visa: String
)
I need to use these with an AWS DynamoDB client, but this question doesn't really concern DynamoDB specifically. I'll explain my usage below.
For several reasons, I've decided to implement a DAO class that is essentially a copy of the class Person, called PersonRecord except the fields containing complex types, i.e., Employment and WorkPermit, are stored as Strings instead. Also, all the fields are mutable and nullable. I had to make it this way because it's supposed to be a mapper class for DynamoDB Enhanced Client (doc).
Annotating this class as #DynamoDbBean defines how the client writes items into a specified table.
package util
import kotlinx.serialization.decodeFromString
import kotlinx.serialization.encodeToString
import kotlinx.serialization.json.Json
import software.amazon.awssdk.enhanced.dynamodb.DynamoDbEnhancedClient
import software.amazon.awssdk.enhanced.dynamodb.DynamoDbTable
import software.amazon.awssdk.enhanced.dynamodb.Key
import software.amazon.awssdk.enhanced.dynamodb.TableSchema
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbBean
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbPartitionKey
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbSortKey
#DynamoDbBean
internal data class PersonRecord(
#get: DynamoDbPartitionKey
#get: DynamoDbSortKey
var id: String? = null,
var name: String? = null,
var age: Int? = null,
var currentEmployment: String? = null,
var workPermit: String? = null,
)
class PersonDao(
ddb: DynamoDbEnhancedClient,
personTableName: String
) {
private val personTable: DynamoDbTable<PersonRecord> = ddb.table(
personTableName,
TableSchema.fromBean(PersonRecord::class.java)
)
private fun toPersonRecord(person: Person): PersonRecord =
PersonRecord(
id = person.id,
name = person.name,
age = person.age,
currentEmployment = Json.encodeToString(person.currentEmployment),
workPermit = Json.encodeToString(person.workPermit)
)
private fun toPerson(personRecord: PersonRecord): Person =
Person(
id = personRecord.id!!,
name = personRecord.name!!,
age = personRecord.age!!,
currentEmployment = Json.decodeFromString(
personRecord.currentEmployment!!
),
workPermit = Json.decodeFromString(
personRecord.workPermit!!
)
)
fun writePerson(person: Person) =
personTable.putItem(toPersonRecord(person))
fun readPerson(id: String): Person? {
val personRecord = personTable.getItem(
Key.builder()
.partitionValue(id)
.build()
)
return if (personRecord != null) toPerson(personRecord)
else null
}
}
I am using the public functions readPerson and writePerson to read and write the pretty Person class, while these functions internally convert to and fro PersonRecord.
Is there a way to copy between the different classes Person and PersonRecord more elegantly? If, in the future, we change the shape of Person slightly, there's a lot to change in the PersonRecord and PersonDao classes too. In particular, I need a way to handle decoding String to Employment and WorkPermit, and vice-versa.
In the example above, it'd be trivial to add a field or two, but in my actual application I'm dealing with over a dozen fields, and a bunch of unit tests intricately involved with the fields themselves.
Someone suggested to use class reflections, but I don't understand how I'd use it based on what the Kotlin docs describe.
You can try to read Person properties into a map via reflections (there is no other way) and use delegated properties feature to construct PersonRecord from that map.
https://kotlinlang.org/docs/delegated-properties.html#storing-properties-in-a-map
Here is a sample of reading via reflection https://stackoverflow.com/a/38688203/8642957
Yes, MapStruct is great and it's available in kotlin via kapt.

Issue converting decimal (Double) value from JSON to case class

I am using Scala 2.12 with circe version 0.14.1. I am converting a JSON into corresponding case class as below:
case class Document(curr: String, value: Double)
val json = s"""{"Document":{"curr":"USD","value":40000000.01}}"""
import io.circe.generic.auto._
io.circe.parser.decode[Document](json)
The converted case class is below:
Document(USD,4.000000001E7)
I do not want the Double value to change into a Exponential representation. I need it to remain as Double unchanged as
40000000.01
.
You can override toString implementation to see pain instead of engineering notation:
case class Document(curr: String, value: Double) {
override def toString: String =
s"Document($curr,${BigDecimal(value).underlying().toPlainString})"
}
If you want JSON serialization in the plain notation then use the following encoder:
implicit val doubleE5r: Encoder[Double] = new Encoder[Double] {
override def apply(a: Double): Json =
Json.fromJsonNumber(JsonNumber.fromDecimalStringUnsafe(BigDecimal(a).underlying().toPlainString))
}

Kryo: Difference between readClassAndObject/ReadObject and WriteClassAndObject/WriteObject

I am trying to understand the following statement from the documentation:
If the concrete class of the object is not known and the object couldbe null:
kryo.writeClassAndObject(output, object);
Object object = kryo.readClassAndObject(input);
What does if the concrete class is not known exactly.
I am having the following code:
case class RawData(modelName: String,
sourceType: String,
deNormalizedVal: String,
normalVal: Map[String, String])
object KryoSpike extends App {
val kryo = new Kryo()
kryo.setRegistrationRequired(false)
kryo.addDefaultSerializer(classOf[scala.collection.Map[_,_]], classOf[ScalaImmutableAbstractMapSerializer])
kryo.addDefaultSerializer(classOf[scala.collection.generic.MapFactory[scala.collection.Map]], classOf[ScalaImmutableAbstractMapSerializer])
kryo.addDefaultSerializer(classOf[RawData], classOf[ScalaProductSerializer])
//val testin = Map("id" -> "objID", "field1" -> "field1Value")
val testin = RawData("model1", "Json", "", Map("field1" -> "value1", "field2" -> "value2") )
val outStream = new ByteArrayOutputStream()
val output = new Output(outStream, 20480)
kryo.writeClassAndObject(output, testin)
output.close()
val input = new Input(new ByteArrayInputStream(outStream.toByteArray), 4096)
val testout = kryo.readClassAndObject(input)
input.close()
println(testout.toString)
}
When I use readClassAndObject and writeClassAndObject is works. However if I use writeObject and readObject it does not.
Exception in thread "main" com.esotericsoftware.kryo.KryoException:
Class cannot be created (missing no-arg constructor):
com.romix.scala.serialization.kryo.ScalaProductSerializer
I just don't understand why.
earlier using the same code, Instead of using my class RawData, I used a Map and it worked like a charm with writeObject and ReadObject. Hence i am confused.
Can someone help understand it ?
The difference is as follows:
you use writeClassAndObject and readClassAndObject when you're using a serializer that:
serializes a base type: an interface, a class that has subclasses, or - in case of Scala - a trait like Product,
and needs the type (i.e. the Class object) of the deserialized object to construct this object (without this type, it doesn't know what to construct),
example: ScalaProductSerializer
you use writeObject and readObject when you're using a serializer that:
serializes exactly one type (i.e. a class that can be instantiated; example: EnumSetSerializer),
or serializes more than one type but the specific type can be somehow deduced from the serialized data (example: ScalaImmutableAbstractMapSerializer)
To sum this up for your specific case:
when you deserialize your RawData:
ScalaProductSerializer needs to find out the exact type of Product to create an instance,
so it uses the typ: Class[Product] parameter to do it,
as a result, only readClassAndObject works.
when you deserialze a Scala immutable map (scala.collection.immutable.Map imported as IMap):
ScalaImmutableAbstractMapSerializer doesn't need to find out the exact type - it uses IMap.empty to create an instance,
as a result, it doesn't use the typ: Class[IMap[_, _]] parameter,
as a result, both readObject and readClassAndObject work.

Jackson deserializer fallback based on value

Is it possible to create a Jackson deserializer that is "chained" with others. For instance, we have a generalized, custom deserializer for Enumerations that handles a special case, but we'd like to fallback to the default Jackson implementation if the custom deserializer fails to handle the incoming Enumeration.
It seems as if there's no simple way to do this on a value basis - I've seen examples on something similar on the type level, however.
The issue seems to stem from the fact that registering a deserializer for a specific class gives that deserializer precedence over any others that are registered and thus the previously registered deserializer is lost. There doesn't seem to be a concept of chaining built-in, so returning null from the custom deserializer doesn't pass the value down the chain of registered deserializers for that type.
Any insight on how to get deserializers to chain would be greatly appreciated!
Update
Here's an example of the intended behavior vs. the (simplified) current setup.
The current setup:
object EnumSerializerModule extends JacksonModule {
override def getModuleName: String = "permissive-enums"
this += { _.addDeserializers(PermissiveEnumDeserializerLocator) }
}
object PermissiveEnumDeserializerLocator extends Deserializers.Base {
override def findEnumDeserializer(javaType: Class[_], config: DeserializationConfig, desc: BeanDescription): JsonDeserializer[_] = {
new PermissiveEnumDeserializer(javaType.asInstanceOf[Class[Enum[_ <: Enum[_]]]])
}
}
// Super simplified version of a custom enum deserializer
class PermissiveEnumDeserializer(javaType: Class[Enum[_ <: Enum[_]]]) extends StdScalarDeserializer[Enum[_ <: Enum[_]]](javaType) {
override def deserialize(jp: JsonParser, ctxt: DeserializationContext): Enum[_ <: Enum[_]] = {
val enumConstants = javaType.getEnumConstants
// Attempt string representation
val constant: Enum[_ <: Enum[_]] = val fromStringRep = enumConstants.find(_.toString.equalsIgnoreCase(text)).orNull
if (constant != null) {
return constant
}
throw ctxt.mappingException(jp.getText + " was not one of " + enumConstants.map(_.name).mkString("[", ", ", "]"))
}
}
In this example, the deserializer would fail on an Enum that is serialized using its "name" value, rather than its "toString" (which might be overridden). Ideally, I'd like for this custom setup to delegate to any previously bounded deserializers on the object mapper to handle the value.
So... in pseudocode:
val originalJacksonDeser = new EnumDeserializer
val customDeser = new PermissiveEnumDeserializer
val enumValue = MySpecialEnum.ONE
var res = customDeser(enumValue)
if (res == null) {
res = originalJacksonDeser(enumValue)
return res

Gson-like library for scala

I'm learning scala. I'm trying to find an easy way for turing JSON String to Scala case class instance. Java has wonderful library called Google Gson. It can turn java bean to json and back without some special coding, basically you can do it in a single line of code.
public class Example{
private String firstField
private Integer secondIntField
//constructor
//getters/setters here
}
//Bean instance to Json string
String exampleAsJson = new Gson().toJson(new Example("hehe", 42))
//String to Bean instance
Example exampleFromJson = new Gson().fromJson(exampleAsJson, Example.class)
I'm reading about https://www.playframework.com/documentation/2.5.x/ScalaJson and can't get the idea: why it's so complex is scala? Why should I write readers/writers to serialize/deserialize plain simple case class instances? Is there easy way to convert case class instance -> json -> case class instance using play json api?
Let's say you have
case class Foo(a: String, b: String)
You can easily write a formatter for this in Play by doing
implicit val fooFormat = Json.format[Foo]
This will allow you to both serialize and deserialize to JSON.
val foo = Foo("1","2")
val js = Json.toJson(foo)(fooFormat) // Only include the specific format if it's not in scope.
val fooBack = js.as[Foo] // Now you have foo back!
Check out uPickle
Here's a small example:
case class Example(firstField: String, secondIntField: Int)
val ex = Example("Hello", 3)
write(ex) // { "firstField": "Hello", "secondIntField" : 3 }

Categories