I'm learning scala. I'm trying to find an easy way for turing JSON String to Scala case class instance. Java has wonderful library called Google Gson. It can turn java bean to json and back without some special coding, basically you can do it in a single line of code.
public class Example{
private String firstField
private Integer secondIntField
//constructor
//getters/setters here
}
//Bean instance to Json string
String exampleAsJson = new Gson().toJson(new Example("hehe", 42))
//String to Bean instance
Example exampleFromJson = new Gson().fromJson(exampleAsJson, Example.class)
I'm reading about https://www.playframework.com/documentation/2.5.x/ScalaJson and can't get the idea: why it's so complex is scala? Why should I write readers/writers to serialize/deserialize plain simple case class instances? Is there easy way to convert case class instance -> json -> case class instance using play json api?
Let's say you have
case class Foo(a: String, b: String)
You can easily write a formatter for this in Play by doing
implicit val fooFormat = Json.format[Foo]
This will allow you to both serialize and deserialize to JSON.
val foo = Foo("1","2")
val js = Json.toJson(foo)(fooFormat) // Only include the specific format if it's not in scope.
val fooBack = js.as[Foo] // Now you have foo back!
Check out uPickle
Here's a small example:
case class Example(firstField: String, secondIntField: Int)
val ex = Example("Hello", 3)
write(ex) // { "firstField": "Hello", "secondIntField" : 3 }
Related
I have a JSON as a string which I am deserializing and instantiating as MyPOJO case class of scala. My data is in YYYY-MM-DD format but the actual attribute in POJO createdBy is LocalDateTime.
How to assign a default time value of 2020-03-02 00:00:00 while instantiating Pojo,
Serialization should return yyyy-mm-dd format. My serialization and deserialization format are different.
case class MyPOJO( #JsonFormat(pattern = "yyyy-mm-dd" ) createdBy :LocalDateTime )
object MyJaxsonP {
def main(args: Array[String]): Unit = {
val objectMapper = new ObjectMapper() with ScalaObjectMapper
objectMapper.findAndRegisterModules()
objectMapper.registerModule(DefaultScalaModule)
objectMapper.registerModule(new JavaTimeModule)
objectMapper.configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
objectMapper.disable(SerializationFeature.WRITE_DATES_AS_TIMESTAMPS)
val adminDatasP = objectMapper.readValue[MyPOJO]("{\"createdBy\":\"2020-03-02\"}")
print(adminDatasP.toString)
}
}
I have tried with custom serialization and deserialization, like below, but is not working saying default constructor is missing
case class MyPOJO( #JsonDeserialize(using = classOf[CustomDateDeserializer] ) createdBy :LocalDateTime )
object CustomDateDeserializer {
private val formatter = new SimpleDateFormat("dd-MM-yyyy")
}
class CustomDateDeserializer(val t: Class[String]) extends StdDeserializer[String](t) {
override def deserialize(p: JsonParser, ctxt: DeserializationContext): String = {
val date = p.getText
return CustomDateDeserializer.formatter.format(date);
}
}
Need expert input on how to solve this problem
but is not working saying default constructor is missing
You getting the error, because there is not default or empty, if you will, constructor for case classes. In this particular case when you declare case class MyPOJO(createdBy : LocalDateTime) scala compiler will generate something like (example might be not very accurate, I want just to show an idea):
class MyPOJO(createdBy : LocalDateTime) extends Product with Serializeble {
override def hashCode(): Int = ...
override def equals(oterh: Any): Boolean = ...
override def toString(): String = ...
// and other methods, like copy, unaply, tupled etc.
}
object MyPOJO {
def apply(createdBy : LocalDateTime) = new MyPOJO(createdBy)
}
so Jackson wont be able to create empty class instance with empty fields (or null values more precisely) and then inject values from source JSON.
What you can do - use plain class instead of case classes. Or, what is more preferable, to take a look on Scala friendly JSON libraries like Circe which is not reflection based, unlike Jackson, and instead generates JSON codec in compile time for some classes, based on implicit's and Scala macros (more precisely it relies on Shapeless library which uses Scala macros mechanism under the hood).
In your particular case, code would look like:
import io.circe._
import io.circe.generic.auto._
import io.circe.syntax._
case class MyPOJO(createdBy: LocalDateTime)
val format = DateTimeFormatter.ofPattern("yyyy-MM-dd")
// Implement own `Encoder` to render `LocalDateTime` as JSON string, separated with comma inside
implicit val encoder: Encoder[LocalDateTime] = Encoder[String].contramap(_.format(format))
// Implement own `Decoder` to parse JSON string as `LocalDateTime`
implicit val decoder: Decoder[LocalDateTime] = Decoder[String].
emapTry(value => Try(LocalDate.parse(value, format).atStartOfDay()))
val foo = MyPOJO(LocalDateTime.now())
val json = foo.asJson
println(json.noSpaces)
println(json.as[MyPOJO])
which will produce next result:
{"createdBy":"2020-03-04"}
Right(MyPOJO(2020-03-04T00:00))
Hope this helps!
I am trying to understand the following statement from the documentation:
If the concrete class of the object is not known and the object couldbe null:
kryo.writeClassAndObject(output, object);
Object object = kryo.readClassAndObject(input);
What does if the concrete class is not known exactly.
I am having the following code:
case class RawData(modelName: String,
sourceType: String,
deNormalizedVal: String,
normalVal: Map[String, String])
object KryoSpike extends App {
val kryo = new Kryo()
kryo.setRegistrationRequired(false)
kryo.addDefaultSerializer(classOf[scala.collection.Map[_,_]], classOf[ScalaImmutableAbstractMapSerializer])
kryo.addDefaultSerializer(classOf[scala.collection.generic.MapFactory[scala.collection.Map]], classOf[ScalaImmutableAbstractMapSerializer])
kryo.addDefaultSerializer(classOf[RawData], classOf[ScalaProductSerializer])
//val testin = Map("id" -> "objID", "field1" -> "field1Value")
val testin = RawData("model1", "Json", "", Map("field1" -> "value1", "field2" -> "value2") )
val outStream = new ByteArrayOutputStream()
val output = new Output(outStream, 20480)
kryo.writeClassAndObject(output, testin)
output.close()
val input = new Input(new ByteArrayInputStream(outStream.toByteArray), 4096)
val testout = kryo.readClassAndObject(input)
input.close()
println(testout.toString)
}
When I use readClassAndObject and writeClassAndObject is works. However if I use writeObject and readObject it does not.
Exception in thread "main" com.esotericsoftware.kryo.KryoException:
Class cannot be created (missing no-arg constructor):
com.romix.scala.serialization.kryo.ScalaProductSerializer
I just don't understand why.
earlier using the same code, Instead of using my class RawData, I used a Map and it worked like a charm with writeObject and ReadObject. Hence i am confused.
Can someone help understand it ?
The difference is as follows:
you use writeClassAndObject and readClassAndObject when you're using a serializer that:
serializes a base type: an interface, a class that has subclasses, or - in case of Scala - a trait like Product,
and needs the type (i.e. the Class object) of the deserialized object to construct this object (without this type, it doesn't know what to construct),
example: ScalaProductSerializer
you use writeObject and readObject when you're using a serializer that:
serializes exactly one type (i.e. a class that can be instantiated; example: EnumSetSerializer),
or serializes more than one type but the specific type can be somehow deduced from the serialized data (example: ScalaImmutableAbstractMapSerializer)
To sum this up for your specific case:
when you deserialize your RawData:
ScalaProductSerializer needs to find out the exact type of Product to create an instance,
so it uses the typ: Class[Product] parameter to do it,
as a result, only readClassAndObject works.
when you deserialze a Scala immutable map (scala.collection.immutable.Map imported as IMap):
ScalaImmutableAbstractMapSerializer doesn't need to find out the exact type - it uses IMap.empty to create an instance,
as a result, it doesn't use the typ: Class[IMap[_, _]] parameter,
as a result, both readObject and readClassAndObject work.
I have the bunch of Java classes. I need to create simple POJOs of just the fields from Java classes. There is a way to create POJOs from JSON but I need directly from Java classes.
Java class may have logical methods and constructed based upon different things. My goal is just to hold the state in POJOs and send it over the network and deserialize in same set of POJOs.
You can serialize Java classes just fine, no need to strip them down to their fields (which is what it sounds like you want).
class MyClass implements Serializable {
private int myInt;
private String myString;
public MyClass(int mi, String ms) {
myInt = mi; myString = ms;
}
public String doStuff() { return String.format("%s %d", myString, myInt); }
}
Code for serialization:
MyClass toSerialize = new MyClass(5, "Test");
try (ObjectOutputStream out = new ObjectOutputStream(getNetworkOutstream())) {
out.writeObject(toSerialize);
}
Code to deserialize:
try (ObjectInputStream in = new ObjectInputStream(getNetworkInStream())) {
MyClass received = (MyClass) in.readObject();
}
The doStuff method is not in the way if that's what you're thinking.
Caveat is that all fields need to also be Serializable (or primitives).
If you are looking for a way to programmatically parse all those classes and generate POJOs for them, then you can use libraries like Antlr, JavaCC or JavaParser to analyse sources and then generate and save the new POJOs.
Use some JSON library.
For ex. GSON
You could choose what fields to serialize or not using transient identifier.
Apart from that these libraries offer much more and definitely all the requirements you specified.
My company has a webserver API that provides search results in JSON format. I'm responsible for developing an Android app that consumes that API, and I have made some classes that model the objects in the JSON responses.
For the sake of habit and my own preference, I use to write my code in English only. However, most of the JSON keys are not in English. This way, I cannot readily use GSON to convert the JSON strings into Java Objects -- at least that is what I think.
I was wondering if there is any way to reference just once per class the connection between the JSON key and their corresponding instance variables in the code. In a way that after referenced, I could simply instantiate objects from JSON and create JSON strings from objects.
Is that possible?
Example:
// Java code
class Model {
String name;
Integer age;
}
// JSON with keys in Portuguese
{
"nome" : "Mark M.", # Key "nome" matches variable "name"
"idade" : 30 # Key "idade" matches variable "age"
}
Use the #SerializedName annotation.
Here is an example of how this annotation is meant to be used:
public class SomeClassWithFields {
#SerializedName("name") private final String someField;
private final String someOtherField;
public SomeClassWithFields(String a, String b) {
this.someField = a;
this.someOtherField = b;
}
}
The following shows the output that is generated when serializing an instance of the above example class:
SomeClassWithFields objectToSerialize = new SomeClassWithFields("a", "b");
Gson gson = new Gson();
String jsonRepresentation = gson.toJson(objectToSerialize);
System.out.println(jsonRepresentation);
===== OUTPUT =====
{"name":"a","someOtherField":"b"}
Source: SerializedName Javadocs
In the work that I do on a day to day in Java, I use builders quite a lot for fluent interfaces, e.g.: new PizzaBuilder(Size.Large).onTopOf(Base.Cheesy).with(Ingredient.Ham).build();
With a quick-and-dirty Java approach, each method call mutates the builder instance and returns this. Immutably, it involves more typing, cloning the builder first before modifying it. The build method eventually does the heavy lifting over the builder state.
What's a nice way of achieving the same in Scala?
If I wanted to ensure that onTopOf(base:Base) was called only once, and then subsequently only with(ingredient:Ingredient) and build():Pizza could be called, a-la a directed builder, how would I go about approaching this?
Another alternative to the Builder pattern in Scala 2.8 is to use immutable case classes with default arguments and named parameters. Its a little different but the effect is smart defaults, all values specified and things only specified once with syntax checking...
The following uses Strings for the values for brevity/speed...
scala> case class Pizza(ingredients: Traversable[String], base: String = "Normal", topping: String = "Mozzarella")
defined class Pizza
scala> val p1 = Pizza(Seq("Ham", "Mushroom"))
p1: Pizza = Pizza(List(Ham, Mushroom),Normal,Mozzarella)
scala> val p2 = Pizza(Seq("Mushroom"), topping = "Edam")
p2: Pizza = Pizza(List(Mushroom),Normal,Edam)
scala> val p3 = Pizza(Seq("Ham", "Pineapple"), topping = "Edam", base = "Small")
p3: Pizza = Pizza(List(Ham, Pineapple),Small,Edam)
You can then also use existing immutable instances as kinda builders too...
scala> val lp2 = p3.copy(base = "Large")
lp2: Pizza = Pizza(List(Ham, Pineapple),Large,Edam)
You have three main alternatives here.
Use the same pattern as in Java, classes and all.
Use named and default arguments and a copy method. Case classes already provide this for you, but here's an example that is not a case class, just so you can understand it better.
object Size {
sealed abstract class Type
object Large extends Type
}
object Base {
sealed abstract class Type
object Cheesy extends Type
}
object Ingredient {
sealed abstract class Type
object Ham extends Type
}
class Pizza(size: Size.Type,
base: Base.Type,
ingredients: List[Ingredient.Type])
class PizzaBuilder(size: Size.Type,
base: Base.Type = null,
ingredients: List[Ingredient.Type] = Nil) {
// A generic copy method
def copy(size: Size.Type = this.size,
base: Base.Type = this.base,
ingredients: List[Ingredient.Type] = this.ingredients) =
new PizzaBuilder(size, base, ingredients)
// An onTopOf method based on copy
def onTopOf(base: Base.Type) = copy(base = base)
// A with method based on copy, with `` because with is a keyword in Scala
def `with`(ingredient: Ingredient.Type) = copy(ingredients = ingredient :: ingredients)
// A build method to create the Pizza
def build() = {
if (size == null || base == null || ingredients == Nil) error("Missing stuff")
else new Pizza(size, base, ingredients)
}
}
// Possible ways of using it:
new PizzaBuilder(Size.Large).onTopOf(Base.Cheesy).`with`(Ingredient.Ham).build();
// or
new PizzaBuilder(Size.Large).copy(base = Base.Cheesy).copy(ingredients = List(Ingredient.Ham)).build()
// or
new PizzaBuilder(size = Size.Large,
base = Base.Cheesy,
ingredients = Ingredient.Ham :: Nil).build()
// or even forgo the Builder altogether and just
// use named and default parameters on Pizza itself
Use a type safe builder pattern. The best introduction I know of is this blog, which also contains references to many other articles on the subject.
Basically, a type safe builder pattern guarantees at compile time that all required components are provided. One can even guarantee mutual exclusion of options or arity. The cost is the complexity of the builder code, but...
Case classes solve the problem as shown in previous answers, but the resulting api is difficult to use from java when You have scala collections in your objects. To provide a fluent api to java users try this:
case class SEEConfiguration(parameters : Set[Parameter],
plugins : Set[PlugIn])
case class Parameter(name: String, value:String)
case class PlugIn(id: String)
trait SEEConfigurationGrammar {
def withParameter(name: String, value:String) : SEEConfigurationGrammar
def withParameter(toAdd : Parameter) : SEEConfigurationGrammar
def withPlugin(toAdd : PlugIn) : SEEConfigurationGrammar
def build : SEEConfiguration
}
object SEEConfigurationBuilder {
def empty : SEEConfigurationGrammar = SEEConfigurationBuilder(Set.empty,Set.empty)
}
case class SEEConfigurationBuilder(
parameters : Set[Parameter],
plugins : Set[PlugIn]
) extends SEEConfigurationGrammar {
val config : SEEConfiguration = SEEConfiguration(parameters,plugins)
def withParameter(name: String, value:String) = withParameter(Parameter(name,value))
def withParameter(toAdd : Parameter) = new SEEConfigurationBuilder(parameters + toAdd, plugins)
def withPlugin(toAdd : PlugIn) = new SEEConfigurationBuilder(parameters , plugins + toAdd)
def build = config
}
Then in java code the api is really easy to use
SEEConfigurationGrammar builder = SEEConfigurationBuilder.empty();
SEEConfiguration configuration = builder
.withParameter(new Parameter("name","value"))
.withParameter("directGivenName","Value")
.withPlugin(new PlugIn("pluginid"))
.build();
It's the same exact pattern. Scala allows for mutation and side effects. That said, if you'd like to be more of a purest, have each method return a new instance of the object that you're constructing with the element(s) changed. You could even put the functions within the Object of a class so that there's a higher level of separation within your code.
class Pizza(size:SizeType, layers:List[Layers], toppings:List[Toppings]){
def Pizza(size:SizeType) = this(size, List[Layers](), List[Toppings]())
object Pizza{
def onTopOf( layer:Layer ) = new Pizza(size, layers :+ layer, toppings)
def withTopping( topping:Topping ) = new Pizza(size, layers, toppings :+ topping)
}
so that your code might look like
val myPizza = new Pizza(Large) onTopOf(MarinaraSauce) onTopOf(Cheese) withTopping(Ham) withTopping(Pineapple)
(Note: I've probably screwed up some syntax here.)
using Scala partial applies are feasible if you are building a smallish object that you don't need to pass over method signatures. If any of those assumptions don't apply, I recommend using a mutable builder to build an immutable object. With this being scala you could implement the builder pattern with a case class for the object to build with a companion as the builder.
Given that the end result is a constructed immutable object I don't see that it defeats any of the Scala principles.