I have a 3-level nested Java POJO that looks like this in the schema file:
struct FPathSegment {
originIata:ushort;
destinationIata:ushort;
}
table FPathConnection {
segments:[FPathSegment];
}
table FPath {
connections:[FPathConnection];
}
When I try to serialize a Java POJO to the Flatbuffer equivalent I pretty much get "nested serialzation is not allowed" error every time I try to use a common FlatBufferBuilder to build this entire object graph.
There is no clue in the docs to state if I have a single builder for the entire graph? A separate one for every table/struct? If separate, how do you import the child objects into the parent?
There are all these methods like create/start/add various vectors, but no explanation what builders go in there. Painfully complicated.
Here is my Java code where I attempt to serialize my Java POJO into Flatbuffers equivalent:
private FPath convert(Path path) {
FlatBufferBuilder bld = new FlatBufferBuilder(1024);
// build the Flatbuffer object
FPath.startFPath(bld);
FPath.startConnectionsVector(bld, path.getConnections().size());
for(Path.PathConnection connection : path.getConnections()) {
FPathConnection.startFPathConnection(bld);
for(Path.PathSegment segment : connection.getSegments()) {
FPathSegment.createFPathSegment(bld,
stringCache.getPointer(segment.getOriginIata()),
stringCache.getPointer(segment.getDestinationIata()));
}
FPathConnection.endFPathConnection(bld);
}
FPath.endFPath(bld);
return FPath.getRootAsFPath(bld.dataBuffer());
}
Every start() method throws a "FlatBuffers: object serialization must not be nested" exception, can't figure out what is the way to do this.
You use a single FlatBufferBuilder, but you must finish serializing children before starting the parents.
In your case, that requires you to move FPath.startFPath to the end, and FPath.startConnectionsVector to just before that. This means you need to store the offsets for each FPathConnection in a temp array.
This will make the nesting error go away.
The reason for this inconvenience is to allow the serialization process to proceed without any temporary data structures.
I'm new, and attempting to work with the Rest API on setlist.fm from Android Studio, but am having some issues when fitting my GET request results into my Java data model.
Particularly, I have modeled "sets" ("set" refers to a set played at a concert) as a Java class. But commonly, I get results back from my HTTP requests that have "set" as an empty string or even an array.
I'll use this following GET request for all Radiohead setlists as an example:
http://api.setlist.fm/rest/0.1/artist/a74b1b7f-71a5-4011-9441-d0b5e4122711/setlists.json
Notice how, for the most part, "sets" is an object. But in some instances, it is a String. In other instances it is an array.
My Android Studio is giving me the following error when I try to parse the json with Gson into my data model using the following line of code:
gson.fromJson(result.toString(),Response.class);
It appears to be failing on an instance where "sets" is shown an empty string rather than an object:
Expected BEGIN_OBJECT but was STRING at line 1 column 942 path $.setlists.setlist[0].sets
Does anyone have advice on how to handle this type of thing? I've noticed it with all artists I've looked up so far.
Thanks!
Assuming Response is a class you wrote containing the main fields of the json and that at some point in it you have:
#SerializedName("setlist")
private List<MyItem> setlist;
I also assume your MyItem class contains the field:
#SerializedName("sets")
private List<MySet> sets;
if you let Gson parse it it will fail when it found a string instead of a list (-> array) of MySet object.
But you can write a custom TypeAdapter for your MyItem.
There's plenty of documentation about how to write a Gson TypeAdapter, look for it.
Use instanceOf operator to determine the type and cast accordingly.
JSONObject response=new JSONObject(res);
if(res.get("key") instanceOf JSONObject)
{
// code for JSONObject
}
else if(res.get("key") instanceOf JSONArray)
{
// code for JSONOArray
}
And so on
First I know my title is bad as I didn't come up with better, I'm opened to suggestion.
I'm using retrofit to get data from an api of this kind : #GET("users/{userid}")
It works fine and I'm happy with it, the problem is when I call the same api with #POST("users/widget") with a list of ids. I have the following answer :
{
"long_hash_id": {
"_id": "long_hash_id"
.......
},
"long_hash_id": {
"_id": "long_hash_id",
.....
},
........
}
the "long_hash_id" is typicaly "525558cf8ecd651095af7954"
it correspond to the id of the user attached to it.
When I didn't use retrofit, I used Gson in stream mode to get each user one by one. But I don't know how to tell retrofit.
Hope I'm clear and
Thank you in advance.
----------- Solution :
I made my interface this way :
#FormUrlEncoded
#POST(AppConstants.ROUTE_USER_GROUP)
Call<Map<String,User>> getUsers( #Field("ids") List<String> param, #QueryMap Map<String,String> options);
and I simply gave my ArrayList of ids. Thank you very much
Gson is able to deal with JSON objects with variable keys like the one you posted. What you have to do, in this case, is to declare a Map<String, ModelClass>, where ModelClass is the content of the JSONObject you want to represent
On Play Framework's homepage they claim that "JSON is a first class citizen". I have yet to see the proof of that.
In my project I'm dealing with some pretty complex JSON structures. This is just a very simple example:
{
"key1": {
"subkey1": {
"k1": "value1"
"k2": [
"val1",
"val2"
"val3"
]
}
}
"key2": [
{
"j1": "v1",
"j2": "v2"
},
{
"j1": "x1",
"j2": "x2"
}
]
}
Now I understand that Play is using Jackson for parsing JSON. I use Jackson in my Java projects and I would do something simple like this:
ObjectMapper mapper = new ObjectMapper();
Map<String, Object> obj = mapper.readValue(jsonString, Map.class);
This would nicely parse my JSON into Map object which is what I want - Map of string and object pairs and would allow me easily to cast array to ArrayList.
The same example in Scala/Play would look like this:
val obj: JsValue = Json.parse(jsonString)
This instead gives me a proprietary JsObject type which is not really what I'm after.
My question is: can I parse JSON string in Scala/Play to Map instead of JsObject just as easily as I would do it in Java?
Side question: is there a reason why JsObject is used instead of Map in Scala/Play?
My stack: Play Framework 2.2.1 / Scala 2.10.3 / Java 8 64bit / Ubuntu 13.10 64bit
UPDATE: I can see that Travis' answer is upvoted, so I guess it makes sense to everybody, but I still fail to see how that can be applied to solve my problem. Say we have this example (jsonString):
[
{
"key1": "v1",
"key2": "v2"
},
{
"key1": "x1",
"key2": "x2"
}
]
Well, according to all the directions, I now should put in all that boilerplate that I otherwise don't understand the purpose of:
case class MyJson(key1: String, key2: String)
implicit val MyJsonReads = Json.reads[MyJson]
val result = Json.parse(jsonString).as[List[MyJson]]
Looks good to go, huh? But wait a minute, there comes another element into the array which totally ruins this approach:
[
{
"key1": "v1",
"key2": "v2"
},
{
"key1": "x1",
"key2": "x2"
},
{
"key1": "y1",
"key2": {
"subkey1": "subval1",
"subkey2": "subval2"
}
}
]
The third element no longer matches my defined case class - I'm at square one again. I am able to use such and much more complicated JSON structures in Java everyday, does Scala suggest that I should simplify my JSONs in order to fit it's "type safe" policy? Correct me if I'm wrong, but I though that language should serve the data, not the other way around?
UPDATE2: Solution is to use Jackson module for scala (example in my answer).
Scala in general discourages the use of downcasting, and Play Json is idiomatic in this respect. Downcasting is a problem because it makes it impossible for the compiler to help you track the possibility of invalid input or other errors. Once you've got a value of type Map[String, Any], you're on your own—the compiler is unable to help you keep track of what those Any values might be.
You have a couple of alternatives. The first is to use the path operators to navigate to a particular point in the tree where you know the type:
scala> val json = Json.parse(jsonString)
json: play.api.libs.json.JsValue = {"key1": ...
scala> val k1Value = (json \ "key1" \ "subkey1" \ "k1").validate[String]
k1Value: play.api.libs.json.JsResult[String] = JsSuccess(value1,)
This is similar to something like the following:
val json: Map[String, Any] = ???
val k1Value = json("key1")
.asInstanceOf[Map[String, Any]]("subkey1")
.asInstanceOf[Map[String, String]]("k1")
But the former approach has the advantage of failing in ways that are easier to reason about. Instead of a potentially difficult-to-interpret ClassCastException exception, we'd just get a nice JsError value.
Note that we can validate at a point higher in the tree if we know what kind of structure we expect:
scala> println((json \ "key2").validate[List[Map[String, String]]])
JsSuccess(List(Map(j1 -> v1, j2 -> v2), Map(j1 -> x1, j2 -> x2)),)
Both of these Play examples are built on the concept of type classes—and in particular on instances of the Read type class provided by Play. You can also provide your own type class instances for types that you've defined yourself. This would allow you to do something like the following:
val myObj = json.validate[MyObj].getOrElse(someDefaultValue)
val something = myObj.key1.subkey1.k2(2)
Or whatever. The Play documentation (linked above) provides a good introduction to how to go about this, and you can always ask follow-up questions here if you run into problems.
To address the update in your question, it's possible to change your model to accommodate the different possibilities for key2, and then define your own Reads instance:
case class MyJson(key1: String, key2: Either[String, Map[String, String]])
implicit val MyJsonReads: Reads[MyJson] = {
val key2Reads: Reads[Either[String, Map[String, String]]] =
(__ \ "key2").read[String].map(Left(_)) or
(__ \ "key2").read[Map[String, String]].map(Right(_))
((__ \ "key1").read[String] and key2Reads)(MyJson(_, _))
}
Which works like this:
scala> Json.parse(jsonString).as[List[MyJson]].foreach(println)
MyJson(v1,Left(v2))
MyJson(x1,Left(x2))
MyJson(y1,Right(Map(subkey1 -> subval1, subkey2 -> subval2)))
Yes, this is a little more verbose, but it's up-front verbosity that you pay for once (and that provides you with some nice guarantees), instead of a bunch of casts that can result in confusing runtime errors.
It's not for everyone, and it may not be to your taste—that's perfectly fine. You can use the path operators to handle cases like this, or even plain old Jackson. I'd encourage you to give the type class approach a chance, though—there's a steep-ish learning curve, but lots of people (including myself) very strongly prefer it.
I've chosen to use Jackson module for scala.
import com.fasterxml.jackson.databind.ObjectMapper
import com.fasterxml.jackson.module.scala.DefaultScalaModule
import com.fasterxml.jackson.module.scala.experimental.ScalaObjectMapper
val mapper = new ObjectMapper() with ScalaObjectMapper
mapper.registerModule(DefaultScalaModule)
val obj = mapper.readValue[Map[String, Object]](jsonString)
For further reference and in the spirit of simplicity, you can always go for:
Json.parse(jsonString).as[Map[String, JsValue]]
However, this will throw an exception for JSON strings not corresponding to the format (but I assume that goes for the Jackson approach as well). The JsValue can now be processed further like:
jsValueWhichBetterBeAList.as[List[JsValue]]
I hope the difference between handling Objects and JsValues is not an issue for you (only because you were complaining about JsValues being proprietary). Obviously, this is a bit like dynamic programming in a typed language, which usually isn't the way to go (Travis' answer is usually the way to go), but sometimes that's nice to have I guess.
You can simply extract out the value of a Json and scala gives you the corresponding map.
Example:
var myJson = Json.obj(
"customerId" -> "xyz",
"addressId" -> "xyz",
"firstName" -> "xyz",
"lastName" -> "xyz",
"address" -> "xyz"
)
Suppose you have the Json of above type.
To convert it into map simply do:
var mapFromJson = myJson.value
This gives you a map of type : scala.collection.immutable.HashMap$HashTrieMap
Would recommend reading up on pattern matching and recursive ADTs in general to better understand of why Play Json treats JSON as a "first class citizen".
That being said, many Java-first APIs (like Google Java libraries) expect JSON deserialized as Map[String, Object]. While you can very simply create your own function that recursively generates this object with pattern matching, the simplest solution would probably be to use the following existing pattern:
import com.google.gson.Gson
import java.util.{Map => JMap, LinkedHashMap}
val gson = new Gson()
def decode(encoded: String): JMap[String, Object] =
gson.fromJson(encoded, (new LinkedHashMap[String, Object]()).getClass)
The LinkedHashMap is used if you would like to maintain key ordering at the time of deserialization (a HashMap can be used if ordering doesn't matter). Full example here.
I am writing an Android app for a co-worker that keeps track of Kids signed up for a soccer league. I am currently having trouble saving/serializing my roster then deserializing it later. The object I am serializing is an array of Player objects. The custom player class implements serializable so an array of them should be fine to serialize (as far as I know).
My serialization/saving method:
String ser = SerializeObject.objectToString(currentRoster.getRosterArray());
if (ser != null && !ser.equalsIgnoreCase("")) {
SerializeObject.WriteSettings(this, ser, "playerRoster"); //.dat extension
} else {
System.out.println("Object not saved");
SerializeObject.WriteSettings(this, "", "playerRoster");
}
My deserialization method:
String ser = SerializeObject.ReadSettings(this, "playerRoster");
if (ser != null && !ser.equalsIgnoreCase("")) {
Object obj = SerializeObject.stringToObject(ser);
// Then cast it to your object and
if (obj instanceof Player[]) {
// Do something
loadedRoster = (Player[]) obj;
System.out.println(loadedRoster[0]);
}
}
The result I am getting in my app is jargon for every player in the array when deserialized.
My question is on if I am correctly saving and loading the data, or am I forgetting something. (I left out some of the filler code and exception handling to keep it cleaner)
Thanks for any help!
When facing the problem of (de)serialization or (un)marshalling, I turned to JAXB to (de)serialize json or xml. I tried Jackson, but didn't get the results I was looking for, particularly with my xml. Jackson likes to set namespaces for XML (and defaults to "") and I needed mine without. Other than that, it was great, no dependencies and it'll handle well formed XML and JSON unlike gson and JSON-java.
When reading about using any of the above approaches you can't go wrong reading anything by Blaise Doughan or StaxMan. You can find a tutorial about JAXB in general right here. For using MOXy as your JAXB provider, this shows all the necessary code and links to anything else you need to know to (de)serialize/(un)marshal your objects.