How can I save and load a level file with abstract objects? - java

I'm trying to make a level editor that provides a save functionality like mario maker - the user can create a level and save the level data. How is this usually done? Specifically what I'm struggling with is my level contains a list of Enemies (an abstract class). When I write to a file I can write the json representation of the concrete enemy classes, but when I read from the file, I'd need to know what concrete class it is in order to reconstruct it into that specific class. That seems like a pain - I'd have to manually add some code to write out what class type the enemy is when it gets saved and also add code to read what class type and create an instance of that class when read. I'm afraid of maintaining that for every new Enemy that I create. So my main question is how can I most easily read a list of concrete Enemies into a list of abstract Enemies? It seems like some knowledge about the class is required on save/load.
Also, is saving as JSON the way to go here or is serialization better? Does it matter?

Since your going to be creating concrete classes when the program starts up, you will need to know the actual class of each one. There are a bunch of ways you could do this. To do something easy, you could add a getLabel() method to each concrete class and use that as a switch to figure out the correct concrete class.
// Using jackson-databind
ObjectMapper mapper = new ObjectMapper();
JsonNode node = mapper.readValue(json, JsonNode.class);
Enemy enemy = null;
if (GOOMBA_LABEL.equals(node.get("label").asText()))
enemy = mapper.readValue(json, Goomba.class);
I really like using JSON libraries functionality to parse my JSON into a POJO. However, doing the above would actually require double parsing - 1) Parse into some generic structure (like a Map or JsonNode), check the label and then 2) Parse into POJO.
Another thing you could do is prepend a "Magic Number" before each JSON string to let you know which type it is. Then you don't have to double parse JSON.
DataInput input = new DataInputStream(fileInputStream);
int magic = input.readInt();
Enemy enemy = null;
if (GOOMBA_MAGIC == magic) {
String json = input.readUTF();
enemy = mapper.readValue(json, Goomba.class);
}
As far as if JSON is the right serialization to use, it's up to you. The things that are nice about it is it's human readable and editable. There are other serialization technologies if performance or disk usage are more important to you. For something like this, JSON seems fine.

Related

Is JSON parsing or String parsing more expensive in Java?

I'm trying to dynamically parse a lot of data in Java. In other words I have tons of different types of data so I am putting them in a String or a JSON String (via GSON) instead of doing:
switch(data){
case instanceof Class1:
Class1 data = gson.fromJson(jsonString);
//Do Stuff for class 1 layout
case instanceof Class2:
Class2 data = gson.fromJson(jsonString);
etc.
So instead of doing this for 80 or so classes (which more may be added/removed/changed at any given time) I am planning to put the data in either a String or Json String and parse out the values (depth first).
I am on a low end PC (single core Atom processor) and am trying to reduce the amount of taxing I put on the CPU, so determining which would be faster... regex string manipulation with splits or using a recursive JSON parser.
Let us discuss both the cases you have mentioned here:
CASE 1:
Creating instances for each and every json input using gson (mapped to a Class)
and
CASE 2: Create a Document (or similar type object) itself and try accessing data from there.
For Case 2, you don't need to write a parser yourself: there are parsers already available.
I'll just write down a jackson example here (There must be similar stuff available with gson as well):
String myJson = "{ \"type\" : \"foo\", \"class\" : 3 }";
ObjectMapper objectMapper = new ObjectMapper();
JsonNode node = objectMapper.readValue(myJson, JsonNode.class);
JsonNode type = node.get("type");
System.out.println(type.asText());
As per my experience, the performance difference hasn't been much in both these cases as the libraries handle the parsing quite efficiently but if you have a lot of different types of JSON, then it doesn't make any sense to create POJOs out of each and every JSON (So much mapping !!).
I have personally not worked with gson because of performance reasons like this but you can create what's called an ObjectMapper using jackson and use it quite efficiently. I assume there should be similar thing in gson as well.
The only downside would be that every field will now be accessible through a string input which might make your code a bit unreadable.
EDIT:
If you really wish to iterate through all the fields of the json, you'll have to do DFS ultimately, but parsing can still be avoided by using the fields method of the JsonNode.
root.fields() will return with an iterator with the entries in it which can be then gracefully used as described in answer here.
Remember, similar stuff with different names will be available in gson too. :)

How can I lazy load specific elements of a JSON file efficiently with GSON?

I have a JSON file that is marshalled into custom object using GSON.
All works fine.
An example of the structure I have:
public class Domain {
private String id;
private String name;
//other fields
private ArrayList<Structures> structs;
private ArrayList<Buildings> buildings;
private ArrayList<CustomObject> objects;
// some more lists and fields
}
So basically I create a builder and parse the json
Gson gson = new GsonBuilder().create();
gson.fromJson(jsonString, Domain.class);
This works absolutely fine. Parsing is done and I get my data.
Problem:
I don't really need to have various fields of the Domain class populated from the start because e.g. I may have the Domain class with a lot of elements in the e.g. list for structs but I might not really need to access them.
What I need to do is some kind of pattern for lazy loading.
I.e. I would like to not load some parts of class during the json parsing and only load them when needed.
Question:
If I understand correctly the way to skip fields from being parsed is by making them transient.
But then if at some later time I need to access e.g. the structs how would I actually load them at that point? I think that reloading/reparsing all the json again is suboptimal.
What is a standard/good approach for this?
This is a really lengthy topic. There are many approaches to this and all of them are usually a lot more complicated. The easiest one, if you really value something very simple for me was so far not using gson, but for example something like JSONObject and then populate the object myself. using this you could easily split this up into multiple steps. The problem that now arises is, that you never know, what exactly is already loaded - or more - what is maybe loaded, but just not filled as a field.
Lazy loading using automatic conversions like gson is unfortunately always gonna involve unnecessary object creation too, so question then is if its not less pain just to do it yourself from the beginning.
if it has to be gson, you could have different objects for different stages. read them in through json and then apply to your main object.
a favourable version of that is probably to split up the object into different components (or aspects or whatever you want to call it) that match the different loading stages. Different possibilities but lets just pick one of them:
class Domain {
private String id;
private DomainStructs domainStructs;
}
class DomainStructs {
private ArrayList<Structures> structs;
}
Now you need a new Object in this version of doing this. This means the overall size of the model is slightly (but not much really) bigger and you should probably match together things that are necessary together anyway - so not load every field separate, but this way you can leave out parts and easily add them later by populating them from Gson like 2 steps here:
Gson gson = new GsonBuilder().create();
Domain domain = gson.fromJson(jsonString, Domain.class); // first time
domain.structs = gson.fromJson(jsonString, DomainStructs.class); // now adding
I am not saying this is the best idea ever, but it fulfills your idea while still using gson.
I would though consider splitting up the Data already - so not storing the strings, but holding the data in different components in this case if it is possible. Basically you want a domainJsonString and a domainStructsJsonString if you get what i mean. stored in a way so you can easily retrieve them separately.
I hope this helps you to move a bit forward

transform Scala case class to org.apache.solr.common.SolrInputDocument

Are there any generic implementations out-there which can transform a Scala case class to SolrDocument ?
Since I could not find any such mapper utility that I can reuse, I took the below approach:
Create the case class object
Get non-empty fields by transforming case class object to Map
Add fields to the mutable document one-by-one.
This approach works for me, but I would have to create the intermediate Map object. I want to avoid this for verbosity and complexity reasons. Is there a better way of doing it?
Not a complete answer (I would write this in a comment, but need a few more rep), but just you point you in a direction, macros are the way to do this in a type-safe way without writing boiler plate mapping functions for every case class. JSON libraries deal with the same problem (except replace SolrDocument with JSON object). As an example you can take a look at the JSON serializer/deserializer macro implementation from Play Framework:
https://github.com/playframework/playframework/blob/master/framework/src/play-json/src/main/scala/play/api/libs/json/JsMacroImpl.scala
I suspect this solution is a little more heavy than you were looking for. The way I would approach it is to write the stupid boilerplate mapping functions for each case class, and only go down the macro path if this becomes a significant burden.
Seems fairly trivial to modify one of these answers:
def getSolrDoc(cc: Product): SolrDocument = {
val solrDoc = new SolrDocument
cc.getClass.getDeclaredFields.foreach { f =>
f.setAccessible(true)
solrDoc.addField(f.getName, f.get(cc))
}
solrDoc
}
And usage:
case class X(a:Int, b:String)
val x = X(1, "Test")
val doc = getSolrDoc(x)
println(doc) // prints SolrDocument{a=1, b=Test}

Serialize a hierarchical structure of objects in java

I have an abstract class Screen and child classes: GameScreen, SpellScreen, StatsScreen, etc.
The game works in this way: a Renderer class creates a root
Screen screenRoot = new GameScreen()
which then is free to add screens to itself, which then may add screens to themselves and so it goes. Therefore a tree-like structure is formed, every Screen containing a list of its child-screens.
Now I am wondering if it's possible to perform serialization and deserialization on that - I'd like to recreate all the screens in the same hierarchy.
Is it enough to serialize only the screenRoot object, and then deserialize it (provided I want to preserve the whole screens tree), or do I need to traverse the tree myself somehow?
How would you go about serializing this?
P.S. the game is for Android and uses OpenGL ES 2.0.
A hierarchy of objects is no impediment to using Java Serialization, as the latter can cope with arbitrary object graphs - and yes, serializing an object using Java Serialization will serialize all objects it refers to (unless that reference is marked transient). Assuming that's what you want, serializing the hierarchy is as simple as:
try (ObjectOutputStream oos = new ObjectOutputStream(new BufferedOutputStream(new FileOutputStream(filename)))) {
oos.write(rootScreen);
}
and reading as simple as:
try (ObjectInputStream ois = new ObjectInputStream(new BufferedInputStream(new FileInputStream(filename)))) {
return (GameScreen) ois.readObject();
}
There are two issues here.
First, screens should be just that--screens. They shouldn't contain the "model" or object data that represents your game state; only the view/rendering of that state. So serializing and deserializing, doesn't really make sense. I would suggest looking at your architecture again to see if this is really what you want to do.
If you decide to do it, or if you have another game-state object root that you can serialize (I usually use the Player since it has all the essential data in it), you can easily do this with Gson:
// Save
RootObject o = ...; // The root of the hierarchy to serialize
Gson gson = new Gson();
String serialized - gson.toJson(o); // JSON object, eg. { "Player": { ... } }
// Load
RootObject deserialized = gson.fromJson(serialized, RootObject.class);
You can read more in their user guide.
Second, on the issue of JSON and Gson: I prefer this over standard serialization, because it's robust in the face of changes. If your class definitions change, you can still deserialize objects (albeit you get null/empty fields) instead of a runtime exception; you don't need to worry about versioning your classes, either.
Edit: questions like this are better suited to the Game Dev SE site.

Transfer of a Java Serialized Object

Is it possible to declare an instance of a serializable object in one Java program / class, then repeat the definitions of the internal objects in a different program /class entirely, and load in a big complex object from a data file? The goal is to be able to write an editor for items that's kept locally on my build machine, then write the game itself and distribute it to people who would like to play the game.
I'm writing a game in Java as a hobbyist project. Within my game, there's an a family of classes that extend a parent class, GameItem. Items might be in various families like HealingPotion, Bomb, KeyItem, and so on.
class GameItem implements Serializable {
String ItemName
String ImageResourceLocation
....}
What I want to do is include definitions of how to create each item in a particularly family of items, but then have a big class called GameItemList, which contains all possible items that can occur as you play the game.
class GameItemList implements Serializable {
LinkedList<GameItem>gameItemList;
//methods here like LookUpByName, LookUpByIndex that return references to an item
}
Maybe at some point - as the player starts a new game, or as the game launches, do something like:
//create itemList
FileInputStream fileIn = new FileInputStream("items.dat");
ObjectInputStream in = new ObjectInputStream(fileIn);
GameItemList allItems = (GameItemList)in.readObject();
in.close();
//Now I have an object called allItems that can be used for lookups.
Thanks guys, any comments or help would be greatly appreciated.
When you serialize an object, every field of the object is serialized, unless marked with transient. And this behavior is of course recursive. So yes, you can serialize an object, then deserialize it, and the deserialized object will have the same state as the serialized one. A different behavior would make serialization useless.
I wouldn't use native serialization for long-term storage of data, though. Serialized objects are hard to inspect, impossible to modify using a text editor, and maintaining backward compatibility with older versions of the classes is hard. I would use a more open format like XML or JSON.
Yes, that is possible. If an object is correctly serialized, it can be deserialized in any other machine as long as the application running there knowns the definition of the class to be deserialized.
This will work, but Java serialization is notorious for making it hard to "evolve" classes -- the internal representation is explicitly tied to the on-disk format. You can work around this with custom reader / writer methods, but you might consider a more portable format like JSON or XML instead of object serialization.

Categories