Json Deep serialize - java

In a case where Person is a POJO having a List of "hobbies".
Just trying to understand this statement to implement a deep serialize mechanism:
new JSONSerializer().include("hobbies").serialize( person );
Does the syntax seem intuitive? From a java user POV, it seems the syntax should be:
new JSONSerializer().serialize( person ).include("hobbies");
I say this because it seems intuitive first to serialize the priamry object and then any Lists, references thereof.
Also, is the source code of flexjson available for public use? It is not present on sourceforge.net

You cannot do the latter so easily - the implementation would not know when you are done. You need to have some kind of terminator that performs the action, such as as .run() or .done()..

Related

Can I create a DeepCopy of an Java Object/Entity with Mapstruct?

I have a JPA entity (but this question is interesting in general) that consists of multiple child classes (aggregation).
I need to create a new entry in the DB that is 90% identical to the existing one (a few business values and of course the IDs need to be different).
As we need mapstruct for mapping between entity and TO I was thinking "Can mapstruct do this for me?" After Creating a deep copy I could simply update the remaining fields and persist the object.
Writing a copy constructor by hand is error prone (as newly added fields could be forgotten), a generator aproach would be much appreciated.
Yes, you can use #DeepClone:
This Javadoc contains an example:
https://mapstruct.org/documentation/dev/api/org/mapstruct/control/MappingControl.html
#Mapper(mappingControl = DeepClone.class)
public interface CloningMapper {
CloningMapper INSTANCE = Mappers.getMapper( CloningMapper.class );
MyDTO clone(MyDTO in);
}
Yes. But be careful though. If MapStruct discovers the same type on source and target it will simply take (not clone) the source type. Unless you define a method signature for it.
In other words: check the generated code carefully.

schema.org deserialize in Java

I am trying to deserialize schema.org's objects but every time I face a wall of complexity. I'm not sure if it's my fault or no one ever did this. I tried several schema.org's item and all of them sooner or later encounter the same issue (for obvious reasons actually). The problem lies on property like "Author". For example a cooking recipe has an author. Schema.org/Recipe says that the author can be an a Person or an Organisation. Both are schema.org's objects.
Until now it's easy. I get a schema for a Recipe and pass it to jsonschema2pojo.org and obtain my classes.
Then with Gson
Gson gson = new Gson();
Recipe recipe = gson.fromJson(myString,Recipe.class);
myString is the json-ld I used to generate the Recipe classes. Once I try to download some more recipes from the web, I immediately encounter schemas where the Author is not a schema.org item, but a simple String. From this point on I am blocked. The parser is stuck, exactly like google's schemaorg-java parser.
I did read that some people modify the class to have authors as Object and then modify the getter and setters. A deserializer should be made for the whole Recipe class, but it must behave differently only for Author (and other similar Parameters.
Isn't there an easier way to deserialize schema.org in java? Am I googling wrong?
If you're using GSON you'll need to create custom deserialisers. Your best bet is to read the type of the author value and if it is a string create a custom Author POJO with the name set as the string.
You've chosen a difficult language to implement this parser. Strongly typed languages are going to have a tough time deserialising data from a loosely typed language.
On top of that schema.org isn't well defined. Plus people will screw up their schema.org markup. It's up to you to decide how you'll handle it. Do you reject all data that doesn't conform exactly to the data or do you try to coerce the data?
I'm curious what you're working on. Is it a web service?

Java Object Clone (additional class member) using Prototype, Builder Pattern

It is not easy to explain my issue.
JPA creates some complex objects for calculations, which are stored in a database.
We decided to set the results in a working copy of this objects.
This means for each object model we created a seperated working copy model file with the same fields but some other LocalDates values and new result fields.
When the calculation was starting the working copies are instantiated.
This approach is not the best i think.
I think of the prototype pattern to clone the object.
There i come to the problem how to add the new fields. How?
Instantion costs and ist creates lots of additionals model class files.
I only think of put the result field in the calculation data models as transient fields.
Maybe inner class or local class?
I also tried to use an interface as data bucket.
But thats not the realy purpose of interfaces and also it works only with many curious trick.
For Unit Tests and user input i think it is the best to use the builder pattern and then tell JPA to store the parent object, or not?
Sorry but my answer was to long for a comment :(
There is big complex object relationship with Lists and Sets One To Many etc. relationship. When i set the result i a new class i cant determine the right object e.g. in a list. So we bild the same structurefor these result and seperated these classes in a package. Maybe it is possible to dont build the structure a second time with also references to the "basic classes". It should be sufficient to reference to each basic class a result class. It would only a little bit more navigation to get values from deeper classes. For a similiar use case there must be a best practise, or? Interfaces or sth. I very dislike the many classes for the result. Is it not possible to clone and add classmember to it for the result or to logical group easier or something like this?
It could be a solution for somebody:
http://help.eclipse.org/luna/index.jsp?topic=%2Forg.eclipse.jdt.doc.isv%2Freference%2Fapi%2Forg%2Feclipse%2Fjdt%2Fcore%2FIWorkingCopy.html
Here you will work with the Eclipse API and create IWorkingCopies.
For the described task toooo much.

Best practice: Java/XML serialization: how to determine to what class to deserialize?

I have an application that saves its context to XML. In this application, there is a hierarchy of classes, that all implement a common interface, and that represent different settings. For instance, a first setting class may be made of 4 public float fields, another one can be made of a sole HashMap.
I am trying to determine what is the best way to handle writing and reading to XML in a generic way. I read on this site a lot about JAXB and XStream for instance, which are able to make a specific class instance from XML.
However my question is related to the fact that the actual class can be anything that implement a given interface. When you read the XML file, how would you guess the actual class to instantiate from the XML data? How do you do that in your applications?
I thought that I could write the .class name in a XML attribute, read it and compare it to all possible class .class names, until I find a match. Is there a more sensible way?
Thanks
xstream should already take care of this and create the object of correct type.
The tutorial seems to confirm that:
To reconstruct an object, purely from the XML:
Person newJoe = (Person)xstream.fromXML(xml);
If you don't know the type, you will have to first assign it to the common interface type:
CommonInterface newObject = (CommonInterface)xstream.fromXML(xml);
// now you can either check its type or call virtual methods
In my case I just have a kind of header that stores the class name that is serialized and when de-serializing it I just use the header value to figure out to which class shall I de-serialize the values.
A best practice would to use an established, well documented XML parser/mapper. All of the serialization/deserialization work has been done, so you can worry about your business logic instead. Castor and Apache Axiom are two APIs that I have used to marshal/unmarshall(serialize/deserialize) Java Classes and XML.
http://www.castor.org
Apache Axiom

Flattening an object graph to a map

I'm trying to flatten an object graph completely to a map.
Complex objects should also be flattened to the top level using "namespaces". So if the object A contains an int i, a string pid and another object B that contains a string id, the resulting Map would look like {i=1, pid="test", B.id="test1"}.
I also want to be able to reconstruct the original object from a given map.
I've searched around for libraries that do this. But I'm not quite getting what I'm looking for. I see stuff that maintains the hierarchy but nothing that completely flattens the structure.
I do see something in Spring Integration that looks like what I want to do:
http://static.springsource.org/spring-integration/api/org/springframework/integration/transformer/ObjectToMapTransformer.html#ObjectToMapTransformer%28%29
But I can't get it to work.
Any help would be appreciated.
Thanks.
The Apache BeanUtils library has a describe() method that does something similar to what I was looking for.
Another possible solution would be via the Jackson JSON library, since JSON objects are essentially key-value pairs.
Related discussions: How to convert a Java object (bean) to key-value pairs (and vice versa)?
Have you considered using Protobufs?
The json-flattener library solves exactly your problem

Categories