I want to use auto-value with firebase 9.2.0+. I have the following code:
#AutoValue
public abstract class Office {
public static Builder builder() {
return new AutoValue_Office.Builder();
}
public abstract double latitud();
public abstract double longitud();
#AutoValue.Builder
public static abstract class Builder {
public abstract Builder latitud(double latitud);
public abstract Builder longitud(double longitud);
public abstract Office build();
}
}
But when I make to call this Office office = childDataSnapshot.getValue(Office.class); I am getting this error:
com.google.firebase.database.DatabaseException: No properties to serialize found on class com.example.app.model.Office
Somebody have an idea why I am getting this error and how to solve it? I read that firebase is no longer using jackson for json serialization. So I am not sure how to specify a kind of #JsonProperty("latitud") I have used #PropertyName unsuccessfully.
I also tried rename the abstract methods like public abstract double getLatitud(); and after that the error is the next one:
java.lang.InstantiationException: Can't instantiate abstract class com.example.app.model.Office
So I am not sure how to solve this.
SOLUTION
Thanks to hatboysam and Frank van Puffelen I finally could face this problem with the next solution.
I created a FirebaseUtil enum with two methods for serialize and deserialize objects for Firebase based on hatboysam answer and Frank van Puffelen comment.
I create a couple of User and Phone classes for testing.
Dependencies:
compile 'com.fasterxml.jackson.core:jackson-annotations:2.8.0'
compile 'com.fasterxml.jackson.core:jackson-databind:2.8.0'
Usage example:
User user = FirebaseUtil.deserialize(dataSnapshot, User.class);
Map<String, Object> map = FirebaseUtil.serialize(user);
I'm not sure this is possible with the default Firebase data mapper, but there is a possible workaround. First let's explain the errors you're seeing:
com.google.firebase.database.DatabaseException: No properties to serialize found on class com.example.app.model.Office
The Firebase mapper looks for either public fields or fields named with the getFoo/setFoo pattern. So on your class the mapper does not see any properties.
java.lang.InstantiationException: Can't instantiate abstract class com.example.app.model.Office
This is the one I think you'll have trouble getting around. In order for the deserialization to work your class needs to have a public, no-argument constructor that the mapper can call via reflection (newInstance()). As far as I know this is not how AutoValue works.
But don't lose hope!. According to this github issue there is a way to make Jackson and AutoValue compatible using the #JsonCreator annotation. So you'll need to use both Jackson and Firebase to get the job done here.
Serializing:
// Convert to a Map<String,Object> using Jackson and then pass that to Firebase
ObjectMapper mapper = new ObjectMapper();
Map<String, Object> map = mapper.convertValue(office, Map.class);
databaseReference.setValue(map);
Deserializing:
// Use Firebase to convert to a Map<String,Object>
GenericTypeIndicator<Map<String,Object>> t = new GenericTypeIndicator<Map<String,Object>>() {};
Map<String,Object> map = dataSnap.getValue(t);
// Use Jackson to convert from a Map to an Office object
ObjectMapper mapper = new ObjectMapper();
Office pojo = mapper.convertValue(map, Office.class);
I wrote an AutoValue extension for this:
https://github.com/mattlogan/auto-value-firebase
The extension generates a firebase-compatible class, called FirebaseValue, as a static inner class in your generated AutoValue class. You can convert between your AutoValue class and your FirebaseValue class via the generated constructors.
Here's an example, copied from the readme, of what that looks like:
#AutoValue #FirebaseValue
public abstract class Taco {
public static Taco create(String name, List<Ingredient> ingredients, Review review) {
return new AutoValue_Taco(name, ingredients, review);
}
// Create AutoValue_Taco instance from AutoValue_Taco.FirebaseValue instance
public static Taco create(DataSnapshot dataSnapshot) {
AutoValue_Taco.FirebaseValue taco = dataSnapshot.getValue(AutoValue_Taco.FirebaseValue.class);
return new AutoValue_Taco(taco);
}
// Create AutoValue_Taco.FirebaseValue instance from AutoValue_Taco instance
public Object toFirebaseValue() {
return new AutoValue_Taco.FirebaseValue(this);
}
public abstract String name();
public abstract List<Ingredient> ingredients();
public abstract Review review();
}
Related
I have some enum types that look like this:
public static enum Thingie {
ABC("abc"), DEF("def");
private String messageValue;
#JsonValue
public String getMessageValue() { return messageValue; }
private Thingie(String messageValue) { this.messageValue = messageValue; }
}
This will allow Jackson to properly marshal and unmarshal between string values and the enum type.
There may be times when I'd like to directly convert a string value to the enum value. This would be like the internal "fromValue()" method, but not quite the same:
public static Thingie messageValueOf(String messageValue) {
ObjectMapper mapper = new ObjectMapper();
return mapper.convertValue(messageValue, Thingie.class);
}
I would like to convert this into a generic method AND put it into a base class, along with the "messageValue" property and accessor. The constructor would change to just call "super(messageValue)". Obviously, if I could do that, I would move the "mapper" to class level.
At this point, I've only attempted to write this as a generic method in the single enum type. I can't even get that working. I can't figure out how to extract the class from the template parameter. I've seen this particular question before, and there have been some answers, but I couldn't quite get it to work, and I imagine trying to do this in the base class would add additional complexity.
Let's assume I understood your problem (correct me if I am wrong).
The constructor would change to just call "super(messageValue)"
An enum can not extend a class, so you can't do that. But you can create an interface/class which you will delegate to for such queries (very simplistic code):
interface Test {
ObjectMapper MAPPER = new ObjectMapper();
static <T extends Enum<T>> T getIt(String s, Class<T> clazz) {
return MAPPER.convertValue(s, clazz);
}
}
public static void main(String[] args) {
Thingie abc = Test.getIt("abc", Thingie.class);
System.out.println(abc.ordinal());
}
I have implementation of mapstruct Mapper as following
#Mapper
public interface MyMapper extends Serializable {
MyMapper INSTANCE = Mappers.getMapper(MyMapper.class);
//#Mapping(target = "status", source = "p1.status")
MergedPojosClass from(Pojo1 p1, Pojo2 p2);
}
In the target class I have field status but this field is available in both pojo classes.
For my pojos I use lombok to generate setters, getters and all kind of constructors.
Without commented line I receive following error:
Error:(20, 14) java: Several possible source properties for target property "status".
Can I avoid above boilerplate (explicit mapping) by adding some annotation saying that Pojo1 has higher priority?
I was looking into Java docs and also to source code of mapstruct but without any example or clue which could help in my case. I was trying to find something with InheritanceStrategy but it looks rather like internal concept of mapstruct.
You could try to define an #MapperConfig. Not sure if it works though
So like this:
#MapperConfig
public interface MyConfig {
#Mapping(target = "status", source = "p1.status")
MergedPojosClass from(Pojo1 p1);
}
#Mapper(config = MyConfig.class, mappingInheritanceStrategy=MappingInheritanceStrategy.AUTO_INHERIT_ALL_FROM_CONFIG)
public interface MyMapper extends Serializable {
MyMapper INSTANCE = Mappers.getMapper(MyMapper.class);
// here's the doubt.. I'm not sure that in 2 arg mapping the config is used
MergedPojosClass from(Pojo1 p1, Pojo2 p2);
}
If you want to merge several objects of the same type into one you can use #MappingTarget. However this approach modifies the parameter. If you want to produce a new object you'd need something like this:
#Mapper(nullValuePropertyMappingStrategy = IGNORE)
public interface PojoMerger {
void copyNonNullProperties(#MappingTarget Pojo target, Pojo source);
default Pojo merge(Pojo... sources) {
Pojo merged = new Pojo();
for(Pojo source: sources) {
copyNonNullProperties(merged, source);
}
return merged;
}
}
In one of our projects we use a java webapp talking to a MongoDB instance. In the database, we use DBRefs to keep track of some object relations. We (de)serialize with POJO objects using jackson (using mongodb-jackson-mapper).
However, we use the same POJOs to then (de)serialize to the outside world, where our front end deals with presenting the JSON.
Now, we need a way for the serialization for the outside world to contain the referenced object from a DBRef (so that the UI can present the full object), while we obviously want to have the DBRef written to the database, and not the whole object.
Right now I wrote some untested static nested class code:
public static class FooReference {
public DBRef<Foo> foo;
// FIXME how to ensure that this doesn't go into the database?
public Foo getFoo() {
return foo.fetch();
}
}
Ideally I would like a way to annotate this so that I could (de)serialize it either with or without the getFoo() result, probably depending on some configuration object. Is this possible? Do you see a better way of going about doing this?
From looking at options, it seems you can annotate properties to only be shown if a given View is passed to the ObjectMapper used for serialization. You could thus edit the class:
public static class FooReference {
public DBRef<Foo> foo;
#JsonView(Views.WebView.class)
public Foo getFoo() {
return foo.fetch();
}
}
and provide:
class Views {
static class WebView { }
}
and then serialize after creating a configuration with the correct view:
SerializationConfig conf = objectMapper.getSerializationConfig().withView(Views.WebView.class);
objectMapper.setSerializationConfig(conf);
Which would then serialize it. Not specifying the view when serializing with the MongoDB wrapper would mean the method would be ignored. Properties without a JsonView annotation are serialized by default, a behaviour you can change by specifying:
objectMapper.configure(SerializationConfig.Feature.DEFAULT_VIEW_INCLUSION, false);
More info is available on the Jackson Wiki.
There are still other alternatives, too, it turns out: there are Jackson MixIns which would let you override (de)serialization behaviour of parts of a class without modifying the class itself, and as of Jackson 2.0 (very recent release) there are filters, too.
Use a custom JSONSerializer and apply your logic in the serialize method:
public static class FooReference {
public DBRef<Foo> foo;
#JsonSerialize(using = CustomSerializer.class)
public Foo getFoo() {
return foo.fetch();
}
}
public class CustomSerializer extends JsonSerializer<Object> {
public void serialize(Object value, JsonGenerator jgen, SerializerProvider provider)
throws IOException, JsonProcessingException {
// jgen.writeObjectField ...
}
}
I'm calling a rest service that returns a json object. I'm trying to deserialize the responses to my Java Beans using Jackson and data-binding.
The example Json is something like this:
{
detail1: { property1:value1, property2:value2},
detail2: { property1:value1, property2:value2},
otherObject: {prop3:value1, prop4:[val1, val2, val3]}
}
Essentially, detail1 and detail2 are of the same structure, and thus can be represented by a single class type, whereas OtherObject is of another type.
Currently, I've set up my classes as follows (this is the structure I would prefer):
class ServiceResponse {
private Map<String, Detail> detailMap;
private OtherObject otherObject;
// getters and setters
}
class Detail {
private String property1;
private String property2;
// getters and setters
}
class OtherObject {
private String prop3;
private List<String> prop4;
// getters and setters
}
Then, just do:
String response = <call service and get json response>
ObjectMapper mapper = new ObjectMapper();
mapper.readValue(response, ServiceResponse.class)
The problem is I'm getting lost reading through the documentation about how to configure the mappings and annotations correctly to get the structure that I want. I'd like detail1, detail2 to create Detail classes, and otherObject to create an OtherObject class.
However, I also want the detail classes to be stored in a map, so that they can be easily distinguished and retrieved, and also the fact that the service in in the future will return detail3, detail4, etc. (i.e., the Map in ServiceResponse would look like
"{detail1:Detail object, detail2:Detail object, ...}).
How should these classes be annotated? Or, perhaps there's a better way to structure my classes to fit this JSON model? Appreciate any help.
Simply use #JsonAnySetter on a 2-args method in ServiceResponse, like so:
#JsonAnySetter
public void anySet(String key, Detail value) {
detailMap.put(key, value);
}
Mind you that you can only have one "property" with #JsonAnySetter as it's a fallback for unknown properties. Note that the javadocs of JsonAnySetter is incorrect, as it states that it should be applied to 1-arg methods; you can always open a minor bug in Jackson ;)
I would like to know if it is possible to have snakeyaml load a yaml document into a javabean and if it is unable to find a match for the entry in the document as a javabean property it will place it into a generic map within the javabean...
Ex.
public class Person {
private String firstName;
private String lastName;
private Map<String, Object> anythingElse;
//Getters and setters...
}
If I load a document that looks like:
firstName: joe
lastName: smith
age: 30
Since age is not a property in the bean I would like {age, 30} to be added to the anythingElse map.
Possible?
Thanks.
No it wouldn't be possible.
From my experience and attempts it doesn't work. If you would want to load a file into a object than all attributes in that objectclass would have to have a getter and setter (I.E. the class have to be JavaBean, see Wikipedia).
I used your Person Class (See the wiki page for a proper JavaBeanClass) and this code: http://codepaste.net/dbtzqb
My error message was: "Line 3, column 4: Unable to find property 'age' on class: Person" thus proving that this simple program cannot have "unexpected" attributes. This is my fast and easy conclusion. I've not tried extensively so it may be possible but I don't know such a way (you'll have to bypass the readingmethods and JavaBean). I've used YamlBeans (https://code.google.com/p/yamlbeans/) so it's a little different but I find it easier and working ;]
Hope it's helping!
Edit
Sorry for bumping this, better late than never! I didn't see the postdate until after I posted my answer.. But hopefully it would help others seeking help assweel :3
I haven't tried the following (semi-kludgy hack) using SnakeYaml, but I have it working using YamlBeans:
Basically the idea is to define a class that extends one of the concrete implementations of java.util.Map. Then define getters that pick out distinct values and a general getter that returns everything else:
public class Person extends HashMap<String, Object>
{
public String getFirstName()
{
return (String)this.get("firstName");
}
public String getLastName()
{
return (String)this.get("lastName");
}
public Map<String, Object> getExtensions()
{
Map<String, Object> retVal = (Map<String, Object>)this.clone();
retVal.remove("firstName");
retVal.remove("lastName");
return retVal;
}
}
I'm not sure how either SnakeYaml or YamlBeans prioritizes the different type information you see when introspecting on this class, but YamlBeans (at least) is content to deserialize info into this class as if it were any other Map and doesn't seem to get confused by the addition getters (i.e. doesn't trip up on the "getExtensions").
It is possible:
import org.yaml.snakeyaml.Yaml;
import org.yaml.snakeyaml.representer.Representer;
public class YamlReader {
public static <T> T readYaml(InputStream is, Class<T> clazz){
Representer representer = new Representer();
// Set null for missing values in the yaml
representer.getPropertyUtils().setSkipMissingProperties(true);
Yaml yaml = new Yaml(representer);
T data = yaml.loadAs(is, clazz);
return data;
}
}