SimpleXml framework - embedded collections - java

I try to serialize embedded collection using simple.
For example :
Map<String, List<MyClass>>
I already added necessary annotations in MyClass, i tried with #ElementMap but it doesn't work:
Exception in thread "main" org.simpleframework.xml.transform.TransformException: Transform of class java.util.ArrayList not supported
If its just
#ElementMap Map<String, MyClass>
it works fine. I don't know ho to deal with embedded collection. I know about #ElementList annotation but don't know how to use it in this case. Any hints?

I'm coming across the same issue. The only way I have managed to get it working has been a really cheesy hack - wrapping List in another class.
public class MyWrapper {
#ElementList(name="data")
private List<MyClass> data = new ArrayList<MyClass>();
public MyWrapper(List<MyClass> data) {
this.data = data;
}
public List<MyClass> getData() {
return this.data;
}
public void setData(List<MyClass> data) {
this.data = data;
}
}
And then, instead of
#ElementMap Map<String,List<MyClass>>
...you'd have:
#ElementMap Map<String,MyWrapper>
In my case, the Map is entirely private to my class (i.e. other classes never get to talk directly to the Map), so the fact that I have this extra layer in here doesn't make much of a difference. The XML that is produced of course, is gross, but again, in my case, it's bearable because there is nothing outside of my class that is consuming it. Wish I had a better solution than this, but at the moment, I'm stumped.

Related

Java bean mapper expected capture but is provided object

Please note: even though I mention Dozer in this question, I do believe its really just a pure Java generics question at heart. There may be a Dozer-specific solution out there, but I think anyone with strong working knowledge of Java (11) generics/captures/erasures should be able to help me out!
Java 11 and Dozer here. Dozer is great for applying default bean mapping rules to field names, but anytime you have specialized, custom mapping logic you need to implement a Dozer CustomConverter and register it. That would be great, except the Dozer API for CustomConverter isn't genericized, is monolithic and leads to nasty code like this:
public class MyMonolithicConverter implements CustomConverter {
#Override
public Object convert(Object destination, Object source, Class<?> destinationClass, Class<?> sourceClass) {
if (sourceClass.isAssignableFrom(Widget.class)) {
Widget widget = (Widget)source;
if (destinationClass.isAssignableFrom(Fizz.class)) {
Fizz fizz = (Fizz)destination;
// write code for mapping widget -> fizz here
} else if (destinationClass.isAssignableFrom(Foo.class)) {
// write code for mapping widget -> foo here
}
... etc.
} else if (sourceClass.isAssignableFrom(Baz.class)) {
// write all the if-else-ifs and mappings for baz -> ??? here
}
}
}
So again: monolithic, not genericized and leads to large, complex nested if-else-if blocks. Eek.
I'm trying to make this a wee bit more palatable:
public abstract class BeanMapper<SOURCE,TARGET> {
private Class<SOURCE> sourceClass;
private Class<TARGET> targetClass;
public abstract TARGET map(SOURCE source);
public boolean matches(Class<?> otherSourceClass, Class<?> otherTargetClass) {
return sourceClass.equals(otherSourceClass) && targetClass.equals(otherTargetClass);
}
}
Then, an example of it in action:
public class SignUpRequestToAccountMapper extends BeanMapper<SignUpRequest, Account> {
private PlaintextEncrypter encrypter;
public SignUpRequestToAccountMapper(PlaintextEncrypter encrypter) {
this.encrypter = encrypter;
}
#Override
public Account map(SignUpRequest signUpRequest) {
return Account.builder()
.username(signUpRequest.getRequestedName())
.email(signUpRequest.getEmailAddr())
.givenName(signUpRequest.getFirstName())
.surname(signUpRequest.getLastName()())
.dob(DateUtils.toDate(signUpRequest.getBirthDate()))
.passwordEnc(encrypter.saltPepperAndEncrypt(signUpRequest.getPasswordPlaintext()))
.build();
}
}
And now a way to invoke the correct source -> target mapper from inside my Dozer converter:
public class DozerConverter implements CustomConverter {
private Set<BeanMapper> beanMappers;
#Override
public Object convert(Object destination, Object source, Class<?> destinationClass, Class<?> sourceClass) {
BeanMapper<?,?> mapper = beanMappers.stream()
.filter(beanMapper -> beanMapper.matches(sourceClass, destinationClass))
.findFirst()
.orElseThrow();
// compiler error here:
return mapper.map(source);
}
}
I really like this design/API approach, however I get a compiler error on that mapper.map(source) line at the very end:
"Required type: capture of ?; Provided: Object"
What can I do to fix this compiler error? I'm not married to this API/approach, but I do like the simplicity it adds over the MyMonolithicConverter example above, which is the approach Dozer sort of forces on you. It is important to note that I am using Dozer elsewhere for simple bean mappings so I would prefer to use a CustomConverter impl and leverage Dozer for this instead of bringing in a whole other dependency/library for these custom/complex mappings. If Dozer offers a different solution I might be happy with that as well. Otherwise I just need to fix this capture issue. Thanks for any help here!
The issue seems to come from the beanMappers. You have a set of mappers of various types. The compiler cannot infer what types the found mapper will have.
You can make the compiler believe you by casting the result and suppress the warning it gives you.
Casting to a <?,?> isn't going to happen, so I've added symbols for the convert method. At least it can then be assumed that when you get a BeanMapper<S,T>, map will indeed return a T upon an S source.
class DozerConverter {
private Set<BeanMapper<Object,Object>> beanMappers;
public <S,T> T convert(S source,
Class<?> destinationClass,
Class<?> sourceClass) {
#SuppressWarnings("unchecked")
BeanMapper<S,T> mapper = (BeanMapper<S,T>) beanMappers.stream()
.filter(beanMapper -> beanMapper.matches(sourceClass, destinationClass))
.findFirst()
.orElseThrow();
return mapper.map(source);
}
}
I'm afraid you're going to have to call it like so:
TARGET-TYPE target = dozerConverter.<SOURCE-TYPE,TARGET-TYPE>convert(...);

How to implement a Gson equivalent of #JsonUnwrap

I know Gson doesn't come with a similar feature, but is there a way to add support for unwrapping Json fields the way #JsonUnwrap does?
The goal is to allow a structure like:
public class Person {
public int age;
public Name name;
}
public class Name {
public String first;
public String last;
}
to be (de)serialized as:
{
"age" : 18,
"first" : "Joey",
"last" : "Sixpack"
}
instead of:
{
"age" : 18,
"name" : {
"first" : "Joey",
"last" : "Sixpack"
}
}
I understand it could get fairly complex, so I'm not looking for a full solution, just some high-level guidelines if this is even doable.
I've made a crude implementation of a deserializer that supports this. It is fully generic (type-independent), but also expensive and fragile and I will not be using it for anything serious. I am posting only to show to others what I've got, if they end up needing to do something similar.
public class UnwrappingDeserializer implements JsonDeserializer<Object> {
//This Gson needs to be identical to the global one, sans this deserializer to prevent infinite recursion
private Gson delegate;
public UnwrappingDeserializer(Gson delegate) {
this.delegate = delegate;
}
#Override
public Object deserialize(JsonElement json, Type type, JsonDeserializationContext context) throws JsonParseException {
Object def = delegate.fromJson(json, type); //Gson doesn't care about unknown fields
Class raw = GenericTypeReflector.erase(type);
Set<Field> unwrappedFields = ClassUtils.getAnnotatedFields(raw, GsonUnwrap.class);
for (Field field : unwrappedFields) {
AnnotatedType fieldType = GenericTypeReflector.getExactFieldType(field, type);
field.setAccessible(true);
try {
Object fieldValue = deserialize(json, fieldType.getType(), context);
field.set(def, fieldValue);
} catch (IllegalAccessException e) {
throw new RuntimeException(e);
}
}
return def;
}
}
It can then be registered globally via new GsonBuilder().registerTypeHierarchyAdapter(Object.class, new UnwrappingDeserializer(new Gson())).create() or for a specific type via registerTypeAdapter.
Notes:
A real implementation should recursively check the entire class structure for the presence of GsonUnwrap, cache the result in a concurrent map, and only go through this procedure if it needs to. Otherwise it should just return def immediately
It should also cache discovered annotated fields to avoid scanning the hierarchy each time
GenericTypeReflector is coming from GeAnTyRef
ClassUtils#getAnnotatedFields is my own implementation, but it doesn't do anything special - it just gathers declared fields (via Class#getDeclaredFields) recursively for the class hierarchy
GsonUnwrap is just a simple custom annotation
I presume a similar thing can be done for serialization as well. Examples linked from Derlin's answer can be a starting point.
Currently, there is no easy way to do that. Here are anyway some pointers/alternative ways to make it work.
GsonFire: GsonFire implements some useful features missing from Gson. While it does not yet offer automatic wrapping/unwrapping, it may be a good starting point to create your custom logic.
If you only need serialization, you can add getters for first and last in Person and use #ExposeMethodResult to serialize them. Unfortunately, setters are not supported (cf. Is possible to use setters when Gson deserializes a JSON?).
Another way to support the serialization is to follow the advices from How to move fields to parent object.
Custom TypeAdapters : on of the only ways to support both serialization and deserialization is to create custom TypeAdapters. This won't be generic, but it will suit your usecase.
The thread Serialize Nested Object as Attributes already gives you examples, so I won't repeat them here.

unmarshal synchronized map

I'm using JAXB to save objects to xml files.
#XmlRootElement(name="jaxbobjt")
#XmlAccessorType(XmlAccessType.FIELD)
public class SomeJAXBObject
{
#XmlElementWrapper(name="myEntry")
private Map<Integer, AnotherJAXBObject> map = Collections.synchronizedMap(new LinkedHashMap<Integer, AnotherJAXBObject>());
}
Note the fact that I'm using a synchronizedMap(...) wrapper.
The above results in the following xml:
<jaxbobjt>
<map>
<myEntry>
<key>key</key>
<value>value</value>
</myEntry>
</map>
</jaxbobjt>
Actually I thought that I would need an XmlAdapter to get this working.
But to my surprise this marshals and unmarshals fine. Tests revealed that it correctly uses a java.util.Collections$SynchronizedMap containing a LinkedHashMap$Entry object.
So, if I understand correctly. JAXB's unmarshaller, just instantiates my object using the constructor. Since there's already an instance for the map after instantiation of the object, it does not instantiate the map itself. It uses the putAll I assume ?
I'm just trying to get a deeper understanding of what is going on. It would be nice of somebody could give me some more background information about this. Are my assumptions correct ?
If I am correct, I assume the following implementation would have failed:
#XmlRootElement(name="jaxbobjt")
#XmlAccessorType(XmlAccessType.FIELD)
public class SomeJAXBObject
{
// no instance yet.
#XmlElementWrapper(name="myEntry")
private Map<Integer, AnotherJAXBObject> map = null;
public synchronized void addObject(Integer i, AnotherJAXBObject obj)
{
// instantiates map on-the-fly.
if (map == null) map = Collections.synchronizedMap(new LinkedHashMap<Integer, AnotherJAXBObject>());
map.put(i, obj);
}
}
The strategy used by JAXB is to create container classes only when it is necessary. For anything that is bound to a List, JAXB's xjc creates
protected List<Foo> foos;
public List<Foo> getFoos(){
if( foos == null ) foos = new ArrayList<>();
return foos;
}
and thus, unmarshalling another Foo to be added to this list, does essentially
parent.getFoos().add( foo );
As for maps: presumably the working version of your class SomeJAXBObject contains a getMap method, and that'll work the same way. Setters for lists and maps aren't necessary, and they'll not be used if present. A put method in the parent class isn't expected either; if present it'll not be used because JAXB wouldn't have a way of knowing what it does.

Jackson Ser/Deser: Proxying an object to/from an id w/ a different key

Apologies in advance. This seems like a simple task, but hours later on Google and with guess/check, I still can't figure it out.
I'm writing a Java convenience wrapper library for an API my company provides. One of the classes looks something like this:
class View extends Model<View>
{
List<Column> columns;
Column primaryColumn;
}
However, our API actually wants a primaryColumnId integer, not an actual Column object. I want to maintain the strongly-typed getPrimaryColumn() and setPrimaryColumn(Column) in the library to reduce developer error, but I'm having significant difficulty writing some sort of translation between the getter/setter that we need to ser/deser to/from JSON.
I'm using the standard Bean serialization strategy. I'd like to avoid the wholly-custom approach because in reality View has dozens of fields. Here's what I've figured out so far.
I think (haven't tested yet) that I can handle the serialization case simply by creating a custom JsonSerializer that looks something like:
public static class ColumnIdSerializer extends JsonSerializer<Column>
{
#Override
public void serialize(Column column, JsonGenerator jsonGenerator,
SerializerProvider serializerProvider) throws IOException {
jsonGenerator.writeFieldName("primaryColumnId");
jsonGenerator.writeNumber(column.id);
}
}
And then assigning the annotation to the appropriate place:
#JsonSerialize(using = Column.ColumnIdSerializer.class)
public Column getPrimaryColumn() { /* ... */ }
This allows me to serialize the id rather than the whole class, and rename the key from primaryColumn to primaryColumnId.
Now, we get to deserialization. Here I run into three problems.
The first is that in order to successfully deserialize the column from the id, we have to first have the list of columns. This is solvable using #JsonPropertyOrder on the class. Great, that's done.
The second is that I need to tell Jackson to look under primaryColumnId rather than primaryColumn for the value. I don't know how to do this; the JsonDeserializer appears to kick in after the key has already been found, so it's too late to modify it. JsonSchema looks like it might be relevant but I can't find any documentation or internet chatter on how to use it.
The third is that from the custom JsonDeserializer class I'll have to be able to reference the View that's being deserialized in order to ask it for a Column in return for my id int. There doesn't appear to be a way to do that.
Should I just cave and add a public getPrimaryColumnId() and setPrimaryColumnId(Integer), or is there a way to overcome these obstacles?
So I'd propose something like this:
class CustomView
{
private final View parent;
public CustomView(View view){
parent = view;
}
// Jackson needs a no-arg constructor
public CustomView(){
parent = new View();
}
// ...
public List<Columns> getColumns(){ ... }
public void setColumns(List<Columns> columns){ ... }
public int getPrimaryColumn(){
return parent.getPrimaryColumn().getColumnId();
}
public void setPrimaryColumn(int column){
parent.getPrimaryColumn().setColumnId(column);
}
//...
// don't use `get` in the method name here to avoid serialization
public View rawView(){
return parent;
}
}
If needed this can be written to extend View, but be careful to mask methods where appropriate.
Turns out that since Jackson does nasty reflection, it can see through private methods. So, the trick ended up simply being along the lines of:
private void setPrimaryColumnId(Integer id) {...}
private Integer getPrimaryColumnId() {...}
public void setPrimaryColumn(Column column) {...}
#JsonIgnore
public Column getPrimaryColumn() {...}

Java: Trouble with Generics & Collection type detection

I have a class called DataSet with various constructors, each specifying a different type of variable. It might look a bit like this:
public class DataSet
{
private HashSet Data;
public DataSet( DataObject obj )
{
Data = new <DataObject>HashSet();
Data.add( obj );
}
public DataSet( ObjectRelationship rel )
{
Data = new <ObjectRelationship>HashSet();
Data.add( rel );
}
// etc.
Note: I haven't yet gotten to test that code due to incomplete parts (which I'm building right now).
In a function that I'm currently building, getDataObjects(), I need to return all DataObject objects that this set represents. In the case of constructors that initiate the class's HashSet, Data with types other than DataObject (such as the above ObjectRelationship), there obviously won't be any DataObjects stored within. In this case, I need to be able to detect the type that the HashSet 'Data' was initiated with (like, to tell if it's 'ObjectRelationship' or not, I mean). How do I do this?
tl;dr: How do I tell the type that a Collection (in this case, a HashSet) was initiated with in my code (like with an 'if' or 'switch' statement or something)?
Sounds like you want to make the entire class generic- add a template parameter to the declaration for the class and define your HashSet and retrieval functions using that template parameter for the types.
I'm a .Net guy at the moment, though, so I couldn't give you the Java syntax, but using C# syntax it would look something like this:
public class DataSet<T>
{
private Set<T> Data;
public DataSet( T obj )
{
Data = new HashSet<T>();
Data.add( obj );
}
public Iterator<T> getDataObjects()
{
return Data.iterator;
}
}
You could fetch an object from the set and verify its type.
Or you could have multiple sets to contain different types.
Or you could have an instance variable of type Class to act as a discriminator as an instance variable.
Or you could create a proxy object for HashSet using the last technique.
You could use a map to the set
HashMap <Class<?>, HashSet<Object>> data;
HashSet temp = data.get(DataObject.class);
if(temp == null)
{
temp = new HashSet();
data.put(DataObject.class, temp);
}
temp.add(obj);
Then you will get the best of both worlds.
Sounds like your design needs to be re-thought.
Also, to be clear on Generics; you cannot access the type at runtime. The type parameter is only for compile-time checking and is completely gone (type erasure) at runtime.
What does this class offer that CachedRowSet does not?
Sorry, I don't consider this to be a very good abstraction. If I were a member of your team, I wouldn't use it.
Your syntax doesn't look correct to me, either. IntelliJ agrees with me: it won't compile.
This does:
import java.util.HashSet;
import java.util.Set;
import java.util.Arrays;
public class DataSet
{
private Set<DataObject> data;
public DataSet(DataObject obj)
{
this.data = new HashSet<DataObject>();
data.add(obj);
}
public DataSet(DataObject[] objs)
{
data = new HashSet<DataObject>();
data.addAll(Arrays.asList(objs));
}
// etc.
}
Still a poor abstraction. Rethink it.
You could add an property to your dataset class (an enumerated value, boolean or type) that specifies which type was used to initialize the hashset.
Set the property in the appropriate constructor. This allows you to bypass getting an element out of the collection to check its type.
pseudo-code:
public class DataSet
{
private HashSet Data;
private Type _iw = null;
public Type InitializedWith { return _iw; }
public DataSet(DataObject)
{
...
_iw = typeof(DataObject);
}
public DataSet(ObjectRelationship)
{
...
_iw = typeof(ObjectRelationship)
}
I'm going to follow duffymo's advice and just use better abstraction. I'm going to make multiple classes for each specific type I plan to use (each implementing a common interface) so that I can just bypass this dumb problem.
It'll add a minuscule bit of overhead during the process of creating each DataSet object of correct type, but I suppose that's just how it goes.
I don't know what DataObject gives you over and above an Object.
I think an object-oriented approach to your problem would use classes that reflected your domain of interest (e.g., Invoice, Customer, etc.). The persistence layer would hide the persistence details.
A common way to accomplish this is to use the Data Access Object, which might look like this in Java:
public interface GenericDao<T>
{
T find(Serializable id);
List<T> find();
void save(T obj);
void update(T obj);
void delete(T obj);
}
Now you're dealing with objects instead of things that smack of relational databases. All the CRUD details are hidden behind the DAO interface.

Categories