MongoDB spring repository - abstract class as field "Class is abstract" - java

I'm getting error when spring mongo template reading object from DB: "Class is abstract". This is because internal field in document is abstract type.
in my case classes looks like:
public abstract class Context {
private String name;
}
public class AContext {
private String aData;
}
public class BContext {
private String bData;
}
#Document
#TypeAlias("Task")
public class Task {
#Id
private String id;
private String name;
private List<Context> contexts;
}
How can I fix thiss issue ?

Related

How to map extended classes in MapStruct

Gotta question regarding mapStruct. I have case where I extend class from base entity and not sure how to map it. Here is my case.
BaseEntity:
public class BaseEntity {
#Id
#GeneratedValue(strategy = GenerationType.SEQUENCE)
#Column(name = "id")
private Long id;
}
BaseDto:
public class BaseDto {
private Long id;
}
UserEntity:
public class User extends BaseEntity {
private String name;
private String lastName;
private String username;
private String password;
private String profilePicturePath;
}
UserDto:
public class UserDto extends BaseDto {
private String name;
private String lastName;
private String username;
private String password;
private String profilePicturePath;
}
And mapper is like this:
#Mapper(uses = {BaseMapper.class})
public interface UserMapper {
User userDtoToUser(UserDto userDto);
UserDto userToUserDto(User user);
}
BaseMapper:
#Mapper
public interface BaseMapper {
BaseEntity dtoToEntity(BaseDto baseDto);
BaseDto entityToDto(BaseEntity baseEntity);
}
Problem is that I don't get ID property mapped.
Thank you for your time.
EDIT:
There is no error shown, in mapper implementation (generated code) there is no mapping for that ID:
#Override
public User userDtoToUser(UserDto userDto) {
if ( userDto == null ) {
return null;
}
UserBuilder user = User.builder();
user.name( userDto.getName() );
user.lastName( userDto.getLastName() );
user.username( userDto.getUsername() );
user.password( userDto.getPassword() );
user.profilePicturePath( userDto.getProfilePicturePath() );
return user.build();
}
I'm guessing (as you have not put buider code) the problem is that your builder class does not include parent class field. MapStruct makes some assumption while generating code for mapper. From documentation -
The default implementation of the BuilderProvider assumes the
following:
The type has a parameterless public static builder creation method
that returns a builder. So for example Person has a public static
method that returns PersonBuilder.
The builder type has a parameterless public method (build method)
that returns the type being build In our example PersonBuilder has a
method returning Person.
In case there are multiple build methods, MapStruct will look for a
method called build, if such method exists then this would be used,
otherwise a compilation error would be created.
If you are using Lombok, you can solve this by using #SuperBuilder as -
#SuperBuilder
#Getter
#ToString
public class UserDto extends BaseDto {
private String name;
private String lastName;
private String username;
private String password;
private String profilePicturePath;
}
#Getter
#SuperBuilder
class BaseDto {
private Long id;
}
#SuperBuilder
#Getter
#ToString
public class User extends BaseEntity {
private String name;
private String lastName;
private String username;
private String password;
private String profilePicturePath;
}
#Setter
#Getter
#SuperBuilder
class BaseEntity {
private Long id;
}
And generated could looks like -
#Override
public User userDtoToUser(UserDto userDto) {
if ( userDto == null ) {
return null;
}
UserBuilder<?, ?> user = User.builder();
user.id( userDto.getId() );
user.name( userDto.getName() );
user.lastName( userDto.getLastName() );
user.username( userDto.getUsername() );
user.password( userDto.getPassword() );
user.profilePicturePath( userDto.getProfilePicturePath() );
return user.build();
}

hibernate - Persisting a composition interface of strategy pattern

I have the following class structure:
public abstract class Creature{
private String name;
//strategy pattern composition
private SkillInterface skill;
}
public interface SkillInterface {
void attack();
}
public class NoSkill implements SkillInterface {
#Override
public void attack() {
//statements
}
}
My goal is to persist Creature objects at one table in database. Subclasses of SkillInterface are without any fields. As they determine the behaviour, I want to convert selected SkillInterface class name to a String, as I only need to persist the classname of the current skill strategy of creature, with a String like skill.getClass().getSimpleName(). I tried to implement it with #Converter annotation, using AttributeConverter class to convert SkillInterface to String and save, but always had mapping exceptions. I want to be able to save it as String and retrieve as SkillInterface object.
But how can I implement it with Hibernate? Or do I have a design mistake?
Ok looks like I have found a basic solution that can be used to persist Strategy Pattern interfaces implementations. I used a #Converter annotation and a AttributeConverter class to convert strategy class names to column while saving to database and cast the retrieved String back to strategy class as following:
#Entity
public class Creature {
#Id
#GeneratedValue(strategy = GenerationType.AUTO)
private int id;
#Convert(converter = SkillConverter.class)
private SkillInterface skill;
}
public class SkillConverter implements AttributeConverter<SkillInterface,String> {
#Override
public String convertToDatabaseColumn(SkillInterface skill) {
return skill.getClass().getSimpleName().toLowerCase();
}
#Override
public SkillInterface convertToEntityAttribute(String dbData) {
//works as a factory
if (dbData.equals("noskill")) {
return new NoSkill();
} else if (dbData.equals("axe")) {
return new Axe();
}
return null;
}
}
public interface SkillInterface {
public String getSkill();
void attack();
}
public class NoSkill implements SkillInterface{
public String getSkill() {
return getClass().getSimpleName();
}
#Override
public void attack() {
//strategy statements
}
}
You can use a proxy field to this for you like below:
abstract class Creature {
#Column
private String name;
// strategy pattern composition
private SkillInterface skill;
#Column
private String skillName;
public String getSkillName() {
return skill.getClass().getSimpleName();
}
public void setSkillName(String skillName) {
//ignore
}
}

Unable to find generated Parcelable class with Realm

I am trying to pass realm object with bundle and I used Parcel library
This is my realm model class.
Album.java
#Parcel
public class Album extends RealmObject implements Serializable {
#PrimaryKey
public String id;
public String upc;
public String albumName;
public String albumArtUrl;
public String artistName;
public String genre_id;
public String genreName;
public String price;
public String releaseYear;
public int explicit;
public RealmList<Song> songs = new RealmList<>();
}
And this is Song.java.
#Parcel
public class Song extends RealmObject implements Serializable {
#PrimaryKey
public String id;
public String isrc;
public String songName;
public String artistName;
public String album_id;
public String albumArtUrl;
public String genre_id;
public String genreName;
public String releaseYear;
public String price;
public String lyrics;
public String demo;
public int explicit;
}
When I try to pass album object in bundle like that,
b.putParcelable("album", Parcels.wrap(album));
I am having that error.
Unable to find generated Parcelable class for com.devhousemyanmar.juketrill.models.Album, verify that your class is configured properly and that the Parcelable class com.devhousemyanmar.juketrill.models.Album$$Parcelable is generated by Parceler.
please help me to solve this.
If you check the documentation, it has a section dedicated to using Parceler.
// All classes that extend RealmObject will have a matching RealmProxy class created
// by the annotation processor. Parceler must be made aware of this class. Note that
// the class is not available until the project has been compiled at least once.
#Parcel(implementations = { PersonRealmProxy.class },
value = Parcel.Serialization.BEAN, // <-- requires getters/setters if set
analyze = { Person.class })
public class Person extends RealmObject {
// ...
}
But what's worth noting is that you don't need to specify implementations = {PersonRealmProxy.class} if you use realm.copyFromRealm(song) before passing it to Parcels.wrap(). You'll need to do that anyways if you want to use field values instead of bean serialization strategy, anyways.
Also, you might need a RealmList parceler configuration.

Mapping abstract class in dozer

I have the following class structure (it actually is a VO layer with Hibernate mappings):
public abstract class abstractClassVO {
private int id;
private String name;
}
public class concreteClassAVO extends abstractClassVO {
private String aAttribute;
}
public class concreteClassBVO extends abstractClassVO {
private Long bAttribute;
}
And the equivalent DTO objects:
public abstract class abstractClassDTO {
private int id;
private String name;
}
public class concreteClassADTO extends abstractClassDTO {
private String aAttribute;
}
public class concreteClassBDTO extends abstractClassDTO {
private Long bAttribute;
}
Then I have another object like this:
public class compositeObject {
private int anAttribute;
private abstractClassVO myInstance;
}
and its equivalent:
public class compositeObjectDTO{
private int anAttribute;
private abstractClassDTO myInstance;
}
How can I tell dozer to automatically map myInstance to the specific DTO that corresponds to the concrete class implementation in the VO layer?
Currently, out of the box, Dozer isn't even putting anything in the myInstance field of the compositeObjectDTO class. My guess is that it's due to the fact that abstractClassDTO it is an abstact class, and since it cannot determine the implementation, it does nothing. I am not getting any exceptions.
Dozer can't do it out of the box but you could write a helper that would determine destination class by source class. You can get this information from DozerBeanMapper.getMappingMetadata().getClassMappings* methods. These methods return list of ClassMappingMetadata that contains destination class. You just only need to chech whether destination class is inherited from abstractClassDTO. This check can be omitted if you only have one mapping for one VO.
For bi-directional mapping you should additionally check ClassMappingMetadata.MappingDirection field.

How to deserialize JSON Array contained an abstract class without modifying a parent class?

I'm trying to deserialize JSON Array, which is persisted into my MongoDB, to a Java object by using Jackson. I found many tutorials mentioned to handle this polymorphism by adding:
#JsonTypeInfo(use=Id.CLASS,property="_class")
to a Super-class. However, in my case, I can't be able to modify the Super-class. So, are there some solutions to solve it without modifying the Super-class? Here is my code:
public class User {
#JsonProperty("_id")
private String id;
private List<Identity> identities; // <-- My List contains objects of an abstract class; Identity
public User(){
identities = new ArrayList<Identity>();
}
public static Iterable<User> findAllUsers(){
return users().find().as(User.class); // Always give me the errors
}
/*More code*/
}
It always give me the error - Can not construct instance of securesocial.core.Identity, problem: abstract types either need to be mapped to concrete types, have custom deserializer, or be instantiated with additional type information.
You can use #JsonDeserilize annotation to bind a concrete implementation class to an abstract class. If you cannot modify your abstract class you can use the Jackson Mix-in annotations to tell Jackson how to find the implementation class.
Here is an example:
public class JacksonAbstract {
public static class User {
private final String id;
private final List<Identity> identities;
#JsonCreator
public User(#JsonProperty("_id") String id, #JsonProperty("identities") List<Identity> identities) {
this.id = id;
this.identities = identities;
}
#JsonProperty("_id")
public String getId() {
return id;
}
public List<Identity> getIdentities() {
return identities;
}
}
public static abstract class Identity {
public abstract String getField();
}
#JsonDeserialize(as = IdentityImpl.class)
public static abstract class IdentityMixIn {
}
public static class IdentityImpl extends Identity {
private final String field;
public IdentityImpl(#JsonProperty("field") String field) {
this.field = field;
}
#Override
public String getField() {
return field;
}
}
public static void main(String[] args) throws IOException {
User u = new User("myId", Collections.<Identity>singletonList(new IdentityImpl("myField")));
ObjectMapper mapper = new ObjectMapper();
mapper.addMixInAnnotations(Identity.class, IdentityMixIn.class);
String json = mapper.writerWithDefaultPrettyPrinter().writeValueAsString(u);
System.out.println(json);
System.out.println(mapper.readValue(json, User.class));
}
}

Categories