Patterns: Populate instance from Parameters and export it to XML - java

I'm building a simple RESTFul Service; and for achieve that I need two tasks:
Get an instance of my resource (i.e Book) from request parameters, so I can get that instance to be persisted
Build an XML document from that instance to send the representation to the clients
Right now, I'm doing both things in my POJO class:
public class Book implements Serializable {
private Long id;
public Book(Form form) {
//Initializing attributes
id = Long.parseLong(form.getFirstValue(Book.CODE_ELEMENT));
}
public Element toXml(Document document) {
// Getting an XML Representation of the Book
Element bookElement = document.createElement(BOOK_ELEMENT);
}
I've remembered an OO principle that said that behavior should be where the data is, but now my POJO depends from Request and XML API's and that doesn't feels right (also, that class has persistence anotations)
Is there any standard approach/pattern to solve that issue?
EDIT:
The libraries i'm using are Restlets and Objectify.

I agree with you when you say that the behavior should be where the data is. But at the same time, as you say I just don't feel confortable polluting a POJO interface with specific methods used for serialization means (which can grow considerably depending on the way you want to do it - JSON, XML, etc.).
1) Build an XML document from that instance to send the representation to the clients
In order to decouple the object from serialization logic, I would adopt the Strategy Pattern:
interface BookSerializerStrategy {
String serialize(Book book);
}
public class XmlBookSerializerStrategy implements BookSerializerStrategy {
public String serialize(Book book) {
// Do something to serialize your book.
}
}
public class JsonBookSerializerStrategy implements BookSerializerStrategy {
public String serialize(Book book) {
// Do something to serialize your book.
}
}
You POJO interface would become:
public class Book implements Serializable {
private Long id;
private BookSerializerStrategy serializer
public String serialize() {
return serializer.serialize(this);
}
public void setSerializer(BookSerializerStrategy serializer) {
this.serializer = serializer;
}
}
Using this approach you will be able to isolate the serialization logic in just one place and wouldn't pollute your POJO with that. Additionally, returning a String I won't need to couple you POJO with classes Document and Element.
2) Get an instance of my resource (i.e Book) from request parameters, so I can get that instance to be persisted
To find a pattern to handle the deserialization is more complex in my opinion. I really don't see a better way than to create a Factory with static methods in order to remove this logic from your POJO.
Another approach to answer your two questions would be something like JAXB uses: two different objects, an Unmarshaller in charge of deserialization and a Marshaller for serialization. Since Java 1.6, JAXB comes by default with JDK.
Finally, those are just suggestions. I've become really interested in your question actually and curious about other possible solutions.

Are you using Spring, or any other framework, in your project? If you used Spring, it would take care of serialization for you, as well as assigning request params to method params (parsing as needed).

Related

How to properly convert domain entities to DTOs while considering scalability & testability

I have read several articles and Stackoverflow posts for converting domain objects to DTOs and tried them out in my code. When it comes to testing and scalability I am always facing some issues. I know the following three possible solutions for converting domain objects to DTOs. Most of the time I am using Spring.
Solution 1: Private method in the service layer for converting
The first possible solution is to create a small "helper" method in the service layer code which is convertig the retrieved database object to my DTO object.
#Service
public MyEntityService {
public SomeDto getEntityById(Long id){
SomeEntity dbResult = someDao.findById(id);
SomeDto dtoResult = convert(dbResult);
// ... more logic happens
return dtoResult;
}
public SomeDto convert(SomeEntity entity){
//... Object creation and using getter/setter for converting
}
}
Pros:
easy to implement
no additional class for convertion needed -> project doesn't blow up with entities
Cons:
problems when testing, as new SomeEntity() is used in the privated method and if the object is deeply nested I have to provide a adequate result of my when(someDao.findById(id)).thenReturn(alsoDeeplyNestedObject) to avoid NullPointers if convertion is also dissolving the nested structure
Solution 2: Additional constructor in the DTO for converting domain entity to DTO
My second solution would be to add an additional constructor to my DTO entity to convert the object in the constructor.
public class SomeDto {
// ... some attributes
public SomeDto(SomeEntity entity) {
this.attribute = entity.getAttribute();
// ... nesting convertion & convertion of lists and arrays
}
}
Pros:
no additional class for converting needed
convertion hided in the DTO entity -> service code is smaller
Cons:
usage of new SomeDto() in the service code and therefor I have to provide the correct nested object structure as a result of my someDao mocking.
Solution 3: Using Spring's Converter or any other externalized Bean for this converting
If recently saw that Spring is offering a class for converting reasons: Converter<S, T> but this solution stands for every externalized class which is doing the convertion. With this solution I am injecting the converter to my service code and I call it when i want to convert the domain entity to my DTO.
Pros:
easy to test as I can mock the result during my test case
separation of tasks -> a dedicated class is doing the job
Cons:
doesn't "scale" that much as my domain model grows. With a lot of entities I have to create two converters for every new entity (-> converting DTO entitiy and entitiy to DTO)
Do you have more solutions for my problem and how do you handle it? Do you create a new Converter for every new domain object and can "live" with the amount of classes in the project?
Thanks in advance!
Solution 1: Private method in the service layer for converting
I guess Solution 1 will not not work well, because your DTOs are domain-oriented and not service oriented. Thus it will be likely that they are used in different services. So a mapping method does not belong to one service and therefore should not be implemented in one service. How would you re-use the mapping method in another service?
The 1. solution would work well if you use dedicated DTOs per service method. But more about this at the end.
Solution 2: Additional constructor in the DTO for converting domain entity to DTO
In general a good option, because you can see the DTO as an adapter to the entity. In other words: the DTO is another representation of an entity. Such designs often wrap the source object and provide methods that give you another view on the wrapped object.
But a DTO is a data transfer object so it might be serialized sooner or later and send over a network, e.g. using spring's remoting capabilities. In this case the client that receives this DTO must deserialize it and thus needs the entity classes in it's classpath, even if it only uses the DTO's interface.
Solution 3: Using Spring's Converter or any other externalized Bean for this converting
Solution 3 is the solution that I also would prefer. But I would create a Mapper<S,T> interface that is responsible for mapping from source to target and vice versa. E.g.
public interface Mapper<S,T> {
public T map(S source);
public S map(T target);
}
The implementation can be done using a mapping framework like modelmapper.
You also said that a converter for each entity
doesn't "scale" that much as my domain model grows. With a lot of entities I have to create two converters for every new entity (-> converting DTO entitiy and entitiy to DTO)
I doupt that you only have to create 2 converter or one mapper for one DTO, because your DTO is domain-oriented.
As soon as you start to use it in another service you will recognize that the other service usually should or can not return all values that the first service does.
You will start to implement another mapper or converter for each other service.
This answer would get to long if I start with pros and cons of dedicated or shared DTOs, so I can only ask you to read my blog pros and cons of service layer designs.
EDIT
About the third solution: where do you prefer to put the call for the mapper?
In the layer above the use cases. DTOs are data transfer objects, because they pack data in data structures that are best for the transfer protocol. Thus I call that layer the transport layer.
This layer is responsible for mapping use case's request and result objects from and to the transport representation, e.g. json data structures.
EDIT
I see you're ok with passing an entity as a DTO constructor parameter. Would you also be ok with the opposite? I mean, passing a DTO as an Entity constructor parameter?
A good question. The opposite would not be ok for me, because I would then introduce a dependency in the entity to the transport layer. This would mean that a change in the transport layer can impact the entities and I don't want changes in more detailed layers to impact more abstract layers.
If you need to pass data from the transport layer to the entity layer you should apply the dependency inversion principle.
Introduce an interface that will return the data through a set of getters, let the DTO implement it and use this interface in the entities constructor. Keep in mind that this interface belongs to the entity's layer and thus should not have any dependencies to the transport layer.
interface
+-----+ implements || +------------+ uses +--------+
| DTO | ---------------||-> | EntityData | <---- | Entity |
+-----+ || +------------+ +--------+
I like the third solution from the accepted answer.
Solution 3: Using Spring's Converter or any other externalized Bean for this converting
And I create DtoConverter in this way:
BaseEntity class marker:
public abstract class BaseEntity implements Serializable {
}
AbstractDto class marker:
public class AbstractDto {
}
GenericConverter interface:
public interface GenericConverter<D extends AbstractDto, E extends BaseEntity> {
E createFrom(D dto);
D createFrom(E entity);
E updateEntity(E entity, D dto);
default List<D> createFromEntities(final Collection<E> entities) {
return entities.stream()
.map(this::createFrom)
.collect(Collectors.toList());
}
default List<E> createFromDtos(final Collection<D> dtos) {
return dtos.stream()
.map(this::createFrom)
.collect(Collectors.toList());
}
}
CommentConverter interface:
public interface CommentConverter extends GenericConverter<CommentDto, CommentEntity> {
}
CommentConveter class implementation:
#Component
public class CommentConverterImpl implements CommentConverter {
#Override
public CommentEntity createFrom(CommentDto dto) {
CommentEntity entity = new CommentEntity();
updateEntity(entity, dto);
return entity;
}
#Override
public CommentDto createFrom(CommentEntity entity) {
CommentDto dto = new CommentDto();
if (entity != null) {
dto.setAuthor(entity.getAuthor());
dto.setCommentId(entity.getCommentId());
dto.setCommentData(entity.getCommentData());
dto.setCommentDate(entity.getCommentDate());
dto.setNew(entity.getNew());
}
return dto;
}
#Override
public CommentEntity updateEntity(CommentEntity entity, CommentDto dto) {
if (entity != null && dto != null) {
entity.setCommentData(dto.getCommentData());
entity.setAuthor(dto.getAuthor());
}
return entity;
}
}
I ended up NOT using some magical mapping library or external converter class, but just adding a small bean of my own which has convert methods from each entity to each DTO I need. The reason is that the mapping was:
either stupidly simple and I would just copy some values from one field to another, perhaps with a small utility method,
or was quite complex and would be more complicated to write down in the custom parameters to some generic mapping library, compared to just writing out that code. This is for example in the case where the client can send JSON but under the hood this is transformed into entities, and when the client retrieves the parent object of these entities again, it's converted back into JSON.
This means I can just call .map(converter::convert) on any collection of entities to get back a stream of my DTO's.
Is it scalable to have it all in one class? Well the custom configuration for this mapping would have to be stored somewhere even if using a generic mapper. The code is generally extremely simple, except for a handful of cases, so I'm not too worried about this class exploding in complexity. I'm also not expecting to have dozens more entities, but if I did I might group these converters in a class per subdomain.
Adding a base class to my entities and DTO's so I can write a generic converter interface and implement it per class just isn't needed (yet?) either for me.
In my opinion the third solution is the best one. Yes for each entity you'll have to create a two new convert classes but when you come time for testing you won't have a lot of headaches. You should never chose the solution which will cause you to write less code at the begining and then write much more when it comes to testing and maintaining that code.
Another point is , if you use the second approach and your entity has lazy dependencies, your Dto can't understand if dependency is loaded unless you inject EntityManager into the Dto and use it to check if dependency was loaded. I don't like this approach cause Dto shouldn't know anything about EntityManager. As a solution I personally prefer Converters but at the same time I prefer to have multiple Dto classes for the same entity . For example If I am 100 % sure that User Entity will be loaded without corresponding Company , then there has to be a UserDto that doesn't have CompanyDto as a field. At the same time If I know that UserEntity will be loaded with correlated Company , then I will use aggregate pattern , something like a UserCompanyDto class that contains UserDto and CompanyDto as parameters
On my side I prefer using option 3 with a third party library such as modelmapper or mapstruct. Also I use it through interface in an util package, because I don't want any external tool or library to interact directly with my code.
Definition:
public interface MapperWrapper {
<T> T performMapping(Object source, Class<T> destination);
}
#Component
public class ModelMapperWrapper implements MapperWrapper {
private ModelMapper mapper;
public ModelMapperWrapper() {
this.mapper = new ModelMapper();
}
#Override
public <T> T performMapping(Object source, Class<T>
destination) {
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT);
return mapper.map(source, destination);
}
}
Then after I can test it easily:
Testing:
#SpringJUnitWebConfig(TestApplicationConfig.class)
class ModelMapperWrapperTest implements WithAssertions {
private final MapperWrapper mapperWrapper;
#Autowired
public ModelMapperWrapperTest(MapperWrapper mapperWrapper) {
this.mapperWrapper = mapperWrapper;
}
#BeforeEach
void setUp() {
}
#Test
void givenModel_whenMapModelToDto_thenReturnsDto() {
var model = new DummyModel();
model.setId(1);
model.setName("DUMMY_NAME");
model.setAge(25);
var modelDto = mapperWrapper.performMapping(model, DummyModelDto.class);
assertAll(
() -> assertThat(modelDto.getId()).isEqualTo(String.valueOf(model.getId())),
() -> assertThat(modelDto.getName()).isEqualTo(model.getName()),
() -> assertThat(modelDto.getAge()).isEqualTo(String.valueOf(model.getAge()))
);
}
#Test
void givenDto_whenMapDtoToModel_thenReturnsModel() {
var modelDto = new DummyModelDto();
modelDto.setId("1");
modelDto.setName("DUMMY_NAME");
modelDto.setAge("25");
var model = mapperWrapper.performMapping(modelDto, DummyModel.class);
assertAll(
() -> assertThat(model.getId()).isEqualTo(Integer.valueOf(modelDto.getId())),
() -> assertThat(model.getName()).isEqualTo(modelDto.getName()),
() -> assertThat(model.getAge()).isEqualTo(Integer.valueOf(modelDto.getAge()))
);
}
}
After that it can be very easy to use another mapper library. I should have created an abstract factory, or strategy pattern also.

Java serialization of non serializable third party class

I am currently developing a web application and I would like to make java objects persistent at the server so that they can be retrieved at any time. Since a database is an overkill for my application, I choose the easiest way of persisting java objects: serialization to xml or to bytes. Unfortunately a big part of the code I use are java classes which I cannot modify and these classes do not implement the interface 'serializable'. What are my options regarding to serializing objects of these classes, as well as other interacting objects of my own classes?
As I said in my comments, I'd go for a SerializationService which would find the proper Serializer<T> for every object you want to save.
Something like :
public interface Serializer<T> {
Serializable toSerializable(T objectToSerialize);
//to build a factory/service around it
boolean canDeserialize(Serializable serializedObject);
T fromSerializable(Serializable serializedObject);
}
And if you want a basic, concrete example : with the quite-common Path :
public class PathSerializer implements Serializer<Path> {
#Override
public Serializable toSerializable(Path objectToSerialize) {
return objectToSerialize.toString();
}
#Override
public Path fromSerializable(Serializable serializedObject) {
if(!canDeserialize(serializedObject)){
throw new IllegalArgumentException("Cannot deserialize this");
}
return Paths.get((String)serializedObject);
}
#Override
public boolean canDeserialize(Serializable serializedObject) {
return serializedObject != null && serializedObject instanceof String;
}
}
You could also very well store POJO containing the name your original object class and the list of parameters needed in its constructor an/or a map of its fields to be able to regenerate your objects by reflection.
It's all up to you and the complexity of your application.
I think JSON would be the go-to solution here. Take Googles GSON library for example. You don't need to annotate your classes, simply write
Gson gson = new Gson();
MyObj obj = gson.fromJson(jsonString);
String json = gson.toJson(obj);
For more general information about the JSON format see the official JSON documentation.
One option would be to extend the classes that you don't have access to, in order to save their internal state, and implement Serializable on those.
More info on this SO question:
Serializing a class variable which does not implement serializable
Besides this, I don't think there is any other option except building some wrappers and serializing the classes manually to XML or JSON.

How to JSON parse immutable object in Java + Jersey

So I am just trying out Jersey for REST services and it seems to we working out fine. I only expose get services and all of the object types that I expose with these services have an immutable object representation in Java. By default Jersey seems to use a parser (JAXB?), requiring a #XmlRootElement annotation for the class that should be parsed, zero-arg constructor and setters.
I have been using Gson with no zero-arg constructor, no setters and final on all fields with no problems at all. Is there any way to accomplish this with Jersey(i.e. the paser it is using)? I have seen solutions with adapter classes that map data from a immutable object to a mutable representation, but this seems like a lot of boilerplate(new classes, more annotations, etc.) if it can be achieved with Gson without anything added.
Note: 1) I have heard people promote using zero-arg constructor and claim that Gson should not work without it. This is not what I am interested in. 2) I really have tried googling this but my keywords might be off. In other words, humiliate me in moderation.
EDIT 1:
My webservice works if I do like this:
#XmlRootElement
public class Code{
private String code; //Silly object just used for example.
public Code(){}
//(G || S)etters
}
With this class exposing the object:
#GET
#Produces(MediaType.APPLICATION_JSON)
public Set<Code> get(#QueryParam("name") String name) { // Here I want to use a class of my own instead of String name, haven't figured out how yet.
return this.codeService.get(name);
}
If I replace the Code with the following, the webservice stops working:
public class Code{
private final String code;
#JsonCreator
public Code(#JsonProperty("code") String code) {
this.code = code;
}
//Getters omitted
}
What I want is to be able to 1) have immutable objects that can be parsed to/from json and 2) Be able to define something like #RequestBody in Spring MVC for my incoming objects.
Actually this could be pretty easy with Genson. You just need the jar and then configure the Genson feature to use constructors with arguments (if you don't want to put annotations on it).
Genson genson = new GensonBuilder().useConstructorWithArguments(true).create();
// and then register it with jersey
new ResourceConfig().register(new GensonJaxRSFeature().use(genson));
Or you can use JsonProperty on the arguments. See the User Guide for more details.

Order matters with class metadata in Genson - Is there a work-around?

I'm using Genson to serialize + deserialize json in my android app into polymorphic objects. The JSON is coming from a variety of sources though and I can't guarantee that the #class metadata will be the first line item in the json. Walking through the Genson code and writing test cases it looks like the #class metadata has to be the first entry in the dictionary.
Has anyone had luck working around this constraint? Is it time to switch to something else, and if so, what?
public class Message {
Payload payload;
// getters & setters
}
public abstract class Payload {
//
}
public class Notification1 extends Payload {
String text;
// getters & setters
}
public class Notification2 extends Payload {
String otherText
// getters & setters
}
String correctOrder = {"#class":"Message","payload":{"#class":"Notification1","text":"Text"}}
String modifiedOrder = {"#class":"Message","payload":{"text":"Text", "#class":"Notification1"}}
Genson g = Genson.Builder()
.addAlias("Notification1", Notification1.class)
.addAlias("Notification2", Notification2.class)
.useRuntimeType(true)
.useClassMetadata(true)
.useMetadata(true)
.useFields(false)
.useIndentation(false)
.create();
g.deserialize(correctOrder, Message.class) // This works
g.deserialize(modifiedOrder, Message.class) // This barfs with the error: com.owlike.genson.JsonBindingException: Could not deserialize to type class com.ol.communication.messages.Message
Indeed the order matters. This was choosed on purpose, see the remarks in the user guide.
If we allow the #class property anywhere in the json object, then we will have to first deserialize all the json object (and its sub properties obj/arr etc) to an intermediary data structure and then to the correct type.
This would incur additional memory overhead and less speed but greater flexibility, true.
A solution would be to mark classes that are polymorphic (annotation/config in the builder), for whom Genson would search/produce the #class property in the stream. This would allow to have this overhead only for the polymorphic objects in the stream.
At the moment it is not implemented, but I opened an issue. It will come in a future release.
Outside of the technical aspects, I don't think you should have polymorphic logic (or any other fancy stuff) when you are dealing with multiple external API. I mean this kind of features is library specific, so if you don't use the same tool on both sides you can run into troubles. Usually people have a layer that will be used to communicate with the APIs and map the data to YOUR model. If you don't own the code on both ends, I think this would be a good solution on the long term.

Parsing nested tags with XStream

I'm using a database API called Socrata to parse information for recycling services. An example of the XML output from this service can be found here:
http://www.datakc.org/api/views/zqwi-c5q3/rows.xml?search=%22computer%22
As you can probably tell, this seems just like bad XML. The problem I'm having is that the tag containing the list of rows is called row (with no attributes) and each row is also called row (with attributes). This seems like it will confuse the hell out of XStream, and I cannot find a way at all to handle this data with XStream.
Any of you XStream/XML gurus out there know what I can do?
The solution wasn't very intuitive but I managed to figure it out. First I had to declare a class that represented the response tag. This class also contains a container of a list of recycle services. Here's how the class looks:
#XStreamAlias("response")
public class QueryResponse {
#XStreamAlias("row")
private RecycleServices services;
public RecycleServices getServices() {
return services;
}
public void setServices(RecycleServices services) {
this.services = services;
}
}
The RecycleServices class is the real trick here, which wraps an implicit List of RecycleService classes.
#XStreamAlias("row")
public class RecycleServices {
#XStreamImplicit(itemFieldName = "row")
private List<RecycleService> services = new ArrayList<RecycleService>();
public List<RecycleService> getServices() {
return services;
}
public void setServices(List<RecycleService> services) {
this.services = services;
}
}
The RecycleService class was then just a straight representation of each recycle service row, and is not really relevant to the answer of this question. I had some frustration figuring this out and I hope that this helps someone out there.
This is more a workaround than a solution, but the row/element naming would be surmountable by using XSLT to rewrite the DOM to something XStream would be better able to consume.

Categories