Same class mapped to multiple collections? - java

I want to have template objects for a class in the database and owned/altered by clients of these objects both stored in a database. To do this I can figure out 3 options.
2 classes where the owned class takes a template object to construct itself. This will create 2 nearly identical classes and constructing from the template object does not feel right to me.
The same class stored in 2 collections to differentiate from template and owned object. The template objects will have one redundant field that references to a client ID for a owned object. I am not sure if I can turn off writing Indexed fields to DB if they are not initialized. Also, I cannot seem to find a way to store the same class in a different collection. I am using Datastore.save(..) to write to DB and it seems collection name is chosen by class name. I can alter this at class level but that still means I cannot create 2 different collections for this class.
One class, one collection. I could use the ownerId reference field to specify the object is a template. However, the collection of owned objects will grow rapidly and templates need to be accessed often so this creates overhead.
There seems to be AdvancedDatastore, I have not used it but it seems to have AdvancedDatastore.save(String collection, T entity). However, it does not have a signature like that for update, delete and many other methods. So I am not sure how to properly use this in my scenario.

I migrated my Datastore implementation to a custom BasisDAO implementation where I can use AdvancedDatastore to supply a collection name. Here is what it looks like:
public class UserDAO extends BasicDAO<User, String> {
private final String collectionName;
public UserDAO(MongoClient mongoClient, Morphia morphia, String dbName, String collectionName) {
super(mongoClient, morphia, dbName);
this.collectionName = collectionName;
}
#Override
public Key<User> save(User entity) {
return ((AdvancedDatastore)getDatastore()).save(collectionName, entity);
}
#Override
public WriteResult delete(User entity) {
return ((AdvancedDatastore)getDatastore()).delete(collectionName, User.class, entity.getId());
}
public User getByName(String userName) {
return ((AdvancedDatastore)getDatastore()).createQuery(collectionName, User.class).field("name").equal(userName).get();
}
}
Using this class I am able to create multiple collections for the same class.
edit
The problem is, if you use a method from the base class that is not overwritten it will work in the default collection (class name by default). Since there does not seem to be a DAO that supports collection names I think it's better to just roll your own DAO from scratch with a AdvancedDatastore. You can implement the DAO interface and write all it's methods yourself ff you need it's contract or full functionality.

Related

Get a different class based on a config value in JAVA

I am working on a program that supports 3 different platforms. These platforms have identical logic, but the program would have to work with a different database for each one.
I have three different Database.java files for each platform. For example
com.myproject.dao.bmw.Database.java
com.myproject.dao.ford.Database.java
com.myproject.dao.chevy.Database.java
The Database classes all have the same method signatures. But their database connection or queries may be different.
I set the platform name, which in this case is the car make using a config.properties file. I call the methods inside the Database class depending on which platform is set in the config.properties file throughout the program many times.
I want to have to get the Database object based in what is set on the config.properties file when the program starts, while having the same object name for the database. That way each time I call the method names I would not have to use if statements or switches each time I want to use a method in the Database class.
What is the best way to achieve my goal?.
This sounds like a job for the Factory pattern.
Create an interface CarDB (or ICarDb if you like the naming convention like that so you know it is an interface) that contains all the common methods
Create 3 classes that implement CarDB - Ford, Bmw and Chevy
Create a CarDbFactory that has a method like CarDB getDb(Params params) that given your parameters will return a CarDB - the actual one (Ford, Bmw...) depends on the paremeters.
First of all, you did not mention any reasons why you are not considering any of the existing ORM frameworks like Hibernate which is meant specifically for this job. In a nutshell, the ORM allows you to switch across the different databases easily. But if you have a strong reason for not to use the ORM framework, then you can consider the below approach.
Basically, need to define and use the DataBaseConfigFactory and set the appropriate DBConfiguration during the start up of your application as shown below:
DataBaseConfigFactory interface:
public interface DataBaseConfigFactory {
Connection getConnection();
void executeQuery();
}
MyProjectDataBaseConfigFactory class:
public class MyProjectDataBaseConfigFactory implements DataBaseConfigFactory {
private static final DBConfiguration dbConfiguration;
static {
// Get the active db name from props file
// Set dbConfiguration to BmwDBConfiguration or FordDBConfiguration, etc...
}
public Connection getConnection() {
return dbConfiguration.getConnection();
}
public void executeQuery() {
return dbConfiguration.executeQuery();
}
}
Now define a DBConfiguration interface and all specific implementations for the operations that your bmw, ford, etc.. support
DBConfiguration class:
public interface DBConfiguration {
//Add all methods that can be supported by DBConfiguration
}
public class BmwDBConfiguration implements DBConfiguration {
// BMW specific implementations for DBConfiguration
}
public class FordDBConfiguration implements DBConfiguration{
// Ford specific implementations for DBConfiguration
}
In short, you will be using DataBaseConfigFactory interface through out your application to connect with databases and if a new database is added then you need to set the DBConfiguration appropriately.

How to properly convert domain entities to DTOs while considering scalability & testability

I have read several articles and Stackoverflow posts for converting domain objects to DTOs and tried them out in my code. When it comes to testing and scalability I am always facing some issues. I know the following three possible solutions for converting domain objects to DTOs. Most of the time I am using Spring.
Solution 1: Private method in the service layer for converting
The first possible solution is to create a small "helper" method in the service layer code which is convertig the retrieved database object to my DTO object.
#Service
public MyEntityService {
public SomeDto getEntityById(Long id){
SomeEntity dbResult = someDao.findById(id);
SomeDto dtoResult = convert(dbResult);
// ... more logic happens
return dtoResult;
}
public SomeDto convert(SomeEntity entity){
//... Object creation and using getter/setter for converting
}
}
Pros:
easy to implement
no additional class for convertion needed -> project doesn't blow up with entities
Cons:
problems when testing, as new SomeEntity() is used in the privated method and if the object is deeply nested I have to provide a adequate result of my when(someDao.findById(id)).thenReturn(alsoDeeplyNestedObject) to avoid NullPointers if convertion is also dissolving the nested structure
Solution 2: Additional constructor in the DTO for converting domain entity to DTO
My second solution would be to add an additional constructor to my DTO entity to convert the object in the constructor.
public class SomeDto {
// ... some attributes
public SomeDto(SomeEntity entity) {
this.attribute = entity.getAttribute();
// ... nesting convertion & convertion of lists and arrays
}
}
Pros:
no additional class for converting needed
convertion hided in the DTO entity -> service code is smaller
Cons:
usage of new SomeDto() in the service code and therefor I have to provide the correct nested object structure as a result of my someDao mocking.
Solution 3: Using Spring's Converter or any other externalized Bean for this converting
If recently saw that Spring is offering a class for converting reasons: Converter<S, T> but this solution stands for every externalized class which is doing the convertion. With this solution I am injecting the converter to my service code and I call it when i want to convert the domain entity to my DTO.
Pros:
easy to test as I can mock the result during my test case
separation of tasks -> a dedicated class is doing the job
Cons:
doesn't "scale" that much as my domain model grows. With a lot of entities I have to create two converters for every new entity (-> converting DTO entitiy and entitiy to DTO)
Do you have more solutions for my problem and how do you handle it? Do you create a new Converter for every new domain object and can "live" with the amount of classes in the project?
Thanks in advance!
Solution 1: Private method in the service layer for converting
I guess Solution 1 will not not work well, because your DTOs are domain-oriented and not service oriented. Thus it will be likely that they are used in different services. So a mapping method does not belong to one service and therefore should not be implemented in one service. How would you re-use the mapping method in another service?
The 1. solution would work well if you use dedicated DTOs per service method. But more about this at the end.
Solution 2: Additional constructor in the DTO for converting domain entity to DTO
In general a good option, because you can see the DTO as an adapter to the entity. In other words: the DTO is another representation of an entity. Such designs often wrap the source object and provide methods that give you another view on the wrapped object.
But a DTO is a data transfer object so it might be serialized sooner or later and send over a network, e.g. using spring's remoting capabilities. In this case the client that receives this DTO must deserialize it and thus needs the entity classes in it's classpath, even if it only uses the DTO's interface.
Solution 3: Using Spring's Converter or any other externalized Bean for this converting
Solution 3 is the solution that I also would prefer. But I would create a Mapper<S,T> interface that is responsible for mapping from source to target and vice versa. E.g.
public interface Mapper<S,T> {
public T map(S source);
public S map(T target);
}
The implementation can be done using a mapping framework like modelmapper.
You also said that a converter for each entity
doesn't "scale" that much as my domain model grows. With a lot of entities I have to create two converters for every new entity (-> converting DTO entitiy and entitiy to DTO)
I doupt that you only have to create 2 converter or one mapper for one DTO, because your DTO is domain-oriented.
As soon as you start to use it in another service you will recognize that the other service usually should or can not return all values that the first service does.
You will start to implement another mapper or converter for each other service.
This answer would get to long if I start with pros and cons of dedicated or shared DTOs, so I can only ask you to read my blog pros and cons of service layer designs.
EDIT
About the third solution: where do you prefer to put the call for the mapper?
In the layer above the use cases. DTOs are data transfer objects, because they pack data in data structures that are best for the transfer protocol. Thus I call that layer the transport layer.
This layer is responsible for mapping use case's request and result objects from and to the transport representation, e.g. json data structures.
EDIT
I see you're ok with passing an entity as a DTO constructor parameter. Would you also be ok with the opposite? I mean, passing a DTO as an Entity constructor parameter?
A good question. The opposite would not be ok for me, because I would then introduce a dependency in the entity to the transport layer. This would mean that a change in the transport layer can impact the entities and I don't want changes in more detailed layers to impact more abstract layers.
If you need to pass data from the transport layer to the entity layer you should apply the dependency inversion principle.
Introduce an interface that will return the data through a set of getters, let the DTO implement it and use this interface in the entities constructor. Keep in mind that this interface belongs to the entity's layer and thus should not have any dependencies to the transport layer.
interface
+-----+ implements || +------------+ uses +--------+
| DTO | ---------------||-> | EntityData | <---- | Entity |
+-----+ || +------------+ +--------+
I like the third solution from the accepted answer.
Solution 3: Using Spring's Converter or any other externalized Bean for this converting
And I create DtoConverter in this way:
BaseEntity class marker:
public abstract class BaseEntity implements Serializable {
}
AbstractDto class marker:
public class AbstractDto {
}
GenericConverter interface:
public interface GenericConverter<D extends AbstractDto, E extends BaseEntity> {
E createFrom(D dto);
D createFrom(E entity);
E updateEntity(E entity, D dto);
default List<D> createFromEntities(final Collection<E> entities) {
return entities.stream()
.map(this::createFrom)
.collect(Collectors.toList());
}
default List<E> createFromDtos(final Collection<D> dtos) {
return dtos.stream()
.map(this::createFrom)
.collect(Collectors.toList());
}
}
CommentConverter interface:
public interface CommentConverter extends GenericConverter<CommentDto, CommentEntity> {
}
CommentConveter class implementation:
#Component
public class CommentConverterImpl implements CommentConverter {
#Override
public CommentEntity createFrom(CommentDto dto) {
CommentEntity entity = new CommentEntity();
updateEntity(entity, dto);
return entity;
}
#Override
public CommentDto createFrom(CommentEntity entity) {
CommentDto dto = new CommentDto();
if (entity != null) {
dto.setAuthor(entity.getAuthor());
dto.setCommentId(entity.getCommentId());
dto.setCommentData(entity.getCommentData());
dto.setCommentDate(entity.getCommentDate());
dto.setNew(entity.getNew());
}
return dto;
}
#Override
public CommentEntity updateEntity(CommentEntity entity, CommentDto dto) {
if (entity != null && dto != null) {
entity.setCommentData(dto.getCommentData());
entity.setAuthor(dto.getAuthor());
}
return entity;
}
}
I ended up NOT using some magical mapping library or external converter class, but just adding a small bean of my own which has convert methods from each entity to each DTO I need. The reason is that the mapping was:
either stupidly simple and I would just copy some values from one field to another, perhaps with a small utility method,
or was quite complex and would be more complicated to write down in the custom parameters to some generic mapping library, compared to just writing out that code. This is for example in the case where the client can send JSON but under the hood this is transformed into entities, and when the client retrieves the parent object of these entities again, it's converted back into JSON.
This means I can just call .map(converter::convert) on any collection of entities to get back a stream of my DTO's.
Is it scalable to have it all in one class? Well the custom configuration for this mapping would have to be stored somewhere even if using a generic mapper. The code is generally extremely simple, except for a handful of cases, so I'm not too worried about this class exploding in complexity. I'm also not expecting to have dozens more entities, but if I did I might group these converters in a class per subdomain.
Adding a base class to my entities and DTO's so I can write a generic converter interface and implement it per class just isn't needed (yet?) either for me.
In my opinion the third solution is the best one. Yes for each entity you'll have to create a two new convert classes but when you come time for testing you won't have a lot of headaches. You should never chose the solution which will cause you to write less code at the begining and then write much more when it comes to testing and maintaining that code.
Another point is , if you use the second approach and your entity has lazy dependencies, your Dto can't understand if dependency is loaded unless you inject EntityManager into the Dto and use it to check if dependency was loaded. I don't like this approach cause Dto shouldn't know anything about EntityManager. As a solution I personally prefer Converters but at the same time I prefer to have multiple Dto classes for the same entity . For example If I am 100 % sure that User Entity will be loaded without corresponding Company , then there has to be a UserDto that doesn't have CompanyDto as a field. At the same time If I know that UserEntity will be loaded with correlated Company , then I will use aggregate pattern , something like a UserCompanyDto class that contains UserDto and CompanyDto as parameters
On my side I prefer using option 3 with a third party library such as modelmapper or mapstruct. Also I use it through interface in an util package, because I don't want any external tool or library to interact directly with my code.
Definition:
public interface MapperWrapper {
<T> T performMapping(Object source, Class<T> destination);
}
#Component
public class ModelMapperWrapper implements MapperWrapper {
private ModelMapper mapper;
public ModelMapperWrapper() {
this.mapper = new ModelMapper();
}
#Override
public <T> T performMapping(Object source, Class<T>
destination) {
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT);
return mapper.map(source, destination);
}
}
Then after I can test it easily:
Testing:
#SpringJUnitWebConfig(TestApplicationConfig.class)
class ModelMapperWrapperTest implements WithAssertions {
private final MapperWrapper mapperWrapper;
#Autowired
public ModelMapperWrapperTest(MapperWrapper mapperWrapper) {
this.mapperWrapper = mapperWrapper;
}
#BeforeEach
void setUp() {
}
#Test
void givenModel_whenMapModelToDto_thenReturnsDto() {
var model = new DummyModel();
model.setId(1);
model.setName("DUMMY_NAME");
model.setAge(25);
var modelDto = mapperWrapper.performMapping(model, DummyModelDto.class);
assertAll(
() -> assertThat(modelDto.getId()).isEqualTo(String.valueOf(model.getId())),
() -> assertThat(modelDto.getName()).isEqualTo(model.getName()),
() -> assertThat(modelDto.getAge()).isEqualTo(String.valueOf(model.getAge()))
);
}
#Test
void givenDto_whenMapDtoToModel_thenReturnsModel() {
var modelDto = new DummyModelDto();
modelDto.setId("1");
modelDto.setName("DUMMY_NAME");
modelDto.setAge("25");
var model = mapperWrapper.performMapping(modelDto, DummyModel.class);
assertAll(
() -> assertThat(model.getId()).isEqualTo(Integer.valueOf(modelDto.getId())),
() -> assertThat(model.getName()).isEqualTo(modelDto.getName()),
() -> assertThat(model.getAge()).isEqualTo(Integer.valueOf(modelDto.getAge()))
);
}
}
After that it can be very easy to use another mapper library. I should have created an abstract factory, or strategy pattern also.

How to select JDBC / JPA implementations at run time?

I'm writing an application meant to manage a database using both JDBC and JPA for an exam. I would like the user to select once at the beginning the API to use so that all the application will use the selected API (whether it be JPA or JDBC).
For the moment I decided to use this approach:
I created an interface for each DAO class (e.g. interface UserDAO) with all needed method declarations.
I created two classes for each DAO distinguished by the API used (e.g UserDAOImplJDBC and UserDAOImplJPA). Both of them implement the interface (in our case, UserDAO).
I created a third class (e.g. UserDAOImpl) that extends the JDBC implementation class. In all my code I've been always using this class. When I wanted to switch to the JPA I just had to change in all DAO classes the extends ***ImplDAOJDBC to extends ***ImplDAOJPA.
Now, as I'm starting having many DAO classes it's starting being complicate to modify the code each time.
Is there a way to change all extends faster?
I was considering adding an option in the first screen (for example a radioGroup) to select JDBC or JPA. But yet I have no idea how to make it work without having to restructure all code. Any idea?
Use a factory to get the appropriate DAO, every time you need one:
public class UserDaoFactory {
public UserDao create() {
if (SomeSharedSingleton.getInstance().getPersistenceOption() == JDBC) {
return new UserDAOImplJDBC();
}
else {
return new UserDAOImplJPA();
}
}
}
That's a classic OO pattern.
That said, I hope you realize that what you're doing there should really never be done in a real application:
there's no reason to do the exact same thing in two different ways
the persistence model of JPA and JDBC is extremely different: JPA entities are managed by the JPA engine, so every change to JPA entities is transparently made persistent. That's not the case with JDBC, where the data you get from the database is detached. So the way to implement business logic is very different between JPA and JDBC: you typically never need to save any change when using JPA.
You got 1 and 2 right, but 3 completely wrong.
Instead of having Impl extending one of the other implementations, choose which implementation to initialize using a utility method, for example. That's assuming you don't use Dependency Injection framework such as Spring.
UserDAO dao = DBUtils.getUserDAO();
public class DBUtils {
public static boolean shouldUseJdbc() {
// Decide on some configuration what should you use
}
public static UserDAO getUserDAO() {
if (shouldUseJdbc()) {
return new UserDAOImplJDBC();
}
else {
return new UserDAOImplJPA();
}
}
}
This is still jus an examle, as your DAOs don't need to be instantiated each time, but actually should be singletons.

How to make sure that class sent as parameter has a certain annotation

I created a DAL few weeks ago which connects to Mongo Database.
When I want to query the database with a certain class, I need to know collection it belongs.
I thought about creating an annotation, that I'll put above each class which will contain the name of the related collection, and when I'll need to query the database I'll get the annotation value by reflection.
My question is how can I declare that the class that is sent to me has the annotation.
Pretty much like:
public List<T> query(Class<T extends Interface>)
only:
public List<T> query(Class<T has Annotation>)
Thanks.
You should either use interface or enumeration to do this. It is much simpler and more explicit.
But, if you want to experiment it is fine. Following should work
public List query(Class klass) {
for(Annotations a : klass.getAnnotations()) {
//Iterate and do stuff
}
//do other stuff
}

Patterns: Populate instance from Parameters and export it to XML

I'm building a simple RESTFul Service; and for achieve that I need two tasks:
Get an instance of my resource (i.e Book) from request parameters, so I can get that instance to be persisted
Build an XML document from that instance to send the representation to the clients
Right now, I'm doing both things in my POJO class:
public class Book implements Serializable {
private Long id;
public Book(Form form) {
//Initializing attributes
id = Long.parseLong(form.getFirstValue(Book.CODE_ELEMENT));
}
public Element toXml(Document document) {
// Getting an XML Representation of the Book
Element bookElement = document.createElement(BOOK_ELEMENT);
}
I've remembered an OO principle that said that behavior should be where the data is, but now my POJO depends from Request and XML API's and that doesn't feels right (also, that class has persistence anotations)
Is there any standard approach/pattern to solve that issue?
EDIT:
The libraries i'm using are Restlets and Objectify.
I agree with you when you say that the behavior should be where the data is. But at the same time, as you say I just don't feel confortable polluting a POJO interface with specific methods used for serialization means (which can grow considerably depending on the way you want to do it - JSON, XML, etc.).
1) Build an XML document from that instance to send the representation to the clients
In order to decouple the object from serialization logic, I would adopt the Strategy Pattern:
interface BookSerializerStrategy {
String serialize(Book book);
}
public class XmlBookSerializerStrategy implements BookSerializerStrategy {
public String serialize(Book book) {
// Do something to serialize your book.
}
}
public class JsonBookSerializerStrategy implements BookSerializerStrategy {
public String serialize(Book book) {
// Do something to serialize your book.
}
}
You POJO interface would become:
public class Book implements Serializable {
private Long id;
private BookSerializerStrategy serializer
public String serialize() {
return serializer.serialize(this);
}
public void setSerializer(BookSerializerStrategy serializer) {
this.serializer = serializer;
}
}
Using this approach you will be able to isolate the serialization logic in just one place and wouldn't pollute your POJO with that. Additionally, returning a String I won't need to couple you POJO with classes Document and Element.
2) Get an instance of my resource (i.e Book) from request parameters, so I can get that instance to be persisted
To find a pattern to handle the deserialization is more complex in my opinion. I really don't see a better way than to create a Factory with static methods in order to remove this logic from your POJO.
Another approach to answer your two questions would be something like JAXB uses: two different objects, an Unmarshaller in charge of deserialization and a Marshaller for serialization. Since Java 1.6, JAXB comes by default with JDK.
Finally, those are just suggestions. I've become really interested in your question actually and curious about other possible solutions.
Are you using Spring, or any other framework, in your project? If you used Spring, it would take care of serialization for you, as well as assigning request params to method params (parsing as needed).

Categories