Context :
I have a REST Service let's say CustomerService which for now has one method getCustomer(id, country). Now requirement is that depending upon country I have to perform different business logic like access different database or some custom rules and then report that I received such a request.
Firstly to solve different implementations depending upon country I used Factory pattern as shown below :
Common interface for all country based implementations
public Interface CustomerServiceHandler{
Cusomer getCustomer(String id, String country);
}
Then factory as
public class CustomerServiceHandlerFactory{
public CustomerServiceHandler getHandler(String country){...};
}
Implementation Detail using Facade
Note this facade is called from REST class i.e.CustomerService
public CustomerServiceFacade{
public Customer getCustomer(String id, String country){
//use factory to get handler and then handler.getCustomer
Customer customer = factory.getHandler(country).getCustomer(id,country);
//report the request
reportingService.report('fetch-customer',....);
return customer;
}
}
Going by the SRP(Single Responsibility Principle), this facade is not achieving a single objective. It is fetcing customer as well as reporting that such request was received. So I thought of decorator pattern as follow.
Implementation using Decorator Pattern:
//this is called from Rest layer
public ReportingCustomerHandler implements CustomerServiceHandler{
//this delegate is basically the default implementation and has factory too
private CustomerServiceHandler delegate;
private ReportingService reporting;
public Customer getCustomer(String id, String country){
Customer customer = delegate.getCustomer(id, country);
reporting.report(....);
return customer;
}
}
//this is called from ReportingCustomerHandler
public DefaultCustomerServiceHandler implements CustomerServiceHandler{
private CustomerServiceHandlerFactory factory;
public Customer getCustomer(String id, String country){
//get factory object else use itself, even default implementation is provided by factory
CustomerServiceHandler handler = factory.getHandler(country);
return handler.getCustomer(id,country);
}
}
Note: In second approach I am reusing the interface CustomerServiceHandler(shown in factory code) for Reporting and Default implementations also.
So what is the correct way, or what is the alternative to this if something more suitable is present.
Second part of the question
What if I have to maintain two different interfaces i.e. one CustomerServiceHandler to implement different countries' implementation and one to serve REST Layer. Then what can be the design or alternative. In this case i think facade would fit.
So what is the correct way, or what is the alternative to this
You have a solid design here and great use of the factory pattern. What I offer are suggestions on this good work but I think there are many ways to enhance what you have.
I can see where the CustomerServiceFacade method getCustomer is breaking the SRP. It combines retrieving the Customer with the reporting aspect. I agree that it would be cleaner to move reporting out of that method.
Then your object would look like this:
public CustomerServiceFacade{
public Customer getCustomer(String id, String country){
return factory.getHandler(country).getCustomer(id,country);
}
}
So where do we put the reporting?
You could move/manage the reporting through a separate interface. This would allow flexibility in implementing different reporting approaches and make testing easier (ie mock the reporting piece).
public interface ReportService {
void report(Customer c, String id, String country);
}
How does the REST layer access reporting?
Option 1: REST accesses various Customer functions through multiple objects
The implementation of ReportService can be injected into the REST Controller with the CustomerServiceFacade.
Not sure what framework you are using for REST but here is what that might look like:
#GET
#Path("/customer/{Id}/{country}")
public Response getCustomer(#PathParam("Id") String id, #PathParam("country") String country){
Response r = null;
// injected implementation of CustomerServiceFacade
Customer c = customerServiceFacade.getCustomer(id, country);
if (c!=null){
// injected implementation of ReportService
reportService.report(c, id, country);
}
else {
// handle errors ...
}
return r;
}
Option 2: REST accesses various Customer functions through one Facade/Service
You could allow your service facade layer to serve the function of providing a simplified interface to a larger set of objects that provide capabilties. This could be done by having multiple customer servicing methods that enable the REST layer to access various capabilities through one object but still have the benefit of having each method adhere more closely to the SRP.
Here we inject CustomerServiceFacade into the REST Controller and it calls the two methods 1) to get the customer and 2) to handle reporting. The facade uses the implementation of the ReportService interface from above.
public CustomerServiceFacade{
public Customer getCustomer(String id, String country){
// call the correct CustomerServiceHandler (based on country)
return factory.getHandler(country).getCustomer(id,country);
}
public void report(Customer c, String id, String country){
// call the reporting service
reportService.report(c, id, country);
}
}
I think this is a reasonable use of the Facade pattern while still having SRP within the actual methods.
If the reporting implementation differs by country in the same way that the Customer does you could use another factory.
public void report(Customer c, String id, String country){
// call the correct reporting service (based on country)
rptFactory.getInstance(country).report(c,id,country);
}
Related
I am building a Spring Boot project for work.
In this project I have service which is tasked with getting certain Documents from another backend. There a quite a lot of different scenarios where the documents have to meet certain criteria e.g. be from a certain date, which can be matched freely. Currently this is accomplished with normal method like so:
#Service
public class DocumentService(){
private OtherService otherService;
#Autowire
public DocumentService(OtherService otherService){
this.otherService = otherService;
}
public List<Document> getDocuments() {
...
}
public List<Document> getDocuments(LocalDate date) {
...
}
public List<Document> getDocuments(String name){
...
}
public List<Document> getDocuments(String name, LocalDate date){
...
}
}
I find this to be a rather bad solution because for every new combination there would need to be a new method.
For that reason I'd like to use a fluent style interface for that, something like this:
//Some other class that uses DocumentService
documentService().getDocuments().withDate(LocalDate date).withName(String name).get();
I'm familiar with the Builder Pattern and method chaining but I don't see a way how I can adapt either one of those. Seeing as, per my understanding, #Service-classes are singletons in Spring Boot.
Is this at all possible with Spring Boot?
Doesn't have to be a Spring Boot solution, why not just introduce a POJO builder-like local class:
#Service
public class DocumentService(){
public Builder documents() {
return new Builder();
}
public class Builder {
private LocalDate date;
private String name;
public Builder withDate(LocalDate date) {
this.date = date;
return this;
}
// etc
public List<String> get() {
final List<SomeDTO> results = otherService.doQuery(name, date, ...);
// TODO - tranform DTO to List<String>
return list;
}
}
}
Obviously make it static if it doesn't need access to the parent component.
You could make the Spring component and the builder be the same object but that does feel contrived, also I expect you would like to be able to support multiple builders.
Also I'm assuming the parent component is genuinely a service, i.e. it doesn't contain any state or mutators, otherwise you are introducing potential synchronization problems.
EDIT: Just for illustration the builder maintains the arguments to be passed to the otherService and performs any service-like transforms.
If you want to use a fluent interface here, the object returned by your getDocuments() method would have to be the starting point for the method chain. Perhaps create something like a DocumentFilter class that you can return from there, then you'll end up with something like this:
documentService.documents().getDocuments().withDate(LocalDate date).withName(String name).getDocuments()
In this example, your DocumentFilter will have withDate(...) and withName(...) methods, and each subsequent call includes all of the criteria from the preceding DocumentFilter.
I am working with Spring Boot on a project. JdbcNamedTemplates are used in my DAOs to access data. I write queries in my daos and then Map some parameters at runtime to get correct data.
Now, I have to handle retrieving data from multiple identical tables depending on the request. Retrieval logic is the same except I need to use different table names. JdbcTemplate does not allow using table names as parameters. I do not want to use string formatting as I wish my queries to be final.
I could create abstract class with most of the functionality and the concrete classes that extend it to handle differences in table names (basicly, have method "getTableName()"). This works. However, it seems like I am creating a lot of classes and I would like to create less of them.
Are there better ways to do it?
I was thinking that using interfaces for specific tablenames would be nice, but I cant wrap my head around how that could work with Spring and Autowiring.
UPDATE:
Just giving a sample of what I would like to improve.
So far I have couple of abstract DAOs like this. They do the database talk.
public abstract class Dao1 {
private static final String PARAM = "p";
private final String QUERY1 = " SELECT * FROM " + getTableName() + " WHERE something";
//here we would also have autowired jdbcNamedTemplate and maybe some other stuff.
public getAll() {
//map parameters, do query return results
}
protected abstract String getTableName();
}
Next, I have couple of data access objects that implemenent abstract method getTableName(). So if the table was "Autumn", I woould have.
#Component
public class AutumnDao1 extends Dao1 {
#Override
protected String getTableName() {
return "AUTUMN";
}
}
So from example above you can see that for each abstract dao I would have to make couple of Concrete Daos (autumn, winter, spring, summer). This is acceptable for now, but at some point this might grow to quite sizeable collection of daos.
I would like to know if there is a way to avoid that by for instance creating just one class / interface for each "season" / name and somehow attatching it to Dao1, Dao2 etc, as needed. I only know which name is relevant when the user request arrive.
With #Qualifier("nameOfBean") you can inject the instance you are looking for.
If you have, for instance:
#Component
public class AutumnDao1 extends Dao1 {
#Override
protected String getTableName() {
return "AUTUMN";
}
}
#Component
public class SummerDao1 extends Dao1 {
#Override
protected String getTableName() {
return "SUMMER";
}
}
In this case you are creating two beans that can be injected in the parent class Dao1. To inject the right one you should do:
#Autowire
#Qualifier("autumnDao1")
private Dao1 autumnDao;
#Autowire
#Qualifier("summerDao1")
private Dao1 summerDao;
Try this!
I am going to implement Data Mapper pattern to store data in different storages/databases.
What is the best OOP pattern to implement this concept?
For example, I have User model class
public class User
{
private int id;
private String name;
private String surname;
/* getters, setters and model-level business logic */
}
and appropriate Data Mapper class
public class UserMapper
{
public User findById(int id)
{
// perform query to MySQL, Redis or another DB
}
/* other methods */
}
Is it a good idea to use Strategy pattern by creating multiple storage strategy classes and then inject them into DataMapper class?
public class UserMySQLStorageStrategy extends UserStorageStrategy
{
public User findById(int id)
{
// perform query to MySQL
}
}
public class UserRedisStorageStrategy extends UserStorageStrategy
{
public User findById(int id)
{
// perform query to Redis
}
}
public class UserMapper
{
protected UserStorageStrategy _storageStrategy;
public UserMapper(UserStorageStrategy storageStrategy)
{
this._storageStrategy = storageStrategy;
}
public User findById(int id)
{
return this._storageStrategy.findById(id);
}
/* other methods */
}
Your strategy looks awfully like the mapper class, itself. It may make sense to make your mapper and user objects into interfaces, instead, and then your specific implementations choose how/where to store them. The strategy approach makes sense if your UserMapper class does many operations that are also unrelated to storage and that do not need to change despite the difference in storage; but if all your UserMapper class does is the storage, then an interface and multiple implementations would be simpler.
You do not need any particular OOP Design Pattern. What you require is a interface that provide the functionality.
Then your different data storage should implement it. And then you just need a strategy that provide the expected instance for the work flow of your program.
I'd make UserMapper an interface with multiple implementation concrete classes first and call the interface UserDao.
I'd call the implementation classes User{Mysql|Redis|etc}DAO. If you find any common piece of code between them it could be extracted into a common abstract base class.
At that point the logic of UserMapper class could be called to UserDaoResolver, that chooses and returns the concrete implementation upon some input or if you use some dependency injection framework (like Spring) you can delegate that function to it.
The current UserMapper caller would use the DAO implementation through its interface and obtain it by one of the method mentioned above.
I have DTOs (Data Transfer Objects) sent to the DAO (Data Access Object).
DTO has an identifier string.
Based on this string (or rather the DTO), I want to invoke specific methods in the DAO.
These methods make database calls.
I have found two options to do this:
1. Constant specific method implementation using Enum
2. Invoke the method based on reflection ( in which case the DTO will carry the name of the method that needs to be invoked.)
I want to know which is a better option. Are there any other alternatives ? Is it okay to have database calls within the Enum.
The programming language used is Java.
I would not put database calls within your Enum. Instead, provide a method on your DAO that accepts the DTO, and then let that method call other methods within the DAO based on the string on the DTO. You could use a switch statement on the Enum, and make this very efficient. (Alternatively, put this implementation in a separate "adapter" class, since it could be argued that this code doesn't strictly belong in the DAO, either.)
I would also avoid reflection, mainly due to additional complexities - including in debugging and troubleshooting, as well as potential security concerns. (What if the String contained a method name that you didn't want called?)
You can create a map that maps the strings to method calls:
class YourDAO {
private interface Action {
public void perform();
}
private Map<String, Action> actions;
public YourDAO() {
actions.add("String1", new Action() {
public void perform() {
daoMethod1();
}
}
actions.add("String2", new Action() {
public void perform() {
daoMethod2();
}
}
}
public void daoMethod1() {
...
}
public void daoMethod2() {
...
}
public void doSomethingWithDTO(YourDTO dto) {
actions.get(dto.getIdentifier()).perform();
}
}
You can even adapt this idea to perform specific actions on different DTO types if you
change the key type of the map to Class<?> and instead of dto.getIdentifier() use dto.getClass().
I've noticed BlazeDS has certain things it does not support and it is often difficult to find this out. Ex: polymorphism is not. One must create methods with different names as methods with the same name with different parameters create a conflict.
I'm trying to find out if BlazeDS does not support Java static and non-static inner classes.
Details of an example pointing out the issue:
public class UserDTO {
private String name;
private AddressDTO adddress;
private PhoneDTO phone;
....
public static class PhoneDTO {
private String phoneNumber;
.....
}
public class AddressDTO {
private String address;
.....
}
This code appears to work fine for passing data to Flex via BlazeDS but results in errors when passing the data from Flex via BlazeDS back to Java.
#Service
#RemotingDestination(channels = { "my-amf" }, value = "UserService")
public class UserService {
....
public UserDTO getUser(Long userID) {
.....
return userDTO;
}
public void updateUser(UserDTO userDTO) {
....
}
public void updatePhone(PhoneDTO phoneDTO) {
.....
}
The example code above will compile and the getUser method will work. A call to the updateUser or updatePhone methods on the other hand results in a BlazeDS error. Is there a special way to use inner classes in Flex or are inner classes not supported?
Here is an example of the error messages produced:
[BlazeDS]Cannot create class of type 'com.test.dto.UserDTO.PhoneDTO'.
flex.messaging.MessageException: Cannot create class of type 'com.test.dto.UserDTO.PhoneDTO'. Type 'com.test.dto.UserDTO.PhoneDTO' not found.
Example Flex code:
var thisPhone:PhoneDTO = new PhoneDTO();
thisPhone.phoneNumber = "8885551212";
updateTagsResult.token = userService.updatePhone(thisPhone);
As for the static classes, I'm also very skeptical that they can be used as well. Static classes are possible in Actionscript, but only in the same file (private static) and I don't believe AMF3 supports it.
The purpose of AMF3 is just to have simple property to property serialization between classes. Anything more complex than that is hard to transfer over and frankly, shouldn't be done in the first place because the complexity will, in all probability, affect your development. This is why Java has DTOs. Abstract data objects that can be transferred to any languages using your choice of data protocol.
Inner Classes
No, sending an Actionscript object aliased to a Java inner class (static or otherwise) is not supported out-of-the-box.
As you've seen, when the AMF packet is deserialized, the class name is interpreted as an outer class, rather than as an inner class.
However, you could implement this yourself by having your classes implement IExternalizable. (See here for further information)
An alternative to IExternalizable is to use an approach similar to this one, which provides support for Java Enum's to be sent across to Flex. They use a custom deserializer endpoint.
In the interests of completeness, I should point out that serializing Actionscript inner classes is supported, however the [RemoteClass] metatag is not. Instead, inner classes must be explicitly registered using registerClassAlias, normally within a static method of the outer class.
Polymorphism
To correct a point in the original post:
.... Ex: polymorphism is not. One must create methods with different names as methods with the same name with different parameters create a conflict.
Given that BlazeDS is a server-side product, I'm assuming that you're referring to the way BlazeDS handles Polymorphism & overloading in Java. In which case, your statement is incorrect.
For example, the following code is valid:
#RemotingDestination
public class EchoService {
public String echo(String source)
{
return "Received String";
}
public Object echo(Object source)
{
return "Recieved object of type " + source.getClass().getName();
}
Executed as follows:
remoteObject.echo("Hello") // result from service is "Received String"
remoteObject.echo(new Date()) // result from service is "Received object of type java.util.Date"
However, this is not an example of polymoprhism, as your question states. This is method overloading, which is different.
Polymorphism is supported, as shown here:
// Java
// This method on EchoService
public String echo(Employee employee)
{
return employee.sayHello();
}
public class Employee {
public String sayHello() {
return "Hello, I'm an employee";
}
}
public class Manager extends Employee {
#Override
public String sayHello() {
return "Hello, I'm a Manager";
}
}
Executed as follows:
// In flex...
remoteObject.echo(new Employee()) // Recieves "Hello, I'm an employee"
remoteObject.echo(new Manager()) // Recieves "Hello, I'm a Manager"
If we remove the echo(Employee employee) method, then the result is:
// In flex...
remoteObject.echo(new Employee()) // Recieves "Recieved object of type Employee"
remoteObject.echo(new Manager()) // Recieves "Recieved object of type Manager"