Which design pattern to apply - java

Lets say we have to retrive data of class: Trade. This trade class has many parameters like A, B, C....
class A { retrieveTradeDataWithA(); and many more methods which do something }
class B { retrieveTradeDataWithB(); and many more methods which do something }
class LetsSaySomeResource {
#Inject
private A classAInstance;
#Inject
private B classBInstance;
public void getTradeDataBasedOnA(){
classAInstance.retrieveTradeDataWithA();
}
public void getTradeDataBasedOnB(){
classBInstance.retrieveTradeDataWithB();
}
}
Now the requirement is we want to fetch some trade data based on A and B both and later on maybe more classes like A and B will be added to get data based on this. How shall i make the design more flexible?
Like,
public void getDataBasedOnAandB(){
}
Or, later on C class can come, so i dont want to keep injecting filters like A,B....
Can someone help on this?

First create an interface that will define the contract for doing something :
public interface IData {
void doSomething();
}
Then create the concrete implementations to do something :
public class DataA implements IData {
#Override
public void doSomething() {
// TODO Do something for A
}
}
public class DataB implements IData {
#Override
public void doSomething() {
// TODO Do something for B
}
}
And finally a class that will actually do something :
public class DataDAO {
private List<IData> dataList;
public DataDAO(List<IData> dataList) {
this.dataList = dataList;
}
public void doSomething() {
for(IData data : dataList) {
data.doSomething();
}
}
}
Now let's take your use cases :
do something for A :
List<IData> dataAList = new ArrayList<IData>();
dataList.add(new DataA());
DataDAO dataADAO = new DataDAO(dataAList);
dataADAO.doSomething();
do something for A and B :
List<IData> dataAList = new ArrayList<IData>();
dataList.add(new DataA());
dataList.add(new DataB());
DataDAO dataADAO = new DataDAO(dataAList);
dataADAO.doSomething();

It may look something like this:
class LetsSaySomeResource {
#Resource
private Map<String, DataAccessInterface> instanceToDataAccessMapping;
public DateAggregationResult getDataFor(String... instanceNames) {
DataAggregationResult result = new DataAggregationResult(); // some list or whatever
for (String instanceName : instanceNames) {
Data data = instanceToDataAccessMapping.get(instanceName).getData();
/**
* Add this data to aggregation result here
*/
}
return result;
}
}

Try something like the decorator pattern. If your classes will grow over time always based on previous requirements, you could add the additional computations by decorating them.
Your example is still very abstract so it is hard to tell whether your additional classes are interdependent or simply "extended" (a simple inheritance would do).
Or completely decoupled so something like SimY4 did could help (use e.g. a variation of the visitor pattern).

I think template design pattern would be useful here. http://en.wikipedia.org/wiki/Template_method_pattern
you can define your template method in the base class and let the sub classes implementing this class define their own algorithm to do this. Do read the wiki link and you would find your way out.

Related

How to "map" two different object inheritance trees without instanceof

I have a design question I can't get a good solution for. This is my problem:
There are two different object "trees" which need to processed together. Object tree one:
AbstractObjectTreeOne with Sub1ObjectTreeOne and Sub2ObjectTreeOne
AbstractObjectTreeTwo with Sub1ObjectTreeTwo and Sub2ObjectTreeTwo
I now have a method where I get a list of AbstractObjectTreeOne and a list of AbstractObjectTreeTwo. They are exact the same size and "match" to each other by name. So I can loop through the objects in the list of AbstractObjectTreeOne and get the according AbstractObjectTreeTwo by name.
Now it should be validated if the "matching" objects (by name) really match to each other, so the current code contains a lot of instanceof stuff. Example:
if (!(objectOfAbstractObjectTreeOne instanceof Sub1ObjectTreeOne)) {
throw exception;
}
and then also in the same method
if (!(objectOfAbstractObjectTreeTwo instanceof Sub1ObjectTreeTwo)) {
throw exception;
}
After that, both parameters are cast to their "real" subtype to be further processed. This also does not feel very good.
This all feels not very object-oriented, but I currently do not have a good idea how to solve this. I tried the visitor pattern, but it only solves the instanceof issue in either AbstractObjectTreeOne or AbstractObjectTreeTwo and still contains a lot of instanceof.
Maybe some of you have a good idea about this kind of problem. Maybe it's easy to solve, but I do not have the right idea yet.
This is called OOO Principle Polymorfism.
No need to use instanceof. You have to create an interface and use it in the declaration of the tree. All subtypes should implement this interface, and you can call the required methods without typecasting.
This is an example.
public interface ObjectTreeOne { void payloadOne() {} }
public class Sub1ObjectTreeOne implements ObjectTreeOne { void payloadOne() {} }
public class Sub2ObjectTreeOne implements ObjectTreeOne { void payloadOne() {} }
List<ObjectTreeOne> objectTreeOne = new ArrayList<>();
objectTreeOne.add(new Sub1ObjectTreeOne());
objectTreeOne.add(new Sub2ObjectTreeOne());
public interface ObjectTreeTwo { void payloadTwo() {} }
public class Sub1ObjectTreeTwo implements ObjectTreeTwo { void payloadTwo() {} }
public class Sub2ObjectTreeTwo implements ObjectTreeTwo { void payloadTwo() {} }
List<ObjectTreeTwo> objectTreeTwo = new ArrayList<>();
objectTreeTwo.add(new Sub1ObjectTreeTwo());
objectTreeTwo.add(new Sub2ObjectTreeTwo());
for(int i = 0; i < 2; i++) {
ObjectTreeOne objectTreeOne = objectTreeOne.get(i);
ObjectTreeTwo objectTreeTwo = objectTreeTwo.get(i);
objectTreeOne.payloadOne();
objectTreeTwo.payloadTwo();
}

Strategy pattern, pass function into parent method

I would like to implement something like Strategy Pattern. I have generalized logic in Parent method, I need to pass specific logic (with casting etc..) into parent.
I have following classes:
class A{
public Object generateData(Function fetchData, AbstractForm form)
{
List<DataBean> dataBeans = (List<DataBean>) fetchData.apply(form);
//...
}
}
class B extends A{
void someMethod(AbstractForm form){
Function<AbstractForm, List<DataBean>> fetchFunction = new Function<AbstractForm, List<DataBean>>() {
//here goes form specific casting and other data fetch specific logic
return dataBeans;
}
super.generateData(fetchFunction, form);
}
}
Did I get the Idea of function correctly here?
Correct use of the Strategy pattern implies aggregation between a Context (in your case class A) and a Strategy (in your case an implementation of Function).
You can see the relationship in the image below (taken from the Gang of Four book, Design patterns: elements of reusable object-oriented software).
Below I've applied a traditional Strategy pattern approach to your problem. In this case I've made it so that Function.apply(AbstractForm) returns List<DataBean> to remove the need for casting. You could of course use generics to make Function more flexible.
Strategy
public interface Function {
List<DataBean> apply(AbstractForm form);
}
Context
public class A {
private Function fetchData; // strategy
public void setStrategy(Function fetchData) { // method for setting the strategy
this.fetchData = fetchData;
}
// precondition: fetchData != null
public Object generateData(AbstractForm form) {
List<DataBean> dataBeans = fetchData.apply(form); // using the strategy
return null; // whatever you want to return
}
}
In this case, extending class A is not neccessary as we can inject our Strategy (Function) using setStrategy(Function). However, we could always extend A to great an object with a predefined Strategy.
For example:
public class B extends A {
public B() {
setStrategy((form) -> null); // implement your concrete strategy here
}
}
Using a Factory Method
Since a Strategy for fetching the data is likely required and there may be no 'default' to use and may not ever change, the Factory method pattern could be used instead to enforce the creation of a Product (Function). Note class A is now abstract and includes a Factory method createFunction() which is then implemented in the subclasses (e.g. B) to create the Function.
The design for the factory method pattern can be seen in the UML below. In this case our Product is now what was previously our Strategy (Function) and the Creator is class A, with the ConcreteCreator being class B.
Creator
public abstract class A {
private Function fetchData; // product to be used
public class A() {
fetchData = createFunction(); // call factory method
}
protected abstract Function createFunction(); // factory method
// precondition: fetchData != null
public Object generateData(AbstractForm form) {
List<DataBean> dataBeans = fetchData.apply(form); // using the product
return null; // whatever you want to return
}
}
ConcreteCreator
public class B extends A {
#Override
protected Function createFunction() {
return (form) -> null; // return product
}
}
In this case the Product is fixed and not changable, but this could be overcome by mixing the two patterns together and including setStrategy(Function) again from class A in the first example.

Best Practice/Pattern for Transforming Java Objects

Let's say I have an application that is responsible for taking a vendor message and converting into a canonical message. For example:
public class MessageA extends VendorMessage { ... }
public class MessageB extends VendorMessage { ... }
public class MessageX extends CanonicalMessage { ... }
public class MessageY extends CanonicalMessage { ... }
Where MessageA maps to MessageX and MessageB maps to MessageY.
My approach is that I have one transformer class per message type to handle this conversion. In this example, I would have the following transformers:
public class MessageXTransfomer()
{
public MessageX transform(MessageA message) {...}
}
public class MessageYTransfomer()
{
public MessageY transform(MessageB message) {...}
}
My questions is really with the way I would ultimately invoke the transformers.
Since my process takes some VendorMessage as an input, I need to interrogate the type so I know which specific transformer to direct it to. For example, one approach might look like this:
public class TransfomerService
{
MessageXTransformer messageXTransformer = new MessageXTransformer();
MessageYTransformer messageYTransformer = new MessageYTransformer();
public CanonicalMessage transform(VendorMessage message)
{
if (message instanceOf MessageA)
{
return messageXTransformer.transform((MessageA) message);
}
else if (message instanceOf MessageB)
{
return messageYTransformer.transform((MessageB) message);
}
}
}
I'm not sure why, but I this approach just feels strange - as if I'm doing something wrong. Is there a best practice for this kind of problem that I should be using?
Note: I'm looking for the best approach without using any transformation frameworks, etc. Ideally, the pattern would be achievable using just basic Java.
I like the answer of #javaguy however it is not complete. Of course it will be nice if you could use the specific transformer like in his later example, but if you can't you have to stick with TransformerFacade and kind of a StrategyPattern:
public class TransformerFacade {
private Map<Class, VendorMessageToCanonicalMessageTransformer> transformers = new HashMap<>();
{
// this is like strategies, the key may be class, class name, enum value, whatever
transformers.put(MessageA.class, new MessageXTransformer());
transformers.put(MessageB.class, new MessageYTransformer());
}
public CanonicalMessage transform(VendorMessage message) {
return transformers.get(message.getClass()).transform(message);
}
}
I would simply let every concrete VendorMessage return its corresponding CanonicalMessage by implementing an interface:
public interface Mapper<T> {
T map();
}
Then, MessageA should implement this interface:
public MessageA implements Mapper<MessageX> {
#Override
public MessageX map() {
MessageX message = ...;
// fill message
return message;
}
}
If you don't want to do the mapping in the VendorMessage class, then a strategy as suggested by Vadim Kirilchuk in his answer would do the trick.

Factory pattern: Accessing child methods

I have two classes CashStore and DrinkStore, both extends from Store. I have a StoreFactory class (returns Store object) to instantiate objects for clients. I want to access methods specific to child classes from these clients. How do I do it without casting? If I used casting, would it break the pattern, since now the clients know about the Child classes?
class Store{
A(){}
B(){}
}
class CashStore{
A(){}
B(){}
C(){}
D(){}
}
//impl for drink store and other stores
class StoreFactory{
public Store getStore(String type){
//return a Store obj based on type DrinkStore or CashStore
}
}
class Client{
StoreFactory fac;
public Client(){
fac = new StoreFactory();
Store s = fac.getStore("cash");
s.C(); //requires a cast
}
}
Does casting break my pattern?
Factory pattern is used to decouple from runtime type. For example, when it's platform- or layout-specific, and you don't want your client code to mess with it. In your case you do need an exact type, so it seems factory pattern isn't a good choice. Consider using simple static methods, like:
class Stores {
static CashStore createCashStore() {
return new CashStore();
}
static DrinkStore createDrinkStore() {
return new DrinkStore();
}
}
So basically you need to access child specific methods without casting. That's the whole purpose of Visitor pattern.
You can switch between different child by using method overloading. I have given an example below, you would need to adapt that to fit into your code. And also you should take out the business logic from the constructor (of Client) and implement them inside methods.
public class Client{
public void doSomething(CashStore cs){
cs.c();
//you can call methods specific to CashStore.
}
public void doSomething(DrinkStore ds){
ds.e();
//you can call methods specific to DrinkStore.
}
}
I want to access methods specific to child classes from these clients.
How do I do it without casting?
If you know the expected type, then you can use generics to avoid casting:
interface Store {
}
class WhiskeyStore implements Store {
}
class VodkaStore implements Store {
}
class StoreFactory {
<T extends Store> T getStore(Class<T> clazz) {
try {
// I use reflection just as an example, you can use whatever you want
return clazz.getConstructor().newInstance();
} catch (Exception e) {
throw new RuntimeException("Cannot create store of type: " + clazz, e);
}
}
}
public final class Example {
public static void main(String[] args) {
WhiskeyStore whiskeyStore = new StoreFactory().getStore(WhiskeyStore.class);
VodkaStore vodkaStore = new StoreFactory().getStore(VodkaStore.class);
}
}

Why would an Enum implement an Interface?

I just found out that Java allows enums to implement an interface. What would be a good use case for that?
Here's one example (a similar/better one is found in Effective Java 2nd Edition):
public interface Operator {
int apply (int a, int b);
}
public enum SimpleOperators implements Operator {
PLUS {
int apply(int a, int b) { return a + b; }
},
MINUS {
int apply(int a, int b) { return a - b; }
};
}
public enum ComplexOperators implements Operator {
// can't think of an example right now :-/
}
Now to get a list of both the Simple + Complex Operators:
List<Operator> operators = new ArrayList<Operator>();
operators.addAll(Arrays.asList(SimpleOperators.values()));
operators.addAll(Arrays.asList(ComplexOperators.values()));
So here you use an interface to simulate extensible enums (which wouldn't be possible without using an interface).
Enums don't just have to represent passive sets (e.g. colours). They can represent more complex objects with functionality, and so you're then likely to want to add further functionality to these - e.g. you may have interfaces such as Printable, Reportable etc. and components that support these.
The Comparable example given by several people here is wrong, since Enum already implements that. You can't even override it.
A better example is having an interface that defines, let's say, a data type. You can have an enum to implement the simple types, and have normal classes to implement complicated types:
interface DataType {
// methods here
}
enum SimpleDataType implements DataType {
INTEGER, STRING;
// implement methods
}
class IdentifierDataType implements DataType {
// implement interface and maybe add more specific methods
}
There is a case I often use. I have a IdUtil class with static methods to work with objects implementing a very simple Identifiable interface:
public interface Identifiable<K> {
K getId();
}
public abstract class IdUtil {
public static <T extends Enum<T> & Identifiable<S>, S> T get(Class<T> type, S id) {
for (T t : type.getEnumConstants()) {
if (Util.equals(t.getId(), id)) {
return t;
}
}
return null;
}
public static <T extends Enum<T> & Identifiable<S>, S extends Comparable<? super S>> List<T> getLower(T en) {
List<T> list = new ArrayList<>();
for (T t : en.getDeclaringClass().getEnumConstants()) {
if (t.getId().compareTo(en.getId()) < 0) {
list.add(t);
}
}
return list;
}
}
If I create an Identifiable enum:
public enum MyEnum implements Identifiable<Integer> {
FIRST(1), SECOND(2);
private int id;
private MyEnum(int id) {
this.id = id;
}
public Integer getId() {
return id;
}
}
Then I can get it by its id this way:
MyEnum e = IdUtil.get(MyEnum.class, 1);
Since Enums can implement interfaces they can be used for strict enforcing of the singleton pattern. Trying to make a standard class a singleton allows...
for the possibility of using reflection techniques to expose private methods as public
for inheriting from your singleton and overriding your singleton's methods with something else
Enums as singletons help to prevent these security issues. This might have been one of the contributing reasons to let Enums act as classes and implement interfaces. Just a guess.
See https://stackoverflow.com/questions/427902/java-enum-singleton and Singleton class in java for more discussion.
It's required for extensibility -- if someone uses an API you've developed, the enums you define are static; they can't be added to or modified. However, if you let it implement an interface, the person using the API can develop their own enum using the same interface. You can then register this enum with an enum manager which conglomerates the enums together with the standard interface.
Edit: #Helper Method has the perfect example of this. Think about having other libraries defining new operators and then telling a manager class that 'hey, this enum exists -- register it'. Otherwise, you'd only be able to define Operators in your own code - there'd be no extensibility.
The post above that mentioned strategies didn't stress enough what a nice lightweight implementation of the strategy pattern using enums gets you:
public enum Strategy {
A {
#Override
void execute() {
System.out.print("Executing strategy A");
}
},
B {
#Override
void execute() {
System.out.print("Executing strategy B");
}
};
abstract void execute();
}
You can have all your strategies in one place without needing a separate compilation unit for each. You get a nice dynamic dispatch with just:
Strategy.valueOf("A").execute();
Makes java read almost like a tasty loosely typed language!
Enums are just classes in disguise, so for the most part, anything you can do with a class you can do with an enum.
I cannot think of a reason that an enum should not be able to implement an interface, at the same time I cannot think of a good reason for them to either.
I would say once you start adding thing like interfaces, or method to an enum you should really consider making it a class instead. Of course I am sure there are valid cases for doing non-traditional enum things, and since the limit would be an artificial one, I am in favour of letting people do what they want there.
Most common usage for this would be to merge the values of two enums into one group and treat them similarly. For example, see how to join Fruits and Vegatables.
For example if you have a Logger enum. Then you should have the logger methods such as debug, info, warning and error in the interface. It makes your code loosely coupled.
One of the best use case for me to use enum's with interface is Predicate filters. It's very elegant way to remedy lack of typness of apache collections (If other libraries mayn't be used).
import java.util.ArrayList;
import java.util.Collection;
import org.apache.commons.collections.CollectionUtils;
import org.apache.commons.collections.Predicate;
public class Test {
public final static String DEFAULT_COMPONENT = "Default";
enum FilterTest implements Predicate {
Active(false) {
#Override
boolean eval(Test test) {
return test.active;
}
},
DefaultComponent(true) {
#Override
boolean eval(Test test) {
return DEFAULT_COMPONENT.equals(test.component);
}
}
;
private boolean defaultValue;
private FilterTest(boolean defautValue) {
this.defaultValue = defautValue;
}
abstract boolean eval(Test test);
public boolean evaluate(Object o) {
if (o instanceof Test) {
return eval((Test)o);
}
return defaultValue;
}
}
private boolean active = true;
private String component = DEFAULT_COMPONENT;
public static void main(String[] args) {
Collection<Test> tests = new ArrayList<Test>();
tests.add(new Test());
CollectionUtils.filter(tests, FilterTest.Active);
}
}
When creating constants in a jar file, it is often helpful to let users extend enum values. We used enums for PropertyFile keys and got stuck because nobody could add any new ones! Below would have worked much better.
Given:
public interface Color {
String fetchName();
}
and:
public class MarkTest {
public static void main(String[] args) {
MarkTest.showColor(Colors.BLUE);
MarkTest.showColor(MyColors.BROWN);
}
private static void showColor(Color c) {
System.out.println(c.fetchName());
}
}
one could have one enum in the jar:
public enum Colors implements Color {
BLUE, RED, GREEN;
#Override
public String fetchName() {
return this.name();
}
}
and a user could extend it to add his own colors:
public enum MyColors implements Color {
BROWN, GREEN, YELLOW;
#Override
public String fetchName() {
return this.name();
}
}
Another posibility:
public enum ConditionsToBeSatisfied implements Predicate<Number> {
IS_NOT_NULL(Objects::nonNull, "Item is null"),
IS_NOT_AN_INTEGER(item -> item instanceof Integer, "Item is not an integer"),
IS_POSITIVE(item -> item instanceof Integer && (Integer) item > 0, "Item is negative");
private final Predicate<Number> predicate;
private final String notSatisfiedLogMessage;
ConditionsToBeSatisfied(final Predicate<Number> predicate, final String notSatisfiedLogMessage) {
this.predicate = predicate;
this.notSatisfiedLogMessage = notSatisfiedLogMessage;
}
#Override
public boolean test(final Number item) {
final boolean isNotValid = predicate.negate().test(item);
if (isNotValid) {
log.warn("Invalid {}. Cause: {}", item, notSatisfiedLogMessage);
}
return predicate.test(item);
}
}
and using:
Predicate<Number> p = IS_NOT_NULL.and(IS_NOT_AN_INTEGER).and(IS_POSITIVE);
Enums are like Java Classes, they can have Constructors, Methods, etc. The only thing that you can't do with them is new EnumName(). The instances are predefined in your enum declaration.
Here's my reason why ...
I have populated a JavaFX ComboBox with the values of an Enum. I have an interface, Identifiable (specifying one method: identify), that allows me to specify how any object identifies itself to my application for searching purposes. This interface enables me to scan lists of any type of objects (whichever field the object may use for identity) for an identity match.
I'd like to find a match for an identity value in my ComboBox list. In order to use this capability on my ComboBox containing the Enum values, I must be able to implement the Identifiable interface in my Enum (which, as it happens, is trivial to implement in the case of an Enum).
I used an inner enum in an interface describing a strategy to keep instance control (each strategy is a Singleton) from there.
public interface VectorizeStrategy {
/**
* Keep instance control from here.
*
* Concrete classes constructors should be package private.
*/
enum ConcreteStrategy implements VectorizeStrategy {
DEFAULT (new VectorizeImpl());
private final VectorizeStrategy INSTANCE;
ConcreteStrategy(VectorizeStrategy concreteStrategy) {
INSTANCE = concreteStrategy;
}
#Override
public VectorImageGridIntersections processImage(MarvinImage img) {
return INSTANCE.processImage(img);
}
}
/**
* Should perform edge Detection in order to have lines, that can be vectorized.
*
* #param img An Image suitable for edge detection.
*
* #return the VectorImageGridIntersections representing img's vectors
* intersections with the grids.
*/
VectorImageGridIntersections processImage(MarvinImage img);
}
The fact that the enum implements the strategy is convenient to allow the enum class to act as proxy for its enclosed Instance. which also implements the interface.
it's a sort of strategyEnumProxy :P the clent code looks like this:
VectorizeStrategy.ConcreteStrategy.DEFAULT.processImage(img);
If it didn't implement the interface it'd had been:
VectorizeStrategy.ConcreteStrategy.DEFAULT.getInstance().processImage(img);

Categories