Suppose there is a third party library containing base class Transformer and concrete implementations TransformerA and TransformerB.
I need to write parallel classes for TransformerA and TransformerB outputting class say TransformerNew
public class TransformerAConverter {
public TransformerNew convert(TransformerA transformerA) {
// conversion logic
}
}
public class TransformerBConverter {
public TransformerNew convert(TransformerB transformerB) {
// conversion logic
}
}
I need to write following function:
public TransformerNew[] process(Transformer[] transformers) {
}
How can I achieve this without instanceof or explicit type casting. I have tried using visitor pattern but unable to express it.
I would suggest you using Strategy pattern here. The two transformers would be the Transformer strategy here. You may organize the code like this. Visitor won't fit in here, since it is used to decouple the traversal logic from the underneath data structure or representation.
public class TransformerConverter {
private final Transformer transformerStrategy,
public TransformerConverter(Transformer strategy) {
this.transformerStrategy = strategy;
}
public TransformerNew convert() {
// use the strategy to achieve the conversion.
}
}
From the comments - that sounds like the Facade pattern might be useful. For example, given this interface:
public interface TransformerNew {
public int getInterestingValue();
}
Then have a few implementations:
public class TransformerNewA implements TransformerNew {
private final TransformerA a;
public TransformerNewA(TransformerA a) {
this.a = a;
}
public int getInterestingValue() {
return a.getSomeValue() + a.getSomeOtherValue();
}
}
and
public class TransformerNewB implements TransformerNew {
private final TransformerB b;
public TransformerNewB(TransformerB b) {
this.b = b;
}
public int getInterestingValue() {
return b.getFirstPart() + b.getSecondPart();
}
}
So there's really no conversion here - just wrapping the 3rd party type, and providing a common (hopefully simpler) interface for downstream use.
Related
The Problem
I'm trying to create an application where an object class can implement some
operations from the total pool of available operations. The end goal is to not have any code duplication and to abide by the laws of OOP as much as possible.
In more detail, I'm trying to make a search engine using Lucene. Lucene
uses many indices. I've already implemented a simple structure where different index-objects inherit the methods of a parent class. The problem is that, whatever method is implemented in that parent class, it automatically becomes available for all subclasses to use. I want to give the option to the user to determine if he wants to do a phrase search, a term search or whatever else there is available for that specific index. The catch is, some indices shouldn't have the option to conduct phrase search, for example.
First Thoughts
I've thought of implementing something close to the Composite pattern,
as described by the GoF. I would implement the search operations (e.g. term search, phrase search) as primitive operations implementing some Component class and add these primitive objects later on to a Composite object. The Composite object will be implementing the same Component class as the primitives.
public abstract class Index {
public Index(String indexPath) {
// Constructor using the information provided by the subclass
}
public void phraseSearch(...) {
// Do the operation
}
public void termSearch(...) {
// Do the operation
}
public void categorySearch(...) {
// Do the operation
}
}
public class ReviewIndex extends Index {
public ReviewIndex() {
super("./review_index/");
}
}
public class TipIndex extends Index {
public TipIndex() {
super("./tip_index/");
}
}
Expected Outcome
The class ReviewIndex shouldn't be able to perform a categorySearch but be
able to execute phraseSearch and termSearch. Respectively, the TipIndex class
should be able to execute some of the parent class methods.
Final Thoughts
I know that in my solution there is no code duplication but there
are useless methods being generated each time a new index object is created.
Thank you all in advance!
P.S. If you think the Composite pattern is the way to go, in which way would you actually add the primitive objects to the composite class and in which way would you invoke them when need to?
All methods defined in a superclass are available at deriving classes but with Java 8 you might be able to get something like this by using default-methods in interfaces. So instead of one abstract class containing all possible methods you might implement four interfaces
public interface Searchable {
public String getIndexPath();
}
public interface PhraseSearchable extends Searchable {
public default void phraseSearch() {
String indexPath = getIndexPath();
// do the search
}
}
public interface TermSearchable extends Searchable {
public default void termSearch() {
String indexPath = getIndexPath();
// do the search
}
}
public interface CategorySearchable extends Searchable {
public default void categorySearch() {
String indexPath = getIndexPath();
// do the search
}
}
To avoid duplicate code you can create an abstract class
public abstract class AbstractSearchable implements Searchable {
private String indexPath;
public AbstractSearchable(String indexPath) {
this.indexPath = indexPath;
}
// other methods that might be useful
}
Your actual classes can then implement the corresponding interfaces
public class ReviewIndex extends AbstractSearchable implements CategorySearchable {
public ReviewIndex() {
super("./review_index/");
}
}
public class TipIndex extends AbstractSearchable implements PhraseSearchable, TermSearchable {
public ReviewIndex() {
super("./review_index/");
}
}
If this is possible depends heavily on the actual implementation of the search methods. Interfaces can't contain any members, etc. so these methods must be able to run for themselves (like a static method without using any static members of the class). You might to overcome this problem by adding more methods to the Searchable interface that provide the data and do the implementation in the abstract class but that might expose internal stuff to the public because all the declared methods in an interface are public.
If you don't want to use categorySearch(...) for ReviewIndex class then create one more hierarchy where you keep the categorySearch(...) method.
Example:
public abstract class Index {
public Index(String indexPath) {
// Constructor using the information provided by the subclass
}
public void phraseSearch(...) {
// Do the operation
}
}
// Give a meaningful Name
public abstract class IndexChild1 extends Index {
public void categorySearch(...) {
// Do the operation
}
}
// Give a meaningful Name
public abstract class IndexChild2 extends Index {
public void termSearch(...) {
// Do the operation
}
}
public class ReviewIndex extends IndexChild1 {
public ReviewIndex() {
super("./review_index/");
}
}
public class TipIndex extends IndexChild2 {
public TipIndex() {
super("./review_index/");
}
}
You can use Composite pattern if you need to have the same objects and use them as you wish in your ReviewIndex and TipIndex classes. you can use a list which implies aggregation and you can use one instantiation of each object(PhraseSeach, TermSearch, CategorySearch) in any order you want.
here is the code:
import java.util.ArrayList;
import java.util.List;
public class Main{
public static void main(String[] args) {
Main m = new Main();
m.run();
}
public void run() {
ReviewIndex ri = new ReviewIndex();
}
public interface ISearch {
public void search();
}
public class SearchComposite implements ISearch{
private List<ISearch> l = new ArrayList<ISearch>();
public SearchComposite(String index) {
System.out.println(index);
}
public int addSearch(ISearch search) {
l.add(search);
return this.l.size() - 1;
}
public List<ISearch> getSearch(){
return this.l;
}
public void search() {
System.out.println("search");
}
}
public class CategorySearch implements ISearch{
#Override
public void search() {
System.out.println("category search");
}
}
public class PhraseSearch implements ISearch{
#Override
public void search() {
System.out.println("phrase search");
}
}
public class TermSearch implements ISearch{
#Override
public void search() {
System.out.println("term search");
}
}
CategorySearch cs = new CategorySearch();
TermSearch ts = new TermSearch();
PhraseSearch ps = new PhraseSearch();
public class ReviewIndex {
SearchComposite sc = new SearchComposite("./review_index/");
public ReviewIndex() {
int p = sc.addSearch(ps);
int t = sc.addSearch(ts);
sc.search();
List<ISearch> s = sc.getSearch();
s.get(p).search();
s.get(t).search();
}
}
public class TipIndex {
SearchComposite sc = new SearchComposite("./tip_index/");
public TipIndex() {
int p = sc.addSearch(ps);
int t = sc.addSearch(ts);
int c = sc.addSearch(cs);
sc.search();
List<ISearch> s = sc.getSearch();
s.get(p).search();
s.get(t).search();
s.get(c).search();
}
}
}
the output of the code above is:
./review_index/
search
phrase search
term search
and we have used the same CategorySearch, TermSearch and PhraseSearch for ReviewIndex and TipIndex classes.
I know how to implement basic Adapter design pattern and also knows how C# using delegation to implement Pluggable Adapter design. But I could not find anything implemented in Java. Would you mind pointing out an example code.
Thanks in advance.
The pluggable adapter pattern is a technique for creating adapters that doesn't require making a new class for each adaptee interface you need to support.
In Java, this sort of thing is super easy, but there isn't any object involved that would actually correspond to the pluggable adapter object you might use in C#.
Many adapter target interfaces are Functional Interfaces -- interfaces that contain just one method.
When you need to pass an instance of such an interface to a client, you can easily specify an adapter using a lambda function or method reference. For example:
interface IRequired
{
String doWhatClientNeeds(int x);
}
class Client
{
public void doTheThing(IRequired target);
}
class Adaptee
{
public String adapteeMethod(int x);
}
class ClassThatNeedsAdapter
{
private final Adaptee m_whatIHave;
public String doThingWithClient(Client client)
{
// super easy lambda adapter implements IRequired.doWhatClientNeeds
client.doTheThing(x -> m_whatIHave.adapteeMethod(x));
}
public String doOtherThingWithClient(Client client)
{
// method reference implements IRequired.doWhatClientNeeds
client.doTheThing(this::_complexAdapterMethod);
}
private String _complexAdapterMethod(int x)
{
...
}
}
When the target interface has more than one method, we use an anonymous inner class:
interface IRequired
{
String clientNeed1(int x);
int clientNeed2(String x);
}
class Client
{
public void doTheThing(IRequired target);
}
class ClassThatNeedsAdapter
{
private final Adaptee m_whatIHave;
public String doThingWithClient(Client client)
{
IRequired adapter = new IRequired() {
public String clientNeed1(int x) {
return m_whatIHave.whatever(x);
}
public int clientNeed2(String x) {
return m_whatIHave.whateverElse(x);
}
};
return client.doTheThing(adapter);
}
}
One of the reasons to consider the Visitor_pattern:
A practical result of this separation is the ability to add new operations to existing object structures without modifying those structures.
Assume that you don't have the source code of third party libraries and you have added one operation on related objects.
Since you don't have object, your elements (Third party classes) can't be modified to add Visitor.
In this case, double dispatch is not possible.
So which option is generally preferred?
Option 1: Extend one more inheritance hierarchy on top of third party class and implement pattern as show in picture with double dispatch?
For a given hierarchy of Class B which extends Class A, I will add
ElementA extends A
ElementB extends B
Now ConcreteElements are derived from ElementA instead of class A.
Cons: The number of classes will grow.
Option 2: Use Visitor class a central helper class and get the work done with single dispatch.
Cons: We are not really following Visitor patter as per UML diagram.
Correct if I am wrong.
You could combine a Wrapper and Visitor to solve your problems.
Using the wrapper to add a visit method allows you to increase the usability of these objects. Of course you get the full advantages (less dependency on the legacy classes) and disadvantages (additional objects) of a wrapper.
Here's a worked-up example in JAVA (because it is pretty strict, does not do double-dispatch by itself, and I'm quite familiar with it):
1) Your legacy Objects
Assuming you have your legacy objects Legacy1 and Legacy2which you cannot change, which have specific business methods:
public final class Legacy1 {
public void someBusinessMethod1(){
...
}
}
and
public final class Legacy2 {
public void anotherBusinessMethod(){
...
}
}
2) Prepare the Wrapper
You just wrap them in a VisitableWrapper which has a visit method that takes your visitor, like:
public interface VisitableWrapper {
public void accept(Visitor visitor);
}
With the following implementations:
public class Legacy1Wrapper implements VisitableWrapper {
private final Legacy1 legacyObj;
public Legacy1Wrapper(Legacy1 original){
this.legacyObj = original;
}
public void accept(Visitor visitor){
visitor.visit(legacyObj);
}
}
and
public class Legacy2Wrapper implements VisitableWrapper {
private final Legacy2 legacyObj;
public Legacy2Wrapper(Legacy2 original){
this.legacyObj = original;
}
public void accept(Visitor visitor){
visitor.visit(legacyObj);
}
}
3) Visitor, at the ready!
Then your own Visitors can be set to visit the wrapper like so:
public interface Visitor {
public void visit(Legacy1 leg);
public void visit(Legacy2 leg);
}
With an implementation like so:
public class SomeLegacyVisitor{
public void visit(Legacy1 leg){
System.out.println("This is a Legacy1! let's do something with it!");
leg.someBusinessMethod1();
}
public void visit(Legacy2 leg){
System.out.println("Hum, this is a Legacy 2 object. Well, let's do something else.");
leg.anotherBusinessMethod();
}
}
4) Unleash the power
Finally in your code, this framework would work like this:
public class TestClass{
// Start off with some legacy objects
Legacy1 leg1 = ...
Legacy2 leg2 = ...
// Wrap all your legacy objects into a List:
List<VisitableWrapper> visitableLegacys = new ArrayList<>();
visitableLegacys.add(new Legacy1Wrapper(legacy1));
visitableLegacys.add(new Legacy2Wrapper(legacy2));
// Use any of your visitor implementations!
Visitor visitor = new SomeLegacyVisitor();
for(VisitableWrapper wrappedLegacy: visitableLegacys){
wrappedLegacy.accept(visitor);
}
}
The expected output:
This is a Legacy1! let's do something with it!
Hum, this is a Legacy 2 object. Well, let's do something else.
Drawbacks:
Quite a lot of boilerplate. Use Lombok if you develop in Java.
Quite a lot of wrapper objects instances. May or may not be a problem for you.
You need to know the specific type of the objects beforehand. This implies you know their subtype, they aren't bundles in a List. If that's the case, you have no other option but to use reflection.
There should be a possibility to add new functionality to the classes of some hierarchy, without changing the base class interface. Kinds of possible behavior should be constant, while operations for different classes should execute differently.
The Visitor Pattern allows to concentrate all that operations in one class. There might be a lot of Concrete Element classes (from the diagram), but for each of them there will be implemented visit() method in Concrete Visitor class that will define his own algorithm.
Definition and implementation of method for each subclass of Element class:
public interface Visitor {
void visit(Element element);
}
public class ConcreteVisitor implements Visitor {
public void visit(Element element) {
// implementation
}
}
The Visitor Pattern is easily extended for new operations by implementing this interface by new class with his method implementation.
The following structure encapsulates the Element class:
public lass ObjectStructure {
private Element element;
// some methods
}
This ObjectStructure class could aggregate one or several instances of Element. Presentation that Visitor acts on:
public interface Element {
void accept(Visitor visitor);
}
And implementation of accept() method in the concrete entity:
public class ConcreteElement implements Element {
public void accept(Visitor visitor) {
visitor.visit();
}
}
Using of Visitor Pattern allows to save Element hierarchy from huge logical functionality or complicated configuration.
It is desirable to add the functionality to all the classes of hierarchy while defining a new Visitor subclasses. But there could be a problem: visit() should be overriden for every hierarchy type. To avoid this it's better to define AbstractVisitor class and all leave his all visit() method bodies empty.
Conclusion: using this pattern is good when class hierarchy of type Element keeps constant. If new classes add, it usually goes to considerable changes in classes of Visitor type.
My answer is very similar to Michael von Wenckstern's, with the improvements that we have a named accept method (more like the standard pattern) and that we handle unknown concrete classes -- there's no guarantee that at some point a concrete implementation we haven't seen before won't appear on the classpath.
My visitor also allows a return value.
I've also used a more verbose name for the visit methods -- including the type in the method name, but this isn't necessary, you can call them all visit.
// these classes cannot be modified and do not have source available
class Legacy {
}
class Legacy1 extends Legacy {
}
class Legacy2 extends Legacy {
}
// this is the implementation of your visitor
abstract class LegacyVisitor<T> {
abstract T visitLegacy1(Legacy1 l);
abstract T visitLegacy2(Legacy2 l);
T accept(Legacy l) {
if (l instanceof Legacy1) {
return visitLegacy1((Legacy1)l);
} else if (l instanceof Legacy2) {
return visitLegacy2((Legacy2)l);
} else {
throw new RuntimeException("Unknown concrete Legacy subclass:" + l.getClass());
}
}
}
public class Test {
public static void main(String[] args) {
String s = new LegacyVisitor<String>() {
#Override
String visitLegacy1(Legacy1 l) {
return "It's a 1";
}
#Override
String visitLegacy2(Legacy2 l) {
return "It's a 2";
}
}.accept(new Legacy1());
System.out.println(s);
}
}
First I had to made a few assumptions about the legacy code, since you didn't provide much details about it. Let's say I need to add a new method to Legacy without reimplementing everything. This is how I'll do it:
public interface LegacyInterface {
void A();
}
public final class LegacyClass implements LegacyInterface {
#Override
public void A() {
System.out.println("Hello from A");
}
}
First extends the "contract"
public interface MyInterface extends LegacyInterface {
void B();
}
And implement it in a "decorated" way
public final class MyClass implements MyInterface {
private final LegacyInterface origin;
public MyClass(LegacyInterface origin) {
this.origin = origin;
}
#Override
public void A() {
origin.A();
}
#Override
public void B() {
System.out.println("Hello from B");
}
}
The key point is MyInterface extends LegacyInterface: this is the guarantee the implementations will benefit from both the services from the legacy code and your personnal addings.
Usage
MyInterface b = new MyClass(new LegacyClass());
I think the best approach is the Option 1: Extend one more inheritance hierarchy on top of third party class and implement the visitor pattern with double dispatch.
The problem is the number of additional classes you need, but this can be resolved with a dynamic wrapper decorator.
The Wrapper Decorator is a way to add interface implementation, methods and properties to already existing obejcts: How to implement a wrapper decorator in Java?
In this way you need your Visitor interface and put there the visit(L legacy) methods:
public interface Visitor<L> {
public void visit(L legacy);
}
In the AcceptInterceptor you can put the code for the accept method
public class AcceptInterceptor {
#RuntimeType
public static Object intercept(#This WrappedAcceptor proxy, #Argument(0) Visitor visitor) throws Exception {
visitor.visit(proxy);
}
}
The WrappedAcceptor interface defines the method to accept a visitor and to set and retrieve the wrapped object
interface WrappedAcceptor<V> {
Object getWrapped();
void setWrapped(Object wrapped);
void accept(V visitor);
}
And finally the utility code to create the Wrapper around any obect:
Class<? extends Object> proxyType = new ByteBuddy()
.subclass(legacyObject.getClass(), ConstructorStrategy.Default.IMITATE_SUPER_TYPE_PUBLIC)
.method(anyOf(WrappedAcceptor.class.getMethods())).intercept(MethodDelegation.to(AcceptInterceptor.class))
.defineField("wrapped", Object.class, Visibility.PRIVATE)
.implement(WrappedAcceptor.class).intercept(FieldAccessor.ofBeanProperty())
.make()
.load(getClass().getClassLoader(), ClassLoadingStrategy.Default.WRAPPER)
.getLoaded();
WrappedAcceptor wrapper = (WrappedAcceptor) proxyType.newInstance();
wrapper.setWrapped(legacyObject);
If your library does not has accept methods you need to do it with instanceof. (Normally you do twice single-dispatching in Java to emulate double dispatching; but here we use instanceof to emulate double dispatching).
Here is the example:
interface Library {
public void get1();
public void get2();
}
public class Library1 implements Library {
public void get1() { ... }
public void get2() { ... }
}
public class Library2 implements Library {
public void get1() { ... }
public void get2() { ... }
}
interface Visitor {
default void visit(Library1 l1) {}
default void visit(Library2 l2) {}
default void visit(Library l) {
// add here instanceof for double dispatching
if (l instanceof Library1) {
visit((Library1) l);
}
else if (l instanceof Library2) {
visit((Library2) l);
}
}
}
// add extra print methods to the library
public class PrinterVisitor implements Visitor {
void visit(Library1 l1) {
System.out.println("I am library1");
}
void visit(Library2 l2) {
System.out.println("I am library2");
}
}
and now in any method you can write:
Library l = new Library1();
PrinterVisitor pv = new PrinterVisitor();
pv.visit(l);
and it will print to you "I am library1";
I have two separate entities:
public enum Rule implements Validatable, StringRepresentable{
//...
}
and
public inteface Filter extends Validatable, StringRepresentable{
//...
}
Where
public inteface Validatable{
public GenericValidator getValidator();
}
and
public interface StringRepresentable{
public String getStringRepresentation();
}
GenericValidator is an abstract class having a number of subclasses I would not like users to access directly. How should I handle those things better?
I don't understand when it's better to create a class like
public class ValidatorFactory{
public Validator getRuleValidator(Rule r){ ... }
public Validator getFilterValidator(Filter f){ ... }
}
instead of implementing the Validatable interface as I shown earlier.
Couldn't someone explain how can I make a right decision? What potentiall circumstances requires implementing FactoryMethod a bad decision and when it would be really good?
UPD:
public interface Validator{
public ErrorCode validate();
}
public abstract class GenericValidator implements Validator{
//...
}
The ErrorCode class encapsulates the result of the validation (null if valiadtion's completed succsfully).
The Single Responsibility Principle
Construction of Validator is one responsibility, Filter or Rule probably carries another one. This means we should split it and usually we do so encapsulating instantiation logic in a Factory pattern.
Also note that implementing Validatable means being a ValidatorFactory. My answer would be - combine both solutions:
public class FilterImpl implements Filter {
private final Validator validator;
public FilterImpl(Validator validator) {
this.validator = validator;
}
#Override
public getValidator() {
return this.validator;
}
//...
}
public class FilterFactory {
private final ValidatorFactory validatorFactory = new ValidatorFactory();
public Filter createFilter() {
return new FilterImpl(valdatorFactory.createFilterValidator());
}
}
This is called Dependency Injection.
I use this pattern in two major cases:
A) Construction of the object isn't trivial - I don't trust the users of the API to do it correctly
B) There are more implementations and I want to choose the right one myself.
In both these cases I want to hide implementations simply because the user won't know which one to use and/or doesn't know how to construct it properly.
Always aim for simplicity and ease-of-use for your user. Ask yourself these questions:
Is the API easy to understand?
Is the API easy/fun to use?
Is it foolproof? (I have to try quite hard to misuse it)
Validator interface can look like this:
public interface Validator {
public int validate();
}
Filter interface can look like this:
public interface Filter {
public String getParameters(); // some related methods..
public int currentLength();
....
}
Rule interface:
public interface Rule {
public String getRule();
}
FilterValidator can look like this:
public class FilterValidator implements Validator{
private Filter f;
public FilterValidator(Filter f){
this.f = f;
}
#Override
public int validate() {
// validate f and return errorcode
String params = f.getParameters();
int strLength = f.currentLength();
.....
return 0;
}
}
Creating a factory is better to hide the internal logic of validators.
public class ValidatorFactory {
public Validator getRuleValidator(Rule r){
return null;
}
public Validator getFilterValidator(Filter f){
FilterValidator fv = new FilterValidator(f);
return fv;
}
}
Now client will invoke this factoy like this:
public class ClientDemo {
private class MyFilter implements Filter{
private String filterInput;
public MyFilter(String input){
this.filterInput = input;
}
#Override
public String getParameters() {
return null;
}
#Override
public int currentLength() {
return this.filterInput.length();
}
}
public void testValidators(){
ValidatorFactory factory = new ValidatorFactory();
Validator v = factory.getFilterValidator(new MyFilter("filter string goes here..."));
v.validate();
}
}
}
Through the interfaces Rule, Filter you can enforce the behavior you desire from client. Then client can get instances from the factory and pass the rule/filter instances to it for validation.
I can't seem to figure out the best approach to tackle the following problem. Let's say there is an abstract base class with several concrete subclasses:
public abstract class AbstractType { /* common properties */ }
public class TypeA { /* properties of type A */ }
public class TypeB { /* properties of type A */ }`
These are domain classes (JPA entities). The properties of the types are (amongst other things) used to validate user data. I'm under the assumption that adding logic to the domain model itself is considered bad practice. Therefore, I want to avoid adding a validate method to the concrete subclasses. Like so:
UserInput userInput = ...;
AbstractType data = ...;
data.validate(userInput);
I don't see an option without having to cast the domain model,
if I want to move the logic to a logic layer. With the limited knowledge I have, I can only come up with following two similar "solutions", using some kind of handler interface.
Keep some explicit reference to the handler in the type
public interface TypeHandler {
public validate(AbstractType data, UserInput userInput);
}
/* TypeAHandler & TypeBHandler implementations */
public enum Type {
TYPE_A(new TypeAHandler()),
TYPE_B(new TypeBHandler());
private TypeHandler handler;
public Handler(TypeHandler handler){
this.handler = handler;
}
public TypeHandler getHandler(){ return handler; }
}
public class TypeA {
private Type type = TYPE_A;
/* ... */
}
The handler would than be called in the following manner:
UserInput userInput = ...;
AbstractType data = ...;
data.getType.getHandler().validate(data, userInput);
The reference to the handler could also be added immediately (without the enum in between) as property to the AbstractType class, but that would mean there is a reference to a class inside the logic layer from the domain model (which kind of defeats the purpose of moving the logic to a logic layer?)
The problem here too is that the validate method inside the TypeXHandler needs to cast the data argument to its subclass first before it can start validating.
Or I could implement some method which has a large if-then structure to get the right subclass, cast it and call the appropriate handler which implements an interface similar to the following.
public interface TypeHandler<T extends AbstractType> {
public validate(T data, UserInput userInput);
}
So in both cases there is casting. In the first case there is no huge if-then structure, but the logic and domain are not separated. In the second case there is a very inflexible if-then structure.
To conclude, here is my question. Should I really avoid implementing the logic directly inside the domain? If so, is there any way to avoid the casting, the if-else structure and/or adding additional properties to the domain model (like the enum in the first "solution").
At the end of the day, you're branching based on the subtype (concrete classes) since the logic to validate user input is based on those specific details contained in the subclasses.
Generics don't really help you much here since generics are based primarily on applying logic that is uniform across different types, operating on universal logic applied to a common interface that all applicable types share. Here your logic and interface varies for each subtype.
So your main choices are an inextensible solution where you're modifying central source code (like a big bunch of ifs/elses, a map, etc) and manually branching based on subtype, or using abstraction/dynamic polymorphism as an extensible solution which doesn't require modifying any central source code and automatically branches based on subtype.
Reflection might also be a possible route if you can afford it (it's a bit expensive at runtime) and provided it can fit to give you that universal logic you can implement centrally.
If you don't want to add this validate method to AbstractType and all of its subtypes, then you can always add another level of abstraction on top which does contain a validate method like ValidatorB which implements the IValidator interface and stores an object of TypeB as a member and applies the logic used to validate user input using TypeB's properties.
I studied design patterns last week and I would like to propose my solution (it works but I'm not sure that is the smartest way to resolve your problem).
The idea of my solution is to use a factory: you give a model (in your case a JPA entity) to the factory and it gives you the correct validator for that model.
At the beginning of the program, you have to tell to the factory which is the validator class for each model class of your program through a register method.
Let's start with the implementation...
AbstractModel.java
public abstract class AbstractModel
{
private final int commonProperty;
protected AbstractModel(int commonProperty)
{
this.commonProperty = commonProperty;
}
public int getCommonProperty() { return commonProperty; };
}
In the AbstractModel we put all the common properties of the models.
ModelA.java
public class ModelA extends AbstractModel
{
private final int specificProperty1;
private final int specificProperty2;
public ModelA(int commonProperty, int specificProperty1, int specificProperty2)
{
super(commonProperty);
this.specificProperty1 = specificProperty1;
this.specificProperty2 = specificProperty2;
}
public int getSpecificProperty1() { return specificProperty1; }
public int getSpecificProperty2() { return specificProperty2; }
}
ModelA has got two specific properties.
ModelB.java
public class ModelB extends AbstractModel
{
private final int specificProperty1;
private final int specificProperty2;
public ModelB(int commonProperty, int specificProperty1, int specificProperty2)
{
super(commonProperty);
this.specificProperty1 = specificProperty1;
this.specificProperty2 = specificProperty2;
}
public int getSpecificProperty1() { return specificProperty1; }
public int getSpecificProperty2() { return specificProperty2; }
}
ModelB has got two specific properties too.
Let's say that an instance a of ModelA is valid iff
a.commonProperties == a.specificProperty1 + a.specificProperty2
and an instance b of ModelB is valid iff
b.commonProperties == b.specificProperty1 * b.specificProperty2
Validator.java
public interface Validator
{
public boolean validate();
}
A really simple interface for the validators.
AbstractValidator.java
public abstract class AbstractValidator implements Validator
{
private final AbstractModel toBeValidated;
protected AbstractValidator(AbstractModel toBeValidated)
{
this.toBeValidated = toBeValidated;
}
protected AbstractModel getModel()
{
return toBeValidated;
}
}
This is the superclass of the concrete validators that wraps the model to be validated.
ValidatorA.java
public class ValidatorA extends AbstractValidator
{
protected ValidatorA(AbstractModel toBeValidated)
{
super(toBeValidated);
}
public boolean validate()
{
ModelA modelA = (ModelA) getModel();
return modelA.getCommonProperty() == modelA.getSpecificProperty1() + modelA.getSpecificProperty2();
}
}
The validator for the instances of ModelA.
ValidatorB
public class ValidatorB extends AbstractValidator
{
protected ValidatorB(AbstractModel toBeValidated)
{
super(toBeValidated);
}
public boolean validate()
{
ModelB modelB = (ModelB) getModel();
return modelB.getCommonProperty() == modelB.getSpecificProperty1() * modelB.getSpecificProperty2();
}
}
And this is the validator for the instances of ModelB.
And finally it comes the factory!
ValidatorFactory.java
public class ValidatorsFactory
{
private static ValidatorsFactory instance;
private final HashMap<Class<? extends AbstractModel>, Class<? extends Validator>> registeredValidators;
private ValidatorsFactory()
{
registeredValidators =
new HashMap<Class<? extends AbstractModel>, Class<? extends Validator>>();
}
public static ValidatorsFactory getInstance()
{
if (instance == null)
instance = new ValidatorsFactory();
return instance;
}
public void registerValidator(
Class<? extends AbstractModel> model,
Class<? extends Validator> modelValidator)
{
registeredValidators.put(model, modelValidator);
}
public Validator createValidator(AbstractModel model)
{
Class<? extends Validator> validatorClass = registeredValidators.get(model.getClass());
Constructor<? extends Validator> validatorConstructor = null;
Validator validator = null;
try
{
validatorConstructor = validatorClass.getDeclaredConstructor(new Class<?>[] { AbstractModel.class });
validator = (Validator) validatorConstructor.newInstance(new Object[] { model });
}
catch (NoSuchMethodException | SecurityException | InstantiationException | IllegalAccessException | IllegalArgumentException | InvocationTargetException e)
{
System.err.println(e.getMessage());
// handle exception
}
return validator;
}
}
The factory is a singleton with two significant method:
registerValidator to add a new pair (modelClass, validatorClass) in the HashMap.
createValidator to obtain the correct validator for the specified model.
This is how to use this pattern:
public class Main
{
public static void main(String args[])
{
ValidatorsFactory factory = ValidatorsFactory.getInstance();
factory.registerValidator(ModelA.class, ValidatorA.class);
factory.registerValidator(ModelB.class, ValidatorB.class);
ModelA modelA = new ModelA(10, 4, 6);
if (factory.createValidator(modelA).validate())
System.out.println("modelA is valid");
else
System.out.println("modelA is not valid");
ModelB modelB = new ModelB(10, 8, 2);
if (factory.createValidator(modelB).validate())
System.out.println("modelB is valid");
else
System.out.println("modelB is not valid");
}
}
output:
modelA is valid [because 10 = 4 + 6]
modelB is not valid [because 10 != 8 * 2]
Note that the model is completely separeted from the controller and it uses only one cast from AbstractModel to a concrete model.
Hope it helps!