How to split the implementation of an interface between two classes - java

I have an object model made up of interfaces with getter/setter methods. Implementations of these objects are created using dynamic proxies where values for the fields implied (using JavaBean naming conventions) are stored in a Map.
I'd like to add methods to these interfaces to provide business logic (you know, like a real object model and not just a collection of POJOs).
My first thought was to create abstract classes that implement each interface but only provide implementations of the business methods. Then I would use these implementations in concert with the Map in the InvocationHandler to provide a full implementation of the interface.
Something like:
interface ModelObject extends BaseModel {
void setFoo(String foo);
String getFoo();
void doSomething();
}
public abstract class ModelObjectImpl implements ModelObject {
#Override
public void doSomething()
{
// Do something
}
}
public class ModelObjectInvocationHander implements InvocationHandler {
Map<String, Object> fieldValues; // holds values for implied fields for getter setter
ModelObject modelObject; // holds reference to object implementing business methods
#Override
public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
// Get implied field name for method name by removing "get"/"set" and lower casing next letter
String fieldName = getBeanNameForMethod(method.getName());
if (fieldValues.containsKey(fieldName)) {
return fieldValues.get(fieldName);
}
// Not a getter/setter so must be a business method. Delegate to implementation class
return method.invoke(modelObject, args);
}
}
Something like this (but obviously more complicated) would work except that I cannot create an instance of the abstract class. I could make BusinessObjectImpl non-abstract and add do-nothing implementations of the getter/setter methods that would never be called, but that just uglies up the code and causes maintenance issues. I could also have BusinessObjectImpl not actually implement the BusinessObject interface but that breaks the nice binding between implementation and interface leading to errors when the interface and "implementation" get out of sync.
Are there any sneaky Java Reflection tricks I can use to invoke these business methods?
UPDATE:
Went with a combination of the Java dynamic proxy framework that's already in place and Javassist to create proxies for the abstract implementation classes. This allows there to be no changes at all to the existing model interfaces until business methods are added on an as-needed basis. The capability is now in place to add behavior to the objects. It's up the developers to start writing true object oriented code now.
public class ModelObjectInvocationHandler implements InvocationHandler
{
public ModelObjectInvocationHandler(Class<ModelImplementation<? extends BaseModel>> implementationClass)
{
if (implementationClass != null)
{
ProxyFactory factory = new ProxyFactory();
factory.setSuperclass(implementationClass);
try
{
modelObject = (ModelObject) factory.create(new Class<?>[0], new Object[0]);
}
catch (Exception e)
{
// Exception handling
}
}
}
Map<String, Object> fieldValues; // holds values for implied fields for getter setter
ModelObject modelObject; // holds reference to object implementing business methods
#Override
public Object invoke(Object proxy, Method method, Object[] args) throws Throwable
{
// Get implied field name for method name by removing "get"/"set" and lower casing next letter
String fieldName = getBeanNameForMethod(method.getName());
if (fieldValues.containsKey(fieldName))
{
return fieldValues.get(fieldName);
}
// Not a getter/setter so must be a business method. Delegate to implementation class
if (modelObject != null)
{
return method.invoke(modelObject, args);
}
return null;
}
}
At runtime, I scan for implementation classes and create a Map<Class<? extends BaseModel>, Class<ModelImplementation>>. When creating the dynamic proxy for the interface, I find its implementation class in the map and pass it to the InvocationHandler. Any method that is not matched as a bean name is delegated to the proxy for the implementation class. Of course, it's a little more complicated than that since I have to account for class hierarchies and multiple inheritance within the model interfaces, but the theory is sound.

I'm not aware of any standard Java reflection tricks that would do that. You could dynamically extend the abstract classes using cglib or javaassist at class load-time. This would improve performance a little bit, because no Proxy object is necessary anymore. Instead you can implement the getter/setter methods directly when creating the new class.
A third way, without those tricks, would be with the delegation pattern:
public class ModelObjectImpl implements ModelObject {
private final ModelObject delegate;
public ModelObjectImpl(ModelObject delegate) {
this.delegate = delegate;
}
#Override
public void doSomething() { /* Do something */ }
#Override
public String getFoo() { return delegate.getFoo(); }
#Override
public void setFoo(String foo) { delegate.setFoo(foo); }
}
Feed your proxy, implementing the getter/setter methods of the interface, to the constructor delegate argument. However, while this looks better than stub methods (at least for me) it's still duplicate code. So if you really want to have such dynamic classes, go with dynamic bytecode generation.
References:
Delegation pattern
CGLib
Javaassist

One way to do that is to define all your "additional" or "business" method's contract in a new interface, like:
interface ModelObjectExtension {
void doSomething();
}
interface ModelObject extends ModelObjectExtension {
void setFoo(String foo);
String getFoo();
}
public abstract class ModelObjectExtensionImpl implements ModelObjectExtension {
#Override
public void doSomething()
{
// Do something
}
}
public class ModelObjectImpl extends ModelObjectExtension {
// whatever your current implementation is...
}
and finally you can use following in your handler to call extension methods:
((ModelObjectExtension) modelObject).doSomething();

Related

create a wrapper class with additional features in java

I want to create a wrapper class over another class so that it hides the functionality of wrapped class and also the wrapper provides certain methods of its own.
For example, lets say we have class A as
public class A{
void method1(){ ... do something ... }
void method2(){ ... do something ... }
void method3(){ ... do something ... }
}
Now I want another class B which wraps class A, so that it has its own methods, and also if someone asks method of class A, it should delegate it to class A.
public class B{
// if someone asks method1() or method2() or method3() ... it should delegate it to A
// and also it has own methods
void method4(){ ... do something ... }
void method5(){ ... do something ... }
}
I can't use inheritance (i.e B extends A) because its not easy with my use case (where A has concrete constructor with some parameters which we can't get ... but we can get the object of A).
I can't simply delegate each function in A using object of A (because there are several functions in A)
Is there any other way to obtain class B with said restrictions?
Important Note: Class A is handled by someone else. We can't change any part of it.
What you have described is a Decorator pattern coined by GOF. There is plenty of sources on the Internet about it. It is similar to the Proxy pattern (as in the answer of Pavel Polivka) but the intent is different. You need the Decorator pattern:
Attach additional responsibilities to an object dynamically. Decorators provide a flexible alternative to subclassing for extending functionality. sourcemaking.com
As you have written in a comment
class A inherits from single interface containing several methods
I assume A implements AIntf and contains all the methods you want.
public class BDecorator implements AIntf {
private A delegate;
private BDecorator(A delegate) {
this.delegate = delegate;
}
void method1(){ delegate.method1(); }
// ...
void method4(){ /* something new */ }
There are several functions in A, and I don't want to do tedious work of writing each method explicitly in B.
Java is a verbose language. However, you don't need to do this by hand, every decent IDE provides automatic generation of delegate methods. So it will take you 5 seconds for any amount of methods.
The class A is not in my control, I mean someone might update its method signatures, In that case I need to watch over class A and made changes to my class B.
If you create B you are responsible for it. You at least notice if anything changed. And once again, you can re-generate the changed method with the help of an IDE in an instant.
This can be easily done with CGLIB but will require few modifications. Consider if those modifications may not be harder to do that the actual delegation of the methods.
You need to extend the classes, this can be done by adding the no arg constructor to class A, we will still delegate all the methods so do not worry about unreachable params, we are not worried about missing data, we just want the methods
You need to have CGLIB on you classpath cglib maven, maybe you already have it
Than
A would look like
public class A {
private String arg = "test";
public A() {
// noop just for extension
}
public A(String arg) {
this.arg = arg;
}
public void method1() {
System.out.println(arg);
}
}
B would look like
public class B extends A implements MethodInterceptor {
private A delegate;
private B(A delegate) {
this.delegate = delegate;
}
public static B createProxy(A obj) {
Enhancer e = new Enhancer();
e.setSuperclass(obj.getClass());
e.setCallback(new B(obj));
B proxifiedObj = (B) e.create();
return proxifiedObj;
}
void method2() {
System.out.println("a");
}
#Override
public Object intercept(Object o, Method method, Object[] objects, MethodProxy methodProxy) throws Throwable {
Method m = findMethod(this.getClass(), method);
if (m != null) { return m.invoke(this, objects); }
Object res = method.invoke(delegate, objects);
return res;
}
private Method findMethod(Class<?> clazz, Method method) throws Throwable {
try {
return clazz.getDeclaredMethod(method.getName(), method.getParameterTypes());
} catch (NoSuchMethodException e) {
return null;
}
}
}
That you can do
MyInterface b = B.createProxy(new A("delegated"));
b.method1(); // will print delegated
This is not very nice solution and you probably do not need it, please consider refactoring your code before doing this. This should be used only in very specific cases.

Creator in Factory Method Pattern

source: https://en.wikipedia.org/wiki/Factory_method_pattern
This diagram really alludes to Factory Method Pattern?
Why do we need Creator? Look at code example:
interface Product{
public String getName();
}
class ConcreteProduct1 implements Product {
#Override
public String getName() {
return "I'm product 1";
}
}
class ConcreteProduct2 implements Product {
#Override
public String getName() {
return "Im product 2!";
}
}
// CREATOR HERE
interface Creator{
public Product createProuct(String productClass);
}
class ConcreteCreator implements Creator{
#Override
public Product createProuct(String productClass) {
if(productClass.equals("1"))
return new ConcreteProduct1();
else if(productClass.equals("2"))
return new ConcreteProduct2();
else
return null; //
}
}
public class Test {
public static void main(String[] args) {
Creator c = new ConcreteCreator();
Product product = c.createProuct("1");
System.out.print(product.getName());
}
}
Code without Creator interface:
class ConcreteCreator{
public Product createProuct(String productClass) {
if(productClass.equals("1"))
return new ConcreteProduct1();
else if(productClass.equals("2"))
return new ConcreteProduct2();
else
return null; //
}
}
public class Test{
public static void main(String[] args) {
ConcreteCreator c = new ConcreteCreator();
Product product = c.createProuct("1");
System.out.print(product.getName());
}
}
So why do we need Creator interface? Is it in case i would add another factory method in future? If yes, is it still Factory Method Pattern or Abstract Factory Pattern? Could you give me some code examples with extensions to my Creator interface and implementation of ConcreteCreator which uses two methods?
Also how about generic Creator? It looks much simpler than many type specified Creators...:
interface Product{
public String getName();
}
class ConcreteProduct implements Product{
#Override
public String getName() {
return "I'm product 1";
}
}
interface Moveable{
public String move();
}
class Car implements Moveable{
#Override
public String move() {
return "moving...";
}
}
interface Creator<T>{
public T create();
}
class ConcreteCreatorProducts implements Creator<Product>{
#Override
public Product create() {
return new ConcreteProduct();
}
}
class ConcreteCreatorCar implements Creator<Car>{
#Override
public Car create() {
return new Car();
}
}
public class Test{
public static void main(String[] args) {
Creator<Product> productCreator = new ConcreteCreatorProducts();
Product product = productCreator.create();
Creator<Car> carCreator = new ConcreteCreatorCar();
Car car = carCreator.create();
}
}
In your example, you don't need a Creator interface, unless you want to have multiple implementations and swap between them. But the diagram is actually describing a slightly different pattern than you've implemented.
The way the factory method pattern is described there is based on the original design patterns book. It's a bit odd today, as it uses subclassing to configure a class, when we would encourage the use of composition instead. So, the diagram does show the factory method pattern, but different from the way it's described in many other places.
The factory method pattern is:
Define an interface for creating an object, but let subclasses decide
which class to instantiate. The Factory method lets a class defer
instantiation it uses to subclasses.
In the original pattern, Creator isn't an interface. By 'interface', they mean the factory method that Creator defines, not interfaces like Java has.
The factory method doesn't need a parameter. Instead of different types being returned based on the parameter, there are different types returned based on the subclass created.
Also, you wouldn't call createProduct from main, but from methods within Creator. Creator is the user of the factory method, so it defines a factory method, that may be abstract, and some other methods that use that method.
See the Java examples on the wikipedia page. The MazeGame class is the Creator. The constructor is used as the anOperation method, and there are multiple subclasses for creating different kinds of rooms.
Code is written so that human readers understand it.
This means that you as a programmer sometimes use the means of the language not because it is absolutely mandatory, but because it is the best way to communicate your intention.
As soon as you declare that something is an interface you make it clear that there is no "base class" - only an interface, and that any specific implementation is subtle detail not really important to people dealing with the corresponding objects.
In other words: yes, it is perfectly possible to implement a factory pattern where the part responsible for creating the actual objects is not an interface, but a fixed class. Especially when thinking about "internal" factories (that are not exposed to a public API and wide range of "different" end users) that case is probably even the more common approach. ( the code I write contains many factories, few of them would follow the above approach of "interfacing" almost everything )
Beyond that - keep in mind that programming is also often about balancing between different requirements. Example: you might (again for communicating intent) decide to declare a class that provides a certain functionality as final. So that nobody gets idea of extending that specific class. But doing so means that users of that API are all of a sudden affected in their choice of mocking frameworks. As mocking final classes is not something that you can do easily. When you are then consuming this API, and you want to write unit tests - then you are very happy about the fact that the public API is relying on interfaces, not classes. Because you can always mock interfaces - but as said, final classes can cause headache.

Thinking in Java 4th Edition - is it necessary to make a factory for isolating the code from implementation?

I am currently reading "Thinking in Java 4th edition". In the Chapter "Interface" and the sub-chapter "Interfaces and factories", it states the following
An interface is intended to be a gateway to multiple implementations,
and a typical way to produce objects that fit the interface is the
Factory Method design pattern. Instead of calling a constructor
directly, you call a creation method on a factory object which
produces an implementation of the interface—this way, in theory, your
code is completely isolated from the implementation of the interface,
thus making it possible to transparently swap one implementation for
another. Here’s a demonstration showing the structure of the Factory
Method:
(for easy reference, the example codes quoted after my question)
My question is that why don't we just make the "serviceConsumer" method to be like
public static void serviceConsumer(Service s) {
s.method1();
s.method2();
}
In this case, the code depends on the interface "Service" but not the implementation. (It can also "swap" transparently, isn't it?). So, I don't really get to the point of using "factory" here and what it states at start.
-----------------------------below quoted from "Thinking in Java"------------------------------
//: interfaces/Factories.java
import static net.mindview.util.Print.*;
interface Service {
void method1();
void method2();
}
interface ServiceFactory {
Service getService();
}
class Implementation1 implements Service {
Implementation1() {} // Package access
public void method1() {
print("Implementation1 method1");
}
public void method2() {
print("Implementation1 method2");
}
}
class Implementation1Factory implements ServiceFactory {
public Service getService() {
return new Implementation1();
}
}
class Implementation2 implements Service {
Implementation2() {} // Package access
public void method1() {
print("Implementation2 method1");
}
public void method2() {
print("Implementation2 method2");
}
}
class Implementation2Factory implements ServiceFactory {
public Service getService() {
return new Implementation2();
}
}
public class Factories {
public static void serviceConsumer(ServiceFactory fact) {
Service s = fact.getService();
s.method1();
s.method2();
}
public static void main(String[] args) {
serviceConsumer(new Implementation1Factory());
// Implementations are completely interchangeable:
serviceConsumer(new Implementation2Factory());
}
}
/* Output:
Implementation1 method1
Implementation1 method2
Implementation2 method1
Implementation2 method2
*/ //:~
Well nothing prevents you from writing such method, the quoted statement is about creation of the object itself.
In this case, the code depends on the interface "Service" but not the implementation
In both cases the code depends on the interface, the difference is, that in your implementation the Service is created outside the method serviceConsumer
Maybe it will be clearer if you see a real use of Factory Method. The TIJ example is without any context.
My favorite example is Collection.iterator(), where Collection is the ServiceFactory and Iterator is the Service. You can see the calls in the serviceConsumer() but think of the following:
Collection c = new ArrayList(); // ArrayList is a Factory for its iterator
Iterator i = c.iterator(); // getService()
if (i.hasNext()) { ...}
If serviceConsumer were a method to print the collection (instead of something without context), you could see how passing an ServiceFactory (ArrayList) is better than passing the Service (Iterator). There is more encapsulation using that (the details of the Service are hidden in the method).
Here are some UML diagrams to help understand the similarities:
Factory method pattern
TIJ Example
Collection.iterator()
Note: The pink classes are actually anonymous classes that implement the Iterator interface type that corresponds to the Collection. They're not normally classes a client will instantiate any other way (hidden).

Using methods from a subclass on an object that is an instance of the superclass

Let's say there's a class that I use extensively and is returned by a method.
CommonClass obj = getCommonObject();
Now I want to extend this class to create some utility method to avoid repeating myself.
public CommonClassPlus extends CommonClass {
public String dontRepeatYourself() {
// the reason I'm creating a subclass
}
}
Of course I would like to use my improved class for the method above, however, downcasting isn't allowed.
CommonClassPlus obj = getCommonObject();
//Cannot cast to CommonClassPlus
How can I use the method dontRepeatYourself() if I can only work with the object that is an instance of the superclass?
CommonClass and getCommonObject() are from an external library and I cannot change them.
You cannot add behavior to an existing instance in Java (like you could in JavaScript, for example).
The closest you can get in Java is the Decorator pattern:
CommonClassPlus obj = decorate(getCommonObject());
where decorate() is
public CommonClassPlus decorate(CommonClass x) {
return new CommonClassPlus(x);
}
This approach creates a potentially huge amount of boilerplate because it must delegate each method call to the wrapped instance. If a method in CommonClass is final and there is no interface you can reimplement, then this approach fails altogether.
In most cases you will be able to get along with a simple static helper method:
public static String dontRepeatYourself(CommonClass x) {
...
}
If CommonClass is from an external library, you probably want to wrap it in an Adapter Pattern anyway, using the principle of Composition over Inheritance.
This gives you complete control if you want to, say, change the library you're using, and allows you to add functionality like dontRepeatYourself().
public class CommonClassAdapter implements MyAdapter {
private final CommonClass common;
private final String cachedResult;
// Note that I'm doing dependency injection here
public CommonClassAdapter(CommonClass common) {
this.common = common;
// Don't expose these because they shouldn't be called more than once
common.methodIOnlyCallOnce();
cachedResult = common.anotherMethodIOnlyCallOnce();
}
#Override
public void someMethod() {
common.someMethodWithDifferentName();
}
#Override
public String dontRepeatYourself() {
return cachedResult;
}
}
Note also that most modern IDEs have things like Eclipse's Source -> Generate Delegate Methods to make this process faster.

Default method in interface in Java 8 and Bean Info Introspector

I have a little problem with default methods in Interface and BeanInfo Introspector.
In this example, there is interface: Interface
public static interface Interface {
default public String getLetter() {
return "A";
}
}
and two classes ClassA and ClassB:
public static class ClassA implements Interface {
}
public static class ClassB implements Interface {
public String getLetter() {
return "B";
}
}
In main method app prints PropertyDescriptors from BeanInfo:
public static String formatData(PropertyDescriptor[] pds) {
return Arrays.asList(pds).stream()
.map((pd) -> pd.getName()).collect(Collectors.joining(", "));
}
public static void main(String[] args) {
try {
System.out.println(
formatData(Introspector.getBeanInfo(ClassA.class)
.getPropertyDescriptors()));
System.out.println(
formatData(Introspector.getBeanInfo(ClassB.class)
.getPropertyDescriptors()));
} catch (IntrospectionException e) {
e.printStackTrace();
}
}
And the result is:
class
class, letter
Why default method "letter" is not visible as property in ClassA? Is it bug or feature?
I guess, Introspector does not process interface hierarchy chains, even though with Java 8 virtual extention methods (aka defenders, default methods) interfaces can have something that kinda sorta looks like property methods. Here's a rather simplistic introspector that claims it does: BeanIntrospector
Whether this can be considered a bug is somewhat of a gray area, here's why I think so.
Obviously, now a class can "inherit" from an interface a method that has all the qualities of what's oficially considered a getter/setter/mutator. But at the same time, this whole thing is against interface's purpose -- an interface can not possibly provide anything that can be considered a property, since it's stateless and behaviorless, it's only meant to describe behavior. Even defender methods are basically static unless they access real properties of a concrete implementation.
On the other hand, if we assume defenders are officially inherited (as opposed to providing default implementation which is a rather ambiguous definition), they should result in synthetic methods being created in the implementing class, and those belong to the class and are traversed as part of PropertyDescriptor lookup. Obviously this is not the way it is though, otherwise the whole thing would be working. :) It seems that defender methods are getting some kind of special treatment here.
Debugging reveals that this method is filtered out at Introspector#getPublicDeclaredMethods():
if (!method.getDeclaringClass().equals(clz)) {
result[i] = null; // ignore methods declared elsewhere
}
where clz is a fully-qualified name of the class in question.
Since ClassB has custom implementation of this method, it passes the check successfully while ClassA doesn't.
I think also that it is a bug.
You can solve this using a dedicated BeanInfo for your class, and by providing somthing like that :
/* (non-Javadoc)
* #see java.beans.SimpleBeanInfo#getAdditionalBeanInfo()
*/
#Override
public BeanInfo[] getAdditionalBeanInfo()
{
Class<?> superclass = Interface.class;
BeanInfo info = null;
try
{
info = Introspector.getBeanInfo(superclass);
}
catch (IntrospectionException e)
{
//nothing to do
}
if (info != null)
return new BeanInfo[] { info };
return null;
}
This is because you only have your method on Interface and ClassB, not on ClassA directly. However it sounds to me like a bug since I'd expect that property to showup on the list. I suspect Inrospector did not catch up with Java 8 features yet.

Categories