So I want to achieve something like this:
#Component
public class ComponentA {
public void doThis(){};
}
#Component
public class ComponentB {
public void doThat(){};
}
public interface MyInterface {
void doSomething();
}
public class MyInterfaceImplA implements MyInterface {
private final ComponentA componentA;
#Inject
public MyInterfaceImplA(ComponentA componentA){
this.componentA = componentA;
}
public void doSomething(){
componentA.doThis();
}
}
public class MyInterfaceImplB implements MyInterface {
private final ComponentB componentB;
#Inject
public MyInterfaceImplB(ComponentB componentB) {
this.componentB = componentB;
}
public void doSomething() {
componentB.doThat();
}
}
What I basically want is to inject different components into different classes implementing the same interface.
My question is if there is a good way to set this architecture up in this or a similar way? Or is there a pattern to achieve this in a better way?
Related
Lets say I have an Object with one boolean field.
public class AnyPojo {
private boolean b;
}
An interface DoAnything
public interface DoAnything {
void doAnything();
}
And two #Service annotated implementations of DoAnything
public class DoAnythingOneImpl implements DoAnything {
#Override
public void doAnything(){
//..
}
}
public class DoAnythingTwoImpl implements DoAnything {
#Override
public void doAnything(){
//..
}
}
In another #Service class the boolean field of AnyPojo determines which implementation of DoAnything should be called. How can I achive that? I can use ApplicationContext here and make the decision like below. But not sure if there are better ways.
#Service
public class AnotherServiceImpl implements AnotherService {
#Autowire
private ApplicationContext context;
#Override
public void anotherDoing(AnyPojo anyPojo) {
if(anyPojo.getB()){
context.getBean(DoAnythingOneImpl.class).doAnything();
} else{
context.getBean(DoAnythingTwoImpl.class).doAnything();
}
}
First things first, if your class requires particular implementations, why don't you simply inject these classes?
If you have several implementations of an interface, you have to inform Spring framework which one you would like to inject into a class. You can distinguish implementations by their unique bean names:
#Service("oneImpl")
public class DoAnythingOneImpl implements DoAnything {
#Override
public void doAnything(){
//..
}
}
#Service("twoImpl")
public class DoAnythingTwoImpl implements DoAnything {
#Override
public void doAnything(){
//..
}
}
And then inject both instances to the client service by marking which implementation should by assign to particular fields:
#Service
public class AnotherServiceImpl implements AnotherService {
#Autowire
#Qualifier("oneImpl")
private DoAnything doAnythingOneImpl;
#Autowire
#Qualifier("twoImpl")
private DoAnything doAnythingTwoImpl;
#Override
public void anotherDoing(AnyPojo anyPojo) {
if(anyPojo){
doAnythingOneImpl.doAnything();
} else{
doAnythingTwoImpl.doAnything();
}
}
Note that I would not call a service component from another service component to make sure I avoid potential redundant cycle in the future.
I would keep the following flow :
Controller ---canCall---> Services ---canCall---> Repositories
And if you need services with a more complex logic, then introduces the concept of Facade
Controller ---canCall---> Facades ---canCall---> Services ---canCall---> Repositories
However, here is a solution :
#Service("myServiceOne")
public class DoAnythingOneImpl implements DoAnything {
#Override
public void doAnything(){
//..
}
}
#Service("myServiceTwo")
public class DoAnythingTwoImpl implements DoAnything {
#Override
public void doAnything(){
//..
}
}
You can autowire both services and choose the best one based on your boolean :
#Service
public class AnotherServiceImpl implements AnotherService {
#Autowired
#Qualifier("myServiceOne")
private DoAnything serviceOne;
#Autowired
#Qualifier("myServiceTwo")
private DoAnything serviceTwo;
#Override
public void anotherDoing(AnyPojo anyPojo) {
if(anyPojo){
serviceOne.doAnything();
} else{
serviceTwo.doAnything();
}
}
}
I have an interface
public interface Abstraction {
void execute();
}
I have built a composite implementation and want to registered this object as the bean, #Named
#Named
public class Composite implements Abstraction {
private List<Abstraction> list;
#Inject
public Composite(List<Abstraction> list) {
this.list = list;
}
public void execute() {
list.forEach(Abstraction::execute);
}
}
How do I set it up so that the set of implementations to the abstraction gets injected properly into the Composite above? I will be having another object that takes the abstraction as a dependency and I want it to receive the #Named Composite above with the 2 Implementations below injected into the ctor.
public class Implementation1 implements Abstraction {
public void execute() { }
}
public class Implementation2 implements Abstraction {
public void execute() { }
}
If you create a bean for each of your implementations, your example will work out of the box. For example, annotate your implementations with #Named or #Component and mark them for scanning (component scan their package)
#Configuration
#ComponentScan
public class StackOverflow {
public static void main(String[] args) throws Exception {
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(StackOverflow.class);
System.out.println(ctx.getBean(Composite.class).list);
}
}
interface Abstraction {
void execute();
}
#Named
class Composite implements Abstraction {
List<Abstraction> list;
#Inject
public Composite(List<Abstraction> list) {
this.list = list;
}
public void execute() {
list.forEach(Abstraction::execute);
}
}
#Named
class Implementation1 implements Abstraction {
public void execute() {
}
}
#Named
class Implementation2 implements Abstraction {
public void execute() {
}
}
The Composite's list will contain both implementations.
Alternatively, since you only have two implementations, you could name their beans and inject them separately. For example
#Component("one")
class Implementation1 implements Abstraction {
public void execute() {
}
}
#Component("two")
class Implementation2 implements Abstraction {
public void execute() {
}
}
and inject them in the Composite
List<Abstraction> list = new ArrayList<>(2);
#Inject
public Composite(#Qualifier("one") Abstraction one, #Qualifier("two") Abstraction two) {
list.add(one);
list.add(two);
}
I suggest this solution just because the order of initialization of Abstraction beans might mess up your context initialization. For example if Implementation1 somehow depended on the initialization of Composite, the context would complain. This is rare and you can control it in other ways. Still, being explicit about the beans might be clearer in this case.
When class hierarchy is not linear, aspect is not triggered when defined on base interface.
The most interesting: when adding delegating implementation (see last code block) to the parent class of the implementation, the test becomes Green (Aspect is triggered as expected).
Question: Why doesn't it work as described in example and why does it work with delegating implementation?
Example (sorry, no shorter example found):
Test:
#Autowired
private TheInterface underTest;
private static boolean aspectCalled;
private static boolean implementationCalled;
#Test
public void aspectTest() throws Exception {
aspectCalled = false;
implementationCalled = false;
underTest.doSomething();
assertTrue("Implementation not called!", implementationCalled);
assertTrue("Aspect not called!", aspectCalled);
}
Aspect:
#Aspect
#Component
public static class MyAspect {
#Before("execution(* *..SpecializedInterface+.doSomething())")
public void applyAspect() {
aspectCalled = true;
}
}
Interfaces:
public static interface TheInterface {
void doSomething();
}
public static interface SpecializedInterface extends TheInterface {
// inherits doSomething
// defines some other methods
}
Abstract implementations (Template pattern):
public static abstract class BaseTemplate implements TheInterface {
abstract void doOneStep();
#Override
public void doSomething() {
// do some stuff and
doOneStep();
}
}
public static abstract class SpecializedTemplate extends BaseTemplate implements SpecializedInterface {
// some other methods
}
Implementing bean:
#Component
public static class TemplateImplementation extends SpecializedTemplate {
#Override
void doOneStep() {
implementationCalled = true;
}
}
(If you are interested: test setup:)
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = MyConfig.class)
public class AopTest {
#Configuration
#EnableAspectJAutoProxy
#ComponentScan(basePackageClasses = AopTest.class)
public static class MyConfig {
}
...
Ugly workaround: add this snippet to SpecializedTemplate
#Override
public void doSomething() {
super.doSomething();
}
So, why is this workaround necessary?
Thomas Stets has already explained the bytecode and JVM stuff, so I will just provide a solution to your problem, see also my answer to a very similar question.
#Aspect
public static class MyAspect {
#Before("execution(* *..TheInterface+.doSomething()) && target(specializedInterface)")
public void applyAspect(SpecializedInterface specializedInterface) {
aspectCalled = true;
}
}
I.e. your pointcut targets the base interface actually defining the method, then you limit the target to the specialised sub-interface of your choice. This should make your test green.
I'm developing an application which builds on a class written by another developer (for which I do not have the source).
I wish to use all of the functionality of said class but also to extend it with additional functionality. Ordinarily to achieve this I would have defined an interface (MyInterface) and have extended the external class (TheirClass) from my own (MyClass) while implementing MyInterface.
public interface TheirClassInterface {
public void theirMethod1();
public void theirMethod2();
}
public class TheirClass implements TheirClassInterface {
public void theirMethod1() { ... }
public void theirMethod2() { ... }
}
public class TheirOtherClass {
public void theirOtherMethod1(TheirClassInterface o) { ... }
}
public interface MyInterface() {
public void myMethod1();
}
public class MyClass extends TheirClass implements MyInterface {
public void myMethod1() { ... }
}
public class MyNewClass extends MyClass {
public void MyNewClassMethod() { ... }
}
The problem is complicated by the fact that:
I now wish to create a new class (MyNewClass) which adds additional functionality to MyClass but I don't want my code to be dependent on TheirClass.
I wish to be able to use my class as a parameter to the method of TheirOtherClass.
To combat this I refactored my code to instead use composition over inheritance and implementing TheirClassInterface. This works but requires me to implement many methods and delegate them to theirClassObject (in reality TheirClassInterface contains a very large number of methods).
public interface TheirClassInterface {
public void theirMethod1();
public void theirMethod2();
}
public class TheirClass implements TheirClassInterface {
public void theirMethod1() { ... }
public void theirMethod2() { ... }
}
public class TheirOtherClass {
public void theirOtherMethod1(TheirClassInterface o) { ... }
}
public interface MyInterface() {
public void myMethod1();
}
public class MyClass implements TheirClassInterface, MyInterface {
private TheirClass theirClassObject;
public void myMethod1() { ... }
public void theirMethod1() { theirClassObject.theirMethod1(); }
public void theirMethod2() { theirClassObject.theirMethod2(); }
}
public class MyNewClass extends MyClass {
public void MyNewClassMethod() { ... }
}
My question is whether my approach is appropriate in this case and whether it could be improved upon as it seems to me that my code uses an excessive amount of delegation to get the job done.
Many thanks for any guidance anyone can give on this.
Danny
First, as java is a strongly-typed single inheritance language, you cannot escape the delegation.
But you can avoid having to write a lot of delegation CODE, by using a dirty little trick with Proxies and reflection.
Code follows
public interface Interface1 {
void m1();
}
public interface Interface2 {
void m2();
}
public class Class1 implements Interface1 {
public void m1() {
System.out.println(1);
}
}
public class Class2 implements Interface2 {
public void m2() {
System.out.println(2);
}
}
public interface MixinInterface extends Interface1, Interface2 {
}
And this is how the magic happens
package j.with.pseudo.multiple.inheritance;
import java.lang.reflect.InvocationHandler;
import java.lang.reflect.Method;
import java.lang.reflect.Proxy;
public class MixinBuilder {
public static Object buildMixed(Class _interface, Object... impls){
InvocationHandler h = new MixinHandler(_interface.getInterfaces(), impls);
return Proxy.newProxyInstance(MixinBuilder.class.getClassLoader(),
new Class[]{_interface}, h);
}
public static void main(String[] args) {
Class1 o1 = new Class1();
Class2 o2 = new Class2();
MixinInterface almost_like_multiple_inheritance_guy =
(MixinInterface) buildMixed(MixinInterface.class, o1, o2);
almost_like_multiple_inheritance_guy.m1();
almost_like_multiple_inheritance_guy.m2();
}
private static class MixinHandler implements InvocationHandler{
private Class[] interfaces;
private Object[] impls;
public MixinHandler(Class[] interfaces, Object[] impls) {
this.interfaces = interfaces;
this.impls = impls;
}
public Object invoke(Object proxy, Method method, Object[] args)
throws Throwable {
int i=0;
for(Class _interface : interfaces){
if(method.getDeclaringClass().isAssignableFrom(_interface)){
return method.invoke(impls[i], args);
}
i++;
}
// TODO Auto-generated method stub
throw new RuntimeException("Method not found: "+method);
}
}
}
Pretty cool huh? :-)
You can't not-depend on a class if you're extending it; it's like having a definition of Human, which does not depend on the definition of Mammal, your optinos are to rewrite everything in the parent, or depend on it.
Many thanks for the answers so far. I've come up with a solution which I think seems reasonable and allows me to fully encapsulate the foreign class.
At the moment I've returned to the method discussed in the first block of code (repeated and extended below) and am now implementing my MyInterface interface for MyNewClass and delegating all interface operations to a composed object. The object to delegate to is decided at runtime by calling a static method on a Factory.
public interface TheirClassInterface {
public void theirMethod1();
public void theirMethod2();
}
public class TheirClass implements TheirClassInterface {
public void theirMethod1() { ... }
public void theirMethod2() { ... }
}
public class TheirOtherClass {
public void theirOtherMethod1(TheirClassInterface o) { ... }
}
public interface MyInterface() {
public void myMethod1();
}
public class MyClass extends TheirClass implements MyInterface {
public void myMethod1() { ... }
}
public class MyNewClass implements MyInterface {
private MyInterface myObject;
public MyNewClass() {
myObject = MyClassFactory.createMyClass();
}
public void myMethod1() {
myObject.myMethod();
}
public void MyNewClassMethod() { ... }
}
Once again, thanks for the ideas. I'm now going to look into them all and see if I can use them to improve my code.
Cheers,
Danny
I am trying to implement a generic abstract class in my service layer. I am already using a simliar pattern in my dao layer and it works fine. I found a working example in the Spring in Practice v8 ebook. I am wondering if there is a way to autowire the following working code. (The code works but I have to call my helper method 'setDao' before I use any other method in the class)
Test class:
public class App {
public static void main(String[] args) {
ApplicationContext appContext = new ClassPathXmlApplicationContext("classpath:/applicationContext.xml");
MyService service = (MyService)appContext.getBean("myService");
service.setDao();
Heading detail = new Heading();
detail.setName("hello");
service.save(detail);
Heading dos = service.findById(Long.valueOf(1));
System.out.println(dos);
}
}
MyServiceImpl class
#Service("myService")
public class MyServiceImpl extends AbstractServiceImpl<Heading> implements HeadingService {
#Autowired
private HeadingDao headingDao;
public void setHeadingDao(HeadingDao headingDao) {
this.headingDao = headingDao;
}
public void setDao() {
super.setDao(this.headingDao);
}
}
MyService interface
public interface HeadingService extends AbstractService<Heading> {
public void setDao();
}
AbstractServiceImpl class
#Service
public abstract class AbstractServiceImpl<T extends Object> implements AbstractService<T> {
private AbstractDao<T> dao;
public void setDao(AbstractDao<T> dao) {
this.dao = dao;
}
public void save(T t) {
dao.save(t);
}
public T findById(Long id) {
return (T)dao.findById(id);
}
public List<T> findAll() {
return dao.findAll();
}
public void update(T t) {
dao.update(t);
}
public void delete(T t) {
dao.delete(t);
}
public long count() {
return dao.count();
}
}
AbstractService interface
public interface AbstractService<T extends Object> {
public void save(T t);
public T findById(Long id);
public List<T> findAll();
public void update(T t);
public void delete(T t);
public long count();
}
Instead of having to call a method (setDao()) to allow your subclass to pass the DAO reference to your superclass, why reverse the direction and force the subclass to supply the DAO to the superclass?
for example:
public abstract class AbstractServiceImpl<T extends Object> implements AbstractService<T> {
private AbstractDao<T> dao;
abstract AbstractDao<T> getDao();
public void save(T t) {
getDao().save(t);
}
}
public class FooServiceImpl extends AbstractServiceImpl<Foo> {
#Autowired
private FooDao fooDao;
#Overrides
public AbstractDao<Foo> getDao() {
return fooDao;
}
}
There is no need to call a method externally to kick the reference-passing-chain into action.
Try making your MyServiceImpl implement InitializingBean, and change your setDao() method to be afterPropertiesSet(). It will automatically get called after the framework is done calling setters.
Or, (even more simple), just call setDao() in your setHeaderDao(...) method.
Upgrade spring framework version to 4, and the problem will be solved.
check this page.