I am looking for a simple way to control the instantiation of a declarative service.
What I want is something like a factory which is asked to return an instance of a component, but where I can control the instantiation.
Currently I just found the ComponentFactory where I just can call newInstance with a given dictionary. But what I really want, is something like this, assuming IComponent is the declarative service interface and MyComponent is the implementation class
public class MyComponentFactory implements ? {
public IComponent newInstance() {
return new MyComponent("firstParameter", "secondParameter");
}
}
Is there something possible like this with declarative services, or do I need to use my own service registration in a bundle activation code?
Thanks in advance
DS does not provide for the level of instance creation indirection you are looking for. DS will always use the public no-args constructor and then call the specified activator method to complete the instance initialization.
One alternative for such service instantiation control is using a combination of DS and CM.
You must set configuration-policy="require" in the DS xml and use CM to create a configuration instance that will be used to pass a Dictionary containing all properties that you need (only types supported by DS of course) to the created service instance. You can even use a propertie file plus Felix File Install for that connfiguration.
If that is not enough you have another alternative that is to track for new created service and call a setup(args) method just after you have added a configuration using CM.
Related
I am trying to understand what would be the correct usage of Spring prototype bean.
May be the following code sample will help in you understanding my dilemma:
List<ClassA> caList = new ArrayList<ClassA>();
for (String name : nameList) {
ClassA ca = new ClassA();
//or Shall I use protypebean, using method lookup I can inject the dependency of ClassA.
// ClassA ca = getPrototypeClassA();
ca.setName(name);
caList.add(ca);
}
So my exact point is in this scenario shall I go with method injection or new() operator.
Provide your view with justification.
You can make use of either of the ways, because ultimately client code is responsible for handling the life-cycle of the prototype bean rather than spring container.
According to Spring-docs,
In some respects, you can think of the Spring containers role when
talking about a prototype-scoped bean as somewhat of a replacement for
the Java 'new' operator. All lifecycle aspects past that point have to
be handled by the client.
Spring does not manage the complete lifecycle of a prototype bean: the
container instantiates, configures, decorates and otherwise assembles
a prototype object, hands it to the client and then has no further
knowledge of that prototype instance. It is the responsibility of the
client code to clean up prototype scoped objects and release any
expensive resources that the prototype bean(s) are holding onto.
If ClassA needs to have #Autowired references, then go for a prototype bean.
Otherwise a simple POJO (that the Spring container is unaware of) will do.
It seems that your instance need some runtime values in order to initialise properly. In such case ,it depends on if you need to use spring feature such as AOP for the ClassA instance. If yes , go with the method injection .If not , you can consider using factory pattern . Much more OO and cleaner to me :
Something like the following . You should get the idea.
#Component
public class FactoryForClassA {
#Autowired
private FooBean someDependencyForClassA;
public ClassA create(String name){
ClassA a = new ClassA(someDependencyForClassA);
a.setName(name);
return a;
}
}
And the client code:
#Autowired
private FactoryForClassA factoryForClassA;
List<ClassA> caList = new ArrayList<ClassA>();
for (String name : nameList) {
ClassA a = factoryForClassA.create(name);
caList.add(ca);
}
I want to define a annotation like #PlatformRelated, once it is marked in a interface, there will be a proxy bean at spring context, and this proxy bean should be #Priority.I want this proxy could invoke different implement according to key parameter #KeyPrameter.And I still wanna use spring features like #Async,#Trasaction,etc... at my Implement1 and Implement2.
#PlatformRelated
interface MyInterface {
method(#KeyPrameter String parameter);
}
#Component
class Implement1 implements MyInterface {
method(String parameter){
//do something 111
}
}
#Component
class Implement2 implements MyInterface {
method(String parameter){
//do something 222
}
}
#Service
class BusinessService{
#Autowired
private MyInterface myInterface;
public void doSomething() {
myInterface.method("key1");
//Implement1 work
myInterface.method("key2");
//Implement2 work
}
}
Do you guys have some good idea to complete it?
I must admit I haven't totally understood the meaning #Priority, however, I can say that if you want to implement this feature in spring, you should probably take a look at Bean Post Processors.
BeanPostProcessors are essentially a hook to Bean Creation process in spring intended for altering bean behavior.
Among other things, they allow wrapping the underlying bean into the proxy (CGLIB/java.lang.Proxy if you're working with interfaces, or even using programmatically Spring AOP), these proxies can provide a hook to the method execution that can read your annotations (like mentioned #KeyParameter) and execute a code in a way similar to Aspect's code that you already make use of.
Not all bean post processor wrap the bean into the proxy. For example, if you want to implement a BPP that uses "#Autowire", you will return the same bean, just "inject" (read, put by reflection) its dependencies. On the other hand, if you want to implement with BPP #Transactional behavior, then yes, you should wrap the bean into a proxy that would take care of transaction management capabilities before and after the method execution.
It's totally ok to have a spring bean that gets "altered" by many post processors, some of them would wrap it into a proxy other will just modify-and-return the same bean, If there are many BPP-s that wrap the bean into proxy we'll get "proxy inside proxy inside proxy" (you get the idea). Each layer of proxy will handle one specific behavior.
As an example, I suggest you take a look at existing Spring postprocessors, or, for instance, a source code of the following library: Spring Boot metering integration library
This library contains some implementations of post processors that allow metrics infrastructure integration by defining annotations on methods of Spring Beans.
I'm wondering if there is possibility to recreate a bean which was already created in java configuration on web app startup.
What I want to do is to reconfigure bean settings.
For example I create new bean with path to database:
#Bean
public TestBean getTestBean() {
TestBean tb = new TestBean("some_path_taken_from_external_point");
return tb;
}
During runtime I want to change the path. Let's assume that this bean doesn't have the setter method for database path.
I will have some kind of event and a listener for this event. Listener should reinitialize TestBean with new path.
Is this possible?
I was thinking of some kind of wrapper. In such case I would have class TestBeanWrapper which will have method get() which will return TestBean instance and recreate(String path) which will create new object with given path.
I'm not sure exactly if such wrapper would work for me, as the TestBean is a class from external library, and I'm not sure if it's not injected somewhere (but probably it's not injected).
More possible is that the other beans may rely on TestBean, so they also must be reinitialized (in case if they won't have setters for my TestBean).
Is this even possible in Spring (4.1) ? What is the best approach for such cases?
So I'm still unsure why you would want to change the path but I have 2 suggestions:
1. Look at setting the scope on the Bean.
By setting the scope on the bean, you can regenerate the bean based on context. Look at Bean Scopes for more information.
2. Look at maybe using a controller or a service.
Controllers and services allow getters and setters which may give you more control.
Writing a small web-app in Grails I encountered a problem with global objects. I have a class which runs threads - ExecutorService with queuing.
The problem is where to create an object of this class, to have it available in Controller?
I've tried at init (BootStrap) but there's no chance then of getting its instance anywhere else.
In general - what I need is an object in one single instance for whole application, with access from Model and/or Controller.
In general - what I need is an object in one single instance for whole application, with access from Model and/or Controller.
The standard way to achieve this is to declare the object as a Spring bean in grails-app/conf/spring/resources.groovy
threadPool(java.util.concurrent.Executors) { bean ->
bean.factoryMethod = "newCachedThreadPool"
}
Then in controllers/services/etc. you can inject this bean the same as you would with grails services, i.e.
def threadPool
But in this case you may find it easier simply to use the executor plugin, which defines such a bean for you and handles the intricacies of ensuring there is a valid GORM session available to the background tasks.
Why not wrap your Executorservice inside a Spring Bean, or using something like:
grailsApplication.controllerClasses.each {controller ->
controller.metaClass.executorService = { ->
executorService
}
}
Actually I have come to something before checking answers.
For a given domain class (Example) and controller (ExampleController) create
ExampleService
and simply include all the needed things there. Set behaviour to #Singleton (default).
Then in Controller inject the instance as follows:
def exampleService
I'm trying to change some legacy code to use DI with Spring framework. I have a concrete case for which I'm wondering which is the most proper way to implement it.
It is a java desktop application. There is a DataManager interface used to query / change data from the data store. Currently there is only one implementation using a XML file for store, but in the future it is possible to add SQL implementation. Also for unit testing I may need to mock it.
Currently every peace of code that needs the data manager retrieves it by using a factory. Here is the source code of the factory:
public class DataManagerFactory
{
private static DataManagerIfc dataManager;
public static DataManagerIfc getInstance()
{
// Let assume synchronization is not needed
if(dataManager == null)
dataManager = new XMLFileDataManager();
return dataManager;
}
}
Now I see 3 ways to change the application to use DI and Spring.
I. Inject the dependency only in the factory and do not change any other code.
Here is the new code:
public class DataManagerFactory
{
private DataManagerIfc dataManager;
public DataManagerFactory(DataManagerIfc dataManager)
{
this.dataManager = dataManager;
}
public DataManagerIfc getDataManager()
{
return dataManager;
}
public static DataManagerIfc getInstance()
{
return getFactoryInstance().getDataManager();
}
public static DataManagerFactory getFactoryInstance()
{
ApplicationContext context =
new ClassPathXmlApplicationContext(new String[] {"com/mypackage/SpringConfig.xml"});
return context.getBean(DataManagerFactory.class);
}
}
And the XML with the bean description:
<bean id="dataManagerFactory"
class="com.mypackage.DataManagerFactory">
<constructor-arg ref="xmlFileDataManager"/>
</bean>
<bean id="xmlFileDataManager"
class="com.mypackage.datamanagers.xmlfiledatamanager.XMLFileDataManager">
</bean>
II. Change every class that is using the data manager so it takes it through the constructor and store it as a class variable. Make Spring bean definitions only for the "root" classes from where the chain of creation starts.
III. Same as II. but for every class that is using the data manager create a Spring bean definition and instantiate every such class by using the Spring Ioc container.
As I'm new to the DI concept, I will appreciate every advice what will be the correct and "best practice" solution.
Many thanks in advance.
Use option 3.
The first option keeps your code untestable. You won't be able to easily mock the static factory method so that it returns a mock DataManager.
The second option will force you to have the root classes know all the dependencies of all the non-root classes in order to make the code testable.
The third option really uses dependency injection, where each bean only know about its direct dependencies, and is injected by the DI container.
Well... why did you write the factory in the first place? Spring is not intended to make you change how you write code (not just to suit Spring that is), so keeping the factory is correct as it uses well-known pattern. Injecting the dependency into the factory will retain that behaviour.
Option 3 is the correct route to take. By using such a configuration you can usefully take components of your configuration and use them in new configurations, and everything will work as expected.
As a rule of thumb, I would expect one call to Spring to instantiate the application context and get the top-level bean. I wouldn't expect to make repeated calls to the Spring framework to get multiple beans. Everything should be injected at the correct level to reflect responsibilities etc.
Beware (since you're new to this) that you don't plumb in your data manager into every class available! This is quite a common mistake to make, and if you've not abstracted out and centralised responsibilities sufficiently, you'll find you're configuring classes with lots of managers. When you see you're doing this it's a good time to step back and look at your abstractions and componentisation.