I've got a singleton Spring bean that creates a couple of tasks (java.util.concurrent.Callable's) at runtime to do its work in parallel. Right now, the Callable's are defined as inner classes in the singleton bean, and the singleton bean creates them simply by instantiating them with new Task(in), where in is a parameter known only at runtime.
Now I want to extract the inner Task class to a regular top-level class because I want to make the Task's call() method transactional, so I need it to be a Spring bean.
I guess I need to give my singleton some kind of factory of Tasks, but the tasks have to be prototype Spring beans that take a runtime value as a constructor parameter. How can I accomplish this?
Spring's bean factory and new are mutually exclusive. You can't call new and expect that object to be under Spring's control.
My suggestion is to inject those Tasks into the Singleton. Make them Spring beans, too.
You should recognize that the Task itself isn't going to be transaction, but its dependencies can be. Inject those into the Tasks and let Spring manage the transactions.
Another approach might be to use Spring's #Configurable annotation with load-time weaving, this way you can use new (instead of a bean factory) to create wired Callable's at runtime:
#Configurable
public class WiredTask implements Callable<Result> {
#Autowired
private TaskExecutor executor;
public WiredTask(String in) {
this.in = in;
}
public Result call() {
return executor.run(in);
}
}
#Bean #Scope("prototype")
public class TaskExecutor() {
#Transactional
public Result run(String in) {
...
}
}
// Example of how you might then use it in your singleton...
ExecutorService pool = Executors.newFixedThreadPool(3);
WiredTask task = new WiredTask("payload");
Future<Result> result = pool.submit(task);
See this article for more info. Note, you cannot use #Configurable and #Transactional in the same bean, hence the need for two classes. For that reason, this might not be the ideal solution if you have lots of different implementations of Callable (since you will need 2 classes for each one).
Your singleton bean can implement BeanFactoryAware and lookup beans from the containing spring factory.
import org.springframework.beans.BeansException;
import org.springframework.beans.factory.BeanFactory;
import org.springframework.beans.factory.BeanFactoryAware;
public class MyBeanFactory implements BeanFactoryAware {
private BeanFactory beanFactory;
public void setBeanFactory(BeanFactory beanFactory)
throws BeansException {
this.beanFactory = beanFactory;
}
public Task createTask(Task in) {
return beanFactory.getBean("task",in);
}
}
///////////////
import java.util.concurrent.Callable;
import org.springframework.beans.factory.annotation.Configurable;
import org.springframework.context.annotation.Scope;
import org.springframework.transaction.annotation.Transactional;
#Configurable // unless using xml based config
#Scope(value="prototype") // tell bean factory to create new instance each time
public class Task implements Callable<Object> {
private Object in;
public Task(Object in) {
super();
this.in = in;
}
#Transactional
public Object call() throws Exception {
//do real work
return in;
}
}
///
Related
For my design I need a controllable schedular. Spring boot offers an #Scheduled annotation but that is more simplified and I do not have granular control.
So I wanted to implement my own scheduler manually.
This is the class I created:
#Slf4j
#Service
public class JobExecutor {
private static ScheduledExecutorService jobExecutor;
private static Environment env;
#Autowired
private JobExecutor(Environment env) {
JobExecutor.env = env;
}
public static ScheduledExecutorService INSTANCE() {
if (null == jobExecutor) {
synchronized (JobExecutor.class) {
if (null == jobExecutor) {
jobExecutor = Executors.newScheduledThreadPool(
Integer.parseInt(Objects.requireNonNull(env.getProperty("scheduler.jobs"))));
}
}
}
return jobExecutor;
}
}
With this approach I could simply call the static method to get a single instance.
Is this correct approach for a schedular? I need to start and stop and shutdown the jobs. I tried guava AbstractScheduledService but that does not seem to be working.
This is not the correct approach for creating a singleton, because double checked locking is broken. You're using Spring, so a) your JobExecutor will be a singleton anyway, and b) will only be created if it is needed. You might as well, therefore, create your executor instance in the constructor and get rid of those static methods.
Even better, you could create schedulers as named beans, and then inject them into classes where you want them:
#Configuration
public class ExecutorConfiguration {
#Bean
public ScheduledExecutorService jobExecutor(#Value("${scheduler.jobs}") jobs) {
return Executors.newScheduledThreadPool(jobs);
}
}
This says that whenever another component needs a ScheduledExecutorService, Spring should call this jobExecutor() method; Spring will automatically populate the jobs parameter from the scheduler.jobs property because of the #Value.
You can then inject your executor wherever you need it, for example with constructor injection (handily you're already using Lombok, so the amount of boilerplate is minimised):
#Service
#AllArgsConstructor
public class MyThingThatNeedsAScheduler {
private final ScheduledExecutorService jobExecutor;
// methods here...
}
You can also use setter or member injection, if you want.
I was new to Springboot application using the #Autowired to perform the dependency injection. We can use #Autowired directly by initiate that class object for class that has none or default parameterless constructor. But what if a class has some parameter in its constructor, and I would like to initiate it during runtime conditionally and automatically, is it possible to do that?
For example
#Component
public class SomeContext {
private String name;
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
}
#Component
public class SomeBuilder {
private final SomeContext ctx;
#Autowired
public SomeBuilder(SomeContext ctx) {
this.ctx = ctx;
}
public void test() {
System.out.println("ctx name: " + ctx.getName());
}
}
#Service
public class SomeService {
#Autowired
SomeBuilder someBuilder;
public void run(SomeContext ctx) {
if (ctx != null) {
// I want someBuilder to be initiated here in someway with my input ctx
// but NOT doing through new with constructor like below
// someBuilder = new SomeBuilder(ctx);
someBuilder.test(); // ctx name: null, I would expect to see "ctx name: someUser", while ctx was injected into someBuilder in any possible way
}
}
}
#RestController
public class HelloWorldController
{
#Autowired
SomeService someService;
#RequestMapping("/")
public String hello() {
SomeContext someContext = new SomeContext();
someContext.setName("someUser");
someService.run(someContext);
return "Hello springboot";
}
}
I'm not sure I've got your question right, but from the code it looks like you really want to create a new instance of SomeBuilder every time you call the run method of SomeService.
If so, I think the first thing to understand is that in general the injection magic happens only if the class is managed by Spring by itself. Read, if spring creates the object of the class - it will inject stuff into it otherwise you're on your own here.
The next thing to understand is that, if you have a object of class SomeBuilder managed by spring and you want to inject SomeContext into it, this SomeContext instance has to be managed by spring as well.
Bottom line, spring can deal only with objects that it manages. These objects are stored in a 'global registry' of all the objects called ApplicationContext in spring.
Now Spring has a concept of prototype scope vs. singleton scope. By Default all the beans are singletons, however you can easily alter this behavior. This has two interesting consequences:
You Can create prototype objects being injected into the singleton upon each invocatino (of method run in your case, so the SomeBuilder can and I believe should be a prototype)
Prototype objects are not stored in the application contexts so the capabilities of injecting stuff in there during the runtime are rather limited
With all this in mind:
If you want to create SomeContext like you do in the controller, its not managed by spring, so you can't use Injection of spring as is into the builder.
The builder is a singleton, so if you inject it with a regular #Autowire into another singleton (SomeService in your case), you'll have to deal with the same instance of the builder object - think about concurrent access to the method run of SomeService and you'll understand that this solution is not really a good one.
So these are the "inaccuracies" in the presented solution.
Now, in terms of solution, you can:
Option 1
Don't manage builders in Spring, not everything should be managed by spring, in this case you'll keep your code as it is now.
Option 2
and this is a the solution, although pretty advanced one:
Use Java Configuration to create prototype beans in runtime with an ability to inject parameters into the bean.
Here is an example:
// Note, I've removed all the annotations, I'll use java configurations for that, read below...
public class SomeBuilder {
private final SomeContext ctx;
public SomeBuilder(SomeContext ctx) {
this.ctx = ctx;
}
public void test() {
System.out.println("ctx name: " + ctx.getName());
}
}
Now the class SomeService will also slightly change:
public class SomeService {
private Function<SomeContext, SomeBuilder> builderFactory;
public void run(SomeContext ctx) {
SomeBuilder someBuilder = builderFactory.apply(ctx);
someBuilder.test();
}
}
And now you should "glue" it to spring in an advanced way with Java Configurations:
#Configuration
public class MyConfiguration {
#Bean
public Function<SomeContext, SomeBuilder> builderFactory() {
return ctx -> someBuilder(ctx);
}
#Bean
#Scope(value = "prototype")
public SomeBuilder someBuilder(SomeContext ctx) {
return new SomeBuilder(ctx);
}
#Bean
public SomeService someService() {
return new SomeService(builderFactory());
}
}
For more details with a really similar example, see this tutorial
From the Spring 5 docs
When #Bean methods are declared within classes that are not annotated
with #Configuration they are referred to as being processed in a
'lite' mode. Bean methods declared in a #Component or even in a plain
old class will be considered 'lite', with a different primary purpose
of the containing class and an #Bean method just being a sort of bonus
there. For example, service components may expose management views to
the container through an additional #Bean method on each applicable
component class. In such scenarios, #Bean methods are a simple
general-purpose factory method mechanism.
Unlike full #Configuration, lite #Bean methods cannot declare
inter-bean dependencies. Instead, they operate on their containing
component’s internal state and optionally on arguments that they may
declare. Such an #Bean method should therefore not invoke other #Bean
methods; each such method is literally just a factory method for a
particular bean reference, without any special runtime semantics. The
positive side-effect here is that no CGLIB subclassing has to be
applied at runtime, so there are no limitations in terms of class
design (i.e. the containing class may nevertheless be final etc).
The #Bean methods in a regular Spring component are processed
differently than their counterparts inside a Spring #Configuration
class. The difference is that #Component classes are not enhanced with
CGLIB to intercept the invocation of methods and fields. CGLIB
proxying is the means by which invoking methods or fields within #Bean
methods in #Configuration classes creates bean metadata references to
collaborating objects; such methods are not invoked with normal Java
semantics but rather go through the container in order to provide the
usual lifecycle management and proxying of Spring beans even when
referring to other beans via programmatic calls to #Bean methods. In
contrast, invoking a method or field in an #Bean method within a plain
#Component class has standard Java semantics, with no special CGLIB
processing or other constraints applying.
I would have expected that the following code throws an exception / bean1.bean2 to be null and that the init method would not be executed. However, the code below runs fine and prints:
Should never be invoked
Expected null but is ch.litebeans.Bean2#402bba4f
So for me it looks like lite beans behave the same as beans constructed from an #Configuration annotated class. Can someone point out in which scenario this is not the case?
.
package ch.litebeans;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
public class Main {
public static void main(String[] args) {
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(ApplicationConfig.class);
Bean1 bean1 = ctx.getBean(Bean1.class);
System.out.println("Expected null but is " + bean1.getBean2());
}
}
package ch.litebeans;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
#Configuration
#ComponentScan(basePackages = {"ch.litebeans"})
public class ApplicationConfig {}
package ch.litebeans;
import org.springframework.context.annotation.Bean;
import org.springframework.stereotype.Component;
#Component
public class Factory1 {
#Bean
public Bean1 getBean1(Bean2 bean2){
return new Bean1(bean2);
}
}
package ch.litebeans;
import org.springframework.context.annotation.Bean;
import org.springframework.stereotype.Component;
#Component
public class Factory2 {
#Bean(initMethod = "init")
public Bean2 getBean2(){
return new Bean2();
}
}
package ch.litebeans;
public class Bean1 {
private Bean2 bean2;
public Bean1(Bean2 bean2){
this.bean2 = bean2;
}
public Bean2 getBean2(){
return bean2;
}
}
package ch.litebeans;
public class Bean2 {
public void init(){
System.out.println("Should never be invoked");
}
}
EDIT:
Based on the explanation of Mike Hill I added an example demonstrating the difference:
public class BeanLiteRunner {
public static void main(String[] args) {
AnnotationConfigApplicationContext acac = new AnnotationConfigApplicationContext(MyComponent.class,
MyConfiguration.class);
MyComponent.MyComponentBean1 componentBean1 = acac.getBean(MyComponent.MyComponentBean1.class);
MyComponent.MyComponentBean1 componentBean2 = acac.getBean(MyComponent.MyComponentBean1.class);
MyConfiguration.MyConfigurationBean1 configurationBean1 = acac.getBean(MyConfiguration
.MyConfigurationBean1.class);
MyConfiguration.MyConfigurationBean1 configurationBean2 = acac.getBean(MyConfiguration
.MyConfigurationBean1.class);
}
}
#Component
public class MyComponent {
#Bean
public MyComponent.MyComponentBean1 getMyComponentBean1(){
return new MyComponent.MyComponentBean1(getMyComponentBean2());
}
#Bean
public MyComponent.MyComponentBean2 getMyComponentBean2(){
return new MyComponent.MyComponentBean2();
}
public static class MyComponentBean1{
public MyComponentBean1(MyComponent.MyComponentBean2 myComponentBean2){
}
}
public static class MyComponentBean2{
public MyComponentBean2(){
System.out.println("Creating MyComponentBean2");
}
}
}
#Configuration
public class MyConfiguration {
#Bean
public MyConfigurationBean1 getMyConfigurationBean1(){
return new MyConfigurationBean1(getMyConfigrationBean2());
}
#Bean
public MyConfigurationBean2 getMyConfigrationBean2(){
return new MyConfigurationBean2();
}
public static class MyConfigurationBean1{
public MyConfigurationBean1(MyConfigurationBean2 myConfigurationBean2){}
}
public static class MyConfigurationBean2{
public MyConfigurationBean2(){
System.out.println("Creating MyConfigrationBean2");
}
}
}
The output is as expected
> Creating MyComponentBean2
> Creating MyComponentBean2
> Creating MyConfigrationBean2
This is not a bug. Spring's lite bean definitions are automatically added to the context during the component scan. Bean definitions using a factory method (i.e., all #Bean-defined beans) will automatically attempt to autowire parameters using the current context. See ConstructorResolver#instantiateUsingFactoryMethod for more details.
The referenced documentation is perhaps not entirely clear. The primary differentiation between "lite" and "full" bean configurations is strictly the proxying that is done in "full" (#Configuration) mode. Take for example what would happen if we instead defined our beans in a #Configuration class:
#Configuration
public MyConfiguration {
#Bean
public Bean1 bean1() {
return new Bean1(bean2());
}
#Bean
public Bean2 bean2() {
return new Bean2();
}
}
The call to bean2() from within bean1() will actually reference the singleton bean2 instance from the context rather than directly calling the bean2() method as implemented. In fact, it doesn't matter how many bean2() method calls are made from within that configuration class -- the real bean2 method will only be executed one time.
"Lite"-mode does not proxy these methods, meaning that each bean2() method call will actually directly reference the method implementation and will not return the bean that is in the context. Rather, it will create a new separate instance that is not tracked by the context (as is the default Java behavior).
This is an exotic use-case, so it requires some patience to understand and may require exotic solutions.
The Context
I'm making a library to be used with Spring that performs automatic actions upon specific bean instances present in the context, created by #Bean methods and not by #ComponentScan. If at all possible, the beans should be distinguishable not by type, but by other means, preferably annotations on the factory method.
Here's the ideal case. E.g. let's say there are 2 bean-producing methods:
#Bean
public SomeType makeSome() {...}
#Bean
#Special
public SomeOtherType makeOther() {...}
Here, the second bean is special because of the #Special annotation on method that created it. But any mechanism of making it distinguishable is an option.
Then, I want to somehow get only the special beans.
The Caveat
I'm aware that if all the beans would implement the same interface, I could inject them by type. But this should work as transparently as possible, requiring as little change to an existing app as possible.
The potential approaches
Here's two broad approaches I have in mind:
1) Jack into the process of registering beans, and transparently wrap the bean instances into some sort of a container (I'm quite certain this part is doable). E.g.
public void registerBean(Object bean, ApplicationContext ctx) {
ctx.register(bean); //do the usual
ctx.register(new Wrapper(bean); //register it wrapped as well
}
Then, inject all all beans of type Wrapper. The problem here is obviously the duplication... Alternatively I could maybe generate a proxy instance on the fly that would implement a Wrapper interface, so it could at the same time act as the original bean and as a wrapper. I did say I'm OK with exotic solutions too, didn't I?
2) Spring already distinguishes bean candidates from actual registered beans (e.g. #ComponentScan can filter the candidates by package, annotations etc). I'm hoping to maybe jack into this process and get a hold of candidate descriptors that still contain some useful metadata (like their factory method) that would allow me to later distinguish those bean instances.
Seems like you need to use #Qualifier it provides functionality to distinguish beans:
#Bean
#Qualifier("special")
class MyBean {}
#Bean
class OtherBean {
#Qualifier("special")
private MyBean bean;
}
You can read about it more there: https://spring.io/blog/2014/11/04/a-quality-qualifier
UPD (understood what you are talking about :) )
You may want to take a look at BeanDefinitionRegistryPostProcessor
Here a usage sample:
import org.springframework.beans.BeansException;
import org.springframework.beans.factory.annotation.AnnotatedGenericBeanDefinition;
import org.springframework.beans.factory.config.BeanPostProcessor;
import org.springframework.beans.factory.config.ConfigurableListableBeanFactory;
import org.springframework.beans.factory.support.BeanDefinitionRegistry;
import org.springframework.beans.factory.support.BeanDefinitionRegistryPostProcessor;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.concurrent.ConcurrentMap;
import static java.util.Collections.unmodifiableMap;
/**
* This is hack to collect all beans of some type in single map without eager initialization of those beans
*
* Usage:
* 1. Register new bean of type {#link ServiceTrackingBeanPostProcessor} parametrized with class of
* beans you want to collect
* 2. Now you can inject {#link ServiceTracker} parametrized with your type anywhere
*
* #param <T> Located type
*/
public class ServiceTrackingBeanPostProcessor<T> implements BeanPostProcessor, BeanDefinitionRegistryPostProcessor {
private final ConcurrentMap<String, T> registeredBeans = new ConcurrentHashMap<>();
private final Class<T> clazz;
private final String beanName;
public ServiceTrackingBeanPostProcessor(Class<T> clazz) {
this.clazz = clazz;
beanName = "locatorFor" + clazz.getCanonicalName().replace('.', '_');
}
#Override
public Object postProcessBeforeInitialization(Object o, String s) throws BeansException {
return o;
}
#Override
public Object postProcessAfterInitialization(Object o, String s) throws BeansException {
if (!clazz.isInstance(o)) {
return o;
}
registeredBeans.putIfAbsent(s, (T) o);
return o;
}
#Override
public void postProcessBeanDefinitionRegistry(BeanDefinitionRegistry registry) throws BeansException {
AnnotatedGenericBeanDefinition def = new AnnotatedGenericBeanDefinition(Wrapper.class);
registry.registerBeanDefinition(beanName, def);
}
#Override
public void postProcessBeanFactory(ConfigurableListableBeanFactory beanFactory) throws BeansException {
beanFactory.registerSingleton(beanName, new Wrapper(unmodifiableMap(registeredBeans)));
}
private class Wrapper extends AbstractServiceTracker<T> {
public Wrapper(Map<String, T> services) {
super(services);
}
}
}
You just need to change the condition of check from clazz.isInstance to obtainment of beanDefinition from application context by bean name where you could get almost any information about instantiation and annotations
Here is one way to get all beans with #Special annotation.
Map<String,Object> beans = applicationContext.getBeansWithAnnotation(Special.class);
Reference: https://stackoverflow.com/a/14236573/1490322
EDIT: The above answer seems to work only when the class is annotated with #Special, so it will not work for your scenario. But this other answer to the same question might work. It uses meta data from the ConfigurableListableBeanFactory to identify any beans whose methods were annotated with a specific annotation.
I think as mentioned #Qualifier annotation is a way to go here. But I will show different example:
#Bean
public SomeType makeSome() {...}
#Bean
public SomeType makeSomeOther() {...}
in your component (#Service) where you want this bean you can:
#Autowired
#Qualifier("makeSome")
SomeType makeSomeBean;
As you see two beans with same type can be distinguished by bean name (Bean assigned with same name as #Bean annotated method named)
I had an issue with spring's #Order annotation, it seems i can't get it to work in my application. Hence I managed to create a test class that imitates the very same behaviour which #Order does not have any effect on my components. The following test fails to run because of lack of bean typed javax.sql.Datasource:
package com.so;
import org.springframework.beans.factory.BeanFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.config.ConfigurableBeanFactory;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.Ordered;
import org.springframework.core.annotation.Order;
import org.springframework.jdbc.datasource.AbstractDataSource;
import org.springframework.stereotype.Repository;
import org.springframework.stereotype.Service;
import javax.annotation.PostConstruct;
import javax.sql.DataSource;
import java.sql.Connection;
import java.sql.SQLException;
public class TestSpring {
public static void main(String[] args) {
Class<?>[] classes = new Class[]{AConf.class, ADAO.class, AService.class, RepoConf.class} ;
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(classes);
}
#Configuration
#Order(Ordered.HIGHEST_PRECEDENCE + 100)
public static class AConf {
#Autowired
AService aService;
}
#Repository
#Order(Ordered.LOWEST_PRECEDENCE)
public static class ADAO {
#Autowired
#Qualifier("myds")
DataSource dataSource;
}
#Service
#Order(Ordered.LOWEST_PRECEDENCE)
public static class AService {
#Autowired
ADAO adao;
#PostConstruct
public void init() {
System.out.println("service init");
}
}
// #Component does not have any effect
#Configuration
#Order(Ordered.HIGHEST_PRECEDENCE)
public static class RepoConf {
#Autowired
BeanFactory beanFactory;
#PostConstruct
public void init() {
ConfigurableBeanFactory configurableBeanFactory = (ConfigurableBeanFactory) beanFactory;
configurableBeanFactory.registerSingleton("myds", new AbstractDataSource() {
#Override
public Connection getConnection() throws SQLException {
return null;
}
#Override
public Connection getConnection(String username, String password) throws SQLException {
return null;
}
});
}
}
}
Manual bean registration has risks as stated here: https://stackoverflow.com/a/11751503/1941560, although I cannot find out in which circumtances that #Order annotation works. For above configuration of application I expect the execution order like; RepoConf, AConf, ADAO, AService.
A weird thing to notice is that when I changed the order of component classes declared to (with commencing array with RepoConf):
Class<?>[] classes = new Class[]{RepoConf.class, AConf.class, ADAO.class, AService.class};
or changed my AConf class to:
#Configuration
#Order(Ordered.HIGHEST_PRECEDENCE + 100)
public static class AConf {
#Autowired
RepoConf repoConf; // must be declared before aService
#Autowired
AService aService;
}
application works as expected. Could someone explain that spring container's behaviour and how can I utilize #Order annotations?
springframework version I use is 4.2.1.RELEASE
Judging by the JavaDoc documentation for the #Order annotation, I don't think it is used to order bean creation:
NOTE: Annotation-based ordering is supported for specific kinds of
components only — for example, for annotation-based AspectJ aspects.
Ordering strategies within the Spring container, on the other hand, are
typically based on the Ordered interface in order to allow for
programmatically configurable ordering of each instance.
Consulting the Spring Framework documentation, the #Ordered annotation seems to be used for:
Ordering of instances when injected into a collection
Ordering the execution of event listeners
Ordering of #Configuration class processing, for example if you want to override a bean by name.
From the documentation, it does not look like #Order is intended for your use case.
However, from the Spring documentation, we can see there depends-on, which enforces that certain beans should be created before the bean being defined. This has a corresponding annotation #DependsOn.
It looks that your version of Spring Framework simply ignores the #Order annotation on configuration classes. There's no surprise here, because that annotation should only be used on classes that implement the Ordered interface. Moreover, I could never find any reference to it about configuration classes in Spring Framework reference documentation.
Anyway you are walking in terra incognita here. It is not described in official documentation, so it will work or not depending on implementation details. What happens here is (seeing the results):
the AnnotationConfigApplicationContext first instantiate the configuration classes and their beans in their declaration order
it then builds all the beans in their instantiation order
As you register the datasource myds at build time in its configuration class, it happens to be registered in time to be autowired in other beans when repoConf is built before any other beans using them. But this is never guaranteed by Spring Framework documentation and future versions could use a different approach without breaking their contract. Moreover, Spring does change initialization order to allow dependencies to be constructed before beans depending on them.
So the correct way here is to tell Spring that the ADAO bean depends on the configuration RepoConf. Just throw away all Order annotations which are no use here, and put a #DependsOn one. You code could be:
package com.so;
...
public class TestSpring {
public static void main(String[] args) {
Class<?>[] classes = new Class[]{AConf.class, ADAO.class, AService.class, RepoConf.class} ;
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(classes);
}
#Configuration
public static class AConf {
#Autowired
AService aService;
}
#DependsOn("repoConf")
#Repository
public static class ADAO {
#Autowired
#Qualifier("myds")
DataSource dataSource;
}
#Service
public static class AService {
#Autowired
ADAO adao;
#PostConstruct
public void init() {
System.out.println("service init");
}
}
#Configuration("repoConf")
public static class RepoConf {
#Autowired
BeanFactory beanFactory;
#PostConstruct
public void init() {
ConfigurableBeanFactory configurableBeanFactory = (ConfigurableBeanFactory) beanFactory;
configurableBeanFactory.registerSingleton("myds", new AbstractDataSource() {
#Override
public Connection getConnection() throws SQLException {
return null;
}
#Override
public Connection getConnection(String username, String password) throws SQLException {
return null;
}
});
}
}
}
#DependsOn ensures that repoConf will be created before ADAO making the datasource available for dependency injection.