#ConditionalOnMissingBean vs #ConditionalOnSingleCandidate - java

We´re working on multi-module project where each module is a cutom spring boot starter that holds several retryable tasks. (Using spring-retry)
In order to ensure that retry annotations are activated only once cross starters, a configuration bean is added to each starter auto configuration submodule:
#EnableRetry
#Configuration
#ConditionalOnMissingBean(RetryConfiguration.class)
public class StarterRetryAutoConfiguation {}
The solution is working as expected.
Question: What's the difference between #ConditionalOnSingleCandidate and #ConditionalOnMissingBean ?
I've read the Spring documentation more then once. However, I didn't get when and where should we use each one of them.
ConditionalOnSingleCandidate:
#Conditional that only matches when a bean of the specified class is
already contained in the BeanFactory and a single candidate can be
determined. The condition will also match if multiple matching bean
instances are already contained in the BeanFactory but a primary
candidate has been defined; essentially, the condition match if
auto-wiring a bean with the defined type will succeed.
The condition can only match the bean definitions that have been
processed by the application context so far and, as such, it is
strongly recommended to use this condition on auto-configuration
classes only. If a candidate bean may be created by another
auto-configuration, make sure that the one using this condition runs
after.
ConditionalOnMissingBean:
#Conditional that only matches when no beans meeting the specified
requirements are already contained in the BeanFactory. None of the
requirements must be met for the condition to match and the
requirements do not have to be met by the same bean.

We can use #ConditionalOnMissingBean if we want to load a bean only if a certain other bean is not in the application context:
#Configuration
class OnMissingBeanModule {
#Bean
#ConditionalOnMissingBean
DataSource dataSource() {
return new InMemoryDataSource();
}
}
In this example, we’re only injecting an in-memory datasource into the application context if there is not already a datasource available. This is very similar to what Spring Boot does internally to provide an in-memory database in a test context.
We can use #ConditionalOnSingleCandidate if we want to load a bean only if a single candidate for the given bean class has been determined.
#Configuration
#ConditionalOnSingleCandidate(DataSource.class)
class OnSingleCandidateModule {
...
}
In this example, the condition matches if there is exactly one primary DataSource bean specified in your application context.

Related

#ConditionalOnBean(KafkaTemplate.class) crashes entire application

I have a Spring boot application that consumes data from Kafka topic and send email notifications with a data received from Kafka,
#Bean
public EmailService emailService() {
return new EmailServiceImpl(getJavaMailSender());
}
it works perfectly,
but after I added #ConditionalOnBean:
#Bean
#ConditionalOnBean(KafkaTemplate.class)
public EmailService emailService() {
return new EmailServiceImpl(getJavaMailSender());
}
application failed to start:
required a bean of type 'com.acme.EmailService' that could not be
found.
And I can't find any explanation, how it is possible, because KafkaTemplate bean automatically created by Spring in KafkaAutoConfiguration class.
Could you please give me an explanation?
From the documentation:
The condition can only match the bean definitions that have been
processed by the application context so far and, as such, it is
strongly recommended to use this condition on auto-configuration
classes only. If a candidate bean may be created by another
auto-configuration, make sure that the one using this condition runs
after.
This documentation clearly says what might be wrong here. I understand KafkaTemplateConfiguration creates the KafkaTemplate.class. But it may not be added in the bean context while the condition was being checked. Try to use autoconfiguration for KafkaTemplate or make sure the ordering of different configuration classes so that you can have the guarantee of having the KafkaTemplate in bean registry before that conditional check.

Bean parameter could not be found when migrating spring boot

I'm migrating services from spring boot 1.5 to spring boot 2.1 and I'm getting an error during this process. I have the following class for configuring my spring beans:
#Configuration
public class CompanyTransactionConfiguration {
public CompanyTransactionConfiguration() {
}
#Bean
public TransactionTaskRunner transactionTaskRunner(PlatformTransactionManager transactionManager) {
return new TransactionTaskRunnerImpl(this.readWriteTransactionTemplate(transactionManager), this.readOnlyTransactionTemplate(transactionManager), this.newReadWriteTransactionTemplate(transactionManager));
}
}
And, of course, a test class to check that everything work as expected:
#RunWith(SpringRunner.class)
public class ReferrerActivityRepositoryIT extends AbstractDomainIT {
#Autowired
private ReferrerActivityRepository referrerActivityRepository;
#Autowired
private TransactionTaskRunner transactionTaskRunner;
...
}
The issue is that this test was working fine after I changed my dependencies to a newer spring boot version (2.1), but now I'm getting the following error:
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 0 of method transactionTaskRunner in com.company.core.server.config.CompanyTransactionConfiguration required a bean of type 'org.springframework.transaction.PlatformTransactionManager' that could not be found.
The following candidates were found but could not be injected:
- Bean method 'transactionManager' in 'DataSourceTransactionManagerAutoConfiguration.DataSourceTransactionManagerConfiguration' not loaded because #ConditionalOnSingleCandidate (types: javax.sql.DataSource; SearchStrategy: all) did not find any beans
- Bean method 'kafkaTransactionManager' in 'KafkaAutoConfiguration' not loaded because #ConditionalOnProperty (spring.kafka.producer.transaction-id-prefix) did not find property 'spring.kafka.producer.transaction-id-prefix'
...
Action:
Consider revisiting the entries above or defining a bean of type 'org.springframework.transaction.PlatformTransactionManager' in your configuration.
I don't know what is going on, maybe I need to add another dependency because of changes in spring boot or change my application.properties file. The question is why is this happening? What should I change to get this working?
Thanks!
You didn't define PlatformTransactionManager bean. I assume you don't want to make it by yourself. You have to add spring.kafka.producer.transaction-id-prefix property to property file in order to use KafkaAutoConfiguration for PlatformTransactionManager.
Bean method 'kafkaTransactionManager' in 'KafkaAutoConfiguration' not loaded because #ConditionalOnProperty (spring.kafka.producer.transaction-id-prefix) did not find property spring.kafka.producer.transaction-id-prefix
By the way your's CompanyTransactionConfiguration constructor is redundant as long as it doesn't have parameters. If there's no constructor in class compiler will create default one without parameters.

Exclude DataSource inside starter library from auto configuration

I have multiple Spring Boot Starters, each of which define a DataSource like this:
#Bean
#ConfigurationProperties(prefix = "some.unique.namespace.datasource")
public DataSource someUniqueNamespaceDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
public SomeOtherBean someOtherBean() {
return new SomeOtherBean(someUniqueNamespaceDataSource())
}
As you can see, the bean method someUniqueNamespaceDataSource() is being called directly in another bean method, within the same configuration class. However, Spring Boot is intercepting the method, and then performing its own internal injection. This time, it injects with a type of DataSource.
When an application uses one of these starters, it works without issue. However, when it uses multiple starters, I get errors like this:
org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type [javax.sql.DataSource] is defined: expected single matching bean but found 2: someUniqueNamespaceDataSource,someOtherUniqueNamespaceDataSource
I believe this is because Spring Boot is internally injected by type, even though my code injects a qualified bean.
Is there some way that the starter libraries can indicate that the DataSources should not be considered candidates for auto-configuration?
Is there some way that an application depending on more than one of these starter libraries can exclude them from auto-configuration?
Disabling auto-configuration entirely is not really viable. Additionally, manually excluding all current auto-configurations that trigger on existence of a DataSource bean is far too brittle because the addition of dependencies later, especially transitive dependencies, which trigger based on a DataSource bean, will reintroduce the error.
In your #SpringBootApplication or #EnableAutoConfiguration annotations set exclude property to:
#SpringBootApplication(exclude = { DataSourceAutoConfiguration.class,
HibernateJpaAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class })
That should do the trick.

Configuring Spring bean if not #Profiles active

The documentation for the #Profile annotation is not very clear on whether multiple active profile names are AND'd or OR'd together.
I would like a bean to be configured if neither "test" nor "integration-test" profiles are active. I have tried the following but it does not work (appear to OR the conditions).
#Bean
#Profile({"!test", "!integration-test"})
SomeLogicNotForTests someLogicNotForTests() {
return new SomeLogicNotForTests();
}
The only work around I can think of is to remove this annotation and instead #Autowired and Environment into the bean itself and test the environment.getActiveProfiles()
Is there any way to make it AND the conditions? If not could the annotation be extended to do that?

Spring: Two beans implementing one interface with one as #Primary - autowiring creates both beans

One Interface: BeanMapperUtil
Two implementing beans:
OrikaBeanMapper - Singleton bean and marked #Primary
DirectBeanMapper - prototype bean
In Manager class:
#Autowired
BeanMapperUtil mapper;
Observation: Spring creates both OrikaBeanMapper and DirectBeanMapper and then autowires OrikaBeanMapper.
Expected: Since OrikaBeanMapper is already marked as #Primary, Spring should create only this bean and autowire it. Spring need not create an instance of DirectBeanMapper. There is no impact on performance/functionality, but this looks like wasteful creation of instance only to be discarded.
When your application starts, Spring container creates instance of all the beans(expect prototype bean) which are register in that and stores that bean in the BeanFactory.
Hence all beans are created at once and only BeanMapperUtil is injected as it is used for autowiring.
#Primary works as a filter after all matching beans have been created. It's not designed to prevent the lookup and creation of other, non-primary, matching beans.
When Spring tries to autowire BeanMapperUtil, it will find two matches, OrikaBeanMapper and DirectBeanMapper, and both will be created. At this point the #Primary comes into play. Spring will choose the bean with the #Primary annotation for injection.

Categories