I am stuck with null values in an autowired property. I am hoping I could get some help.
We are using for the project spring-boot version 0.5.0.M6.
The four configuration files with beans are in one package and are sorted by "area":
Data source configuration
Global method security configuration (as we use Spring-ACL)
MVC configuration
Spring Security configuration
The main method that bootstraps everything is in the following file:
#EnableAspectJAutoProxy
#EnableSpringConfigured
#EnableAutoConfiguration(exclude = {
DataSourceTransactionManagerAutoConfiguration.class,
HibernateJpaAutoConfiguration.class,
JpaRepositoriesAutoConfiguration.class,
SecurityAutoConfiguration.class,
ThymeleafAutoConfiguration.class,
ErrorMvcAutoConfiguration.class,
MessageSourceAutoConfiguration.class,
WebSocketAutoConfiguration.class
})
#Configuration
#ComponentScan
public class IntegrationsImcApplication {
public static void main(String[] args) throws Exception {
ApplicationContext ctx = SpringApplication.run(
IntegrationsImcApplication.c lass, args);
}
}
The first file that holds the data source configuration beans is as follows (I have omitted some method body parts to make it more readable):
#EnableTransactionManagement(mode = AdviceMode.ASPECTJ)
#Configuration
public class RootDataSourceConfig
extends TomcatDataSourceConfiguration
implements TransactionManagementConfigurer {
#Override
public DataSource dataSource() {
return jpaDataSource();
}
public PlatformTransactionManager annotationDrivenTransactionManager() {
return jpaTransactionManager();
}
#Bean
public HibernateExceptionTranslator hibernateExceptionTranslator() {
return new HibernateExceptionTranslator();
}
#Bean(name="jpaDataSource")
public DataSource jpaDataSource() {......}
#Bean(name = {"transactionManager","txMgr"})
public JpaTransactionManager jpaTransactionManager() {......}
#Bean(name = "entityManagerFactory")
public EntityManagerFactory jpaEmf() {......}
}
And here is the next configuration file, that depends on the data source from above. It has about 20 beans related to ACL configuration, but it fails on the firsts bean that uses data source:
#EnableGlobalMethodSecurity(prePostEnabled = true)
#Configuration
public class RootGlobalMethodSecurityConfig
extends GlobalMethodSecurityConfiguration
implements Ordered {
#Autowired
public DataSource dataSource;
#Override
public int getOrder() {
return IntegrationsImcApplication.ROOT_METHOD_SECURITY_CO NFIG_ORDER;
}
#Bean
public MutableAclService aclService()
throws CacheException, IOException {
MutableJdbcAclService aclService = new MutableJdbcAclService(
dataSource, aclLookupStrategy(), aclCache());
aclService.setClassIdentityQuery("SELECT ##IDENTITY");
aclService.setSidIdentityQuery("SELECT ##IDENTITY");
return aclService;
}
...................................
}
Basically invoking aclService() throws an error as dataSource is null. We have tried ordering the configuration files by implementing the Ordered interface. We also tried using #AutoConfigureAfter(RootDataSourceConfig.class) but this did not help either. Instead of doing #Autowired on the DataSource we also tried injecting the RootDataSourceConfig class itself, but it was still null. We tried using #DependsOn and #Ordered on those beans but again no success. It seems like nothing can be injected into this configuration.
The console output at the startup is listing the beans in the order we want them, with data source being the first. We are pretty much blocked by this.
Is there anything weird or unique we are doing here that is not working? If this is as designed, then how could we inject data source differently?
Repo: github
Eager initialization of a bean that depends on a DataSource is definitely the problem. The root cause is nothing to do with Spring Boot or autoconfiguration, but rather plain old-fashioned chicken and egg - method security is applied via an aspect which is wrapped around your business beans by a BeanPostProcessor. A bean can only be post processed by something that is initialized very early. In this case it is too early to have the DataSource injected (actually the #Configuration class that needs the DataSource is instantiated too early to be wrapped properly in the #Configuration processing machinery, so it cannot be autowired). My proposal (which only gets you to the same point with the missing AuthenticationManager) is to declare the GlobalMethodSecurityConfiguration as a nested class instead of the one that the DataSource is needed in:
#EnableGlobalMethodSecurity(prePostEnabled = true)
#Configuration
protected static class ActualMethodSecurityConfiguration extends GlobalMethodSecurityConfiguration {
#Autowired
#Qualifier("aclDaoAuthenticationProvider")
private AuthenticationProvider aclDaoAuthenticationProvider;
#Autowired
#Qualifier("aclAnonymousAuthenticationProvider")
private AnonymousAuthenticationProvider aclAnonymousAuthenticationProvider;
#Autowired
#Qualifier("aclExpressionHandler")
private MethodSecurityExpressionHandler aclExpressionHandler;
#Override
protected void configure(AuthenticationManagerBuilder auth)
throws Exception {
auth.authenticationProvider(aclDaoAuthenticationProvider);
auth.authenticationProvider(aclAnonymousAuthenticationProvider);
}
#Override
public MethodSecurityExpressionHandler createExpressionHandler() {
return aclExpressionHandler;
}
}
i.e. stick that inside the RootMethodSecurityConfiguration and remove the #EnableGlobalMethodSecurity annotation from that class.
I might have resolved the problem.
GlobalMethodSecurityConfiguration.class has the following setter that tries to autowire permission evaluators:
#Autowired(required = false)
public void setPermissionEvaluator(List<PermissionEvaluator> permissionEvaluators) {
....
}
And in my case the aclPermissionEvaluator() bean needs aclService() bean, which in turn depends on another autowired property: dataSource. Which seems not to be autowired yet.
To fix this I implemented BeanFactoryAware and get dataSource from beanFactory instead:
public class RootMethodSecurityConfiguration extends GlobalMethodSecurityConfiguration implements BeanFactoryAware {
private DataSource dataSource;
#Override
public void setBeanFactory(BeanFactory beanFactory) throws BeansException {
this.dataSource = beanFactory.getBean("dataSource", DataSource.class);
}
....
}
After this, other exception showed up, whereSecurityAutoConfiguration.class is complaining about missing AuthenticationManager, so I just excluded it from #EnableAutoConfiguration. I am not sure if its ideal, but I have custom security configuration, so this way everything works ok.
Related
I am using Spring Boot for my application. I am defining JNDI name in the application.properties file.
When I am trying to get JdbcTemplate in below class, its null:
#Configuration
public class DemoClass
{
#Autowired
private JdbcTemplate template;
#Bean
private DataSource getDS(){
return template.getDataSource(); //NPE
}
}
Another Class
#Component
public class SecondClass {
#Autowired
private JdbcTemplate template;
public void show(){
template.getDataSource(): // Working Fine
}
}
I am not sure this configured by default.. In case it is not, then maybe you can try configuring it yourself:
#Autowired
DataSoure dataSource;
#Bean
public JdbcTemplate getJdbcTemplate() {
return new JdbcTemplate(dataSource);
}
in any case if you need only the DataSource, I think it is auto-configured by Spring Boot so you can autowire it directly when you need it.
#Repository
public class DataRepository {
private JdbcTemplate jdbcTemplate;
#Autowired
public void setJdbcTemplate(JdbcTemplate jdbcTemplate) {
this.jdbcTemplate = jdbcTemplate;
}
public int updateCandidate() {
return this.jdbcTemplate.update("update .... from table where ....");
}
}
application.properties
database connection details
spring.datasource.url=jdbc:oracle:thin:***
spring.datasource.username=Scott
spring.datasource.password=Tiger
spring.datasource.driver-class-name=oracle.jdbc.driver.OracleDriver
spring.datasource.tomcat.initial-size=1
spring.datasource.tomcat.max-active=1
spring.datasource.tomcat.min-idle=1
spring.datasource.tomcat.max-idle=1
If you're getting a NPE at getDS. This means JdbcTemplate hasn't been injected yet, maybe it couldn't be injected.
Give spring a hint at bean dependencies by
#Bean
public JdbcTemplate getJdbcTemplate(DataSource dataSource){
return new JdbcTemplate(dataSource)
}
Or
#Bean
#DependsOn("template")
public DataSouce getDS(){
return template.getDataSource();
}
By default #Autowired sets required=true so the DemoClass should not be constructed by Spring.
Most likely you are creating new DemoClass() or disabled the annotation config altogether and the DemoClass class is registered manually e.g. using XML.
Instead ensure that the DemoClass class is discovered using Spring's component scan e.g. using #SpringBootApplication or #ComponentScan e.g. as per this example.
I have a Spring service which is checking database entries. To minimize my repository calls both find methods are "#Cacheable". But when I try to init my service bean while my configuration class has a CacheManager bean definition I get following NoSuchBeanDefinitionException:
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'foo.mediacode.directory.MediaCodeDirectoryService' available
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:353)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:340)
at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1093)
at foo.mediacode.directory.MediaCodeDirectoryService.implementation(MediaCodeDirectoryService.java:63)
at foo.campaigntree.directory.CampaignTreeDirectoryService.<init>(CampaignTreeDirectoryService.java:18)
... 15 more
If I take out the CacheManager bean definition, I can init my service bean and it runs without any problems and caching!
Here is my code:
Configuration
...
#Configuration
#EnableCaching
#EnableJpaRepositories(...)
#PropertySource({...})
public class MediaCodeDirectoryServiceConfig {
private static Logger configLogger = Logger.getLogger(MediaCodeDirectoryServiceConfig.class.getName());
#Value("${jpa.loggingLevel:FINE}")
private String loggingLevel;
#Value("${mysql.databaseDriver}")
private String dataBaseDriver;
#Value("${mysql.username}")
private String username;
#Value("${mysql.password}")
private String password;
#Value("${mysql.databaseUrl}")
private String databaseUrl;
#Bean
public static PropertySourcesPlaceholderConfigurer propertyConfigInDev() {
...
}
#Bean
public MediaCodeDirectoryService mediaCodeDirectoryService() {
return new MediaCodeDirectoryService();
}
#Bean
public CacheManager mediaCodeCacheManager() {
SimpleCacheManager cacheManager = new SimpleCacheManager();
cacheManager.setCaches(Arrays.asList(new ConcurrentMapCache("mediaCodeMappingRegexCache"),
new ConcurrentMapCache("mediaCodeMappingsCache")));
return cacheManager;
}
#Bean
public JpaTransactionManager transactionManager() {
...
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
...
}
public DataSource getDataSource() {
...
}
public JpaDialect getJpaDialect() {
...
}
public Properties getEclipseLinkProperty() {
...
}
public JpaVendorAdapter getJpaVendorAdapter() {
...
}
}
Service
....
public class MediaCodeDirectoryService implements MediaCodeDirectoryServiceApi {
...
#Autowired
private MediaCodeDirectoryRepository repo;
#SuppressWarnings("resource")
public static MediaCodeDirectoryServiceApi implementation() {
if (INSTANCE == null) {
ApplicationContext ctx = new AnnotationConfigApplicationContext(MediaCodeDirectoryServiceConfig.class);
INSTANCE = ctx.getBean(MediaCodeDirectoryService.class);
}
return INSTANCE;
}
...
Repository
...
#Repository
public interface MediaCodeDirectoryRepository extends CrudRepository<MediaCodeDao, Integer> {
#Cacheable("mediaCodeMappingRegexes")
#Query("SELECT m FROM #{#entityName} m WHERE (m.fooId = :fooId) AND (m.isRegex = :isRegex) ORDER BY (m.orderId DESC, m.id ASC)")
List<MediaCodeDao> findByfooIdAndIsRegexOrderByOrderIdDescAndIdAsc(#Param("fooId") int fooId, #Param("isRegex") boolean isRegex);
#Cacheable("mediaCodeMappings")
List<MediaCodeDao> findByMediaCode(String MediaCode, Pageable pageable);
}
When I debug into DefaultListableBeanFactory I can find within beanDefinitionMap my mediaCodeDirectoryService and also within beanDefinitionNames mediaCodeDirectoryService appears. But DefaultListableBeanFactory.getBean(...) cannot resolve name and namedBean in line 364 is null.
When I try to get the context via String like:
INSTANCE = (MediaCodeDirectoryService) ctx.getBean("mediaCodeDirecotryService")
I avoid the NoSuchBeanDefinitionException but I run into an other one.
Anybody here has an idea on what might be the cause of this? Did I missed something in my configuration? Thx!
Caching is applied through AOP. For AOP Spring uses a proxy based approach and the default is to create interface based proxies.
public class MediaCodeDirectoryService implements MediaCodeDirectoryServiceApi {... }
With this class definition at runtime you will get a dynamically created class (Proxy$51 or something along those lines) which implements all interfaces but it isn't a MediaCodeDirectoryService. It is however a MediaCodeDirectoryServiceApi.
You have 2 ways of fixing this, either program to interfaces (which you should have been doing anyway because you have defined interfaces) instead of concrete classes or use class based proxies.
The first option involves you changing your code in the places the directly #Autowire or get an instance of MediaCodeDirectoryService to use MediaCodeDirectoryServiceApi instead (which imho you should already do, why else define an interface). Now you will get the proxy injected and everything will work.
The second option involves you setting proxyTargetClass=true on your #EnableCaching annotation. Then instead of an interface based proxy you will get a class based proxy.
#EnableCaching(proxyTargetClass=true)
I passed one day and a half looking for answer, but this thing is going to put me crazy!
My teammates and I are working on an project, based on springboot. I work specificaly on the administration part, which is a web administration.
There are mainly three layers on my project: Controllers which use Services which use Repositories.
I want my project work with #Transactional for the Service layer (we made some successful efforts until now to use only annotations for configuration).
But, it seems that it doesn't work: One of my service throws a RuntimeException and no rollback is done. I allready read all the proposition in the others sibling subjects. The only thing, related to my problem, that i'm not sure to do neatly is the contexts configuration. Eventhow, i'm not sure that it's really my problem.
I show you the actual configuration:
#SpringBootApplication
#EnableScheduling
#EnableTransactionManagement
public class Application extends SpringBootServletInitializer {
#Value("${ajp.port}")
private int ajpPort;
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
#Override
protected SpringApplicationBuilder configure(SpringApplicationBuilder builder) {
return builder.sources(Application.class);
}
#Bean
public EmbeddedServletContainerFactory servletContainer() {
TomcatEmbeddedServletContainerFactory tomcat = new TomcatEmbeddedServletContainerFactory() {};
tomcat.addAdditionalTomcatConnectors(createConnector(ajpPort));
return tomcat;
}
#Bean
public EmbeddedServletContainerCustomizer containerCustomizer() {
return container -> {
ErrorPage error401Page = new ErrorPage(HttpStatus.UNAUTHORIZED, "/static/401.html");
ErrorPage error404Page = new ErrorPage(HttpStatus.NOT_FOUND, "/static/404.html");
container.addErrorPages(error401Page, error404Page);
};
}
#Bean
public EmailValidator emailValidator() {
return EmailValidator.getInstance();
}
private static Connector createConnector(int ajpPort) {
Connector connector = new Connector("AJP/1.3");
connector.setPort(ajpPort);
return connector;
}
}
The web config:
#Configuration
public class MvcConfig extends WebMvcConfigurerAdapter {
#Autowired
private RequestProcessingTimeInterceptor requestProcessingTimeInterceptor;
#Autowired
private CertificateInterceptor certificateInterceptor;
#Autowired
private ProfilesAuthorizationInterceptor profilesAuthorizationInterceptor;
#Override
public void addInterceptors(InterceptorRegistry registry) {
registry.addInterceptor(requestProcessingTimeInterceptor);
registry.addInterceptor(certificateInterceptor);
registry.addInterceptor(profilesAuthorizationInterceptor);
}
#Override
public void configureDefaultServletHandling(DefaultServletHandlerConfigurer configurer) {
configurer.enable();
}
#Bean
public InternalResourceViewResolver viewResolver() {
InternalResourceViewResolver resolver = new InternalResourceViewResolver();
resolver.setExposeContextBeansAsAttributes(true);
resolver.setPrefix("/WEB-INF/");
resolver.setSuffix(".jsp");
return resolver;
}
#Override
public void addResourceHandlers(ResourceHandlerRegistry registry) {
registry.addResourceHandler("/admin/css/**").addResourceLocations("/WEB-INF/admin/css/").setCachePeriod(CACHE_PERIOD);
registry.addResourceHandler("/admin/img/**").addResourceLocations("/WEB-INF/admin/img/").setCachePeriod(CACHE_PERIOD);
registry.addResourceHandler("/admin/js/**").addResourceLocations("/WEB-INF/admin/js/").setCachePeriod(CACHE_PERIOD);
registry.addResourceHandler("/admin/plugins/**").addResourceLocations("/WEB-INF/admin/plugins/").setCachePeriod(CACHE_PERIOD);
}
}
A Controler-like:
#RestController
#RequestMapping("/pathA")
public class ControlerA {
#Autowired
public ServiceA serviceA;
#RequestMapping(value = "{id}", method = RequestMethod.GET)
#ResponseStatus(HttpStatus.OK)
public A getA(#PathVariable long id) {
return serviceA.getA(id);
}
}
A Service-like (interface + implémentation):
public interface ServiceA {
A getA(long id);
}
#Service
#Transactional
public class ServiceAImpl implements ServiceA {
#Autowired
public RepositoryA repositoryA;
public A getA(long id) {
(...)
A a = repositoryA.findOne(id);
a.updatesomething(something);
repositoryA.update(a);
doOtherThing(a); //throw RuntimeException
(...)
return a;
}
}
And the Repository:
#Repository
public interface RepositoryA extends JpaRepository<A, Long> {
(...)
}
Here is the configuration of the MySQL database:
# Configuration de la base de donnée
spring.datasource.url=jdbc:mysql://localhost/name_innodb
spring.datasource.username=username
spring.datasource.password=password
spring.datasource.driver-class-name=com.mysql.jdbc.Driver
spring.datasource.testOnBorrow=true
spring.datasource.validationQuery=SELECT 1
I know that repository transaction works by default (I saw it, when a SQLException happen). But in the service layer, nothing happen (cf. the throwing exception line) ; when the exception is thrown, the update is done and not rollback. Then it mean that my #Transactional is ignored.
Edit :
I manage to get a transaction like I want, adding #Transactional on the method getA(...) of the Controller. It works, but it's not the place to manage Transaction.
Then my question is: How can I make it work?
Ok, after some days of brainstorming, I found!
The only reasonnable answer is to take care about your Configuration class. My problem was only a crossover configuration problem which leaded to a DispatcherServlet configuration who caused the mess.
Related Subject: For web MVC Spring app should #Transactional go on controller or service?
Edit:
I add some details because it'll be hard to find some information in order to separate context. And I'm still calibrating the configuration because there's no complete and exhaustive information about all the spring's annotations.
You could create parent and child context like this:
#SpringBootApplication
#ComponentScan({"com.mycompany.service", "com.mycompany.interceptors","com.mycompany.manager"})
#PropertySource("file:config/application.properties")
public class ParentConfig{
public static void main(String[] args) {
new SpringApplicationBuilder()
.parent(ParentConfig.class)
.child(ChildConfig1.class, ChildConfig2.class, ChildConfig3.class, ..., ChildConfigN.class)
.run(args);
}
(...)
}
I'm still wondering why I must add the #PropertySource in order children are aware of property values, why "classpath:path" have not work in #PropertySource, why I have to add a static PropertySourcesPlaceholderConfigurer for using #Value in my children (before I do that, i.e without this hierarchical contexts, every context was aware of the properties)
#Bean
public static PropertySourcesPlaceholderConfigurer propertyConfigInDev() {
return new PropertySourcesPlaceholderConfigurer();
}
and I'm still playing with annotations in order every configuration work.
Edit:
I finally have found something in order to work correctly with spring configuration: Different configurations must respect packaging hierarchy.
I stop working with parent and child configuration and let spring work. I ordonate my different config class like this:
MainConfig
|
|__________my.package.mvc.MVCConfig
|
|__________my.package.schedulers.SchedulerConfig
|
|
and so on..
And in my MainConfig I add:
#ComponentScan({"my.package.mvc", "my.package.services", "my.package.interceptors","my.package.managers", "my.package.schedulers"})
And everything is good now! Mostly, MVCConfig can not create conflict with services, because of the different hierarchy.
I have seen a lot of examples of Spring Batch projects where either (a) a dataSource is defined, or (b) no dataSource is defined.
However, in my project, I would like my business logic to have access to a dataSource, but I want Spring Batch to NOT use the dataSource. Is this possible?
This guy has a similar problem: Spring boot + spring batch without DataSource
Generally, using spring-batch without a database is not a good idea, since there could be concurrency issues depending on the kind of job you define. So at least an using an inmemory db is strongly advised, especially if you plan to use the job in production.
Using SpringBatch with SpringBoot will initialize an inmemory datasource, if you do not configure your own datasource(s).
Taking this into account, let me redefine your question as follows: Can my businesslogic use another datasource than springbatch is using to update its BATCH-tables?
Yes, it can. As a matter of fact, you can use as many datasources as you want inside your SpringBatch Jobs. Just use by-name autowiring.
Here is how I do it:
I always use Configuration class, which defines all the datasources I have to use in my Jobs
Configuration
public class DatasourceConfiguration {
#Bean
#ConditionalOnMissingBean(name = "dataSource")
public DataSource dataSource() {
// create datasource, that is used by springbatch
// for instance, create an inmemory datasource using the
// EmbeddedDatabaseFactory
return ...;
}
#Bean
#ConditionalOnMissingBean(name = "bl1datasource")
public DataSource bl1datasource() {
return ...; // your first datasource that is used in your businesslogic
}
#Bean
#ConditionalOnMissingBean(name = "bl2datasource")
public DataSource bl2datasource() {
return ...; // your second datasource that is used in your businesslogic
}
}
Three points to note:
SpringBatch is looking for a datasource with the name "dataSource", if you do not provide this EXACT (uppercase 'S') name as the name, spring batch will try to autowire by type and if it finds more than one instance of DataSource, it will throw an exception.
Put your datasource configuration in its own class. Do not put them in the same class as your jobdefinitions are. Spring needs to be able to instantiate the datasource-SpringBean with the name "dataSource" very early when it loads the context. Before it starts to instantiate your Job- and Step-Beans. Spring will not be able to do it correctly, if you put your datasource definitions in the same class as you have your job/step definitions.
Using #ConditionalOnMissingBean is not mandatory, but I found it a good practics. It makes it easy to change the datasources for unit/integration tests. Just provide an additional test configuration in the ContextConfiguration of your unit/IT test which, for instance, overwrites the "bl1Datasource" with an inMemoryDataSource:
Configuration
public class TestBL1DatasourceConfiguration {
// overwritting bl1datasource with an inMemoryDatasource.
#Bean
public DataSource bl1datasource() {
return new EmbeddedDatabaseFactory.getDatabase();
}
}
In order to use the businesslogic datasources, use injection by name:
#Component
public class PrepareRe1Re2BezStepCreatorComponent {
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private DataSource bl1datasource;
#Autowired
private DataSource bl2datasource;
public Step createStep() throws Exception {
SimpleStepBuilder<..., ...> builder =
stepBuilderFactory.get("astep") //
.<..., ...> chunk(100) //
.reader(createReader(bl1datasource)) //
.writer(createWriter(bl2datasource)); //
return builder.build();
}
}
Furthermore, you probably want to consider using XA-Datasources if you'd like to work with several datasources.
Edited:
Since it seems that you really don't want to use a datasource, you have to implement your own BatchConfigurer (http://docs.spring.io/spring-batch/trunk/apidocs/org/springframework/batch/core/configuration/annotation/BatchConfigurer.html) (as Michael Minella - the SpringBatch project lead - pointed out above).
You can use the code of org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer as a starting point for your own implementation. Simply remove all the datasource/transactionmanager code and keep the content of the if (datasource === null) part in the initialize method. This will initialize a MapBasedJobRepository and MapBasedJobExplorer. But again, this is NOT a useable solution in a productive environment, since it is not threadsafe.
Edited:
How to implement it:
Configuration class that defines the "businessDataSource":
#Configuration
public class DataSourceConfigurationSimple {
DataSource embeddedDataSource;
#Bean
public DataSource myBusinessDataSource() {
if (embeddedDataSource == null) {
EmbeddedDatabaseFactory factory = new EmbeddedDatabaseFactory();
embeddedDataSource = factory.getDatabase();
}
return embeddedDataSource;
}
}
The implementation of a specific BatchConfigurer:
(of course, the methods have to be implemented...)
public class MyBatchConfigurer implements BatchConfigurer {
#Override
public JobRepository getJobRepository() throws Exception {
return null;
}
#Override
public PlatformTransactionManager getTransactionManager() throws Exception {
return null;
}
#Override
public JobLauncher getJobLauncher() throws Exception {
return null;
}
#Override
public JobExplorer getJobExplorer() throws Exception {
return null;
}
}
And finally the main configuration and launch class:
#SpringBootApplication
#Configuration
#EnableBatchProcessing
// Importing MyBatchConfigurer will install your BatchConfigurer instead of
// SpringBatch default configurer.
#Import({DataSourceConfigurationSimple.class, MyBatchConfigurer.class})
public class SimpleTestJob {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
public Job job() throws Exception {
SimpleJobBuilder standardJob = this.jobs.get(JOB_NAME)
.start(step1());
return standardJob.build();
}
protected Step step1() throws Exception {
TaskletStepBuilder standardStep1 = this.steps.get("SimpleTest_step1_Step")
.tasklet(tasklet());
return standardStep1.build();
}
protected Tasklet tasklet() {
return (contribution, context) -> {
System.out.println("tasklet called");
return RepeatStatus.FINISHED;
};
}
public static void main(String[] args) throws Exception {
SpringApplication.run(SimpleTestJob.class, args);
}
}
I am trying to configure a couple of datasources within Spring Batch. On startup, Spring Batch is throwing the following exception:
To use the default BatchConfigurer the context must contain no more thanone DataSource, found 2
Snippet from Batch Configuration
#Configuration
#EnableBatchProcessing
public class BatchJobConfiguration {
#Primary
#Bean(name = "baseDatasource")
public DataSource dataSource() {
// first datasource definition here
}
#Bean(name = "secondaryDataSource")
public DataSource dataSource2() {
// second datasource definition here
}
...
}
Not sure why I am seeing this exception, because I have seen some xml based configuration for Spring batch that declare multiple datasources. I am using Spring Batch core version 3.0.1.RELEASE with Spring Boot version 1.1.5.RELEASE. Any help would be greatly appreciated.
You must provide your own BatchConfigurer. Spring does not want to make that decision for you
#Configuration
#EnableBatchProcessing
public class BatchConfig {
#Bean
BatchConfigurer configurer(#Qualifier("batchDataSource") DataSource dataSource){
return new DefaultBatchConfigurer(dataSource);
}
...
AbstractBatchConfiguration tries to lookup BatchConfigurer in container first, if it is not found then tries to create it itself - this is where IllegalStateException is thrown where there is more than one DataSource bean in container.
The approach to solving the problem is to prevent from creation the DefaultBatchConfigurer bean in AbstractBatchConfiguration.
To do it we hint to create DefaultBatchConfigurer by Spring container using #Component annotation:
The configuration class where #EnableBatchProcessing is placed we can annotate with #ComponentScan that scan the package that contains the empty class that is derived from DefaultBatchConfigurer:
package batch_config;
...
#EnableBatchProcessing
#ComponentScan(basePackageClasses = MyBatchConfigurer.class)
public class MyBatchConfig {
...
}
the full code of that empty derived class is here:
package batch_config.components;
import org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer;
import org.springframework.stereotype.Component;
#Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
}
In this configuration the #Primary annotation works for DataSource bean as in the example below:
#Configuration
public class BatchTestDatabaseConfig {
#Bean
#Primary
public DataSource dataSource()
{
return .........;
}
}
This works for the Spring Batch version 3.0.3.RELEASE
The simplest solution to make #Primary annotation on DataSource work might be just adding #ComponentScan(basePackageClasses = DefaultBatchConfigurer.class) along with #EnableBatchProcessing annotation:
#Configuration
#EnableBatchProcessing
#ComponentScan(basePackageClasses = DefaultBatchConfigurer.class)
public class MyBatchConfig {
I would like to provide a solution here, which is very similar to the one answered by #vanarchi, but I managed to put all the necessary configurations into one class.
For the sake of completeness, the solution here assumes that primary datasource is hsql.
#Configuration
#EnableBatchProcessing
public class BatchConfiguration extends DefaultBatchConfigurer {
#Bean
#Primary
public DataSource batchDataSource() {
// no need shutdown, EmbeddedDatabaseFactoryBean will take care of this
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
EmbeddedDatabase embeddedDatabase = builder
.addScript("classpath:org/springframework/batch/core/schema-drop-hsqldb.sql")
.addScript("classpath:org/springframework/batch/core/schema-hsqldb.sql")
.setType(EmbeddedDatabaseType.HSQL) //.H2 or .DERBY
.build();
return embeddedDatabase;
}
#Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(batchDataSource());
factory.setTransactionManager(transactionManager());
factory.afterPropertiesSet();
return (JobRepository) factory.getObject();
}
private ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
//NOTE: the code below is just to provide developer an easy way to access the in-momery hsql datasource, as we configured it to the primary datasource to store batch job related data. Default username : sa, password : ''
#PostConstruct
public void getDbManager(){
DatabaseManagerSwing.main(
new String[] { "--url", "jdbc:hsqldb:mem:testdb", "--user", "sa", "--password", ""});
}
}
THREE key points in this solution:
This class is annotated with #EnableBatchProcessing and #Configuration, as well as extended from DefaultBatchConfigurer. By doing this, we instruct spring-batch to use our customized batch configurer when AbstractBatchConfiguration tries to lookup BatchConfigurer;
Annotate batchDataSource bean as #Primary, which instruct spring-batch to use this datasource as its datasource of storing the 9 job related tables.
Override protected JobRepository createJobRepository() throws Exception method, which makes the jobRepository bean to use the primary datasource, as well as use a different transactionManager instance from the other datasource(s).
The simplest solution is to extend the DefaultBatchConfigurer and autowire your datasource via a qualifier:
#Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
/**
* Initialize the BatchConfigurer to use the datasource of your choosing
* #param firstDataSource
*/
#Autowired
public MyBatchConfigurer(#Qualifier("firstDataSource") DataSource firstDataSource) {
super(firstDataSource);
}
}
Side Note (as this also deals with the use of multiple data sources): If you use autoconfig to run data initialization scripts, you may notice that it's not initializing on the datasource you'd expect. For that issue, take a look at this: https://github.com/spring-projects/spring-boot/issues/9528
You can define below beans and make sure you application.properties file has entries needed for
#Configuration
#PropertySource("classpath:application.properties")
public class DataSourceConfig {
#Primary
#Bean(name = "abcDataSource")
#ConfigurationProperties(prefix = "abc.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
#Bean(name = "xyzDataSource")
#ConfigurationProperties(prefix = "xyz.datasource")
public DataSource xyzDataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
}
application.properties
abc.datasource.jdbc-url=XXXXX
abc.datasource.username=XXXXX
abc.datasource.password=xxxxx
abc.datasource.driver-class-name=org.postgresql.Driver
...........
...........
...........
...........
Here you can refer: Spring Boot Configure and Use Two DataSources
First, create a custom BatchConfigurer
#Configuration
#Component
public class TwoDataSourcesBatchConfigurer implements BatchConfigurer {
#Autowired
#Qualifier("dataSource1")
DataSource dataSource;
#Override
public JobExplorer getJobExplorer() throws Exception {
...
}
#Override
public JobLauncher getJobLauncher() throws Exception {
...
}
#Override
public JobRepository getJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
// use the autowired data source
factory.setDataSource(dataSource);
factory.setTransactionManager(getTransactionManager());
factory.afterPropertiesSet();
return factory.getObject();
}
#Override
public PlatformTransactionManager getTransactionManager() throws Exception {
...
}
}
Then,
#Configuration
#EnableBatchProcessing
#ComponentScan("package")
public class JobConfig {
// define job, step, ...
}