I am writing a library which retrieves data from a specific data schema. This library holds a Datasource object which can be anything. Right now I have defined the name of the datasource within the library which I would like to avoid.
import javax.sql.DataSource
public class MyLibraryDao.java {
private static final DS_NAME = "MY_DS_NAME";
#Resource(name = "default", lookup = DS_NAME , type = DataSource.class)
protected DataSource dataSource;
}
The DAO class is not directly exposed to the client. There is a service layer inbetween:
import javax.inject.Inject;
import javax.enterprise.context.ApplicationScoped;
import javax.enterprise.inject.Model;
#ApplicationScoped
#Model
public class MyLibraryService {
#Inject
MyLibraryDao dao;
}
Now, how would I pass the datasource object to the library?
I assume I need to create a constructor in the DAO with takes a DataSource but what about the service?
The library will be used in a CDI environment.
First things first your library needs a datasource, let's declare the dependency:
public class MyLibraryDao {
#Inject
protected DataSource dataSource;
}
Now the rest of the application that is using the library is responsible to provide a datasource to CDI; a simple way is:
// Example; your implementation may vary
public class AppDatasourceProducer {
private static final DS_NAME = "MY_APP_DS_NAME";
#Resource(name = "default", lookup = DS_NAME , type = DataSource.class)
protected DataSource dataSource;
#Produces
#ApplicationScoped
public DataSource getDatasource() {
return dataSource;
}
}
What's changed? Now your application is responsible for knowing the datasource name AND providing the datasource itself. The example above can work in JEE environments that honor the #Resource annotation. Using a different implementation for the provider would work in e.g. a desktop environment (standalone application), making your library reusable.
The exact datasource name may be fixed, just like in the example, or read from configuration, e.g. from system properties (like mentioned in a comment); e.g.:
// Example 2, DS name from system properties
#ApplicationScoped
public class AppDatasourceProducer {
protected DataSource dataSource;
#PostConstruct
void init() throws Exception {
String dsName = System.getProperty("XXXXX");
InitialContext ic = new InitialContext();
dataSource = (DataSource) ic.lookup(dsName);
}
#Produces
#ApplicationScoped
public DataSource getDatasource() {
return dataSource;
}
}
Going further:
An application that uses your library may be using several datasources, for whatever reason. You may want to provide a qualifier to specify the datasource to be used by your app.
I used field injection in MyLibraryDao for simplicity. If you change to constructor injection then, at least MyLibraryDao, will be usable in non-CDI environments as well, i.e. if you have obtained a DataSource somehow, you can now do new MyLibraryDao(datasource). Even more reusability.
Related
I would like to achieve not-trivial bean injection implementation.
I have a custom properties file:
#Getter
#Setter
#ConfigurationProperties
public class DatabaseProperties {
private String url;
private String username;
private String password;
}
I Here is the configuration file:
#Configuration
#EnableConfigurationProperties(DatabaseProperties.class)
public class DBConfig {
#Bean
#ConfigurationProperties(prefix = "datasource.database1")
public JdbcTemplate jdbcTemplateDatabase1(DatabaseProperties databaseProperties) {
DataSource dataSource = new DriverManagerDataSource(
databaseProperties.getUrl(),
databaseProperties.getUsername(),
databaseProperties.getPassword());
return new JdbcTemplate(dataSource);
}
#Bean
#ConfigurationProperties(prefix = "datasource.database2")
public JdbcTemplate jdbcTemplateDatabase2(DatabaseProperties databaseProperties) {
DataSource dataSource = new DriverManagerDataSource(
databaseProperties.getUrl(),
databaseProperties.getUsername(),
databaseProperties.getPassword());
return new JdbcTemplate(dataSource);
}
}
The goal I want to achieve is to instantiate a new DatabaseProperties instance based on prefix.
There are two possible solutions:
create two beans of type DatabaseProperties using corresponding prefixes and two JdbcTemplate beans where parameter is qualified DatabaseProperties bean accordingly.
in each JdbcTemplate bean provide 3 parameters (String url, String username, String password) and inject them through #Value
BUT Is it possible to get rid of creating DatabaseProperties beans for each JdbcTemplate or using #Value ?
You don't need create a DatabaseProperties. Spring already does this for us on datasources and proprierties variable
#Configuration
public class ConfigDataSource {
#Bean("datasource-1") // this name will qualify on #autowired
#ConfigurationProperties(prefix="spring.datasource.yourname-datasource-1") // this is the name for the prefix for datasource on .properties settings
public DataSource dataSourcePostgres() {
return DataSourceBuilder.create().build();
}
#Bean("datasource-2") // this name will qualify on #autowired
#ConfigurationProperties(prefix="spring.datasource.yourname-datasource-2") // this is the name for the prefix for datasource on .properties settings
public DataSource dataSourceMySql() {
return DataSourceBuilder.create().build();
}
}
.properties
# Its required use the same name declared in bean
spring.datasource.yourname-datasource-1.url=...
spring.datasource.yourname-datasource-1.jdbcUrl=${spring.datasource.yourname-datasource-1}
spring.datasource.yourname-datasource-1.username=user
spring.datasource.yourname-datasource-1.password=pass
spring.datasource.yourname-datasource-1.driver-class-name=your.driver
spring.datasource.yourname-datasource-2.url=...
spring.datasource.yourname-datasource-2.jdbcUrl=${spring.datasource.yourname-datasource-2}
spring.datasource.yourname-datasource-2.username=user
spring.datasource.yourname-datasource-2.password=pass
spring.datasource.yourname-datasource-2.driver-class-name=your.driver
using on services
#Awtowired
#Qualifier("datasource-1")
private DataSource dataSource1;
#Awtowired
#Qualifier("datasource-2")
private DataSource dataSource2;
public testJdbcTemplate(){
// You can qualifier JdbcTemplate below on bean and not necessary need instance on service
JdbcTemplate jdbcTemplateDatasource1 = new JdbcTemplate(dataSource1);
JdbcTemplate jdbcTemplateDatasource2 = new JdbcTemplate(dataSource2);
}
To my knowledge, there is no way around the fact that if you want to have access to multiple databases Spring will not be able to do the magic for you. You will need to create the two JdbcTemplate Spring-managed beans and then inject them where needed using the #Qualifier annotation.
This approach has two benefits:
It actually work;
You are explicit about what you are doing. A Spring application has already a good load of magic happening in there, so we might want to avoid some additional one when we need something a little bit more custom and that is not that complex to achieve.
This one is really puzzling me. I have a Spring Boot (2.1.2) app where I am managing two data sources via MyBatis and Spring. I have multiple MyBatis mappers and each one is configured to use a particular data source. The code for that configuration is below:
import org.apache.ibatis.session.SqlSessionFactory;
import org.mybatis.spring.SqlSessionFactoryBean;
import org.mybatis.spring.mapper.MapperFactoryBean;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import javax.sql.DataSource;
#Configuration
public class MyBatisConfig {
private static final String POSTGRES_SESSION_FACTORY = "postgresSessionFactory";
private static final String MYSQL_SESSION_FACTORY = "mySqlDbSessionFactory";
#Bean(name = POSTGRES_SESSION_FACTORY, destroyMethod = "")
#Primary
public SqlSessionFactoryBean postgresSessionFactory(
#Qualifier(DataSourceConfig.PRIMARY_DATA_SOURCE) final DataSource oneDataSource,
ApplicationContext applicationContext) throws Exception {
final SqlSessionFactoryBean sqlSessionFactoryBean = new SqlSessionFactoryBean();
sqlSessionFactoryBean.setConfigLocation(applicationContext.getResource("classpath:mybatis-config.xml"));
sqlSessionFactoryBean.setDataSource(oneDataSource);
SqlSessionFactory sqlSessionFactory;
sqlSessionFactory = sqlSessionFactoryBean.getObject();
sqlSessionFactory.getConfiguration().addMapper(Mapper1.class);
sqlSessionFactory.getConfiguration().addMapper(Mapper2.class);
return sqlSessionFactoryBean;
}
#Bean(name = MYSQL_SESSION_FACTORY, destroyMethod = "")
public SqlSessionFactoryBean mySqlSessionFactory(
#Qualifier(DataSourceConfig.SECONDARY_DATA_SOURCE) final DataSource anotherDataSource,
ApplicationContext applicationContext) throws Exception {
final SqlSessionFactoryBean sqlSessionFactoryBean = new SqlSessionFactoryBean();
sqlSessionFactoryBean.setConfigLocation(applicationContext.getResource("classpath:mybatis-config-sql.xml"));
sqlSessionFactoryBean.setDataSource(anotherDataSource);
final SqlSessionFactory sqlSessionFactory = sqlSessionFactoryBean.getObject();
sqlSessionFactory.getConfiguration().addMapper(Mapper3.class);
return sqlSessionFactoryBean;
}
#Bean
public MapperFactoryBean<Mapper1> accountMapperFactory(#Qualifier(POSTGRES_SESSION_FACTORY) final SqlSessionFactoryBean sqlSessionFactoryBean) throws Exception {
MapperFactoryBean<Mapper1> factoryBean = new MapperFactoryBean<>(Mapper1.class);
factoryBean.setSqlSessionFactory(sqlSessionFactoryBean.getObject());
return factoryBean;
}
#Bean
public MapperFactoryBean<Mapper2> domainMapperFactory(#Qualifier(POSTGRES_SESSION_FACTORY) final SqlSessionFactoryBean sqlSessionFactoryBean) throws Exception {
MapperFactoryBean<Mapper2> factoryBean = new MapperFactoryBean<>(Mapper2.class);
factoryBean.setSqlSessionFactory(sqlSessionFactoryBean.getObject());
return factoryBean;
}
#Bean
public MapperFactoryBean<Mapper3> usageMapperFactory(#Qualifier(MYSQL_SESSION_FACTORY) final SqlSessionFactoryBean sqlSessionFactoryBean) throws Exception {
MapperFactoryBean<Mapper3> factoryBean = new MapperFactoryBean<>(Mapper3.class);
factoryBean.setSqlSessionFactory(sqlSessionFactoryBean.getObject());
return factoryBean;
}
}
If I use my debugger, I can very that at the time these beans are being initialized, all of them are pointing to the correct data source. (Mapper1 and Mapper2's SqlSessionFactorys connect to the postgres datasource, Mapper3's SqlSessionFactory connects to the mysql datasource.
But strangely, when they are injected into a service, all three Mappers are connected to the postgres datasource. I am beyond confused at this point.
The service and injection are quite simple:
#Autowired private Mapper1 mapper1;
#Autowired private Mapper2 mapper2;
#Autowired private Mapper3 mapper3;
However when I call the service and stop it with the debugger, I can see that mapper3 is connected to the wrong datasource (postgres).
Any ideas? Any more information needed?
I had the same effects, when configuring multiple datasources with MyBatis. At runtime, all mappers were using the last defined datasource connection credentials, although during startup everything looked fine.
I could narrow the problem down to the factoryBean definition and setting the configuration. As it seems, the configuration object is shared among all datasources (as I set them here), because I set the same instance from the MyBatisProperties. During runtime the configuration is also the place where MyBatis gets the datasource from the Environment property.
The "magic" happens on factoryBean.getObject() and factoryBean.buildSqlSessionFactory(). Here, the datasource is set into the configuration. When the configuration was not explicitly set, then a new default one is created. Otherwise, the configuration object is reused. Thus when initializing several factory beans with the same configuration object, then the configuration object gets every datasource injected and the last datasource remains.
This might be a bug, because as a Mybatis user I expect Mybatis to use the datasource I provide with factoryBean.setDataSource(aDatasource) So, when I commented out factoryBean.setConfiguration, then everything works as intended, but of course my configuration is not applied.
#Bean("sqlSessionFactoryA")
public SqlSessionFactory sqlSessionFactory(#Qualifier("datasourceA") DataSource aDataSource) throws Exception {
SqlSessionFactoryBean factoryBean = new SqlSessionFactoryBean();
factoryBean.setDataSource(aDataSource);
factoryBean.setConfigurationProperties(properties.getConfigurationProperties());
// factoryBean.setConfiguration(properties.getConfiguration());
return factoryBean.getObject();
}
As a solution I set the MyBatisProperties to prototype scope, so that a new properties object with a new configuration object is created on each injection. This is necessary, beause I dont want to write a copy method and org.apache.ibatis.session.Configuration does not provide a copy constructor.
#Bean
#Primary
#Scope("prototype")
public MybatisProperties myBatisProperties() {
return new MybatisProperties();
}
Now I could also apply the configuration and everything worked fine.
org.apache.ibatis.session.Configuration tmpConfiguration = properties.getConfiguration();
factoryBean.setConfiguration(tmpConfiguration);
Context
The server runs on spring-boot and utilizes spring-data. The database
being used is postgresql.
Problem
Some of the components read from information_schema, pg_user,
pg_policies, and pg_catalog. These components' PostConstruct are
currently running before jpa schema creation does. This means that the
information that the components are trying to fetch hasn't been
created by jpa yet, so the components crash.
Prior Research
No errors are being thrown by hibernate itself. Running the server
twice makes the problematic components run correctly. This implies
that these components are running before jpa.
My properties file includes spring.jpa.hibernate.ddl-auto = update . I
tried to find the code behind spring.jpa.hibernate.ddl-auto to see how
I could get the components to require it by way of #DependsOn, but I
have yet to find anything on it.
I can't simply wait for ApplicationReadyEvent with an event listener
as that will break the dependencies between these components.
Code
These are my hikari data sources
#RequiredArgsConstructor
#Configuration
#EnableConfigurationProperties
public class DatabaseConfiguration {
#Bean(name = "server")
#ConfigurationProperties(prefix = "server.datasource")
public HikariDataSource server() {
return (HikariDataSource) DataSourceBuilder.create().build();
}
#Bean(name = "client")
#ConfigurationProperties(prefix = "client.datasource")
public HikariDataSource client() {
return (HikariDataSource) DataSourceBuilder.create().build();
}
}
I have a custom DataSource component.
#Component
public class DatabaseRouterBean {
private final AwsCognitoConfiguration cognitoConfiguration;
private final DatabaseService databaseService;
private final HikariDataSource server;
private final HikariDataSource client;
private final ModelSourceInformation modelSourceInformation;
public DatabaseRouterBean(
#Qualifier("server") final HikariDataSource server,
#Qualifier("client") final HikariDataSource client,
final AwsCognitoConfiguration cognitoConfiguration,
final DatabaseService databaseService,
final ModelSourceInformation modelSourceInformation
) {
this.server = server;
this.client = client;
this.cognitoConfiguration = cognitoConfiguration;
this.databaseService = databaseService;
this.modelSourceInformation = modelSourceInformation;
}
#Bean
#Primary
public DatabaseRouter dataSource() {
return new DatabaseRouter(cognitoConfiguration, databaseService, server, client, modelSourceInformation);
}
}
The following is the implementation of the data source.
// could have a better name
#RequiredArgsConstructor
#Log4j2
public class DatabaseRouter implements DataSource {
private final AwsCognitoConfiguration config;
private final DatabaseService databaseService;
private final HikariDataSource superuser;
private final HikariDataSource user;
private final ModelSourceInformation modelSourceInformation;
The custom data source component is used to create connections for entity managers using one of two accounts on the database for the purpose of multi-tenancy. One account is superuser while the other is a limited user account. Multi-tenancy is achieved through the use of policies. The custom data source runs SET_CONFIG on the connection.
DatabaseService is a very low level service class that supports reading from information_schema, pg_user, pg_policies, and pg_catalog.
#Service
#Log4j
public class DatabaseServiceImpl implements DatabaseService {
private final HikariDataSource server;
private final HikariDataSource client;
ModelSourceInformation has no dependencies. It is used to convert a class type into a configuration variable name and vice versa. It is used by the custom data source to populate SET_CONFIG based on the type of user. It supports defining configuration variables and tying them to models by way of annotations.
AwsCognitoConfiguration is simply a Configuration class that reads the cognito settings from the properties file.
Defined Execution Order By Dependency
DatabaseConfiguration, ModelSourceInformation, AwsCognitoConfiguration
DatabaseService
DatabaseRouter
JPA
Rest of beans
The following components are initialized before jpa. They need to be initialized after jpa. There are dependencies between them.
ModelDynamismInformation
ModelEntityInformation
ModelInformation
ModelPrimaryKeyInformation
ModelSchemaInformation
ModelSecurityInformation
PolicyInitializer
You can use #DependsOn to control the order in which beans get initialized. A bean depending on an EntityManagerFactory should get initialized after Hibernate did its schema creation.
I have a data source configuration class that looks as follows, with separate DataSource beans for testing and non-testing environments using JOOQ. In my code, I do not use DSLContext.transaction(ctx -> {...} but rather mark the method as transactional, so that JOOQ defers to Spring's declarative transactions for transactionality. I am using Spring 4.3.7.RELEASE.
I have the following issue:
During testing (JUnit), #Transactional works as expected. A single method is transactional no matter how many times I use the DSLContext's store() method, and a RuntimeException triggers a rollback of the entire transaction.
During actual production runtime, #Transactional is completely ignored. A method is no longer transactional, and TransactionSynchronizationManager.getResourceMap() holds two separate values: one showing to my connection pool (which is not transactional), and one showing the TransactionAwareDataSourceProxy).
In this case, I would have expected only a single resource of type TransactionAwareDataSourceProxy which wraps my DB CP.
After much trial and error using the second set of configuration changes I made (noted below with "AFTER"), #Transactional works correctly as expected even during runtime, though TransactionSynchronizationManager.getResourceMap() holds the following value:
In this case, my DataSourceTransactionManager seems to not even know the TransactionAwareDataSourceProxy (most likely due to my passing it the simple DataSource, and not the proxy object), which seems to completely 'skip' the proxy anyway.
My question is: the initial configuration that I had seemed correct, but did not work. The proposed 'fix' works, but IMO should not work at all (since the transaction manager does not seem to be aware of the TransactionAwareDataSourceProxy).
What is going on here? Is there a cleaner way to fix this issue?
BEFORE (not transactional during runtime)
#Configuration
#EnableTransactionManagement
#RefreshScope
#Slf4j
public class DataSourceConfig {
#Bean
#Primary
public DSLContext dslContext(org.jooq.Configuration configuration) throws SQLException {
return new DefaultDSLContext(configuration);
}
#Bean
#Primary
public org.jooq.Configuration defaultConfiguration(DataSourceConnectionProvider dataSourceConnectionProvider) {
org.jooq.Configuration configuration = new DefaultConfiguration()
.derive(dataSourceConnectionProvider)
.derive(SQLDialect.POSTGRES_9_5);
configuration.set(new DeleteOrUpdateWithoutWhereListener());
return configuration;
}
#Bean
public DataSourceTransactionManager transactionManager(DataSource dataSource) {
return new DataSourceTransactionManager(dataSource);
}
#Bean
public DataSourceConnectionProvider dataSourceConnectionProvider(DataSource dataSource) {
return new DataSourceConnectionProvider(dataSource);
}
#Configuration
#ConditionalOnClass(EmbeddedPostgres.class)
static class EmbeddedDataSourceConfig {
#Value("${spring.jdbc.port}")
private int dbPort;
#Bean(destroyMethod = "close")
public EmbeddedPostgres embeddedPostgres() throws Exception {
EmbeddedPostgres embeddedPostgres = EmbeddedPostgresHelper.startDatabase(dbPort);
return embeddedPostgres;
}
#Bean
#Primary
public DataSource dataSource(EmbeddedPostgres embeddedPostgres) throws Exception {
DataSource dataSource = embeddedPostgres.getPostgresDatabase();
return new TransactionAwareDataSourceProxy(dataSource);
}
}
#Configuration
#ConditionalOnMissingClass("com.opentable.db.postgres.embedded.EmbeddedPostgres")
#RefreshScope
static class DefaultDataSourceConfig {
#Value("${spring.jdbc.url}")
private String url;
#Value("${spring.jdbc.username}")
private String username;
#Value("${spring.jdbc.password}")
private String password;
#Value("${spring.jdbc.driverClass}")
private String driverClass;
#Value("${spring.jdbc.MaximumPoolSize}")
private Integer maxPoolSize;
#Bean
#Primary
#RefreshScope
public DataSource dataSource() {
log.debug("Connecting to datasource: {}", url);
HikariConfig hikariConfig = buildPool();
DataSource dataSource = new HikariDataSource(hikariConfig);
return new TransactionAwareDataSourceProxy(dataSource);
}
private HikariConfig buildPool() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl(url);
config.setUsername(username);
config.setPassword(password);
config.setDriverClassName(driverClass);
config.setConnectionTestQuery("SELECT 1");
config.setMaximumPoolSize(maxPoolSize);
return config;
}
}
AFTER (transactional during runtime, as expected, all non-listed beans identical to above)
#Configuration
#EnableTransactionManagement
#RefreshScope
#Slf4j
public class DataSourceConfig {
#Bean
public DataSourceConnectionProvider dataSourceConnectionProvider(TransactionAwareDataSourceProxy dataSourceProxy) {
return new DataSourceConnectionProvider(dataSourceProxy);
}
#Bean
public TransactionAwareDataSourceProxy transactionAwareDataSourceProxy(DataSource dataSource) {
return new TransactionAwareDataSourceProxy(dataSource);
}
#Configuration
#ConditionalOnMissingClass("com.opentable.db.postgres.embedded.EmbeddedPostgres")
#RefreshScope
static class DefaultDataSourceConfig {
#Value("${spring.jdbc.url}")
private String url;
#Value("${spring.jdbc.username}")
private String username;
#Value("${spring.jdbc.password}")
private String password;
#Value("${spring.jdbc.driverClass}")
private String driverClass;
#Value("${spring.jdbc.MaximumPoolSize}")
private Integer maxPoolSize;
#Bean
#Primary
#RefreshScope
public DataSource dataSource() {
log.debug("Connecting to datasource: {}", url);
HikariConfig hikariConfig = buildPoolConfig();
DataSource dataSource = new HikariDataSource(hikariConfig);
return dataSource; // not returning the proxy here
}
}
}
I'll turn my comments into an answer.
The transaction manager should NOT be aware of the proxy. From the documentation:
Note that the transaction manager, for example
DataSourceTransactionManager, still needs to work with the underlying
DataSource, not with this proxy.
The class TransactionAwareDataSourceProxy is a special purpose class that is not needed in most cases. Anything that is interfacing with your data source through the Spring framework infrastructure should NOT have the proxy in their chain of access. The proxy is intended for code that cannot interface with the Spring infrastructure. For example, a third party library that was already setup to work with JDBC and did not accept any of Spring's JDBC templates. This is stated in the same docs as above:
This proxy allows data access code to work with the plain JDBC API and
still participate in Spring-managed transactions, similar to JDBC code
in a J2EE/JTA environment. However, if possible, use Spring's
DataSourceUtils, JdbcTemplate or JDBC operation objects to get
transaction participation even without a proxy for the target
DataSource, avoiding the need to define such a proxy in the first
place.
If you do not have any code that needs to bypass the Spring framework then do not use the TransactionAwareDataSourceProxy at all. If you do have legacy code like this then you will need to do what you already configured in your second setup. You will need to create two beans, one which is the data source, and one which is the proxy. You should then give the data source to all of the Spring managed types and the proxy to the legacy types.
I have seen a lot of examples of Spring Batch projects where either (a) a dataSource is defined, or (b) no dataSource is defined.
However, in my project, I would like my business logic to have access to a dataSource, but I want Spring Batch to NOT use the dataSource. Is this possible?
This guy has a similar problem: Spring boot + spring batch without DataSource
Generally, using spring-batch without a database is not a good idea, since there could be concurrency issues depending on the kind of job you define. So at least an using an inmemory db is strongly advised, especially if you plan to use the job in production.
Using SpringBatch with SpringBoot will initialize an inmemory datasource, if you do not configure your own datasource(s).
Taking this into account, let me redefine your question as follows: Can my businesslogic use another datasource than springbatch is using to update its BATCH-tables?
Yes, it can. As a matter of fact, you can use as many datasources as you want inside your SpringBatch Jobs. Just use by-name autowiring.
Here is how I do it:
I always use Configuration class, which defines all the datasources I have to use in my Jobs
Configuration
public class DatasourceConfiguration {
#Bean
#ConditionalOnMissingBean(name = "dataSource")
public DataSource dataSource() {
// create datasource, that is used by springbatch
// for instance, create an inmemory datasource using the
// EmbeddedDatabaseFactory
return ...;
}
#Bean
#ConditionalOnMissingBean(name = "bl1datasource")
public DataSource bl1datasource() {
return ...; // your first datasource that is used in your businesslogic
}
#Bean
#ConditionalOnMissingBean(name = "bl2datasource")
public DataSource bl2datasource() {
return ...; // your second datasource that is used in your businesslogic
}
}
Three points to note:
SpringBatch is looking for a datasource with the name "dataSource", if you do not provide this EXACT (uppercase 'S') name as the name, spring batch will try to autowire by type and if it finds more than one instance of DataSource, it will throw an exception.
Put your datasource configuration in its own class. Do not put them in the same class as your jobdefinitions are. Spring needs to be able to instantiate the datasource-SpringBean with the name "dataSource" very early when it loads the context. Before it starts to instantiate your Job- and Step-Beans. Spring will not be able to do it correctly, if you put your datasource definitions in the same class as you have your job/step definitions.
Using #ConditionalOnMissingBean is not mandatory, but I found it a good practics. It makes it easy to change the datasources for unit/integration tests. Just provide an additional test configuration in the ContextConfiguration of your unit/IT test which, for instance, overwrites the "bl1Datasource" with an inMemoryDataSource:
Configuration
public class TestBL1DatasourceConfiguration {
// overwritting bl1datasource with an inMemoryDatasource.
#Bean
public DataSource bl1datasource() {
return new EmbeddedDatabaseFactory.getDatabase();
}
}
In order to use the businesslogic datasources, use injection by name:
#Component
public class PrepareRe1Re2BezStepCreatorComponent {
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private DataSource bl1datasource;
#Autowired
private DataSource bl2datasource;
public Step createStep() throws Exception {
SimpleStepBuilder<..., ...> builder =
stepBuilderFactory.get("astep") //
.<..., ...> chunk(100) //
.reader(createReader(bl1datasource)) //
.writer(createWriter(bl2datasource)); //
return builder.build();
}
}
Furthermore, you probably want to consider using XA-Datasources if you'd like to work with several datasources.
Edited:
Since it seems that you really don't want to use a datasource, you have to implement your own BatchConfigurer (http://docs.spring.io/spring-batch/trunk/apidocs/org/springframework/batch/core/configuration/annotation/BatchConfigurer.html) (as Michael Minella - the SpringBatch project lead - pointed out above).
You can use the code of org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer as a starting point for your own implementation. Simply remove all the datasource/transactionmanager code and keep the content of the if (datasource === null) part in the initialize method. This will initialize a MapBasedJobRepository and MapBasedJobExplorer. But again, this is NOT a useable solution in a productive environment, since it is not threadsafe.
Edited:
How to implement it:
Configuration class that defines the "businessDataSource":
#Configuration
public class DataSourceConfigurationSimple {
DataSource embeddedDataSource;
#Bean
public DataSource myBusinessDataSource() {
if (embeddedDataSource == null) {
EmbeddedDatabaseFactory factory = new EmbeddedDatabaseFactory();
embeddedDataSource = factory.getDatabase();
}
return embeddedDataSource;
}
}
The implementation of a specific BatchConfigurer:
(of course, the methods have to be implemented...)
public class MyBatchConfigurer implements BatchConfigurer {
#Override
public JobRepository getJobRepository() throws Exception {
return null;
}
#Override
public PlatformTransactionManager getTransactionManager() throws Exception {
return null;
}
#Override
public JobLauncher getJobLauncher() throws Exception {
return null;
}
#Override
public JobExplorer getJobExplorer() throws Exception {
return null;
}
}
And finally the main configuration and launch class:
#SpringBootApplication
#Configuration
#EnableBatchProcessing
// Importing MyBatchConfigurer will install your BatchConfigurer instead of
// SpringBatch default configurer.
#Import({DataSourceConfigurationSimple.class, MyBatchConfigurer.class})
public class SimpleTestJob {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
public Job job() throws Exception {
SimpleJobBuilder standardJob = this.jobs.get(JOB_NAME)
.start(step1());
return standardJob.build();
}
protected Step step1() throws Exception {
TaskletStepBuilder standardStep1 = this.steps.get("SimpleTest_step1_Step")
.tasklet(tasklet());
return standardStep1.build();
}
protected Tasklet tasklet() {
return (contribution, context) -> {
System.out.println("tasklet called");
return RepeatStatus.FINISHED;
};
}
public static void main(String[] args) throws Exception {
SpringApplication.run(SimpleTestJob.class, args);
}
}