I have a Spring Boot app for which I have configured two data sources. So far I've configured the data sources in my Application class (annotated with #EnableAutoConfiguration):
#Bean
#Primary
#ConfigurationProperties(prefix="datasource.db1")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix="datasource.db2")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}
I also added the configuration values to application.properties:
datasource.db1.url=...
...
datasource.db2.url=...
...
Since db1 is the #Primary data source, it is chosen by default. How do I tell an interface extending JpaRepository that it should use db2 instead?
UPDATE: mentioning that my repository is an interface.
You can get the bean associated to the secondary data source from the application context.
For example in Application.java (I'm also using Spring Boot) you define:
#Bean
#ConfigurationProperties(prefix="datasource.secondary")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}
and in a service (here for calling a stored procedure) you have:
#Service
public class EngineImpl implements EngineDao {
private SetScartiProcedure setScarti;
#Autowired
public void init(ApplicationContext ctx) {
DataSource dataSource = (DataSource) ctx.getBean("secondaryDataSource");
this.setScarti = new SetScartiProcedure(dataSource);
}
public class SetScartiProcedure extends StoredProcedure {
...
}
based on this you can define several DataSourcethis way
#Bean
public LocalContainerEntityManagerFactoryBean customerEntityManagerFactory(
EntityManagerFactoryBuilder builder) {
return builder
.dataSource(customerDataSource())
.packages(Customer.class)
.persistenceUnit("customers")
.build();
}
#Bean
public LocalContainerEntityManagerFactoryBean orderEntityManagerFactory(
EntityManagerFactoryBuilder builder) {
return builder
.dataSource(orderDataSource())
.packages(Order.class)
.persistenceUnit("orders")
.build();
}
and then bind each one of them to different classes that each one of them manages
#Configuration
#EnableJpaRepositories(basePackageClasses = Customer.class,
entityManagerFactoryRef = "customerEntityManagerFactory")
public class CustomerConfiguration {
...
}
#Configuration
#EnableJpaRepositories(basePackageClasses = Order.class,
entityManagerFactoryRef = "orderEntityManagerFactory")
public class OrderConfiguration {
...
}
the repositories should know which database to use by the DataSource that was bidden to the class
Related
I am following this link for understanding hexagonal architecture with spring boot. The infrastructure section contains the configuration for the service bean and the repository is passed as a parameter as a below method.
Configuration
#Configuration
#ComponentScan(basePackageClasses = HexagonalApplication.class)
public class BeanConfiguration {
#Bean
BankAccountService bankAccountService(BankAccountRepository repository) {
return new BankAccountService(repository, repository);
}
}
I am not using JPA instead using Spring JDBC for interacting to DB. Linked tutorial is using JPA.
Lets say I have different database implementations i.e.. postgresql(BankAccountRepository) and db2(BankAccountDB2Rep) . I want to change the beans without touching the code. something like with yml configuration or something which I can maintain separately instead of touching the code.
BankAccountRepository.java
#Component
public class BankAccountRepository implements LoadAccountPort, SaveAccountPort {
private SpringDataBankAccountRepository repository;
// Constructor
#Override
public Optional<BankAccount> load(Long id) {
return repository.findById(id);
}
#Override
public void save(BankAccount bankAccount) {
repository.save(bankAccount);
}
}
How can I achieve the same in spring boot? Any help is appreciated..
You can refer to
Spring Boot Configure and Use Two DataSources for creating multiple datasources and do something like following.
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "entityManagerFactory",
transactionManagerRef = "transactionManager",
basePackages = {
"com.example"
}
)
public class JPAConfig {
#Primary
#Bean(name = "postgresDataSource")
#ConfigurationProperties(prefix = "postgres.datasource")
public DataSource postgresDataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "db2DataSource")
#ConfigurationProperties(prefix = "db2.datasource")
public DataSource db2DataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "entityManagerFactory")
public LocalContainerEntityManagerFactoryBean entityManagerFactory(
EntityManagerFactoryBuilder builder,
#Qualifier("postgresDataSource") DataSource postgresdataSource,
#Qualifier("db2DataSource") DataSource db2dataSource,
#Value("${useDb2}") Boolean useDb2
) {
return builder
.dataSource(useDb2? db2dataSource : postgresdataSource)
.packages("com.example")
.persistenceUnit("db1")
.build();
}
#Primary
#Bean(name = "transactionManager")
public PlatformTransactionManager transactionManager(
#Qualifier("entityManagerFactory") EntityManagerFactory entityManagerFactory
) {
return new JpaTransactionManager(entityManagerFactory);
}
}
As mentioned by #M.Deinum in comments, the issue can be resolved by using the spring conditional beans, as below
#Configuration
#ConditionalOnProperty(
value="module.enabled",
havingValue = "true",
matchIfMissing = true)
class CrossCuttingConcernModule {
...
}
More information can be found here
I want to initialize two DataSource in my app, as follows:
#Configuration
public class DataSourceConfig {
#Bean
#Primary
#ConfigurationProperties(prefix="spring.datasource")
public DataSource primaryDataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix="spring.datasource2")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}
}
Now I want to use the secondary datasource explicit as follows:
public class SecondaryDbService {
#Autowired
private EntityManager em;
#Autowired
private SecondaryCrudRepository dao;
}
interface SecondaryCrudRepository implements CrudRepository<SecondaryEntity, Long> {
}
If configured as above, the service would use the primary datasource.
Question: how can I tell the CrudRepository to rely on the "secondaryDataSource"? And likewise, how can I inject the EntityManager from the "secondaryDataSource"?
If you want to use multiple datasources, the key is to have the configurations for each Datasource in different packages. You will need to separate your entities between these packages according to which datasource they should access.
You will also have to implement both entity and transaction managers for each datasource in these packages.
To much theory ? in practical it would look something like this:
com.package1
- com.package1.entities
- EntityClass1.java (annotated with #Entity)
- ConfigForDataSource1.java
com.package2
- com.package2.entities
- EntityClass2.java (annotated with #Entity)
- ConfigForDataSource2.java
Here's how ConfigForDataSource1 would look like:
#Configuration
#EnableJpaRepositories(entityManagerFactoryRef = "entityManagerDataSource1",
basePackages = "com.package1",
transactionManagerRef = "TransactionManagerDataSource1")
public class MasterDBConfig {
#Bean(name="DataSource1")
#ConfigurationProperties(prefix = "datasource1.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name="entityManagerDataSource1")
public LocalContainerEntityManagerFactoryBean entityManagerDataSource1(EntityManagerFactoryBuilder builder,#Qualifier("DataSource1") DataSource dataSource) {
return builder.dataSource(dataSource).packages("com.package1").persistenceUnit("DataSource1").build();
}
#Bean(name = "TransactionManagerDataSource1")
public PlatformTransactionManager TransactionManagerDataSource1(#Qualifier("entityManagerDataSource1") EntityManagerFactory entityManagerFactory) {
return new JpaTransactionManager(entityManagerFactory);
}
}
Then do the same for package 2 and enjoy.
Good luck !
My goal is to use Spring Batch with different instances of DataSource for my ItemWriter and the JobRepository respectively which should work like this.
Unfortunately the Spring container injects the primary datasource at a later stage which I can confirm via debugger. Here's my configuration:
#RunWith(SpringJUnit4ClassRunner.class)
#DataJpaTest
#AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE)
#SpringBootTest(classes = { BatchTest.DatabaseConfig.class, BatchTest.BatchTestConfig.class })
public class BatchTest {
#Configuration
static class DatabaseConfig {
#Bean
#Primary
#ConfigurationProperties("spring.datasource")
public DataSource primaryDataSource() {
return DataSourceBuilder.create()
.build();
}
#Bean
#ConfigurationProperties("spring.secondaryDatasource")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create()
.build();
}
}
#Configuration
#EnableBatchProcessing
static class BatchTestConfig {
#Bean()
BatchConfigurer configurer(#Qualifier("secondaryDataSource") DataSource dataSource) {
return new DefaultBatchConfigurer(dataSource);
}
}
}
I reckon this is due to the setter-injection defined in
package org.springframework.batch.core.configuration.annotation;
#Component
public class DefaultBatchConfigurer implements BatchConfigurer {
#Autowired(required = false)
public void setDataSource(DataSource dataSource) {
this.dataSource = dataSource;
this.transactionManager = new DataSourceTransactionManager(dataSource);
}
}
So now I'm wondering how above mentioned SO response works or rather doesn't work in my case. Can I somehow disable the additional setter-injection on the provided bean?
Ttry to override DefaultBatchConfigurer#setDataSource and add the qualifier to the setDataSource method:
#Bean()
BatchConfigurer configurer(#Qualifier("secondaryDataSource") DataSource dataSource) {
return new DefaultBatchConfigurer(dataSource) {
#Autowired(required = false)
public void setDataSource(#Qualifier("secondaryDataSource") DataSource dataSource) {
super.setDataSource(dataSource);
}
};
}
I agree it's a bit odd, but it's odd too that spring batch has such a constraint.
You could even try to override without any annotation at all. I don't remember if Spring searches annotation too in the class hiearchy.
I am trying to setup multiple data sources(MySql, Postgres & Oracle) using Spring boot. I am not using JPA. Setting up with a JdbcTemplate.
I have tried setting up something like this.
application.properties
spring.datasource.test-oracle.username=test-oracle
spring.datasource.test-oracle.password=test-password
spring.datasource.test-oracle.url=dburl/test
spring.datasource.test-oracle.driver-class-name=oracle.jdbc.OracleDriver
spring.datasource.int-oracle.username=int-oracle
spring.datasource.int-oracle.password=int-password
spring.datasource.int-oracle.url=dburl/int
spring.datasource.int-oracle.driver-class-name=oracle.jdbc.driver.OracleDriver
spring.datasource.d.int-mysql.username=user
spring.datasource.d.int-mysql.password=password
spring.datasource.d.int-mysql.url=dburl/d
spring.datasource.d.int-mysql.driver-class-name=com.mysql.jdbc.Driver
spring.datasource.m.int-mysql.username=user
spring.datasource.m.int-mysql.password=password
spring.datasource.m.int-mysql.url=dburl/m
spring.datasource.m.int-mysql.driver-class-name=com.mysql.jdbc.Driver
spring.datasource.d.test-mysql.username=user
spring.datasource.d.test-mysql.password=password
spring.datasource.d.test-mysql.url=dburl/d
spring.datasource.d.test-mysql.driver-class-name=com.mysql.jdbc.Driver
spring.datasource.m.test-mysql.username=user
spring.datasource.m.test-mysql.password=password
spring.datasource.m.test-mysql.url=dburl/m
spring.datasource.m.test-mysql.driver-class-name=com.mysql.jdbc.Driver
MySqlConfiguration.java
#Configuration
public class MySqlConfiguration() {
#Bean(name = "dMySql")
#ConfigurationProperties(prefix = "spring.datasource.d.int-mysql")
public DataSource mysqlDrupalDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "dJdbc")
public JdbcTemplate drupalJdbcTemplate(DataSource dMySql) {
return new JdbcTemplate(dMySql);
}
#Bean(name = "mMySql")
#ConfigurationProperties(prefix = "spring.datasource.m.int-mysql")
public DataSource mysqlDrupalDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "mJdbc")
public JdbcTemplate drupalJdbcTemplate(DataSource mMySql) {
return new JdbcTemplate(mMySql);
}
}
OracleConfiguration.java
#Configuration
public class OracleConfiguration {
#Primary
#Bean(name = "tOracle")
#ConfigurationProperties(prefix = "spring.datasource.test-oracle")
public DataSource heOracleDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "tOracleJdbc")
public JdbcTemplate jdbcTemplate(DataSource tOracle) {
return new JdbcTemplate(tOracle);
}
#Bean(name = "iOracle")
#ConfigurationProperties(prefix = "spring.datasource.int-oracle")
public DataSource heOracleDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "iOracleJdbc")
public JdbcTemplate jdbcTemplate(DataSource iOracle) {
return new JdbcTemplate(iOracle);
}
}
I am not sure if the above is the correct way to go about this. When I use #Primary as per the boot docs, the Bean that has #Primary is always used. Then I use the configurations in my DAO implementations like this
One of the DAO Implementation
#Repository
public class DAOImpl implements DAOInterface {
#Autowired
#Qualifier("dJdbc")
private JdbcTemplate jdbc;
#Override
public Map<String, Object> getBasicStudentInfo(String MAIL) {
return jdbc.queryForMap(GET_BASIC_STUDENT_INFO, new Object[]{MAIL});
}
How do I go about doing this.? I did see many articles which is about mutliple datasources but unfortunately the examples or solutions don't suite me.
Further to this I need to be able to query against the DB's based on some user input. So if a user provides an environment e.g., "test" or "int", how can I trigger the correct properties based on that input.
I understand that Environment is #Autowired into Spring boot and I can intercept the user input, but unsure how I should provide the plumbing between the user input and the DAO configurations.
If something is unclear or needs a bit more explanation from my side or need more code I can provide that. Any help to resolve this situation would be appreciated.Thanks
Here is complete solution to your problem ...
Your configuration classes will look like this :
MySqlConfiguration.java
#Configuration
public class MySqlConfiguration {
#Bean(name = "dMySql")
#ConfigurationProperties(prefix = "spring.datasource.d.int-mysql")
public DataSource mysqlDrupalDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "dJdbc")
public JdbcTemplate drupalJdbcTemplate(#Qualifier("dMySql") DataSource dMySql) {
return new JdbcTemplate(dMySql);
}
#Bean(name = "mMySql")
#ConfigurationProperties(prefix = "spring.datasource.m.int-mysql")
public DataSource mysqlDrupalDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "mJdbc")
public JdbcTemplate drupalJdbcTemplate(#Qualifier("mMySql") DataSource mMySql) {
return new JdbcTemplate(mMySql);
}
}
OracleConfiguration.java
#Configuration
public class OracleConfiguration {
#Primary
#Bean(name = "tOracle")
#ConfigurationProperties(prefix = "spring.datasource.test-oracle")
public DataSource heOracleDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "tOracleJdbc")
public JdbcTemplate jdbcTemplate(#Qualifier("tOracle") DataSource tOracle) {
return new JdbcTemplate(tOracle);
}
#Bean(name = "iOracle")
#ConfigurationProperties(prefix = "spring.datasource.int-oracle")
public DataSource heOracleDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "iOracleJdbc")
public JdbcTemplate jdbcTemplate(#Qualifier("iOracle") DataSource iOracle) {
return new JdbcTemplate(iOracle);
}
}
and in your DAO class , you can autowire the JdbcTemplate like this :
#Repository
public class DAOImpl implements DAOInterface {
#Autowired
#Qualifier("dJdbc")
private JdbcTemplate dJdbc;
#Autowired
#Qualifier("mJdbc")
private JdbcTemplate mJdbc;
#Autowired
#Qualifier("tOracleJdbc")
private JdbcTemplate tOracleJdbc;
#Autowired
#Qualifier("iOracleJdbc")
private JdbcTemplate iOracleJdbc;
#Override
public Map<String, Object> getBasicStudentInfo(String MAIL) {
return dJdbc.queryForMap(GET_BASIC_STUDENT_INFO, new Object[]{MAIL});
}
.
.
.
}
Note: Make Sure to annotate one of DataSource with #Primary annotation
My setup: spring-boot version 1.2.5.RELEASE
I succeeded in running a setup like this, with the jdbc being created with the correct DataSources by adding a #Qualifier in each JDBC method creation
So, for every JDBC method you should match the qualifying datasource like this
#Bean(name = "dJdbc")
public JdbcTemplate drupalJdbcTemplate(#Qualifier("dMySql") DataSource dMySql) {
return new JdbcTemplate(dMySql);
}
No matter you choose for #Primary, using the #Qualifier for every JDBC should work good.
Autowiring jdbcTemplates in repositories, and using #Qualifier for them is ok also.
In your DAO you could wire in additional jdbctemplates. Then at runtime you can pick which one to use.
#Repository
public class DAOImpl implements DAOInterface {
#Autowired
#Qualifier("tOracle")
private JdbcTemplate testJdbc;
#Autowired
#Qualifier("intOracle")
private JdbcTemplate intJdbc;
#Override
public Map<String, Object> getBasicStudentInfo(String MAIL, String source) {
if ("TEST".equals(source)){
return testJdbc.queryForMap(GET_BASIC_STUDENT_INFO, new Object[]{MAIL});
}else {
return intJdbc.queryForMap(GET_BASIC_STUDENT_INFO, new Object[]{MAIL});
}
}
I am trying to configure a couple of datasources within Spring Batch. On startup, Spring Batch is throwing the following exception:
To use the default BatchConfigurer the context must contain no more thanone DataSource, found 2
Snippet from Batch Configuration
#Configuration
#EnableBatchProcessing
public class BatchJobConfiguration {
#Primary
#Bean(name = "baseDatasource")
public DataSource dataSource() {
// first datasource definition here
}
#Bean(name = "secondaryDataSource")
public DataSource dataSource2() {
// second datasource definition here
}
...
}
Not sure why I am seeing this exception, because I have seen some xml based configuration for Spring batch that declare multiple datasources. I am using Spring Batch core version 3.0.1.RELEASE with Spring Boot version 1.1.5.RELEASE. Any help would be greatly appreciated.
You must provide your own BatchConfigurer. Spring does not want to make that decision for you
#Configuration
#EnableBatchProcessing
public class BatchConfig {
#Bean
BatchConfigurer configurer(#Qualifier("batchDataSource") DataSource dataSource){
return new DefaultBatchConfigurer(dataSource);
}
...
AbstractBatchConfiguration tries to lookup BatchConfigurer in container first, if it is not found then tries to create it itself - this is where IllegalStateException is thrown where there is more than one DataSource bean in container.
The approach to solving the problem is to prevent from creation the DefaultBatchConfigurer bean in AbstractBatchConfiguration.
To do it we hint to create DefaultBatchConfigurer by Spring container using #Component annotation:
The configuration class where #EnableBatchProcessing is placed we can annotate with #ComponentScan that scan the package that contains the empty class that is derived from DefaultBatchConfigurer:
package batch_config;
...
#EnableBatchProcessing
#ComponentScan(basePackageClasses = MyBatchConfigurer.class)
public class MyBatchConfig {
...
}
the full code of that empty derived class is here:
package batch_config.components;
import org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer;
import org.springframework.stereotype.Component;
#Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
}
In this configuration the #Primary annotation works for DataSource bean as in the example below:
#Configuration
public class BatchTestDatabaseConfig {
#Bean
#Primary
public DataSource dataSource()
{
return .........;
}
}
This works for the Spring Batch version 3.0.3.RELEASE
The simplest solution to make #Primary annotation on DataSource work might be just adding #ComponentScan(basePackageClasses = DefaultBatchConfigurer.class) along with #EnableBatchProcessing annotation:
#Configuration
#EnableBatchProcessing
#ComponentScan(basePackageClasses = DefaultBatchConfigurer.class)
public class MyBatchConfig {
I would like to provide a solution here, which is very similar to the one answered by #vanarchi, but I managed to put all the necessary configurations into one class.
For the sake of completeness, the solution here assumes that primary datasource is hsql.
#Configuration
#EnableBatchProcessing
public class BatchConfiguration extends DefaultBatchConfigurer {
#Bean
#Primary
public DataSource batchDataSource() {
// no need shutdown, EmbeddedDatabaseFactoryBean will take care of this
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
EmbeddedDatabase embeddedDatabase = builder
.addScript("classpath:org/springframework/batch/core/schema-drop-hsqldb.sql")
.addScript("classpath:org/springframework/batch/core/schema-hsqldb.sql")
.setType(EmbeddedDatabaseType.HSQL) //.H2 or .DERBY
.build();
return embeddedDatabase;
}
#Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(batchDataSource());
factory.setTransactionManager(transactionManager());
factory.afterPropertiesSet();
return (JobRepository) factory.getObject();
}
private ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
//NOTE: the code below is just to provide developer an easy way to access the in-momery hsql datasource, as we configured it to the primary datasource to store batch job related data. Default username : sa, password : ''
#PostConstruct
public void getDbManager(){
DatabaseManagerSwing.main(
new String[] { "--url", "jdbc:hsqldb:mem:testdb", "--user", "sa", "--password", ""});
}
}
THREE key points in this solution:
This class is annotated with #EnableBatchProcessing and #Configuration, as well as extended from DefaultBatchConfigurer. By doing this, we instruct spring-batch to use our customized batch configurer when AbstractBatchConfiguration tries to lookup BatchConfigurer;
Annotate batchDataSource bean as #Primary, which instruct spring-batch to use this datasource as its datasource of storing the 9 job related tables.
Override protected JobRepository createJobRepository() throws Exception method, which makes the jobRepository bean to use the primary datasource, as well as use a different transactionManager instance from the other datasource(s).
The simplest solution is to extend the DefaultBatchConfigurer and autowire your datasource via a qualifier:
#Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
/**
* Initialize the BatchConfigurer to use the datasource of your choosing
* #param firstDataSource
*/
#Autowired
public MyBatchConfigurer(#Qualifier("firstDataSource") DataSource firstDataSource) {
super(firstDataSource);
}
}
Side Note (as this also deals with the use of multiple data sources): If you use autoconfig to run data initialization scripts, you may notice that it's not initializing on the datasource you'd expect. For that issue, take a look at this: https://github.com/spring-projects/spring-boot/issues/9528
You can define below beans and make sure you application.properties file has entries needed for
#Configuration
#PropertySource("classpath:application.properties")
public class DataSourceConfig {
#Primary
#Bean(name = "abcDataSource")
#ConfigurationProperties(prefix = "abc.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
#Bean(name = "xyzDataSource")
#ConfigurationProperties(prefix = "xyz.datasource")
public DataSource xyzDataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
}
application.properties
abc.datasource.jdbc-url=XXXXX
abc.datasource.username=XXXXX
abc.datasource.password=xxxxx
abc.datasource.driver-class-name=org.postgresql.Driver
...........
...........
...........
...........
Here you can refer: Spring Boot Configure and Use Two DataSources
First, create a custom BatchConfigurer
#Configuration
#Component
public class TwoDataSourcesBatchConfigurer implements BatchConfigurer {
#Autowired
#Qualifier("dataSource1")
DataSource dataSource;
#Override
public JobExplorer getJobExplorer() throws Exception {
...
}
#Override
public JobLauncher getJobLauncher() throws Exception {
...
}
#Override
public JobRepository getJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
// use the autowired data source
factory.setDataSource(dataSource);
factory.setTransactionManager(getTransactionManager());
factory.afterPropertiesSet();
return factory.getObject();
}
#Override
public PlatformTransactionManager getTransactionManager() throws Exception {
...
}
}
Then,
#Configuration
#EnableBatchProcessing
#ComponentScan("package")
public class JobConfig {
// define job, step, ...
}