TransactionAttribute not working for simple step - java

(Note this issue might be connected to this question, but it has a much smaller scope.)
I have the simplest of jobs defined like this:
#Configuration
#EnableBatchProcessing
public class FileTransformerConfiguration {
private JobBuilderFactory jobBuilderFactory;
private StepBuilderFactory stepBuilderFactory;
#Autowired
public FileTransformerConfiguration(JobBuilderFactory jobBuilderFactory,
StepBuilderFactory stepBuilderFactory) {
this.jobBuilderFactory = jobBuilderFactory;
this.stepBuilderFactory = stepBuilderFactory;
}
#Bean
public Job transformJob() {
return this.jobBuilderFactory.get("transformJob").incrementer(new RunIdIncrementer())
.flow(transformStep()).end().build();
}
#Bean
public Step transformStep() {
return this.stepBuilderFactory.get("transformStep")
.<String, String>chunk(1).reader(new ItemReader())
.processor(processor())
.writer(new ItemWriter()).build();
}
#Bean
public ItemProcessor<String, String> processor() {
return item -> {
System.out.println("Converting item (" + item + ")...");
return item;
};
}
}
public class ItemReader implements ItemStreamReader<String> {
private Iterator<String> it;
#Override
public void open(ExecutionContext executionContext) throws ItemStreamException {
this.it = Arrays.asList("A", "B", "C", "D", "E").iterator();
}
#Override
public String read() throws Exception {
return this.it.hasNext() ? this.it.next() : null;
}
#Override
public void close() throws ItemStreamException { }
#Override
public void update(ExecutionContext executionContext) throws ItemStreamException {}
}
#JobScope
public class ItemWriter implements ItemStreamWriter<String> {
#Override
public void open(ExecutionContext executionContext) throws ItemStreamException { }
#Override
public void write(List<? extends String> items) throws Exception {
items.forEach(item -> System.out.println("Writing item: " + item));
}
#Override
public void update(ExecutionContext executionContext) throws ItemStreamException { }
#Override
public void close() throws ItemStreamException { }
}
There is no fancy logic, just strings being moved through the pipeline.
The code is called like this:
#SpringBootApplication
public class TestCmpsApplication {
}
#SpringBootTest(classes = {TestCmpsApplication.class})
public class FileTransformerImplIT {
#Autowired
private JobLauncher jobLauncher;
#Autowired
private Job transformJob;
#Test
void test1() throws Exception {
String id = UUID.randomUUID().toString();
JobParametersBuilder jobParameters = new JobParametersBuilder();
jobParameters.addLong("PARAM_START_TIME", System.currentTimeMillis());
jobParameters.addString("PARAM_MAPPING_RULE_DEFINITION_ID", id, true);
this.jobLauncher.run(this.transformJob, jobParameters.toJobParameters());
}
#Test
void test2() throws Exception {
String id = UUID.randomUUID().toString();
JobParametersBuilder jobParameters = new JobParametersBuilder();
jobParameters.addLong("PARAM_START_TIME", System.currentTimeMillis());
jobParameters.addString("PARAM_MAPPING_RULE_DEFINITION_ID", id, true);
this.jobLauncher.run(this.transformJob, jobParameters.toJobParameters());
}
}
(Note there need to be two tests, even though they are identical. The first one will always work.)
So this works fine. However, once I add this:
#Bean
public Step transformStep() {
return this.stepBuilderFactory.get("transformStep")
.<String, String>chunk(1).reader(new ItemReader())
.processor(processor())
.writer(new ItemWriter())
.transactionAttribute(transactionAttribute()).build();
}
private TransactionAttribute transactionAttribute() {
DefaultTransactionAttribute attribute = new DefaultTransactionAttribute();
attribute.setPropagationBehavior(Propagation.NEVER.value());
return attribute;
}
Now the second test fails. The test itself says
TransactionSuspensionNotSupportedException: Transaction manager [org.springframework.batch.support.transaction.ResourcelessTransactionManager] does not support transaction suspension
While the log helpfully provides this error:
IllegalTransactionStateException: Existing transaction found for transaction marked with propagation 'never'
Okay. I directly told the job to never use a transaction, but somehow, somebody creates one anyway. So let's try MANDATORY. Now the test has the same error as above, the log now says:
IllegalTransactionStateException: No existing transaction found for transaction marked with propagation 'mandatory'
Somehow, somebody creates a transaction, but not for all two jobs? Surely SUPPORTS will work then. No, then the test will fail with the same exception, and the log will have this:
OptimisticLockingFailureException: Attempt to update step execution id=1 with wrong version (2), where current version is 3
I have no idea what is happening. Clearly someone creates transactions outside the step, but I have no idea how to stop them. Because I'd rather have no transactions. Or at least a working transaction management were transactions will work the same when called twice in a row.
I tried Spring Batch 4.2, 4.2.5, 4.3 and 4.3.1.
What did I do wrong? How can I make this work?

The problem is with the default job repository. It seems its transaction handling is buggy. To fix this, replace this with the JDBC job repository with an in-memory database. Just add this class to the Spring context:
#Configuration
#EnableBatchProcessing
public class InMemoryBatchContextConfigurer extends DefaultBatchConfigurer {
#Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDatabaseType(DatabaseType.H2.getProductName());
factory.setDataSource(dataSource());
factory.setTransactionManager(getTransactionManager());
return factory.getObject();
}
public DataSource dataSource() {
EmbeddedDatabaseBuilder embeddedDatabaseBuilder = new EmbeddedDatabaseBuilder();
return embeddedDatabaseBuilder
.addScript("classpath:org/springframework/batch/core/schema-drop-h2.sql")
.addScript("classpath:org/springframework/batch/core/schema-h2.sql")
.setType(EmbeddedDatabaseType.H2).build();
}
}

Related

How to combine multiple listeners (step, read, process, write and skip) in spring batch

The aim of this operation is to track the lines or items that are being read/processed/written in a spring batch job with multiple steps.
I have created a listener that implements these interfaces : StepExecutionListener, SkipPolicy, ItemReadListener, ItemProcessListener, ItemWriteListener
#Component
public class GenericListener implements StepExecutionListener, SkipPolicy, ItemReadListener, ItemProcessListener, ItemWriteListener {
private Log logger = LogFactory.getLog(getClass());
private JobExecution jobExecution;
private int numeroProcess = 0;
private int currentReadIndex = 0;
private int currentProcessIndex = 0;
private int currentWriteIndex = 0;
#Override
public void beforeRead() throws Exception {
log.info(String.format("[read][line : %s]", currentReadIndex));
currentReadIndex++;
}
#Override
public void afterRead (Object o) throws Exception {
log.info("Ligne correct");
}
#Override
public void onReadError (Exception e) throws Exception {
jobExecution.stop();
}
#Override
public boolean shouldSkip (Throwable throwable, int i) throws SkipLimitExceededException {
String err = String.format("Erreur a la ligne %s | message %s | cause %s | stacktrace %s", numeroProcess, throwable.getMessage(), throwable.getCause().getMessage(), throwable.getCause().getStackTrace());
log.error(err);
return true;
}
#Override
public void beforeProcess (Object o) {
log.debug(String .format("[process:%s][%s][Object:%s]", numeroProcess++, o.getClass(), o.toString()));
currentProcessIndex++;
}
#Override
public void afterProcess (Object o, Object o2) { }
#Override
public void onProcessError (Object o, Exception e) {
String err = String.format("[ProcessError at %s][Object %s][Exception %s][Trace %s]", currentProcessIndex, o.toString(), e.getMessage(), e.getStackTrace());
log.error(err);
jobExecution.stop();
}
#Override
public void beforeWrite (List list) {
log.info(String .format("[write][chunk number:%s][current chunk size %s]", currentWriteIndex, list != null ? list.size() : 0));
currentWriteIndex++;
}
#Override
public void afterWrite (List list) { }
#Override
public void onWriteError (Exception e, List list) {
jobExecution.stop();
}
#Override
public void beforeStep(StepExecution stepExecution) {
jobExecution = stepExecution.getJobExecution();
currentReadIndex = 0;
currentProcessIndex = 0;
currentWriteIndex = 0;
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
return null;
}
}
The job definition (CustomJobListener is a simple class that extends JobExecutionListenerSupport)
public class BatchConfiguration {
#Autowired
public JobBuilderFactory jobs;
#Bean
public Job job(CustomJobListener listener,
#Qualifier("step1") Step step1,
#Qualifier("step2") Step step2,
#Qualifier("step3") Step step3) {
return jobs.get("SimpleJobName")
.incrementer(new RunIdIncrementer())
.preventRestart()
.listener(listener)
.start(step1)
.next(step2)
.next(step3)
.build();
}
}
The step definition (all three steps have the same definition, only the read/processor/writer changes)
#Component
public class StepControleFormat {
#Autowired
private StepOneReader reader;
#Autowired
private StepOneProcessor processor;
#Autowired
private StepOneWriter writer;
#Autowired
private ConfigAccess configAccess;
#Autowired
private GenericListener listener;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Bean
#JobScope
#Qualifier("step1")
public Step stepOne() throws StepException {
return stepBuilderFactory.get("step1")
.<StepOneInput, StepOneOutput>chunk(configAccess.getChunkSize())
.listener((ItemProcessListener<? super StepOneInput, ? super StepOneOutput>) listener)
.faultTolerant()
.skipPolicy(listener)
.reader(reader.read())
.processor(processor.compose())
.writer(writer)
.build();
}
}
Now the problem is that methods beforeStep(StepExecution stepExecution) and afterStep(StepExecution stepExecution) are not fired, but all other methods in GenericListener are correctly fired when their respective events occur.
I tried using listener((StepExecutionListener)listener) instead of listener((ItemProcessListener<? super StepOneInput, ? super StepOneOutput>) listener) but the latter returns AbstractTaskletStepBuiler and then I cant use reader, processor or writer.
Update : My spring boot version is : v1.5.9.RELEASE
I solved it thanks to Michael Minella's hint :
#Bean
#JobScope
#Qualifier("step1")
public Step stepOne() throws StepException {
SimpleStepBuilder<StepOneInput, StepOneOutput> builder = stepBuilderFactory.get("step1")
.<StepOneInput, StepOneOutput>chunk(configAccess.getChunkSize())
// setting up listener for Read/Process/Write
.listener((ItemProcessListener<? super StepOneInput, ? super StepOneOutput>) listener)
.faultTolerant()
// setting up listener for skipPolicy
.skipPolicy(listener)
.reader(reader.read())
.processor(processor.compose())
.writer(writer);
// for step execution listener
builder.listener((StepExecutionListener)listener);
return builder.build();
}
The last called listener method public B listener(StepExecutionListener listener) from StepBuilderHelper<B extends StepBuilderHelper<B>> returns a StepBuilderHelper that doesn't contain a definition for a build() method. So the solution was to split up the step build definition.
What I don't understand is : although the writer method is returning a SimpleStepBuilder<I, O> which contains a definition for this method public SimpleStepBuilder listener(Object listener), the compiler/IDE (IntelliJ IDEA) is calling public B listener(StepExecutionListener listener) from StepBuilderHelper<B extends StepBuilderHelper<B>>. If anyone could help explain this behaviour.
Moreover, finding a way to hookup all listeners in one call using public SimpleStepBuilder listener(Object listener) from SimpleStepBuilder would be very interesting.
Additional Step Listeners can be added as follows.
#Bean(name = STEP1)
public Step rpcbcStep() {
SimpleStepBuilder<Employee, Employee> builder = stepBuilderFactory.get(STEP1).<Employee, Employee>chunk(100)
.reader(step1BackgroundReader())
.processor(processor())
.writer(writer());
builder.listener(step1BackgroundStepListener)
builder.listener(step1BackgroundStepListener2);
// add any other listeners needed
return builder.build();
}

Transaction is not rolled back in JOOQ

I have a code that is very similar to this one:
dslContext.transaction(new TransactionalRunnable()
{
#Override
public void run(Configuration arg0) throws Exception
{
dao1.insert(object1);
//Object 1 is inserted in the database
//despite the exception that is being thrown
if(true)
throw new RuntimeException();
dao2.insert(object2)
}
});
This is the code I'm using to create the dsl context and the daos that have been generated with JOOQ.
ComboPooledDataSource comboPooledDataSource = new ComboPooledDataSource();
comboPooledDataSource.setDriverClass(org.postgresql.Driver.class.getName());
comboPooledDataSource.setJdbcUrl("jdbc:postgresql://localhost:5432/database?searchpath=schema");
comboPooledDataSource.setUser("user");
comboPooledDataSource.setPassword("password");
comboPooledDataSource.setMinPoolSize(5);
comboPooledDataSource.setAcquireIncrement(5);
comboPooledDataSource.setMaxPoolSize(25);
Configuration configuration=new DefaultConfiguration().set(comboPooledDataSource).set(
SQLDialect.POSTGRES);
DSLContext dslContext=DSL.using(configuration);
Dao1 dao1=new Dao1(configuration);
Dao2 dao2=new Dao2(configuration);
Why am I getting this behavior?
Your DAOs are configured with a different configuration than your transaction. This means that each DAO runs its code in a new auto-committed transaction, even if you put that logic inside of a TransactionalRunnable.
This would work:
dslContext.transaction(new TransactionalRunnable()
{
#Override
public void run(Configuration arg0) throws Exception
{
new Dao1(arg0).insert(object1);
if(true)
throw new RuntimeException();
new Dao2(arg0).insert(object2)
}
});
Note that [DSLContext.transaction(TransactionalRunnable][1]) does not modify the dslContext and its enclosed Configuration. This means that if your data source is not working e.g. like a JavaEE or Spring TransactionAwareDataSourceProxy, then you must use the argument Configuration of your run() method to run further queries, either by wrapping it with DSL.using(configuration) or by passing it to your daos.
A much simpler option would be to use a data source that is transaction aware (i.e. it binds a transaction to a thread), such that the same thread will always get the same transacted JDBC Connection from the datasource.
I'm letting spring handle the transactions with jOOQ. Here is how:
This is the spring configuration class:
#Configuration
public class SpringConfiguration
{
#Bean
public DataSource dataSource() throws PropertyVetoException
{
comboPooledDataSource.setDriverClass(org.postgresql.Driver.class.getName());
comboPooledDataSource
.setJdbcUrl("jdbc:postgresql://localhost:5432/database?searchpath=schema");
comboPooledDataSource.setUser("databaseuser");
comboPooledDataSource.setPassword("password");
comboPooledDataSource.setMinPoolSize(5);
comboPooledDataSource.setAcquireIncrement(5);
comboPooledDataSource.setMaxPoolSize(25);
return comboPooledDataSource;
}
#Bean
public DataSourceTransactionManager transactionManager() throws PropertyVetoException
{
return new DataSourceTransactionManager(dataSource());
}
#Bean
public TransactionAwareDataSourceProxy transactionAwareDataSource() throws PropertyVetoException
{
return new TransactionAwareDataSourceProxy(dataSource());
}
#Bean
public DataSourceConnectionProvider connectionProvider() throws PropertyVetoException
{
return new DataSourceConnectionProvider(transactionAwareDataSource());
}
#Bean
public org.jooq.Configuration configuration() throws PropertyVetoException
{
return new DefaultConfiguration().set(connectionProvider()).set(transactionProvider()).set(SQLDialect.POSTGRES);
}
#Bean
public TransactionProvider transactionProvider() throws PropertyVetoException
{
return new SpringTransactionProvider(transactionManager());
}
#Bean
public DSLContext dslContext() throws PropertyVetoException
{
return DSL.using(configuration());
}
}
And this is the SpringTransactionProvider:
public class SpringTransactionProvider implements TransactionProvider
{
DataSourceTransactionManager transactionManager;
public SpringTransactionProvider(DataSourceTransactionManager transactionManager)
{
this.transactionManager = transactionManager;
}
#Override
public void begin(TransactionContext ctx)
{
TransactionStatus tx = transactionManager.getTransaction(new DefaultTransactionDefinition(
TransactionDefinition.PROPAGATION_REQUIRED));
ctx.transaction(new SpringTransaction(tx));
}
#Override
public void commit(TransactionContext ctx)
{
transactionManager.commit(((SpringTransaction) ctx.transaction()).tx);
}
#Override
public void rollback(TransactionContext ctx)
{
transactionManager.rollback(((SpringTransaction) ctx.transaction()).tx);
}
class SpringTransaction implements Transaction
{
final TransactionStatus tx;
SpringTransaction(TransactionStatus tx)
{
this.tx = tx;
}
}
}
And finally to get the DSLContext:
ApplicationContext applicationContext = new AnnotationConfigApplicationContext(SpringConfiguration.class);
DSLContext dslContext=applicationContext.getBean(DSLContext.class);
You will need those jars in your classpath to get this to work:
spring-tx.jar, spring-aop.jar, spring-expression.jar, spring-core.jar, spring-beans.jar, spring-jdbc.jar and spring-context.jar :)

Spring Batch job Not ending

I have just started to use Spring batch and got stuck at this problem. My job never ends, its in an infinite loop. Below is the code:-
#SpringBootApplication
#EnableBatchProcessing
public class Main implements CommandLineRunner{
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private JobLauncher jobLauncher;
public static void main(String[] args) {
SpringApplication.run(Main.class, args);
}
#Override
public void run(String... args) throws Exception {
jobLauncher.run(flattenPersonJob(),new JobParameters());
System.out.println("Done");
}
#Bean
public ItemReader itemReader() {
return new PersonReader();
}
#Bean
public ItemProcessor itemProcessor() {
return new PersonProcessor();
}
#Bean
public ItemWriter itemWriter() {
return new PersonWriter();
}
#Bean
public Step flattenPersonStep() {
return stepBuilderFactory.get("flattenPersonStep").
chunk(1).
reader(itemReader()).
processor(itemProcessor()).
writer(itemWriter()).
build();
}
#Bean
public JobListener jobListener() {
return new JobListener();
}
#Bean
public Job flattenPersonJob() {
return jobBuilderFactory.get("flattenPersonJob").
incrementer(new RunIdIncrementer()).
listener(jobListener()).
flow(flattenPersonStep()).
end().
build();
}
}
This is my reader class
public class PersonReader implements ItemReader<List<Person>> {
#Override
public List<Person> read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
System.out.println("This is the reader");
List<Person> personList = new ArrayList<>();
personList.add(new Person("","",""));
Thread.sleep(5000);
return personList;
}
}
This is my writer class
public class PersonWriter implements ItemWriter<List<String>> {
#Override
public void write(List<? extends List<String>> items) throws Exception {
System.out.println("This is the writer");
//Thread.sleep(5000);
items.forEach(System.out::println);
}
}
This is my processor class
public class PersonProcessor implements ItemProcessor<List<Person>, List<String>> {
#Override
public List<String> process(List<Person> item) throws Exception {
System.out.println("This is the processor");
//Thread.sleep(5000);
return item.stream().map(n -> n.getName()).collect(Collectors.toList());
}
}
Is there any configuration that I am missing here ??
Or is there something wrong with my code ?
I have googled for some time now, but couldnot find anything constructive.
Any help here is much appreciated.
Thanks,
Amar
Your reader never returns null. The contract for the ItemReader within Spring Batch is to read until the reader returns null (indicating that the input has been exhausted). Since you never return null from your ItemReader...your job will read for ever.

Spring Batch how to process list of data before write in a Step

I am trying to read client data from database and write processed data to a flat file.
But I need to process whole result of the ItemReader before write data.
For example, I am reading Client from database rows :
public class Client {
private String id;
private String subscriptionCode;
private Boolean activated;
}
But I want to count and write how many user are activated grouped by subscriptionCode :
public class Subscription {
private String subscriptionCode;
private Integer activatedUserCount;
}
I don't know how to perform that using ItemReader/ItemProcessor/ItemWriter, can you help me ?
BatchConfiguration :
#CommonsLog
#Configuration
#EnableBatchProcessing
#EnableAutoConfiguration
public class BatchConfiguration {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<Client, Client> chunk(1000)
.reader(new ListItemReader<Client>(new ArrayList<Client>() { // Just for test
{
add(Client.builder().id("1").subscriptionCode("AA").activated(true).build());
add(Client.builder().id("2").subscriptionCode("BB").activated(true).build());
add(Client.builder().id("3").subscriptionCode("AA").activated(false).build());
add(Client.builder().id("4").subscriptionCode("AA").activated(true).build());
}
}))
.processor(new ItemProcessor<Client, Client>() {
public Client process(Client item) throws Exception {
log.info(item);
return item;
}
})
.writer(new ItemWriter<Client>() {
public void write(List<? extends Client> items) throws Exception {
// Only here I can use List of Client
// How can I process this list before to fill Subscription objects ?
}
})
.build();
}
#Bean
public Job job1(Step step1) throws Exception {
return jobBuilderFactory.get("job1").incrementer(new RunIdIncrementer()).start(step1).build();
}
}
Main application:
public class App {
public static void main(String[] args) throws JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException, JobParametersInvalidException {
System.exit(SpringApplication.exit(SpringApplication.run(BatchConfiguration.class, args)));
}
}
If I understand from your comments you need to make a summary of activated account, right?
You can create a Subscription for every Client you are processing and with a ItemWriterLister.afterWrite write the above created Subscriptions items to database.
I found a solution based on ItemProcessor :
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<Client, Subscription> chunk(1000)
.reader(new ListItemReader<Client>(new ArrayList<Client>() {
{
add(Client.builder().id("1").subscriptionCode("AA").activated(true).build());
add(Client.builder().id("2").subscriptionCode("BB").activated(true).build());
add(Client.builder().id("3").subscriptionCode("AA").activated(false).build());
add(Client.builder().id("4").subscriptionCode("AA").activated(true).build());
}
}))
.processor(new ItemProcessor<Client, Subscription>() {
private List<Subscription> subscriptions;
public Subscription process(Client item) throws Exception {
for (Subscription s : subscriptions) { // try to retrieve existing element
if (s.getSubscriptionCode().equals(item.getSubscriptionCode())) { // element found
if(item.getActivated()) {
s.getActivatedUserCount().incrementAndGet(); // increment user count
log.info("Incremented subscription : " + s);
}
return null; // existing element -> skip
}
}
// Create new Subscription
Subscription subscription = Subscription.builder().subscriptionCode(item.getSubscriptionCode()).activatedUserCount(new AtomicInteger(1)).build();
subscriptions.add(subscription);
log.info("New subscription : " + subscription);
return subscription;
}
#BeforeStep
public void initList() {
subscriptions = Collections.synchronizedList(new ArrayList<Subscription>());
}
#AfterStep
public void clearList() {
subscriptions.clear();
}
})
.writer(new ItemWriter<Subscription>() {
public void write(List<? extends Subscription> items) throws Exception {
log.info(items);
// do write stuff
}
})
.build();
}
But I have to maintain a second Subscription List into ItemProcessor (I don't know if is thread safe and efficient ?). What do you think about this solution ?

Spring-batch #BeforeStep does not work with #StepScope

I'm using Spring Batch version 2.2.4.RELEASE
I tried to write a simple example with stateful ItemReader, ItemProcessor and ItemWriter beans.
public class StatefulItemReader implements ItemReader<String> {
private List<String> list;
#BeforeStep
public void initializeState(StepExecution stepExecution) {
this.list = new ArrayList<>();
}
#AfterStep
public ExitStatus exploitState(StepExecution stepExecution) {
System.out.println("******************************");
System.out.println(" READING RESULTS : " + list.size());
return stepExecution.getExitStatus();
}
#Override
public String read() throws Exception {
this.list.add("some stateful reading information");
if (list.size() < 10) {
return "value " + list.size();
}
return null;
}
}
In my integration test, I'm declaring my beans in an inner static java config class like the one below:
#ContextConfiguration
#RunWith(SpringJUnit4ClassRunner.class)
public class SingletonScopedTest {
#Configuration
#EnableBatchProcessing
static class TestConfig {
#Autowired
private JobBuilderFactory jobBuilder;
#Autowired
private StepBuilderFactory stepBuilder;
#Bean
JobLauncherTestUtils jobLauncherTestUtils() {
return new JobLauncherTestUtils();
}
#Bean
public DataSource dataSource() {
EmbeddedDatabaseBuilder embeddedDatabaseBuilder = new EmbeddedDatabaseBuilder();
return embeddedDatabaseBuilder.addScript("classpath:org/springframework/batch/core/schema-drop-hsqldb.sql")
.addScript("classpath:org/springframework/batch/core/schema-hsqldb.sql")
.setType(EmbeddedDatabaseType.HSQL)
.build();
}
#Bean
public Job jobUnderTest() {
return jobBuilder.get("job-under-test")
.start(stepUnderTest())
.build();
}
#Bean
public Step stepUnderTest() {
return stepBuilder.get("step-under-test")
.<String, String>chunk(1)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
#Bean
public ItemReader<String> reader() {
return new StatefulItemReader();
}
#Bean
public ItemProcessor<String, String> processor() {
return new StatefulItemProcessor();
}
#Bean
public ItemWriter<String> writer() {
return new StatefulItemWriter();
}
}
#Autowired
JobLauncherTestUtils jobLauncherTestUtils;
#Test
public void testStepExecution() {
JobExecution jobExecution = jobLauncherTestUtils.launchStep("step-under-test");
assertEquals(ExitStatus.COMPLETED, jobExecution.getExitStatus());
}
}
This test passes.
But as soon as I define my StatefulItemReader as a step scoped bean (which is better for a stateful reader), the "before step" code is no longer executed.
...
#Bean
#StepScope
public ItemReader<String> reader() {
return new StatefulItemReader();
}
...
And I notice the same issue with processor and my writer beans.
What's wrong with my code? Is it related to this resolved issue: https://jira.springsource.org/browse/BATCH-1230
My whole Maven project with several JUnit tests can be found on GitHub: https://github.com/galak75/spring-batch-step-scope
Thank you in advance for your answers.
When you configure a bean as follows:
#Bean
#StepScope
public MyInterface myBean() {
return new MyInterfaceImpl();
}
You are telling Spring to use the proxy mode ScopedProxyMode.TARGET_CLASS. However, by returning the MyInterface, instead of the MyInterfaceImpl, the proxy only has visibility into the methods on the MyInterface. This prevents Spring Batch from being able to find the methods on MyInterfaceImpl that have been annotated with the listener annotations like #BeforeStep. The correct way to configure this is to return MyInterfaceImpl on your configuration method like below:
#Bean
#StepScope
public MyInterfaceImpl myBean() {
return new MyInterfaceImpl();
}
We have added a warning log message on startup that points out, as we look for the annotated listener methods, if the object is proxied and the target is an interface, we won't be able to find methods on the implementing class with annotations on them.
as suggested by pojo-guy
Solution is to implement StepExecutionListener and Override beforeStep method to set stepExecution
#Override
public void beforeStep(StepExecution stepExecution) {
this.stepExecution = stepExecution;
}

Categories