Spring Batch Transaction issue - java

I am implementing application using spring Batch. I am following ItemReader, processor, ItemWriter Approach. I have created Partitioner component which is partitioning Data. Through ItemReader I am reading Data and processing it.
After processing I am writing back data in DB. Once job is finished, I observed there is some data missing in DB. Sometimes execution of the one partition fails. Sometimes Job executes successfully.
Sometimes I get exceptions. Its random.
"java.lang.RuntimeException: java.lang.reflect.UndeclaredThrowableException
is mapped to a primary key column in the database. Updates are not allowed.
org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: java.sql.SQLException: Closed Resultset: getObject
Is there any Thread synchronization or transaction, we need to maintain ?
E.g.
Total no of records - 1000
chunk - 100
Partition1 - 500
Partition2 - 500
This scenario works fine without using partitioning or using MultiThreaded Step
Sample Code -: This code some times works and commit all data and some times fails.. sometimes I observed few data is not committed in DB (even commit count in BATCH_STEP_EXECUTION table is correct). It is kind of random.
#Configuration
#EnableBatchProcessing
#EnableTransactionManagement
#EnableAspectJAutoProxy(proxyTargetClass = true)
public class BatchConfig {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public SimpleJobLauncher jobLauncher(JobRepository jobRepository) {
SimpleJobLauncher launcher = new SimpleJobLauncher();
launcher.setJobRepository(jobRepository);
return launcher;
}
#Bean(name = "customerJob")
public Job prepareBatch1() {
return jobBuilderFactory.get("customerJob").incrementer(new RunIdIncrementer()).start(masterStep()).listener(listener())
.build();
}
#Bean
public Step masterStep() {
return stepBuilderFactory.get("masterStep").
partitioner(slaveStep().getName(), partitioner())
.partitionHandler(partitionHandler())
.build();
}
#Bean
public BatchListener listener() {
return new BatchListener();
}
#Bean
#JobScope
public BatchPartitioner partitioner() {
return new BatchPartitioner();
}
#Bean
#StepScope
public PartitionHandler partitionHandler() {
TaskExecutorPartitionHandler taskExecutorPartitionHandler = new TaskExecutorPartitionHandler();
taskExecutorPartitionHandler.setGridSize(2);
taskExecutorPartitionHandler.setTaskExecutor(taskExecutor());
taskExecutorPartitionHandler.setStep(slaveStep());
try {
taskExecutorPartitionHandler.afterPropertiesSet();
} catch (Exception e) {
return taskExecutorPartitionHandler;
}
#Bean
#StepScope
public Step slaveStep() {
return stepBuilderFactory.get("slaveStep").<Customer, CustomerWrapperDTO>chunk(100)
.reader(getReader())
.processor(processor())
.writer(writer())
.build();
}
#Bean
#StepScope
public BatchWriter writer() {
return new BatchWriter();
}
#Bean
#StepScope
public BatchProcessor processor() {
return new BatchProcessor();
}
#Bean
#StepScope
public BatchReader getReader() {
return new BatchReader();
}
#Bean
public TaskExecutor taskExecutor() {
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setMaxPoolSize(Runtime.getRuntime().availableProcessors());
taskExecutor.setCorePoolSize(Runtime.getRuntime().availableProcessors());
taskExecutor.afterPropertiesSet();
return taskExecutor;
}
}
class CustomerWrapperDTO {
private Address address;
private Customer customer;
//setter getter address, customer
}
Entity
class Customer {
String processStatus; // "U" : unprocessed, "C" : completed, "F" : Failed
}
public class BatchListener implements JobExecutionListener {
#Autowired
private CustomerRepo customerRepo;
public BatchListener() {
}
#Override
public void beforeJob(JobExecution jobExecution) {
List<Customer> customers;
try {
customers = customerRepo.getAllUnprocessedCustomer);
} catch (Exception e) {
throw new CustomerException("failed in BatchListener", e);
}
jobExecution.getExecutionContext().put("customers",customers);
}
#Override
public void afterJob(JobExecution jobExecution) {
}
}
public class BatchPartitioner implements Partitioner {
#Value("#{jobExecutionContext[customers]}")
private List<Customer> customers;
#Override
public Map<String, ExecutionContext> partition(int gridSize) {
Map<String, ExecutionContext> result = new HashMap<>();
int size = customers.size() / gridSize;
List<List<Customer>> lists = IntStream.range(0, customers.size()).boxed()
.collect(Collectors.groupingBy(i -> i / size,
Collectors.mapping(customers::get, Collectors.toList())))
.values().stream().collect(Collectors.toList());
for (int i = 0; i < gridSize; i++) {
ExecutionContext executionContext = new ExecutionContext();
executionContext.putString("name", "Thread_" + i);
executionContext.put("customers", lists.get(i));
result.put("partition" + i, executionContext);
}
return result;
}
}
#Component
#StepScope
Class BatchReader {
private int index;
#Value("#{stepExecutionContext[customers]}")
private List<Customer> customers;
#Override
public Customer read() {
Customer Customer = null;
if (index < customers.size()) {
Customer = customers.get(index);
index++;
} else {
index = 0;
}
return Customer;
}
}
#Component
#StepScope
public class BatchProcessor implements ItemProcessor<Customer, CustomerWrapperDTO> {
public BatchProcessor() {
}
#Override
public BatchProcessor process(Customer item) {
CustomerWrapperDTO customerWrapper = new CustomerWrapperDTO();
try {
// logic to get address
Address address = // API call or some business logic.
item.setAddress(address);
item.setProcessStatus("C"); // Completed
}catch(Exception e) {
item.setProcessStatus("F");// failed
}
//logic to get Address
customerWrapper.setCustomer(item);
return customerWrapper;
}
}
#Component
#StepScope
public class BatchWriter implements ItemWriter<CustomerBatchWrapperDTO> {
#Autowired
private CustmerRepo customerRepo;
#Autowired
private AddressRepo addessRepo;
public BatchWriter() {
}
#Override
public void write(List<? extends CustomerBatchWrapperDTO> items) {
items.forEach(item -> {
try {
if(item.getCustomer() != null) {
customerRepo.merge(item.getCustomer());
}
if(item.getAddress() != null) {
addessRepo.save(item.getAddress());
}
} catch (Exception e) {
throw new RuntimeException(e);
}
});
}
}

Spring batch is gonna process by chunks. If a chunk fails (this means at least one item failed to process), the transaction is gonna be rolled back.

The issue is with your item reader:
The implementation is not thread-safe, yet it is used in a multi-threaded step. You should synchronize the read method or wrap your reader in a SynchronizedItemStreamReader
The execution context is not safe to share between threads, and you seem to be sharing items between threads through the execution context. BTW, storing items in the execution context is not recommended even for single threaded cases, because the context will be persisted (possibly several times) during the job execution.

Related

How to write Item writer to save DB

How to write Item writer to save DB.
I am using spring batch where I am able to create a custom IteamReader. Now I need to save to DB but i am not getting hold of data in Item Writer.
here is my code :
public class BatchConfig {
#Autowired
MyRepo myrepo;
#Bean
public ItemReader<MyDTO> itemReader() {
return new myItemReader();
}
#Bean
public ItemWriter<MyDTO> itemWriter() {
return new LoggingItemWriter();
}
#Bean
public Step exampleJobStep(ItemReader<MyDTO> reader,
ItemWriter<MyDTO> writer,
StepBuilderFactory stepBuilderFactory) {
return stepBuilderFactory.get("exampleJobStep")
.<MyDTO, MyDTO>chunk(1)
.reader(reader)
.writer(writer)
.build();
}
#Bean
public Job exampleJob(Step exampleJobStep,
JobBuilderFactory jobBuilderFactory) {
return jobBuilderFactory.get("exampleJob")
.incrementer(new RunIdIncrementer())
.flow(exampleJobStep)
.end()
.build();
}
}
Now I will show you itemreader and itemWriter code.
ITEMREADER :
public class myItemReader implements ItemReader<MyDTO> {
private int index;
private List<MyDTO> myData = new ArrayList<>();
InMemoryStudentReader() {
initialize();
}
private void initialize() {
for(int i=0; i<5; i++){
MyDTO newObj = new MyDTO();
newObj.setId(id)
newObj.setName(firstName);
// some random setting values based on index...
myData.add(newObj);
index = 0;
}
}
#Override
public MyDTO read() throws Exception {
MyDTO nextDTO = null;
if (index < myData.size()) {
nextDTO = myData.get(index);
index++;
}
else {
index = 0;
}
return nextDTO;
}
}
ItemWriter ;
public class LoggingItemWriter implements ItemWriter<AreaBoundaryDTO>{
private static final Logger LOGGER = LoggerFactory.getLogger(LoggingItemWriter.class);
public LoggingItemWriter() {
System.out.println("test");
}
#Override
public void write(List<? extends AreaBoundaryDTO> list) throws Exception {
for (MyDTO item : list) {
System.out.println("test"+list);
}
}
}
What I want here is I want to save Data in myrepo.save() where data is myArray list which i have saved in myData during Iteamreader.
Can anybody help me how to save these data which I am getting in itemreader to database through JPA repository.
I was following this Example
The code you shared is incorrect: The reader is defined as ItemReader<StudentDTO> but the read method returns another type public MyDTO read().
Spring Batch provides the RepositoryItemWriter which uses a JPA repository to persist items in a database. If needed, you can extend this built-in writer to customize how data is persisted.

TransactionAttribute not working for simple step

(Note this issue might be connected to this question, but it has a much smaller scope.)
I have the simplest of jobs defined like this:
#Configuration
#EnableBatchProcessing
public class FileTransformerConfiguration {
private JobBuilderFactory jobBuilderFactory;
private StepBuilderFactory stepBuilderFactory;
#Autowired
public FileTransformerConfiguration(JobBuilderFactory jobBuilderFactory,
StepBuilderFactory stepBuilderFactory) {
this.jobBuilderFactory = jobBuilderFactory;
this.stepBuilderFactory = stepBuilderFactory;
}
#Bean
public Job transformJob() {
return this.jobBuilderFactory.get("transformJob").incrementer(new RunIdIncrementer())
.flow(transformStep()).end().build();
}
#Bean
public Step transformStep() {
return this.stepBuilderFactory.get("transformStep")
.<String, String>chunk(1).reader(new ItemReader())
.processor(processor())
.writer(new ItemWriter()).build();
}
#Bean
public ItemProcessor<String, String> processor() {
return item -> {
System.out.println("Converting item (" + item + ")...");
return item;
};
}
}
public class ItemReader implements ItemStreamReader<String> {
private Iterator<String> it;
#Override
public void open(ExecutionContext executionContext) throws ItemStreamException {
this.it = Arrays.asList("A", "B", "C", "D", "E").iterator();
}
#Override
public String read() throws Exception {
return this.it.hasNext() ? this.it.next() : null;
}
#Override
public void close() throws ItemStreamException { }
#Override
public void update(ExecutionContext executionContext) throws ItemStreamException {}
}
#JobScope
public class ItemWriter implements ItemStreamWriter<String> {
#Override
public void open(ExecutionContext executionContext) throws ItemStreamException { }
#Override
public void write(List<? extends String> items) throws Exception {
items.forEach(item -> System.out.println("Writing item: " + item));
}
#Override
public void update(ExecutionContext executionContext) throws ItemStreamException { }
#Override
public void close() throws ItemStreamException { }
}
There is no fancy logic, just strings being moved through the pipeline.
The code is called like this:
#SpringBootApplication
public class TestCmpsApplication {
}
#SpringBootTest(classes = {TestCmpsApplication.class})
public class FileTransformerImplIT {
#Autowired
private JobLauncher jobLauncher;
#Autowired
private Job transformJob;
#Test
void test1() throws Exception {
String id = UUID.randomUUID().toString();
JobParametersBuilder jobParameters = new JobParametersBuilder();
jobParameters.addLong("PARAM_START_TIME", System.currentTimeMillis());
jobParameters.addString("PARAM_MAPPING_RULE_DEFINITION_ID", id, true);
this.jobLauncher.run(this.transformJob, jobParameters.toJobParameters());
}
#Test
void test2() throws Exception {
String id = UUID.randomUUID().toString();
JobParametersBuilder jobParameters = new JobParametersBuilder();
jobParameters.addLong("PARAM_START_TIME", System.currentTimeMillis());
jobParameters.addString("PARAM_MAPPING_RULE_DEFINITION_ID", id, true);
this.jobLauncher.run(this.transformJob, jobParameters.toJobParameters());
}
}
(Note there need to be two tests, even though they are identical. The first one will always work.)
So this works fine. However, once I add this:
#Bean
public Step transformStep() {
return this.stepBuilderFactory.get("transformStep")
.<String, String>chunk(1).reader(new ItemReader())
.processor(processor())
.writer(new ItemWriter())
.transactionAttribute(transactionAttribute()).build();
}
private TransactionAttribute transactionAttribute() {
DefaultTransactionAttribute attribute = new DefaultTransactionAttribute();
attribute.setPropagationBehavior(Propagation.NEVER.value());
return attribute;
}
Now the second test fails. The test itself says
TransactionSuspensionNotSupportedException: Transaction manager [org.springframework.batch.support.transaction.ResourcelessTransactionManager] does not support transaction suspension
While the log helpfully provides this error:
IllegalTransactionStateException: Existing transaction found for transaction marked with propagation 'never'
Okay. I directly told the job to never use a transaction, but somehow, somebody creates one anyway. So let's try MANDATORY. Now the test has the same error as above, the log now says:
IllegalTransactionStateException: No existing transaction found for transaction marked with propagation 'mandatory'
Somehow, somebody creates a transaction, but not for all two jobs? Surely SUPPORTS will work then. No, then the test will fail with the same exception, and the log will have this:
OptimisticLockingFailureException: Attempt to update step execution id=1 with wrong version (2), where current version is 3
I have no idea what is happening. Clearly someone creates transactions outside the step, but I have no idea how to stop them. Because I'd rather have no transactions. Or at least a working transaction management were transactions will work the same when called twice in a row.
I tried Spring Batch 4.2, 4.2.5, 4.3 and 4.3.1.
What did I do wrong? How can I make this work?
The problem is with the default job repository. It seems its transaction handling is buggy. To fix this, replace this with the JDBC job repository with an in-memory database. Just add this class to the Spring context:
#Configuration
#EnableBatchProcessing
public class InMemoryBatchContextConfigurer extends DefaultBatchConfigurer {
#Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDatabaseType(DatabaseType.H2.getProductName());
factory.setDataSource(dataSource());
factory.setTransactionManager(getTransactionManager());
return factory.getObject();
}
public DataSource dataSource() {
EmbeddedDatabaseBuilder embeddedDatabaseBuilder = new EmbeddedDatabaseBuilder();
return embeddedDatabaseBuilder
.addScript("classpath:org/springframework/batch/core/schema-drop-h2.sql")
.addScript("classpath:org/springframework/batch/core/schema-h2.sql")
.setType(EmbeddedDatabaseType.H2).build();
}
}

How to combine multiple listeners (step, read, process, write and skip) in spring batch

The aim of this operation is to track the lines or items that are being read/processed/written in a spring batch job with multiple steps.
I have created a listener that implements these interfaces : StepExecutionListener, SkipPolicy, ItemReadListener, ItemProcessListener, ItemWriteListener
#Component
public class GenericListener implements StepExecutionListener, SkipPolicy, ItemReadListener, ItemProcessListener, ItemWriteListener {
private Log logger = LogFactory.getLog(getClass());
private JobExecution jobExecution;
private int numeroProcess = 0;
private int currentReadIndex = 0;
private int currentProcessIndex = 0;
private int currentWriteIndex = 0;
#Override
public void beforeRead() throws Exception {
log.info(String.format("[read][line : %s]", currentReadIndex));
currentReadIndex++;
}
#Override
public void afterRead (Object o) throws Exception {
log.info("Ligne correct");
}
#Override
public void onReadError (Exception e) throws Exception {
jobExecution.stop();
}
#Override
public boolean shouldSkip (Throwable throwable, int i) throws SkipLimitExceededException {
String err = String.format("Erreur a la ligne %s | message %s | cause %s | stacktrace %s", numeroProcess, throwable.getMessage(), throwable.getCause().getMessage(), throwable.getCause().getStackTrace());
log.error(err);
return true;
}
#Override
public void beforeProcess (Object o) {
log.debug(String .format("[process:%s][%s][Object:%s]", numeroProcess++, o.getClass(), o.toString()));
currentProcessIndex++;
}
#Override
public void afterProcess (Object o, Object o2) { }
#Override
public void onProcessError (Object o, Exception e) {
String err = String.format("[ProcessError at %s][Object %s][Exception %s][Trace %s]", currentProcessIndex, o.toString(), e.getMessage(), e.getStackTrace());
log.error(err);
jobExecution.stop();
}
#Override
public void beforeWrite (List list) {
log.info(String .format("[write][chunk number:%s][current chunk size %s]", currentWriteIndex, list != null ? list.size() : 0));
currentWriteIndex++;
}
#Override
public void afterWrite (List list) { }
#Override
public void onWriteError (Exception e, List list) {
jobExecution.stop();
}
#Override
public void beforeStep(StepExecution stepExecution) {
jobExecution = stepExecution.getJobExecution();
currentReadIndex = 0;
currentProcessIndex = 0;
currentWriteIndex = 0;
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
return null;
}
}
The job definition (CustomJobListener is a simple class that extends JobExecutionListenerSupport)
public class BatchConfiguration {
#Autowired
public JobBuilderFactory jobs;
#Bean
public Job job(CustomJobListener listener,
#Qualifier("step1") Step step1,
#Qualifier("step2") Step step2,
#Qualifier("step3") Step step3) {
return jobs.get("SimpleJobName")
.incrementer(new RunIdIncrementer())
.preventRestart()
.listener(listener)
.start(step1)
.next(step2)
.next(step3)
.build();
}
}
The step definition (all three steps have the same definition, only the read/processor/writer changes)
#Component
public class StepControleFormat {
#Autowired
private StepOneReader reader;
#Autowired
private StepOneProcessor processor;
#Autowired
private StepOneWriter writer;
#Autowired
private ConfigAccess configAccess;
#Autowired
private GenericListener listener;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Bean
#JobScope
#Qualifier("step1")
public Step stepOne() throws StepException {
return stepBuilderFactory.get("step1")
.<StepOneInput, StepOneOutput>chunk(configAccess.getChunkSize())
.listener((ItemProcessListener<? super StepOneInput, ? super StepOneOutput>) listener)
.faultTolerant()
.skipPolicy(listener)
.reader(reader.read())
.processor(processor.compose())
.writer(writer)
.build();
}
}
Now the problem is that methods beforeStep(StepExecution stepExecution) and afterStep(StepExecution stepExecution) are not fired, but all other methods in GenericListener are correctly fired when their respective events occur.
I tried using listener((StepExecutionListener)listener) instead of listener((ItemProcessListener<? super StepOneInput, ? super StepOneOutput>) listener) but the latter returns AbstractTaskletStepBuiler and then I cant use reader, processor or writer.
Update : My spring boot version is : v1.5.9.RELEASE
I solved it thanks to Michael Minella's hint :
#Bean
#JobScope
#Qualifier("step1")
public Step stepOne() throws StepException {
SimpleStepBuilder<StepOneInput, StepOneOutput> builder = stepBuilderFactory.get("step1")
.<StepOneInput, StepOneOutput>chunk(configAccess.getChunkSize())
// setting up listener for Read/Process/Write
.listener((ItemProcessListener<? super StepOneInput, ? super StepOneOutput>) listener)
.faultTolerant()
// setting up listener for skipPolicy
.skipPolicy(listener)
.reader(reader.read())
.processor(processor.compose())
.writer(writer);
// for step execution listener
builder.listener((StepExecutionListener)listener);
return builder.build();
}
The last called listener method public B listener(StepExecutionListener listener) from StepBuilderHelper<B extends StepBuilderHelper<B>> returns a StepBuilderHelper that doesn't contain a definition for a build() method. So the solution was to split up the step build definition.
What I don't understand is : although the writer method is returning a SimpleStepBuilder<I, O> which contains a definition for this method public SimpleStepBuilder listener(Object listener), the compiler/IDE (IntelliJ IDEA) is calling public B listener(StepExecutionListener listener) from StepBuilderHelper<B extends StepBuilderHelper<B>>. If anyone could help explain this behaviour.
Moreover, finding a way to hookup all listeners in one call using public SimpleStepBuilder listener(Object listener) from SimpleStepBuilder would be very interesting.
Additional Step Listeners can be added as follows.
#Bean(name = STEP1)
public Step rpcbcStep() {
SimpleStepBuilder<Employee, Employee> builder = stepBuilderFactory.get(STEP1).<Employee, Employee>chunk(100)
.reader(step1BackgroundReader())
.processor(processor())
.writer(writer());
builder.listener(step1BackgroundStepListener)
builder.listener(step1BackgroundStepListener2);
// add any other listeners needed
return builder.build();
}

Spring Batch how to process list of data before write in a Step

I am trying to read client data from database and write processed data to a flat file.
But I need to process whole result of the ItemReader before write data.
For example, I am reading Client from database rows :
public class Client {
private String id;
private String subscriptionCode;
private Boolean activated;
}
But I want to count and write how many user are activated grouped by subscriptionCode :
public class Subscription {
private String subscriptionCode;
private Integer activatedUserCount;
}
I don't know how to perform that using ItemReader/ItemProcessor/ItemWriter, can you help me ?
BatchConfiguration :
#CommonsLog
#Configuration
#EnableBatchProcessing
#EnableAutoConfiguration
public class BatchConfiguration {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<Client, Client> chunk(1000)
.reader(new ListItemReader<Client>(new ArrayList<Client>() { // Just for test
{
add(Client.builder().id("1").subscriptionCode("AA").activated(true).build());
add(Client.builder().id("2").subscriptionCode("BB").activated(true).build());
add(Client.builder().id("3").subscriptionCode("AA").activated(false).build());
add(Client.builder().id("4").subscriptionCode("AA").activated(true).build());
}
}))
.processor(new ItemProcessor<Client, Client>() {
public Client process(Client item) throws Exception {
log.info(item);
return item;
}
})
.writer(new ItemWriter<Client>() {
public void write(List<? extends Client> items) throws Exception {
// Only here I can use List of Client
// How can I process this list before to fill Subscription objects ?
}
})
.build();
}
#Bean
public Job job1(Step step1) throws Exception {
return jobBuilderFactory.get("job1").incrementer(new RunIdIncrementer()).start(step1).build();
}
}
Main application:
public class App {
public static void main(String[] args) throws JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException, JobParametersInvalidException {
System.exit(SpringApplication.exit(SpringApplication.run(BatchConfiguration.class, args)));
}
}
If I understand from your comments you need to make a summary of activated account, right?
You can create a Subscription for every Client you are processing and with a ItemWriterLister.afterWrite write the above created Subscriptions items to database.
I found a solution based on ItemProcessor :
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<Client, Subscription> chunk(1000)
.reader(new ListItemReader<Client>(new ArrayList<Client>() {
{
add(Client.builder().id("1").subscriptionCode("AA").activated(true).build());
add(Client.builder().id("2").subscriptionCode("BB").activated(true).build());
add(Client.builder().id("3").subscriptionCode("AA").activated(false).build());
add(Client.builder().id("4").subscriptionCode("AA").activated(true).build());
}
}))
.processor(new ItemProcessor<Client, Subscription>() {
private List<Subscription> subscriptions;
public Subscription process(Client item) throws Exception {
for (Subscription s : subscriptions) { // try to retrieve existing element
if (s.getSubscriptionCode().equals(item.getSubscriptionCode())) { // element found
if(item.getActivated()) {
s.getActivatedUserCount().incrementAndGet(); // increment user count
log.info("Incremented subscription : " + s);
}
return null; // existing element -> skip
}
}
// Create new Subscription
Subscription subscription = Subscription.builder().subscriptionCode(item.getSubscriptionCode()).activatedUserCount(new AtomicInteger(1)).build();
subscriptions.add(subscription);
log.info("New subscription : " + subscription);
return subscription;
}
#BeforeStep
public void initList() {
subscriptions = Collections.synchronizedList(new ArrayList<Subscription>());
}
#AfterStep
public void clearList() {
subscriptions.clear();
}
})
.writer(new ItemWriter<Subscription>() {
public void write(List<? extends Subscription> items) throws Exception {
log.info(items);
// do write stuff
}
})
.build();
}
But I have to maintain a second Subscription List into ItemProcessor (I don't know if is thread safe and efficient ?). What do you think about this solution ?

Spring-batch #BeforeStep does not work with #StepScope

I'm using Spring Batch version 2.2.4.RELEASE
I tried to write a simple example with stateful ItemReader, ItemProcessor and ItemWriter beans.
public class StatefulItemReader implements ItemReader<String> {
private List<String> list;
#BeforeStep
public void initializeState(StepExecution stepExecution) {
this.list = new ArrayList<>();
}
#AfterStep
public ExitStatus exploitState(StepExecution stepExecution) {
System.out.println("******************************");
System.out.println(" READING RESULTS : " + list.size());
return stepExecution.getExitStatus();
}
#Override
public String read() throws Exception {
this.list.add("some stateful reading information");
if (list.size() < 10) {
return "value " + list.size();
}
return null;
}
}
In my integration test, I'm declaring my beans in an inner static java config class like the one below:
#ContextConfiguration
#RunWith(SpringJUnit4ClassRunner.class)
public class SingletonScopedTest {
#Configuration
#EnableBatchProcessing
static class TestConfig {
#Autowired
private JobBuilderFactory jobBuilder;
#Autowired
private StepBuilderFactory stepBuilder;
#Bean
JobLauncherTestUtils jobLauncherTestUtils() {
return new JobLauncherTestUtils();
}
#Bean
public DataSource dataSource() {
EmbeddedDatabaseBuilder embeddedDatabaseBuilder = new EmbeddedDatabaseBuilder();
return embeddedDatabaseBuilder.addScript("classpath:org/springframework/batch/core/schema-drop-hsqldb.sql")
.addScript("classpath:org/springframework/batch/core/schema-hsqldb.sql")
.setType(EmbeddedDatabaseType.HSQL)
.build();
}
#Bean
public Job jobUnderTest() {
return jobBuilder.get("job-under-test")
.start(stepUnderTest())
.build();
}
#Bean
public Step stepUnderTest() {
return stepBuilder.get("step-under-test")
.<String, String>chunk(1)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
#Bean
public ItemReader<String> reader() {
return new StatefulItemReader();
}
#Bean
public ItemProcessor<String, String> processor() {
return new StatefulItemProcessor();
}
#Bean
public ItemWriter<String> writer() {
return new StatefulItemWriter();
}
}
#Autowired
JobLauncherTestUtils jobLauncherTestUtils;
#Test
public void testStepExecution() {
JobExecution jobExecution = jobLauncherTestUtils.launchStep("step-under-test");
assertEquals(ExitStatus.COMPLETED, jobExecution.getExitStatus());
}
}
This test passes.
But as soon as I define my StatefulItemReader as a step scoped bean (which is better for a stateful reader), the "before step" code is no longer executed.
...
#Bean
#StepScope
public ItemReader<String> reader() {
return new StatefulItemReader();
}
...
And I notice the same issue with processor and my writer beans.
What's wrong with my code? Is it related to this resolved issue: https://jira.springsource.org/browse/BATCH-1230
My whole Maven project with several JUnit tests can be found on GitHub: https://github.com/galak75/spring-batch-step-scope
Thank you in advance for your answers.
When you configure a bean as follows:
#Bean
#StepScope
public MyInterface myBean() {
return new MyInterfaceImpl();
}
You are telling Spring to use the proxy mode ScopedProxyMode.TARGET_CLASS. However, by returning the MyInterface, instead of the MyInterfaceImpl, the proxy only has visibility into the methods on the MyInterface. This prevents Spring Batch from being able to find the methods on MyInterfaceImpl that have been annotated with the listener annotations like #BeforeStep. The correct way to configure this is to return MyInterfaceImpl on your configuration method like below:
#Bean
#StepScope
public MyInterfaceImpl myBean() {
return new MyInterfaceImpl();
}
We have added a warning log message on startup that points out, as we look for the annotated listener methods, if the object is proxied and the target is an interface, we won't be able to find methods on the implementing class with annotations on them.
as suggested by pojo-guy
Solution is to implement StepExecutionListener and Override beforeStep method to set stepExecution
#Override
public void beforeStep(StepExecution stepExecution) {
this.stepExecution = stepExecution;
}

Categories