I have just started to use Spring batch and got stuck at this problem. My job never ends, its in an infinite loop. Below is the code:-
#SpringBootApplication
#EnableBatchProcessing
public class Main implements CommandLineRunner{
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private JobLauncher jobLauncher;
public static void main(String[] args) {
SpringApplication.run(Main.class, args);
}
#Override
public void run(String... args) throws Exception {
jobLauncher.run(flattenPersonJob(),new JobParameters());
System.out.println("Done");
}
#Bean
public ItemReader itemReader() {
return new PersonReader();
}
#Bean
public ItemProcessor itemProcessor() {
return new PersonProcessor();
}
#Bean
public ItemWriter itemWriter() {
return new PersonWriter();
}
#Bean
public Step flattenPersonStep() {
return stepBuilderFactory.get("flattenPersonStep").
chunk(1).
reader(itemReader()).
processor(itemProcessor()).
writer(itemWriter()).
build();
}
#Bean
public JobListener jobListener() {
return new JobListener();
}
#Bean
public Job flattenPersonJob() {
return jobBuilderFactory.get("flattenPersonJob").
incrementer(new RunIdIncrementer()).
listener(jobListener()).
flow(flattenPersonStep()).
end().
build();
}
}
This is my reader class
public class PersonReader implements ItemReader<List<Person>> {
#Override
public List<Person> read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
System.out.println("This is the reader");
List<Person> personList = new ArrayList<>();
personList.add(new Person("","",""));
Thread.sleep(5000);
return personList;
}
}
This is my writer class
public class PersonWriter implements ItemWriter<List<String>> {
#Override
public void write(List<? extends List<String>> items) throws Exception {
System.out.println("This is the writer");
//Thread.sleep(5000);
items.forEach(System.out::println);
}
}
This is my processor class
public class PersonProcessor implements ItemProcessor<List<Person>, List<String>> {
#Override
public List<String> process(List<Person> item) throws Exception {
System.out.println("This is the processor");
//Thread.sleep(5000);
return item.stream().map(n -> n.getName()).collect(Collectors.toList());
}
}
Is there any configuration that I am missing here ??
Or is there something wrong with my code ?
I have googled for some time now, but couldnot find anything constructive.
Any help here is much appreciated.
Thanks,
Amar
Your reader never returns null. The contract for the ItemReader within Spring Batch is to read until the reader returns null (indicating that the input has been exhausted). Since you never return null from your ItemReader...your job will read for ever.
Related
(Note this issue might be connected to this question, but it has a much smaller scope.)
I have the simplest of jobs defined like this:
#Configuration
#EnableBatchProcessing
public class FileTransformerConfiguration {
private JobBuilderFactory jobBuilderFactory;
private StepBuilderFactory stepBuilderFactory;
#Autowired
public FileTransformerConfiguration(JobBuilderFactory jobBuilderFactory,
StepBuilderFactory stepBuilderFactory) {
this.jobBuilderFactory = jobBuilderFactory;
this.stepBuilderFactory = stepBuilderFactory;
}
#Bean
public Job transformJob() {
return this.jobBuilderFactory.get("transformJob").incrementer(new RunIdIncrementer())
.flow(transformStep()).end().build();
}
#Bean
public Step transformStep() {
return this.stepBuilderFactory.get("transformStep")
.<String, String>chunk(1).reader(new ItemReader())
.processor(processor())
.writer(new ItemWriter()).build();
}
#Bean
public ItemProcessor<String, String> processor() {
return item -> {
System.out.println("Converting item (" + item + ")...");
return item;
};
}
}
public class ItemReader implements ItemStreamReader<String> {
private Iterator<String> it;
#Override
public void open(ExecutionContext executionContext) throws ItemStreamException {
this.it = Arrays.asList("A", "B", "C", "D", "E").iterator();
}
#Override
public String read() throws Exception {
return this.it.hasNext() ? this.it.next() : null;
}
#Override
public void close() throws ItemStreamException { }
#Override
public void update(ExecutionContext executionContext) throws ItemStreamException {}
}
#JobScope
public class ItemWriter implements ItemStreamWriter<String> {
#Override
public void open(ExecutionContext executionContext) throws ItemStreamException { }
#Override
public void write(List<? extends String> items) throws Exception {
items.forEach(item -> System.out.println("Writing item: " + item));
}
#Override
public void update(ExecutionContext executionContext) throws ItemStreamException { }
#Override
public void close() throws ItemStreamException { }
}
There is no fancy logic, just strings being moved through the pipeline.
The code is called like this:
#SpringBootApplication
public class TestCmpsApplication {
}
#SpringBootTest(classes = {TestCmpsApplication.class})
public class FileTransformerImplIT {
#Autowired
private JobLauncher jobLauncher;
#Autowired
private Job transformJob;
#Test
void test1() throws Exception {
String id = UUID.randomUUID().toString();
JobParametersBuilder jobParameters = new JobParametersBuilder();
jobParameters.addLong("PARAM_START_TIME", System.currentTimeMillis());
jobParameters.addString("PARAM_MAPPING_RULE_DEFINITION_ID", id, true);
this.jobLauncher.run(this.transformJob, jobParameters.toJobParameters());
}
#Test
void test2() throws Exception {
String id = UUID.randomUUID().toString();
JobParametersBuilder jobParameters = new JobParametersBuilder();
jobParameters.addLong("PARAM_START_TIME", System.currentTimeMillis());
jobParameters.addString("PARAM_MAPPING_RULE_DEFINITION_ID", id, true);
this.jobLauncher.run(this.transformJob, jobParameters.toJobParameters());
}
}
(Note there need to be two tests, even though they are identical. The first one will always work.)
So this works fine. However, once I add this:
#Bean
public Step transformStep() {
return this.stepBuilderFactory.get("transformStep")
.<String, String>chunk(1).reader(new ItemReader())
.processor(processor())
.writer(new ItemWriter())
.transactionAttribute(transactionAttribute()).build();
}
private TransactionAttribute transactionAttribute() {
DefaultTransactionAttribute attribute = new DefaultTransactionAttribute();
attribute.setPropagationBehavior(Propagation.NEVER.value());
return attribute;
}
Now the second test fails. The test itself says
TransactionSuspensionNotSupportedException: Transaction manager [org.springframework.batch.support.transaction.ResourcelessTransactionManager] does not support transaction suspension
While the log helpfully provides this error:
IllegalTransactionStateException: Existing transaction found for transaction marked with propagation 'never'
Okay. I directly told the job to never use a transaction, but somehow, somebody creates one anyway. So let's try MANDATORY. Now the test has the same error as above, the log now says:
IllegalTransactionStateException: No existing transaction found for transaction marked with propagation 'mandatory'
Somehow, somebody creates a transaction, but not for all two jobs? Surely SUPPORTS will work then. No, then the test will fail with the same exception, and the log will have this:
OptimisticLockingFailureException: Attempt to update step execution id=1 with wrong version (2), where current version is 3
I have no idea what is happening. Clearly someone creates transactions outside the step, but I have no idea how to stop them. Because I'd rather have no transactions. Or at least a working transaction management were transactions will work the same when called twice in a row.
I tried Spring Batch 4.2, 4.2.5, 4.3 and 4.3.1.
What did I do wrong? How can I make this work?
The problem is with the default job repository. It seems its transaction handling is buggy. To fix this, replace this with the JDBC job repository with an in-memory database. Just add this class to the Spring context:
#Configuration
#EnableBatchProcessing
public class InMemoryBatchContextConfigurer extends DefaultBatchConfigurer {
#Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDatabaseType(DatabaseType.H2.getProductName());
factory.setDataSource(dataSource());
factory.setTransactionManager(getTransactionManager());
return factory.getObject();
}
public DataSource dataSource() {
EmbeddedDatabaseBuilder embeddedDatabaseBuilder = new EmbeddedDatabaseBuilder();
return embeddedDatabaseBuilder
.addScript("classpath:org/springframework/batch/core/schema-drop-h2.sql")
.addScript("classpath:org/springframework/batch/core/schema-h2.sql")
.setType(EmbeddedDatabaseType.H2).build();
}
}
The aim of this operation is to track the lines or items that are being read/processed/written in a spring batch job with multiple steps.
I have created a listener that implements these interfaces : StepExecutionListener, SkipPolicy, ItemReadListener, ItemProcessListener, ItemWriteListener
#Component
public class GenericListener implements StepExecutionListener, SkipPolicy, ItemReadListener, ItemProcessListener, ItemWriteListener {
private Log logger = LogFactory.getLog(getClass());
private JobExecution jobExecution;
private int numeroProcess = 0;
private int currentReadIndex = 0;
private int currentProcessIndex = 0;
private int currentWriteIndex = 0;
#Override
public void beforeRead() throws Exception {
log.info(String.format("[read][line : %s]", currentReadIndex));
currentReadIndex++;
}
#Override
public void afterRead (Object o) throws Exception {
log.info("Ligne correct");
}
#Override
public void onReadError (Exception e) throws Exception {
jobExecution.stop();
}
#Override
public boolean shouldSkip (Throwable throwable, int i) throws SkipLimitExceededException {
String err = String.format("Erreur a la ligne %s | message %s | cause %s | stacktrace %s", numeroProcess, throwable.getMessage(), throwable.getCause().getMessage(), throwable.getCause().getStackTrace());
log.error(err);
return true;
}
#Override
public void beforeProcess (Object o) {
log.debug(String .format("[process:%s][%s][Object:%s]", numeroProcess++, o.getClass(), o.toString()));
currentProcessIndex++;
}
#Override
public void afterProcess (Object o, Object o2) { }
#Override
public void onProcessError (Object o, Exception e) {
String err = String.format("[ProcessError at %s][Object %s][Exception %s][Trace %s]", currentProcessIndex, o.toString(), e.getMessage(), e.getStackTrace());
log.error(err);
jobExecution.stop();
}
#Override
public void beforeWrite (List list) {
log.info(String .format("[write][chunk number:%s][current chunk size %s]", currentWriteIndex, list != null ? list.size() : 0));
currentWriteIndex++;
}
#Override
public void afterWrite (List list) { }
#Override
public void onWriteError (Exception e, List list) {
jobExecution.stop();
}
#Override
public void beforeStep(StepExecution stepExecution) {
jobExecution = stepExecution.getJobExecution();
currentReadIndex = 0;
currentProcessIndex = 0;
currentWriteIndex = 0;
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
return null;
}
}
The job definition (CustomJobListener is a simple class that extends JobExecutionListenerSupport)
public class BatchConfiguration {
#Autowired
public JobBuilderFactory jobs;
#Bean
public Job job(CustomJobListener listener,
#Qualifier("step1") Step step1,
#Qualifier("step2") Step step2,
#Qualifier("step3") Step step3) {
return jobs.get("SimpleJobName")
.incrementer(new RunIdIncrementer())
.preventRestart()
.listener(listener)
.start(step1)
.next(step2)
.next(step3)
.build();
}
}
The step definition (all three steps have the same definition, only the read/processor/writer changes)
#Component
public class StepControleFormat {
#Autowired
private StepOneReader reader;
#Autowired
private StepOneProcessor processor;
#Autowired
private StepOneWriter writer;
#Autowired
private ConfigAccess configAccess;
#Autowired
private GenericListener listener;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Bean
#JobScope
#Qualifier("step1")
public Step stepOne() throws StepException {
return stepBuilderFactory.get("step1")
.<StepOneInput, StepOneOutput>chunk(configAccess.getChunkSize())
.listener((ItemProcessListener<? super StepOneInput, ? super StepOneOutput>) listener)
.faultTolerant()
.skipPolicy(listener)
.reader(reader.read())
.processor(processor.compose())
.writer(writer)
.build();
}
}
Now the problem is that methods beforeStep(StepExecution stepExecution) and afterStep(StepExecution stepExecution) are not fired, but all other methods in GenericListener are correctly fired when their respective events occur.
I tried using listener((StepExecutionListener)listener) instead of listener((ItemProcessListener<? super StepOneInput, ? super StepOneOutput>) listener) but the latter returns AbstractTaskletStepBuiler and then I cant use reader, processor or writer.
Update : My spring boot version is : v1.5.9.RELEASE
I solved it thanks to Michael Minella's hint :
#Bean
#JobScope
#Qualifier("step1")
public Step stepOne() throws StepException {
SimpleStepBuilder<StepOneInput, StepOneOutput> builder = stepBuilderFactory.get("step1")
.<StepOneInput, StepOneOutput>chunk(configAccess.getChunkSize())
// setting up listener for Read/Process/Write
.listener((ItemProcessListener<? super StepOneInput, ? super StepOneOutput>) listener)
.faultTolerant()
// setting up listener for skipPolicy
.skipPolicy(listener)
.reader(reader.read())
.processor(processor.compose())
.writer(writer);
// for step execution listener
builder.listener((StepExecutionListener)listener);
return builder.build();
}
The last called listener method public B listener(StepExecutionListener listener) from StepBuilderHelper<B extends StepBuilderHelper<B>> returns a StepBuilderHelper that doesn't contain a definition for a build() method. So the solution was to split up the step build definition.
What I don't understand is : although the writer method is returning a SimpleStepBuilder<I, O> which contains a definition for this method public SimpleStepBuilder listener(Object listener), the compiler/IDE (IntelliJ IDEA) is calling public B listener(StepExecutionListener listener) from StepBuilderHelper<B extends StepBuilderHelper<B>>. If anyone could help explain this behaviour.
Moreover, finding a way to hookup all listeners in one call using public SimpleStepBuilder listener(Object listener) from SimpleStepBuilder would be very interesting.
Additional Step Listeners can be added as follows.
#Bean(name = STEP1)
public Step rpcbcStep() {
SimpleStepBuilder<Employee, Employee> builder = stepBuilderFactory.get(STEP1).<Employee, Employee>chunk(100)
.reader(step1BackgroundReader())
.processor(processor())
.writer(writer());
builder.listener(step1BackgroundStepListener)
builder.listener(step1BackgroundStepListener2);
// add any other listeners needed
return builder.build();
}
I want contract 1 updated and contract 7 rolled back when method update2 occur exception. But both contracts were saved success.
If I change propagation of update2 from REQUIRES_NEW to REQUIRED, it throw exception "Transaction silently rolled back because it has been marked as rollback-only"
How can I achieve that?
#Service
public class UserService {
#Autowired
private ContractRepo contractRepo;
#Autowired
ContractService contractService;
#Transactional(value = "transactionManager2")
public void update1() {
CONTRACT con1 = contractRepo.findById("1120180001").get();
con1.setCONTRACT_DATE(new Date());
contractRepo.save(con1);
CONTRACT con7 = contractRepo.findById("1120180007").get();
try {
contractService.update2(con7);
} catch (RuntimeException ex) {
System.out.println("Exception when calling update2 " + ex.getMessage());
}
}
}
#Service
public class ContractService {
#Autowired
private ContractRepo contractRepo;
#Transactional(value = "transactionManager2", propagation = Propagation.REQUIRES_NEW)
public void update2(CONTRACT con) {
con.setCONTRACT_DATE(new Date());
contractRepo.save(con);
throw new RuntimeException("RuntimeException update2: test for rollback");
}
}
calling update method
#SpringBootApplication
public class DemoApplication implements CommandLineRunner {
public static void main(String[] args) throws Exception {
SpringApplication.run(DemoApplication.class, args);
}
#Autowired
private UserService userService;
#Override
public void run(String... args) throws Exception {
userService.update1();
}
}
I try to make my test to work with Spring #Transactional annotation.
#ContextConfiguration(classes = SomeTest.SomeTestSpringConfig.class)
#RunWith(SpringJUnit4ClassRunner.class)
public class SomeTest {
#Autowired
MyBean some;
#Autowired
PlatformTransactionManager transactionManager;
#Test
public void testSpring() throws Exception {
some.method();
assertTrue(some.isTransactionalWorks);
}
#EnableAspectJAutoProxy(proxyTargetClass = true)
#EnableLoadTimeWeaving
#EnableTransactionManagement(mode = AdviceMode.ASPECTJ)
#TransactionConfiguration
static class SomeTestSpringConfig {
#Bean
PlatformTransactionManager transactionManager() {
return new MyTransactionManager(dataSource());
}
#Bean
MyBean some() {
return new MyBean();
}
#Bean
DataSource dataSource() {
return new SimpleDriverDataSource(Driver.load(), "jdbc:h2:mem:unit-test");
}
}
}
class MyBean {
#Autowired
DataSource dataSource;
public boolean isTransactionalWorks;
#Transactional
private void someInTransaction() {
try {
dataSource.getConnection();
} catch (SQLException e) {
e.printStackTrace();
}
System.out.println("I should be in transaction");
}
public void method() {
someInTransaction();
}
}
class MyTransactionManager implements PlatformTransactionManager, InitializingBean {
private final DataSourceTransactionManager base = new DataSourceTransactionManager();
#Autowired
MyBean some;
public MyTransactionManager(DataSource datasource) {
base.setDataSource(datasource);
}
#Override
public TransactionStatus getTransaction(TransactionDefinition definition) throws TransactionException {
some.isTransactionalWorks = true;
return base.getTransaction(definition);
}
#Override
public void commit(TransactionStatus status) throws TransactionException {
base.commit(status);
}
#Override
public void rollback(TransactionStatus status) throws TransactionException {
base.rollback(status);
}
#Override
public void afterPropertiesSet() throws Exception {
base.afterPropertiesSet();
}
}
Also I added -javaagent:D:/libs/spring-instrument-4.1.7.RELEASE.jar to VM options for this test.
But it always fails. What did I miss?
Please check this link, i think it is the similar problem u are facing.
How to configure AspectJ with Load Time Weaving without Interface
In this link he has asked to provide both aspectjweaver.jar and spring-instrument.jar in vm argument.
Good to know it worked. :)
I am trying to read client data from database and write processed data to a flat file.
But I need to process whole result of the ItemReader before write data.
For example, I am reading Client from database rows :
public class Client {
private String id;
private String subscriptionCode;
private Boolean activated;
}
But I want to count and write how many user are activated grouped by subscriptionCode :
public class Subscription {
private String subscriptionCode;
private Integer activatedUserCount;
}
I don't know how to perform that using ItemReader/ItemProcessor/ItemWriter, can you help me ?
BatchConfiguration :
#CommonsLog
#Configuration
#EnableBatchProcessing
#EnableAutoConfiguration
public class BatchConfiguration {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<Client, Client> chunk(1000)
.reader(new ListItemReader<Client>(new ArrayList<Client>() { // Just for test
{
add(Client.builder().id("1").subscriptionCode("AA").activated(true).build());
add(Client.builder().id("2").subscriptionCode("BB").activated(true).build());
add(Client.builder().id("3").subscriptionCode("AA").activated(false).build());
add(Client.builder().id("4").subscriptionCode("AA").activated(true).build());
}
}))
.processor(new ItemProcessor<Client, Client>() {
public Client process(Client item) throws Exception {
log.info(item);
return item;
}
})
.writer(new ItemWriter<Client>() {
public void write(List<? extends Client> items) throws Exception {
// Only here I can use List of Client
// How can I process this list before to fill Subscription objects ?
}
})
.build();
}
#Bean
public Job job1(Step step1) throws Exception {
return jobBuilderFactory.get("job1").incrementer(new RunIdIncrementer()).start(step1).build();
}
}
Main application:
public class App {
public static void main(String[] args) throws JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException, JobParametersInvalidException {
System.exit(SpringApplication.exit(SpringApplication.run(BatchConfiguration.class, args)));
}
}
If I understand from your comments you need to make a summary of activated account, right?
You can create a Subscription for every Client you are processing and with a ItemWriterLister.afterWrite write the above created Subscriptions items to database.
I found a solution based on ItemProcessor :
#Bean
public Step step1() {
return stepBuilderFactory.get("step1")
.<Client, Subscription> chunk(1000)
.reader(new ListItemReader<Client>(new ArrayList<Client>() {
{
add(Client.builder().id("1").subscriptionCode("AA").activated(true).build());
add(Client.builder().id("2").subscriptionCode("BB").activated(true).build());
add(Client.builder().id("3").subscriptionCode("AA").activated(false).build());
add(Client.builder().id("4").subscriptionCode("AA").activated(true).build());
}
}))
.processor(new ItemProcessor<Client, Subscription>() {
private List<Subscription> subscriptions;
public Subscription process(Client item) throws Exception {
for (Subscription s : subscriptions) { // try to retrieve existing element
if (s.getSubscriptionCode().equals(item.getSubscriptionCode())) { // element found
if(item.getActivated()) {
s.getActivatedUserCount().incrementAndGet(); // increment user count
log.info("Incremented subscription : " + s);
}
return null; // existing element -> skip
}
}
// Create new Subscription
Subscription subscription = Subscription.builder().subscriptionCode(item.getSubscriptionCode()).activatedUserCount(new AtomicInteger(1)).build();
subscriptions.add(subscription);
log.info("New subscription : " + subscription);
return subscription;
}
#BeforeStep
public void initList() {
subscriptions = Collections.synchronizedList(new ArrayList<Subscription>());
}
#AfterStep
public void clearList() {
subscriptions.clear();
}
})
.writer(new ItemWriter<Subscription>() {
public void write(List<? extends Subscription> items) throws Exception {
log.info(items);
// do write stuff
}
})
.build();
}
But I have to maintain a second Subscription List into ItemProcessor (I don't know if is thread safe and efficient ?). What do you think about this solution ?