I am trying to test my spring batch application with JUnit.
I have setup the project and it is working fine when running using the normal java main method. But when i try to write JUnit test and trying to execute using SpringJUnit4ClassRunner, it is not executing the step that belongs to the Job. Following is the JUnit test case i have written.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = CASConfig.class)
public class DailyDataPreparationStepTest {
private static final Logger LOGGER = LoggerFactory.getLogger(DailyDataPreparationStepTest.class);
#Autowired
private ApplicationContext applicationContext;
#Autowired
private Environment environment;
#Autowired
private JobLauncher jobLauncher;
#Autowired
private JobRepository jobRepository;
#Autowired
private Job dailyDataPrepJob;
#Autowired
private DailyDataPreparationStep dailyDataPreparationStep;
//
private JobLauncherTestUtils jobLauncherTestUtils;
static{
System.setProperty("env", "test");
}
#Before
public void setUp() throws Exception {
jobLauncherTestUtils = new JobLauncherTestUtils();
jobLauncherTestUtils.setJobLauncher(jobLauncher);
jobLauncherTestUtils.setJobRepository(jobRepository);
jobLauncherTestUtils.setJob(dailyDataPrepJob);
}
#After
public void tearDown() throws Exception {
}
#Test
public void testJobExecution(){
try {
jobLauncherTestUtils.launchJob(this.jobParameters("1", "System", DateUtil.getCurrentDateInYYYYMMDD(), "log.log"));
} catch (Exception e) {
e.printStackTrace();
}
}
private JobParameters jobParameters(String jobId, String userId, String executionDate, String jobLogFileName){
return new JobParametersBuilder()
.addString(Constants.BatchJobParameter.PARAM_EXECUTION_DATE, executionDate)
.addString(Constants.BatchJobParameter.PARAM_JOBID, jobId)
.addString(Constants.BatchJobParameter.PARAM_JOBLOGFILENAME, jobLogFileName)
.addString(Constants.BatchJobParameter.PARAM_USERID, userId)
.addDate(Constants.BatchJobParameter.PARAM_TIMESTAMP, Calendar.getInstance().getTime())
.toJobParameters();
}
}
Following code snipt shows the Job configuration
#Bean(name = "dailyDataPrepJob")
public Job dailyDataPreparationJob() throws Exception{
return jobBuilderFactory.get("dailyDataPrepJob")
.start(dailyDataPrepStep)
.incrementer(new RunIdIncrementer())
.listener(jobExecutionListener)
.build();
}
Can anyone give some idea what's going on while executing the job via SpringJUnit4ClassRunner?
Related
I have 2 Test files but whenever I try to run gradle clean build,
I getting java.lang.IllegalStateException: Failed to load ApplicationContext, when i remove the #AutoConfigureMockMvc, then i get an error Could not autowire. No beans of 'MockMvc' type found.
1st File JobTest
#RunWith(SpringRunner.class)
#SpringBootTest
#AutoConfigureMockMvc
public class JobTest {
#Autowired
private WebApplicationContext applicationContext;
#Autowired
private MockMvc mockMvc;
#MockBean
private JobService jobService;
private static final String URI = "/testJob/";
#BeforeEach
void setup() {
this.mockMvc = MockMvcBuilders.webAppContextSetup(applicationContext).build();
}
private final UUID jobId = UUID.fromString("d35089c0-8ca8-4a9d-8932-8464e9a0736c");
#Test
public void testRequestJob() throws Exception {
//create a request object
RequestBuilder requestBuilder = MockMvcRequestBuilders.post(URI)
.contentType(MediaType.APPLICATION_JSON)
.accept(MediaType.ALL)
.content("testRequestString");
when(jobService.neededJob(anyString()).thenReturn(mockJob);
ResultActions perform = mockMvc.perform(requestBuilder);
assertEquals(HttpStatus.OK.value(), perform.andReturn().getResponse().getStatus());
//perform the request and get the response
perform.andExpect(status().isOk())
.andExpect(content().contentType(MediaType.APPLICATION_JSON))
.andExpect(jsonPath("$.data.jobId").exists());
}
}
2nd File EmployerTest
#RunWith(SpringRunner.class)
#SpringBootTest
#AutoConfigureMockMvc
public class ShiftControllerTest {
#Autowired
private WebApplicationContext applicationContext;
#Autowired
private MockMvc mockMvc;
#MockBean
private EmployerService employerService;
#BeforeEach
void setup() {
this.mockMvc = MockMvcBuilders.webAppContextSetup(applicationContext).build();
}
private final UUID jobId = UUID.fromString("d35089c0-8ca8-4a9d-8932-8464e9a0736c");
private static final String URI = "/employer/";
#Test
public void testEmployer() throws Exception {
RequestBuilder requestBuilder = MockMvcRequestBuilders.get(URI + jobId)
.accept(MediaType.ALL);
when(employerService.getEmployer(jobId)).thenReturn(mockEmployer);
ResultActions perform = mockMvc.perform(requestBuilder);
assertEquals(HttpStatus.OK.value(), perform.andReturn().getResponse().getStatus());
}
}
If I comment one file and then try to run gradle clean build it works properly, any suggestion will be appreciated.
In the code that you have posted you don't seem to use the WebApplicationContext for anything else apart from creating a MockMvc object which you have already created anyway by virtue of the #Autowired annotation. If you don't need it for anything else try just removing the WebApplicationContext. For example:
#RunWith(SpringRunner.class)
#SpringBootTest
#AutoConfigureMockMvc
public class JobTest {
// #Autowired
// private WebApplicationContext applicationContext;
#Autowired
private MockMvc mockMvc;
#MockBean
private JobService jobService;
private static final String URI = "/testJob/";
// #BeforeEach
// void setup() {
// this.mockMvc = MockMvcBuilders.webAppContextSetup(applicationContext).build();
// }
private final UUID jobId = UUID.fromString("d35089c0-8ca8-4a9d-8932-8464e9a0736c");
#Test
public void testRequestJob() throws Exception {
//create a request object
RequestBuilder requestBuilder = MockMvcRequestBuilders.post(URI)
.contentType(MediaType.APPLICATION_JSON)
.accept(MediaType.ALL)
.content("testRequestString");
when(jobService.neededJob(anyString()).thenReturn(mockJob);
ResultActions perform = mockMvc.perform(requestBuilder);
assertEquals(HttpStatus.OK.value(), perform.andReturn().getResponse().getStatus());
//perform the request and get the response
perform.andExpect(status().isOk())
.andExpect(content().contentType(MediaType.APPLICATION_JSON))
.andExpect(jsonPath("$.data.jobId").exists());
}
}
There is spring-application used with spring-batch and spring-quartz
There is #Service class that launch spring-batch and named MyService.
#Service
#EnableBatchProcessing
#EnableScheduling
public class MyService {
#Autowired
JobLauncher jobLauncher;
#Autowired
Job processExportJob;
public void helloMethod() throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException {
JobParameters jobParameters = new JobParametersBuilder().addLong("time", System.currentTimeMillis())
.toJobParameters();
jobLauncher.run(processExportJob, jobParameters);
}
}
There is spring-quarts job and config that try to inject this serrvice and launch method helloMethod()
If job contains only logger, there is no problem and works okey.
Then, I try to inject my service in one of field. After every application launch, first time, job contains this field, but in next time there is null in this field.
I tried to just create my service by new:
MyService service = new MyService();
But in service all autowired fields are null after first successful launch
Application is deployed on webshere 8.5.5.13(cluster with 2 nodes), oracle11g and spring4. Java8.
Then, using the #Autowired annotation I inject the service into the job it this job works exactly once. The injected field is null during all the subsequent job executions.
Moreover, if I create the service outside spring:
MyService service = new MyService();
#DisallowConcurrentExecution
#PersistJobDataAfterExecution
public class MyJob implements Job {
#Autowired
private MyService service;
private static final String MESSAGE = "===================================QUARTZ TACT===================================";
private Logger logger = Logger.getLogger(getClass());
#Override
public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException {
logger.log(Level.INFO, MESSAGE);
try {
service.helloMethod();
} catch (Exception e) {
e.printStackTrace();
logger.log(Level.ERROR, " Failed..");
logger.log(Level.ERROR, Arrays.toString(e.getStackTrace()));
}
}
}
From different place is heplful to use this:
ApplicationContext springContext =
WebApplicationContextUtils.getWebApplicationContext(ContextLoaderListener.getCurrentWebApplicationContext().getServletContext());
Object bean = springContext.getBean("myService");
It solved my issue
I started to learn Spring Boot Batch in version 2.1.4
I want to run my job in scheduler and this job runs only once. I mean ItemProcessor and ItemWriter run only once. ItemReader runs every time. anyone have an idea what I did wrong. In the future, I want to change scheduler to Java WatchService and pass filePath to the job but now parameter for filePath is like a string in the function parameter. This is my code:
This is my reader:
#Component
public class UserReaderImpl {
#StepScope
public ItemReader<UserCsvStructure> read(String filepath) {
FlatFileItemReader<UserCsvStructure> reader = new FlatFileItemReader();
reader.setLinesToSkip(1);
reader.setResource(new FileSystemResource(filepath));
reader.setLineMapper(new DefaultLineMapper<UserCsvStructure>() {
{
setLineTokenizer(new DelimitedLineTokenizer() {
{
setNames(new String[]{"firstName","lastName","email"});
}
});
setFieldSetMapper(new BeanWrapperFieldSetMapper<UserCsvStructure>() {
{
setTargetType(UserCsvStructure.class);
}
});
}
});
return reader;
}
}
This in my ItemProcessor
#StepScope
#Component
public class UserProcessorImpl implements ItemProcessor<UserCsvStructure, User> {
#Override
public User process(UserCsvStructure userCsvStructure) throws Exception {
return User.builder()
.email(userCsvStructure.getEmail())
.firstName(userCsvStructure.getFirstName())
.lastName(userCsvStructure.getLastName())
.build();
}
}
This is my ItemWriter
#Component
#StepScope
public class UserWriterImpl implements ItemWriter<User>{
#Autowired
private UserRepository userRepository;
#Override
public void write(List<? extends User> list) throws Exception {
System.out.println(list);
userRepository.saveAll(list);
}
}
And this is my configuration
#Component
public class UserBatchCsvConfig {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
private UserReaderImpl userReader;
#Autowired
private UserWriterImpl userWriter;
#Autowired
private UserProcessorImpl userProcessor;
public Job csvFileToDatabaseJob(UserJobCompletionNotificationListener listener, String fileName) {
return jobBuilderFactory.get("userCsvProcess")
.incrementer(new RunIdIncrementer())
.listener(listener)
.flow(csvFileToDatabaseStep(fileName))
.end()
.build();
}
private Step csvFileToDatabaseStep(String fileName) {
return stepBuilderFactory.get("userCsvProcess")
.<UserCsvStructure, User>chunk(1)
.reader(userReader.read(fileName))
.processor(userProcessor)
.writer(userWriter)
.build();
}
}
Last class is my scheduler:
#Component
public class UserCsvProcessor {
#Autowired
private JobLauncher jobLauncher;
#Autowired
private UserBatchCsvConfig job;
#Autowired
private UserJobCompletionNotificationListener userJobCompletionNotificationListener;
#Scheduled(fixedDelay = 10000)
public void runJob() throws Exception {
jobLauncher.run(job.csvFileToDatabaseJob(userJobCompletionNotificationListener, "C:\\Users\\Anik\\Desktop\\angular\\test.csv"), new JobParameters());
}
}
I know what should I add in my code
In UserCsvProcessor class I need to change my scheduled function to:
#Scheduled(fixedDelay = 10000)
public void runJob() throws Exception {
JobParameters params = new JobParametersBuilder()
.addString("JobID", String.valueOf(System.currentTimeMillis()))
.toJobParameters();
jobLauncher.run(job.csvFileToDatabaseJob(userJobCompletionNotificationListener, "C:\\Users\\Anik\\Desktop\\angular\\test.csv"), params);
}
If someone has other idea or better idea just add an answer
With the configuration you have in #Scheduled annotation you are indicating to be executed every 10 seconds. So, when your first execution is completed it will wait 10 seconds and then execute it again and so on.
#Scheduled(fixedDelay = 10000)
If you want to execute it once (I guess it is once a day) you can use cron expression in your #Scheduled annotation. Check the example below where the cron expression indicates that the method should be executed every day at 10:15 a.m.
#Scheduled(cron = "0 15 10 * * *")
If you want to run it once a month/year you can handle the cron expression to do that.
Additionally, you can read that expression from the configuration file using something like the following:
#Scheduled(cron = "${cron.expression}")
This is my integration test controller class. Method get all team and there was a problem with compilation:
#SpringJUnitWebConfig(classes = CrewApplication.class)
public class Team_Controller_Integration_Test {
private MockMvc mockMvc;
#Autowired
private WebApplicationContext webApplicationContext;
#Before
public void setup() throws Exception
{
this.mockMvc = MockMvcBuilders.webAppContextSetup(this.webApplicationContext).build();
MockitoAnnotations.initMocks(this);
}
#Test
void getAccount() throws Exception {
this.mockMvc.perform(get("/teams")
.accept(MediaType.parseMediaType("application/json;charset=UTF-8")))
.andExpect(status().isOk())
.andExpect(content().contentType("application/json;charset=UTF-8"))
.andExpect(jsonPath("$version").value(null))
.andExpect(jsonPath("$name").value("Apacze"))
.andExpect(jsonPath("$createOn").value(null))
.andExpect(jsonPath("modifiedOn").value(null))
.andExpect(jsonPath("$description").value("grupa programistow"))
.andExpect(jsonPath("$city").value("Włocławek"))
.andExpect(jsonPath("$headcount").value(null));
}
}
This is my error:
java.lang.NullPointerException
On the other hand I create test for Db adn I have problem becouse after add mock elements to db they return null:
#RunWith(SpringRunner.class)
#DataJpaTest
public class Team_database_integration_test {
#MockBean
private TeamRepository teamRepository;
#Autowired
private TestEntityManager testEntityManager;
#Test
public void testDb(){
Team team = new Team(1L,"teamName","teamDescription","krakow",7);
testEntityManager.persist(team);
testEntityManager.flush();
System.out.println(team);
}
}
Add this dependency declaration to test case
#Autowired
private WebApplicationContext webApplicationContext;
and correct the initialization
#Before
public void setup() throws Exception
{
this.mockMvc = MockMvcBuilders.webAppContextSetup(this.webApplicationContext).build();
MockitoAnnotations.initMocks(this);
}
try changing to this signature,i think content type is by default -json.Try without assertions first then add assertions for validations!
MvcResult CDTO = this.mockMvc.perform(get("/plan/1"))
.andExpect(status().isOk())
.andReturn();
I am using Spring Boot + Spring Batch (annotation) , have come across a scenario where I have to run 2 jobs.
I have Employee and Salary records which needs to updated using spring batch. I have configured BatchConiguration classes by following this tutorial spring-batch getting started tutorial for Employee and Salary objects, respectively named as BatchConfigurationEmployee & BatchConfigurationSalary.
I have Defined the ItemReader, ItemProcessor, ItemWriter and Job by following the tutorial which is mentioned above already.
When I start my Spring Boot application either of the Job runs, I want to run both the BatchConfigured classes. How can I achieve this
********* BatchConfigurationEmployee.java *************
#Configuration
#EnableBatchProcessing
public class BatchConfigurationEmployee {
public ItemReader<employee> reader() {
return new EmployeeItemReader();
}
#Bean
public ItemProcessor<Employee, Employee> processor() {
return new EmployeeItemProcessor();
}
#Bean
public Job Employee(JobBuilderFactory jobs, Step s1) {
return jobs.get("Employee")
.incrementer(new RunIdIncrementer())
.flow(s1)
.end()
.build();
}
#Bean
public Step step1(StepBuilderFactory stepBuilderFactory, ItemReader<Employee> reader,
ItemProcessor<Employee, Employee> processor) {
return stepBuilderFactory.get("step1")
.<Employee, Employee> chunk(1)
.reader(reader)
.processor(processor)
.build();
}
}
Salary Class is here
#Configuration
#EnableBatchProcessing
public class BatchConfigurationSalary {
public ItemReader<Salary> reader() {
return new SalaryItemReader();
}
#Bean
public ItemProcessor<Salary, Salary> processor() {
return new SalaryItemProcessor();
}
#Bean
public Job salary(JobBuilderFactory jobs, Step s1) {
return jobs.get("Salary")
.incrementer(new RunIdIncrementer())
.flow(s1)
.end()
.build();
}
#Bean
public Step step1(StepBuilderFactory stepBuilderFactory, ItemReader<Salary> reader,
ItemProcessor<Salary, Salary> processor) {
return stepBuilderFactory.get("step1")
.<Salary, Salary> chunk(1)
.reader(reader)
.processor(processor)
.build();
}
}
The names of the Beans have to be unique in the whole Spring Context.
In both jobs, you are instantiating the reader, writer and processor with the same methodname. The methodname is the name that is used to identifiy the bean in the context.
In both job-definitions, you have reader(), writer() and processor(). They will overwrite each other. Give them unique names like readerEmployee(), readerSalary() and so on.
That should solve your problem.
You jobs are not annotated with #Bean, so the spring-context doesn't know them.
Have a look at the class JobLauncherCommandLineRunner. All Beans in the SpringContext implementing the Job interface will be injected. All jobs that are found will be executed. (this happens inside the method executeLocalJobs in JobLauncherCommandLineRunner)
If, for some reason, you don't want to have them as beans in the context, then you have to register your jobs with the jobregistry.( the method execute registeredJobs of JobLauncherCommandLineRunner will take care of launching the registered jobs)
BTW, you can control with the property
spring.batch.job.names= # Comma-separated list of job names to execute on startup (For instance
`job1,job2`). By default, all Jobs found in the context are executed.
which jobs should be launched.
I feel that this also is a pretty good way to run mutiple Jobs.
I am making use of a Job Launcher to configure and execute the job and independent commandLineRunner implementation to run them. These are ordered to make sure they are executed sequentially in the required though
Apologies for the big post but I wanted to give a clear picture of what can be achieved using JobLauncher configurations with multiple command line runners
This is the current BeanConfiguration that I have
#Configuration
public class BeanConfiguration {
#Autowired
DataSource dataSource;
#Autowired
PlatformTransactionManager transactionManager;
#Bean(name="jobOperator")
public JobOperator jobOperator(JobExplorer jobExplorer,
JobRegistry jobRegistry) throws Exception {
SimpleJobOperator jobOperator = new SimpleJobOperator();
jobOperator.setJobExplorer(jobExplorer);
jobOperator.setJobRepository(createJobRepository());
jobOperator.setJobRegistry(jobRegistry);
jobOperator.setJobLauncher(jobLauncher());
return jobOperator;
}
/**
* Configure joblaucnher to set the execution to be done asycn
* Using the ThreadPoolTaskExecutor
* #return
* #throws Exception
*/
#Bean
public JobLauncher jobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(createJobRepository());
jobLauncher.setTaskExecutor(taskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
// Read the datasource and set in the job repo
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSource);
factory.setTransactionManager(transactionManager);
factory.setIsolationLevelForCreate("ISOLATION_SERIALIZABLE");
//factory.setTablePrefix("BATCH_");
factory.setMaxVarCharLength(10000);
return factory.getObject();
}
#Bean
public RestTemplateBuilder restTemplateBuilder() {
return new RestTemplateBuilder().additionalInterceptors(new CustomRestTemplateLoggerInterceptor());
}
#Bean(name=AppConstants.JOB_DECIDER_BEAN_NAME_EMAIL_INIT)
public JobExecutionDecider jobDecider() {
return new EmailInitJobExecutionDecider();
}
#Bean
public ThreadPoolTaskExecutor taskExecutor() {
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setCorePoolSize(15);
taskExecutor.setMaxPoolSize(20);
taskExecutor.setQueueCapacity(30);
return taskExecutor;
}
}
I have setup the database to hold the job exectuion details in postgre and hence the DatabaseConfiguration looks like this (two different beans for two different profiles -env)
#Configuration
public class DatasourceConfiguration implements EnvironmentAware{
private Environment env;
#Bean
#Qualifier(AppConstants.DB_BEAN)
#Profile("dev")
public DataSource getDataSource() {
HikariDataSource ds = new HikariDataSource();
boolean isAutoCommitEnabled = env.getProperty("spring.datasource.hikari.auto-commit") != null ? Boolean.parseBoolean(env.getProperty("spring.datasource.hikari.auto-commit")):false;
ds.setAutoCommit(isAutoCommitEnabled);
// Connection test query is for legacy connections
//ds.setConnectionInitSql(env.getProperty("spring.datasource.hikari.connection-test-query"));
ds.setPoolName(env.getProperty("spring.datasource.hikari.pool-name"));
ds.setDriverClassName(env.getProperty("spring.datasource.driver-class-name"));
long timeout = env.getProperty("spring.datasource.hikari.idleTimeout") != null ? Long.parseLong(env.getProperty("spring.datasource.hikari.idleTimeout")): 40000;
ds.setIdleTimeout(timeout);
long maxLifeTime = env.getProperty("spring.datasource.hikari.maxLifetime") != null ? Long.parseLong(env.getProperty("spring.datasource.hikari.maxLifetime")): 1800000 ;
ds.setMaxLifetime(maxLifeTime);
ds.setJdbcUrl(env.getProperty("spring.datasource.url"));
ds.setPoolName(env.getProperty("spring.datasource.hikari.pool-name"));
ds.setUsername(env.getProperty("spring.datasource.username"));
ds.setPassword(env.getProperty("spring.datasource.password"));
int poolSize = env.getProperty("spring.datasource.hikari.maximum-pool-size") != null ? Integer.parseInt(env.getProperty("spring.datasource.hikari.maximum-pool-size")): 10;
ds.setMaximumPoolSize(poolSize);
return ds;
}
#Bean
#Qualifier(AppConstants.DB_PROD_BEAN)
#Profile("prod")
public DataSource getProdDatabase() {
HikariDataSource ds = new HikariDataSource();
boolean isAutoCommitEnabled = env.getProperty("spring.datasource.hikari.auto-commit") != null ? Boolean.parseBoolean(env.getProperty("spring.datasource.hikari.auto-commit")):false;
ds.setAutoCommit(isAutoCommitEnabled);
// Connection test query is for legacy connections
//ds.setConnectionInitSql(env.getProperty("spring.datasource.hikari.connection-test-query"));
ds.setPoolName(env.getProperty("spring.datasource.hikari.pool-name"));
ds.setDriverClassName(env.getProperty("spring.datasource.driver-class-name"));
long timeout = env.getProperty("spring.datasource.hikari.idleTimeout") != null ? Long.parseLong(env.getProperty("spring.datasource.hikari.idleTimeout")): 40000;
ds.setIdleTimeout(timeout);
long maxLifeTime = env.getProperty("spring.datasource.hikari.maxLifetime") != null ? Long.parseLong(env.getProperty("spring.datasource.hikari.maxLifetime")): 1800000 ;
ds.setMaxLifetime(maxLifeTime);
ds.setJdbcUrl(env.getProperty("spring.datasource.url"));
ds.setPoolName(env.getProperty("spring.datasource.hikari.pool-name"));
ds.setUsername(env.getProperty("spring.datasource.username"));
ds.setPassword(env.getProperty("spring.datasource.password"));
int poolSize = env.getProperty("spring.datasource.hikari.maximum-pool-size") != null ? Integer.parseInt(env.getProperty("spring.datasource.hikari.maximum-pool-size")): 10;
ds.setMaximumPoolSize(poolSize);
return ds;
}
public void setEnvironment(Environment environment) {
// TODO Auto-generated method stub
this.env = environment;
}
}
Make sure that the initial app launcher catches the app execution which will be returned once the job execution terminates (either gets failed or completed) so that you can gracefully shutdown the jvm. Else using joblauncher makes the jvm to be alive even after all jobs get completed
#SpringBootApplication
#ComponentScan(basePackages="com.XXXX.Feedback_File_Processing.*")
#EnableBatchProcessing
public class FeedbackFileProcessingApp
{
public static void main(String[] args) throws Exception {
ApplicationContext appContext = SpringApplication.run(FeedbackFileProcessingApp.class, args);
// The batch job has finished by this point because the
// ApplicationContext is not 'ready' until the job is finished
// Also, use System.exit to force the Java process to finish with the exit code returned from the Spring App
System.exit(SpringApplication.exit(appContext));
}
}
............. so on , you can configure your own decider , your own job/steps as you said above for two different configurations like below and use them seperately in commandline runners (since the post is getting bigger, I am giving the details of just the job and command line runner)
These are the two jobs
#Configuration
public class DefferalJobConfiguration {
#Autowired
JobLauncher joblauncher;
#Autowired
private JobBuilderFactory jobFactory;
#Autowired
private StepBuilderFactory stepFactory;
#Bean
#StepScope
public Tasklet newSampleTasklet() {
return ((stepExecution, chunkContext) -> {
System.out.println("execution of step after flow");
return RepeatStatus.FINISHED;
});
}
#Bean
public Step sampleStep() {
return stepFactory.get("sampleStep").listener(new CustomStepExecutionListener())
.tasklet(newSampleTasklet()).build();
}
#Autowired
#Qualifier(AppConstants.FLOW_BEAN_NAME_EMAIL_INITIATION)
private Flow emailInitFlow;
#Autowired
#Qualifier(AppConstants.JOB_DECIDER_BEAN_NAME_EMAIL_INIT)
private JobExecutionDecider jobDecider;
#Autowired
#Qualifier(AppConstants.STEP_BEAN_NAME_ITEMREADER_FETCH_DEFERRAL_CONFIG)
private Step deferralConfigStep;
#Bean(name=AppConstants.JOB_BEAN_NAME_DEFERRAL)
public Job deferralJob() {
return jobFactory.get(AppConstants.JOB_NAME_DEFERRAL)
.start(emailInitFlow)
.on("COMPLETED").to(sampleStep())
.next(jobDecider).on("COMPLETED").to(deferralConfigStep)
.on("FAILED").fail()
.end().build();
}
}
#Configuration
public class TestFlowJobConfiguration {
#Autowired
private JobBuilderFactory jobFactory;
#Autowired
#Qualifier("testFlow")
private Flow testFlow;
#Bean(name = "testFlowJob")
public Job testFlowJob() {
return jobFactory.get("testFlowJob").start(testFlow).end().build();
}
}
Here are the command line runners (I am making sure that the first job is completed before the second job is initialized but it is totally up to the user to execute them in parallel following a different stratergy)
#Component
#Order(1)
public class DeferralCommandLineRunner implements CommandLineRunner, EnvironmentAware{
// If the jobLauncher is not used, then by default jobs are launched using SimpleJobLauncher
// with default configuration(assumption)
// hence modified the jobLauncher with vales set in BeanConfig
// of spring batch
private Environment env;
#Autowired
JobLauncher jobLauncher;
#Autowired
#Qualifier(AppConstants.JOB_BEAN_NAME_DEFERRAL)
Job deferralJob;
#Override
public void run(String... args) throws Exception {
// TODO Auto-generated method stub
JobParameters jobparams = new JobParametersBuilder()
.addString("run.time", LocalDateTime.now().
format(DateTimeFormatter.ofPattern(AppConstants.JOB_DATE_FORMATTER_PATTERN)).toString())
.addString("instance.name",
(deferralJob.getName() != null) ?deferralJob.getName()+'-'+UUID.randomUUID().toString() :
UUID.randomUUID().toString())
.toJobParameters();
jobLauncher.run(deferralJob, jobparams);
}
#Override
public void setEnvironment(Environment environment) {
// TODO Auto-generated method stub
this.env = environment;
}
}
#Component
#Order(2)
public class TestJobCommandLineRunner implements CommandLineRunner {
#Autowired
JobLauncher jobLauncher;
#Autowired
#Qualifier("testFlowJob")
Job testjob;
#Autowired
#Qualifier("jobOperator")
JobOperator operator;
#Override
public void run(String... args) throws Exception {
// TODO Auto-generated method stub
JobParameters jobParam = new JobParametersBuilder().addString("name", UUID.randomUUID().toString())
.toJobParameters();
System.out.println(operator.getJobNames());
try {
Set<Long> deferralExecutionIds = operator.getRunningExecutions(AppConstants.JOB_NAME_DEFERRAL);
System.out.println("deferralExceutuibuds:" + deferralExecutionIds);
operator.stop(deferralExecutionIds.iterator().next());
} catch (NoSuchJobException | NoSuchJobExecutionException | JobExecutionNotRunningException e) {
// just add a logging here
System.out.println("exception caught:" + e.getMessage());
}
jobLauncher.run(testjob, jobParam);
}
}
Hope this gives a complete idea of how it can be done. I am using spring-boot-starter-batch:jar:2.0.0.RELEASE