My Spring-Batch job is set like that:
#Bean
Job myJob(JobBuilderFactory jobBuilderFactory,
#Qualifier("stepA") Step stepA,
#Qualifier("s"tepB) Step stepB) {
return jobBuilderFactory.get("myJob")
.incrementer(new RunIdIncrementer())
.start(stepA)
.next(stepB)
.build();
}
And here is my launcher:
#Autowired
JobLauncher(#Qualifier("myJob") Job job, JobLauncher jobLauncher) {
this.job = job;
this.jobLauncher = jobLauncher;
}
#Scheduled(fixedDelay=5000)
void launcher() throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException {
jobLauncher.run(job, newExecution());
}
private JobParameters newExecution() {
Map<String, JobParameter> parameters = new HashMap<>();
this.dateTime = new DateTime(DateTimeZone.UTC);
this.dateTimeString = this.dateTime.toString(ISODateTimeFormat.dateTime());
JobParameter parameter = new JobParameter(this.dateTimeString);
parameters.put("currentTime", parameter);
return new JobParameters(parameters);
}
As you can see, my job is scheduled to launch every 5 seconds.
But, after first launch, it does not end; it goes on the next execution.
The job is like in a loop. I would like it to stop and restart after 5 seconds.
I missed that readers need to return null when they finish. Problem solved.
You can also use System.exit(0); at the end of main class which will terminate JVM resulting in terminatation of batch.
Related
I'm new to spring batch and still learning, I have batch configuration with IteratorItemReader , Custom Processor and Custom Writer as below,
#Autowired
JobBuilderFactory jobBuilderFactory;
#Autowired
StepBuilderFactory stepBuilderFactory;
#Value("${inputFile.location}")
private String inputFile;
#Bean
public Job testJob() throws IOException {
return jobBuilderFactory.get("testJob")
.incrementer(new RunIdIncrementer())
.start(testStep())
.listener(new JobListener())
.build();
}
#Bean
public Step testStep() throws IOException {
return stepBuilderFactory.get("testStep")
.<File, File>chunk(1)
.reader(testReader())
.processor(testProcessor())
.writer(testWriter())
.taskExecutor(threadPoolTaskExecutor())
.build();
}
#Bean
public ItemReader<File> testReader() throws IOException {
List<File> files = Files.walk(Paths.get(inputFile), 1)
.filter(Files::isRegularFile)
.map(Path::toFile)
.collect(Collectors.toList());
return new IteratorItemReader<>(files);
}
#Bean
public CustomProcessor testProcessor() {
return new CustomProcessor();
}
#Bean
public CustomWriter testWriter() {
return new CustomWriter();
}
#Bean
public ThreadPoolTaskExecutor threadPoolTaskExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(4);
executor.setMaxPoolSize(6);
executor.setQueueCapacity(4);
executor.initialize();
return executor;
}
Here testReader() will check the given input path and list all the files into a List then it returns IteratorItemReader, and then in processor business logic is happening.
with multithreading If there are multiple files (more than one) in the input location everything is working fine, i'm not getting any error but,
Problem Statement : Let's say there's only one file in the input location (Ex: C:/User/documents/abc.txt), one thread will process the file completely everything is ok but in the end i'm getting this below exception,
ERROR - Encountered an error executing step testStep in job testJob
java.util.NoSuchElementException: null
at java.util.ArrayList$Itr.next(ArrayList.java:864)
at org.springframework.batch.item.support.IteratorItemReader.read (IteratorItemReader.java:70)
at org.springframework.batch.core.step.item.SimpleChunkProvider.doRead (SimpleChunk Provider.java:99)
at org.springframework.batch.core.step.item.SimpleChunkProvider.read (SimpleChunkProvider.java:180)
at org.springframework.batch.core.step.item.SimpleChunkProvider$1.doInIteration (SimpleChunk Provider.java:126)
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult (RepeatTemplate.java:375)
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal (RepeatTemplate.java:215)
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:145)
at org.springframework.batch.core.step.item.SimpleChunk Provider.provide (SimpleChunkProvider.java:118)
at org.springframework.batch.core.step.item. ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:71)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction (TaskletStep.java:407)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:331)
at org.springframework.transaction.support. Transaction Template.execute(Transaction Template.java:140)
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext (TaskletStep.java:273)
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration (StepContextRepeatCallback.java:82)
at org.springframework.batch.repeat.support.TaskExecutorRepeatTemplate$ExecutingRunnable.run (TaskExecutorRepeatTemplate.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
This Exception is happening because of multithreading only, when i tried to look into the IteratorItemReader class line number 70 i've found this below code,
if (iterator.hasNext())
return iterator.next();
else
return null; // end of data
What will be the best solution to over come this issue please provide your inputs on this,
Thanks in advance.
any suggestions would be helpful.
This is the function in the main file for where i need to write test for.
#Override
public void processTask(JobExecutionContext arg0) throws TaskException {
if (BatchInputChannel.DB.toString().equals(runtimeContext.getProperties().getProperty(BATCH_CHANNEL_TYPE))) {
return;
} else if (BatchInputChannel.FILE.toString().equals(runtimeContext.getProperties().getProperty(BATCH_CHANNEL_TYPE))) {
jobLauncher = (JobLauncher) beanFactory.getBean("jobLauncher");
Job job = (Job) beanFactory.getBean("micorpFileLoadJob");
JobParameters jobParameters = new JobParametersBuilder()
.addLong("time", System.currentTimeMillis())
.toJobParameters();
try {
JobExecution jobExecution = jobLauncher.run(job, jobParameters);
System.out.println("jobExecution=="+jobExecution);
} catch (JobExecutionAlreadyRunningException | JobRestartException | JobInstanceAlreadyCompleteException
| JobParametersInvalidException e) {
throw new ProcessingException("File Loading Failed" + e.getMessage());
}
}
}
And this i the test function which i tried to create
#Test(expected = JobParametersInvalidException.class)
public void processTaskWithFileInputJobFailed5() throws Exception {
when(mockruntimeContext.getProperties()).thenReturn(mockProperties);
when(mockProperties.getProperty(BATCH_CHANNEL_TYPE)).thenReturn("FILE");
when(mockbeanFactory.getBean("jobLauncher")).thenReturn(mockJobLauncher);
when(mockbeanFactory.getBean("micorpFileLoadJob")).thenReturn(mockjob);
mockjobParameters = new JobParametersBuilder().addLong("time", System.currentTimeMillis()).toJobParameters();
when(mockJobLauncher.run(mockjob, mockjobParameters)).thenReturn(mockJobExecution);
when(mockJobExecution.getStatus()).thenReturn(BatchStatus.FAILED);
when(mockJobExecution.getJobConfigurationName()).thenReturn(null);
Mockito.doThrow(new JobParametersInvalidException("Invalid")).when(mockJobLauncher).run(mockjob, mockjobParameters);
inputFileLoaderTaskProcessor.processTask(mockjobExecutionContext);
}
When i am executing the project as a J-unit test,its expecting processing exception to be thrown,but i have mentioned "JobParametersInvalidException" in expected.
As you can see I have only added one exception in this function,in order to cover all the exceptions(inside catch) in the main function what needs to be done?
Mockito.doThrow(new JobParametersInvalidException("Invalid")).when(mockJobLauncher).run(mockjob, mockjobParameters);
This doThrow is not being invoked because the parameters will not match.
In the code, the parameters is set by :
JobParameters jobParameters = new JobParametersBuilder()
.addLong("time", System.currentTimeMillis())
.toJobParameters();
In the unit test, the matching mock is set by :
mockjobParameters = new JobParametersBuilder().addLong("time", System.currentTimeMillis()).toJobParameters();
The trouble is, the millisecond clock would almost certainly have ticked over between when the test was being set up, and when the code under test executes. That means these parameters will not be equal, and so the exception is not being thrown.
Change the mock setup to expect any JobParameters, eg something like :
Mockito.doThrow(new JobParametersInvalidException("Invalid")).when(mockJobLauncher).run(eq(mockjob), any(JobParameters.class));
I am trying to execute a series of jobs where one job execute other two, and one of the two execute another.
Job 1 --> Job 3
--> Job 2 -->Job 4
The jobs are for sending data from db.
This is what i have done
#PersistJobDataAfterExecution
#DisallowConcurrentExecution
public class MembersJob implements Job{
List<Member>unsentMem=new ArrayList<Member>();
JSONArray customerJson = new JSONArray();
Depot depot;
public MembersJob(){
depot = new UserHandler().getDepot();
}
public void execute(JobExecutionContext jec) throws JobExecutionException {
if ( Util.getStatus() ){
runSucceedingJobs(jec);
} else {
System.out.println("No internet connection");
}
}
public void runSucceedingJobs(JobExecutionContext context){
JobDataMap jobDataMap = context.getJobDetail().getJobDataMap();
Object milkCollectionJobObj = jobDataMap.get("milkCollectionJob");
MilkCollectionsJob milkCollectionsJob = (MilkCollectionsJob)milkCollectionJobObj;
Object productsJobObj = jobDataMap.get("milkCollectionJob");
ProductsJob productsJob = (ProductsJob)productsJobObj;
try {
milkCollectionsJob.execute(context);
productsJob.execute(context);
} catch (JobExecutionException ex) {
System.out.println("Error...");
Logger.getLogger(MembersJob.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
Call jobs in series
//Members Job
JobKey membersJobKey = new JobKey("salesJob", "group1");
JobDetail membersJob = JobBuilder.newJob(MembersJob.class)
.withIdentity(membersJobKey).build();
membersJob.getJobDataMap().put("milkCollectionJob", new MilkCollectionsJob());
membersJob.getJobDataMap().put("productsJob", new ProductsJob());
//
CronTrigger membersTrigger = newTrigger()
.withIdentity("salesTrigger", "group1")
.withSchedule(
cronSchedule("0/10 * * * * ?"))
.forJob(membersJobKey)
.build();
Scheduler scheduler = new StdSchedulerFactory().getScheduler();
scheduler.scheduleJob(membersJob, membersTrigger);
scheduler.start();
The problem is members job starts but does not start other jobs when it is done. What is the easiest and fastest way to achieve this?
I want to use Spring batch with osgi to run a job daily.
here what i did:
#Component
#EnableBatchProcessing
public class BatchConfiguration {
private JobBuilderFactory jobs;
public JobBuilderFactory getJobs() {
return jobs;
}
public void setJobs(JobBuilderFactory jobs) {
this.jobs = jobs;
}
private StepBuilderFactory steps;
private EmployeeRepository employeeRepository; //spring data repository
public EmployeeRepository getEmployeeRepository() {
return employeeRepository;
}
#Reference
public void setEmployeeRepository(EmployeeRepository employeeRepository) {
employeeRepository= employeeRepository;
}
public Step syncEmployeesStep() throws Exception{
RepositoryItemWriter writer = new RepositoryItemWriter();
writer.setRepository(employeeRepository);
writer.setMethodName("save");
return steps.get("syncEmployeesStep")
.<Employee, Employee> chunk(10)
.reader(reader())
.writer(writer)
.build();
}
public Job importEmpJob()throws Exception {
return jobs.get("importEmpJob")
.incrementer(new RunIdIncrementer())
.start(syncEmployeesStep())
.next(syncEmployeesStep())
.build();
}
public ItemReader<Employee> reader() throws Exception {
String jpqlQuery = "select a from Employee a";
ServerEMF entityManager = new ServerEMF();
JpaPagingItemReader<Employee> reader = new JpaPagingItemReader<Tariff>();
reader.setQueryString(jpqlQuery);
reader.setEntityManagerFactory(entityManager.getEntityManagerFactory());
reader.setPageSize(3);
reader.afterPropertiesSet();
reader.setSaveState(true);
return reader;
}
}
here I want to run this job to sync between two databases,My problem is how to run this job inside osgi.
#EnableScheduling
#Component
public class JobRunner {
private JobLauncher jobLauncher;
private Job job ;
private BatchConfiguration batchConfig;
//private JobBuilderFactory jobs;
//private JobRepository jobrepo;
final static Logger logger = LoggerFactory.getLogger(BatchConfiguration.class);
BundleContext ctx;
#SuppressWarnings("rawtypes")
ServiceTracker servicetracker;
#Activate
public void start(BundleContext context) {
batchConfig = new BatchConfiguration();
//jobs = new JobBuilderFactory(jobRepository)
try {
job = batchConfig.importEmpJob(); //job is null because i don't know how to use it
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
ctx = context;
servicetracker= new ServiceTracker(ctx, BatchConfiguration.class, null);
servicetracker.open();
new Thread() {
public void run() { findAndRunJob(); }
}.start();
}
#Deactivate
public void stop() {
configAdminTracker.close();
}
#Scheduled(fixedRate = 5000)
protected void findAndRunJob() {
logger.info("job created.");
try {
String dateParam = new Date().toString();
JobParameters param = new JobParametersBuilder().addString("date", dateParam).toJobParameters();
System.out.println(dateParam);
JobExecution execution = jobLauncher.run(job, param);
System.out.println("Exit Status : " + execution.getStatus());
} catch (Exception e) {
//e.printStackTrace();
}
}
for sure,I got a java.lang.NullPointerException because the job is null.
could anyone help me with that?
after updates
#Component
#EnableBatchProcessing
public class BatchConfiguration {
private EmployeeRepository employeeRepository; //spring data repository
public EmployeeRepository getEmployeeRepository() {
return employeeRepository;
}
#Reference
public void setEmployeeRepository(EmployeeRepository employeeRepository) {
employeeRepository= employeeRepository;
}
public Step syncEmployeesStep() throws Exception{
RepositoryItemWriter writer = new RepositoryItemWriter();
writer.setRepository(employeeRepository);
writer.setMethodName("save");
return steps.get("syncEmployeesStep")
.<Employee, Employee> chunk(10)
.reader(reader())
.writer(writer)
.build();
}
public Job importEmpJob(JobRepository jobRepository, PlatformTransactionManager transactionManager)throws Exception {
JobBuilderFactory jobs= new JobBuilderFactory(jobRepository);
StepBuilderFactory stepBuilderFactory = new StepBuilderFactory(jobRepository, transactionManager);
return jobs.get("importEmpJob")
.incrementer(new RunIdIncrementer())
.start(syncEmployeesStep())
.next(syncEmployeesStep())
.build();
}
public ItemReader<Employee> reader() throws Exception {
String jpqlQuery = "select a from Employee a";
ServerEMF entityManager = new ServerEMF();
JpaPagingItemReader<Employee> reader = new JpaPagingItemReader<Tariff>();
reader.setQueryString(jpqlQuery);
reader.setEntityManagerFactory(entityManager.getEntityManagerFactory());
reader.setPageSize(3);
reader.afterPropertiesSet();
reader.setSaveState(true);
return reader;
}
}
job runner class
private JobLauncher jobLauncher;
private PlatformTransactionManager transactionManager;
private JobRepository jobRepository;
Job importEmpJob;
private BatchConfiguration batchConfig;
#SuppressWarnings("deprecation")
#Activate
public void start(BundleContext context) {
try {
batchConfig = new BatchConfiguration();
this.transactionManager = new ResourcelessTransactionManager();
MapJobRepositoryFactoryBean repositorybean = new MapJobRepositoryFactoryBean();
repositorybean.setTransactionManager(transactionManager);
this.jobRepository = repositorybean.getJobRepository(); //error after executing this statement
// setup job launcher
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setTaskExecutor(new SyncTaskExecutor());
simpleJobLauncher.setJobRepository(jobRepository);
this.jobLauncher = simpleJobLauncher;
//System.out.println(job);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
ctx = context;
configAdminTracker = new ServiceTracker(ctx, BatchConfiguration.class.getName(), null);
configAdminTracker.open();
new Thread() {
public void run() { findAndRunJob(); }
}.start();
}
#Deactivate
public void stop() {
configAdminTracker.close();
}
protected void findAndRunJob() {
logger.info("job created.");
try {
String dateParam = new Date().toString();
// creating the job
Job job = batchConfig.importEmpJob(jobRepository, transactionManager);
// running the job
JobExecution execution = this.jobLauncher.run(job, new JobParameters());
System.out.println("Exit Status : " + execution.getStatus());
} catch (Exception e) {
//e.printStackTrace();
}
}
what i getting is "java.lang.IllegalArgumentException: interface org.springframework.batch.core.repository.JobRepository is not visible from class loader" after running .could anyone help me with that error?
In Short
If you're just trying to kick off a something simple and don't need all the Spring Batch goodness I would look into the Apache Sling Commons Scheduler which has a simple job processor on top of Quartz for scheduling[1].
In General
There are a couple of considerations here depending on what you are trying to do. Are you deploying the Spring Batch Jars to the OSGi container with the assumption that the code that is written for the jobs (steps, tasks, etc) will live in separate bundles? OSGi's purpose is to develop modular code so my answer is assuming that this is your end goal. The folks at Pivotal have dropped requiring support for OSGi on their artifacts so to make it work you'll need to determine what you need to export from the Batch jar files. This can be done with BND. I would recommend checking out the new BND Maven plugin [2]. I would configure the Export-Package to export the interfaces you need to write the jobs so that you can write the jobs in separate modular bundles. Then I would probably embed the spring batch jars in a bundle and write a small wrapper around the JobLauncher. This should contain all the actual batch code to a single classloader so that you don't have to worry about OSGi trying to pull in classes dynamically. The downside is that this will prevent you from using many of the batch annotations outside of the spring batch bundle you created but will provide the modularity that you'd be looking for by implementing this type of solution with OSGi.
[1] https://sling.apache.org/documentation/bundles/apache-sling-eventing-and-job-handling.html
[2] http://njbartlett.name/2015/03/27/announcing-bnd-maven-plugin.html
I followed this sample for Spring Batch with Boot.
When you run the main method the job is executed.
This way I can't figure out how one can control the job execution. For example how you schedule a job, or get access to the job execution, or set job parameters.
I tried to register my own JobLauncher
#Bean
public JobLauncher jobLauncher(JobRepository jobRepo){
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(jobRepo);
return simpleJobLauncher;
}
but when I try to use it in the main method:
public static void main(String[] args) {
ConfigurableApplicationContext ctx = SpringApplication.run(Application.class, args);
JobLauncher jobLauncher = ctx.getBean(JobLauncher.class);
//try catch removed for readability
jobLauncher.run(ctx.getBean(Job.class), new JobParameters());
}
The job is again executed when the context is loaded and I got JobInstanceAlreadyCompleteException when I try to run it manually.
Is there a way to prevent the automatic job execution?
The jobs execution can be prevented by setting
spring.batch.job.enabled=false
in application.properties. Or you can use spring.batch.job.names it takes a comma-delimited list of job names that will be run.
Taken from here: how to stop spring batch scheduled jobs from running at first time when executing the code?
You can enable the execution of a Job using rest controller POST:
#RestController
#RequestMapping(value="/job/")
public class JobLauncherController {
private static final Log LOG = LogFactory.getLog(JobLauncherController.class);
#Autowired
private JobLauncher jobLauncher;
#Autowired
private Job job;
#Autowired
private JobRepository jobRepository;
#Autowired
private JobRegistry jobRegistry;
#RequestMapping("/launchjob/{jobName}")
public String handle(#PathVariable("jobName") String jobName, #RequestBody Map<String,Object> request) throws Exception {
try {
request.put("timeJobStarted", DateUtil.getDateFormatted(new Date(), DateUtil.DATE_UUUUMMDDHHMMSS));
Map<String,Object> mapMessage = this.enrichJobMessage(request);
Map<String, JobParameter> jobParameters = new HashMap<>();
mapMessage.forEach((k,v)->{
MapperUtil.castParameter(jobParameters, k, v);
});
jobParameters.put(Field.Batch.JOB_INSTANCE_NAME, new JobParameter(jobName));
jobLauncher.run(job, new JobParameters(jobParameters));
assertNotNull(jobRegistry.getJob(job.getName()));
}catch( NoSuchJobException ex){
jobRegistry.register(new ReferenceJobFactory(job));
} catch (Exception e) {
LOG.error(e.getMessage(),e);
}
return "Done";
}
public static void castParameter(Map<String, JobParameter> jobParameters, String k, Object v){
if(v instanceof String){
jobParameters.put(k, new JobParameter((String)v));
}else if(v instanceof Date){
jobParameters.put(k, new JobParameter((Date)v));
}else if(v instanceof Double){
jobParameters.put(k, new JobParameter((Double)v));
}else if(v instanceof Long){
jobParameters.put(k, new JobParameter((Long)v));
}else{
DslJson dslJson = new DslJson<>();
JsonWriter writer = dslJson.newWriter();
try {
dslJson.serialize(writer,v);
jobParameters.put(k, new JobParameter(writer.toString()));
} catch (IOException e) {
LOG.warn(e.getMessage(), e);
}
}
}
}