How to handle hibernate exception in quartz job - java

I want to catch a hibernate database exception in my quartz job class, please find my quartz listener and quart job class, I'm trying to test my quartz scheduler with database failure scenario, while executing my quartz job I'm stopping mysql db to verify my quartz job class catching hibernate exception or not.
But the code below does not catch if any db exception occurs.
Can someone help what I'm doing wrong here:
MyListener.java
#WebListener
public class MyListener extends QuartzInitializerListener {
private final Logger logger = LoggerFactory.getLogger(MyListener.class);
private Scheduler scheduler;
#Override
public void contextInitialized(ServletContextEvent sce) {
super.contextInitialized(sce);
ServletContext ctx = sce.getServletContext();
StdSchedulerFactory factory = (StdSchedulerFactory) ctx.getAttribute(QUARTZ_FACTORY_KEY);
try {
scheduler = factory.getScheduler();
JobDetail job = newJob(TestQuartzJob.class)
.withIdentity("job1", "group1")
.build();
CronTrigger trigger = newTrigger()
.withIdentity("trigger1", "group1")
.withSchedule(cronSchedule("0/20 * * * * ?"))
.build();
scheduler.scheduleJob(job, trigger);
scheduler.start();
} catch (Exception e) {
ctx.log("There was an error scheduling the job.", e);
}
}
TestQuartzJob.java
public class TestQuartzJob implements Job throws JobExecutionException{
Logger logger = Logger.getLogger(TestQuartzJob.class);
Session mysession = HibernateUtilities.getJobsSessionFactory().openSession();
Transaction transaction = mysession.beginTransaction();
#Override
public void execute(JobExecutionContext jec) {
try {
Student getstudName = (Student) mysession.
createCriteria(Student).add(eq(“student_id", “stud1234")).uniqueResult();
String name = getstudName.getName();
} catch (Exception e) {
logger.info("--- Error in job!");
JobExecutionException e2 = new JobExecutionException(e);
// this job will refire immediately
e2.refireImmediately();
throw e2;
}
}
}

Related

AWS lambda run Spring batch Job from from request handler with out using scheduler

My application is using spring boot with batch and testing it in aws lambda I want run the job in main method and NOT through scheduler. Is it possible to do that?
#SpringBootApplication
#EnableAutoConfiguration
#EnableJpaRepositories("com.myrepo.repository")
#ComponentScan("com.myrepo")
#EnableScheduling
public class Main {
private static final Logger LOG = LoggerFactory.getLogger(hMain.class);
#Autowired
JobLauncher launcher;
#Autowired
Job job;
public static void main(String[] args) {
try {
LOG.info("Start of application - debt card notofication JOB");
SpringApplication.run(BatchMain.class, args);
} catch (Exception e) {
LOG.error("Exception caught bathch Main, );
}
}
}
EDIT -- I wrote below code but it is not working inside aws lambda function
#Scheduled(cron = "0/1 * * * * *")
public void performBatchOpertaion() {
try {
LOG.info("Scheduling Job and Launcher {}, {}", job, launcher);
JobParameters params = new JobParametersBuilder()
.addString(Constants.MYBATCH, String.valueOf(System.currentTimeMillis()))
.toJobParameters();
launcher.run(job, params);
} catch (Exception e) {
LOG.error("Unable to schedules ", e.getCause());
}
}
public static void startApp() {
LOG.info("start batch job ");
SpringApplication.run(Main.class);
LOG.info("end batch job ");
}
here is my Request handler class which call statApp() of Main class
--------------------------------------------------------
public class MyHandler implements RequestHandler<Map<String, Object>, String> {
private static final Logger LOG = LoggerFactory.getLogger(MyHandler.class);
#Autowired
BatchMain main;
#Override
public String handleRequest(Map<String, Object> input, Context context) {
LOG.info("Inside the handler request");
BatchMain.startApp();
LOG.info("End of handler request");
return "End Of Application";
}
}
handleRequestmethod wait until the BatchMain to complete. You can use Thread join . Lambda execution engine suspend the main thread once the handle request return the result . In the next event it may complete the previous suspended tasks

Quartz produces unhandled null pointer exception

So, I have 2 schedulers set on different Spring profiles. When I run Spring scheduler everything works fine, but they want me to implement Quartz.
Here's a Job class:
#Profile("quartz")
#Component
public class SampleJob implements Job {
#Autowired
private GetDataServiceQuartz getDataServiceQuartz;
public SampleJob() {
}
public SampleJob(GetDataServiceQuartz getDataServiceQuartz) {
this.getDataServiceQuartz = getDataServiceQuartz;
}
#Override
public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException {
this.getDataServiceQuartz.storeData();
}
}
An error being thrown:
org.quartz.SchedulerException: Job threw an unhandled exception.
at org.quartz.core.JobRunShell.run(JobRunShell.java:213) ~[quartz-2.2.1.jar:na]
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573) [quartz-2.2.1.jar:na]
Caused by: java.lang.NullPointerException: null
at com.example.blockchaininfo.services.Quartz.SampleJob.execute(SampleJob.java:27) ~[classes/:na]
at org.quartz.core.JobRunShell.run(JobRunShell.java:202) ~[quartz-2.2.1.jar:na]
... 1 common frames omitted
A nullPointerException is being thrown on this particular line:
this.getDataServiceQuartz.storeData();
As soon as I've tried to print this.getDataServiceQuartz it prints null.
The class that does all the behind work:
#Slf4j
#Service
#Profile("quartz")
public class GetDataServiceQuartz{
constructorHere();
public void storeData(){
try {
String hashrateFromApi = this.getNetworkHashrateFromApi("http://public.turtlenode.io:11898/getinfo");
OffsetDateTime date = OffsetDateTime.now();
this.saveNetworkHashrateNewEntity(hashrateFromApi, date);
this.storePoolDataToDB(this.getPoolsListFromJson(), retrieveNetworkIdForPoolDefinition(date));
} catch (HttpServerErrorException e){
log.info("Network Server error e1: " + e);
} catch (ResourceAccessException e2){
log.info("Network resource access exception: " + e2);
} catch (IOException e3) {
log.info("" + e3);
} catch (InterruptedException e4){
log.info("" + e4);
}
}
...all other methods to acquire stuff.
And Quartz configuration.
#Override
public void onApplicationEvent(ContextRefreshedEvent contextRefreshedEvent){
this.startQuartzScheduling();
}
public void startQuartzScheduling () {
JobDetail job = JobBuilder.newJob(SampleJob.class)
.withIdentity("dummyJobName", "group1").build();
Trigger trigger = TriggerBuilder
.newTrigger()
.withIdentity("dummyTriggerName", "group1")
.withSchedule(
SimpleScheduleBuilder.simpleSchedule()
.withIntervalInSeconds(5).repeatForever())
.build();
try {
Scheduler scheduler = StdSchedulerFactory.getDefaultScheduler();
scheduler.start();
scheduler.scheduleJob(job, trigger);
} catch (SchedulerException e){
log.info("" + e);
}
}
What am I missing? How to properly inject a class that should have its methods scheduled?
I believe this is hapenning because quartz JobBuilder creates new instance of your SampleJob instead of using created one with autowired fields. And as it uses default constructor as the result you have nullpointer.
One option to fix this is to put your GetDataServiceQuartz to scheduler context.
Described here
So, to put your data you need to call:
scheduler.getContext().put("getDataServiceQuartz", getDataServiceQuartz);
And when executing your task:
SchedulerContext schedulerContext = jobExecutionContext.getScheduler().getContext();
schedulerContext.get("getDataServiceQuartz");
Other, in my opinion more convenient way, is to put it to JobDataMap which will be available from your SampleJob:
job.getJobDataMap().put("getDataServiceQuartz", getDataServiceQuartz);
When executing task:
context.getJobDetail().getJobDataMap().get("getDataServiceQuartz")
Full example can be found here.

Quartz Scheduler Job Sequence/Workflow

I am trying to execute a series of jobs where one job execute other two, and one of the two execute another.
Job 1 --> Job 3
--> Job 2 -->Job 4
The jobs are for sending data from db.
This is what i have done
#PersistJobDataAfterExecution
#DisallowConcurrentExecution
public class MembersJob implements Job{
List<Member>unsentMem=new ArrayList<Member>();
JSONArray customerJson = new JSONArray();
Depot depot;
public MembersJob(){
depot = new UserHandler().getDepot();
}
public void execute(JobExecutionContext jec) throws JobExecutionException {
if ( Util.getStatus() ){
runSucceedingJobs(jec);
} else {
System.out.println("No internet connection");
}
}
public void runSucceedingJobs(JobExecutionContext context){
JobDataMap jobDataMap = context.getJobDetail().getJobDataMap();
Object milkCollectionJobObj = jobDataMap.get("milkCollectionJob");
MilkCollectionsJob milkCollectionsJob = (MilkCollectionsJob)milkCollectionJobObj;
Object productsJobObj = jobDataMap.get("milkCollectionJob");
ProductsJob productsJob = (ProductsJob)productsJobObj;
try {
milkCollectionsJob.execute(context);
productsJob.execute(context);
} catch (JobExecutionException ex) {
System.out.println("Error...");
Logger.getLogger(MembersJob.class.getName()).log(Level.SEVERE, null, ex);
}
}
}
Call jobs in series
//Members Job
JobKey membersJobKey = new JobKey("salesJob", "group1");
JobDetail membersJob = JobBuilder.newJob(MembersJob.class)
.withIdentity(membersJobKey).build();
membersJob.getJobDataMap().put("milkCollectionJob", new MilkCollectionsJob());
membersJob.getJobDataMap().put("productsJob", new ProductsJob());
//
CronTrigger membersTrigger = newTrigger()
.withIdentity("salesTrigger", "group1")
.withSchedule(
cronSchedule("0/10 * * * * ?"))
.forJob(membersJobKey)
.build();
Scheduler scheduler = new StdSchedulerFactory().getScheduler();
scheduler.scheduleJob(membersJob, membersTrigger);
scheduler.start();
The problem is members job starts but does not start other jobs when it is done. What is the easiest and fastest way to achieve this?

How can I use Spring Batch with OSGI

I want to use Spring batch with osgi to run a job daily.
here what i did:
#Component
#EnableBatchProcessing
public class BatchConfiguration {
private JobBuilderFactory jobs;
public JobBuilderFactory getJobs() {
return jobs;
}
public void setJobs(JobBuilderFactory jobs) {
this.jobs = jobs;
}
private StepBuilderFactory steps;
private EmployeeRepository employeeRepository; //spring data repository
public EmployeeRepository getEmployeeRepository() {
return employeeRepository;
}
#Reference
public void setEmployeeRepository(EmployeeRepository employeeRepository) {
employeeRepository= employeeRepository;
}
public Step syncEmployeesStep() throws Exception{
RepositoryItemWriter writer = new RepositoryItemWriter();
writer.setRepository(employeeRepository);
writer.setMethodName("save");
return steps.get("syncEmployeesStep")
.<Employee, Employee> chunk(10)
.reader(reader())
.writer(writer)
.build();
}
public Job importEmpJob()throws Exception {
return jobs.get("importEmpJob")
.incrementer(new RunIdIncrementer())
.start(syncEmployeesStep())
.next(syncEmployeesStep())
.build();
}
public ItemReader<Employee> reader() throws Exception {
String jpqlQuery = "select a from Employee a";
ServerEMF entityManager = new ServerEMF();
JpaPagingItemReader<Employee> reader = new JpaPagingItemReader<Tariff>();
reader.setQueryString(jpqlQuery);
reader.setEntityManagerFactory(entityManager.getEntityManagerFactory());
reader.setPageSize(3);
reader.afterPropertiesSet();
reader.setSaveState(true);
return reader;
}
}
here I want to run this job to sync between two databases,My problem is how to run this job inside osgi.
#EnableScheduling
#Component
public class JobRunner {
private JobLauncher jobLauncher;
private Job job ;
private BatchConfiguration batchConfig;
//private JobBuilderFactory jobs;
//private JobRepository jobrepo;
final static Logger logger = LoggerFactory.getLogger(BatchConfiguration.class);
BundleContext ctx;
#SuppressWarnings("rawtypes")
ServiceTracker servicetracker;
#Activate
public void start(BundleContext context) {
batchConfig = new BatchConfiguration();
//jobs = new JobBuilderFactory(jobRepository)
try {
job = batchConfig.importEmpJob(); //job is null because i don't know how to use it
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
ctx = context;
servicetracker= new ServiceTracker(ctx, BatchConfiguration.class, null);
servicetracker.open();
new Thread() {
public void run() { findAndRunJob(); }
}.start();
}
#Deactivate
public void stop() {
configAdminTracker.close();
}
#Scheduled(fixedRate = 5000)
protected void findAndRunJob() {
logger.info("job created.");
try {
String dateParam = new Date().toString();
JobParameters param = new JobParametersBuilder().addString("date", dateParam).toJobParameters();
System.out.println(dateParam);
JobExecution execution = jobLauncher.run(job, param);
System.out.println("Exit Status : " + execution.getStatus());
} catch (Exception e) {
//e.printStackTrace();
}
}
for sure,I got a java.lang.NullPointerException because the job is null.
could anyone help me with that?
after updates
#Component
#EnableBatchProcessing
public class BatchConfiguration {
private EmployeeRepository employeeRepository; //spring data repository
public EmployeeRepository getEmployeeRepository() {
return employeeRepository;
}
#Reference
public void setEmployeeRepository(EmployeeRepository employeeRepository) {
employeeRepository= employeeRepository;
}
public Step syncEmployeesStep() throws Exception{
RepositoryItemWriter writer = new RepositoryItemWriter();
writer.setRepository(employeeRepository);
writer.setMethodName("save");
return steps.get("syncEmployeesStep")
.<Employee, Employee> chunk(10)
.reader(reader())
.writer(writer)
.build();
}
public Job importEmpJob(JobRepository jobRepository, PlatformTransactionManager transactionManager)throws Exception {
JobBuilderFactory jobs= new JobBuilderFactory(jobRepository);
StepBuilderFactory stepBuilderFactory = new StepBuilderFactory(jobRepository, transactionManager);
return jobs.get("importEmpJob")
.incrementer(new RunIdIncrementer())
.start(syncEmployeesStep())
.next(syncEmployeesStep())
.build();
}
public ItemReader<Employee> reader() throws Exception {
String jpqlQuery = "select a from Employee a";
ServerEMF entityManager = new ServerEMF();
JpaPagingItemReader<Employee> reader = new JpaPagingItemReader<Tariff>();
reader.setQueryString(jpqlQuery);
reader.setEntityManagerFactory(entityManager.getEntityManagerFactory());
reader.setPageSize(3);
reader.afterPropertiesSet();
reader.setSaveState(true);
return reader;
}
}
job runner class
private JobLauncher jobLauncher;
private PlatformTransactionManager transactionManager;
private JobRepository jobRepository;
Job importEmpJob;
private BatchConfiguration batchConfig;
#SuppressWarnings("deprecation")
#Activate
public void start(BundleContext context) {
try {
batchConfig = new BatchConfiguration();
this.transactionManager = new ResourcelessTransactionManager();
MapJobRepositoryFactoryBean repositorybean = new MapJobRepositoryFactoryBean();
repositorybean.setTransactionManager(transactionManager);
this.jobRepository = repositorybean.getJobRepository(); //error after executing this statement
// setup job launcher
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setTaskExecutor(new SyncTaskExecutor());
simpleJobLauncher.setJobRepository(jobRepository);
this.jobLauncher = simpleJobLauncher;
//System.out.println(job);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
ctx = context;
configAdminTracker = new ServiceTracker(ctx, BatchConfiguration.class.getName(), null);
configAdminTracker.open();
new Thread() {
public void run() { findAndRunJob(); }
}.start();
}
#Deactivate
public void stop() {
configAdminTracker.close();
}
protected void findAndRunJob() {
logger.info("job created.");
try {
String dateParam = new Date().toString();
// creating the job
Job job = batchConfig.importEmpJob(jobRepository, transactionManager);
// running the job
JobExecution execution = this.jobLauncher.run(job, new JobParameters());
System.out.println("Exit Status : " + execution.getStatus());
} catch (Exception e) {
//e.printStackTrace();
}
}
what i getting is "java.lang.IllegalArgumentException: interface org.springframework.batch.core.repository.JobRepository is not visible from class loader" after running .could anyone help me with that error?
In Short
If you're just trying to kick off a something simple and don't need all the Spring Batch goodness I would look into the Apache Sling Commons Scheduler which has a simple job processor on top of Quartz for scheduling[1].
In General
There are a couple of considerations here depending on what you are trying to do. Are you deploying the Spring Batch Jars to the OSGi container with the assumption that the code that is written for the jobs (steps, tasks, etc) will live in separate bundles? OSGi's purpose is to develop modular code so my answer is assuming that this is your end goal. The folks at Pivotal have dropped requiring support for OSGi on their artifacts so to make it work you'll need to determine what you need to export from the Batch jar files. This can be done with BND. I would recommend checking out the new BND Maven plugin [2]. I would configure the Export-Package to export the interfaces you need to write the jobs so that you can write the jobs in separate modular bundles. Then I would probably embed the spring batch jars in a bundle and write a small wrapper around the JobLauncher. This should contain all the actual batch code to a single classloader so that you don't have to worry about OSGi trying to pull in classes dynamically. The downside is that this will prevent you from using many of the batch annotations outside of the spring batch bundle you created but will provide the modularity that you'd be looking for by implementing this type of solution with OSGi.
[1] https://sling.apache.org/documentation/bundles/apache-sling-eventing-and-job-handling.html
[2] http://njbartlett.name/2015/03/27/announcing-bnd-maven-plugin.html

How Spring Boot run batch jobs

I followed this sample for Spring Batch with Boot.
When you run the main method the job is executed.
This way I can't figure out how one can control the job execution. For example how you schedule a job, or get access to the job execution, or set job parameters.
I tried to register my own JobLauncher
#Bean
public JobLauncher jobLauncher(JobRepository jobRepo){
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(jobRepo);
return simpleJobLauncher;
}
but when I try to use it in the main method:
public static void main(String[] args) {
ConfigurableApplicationContext ctx = SpringApplication.run(Application.class, args);
JobLauncher jobLauncher = ctx.getBean(JobLauncher.class);
//try catch removed for readability
jobLauncher.run(ctx.getBean(Job.class), new JobParameters());
}
The job is again executed when the context is loaded and I got JobInstanceAlreadyCompleteException when I try to run it manually.
Is there a way to prevent the automatic job execution?
The jobs execution can be prevented by setting
spring.batch.job.enabled=false
in application.properties. Or you can use spring.batch.job.names it takes a comma-delimited list of job names that will be run.
Taken from here: how to stop spring batch scheduled jobs from running at first time when executing the code?
You can enable the execution of a Job using rest controller POST:
#RestController
#RequestMapping(value="/job/")
public class JobLauncherController {
private static final Log LOG = LogFactory.getLog(JobLauncherController.class);
#Autowired
private JobLauncher jobLauncher;
#Autowired
private Job job;
#Autowired
private JobRepository jobRepository;
#Autowired
private JobRegistry jobRegistry;
#RequestMapping("/launchjob/{jobName}")
public String handle(#PathVariable("jobName") String jobName, #RequestBody Map<String,Object> request) throws Exception {
try {
request.put("timeJobStarted", DateUtil.getDateFormatted(new Date(), DateUtil.DATE_UUUUMMDDHHMMSS));
Map<String,Object> mapMessage = this.enrichJobMessage(request);
Map<String, JobParameter> jobParameters = new HashMap<>();
mapMessage.forEach((k,v)->{
MapperUtil.castParameter(jobParameters, k, v);
});
jobParameters.put(Field.Batch.JOB_INSTANCE_NAME, new JobParameter(jobName));
jobLauncher.run(job, new JobParameters(jobParameters));
assertNotNull(jobRegistry.getJob(job.getName()));
}catch( NoSuchJobException ex){
jobRegistry.register(new ReferenceJobFactory(job));
} catch (Exception e) {
LOG.error(e.getMessage(),e);
}
return "Done";
}
public static void castParameter(Map<String, JobParameter> jobParameters, String k, Object v){
if(v instanceof String){
jobParameters.put(k, new JobParameter((String)v));
}else if(v instanceof Date){
jobParameters.put(k, new JobParameter((Date)v));
}else if(v instanceof Double){
jobParameters.put(k, new JobParameter((Double)v));
}else if(v instanceof Long){
jobParameters.put(k, new JobParameter((Long)v));
}else{
DslJson dslJson = new DslJson<>();
JsonWriter writer = dslJson.newWriter();
try {
dslJson.serialize(writer,v);
jobParameters.put(k, new JobParameter(writer.toString()));
} catch (IOException e) {
LOG.warn(e.getMessage(), e);
}
}
}
}

Categories