I´m creating an application with Spring boot + Quartz + Oracle and i would like to save the scheduling in the database (persistent, in case the server crashes). With RAMJobStore works fine, but when I try to use JobStoreTX it doesn't work, it always uses RAMJobStore, where colud be the problem?. I'm surely making a lot of mistakes, but it's my first application with spring boot + Quartz, can you give me an idea?
The events will be created dynamically, receiving the information in the controller.
application.yaml (In addition to Quartz, the application connects to the database to query tables, but the application and Quartz will use the same database)
hibernate:
globally_quoted_identifiers: true
show_sql: true
logging:
level:
org:
hibernate:
SQL: ${hibernate.logging}
pattern:
console: '%d{yyyy-MM-dd HH:mm:ss} %-5level %logger{36} - %msg%n'
server:
port: ${port}
spring:
activemq:
broker-url: ${activemq-url}
password: ${activemq-password}
user: ${activemq-user}
datasource:
driver-class-name: ${driverClassName}
password: ${ddbb-password}
jdbcUrl: ${ddbb-url}
username: ${ddbb-user}
jpa:
properties:
hibernate:
dialect: org.hibernate.dialect.Oracle12cDialect
quartz:
job-store-type: jdbc
jdbc:
initialize-schema: never
properties:
org:
quartz:
scheduler:
instanceId: AUTO
jobStore:
useProperties: true
isClustered: false
clusterCheckinInterval: 5000
class: org.quartz.impl.jdbcjobstore.JobStoreTX
driverDelegateClass: org.quartz.impl.jdbcjobstore.oracle.OracleDelegate
dataSource: quartzDataSource
dataSource:
quartzDataSource:
driver: oracle.jdbc.driver.OracleDriver
URL: ${ddbb-url}
user: ${ddbb-user}
password: ${ddbb-password}
Class SchedulerConfiguration
#Configuration
public class SchedulerConfiguration {
#Bean
public SchedulerFactoryBean schedulerFactory(ApplicationContext applicationContext) {
SchedulerFactoryBean schedulerFactoryBean = new SchedulerFactoryBean();
schedulerFactoryBean.setJobFactory(new AutoWiringSpringBeanJobFactory());
return schedulerFactoryBean;
}
#Bean
public Scheduler scheduler(ApplicationContext applicationContext) throws SchedulerException {
Scheduler scheduler = schedulerFactory(applicationContext).getScheduler();
scheduler.start();
return scheduler;
}
#Bean
#QuartzDataSource
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource quartzDataSource() {
return DataSourceBuilder.create().build();
}
}
class AutoWiringSpringBeanJobFactor
public class AutoWiringSpringBeanJobFactory extends SpringBeanJobFactory implements
ApplicationContextAware{
private transient AutowireCapableBeanFactory beanFactory;
#Override
public void setApplicationContext(final ApplicationContext context) {
beanFactory = context.getAutowireCapableBeanFactory();
}
#Override
protected Object createJobInstance(final TriggerFiredBundle bundle) throws Exception {
final Object job = super.createJobInstance(bundle);
beanFactory.autowireBean(job);
return job;
}
}
Job class
#Component
public class CampaignJob implements Job{
#Override
public void execute(JobExecutionContext jobExecutionContext) throws JobExecutionException {
System.out.println("Hi, the job works");
}
}
The class where the scheduler it´s created
public class ManagerServiceImpl implements ManagerService {
#Autowired
private ProducerQueue producer;
#Autowired
private ManagementDatabase managementDatabase;
#Autowired
private Scheduler scheduler;
#Override
public String processCampaign(ScheduleCampaign scheduleCampaign) {
try {
ZonedDateTime dateTime = ZonedDateTime.of(scheduleCampaign.getDateTime(), scheduleCampaign.getTimeZone());
JobDetail jobDetail = buildJobDetail(scheduleCampaign);
Trigger trigger = buildJobTrigger(jobDetail, dateTime);
scheduler.scheduleJob(jobDetail, trigger);
} catch (SchedulerException e) {
System.out.println("There was an error creating the scheduler: "+e);
}
return "Scheduler created";
}
private JobDetail buildJobDetail(ScheduleCampaign scheduleCampaign) {
JobDataMap jobDataMap = new JobDataMap();
System.out.println("Function: buildJobDetail - campaign value: "+scheduleCampaign.getCampaign());
jobDataMap.put("campaign", scheduleCampaign.getCampaign());
return JobBuilder.newJob(CampaignJob.class)
.withIdentity(UUID.randomUUID().toString(), "campaign-jobs")
.requestRecovery(true)
.storeDurably(true)
.withDescription("campaign job planned")
.usingJobData(jobDataMap)
.storeDurably()
.build();
}
private Trigger buildJobTrigger(JobDetail jobDetail, ZonedDateTime startAt) {
return TriggerBuilder.newTrigger()
.forJob(jobDetail)
.withIdentity(jobDetail.getKey().getName(), "campaign-triggers")
.withDescription("campaign job Trigger")
.startAt(Date.from(startAt.toInstant()))
.withSchedule(SimpleScheduleBuilder.simpleSchedule().withMisfireHandlingInstructionFireNow())
.build();
}
}
Logs
2020-12-28 16:16:08 INFO com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Starting...
2020-12-28 16:16:09 INFO com.zaxxer.hikari.HikariDataSource - HikariPool-1 - Start completed.
2020-12-28 16:16:09 INFO o.h.jpa.internal.util.LogHelper - HHH000204: Processing PersistenceUnitInfo [name: default]
2020-12-28 16:16:10 INFO org.hibernate.Version - HHH000412: Hibernate ORM core version 5.4.23.Final
2020-12-28 16:16:10 INFO org.quartz.impl.StdSchedulerFactory - Using default implementation for ThreadExecutor
2020-12-28 16:16:10 INFO o.quartz.core.SchedulerSignalerImpl - Initialized Scheduler Signaller of type: class org.quartz.core.SchedulerSignalerImpl
2020-12-28 16:16:10 INFO org.quartz.core.QuartzScheduler - Quartz Scheduler v.2.3.2 created.
2020-12-28 16:16:10 INFO org.quartz.simpl.RAMJobStore - RAMJobStore initialized.
2020-12-28 16:16:10 INFO org.quartz.core.QuartzScheduler - Scheduler meta-data: Quartz Scheduler (v2.3.2) 'schedulerFactory' with instanceId 'NON_CLUSTERED'
Scheduler class: 'org.quartz.core.QuartzScheduler' - running locally.
NOT STARTED.
Currently in standby mode.
Number of jobs executed: 0
Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 10 threads.
Using job-store 'org.quartz.simpl.RAMJobStore' - which does not support persistence. and is not clustered.
Try to autowire your app default dataSource, if you are pointing the same DB for quartz jobs as well.
The below configuration worked for me with PostgreSQL
#Autowired
DataSource dataSource;
#Autowired
JobFactory jobFactory;
#Bean
public JobFactory jobFactory(ApplicationContext applicationContext) {
AutoWiringSpringBeanJobFactory jobFactory = new AutoWiringSpringBeanJobFactory();
jobFactory.setApplicationContext(applicationContext);
return jobFactory;
}
#Bean
public SchedulerFactoryBean schedulerFactoryBean() throws IOException {
SchedulerFactoryBean factory = new SchedulerFactoryBean();
factory.setOverwriteExistingJobs(true);
factory.setAutoStartup(true);
factory.setDataSource(dataSource);
factory.setJobFactory(jobFactory);
factory.setQuartzProperties(quartzProperties());
return factory;
}
#Bean
public Properties quartzProperties() throws IOException {
PropertiesFactoryBean propertiesFactoryBean = new PropertiesFactoryBean();
propertiesFactoryBean.setLocation(new ClassPathResource("/quartz.properties"));
propertiesFactoryBean.afterPropertiesSet();
return propertiesFactoryBean.getObject();
}
Related
i'd like to use JpaItemWriter
in conditions of multi-datasource in spring batch, JPA.
below is my config (there are log, common, member db config)
common db config
#Configuration
#EnableJpaRepositories(
basePackages = "com.member.batch.dao.common",
entityManagerFactoryRef = "commonEntityManager",
transactionManagerRef = "commonTransactionManager"
)
public class CommonConfiguration {
#ConfigurationProperties(prefix = "spring.datasource-common.hikari")
#Primary
#Bean(name = "commonDataSource")
public DataSource commonDataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
#ConfigurationProperties("spring.jpa.common-properties")
#Bean(name = "commonHibernateProperties")
public HashMap<String, Object> hibernateProperties() {
return new HashMap<>();
}
#Bean
public LocalContainerEntityManagerFactoryBean commonEntityManager() {
LocalContainerEntityManagerFactoryBean em = new LocalContainerEntityManagerFactoryBean();
em.setDataSource(commonDataSource());
em.setPackagesToScan("com.member.batch.entities.common");
HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
em.setJpaVendorAdapter(vendorAdapter);
final HashMap<String, Object> properties = hibernateProperties();
em.setJpaPropertyMap(properties);
return em;
}
#Bean(name = "commonTransactionManager")
public PlatformTransactionManager commonTransactionManager() {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(commonEntityManager().getObject());
return transactionManager;
}
}
log db config
#Configuration
#EnableJpaRepositories(
basePackages = "com.member.batch.dao.log",
entityManagerFactoryRef = "logEntityManager",
transactionManagerRef = "logTransactionManager"
)
public class LogConfiguration {
// same format with common
}
member db config
#Configuration
#EnableJpaRepositories(
basePackages = "com.member.batch.dao.member",
entityManagerFactoryRef = "memberEntityManager",
transactionManagerRef = "memberTransactionManager"
)
public class MemberConfiguration {
// same format with common
}
and i have to use configures in my job
#Slf4j
#RequiredArgsConstructor
#Configuration
public class WithdrawMembersJobConfiguration {
public static final String WITHDRAW_MEMBERS_JOB = "withdrawMembersJob";
private final MessageSource messageSource;
private final JobBuilderFactory jobBuilderFactory;
private final StepBuilderFactory stepBuilderFactory;
private final CustomJobListener customJobListener;
private final ScheduleJobService scheduleJobService;
private final EntityManagerFactory entityManagerFactory;
#Bean
#Primary
public Job withdrawMembersJob() {
return jobBuilderFactory.get(WITHDRAW_MEMBERS_JOB)
.start(withdrawStep())
.listener(customJobListener)
.build();
}
#JobScope
#Bean(name = WITHDRAW_MEMBERS_JOB + "_step")
public Step withdrawStep() {
SimpleStepBuilder step = stepBuilderFactory.get("deleteExpiredMembersStep")
.<WithdrawMember, WithdrawMember>chunk(1000)
.reader(withdrawItemReader(null, null))
.processor(withdrawItemProcessor())
.writer(withdrawItemWriter());
return step.build();
}
#StepScope
#Bean(name = WITHDRAW_MEMBERS_JOB + "_reader")
public JpaPagingItemReader<WithdrawMember> withdrawItemReader(#org.springframework.beans.factory.annotation.Value("#{jobParameters[requestDate]}") String requestDate,
#org.springframework.beans.factory.annotation.Value("#{jobParameters[status]}") WithdrawStatus status) {
LocalDateTime scheduledDate = LocalDateTime.now().with(LocalTime.MAX);
log.info("scheduled date is " + scheduledDate);
if (status == null) {
status = WithdrawStatus.WSC001;
}
Map<String, Object> parameters = new HashMap<>();
parameters.put("scheduledDate", scheduledDate);
parameters.put("status", status);
return new JpaPagingItemReaderBuilder<WithdrawMember>()
.name("withdrawReader")
.parameterValues(parameters)
.entityManagerFactory(entityManagerFactory)
.queryString("select m from WithdrawMember m where m.scheduleDate < :scheduledDate and m.status = :status")
.pageSize(1000)
.build();
}
#StepScope
#Bean(name = WITHDRAW_MEMBERS_JOB + "_processor")
public ItemProcessor<WithdrawMember, WithdrawMember> withdrawItemProcessor() {
return new ItemProcessor<WithdrawMember, WithdrawMember>() {
#Override
public WithdrawMember process(WithdrawMember item) throws Exception {
log.info("withdraw info , " + item.getReason());
return item;
}
};
}
#StepScope
#Bean(name = WITHDRAW_MEMBERS_JOB + "_writer")
public JpaItemWriter withdrawItemWriter() {
return new JpaItemWriterBuilder<WithdrawMember>()
.entityManagerFactory(entityManagerFactory)
.build();
}
}
but now i am having getting error ,below is my console
2022-09-15 16:18:01.521 DEBUG 1552 --- [eduler_Worker-4] o.s.j.d.DataSourceTransactionManager : Creating new transaction with name [org.springframework.batch.core.repository.support.SimpleJobRepository.update]: PROPAGATION_REQUIRED,ISOLATION_DEFAULT
2022-09-15 16:18:01.521 DEBUG 1552 --- [eduler_Worker-4] o.s.j.d.DataSourceTransactionManager : Acquired Connection [HikariProxyConnection#238016139 wrapping com.mysql.cj.jdbc.ConnectionImpl#3023ef72] for JDBC transaction
2022-09-15 16:18:01.521 DEBUG 1552 --- [eduler_Worker-4] o.s.j.d.DataSourceTransactionManager : Switching JDBC Connection [HikariProxyConnection#238016139 wrapping com.mysql.cj.jdbc.ConnectionImpl#3023ef72] to manual commit
2022-09-15 16:18:01.541 DEBUG 1552 --- [eduler_Worker-4] o.s.jdbc.core.JdbcTemplate : Executing prepared SQL query
2022-09-15 16:18:01.541 DEBUG 1552 --- [eduler_Worker-4] o.s.jdbc.core.JdbcTemplate : Executing prepared SQL statement [SELECT VERSION FROM BATCH_JOB_EXECUTION WHERE JOB_EXECUTION_ID=?]
2022-09-15 16:18:01.577 DEBUG 1552 --- [eduler_Worker-4] o.s.b.c.r.dao.JdbcJobExecutionDao : Truncating long message before update of JobExecution: JobExecution: id=1956, version=1, startTime=Thu Sep 15 16:18:00 KST 2022, endTime=Thu Sep 15 16:18:01 KST 2022, lastUpdated=Thu Sep 15 16:18:01 KST 2022, status=FAILED, exitStatus=exitCode=FAILED;exitDescription=javax.persistence.TransactionRequiredException: no transaction is in progress
at org.hibernate.internal.AbstractSharedSessionContract.checkTransactionNeededForUpdateOperation(AbstractSharedSessionContract.java:422)
at org.hibernate.internal.SessionImpl.checkTransactionNeededForUpdateOperation(SessionImpl.java:3397)
at org.hibernate.internal.SessionImpl.doFlush(SessionImpl.java:1354)
at org.hibernate.internal.SessionImpl.flush(SessionImpl.java:1349)
javax.persistence.TransactionRequiredException: no transaction is in progress
at org.hibernate.internal.AbstractSharedSessionContract.checkTransactionNeededForUpdateOperation(AbstractSharedSessionContract.java:422) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final]
at org.hibernate.internal.SessionImpl.checkTransactionNeededForUpdateOperation(SessionImpl.java:3397) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final]
at org.hibernate.internal.SessionImpl.doFlush(SessionImpl.java:1354) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final]
at org.hibernate.internal.SessionImpl.flush(SessionImpl.java:1349) ~[hibernate-core-5.4.32.Final.jar:5.4.32.Final]
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:na]
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:na]
at java.base/java.lang.reflect.Method.invoke(Method.java:566) ~[na:na]
at org.springframework.orm.jpa.ExtendedEntityManagerCreator$ExtendedEntityManagerInvocationHandler.invoke(ExtendedEntityManagerCreator.java:362) ~[spring-orm-5.3.9.jar:5.3.9]
at com.sun.proxy.$Proxy119.flush(Unknown Source) ~[na:na]
at org.springframework.batch.item.database.JpaItemWriter.write(JpaItemWriter.java:94) ~[spring-batch-infrastructure-4.3.3.jar:4.3.3]
at org.springframework.batch.item.database.JpaItemWriter$$FastClassBySpringCGLIB$$29c4242e.invoke(<generated>) ~[spring-batch-infrastructure-4.3.3.jar:4.3.3]
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218) ~[spring-core-5.3.9.jar:5.3.9]
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:779) ~[spring-aop-5.3.9.jar:5.3.9]
at
i don't know why i get this error.
i could get result of itemReader.
None of the classes you shared is annotated with #EnableBatchProcessing, so it is not clear which transaction manager is set on the StepBuilderFactory you use to create steps. The default transaction manager configured by #EnableBatchProcessing is a DataSourceTransactionManager, which knows nothing about the JPA context. Since you use JPA, so you need to configure a JpaTransactionManager in your step:
#JobScope
#Bean(name = WITHDRAW_MEMBERS_JOB + "_step")
public Step withdrawStep(JpaTransactionManager transactionManager) {
SimpleStepBuilder step = stepBuilderFactory.get("deleteExpiredMembersStep")
.<WithdrawMember, WithdrawMember>chunk(1000)
.transactionManager(transactionManager)
.reader(withdrawItemReader(null, null))
.processor(withdrawItemProcessor())
.writer(withdrawItemWriter());
return step.build();
}
If you use #EnableBatchProcessing, you need to provide a custom BatchConfigurer, something like:
#Bean
public BatchConfigurer batchConfigurer(DataSource dataSource, EntityManagerFactory entityManagerFactory) {
return new DefaultBatchConfigurer(dataSource) {
#Override
public PlatformTransactionManager getTransactionManager() {
return new JpaTransactionManager(entityManagerFactory);
}
};
}
I am using the dependency spring-boot-starter-quartz and I have the following config in application.yaml for Quartz
spring:
quartz:
job-store-type: jdbc
jdbc:
initialize-schema: always
properties:
org:
quartz:
scheduler:
instanceId: AUTO
threadPool:
threadCount: 2
jobStore:
class: org.quartz.impl.jdbcjobstore.JobStoreTX
driverDelegateClass: org.quartz.impl.jdbcjobstore.PostgreSQLDelegate
useProperties: false
tablePrefix: SCHEDULER_
isClustered: true
clusterCheckinInterval: 20000
I have the following beans defined in the Quartz configuration
#Bean
JobDetailFactoryBean notificationJobDetail() {
JobDetailFactoryBean bean = new JobDetailFactoryBean();
bean.setJobClass(NotificationJob.class);
return bean;
}
#Bean
CronTriggerFactoryBean notificationCronTrigger() {
CronTriggerFactoryBean cronTriggerFactoryBean = new CronTriggerFactoryBean();
cronTriggerFactoryBean.setCronExpression(schedulerProperty.getCronNotification());
cronTriggerFactoryBean.setJobDetail(Objects.requireNonNull(notificationJobDetail().getObject()));
return cronTriggerFactoryBean;
}
#Bean
public SchedulerFactoryBean schedulerFactoryBean() {
SchedulerFactoryBean schedulerFactoryBean = new SchedulerFactoryBean();
schedulerFactoryBean.setTriggers(notificationCronTrigger().getObject());
schedulerFactoryBean.setApplicationContextSchedulerContextKey(QUARTZ_APPLICATION_CONTEXT);
return schedulerFactoryBean;
}
But some how the application.conf properties I guess is not read by Quartz and therefor clustering is not working. In the logs it shows
Using thread pool 'org.quartz.simpl.SimpleThreadPool' - with 10 threads.
Using job-store 'org.quartz.simpl.RAMJobStore' - which does not support persistence. and is not clustered.
My database has limited active connections which results in a HikariPool initialization exception as shown below for which I want to neglect the entire stack trace and catch the exception in my Main class.
Here's a stack trace of my exception log:
2020-11-18 16:27:16.619 INFO 9124 --- [ restartedMain] com.zaxxer.hikari.HikariDataSource : HikariPool-1 - Starting...
2020-11-18 16:27:18.344 ERROR 9124 --- [ restartedMain] com.zaxxer.hikari.pool.HikariPool : HikariPool-1 - Exception during pool initialization.
java.sql.SQLSyntaxErrorException: User 6eX6BxR3TY already has more than 'max_user_connections' active connections
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:120)
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:97)
at com.mysql.cj.jdbc.exceptions.SQLExceptionsMapping.translateException(SQLExceptionsMapping.java:122)
at com.mysql.cj.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:836)
at com.mysql.cj.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:456)
at com.mysql.cj.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:246)
Here's the main class:
#SpringBootApplication(exclude = { DataSourceAutoConfiguration.class })
public class NseapiApplication {
private static final Logger LOGGER = LoggerFactory.getLogger(NseapiApplication.class);
private static String splunkUrl;
#Value("${splunk.url}")
public void setSplunkUrl(String splunkUrl) {
NseapiApplication.splunkUrl = splunkUrl;
}
public static void main(String[] args) {
SpringApplication.run(NseapiApplication.class, args);
LOGGER.info("Forwarding logs to Splunk Cloud Instance : " + splunkUrl);
}
#Bean
public static BeanFactoryPostProcessor dependsOnPostProcessor() {
return bf -> {
String[] jpa = bf.getBeanNamesForType(EntityManagerFactory.class);
Stream.of(jpa).map(bf::getBeanDefinition).forEach(it -> it.setDependsOn("databaseStartupValidator"));
};
}
#Bean
public DatabaseStartupValidator databaseStartupValidator(DataSource dataSource) {
DatabaseStartupValidator dsv = new DatabaseStartupValidator();
dsv.setDataSource(dataSource);
dsv.setValidationQuery(DatabaseDriver.MYSQL.getValidationQuery());
return dsv;
}
}
Here's the Database Configuration class:
#Configuration
public class DatasourceConfig {
private static final Logger LOGGER = LoggerFactory.getLogger(DatasourceConfig.class);
#Bean
public DataSource datasource() {
return DataSourceBuilder.create()
.driverClassName("com.mysql.cj.jdbc.Driver")
.url("jdbc:mysql://myDbUrl").username("myUserName").password("myPassword")
.build();
}
}
I am working on a spring batch project in which I am reading a list of students, processing it and writing it.
For now I have kept it simple and processing just returns the student and write just prints it.
I was expecting every-time the step runs I will see the output but I see it only once when the step runs for the first time. Below is the output
2020-04-03 01:33:16.153 INFO 14710 --- [ main] o.s.batch.core.job.SimpleStepHandler : Executing step: [xxxx]
[Student{id=1, name='ABC'}]
as
[Student{id=2, name='DEF'}]
as
[Student{id=3, name='GHI'}]
as
2020-04-03 01:33:16.187 INFO 14710 --- [ main] o.s.batch.core.step.AbstractStep : Step: [xxxx] executed in 33ms
2020-04-03 01:33:16.190 INFO 14710 --- [ main] o.s.b.c.l.support.SimpleJobLauncher : Job: [SimpleJob: [name=readStudents]] completed with the following parameters: [{}] and the following status: [COMPLETED] in 52ms
job triggered
2020-04-03 01:33:17.011 INFO 14710 --- [ scheduling-1] o.s.b.c.l.support.SimpleJobLauncher : Job: [SimpleJob: [name=readStudents]] launched with the following parameters: [{time=1585857797003}]
2020-04-03 01:33:17.017 INFO 14710 --- [ scheduling-1] o.s.batch.core.job.SimpleStepHandler : Executing step: [xxxx]
2020-04-03 01:33:17.022 INFO 14710 --- [ scheduling-1] o.s.batch.core.step.AbstractStep : Step: [xxxx] executed in 4ms
2020-04-03 01:33:17.024 INFO 14710 --- [ scheduling-1] o.s.b.c.l.support.SimpleJobLauncher : Job: [SimpleJob: [name=readStudents]] completed with the following parameters: [{time=1585857797003}] and the following status: [COMPLETED] in 11ms
Also I notice that first time there are no parameters in job and after that there are parameters. Whereas I am supplying job parameters whenever I run job.
Config file
#EnableBatchProcessing
public class Config {
private JobRunner jobRunner;
public Config(JobRunner jobRunner){
this.jobRunner = jobRunner;
}
#Scheduled(cron = "* * * * * *")
public void scheduleJob() throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException {
System.out.println("job triggered");
jobRunner.runJob();
}
}
#Configuration
public class JobConfig {
#Bean
public Job job(JobBuilderFactory jobBuilderFactory,
StepBuilderFactory stepBuilderFactory,
ItemReader<Student> reader,
ItemProcessor<Student, Student> processor,
ItemWriter<Student> writer) {
Step step = stepBuilderFactory.get("xxxx")
.<Student, Student>chunk(1)
.reader(reader)
.processor(processor)
.writer(writer)
.build();
return jobBuilderFactory
.get("readStudents")
.start(step)
.build();
}
#Bean
public ItemReader<Student> reader() {
return new InMemoryStudentReader();
}
}
Job runner file
public class JobRunner {
private Job job;
private JobLauncher simpleJobLauncher;
#Autowired
public JobRunner(Job job, JobLauncher jobLauncher) {
this.simpleJobLauncher = jobLauncher;
this.job = job;
}
public void runJob() throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException {
JobParameters jobParameters =
new JobParametersBuilder()
.addLong("time",System.currentTimeMillis()).toJobParameters();
simpleJobLauncher.run(job, jobParameters);
}
}
In memory student reader
public class InMemoryStudentReader implements ItemReader<Student> {
private int nextStudentIndex;
private List<Student> studentData;
public InMemoryStudentReader() {
initialize();
}
private void initialize() {
Student s1 = new Student(1, "ABC");
Student s2 = new Student(2, "DEF");
Student s3 = new Student(3, "GHI");
studentData = Collections.unmodifiableList(Arrays.asList(s1, s2,s3));
nextStudentIndex = 0;
}
#Override
public Student read() throws Exception {
Student nextStudent = null;
if (nextStudentIndex < studentData.size()) {
nextStudent = studentData.get(nextStudentIndex);
nextStudentIndex++;
}
return nextStudent;
}
}
Because you are calling initialize() in InMemoryStudentReader constructor. Spring only initialize InMemoryStudentReader once and wire it to your job. After the first run, nextStudentIndex is not reset to 0. So the next time your job runs, your reader cannot read anymore.
If you want it to work, you should reset the nextStudentIndex to 0 whenever you start your job.
I want to use HikariCP as JDBC connection pool in my Spring boot application. I have two datasources (MySQL database as the primary database and accessing those data through Hibernate and additionally an Oracle database for reading some other data through JDBCTemplate).
I set the MySQL datasource as primary bean:
#Bean
#Primary
#ConfigurationProperties("spring.datasource")
public DataSourceProperties mySQLDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#Primary
#ConfigurationProperties("spring.datasource")
public DataSource mySQLDataSource() {
return mySQLDataSourceProperties().initializeDataSourceBuilder().build();
}
#Bean
#ConfigurationProperties("oracle.datasource")
public DataSourceProperties oracleDataSourceProperties() {
return new DataSourceProperties();
}
#Bean(name = "oracleDatabase")
#ConfigurationProperties("oracle.datasource")
public DataSource oracleDataSource() {
return oracleDataSourceProperties().initializeDataSourceBuilder().build();
}
#Bean
public JdbcTemplate oracleJdbcTemplate(#Qualifier("oracleDatabase") DataSource oracleDb) {
return new JdbcTemplate(oracleDb);
}
and I put the following configurations in my application.properties :
spring.datasource.type=com.zaxxer.hikari.HikariDataSource
spring.datasource.hikari.minimum-idle=7
spring.datasource.hikari.pool-name=Test-1
spring.datasource.hikari.data-source-properties.prepStmtCacheSize=250
spring.datasource.hikari.data-source-properties.prepStmtCacheSqlLimit=2048
spring.datasource.hikari.data-source-properties.cachePrepStmts=true
spring.datasource.hikari.data-source-properties.useServerPrepStmts=true
Unforuntately, these HikariCP configurations are not being read :
HikariConfig - dataSourceJNDI..................none
HikariConfig - dataSourceProperties............{password=<masked>}
HikariConfig - driverClassName................."com.mysql.jdbc.Driver"
HikariConfig - healthCheckProperties...........{}
HikariConfig - healthCheckRegistry.............none
HikariConfig - idleTimeout.....................600000
HikariConfig - initializationFailFast..........true
HikariConfig - initializationFailTimeout.......1
HikariConfig - isolateInternalQueries..........false
HikariConfig - jdbc4ConnectionTest.............false
HikariConfig - jdbcUrl........................."jdbc:mysql://localhost:3306/testDB"
HikariConfig - leakDetectionThreshold..........0
HikariConfig - maxLifetime.....................1800000
HikariConfig - maximumPoolSize.................10
HikariConfig - metricRegistry..................none
HikariConfig - metricsTrackerFactory...........none
HikariConfig - minimumIdle.....................10
HikariConfig - password........................<masked>
HikariConfig - poolName........................"HikariPool-1"
Creating the HikariCP beans and deactivating the DataSource autoconfiguration and removing "spring.datasource" :
#EnableAutoConfiguration(exclude = {DataSourceAutoConfiguration.class})
#SpringBootApplication
#ComponentScan
public class SpringApplication {
#Bean
#Primary
#ConfigurationProperties(prefix = "spring.datasource.hikari")
public HikariConfig hikariConfig() {
return new HikariConfig();
}
#Bean
public DataSource dataSource() {
return new HikariDataSource(hikariConfig());
}
solves my problem :
HikariConfig - dataSourceJNDI..................none
HikariConfig - dataSourceProperties............{password=<masked>, prepStmtCacheSqlLimit=2048, cachePrepStmts=true, useServerPrepStmts=true, prepStmtCacheSize=250}
HikariConfig - driverClassName................."com.mysql.jdbc.Driver"
HikariConfig - healthCheckProperties...........{}
HikariConfig - healthCheckRegistry.............none
HikariConfig - idleTimeout.....................600000
HikariConfig - initializationFailFast..........true
HikariConfig - initializationFailTimeout.......1
HikariConfig - isolateInternalQueries..........false
HikariConfig - jdbc4ConnectionTest.............false
HikariConfig - jdbcUrl........................."jdbc:mysql://localhost:3306/testDB?autoReconnect=true"
HikariConfig - leakDetectionThreshold..........0
HikariConfig - maxLifetime.....................1800000
HikariConfig - poolName........................"Test-1"
But then the Flyway showing some weird warnings which were not shown before and I have to create the database Schema manually before running the Spring application, that is : the create schema does not work anymore.
[WARN ] JdbcTemplate - DB: Can't create database 'test'; database exists (SQL State: HY000 - Error Code: 1007)
[WARN ] JdbcTemplate - DB: Unknown table 'testSchema.tenant' (SQL State: 42S02 - Error Code: 1051)
[WARN ] JdbcTemplate - DB: Unknown table 'testSchema.user' (SQL State: 42S02 - Error Code: 1051)
My Flyway SQL scripts are plain DDL scripts :
CREATE SCHEMA IF NOT EXISTS `testSchema` DEFAULT CHARACTER SET utf8 ;
DROP TABLE IF EXISTS `testSchema`.`tenant`;
CREATE TABLE `testSchema`.`tenant` (
`id` int NOT NULL AUTO_INCREMENT,
I think that disabling the Auto-Datasource configuration is not the best solution since Flyway stops creating the schema and showing warnings. Is there any other way to solve this ?
Declaring your own DataSource will already have implicity disabled Spring Boot's auto-configuration of a data source. In other words this won't be having any effect:
#EnableAutoConfiguration(exclude = {DataSourceAutoConfiguration.class})
I think the problem lies in the fact that you aren't binding Hikari-specific configuration to your MySQL DataSource. You need to do something like this:
#Bean
#Primary
#ConfigurationProperties("spring.datasource.hikari")
public DataSource mySQLDataSource() {
return mySQLDataSourceProperties().initializeDataSourceBuilder().build();
}
This will mean that your mySQLDataSourceProperties are configured with general-purpose data source configuration. They then create a HikariDataSource which is further configured with the Hikari-specific configuration.
Thank you Andy for your fast and valuable answer ! You set me on the right track. After fiddling around, I found this configuration is working for me :
#Bean
#Primary
#ConfigurationProperties("spring.datasource")
//#ConfigurationProperties("spring.datasource.hikari") can also be used, no difference
public DataSourceProperties mySQLDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#Primary
#ConfigurationProperties("spring.datasource.hikari")
public DataSource mySQLDataSource() {
return mySQLDataSourceProperties().initializeDataSourceBuilder().build();
}
#Bean
#ConfigurationProperties(prefix = "spring.datasource.hikari")
public HikariConfig hikariConfig() {
return new HikariConfig();
}
#Bean
public DataSource dataSource() {
return new HikariDataSource(hikariConfig());
}
and I had to add these settings in the application.properties:
# this is absolutely mandatory otherwise BeanInstantiationException in mySQLDataSource !
spring.datasource.url=${JDBC_CONNECTION_STRING}
spring.datasource.hikari.jdbc-url=${JDBC_CONNECTION_STRING}
spring.datasource.hikari.username=user
spring.datasource.hikari.password=pass
I used the following approach
first.datasource.jdbc-url=jdbc-url
first.datasource.username=username
first.datasource.password=password
.
.
.
.
=================== In Java Configuration File ==================
#Primary
#Bean(name = "firstDataSource")
#ConfigurationProperties(prefix = "first.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Primary
#Bean(name = "firstEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean barEntityManagerFactory(EntityManagerFactoryBuilder builder,
#Qualifier("firstDataSource") DataSource dataSource) {
Map<String, String> props = new HashMap<String, String>();
props.put("spring.jpa.database-platform", "org.hibernate.dialect.Oracle12cDialect");
.
.
.
return builder.dataSource(dataSource).packages("com.first.entity").persistenceUnit("firstDB")
.properties(props)
.build();
}
#Primary
#Bean(name = "firstTransactionManager")
public PlatformTransactionManager firstTransactionManager(
#Qualifier("firstEntityManagerFactory") EntityManagerFactory firstEntityManagerFactory) {
return new JpaTransactionManager(firstEntityManagerFactory);
}
second.datasource.jdbc-url=jdbc-url
second.datasource.username=username
second.datasource.password=password
.
.
.
.
=================== In Java Configuration File ==================
#Bean(name = "secondDataSource")
#ConfigurationProperties(prefix = "second.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "secondEntityManagerFactory")
public LocalContainerEntityManagerFactoryBean barEntityManagerFactory(EntityManagerFactoryBuilder builder,
#Qualifier("secondDataSource") DataSource dataSource) {
Map<String, String> props = new HashMap<String, String>();
props.put("spring.jpa.database-platform", "org.hibernate.dialect.Oracle12cDialect");
.
.
.
return builder.dataSource(dataSource).packages("com.second.entity").persistenceUnit("secondDB")
.properties(props)
.build();
}
#Bean(name = "secondTransactionManager")
public PlatformTransactionManager secondTransactionManager(
#Qualifier("secondEntityManagerFactory") EntityManagerFactory secondEntityManagerFactory) {
return new JpaTransactionManager(secondEntityManagerFactory);
}