I am using both Spring Boot and Batch in a maven multi-module project for parsing CSV files and storing data in a MySQL database.
When running the batch module using my BatchLauncher class (shared below) I get a BeanCurrentlyInCreationException caused by getDataBase() which I use for configuring my MySQL database. (click this link to see logs)
And when I remove this method Spring Boot choose automatically an embedded database of type H2 (link for logs)
BatchLauncher class :
#Slf4j
public class BatchLauncher {
public static void main(String[] args) {
try {
Launcher.launchWithConfig("My Batch", BatchConfig.class, false);
}catch (Exception ex) {
log.error(ex.getMessage());
}
}
}
Launcher class :
#Slf4j
public class Launcher {
private Launcher() {}
public static void launchWithConfig(String batchName, Class<?> configClass, boolean oncePerDayMax) throws JobExecutionException, BatchException {
try {
// Check the spring profiles used
log.info("Start batch \"" + batchName + "\" with profiles : " + System.getProperty("spring.profiles.active"));
// Load configuration
#SuppressWarnings("resource")
AnnotationConfigApplicationContext context = new AnnotationConfigApplicationContext(configClass);
JobLauncher jobLauncher = context.getBean(JobLauncher.class);
Job job = context.getBean(Job.class);
//Authorize only one execution of each job per day
JobParameters jobParameters = new JobParameters();
JobExecution execution = jobLauncher.run(job, jobParameters);
if(!BatchStatus.COMPLETED.equals(execution.getStatus())) {
throw new BatchException("Unknown error while executing batch : " + batchName);
}
}catch (Exception ex){
log.error("Exception",ex);
throw new BatchException(ex.getMessage());
}
}
}
BatchConfig class :
#Slf4j
#Configuration
#EnableAutoConfiguration(exclude = {DataSourceAutoConfiguration.class, DataSourceTransactionManagerAutoConfiguration.class, HibernateJpaAutoConfiguration.class})
#EnableBatchProcessing
#ComponentScan(basePackages = {
"fr.payet.flad.batch.tasklet",
"fr.payet.flad.batch.mapper"
})
#Import({CoreConfig.class})
public class BatchConfig {
private StepBuilderFactory steps;
private JobBuilderFactory jobBuilderFactory;
private ReadInputTasklet readInputTasklet;
public BatchConfig(StepBuilderFactory steps, JobBuilderFactory jobBuilderFactory, ReadInputTasklet readInputTasklet) {
this.steps = steps;
this.jobBuilderFactory = jobBuilderFactory;
this.readInputTasklet = readInputTasklet;
}
#Bean
public DataSource getDataBase(){
return DataSourceBuilder
.create()
.driverClassName("com.mysql.jdbc.Driver")
.url("jdbc:mysql://localhost:3306/myDb?useSSL=false")
.username("myuser")
.password("mypwd")
.build();
}
#Bean
public Step readInputStep() {
return steps.get("readInputStep")
.tasklet(readInputTasklet)
.build();
}
#Bean
public Job readCsvJob() {
return jobBuilderFactory.get("readCsvJob")
.incrementer(new RunIdIncrementer())
.flow(readInputStep())
.end()
.build();
}
}
The solution was to create a custom DataSourceConfiguration class annotated with #Configuration in which I set my own database like this :
#Bean
public DataSource getDataBase(){
return DataSourceBuilder
.create()
.driverClassName("com.mysql.jdbc.Driver")
.url("jdbc:mysql://localhost:3306/myDB?useSSL=false")
.username("myUser")
.password("myPwd")
.build();
}
Related
I get this error while testing my reader:
Failed to initialize the reader
org.springframework.batch.item.ItemStreamException: Failed to initialize the reader
at app//org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader.open(AbstractItemCountingItemStreamItemReader.java:153)
at app//org.springframework.batch.item.support.AbstractItemCountingItemStreamItemReader$$FastClassBySpringCGLIB$$ebb633d0.invoke(<generated>)
at app//org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218)
at app//org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:783)
at app//org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163)
at app//org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:753)
at app//org.springframework.aop.support.DelegatingIntroductionInterceptor.doProceed(DelegatingIntroductionInterceptor.java:137)
at app//org.springframework.aop.support.DelegatingIntroductionInterceptor.invoke(DelegatingIntroductionInterceptor.java:124)
at app//org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
at app//org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:753)
at app//org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:698)
at app//org.springframework.batch.item.database.JdbcCursorItemReader$$EnhancerBySpringCGLIB$$403e222f.open(<generated>)
at app//com.europcar.tokenization.step.reader.BankCardReaderTest.lambda$readAll$0(BankCardReaderTest.java:131)
at app//org.springframework.batch.test.StepScopeTestUtils.doInStepScope(StepScopeTestUtils.java:38)
at app//com.europcar.tokenization.step.reader.BankCardReaderTest.readAll(BankCardReaderTest.java:127)
at app//com.europcar.tokenization.step.reader.BankCardReaderTest.readTest(BankCardReaderTest.java:120)
at java.base#17.0.3.1/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base#17.0.3.1/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base#17.0.3.1/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base#17.0.3.1/java.lang.reflect.Method.invoke(Method.java:568)
at app//org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:725)
at app//org.junit.jupiter.engine.execution.MethodInvocation.proceed(MethodInvocation.java:60)
at app//org.junit.jupiter.engine.execution.InvocationInterceptorChain$ValidatingInvocation.proceed(InvocationInterceptorChain.java:131)
at app//org.junit.jupiter.engine.extension.TimeoutExtension.intercept(TimeoutExtension.java:149)
at app//org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestableMethod(TimeoutExtension.java:140)
at app//org.junit.jupiter.engine.extension.TimeoutExtension.interceptTestMethod(TimeoutExtension.java:84)
at app//org.junit.jupiter.engine.execution.ExecutableInvoker$ReflectiveInterceptorCall.lambda$ofVoidMethod$0(ExecutableInvoker.java:115)
at app//org.junit.jupiter.engine.execution.ExecutableInvoker.lambda$invoke$0(ExecutableInvoker.java:105)
at app//org.junit.jupiter.engine.execution.InvocationInterceptorChain$InterceptedInvocation.proceed(InvocationInterceptorChain.java:106)
at app//org.junit.jupiter.engine.execution.InvocationInterceptorChain.proceed(InvocationInterceptorChain.java:64)
at app//org.junit.jupiter.engine.execution.InvocationInterceptorChain.chainAndInvoke(InvocationInterceptorChain.java:45)
at app//org.junit.jupiter.engine.execution.InvocationInterceptorChain.invoke(InvocationInterceptorChain.java:37)
at app//org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:104)
at app//org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:98)
at app//org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$7(TestMethodTestDescriptor.java:214)
at app//org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at app//org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:210)
at app//org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:135)
at app//org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:66)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:151)
at app//org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
at app//org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
at app//org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
at java.base#17.0.3.1/java.util.ArrayList.forEach(ArrayList.java:1511)
at app//org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
at app//org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
at app//org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
at app//org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
at java.base#17.0.3.1/java.util.ArrayList.forEach(ArrayList.java:1511)
at app//org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.invokeAll(SameThreadHierarchicalTestExecutorService.java:41)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$6(NodeTestTask.java:155)
at app//org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$8(NodeTestTask.java:141)
at app//org.junit.platform.engine.support.hierarchical.Node.around(Node.java:137)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.lambda$executeRecursively$9(NodeTestTask.java:139)
at app//org.junit.platform.engine.support.hierarchical.ThrowableCollector.execute(ThrowableCollector.java:73)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.executeRecursively(NodeTestTask.java:138)
at app//org.junit.platform.engine.support.hierarchical.NodeTestTask.execute(NodeTestTask.java:95)
at app//org.junit.platform.engine.support.hierarchical.SameThreadHierarchicalTestExecutorService.submit(SameThreadHierarchicalTestExecutorService.java:35)
at app//org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:57)
at app//org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:54)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:108)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:88)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.lambda$execute$0(EngineExecutionOrchestrator.java:54)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.withInterceptedStreams(EngineExecutionOrchestrator.java:67)
at org.junit.platform.launcher.core.EngineExecutionOrchestrator.execute(EngineExecutionOrchestrator.java:52)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:96)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:75)
at org.gradle.api.internal.tasks.testing.junitplatform.JUnitPlatformTestClassProcessor$CollectAllTestClassesExecutor.processAllTestClasses(JUnitPlatformTestClassProcessor.java:99)
at org.gradle.api.internal.tasks.testing.junitplatform.JUnitPlatformTestClassProcessor$CollectAllTestClassesExecutor.access$000(JUnitPlatformTestClassProcessor.java:79)
at org.gradle.api.internal.tasks.testing.junitplatform.JUnitPlatformTestClassProcessor.stop(JUnitPlatformTestClassProcessor.java:75)
at org.gradle.api.internal.tasks.testing.SuiteTestClassProcessor.stop(SuiteTestClassProcessor.java:61)
at java.base#17.0.3.1/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base#17.0.3.1/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base#17.0.3.1/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base#17.0.3.1/java.lang.reflect.Method.invoke(Method.java:568)
at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:36)
at org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at org.gradle.internal.dispatch.ContextClassLoaderDispatch.dispatch(ContextClassLoaderDispatch.java:33)
at org.gradle.internal.dispatch.ProxyDispatchAdapter$DispatchingInvocationHandler.invoke(ProxyDispatchAdapter.java:94)
at jdk.proxy1/jdk.proxy1.$Proxy2.stop(Unknown Source)
at org.gradle.api.internal.tasks.testing.worker.TestWorker$3.run(TestWorker.java:193)
at org.gradle.api.internal.tasks.testing.worker.TestWorker.executeAndMaintainThreadName(TestWorker.java:129)
at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:100)
at org.gradle.api.internal.tasks.testing.worker.TestWorker.execute(TestWorker.java:60)
at org.gradle.process.internal.worker.child.ActionExecutionWorker.execute(ActionExecutionWorker.java:56)
at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:133)
at org.gradle.process.internal.worker.child.SystemApplicationClassLoaderWorker.call(SystemApplicationClassLoaderWorker.java:71)
at app//worker.org.gradle.process.internal.worker.GradleWorkerMain.run(GradleWorkerMain.java:69)
at app//worker.org.gradle.process.internal.worker.GradleWorkerMain.main(GradleWorkerMain.java:74)
Caused by: org.springframework.jdbc.BadSqlGrammarException: Executing query; bad SQL grammar [select field1,field2
field3,
field4,
field5
from MYDB.MYTABLE wct]; nested exception is org.h2.jdbc.JdbcSQLSyntaxErrorException: Table "MYTABLE"
Table "MYTABLE" not found;
how can I avoid this error? when I debug I go throw the ReaderConfig
class and initiliaze reader when context is created, but when come to the test reader is not initialized!
some one can help me?
this is my test class,this is my test config for Reader and job Config.
I set an H2 database withs properties file above
#ActiveProfiles("test") #RunWith(SpringRunner.class) #PropertySource(ignoreResourceNotFound = true, value = "classpath:application-test.properties") #SpringBootTest #EnableAutoConfiguration #ContextConfiguration(classes = { MyFunctionJobConfiguration.class, ReaderConfig.class}) #TestExecutionListeners({ DependencyInjectionTestExecutionListener.class, DirtiesContextTestExecutionListener.class }) #DirtiesContext(classMode
= ClassMode.AFTER_CLASS) public class `MyEntity`ReaderTest {
#Autowired private JdbcCursorItemReader<MyEntity> MyEntityReader;
#Autowired private DataSource dataSource;
StepExecution stepExecution;
private static final Logger LOGGER = LoggerFactory.getLogger("TEST_LOGGER");
private JobParameters defaultJobParameters() {
JobParameters params = new JobParametersBuilder()
.addString("JobID", String.valueOf(System.currentTimeMillis()))
.toJobParameters();
return params; } #BeforeEach public void setUp() throws Exception { } public StepExecution getStepExecution() {
StepExecution stepExecution = MetaDataInstanceFactory.createStepExecution(defaultJobParameters());
return stepExecution; } #Test #Transactional void readTest() throws Exception {
List<MyEntity> MyEntities = readAll();
LOGGER.info("result of read : bank card ={}", MyEntities);
assertNotNull(MyEntities); }
private List<MyEntity> readAll() throws Exception {
// build step execution context
return StepScopeTestUtils.doInStepScope(
getStepExecution(),
() -> {
// init reader
MyEntityReader.open(stepExecution.getExecutionContext());
List<MyEntity> items = new ArrayList<>();
MyEntity MyEntity1;
while ((MyEntity1 = MyEntityReader.read()) != null) {
items.add(MyEntity1);
}
MyEntityReader.close();
return items;
}); } }
#Profile("test")
#Configuration
public class ReaderConfig {
#Autowired
private DataSource dataSource;
private MyEntityRowMapper MyEntityRowMapper = new MyEntityRowMapper();
private static final String SQL_FILE = "extract_entities_test.sql";
Integer maxCardsNumber;
public static final String MODULO_LABEL = ":modulo";
public static final String GRID_SIZE_LABEL = ":gridSize";
#Bean
#Primary
#StepScope
public JdbcCursorItemReader<MyEntity> MyEntityReader() throws IOException {
return initMyEntityReader(1,1);
}
#SqlGroup
({
#Sql(scripts={"classpath:create-tables.sql"}),
#Sql(scripts={"classpath:init_MyEntity_data.sql"})
})
public JdbcCursorItemReader<MyEntity> initMyEntityReader(int modulo, int gridSize) throws IOException {
JdbcCursorItemReader<MyEntity> cursorItemReader = new JdbcCursorItemReader<>();
ClassPathResource resource = new ClassPathResource(SQL_FILE);
BufferedReader reader = new BufferedReader(new InputStreamReader(resource.getInputStream()));
String query = FileCopyUtils.copyToString(reader);
query = query.replace(MODULO_LABEL, String.valueOf(modulo));
query = query.replace(GRID_SIZE_LABEL, String.valueOf(gridSize));
cursorItemReader.setSql(query);
final int partitionSize = 10 / gridSize;
cursorItemReader.setMaxItemCount(partitionSize);
cursorItemReader.setDataSource(dataSource);
cursorItemReader.setRowMapper(MyEntityRowMapper);
return cursorItemReader;
}
}
#Configuration
#EnableBatchProcessing
#RefreshScope
public class MyFunctionJobConfiguration {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
JdbcCursorItemReader<MyEntity> reader;
#Value("${max-number-card-to-process}")
private Integer MAX_NUMBER_CARD;
#Value("${chunck-size:10}")
private int chunckSize;
#Value("${grid-size:1}")
private int gridSize;
private final static String JOB_DISABLED = "job is disabled, check the configuration file !";
#Value("${job.enabled}")
private boolean batchIsEnabled;
private static final Logger LOGGER = LoggerFactory.getLogger("FUNCTIONAL_LOGGER");
#Bean
#StepScope
#RefreshScope
public MyEntityWriter writer() {
return new MyEntityWriter();
}
#Bean
#StepScope
#RefreshScope
public MyFunctionProcessor processor() throws IOException {
return new MyFunctionProcessor();
}
#Bean
public MyPrationner partitioner() {
return new MyPrationner();
}
#Bean
public Step masterStep() throws SQLException, IOException, ClassNotFoundException {
return stepBuilderFactory.get("masterStep")
.partitioner("MyFunctionStep", partitioner())
.step(MyFunctionStep())
.gridSize(gridSize)
.taskExecutor(MyFunctionTaskExecutor())
.build();
}
#Bean
public TaskExecutor myFunctionTaskExecutor() {
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setThreadNamePrefix("MyFunctionTaskExecutor_");
int corePoolSize = gridSize + 2;
int maxPoolSize = corePoolSize * 2;
taskExecutor.setMaxPoolSize(maxPoolSize);
taskExecutor.setAllowCoreThreadTimeOut(true);
taskExecutor.setCorePoolSize(corePoolSize);
taskExecutor.setQueueCapacity(Integer.MAX_VALUE);
return taskExecutor;
}
#Bean
public Step myFunctionStep() throws IOException, ClassNotFoundException, SQLException {
return stepBuilderFactory.get("MyFunctionStep")
.<MyEntity, MyEntity>chunk(chunckSize)
.reader(reader)
.faultTolerant()
.skipLimit(MAX_NUMBER_CARD)
.skip(InvalidCardNumberException.class)
.skip(TokenManagementException.class)
.processor(processor())
.listener(new MyEntityProcessListener())
.writer(writer())
.listener(new MyEntityWriteListener())
.build();
}
#Bean
public Job myFunctionJob(#Qualifier("MyFunctionStep") Step myFunctionStep)
throws SQLException, IOException, ClassNotFoundException {
if (!batchIsEnabled) {
LOGGER.error(JOB_DISABLED);
System.exit(0);
}
return jobBuilderFactory.get("MyFunctionJob")
.listener(new MyFunctionJobListener())
.incrementer(new RunIdIncrementer())
.flow(masterStep())
.end()
.build();
}
}
logging.level.org.hibernate.SQL=DEBUG
logging.level.org.hibernate.type.descriptor.sql.BasicBinder=TRACE
spring.datasource.type=com.zaxxer.hikari.HikariDataSource
spring.datasource.pool-name=${spring.application.name}-pool
spring.datasource.url=jdbc:h2:mem:MYDB;DB_CLOSE_DELAY=-1;INIT=CREATE SCHEMA IF NOT EXISTS MYDB
spring.datasource.username=sa
spring.datasource.password=sa
spring.datasource.driver-class-name=org.h2.Driver
spring.datasource.database-platform=org.hibernate.dialect.H2Dialect
spring.jpa.defer-datasource-initialization=true
#initialize batch schema
spring.batch.jdbc.initialize-schema=always
spring.datasource.initialization-mode=always
spring.sql.init.mode=always
spring.jpa.hibernate.ddl-auto=none
#spring.jpa.hibernate.ddl-auto=create
spring.h2.console.enabled=true
# Custom H2 Console URL
spring.h2.console.path=/h2
spring.jpa.generate-ddl=true
#jpa.hibernate.ddl-auto=create-drop
spring.datasource.hikari.pool-name=${spring.application.name}-pool
spring.datasource.hikari.connection-timeout=20000
spring.datasource.hikari.maximum-pool-size=30
spring.datasource.hikari.max-lifetime=86400000
spring.datasource.hikari.idle-timeout=600000
spring.datasource.hikari.keepalive-time=30000
spring.datasource.hikari.minimum-idle=10
spring.datasource.jmx-enabled=true`
I put creation tables and injection data sql scripts in ReaderConfig with #Sql anotation when I define the reader Bean for test.
It is too late to create tables at that point. You need to initialize the data source with Spring Batch tables and your custom tables before running your job. This can be done on the test setup phase, not the item reader setup phase.
Please check the documentation on how to initialize the data source with Spring Boot.
I tried to implement an example of Spring Boot Batch Processing from db to csv.
The issue which I cannot solve is based on not sorting all values by id in csv file as well as showing column titles.
Here is the output in csv file. (First value is related with id)
3,9,2013-04-15,Japan,jheino3#mayoclinic.com,Jard,MALE,Heino,87cdda81-45d0-451a-a62f-f8450eae1b64
2,23,1999-01-25,Panama,nloynes2#woothemes.com,Natala,FEMALE,Loynes,24be24e6-525f-42de-855d-52d4fef21608
6,9,2013-02-16,China,rcossans4#harvard.edu,Roseline,FEMALE,Cossans,06f70b0d-2c98-4f46-b933-528499ab91b3
4,8,2014-05-09,Indonesia,jcarlaw1#t.co,Jilleen,FEMALE,Carlaw,c722e6d5-9024-49c5-80e0-c2555f1eb9cc
1,22,2000-08-15,China,gspearing0#flickr.com,Ginnie,FEMALE,Spearing,fa26fa96-97d3-4e8e-856a-fdf07499e13e
5,22,2000-03-18,Indonesia,rgillino6#china.com.cn,Rainer,MALE,Gillino,5302a199-f313-4a24-9550-d643001d9faf
I want all values are sorted by id.
How can I do that?
Here is the batch configuration file shown below.
#Configuration // Informs Spring that this class contains configurations
#EnableBatchProcessing // Enables batch processing for the application
#RequiredArgsConstructor
public class BatchConfiguration {
private final JobBuilderFactory jobBuilderFactory;
private final StepBuilderFactory stepBuilderFactory;
private final UserRepository userRepository;
Date now = new Date(); // java.util.Date, NOT java.sql.Date or java.sql.Timestamp!
String format1 = new SimpleDateFormat("yyyy-MM-dd'-'HH-mm-ss-SSS",Locale.forLanguageTag("tr-TR")).format(now);
private Resource outputResource = new FileSystemResource("output/customers_" + format1 + ".csv");
#Bean
public RepositoryItemReader<User> reader(){
RepositoryItemReader<User> repositoryItemReader = new RepositoryItemReader<>();
repositoryItemReader.setRepository(userRepository);
repositoryItemReader.setMethodName("findAll");
final HashMap<String, Sort.Direction> sorts = new HashMap<>();
sorts.put("id", Sort.Direction.ASC);
repositoryItemReader.setSort(sorts);
return repositoryItemReader;
}
#Bean
public FlatFileItemWriter<User> writer() {
FlatFileItemWriter<User> writer = new FlatFileItemWriter<>();
writer.setResource(outputResource);
writer.setAppendAllowed(true);
writer.setLineAggregator(new DelimitedLineAggregator<User>() {
{
setDelimiter(",");
setFieldExtractor(new BeanWrapperFieldExtractor<User>() {
{
setNames(new String[]{"id", "age", "birthday", "country", "email", "firstName", "gender", "lastName", "personId"});
}
});
}
});
return writer;
}
#Bean
public UserProcessor processor() {
return new UserProcessor();
}
#Bean
public UserJobExecutionNotificationListener stepExecutionListener() {
return new UserJobExecutionNotificationListener(userRepository);
}
#Bean
public UserStepCompleteNotificationListener jobExecutionListener() {
return new UserStepCompleteNotificationListener();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("csv-step").<User, User>chunk(10)
.reader(reader())
.processor(processor())
.writer(writer())
.listener(stepExecutionListener())
.taskExecutor(taskExecutor())
.build();
}
#Bean
public Job runJob() {
return jobBuilderFactory.get("importuserjob")
.listener(jobExecutionListener())
.flow(step1()).end().build();
}
#Bean
public TaskExecutor taskExecutor() {
SimpleAsyncTaskExecutor asyncTaskExecutor = new SimpleAsyncTaskExecutor();
asyncTaskExecutor.setConcurrencyLimit(10);
return asyncTaskExecutor;
}
}
Here is the link of example : Link
After I added FlatFileHeaderCallback into FlatFileItemWriter, I fixed the issue.
Here is the code snippets shown below.
writer.setHeaderCallback(new FlatFileHeaderCallback() {
#Override
public void writeHeader(Writer writer) throws IOException {
for(int i=0;i<headers.length;i++){
if(i!=headers.length-1)
writer.append(headers[i] + ",");
else
writer.append(headers[i]);
}
}
});
I'm new to spring batch and I'm trying to learn about it.
I implemented datasource and batch configuration but it doesn't seem to work, because
Exception in thread "main" org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [SELECT JOB_INSTANCE_ID, JOB_NAME from BATCH_JOB_INSTANCE where JOB_NAME = ? and JOB_KEY = ?]; nested exception is java.sql.SQLSyntaxErrorException: Table '$tableName.batch_job_instance' doesn't exist
DataSourceConfiguration class :
#Configuration
public class DataSourceConfiguration {
#Bean
public DataSource mysqlDataSource() throws SQLException {
final SimpleDriverDataSource dataSource = new SimpleDriverDataSource();
dataSource.setDriver(new com.mysql.cj.jdbc.Driver());
dataSource.setUrl("jdbc:mysql://localhost:3306/task4");
dataSource.setUsername("name");
dataSource.setPassword("password");
return dataSource;
}
}
And I have Batch configuration class, but I dont know #EnableBatchProcessing annotation works without springboot
#Configuration
#EnableBatchProcessing
#Import(DataSourceConfiguration.class)
public class BatchConfiguration {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Autowired
private DataSource dataSource;
#Bean
public ItemReader<User> itemReader() {
return new JdbcCursorItemReaderBuilder<User>()
.dataSource(this.dataSource)
.name("creditReader")
.sql("select userLogin, userEmail, userPassword, accountBalance from userInfo")
.rowMapper(new UserRowMapper())
.build();
}
#Bean
public ItemWriter<User> simpleItemWriter(){
return new SimpleUserWriter();
}
#Bean
public Job sampleJob() {
return this.jobs.get("sampleJob")
.start(step1())
.build();
}
#Bean
public Step step1() {
return this.steps.get("step1")
.<User,User>chunk(10)
.reader(itemReader())
.writer(simpleItemWriter())
.build();
}
}
Main class contains those lines of code :
public class Demo {
public static void main(String[] args) throws SQLException {
ApplicationContext context = new AnnotationConfigApplicationContext(BatchConfiguration.class);
context.getBean("jobRepository");
JobLauncher jobLauncher = context.getBean("jobLauncher",JobLauncher.class);
Job job = context.getBean("sampleJob", Job.class);
try {
JobExecution execution = jobLauncher.run(job, new JobParameters());
System.out.println("Job Exit Status : "+ execution.getStatus());
} catch (JobExecutionException e) {
System.out.println("Job ExamResult failed");
e.printStackTrace();
}
}
}
How can I solve this problem?Thanks for helping
The error seems to say that you are missing the tables needed by Spring batch.
Check https://docs.spring.io/spring-batch/docs/4.3.x/reference/html/schema-appendix.html
You can also find SQL files ready for the different DB engine under org/springframework/bacth/core/schema-<DB>.sql in the spring batch core jar file
I want to use Spring batch with osgi to run a job daily.
here what i did:
#Component
#EnableBatchProcessing
public class BatchConfiguration {
private JobBuilderFactory jobs;
public JobBuilderFactory getJobs() {
return jobs;
}
public void setJobs(JobBuilderFactory jobs) {
this.jobs = jobs;
}
private StepBuilderFactory steps;
private EmployeeRepository employeeRepository; //spring data repository
public EmployeeRepository getEmployeeRepository() {
return employeeRepository;
}
#Reference
public void setEmployeeRepository(EmployeeRepository employeeRepository) {
employeeRepository= employeeRepository;
}
public Step syncEmployeesStep() throws Exception{
RepositoryItemWriter writer = new RepositoryItemWriter();
writer.setRepository(employeeRepository);
writer.setMethodName("save");
return steps.get("syncEmployeesStep")
.<Employee, Employee> chunk(10)
.reader(reader())
.writer(writer)
.build();
}
public Job importEmpJob()throws Exception {
return jobs.get("importEmpJob")
.incrementer(new RunIdIncrementer())
.start(syncEmployeesStep())
.next(syncEmployeesStep())
.build();
}
public ItemReader<Employee> reader() throws Exception {
String jpqlQuery = "select a from Employee a";
ServerEMF entityManager = new ServerEMF();
JpaPagingItemReader<Employee> reader = new JpaPagingItemReader<Tariff>();
reader.setQueryString(jpqlQuery);
reader.setEntityManagerFactory(entityManager.getEntityManagerFactory());
reader.setPageSize(3);
reader.afterPropertiesSet();
reader.setSaveState(true);
return reader;
}
}
here I want to run this job to sync between two databases,My problem is how to run this job inside osgi.
#EnableScheduling
#Component
public class JobRunner {
private JobLauncher jobLauncher;
private Job job ;
private BatchConfiguration batchConfig;
//private JobBuilderFactory jobs;
//private JobRepository jobrepo;
final static Logger logger = LoggerFactory.getLogger(BatchConfiguration.class);
BundleContext ctx;
#SuppressWarnings("rawtypes")
ServiceTracker servicetracker;
#Activate
public void start(BundleContext context) {
batchConfig = new BatchConfiguration();
//jobs = new JobBuilderFactory(jobRepository)
try {
job = batchConfig.importEmpJob(); //job is null because i don't know how to use it
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
ctx = context;
servicetracker= new ServiceTracker(ctx, BatchConfiguration.class, null);
servicetracker.open();
new Thread() {
public void run() { findAndRunJob(); }
}.start();
}
#Deactivate
public void stop() {
configAdminTracker.close();
}
#Scheduled(fixedRate = 5000)
protected void findAndRunJob() {
logger.info("job created.");
try {
String dateParam = new Date().toString();
JobParameters param = new JobParametersBuilder().addString("date", dateParam).toJobParameters();
System.out.println(dateParam);
JobExecution execution = jobLauncher.run(job, param);
System.out.println("Exit Status : " + execution.getStatus());
} catch (Exception e) {
//e.printStackTrace();
}
}
for sure,I got a java.lang.NullPointerException because the job is null.
could anyone help me with that?
after updates
#Component
#EnableBatchProcessing
public class BatchConfiguration {
private EmployeeRepository employeeRepository; //spring data repository
public EmployeeRepository getEmployeeRepository() {
return employeeRepository;
}
#Reference
public void setEmployeeRepository(EmployeeRepository employeeRepository) {
employeeRepository= employeeRepository;
}
public Step syncEmployeesStep() throws Exception{
RepositoryItemWriter writer = new RepositoryItemWriter();
writer.setRepository(employeeRepository);
writer.setMethodName("save");
return steps.get("syncEmployeesStep")
.<Employee, Employee> chunk(10)
.reader(reader())
.writer(writer)
.build();
}
public Job importEmpJob(JobRepository jobRepository, PlatformTransactionManager transactionManager)throws Exception {
JobBuilderFactory jobs= new JobBuilderFactory(jobRepository);
StepBuilderFactory stepBuilderFactory = new StepBuilderFactory(jobRepository, transactionManager);
return jobs.get("importEmpJob")
.incrementer(new RunIdIncrementer())
.start(syncEmployeesStep())
.next(syncEmployeesStep())
.build();
}
public ItemReader<Employee> reader() throws Exception {
String jpqlQuery = "select a from Employee a";
ServerEMF entityManager = new ServerEMF();
JpaPagingItemReader<Employee> reader = new JpaPagingItemReader<Tariff>();
reader.setQueryString(jpqlQuery);
reader.setEntityManagerFactory(entityManager.getEntityManagerFactory());
reader.setPageSize(3);
reader.afterPropertiesSet();
reader.setSaveState(true);
return reader;
}
}
job runner class
private JobLauncher jobLauncher;
private PlatformTransactionManager transactionManager;
private JobRepository jobRepository;
Job importEmpJob;
private BatchConfiguration batchConfig;
#SuppressWarnings("deprecation")
#Activate
public void start(BundleContext context) {
try {
batchConfig = new BatchConfiguration();
this.transactionManager = new ResourcelessTransactionManager();
MapJobRepositoryFactoryBean repositorybean = new MapJobRepositoryFactoryBean();
repositorybean.setTransactionManager(transactionManager);
this.jobRepository = repositorybean.getJobRepository(); //error after executing this statement
// setup job launcher
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setTaskExecutor(new SyncTaskExecutor());
simpleJobLauncher.setJobRepository(jobRepository);
this.jobLauncher = simpleJobLauncher;
//System.out.println(job);
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
ctx = context;
configAdminTracker = new ServiceTracker(ctx, BatchConfiguration.class.getName(), null);
configAdminTracker.open();
new Thread() {
public void run() { findAndRunJob(); }
}.start();
}
#Deactivate
public void stop() {
configAdminTracker.close();
}
protected void findAndRunJob() {
logger.info("job created.");
try {
String dateParam = new Date().toString();
// creating the job
Job job = batchConfig.importEmpJob(jobRepository, transactionManager);
// running the job
JobExecution execution = this.jobLauncher.run(job, new JobParameters());
System.out.println("Exit Status : " + execution.getStatus());
} catch (Exception e) {
//e.printStackTrace();
}
}
what i getting is "java.lang.IllegalArgumentException: interface org.springframework.batch.core.repository.JobRepository is not visible from class loader" after running .could anyone help me with that error?
In Short
If you're just trying to kick off a something simple and don't need all the Spring Batch goodness I would look into the Apache Sling Commons Scheduler which has a simple job processor on top of Quartz for scheduling[1].
In General
There are a couple of considerations here depending on what you are trying to do. Are you deploying the Spring Batch Jars to the OSGi container with the assumption that the code that is written for the jobs (steps, tasks, etc) will live in separate bundles? OSGi's purpose is to develop modular code so my answer is assuming that this is your end goal. The folks at Pivotal have dropped requiring support for OSGi on their artifacts so to make it work you'll need to determine what you need to export from the Batch jar files. This can be done with BND. I would recommend checking out the new BND Maven plugin [2]. I would configure the Export-Package to export the interfaces you need to write the jobs so that you can write the jobs in separate modular bundles. Then I would probably embed the spring batch jars in a bundle and write a small wrapper around the JobLauncher. This should contain all the actual batch code to a single classloader so that you don't have to worry about OSGi trying to pull in classes dynamically. The downside is that this will prevent you from using many of the batch annotations outside of the spring batch bundle you created but will provide the modularity that you'd be looking for by implementing this type of solution with OSGi.
[1] https://sling.apache.org/documentation/bundles/apache-sling-eventing-and-job-handling.html
[2] http://njbartlett.name/2015/03/27/announcing-bnd-maven-plugin.html
I am writing a Java Application that is going to be using Spring, Hibernate and more its going to be packaged in a Jar and run from the command like.
My main class looks like the following right now:
public class App
{
private static final Logger logger = LoggerFactory.getLogger(App.class);
#Autowired
private static MemberInquiryService memberInquiryService;
public static void main(String[] args )
{
logger.info("Starting Inquiry Batch Process");
int pendingRecords = memberInquiryService.getPendingRecordCount();
logger.info("Current Number Of Pendinig Records (" + pendingRecords + ")");
logger.info("Ending Inquiry Batch Process");
}
}
and in the getPendingRecordCount I am just return "10" for testing:
public int getPendingRecordCount()
{
return 10;
};
Why would I be getting the following error:
Exception in thread "main" java.lang.NullPointerException
at org.XXXX.inquirybatch.app.App.main(App.java:33)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
Also here is my DatabaseConfig.class
#Configuration
#EnableTransactionManagement
#ComponentScan(basePackages= { "org.xxxx.inquirybatch", "org.xxxx.core" })
#PropertySource("classpath:application.properties")
public class DatabaseConfig {
private static final Logger logger = LoggerFactory.getLogger(DatabaseConfig.class);
#Autowired
Environment env;
#Bean
public DataSource dataSource() {
String serverType = env.getProperty("server.type");
try {
if(serverType.equalsIgnoreCase("tomcat"))
{
com.mchange.v2.c3p0.ComboPooledDataSource ds = new com.mchange.v2.c3p0.ComboPooledDataSource();
ds.setDriverClass(env.getProperty("database.driver"));
ds.setUser(env.getProperty("database.user"));
ds.setPassword(env.getProperty("database.password"));
ds.setJdbcUrl(env.getProperty("database.url"));
return ds;
}
else
{
Context ctx = new InitialContext();
return (DataSource) ctx.lookup("java:jboss/datasources/mySQLDB");
}
}
catch (Exception e)
{
logger.error(e.getMessage());
}
return null;
}
#Bean
public SessionFactory sessionFactory()
{
LocalSessionFactoryBean factoryBean = new LocalSessionFactoryBean();
factoryBean.setDataSource(dataSource());
factoryBean.setHibernateProperties(getHibernateProperties());
factoryBean.setPackagesToScan(new String[] { "org.xxxx.inquirybatch.model", "org.xxxx.core.model" } );
try {
factoryBean.afterPropertiesSet();
} catch (IOException e) {
logger.error(e.getMessage());
e.printStackTrace(); //To change body of catch statement use File | Settings | File Templates.
}
return factoryBean.getObject();
}
#Bean
public Properties getHibernateProperties()
{
Properties hibernateProperties = new Properties();
hibernateProperties.setProperty("hibernate.dialect", env.getProperty("hibernate.dialect"));
hibernateProperties.setProperty("hibernate.show_sql", env.getProperty("hibernate.show_sql"));
hibernateProperties.setProperty("hibernate.use_sql_comments", env.getProperty("hibernate.use_sql_comments"));
hibernateProperties.setProperty("hibernate.format_sql", env.getProperty("hibernate.format_sql"));
hibernateProperties.setProperty("hibernate.hbm2ddl.auto", env.getProperty("hibernate.hbm2ddl.auto"));
hibernateProperties.setProperty("hibernate.generate_statistics", env.getProperty("hibernate.generate_statistics"));
hibernateProperties.setProperty("javax.persistence.validation.mode", env.getProperty("javax.persistence.validation.mode"));
//Audit History flags
hibernateProperties.setProperty("org.hibernate.envers.store_data_at_delete", env.getProperty("org.hibernate.envers.store_data_at_delete"));
hibernateProperties.setProperty("org.hibernate.envers.global_with_modified_flag", env.getProperty("org.hibernate.envers.global_with_modified_flag"));
return hibernateProperties;
}
#Bean
public HibernateTransactionManager hibernateTransactionManager()
{
HibernateTransactionManager htm = new HibernateTransactionManager();
htm.setSessionFactory(sessionFactory());
htm.afterPropertiesSet();
return htm;
}
}
Spring never injects static fields. And it only inject objects that are retrieved from the application context, or are themselves injected into other objects.
You're not even creating an application context in your program, so Spring doesn't play any role in this program.
I suggest to read the documentation.
You need to get the memberInquiryService from ClassPathXmlApplicationContext
For example :-
ApplicationContext context= new ClassPathXmlApplicationContext("spring config.xml");
MmberInquiryService memberInquiryService = context.getBean("memberInquiryService ");
basically in your code snippet MemberInquiryService is not spring managed as you are not getting from spring container. Also you need to declare MmberInquiryService entry in spring config.xml
I had to change the main class to..
public static void main(String[] args )
{
logger.info("Starting Inquiry Batch Process");
ApplicationContext context = new AnnotationConfigApplicationContext(AppConfig.class);
MemberInquiryService memberInquiryService = (MemberInquiryService) context.getBean("memberInquiryService");
int pendingRecords = memberInquiryService.getPendingRecordCount();
logger.info("Current Number Of Pendinig Records (" + pendingRecords + ")");
logger.info("Ending Inquiry Batch Process");
}