JobParameter pass to Processor class in Spring Batch - java

I am trying to store 6 CSV file data in the database. Here I am converting CSV data into another object and going to save it.
To Create the object, I need another field from the rest endpoint as a request body and pass it as a JobParameter.
I need to access that parameter in my processor class. I tried different methods. But I am getting the following errors. Any solution would appreciate.
EL1008E: Property or field 'jobParameters' cannot be found on object of type 'org.springframework.beans.factory.config.BeanExpressionContext' - maybe not public or not valid?
This is my RestEnd Point:
#PostMapping
public void save(#RequestBody DownloadFileRequestDto downloadFileRequestDto) throws IOException {
JobParameters JobParameters = new JobParametersBuilder()
.addString("myParam", downloadFileRequestDto.getMyParam())
.toJobParameters();
try {
JobExecution run = jobLauncher.run(job, JobParameters);
} catch (JobExecutionAlreadyRunningException e) {
throw new RuntimeException(e);
} catch (JobRestartException e) {
throw new RuntimeException(e);
} catch (JobInstanceAlreadyCompleteException e) {
throw new RuntimeException(e);
} catch (JobParametersInvalidException e) {
throw new RuntimeException(e);
}
}
This is my Spring Batch Configuration class:
#Configuration
#EnableBatchProcessing
#AllArgsConstructor
public class SpringBatchConfig {
private JobBuilderFactory jobBuilderFactory;
private StepBuilderFactory stepBuilderFactory;
private UserRepository userRepository;
#Bean
public FlatFileItemReader<InputUser> reader() {
FlatFileItemReader<InputUser> itemReader = new FlatFileItemReader<>();
itemReader.setResource(new FileSystemResource("src/main/resources/users.csv"));
itemReader.setName("csvReader");
itemReader.setLinesToSkip(1);
itemReader.setLineMapper(lineMapper());
return itemReader;
}
private LineMapper<InputUser> lineMapper() {
DefaultLineMapper<InputUser> lineMapper = new DefaultLineMapper<>();
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
lineTokenizer.setDelimiter(",");
lineTokenizer.setStrict(false);
lineTokenizer.setNames("name", "salary");
BeanWrapperFieldSetMapper<InputUser> fieldSetMapper = new BeanWrapperFieldSetMapper<>();
fieldSetMapper.setTargetType(InputUser.class);
lineMapper.setLineTokenizer(lineTokenizer);
lineMapper.setFieldSetMapper(fieldSetMapper);
return lineMapper;
}
#Bean
public Processor processors() {
return new Processor();
}
public RepositoryItemWriter<User> writer(){
RepositoryItemWriter<User> writer = new RepositoryItemWriter<User>();
writer.setRepository(userRepository);
writer.setMethodName("save");
return writer;
}
#Bean
public Step step1(){
return stepBuilderFactory.get("csv-step").<InputUser, User>chunk(500)
.reader(reader())
.processor(processors())
.writer(writer())
.taskExecutor(taskExecutor())
.build();
}
#Bean
public Job job(){
return jobBuilderFactory.get("importUsers")
.flow(step1())
.end().build();
}
public TaskExecutor taskExecutor(){
SimpleAsyncTaskExecutor simpleAsyncTaskExecutor = new SimpleAsyncTaskExecutor();
simpleAsyncTaskExecutor.setConcurrencyLimit(50);
return simpleAsyncTaskExecutor;
}
}
This is my Processor class, where I need access to my job parameters.
#Component
#Scope("step")
public class Processor implements ItemProcessor<InputUser, User> {
#Value("#{jobParameters['myParam']}")
private String fileName;
public Processor() {
}
public void setFileName(String fileName) {
this.fileName = fileName;
}
#Override
public User process(InputUser item) throws Exception {
Random random = new Random();
return new User(random.nextInt(), item.getName(), item.getSalary());
}
}

Remove #Component and #Scope from the Processor class. In the configuration just declare the bean
#Bean
#StepScope
public Processor processors(#Value("#jobParameters['myParam']}") String myParam) {
return new Processor(myParam);
}
Add a constructor in the Processor class that takes a string.
As is you are creating two beans of Processor type one in the configuration class and one with #Component But when defining the step you are using the method in the configuration that is not step scoped and I think spring is not able to inject the value in it.

Related

Spring Boot Batch Automatically run instead of sending a request. How can I fix it?

I have an issue regarding running batch process through a request instead of run it automatically when the application runs.
Normally, when I start the application, batch process automatically is handled and it allows to save all values into database. I don't want to do that.
After I want to make a request defined in controller, batch process will starts. It is what I really want to do.
How can I do that?
Here is the project link : Link
Here is the BatchConfiguration class shown below.
#Configuration // Informs Spring that this class contains configurations
#EnableBatchProcessing // Enables batch processing for the application
#RequiredArgsConstructor
public class BatchConfiguration {
private final JobBuilderFactory jobBuilderFactory;
private final StepBuilderFactory stepBuilderFactory;
private final UserRepository userRepository;
#Bean
public FlatFileItemReader<UserInput> reader() {
FlatFileItemReader<UserInput> itemReader = new FlatFileItemReader<>();
itemReader.setResource(new FileSystemResource("src/main/resources/MOCK_DATA.csv"));
itemReader.setName("csvReader");
itemReader.setLinesToSkip(1);
itemReader.setLineMapper(lineMapper());
return itemReader;
}
private LineMapper<UserInput> lineMapper() {
DefaultLineMapper<UserInput> lineMapper = new DefaultLineMapper<>();
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
lineTokenizer.setDelimiter(",");
lineTokenizer.setStrict(false);
lineTokenizer.setNames(
"personId","firstName","lastName","email","gender","birthday","country"
);
BeanWrapperFieldSetMapper<UserInput> fieldSetMapper = new BeanWrapperFieldSetMapper<>();
fieldSetMapper.setTargetType(UserInput.class);
lineMapper.setLineTokenizer(lineTokenizer);
lineMapper.setFieldSetMapper(fieldSetMapper);
return lineMapper;
}
#Bean
public UserProcessor processor() {
return new UserProcessor();
}
#Bean
public RepositoryItemWriter<User> writer() {
RepositoryItemWriter<User> writer = new RepositoryItemWriter<>();
writer.setRepository(userRepository);
writer.setMethodName("save");
return writer;
}
#Bean
public Step step1() {
return stepBuilderFactory.get("csv-step").<UserInput, User>chunk(10)
.reader(reader())
.processor(processor())
.writer(writer())
.listener(stepExecutionListener())
.taskExecutor(taskExecutor())
.build();
}
#Bean
public Job runJob() {
return jobBuilderFactory.get("importuserjob")
.listener(jobExecutionListener())
.flow(step1()).end().build();
}
#Bean
public TaskExecutor taskExecutor() {
SimpleAsyncTaskExecutor asyncTaskExecutor = new SimpleAsyncTaskExecutor();
asyncTaskExecutor.setConcurrencyLimit(10);
return asyncTaskExecutor;
}
#Bean
public UserJobExecutionNotificationListener stepExecutionListener() {
return new UserJobExecutionNotificationListener(userRepository);
}
#Bean
public UserStepCompleteNotificationListener jobExecutionListener() {
return new UserStepCompleteNotificationListener();
}
}
Here is the controller class shown below.
public class BatchController {
private final JobLauncher jobLauncher;
private final Job job;
#PostMapping("/importuserjob")
public ResponseEntity<String> importCsvToDBJob() {
log.info("BatchController | importCsvToDBJob is called");
JobParameters jobParameters = new JobParametersBuilder()
.addLong("startAt", System.currentTimeMillis()).toJobParameters();
try {
jobLauncher.run(job, jobParameters);
} catch (JobExecutionAlreadyRunningException | JobRestartException |
JobInstanceAlreadyCompleteException | JobParametersInvalidException e) {
log.info("BatchController | importCsvToDBJob | error : " + e.getMessage());
e.printStackTrace();
}
return new ResponseEntity<>("Batch Process started!!", HttpStatus.OK);
}
}
To stop the job running automatically set the following config flag
spring:
batch:
job:
enabled: false
you can always set the value at launch if you are running a jar
java -Dspring.batch.job.enabled=false -jar myapp.jar
Not sure if it's just a cut and paste issue, but you are missing a constructor in your controller which is required to set final variables, joblauncher and job. Also I assume your controller has #Controller & #Slf4j annotations

Spring batch job runs automatically

I'm using a spring batch to read a CSV file and write it to the DB, using the controller trigger. On starting the application, before I hit from the browser url, I see the print statements from my reader, on the startup. Although it doesn't print it for my processor or writer, which are in separate classes which I have autowired. Is it because the reader is a bean?
I see the print statements from my FlatFileItemReader in the log on the application startup. But the print statements for my processor and writer only show up in the console when I hit the controller url.
I've tried adding spring.batch.job.enabled=false in the application.properties file, but it doesnt stop the execution of the reader bean. How can I prevent auto execution of the reader bean in the SpringBatchConfig class:
SpringBatchConfig class:
#Configuration
#EnableBatchProcessing
public class SpringBatchConfig {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private DataSource dataSource;
#Autowired
private DBWriter writer1;
#Autowired
private Processor processor1;
//Step 1 - CSV to DB
#Bean
public FlatFileItemReader<User> itemReader() {
FlatFileItemReader<User> flatFileItemReader = new FlatFileItemReader<>();
flatFileItemReader.setResource(new FileSystemResource("src/main/resources/users.csv"));
flatFileItemReader.setName("CSV-Reader");
flatFileItemReader.setLinesToSkip(1);
flatFileItemReader.setLineMapper(lineMapper());
System.out.println("inside file reader 1 !!!!!");
return flatFileItemReader;
}
#Bean
public LineMapper<User> lineMapper() {
DefaultLineMapper<User> defaultLineMapper = new DefaultLineMapper<>();
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
lineTokenizer.setDelimiter(",");
lineTokenizer.setStrict(false);
lineTokenizer.setNames(new String[]{"id", "name", "dept", "salary"});
BeanWrapperFieldSetMapper<User> fieldSetMapper = new BeanWrapperFieldSetMapper<>();
fieldSetMapper.setTargetType(User.class);
defaultLineMapper.setLineTokenizer(lineTokenizer);
defaultLineMapper.setFieldSetMapper(fieldSetMapper);
return defaultLineMapper;
}
#Bean
public Step step1() throws Exception{ // Step 1 - Read CSV and Write to DB
return stepBuilderFactory.get("step1")
.<User,User>chunk(100)
.reader(itemReader())
.processor(processor1)
.writer(writer1)
.build();
}
#Bean
public Job job() throws Exception{
return this.jobBuilderFactory.get("BATCH JOB")
.incrementer(new RunIdIncrementer())
.start(step1())
.build();
}
DBWriter class:
#Component
public class DBWriter implements ItemWriter<User> {
#Autowired
private UserRepository userRepository;
#Override
public void write(List<? extends User> users) throws Exception {
System.out.println("Inside DB Writer");
System.out.println("Data Saved for Users: " + users);
userRepository.save(users);
}
}
Processor class:
#Component
public class Processor implements ItemProcessor<User, User> {
private static final Map<String, String> DEPT_NAMES =
new HashMap<>();
public Processor() {
DEPT_NAMES.put("001", "Technology");
DEPT_NAMES.put("002", "Operations");
DEPT_NAMES.put("003", "Accounts");
}
#Override
public User process(User user) throws Exception {
String deptCode = user.getDept();
String dept = DEPT_NAMES.get(deptCode);
user.setDept(dept);
user.setTime(new Date());
System.out.println(String.format("Converted from [%s] to [%s]", deptCode, dept));
return user;
}
}
Controller Class:
#RestController
#RequestMapping("/load")
public class LoadController {
#Autowired
JobLauncher jobLauncher;
#Autowired
Job job;
#GetMapping("/users")
public BatchStatus load() throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException {
Map<String, JobParameter> maps = new HashMap<>();
maps.put("time", new JobParameter(System.currentTimeMillis()));
JobParameters parameters = new JobParameters(maps);
JobExecution jobExecution = jobLauncher.run(job, parameters);
System.out.println("JobExecution: " + jobExecution.getStatus());
System.out.println("Batch is Running...");
while (jobExecution.isRunning()) {
System.out.println("...");
}
return jobExecution.getStatus();
}
}
The spring.batch.job.enabled=false property is used to prevent running jobs at application startup.
The method that creates the reader will be still be called at configuration time, so it's normal that you see the print statement. But that does not mean the reader was called inside a running job.

Import CSV file into database using Controller to pass CSV file

I am trying to create Endpoint for Importing csv file to database with the help of spring batch. But i am not able to do. Can someone help me to do with this. It will be great thankful. I am new learner.
My source code is given below please try to help me to sort this issue.
#RestController
public class MyImportController {
private static final Logger logger = LoggerFactory.getLogger(MyImportController.class);
#Autowired
private JobLauncher jobLauncher;
#Autowired
private Job importUserJob;
#GetMapping(value = "/import/file")
public String uploadFile() {
return "/uploadFile";
}
#PostMapping(value="/import/file", consumes = MediaType.MULTIPART_FORM_DATA_VALUE)
public String create(#RequestParam("file") MultipartFile multipartFile) throws IOException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException, JobParametersInvalidException {
String paths = "/Users/dilipverma/Desktop/CSV_FILE_READER/src/main/resources/temp/";
File fileToImport = new File(path + multipartFile.getOriginalFilename());
System.out.println(" ::::::::::::::::::: File name is ::---------------------------"+fileToImport);
OutputStream outputStream = new FileOutputStream(fileToImport);
IOUtils.copy(multipartFile.getInputStream(), outputStream);
outputStream.flush();
outputStream.close();
JobExecution jobExecution = jobLauncher.run(importUserJob, new JobParametersBuilder()
.addString("fullPathFileName", fileToImport.getAbsolutePath())
.toJobParameters());
logger.info(" :::::::::::::::::::::: Job Status is :::::::::::::::::: "+jobExecution.getStatus());
return "Done";
}
}
Above is my Controller to accept the CSV File.
//Batch File
#Configuration
#EnableBatchProcessing
public class BatchConfig {
#Bean
public ResourcelessTransactionManager batchTransactionManager(){
return new ResourcelessTransactionManager();
}
#Bean
protected JobRepository jobRepository(ResourcelessTransactionManager batchTransactionManager) throws Exception{
MapJobRepositoryFactoryBean jobRepository = new MapJobRepositoryFactoryBean();
jobRepository.setTransactionManager(batchTransactionManager);
return jobRepository.getObject();
}
#Bean
public JobLauncher jobLauncher(JobRepository jobRepository){
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository);
return jobLauncher;
}
}
#Configuration
public class ImportJobConfig {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
#Scope(value = "step", proxyMode = ScopedProxyMode.TARGET_CLASS)
public FlatFileItemReader<Contact> importReader(#Value("#{jobParameters[fullPathFileName]}") String pathToFile) {
FlatFileItemReader<Contact> reader = new FlatFileItemReader<>();
reader.setResource(new FileSystemResource(pathToFile));
reader.setLineMapper(lineMapper());
return reader;
}
#Bean
public LineMapper<Contact> lineMapper() {
DefaultLineMapper<Contact> defaultLineMapper = new DefaultLineMapper<>();
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
lineTokenizer.setDelimiter(",");
lineTokenizer.setStrict(false);
lineTokenizer.setNames( new String[] {"email", "contactno"});
BeanWrapperFieldSetMapper<Contact> fieldSetMapper = new BeanWrapperFieldSetMapper<>();
fieldSetMapper.setTargetType(Contact.class);
defaultLineMapper.setFieldSetMapper(fieldSetMapper);
defaultLineMapper.setLineTokenizer(lineTokenizer);
return defaultLineMapper;
}
#Bean
public APSUploadFileItemProcessor processor() {
return new APSUploadFileItemProcessor();
}
#Bean
public Job importUserJob(ItemReader<Contact> importReader,
ItemWriter<Contact> itemWriter) {
return jobBuilderFactory.get("importUserJob")
.incrementer(new RunIdIncrementer())
.flow(step1(importReader,itemWriter))
.end()
.build();
}
#Bean
public Step step1(ItemReader<Contact> importReader,
ItemWriter<Contact> itemWriter) {
return stepBuilderFactory.get("step1")
.<Contact, Contact>chunk(10)
.reader(importReader)
.processor(processor())
.writer(itemWriter)
.build();
}
}
//Processor for Job.
#Component
public class APSUploadFileItemProcessor implements ItemProcessor<Contact, Contact> {
#Override
public Contact process(Contact apsUploadFile){
return apsUploadFile;
}
}
//Storing into database
#Component
public class DBWriter implements ItemWriter<Contact> {
#Autowired
private ContactRepository contactRepository;
#Override
public void write(List<? extends Contact> contacts) throws Exception {
contactRepository.save(contacts);
}
}
Here done below is my Error which i am getting while passing my Csv file.
2020-01-20 15:18:50.859 ERROR 15841 --- [nio-8090-exec-2] o.s.batch.core.step.AbstractStep : Encountered an error executing step step1 in job importUserJob
org.springframework.beans.NotReadablePropertyException: Invalid property 'Id' of bean class [java.util.Collections$UnmodifiableRandomAccessList]: Could not find field for property during fallback access!
at org.springframework.data.util.DirectFieldAccessFallbackBeanWrapper.getPropertyValue(DirectFieldAccessFallbackBeanWrapper.java:58) ~[spring-data-commons-2.1.10.RELEASE.jar:2.1.10.RELEASE]
at org.springframework.data.jpa.repository.support.JpaMetamodelEntityInformation.getId(JpaMetamodelEntityInformation.java:152) ~[spring-data-jpa-2.1.10.RELEASE.jar:2.1.10.RELEASE]
at org.springframework.data.repository.core.support.AbstractEntityInformation.isNew(AbstractEntityInformation.java:42) ~[spring-data-commons-2.1.10.RELEASE.jar:2.1.10.RELEASE]
at org.springframework.data.jpa.repository.support.JpaMetamodelEntityInformation.isNew(JpaMetamodelEntityInformation.java:231) ~[spring-data-jpa-2.1.10.RELEASE.jar:2.1.10.RELEASE]
at org.springframework.data.jpa.repository.support.SimpleJpaRepository.save(SimpleJpaRepository.java:534) ~[spring-data-jpa-2.1.10.RELEASE.jar:2.1.10.RELEASE]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.8.0_191]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:1.8.0_191]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_191]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_191]
at org.springframework.data.repository.core.support.RepositoryComposition$RepositoryFragments.invoke(RepositoryComposition.java:359) ~[spring-data-commons-2.1.10.RELEASE.jar:2.1.10.RELEASE]
One way to resolve this issue is save contacts by using contactRepository.saveAll(contacts).
#Override
public void write(List<? extends Contact> contacts) throws Exception {
personRepository.saveAll(contacts);
}
By using this, I got the output.

AWS SqsListener deserialize Custom Object with Jackson

I can only listen to strings once I try to receive a custom object it throws the following error. It seems that I need to teach Spring to handle my custom object (B2BOrder)
org.springframework.messaging.converter.MessageConversionException: Cannot convert from [java.lang.String] to [br.com.b2breservas.api.model.B2BOrder] for GenericMessage [payload={"comments":"95d29059-8552-42fa-8fd9-a1d776416269"},
My SQSConfig
#Configuration
#EnableSqs
public class SqsConfig {
private static final String DEFAULT_THREAD_NAME_PREFIX = ClassUtils.getShortName(SimpleMessageListenerContainer.class) + "-";
#Bean
public QueueMessagingTemplate myMessagingTemplate(AmazonSQSAsync amazonSqs, ResourceIdResolver resolver) {
ObjectMapper mapper = new ObjectMapper()
.registerModule(new ParameterNamesModule())
.registerModule(new Jdk8Module())
.registerModule(new JodaModule())
.registerModule(new JavaTimeModule());
// configure the Jackson mapper as needed
// maybe I need to do something here!
MappingJackson2MessageConverter converter = new MappingJackson2MessageConverter();
converter.setSerializedPayloadClass(String.class);
converter.setStrictContentTypeMatch(false);
converter.setObjectMapper(mapper);
return new QueueMessagingTemplate(amazonSqs, resolver, converter);
}
#Bean
public ClientConfiguration sqsClientConfiguration() {
return new ClientConfiguration()
.withConnectionTimeout(30000)
.withRequestTimeout(30000)
.withClientExecutionTimeout(30000);
}
#Bean
public ExecutorFactory sqsExecutorFactory() {
return new ExecutorFactory() {
#Override
public ExecutorService newExecutor() {
return new ThreadPoolExecutor(2, 2, 0L, TimeUnit.MILLISECONDS, new LinkedBlockingQueue<Runnable>());
}
};
}
#Value("${b2b.b2b.accesstoken}")
public String accesstoken;
#Value("${b2b.b2b.secretkey}")
public String secretkey;
#Bean
public AmazonSQSAsync amazonSqs(ClientConfiguration sqsClientConfiguration, ExecutorFactory sqsExecutorFactory) {
BasicAWSCredentials credential = new BasicAWSCredentials(accesstoken, secretkey);
return AmazonSQSAsyncClientBuilder.standard()
.withClientConfiguration(sqsClientConfiguration)
.withExecutorFactory(sqsExecutorFactory)
// .withEndpointConfiguration(sqsEndpointConfiguration)
// .withCredentials(credentialsProvider)
.withCredentials(new AWSStaticCredentialsProvider(credential))
.build();
}
#Bean
public AsyncTaskExecutor queueContainerTaskEecutor() {
ThreadPoolTaskExecutor threadPoolTaskExecutor = new ThreadPoolTaskExecutor();
threadPoolTaskExecutor.setThreadNamePrefix(DEFAULT_THREAD_NAME_PREFIX);
threadPoolTaskExecutor.setCorePoolSize(2);
threadPoolTaskExecutor.setMaxPoolSize(2);
// No use of a thread pool executor queue to avoid retaining message to long in memory
threadPoolTaskExecutor.setQueueCapacity(0);
threadPoolTaskExecutor.afterPropertiesSet();
return threadPoolTaskExecutor;
}
#Bean
public SimpleMessageListenerContainerFactory simpleMessageListenerContainerFactory(AmazonSQSAsync amazonSqs, AsyncTaskExecutor queueContainerTaskEecutor) {
SimpleMessageListenerContainerFactory factory = new SimpleMessageListenerContainerFactory();
factory.setAmazonSqs(amazonSqs);
factory.setAutoStartup(true);
// factory.setQueueMessageHandler();
factory.setMaxNumberOfMessages(1);
factory.setWaitTimeOut(20);
factory.setTaskExecutor(queueContainerTaskEecutor);
return factory;
}
}
Listener
#Component
public class SqsHub {
#SqsListener(
"https://sqs.us-west-2.amazonaws.com/3234/32443-checkout.fifo"
)
public void listen(B2BOrder message) {
// public void listen(String message) { THIS WORKS!!
System.out.println("!!!! received message {} {}" + message.toString());
}
}
Sending
....
#Autowired
AmazonSQSAsync amazonSqs;
#GetMapping("/yay")
public String yay() {
try {
B2BOrder pendingOrder = new B2BOrder();
pendingOrder.setComments(UUID.randomUUID().toString());
String pendingOrderJson = objectMapper.writeValueAsString(pendingOrder);
QueueMessagingTemplate queueMessagingTemplate = new QueueMessagingTemplate(amazonSqs);
Map<String, Object> headers = new HashMap<>();
headers.put(SqsMessageHeaders.SQS_GROUP_ID_HEADER, "my-application");
headers.put(SqsMessageHeaders.SQS_DEDUPLICATION_ID_HEADER, UUID.randomUUID().toString());
queueMessagingTemplate.convertAndSend("booking-checkout.fifo", pendingOrderJson, headers);
} catch (final AmazonClientException | JsonProcessingException ase) {
System.out.println("Error Message: " + ase.getMessage());
}
return "sdkjfn";
}
....
Simple Custom Object
public class B2BOrder implements Serializable {
#JsonProperty
private String comments;
}
UPDATE
#Michiel answer took me here, but still got the same error.
#Autowired
public ObjectMapper objectMapper;
#Bean
public SimpleMessageListenerContainerFactory simpleMessageListenerContainerFactory(AmazonSQSAsync amazonSqs, AsyncTaskExecutor queueContainerTaskEecutor) {
SimpleMessageListenerContainerFactory factory = new SimpleMessageListenerContainerFactory();
factory.setAmazonSqs(amazonSqs);
factory.setAutoStartup(true);
QueueMessageHandlerFactory queueMessageHandlerFactory = new QueueMessageHandlerFactory();
queueMessageHandlerFactory.setAmazonSqs(amazonSqs);
MappingJackson2MessageConverter jsonMessageConverter = new MappingJackson2MessageConverter();
jsonMessageConverter.setObjectMapper(objectMapper);
queueMessageHandlerFactory.setMessageConverters(Collections.singletonList(jsonMessageConverter));
factory.setQueueMessageHandler(queueMessageHandlerFactory.createQueueMessageHandler());
// factory.setMaxNumberOfMessages(1);
factory.setWaitTimeOut(20);
factory.setTaskExecutor(queueContainerTaskEecutor);
return factory;
}
```
Although you have registered a MessageConverter, it is only configured to be used in the outgoing request (using the QueueMessagingTemplate). Your MessageListener does not have a MessageConverter configuration. Therefore, incoming messages can only be retrieved as a 'raw' type such as String.
In your snippet you have commented the following line of code:
// factory.setQueueMessageHandler();
This is the location where you could set a QueueMessageHandler that itself has one or more MessageConverters attached.
[edit]
Sure:
QueueMessageHandlerFactory handlerFactory = new QueueMessageHandlerFactory();
handlerFactory.setMessageConverters(yourJacksonConfig);
QueueMessageHandler messageHandler = handlerFactory.createQueueMessageHandler();
factory.setQueueMessageHandler(messageHandler);
This Spring documentation might be of help.
I'm using a SimpleMessageListenerContainer which is defined in a Bean as follows
#Bean
#Primary
fun simpleMessageListenerContainerFactory(amazonSQS: AmazonSQSAsync): SimpleMessageListenerContainerFactory =
SimpleMessageListenerContainerFactory().apply {
setAmazonSqs(amazonSQS)
setMaxNumberOfMessages(10)
setAutoStartup(false)
}
(by the way, I'm using Kotlin)
This is bean is created outside my project, so in order to inject into the SimpleMessageListenerContainerFactory a new converter I've done as #Michiel has suggested:
#Bean
fun Handler(): QueueMessageHandlerFactory {
val handlerFactory = QueueMessageHandlerFactory()
handlerFactory.messageConverters = listOf(jsonMessageConverter())
return handlerFactory
}

Spring boot batch Spring data jpa

i am using Spring boot + Spring Batch + JPA + Mysql. below are the follow i am following:
spring boot application
```
#SpringBootApplication(scanBasePackages = "com.xxx.xxxx.config, com.xxx.xxx.rest, com.xxx.xxx.domain, com.xxx.xxx.dataservice, com.xxx.xxx.batchconfiguration, com.xxx.xxx.steps")
#EnableAutoConfiguration
#EnableConfigurationProperties(Properties.class)
#EnableEncryptableProperties
public class SampleApplication
{
public static void main(String[] args) {
SpringApplication.run(SampleApplication.class, args);
}
}
```
Created #Entity classes based on table structure
created reposity interface like below
```
#Component
public interface ExampleRepository extends JpaRepository<tableClass, Long> {
Page<tableClass> findTop10ByStatus(tableClassStatus status,
Pageable pageable);
}
```
Batch configuration:
```
#Configuration
#EnableScheduling
public class BatchScheduler {
#Bean
public ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public JobExplorer jobExplorer() throws Exception {
MapJobExplorerFactoryBean jobExplorerFactory = new MapJobExplorerFactoryBean(
mapJobRepositoryFactory());
jobExplorerFactory.afterPropertiesSet();
return jobExplorerFactory.getObject();
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactory() throws Exception {
MapJobRepositoryFactoryBean factory = new MapJobRepositoryFactoryBean();
factory.setTransactionManager(transactionManager());
return factory;
}
#Bean
public JobRepository jobRepository() throws Exception {
return mapJobRepositoryFactory().getObject();
}
#Bean
public JobLauncher jobLauncher() throws Exception {
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(jobRepository());
return simpleJobLauncher;
}
}
```
batch configuration
```
#Configuration
#EnableBatchProcessing
#Import({ BatchScheduler.class })
public class BatchConfiguration2 {
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
private SimpleJobLauncher jobLauncher;
#Autowired
public ExampleRepository exampleRepository ;
#Scheduled(cron = "0/5 * * * * ?")
public void perform() throws Exception {
System.out.println("Job Started at :" + new Date());
JobParameters param = new JobParametersBuilder()
.addString("JobID", String.valueOf(System.currentTimeMillis())).toJobParameters();
JobExecution execution = jobLauncher.run(job(), param);
System.out.println("Job finished with status :" + execution.getStatus());
}
#Bean
public RepositoryMetadata repositoryMetadata() {
return new DefaultRepositoryMetadata(ExampleRepository.class);
}
#Bean
public RepositoryItemReader<tableClass> reader() {
RepositoryItemReader<tableClass> fullfillment = new RepositoryItemReader<tableClass>();
fullfillment.setRepository(ExampleRepository);
fullfillment.setMethodName("findTop10ByStatus");
List<tableClassStatus> list = new ArrayList<tableClassStatus>();
list.add(tableClassStatus.FULFILLMENT_READY);
fullfillment.setArguments(list);
HashMap<String, Sort.Direction> sorts = new HashMap<>();
sorts.put("id", Direction.DESC);
fullfillment.setSort(sorts);
return fullfillment;
}
/* #Bean
public RepositoryItemWriter<tableClass> writer() {
System.out.println("BatchConfiguration.writer()");
RepositoryItemWriter<tableClass> itemWriter = new RepositoryItemWriter<tableClass>();
itemWriter.setRepository(ExampleRepository);
itemWriter.setMethodName("save");
return itemWriter;
}*/
#Bean
public RepositoryItemWriter<tableClass> writer() {
System.out.println("BatchConfiguration.writer()");
DefaultCrudMethods defaultCrudMethods = new DefaultCrudMethods(repositoryMetadata());
RepositoryItemWriter<tableClass> itemWriter = new RepositoryItemWriter<tableClass>();
itemWriter.setRepository(ExampleRepository);
itemWriter.setMethodName(defaultCrudMethods.getSaveMethod().getName());
return itemWriter;
}
#Bean
public Step step1() throws Exception {
return this.stepBuilderFactory.get("step1")
.<tableClass, tableClass> chunk(1).reader(reader())
.processor(new QuoteOfferFullfillmentSubmitionProcessor()).writer(writer()).build();
}
#Bean
public Job job() throws Exception {
return this.jobBuilderFactory.get("job").incrementer(new RunIdIncrementer()).start(step1())
.listener(new JobCompletionListener()).build();
}
}
```
QuoteOfferFullfillmentSubmitionProcessor.java
```
public class QuoteOfferFullfillmentSubmitionProcessor implements ItemProcessor<QuoteOfferFulfillment, QuoteOfferFulfillment> {
#Override
public tableClass process(tableClass item) throws Exception {
System.out.println("Processor.process() ==> ID " + item.getId());
System.out.println("Processor.process() ==> " + item.getLenderName());
System.out.println(
"QuoteOfferFullfillmentSubmitionProcessor.process() ==> source" + item.getStatus());
item.setStatus(QuoteOfferFulfillmentStatus.PROCESSING);
return item;
}
}
```
JobCompletionListener.java
```
public class JobCompletionListener extends JobExecutionListenerSupport {
#Override
public void afterJob(JobExecution jobExecution) {
if (jobExecution.getStatus() == BatchStatus.COMPLETED) {
System.out.println("BATCH JOB COMPLETED SUCCESSFULLY");
}
}
}
```
after all configuration, running the application every 5 sec the job is running fine, getting values and printing in process class, after that coming to writer class and printing all the values with out Errors.
the problem is: its not updating the values in the Database in write method.
with out BatchScheduler, scheduler and job launcher (commit the code) restarting the spring boot its all updating the values in database.
Can you guys suggest me what wrong i am doing.
I have resolve the issue. in the batch configuration if i use
```
#Bean
public ResourcelessTransactionManager resourcelessTransactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public JobRepository jobRepository() throws Exception {
return new MapJobRepositoryFactoryBean(resourcelessTransactionManager()).getObject();
}
```
it started working.
Thanks,
Bala.

Categories