File upload as parallel process in spring batch with integration - java

I am trying to upload multiple files to SFTP server using Spring batch with integration. Multiple files are uploaded parallelly using Future with threadPoolExecutorService. But I want to execute the tasklet in parallel with spring batch config [Not the way I did now with future tasks] as well as suppose if file upload fails, I want to retry the file uploading process for certain interval.
#Autowired
UploadGateway gateway;
#Bean
public Job importDataJob() {
return jobBuilderFactory.get(FILE_UPLOAD_JOB_NAME).listener(jobExecutionListener(threadPoolTaskExecutor()))
.incrementer(new RunIdIncrementer()).flow(uploadFiles())
.end().build();
}
#Bean
public Step uploadFiles() {
return stepBuilderFactory.get(UPLOAD_FILE_STEP_NAME).tasklet(new Tasklet() {
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
log.info("Upload tasklet start executing..");
resources = resourcePatternResolver.getResources(quantumRuntimeProperties.getInputFilePaths());
for (Resource anInputResource : resources) {
log.info("Incoming file <{}> to upload....", anInputResource.getFilename());
Future<?> submit = threadPoolTaskExecutor().submit(new Callable<Boolean>() {
#Override
public Boolean call() throws Exception {
try {
gateway.upload(anInputResource.getFilename());
} catch (Exception e) {
}
return true;
}
});
resultList.add(submit);
}
return RepeatStatus.FINISHED;
}
}).build();
}
#Bean
public JobExecutionListener jobExecutionListener(ThreadPoolTaskExecutor executor) {
return new JobExecutionListener() {
private ThreadPoolTaskExecutor taskExecutor = executor;
#Override
public void beforeJob(JobExecution jobExecution) {
// DO-NOTHING
}
#Override
public void afterJob(JobExecution jobExecution) {
for (Future<?> future : resultList) {
try {
// Wait for all the file uploads to complete
future.get();
} catch (InterruptedException | ExecutionException e) {
log.error("Error occured while waiting for all files to get uploaded...");
}
}
taskExecutor.shutdown();
}
};
}

I would suggest you to take a look into the ChunkMessageChannelItemWriter and RemoteChunkHandlerFactoryBean - the integration of Spring Batch with Spring Integration. See Reference Manual for more information.

Related

How to execute a HTTP request in a seperate thread and return response

I have a service method that does a considerable amount of looping and executing HTTP request for each loop.
So my intention is to use a seperate thread for this service. How can i correctly achieve this using ExecutorService.
private ExecutorService executorService;
#PostConstruct
public void init() {
executorService = Executors.newWorkStealingPool();
}
#Override
public ResponseDto getSummary() throws ExecutionException, InterruptedException {
Callable<ResponseDto> task = () -> {
try {
executeRequests();
log.debug("Fetching summary details ompleted");
} catch (Exception e) {
log.error("Exception occurred", e);
}
};
Future<ResponseDto> future = executorService.submit(task);
return future.get();
}
private ResponseDto executeRequests() {
// executing HTTP requests inside loops and returning ResponseDto
}
So upto now i have implemented like above. Im not sure about how to return the response when the work is done. Any suggestions?

spring cloud aws multiple sqs listener

There are 2 sqs listener in my project. I want one of them to have the same setting and one of them different setting. The only value I want to change is maxNumberOfMessages.
What is the most practical way to do this ? ı want set different maxNumberOfMessages value for one of listener.
this is my config ;
#Bean
public AWSCredentialsProvider awsCredentialsProvider(#Value("${cloud.aws.profile}") String profile,
#Value("${cloud.aws.region.static}") String region,
#Value("${cloud.aws.roleArn}") String role,
#Value("${cloud.aws.user}") String user) {
...
return new AWSStaticCredentialsProvider(sessionCredentials);
}
#Bean
#Primary
#Qualifier("amazonSQSAsync")
public AmazonSQSAsync amazonSQSAsync(#Value("${cloud.aws.region.static}") String region, AWSCredentialsProvider awsCredentialsProvider) {
return AmazonSQSAsyncClientBuilder.standard()
.withCredentials(awsCredentialsProvider)
.withRegion(region)
.build();
}
#Bean
#Primary
public SimpleMessageListenerContainerFactory simpleMessageListenerContainerFactory(AmazonSQSAsync amazonSqs) {
SimpleMessageListenerContainerFactory factory = new SimpleMessageListenerContainerFactory();
factory.setAmazonSqs(amazonSqs);
factory.setMaxNumberOfMessages(1);
factory.setWaitTimeOut(10);
factory.setQueueMessageHandler(new SqsQueueMessageHandler());
return factory;
}
This is listener;
#SqsListener(value = "${messaging.queue.blabla.source}", deletionPolicy = SqsMessageDeletionPolicy.NEVER)
public void listen(Message message, Acknowledgment acknowledgment, #Header("MessageId") String messageId) {
log.info("Message Received");
try {
....
acknowledgment.acknowledge().get();
} catch (InterruptedException e) {
e.printStackTrace();
} catch (ExecutionException e) {
e.printStackTrace();
} catch (Exception ex) {
throw new RuntimeException(ex.getMessage());
}
}
Following hack worked for me (if each listener listens to different queue)
#Bean
public SimpleMessageListenerContainerFactory simpleMessageListenerContainerFactory(AmazonSQSAsync amazonSqs) {
return new SimpleMessageListenerContainerFactory() {
#Override
public SimpleMessageListenerContainer createSimpleMessageListenerContainer() {
SimpleMessageListenerContainer simpleMessageListenerContainer = new SimpleMessageListenerContainer() {
#Override
protected void startQueue(String queueName, QueueAttributes queueAttributes) {
// A place to configure queue based maxNumberOfMessages
try {
if (queueName.endsWith(".fifo")) {
FieldUtils.writeField(queueAttributes, "maxNumberOfMessages", 1, true);
}
} catch (IllegalAccessException e) {
throw new RuntimeException(e);
}
super.startQueue(queueName, queueAttributes);
}
};
simpleMessageListenerContainer.setAmazonSqs(amazonSqs);
return simpleMessageListenerContainer;
}
};
}
ı found the solution and share on example repo on github.
github link
if ı add #EnableAsync annotation on listener class and #Async annotation to handler method my problem is solving :)
Unfortunately, the solution from Sushant didn't compile for me in Kotlin(because QueueAttributes is static protected class), but I used it to write following:
#Bean
fun simpleMessageListenerContainerFactory(sqs: AmazonSQSAsync): SimpleMessageListenerContainerFactory =
object : SimpleMessageListenerContainerFactory() {
override fun createSimpleMessageListenerContainer(): SimpleMessageListenerContainer {
val container = object : SimpleMessageListenerContainer() {
override fun afterPropertiesSet() {
super.afterPropertiesSet()
registeredQueues.forEach { (queue, attributes) ->
if (queue.contains(QUEUE_NAME)) {
FieldUtils.writeField(
attributes,
"maxNumberOfMessages",
NEW_MAX_NUMBER_OF_MESSAGES,
true
)
}
}
}
}
container.setWaitTimeOut(waitTimeOut)
container.setMaxNumberOfMessages(maxNumberOfMessages)
container.setAmazonSqs(sqs)
return container
}
}

Spring batch step won't stop by itself after finishing

I am trying to upload multiple files in SFTP server using Spring batch and spring integration. For that I am using ThreadPoolTaskExecutor for parallel processing.
Execute the file upload in each process
But even if all the files are uploaded successfully in SFTP server, still it's not stopping the process, the program always keeps on running state.
Even if I override JobExecutionListener
#Bean
public JobExecutionListener jobExecutionListener(ThreadPoolTaskExecutor executor) {
return new JobExecutionListener() {
private ThreadPoolTaskExecutor taskExecutor = executor;
#Override
public void beforeJob(JobExecution jobExecution) {
}
#Override
public void afterJob(JobExecution jobExecution) {
taskExecutor.shutdown();
}
};
}
#Bean
public ThreadPoolTaskExecutor threadPoolTaskExecutor()
{
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.initialize();
executor.setCorePoolSize(10);
executor.setMaxPoolSize(10);
executor.setThreadNamePrefix("quantum-runtime-worker-thread");
executor.setWaitForTasksToCompleteOnShutdown(true);
return executor;
}
#Bean
public Step uploadFiles()
{
return stepBuilderFactory.get(UPLOAD_FILE_STEP_NAME).tasklet(new Tasklet() {
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception
{
log.info("Upload tasklet start executing..");
resources = resourcePatternResolver.getResources(inputFilesPath);
for (Resource anInputResource : resources)
{
log.info("Incoming file <{}> to upload....", anInputResource.getFilename());
threadPoolTaskExecutor().execute(new Runnable() {
#Override
public void run()
{
File zippedFile = null;
try
{
log.info("Uploading file : {}", anInputResource.getFilename());
gateway.upload(zippedFile);
log.info("{} file uploaded : {}", anInputResource.getFilename(), zippedFile.delete());
}
catch (Exception e)
{
log.error("Error occured while uploading a file : {} and the exception is {}",
anInputResource.getFilename(), e);
}
}
});
}
System.out.println("=============POINTER NOT COMMING HERE================");
return RepeatStatus.FINISHED;
}
}).build();
}
#Bean
#ServiceActivator(inputChannel = SFTP_CHANNEL_NAME)
public MessageHandler handler()
{
SftpMessageHandler handler = new SftpMessageHandler(sftpSessionFactory());
handler.setRemoteDirectoryExpression(new LiteralExpression("/"));
handler.setFileNameGenerator(new FileNameGenerator() {
#Override
public String generateFileName(Message<?> message)
{
if (message.getPayload() instanceof File)
{
return ((File) message.getPayload()).getName();
}
else
{
throw new IllegalArgumentException("File expected as payload.");
}
}
});
return handler;
}
#MessagingGateway
#Component
public interface UploadGateway {
#Gateway(requestChannel = SFTP_CHANNEL_NAME)
void upload(File file);
}
Now I close the context and is successfully program stops.
ConfigurableApplicationContext context = new SpringApplicationBuilder(QuantumFileUploadApplication.class).web(false).run(args);
context.close();// it works

Spring boot async controller

I have this post method:
#PostMapping("/upload")
public String singleFileUpload(#RequestParam("file") MultipartFile file,RedirectAttributes redirectAttributes)
throws IOException {
ExecutorService service= Executors.newSingleThreadExecutor();
Future<String> future=service.submit(new Callable<String>() {
#Override
public String call() throws Exception {
//parse file
Thread.sleep(5000);
return "done";
}
});
String result=future.get();
service.shutdown();
return "redirect:uploadState";
}
I want to redirect to uploadState while executor parse my file and uploadState have long polling ajax to notify if parsing is done or not.Can help me with some hints.
Here future.get() is blocker.
You can use Java 8 CompletableFuture :
CompletableFuture.supplyAsync(() -> {
try{
Thread.sleep(5000);
return "done";
}catch(Exception ex){}
}).thenApply((res->)->{
return res;
});
It will not block your control.

How to parse the job status in case of any exception

Hi I am facing problem in parsing the job status in Spring Batch [ I am new to spring-batch ]. In my ItemReader even though exception is occurring , status shows COMPLETED.
this is my job config.
#Bean
public Job customerJob(){
return jobBuilderFactory.get("custJob")
.listener(jobListner)
.start(custStep())
.build();
private Step custStep() {
return stepBuilderFactory.get("custStep1")
.<Customer,Customer>chunk(10)
.reader(customerItemReader)
.processor(customerItemProcessor)
.writer(customerItemWriter)
.allowStartIfComplete(true)
.build();
ItemReader class
#Override
public Customer read() throws Exception, UnexpectedInputException, ParseException,
NonTransientResourceException {
if(customerList != null && customerList.size() >=1){
Customer customer;
customer = customerList.remove(0);
System.out.println("## ID="+customer.getCustNumber());
return customer;
}
return null;
}
//Method to fetch list[from ItemStream interface]
#Override
public void open(ExecutionContext executionContext) throws ItemStreamException {
System.out.println("reading customer list");
try {
customerList = customerService.getCustomers();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
Here getCustomers() method from line customerList = customerService.getCustomers() some time throw java.net.SocketException exception (if there is some problem in server). In this case I excepct that my job status should fail, but it shows "COMPLETED".
#Override
public void afterJob(JobExecution jobExecution) {
System.out.println("exit status ="+jobExecution.getExitStatus().getExitCode());
System.out.println(" status ="+jobExecution.getStatus().name());
}
java.net.SocketException: Connection reset
at java.net.SocketInputStream.read(SocketInputStream.java:196)
at java.net.SocketInputStream.read(SocketInputStream.java:122)
//Line ommited
exit status =COMPLETED
status =COMPLETED
How to show status "FAIL" in such case.

Categories