Rabbitmq wait after processing message - java

I have following rabbit listener:
#Slf4j
#Service
#RequiredArgsConstructor
#RabbitListener(queues = "${spring.rabbitmq.template.default-receive-queue}")
public class RabbitmqListener {
private final Processor processor;
#RabbitHandler(isDefault = true)
public void receiveMessage(List<String> someData) {
log.info("Received {} some data", someData.size());
processor.process(someData);
//should wait for 15 minutes here
}
}
I need to configure listener to wait for 15 minutes after it processed one message before receiving next one. Not necessary to wait inside this method. All i need is NOT to receive any messages after processing one. It could be done by Thread.sleep(15000), but i'm not sure that it's the best way to achieve this. Is there any rabbitmq configuration for this kind of situation?

Thread.sleep(15000) will wait for 15 seconds, not 15 minutes.
It's probably not a good idea to sleep for 15 minutes; if you only need to sleep for 15 seconds, a sleep is probably ok (but you do risk a redelivery if the server crashes while you are are sleeping).
You might want to consider using RabbitTemplate.receiveAndConvert() instead for this use case, rather than using a message-driven architecture.

Related

Spring #Scheduled with fixed rate running multiple times

I'm using a #Scheduled with fixed rate spring annotaion in my service.
In the iteration im doing some logic and send http request.
I'm running it every 2 seconds and most of the time it is okay.
I have a log to console that tells me that a new iteration beigns.
Sometimes, the log is presented 4/6 times per second.
I also have a log that tells me how long the iteration took, and it never exceeded more than 1 second.
#Service
#EnableScheduling
public class Handler {
#Scheduled(fixedRate = 2000)
public void handle() {
System.out.println("Start iteration");
// -- logic --
}
}
I'm using sprintboot version 2.1.2.RELEASE.
Any ideas?
running every 2 seconds

o.s.w.c.request.async.WebAsyncManager : Could not complete async processing due to timeout or network error

I am getting the below timeout error while calling the java springboot API service. Code attached
o.s.w.c.request.async.WebAsyncManager : Could not complete async processing due to timeout or network error
Also, I want to include concurrency in the service. Please let me know how to do.
#Configuration
public class WebConfiguration extends WebMvcConfigurerAdapter {
#Override
public void configureAsyncSupport(AsyncSupportConfigurer configurer) {
configurer.setDefaultTimeout(-1);
configurer.setTaskExecutor(asyncTaskExecutor());
}
#Bean
public AsyncTaskExecutor asyncTaskExecutor() {
return new SimpleAsyncTaskExecutor("stream-task");
}
}
In this,
configurer.setDefaultTimeout(-1);
You are actually wrongly configuring the timeout. The value passed in this method is the the amount of time, in milliseconds, before asynchronous request times out.
You should set it according to your use-case like set it to 5000 for 5 seconds. Or you can ignore this and Spring wil automatically set it to 10 seconds by default.

Amount parallel processing Simple Queue Service (SQS)

I am using Spring Cloud to consume Simple Queue Service (SQS). I have the following configurations for parallel processing:
#Bean
public SimpleAsyncTaskExecutor simpleAsyncTaskExecutor() {
SimpleAsyncTaskExecutor simpleAsyncTaskExecutor = new SimpleAsyncTaskExecutor();
simpleAsyncTaskExecutor.setConcurrencyLimit(50);
return simpleAsyncTaskExecutor;
}
#Bean
public SimpleMessageListenerContainerFactory simpleMessageListenerContainerFactory(
SimpleAsyncTaskExecutor simpleAsyncTaskExecutor) {
SimpleMessageListenerContainerFactory factory = new SimpleMessageListenerContainerFactory();
factory.setAutoStartup(true);
factory.setTaskExecutor(simpleAsyncTaskExecutor);
factory.setWaitTimeOut(20);
factory.setMaxNumberOfMessages(10);
return factory;
}
I need to process 50 messages in 50 threads (configuration in the bean SimpleAsyncTaskExecutor), but is processing only 10 messages in parallel (maxNumberOfMessages returned from SQS)
How can I process 50 messages instead 10?
I found the solution.
It's necessary to annotate the method with #Async, change deletionPolicy to NEVER, and delete the message when finalizing execution.
In this way, the queue consume will respect the configured number of threads. For example, if you have 50 threads, will make 5 requests in the SQS queue (10 messages per request), thus processing a total of 50 messages in parallel.
The code looks like this:
#Async
#SqsListener(value = "sqsName", deletionPolicy = SqsMessageDeletionPolicy.NEVER)
public void consume(String message, Acknowledgment acknowledgment) throws InterruptedException, ExecutionException {
//your code
acknowledgment.acknowledge().get(); //To delete message from queue
}
I wouldn't be into specific numbers (like 50 messages for 50 threads) too much. Try performance testing it instead (build something to push the expected number of messages in peak-hours to the queue, and let your service handle them, to see if it bottlenecks).
As per your actual question, you can't. AWS SQS simply doesnt support fetching more than 10 messages pr. request. see http://docs.aws.amazon.com/AWSSimpleQueueService/latest/APIReference/API_ReceiveMessage.html for reference. (it's in the 1st paragraph).

Asynchronous event in Spring taking lot of time to execute(get its turn)

I have an application where I need to trigger email whenever a REST call is made for an endpoint. The design is that whenever a REST call is invoked, I save the data in the db, emit an asynchrnous event and return.
My problem is that due to huge no of requests that keep coming, the async events which are emitted do not get a chance for quite lot of time. Sometimes as the server is up for some weeks, the delay keeps increasing.
The scenario
Server endpoint is invoked
Server saves data to db, emits a Spring Asynchronous event
Returns from the endpoint
The call 2 is getting delayed as the listener is invoked quite late sometimes.
public class DataController {
#Inject
ApplicationEventPublisher eventPublisher;
#RequestMapping(value = "data", method = RequestMethod.POST)
#ResponseStatus(HttpStatus.NO_CONTENT)
public void addData(#RequestBody DataDTO data) {
dataService.addData(data);
eventPublisher.publishEvent(new DataRequest(new DataDTO());
}
}
public class DataRequest extends ApplicationEvent {
private DataDTO dataDTO;
public DataRequest(DataDTO dataDTO) {
super(dataDTO);
this.dataDTO = dataDTO;
}
}
#Component
public class DataListener {
#EventListener
#Async
private void dataListener(DataDTO dataDTO) {
// Send email
}
}
Since it is an Async event , the JVM gives the dataListener chance to execute very late. Sometimes
the events triggered earlier gets chance late than the ones that were triggered after that.
So 2 fundamental problems
Emails delayed. Delay can range from 1 min, to 4 hours to 8 days etc
If an event is triggered at 12 PM to send email to xyz#gmail.com, and another at 12:15 PM which sends email to abc#gmail.com, then there are chances of abc#gmail.com receiving email before xyz#gmail.com.
Appreciate your help
Spring Asynchronous event is limited to the size of the Thread pool and as soon as the incoming requests are higher than the size of active threads there will be delays.
You need to use a Message Queue like RabbitMQ, Kafka, etc. Your architecture should be changed to do the following;
Serialize a JSON message in the REST Endpoint with all information like to email address, database entry data, etc and just store that JSON message in the message queue and return a status code
There must be consumers for message queue (separate Java applications) which poll or get notified when there is data in the message queue.
These consumers should de-serialize the JSON message, save an entry in the database and send an email.
With this architecture you can increase consumers at times of high load and thus scale as required.

Spring AMQP - Duplicate messages

I am processing a high volume stream ~ 500+ msgs per second, The data is consumed off Spring AMQP+Rabbit using a SimpleMessageListenerContainer with 10 concurrent consumers, I have to do some checks on the Db every 15 mins and reload certain properties for processing, this is done with a quartz trigger which fires every 15 mins, stops the SimplelistenerContainer, does the necessary work and starts the Container once again.
Everything works perfectly when the app starts up, when the trigger fires and the Container restarts, I see the same message being delivered multiple times,this causes a lot of duplicates. There are no exeptions thrown by the consumers.
The Message Listener
class RoundRobinQueueListener implements MessageListener {
#Override
public void onMessage(Message message) { //do processing
}
}
During app startup set up parallel consumers and start the consumer
final SimpleMessageListenerContainer messageListenerContainer = new SimpleMessageListenerContainer(connectionFactory);
RoundRobinQueueListener roundRobinListener = RoundRobinQueueListener.class.newInstance();
messageListenerContainer.setQueueNames(queueName);
messageListenerContainer.setMessageListener(roundRobinListener);
messageListenerContainer.setConcurrentConsumers(10);
messageListenerContainer.setChannelTransacted(true);
The quartz trigger
void execute(JobExecutionContext context) throws JobExecutionException {
messageListenerContainer.stop()
//Do db task, other processing
messageListenerContainer.start()
}
Looks like your messages are now acknowledged by the consumer. If you are not using auto acknowledge mode, you need to acknowledge the message by yourself (This can also be configured at the SimpleMessageListenerContainer). Otherwise, the broker presumes that the message was not processed successfully and tries to deliver it again.

Categories