Currently I am making logic to consume Message using Rabbitmq. However, contrary to expectations, it takes too long to consume the message.
If you look at the Queued messages graph in the picture above, you can see Unacked and Ready stacking up.
Looking at the message rates below, the publish speed is fast, but the consumer ack speed is too slow.
I'm not sure if the Rabbitmq Configuration I've developed is wrong or if I forgot to set the listener configuration.
The rabbitmq message I receive is a callback message.
Any help would be greatly appreciated.
This is Rabbitmq configuration and RabbitListener configuration
#Configuration
#Profile({ProfileConfig.RABBITMQ})
public class RabbitmqConfig {
#Value("${rabbitmq.queue.name}")
private String queueName;
#Value("${rabbitmq.exchange.name}")
private String exchangeName;
#Value("${rabbitmq.routing.key.callback}")
private String routingKey;
#Value("${rabbitmq.fetch-count}")
private Integer fetchCount;
#Bean
Queue queue() {
return new Queue(queueName, true);
}
#Bean
DirectExchange directExchange() {
return new DirectExchange(exchangeName);
}
#Bean
Binding binding(DirectExchange directExchange, Queue queue) {
return BindingBuilder.bind(queue).to(directExchange).with(routingKey);
}
#Bean
public RabbitListenerContainerFactory<SimpleMessageListenerContainer> prefetchOneContainerFactory(
SimpleRabbitListenerContainerFactoryConfigurer configurer, ConnectionFactory factory)
{
SimpleRabbitListenerContainerFactory simpleFactory = new SimpleRabbitListenerContainerFactory();
configurer.configure(simpleFactory, factory);
simpleFactory.setPrefetchCount(fetchCount);
return simpleFactory;
}
}
#RabbitListener(queues = {"${rabbitmq.queue.name}"}, concurrency = "3", containerFactory = "prefetchOneContainerFactory")
public void receiveMessage(final String message, Channel channel, #Header(AmqpHeaders.DELIVERY_TAG) long tag) {
try {
JSONParser parser = new JSONParser();
JSONObject json = (JSONObject) parser.parse(message);
String messageType = json.get("messageType").toString();
log.debug("Receive Queue Key={}, Message = {}", messageType, message);
AsyncType asyncType = AsyncType.valueOf(messageType);
executeMessage(asyncType, message);
} catch (Exception e) {
traceService.removeTraceId();
traceService.printErrorLog(log, "Fail to deal receive message.", e, PrintStackPolicy.ALL);
} finally {
try {
channel.basicAck(tag, false);
}
catch (IOException e) {
traceService.printErrorLog(log, "Fail to send ack to RabbitMQ", e, PrintStackPolicy.ALL);
}
}
}
The goal is to consume messages to Rabbitmq faster.
However, the current consumption speed is too slow.
Related
I want to create two beans of PubSubTemplate class to set different message converter.
I am having two subscriber among them one is receiving Json response and another one is receiving String response. To handle these two scenario I am creating two PubSubTemplate bean.
Below is my PubSubTemplateConfig.java :
#Configuration
public class PubSubTemplateConfig {
#Bean
public PubSubTemplate pubSubTemplateForUserCreation(PubSubPublisherTemplate pubSubPublisherTemplate,
PubSubSubscriberTemplate pubSubSubscriberTemplate) {
PubSubTemplate template = new PubSubTemplate(pubSubPublisherTemplate, pubSubSubscriberTemplate);
template.setMessageConverter(new JacksonPubSubMessageConverter(getObjectMapper()));
return template;
}
private ObjectMapper getObjectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.registerModule(new JavaTimeModule());
return objectMapper;
}
#Bean
public PubSubTemplate pubSubTemplateForAuditTracker(PubSubPublisherTemplate pubSubPublisherTemplate,
PubSubSubscriberTemplate pubSubSubscriberTemplate) {
PubSubTemplate template = new PubSubTemplate(pubSubPublisherTemplate, pubSubSubscriberTemplate);
template.setMessageConverter(new SimplePubSubMessageConverter());
return template;
}
}
Below two are the subscriber configuration :
AuditsubscriptioncriptionConfiguration.java
#Configuration
public class AuditsubscriptioncriptionConfiguration {
#Value("${subscriptioncription.auditsubscriptioncription}")
private String subscription;
#Bean("pubsubAuditInputChannel")
public MessageChannel pubsubAuditInputChannel() {
return new DirectChannel();
}
#Bean
public PubSubInboundChannelAdapter auditMessageChannelAdapter(#Qualifier("pubsubAuditInputChannel") MessageChannel pubsubAuditInputChannel,
#Qualifier("pubSubTemplateForAuditTracker") PubSubTemplate pubSubTemplateForAuditTracker) {
PubSubInboundChannelAdapter adapter = new PubSubInboundChannelAdapter(pubSubTemplateForAuditTracker, subscription);
adapter.setOutputChannel(pubsubAuditInputChannel);
adapter.setPayloadType(String.class); //need changes
adapter.setAckMode(AckMode.MANUAL);
return adapter;
}
}
And UserSubscriptionConfiguration.java
#Configuration
public class UserSubscriptionConfiguration {
#Value("${subscription.userSubscriber}")
private String subscriber;
#Bean("pubsubInputChannel")
public MessageChannel pubsubInputChannel() {
return new DirectChannel();
}
#Bean
public PubSubInboundChannelAdapter userMessageChannelAdapter(#Qualifier("pubsubInputChannel") MessageChannel pubsubInputChannel,
#Qualifier("pubSubTemplateForUserCreation") PubSubTemplate pubSubTemplateForUserCreation) {
PubSubInboundChannelAdapter adapter = new PubSubInboundChannelAdapter(pubSubTemplateForUserCreation, subscriber);
adapter.setOutputChannel(pubsubInputChannel);
adapter.setPayloadType(UserChangeEvent.class);
adapter.setAckMode(AckMode.MANUAL);
return adapter;
}
}
Steps I observed during container start up :
Step 1. First pubSubTemplateForAuditTracker bean is getting created with SimplePubSubMessageConverter and then AuditMessageChannelAdapter bean is getting configured
Step 2. pubSubTemplateForUserCreation bean is getting created with JacksonPubSubMessageConverter and userMessageChannelAdapter is getting configured
Here, I should have two beans with two different message converter but while debugging I found that only one instance of PubSubTemplate is present and the attached message converter is JacksonPubSubMessageConverter. pubSubTemplateForAuditTracker bean is getting overridden with pubSubTemplateForUserCreation bean though I had defined them twice with #Bean annotation. This behavior is leading to an Error when auditMessageChannelAdapter is receiving a String message
My expectation is I want to have two separate PubSubTemplate bean with two different Message Converter.
Basically I want to create two beans of type PubSubTemplate with different behaviour.
Can someone please help me here.
I am exploring GCP pub/sub for the first time. Thank you
You can manually create multiple subscriptions like this
#PostConstruct
private void subscribeWithConcurrencyControl() {
// create subscription
TopicName topic = TopicName.ofProjectTopicName(projectId, this.eventTopic);
Subscription subscription = Subscription.newBuilder()
.setName("projects/XYZ/subscriptions/" + eventSubscription)
.setTopic(topic.toString())
.setPushConfig(PushConfig.getDefaultInstance())
.setAckDeadlineSeconds(100)
.build();
Subscription subscription2 = Subscription.newBuilder()
.setName("projects/XYZ/subscriptions/" + eventSubscription2)
.setTopic(topic.toString())
.setPushConfig(PushConfig.getDefaultInstance())
.setAckDeadlineSeconds(100)
.build();
try {
client.createSubscription(subscription);
} catch (AlreadyExistsException e) {
// nothing to do
}
try {
client.createSubscription(subscription2);
} catch (AlreadyExistsException e) {
// nothing to do
}
ProjectSubscriptionName subscriptionName1 = ProjectSubscriptionName.of(projectId, eventSubscription);
ProjectSubscriptionName subscriptionName2 = ProjectSubscriptionName.of(projectId, eventSubscription2);
// Instantiate an asynchronous message receiver.
MessageReceiver receiver = (PubsubMessage message, AckReplyConsumer consumer) -> {
// Handle incoming message, then ack the received message.
try {
process(message);
consumer.ack();
} catch (Exception e) {
LOG.error("Failed to process message", e);
consumer.nack();
}
};
// Provides an executor service for processing messages. The default `executorProvider` used
// by the subscriber has a default thread count of 5.
ExecutorProvider executorProvider =
InstantiatingExecutorProvider.newBuilder().setExecutorThreadCount(2).build();
FlowControlSettings flowControlSettings =
FlowControlSettings.newBuilder()
.setMaxOutstandingElementCount(100L)
.build();
// `setParallelPullCount` determines how many StreamingPull streams the subscriber will open
// to receive message. It defaults to 1. `setExecutorProvider` configures an executor for the
// subscriber to process messages. Here, the subscriber is configured to open 2 streams for
// receiving messages, each stream creates a new executor with 4 threads to help process the
// message callbacks. In total 2x4=8 threads are used for message processing.
subscriber1 = Subscriber.newBuilder(subscriptionName1, receiver)
.setParallelPullCount(20)
.setExecutorProvider(executorProvider)
.setCredentialsProvider(credentialsProvider)
.setFlowControlSettings(flowControlSettings)
.build();
subscriber2 = Subscriber.newBuilder(subscriptionName2, receiver)
.setParallelPullCount(20)
.setExecutorProvider(executorProvider)
.setCredentialsProvider(credentialsProvider)
.setFlowControlSettings(flowControlSettings)
.build();
// Start the subscriber.
subscriber1.startAsync().awaitRunning();
subscriber2.startAsync().awaitRunning();
}
In the process() method you can use the usual objectmapper and/or instanceof commands to determine the type of the message (or have different receivers for different subscriptions, or even transport the type of the message in the pubsub headers)
private void process(PubsubMessage message) {
try {
ModificationEvent modificationEvent = objectMapper.readValue(message.getData().toStringUtf8(), ModificationEvent.class);
} catch(...)
If a message was sent to a topic in a different format, for example, not JSON as I expect, but String, then we cannot deserialize it.
Please tell me how to ignore such messages.
#Configuration
public class KafkaConfig {
private final KafkaProperties kafkaProperties;
public KafkaConfig(KafkaProperties kafkaProperties) {
this.kafkaProperties = kafkaProperties;
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, ExternalCardServiceData> kafkaContainerFactory() {
DefaultKafkaConsumerFactory<String, ExternalCardServiceData> consumerFactory =
new DefaultKafkaConsumerFactory<>(kafkaProperties.buildConsumerProperties(),
new StringDeserializer(), new JsonDeserializer<>(ExternalCardServiceData.class));
ConcurrentKafkaListenerContainerFactory<String, ExternalCardServiceData> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory);
return factory;
}
}
#KafkaListener(topics = "#{'${topic}'}",
groupId = "#{'${groupid}'}",
autoStartup = "#{'${enabled}'}",
containerFactory = "kafkaContainerFactory")
public void updateExternalCardToken(ConsumerRecord<String, ExternalCardServiceData> record) {
try {
ExternalCardServiceData externalCardServiceData = record.value();
String key = record.key();
log.info("ExternalCardsListener. Received message: {}, offset={}", externalCardServiceData, record.offset());
externalCardService.updateToken(key, externalCardServiceData);
} catch (Exception e) {
log.error("Error processing message received from kafka. [Message={}]", record.value());
externalCardService.saveTokensErrors(record.key(), record.value().toString(),
Arrays.toString(e.getStackTrace()));
}
}
As the error message indicates; you need to configure an ErrorHandlingDeserializer which will catch the exception and the failed record will be sent straight to the error handler.
https://docs.spring.io/spring-kafka/docs/current/reference/html/#error-handling-deserializer
I am new with Spring Integration. I was making some tests I realized the behavior of my app changes when the Gateway return void or return String. I'm trying to process the flow in the background (async) meanwhile I return a http message. So I did a async pipeline
#Bean
MessageChannel asyncChannel() {
return new QueueChannel(1);
}
#Bean
public MessageChannel asyncChannel2() {
return new QueueChannel(1);
}
#Bean
public MessageChannel asyncChannel3() {
return new QueueChannel(1);
}
#Bean(name = PollerMetadata.DEFAULT_POLLER)
PollerMetadata customPoller() {
PeriodicTrigger periodicTrigger = new PeriodicTrigger(2000, TimeUnit.MICROSECONDS);
periodicTrigger.setFixedRate(true);
periodicTrigger.setInitialDelay(1000);
PollerMetadata poller = new PollerMetadata();
poller.setMaxMessagesPerPoll(500);
poller.setTrigger(periodicTrigger);
return poller;
}
3 Activators
#ServiceActivator(inputChannel = "asyncChannel", outputChannel = "asyncChannel2")
public String async(String message) {
try {
Thread.sleep(5000);
log.info("Activator 1 " + message);
return message;
} catch (InterruptedException e) {
log.error("I don't want to sleep now");
}
return "";
}
#ServiceActivator(inputChannel = "asyncChannel2", outputChannel = "asyncChannel3")
public String async(String message){
log.info("Activator 2 "+ message);
try {
Thread.sleep(2000);
return message;
} catch (InterruptedException e) {
log.error("I don't want to sleep");
}
return "";
}
#ServiceActivator(inputChannel = "asyncChannel3")
public String result(String message) throws InterruptedException {
Thread.sleep(2000);
log.info("Activator 3 " + message);
return message;
}
I receive a message from Controller class
private final ReturningGateway returningGateway;
#PostMapping("/example")
public ResponseEntity post() {
returningGateway.processWhileResponse("Message example");
return ResponseEntity.ok(Map.of("Message","Http Done. Check the logs"));
}
The gateway
#Gateway(requestChannel = "asyncChannel")
public void processWhileResponse(String message_example);
The curious thing is when the gateway returns a void it making the process async so I can see the http message "Http Done. Check the logs" first, then I go to the logs and I see the async execution. but when the gateway returns a String I see the logs first and then the http message.
So I need the gateway returns a value but it keep the async way so I can get a http message
could you give a hand?
Sorry if I'm not using the right term. Thanks
So I need the gateway returns a value but it keep the async way so I can get a http message.
As long as you return some non-async type, it is going to block your code on the gateway call and wait for that return value to come back. Even if your flow behind that gateway is async, it still waits for a reply on the CountDownLatch barrier for replyChannel. In case of void return type there is no reply expectations and gateway exists immediately after sending a request message.
You may consider to have a Future as return type, but it still not clear when you would like to get the value: before returning from your controller method, or it is OK already after.
See more info in docs: https://docs.spring.io/spring-integration/docs/current/reference/html/messaging-endpoints.html#async-gateway
I try to receive a protobuf message out of RabbitMQ using Spring Integration.
My integration flow:
public class FacadeIntegrationFlowAdapter extends IntegrationFlowAdapter {
#SuppressWarnings("rawtypes")
private final Facade facade;
private final FacadeProperties facadeProperties;
#SuppressWarnings("unchecked")
#Override
protected IntegrationFlowDefinition<?> buildFlow() {
return from(facadeProperties.getQueueName())
.handle(facade::getNewMessages);
}
}
The getNewMessages method:
#Override
public ExchangeResponse getNewMessages(Message<ExchangeRequest> message) {
ExchangeRequest request = message.getPayload();
log.info("Receiving new message: " + request.toString());
This is how I send the message to the queue. It's so simple to make the test easy to follow.
ExchangeRequest request = ExchangeRequest.newBuilder()
.addAllAuthors(List.of("author1", "author2"))
.addAllBooks(List.of("book1", "book2"))
.build();
ConnectionFactory connectionFactory = new ConnectionFactory();
connectionFactory.setUsername("user");
connectionFactory.setPassword("password");
connectionFactory.setHost("localhost");
connectionFactory.setPort(24130);
try {
Connection connection = connectionFactory.newConnection();
Channel channel = connection.createChannel();
var basicProperties = new AMQP.BasicProperties().builder()
.contentType("application/x-protobuf")
.type(request.getDescriptorForType().getFullName())
.build();
channel.basicPublish(
"facade-exchange", "facade-routing-key", basicProperties, request.toByteArray());
} catch (IOException e) {
Unfortunately, I keep getting the exception:
com.google.protobuf.InvalidProtocolBufferException: Type of the Any message does not match the given class.
However, when I change the getNewMessages method to the following, all seems fine.
#Override
public ExchangeResponse getNewMessages(Message message) {
try {
Any payload = (Any) message.getPayload();
ByteString value = payload.getValue();
ExchangeRequest request = ExchangeRequest.parseFrom(value);
log.info("Receiving new message: " + request.toString());
Where do I make a mistake? Tx!
I am trying to receive JSON messages from a Solace JMS queue but I am not receiving any message. Below is my code
#Service
public class QueueConsumer {
final String QUEUE_NAME = "test.Request.Q.V01";
// Latch used for synchronizing between threads
final CountDownLatch latch = new CountDownLatch(1);
public void run(String... args) throws Exception {
String host = "test.solace.com";
String vpnName = "TEST_VPN";
String username = "testVpn";
String password = "test123";
System.out.printf("QueueConsumer is connecting to Solace messaging at %s...%n", host);
SolConnectionFactory connectionFactory = SolJmsUtility.createConnectionFactory();
connectionFactory.setHost(host);
connectionFactory.setVPN(vpnName);
connectionFactory.setUsername(username);
connectionFactory.setPassword(password);
connectionFactory.setDynamicDurables(true);
Connection connection = connectionFactory.createConnection();
Session session = connection.createSession(false, SupportedProperty.SOL_CLIENT_ACKNOWLEDGE);
System.out.printf("Connected to the Solace Message VPN '%s' with client username '%s'.%n", vpnName, username);
Queue queue = session.createQueue(QUEUE_NAME);
MessageConsumer messageConsumer = session.createConsumer(queue);
messageConsumer.setMessageListener(new MessageListener() {
#Override
public void onMessage(Message message) {
try {
if (message instanceof SolaceMsg) {
System.out.printf("TextMessage received: '%s'%n", ((SolaceMsg) message).getClass());
} else {
System.out.println("Message received.");
}
System.out.printf("Message Content:%n%s%n", SolJmsUtility.dumpMessage(message));
message.acknowledge();
latch.countDown(); // unblock the main thread
} catch (JMSException ex) {
System.out.println("Error processing incoming message.");
ex.printStackTrace();
}
}
});
System.out.println("Start receiving messages....");
connection.start();
System.out.println("Awaiting message...");
latch.await();
connection.stop();
messageConsumer.close();
session.close();
connection.close();
}
public static void main(String... args) throws Exception {
new QueueConsumer().run(args);
}
}
My message type is JSON ad below, and I have created a POJO for this.
{
"customerDetails": {
"customerID": "0001234",
"customerName": "John"
}
}
I am getting one warning saying Response - 400 Queue already exists as it is an existing queue, and I am not receiving any messages. What am I doing wrong here?
Your code snippet looks correct. You can log on to the PubSub+ Manager of your event broker to verify that the client is binding to the correct queue and that the messages were successfully published to the queue and are waiting to be consumed. You can also enable Solace JMS API logging to understand more about what the application is doing: https://docs.solace.com/Solace-JMS-API/Code-and-Compile-Guideli.htm