How to create multiple beans of PubSubTemplate - java

I want to create two beans of PubSubTemplate class to set different message converter.
I am having two subscriber among them one is receiving Json response and another one is receiving String response. To handle these two scenario I am creating two PubSubTemplate bean.
Below is my PubSubTemplateConfig.java :
#Configuration
public class PubSubTemplateConfig {
#Bean
public PubSubTemplate pubSubTemplateForUserCreation(PubSubPublisherTemplate pubSubPublisherTemplate,
PubSubSubscriberTemplate pubSubSubscriberTemplate) {
PubSubTemplate template = new PubSubTemplate(pubSubPublisherTemplate, pubSubSubscriberTemplate);
template.setMessageConverter(new JacksonPubSubMessageConverter(getObjectMapper()));
return template;
}
private ObjectMapper getObjectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.registerModule(new JavaTimeModule());
return objectMapper;
}
#Bean
public PubSubTemplate pubSubTemplateForAuditTracker(PubSubPublisherTemplate pubSubPublisherTemplate,
PubSubSubscriberTemplate pubSubSubscriberTemplate) {
PubSubTemplate template = new PubSubTemplate(pubSubPublisherTemplate, pubSubSubscriberTemplate);
template.setMessageConverter(new SimplePubSubMessageConverter());
return template;
}
}
Below two are the subscriber configuration :
AuditsubscriptioncriptionConfiguration.java
#Configuration
public class AuditsubscriptioncriptionConfiguration {
#Value("${subscriptioncription.auditsubscriptioncription}")
private String subscription;
#Bean("pubsubAuditInputChannel")
public MessageChannel pubsubAuditInputChannel() {
return new DirectChannel();
}
#Bean
public PubSubInboundChannelAdapter auditMessageChannelAdapter(#Qualifier("pubsubAuditInputChannel") MessageChannel pubsubAuditInputChannel,
#Qualifier("pubSubTemplateForAuditTracker") PubSubTemplate pubSubTemplateForAuditTracker) {
PubSubInboundChannelAdapter adapter = new PubSubInboundChannelAdapter(pubSubTemplateForAuditTracker, subscription);
adapter.setOutputChannel(pubsubAuditInputChannel);
adapter.setPayloadType(String.class); //need changes
adapter.setAckMode(AckMode.MANUAL);
return adapter;
}
}
And UserSubscriptionConfiguration.java
#Configuration
public class UserSubscriptionConfiguration {
#Value("${subscription.userSubscriber}")
private String subscriber;
#Bean("pubsubInputChannel")
public MessageChannel pubsubInputChannel() {
return new DirectChannel();
}
#Bean
public PubSubInboundChannelAdapter userMessageChannelAdapter(#Qualifier("pubsubInputChannel") MessageChannel pubsubInputChannel,
#Qualifier("pubSubTemplateForUserCreation") PubSubTemplate pubSubTemplateForUserCreation) {
PubSubInboundChannelAdapter adapter = new PubSubInboundChannelAdapter(pubSubTemplateForUserCreation, subscriber);
adapter.setOutputChannel(pubsubInputChannel);
adapter.setPayloadType(UserChangeEvent.class);
adapter.setAckMode(AckMode.MANUAL);
return adapter;
}
}
Steps I observed during container start up :
Step 1. First pubSubTemplateForAuditTracker bean is getting created with SimplePubSubMessageConverter and then AuditMessageChannelAdapter bean is getting configured
Step 2. pubSubTemplateForUserCreation bean is getting created with JacksonPubSubMessageConverter and userMessageChannelAdapter is getting configured
Here, I should have two beans with two different message converter but while debugging I found that only one instance of PubSubTemplate is present and the attached message converter is JacksonPubSubMessageConverter. pubSubTemplateForAuditTracker bean is getting overridden with pubSubTemplateForUserCreation bean though I had defined them twice with #Bean annotation. This behavior is leading to an Error when auditMessageChannelAdapter is receiving a String message
My expectation is I want to have two separate PubSubTemplate bean with two different Message Converter.
Basically I want to create two beans of type PubSubTemplate with different behaviour.
Can someone please help me here.
I am exploring GCP pub/sub for the first time. Thank you

You can manually create multiple subscriptions like this
#PostConstruct
private void subscribeWithConcurrencyControl() {
// create subscription
TopicName topic = TopicName.ofProjectTopicName(projectId, this.eventTopic);
Subscription subscription = Subscription.newBuilder()
.setName("projects/XYZ/subscriptions/" + eventSubscription)
.setTopic(topic.toString())
.setPushConfig(PushConfig.getDefaultInstance())
.setAckDeadlineSeconds(100)
.build();
Subscription subscription2 = Subscription.newBuilder()
.setName("projects/XYZ/subscriptions/" + eventSubscription2)
.setTopic(topic.toString())
.setPushConfig(PushConfig.getDefaultInstance())
.setAckDeadlineSeconds(100)
.build();
try {
client.createSubscription(subscription);
} catch (AlreadyExistsException e) {
// nothing to do
}
try {
client.createSubscription(subscription2);
} catch (AlreadyExistsException e) {
// nothing to do
}
ProjectSubscriptionName subscriptionName1 = ProjectSubscriptionName.of(projectId, eventSubscription);
ProjectSubscriptionName subscriptionName2 = ProjectSubscriptionName.of(projectId, eventSubscription2);
// Instantiate an asynchronous message receiver.
MessageReceiver receiver = (PubsubMessage message, AckReplyConsumer consumer) -> {
// Handle incoming message, then ack the received message.
try {
process(message);
consumer.ack();
} catch (Exception e) {
LOG.error("Failed to process message", e);
consumer.nack();
}
};
// Provides an executor service for processing messages. The default `executorProvider` used
// by the subscriber has a default thread count of 5.
ExecutorProvider executorProvider =
InstantiatingExecutorProvider.newBuilder().setExecutorThreadCount(2).build();
FlowControlSettings flowControlSettings =
FlowControlSettings.newBuilder()
.setMaxOutstandingElementCount(100L)
.build();
// `setParallelPullCount` determines how many StreamingPull streams the subscriber will open
// to receive message. It defaults to 1. `setExecutorProvider` configures an executor for the
// subscriber to process messages. Here, the subscriber is configured to open 2 streams for
// receiving messages, each stream creates a new executor with 4 threads to help process the
// message callbacks. In total 2x4=8 threads are used for message processing.
subscriber1 = Subscriber.newBuilder(subscriptionName1, receiver)
.setParallelPullCount(20)
.setExecutorProvider(executorProvider)
.setCredentialsProvider(credentialsProvider)
.setFlowControlSettings(flowControlSettings)
.build();
subscriber2 = Subscriber.newBuilder(subscriptionName2, receiver)
.setParallelPullCount(20)
.setExecutorProvider(executorProvider)
.setCredentialsProvider(credentialsProvider)
.setFlowControlSettings(flowControlSettings)
.build();
// Start the subscriber.
subscriber1.startAsync().awaitRunning();
subscriber2.startAsync().awaitRunning();
}
In the process() method you can use the usual objectmapper and/or instanceof commands to determine the type of the message (or have different receivers for different subscriptions, or even transport the type of the message in the pubsub headers)
private void process(PubsubMessage message) {
try {
ModificationEvent modificationEvent = objectMapper.readValue(message.getData().toStringUtf8(), ModificationEvent.class);
} catch(...)

Related

why rabbitmq message consumption is so slow

Currently I am making logic to consume Message using Rabbitmq. However, contrary to expectations, it takes too long to consume the message.
If you look at the Queued messages graph in the picture above, you can see Unacked and Ready stacking up.
Looking at the message rates below, the publish speed is fast, but the consumer ack speed is too slow.
I'm not sure if the Rabbitmq Configuration I've developed is wrong or if I forgot to set the listener configuration.
The rabbitmq message I receive is a callback message.
Any help would be greatly appreciated.
This is Rabbitmq configuration and RabbitListener configuration
#Configuration
#Profile({ProfileConfig.RABBITMQ})
public class RabbitmqConfig {
#Value("${rabbitmq.queue.name}")
private String queueName;
#Value("${rabbitmq.exchange.name}")
private String exchangeName;
#Value("${rabbitmq.routing.key.callback}")
private String routingKey;
#Value("${rabbitmq.fetch-count}")
private Integer fetchCount;
#Bean
Queue queue() {
return new Queue(queueName, true);
}
#Bean
DirectExchange directExchange() {
return new DirectExchange(exchangeName);
}
#Bean
Binding binding(DirectExchange directExchange, Queue queue) {
return BindingBuilder.bind(queue).to(directExchange).with(routingKey);
}
#Bean
public RabbitListenerContainerFactory<SimpleMessageListenerContainer> prefetchOneContainerFactory(
SimpleRabbitListenerContainerFactoryConfigurer configurer, ConnectionFactory factory)
{
SimpleRabbitListenerContainerFactory simpleFactory = new SimpleRabbitListenerContainerFactory();
configurer.configure(simpleFactory, factory);
simpleFactory.setPrefetchCount(fetchCount);
return simpleFactory;
}
}
#RabbitListener(queues = {"${rabbitmq.queue.name}"}, concurrency = "3", containerFactory = "prefetchOneContainerFactory")
public void receiveMessage(final String message, Channel channel, #Header(AmqpHeaders.DELIVERY_TAG) long tag) {
try {
JSONParser parser = new JSONParser();
JSONObject json = (JSONObject) parser.parse(message);
String messageType = json.get("messageType").toString();
log.debug("Receive Queue Key={}, Message = {}", messageType, message);
AsyncType asyncType = AsyncType.valueOf(messageType);
executeMessage(asyncType, message);
} catch (Exception e) {
traceService.removeTraceId();
traceService.printErrorLog(log, "Fail to deal receive message.", e, PrintStackPolicy.ALL);
} finally {
try {
channel.basicAck(tag, false);
}
catch (IOException e) {
traceService.printErrorLog(log, "Fail to send ack to RabbitMQ", e, PrintStackPolicy.ALL);
}
}
}
The goal is to consume messages to Rabbitmq faster.
However, the current consumption speed is too slow.

How to delegate Spring Integration Message Payload to Spring Batch Job?

I have an FTP Streaming Inbound Channel Adapter from Spring Integration which produces message with payloads of type InputStream, letting files be fetched without writing to the local file system.
#Bean
#InboundChannelAdapter(channel = Constants.CHANNEL_INBOUND_FTP_ADAPTER, poller = #Poller(fixedDelay = Constants.FIXED_POLLING_FROM_INBOUND_FTP_ADAPTER_DELAY))
public MessageSource<InputStream> ftpMessageSource() {
FtpStreamingMessageSource ftpStreamingMessageSource = new FtpStreamingMessageSource(ftpRemoteFileTemplate());
ftpStreamingMessageSource.setRemoteDirectory(ftpConnectionParameters.getRootDir());
ftpStreamingMessageSource.setFilter(chainFileListFilter());
ftpStreamingMessageSource.setMaxFetchSize(Constants.INBOUND_ADAPTER_MAX_FETCH_SIZE);
return ftpStreamingMessageSource;
}
After I transform with
#Bean
#org.springframework.integration.annotation.Transformer(inputChannel = Constants.CHANNEL_INBOUND_FTP_ADAPTER, outputChannel = Constants.CHANNEL_STREAMED_DATA)
public Transformer transformer() {
return new StreamTransformer(Charset.defaultCharset().name());
}
Then handle data to check it works and maybe for custom inteceptors for future:
#ServiceActivator(inputChannel = Constants.CHANNEL_STREAMED_DATA, outputChannel = "BATCH_ALARM_CHANNEL")
public Message<?> alarmHandler(Message<?> message) {
System.out.println(Constants.CHANNEL_ALARM);
System.out.println(message.getHeaders());
return message;
}
After this according to official Spring Batch Integration documentation I have one more Transformer which let us transform to JobLaunchRequest
#Transformer
public JobLaunchRequest toRequest(Message<File> message) {
JobParametersBuilder jobParametersBuilder = new JobParametersBuilder();
jobParametersBuilder.addDate("dummy", new Date());
return new JobLaunchRequest(job, jobParametersBuilder.toJobParameters());
}
Here we have Message from last BATCH_ALARM_CHANNEL which needs in Spring Batch Jobs, but JobParametersBuilder doesn't allow to put complex object only primitive types. So how I can pass message payload for JobLaunching and do the rest of the work such as read, parse and save to DB?

Why does Gateway with void return is making async flow but it does sync when it return value? Spring Integration

I am new with Spring Integration. I was making some tests I realized the behavior of my app changes when the Gateway return void or return String. I'm trying to process the flow in the background (async) meanwhile I return a http message. So I did a async pipeline
#Bean
MessageChannel asyncChannel() {
return new QueueChannel(1);
}
#Bean
public MessageChannel asyncChannel2() {
return new QueueChannel(1);
}
#Bean
public MessageChannel asyncChannel3() {
return new QueueChannel(1);
}
#Bean(name = PollerMetadata.DEFAULT_POLLER)
PollerMetadata customPoller() {
PeriodicTrigger periodicTrigger = new PeriodicTrigger(2000, TimeUnit.MICROSECONDS);
periodicTrigger.setFixedRate(true);
periodicTrigger.setInitialDelay(1000);
PollerMetadata poller = new PollerMetadata();
poller.setMaxMessagesPerPoll(500);
poller.setTrigger(periodicTrigger);
return poller;
}
3 Activators
#ServiceActivator(inputChannel = "asyncChannel", outputChannel = "asyncChannel2")
public String async(String message) {
try {
Thread.sleep(5000);
log.info("Activator 1 " + message);
return message;
} catch (InterruptedException e) {
log.error("I don't want to sleep now");
}
return "";
}
#ServiceActivator(inputChannel = "asyncChannel2", outputChannel = "asyncChannel3")
public String async(String message){
log.info("Activator 2 "+ message);
try {
Thread.sleep(2000);
return message;
} catch (InterruptedException e) {
log.error("I don't want to sleep");
}
return "";
}
#ServiceActivator(inputChannel = "asyncChannel3")
public String result(String message) throws InterruptedException {
Thread.sleep(2000);
log.info("Activator 3 " + message);
return message;
}
I receive a message from Controller class
private final ReturningGateway returningGateway;
#PostMapping("/example")
public ResponseEntity post() {
returningGateway.processWhileResponse("Message example");
return ResponseEntity.ok(Map.of("Message","Http Done. Check the logs"));
}
The gateway
#Gateway(requestChannel = "asyncChannel")
public void processWhileResponse(String message_example);
The curious thing is when the gateway returns a void it making the process async so I can see the http message "Http Done. Check the logs" first, then I go to the logs and I see the async execution. but when the gateway returns a String I see the logs first and then the http message.
So I need the gateway returns a value but it keep the async way so I can get a http message
could you give a hand?
Sorry if I'm not using the right term. Thanks
So I need the gateway returns a value but it keep the async way so I can get a http message.
As long as you return some non-async type, it is going to block your code on the gateway call and wait for that return value to come back. Even if your flow behind that gateway is async, it still waits for a reply on the CountDownLatch barrier for replyChannel. In case of void return type there is no reply expectations and gateway exists immediately after sending a request message.
You may consider to have a Future as return type, but it still not clear when you would like to get the value: before returning from your controller method, or it is OK already after.
See more info in docs: https://docs.spring.io/spring-integration/docs/current/reference/html/messaging-endpoints.html#async-gateway

Spring Integration Error Channel for async operations configuration

I am trying to configure the a HttpRequestExecutingMessageHandler
#Bean
public HttpRequestExecutingMessageHandler httpEnrollHandlerKairos() {
HttpRequestExecutingMessageHandler handler = new HttpRequestExecutingMessageHandler(kairosProperties.getEnrollUrl());
handler.setOutputChannel(channelConfiguration.kairosResponseChannel());
handler.setExpectReply(true);
handler.setMessageConverters(httpMessageConverters);
handler.setHttpMethod(HttpMethod.POST);
handler.setExpectedResponseType(EnrollResponse.class);
handler.setHeaderMapper(defaultHeaderMapper);
return handler;
}
And with the following flow.
public IntegrationFlow kairosEnrollFlow() {
return IntegrationFlows
.from(channelConfiguration.kairosEnrollChannel())
.<Agreement , EnrollRequest>transform(p -> transformAgreementEnrollRequest(p))
.transform(Transformers.toJson())
.headerFilter(httpRequestHeaderFilters())
.enrichHeaders(m -> m.header(APP_ID, kairosProperties.getAppId()))
.enrichHeaders(m -> m.header(APP_KEY, kairosProperties.getAppKey()))
.handle(httpEnrollHandlerKairos())
.get();
}
I put an enrichHeaders on this flow for the errorChannel with no results...
I configure an Autowired PS CHannel for the errorChannel as follows,
#Autowired
#Qualifier("errorChannel")
private PublishSubscribeChannel errorChannel;
My error handler flow... (redirect to another errorChannel for convenience purposes) my objective is to configure specific error channels for some sort of different problems on the application.
#Autowired
#Qualifier("errorChannel")
private PublishSubscribeChannel errorChannel;
#Autowired
private ChannelConfiguration channelConfiguration;
#Bean
public IntegrationFlow errorHandlerFlow() {
return IntegrationFlows.from(errorChannel)
.<Message<?> , Message<ErrorMessage>>transform(m -> handleExceptionMessage(m))
.channel(channelConfiguration.errorHandlerOutputChannel())
.get();
}
public Message<ErrorMessage> handleExceptionMessage(Message<?> message)
{
MessagingException me = (MessagingException) message.getPayload();
MessageHeaders mh = me.getFailedMessage().getHeaders();
return MessageBuilder.withPayload(new ErrorMessage()).copyHeaders(mh).build();
}
But afterhundred of tries i couldn't put this errorChannel to work, the application on the send method fires an exception, and the errorChannel and the "errorFlow" configured is never called.
I Noticed that the application Handler is using teh GenericMessagingTemplate that on it turns has the following code.
I tried to use the headers (Message and IntegrationUtils), and nothin " TemporaryReplyChannel tempReplyChannel = new TemporaryReplyChannel(this.throwExceptionOnLateReply);
requestMessage = MessageBuilder.fromMessage(requestMessage).setReplyChannel(tempReplyChannel)
.setHeader(this.sendTimeoutHeader, null)
.setHeader(this.receiveTimeoutHeader, null)
.setErrorChannel(tempReplyChannel).build();".
which does not let to override the errorChannel never...
Any tip how to do this.
Does anyone knows what happens, all channels on question are asynchronous.
Any help to clarify the configuration of the error channel will be glad.
Kind Regards,
José Carlos Canova.

Spring cloud stream RabbitMQ routing messages dynamically

I have implemented the example as shown here Spring Dynamic Destination
In the rabbitmq, it is creating an exchange dynamically, but there is no option to provide binding or routing key. My requirement is to send a message to this dynamically created exchange with a routing key. How would i need to implement this to setup the routing key?
#Component
public class DDProducerBean {
#Autowired
private BinderAwareChannelResolver poChannelResolver = null;
public void publish(DDSocketVO ddSocketVO) throws Exception {
this.poChannelResolver.resolveDestination(ddSocketVO.getDestination()).send(MessageBuilder.withPayload(new ObjectMapper().
setVisibility(PropertyAccessor.FIELD, Visibility.ANY).
writeValueAsString(ddSocketVO)).build());
}
}
Here is the workaround as suggested Here
Basically create a MessageChannel with the dynamic destination using BinderAwareChannelResolver, then connect to RabbitMQ with RabbitAdmin API and bind the newly created exchange to another queue or exchange with routing key before sending messages.
#Autowired
private BinderAwareChannelResolver poChannelResolver;
public void publish(WebSocketVO webSocketVO) throws Exception {
MessageChannel channel = this.poChannelResolver.resolveDestination(webSocketVO.getDestination());
CachingConnectionFactory connectionFactory = new CachingConnectionFactory();
connectionFactory.setUsername(System.getProperty("spring.cloud.stream.binders.corerabbit.environment.spring.rabbitmq.username"));
connectionFactory.setPassword(System.getProperty("spring.cloud.stream.binders.corerabbit.environment.spring.rabbitmq.password"));
connectionFactory.setAddresses(System.getProperty("spring.cloud.stream.binders.corerabbit.environment.spring.rabbitmq.addresses"));
connectionFactory.setVirtualHost(System.getProperty("spring.cloud.stream.binders.corerabbit.environment.spring.rabbitmq.virtual-host"));
AmqpAdmin amqpAdmin = new RabbitAdmin(connectionFactory);
TopicExchange sourceExchange = new TopicExchange(webSocketVO.getDestination(), false, true);
TopicExchange destExchange = new TopicExchange("amq.topic");
amqpAdmin.declareBinding(BindingBuilder.bind(destExchange).to(sourceExchange).with(webSocketVO.getRoutingKeyExpression()));
channel.send(MessageBuilder.withPayload(new ObjectMapper().
setVisibility(PropertyAccessor.FIELD, Visibility.ANY).
writeValueAsString(webSocketVO)).build());
amqpAdmin.deleteExchange(webSocketVO.getDestination());
connectionFactory.destroy();
}

Categories