Spring cloud stream RabbitMQ routing messages dynamically - java

I have implemented the example as shown here Spring Dynamic Destination
In the rabbitmq, it is creating an exchange dynamically, but there is no option to provide binding or routing key. My requirement is to send a message to this dynamically created exchange with a routing key. How would i need to implement this to setup the routing key?
#Component
public class DDProducerBean {
#Autowired
private BinderAwareChannelResolver poChannelResolver = null;
public void publish(DDSocketVO ddSocketVO) throws Exception {
this.poChannelResolver.resolveDestination(ddSocketVO.getDestination()).send(MessageBuilder.withPayload(new ObjectMapper().
setVisibility(PropertyAccessor.FIELD, Visibility.ANY).
writeValueAsString(ddSocketVO)).build());
}
}

Here is the workaround as suggested Here
Basically create a MessageChannel with the dynamic destination using BinderAwareChannelResolver, then connect to RabbitMQ with RabbitAdmin API and bind the newly created exchange to another queue or exchange with routing key before sending messages.
#Autowired
private BinderAwareChannelResolver poChannelResolver;
public void publish(WebSocketVO webSocketVO) throws Exception {
MessageChannel channel = this.poChannelResolver.resolveDestination(webSocketVO.getDestination());
CachingConnectionFactory connectionFactory = new CachingConnectionFactory();
connectionFactory.setUsername(System.getProperty("spring.cloud.stream.binders.corerabbit.environment.spring.rabbitmq.username"));
connectionFactory.setPassword(System.getProperty("spring.cloud.stream.binders.corerabbit.environment.spring.rabbitmq.password"));
connectionFactory.setAddresses(System.getProperty("spring.cloud.stream.binders.corerabbit.environment.spring.rabbitmq.addresses"));
connectionFactory.setVirtualHost(System.getProperty("spring.cloud.stream.binders.corerabbit.environment.spring.rabbitmq.virtual-host"));
AmqpAdmin amqpAdmin = new RabbitAdmin(connectionFactory);
TopicExchange sourceExchange = new TopicExchange(webSocketVO.getDestination(), false, true);
TopicExchange destExchange = new TopicExchange("amq.topic");
amqpAdmin.declareBinding(BindingBuilder.bind(destExchange).to(sourceExchange).with(webSocketVO.getRoutingKeyExpression()));
channel.send(MessageBuilder.withPayload(new ObjectMapper().
setVisibility(PropertyAccessor.FIELD, Visibility.ANY).
writeValueAsString(webSocketVO)).build());
amqpAdmin.deleteExchange(webSocketVO.getDestination());
connectionFactory.destroy();
}

Related

How to create multiple beans of PubSubTemplate

I want to create two beans of PubSubTemplate class to set different message converter.
I am having two subscriber among them one is receiving Json response and another one is receiving String response. To handle these two scenario I am creating two PubSubTemplate bean.
Below is my PubSubTemplateConfig.java :
#Configuration
public class PubSubTemplateConfig {
#Bean
public PubSubTemplate pubSubTemplateForUserCreation(PubSubPublisherTemplate pubSubPublisherTemplate,
PubSubSubscriberTemplate pubSubSubscriberTemplate) {
PubSubTemplate template = new PubSubTemplate(pubSubPublisherTemplate, pubSubSubscriberTemplate);
template.setMessageConverter(new JacksonPubSubMessageConverter(getObjectMapper()));
return template;
}
private ObjectMapper getObjectMapper() {
ObjectMapper objectMapper = new ObjectMapper();
objectMapper.registerModule(new JavaTimeModule());
return objectMapper;
}
#Bean
public PubSubTemplate pubSubTemplateForAuditTracker(PubSubPublisherTemplate pubSubPublisherTemplate,
PubSubSubscriberTemplate pubSubSubscriberTemplate) {
PubSubTemplate template = new PubSubTemplate(pubSubPublisherTemplate, pubSubSubscriberTemplate);
template.setMessageConverter(new SimplePubSubMessageConverter());
return template;
}
}
Below two are the subscriber configuration :
AuditsubscriptioncriptionConfiguration.java
#Configuration
public class AuditsubscriptioncriptionConfiguration {
#Value("${subscriptioncription.auditsubscriptioncription}")
private String subscription;
#Bean("pubsubAuditInputChannel")
public MessageChannel pubsubAuditInputChannel() {
return new DirectChannel();
}
#Bean
public PubSubInboundChannelAdapter auditMessageChannelAdapter(#Qualifier("pubsubAuditInputChannel") MessageChannel pubsubAuditInputChannel,
#Qualifier("pubSubTemplateForAuditTracker") PubSubTemplate pubSubTemplateForAuditTracker) {
PubSubInboundChannelAdapter adapter = new PubSubInboundChannelAdapter(pubSubTemplateForAuditTracker, subscription);
adapter.setOutputChannel(pubsubAuditInputChannel);
adapter.setPayloadType(String.class); //need changes
adapter.setAckMode(AckMode.MANUAL);
return adapter;
}
}
And UserSubscriptionConfiguration.java
#Configuration
public class UserSubscriptionConfiguration {
#Value("${subscription.userSubscriber}")
private String subscriber;
#Bean("pubsubInputChannel")
public MessageChannel pubsubInputChannel() {
return new DirectChannel();
}
#Bean
public PubSubInboundChannelAdapter userMessageChannelAdapter(#Qualifier("pubsubInputChannel") MessageChannel pubsubInputChannel,
#Qualifier("pubSubTemplateForUserCreation") PubSubTemplate pubSubTemplateForUserCreation) {
PubSubInboundChannelAdapter adapter = new PubSubInboundChannelAdapter(pubSubTemplateForUserCreation, subscriber);
adapter.setOutputChannel(pubsubInputChannel);
adapter.setPayloadType(UserChangeEvent.class);
adapter.setAckMode(AckMode.MANUAL);
return adapter;
}
}
Steps I observed during container start up :
Step 1. First pubSubTemplateForAuditTracker bean is getting created with SimplePubSubMessageConverter and then AuditMessageChannelAdapter bean is getting configured
Step 2. pubSubTemplateForUserCreation bean is getting created with JacksonPubSubMessageConverter and userMessageChannelAdapter is getting configured
Here, I should have two beans with two different message converter but while debugging I found that only one instance of PubSubTemplate is present and the attached message converter is JacksonPubSubMessageConverter. pubSubTemplateForAuditTracker bean is getting overridden with pubSubTemplateForUserCreation bean though I had defined them twice with #Bean annotation. This behavior is leading to an Error when auditMessageChannelAdapter is receiving a String message
My expectation is I want to have two separate PubSubTemplate bean with two different Message Converter.
Basically I want to create two beans of type PubSubTemplate with different behaviour.
Can someone please help me here.
I am exploring GCP pub/sub for the first time. Thank you
You can manually create multiple subscriptions like this
#PostConstruct
private void subscribeWithConcurrencyControl() {
// create subscription
TopicName topic = TopicName.ofProjectTopicName(projectId, this.eventTopic);
Subscription subscription = Subscription.newBuilder()
.setName("projects/XYZ/subscriptions/" + eventSubscription)
.setTopic(topic.toString())
.setPushConfig(PushConfig.getDefaultInstance())
.setAckDeadlineSeconds(100)
.build();
Subscription subscription2 = Subscription.newBuilder()
.setName("projects/XYZ/subscriptions/" + eventSubscription2)
.setTopic(topic.toString())
.setPushConfig(PushConfig.getDefaultInstance())
.setAckDeadlineSeconds(100)
.build();
try {
client.createSubscription(subscription);
} catch (AlreadyExistsException e) {
// nothing to do
}
try {
client.createSubscription(subscription2);
} catch (AlreadyExistsException e) {
// nothing to do
}
ProjectSubscriptionName subscriptionName1 = ProjectSubscriptionName.of(projectId, eventSubscription);
ProjectSubscriptionName subscriptionName2 = ProjectSubscriptionName.of(projectId, eventSubscription2);
// Instantiate an asynchronous message receiver.
MessageReceiver receiver = (PubsubMessage message, AckReplyConsumer consumer) -> {
// Handle incoming message, then ack the received message.
try {
process(message);
consumer.ack();
} catch (Exception e) {
LOG.error("Failed to process message", e);
consumer.nack();
}
};
// Provides an executor service for processing messages. The default `executorProvider` used
// by the subscriber has a default thread count of 5.
ExecutorProvider executorProvider =
InstantiatingExecutorProvider.newBuilder().setExecutorThreadCount(2).build();
FlowControlSettings flowControlSettings =
FlowControlSettings.newBuilder()
.setMaxOutstandingElementCount(100L)
.build();
// `setParallelPullCount` determines how many StreamingPull streams the subscriber will open
// to receive message. It defaults to 1. `setExecutorProvider` configures an executor for the
// subscriber to process messages. Here, the subscriber is configured to open 2 streams for
// receiving messages, each stream creates a new executor with 4 threads to help process the
// message callbacks. In total 2x4=8 threads are used for message processing.
subscriber1 = Subscriber.newBuilder(subscriptionName1, receiver)
.setParallelPullCount(20)
.setExecutorProvider(executorProvider)
.setCredentialsProvider(credentialsProvider)
.setFlowControlSettings(flowControlSettings)
.build();
subscriber2 = Subscriber.newBuilder(subscriptionName2, receiver)
.setParallelPullCount(20)
.setExecutorProvider(executorProvider)
.setCredentialsProvider(credentialsProvider)
.setFlowControlSettings(flowControlSettings)
.build();
// Start the subscriber.
subscriber1.startAsync().awaitRunning();
subscriber2.startAsync().awaitRunning();
}
In the process() method you can use the usual objectmapper and/or instanceof commands to determine the type of the message (or have different receivers for different subscriptions, or even transport the type of the message in the pubsub headers)
private void process(PubsubMessage message) {
try {
ModificationEvent modificationEvent = objectMapper.readValue(message.getData().toStringUtf8(), ModificationEvent.class);
} catch(...)

How to delegate Spring Integration Message Payload to Spring Batch Job?

I have an FTP Streaming Inbound Channel Adapter from Spring Integration which produces message with payloads of type InputStream, letting files be fetched without writing to the local file system.
#Bean
#InboundChannelAdapter(channel = Constants.CHANNEL_INBOUND_FTP_ADAPTER, poller = #Poller(fixedDelay = Constants.FIXED_POLLING_FROM_INBOUND_FTP_ADAPTER_DELAY))
public MessageSource<InputStream> ftpMessageSource() {
FtpStreamingMessageSource ftpStreamingMessageSource = new FtpStreamingMessageSource(ftpRemoteFileTemplate());
ftpStreamingMessageSource.setRemoteDirectory(ftpConnectionParameters.getRootDir());
ftpStreamingMessageSource.setFilter(chainFileListFilter());
ftpStreamingMessageSource.setMaxFetchSize(Constants.INBOUND_ADAPTER_MAX_FETCH_SIZE);
return ftpStreamingMessageSource;
}
After I transform with
#Bean
#org.springframework.integration.annotation.Transformer(inputChannel = Constants.CHANNEL_INBOUND_FTP_ADAPTER, outputChannel = Constants.CHANNEL_STREAMED_DATA)
public Transformer transformer() {
return new StreamTransformer(Charset.defaultCharset().name());
}
Then handle data to check it works and maybe for custom inteceptors for future:
#ServiceActivator(inputChannel = Constants.CHANNEL_STREAMED_DATA, outputChannel = "BATCH_ALARM_CHANNEL")
public Message<?> alarmHandler(Message<?> message) {
System.out.println(Constants.CHANNEL_ALARM);
System.out.println(message.getHeaders());
return message;
}
After this according to official Spring Batch Integration documentation I have one more Transformer which let us transform to JobLaunchRequest
#Transformer
public JobLaunchRequest toRequest(Message<File> message) {
JobParametersBuilder jobParametersBuilder = new JobParametersBuilder();
jobParametersBuilder.addDate("dummy", new Date());
return new JobLaunchRequest(job, jobParametersBuilder.toJobParameters());
}
Here we have Message from last BATCH_ALARM_CHANNEL which needs in Spring Batch Jobs, but JobParametersBuilder doesn't allow to put complex object only primitive types. So how I can pass message payload for JobLaunching and do the rest of the work such as read, parse and save to DB?

Spring Integration Error Channel for async operations configuration

I am trying to configure the a HttpRequestExecutingMessageHandler
#Bean
public HttpRequestExecutingMessageHandler httpEnrollHandlerKairos() {
HttpRequestExecutingMessageHandler handler = new HttpRequestExecutingMessageHandler(kairosProperties.getEnrollUrl());
handler.setOutputChannel(channelConfiguration.kairosResponseChannel());
handler.setExpectReply(true);
handler.setMessageConverters(httpMessageConverters);
handler.setHttpMethod(HttpMethod.POST);
handler.setExpectedResponseType(EnrollResponse.class);
handler.setHeaderMapper(defaultHeaderMapper);
return handler;
}
And with the following flow.
public IntegrationFlow kairosEnrollFlow() {
return IntegrationFlows
.from(channelConfiguration.kairosEnrollChannel())
.<Agreement , EnrollRequest>transform(p -> transformAgreementEnrollRequest(p))
.transform(Transformers.toJson())
.headerFilter(httpRequestHeaderFilters())
.enrichHeaders(m -> m.header(APP_ID, kairosProperties.getAppId()))
.enrichHeaders(m -> m.header(APP_KEY, kairosProperties.getAppKey()))
.handle(httpEnrollHandlerKairos())
.get();
}
I put an enrichHeaders on this flow for the errorChannel with no results...
I configure an Autowired PS CHannel for the errorChannel as follows,
#Autowired
#Qualifier("errorChannel")
private PublishSubscribeChannel errorChannel;
My error handler flow... (redirect to another errorChannel for convenience purposes) my objective is to configure specific error channels for some sort of different problems on the application.
#Autowired
#Qualifier("errorChannel")
private PublishSubscribeChannel errorChannel;
#Autowired
private ChannelConfiguration channelConfiguration;
#Bean
public IntegrationFlow errorHandlerFlow() {
return IntegrationFlows.from(errorChannel)
.<Message<?> , Message<ErrorMessage>>transform(m -> handleExceptionMessage(m))
.channel(channelConfiguration.errorHandlerOutputChannel())
.get();
}
public Message<ErrorMessage> handleExceptionMessage(Message<?> message)
{
MessagingException me = (MessagingException) message.getPayload();
MessageHeaders mh = me.getFailedMessage().getHeaders();
return MessageBuilder.withPayload(new ErrorMessage()).copyHeaders(mh).build();
}
But afterhundred of tries i couldn't put this errorChannel to work, the application on the send method fires an exception, and the errorChannel and the "errorFlow" configured is never called.
I Noticed that the application Handler is using teh GenericMessagingTemplate that on it turns has the following code.
I tried to use the headers (Message and IntegrationUtils), and nothin " TemporaryReplyChannel tempReplyChannel = new TemporaryReplyChannel(this.throwExceptionOnLateReply);
requestMessage = MessageBuilder.fromMessage(requestMessage).setReplyChannel(tempReplyChannel)
.setHeader(this.sendTimeoutHeader, null)
.setHeader(this.receiveTimeoutHeader, null)
.setErrorChannel(tempReplyChannel).build();".
which does not let to override the errorChannel never...
Any tip how to do this.
Does anyone knows what happens, all channels on question are asynchronous.
Any help to clarify the configuration of the error channel will be glad.
Kind Regards,
José Carlos Canova.

Removing old messages from ActiveMQ topics with WebSocket and Spring

I have embedded ActiveMQ broker configured in Spring with websocket support (using STOMP).
#Configuration
#EnableWebSocketMessageBroker
public class WebSocketMqConfig extends AbstractWebSocketMessageBrokerConfigurer {
#Override
public void configureMessageBroker(MessageBrokerRegistry registry) {
registry.enableStompBrokerRelay("/topic");
}
#Override
public void registerStompEndpoints(StompEndpointRegistry registry) {
registry.addEndpoint("/messaging")
.setAllowedOrigins("*")
.withSockJS();
}
#Bean(initMethod = "start", destroyMethod = "stop")
public BrokerService brokerService() throws Exception {
PersistenceAdapter persistenceAdapter = getPersistenceAdapter();
BrokerService brokerService = new BrokerService();
brokerService.setPersistent(true);
brokerService.setDeleteAllMessagesOnStartup(true);
brokerService.setUseJmx(false);
brokerService.setBrokerName("broker");
brokerService.addConnector("stomp://localhost:61613");
return borkerService;
}
In my JavaScript client I subscribe to topic:
var successHandler = function() {
stompClient.subscribe('/topic/test', function(not) {
pushNotification(not);
}, {'id': clientId, 'activemq.subscriptionName': clientId});
};
var socket = new SockJS('/messaging');
var stompClient = Stomp.over(socket);
stompClient.connect({'client-id': clientId}, successHandler, failureHandler);
And I am using backend service to feed this topic:
#Autowired
private SimpMessagingTemplate messagingTemplate;
messagingTemplate.convertAndSend("/topic/test", event);
And here are my questions:
When I send message to topic, but client haven't been subscribed yet, why messages are not persisted (I suppose that after client subscribe, he should be notified about missed messages)?
If client disconnect from topic, every message is persisted, is there any mean to restrict number of persisted messages, time or size of KahaDB's log files?
Messages sent to a Topic are not persisted unless the client has created a durable Topic subscription previously and the message is sent with the persistent flag set. To create a durable subscription add the headers as specified in the ActiveMQ STOMP documentation.
Once you start using durable Topic subscriptions then yes message can accumulate in the KahaDB store at which point you can configure the store usage limits to control size.

Not able to connect localstack with spring clound

I have setup locastack in my local pc. I am able to create, send and receive in the queue using command line.
How am trying to connect SpringBoot application with the localstcak queue.
I am not finding any tutorial which will guide me how we can read data from localstack queue using spring cloud.
I have a class which looks like this
#Component
#Profile("aws")
public class EventListener {
private static final Logger LOGGER = LoggerFactory.getLogger(VisitsQueue.class);
#Value("${sqs.queuename}")
private String queueName;
private ObjectMapper mapper = new ObjectMapper();
#RuntimeUse
#SqsListener("${sqs.queuename}")
public void receiveMessage(String message, #Header(value = "SenderId", required = false) String senderId,
#Headers Map<String, Object> allHeaders) {
LOGGER.info("Received message with content {}", message);
}
}

Categories