Camel consumer route with input from another consumer route - java

I have a kafka consumer route from where I get some data.
from("Kafka:foo?brokers=localhost:9092")
Once I receive data from the consumer, use that data in the topic name for a paho mqtt consumer.
from("paho:#?brokerUrl=tcp://localhost:1883")
I'm not able to figure out how to set the dynamic header CamelMqttTopic, from first consumer, as both seems independent flows. I'm using camel with Spring framework. Excuse me if my basic camel understanding is flawed.

You can override the MQTT topic using the CamelPahoOverrideTopic message header with a value being the Kafka topic accessed through the kafka.TOPIC message header:
from("kafka:foo?brokers=localhost:9092")
.setHeader(PahoConstants.CAMEL_PAHO_OVERRIDE_TOPIC, simple("${headers[kafka.TOPIC]"))
.to("paho:#?brokerUrl=tcp://localhost:1883");

Related

How can I return response from RabbitMQ producer to RestController?

I have got two Spring Boot application. First one is REST application. The REST one is communicating with second application through RabbitMQ message queue. I'm sending a request to method with the #GetMapping("/") and this method producing a message to example-queue. A method with #RabbitListener(queues = {"example-queue"}) taking the message and create a object at database. Now, how can I send my response (saved object) to the method with #GetMapping("/")? I need a response from consumer to ResponseEntity.ok();. How can I do that? Thank you.
Just see if you can make the interaction with RabbitMQ consumer as a request-reply pattern.
The #RabbitListener can just return your object and be marked with a #SendTo. This way the framework will look into a replyTo property of the request message.
On the producer side you can just use an AmqpTemplate.convertSendAndReceive().
See more in docs: https://docs.spring.io/spring-amqp/docs/current/reference/html/#request-reply

Should i create NewTopics in each service spring kafka?

I'm using Kafka for sending messages between services. I use NewTopic bean for configuring number of partitions, for example:
#Bean
fun kafkaTopic(kafkaProperties: KafkaProperties): NewTopic = NewTopic(
kafkaProperties.topics.schedulerCalculationTopic.name,
kafkaProperties.topics.schedulerCalculationTopic.partitions,
1
)
My question is simple, should i add this bean into consumer service and producer service or only in one of them?
I would put it in the producer service and then consider the producer as 'owner' of those topics.
But it get a bit complicated if you have a scenario if you would have several producers to the same topic(s).
If you are not creating the topic on the fly, the best practice is to create topic before reading/writing to it.
Rationale is to prevent brokers to create topic whenever they receive metadata fetch request or consume request with the same topic name. Otherwise, if the consumer starts before the producer, you might end up wrong number of partition. (Broker will create your topic with default number of partitions setting.)

Difference between KafkaTemplate and KafkaProducer send method?

My question is in a spring boot microservice using kafka what is appropriate to use KafkaTemplate.send() or KafkaProducer.send()
I have used KafkaConsumer and not KafkaListner to poll the records because KafkaListner was fetching the records as and when they were coming to the topics, I wanted the records to be polled periodically based on business needs.
Have gone through the documentation of KafkaProducer https://kafka.apache.org/10/javadoc/org/apache/kafka/clients/producer/KafkaProducer.html
and Spring KafkaTemplate
https://docs.spring.io/spring-kafka/reference/html/#kafka-template
I am unable to make a decision like what is ideal to use or atleast the reason of using one over the other is unclear?
What my need is I want the operation to be sync i.e. I want to know if the published happened successfully or not because If the record is not delivered I need to retry publishing.
Any help will be appreciated.
For your first question, which one should I use kafka Template or Kafka producer?
The Kafka Producer is defined in Apache Kafka. The KafkaTemplate is Spring's implementation of it (although it does not implement Producer
directly) and so it provides more methods for you to use.
Read this link::
What is the difference between Kafka Template and kafka producer?
For retry mechanism, in case of failure in publishing.
I have answered this in another question.
The acks parameter control how many partition replicas must receive
the record before the producer can consider the write successful.
There are 3 values for the acks parameter:
acks=0, the producer will not wait for a reply from the broker before
assuming the message sent successfully.
acks=1, the producer will receive a successful response from the
broker the moment the leader replica received the message. If the
message can't be written to the leader, the producer will receive an
error response and can retry.
acks=all, the producer will receive a successful response from the
broker once all in-sync replicas received the message.
Best way to configure retries in Kaka Producer

Asynchronous request-reply with Spring Boot and RabbitMQ

We want to implement the following scenario:
A producer service sends some input params to another service asking for the details based on these params.
A producer wants to specify the queue where it will be listening for the reply.
Moreover, a producer wants to provide some metadata so that it can correlate the params it sent with a result it got.
Please advice how to do this properly.
See the AsyncRabbitTemplate.
It uses the correlationId and replyTo properties to convey that information to the service that handles the request.

Two-way communications (ask or tell)?

akka documentation for java says
http://doc.akka.io/docs/akka/2.4/java/camel.html#Consumer_timeout
Two-way communications between a Camel endpoint and an actor are initiated by sending the request message to the actor with the ask pattern and the actor replies to the endpoint when the response is ready.
And then
http://doc.akka.io/docs/akka/2.4/java/camel.html#Asynchronous_routing
A consumer endpoint sends request messages to its consumer actor using the tell method and the actor returns responses with getSender().tell once they are ready.
Which of both statements is true?
Does it depend on the Camel component?
If tell method is used, how does the endpoint know to which client respond?
Thanks.

Categories