Subscribing a rest api to kafka events - java

I recently started working on Kafka for a project of mine.
I am trying to figure out how can I subscribe a rest API to a Kafka event rather than running a consumer which keeps on listening to the topic.
I came across Kafka connect, but not able to figure out exactly how to achieve the same.
Details: I am running a spring-boot project as a producer which uses KafkaTemplate provided by spring to publish the message. Also, the consumer is a Spring Boot project which exposes rest APIs.

There is no other way than to use the Kafka consumer library to have data come from Kafka. Kafka Connect would do the same.
A RESTful service is stateless. Having a consumer is more stateful (offset maintenance, for example).
If you want streaming events in general, maybe you could look into WebSockets or gRPC instead.

Related

Implementing Kafka inside a Java Client Library

I have a back-end application which is exposing APIs and for my clients to consume. Till now the requirement was only from Direct Frontend Dashboards. Recently I have got a client for my service that wants to consume these Apis on his backend application.
I am planning to build a client library in java for the same, which calls my APIs, and has a build in-memory cache system. Till this point everything is clear, but i want my client to have kafka as well. One way is that the backend application that wants to consume this api has Kafka Listener inside his application, the other idea that came across my mind is that what if I cant build a kakfa listner inside my client library itself. Is it a good idea to do it? Assuming that Kafka config will be present inside the backend application that is going to use my client library?
Kafka is a backend service. If you are providing your own REST APIs for clients, then that is not used by a Spring #KafkaListener.
If you add your own #KafkaListener, then you could store that data into your own app and expose data via HTTP endpoints, sure.
But that still wouldn't solve how external services plan on using Kafka on their own. If both services are connected same Kafka cluster, then you don't need HTTP endpoints, rather you would use KafkaTemplate to send data to Kafka, after which the external service would consume via their own #KafkaListener

Java - Is it Possible to Use Spring Cloud Stream Kafka and RabbitMQ for the Same Application

We have an application that is already using Spring Cloud Stream with RabbitMQ, some endpoints of the application are sending messages to Rabbit MQ. Now we want new endpoints start sending messages to Kafka, hoping that the existing endpoints continue using RabbitMQ with Spring Cloud Stream. I am not sure if this is even possible, because this means we have to include both kafka and rabbit binder dependencies in pom.xml. What configuration changes we need to make in the yml file so that the app understands what bindings are for kafka and what bindings are for Rabbit? Many thanks.
Yes it is possible. This is what we call a multi-binder scenario and is one of the core features specifically to support the use case you are describing.
Here is where you can find more information - https://docs.spring.io/spring-cloud-stream/docs/3.2.1/reference/html/spring-cloud-stream.html#multiple-binders
Also, here is an example that actually provides configuration that uses Kafka and Rabbit. While example is centered around CloudEvent, you can ignore it and strictly concentrate on configuration related to Rabbit and Kafka binders - https://github.com/spring-cloud/spring-cloud-function/tree/main/spring-cloud-function-samples/function-sample-cloudevent-stream
Feel free to ask follow up questions once you get familiar with that.

Trying to understand Apache Kafka as a microservice. Still confused

I am learning about Kafka, and I am curious as to how a Kafka client should exist in a Microservices architecture I want Kafka to keep a log of important information, and enable automatic reaction to those information appropriately.
My question is how should Kafka exist with the backend as a microservice?
Stand alone client.- The Kafka client (producer/consumer) exists alone. It exposes an API for the frontend to do HTTP requests (POST, PUT), sending data. The data is then converted into events, and produced to the Kafka cluster by the Kafka client.- The consumer also lives here, and will make reactive API calls to a separate backend when necessary.
A layer in the backend- When the frontend makes a HTTP request (POST, PUT) to the backend, the ‘REST controller’ in the backend looks at what data is sent, and if necessary, produces it to the Kafka cluster. - The consumer also lives here, and reacts to new events internally with other services in the backend.
Producer in the frontend, consumer in the backend.- Frontend (React.js, Vue.js, etc) has a Kafka client that produces event of important information that requires logging. Consumer Kafka client also exists in the backend and reacts to events internally with other services. Backend also exposes an API for non-kafka requests.
Maybe a combination of this? Or are these all wrong?
Confluent has been quite helpful so far, but I think I am missing something important to piece all of it together.

How can I configure a Kafka connector that takes messages from SQS and moves them to a Kafka topic?

I have a use case where I want to move messages from SQS to a Kafka topic. The framework to be used is SpringBoot. So, whenever I run my code it should start moving the messages. I searched for some articles but there were very few. I am looking for some boilerplate code to start with that follow the best practices and how to proceed further.
Thanks in advance.
You need to make yourself familiar with Enterprise Integration Patterns and its Spring Integration implementation.
To take messages from AWS SQS you would need to use an SqsMessageDrivenChannelAdapter from a Spring Integration for AWS extension. To post records into an Apache Kafka topic you need a KafkaProducerMessageHandler from the spring-integration-kafka module.
Then you wire everything together via an IntegrationFlow bean in your Spring Boot configuration.
Of course you can use Spring Cloud for AWS and Spring for Apache Kafka directly.
Choice is yours, but better to follow best practice and start developing really an integration solution.
Apache Kafka offers multiple ways to ingest data from different sources e.g. Kafka Connect, Kafka Producer, etc., and we need to be careful while selecting specific components of Kafka by keeping certain things in mind such as retry mechanism, scalability, etc.
The best solution, in this case, would be to use Amazon SQS Source Connector to ingest data from AWS SQS into Kafka topic and then write your consumer application to do whatever is necessary with the stream of records of that particular topic.

How to use Apache Kafka to send data from one microservice to other microservice using SpringBoot?

How to use Apache Kafka to send data from one microservice to another microservice using SpringBoot?
Yes, you can use Apache Kafka to communicate between Microservices. With Spring boot you can Spring Kafka or plain kafka-client API. Well there are various ways to achieve this, and it is totally depends on your use case. You can start with Producer & Consumer API, producer will send records to a topic, and then a consumer (or group of consumer) consume the records from the topic.
Please refer this for better understanding.
If you need any help let me know. Happy to help.
Links:
microservices-apache-kafka-domain-driven-design

Categories