What is approch to consume Kafka for Java web application - java

I made one web application in java spring mvc,
In this application traffic will be around 10000 files upload/minute.
So in controller we wrote a code to inject file byte data into Kafka Producer.
Where to implement consumer that can get result of file and do the process in java?
Need to make service which will run separately like Windows service. Is this good way?

Related

Implementing Kafka inside a Java Client Library

I have a back-end application which is exposing APIs and for my clients to consume. Till now the requirement was only from Direct Frontend Dashboards. Recently I have got a client for my service that wants to consume these Apis on his backend application.
I am planning to build a client library in java for the same, which calls my APIs, and has a build in-memory cache system. Till this point everything is clear, but i want my client to have kafka as well. One way is that the backend application that wants to consume this api has Kafka Listener inside his application, the other idea that came across my mind is that what if I cant build a kakfa listner inside my client library itself. Is it a good idea to do it? Assuming that Kafka config will be present inside the backend application that is going to use my client library?
Kafka is a backend service. If you are providing your own REST APIs for clients, then that is not used by a Spring #KafkaListener.
If you add your own #KafkaListener, then you could store that data into your own app and expose data via HTTP endpoints, sure.
But that still wouldn't solve how external services plan on using Kafka on their own. If both services are connected same Kafka cluster, then you don't need HTTP endpoints, rather you would use KafkaTemplate to send data to Kafka, after which the external service would consume via their own #KafkaListener

Broadcast message to all instances of kubernetes java service

The problem: I have a spring boot service running on K8s. Generally API calls can be served by any pod of my service, but for a particular use case we have a requirement to propagate the call to all instances of the service.
A bit of googling led me to https://discuss.kubernetes.io/t/how-to-broadcast-message-to-all-the-pod/10002 where they suggest using
kubectl get endpoints cache -o yaml
and proceeding from there. This is fine for a human or a CLI environment, but how do I accomplish the same from within my Java service, aside from executing the above command via Process and parsing the output?
Essentially I want a way to do what the above command is doing but in a more java-friendly way.
Seems like your spring boot service should be listening to a message queue, and when one service receives a specific HTTP request message to the /propagateme endpoint, it sends an event to the topic to all other clients listening to the Propagation topic, when the instances receive a message from the topic they perform the specific action
See JMS https://spring.io/guides/gs/messaging-jms/

Add Web controllers to an Akka Actor System

I am working with Akka and Spring.
I have an actor System that operates on a Kafka Stream set up (using akka-stream-kafka_2.12) and the actors hold some data in memory and persist their state using akka-persistence.
What I wanted to know is that can I create a REST-endpoint that can interact with my Actor-System to provide some data or send messages to my actors.
My question is, how can it be achieved?
As said in the comments, I have created a sample working application in github to demonstrate the usage of Spring with Akka.
Please note that :
I have used Springboot for quick setup and configuration.
You can't expect any kind of good/best practices in this demo
project as i had to create this in 30 mins. It just explains one of
the ways(simple) to use akka within Spring.
This sample cannot be used in microservice architure because there is
no Remoting or Clustering involved here. API controllers directly talk to actors.
In Controllers, Used GetMapping in all places instead of PostMapping for simplicity.
Will update the repository with another sample explaining the usage
with Clustering where the way of communication between API
Controller and ActorSystem changes.
Here is the Link to Repo. Hope this will get you started.
Either you can build the application yourself or run the api-akka-integration-0.0.1-SNAPSHOT.jar file in command prompt. It runs in default 8080 port.
This sample includes two kinds of APIs, /Calc/{Operation}/{operand1}/{operand2} and /Chat/{message}
/chat/hello
/calc/add/1/2
/calc/mul/1/2
/calc/div/1/2
/calc/sub/1/2
Edit:2
Updated the repo with Akka CLuster Usage in API
API-Akka-Cluster

Spark Streaming Context blocking REST endpoints

I am working on a small project using Spring Boot, Kafka, and Spark. So far I have been able to create a Kafka producer in one project and a Spark-Kafka direct stream as a consumer.
I am able to see messages pass through and things seem to be working as intended. However, I have a rest endpoint on the project that is running the consumer. Whenever I disable the Direct Stream, the endpoint works fine. However when I have the stream running, Postman says there is no response. I see nothing in the server logs indicating that a request was ever received either.
The Spark consumer is started by a bean at project launch. Is this keeping the normal server on localhost:8080 from being started?
Initially I was kicking off the StreamingContext by annotating it as a Bean. I instead made the application implement CommandLineRunner, and in the overridden run method, I called the method that kicks off the Streaming Context. That allowed Apache to start and fixed the issue.

Notification system in spring web app

I am currently developing java/jee application,which has 2 projects:backend and frontend projects which are communicating via micro services.I am using mysql as database and i want to create a notification system so what is recommended to be used?
There are 3 options.
JMS/ActiveMQ with JMSTemplate
AMQP/RabbitMQ or Kafka with RabbitTemplate/KafkaTemplate (Preferred for beginners)
Spring Cloud Stream with Kafka (High throughput/Advanced usecase microservices)
More here for AMQP
More here for Cloud Stream.
If you're starting out it's fine using the second option. It's easy to migrate to the third one and you should be using Spring Cloud (specially designed for microservices) for that. The third one is the easiest and have lesser codes.

Categories