I am new in Kafka,I have create a spring boot application, in this application consumes messages from kafka topics, processes it and stores in Database.
I tried Jmeter pepper box for this but it can't work for me properly.
So please any one can suggest me other best tool or list of kafka testing tools for test end to end my spring boot code with Kafka performance or best example of pepper box,I preferred this link :- https://www.blazemeter.com/blog/apache-kafka-how-to-load-test-with-jmeter
Related
We have an application that is already using Spring Cloud Stream with RabbitMQ, some endpoints of the application are sending messages to Rabbit MQ. Now we want new endpoints start sending messages to Kafka, hoping that the existing endpoints continue using RabbitMQ with Spring Cloud Stream. I am not sure if this is even possible, because this means we have to include both kafka and rabbit binder dependencies in pom.xml. What configuration changes we need to make in the yml file so that the app understands what bindings are for kafka and what bindings are for Rabbit? Many thanks.
Yes it is possible. This is what we call a multi-binder scenario and is one of the core features specifically to support the use case you are describing.
Here is where you can find more information - https://docs.spring.io/spring-cloud-stream/docs/3.2.1/reference/html/spring-cloud-stream.html#multiple-binders
Also, here is an example that actually provides configuration that uses Kafka and Rabbit. While example is centered around CloudEvent, you can ignore it and strictly concentrate on configuration related to Rabbit and Kafka binders - https://github.com/spring-cloud/spring-cloud-function/tree/main/spring-cloud-function-samples/function-sample-cloudevent-stream
Feel free to ask follow up questions once you get familiar with that.
I am studying about apache kafka for past two weeks and I have managed to understand how kafka functions, kafka producer and consumer works. Now I want to design a small java program where I can send my apache tomcat 9 logs and metrics to kafka as it can be used for log aggregation purpose. I have searched how to do this, any method or tool I have to learn to design this and I came to know about Log4j.jar through which I can produce custom logs in apache tomcat but I don't know how to send this log to kafka? Please give some guidance regarding this on how to do this program if anyone done this work before.
Thank you.
As commented, you would use KafkaAppender on the application server-side to point at your Kafka brokers to send data to it; Kafka doesn't request data from your applications.
You can also write logs directly to disk, and use any combination of log-processors like Filebeat, Fluent-bit, Rsyslog, which all have Kafka-integrations.
I have a use case where I want to move messages from SQS to a Kafka topic. The framework to be used is SpringBoot. So, whenever I run my code it should start moving the messages. I searched for some articles but there were very few. I am looking for some boilerplate code to start with that follow the best practices and how to proceed further.
Thanks in advance.
You need to make yourself familiar with Enterprise Integration Patterns and its Spring Integration implementation.
To take messages from AWS SQS you would need to use an SqsMessageDrivenChannelAdapter from a Spring Integration for AWS extension. To post records into an Apache Kafka topic you need a KafkaProducerMessageHandler from the spring-integration-kafka module.
Then you wire everything together via an IntegrationFlow bean in your Spring Boot configuration.
Of course you can use Spring Cloud for AWS and Spring for Apache Kafka directly.
Choice is yours, but better to follow best practice and start developing really an integration solution.
Apache Kafka offers multiple ways to ingest data from different sources e.g. Kafka Connect, Kafka Producer, etc., and we need to be careful while selecting specific components of Kafka by keeping certain things in mind such as retry mechanism, scalability, etc.
The best solution, in this case, would be to use Amazon SQS Source Connector to ingest data from AWS SQS into Kafka topic and then write your consumer application to do whatever is necessary with the stream of records of that particular topic.
I would like to gather and store data on the availability of the service or node. The day after I could summarize the figures, like { day-1: service = 98.5%; day-2 = 99%}.
I could get the data by calling a simple rest (ping) service (e.g. via Actuator or what). Then I would need to write a custom scheduled application calling the Actuator/ping services.
Is there a simple solution for collecting/storing the availability data? Via Spring Batch?
UPDATE 31-05: I read about Spring Boot Admin. Is this the right solution? See also this introduction.
The data could be extracted and formatted in a CSV, JasperReporting, etc.
I hope that I can help you. I think that what you need is a way of monitoring your applications in a persistent way. You can build your own solution creating a Ping resource and scheduling a client to collect availability information from time to time. But, to no re-invent the wheel, a really suggest you use some professional tool.
I recommend that you use a Dashboard tool like Grafana to create these reports, and I suggest you try Prometheus to capture monitoring pieces of information.
I have listed some links below.
Actuator and Prometheus
monitoring-spring-boot-applications-with-prometheus
Prometheus dashboard in Grafana
I am currently developing java/jee application,which has 2 projects:backend and frontend projects which are communicating via micro services.I am using mysql as database and i want to create a notification system so what is recommended to be used?
There are 3 options.
JMS/ActiveMQ with JMSTemplate
AMQP/RabbitMQ or Kafka with RabbitTemplate/KafkaTemplate (Preferred for beginners)
Spring Cloud Stream with Kafka (High throughput/Advanced usecase microservices)
More here for AMQP
More here for Cloud Stream.
If you're starting out it's fine using the second option. It's easy to migrate to the third one and you should be using Spring Cloud (specially designed for microservices) for that. The third one is the easiest and have lesser codes.