Start Spring Boot app with Spring Integration Kafka consumers paused - java

I am working on a Spring Boot application that uses Spring Integration flows that have Kafka topics as their source. Our integration flow starts using an interface containing SubscribableChannels with springframework.cloud.stream.annotation.Input and Output annotations. These are configured to read from Kafka via Cloud Config with spring.cloud.stream.kafka.bindings.
When the app first starts up it immediately begins reading from the Kafka topics. This is a problem as the app needs to initialize some local, non-persistable databases before it can start correctly processing incoming Kafka messages.
We are currently using a #PostConstruct to populate these in-memory databases before Kafka starts but this is suboptimal as the app can't use Eureka, Feign, etc, to reliably find a healthy service that has the latest data for the in-memory database.
For a variety of reasons the architecture can't be changed such that the in-memory database is shared or prepopulated. Just know that when I call it an in-memory database I'm simplifying things a bit, it's actually another service, of sorts.
What is the best way to start a Spring Boot app such that an Integration Flow that reads from Kafka starts in a paused state and can be unpaused after some other process completes?

I assume you use KafkaMessageDrivenChannelAdapter and according your mentioning of Spring Integration Java DSL - Kafka.messageDrivenChannelAdapter() to be exact. That one can be configured with the id and autoStartup(false). Therefore it isn't going to start to consume Kafka topic immediately. Whenever you are ready to consume, you can start() this component obtaining it as a Lifecycle from the application context using the mentioned id.
Or you can send an appropriate message to the Control Bus.
UPDATE
If you deal with Spring Cloud Stream and Kafka Binder, you should consider to inject a BindingsEndpoint bean and perform its changeState(#Selector String name, State state) for the name of your binding and the State.STOPPED. When your in-memory DB is ready you call it back with the State.STARTED: https://docs.spring.io/spring-cloud-stream/docs/Elmhurst.RELEASE/reference/htmlsingle/#_binding_visualization_and_control

Related

Java - Is it Possible to Use Spring Cloud Stream Kafka and RabbitMQ for the Same Application

We have an application that is already using Spring Cloud Stream with RabbitMQ, some endpoints of the application are sending messages to Rabbit MQ. Now we want new endpoints start sending messages to Kafka, hoping that the existing endpoints continue using RabbitMQ with Spring Cloud Stream. I am not sure if this is even possible, because this means we have to include both kafka and rabbit binder dependencies in pom.xml. What configuration changes we need to make in the yml file so that the app understands what bindings are for kafka and what bindings are for Rabbit? Many thanks.
Yes it is possible. This is what we call a multi-binder scenario and is one of the core features specifically to support the use case you are describing.
Here is where you can find more information - https://docs.spring.io/spring-cloud-stream/docs/3.2.1/reference/html/spring-cloud-stream.html#multiple-binders
Also, here is an example that actually provides configuration that uses Kafka and Rabbit. While example is centered around CloudEvent, you can ignore it and strictly concentrate on configuration related to Rabbit and Kafka binders - https://github.com/spring-cloud/spring-cloud-function/tree/main/spring-cloud-function-samples/function-sample-cloudevent-stream
Feel free to ask follow up questions once you get familiar with that.

Should I use Spring Data flow server for my new Spring Batch jobs?

I have a requirement to create around 10 Spring Batch jobs, which will consists of a reader and a writer. All readers read data from some different Oracle DB and write into a different Oracle Db(Source and destination servers are different). And the Spring jobs are implemented using Spring Boot. Also all 10+ jobs would be packaged into a single Jar File. So far fine.
Now the client also wants some UI to monitor the job status and act as a job organizer. I gone through the Spring Data flow Server documentation for UI requirement. But I'm not sure whether it'll serve the purpose, or is there any other alternative option available for monitoring the job status, stop and start the jobs whenever required from the UI.
Also how could I separate the the 10+ jobs inside a single Jar in the Spring Data Flow Server if it's the only option for an UI.
Thanks in advance.
I don't have reputation to add a comment. So, I am posting answer here. Although I know this is not the way to share reference link as an answer.
This might help you:
spring-batch-job-monitoring-with-angular-front-end-real-time-progress-bar
Observability of spring batch jobs is given by data that are persisted by the framework in a relational database... instances..executions..timestamps...read count..write count....
You have different way to exploit these data. SQL client, JMX, spring batch api (JobExplorer, JobOperator), spring admin (deprecated in favor of cloud data flow server).
Data flow is an orchestrator allowing you to execute data pipelines with streams and tasks(finite and short lived/monitored services). For your jobs we can imagine wrap each jobs in tasks and create a multitask pipeline. Data flow gives you status of each executions.
You can also expose your monitoring data by pushing them as metrics in an influxDb for instance...

Can I use spring.cloud.stream.bindings.<channel>.group when using RabbitMQ to obtain exactly-once delivery?

so I was reading this tutorial to configure RabbitMQ and SpringBoot.
At a certain point it is said:
Most of the time, we need the message to be processed only once.
Spring Cloud Stream implements this behavior via consumer groups.
So I started looking for more information on Spring docs it is written that:
When doing so, different instances of an application are placed in a
competing consumer relationship, where only one of the instances is
expected to handle a given message.
Spring Cloud Stream models this behavior through the concept of a
consumer group. (Spring Cloud Stream consumer groups are similar to
and inspired by Kafka consumer groups.)
So I setup here two nodes with Spring Boot Cloud Stream and RabbitMQ and using spring.cloud.stream.bindings.<channel>.group.
This to me still looks like at-least-once behavior. Am I wrong in assuming that? Should I still manage the possibility to process a message twice even using spring.cloud.stream.bindings.<channel>.group?
Thank you
It's at least once. The connection might close before the ack is sent. Rare, but possible.

Are there any plugins for broadcasting events in spring boot

I would like to broadcast a message across microservices, whenever any database is updated.
Example:
Stock value keeps changing, how do i notify all to use the latest value everytime the stock value is updated.
Can anyone suggest any tools or plugins for the same which could be embedded with spring boot.
An example code would be very helpful
If you need something more complex, then you can find tons of examples with AMQP, Spring JMS, Redis pub-sub in this book.
One easy solution for basic state changes, would be instead spring-cloud-bus.
Spring Cloud Bus links nodes of a distributed system with a
lightweight message broker. This can then be used to broadcast state
changes (e.g. configuration changes) or other management instructions.
The only implementation currently is with an AMQP broker as the
transport, but the same basic feature set (and some more depending on
the transport) is on the roadmap for other transports.
A common use case is the change in the config server that needs to be updated in all the microservices. While normally we would change the value in the config server and then hit the refresh endpoint for each service, the config cloud bus updates them all making the process easier.

Parallel tests consuming ActiveMQ/JMS topic

I have a vm://localhost in-memory activemq setting on an Spring Boot JMS project.
My controllers send events to a particular topic, and some tests check the events are properly sent with #JmsMessagingTemplate. The problem is when I execute multiple tests at the same time, some pf them fail because they are getting the unexpected event.
How can I fix that? I tried to play with acknowledge modes, concurrent users, exclusive consumers, jms.listener.max-concurrency, activemq pool configuration...
You should do one of the following:
Start an instance of in-memory ActiveMQ for each test (group of tests). For example you may use embedded broker to spawn multiple instances.
Dynamically generate unique topic name for test and create separate topic for each test.

Categories