We are using Spring batch framework for project. Can I configure prometheus (https://prometheus.io/docs/practices/alerting/#alerting) with my spring batch framework. In prometheus docs, I can see an example of Spring boot only. Any help regarding this is highly appreciated.
Spring Batch does not have Micrometer integration at this time. It's something we're evaluating for a future release. Feel free to open an issue if you are interested!
Related
Currently, we have an SQS client configured using an IAM Role that is picked up on the cluster. However, migrating over to Springboot 3 the #SqsListener is no longer consuming the messages.
NOTE: Same code, works on Springboot 2, but not on Springboot 3 is there something else that needs to be configured or am I missing something?
Anyone else run into this issue?
Spring Cloud AWS 2.x is only compatible with Spring Boot 2.x, for Spring Boot 3.x, Spring Cloud AWS 3.0 is necessary.
See compatibility here.
The 3.0 RC version has just been released, feel free to try it out and provide feedback!
Spring Boot 3.0x requires Spring Cloud AWS 3.0x and AWS Java SDK 2.x
Visit this link for info
In order to create a Consumer in Spring Boot 3.0, you need to use #EnableScheduling Bean in your main application. Then you need to use #Scheduled(fixedDelay = 5000) above the receive method. #Scheduled bean replaces the #SqsListener here. You can visit this link for more info.
In order for the #SqsListener annotation to work, the dependency spring-cloud-aws-autoconfigure is required - this autowires the containers etc. needed for the annotation to function. You can see the extra stuff it autowires here.
I am working on PoC of connecting legacy Spring application to Kafka. It is war application to be deployed in Tomcat, Spring version 4.3.12. Is there some library to make communication with Kafka almost as easy as with Spring Boot? I need just fundamental operations: sending message, listening for confirmation, receiving.
I have some experience with Spring Boot support as is provided in org.springframework.kafka:spring-kafka library. I am not sure how to efficiently adopt Kafka for legacy Spring - I'm thinking of using Kafka Java client which looks promising but as I am used to working at Spring Boot abstraction level I don't have clue how much code should I supply myself.
Web search is not much helpful in this case since it tends to show Spring Boot-related solutions. Migration of legacy application is considered too, I just need to have some idea how difficult each way is.
kafka-clients is all you need (from Maven Central, not Confluent). You could go a step further and look into Log4j2 Kafka bridge, then property files for that.
If you want to externalize config into regular Java .properties file, you can, or you can pull values from environment variables, if you follow 12-factor principles.
But if you don't already have Spring Boot dependencies, then I do not think adding them is worth it for only Kafka.
Also, the Spring-Kafka documentation covers how to configure your app without Boot.
I have a Spring Boot application that uses Spring Batch. I want now to implement an admin panel to see all job statuses. For this, Spring has "spring-batch-admin" But I see that is deprecated long time ago:
The functionality of Spring Batch Admin has been mostly duplicated
and
expanded upon via Spring Cloud Data Flow and we encourage all users to
migrate to that going forward.
But then Spring Cloud Data Flow says:
Pipelines consist of Spring Boot apps, built using the Spring Cloud
Stream or Spring Cloud Task microservice frameworks
So in order to use this functionality do I really need to convert my spring boot app to a microservice? Isn't this an overkill just to see some batch statuses? Also I can not install docker on my production server(for various reasons) Can I still use Spring Cloud Data Flow without docker?
Yes, spring boot batch should be wrapped as spring cloud task, which should not be too complicated.
If Docker does not suit your needs - https://docs.spring.io/spring-cloud-dataflow/docs/current/reference/htmlsingle/#getting-started-local-deploying-spring-cloud-dataflow
I want to have read and write datasources for replication purposes in spring boot app. Are there any examples on how to implement this behavior?
This could help you
Spring Boot Configure and Use Two DataSources
also you could take a look on this blog
This is what I was looking for http://fedulov.website/2015/10/14/dynamic-datasource-routing-with-spring/
We are trying to implement a message channel between a worker spring application and a consumer spring application (there will be replicas of the same consumer on multiple JVMs)
With the Java Config there is limited documentation for the spring integration and I was able to find a documentation for the spring Kafka.I am not exactly sure how the dependency is working,
Is spring Kafka integration is based on Spring Kafka. Please give an idea on this?
Where can I find proper documentation for the new Release of Spring Integration Kafka?
Spring Integration Kafka 2.0 is built on top of Spring Kafka (Spring Integration Kafka 1.x used the 0.8.x.x scala client directly).
The documentation for Spring Integration Kafka is in Chapter 6 of the Spring Kafka Reference Manual.
At some point, it is likely that spring-integration-kafka will be pulled into the main spring integration project/documentation.