Is kafka container factory a requirement in Spring Kafka? - java

I have a simple consumer in Spring working. I have a config class defined with a bunch of factories, etc. When I remove the config class, the consumer still works. I'm wondering the benefit of having the factory, ie:
#Bean
public ConcurrentKafkaListenerContainerFactory<String,
GenericRecord> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, GenericRecord> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setBatchListener(true);
return factory;
}
public ConsumerFactory<String, GenericRecord> consumerFactory() {
return new DefaultKafkaConsumerFactory<>(retrieveConsumerConfigs());
}
and now just passing vals in via application properties and calling it a day. I have explicit control over the config in the class-based approach, but was also thinking I could drop the class and have the vals be available through the spring env variables like spring.kafka.bootstrapservers, for example.

The container factory is required for #KafkaListener methods.
Spring Boot will auto-configure one (from application.properties/yml) if you don't provide your own bean. See KafkaAutoConfiguration.
Boot will also configure the consumer factory (if you don't).
An application, typically, does not need to declare any infrastructure beans.
EDIT
I prefer to never declare my own infrastructure beans. If I need some feature that is not exposed as a Boot property, or where I want to override some property for just one container, I simply add a customizer bean.
#Component
class Customizer {
public Customizer(ConcurrentKafkaListenerContainerFactory<?, ?> factory) {
factory.setContainerCustomizer(container -> {
if (container.getContainerProperties().getGroupId().equals("slowGroup")) {
container.getContainerProperties().setIdleBetweenPolls(60_000);
}
});
}
}
or
#Component
class Customizer {
Customizer(AbstractKafkaListenerContainerFactory<?, ?, ?> containerFactory,
ThreadPoolTaskExecutor exec) {
containerFactory.getContainerProperties().setConsumerTaskExecutor(exec);
}
}
etc.

the simple consumer in Spring works because spring-boot auto-configuration under the hoods creates an object of ConcurrentKafkaListenerContainerFactory and registers it with the spring container.
You can validate it by injecting the implementation of KafkaListenerContainerFactory as done below:
#RestController
public class EmployeeController {
private final KafkaListenerContainerFactory kafkaListenerContainerFactory;
#Autowired
public EmployeeController(KafkaListenerContainerFactory kafkaListenerContainerFactory) {
System.out.println(kafkaListenerContainerFactory instanceof ConcurrentKafkaListenerContainerFactory);
this.kafkaListenerContainerFactory = kafkaListenerContainerFactory;
}
}
But if you are not happy with spring boot's auto-generated bean, you can create your own bean and register it with the spring container by using #Bean annotation

Related

Spring instantiate prototype beans by manual call

I have a #Configuration class that I try to use for the sake of custom configuration of JMS components of my application. Here it's simplified code:
#Configuration
class JmsConfiguration(
val props: JmsProperties
) : JmsListenerConfigurer {
#Bean
fun connectionFactoryManager(): ConnectionFactoryManager {
return ConnectionFactoryManager(props.services.map { serviceProps ->
connectionFactory(serviceProps)
}.toList())
}
#Bean
#Scope(BeanDefinition.SCOPE_PROTOTYPE)
fun connectionFactory(serviceProps: JmsServiceProps): ConnectionFactory {
val cf = MQConnectionFactory()
// some configuration
return cf
}
override fun configureJmsListeners(registrar: JmsListenerEnpointRegistrar) {
// adding custom message listeners to registrar
}
}
The thing is that Spring complains about connectionFactory method and it's parameter serviceProps saying the following: "Parameter 0 of method connectionFactory in JmsConfiguration required a bean of type JmsServiceProps that could not be found" which is pretty strange. I thought that spring doen't lookup parameters for prototype scoped bean factory methods? If this is not the case and I'm wrond how should I create such instances?
Note: I need these connection factories to be present in spring context since they are being wrapped by other components thereafter.
Unless that properties file is a #Bean, you need tell Spring about it. The simplest way is to do :
#Configuration
#EnableConfigurationProperties(JmsServiceProps.class)

Why explictly define KafkaTemplate beans?

The Spring Kafka reference documentation suggest to create the beans for Kafka templates explicitly. I'm using spring-boot-starter 2.3.3 and spring-kafka 2.5.5 and I noticed that you can just create a producer factory with a wildcard type and the Kafka template beans are created automatically. The downside of this approach is that the IDE can no longer correctly evaluate whether an #Autowired Kafka template bean actually exists. The advantage is less configuration when you use a lot of different value types in Kafka templates.
Are there any other reasons I should define these beans explicitly?
// In a #Configuration class
// Variant: Just define a wildcard producer
#Bean
public ProducerFactory<String, ?> wildcardProducerFactory(){
return new DefaultKafkaProducerFactory<>(config, new StringSerializer(), new JsonSerializer<>());
}
// Variant: Define specific producer and template
#Bean
public ProducerFactory<String, Foo> fooProducerFactory(){
return new DefaultKafkaProducerFactory(config, new StringSerializer(), new JsonSerializer());
}
#Bean
public KafkaTemplate<String, Foo> fooKafkaTemplate(){
return new KafkaTemplate<>(fooProducerFactory());
}
// Somewhere in a #Component class
// Usage here is the same for both variants
#Autowired
private KafkaTemplate<String, Foo> fooKafkaTemplate;
With Spring Boot you even don't need to create a ProducerFactory bean. The auto-configuration takes care about that for you: https://docs.spring.io/spring-boot/docs/current/reference/html/messaging.html#messaging.kafka
See also a KafkaProperties.Producer for more info how to provide serializers via configuration properties.
Generics are not taken into account when you inject a bean: if its root type match, then you are good to go. There is no any generics at runtime anyway - type erasure in Java.
You probably just confused by the IDE representation. And since those beans are not declared in your project, it doesn't see them from classpath where they are presented by Spring Boot at runtime.

Why Redis Cache is not getting empty in my Spring Boot application?

I am using Redis Cache in my Spring Boot application to store data of multiple rest API's.
I am clearing Redis cache on a regular interval using Spring Cron Jobs. The method is getting called at required time-slots.
I have verified the logs but the cache is not getting clear and hence it is showing the stale data.
The code where I'm trying to clear the cache.
public class CustomerDerivation {
#Autowired
#Qualifier("redisCacheMngr")
CacheManager redisCacheMngr;
#Scheduled(cron = "${redis.api.update.interval}")
#CacheEvict(value = "redis-cache", allEntries = true, cacheNames = {"redis-cache"})
protected void cacheEvict() {
redisCacheMngr.getCache("redis-cache").clear();
logger.info("Evicting ModelCache");
}
}
Custom cache configuration code.
#Configuration
#Profile("cloud")
public class CacheConfig extends AbstractCloudConfig {
#Autowired
Environment env;
#Bean
public RedisConnectionFactory brRedisFactory() {
return connectionFactory().redisConnectionFactory(env.getProperty("model_cache_name"));
}
#Bean
public RedisTemplate<String, Object> brRedisTemplate() {
RedisTemplate<String, Object> redisTemplate = new RedisTemplate<String, Object>();
redisTemplate.setConnectionFactory(brRedisFactory());
return redisTemplate;
}
#Bean(name = "redisCacheMngr")
public CacheManager cacheManager() {
RedisCacheManager cacheManager = new RedisCacheManager(brRedisTemplate());
cacheManager.setUsePrefix(true);
cacheManager.setTransactionAware(true);
return cacheManager;
}
}
How to fix the code to clear the redis cache ?
1) Have you enabled the cache using #EnableCaching annotation?
2) why are you using #CacheEvict and clear() in the same method? Both serves the same purpose. Use either one of them.
Check the following:
App is started with the according profile (ie. "cloud")
#EnabledCaching on your configuration
Avoid to mix #CacheEvict and the CacheManager bean, you have to choose one way for eviction
Extract the #Scheduled method in another "CronJobs" bean to avoid multiple annotations and AOP inside class issues.
Regards.
Spring use Spring AOP (Aspect Oriented Programming) to implement caching which means you must use public access level on your cacheEvict() method so it can be intercepted by AOP proxy. Otherwise it is like you never annotate your method with #CacheEvict

broken Spring #Profile logic

I'm using the new Spring Boot 2.0M7 and I am trying to define some conditional logic to load different beans depending on the active profile.
I have this (working) bean configuration. That defines an sqs based connection factory for all environments except test and activemq for test.
#Configuration
#EnableJms
public class QueueConfig {
private static Logger LOG = LoggerFactory.getLogger(QueueConfig.class);
#Profile({"!test"})
#Bean
public ConnectionFactory sqsConnectionFactory() {
LOG.info("using sqs");
return new SQSConnectionFactory(new ProviderConfiguration(), AmazonSQSClientBuilder.standard()
.withRegion(Regions.EU_WEST_1)
.withCredentials(new DefaultAWSCredentialsProviderChain())
);
}
#Profile({"test"})
#Bean
public ConnectionFactory activeMqConnectionFactory() {
LOG.info("using activemq");
return new ActiveMQConnectionFactory("vm://localhost?broker.persistent=false");
}
#Bean
public JmsTemplate defaultJmsTemplate(ConnectionFactory connectionFactory) {
return new JmsTemplate(connectionFactory);
}
#Bean
public DefaultJmsListenerContainerFactory jmsListenerContainerFactory(ConnectionFactory connectionFactory) {
DefaultJmsListenerContainerFactory factory = new DefaultJmsListenerContainerFactory();
factory.setConnectionFactory(connectionFactory);
factory.setDestinationResolver(new DynamicDestinationResolver());
factory.setConcurrency("3-10");
return factory;
}
}
This works with a single profile. I can see in my test (annotated with #ActiveProfiles("test") that the test profile is active and the correct bean loads (log message).
However, changing #Profile({"!test"}) to #Profile({"!test","!dev}) on the sqsConnectionFactory and #Profile({"test"}) to #Profile({"test","dev}) on the activeMqConnectionFactory breaks things.
I get an unresolved bean exception because it now has two instances instead of 1. I can see in my logs that the test profile is still active and despite this it happily loads both the sqs and activemq implementations even though it shouldn't.
Did something change with the logic for #Profile in spring boot 2.x? If so,
how can I define that the activemq implementation is used when dev or test profile is active and sqs otherwise?
If not, what am I doing wrong here?
There are many ways you can approach that problem. Here is one:
Create another profile sqs. Use it to enable or disable beans.
#Profile({"sqs"})
#Bean
public ConnectionFactory sqsConnectionFactory() { ... }
#Profile({"!sqs"})
#Bean
public ConnectionFactory activeMqConnectionFactory() { ... }
Then declare your profiles in configuration files as using this one, or not:
---
spring.profiles: dev
...
---
spring.profiles: test
...
---
spring.profiles: prod
spring.profiles.include:
- sqs
#Profile({"!test","!dev}) - here you are missing one " after !dev, however, if it is just a typo in here post, try following (that works for me)
#Profile(value={"!test", "!dev"})
and btw - I personally prefer to have one configuration #Bean per class, in that case you are basically annotating your whole class with #Profile, for me it is much readable

Spring autowire list without merging all possible beans

I'm using 4.2.0 spring.
I create two beans:
#Bean
public String sessionAttributeName() {
return "someString";
}
#Bean
public List<String> urlsRequireAuthentication() {
return Lists.newArrayList(
"/auction/*"
);
}
When I try to autowire a list of beans like this:
#Bean
public FilterRegistrationBean someFilterRegistrationBean(List<String> urlsRequireAuthentication) {
...
}
Not only the original list will be autowired as expected ["/auction/*"] but all registered String beans will be merged to one big list like ["/auction/*", "someString"].
I used this feature back in the time and it was useful but for this particular place I really want to only include the content of the urlsRequireAuthentication list. How can I do that?
Just use the method directly instead of injecting the bean as a parameter :
#Bean
public FilterRegistrationBean someFilterRegistrationBean() {
List<String> urlsRequireAuthentication = urlsRequireAuthentication();
}
#Beans documentation :
Typically, #Bean methods are declared within #Configuration classes.
In this case, bean methods may reference other #Bean methods in the
same class by calling them directly. This ensures that references
between beans are strongly typed and navigable. Such so-called
'inter-bean references' are guaranteed to respect scoping and AOP
semantics, just like getBean() lookups would. These are the semantics
known from the original 'Spring JavaConfig' project which require
CGLIB subclassing of each such configuration class at runtime. As a
consequence, #Configuration classes and their factory methods must not
be marked as final or private in this mode. For example:
UPDATE
An other way to do it would be to use the javax #Resource annotation. It does not work with the #Qualifier annotation precisely because of this feature of #Autowired : It is possible to provide all beans of a particular type from the ApplicationContext by adding the annotation to a field or method that expects an array of that type) :
#Configuration
public class ConfigurationClass {
#Resource(name="urlsRequireAuthentication")
private List<String> urlsRequireAuthentication;
#Bean
public FilterRegistrationBean someFilterRegistrationBean() {
urlsRequireAuthentication.size();
}
}
sessionAttributeName and urlsRequireAuthentication should be configuration properties and not beans. Create application.properties in the resources dir and add the following line authentication-urls = /auction/*, /url2/*. Now you can access your properties using the #Value annotation.
#Configuration
#PropertySource("classpath:application.properties")
public class AppConfig {
#Bean
public FilterRegistrationBean someFilterRegistrationBean(#Value("${authentication-urls}") String[] authenticationUrls) {
...
}
}
If you are using spring boot you should check out the docs for externalized configuration.
Use Qualifier annotation:
#Bean
public FilterRegistrationBean someFilterRegistrationBean(#Qualifier("urlsRequireAuthentication") List<String> urlsRequireAuthentication) {
...
}

Categories