broken Spring #Profile logic - java

I'm using the new Spring Boot 2.0M7 and I am trying to define some conditional logic to load different beans depending on the active profile.
I have this (working) bean configuration. That defines an sqs based connection factory for all environments except test and activemq for test.
#Configuration
#EnableJms
public class QueueConfig {
private static Logger LOG = LoggerFactory.getLogger(QueueConfig.class);
#Profile({"!test"})
#Bean
public ConnectionFactory sqsConnectionFactory() {
LOG.info("using sqs");
return new SQSConnectionFactory(new ProviderConfiguration(), AmazonSQSClientBuilder.standard()
.withRegion(Regions.EU_WEST_1)
.withCredentials(new DefaultAWSCredentialsProviderChain())
);
}
#Profile({"test"})
#Bean
public ConnectionFactory activeMqConnectionFactory() {
LOG.info("using activemq");
return new ActiveMQConnectionFactory("vm://localhost?broker.persistent=false");
}
#Bean
public JmsTemplate defaultJmsTemplate(ConnectionFactory connectionFactory) {
return new JmsTemplate(connectionFactory);
}
#Bean
public DefaultJmsListenerContainerFactory jmsListenerContainerFactory(ConnectionFactory connectionFactory) {
DefaultJmsListenerContainerFactory factory = new DefaultJmsListenerContainerFactory();
factory.setConnectionFactory(connectionFactory);
factory.setDestinationResolver(new DynamicDestinationResolver());
factory.setConcurrency("3-10");
return factory;
}
}
This works with a single profile. I can see in my test (annotated with #ActiveProfiles("test") that the test profile is active and the correct bean loads (log message).
However, changing #Profile({"!test"}) to #Profile({"!test","!dev}) on the sqsConnectionFactory and #Profile({"test"}) to #Profile({"test","dev}) on the activeMqConnectionFactory breaks things.
I get an unresolved bean exception because it now has two instances instead of 1. I can see in my logs that the test profile is still active and despite this it happily loads both the sqs and activemq implementations even though it shouldn't.
Did something change with the logic for #Profile in spring boot 2.x? If so,
how can I define that the activemq implementation is used when dev or test profile is active and sqs otherwise?
If not, what am I doing wrong here?

There are many ways you can approach that problem. Here is one:
Create another profile sqs. Use it to enable or disable beans.
#Profile({"sqs"})
#Bean
public ConnectionFactory sqsConnectionFactory() { ... }
#Profile({"!sqs"})
#Bean
public ConnectionFactory activeMqConnectionFactory() { ... }
Then declare your profiles in configuration files as using this one, or not:
---
spring.profiles: dev
...
---
spring.profiles: test
...
---
spring.profiles: prod
spring.profiles.include:
- sqs

#Profile({"!test","!dev}) - here you are missing one " after !dev, however, if it is just a typo in here post, try following (that works for me)
#Profile(value={"!test", "!dev"})
and btw - I personally prefer to have one configuration #Bean per class, in that case you are basically annotating your whole class with #Profile, for me it is much readable

Related

Is kafka container factory a requirement in Spring Kafka?

I have a simple consumer in Spring working. I have a config class defined with a bunch of factories, etc. When I remove the config class, the consumer still works. I'm wondering the benefit of having the factory, ie:
#Bean
public ConcurrentKafkaListenerContainerFactory<String,
GenericRecord> kafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, GenericRecord> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(consumerFactory());
factory.setBatchListener(true);
return factory;
}
public ConsumerFactory<String, GenericRecord> consumerFactory() {
return new DefaultKafkaConsumerFactory<>(retrieveConsumerConfigs());
}
and now just passing vals in via application properties and calling it a day. I have explicit control over the config in the class-based approach, but was also thinking I could drop the class and have the vals be available through the spring env variables like spring.kafka.bootstrapservers, for example.
The container factory is required for #KafkaListener methods.
Spring Boot will auto-configure one (from application.properties/yml) if you don't provide your own bean. See KafkaAutoConfiguration.
Boot will also configure the consumer factory (if you don't).
An application, typically, does not need to declare any infrastructure beans.
EDIT
I prefer to never declare my own infrastructure beans. If I need some feature that is not exposed as a Boot property, or where I want to override some property for just one container, I simply add a customizer bean.
#Component
class Customizer {
public Customizer(ConcurrentKafkaListenerContainerFactory<?, ?> factory) {
factory.setContainerCustomizer(container -> {
if (container.getContainerProperties().getGroupId().equals("slowGroup")) {
container.getContainerProperties().setIdleBetweenPolls(60_000);
}
});
}
}
or
#Component
class Customizer {
Customizer(AbstractKafkaListenerContainerFactory<?, ?, ?> containerFactory,
ThreadPoolTaskExecutor exec) {
containerFactory.getContainerProperties().setConsumerTaskExecutor(exec);
}
}
etc.
the simple consumer in Spring works because spring-boot auto-configuration under the hoods creates an object of ConcurrentKafkaListenerContainerFactory and registers it with the spring container.
You can validate it by injecting the implementation of KafkaListenerContainerFactory as done below:
#RestController
public class EmployeeController {
private final KafkaListenerContainerFactory kafkaListenerContainerFactory;
#Autowired
public EmployeeController(KafkaListenerContainerFactory kafkaListenerContainerFactory) {
System.out.println(kafkaListenerContainerFactory instanceof ConcurrentKafkaListenerContainerFactory);
this.kafkaListenerContainerFactory = kafkaListenerContainerFactory;
}
}
But if you are not happy with spring boot's auto-generated bean, you can create your own bean and register it with the spring container by using #Bean annotation

Do we need to create bean for all in-built classes in Spring Boot?

I am trying to configure RabbitMQ with Spring Boot. Below is a snapshot of my config class.
Case 1:
#Bean
public RabbitTemplate rabbitTemplate(ConnectionFactory connectionFactory) {
RabbitTemplate template = new RabbitTemplate(connectionFactory);
template.setMessageConverter(new Jackson2JsonMessageConverter());
return template;
}
This code works fine.
Code 2:
#Bean
public RabbitTemplate rabbitTemplate(ConnectionFactory connectionFactory, MessageConverter messageConverter) {
RabbitTemplate template = new RabbitTemplate(connectionFactory);
// template.setMessageConverter(new Jackson2JsonMessageConverter()); // Line 1 - works
// template.setMessageConverter(messageConverter); // Line 2 - error: asks to inject Bean
return template;
}
In this case however, code works fine if I use the Line 1, where I am creating object of Jackson2JsonMessageConverter.
But I am writing this code to understand the working of an already existing code where instead of line 1, line 2 is used. So, when I use line 2 instead of line 1, I get error:
Consider defining a bean of type 'org.springframework.amqp.support.converter.MessageConverter' in your configuration.
So I have 2 questions:
Why this error?
If I define a bean for MessageConverter, say
#Bean
public MessageConverter createMessageConverter() {
return new Jackson2JsonMessageConverter();
}
then it works. Then why is it not asking me to define a bean for ConnectionFactory argument?
PS: There are no #Autowired used, neither here, nor in the code I am trying to understand and both arguments, ConnectionFactory & MessageConverter are interfaces and not classes
The short answer is: You need to understand the concept of AutoConfiguratrions in Spring Boot, which will create a lot of #Beans for you, without you "seeing them".
A very good article to understand AutoConfigurations is this:
https://www.marcobehler.com/guides/spring-boot
In your case, you also might want to have a look at the "RabbitAutoConfiguration" class from Spring Boot's source code.

Configuring a Spring Cloud Data Flow Task with its own Database

I have a task application with its own database that I'd like to run in Spring Cloud Data Flow.
My problem is that SCDF overwrites the datasource configuration in the task with the datasource configuration for SCDF. (Both databases are Oracle DBs.)
My task should write to a different database (but I also want to know its status in SCDF database).
How is it possible to configure my task to connect to its own database as well as to SCDF's database?
I found the solution.
I defined both data sources in a configuration class (one for JPA and one for SCDF) following this as an example: https://www.baeldung.com/spring-data-jpa-multiple-databases
However this wasn't enough because the Data Flow Server accepts only one data source by default. To overcome this, one needs to extend the DefaultTaskConfigurer and set the Data Flow Server's data source in the constructor.
#Component
public class GeneratorTaskConfigurer extends DefaultTaskConfigurer {
public GeneratorTaskConfigurer(#Qualifier("dataflowDataSource") DataSource dataSource) {
super(dataSource);
}
}
You can have one config class with SCDF datasource code like this
#Configuration
#Profile("cloud")
public class MySqlConfiguration {
#Bean
public Cloud cloud() {
return new CloudFactory().getCloud();
}
#Bean
#Primary
public DataSource dataSource() {
return cloud().getSingletonServiceConnector(DataSource.class, null);
}
#Bean
#Primary
public PlatformTransactionManager getTransactionManager() {
return new DataSourceTransactionManager(dataSource());
}
#Bean
public JobRepository jobRepositoryFactoryBean() throws Exception{
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSource());
factory.setTransactionManager(getTransactionManager());
factory.afterPropertiesSet();
return factory.getObject();
}
#Bean
#Primary
public DefaultTaskConfigurer defaultTaskConfigurer() {
return new DefaultTaskConfigurer(dataSource());
}
}
And then have your other datasource configuration in a separate class for the database you want to write to.
Make sure you mark the SCDF one #Primary, otherwise you get multiple datasource error.
Hope this helps.

Spring JMS Use Point-to-point and Topic in the same application

We are currently introducing ActiveMQ into our existing application which was running on a different Queueing system. Spring JMS is used to make use of the existing integration within the Spring framework.
Most of our applications use point-to-point (queue) communication, with the exception of one. It needs to be able to listen to the topic created by another producing application while publishing to multiple queues at the same time.
This means that application needs to support both Topics and Queues. However, when setting the global property
jms:
pub-sub-domain: true
the setting is global and all queue subscribers are immediately subscribing to topics, which we can see in the ActiveMQ web interface.
Is there a way to configure the application to support both topics and queues at the same time?
The boot property is used to configure the default container factory used by #JmsListener methods, as well as to configure the JmsTemplate.
Simply override Boot's default container factory...
#Bean
public DefaultJmsListenerContainerFactory jmsListenerContainerFactory(
DefaultJmsListenerContainerFactoryConfigurer configurer,
ConnectionFactory connectionFactory) {
DefaultJmsListenerContainerFactory factory = new DefaultJmsListenerContainerFactory();
configurer.configure(factory, connectionFactory);
return factory;
}
and then add a second one
#Bean
public DefaultJmsListenerContainerFactory jmsTopicListenerContainerFactory(
DefaultJmsListenerContainerFactoryConfigurer configurer,
ConnectionFactory connectionFactory) {
DefaultJmsListenerContainerFactory factory = new DefaultJmsListenerContainerFactory();
configurer.configure(factory, connectionFactory);
factory.setPubSubDomain(true); << override the boot property
return factory;
}
Then refer to the alternate factory in the #JmsListener for the topic.
Alternatively, if you don't have listeners for both types, set the property to true, but override Boot's JmsTemplate configuration.

How to set amqp RabbitMQ consumer tag in Spring Boot?

In question How to set the consumer-tag value in spring-amqp it is being asked how to change the consumer tag when using Spring Amqp and the answer suggests to provide an implementation of ConsumerTagStrategy.
I'm using Spring Boot 2.0.5 and I'm trying to figure out if I can do the same customization, though I can't find any configuration property about that nor providing a bean of type ConsumerTagStrategy seems to work.
How should I go about this?
Override boot's container factory bean declaration and add it there.
#Bean
public SimpleRabbitListenerContainerFactory rabbitListenerContainerFactory(
SimpleRabbitListenerContainerFactoryConfigurer configurer,
ConnectionFactory connectionFactory) {
SimpleRabbitListenerContainerFactory factory = new SimpleRabbitListenerContainerFactory();
configurer.configure(factory, connectionFactory);
factory.setConsumerTagStrategy(q -> "myConsumerFor." + q);
return factory;
}

Categories