How we can access the same cache between two war files? - java

I am using #Cacheable of Spring annotation to cache the data and Redis as the cache manager.
I created the cache with name xyx on one war, now I want to access/update/delete the same cache on another war.
Below is the code I have used to create the cache manager
#Bean
public JedisConnectionFactory redisConnectionFactory() {
JedisConnectionFactory redisConnectionFactory = new JedisConnectionFactory();
// Defaults
redisConnectionFactory.setHostName("127.0.0.1");
redisConnectionFactory.setPort(6379);
return redisConnectionFactory;
}
Bean
public RedisTemplate<String,Object> redisTemplate(RedisConnectionFactory cf) {
RedisTemplate<String, Object> redisTemplate = new RedisTemplate<String, Object>();
redisTemplate.setConnectionFactory(cf);
return redisTemplate;
}
#Primary
#Bean(name = "cManager")
public CacheManager cacheManager(RedisTemplate redisTemplate) {
RedisCacheManager cm= new RedisCacheManager(redisTemplate);
return cm;
}
Below is the method to cache the data in war 1
#Cacheable(value = "xyz" , cacheManager = "cManager")
public Map<String, Map<String, List<DTO>>> cachingData()
throws CustomException {
//logic
}

As long as both web applications are connecting to the same Redis instance, and using the same cacheName and cache key, this should work transparently, as if it was in the same war.
Example annotation
#Cacheable(cacheNames = "myCache", key = "'myKey'")
public String myCacheableMethod(){
return "some value";
}

Related

How to create RedisCacheManager in spring-data 3.0.x

I'm migrating my application from spring boot 1.5.x to 3.0.x. I want to keep jedis but I have a problem with the instantiation of RedisCacheManager.
Now constructor signature is
RedisCacheManager(RedisCacheWriter cacheWriter, RedisCacheConfiguration defaultCacheConfiguration)
But before it was:
RedisCacheManager(RedisOperations redisOperations)
I define this bean having only RedisTemplate in scope:
#Bean
public CacheManager cacheManager1(RedisTemplate redisTemplate) {
RedisCacheManager cacheManager = new RedisCacheManager(redisTemplate);
cacheManager.setDefaultExpiration(300);
HashMap<String, Long> expires = new HashMap<>();
expires.put("DELIVERED_DLR_MESSAGE_PART_COUNT_MAP", new Long(1000));
expires.put("FAILED_DLR_MESSAGE_PART_COUNT_MAP", new Long(1000));
cacheManager.setExpires(expires);
cacheManager.setUsePrefix(true);
return cacheManager;
}
How is it supposed to be created now?

how to create N number of Kafka Template dynamicaly at run time - spring boot

I have a spring boot application that needs to connect N number of Kafka clusters. based on some condition Kafka template need to switch and send a message
I have seen some solutions to create separate Kafka template beans but in my use case number of clusters will change at the deployment time
ex:
#Bean(name = "cluster1")
public KafkaTemplate<String, String> kafkaTemplatesample1() {
return new KafkaTemplate<>(devProducerFactory1());
}
#Bean(name = "cluster2")
public KafkaTemplate<String, String> kafkaTemplatesample2() {
return new KafkaTemplate<>(devProducerFactory2());
}
is there any other solution for this? if you can share a sample code its much appreciated
Let's assume that each cluster can be described with the following attributes:
#Getter
#Setter
public class KafkaCluster {
private String beanName;
private List<String> bootstrapServers;
}
For example, two clusters are defined in the application.properties:
kafka.clusters[0].bean-name=cluster1
kafka.clusters[0].bootstrap-servers=CLUSTER_1_URL
kafka.clusters[1].bean-name=cluster2
kafka.clusters[1].bootstrap-servers=CLUSTER_2_URL
Those properties are needed before beans are instantiated, to register KafkaTemplate beans' definitions, which makes #ConfigurationProperties unsuitable for this case. Instead, Binder API is used to bind them programmatically.
KafkaTemplate beans' definitions can be registered in the implementation of BeanDefinitionRegistryPostProcessor interface.
public class KafkaTemplateDefinitionRegistrar implements BeanDefinitionRegistryPostProcessor {
private final List<KafkaCluster> clusters;
public KafkaTemplateDefinitionRegistrar(Environment environment) {
clusters= Binder.get(environment)
.bind("kafka.clusters", Bindable.listOf(KafkaCluster.class))
.orElseThrow(IllegalStateException::new);
}
#Override
public void postProcessBeanDefinitionRegistry(BeanDefinitionRegistry registry) throws BeansException {
clusters.forEach(cluster -> {
GenericBeanDefinition beanDefinition = new GenericBeanDefinition();
beanDefinition.setBeanClass(KafkaTemplate.class);
beanDefinition.setInstanceSupplier(() -> kafkaTemplate(cluster));
registry.registerBeanDefinition(cluster.getBeanName(), beanDefinition);
});
}
#Override
public void postProcessBeanFactory(ConfigurableListableBeanFactory beanFactory) throws BeansException {
}
public ProducerFactory<String, String> producerFactory(KafkaCluster kafkaCluster) {
Map<String, Object> configProps = new HashMap<>();
configProps.put(
ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,
kafkaCluster.getBootstrapServers());
configProps.put(
ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG,
StringSerializer.class);
configProps.put(
ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG,
StringSerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
public KafkaTemplate<String, String> kafkaTemplate(KafkaCluster kafkaCluster) {
return new KafkaTemplate<>(producerFactory(kafkaCluster));
}
}
Configuration class for the KafkaTemplateDefinitionRegistrar bean:
#Configuration
public class KafkaTemplateDefinitionRegistrarConfiguration {
#Bean
public static KafkaTemplateDefinitionRegistrar beanDefinitionRegistrar(Environment environment) {
return new KafkaTemplateDefinitionRegistrar(environment);
}
}
Additionally, exclude KafkaAutoConfiguration in the main class to prevent creating the default KafkaTemplate bean. This is probably not the best way because all the other KafkaAutoConfiguration beans are not created in that case.
#SpringBootApplication(exclude={KafkaAutoConfiguration.class})
Finally, below is a simple test that proves the existence of two KafkaTemplate beans.
#SpringBootTest
class SpringBootApplicationTest {
#Autowired
List<KafkaTemplate<String,String>> kafkaTemplates;
#Test
void kafkaTemplatesSizeTest() {
Assertions.assertEquals(kafkaTemplates.size(), 2);
}
}
For reference: Create N number of beans with BeanDefinitionRegistryPostProcessor, Spring Boot Dynamic Bean Creation From Properties File

The correct way for creation of KafkaTemplate in spring boot

I try configure apache kafka in spring boot application. I read this documentation and follow the steps:
1) I add this lines to aplication.yaml:
spring:
kafka:
bootstrap-servers: kafka_host:9092
producer:
key-serializer: org.apache.kafka.common.serialization.StringDeserializer
value-serializer: org.apache.kafka.common.serialization.ByteArraySerializer
2) I create new Topic:
#Bean
public NewTopic responseTopic() {
return new NewTopic("new-topic", 5, (short) 1);
}
And now I want use KafkaTemplate:
private final KafkaTemplate<String, byte[]> kafkaTemplate;
public KafkaEventBus(KafkaTemplate<String, byte[]> kafkaTemplate) {
this.kafkaTemplate = kafkaTemplate;
}
But Intellij IDE highlights:
To fix this I need create bean:
#Bean
public KafkaTemplate<String, byte[]> myMessageKafkaTemplate() {
return new KafkaTemplate<>(greetingProducerFactory());
}
And pass to constructor propirties greetingProducerFactory():
#Bean
public ProducerFactory<String, byte[]> greetingProducerFactory() {
Map<String, Object> configProps = new HashMap<>();
configProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "kafka_hist4:9092");
configProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
configProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, ByteArraySerializer.class);
return new DefaultKafkaProducerFactory<>(configProps);
}
But then what's the point of setting in application.yaml if I need create ProducerFactory manual?
I think you can safely ignore IDEA's warning; I have no problems wiring in Boot's template with different generic types...
#SpringBootApplication
public class So55280173Application {
public static void main(String[] args) {
SpringApplication.run(So55280173Application.class, args);
}
#Bean
public ApplicationRunner runner(KafkaTemplate<String, String> template, Foo foo) {
return args -> {
template.send("so55280173", "foo");
if (foo.template == template) {
System.out.println("they are the same");
}
};
}
#Bean
public NewTopic topic() {
return new NewTopic("so55280173", 1, (short) 1);
}
}
#Component
class Foo {
final KafkaTemplate<String, String> template;
#Autowired
Foo(KafkaTemplate<String, String> template) {
this.template = template;
}
}
and
they are the same
By default KafkaTemplate<Object, Object> is created by Spring Boot in KafkaAutoConfiguration class. Since Spring considers generic type information during dependency injection the default bean can't be autowired into KafkaTemplate<String, byte[]>.
I had the same issue initially, but when I executed it gave no errors and worked fine.
Ignore Intellij IDEA's warning, it may be a IDEA's bug figuring out autowired components.

How to use Spring Cache Redis with a custom RestTemplate?

I'm migrating my Spring application from Spring-boot 1.5.9 to Spring-boot 2.0.0.
With this new Spring bundle, I have some issues with caching data in Redis.
In my Configuration, I have 3 CacheManager with differents TTL (long, medium and short) :
#Bean(name = "longLifeCacheManager")
public CacheManager longLifeCacheManager() {
RedisCacheConfiguration cacheConfiguration =
RedisCacheConfiguration.defaultCacheConfig()
.entryTtl(Duration.ofSeconds(redisExpirationLong))
.disableCachingNullValues();
return RedisCacheManager.builder(jedisConnectionFactory()).cacheDefaults(cacheConfiguration).build();
}
I also have a custom RestTemplate :
#Bean
public RedisTemplate<?, ?> redisTemplate(RedisConnectionFactory connectionFactory) {
RedisTemplate<?, ?> template = new RedisTemplate<>();
template.setDefaultSerializer(new GenericJackson2JsonRedisSerializer());
template.setConnectionFactory(connectionFactory);
return template;
}
With the previous Spring version, every data that is cached use this RestTemplate and was serialized with the GenericJackson2JsonRedisSerializer.
With the new Spring version, the CacheManager don't use the RestTemplate but use its own SerializationPair. This result to everything beeing serialized with the default JdkSerializationRedisSerializer.
Is it possible to configure the CacheManager to use the RestTemplate and how ?
If it is not possible, what can I do to use the JacksonSerializer instead of the JdkSerializer ?
I finally found a working solution.
I can't configure the CacheManager to use my RedisTemplate, but I can set the Serializer like this :
#Bean(name = "longLifeCacheManager")
public CacheManager longLifeCacheManager(JedisConnectionFactory jedisConnectionFactory) {
RedisCacheConfiguration cacheConfiguration =
RedisCacheConfiguration.defaultCacheConfig()
.entryTtl(Duration.ofSeconds(redisExpirationLong))
.disableCachingNullValues()
.serializeValuesWith(RedisSerializationContext.SerializationPair.fromSerializer(new GenericJackson2JsonRedisSerializer()));
return RedisCacheManager.builder(jedisConnectionFactory).cacheDefaults(cacheConfiguration).build();
}
The serializeValuesWith method is the key.

How to configure Redis caching in Spring Boot?

How can I configure Redis caching with Spring Boot. From what I have heard, it's just some changes in the application.properties file, but don't know exactly what.
To use Redis caching in your Spring boot application all you need to do is set these in your application.properties file
spring.cache.type=redis
spring.redis.host=localhost //add host name here
spring.redis.port=6379
Add this dependency in your pom.xml
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-redis</artifactId>
</dependency>
Additionally, you have to use the #EnableCaching on your main application class and use the #Cacheable annotation on the methods to use the Cache. That is all is needed to use redis in a Spring boot application. You can use it in any class by autowiring the CacheManager in this case RedisCacheManager.
#Autowired
RedisCacheManager redisCacheManager;
You can mention all the required properties that is hostname, port etc. in the application.properties file and then read from it.
#Configuration
#PropertySource("application.properties")
public class SpringSessionRedisConfiguration {
#Value("${redis.hostname}")
private String redisHostName;
#Value("${redis.port}")
private int redisPort;
#Bean
public static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
#Bean
JedisConnectionFactory jedisConnectionFactory() {
JedisConnectionFactory factory = new JedisConnectionFactory();
factory.setHostName(redisHostName);
factory.setPort(redisPort);
factory.setUsePool(true);
return factory;
}
#Bean
RedisTemplate<Object, Object> redisTemplate() {
RedisTemplate<Object, Object> redisTemplate = new RedisTemplate<Object, Object>();
redisTemplate.setConnectionFactory(jedisConnectionFactory());
return redisTemplate;
}
#Bean
RedisCacheManager cacheManager() {
RedisCacheManager redisCacheManager = new RedisCacheManager(redisTemplate());
return redisCacheManager;
}
}

Categories