I have created a cluster on AWS Elasticache for Redis. The cluster operates in cluster mode enabled. There is 1 shard with 2 nodes currently created.
I'm trying to connect to the cluster through bastion host, thus using such port forwarding:
LocalForward localhost:6379 my-app.redis.apps.region.company.com:6379
I can reach the elasticache service with Spring Boot standalone configuration, but when fetching data from the cache, I get MOVED exception.
I have tried to switch the configuration to cluster mode using only cluster endpoint, but am getting:
Caused by: io.lettuce.core.RedisException: Cannot obtain initial Redis Cluster topology
at io.lettuce.core.cluster.RedisClusterClient.lambda$getPartitions$0(RedisClusterClient.java:329)
at io.lettuce.core.cluster.RedisClusterClient.get(RedisClusterClient.java:941)
at io.lettuce.core.cluster.RedisClusterClient.getPartitions(RedisClusterClient.java:329)
at org.springframework.data.redis.connection.lettuce.ClusterConnectionProvider.getConnectionAsync(ClusterConnectionProvider.java:92)
at org.springframework.data.redis.connection.lettuce.ClusterConnectionProvider.getConnectionAsync(ClusterConnectionProvider.java:40)
at org.springframework.data.redis.connection.lettuce.LettuceConnectionProvider.getConnection(LettuceConnectionProvider.java:53)
at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$ExceptionTranslatingConnectionProvider.getConnection(LettuceConnectionFactory.java:1595)
... 134 common frames omitted
Caused by: io.lettuce.core.cluster.topology.DefaultClusterTopologyRefresh$CannotRetrieveClusterPartitions: Cannot retrieve cluster partitions from [redis://clustercfg.redis-*******.****.****.cache.amazonaws.com]
Here's my configuration:
#Configuration
class RedisConfiguration {
#Bean
fun connectionFactory(): RedisConnectionFactory? {
return LettuceConnectionFactory(
RedisClusterConfiguration().clusterNode(
"clustercfg.redis-*******.****.****.cache.amazonaws.com",
6379
)
)
}
#Bean
fun redisTemplate(redisConnectionFactory: RedisConnectionFactory) =
RedisTemplate<String, String>().apply {
setConnectionFactory(redisConnectionFactory)
StringRedisSerializer().let {
keySerializer = it
valueSerializer = it
}
}
#Bean
fun remoteConfigCacheManager(
connectionFactory: RedisConnectionFactory
): RedisCacheManager? {
return RedisCacheManager.builder(connectionFactory)
.withCacheConfiguration(
“CACHE_NAME”,
RedisCacheConfiguration.defaultCacheConfig().entryTtl(Duration.ofHours(1))
)
.build()
}
}
And there's the config in application.yml
redis:
host: ${REDIS_HOST:clustercfg.redis-*******.****.****.cache.amazonaws.com}
port: ${REDIS_PORT:6379}
password: ${REDIS_PASSWORD:<SOME_SECRET_PASSWORD>}
ssl: ${REDIS_SSL:true}
I've really browsed through every possible SO thread and could not find a solution. Can this be related to my bastion host?
Related
I have a Spring boot application running on embedded tomcat with rabbit listener which I configure like this
#Configuration
public class RabbitConfiguration {
public static final String REQUEST_QUEUE = "from-beeline-req";
public static final String REPLY_QUEUE = "from-beeline-reply";
#Bean
public Queue beelineRpcReqQueue() {
return new Queue(REQUEST_QUEUE);
}
#Bean
public Queue beelineRpcReplyQueue() {
return new Queue(REPLY_QUEUE);
}
#Bean
public RabbitTemplate rabbitTemplate(RabbitTemplateConfigurer configurer, ConnectionFactory connectionFactory) {
RabbitTemplate template = new RabbitTemplate();
configurer.configure(template, connectionFactory);
template.setDefaultReceiveQueue(REQUEST_QUEUE);
template.setReplyAddress(REPLY_QUEUE);
template.setUseDirectReplyToContainer(false);
return template;
}
#Bean
public SimpleMessageListenerContainer replyListenerContainer(ConnectionFactory connectionFactory, RabbitTemplate rabbitTemplate) {
SimpleMessageListenerContainer container = new SimpleMessageListenerContainer();
container.setConnectionFactory(connectionFactory);
container.setQueues(beelineRpcReplyQueue());
container.setMessageListener(rabbitTemplate);
return container;
}
}
And my application.yml file looks like this
spring:
main:
banner-mode: LOG
rabbitmq:
host: 172.29.14.45
port: 5672
username: guest
password: guest
template:
reply-timeout: 15000
server:
port: 8888
So the main point is I want to connect to Rabbit server located at exact address (172.29.14.45). Created listener container is trying to connect to localhost instead. It ignores rabbit port property as well.
2021-02-23 23:04:59.715 [replyListenerContainer-1] INFO (AbstractConnectionFactory.java:636) - Attempting to connect to: [localhost:5672]
2021-02-23 23:05:01.721 [replyListenerContainer-1] ERROR (AbstractMessageListenerContainer.java:1877) - Failed to check/redeclare auto-delete queue(s).
org.springframework.amqp.AmqpConnectException: java.net.ConnectException: Connection refused: connect
and continues to restart consumer after that
2021-02-23 23:17:49.069 [replyListenerContainer-1] INFO (SimpleMessageListenerContainer.java:1428) - Restarting Consumer#2a140ce5: tags=[[]], channel=null, acknowledgeMode=AUTO local queue size=0
2021-02-23 23:17:49.069 [replyListenerContainer-1] DEBUG (BlockingQueueConsumer.java:758) - Closing Rabbit Channel: null
2021-02-23 23:17:49.071 [replyListenerContainer-2] INFO (AbstractConnectionFactory.java:636) - Attempting to connect to: [localhost:5672]
What should I do to tell spring to use my host property instead of localhost?
I always use the addresses property in the application.properties file
spring.rabbitmq.addresses=amqp://username:password#host:port/vhost
The name of the "virtual host" (or vhost) specifies the namespace for entities (such as exchanges and queues) referred to by the protocol. Note that this is not virtual hosting in the HTTP sense.
https://www.rabbitmq.com/uri-spec.html
example:
spring.rabbitmq.addresses=amqp://ihrpsvpp:In4etuiIkgu7FVBr0tr6wYGvGcGyJ9Ja#lion.rmq.cloudamqp.com/ihrpsvpp
Ok, it turned out it was a bean refreshing context of the application, what caused autoconfiguration to fail
I'm trying to use Spring cloud bus with Kafka in my microservices application, and indeed I could use it, but only data which is controlled by Spring cloud config server got refreshed!
I'm using jdbc back-end with my config server, and in order to simulate my need, I'm changing some value in properties file in one of my services, beside the properties table, and call the /monintor end point again (mentioned here section 4.3 https://www.baeldung.com/spring-cloud-bus); as a result, only data coming from properties table is changed.
This is the yml file for my Config server
spring:
cloud:
config:
server:
jdbc:
sql: SELECT KEY,VALUE from PROPERTIES where APPLICATION=? and PROFILE=? and LABEL=?
order: 1
stream:
kafka:
binder:
brokers: localhost:9092
datasource:
url: jdbc:mysql://localhost:3306/sweprofile?zeroDateTimeBehavior=convertToNull
username: 123
password: 123ertbnm
hikari:
maximum-pool-size: 10
connection-timeout: 5000
profiles:
active:
- jdbc
application:
name: configServer
These are yml files for One of my Miscroservices and its propeties file respectively
spring:
datasource:
username: 123
password: 123ertbnm
url: jdbc:mysql://localhost:3306/sweprofile?zeroDateTimeBehavior=convertToNull
jpa:
properties:
hibernate:
format_sql: true
ddl-auto: none
application:
name: auth-service
cloud:
config:
discovery:
enabled: true
service-id: configServer
bus:
refresh:
enabled: true
profiles:
active: jdbc
management:
endpoints:
web:
exposure:
include: ["health","info","refresh", "bus-refresh"]
# This line is dummy data for testing purpose
ali.man = " Ola 12333"
This is snapshot from rest controller
#RestController
#RequestMapping("/user")
#RefreshScope
public class AuthController {
private UserAuthService userAuthService;
#Value("${name}")
private String name; // changed normally
// Calling the key value mentioned in properties file after changing
#Value("${ali.man}")
private String k; // -> not changed
public AuthController(UserAuthService userAuthService) {
this.userAuthService = userAuthService;
}
#GetMapping("authTest")
public String getAuth() {
return name + k;
}
}
What did I miss? Why value from Properties file is not changed? hopefully I can use Spring cloud bus with Kafka to refresh these external data.
After some hours of investigation, I found that there is some recommended way. Cloud bus can send Refresh Event and Spring boot has RefreshEvent Listener to that event; this what I build my solution on.
So when event is sent by the bus; all instances will do the same logic ( Refreshing data ) on the loaded in memory configurations.
I used this snippet to apply this
#Configuration
public class ReloadLookupEvent implements ApplicationListener<RefreshScopeRefreshedEvent> {
#Autowired
private CacheService cacheService;
#Override
public void onApplicationEvent(RefreshScopeRefreshedEvent event) {
cacheService.refreshLookUp();
}
}
I could refresh all other configurations on demand, maybe it is a workaround, but acceptable.
I have problem with connection to rabbitmq via Apache Camel on Spring Boot 2.
I did following steps:
My dependencies:
implementation "org.apache.camel:camel-spring-boot-starter:${camelVersion}"
implementation "org.apache.camel:camel-jackson-starter:${camelVersion}"
implementation "org.apache.camel:camel-core:${camelVersion}"
implementation "org.apache.camel:camel-rabbitmq-starter:${camelVersion}"
implementation "org.springframework.boot:spring-boot-starter-amqp"
Application.yaml
spring:
rabbitmq:
dynamic: true
host: 192.168.1.1
port: 5672
username: X
password: Y
And I have following route:
#Component
public class BasicRoute extends RouteBuilder {
#Override
public void configure() throws Exception {
from("direct:loggerQueue")
.id("loggerQueue")
.to("rabbitmq://TEST-QUEUE.exchange?queue=TEST-QUEUE.queue&autoDelete=false&connectionFactory=#rabbitConnectionFactory")
.end();
}
}
Finnaly I have still following issue:
2019-03-06 12:46:05.766 WARN 19464 --- [ restartedMain] o.a.c.c.rabbitmq.RabbitMQProducer : Failed to create connection. It will attempt to connect again when publishing a message.
java.net.ConnectException: Connection refused: connect
Connection seems ok, I tested it. Something is bad with rabbitConnectionFactory.
I don't know what I have bad.
The problem appears to be that RabbitMQComponent is expecting to find a connection factory of type com.rabbitmq.client.ConnectionFactory.
However, the springboot auto-configure is creating a connection factory of type org.springframework.amqp.rabbit.connection.CachingConnectionFactory.
So, whenever the RabbitMQComponent attempts to find the appropriate connection factory, because it is looking for the specific type, and because it does not subclass the rabbitmq ConnectionFactory, it returns a null value, and fails to use the appropriate host name and configuration parameters specified in your application.yml.
You should also see the following in your log if you have debug level set:
2019-12-15 17:58:53.631 DEBUG 48710 --- [ main] o.a.c.c.rabbitmq.RabbitMQComponent : Creating RabbitMQEndpoint with host null:0 and exchangeName: asterix
2019-12-15 17:58:55.927 DEBUG 48710 --- [ main] o.a.c.c.rabbitmq.RabbitMQComponent : Creating RabbitMQEndpoint with host null:0 and exchangeName: asterix-sink
EDIT:
The CachingConnectionFactory is configured with the required Rabbit connection factory as part of the autoconfiguration. However, you need to provide a link to the correct factory.
Therefore, you need to add a #Bean to disambiguate.
#Configuration
#RequiredArgsConstructor
public class CamelConfig {
private final CachingConnectionFactory rabbitConnectionFactory;
#Bean
com.rabbitmq.client.ConnectionFactory rabbitSourceConnectionFactory() {
return rabbitConnectionFactory.getRabbitConnectionFactory();
}
}
and in your endpoint configuration:
rabbitmq:asterix?connectionFactory=#rabbitSourceConnectionFactory
Note that the # is optional, as it gets stripped out within the code when it is trying to find the rabbit connection factory bean.
In your application.yml, configure the connection parameters (the url is no longer included in the endpoint URI).
spring:
rabbitmq:
host: localhost
port: 5672
username: guest
password: guest
I use redis to cache in my boot spring. I have 2 redis servers and a fairly large database. If I use a redis server running the same network, the cache is normal with a total of 619,000 users. But if I use redis running on the docker and setting up a running schedule in the evening it always throws an error message:
org.springframework.data.redis.RedisConnectionFailureException: java.net.SocketTimeoutException: Read timed out; nested exception is redis.clients.jedis.exceptions.JedisConnectionException: java.net.SocketTimeoutException: Read timed out
I know that this is because redis has to wait for the connection for too long so it will throw an exception if the amount of data is too large. I've done more time out with the constructor but still can not fix it. Information as follows:
#Bean
public JedisConnectionFactory jedisConnectionFactory() {
JedisConnectionFactory jedisConnectionFactory = new JedisConnectionFactory();
jedisConnectionFactory.setHostName(redisHostName);
jedisConnectionFactory.setPort(redisPort);
jedisConnectionFactory.setPassword(redisPassword);
return jedisConnectionFactory;
}
#Bean
public JedisPool getConfig() {
JedisPoolConfig config =new JedisPoolConfig();
JedisPool jedisPool = new JedisPool();
config.setMaxTotal(100);
config.setMaxIdle(200);
config.setMinIdle(50);
config.setMaxWaitMillis(30000);
config.setTestOnBorrow(true);
jedisPool = new JedisPool(config, redisHostName, redisPort,300000,redisPassword);
return jedisPool;
}
So how to overcome the problem. I have set timeout as -1 but if so I think not good in terms of performance
I have a RabbitMQ server like this
When I try to connect to this server via Spring Boot amqp, I see com.rabbitmq.client.AuthenticationFailureException: ACCESS_REFUSED - Login was refused using authentication mechanism PLAIN. For details see the broker logfile.
My configs are this one
# Message
spring.activemq.broker-url=tcp://127.0.0.1:5672
spring.activemq.user=test
spring.activemq.password=test
Yes, the user test can access Virtual Hosts on / and yes, I can login with test/test on RabbitMQ GUI
EDIT
Looking at the rabbitmq logs, I saw this
{handshake_error,starting,0,
{amqp_error,access_refused,
"PLAIN login refused: user 'guest' - invalid credentials",
'connection.start_ok'}}
seems like Spring is ignoring my configs and trying to connect with guest
src/main/resources/application.yml
spring:
rabbitmq:
username: guest
password: guest
host: rabbitmq
port: 5672
virtual-host: someVirtualHost
https://docs.spring.io/spring-boot/docs/current/reference/html/application-properties.html
Spring Properties includes specific settings for RabbitMQ. Try replacing your ActiveMQ config with below.
Example:
spring.rabbitmq.host = 127.0.0.1
spring.rabbitmq.port = 5672
spring.rabbitmq.username = guest
spring.rabbitmq.password = guest
Try to change your rabbitMQ configuration in spring boot properties :
spring.rabbitmq.host = 127.0.0.1
spring.rabbitmq.port = 5672
spring.rabbitmq.username = guest
spring.rabbitmq.password = guest
Using default setting up with springboot is good but if we want to add external rabbit instance to spring container then we should follow as below
application.yml
rabbitmq:
host: 'hostname'
vhost: 'vhostname'
user: 'userName'
password: 'passwd'
port: 5672
Config class
#Configuration
public class RabbitConfig {
#Value("${rabbitmq.host}")
private String host;
#Value("${rabbitmq.vhost}")
private String vhost;
#Value("${rabbitmq.user}")
private String user;
#Value("${rabbitmq.password}")
private String password;
#Value("${rabbitmq.port}")
private int port;
#Bean
public ConnectionFactory connectionFactory() {
CachingConnectionFactory factory = new CachingConnectionFactory();
System.out.println("rmqhost is " + host);
factory.setHost(host);
factory.setVirtualHost(vhost);
factory.setUsername(user);
factory.setPassword(password);
factory.setPort(port);
return factory;
}
#Bean
public RabbitAdmin rabbitAdmin() {
return new RabbitAdmin(connectionFactory());
}
}
and we can create Bean for either rabbitmqtemplate or rabbitmqListener