How to configure elasticsearch 6 java - java

I'm using Spring version 2.1.18.RELEASE (I can't change this version)
I need to implement a crud repository for elasticsearch, for this I connect spring-data-elasticsearch (version 3.1.21.RELEASE is automatically pulled up) with elasticsearch 6.4.3
I tried a bunch of manuals, but elastic is so different from version to version that I can't find a solution. I need to connect to remote server via https with username and password.
pom:
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-elasticsearch</artifactId>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
</dependency>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>transport</artifactId>
</dependency>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>elasticsearch-rest-client</artifactId>
</dependency>
Config:
#Configuration
#EnableElasticsearchRepositories(basePackages = "path.to.elastic")
#ComponentScan(basePackages = {"path.to.elastic"})
public class ElasticConfiguration {
private final int port = 1234;
#Bean
public Client client() throws Exception {
Settings settings = Settings.builder().put("cluster.name", "test")
.put("xpack.security.user", "login:pass")
.put("xpack.security.transport.ssl.enabled", "true")
.put("xpack.security.transport.ssl.enabled", "true")
.build();
return new PreBuiltTransportClient(settings)
.addTransportAddress(new TransportAddress(InetAddress.getByName("bah.elk1.com"), port))
.addTransportAddress(new TransportAddress(InetAddress.getByName("bah.elk2.com"), port))
.addTransportAddress(new TransportAddress(InetAddress.getByName("bah.elk3.com"), port));
}
#Bean
public ElasticsearchOperations elasticsearchTemplate() throws Exception {
return new ElasticsearchTemplate(client());
}
}
Repository:
#Repository
public interface ElkThingRepository extends ElasticsearchRepository<Things, Long> {
List<Things> findByEarNameAndName(String earName, String thingName);
}
Service interface:
public interface ElkThingsService {
List<Things> simpleGet(String thingName);
}
Service impl:
#Service
public class ElkThingsServiceImpl implements ElkThingsService {
private final ElkThingsRepository elkThingsRepository;
#Autowired
public ElkThingsServiceImpl(ElkThingsRepository elkThingsRepository) {
this.elkThingsRepository = elkThingsRepository;
}
public List<Things> simpleGet(String thingsName) {
return elkThingsRepository.findByEarNameAndName(AppInfo.SUBSYSTEM_CODE, thingName);
}
}
Things:
#Data
#Document(indexName = "things-*", type = "things")
public class Things {
#Id
private Long id;
private String earName;
private String name;
}
Now im get exceprion:
java.lang.IllegalArgumentException: unknown setting [xpack.security.user] please check that any required plugins are installed, or check the breaking changes documentation for removed settings
When i add spring-data-elasticsearch 4.0.0 and try to use RestHighLevelClient i get NoClassDefFoundError error creating RestHighLevelClient bean

Related

Is there anyway to disable "Retryable writes" to false in Spring Boot 2.2.1

First time
I am trying to develop a controller to save data in DocumentDB in AWS.
In the first time it saves, but in the second time, I am looking for this register saved in database, I got this and change some data, and save, but...
I am getting this error:
Caused by: com.mongodb.MongoCommandException: Command failed with error 301: 'Retryable writes are not supported' on server aws:27017. The full response is {"ok": 0.0, "code": 301, "errmsg": "Retryable writes are not supported", "operationTime": {"$timestamp": {"t": 1641469879, "i": 1}}}
This my java code
#Service
public class SaveStateHandler extends Handler<SaveStateCommand> {
#Autowired
private MongoRepository repository;
#Autowired
private MongoTemplate mongoTemplate;
#Override
public String handle(Command command) {
SaveStateCommand cmd = (SaveStateCommand) command;
State state = buildState(cmd);
repository.save(state);
return state.getId();
}
private State buildState(SaveStateCommand cmd) {
State state = State
.builder()
.activityId(cmd.getActivityId())
.agent(cmd.getAgent())
.stateId(cmd.getStateId())
.data(cmd.getData())
.dataAlteracao(LocalDateTime.now())
.build();
State stateFound = findState(cmd);
if (stateFound != null) {
state.setId(stateFound.getId());
}
return state;
}
private State findState(SaveStateCommand request) {
Query query = new Query();
selectField(query);
where(request, query);
return mongoTemplate.findOne(query, State.class);
}
private void selectField(Query query) {
query.fields().include("id");
}
private void where(SaveStateCommand request, Query query) {
query.addCriteria(new Criteria().andOperator(
Criteria.where("activityId").is(request.getActivityId()),
Criteria.where("agent").is(request.getAgent())));
}
}
In AWS they suggest to use retryWrites=false but I donĀ“t know how to do it in Spring Boot.
I use Spring Boot 2.2.1
I tryed to do this
#Bean
public MongoClientSettings mongoSettings() {
return MongoClientSettings
.builder()
.retryWrites(Boolean.FALSE)
.build();
}
But not worked.
=================================================================================
Second Time
I connected to AWS DocumentDb with SSH Tunnel.
Started my application with these database configuration
#Configuration
#EnableConfigurationProperties({MongoProperties.class})
public class MongoAutoConfiguration {
private final MongoClientFactory factory;
private final MongoClientOptions options;
private MongoClient mongo;
public MongoAutoConfiguration(MongoProperties properties, ObjectProvider<MongoClientOptions> options, Environment environment) {
this.options = options.getIfAvailable();
if (StringUtils.isEmpty(properties.getUsername()) || StringUtils.isEmpty(properties.getPassword())) {
properties.setUsername(null);
properties.setPassword(null);
}
properties.setUri(createUri(properties));
this.factory = new MongoClientFactory(properties, environment);
}
private String createUri(MongoProperties properties) {
String uri = "mongodb://";
if (StringUtils.hasText(properties.getUsername()) && !StringUtils.isEmpty(properties.getPassword())) {
uri = uri + properties.getUsername() + ":" + new String(properties.getPassword()) + "#";
}
return uri + properties.getHost() + ":" + properties.getPort() + "/" + properties.getDatabase() + "?retryWrites=false";
}
#PreDestroy
public void close() {
if (this.mongo != null) {
this.mongo.close();
}
}
#Bean
public MongoClient mongo() {
this.mongo = this.factory.createMongoClient(this.options);
return this.mongo;
}
}
And localy it saves the data without error.
But, if I put my API update in AWS ECS, and try to save, got the same error.
=================================================================================
Dependencies
<dependency>
<groupId>org.springframework.data</groupId>
<artifactId>spring-data-mongodb</artifactId>
<version>2.2.1.RELEASE</version>
</dependency>
<dependency>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-mongodb</artifactId>
<version>4.1.4</version>
</dependency>
When you construct your connection string, you can include the parameters for disabling retryable writes, by adding this to your connection URI:
?replicaSet=rs0&readPreference=primaryPreferred&retryWrites=false&maxIdleTimeMS=30000
Then use this when creating the database factory and mongo template (this example uses the Reactive database factory, but the principle is the same for the SimpleMongoClientDatabaseFactory:
#Bean
fun reactiveMongoDatabaseFactory(
#Value("\${spring.data.mongodb.uri}") uri: String,
#Value("\${mongodb.database-name}") database: String
): ReactiveMongoDatabaseFactory {
val parsedURI = URI(uri)
return SimpleReactiveMongoDatabaseFactory(MongoClients.create(uri), database)
}

Spring AMQP - RabbitMQ connection is not created on application startup

I have a Spring Boot application and my goal is to declare queues, exchanges, and bindings on application startup. The application will produce messages to various queues there will be no consumers on the application.
I have included those dependencies on my pom.xml
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
<version>2.3.5.RELEASE</version>
</dependency>
<dependency>
<groupId>org.springframework.amqp</groupId>
<artifactId>spring-rabbit</artifactId>
<version>2.2.12.RELEASE</version>
</dependency>
my configuration class
#Configuration
public class RabbitConfiguration {
#Bean
public ConnectionFactory connectionFactory() {
CachingConnectionFactory connectionFactory = new CachingConnectionFactory("myhost", 5672);
connectionFactory.setUsername("example_name");
connectionFactory.setPassword("example_pass");
return connectionFactory;
}
#Bean
public AmqpAdmin rabbitAdmin(ConnectionFactory connectionFactory) {
return new RabbitAdmin(connectionFactory);
}
#Bean
public Queue declareQueue() {
return new Queue("test_queue", true, false, false);
}
#Bean
public DirectExchange declareDirectExchange() {
return new DirectExchange("test_direct_exchange", true, false);
}
#Bean
public Declarables declareBindings() {
return new Declarables(
new Binding("test_queue", DestinationType.QUEUE, "test_direct_exchange", "test_routing_key", null)
);
}
}
My problem is that queues, exchanges, and bindings are not created on the application startup. Spring boot does not even open the connection. The connection, queues, etc are created only when I produce messages to the queues.
If you want to force declaration during app startup and don't have any consumers, you can either add the actuator starter to the classpath, or simply create the shared connection yourself.
#Bean
ApplicationRunner runner(ConnectionFactory cf) {
return args -> cf.createConnection().close();
}
This won't close the connection; if you want to do that, call cf.resetConnection().
If you want the app to start if the broker is down, do something like this.
#Bean
ApplicationRunner runner(ConnectionFactory cf) {
return args -> {
boolean open = false;
while(!open) {
try {
cf.createConnection().close();
open = true;
}
catch (Exception e) {
Thread.sleep(5000);
}
}
};
}
After some digging, I have found out that I was missing the actuator dependency.
So adding this dependency solved my issue
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
</dependency>
It is weird that the connection is not opened on application startup unless the actuator dependency is present.
You can use
#Component
public class QueueConfig {
private AmqpAdmin amqpAdmin;
public QueueConfig(AmqpAdmin amqpAdmin) {
this.amqpAdmin = amqpAdmin;
}
#PostConstruct
public void createQueues() {
amqpAdmin.declareQueue(new Queue("q1", true));
amqpAdmin.declareQueue(new Queue("q2", true));
}
}

Spring Data always returns an empty array with reactive mongo repository

I'm trying to build an application using reactive mongo libraries and spring data. However, every time I hit the application and use dao.findAll() I get an empty array of results.
Please find the code below.
/config/MongoAdminConfig.java
#EnableReactiveMongoRepositories
public class MongoAdminConfig extends AbstractReactiveMongoConfiguration {
#Override
#Bean
public MongoClient reactiveMongoClient() {
MongoClientSettings settings = MongoClientSettings.builder()
.applyConnectionString(new ConnectionString("mongodb://localhost:27017/projects"))
.build();
return MongoClients.create(settings);
}
#Override
protected String getDatabaseName() {
return "projects";
}
}
/controllers/AccountController.java
#RestController
#RequestMapping("accounts")
public class AccountController {
#Autowired
private AccountDao accountDao;
#GetMapping
public Flux<Account> getAllAccounts() {
return this.accountDao.findAll();
}
}
/dao/AccountDao.java
public interface AccountDao extends ReactiveCrudRepository<Account, String> {
}
/models/Account.java
#Document(collection = "accounts")
public class Account {
#Id
private String id;
private String accountNumber;
private String type;
private String status;
private double availableBalance;
private String currency;
//getters and setters
}
Database query example
> db.accounts.find({})
{ "_id" : ObjectId("5deb40c43e079db50337b211"), "accountNumber" : "BANK123456", "type" : "savings", "status" : "active", "availableBalance" : 10000.5, "currency" : "USD" }
> db
projects
>
/pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.aditapillai.demos</groupId>
<artifactId>accounts-demo</artifactId>
<version>1.0-SNAPSHOT</version>
<parent>
<groupId>org.springframework.boot</groupId>
<version>2.2.1.RELEASE</version>
<artifactId>spring-boot-starter-parent</artifactId>
</parent>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-mongodb-reactive</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
</dependencies>
</project>
The code can be found on GitHub
Platform Details:
OS: Ubuntu 19.04 LTS
Mongo version: 4.2.1
My mongo server is running on my local machine on port 27017
It is a new mongo installation.
Fixed by doing the following.
deleted /config/MongoAdminConfig.java
/Application.java
#SpringBootApplication
#EnableReactiveMongoRepositories
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
}
src/main/resources/application.yml
spring:
data:
mongodb:
host: localhost
port: 27017
database: projects
I'll keep looking to understand what exactly happened here.

#StreamListener not receiving message from kafka topic

I am able to send and receive the message using code:
#EnableBinding(Processor.class)
public class KafkaStreamsConfiguration {
#StreamListener(Processor.INPUT)
#SendTo(Processor.OUTPUT)
public String processMessage(String message) {
System.out.println("message = " + message);
return message.replaceAll("my", "your");
}
}
#RunWith(SpringRunner.class)
#SpringBootTest
#DirtiesContext
public class StreamApplicationIT {
private static String topicToPublish = "eventUpdateFromEventModel";
#BeforeClass
public static void setup() {
System.setProperty("spring.kafka.bootstrap-servers", embeddedKafka.getBrokersAsString());
}
#Autowired
private KafkaMessageSender<String> kafkaMessageSenderToTestErrors;
#Autowired
private KafkaMessageSender<EventNotificationDto> kafkaMessageSender;
#ClassRule
public static KafkaEmbedded embeddedKafka = new KafkaEmbedded(1, true, topicToPublish);
#Autowired
private Processor pipe;
#Autowired
private MessageCollector messageCollector;
#Rule
public OutputCapture outputCapture = new OutputCapture();
#Test
public void working() {
pipe.input()
.send(MessageBuilder.withPayload("This is my message")
.build());
Object payload = messageCollector.forChannel(pipe.output())
.poll()
.getPayload();
assertEquals("This is your message", payload.toString());
}
#Test
public void non_working() {
kafkaMessageSenderToTestErrors.send(topicToPublish, "This was my message");
assertTrue(isMessageReceived("This was your message", 50));
}
private boolean isMessageReceived(final String msg, final int maxAttempt) {
return IntStream.rangeClosed(0, maxAttempt)
.peek(a -> {
try {
TimeUnit.MILLISECONDS.sleep(100);
} catch (InterruptedException e) {
fail();
}
}).anyMatch(i -> outputCapture.toString().contains(msg));
}
}
#Service
#Slf4j
public class KafkaMessageSender<T> {
private final KafkaTemplate<String, byte[]> kafkaTemplate;
private final ObjectWriter objectWriter;
public KafkaMessageSender(KafkaTemplate<String, byte[]> kafkaTemplate, ObjectMapper objectMapper) {
this.kafkaTemplate = kafkaTemplate;
this.objectWriter = objectMapper.writer();
}
public void send(String topicName, T payload) {
try {
kafkaTemplate.send(topicName, objectWriter.writeValueAsString(payload).getBytes());
} catch (JsonProcessingException e) {
log.info("error converting object into byte array {}", payload.toString().substring(0, 50));
}
log.info("sent payload to topic='{}'", topicName);
}
}
But when I send the message using kafkaTemplate to any topic, StreamListener doesn't receive the message.
spring.cloud.stream.bindings.input.group=test
spring.cloud.stream.bindings.input.destination=eventUpdateFromEventModel
my pom.xml:
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-stream-kafka</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream-test-support</artifactId>
<scope>test</scope>
</dependency>
<!-- Spring boot version -->
<spring.boot.version>1.5.7.RELEASE</spring.boot.version>
<spring-cloud.version>Edgware.SR3</spring-cloud.version>
<dependencyManagement>
<dependencies>
<dependency>
<!-- Import dependency management from Spring Boot -->
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-dependencies</artifactId>
<version>${spring.boot.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-dependencies</artifactId>
<version>${spring-cloud.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
working
Object payload = messageCollector.forChannel(pipe.output())
.poll()
.getPayload();
...
not working
KafkaTemplate
This is because you are using the TestBinder in your test, not the real Kafka broker and kafka binder.
The message collector is simply fetching it from the channel. If you want to test with a real Kafka broker, see the test-embedded-kafka sample app.
EDIT
I just tested the Ditmars (boot 1.5.x) version of the sample and it works fine...

Sending HTML email with Spring Boot and Thymeleaf

I am checking out how to send an email using Spring Boot.
Send an e-mail using standard Spring Boot modules and prepare HTML content for a message using Thymeleaf template engine.
This is the dependencies I use
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-web</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-mail</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-thymeleaf</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.icegreen</groupId>
<artifactId>greenmail</artifactId>
<version>1.5.0</version>
<scope>test</scope>
</dependency>
</dependencies>
<dependencyManagement>
<dependencies>
<dependency>
<!-- Import dependency management from Spring Boot -->
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-dependencies</artifactId>
<version>${spring-boot.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
Here my MailClient
#Service
public class MailClient {
private JavaMailSender mailSender;
private MailContentBuilder mailContentBuilder;
#Autowired
public MailClient(JavaMailSender mailSender, MailContentBuilder mailContentBuilder) {
this.mailSender = mailSender;
this.mailContentBuilder = mailContentBuilder;
}
public void prepareAndSend(String recipient, String message) {
MimeMessagePreparator messagePreparator = mimeMessage -> {
MimeMessageHelper messageHelper = new MimeMessageHelper(mimeMessage);
messageHelper.setFrom("amadeu.cabanilles#gmail.com");
messageHelper.setTo("amadeu.cabanilles#gmail.com");
messageHelper.setSubject("Sample mail subject");
String content = mailContentBuilder.build(message);
messageHelper.setText(content, true);
};
try {
mailSender.send(messagePreparator);
} catch (MailException e) {
e.printStackTrace();
}
}
}
This is my Test Class
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(Application.class)
public class MailClientTest {
#Autowired
private MailClient mailClient;
private GreenMail smtpServer;
#Before
public void setUp() throws Exception {
smtpServer = new GreenMail(new ServerSetup(25, null, "smtp"));
smtpServer.start();
}
#Test
public void shouldSendMail() throws Exception {
//given
String recipient = "amadeu.cabanilles#gmail.com";
String message = "Test message content";
//when
mailClient.prepareAndSend(recipient, message);
//then
String content = "<span>" + message + "</span>";
assertReceivedMessageContains(content);
}
private void assertReceivedMessageContains(String expected) throws IOException, MessagingException {
MimeMessage[] receivedMessages = smtpServer.getReceivedMessages();
assertEquals(1, receivedMessages.length);
String content = (String) receivedMessages[0].getContent();
System.out.println(content);
assertTrue(content.contains(expected));
}
#After
public void tearDown() throws Exception {
smtpServer.stop();
}
}
Executing the Test in my computer is OK, I pass the test but I don't receive any email.
You don't receive any email because this integration test uses local testing SMTP server stub - GreenMail. The test doesn't send real emails, only verifies if the mail is prepared and sent correctly if a real SMTP server is available in the production.
In order to send emails from your local environment, you need to setup some SMTP server, but then, automated verification if the mail is actually sent is a completely different story.

Categories