I am using ReactiveMongoTemplate in a java/spring application to connect to a database. How would I go about testing whether the connection between the application and the database is actually configured properly so that data can be inserted into the database?
Is there also a way to tell which further steps I would need to take if the connection hasn't been fully established?
You can start a MongoDB instance during your application build and allow the tests to validate the connection logic. To make the setup portable you can either use emebdded MongoDB or Docker with testcontainers.
Do note that if you have a standalone MongoDB server in different environments (e.g. PROD and TEST) with different connection configuration you can't really prove that it works until you try it out. If there is an error in PROD configuration (e.g. TEST username used instead of PROD username) you will only see when the application runs on PROD. This can be fixed by using Kubernetes and immutable deployments.
Related
I am using Spring Data MongoDB 2.2.1.RELEASE for MongoDB access. And flapdoodle embed mongo 2.2.0 as embedded MongoDB for testing. This setup works fine. But recently, I have added support for Spring transactions. And since, MongoDB supports transactions only on replica sets. I have created a replica set locally on my machine and tested transaction scenarios. All good till now. But now when I run my unit tests, #Transactional annotation added to service methods is breaking the application with below error since the embedded MongoDB is not a replica set.
com.mongodb.MongoClientException: Sessions are not supported by the MongoDB cluster to which this client is connected
My question is how to configure my application so the #Transactional feature does not break my application when using embedded or standalone MongoDB?
Suggestions much appreciated. Thanks !!
It is possible to run a 1-node replica set. You may consider this especially in tests so that your test environment more closely resembles your production one.
I have Spring Boot Integration tests (IT) that connect to a real DB or to a real 3rd parties. I use them during development but I find them quite usefull to check the real behaviour of the application therefore I would like to run them during CI process. The goal is to run them on the environment on which the application is deployed and not on CI machine where Jenkins is running. Is there a way how to achieve this? I know I can use for example SOAP UI maven plugin and execute tests against REST endpoints, but I would prefer to use Spring Boot IT tests already written.
Many thanks
Running tests against your production database is a really bad idea. Please please please reconsider. It is better to have your test database updated to be more like production than to run your tests on production data.
That being said, you can point your database configuration to your production machine via the application.properties file (mongo example):
spring.data.mongodb.uri=mongodb://user:pass#production.myhost.com:27017/mydb
I'm guessing that it defaults to localhost:27017. In your test/resources folder you can setup a differing application.properties. Check out the spring-boot externalized properties details.
I have coded a Spring MVC Hibernate application with RabbitMQ as a messaging server & a MySQL DB. I have also used Hazelcast an in-memory distributed cache to centralize the state of the application, moving the local tomcat session to a centralized session & implementing distributed locks.
The app right now is hosted on a single tomcat server in my local system.
I want to test my application on a multiple JVM node environment i'e app running on multiple tomcat servers.
What would be the best approach to test the app.
Few things that come to my mind
A. Install & configure a load balancer & set up a tomcat cluster in my local system. This I believe is a tedious task & requires much effort.
B. Host the application on a PAAS like OpenShift, cloudfoundry but I am not sure if I will be able to test my application on several nodes.
C. Any other way to simulate a clustered environment on my local windows system?
I would suggest first you should understand your application requirement. For the real production/live environment, are you going to use Infrastructure as a service or PAAS.
If Infrastructure as a service then
I would suggest create local cluster environment and use the tomcat and spring application sticky session concept. Persist the session in Hazelcast or redis server installed on different node. Configure load balancer for multiple nodes having tomcat server. 2-3 VMs for testing purpose would be suitable.
If requirement is PAAS then
Don't think about local environment. Test directly on OpenShift or AWS free account and trust me you would be able to test on PAAS if all setup is fine.
I am trying to figure out an easy way to manage many Spring Boot applications on my Production server. Right now I have many fat jars running on different folders where each one has its your own script to start/stop the application and there's an external folder for the configurations (logback, properties, xml). For record those configurations are loaded by command line -Dloader.path to Spring Boot execution.
So how can I avoid conflicts for the same http/https port already running on Production? Does exist any kind of application manager where system administrators could control it? One solution I found was to virtualize Spring Boot applications with Docker, but my environment is Unix Solaris.
Is there any java solution for this scenario?
You can have a look at Spring Cloud which will give you better control and management when running multiple boot applications. All components of Spring Cloud
might not be useful to you, but few of them will help in port resolution, service rerouting and property maintenance. Along with the above you can also try SBA.
Along with the above you can also try Nginx for UI load balancing and reverse proxy.
I have an embedded Neo4j database created and used by a java process utilizing TinkerPop. I would like to use the Neo4j web admin and backup service with this database. I have now installed the server, but when I try to set the server database path to the existing embedded database, I get a StoreLockException (Could not create lock file) when starting the server.
How do I make this work so that I can administer and back up my database? Since I'm using TinkerPop, I actually have no direct Neo4j references in my code. The database used comes from a configuration file. I would like to avoid having to make hard dependencies on Neo4j in the code.
You can't access the database directory from two different processes at the same time. This isn't a code-level concern, just an operational concern.
You'd have to:
Shutdown your application (thereby releasing the lock)
Run a backup using Neo4j tooling (of your choice)
Start your application back up again
For "live" backups without shutting down your application, you'd need to run a cluster using Neo4j Enterprise.
Cheers,
Andreas