I have a SpringBoot application which I am trying to test with the help of Testcontainers. I have something like:
#SpringBootTest
public class DummyIT {
#ClassRule
public static PostgreSQLContainer postgreSQLContainer = new PostgreSQLContainer();
static {
postgreSQLContainer.start();
String url = postgreSQLContainer.getJdbcUrl();
}
}
Unfortunately Testcontainers use random ports and if I understood correctly we are not supposed to have it any other way, which means that the result of postgreSQLContainer.getJdbcUrl() is not deterministic.
Moreover, my application retrieves the database url from its application.properties and I was aiming at replacing such value from the one provided by postgreSQLContainer.getJdbcUrl(), before it is first used during runtime. Is it possible to achieve this?
Thank you for your help.
You can use Testcontainers JDBC URL support.
For example,
#SpringBootTest(
properties = {
"spring.datasource.url=jdbc:tc:postgresql:9.6.8://localhost/your-database",
"spring.datasource.username=dummy-user",
"spring.datasource.password=dummy-password",
"spring.jpa.database-platform=org.hibernate.dialect.MySQLDialect",
"spring.datasource.driver-class-name=org.testcontainers.jdbc.ContainerDatabaseDriver"
})
public class DummyIT {
}
The Testcontainers driver ContainerDatabaseDriver will create and start a PostgreSQL server 9.6.8 with a database named your-database.
Related
I have a side project were I'm using Spring Boot, Liquibase and Postgres.
I have the following sequence of tests:
test1();
test2();
test3();
test4();
In those four tests I'm creating the same entity. As I'm not removing the records from the table after each test case, I'm getting the following exception: org.springframework.dao.DataIntegrityViolationException
I want to solve this problem with the following constraints:
I don't want to use the #repository to clean the database.
I don't want to kill the database and create it on each test case because I'm using TestContainers and doing that would increase the time it takes to complete the tests.
In short: How can I remove the records from one or more tables after each test case without 1) using the #repository of each entity and 2) killing and starting the database container on each test case?
The simplest way I found to do this was the following:
Inject a JdbcTemplate instance
#Autowired
private JdbcTemplate jdbcTemplate;
Use the class JdbcTestUtils to delete the records from the tables you need to.
JdbcTestUtils.deleteFromTables(jdbcTemplate, "table1", "table2", "table3");
Call this line in the method annotated with #After or #AfterEach in your test class:
#AfterEach
void tearDown() throws DatabaseException {
JdbcTestUtils.deleteFromTables(jdbcTemplate, "table1", "table2", "table3");
}
I found this approach in this blog post:
Easy Integration Testing With Testcontainers
Annotate your test class with #DataJpaTest. From the documentation:
By default, tests annotated with #DataJpaTest are transactional and roll back at the end of each test. They also use an embedded in-memory database (replacing any explicit or usually auto-configured DataSource).
For example using Junit4:
#RunWith(SpringRunner.class)
#DataJpaTest
public class MyTest {
//...
}
Using Junit5:
#DataJpaTest
public class MyTest {
//...
}
You could use #Transactional on your test methods. That way, each test method will run inside its own transaction bracket and will be rolled back before the next test method will run.
Of course, this only works if you are not doing anything weird with manual transaction management, and it is reliant on some Spring Boot autoconfiguration magic, so it may not be possible in every use case, but it is generally a highly performant and very simple approach to isolating test cases.
i think this is the most effecient way for postgreSQL. You can make same thing for other db. Just find how to restart tables sequence and execute it
#Autowired
private JdbcTemplate jdbcTemplate;
#AfterEach
public void execute() {
jdbcTemplate.execute("TRUNCATE TABLE users" );
jdbcTemplate.execute("ALTER SEQUENCE users_id_seq RESTART");
}
My personal preference would be:
private static final String CLEAN_TABLES_SQL[] = {
"delete from table1",
"delete from table2",
"delete from table3"
};
#After
public void tearDown() {
for (String query : CLEAN_TABLES_SQL)
{
getJdbcTemplate().execute(query);
}
}
To be able to adopt this approach, you would need to
extend the class with DaoSupport, and set the DataSource in the constructor.
public class Test extends NamedParameterJdbcDaoSupport
public Test(DataSource dataSource)
{
setDataSource(dataSource);
}
I´m setting up a Spring Boot application where certain configurations are being read from my application.yaml-file. I´ve done this a few times before and it works well, but I wondered whether there is a better way to access this configuration during runtime or whether I´m creating possible issues by not following some best practice.
Right now the class that extracts the configuration is simply defined as a Component like this:
#Component
#EnableConfigurationProperties
#ConfigurationProperties("myPrefix")
public class MyExternalConfiguration{
private HashMap<String, Boolean> entries= new HashMap<String, Boolean>();
public Boolean getConfigurationForKey(String key) {
return this.entries.get(key);
}
}
And then autowired to several other classes that need to access this configuration like this:
#Component
public class MyClass{
#Autowired
private MyExternalConfiguration myExternalConfiguration;
public void doSomething(){
//...
Boolean someEntry = myExternalConfiguration.getConfigurationForKey(someKey);
}
}
Now, this does work just fine. It´s just that I have seen examples of where configurations like this are handled as a singleton for example (although not in a Spring-Boot environment). I would just like to ask, whether there is some commonly accepted way to access external configurations or whether you see an issue with the way i access it in my project.
Thank you in advance!
There is a whole chapter about configuration in the Spring Boot Reference Manual:
https://docs.spring.io/spring-boot/docs/current/reference/htmlsingle/#boot-features-external-config
Simply said there are two options to access configuration:
With the Value annotation:
#Value("${name}")
private String name;
Or typesafe with a configuration class:
#ConfigurationProperties(prefix="my")
public class Config {
private List<String> servers = new ArrayList<String>();
public List<String> getServers() {
return this.servers;
}
}
So there is no need to read the configuration file by your own.
Is it possible with Spring Boot Test to set a conditional execution of sql scripts depending on the active profile?
I mean, I have my integration tests for repositories annotated with some #sql annotations like:
#Sql(scripts = "/scripts/entity_test_clear.sql", executionPhase = Sql.ExecutionPhase.AFTER_TEST_METHOD)
For a profile h2 I want to execute entity_test_clear.sql
For a profile mysql I want to execute entity_test_clear_mysql.sql
The reason for it is that I use diffrent syntax for these databases, particularly this one:
ALTER TABLE organisation ALTER COLUMN org_id RESTART WITH 1;
ALTER TABLE organisation AUTO_INCREMENT = 1;
Mysql doesn't understand the syntax #1, while h2 doesn't understand the syntax #2 (despite the mysql mode set, like MODE=MYSQL)
By default, I use h2 for IT tests, but also, in some rarer cases, I would like to check everything works smoothly with mysql too.
P.S I could of course try a straight-forward solution with #Profile and hard code two copies of each test for h2 and mysql, but it is coupled with huge code duplication in tests, which I would like to avoid.
EDITED:
The test case looks like this:
#RunWith(SpringRunner.class)
#DataJpaTest
#AutoConfigureTestDatabase(replace= AutoConfigureTestDatabase.Replace.NONE)
public class EntityRepositoryTestIT {
#Autowired
private EntityRepository entityRepository;
#Test
#Sql(scripts = {"/scripts/entity_test_data.sql", "/scripts/entity_test_data_many.sql"}, executionPhase = Sql.ExecutionPhase.BEFORE_TEST_METHOD)
#Sql(scripts = "/scripts/entity_test_clear.sql", executionPhase = Sql.ExecutionPhase.AFTER_TEST_METHOD)
public void findTest() {
Page<Entity> e = entityRepository.findBySomeDetails(1L, PageRequest.of(0, 20));
Assert.assertEquals(3, e.getContent().size());
Assert.assertEquals(1, e.getContent().get(0).getResources().size());
// more asserts
}
Thank you for any suggestions!
After some deeper digging into the issue I ended up with this simple workaround.
#Sql(scripts = "/scripts/entity_test_clear.sql", executionPhase = Sql.ExecutionPhase.AFTER_TEST_METHOD)
For scripts parameter, it is required to be a compile time constant. You cannot simply fetch a current profile value from application.properties and substitute it to run the right script name.
Introducing #After and #Before methods with ScriptUtils executing the right scripts is rather verbose and, in fact, didn't work for me (some freezing during script executing process occurred).
So what I did was just introducing a class with a single constant:
/**
* Constant holder for exceptionally database IT testing purposes
* for switching between h2 and mysql
*/
public class ActiveProfile {
/**
* Current profile for database IT tests.
* Make sure the value is equal to the value of
* <i>spring.profiles.active</i> property from test application.properties
*/
public static final String NOW = "h2";
}
Then the #sql line becomes:
#Sql(scripts = "/scripts/test_data_clear_"+ ActiveProfile.NOW+".sql", executionPhase = Sql.ExecutionPhase.AFTER_TEST_METHOD)
To use another database for testing (mysql) I just need to 1) change the current spring.profiles.active=mysql profile in application.properties and 2) change this constant to mysql;
That doesn't mean to be the exemplary solution, just a workaround that simply works.
You could use #Profile annotation with separated classes, each for every DMBS, putting the common logic in an another class to avoid the code duplication. You are using Spring so you could get it with something as below.
#Profile("mysql")
#Sql(scripts="... my mysql scripts...")
public class MySqlTests{
#Autowired
private CommonTestsLogic commonLogic;
#Test
public void mySqlTest1(){
commonlogic.test1();
}
}
#Profile("oracle")
#Sql(scripts="... my oracle scripts...")
public class MyOracleTests{
#Autowired
private CommonTestsLogic commonLogic;
#Test
public void myOracleTest1(){
commonlogic.test1();
}
}
I have a BaseTest class which consists of several tests. Each test shall be executed for EVERY profile I list.
I thought about using Parameterized values such as:
#RunWith(Parameterized.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
// #ActiveProfiles("h2-test") // <-- how to iterate over this?
public abstract class BaseTest {
#Autowired
private TestRepository test;
// to be used with Parameterized/Spring
private TestContextManager testContextManager;
public BaseTest(String profile) {
System.setProperty("spring.profiles.active", profile);
// TODO what now?
}
#Parameterized.Parameters
public static Collection<Object[]> data() {
Collection<Object[]> params = new ArrayList<>();
params.add(new Object[] {"h2-test" });
params.add(new Object[] {"mysql-test" });
return params;
}
#Before
public void setUp() throws Exception {
this.testContextManager = new TestContextManager(getClass());
this.testContextManager.prepareTestInstance(this);
// maybe I can spinup Spring here with my profile?
}
#Test
public void testRepository() {
Assert.assertTrue(test.exists("foo"))
}
How would I tell Spring to run each test with these different profiles? In fact, each profile will talk to different datasources (in-memory h2, external mysql, external oracle, ..) so my repository/datasource has to be reinitialized.
I know that I can specify #ActiveProfiles(...) and I can even extend from BaseTest and override the ActiveProfile annotation. Although this will work, I only show a portion of my test-suite. Lots of my test-classes extend from BaseTest and I don't want to create several different profile-stubs for each class. Currently working, but ugly solution:
BaseTest (#ActiveProfiles("mysql"))
FooClassMySQL(annotation from BaseTest)
FooClassH2(#ActiveProfiles("h2"))
BarClassMySQL(annotation from BaseTest)
BarClassH2(#ActiveProfiles("h2"))
Thanks
For what it's worth:
My use case was to run a specific test class for multiple spring profiles, this is how I achieved it:
#SpringBootTest
abstract class BaseTest {
#Test void doSomeTest() {... ARRANGE-ACT-ASSERT ...}
}
#ActiveProfiles("NextGen")
class NextGenTest extends BaseTest {}
#ActiveProfiles("Legacy")
class LegacyTest extends BaseTest {}
If you use Maven you can actually specify active profile from command line (or env variable if needed):
mvn clean test -Dspring.profiles.active=h2-test
The approach with parameterized test may not work in this case, because profile has to be specified before Spring boots up its context. In this case when you run parameterized integration test the context will be already booted up before test runner starts running your test. Also JUnit's parameterized tests were invented for other reasons (running unit tests with different data series).
EDIT: Also one more thing - when you decide to use #RunWith(Parameterized.class) you wont be able to use different runner. In many cases (if not all if it comes to integration testing) you want to specify different runner, like SpringRunner.class - with parameterized test you wont be able to do it.
Spring profiles are not designed to work in this way.
In your case, each profile uses a specific datasource.
So each one requires a Spring Boot load to run tests with the expected datasource.
In fact, what you want to do is like making as many Maven build as Spring profiles that you want to test.
Besides, builds in local env should be as fast as possible.
Multiplying automated tests execution by DBMS implementation that requires a Spring Boot reload for each one will not help.
You should not need to specify #ActiveProfiles .
It looks rather like a task for a Continuous Integration tool where you could define a job that executes (sequentially or parallely) each Maven build by specifying a specific Spring Boot profile :
mvn clean test -Dspring.profiles.active=h2
mvn clean test -Dspring.profiles.active=mysql
etc...
You can also try to perform it in local by writing a script that performs the execution of the maven builds.
But as said, it will slowdown your local build and also complex it.
I'm trying to use ElasticSearch java API in a Dropwizard application.
I found the dropwizard-elasticsearch package: https://github.com/dropwizard/dropwizard-elasticsearch, that seems to be exactly what I need.
Unfortunately, it provides zero "useful" documentation, and no usage examples.
I still haven't understood how to connect to remote servers using the TransportClient, because, due to no documentation of drop wizard-elasticsearch configuration, I should try "randomly" until I find the correct configuration keys...
Does anyone have tried using dropwizard-elasticsearch? Or has someone a real usage example of this?
Thanks in advance,
Unless you really need to join the Elasticsearch cluster, I would avoid using the Java classes provided by Elasticsearch. If you do connect to Elasticsearch this way, you will need to keep the JVM versions used by Elasticsearch and your application in sync.
Instead, you can connect to Elasticsearch using the Jest client found on GitHub. This will allow you to connect to Elasticsearch over the REST interface, just like all of the other client libraries.
You will need to create a simple configuration block for Elasticsearch, to specify the URL of the REST interface. Also, you will need to create a Manager for starting and stopping the JestClient.
Update: You can find the Dropwizard bundle that I use for connecting to Elasticsearch on GitHub. Here are some basic usage instructions for Java 8:
Include the dependency for the bundle in your project's POM.
<dependency>
<groupId>com.meltmedia.dropwizard</groupId>
<artifactId>dropwizard-jest</artifactId>
<version>0.1.0</version>
</dependency>
Define the JestConfiguraion class somewhere in your application's configuration.
import com.meltmedia.dropwizard.jest.JestConfiguration;
...
#JsonProperty
protected JestConfiguration elasticsearch;
public JestConfiguration getElasticsearch() {
return jest;
}
Then include the bundle in the initialize method of your application.
import com.meltmedia.dropwizard.jest.JestBundle;
...
protected JestBundle jestBundle;
#Override
public void initialize(Bootstrap<ExampleConfiguration> bootstrap) {
bootstrap.addBundle(jestBundle = JestBundle.<ExampleConfiguration>builder()
.withConfiguration(ExampleConfiguration::getElasticsearch)
.build());
}
Finally, use the bundle to access the client supplier.
#Override
public void run(ExampleConfiguration config, Environment env) throws Exception {
JestClient client = jestBundle.getClientSupplier().get();
}
Too long for a comment.
Please check README.md ->"Usage" and "Configuration". If you want dropwizard to create managed TransportClient your configuration settings should be something like this.
nodeClient: false
clusterName: dropwizard_elasticsearch_test
servers:
- 127.23.42.1:9300
- 127.23.42.2:9300
- 127.23.42.3
How to obtain dropwizard-managed TransportClient? Example here: public void transportClientShouldBeCreatedFromConfig().
#Override
public void run(DemoConfiguration config, Environment environment) {
final ManagedEsClient managedClient = new ManagedEsClient(configuration.getEsConfiguration());
Client client = managedClient.getClient();
((TransportClient) client).transportAddresses().size();
// [...]
}
There is also a sample blog application using Dropwizard and ElasticSearch. See "Acknowledgements" section in README.md.
I have used Java api for elasticsearch and the thing is the bundle you are using I also explored but documentation part discouraged using it. Here you can use elastic without this bundle:-
Define your elastic configs in your .yml file.
elasticsearchHost: 127.0.0.1
elasticPort: 9300
clusterName: elasticsearch
Now in configuration file(which in my case is mkApiConfiguration) create static functions which will actually be the getter methods for these elastic configuration:-
#NotNull
private static String elasticsearchHost;
#NotNull
private static Integer elasticPort;
#NotNull
private static String clusterName;
#JsonProperty
public static String getElasticsearchHost() {
return elasticsearchHost;
}
//This function will be called while reading configurations from yml file
#JsonProperty
public void setElasticsearchHost(String elasticsearchHost) {
mkApiConfiguration.elasticsearchHost = elasticsearchHost;
}
#JsonProperty
public void setClusterName(String clusterName) {
mkApiConfiguration.clusterName = clusterName;
}
public void setElasticPort(Integer elasticPort) {
mkApiConfiguration.elasticPort = elasticPort;
}
#JsonProperty
public static String getClusterName() {
return clusterName;
}
#JsonProperty
public static Integer getElasticPort() {
return elasticPort;
}
Now create a elastic factory from where you can get a transport client better create it as a singleton class so that only one instance is created and shared for elastic configuration we can get configuration using getter method of configuration class as these are static method so we don't need to create an object to access these methods. So code of this factory goes like that
public class ElasticFactory {
//Private constructor
private ElasticFactory(){};
public static Client getElasticClient(){
try {
/*
* Creating Transport client Instance
*/
Client client = TransportClient.builder().build()
.addTransportAddress(new InetSocketTransportAddress(InetAddress.getByName(mkApiConfiguration.getElasticsearchHost()), mkApiConfiguration.getElasticPort()));
return client;
}
catch (Exception e){
e.printStackTrace();
return null;
}
}
Now you can call this elastic factory method from any class like below:-
/*As getInstance is a static method so we can access it directly without creating an object of ElasticFactory class */
Client elasticInstance= ElasticFactory.getElasticClient();