We have large system using Postgresql DB, with rather complex database structure. And we have many DB-related integration tests for that system.
Because of the complex DB structure and usage of postres specific sql in code, mocking postgres with H2 (or other in memory DB) seems highly unreliable.
So, we are using junit tests of the following structure:
#RunWith(SpringRunner.class)
#JdbcTest
#AutoConfigureTestDatabase(replace= AutoConfigureTestDatabase.Replace.NONE)
#Sql( ... schema creation, sample data, etc )
#ContextConfiguration(classes = ... dao and service classes used in the test)
Everything is OK, when you have 2-3 test classes. Problems start to arise when you have 10+ test classes. As I understand it, SpringBoot creates separate connection pool for every distinct context configuration. To keep tests isolated as much as possible, we usually include in context configuration only components, that are used inside the test. So SpringBoot creates dozens of connection pools, that leads to "too many connection"-type errors from connection pool or jdbc driver. You can run your tests one by one, but you cannot run them all at once (so, say farewell to CI).
We are using the following workaround. The following snippet is copy-pasted to every test class:
// <editor-fold name='connection leaking fix'
#Autowired
private DataSource dataSource;
private static HikariDataSource hikariDataSource;
#Before
public void saveDataSource() {
this.hikariDataSource = (HikariDataSource)dataSource;
}
#AfterClass
public static void releaseDataSource() {
if (hikariDataSource != null) {
hikariDataSource.close();
}
}
// </editor-fold>
It works, but you have to remember that you shouldn't paste that snippet to test classes that use the same context configuration.
The question - is there any way to tell spring boot to close connection pool after each test class execution, or any way to limit number of connection pools spring boot creates?
#M.Deinum is right, the only way to solve the problem without hacking some workaround is to use limited number of configurations. So you can use something like this to test just DAO layer:
#RunWith(SpringRunner.class)
#JdbcTest(includeFilters = #ComponentScan.Filter(Repository.class))
#AutoConfigureTestDatabase(replace= AutoConfigureTestDatabase.Replace.NONE)
#Sql(...)
or that something like this to test DAO and service layer:
#RunWith(SpringRunner.class)
#JdbcTest(includeFilters = {
#ComponentScan.Filter(Repository.class),
#ComponentScan.Filter(Service.class)
})
#AutoConfigureTestDatabase(replace= AutoConfigureTestDatabase.Replace.NONE)
#Sql(...)
Related
I'm trying to do some tests to see if my transactional methods are working fine. However I do not fully understand whether or not I should mock the database or not and how JOOQ comes into this equation. Below is the Service class with the transaction of adding a role into the databse.
#Service
public class RoleService implements GenericRepository<Role>
{
#Autowired
private DSLContext roleDSLContext;
#Override
#Transactional
public int add(Role roleEntry)
{
return roleDSLContext.insertInto(Tables.ROLE,
Tables.ROLE.NAME,
Tables.ROLE.DESCRIPTION,
Tables.ROLE.START_DATE,
Tables.ROLE.END_DATE,
Tables.ROLE.ID_RISK,
Tables.ROLE.ID_TYPE,
Tables.ROLE.ID_CONTAINER)
.values(roleEntry.getName(),
roleEntry.getDescription(),
roleEntry.getStartDate(),
roleEntry.getEndDate(),
roleEntry.getIdRisk(),
roleEntry.getIdType(),
roleEntry.getIdContainer())
.execute();
}
}
I'm using MySQL and the connection to the database is made using the spring config file
spring.datasource.url=jdbc:mysql://localhost:3306/role_managementverifyServerCertificate=false&useSSL=true
spring.datasource.username=root
spring.datasource.password=123456
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
I'm assuming I don't have to reconnect to the database everytime I'm testing the transaction and closing the connection after it finishes. I know that there is
MockDataProvider provider = new MockDataProvider()
but I don't understand how it works.
What is the best way to test the before mentioned method?
Disclaimer
Have you read the big disclaimer in the jOOQ manual regarding mocking of your database?
Disclaimer: The general idea of mocking a JDBC connection with this jOOQ API is to provide quick workarounds, injection points, etc. using a very simple JDBC abstraction. It is NOT RECOMMENDED to emulate an entire database (including complex state transitions, transactions, locking, etc.) using this mock API. Once you have this requirement, please consider using an actual database product instead for integration testing, rather than implementing your test database inside of a MockDataProvider.
It is very much recommended you use something like testcontainers to integration test your application, instead of implementing your own "database product" via the mock SPI of jOOQ (or any other means of mocking).
If you must mock
To answer your actual question, you can configure your DSLContext programmatically, e.g. using:
#Bean
public DSLContext getDSLContext() {
if (testing)
return // the mocking context
else
return // the actual context
}
Now inject some Spring profile value, or whatever, to the above configuration class containing that DSLContext bean configuration, and you're all set.
Alternatively, use constructor injection instead of field injection (there are many benefits to that)
#Service
public class RoleService implements GenericRepository<Role> {
final DSLContext ctx;
public RoleService(DSLContext ctx) {
this.ctx = ctx;
}
// ...
}
So you can manually construct your service in the test that mocks the database:
RoleService testService = new RoleService(mockingContext);
testService.add(...);
But as you can see, the mocking is completely useless. Because what you want to test is that there's a side effect in your database (a record has been inserted), and to test that side effect, you'll want to query the database again, but unless you mock that as well, or re-implement an entire RDBMS, you won't see that record in the database. So, again, why not just integration test your code, instead?
I have 2 parent test classes:
#SpringBootTest(properties = {
"spring.datasource.url=jdbc:tc:mysql:8.0.25:///my_test_db?TC_INITSCRIPT=db/init_mysql.sql",
"spring.datasource.driver-class-name=org.testcontainers.jdbc.ContainerDatabaseDriver"
})
#TestConstructor(autowireMode = TestConstructor.AutowireMode.ALL)
public abstract class UserApplicationIntegrationTest {
}
and
#SpringBootTest
#TestConstructor(autowireMode = TestConstructor.AutowireMode.ALL)
public abstract class UserApplicationTest {
}
The idea is for various test classes to extend these classes. The ones which require a mocked MySQL DB will extend UserApplicationIntegrationTest. Ones which don't need a DB connection but that do require a Spring context will extend UserApplicationTest.
In the absence of UserApplicationIntegrationTest, all the test classes extending UserApplicationTest work well, including using the Mockito framework. Unfortunately, when I introduce UserApplicationIntegrationTest and its sub-tests (which work perfectly with the dockerised db instance), these tests begin to fail as they suddenly demand a datasource.
Caused by: org.springframework.boot.autoconfigure.jdbc.DataSourceProperties$DataSourceBeanCreationException: Failed to determine a suitable driver class
If I try excluding datasource auto-configuration either in app properties or in annotations of the parent class, the testcontainers tests (those extending UserApplicationIntegrationTest) start failing because of a problem with the Spring context and not being able to autowire beans any longer in those tests.
Before I know it, I'm down a rabbit hole of attempting messy exclusions/additions that I've been down before in previous projects and it only leads to problems further down the line.
Essentially I want 3 types of tests coexisting in my project:
Unit tests with no Spring context
Unit tests with a Spring context (including lots of mocking but still autowiring/constructor injection support)
Integration tests with a Spring context that spin up testcontainers and allow me to test DB interactions (and potentially end to end tests to come)
The original reason that I wanted to avoid launching testcontainers for all Spring context tests (which would 'work' perfectly well and only include 1 docker delay in the build process) was because it was irritating me to have to wait for the mysql connection to the dockerised instance every time I ran individual Spring context tests locally during development.
Is there a tidy way to achieve this or an altogether better way of navigating the requirement?
Thanks in advance.
Hopefully I understand you right, what I did was implementing an abstract TestContainer test class:
package de.dwosch.it;
#ActiveProfiles("test")
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
#AutoConfigureTestDatabase(replace = AutoConfigureTestDatabase.Replace.NONE)
#ContextConfiguration(initializers = AbstractPostgreSQLTestContainerIT.Initializer.class)
#Testcontainers
public abstract class AbstractPostgreSQLTestContainerIT {
private static final String POSTGRES_VERSION = "postgres:11.1";
public static PostgreSQLContainer database;
static {
database = new PostgreSQLContainer(POSTGRES_VERSION);
database.start();
}
static class Initializer implements ApplicationContextInitializer<ConfigurableApplicationContext> {
#Override
public void initialize(ConfigurableApplicationContext configurableApplicationContext) {
TestPropertySourceUtils.addInlinedPropertiesToEnvironment(
configurableApplicationContext,
"spring.datasource.url=" + database.getJdbcUrl(),
"spring.datasource.username=" + database.getUsername(),
"spring.datasource.password=" + database.getPassword()
);
}
}
}
Then I just extend my test classes by this abstract class which will fire up a test container and the whole spring context for better separation
class MyAwesomeControllerIT extends AbstractPostgreSQLTestContainerIT { }
I have a dao layer for my database. Now, I am writing some integration tests for it. I wonder if #Transactional or #Rollback should be used in a test class, as they both revert the changes to the database. Which one would be a good practice and in what conditions?
I tried using both of them and they both work in my case. I have a #Before annotated method in my class.
#RunWith(SpringRunner.class)
#AutoConfigureTestDatabase(replace = NONE)
#DataJpaTest
// #Transactional or #Rollback?
public class TestDao {
#Autowired
private ConcreteDao concreteDao;
#Before
public void cleanUp(){ . . . }
#Test
public void testSaveAllEntries(){ . . . }
// and other tests
}
agree with #michael Don't make your tests #transactional at all (but your service layer)This means that all your service layer/persistence layer methods invoked through the tests would start their own transactions (you made them transactional, did you?) and flush the changes upon commit. So you are guaranteed to notice if something blows up on flush, and quite possible that after a while, a database full of junk test data
Running tests with databases usually will be done with integration tests. To keep it simple you could setup a h2 with the dialect you need. Prepare the database structures and run the service calls you need. assert the expecting results and mark the test method as dirties context (as annotation), or resetup the database with each test, otherwise saved results of a test could have impact of another test. In this way you can also test the transaction handling of your services.
Adding transactional to your tests will change your behaviour of your business logic. keep those out of your tests.
I was not able to find any information regarding configuration of AppDynamics agent for JUnit tests. I would like to test performance of Hibernate queries of Spring based web service backed by PostgreSQL database. Tests must be able to rollback the data at the termination.
Should it be unit or integration tests? What is the best way to accomplish it? How to make AppDynamics collect and display graphs of query execution times?
UPDATE:
I was not able to set up addDynamics agent for JUnit tests inside IDEA. The VM arguments is pointing to agent -javaagent:"C:\Tools\AppDynamicsAgent\javaagent.jar", the firewall is off but for some reason in appdynamics web based (SaaS) set up dialog shows that no agent able to connect:
You need both unit tests and integration tests. Unit tests should not use in database or File, ect. I like to use Spring profiles for my tests. For instance, if I have a profile called integeration_test.
#ActiveProfiles("integeration_test")
#ContextConfiguration(locations = {
"classpath:you-context.xml"})
#RunWith(SpringJUnit4ClassRunner.class)
public abstract class DaoTest
{
#Autowired
protected DataSource dataSource;
// delete all your stuff here
protected void clearDatabase()
{
JdbcTemplate jdbc = new JdbcTemplate(dataSource);
jdbc.execute("delete table");
}
#Before
public final void init()
{
clearDatabase();
}
#After
public final void cleanup()
{
clearDatabase();
}
}
(I'm using xml) then in your context do something like: <beans profile="test">TODO </beans> And configure your data-source in there.
I know there are ways to rollback all your transactions after running a test, but I like this better. Just don't delete everything in your real database haha, could even put some safety code in the clearDatabase to make sure that doesn't happen.
For performance testing you will really need to figure out what you want to achieve, and what is meaningful to display. If you have a specific question about performance testing you can ask that, otherwise it is too broader topic.
Maybe you can make a mini-webapp which does performance testing for you and has the results exposed as URL requests for displaying in HTML. Really just depends on how much effort you are willing to spend on it and what you want to test.
Once the agent is attached you can use the AppDynamics Database Queries Window
I am writing unit test for my services e. g. :
#Test
#Rollback(value = true)
public void testMethod()
{
// insert test data
myService.Method(); // read/write from DB
// asserts go here
}
While application running, a new transaction is created every time method A entered. But during the unit test execution - when test testMethod entered. So method A doesn't create new one.
For proper testing I need to clear cache before every call to service inside test.I don't want to write Session.clear() before any call to service in each unit test. What is the best best practices here?
The EntityManager has a method clear() that will drop all persistence context:
Clear the persistence context, causing all managed entities to become detached. Changes made to entities that have not been flushed to the database will not be persisted.
If you call a query right after that method it will come directly from the database. Not from a cache.
If you want to run this before every test, consider using a JUnit #Rule by subclassing ExternalResource and running the method on every before() or after(). You can reuse that in al you database tests.
There are several way:
Evict Caches Manually
#Autowired private CacheManager cacheManager;
public void evictAllCaches(){
for(String name : cacheManager.getCacheNames()){
cacheManager.getCache(name).clear();
}
}
Turning Off Cache for Integration Test Profile
for Spring Boot: spring.cache.type=NONE
or
/** * Disabling cache for integration test */
#Bean public CacheManager cacheManager() {
return new NoOpCacheManager();
}
Use #DirtiesContext
#DirtiesContext(classMode = DirtiesContext.ClassMode.AFTER_EACH_TEST_METHOD)
class CacheAffectedTest { ...
In this case Spring context re-created after every test and test's time in my measure tripling.
For developing Spring Boot Dev Tools turns caching off automatically during the development phase.
See Spring Cache and Integration Testing and A Quick Guide to #DirtiesContext