I was not able to find any information regarding configuration of AppDynamics agent for JUnit tests. I would like to test performance of Hibernate queries of Spring based web service backed by PostgreSQL database. Tests must be able to rollback the data at the termination.
Should it be unit or integration tests? What is the best way to accomplish it? How to make AppDynamics collect and display graphs of query execution times?
UPDATE:
I was not able to set up addDynamics agent for JUnit tests inside IDEA. The VM arguments is pointing to agent -javaagent:"C:\Tools\AppDynamicsAgent\javaagent.jar", the firewall is off but for some reason in appdynamics web based (SaaS) set up dialog shows that no agent able to connect:
You need both unit tests and integration tests. Unit tests should not use in database or File, ect. I like to use Spring profiles for my tests. For instance, if I have a profile called integeration_test.
#ActiveProfiles("integeration_test")
#ContextConfiguration(locations = {
"classpath:you-context.xml"})
#RunWith(SpringJUnit4ClassRunner.class)
public abstract class DaoTest
{
#Autowired
protected DataSource dataSource;
// delete all your stuff here
protected void clearDatabase()
{
JdbcTemplate jdbc = new JdbcTemplate(dataSource);
jdbc.execute("delete table");
}
#Before
public final void init()
{
clearDatabase();
}
#After
public final void cleanup()
{
clearDatabase();
}
}
(I'm using xml) then in your context do something like: <beans profile="test">TODO </beans> And configure your data-source in there.
I know there are ways to rollback all your transactions after running a test, but I like this better. Just don't delete everything in your real database haha, could even put some safety code in the clearDatabase to make sure that doesn't happen.
For performance testing you will really need to figure out what you want to achieve, and what is meaningful to display. If you have a specific question about performance testing you can ask that, otherwise it is too broader topic.
Maybe you can make a mini-webapp which does performance testing for you and has the results exposed as URL requests for displaying in HTML. Really just depends on how much effort you are willing to spend on it and what you want to test.
Once the agent is attached you can use the AppDynamics Database Queries Window
Related
I'm trying to do some tests to see if my transactional methods are working fine. However I do not fully understand whether or not I should mock the database or not and how JOOQ comes into this equation. Below is the Service class with the transaction of adding a role into the databse.
#Service
public class RoleService implements GenericRepository<Role>
{
#Autowired
private DSLContext roleDSLContext;
#Override
#Transactional
public int add(Role roleEntry)
{
return roleDSLContext.insertInto(Tables.ROLE,
Tables.ROLE.NAME,
Tables.ROLE.DESCRIPTION,
Tables.ROLE.START_DATE,
Tables.ROLE.END_DATE,
Tables.ROLE.ID_RISK,
Tables.ROLE.ID_TYPE,
Tables.ROLE.ID_CONTAINER)
.values(roleEntry.getName(),
roleEntry.getDescription(),
roleEntry.getStartDate(),
roleEntry.getEndDate(),
roleEntry.getIdRisk(),
roleEntry.getIdType(),
roleEntry.getIdContainer())
.execute();
}
}
I'm using MySQL and the connection to the database is made using the spring config file
spring.datasource.url=jdbc:mysql://localhost:3306/role_managementverifyServerCertificate=false&useSSL=true
spring.datasource.username=root
spring.datasource.password=123456
spring.datasource.driver-class-name=com.mysql.cj.jdbc.Driver
I'm assuming I don't have to reconnect to the database everytime I'm testing the transaction and closing the connection after it finishes. I know that there is
MockDataProvider provider = new MockDataProvider()
but I don't understand how it works.
What is the best way to test the before mentioned method?
Disclaimer
Have you read the big disclaimer in the jOOQ manual regarding mocking of your database?
Disclaimer: The general idea of mocking a JDBC connection with this jOOQ API is to provide quick workarounds, injection points, etc. using a very simple JDBC abstraction. It is NOT RECOMMENDED to emulate an entire database (including complex state transitions, transactions, locking, etc.) using this mock API. Once you have this requirement, please consider using an actual database product instead for integration testing, rather than implementing your test database inside of a MockDataProvider.
It is very much recommended you use something like testcontainers to integration test your application, instead of implementing your own "database product" via the mock SPI of jOOQ (or any other means of mocking).
If you must mock
To answer your actual question, you can configure your DSLContext programmatically, e.g. using:
#Bean
public DSLContext getDSLContext() {
if (testing)
return // the mocking context
else
return // the actual context
}
Now inject some Spring profile value, or whatever, to the above configuration class containing that DSLContext bean configuration, and you're all set.
Alternatively, use constructor injection instead of field injection (there are many benefits to that)
#Service
public class RoleService implements GenericRepository<Role> {
final DSLContext ctx;
public RoleService(DSLContext ctx) {
this.ctx = ctx;
}
// ...
}
So you can manually construct your service in the test that mocks the database:
RoleService testService = new RoleService(mockingContext);
testService.add(...);
But as you can see, the mocking is completely useless. Because what you want to test is that there's a side effect in your database (a record has been inserted), and to test that side effect, you'll want to query the database again, but unless you mock that as well, or re-implement an entire RDBMS, you won't see that record in the database. So, again, why not just integration test your code, instead?
To test the behaviour of a web service during database outages I would like to simulate connection failures in my unit tests. The database backend is Postgresql and the code uses some non-standard SQL queries that make it hard to use a vanilla in-memory database for testing. The database connection is accessible through a DataSource that defers connection management to a ConnectionPool.
How can I simulate temporary/intermitent database disconnects in a unit test to verify correct error handling and recovery from connection outages?
as mentioned by #duffymo a mock is the way to go.
If you would really do unit testing you already would use Mocks created by a mocking framework since unit tests require isolation of the individual Units by replacing their dependencies with test doubles. A mocking framework is the simplest and most stable way to create such test doubles.
But I guess that you are instead doing Integration Tests executed with a UnitTesting Framework, calling them "unit tests" for whatever reason.
However..
Since your test relies on the real functionality of the datasouces a spy would be a good choice like this:
class DatasourceUsageTest{
#Rule
public ExpectedException exception = ExpectedException.none();
#Test
public void reportDatabaseOutage(){
// arrange
DataSource myDatasource = aquireDatasourceSomehow();
DataSource spyOfMyDatasource = Mockito.spy(myDatasource);
Mockito.doCallRealMethod() // first call
.doThrow(new SqlException("Report this message") // second call (and all following)
.when(spyOfMyDatasource).methodExpectedToFail();
SomeType testedUnit = createUnitAndInject(spyOfMyDatasource );
// act call #1
testedUnit.theMethodUsingDatasource();
Mockito.verify(spyOfMyDatasource).methodExpectedToFail();
// act call #2
exception.expect(TheExceptionTypeToBeThrown.class);
exception.expectMessage(EXCEPTION_MESSAGE_PREFIX + "Report this message");
testedUnit.theMethodUsingDatasource();
// Code below this will not be executed
}
}
Problem here (as with any Integration test) is that your database may have a real Problem in which case this test fails at call #1 (and thus for the wrong reason).
We have large system using Postgresql DB, with rather complex database structure. And we have many DB-related integration tests for that system.
Because of the complex DB structure and usage of postres specific sql in code, mocking postgres with H2 (or other in memory DB) seems highly unreliable.
So, we are using junit tests of the following structure:
#RunWith(SpringRunner.class)
#JdbcTest
#AutoConfigureTestDatabase(replace= AutoConfigureTestDatabase.Replace.NONE)
#Sql( ... schema creation, sample data, etc )
#ContextConfiguration(classes = ... dao and service classes used in the test)
Everything is OK, when you have 2-3 test classes. Problems start to arise when you have 10+ test classes. As I understand it, SpringBoot creates separate connection pool for every distinct context configuration. To keep tests isolated as much as possible, we usually include in context configuration only components, that are used inside the test. So SpringBoot creates dozens of connection pools, that leads to "too many connection"-type errors from connection pool or jdbc driver. You can run your tests one by one, but you cannot run them all at once (so, say farewell to CI).
We are using the following workaround. The following snippet is copy-pasted to every test class:
// <editor-fold name='connection leaking fix'
#Autowired
private DataSource dataSource;
private static HikariDataSource hikariDataSource;
#Before
public void saveDataSource() {
this.hikariDataSource = (HikariDataSource)dataSource;
}
#AfterClass
public static void releaseDataSource() {
if (hikariDataSource != null) {
hikariDataSource.close();
}
}
// </editor-fold>
It works, but you have to remember that you shouldn't paste that snippet to test classes that use the same context configuration.
The question - is there any way to tell spring boot to close connection pool after each test class execution, or any way to limit number of connection pools spring boot creates?
#M.Deinum is right, the only way to solve the problem without hacking some workaround is to use limited number of configurations. So you can use something like this to test just DAO layer:
#RunWith(SpringRunner.class)
#JdbcTest(includeFilters = #ComponentScan.Filter(Repository.class))
#AutoConfigureTestDatabase(replace= AutoConfigureTestDatabase.Replace.NONE)
#Sql(...)
or that something like this to test DAO and service layer:
#RunWith(SpringRunner.class)
#JdbcTest(includeFilters = {
#ComponentScan.Filter(Repository.class),
#ComponentScan.Filter(Service.class)
})
#AutoConfigureTestDatabase(replace= AutoConfigureTestDatabase.Replace.NONE)
#Sql(...)
We are using MockMvc Framework to test Spring Controllers with JUnit. Controller returns a DefferedResult.
The mockmvc.perform looks like bellow
mockMvc.perform(post("/customer")
.accept(APPLICATION_JSON)
.header(AUTH_TOKEN_KEY, "xyz")
.header(FROM_KEY, "email#gmail.com")
.content(json)
.contentType(APPLICATION_JSON))
.andExpect(status().isOk())
.andExpect(request().asyncStarted());
And it takes a lot of time. We are Using embedded cassandra and because of that it takes a lot of time.
I tried this as well, but it's same.
MvcResult mvcResult = mockMvc.perform(post("/customer")
.accept(APPLICATION_JSON)
.header(AUTH_TOKEN_KEY, "xyz")
.header(FROM_KEY, "email#gmail.com")
.content(json)
.contentType(APPLICATION_JSON))
.andReturn();
mockMvc.perform(asyncDispatch(mvcResult))
.andExpect(status().isOk())
.andExpect(request().asyncStarted());
I've hundreds of tests, because of which the build process is really slow.
Is there a way, using JUnit I can say perform the request and wait for response in another thread to assert the results, Or anyother good way of speeding it up.
Thanks
Do you really need the Cassandra/Persistence-Layer of your application for this test?
If the anser is no, or if the answer is no a wide array of tests cases, than you could inject another persitsence repoitory when running tests. To achive this, you could use Spring's built in Profile functionality and annotate your tests accordingly, for example:
#RunWith(SpringJUnit4ClassRunner.class)
#ActiveProfile("StubPersistence")
public class WebLayerIntegrationTests {
...
}
You could then have a stubbed version of your Cassanda-Repository for your tests, that are allowed to work with static data:
#Profiles("StubPersistence")
#Repository
public class StubCassandaRepository {
...
}
This class could be backed by a simple data structure like a HashSet or similar, depdends on your use case. The possibility of this approach depdens heavy on your software architecture, so it might not be possible if you can't stub out your Cassandra depdencies.
I also wonder if you really need hundreds of test, that need your complete application, including the web layer. You can of course significantly speed up your tests by favoring Unit-Tests over Integration-Tests, so that you don't need to initiliaze the Spring-Context. Depends on your application and software architecture as well.
There will also be some testing improvements in Spring-Boot 1.4 that will allow you to specifically initilize single slices of your application for testing (like Persistence, or Web-Layer):
https://spring.io/blog/2016/04/15/testing-improvements-in-spring-boot-1-4
So, my best advice is:
If you want to test your Controllers, test only your Controllers and not your Persistence-Layer, stub it out instead. If you want to test your Persistence-Layer, start with the interfaces of your Persistence-Layer, don't use your Controllers as the test-interface.
As I mentioned in my question We are Using embedded cassandra and because of that it takes a lot of time.
I tried looking things in cassandra.yaml file and changed the line below.
commitlog_sync_batch_window_in_ms: 90
to
commitlog_sync_batch_window_in_ms: 1
That's all and the build time was reduced from 30 minutes to 2 minutes.
From cassandra.yaml comments:-
It will wait up to commitlog_sync_batch_window_in_ms milliseconds for other writes, before performing the sync.
After reducing this time the wait time was reduced and build time got reduced.
Are you doing the following?
Running as a "Spring Integration Test"? e.g.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = CassandraConfig.class)
public class BookRepositoryIntegrationTest {
//
}
Are you starting and stopping the embedded Casandra server using the #BeforeClass / #AfterClass` annotations? e.g.
#BeforeClass
public static void startCassandraEmbedded() {
EmbeddedCassandraServerHelper.startEmbeddedCassandra();
Cluster cluster = Cluster.builder()
.addContactPoints("127.0.0.1").withPort(9142).build();
Session session = cluster.connect();
}
... and ...
#AfterClass
public static void stopCassandraEmbedded() {
EmbeddedCassandraServerHelper.cleanEmbeddedCassandra();
}
See http://www.baeldung.com/spring-data-cassandra-tutorial for more information
I want to test my Dao Class using the SpringContextTests.
In my method class I extended the AbstractTransactionalJUnit4SpringContextTests in order for my test class to integrate with JUnit4. I have also set up the configurations and made the initialization and database clean up in the #Before and tearDown in the #After. My test class works perfectly.
My problem was, when I run my test class and the database is filled with data, the original data was not rolled back and my database is cleared. In the #Before method, I clear the database and populate data, thinking that I will be able to rollback it but its not.
Can anyone site an example that works and rollbacks information in the database.
ADDONS:
Every database manipulation in my test methods are rolled back. But the execution of super.deleteFromTables("person") in the #Before method did not rollback all the previous data from the database.
Spring rollbacks all the CRUD operations but the database clean up before the transaction do not rollback.
Thank you to all those who answered my question. I learned a lot from those answers but it didn't solve my problem.
I knew my test data does a transaction management and it does its job properly.
The mistake is on my part.
I forgot the lesson about database commands that when you execute a DDL statement after a DML statement, it will automatically commit the transaction. I executed a DDL after a DML by deleting all record and then ALTER the AUTO_INCREMENT of the table where in it will cause an auto-commit and delete all records of the table permanently.
FIXING THAT SCENARIO SOLVED MY PROBLEM.
Possible causes:
you're using a database/database engine which does not have proper transactions;
you're using multiple transaction managers and/or data sources and the proper one is not picked up;
you're doing your own, separate, transactions in the test class
As for an example, here's one ( top of my head, not compiled )
public class DBTest extends AbstractTransactionalJUnit4SpringContextTests {
#Autowired
private SomeDAO _aBeanDefinedInMyContextFile;
#Test
public void insert_works() {
assert _aBeanDefinedInMyContextFile.findAll() == 0;
_aBeanDefinedInMyContextFile.save(new Bean());
assert _aBeanDefinedInMyContextFile.findAll() == 1;
}
}
Key points:
the SomeDAO is an interface which corresponds to a bean declared in my context;
the bean does not have any transactional settings ( advice/programmatic), it relies on the caller being transactional - either the service in production, or the test in our situation;
the test does not include any transactional management code, as it's all done in the framework.
I'm not sure what is wrong with your class. Here is an extract of a class that does what you want with dbunit and spring 2.5:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations={
"testContext.xml"})
#TransactionConfiguration
#Transactional
public class SampleDAOTest {
#Autowired
private DataSource dataSource;
#Autowired
private SampleDAO sampleDAO;
#Before
public void onSetUpInTransaction() throws Exception {
//Populate Test data
IDatabaseConnection dbUnitCon = new DatabaseConnection(DataSourceUtils.getConnection(dataSource), "DATASOURCE");
//read in from a dbunit excel file of test data
IDataSet dataSet = new XlsDataSet(new File("src/test/resources/TestData.xls"));
DatabaseOperation.INSERT.execute(dbUnitCon, dataSet);
}
#Test
public void testGetIntermediaryOrganisation() {
// Test getting a user
User object = sampleDAO.getUser(99L);
assertTrue(object.getValue);
}
}
One of the benfits of this method is that you don't need to extend any classes. So you can still have your own hierarchy for tests.
If you really want to stick to your current method instead of using the #before annotation I thinnk you need to overide the below method and put your setup code in there.
#Override
public void onSetUpInTransaction() throws Exception {...}
Hope this helps
Sidestepping your question, I suggest that you use a seperate database instance to run your tests against. That way, you can safely wipe it clean and have your tests initialize it as required.
As far as I know the Spring support classes for database testing only rollback what happens in the tests, not what happens in setup and teardown of tests.
Agree with Confusion-- you should be running your tests against their own database schema.
With this, you can set your hibernate properties to 'create-drop':
With create-drop, the database schema
will be dropped when the
SessionFactory is closed explicitly.
See: Optional Hibernate Config properites
Example snippet:
<bean id="sessionBean" class="org.springframework.orm.hibernate3.LocalSessionFactoryBean">
<property name="dataSource" ref="dataSource"/>
<property name="hibernateProperties">
<props>
<prop key="hibernate.hbm2ddl.auto">create-drop</prop>
...etc
While I'd agree with the guy's suggesting a deciated db for testing, there isn't any reason why using a populated db shouldn't work, both #Before and #After methods are executed within the transactional context, therefore there changes should be rolledback.
Possibles:
The data setup is doing something that isn't transactional (ie DDL statements)
Something in your test is actually committing the transaction
Can you post the #Before method, I'm just wondering if you are just clearing the tables or actually dropping and recreating them?
As far as I can tell, by looking at the Javadocs and source code for AbstractJUnit4SpringContextTests and TransactionalTestExecutionListener you need to annotate your test methods you want transactionalised with #Transactional.
There are also #BeforeTransaction and #AfterTransaction annotations where you can better control what runs in a transaction.
I suggest you create methods annotated with all these annotations, including #Before and then run the test with breakpoints at these methods. That way you can look at the stack and work out whether spring has started a transaction for you or not. If you see something like "TransactionInterceptor" in the stack then, or anything else with "Transaction" in the name, then chances are you're in a transaction.
You're doing super.deleteFromTables in your #Before method which is within the tx. So if the tx is rolled back doesn't the deletions get rolled back also?