We are using MockMvc Framework to test Spring Controllers with JUnit. Controller returns a DefferedResult.
The mockmvc.perform looks like bellow
mockMvc.perform(post("/customer")
.accept(APPLICATION_JSON)
.header(AUTH_TOKEN_KEY, "xyz")
.header(FROM_KEY, "email#gmail.com")
.content(json)
.contentType(APPLICATION_JSON))
.andExpect(status().isOk())
.andExpect(request().asyncStarted());
And it takes a lot of time. We are Using embedded cassandra and because of that it takes a lot of time.
I tried this as well, but it's same.
MvcResult mvcResult = mockMvc.perform(post("/customer")
.accept(APPLICATION_JSON)
.header(AUTH_TOKEN_KEY, "xyz")
.header(FROM_KEY, "email#gmail.com")
.content(json)
.contentType(APPLICATION_JSON))
.andReturn();
mockMvc.perform(asyncDispatch(mvcResult))
.andExpect(status().isOk())
.andExpect(request().asyncStarted());
I've hundreds of tests, because of which the build process is really slow.
Is there a way, using JUnit I can say perform the request and wait for response in another thread to assert the results, Or anyother good way of speeding it up.
Thanks
Do you really need the Cassandra/Persistence-Layer of your application for this test?
If the anser is no, or if the answer is no a wide array of tests cases, than you could inject another persitsence repoitory when running tests. To achive this, you could use Spring's built in Profile functionality and annotate your tests accordingly, for example:
#RunWith(SpringJUnit4ClassRunner.class)
#ActiveProfile("StubPersistence")
public class WebLayerIntegrationTests {
...
}
You could then have a stubbed version of your Cassanda-Repository for your tests, that are allowed to work with static data:
#Profiles("StubPersistence")
#Repository
public class StubCassandaRepository {
...
}
This class could be backed by a simple data structure like a HashSet or similar, depdends on your use case. The possibility of this approach depdens heavy on your software architecture, so it might not be possible if you can't stub out your Cassandra depdencies.
I also wonder if you really need hundreds of test, that need your complete application, including the web layer. You can of course significantly speed up your tests by favoring Unit-Tests over Integration-Tests, so that you don't need to initiliaze the Spring-Context. Depends on your application and software architecture as well.
There will also be some testing improvements in Spring-Boot 1.4 that will allow you to specifically initilize single slices of your application for testing (like Persistence, or Web-Layer):
https://spring.io/blog/2016/04/15/testing-improvements-in-spring-boot-1-4
So, my best advice is:
If you want to test your Controllers, test only your Controllers and not your Persistence-Layer, stub it out instead. If you want to test your Persistence-Layer, start with the interfaces of your Persistence-Layer, don't use your Controllers as the test-interface.
As I mentioned in my question We are Using embedded cassandra and because of that it takes a lot of time.
I tried looking things in cassandra.yaml file and changed the line below.
commitlog_sync_batch_window_in_ms: 90
to
commitlog_sync_batch_window_in_ms: 1
That's all and the build time was reduced from 30 minutes to 2 minutes.
From cassandra.yaml comments:-
It will wait up to commitlog_sync_batch_window_in_ms milliseconds for other writes, before performing the sync.
After reducing this time the wait time was reduced and build time got reduced.
Are you doing the following?
Running as a "Spring Integration Test"? e.g.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = CassandraConfig.class)
public class BookRepositoryIntegrationTest {
//
}
Are you starting and stopping the embedded Casandra server using the #BeforeClass / #AfterClass` annotations? e.g.
#BeforeClass
public static void startCassandraEmbedded() {
EmbeddedCassandraServerHelper.startEmbeddedCassandra();
Cluster cluster = Cluster.builder()
.addContactPoints("127.0.0.1").withPort(9142).build();
Session session = cluster.connect();
}
... and ...
#AfterClass
public static void stopCassandraEmbedded() {
EmbeddedCassandraServerHelper.cleanEmbeddedCassandra();
}
See http://www.baeldung.com/spring-data-cassandra-tutorial for more information
Related
We have some integration tests(written using spring framework) that are failing due to bean initialisation exception which eventually leads to Failed to load ApplicationContext . As per my understanding from spring testing docs , the loading of ApplicationContext happens at class level, so my doubt is -
Once the ApplicationContext fails (i.e. Failed to load ApplicationContext) due to bean initialisation exception during integration test class run, does the ApplicationContext try to spin up again(will be failing eventually) for each individual integration tests present in that particular integration test class ?
Asking above scenario because we are seeing huge spike in number of connections to postgres when bean failure happens, it seems like for every integration test present(which eventually fails due to Failed to load ApplicationContext) in the integration test class, spring tries to create a new connection to postgres and doesn't destroy it before ApplicationContext failure. How can we stop this, please help with some suggestions.
Also, once we get Failed to load ApplicationContext, is there a way to programmatically terminate the run of all the integration tests completely automatically ? If yes, please help how to achieve it ?
Thanks.
Testing framework - junit + Spring
Update: Mentioned testing framework used.
There is currently no way to abort integration tests if the ApplicationContext repeatedly fails to load.
To vote for such support, please see this Spring Framework issue.
If you have multiple integration tests in test class then it will fail multiple times.
you can terminate all the integration test cases using the dependsOnGroups property.
For example, you can have a method to check whether there is a bean exception failure or not and declare all other tests as dependent on this method. But this can be done using TestNG
#Test(groups = { "isApplicationContextFailed" }) // you can run with simply #Test here if there is only one independent test.
public void isApplicationContextFailed() {}
#Test(dependsOnMethods = { "isApplicationContextFailed" })
public void queryTestOne() {}
#Test(dependsOnMethods = { "isApplicationContextFailed" })
public void queryTestTwo() {}
you can explore more from TestNG -> https://testng.org/doc/documentation-main.html#dependent-methods
Note:- if you are using the JUnit there are some other ways you can do the same. Please update in that case.
If you are using JUnit then add below login in your test classes it will work.
public class BaseClass {
#BeforeClass
public void isApplicationContextFailed() {
//logic to check for application failure
}
//continue with your test methods.
}
To suggest you through below references to get in detail.
https://junit.org/junit4/javadoc/4.13/org/junit/BeforeClass.html
https://howtodoinjava.com/testng/testng-before-and-after-annotations/
I have the following test class:
#SpringBootTest
public class ChoreControllerTest
{
#Autowired
private ChoreController controller;
#Test
public void throwOnMissingChore()
{
assertThrows(ChoreNotFoundException.class, () -> this.controller.getChore(0L));
}
}
It takes about 5 seconds for Spring Boot to start up so the test can run. I want to reduce this time, but if I just remove the #SpringBootTest annotaton, I just get a NPE.
Is there a way to make this controller test more lightweight, or am I stuck with the startup time? I'm especially worried about what will happen to my test times if I ever want to test more than one controller....
The #SpringBootTest annotations create a Spring Context for you therefore it takes a while to start up. This annotation is mostly used for integration tests where a Spring context is required. Here are a few tips for optimizing integration tests.
If you remove the annotation the ChoreController cannot be autowired (no context available) which results in a NullpointerException.
Depending on your needs you can just use a Mocking library like Mockito to inject mocks e.g. services that your controller class needs and run the test without the #SpringBootTest.
You might want to take a look at this article for setting up those mocks properly.
We have large system using Postgresql DB, with rather complex database structure. And we have many DB-related integration tests for that system.
Because of the complex DB structure and usage of postres specific sql in code, mocking postgres with H2 (or other in memory DB) seems highly unreliable.
So, we are using junit tests of the following structure:
#RunWith(SpringRunner.class)
#JdbcTest
#AutoConfigureTestDatabase(replace= AutoConfigureTestDatabase.Replace.NONE)
#Sql( ... schema creation, sample data, etc )
#ContextConfiguration(classes = ... dao and service classes used in the test)
Everything is OK, when you have 2-3 test classes. Problems start to arise when you have 10+ test classes. As I understand it, SpringBoot creates separate connection pool for every distinct context configuration. To keep tests isolated as much as possible, we usually include in context configuration only components, that are used inside the test. So SpringBoot creates dozens of connection pools, that leads to "too many connection"-type errors from connection pool or jdbc driver. You can run your tests one by one, but you cannot run them all at once (so, say farewell to CI).
We are using the following workaround. The following snippet is copy-pasted to every test class:
// <editor-fold name='connection leaking fix'
#Autowired
private DataSource dataSource;
private static HikariDataSource hikariDataSource;
#Before
public void saveDataSource() {
this.hikariDataSource = (HikariDataSource)dataSource;
}
#AfterClass
public static void releaseDataSource() {
if (hikariDataSource != null) {
hikariDataSource.close();
}
}
// </editor-fold>
It works, but you have to remember that you shouldn't paste that snippet to test classes that use the same context configuration.
The question - is there any way to tell spring boot to close connection pool after each test class execution, or any way to limit number of connection pools spring boot creates?
#M.Deinum is right, the only way to solve the problem without hacking some workaround is to use limited number of configurations. So you can use something like this to test just DAO layer:
#RunWith(SpringRunner.class)
#JdbcTest(includeFilters = #ComponentScan.Filter(Repository.class))
#AutoConfigureTestDatabase(replace= AutoConfigureTestDatabase.Replace.NONE)
#Sql(...)
or that something like this to test DAO and service layer:
#RunWith(SpringRunner.class)
#JdbcTest(includeFilters = {
#ComponentScan.Filter(Repository.class),
#ComponentScan.Filter(Service.class)
})
#AutoConfigureTestDatabase(replace= AutoConfigureTestDatabase.Replace.NONE)
#Sql(...)
I was not able to find any information regarding configuration of AppDynamics agent for JUnit tests. I would like to test performance of Hibernate queries of Spring based web service backed by PostgreSQL database. Tests must be able to rollback the data at the termination.
Should it be unit or integration tests? What is the best way to accomplish it? How to make AppDynamics collect and display graphs of query execution times?
UPDATE:
I was not able to set up addDynamics agent for JUnit tests inside IDEA. The VM arguments is pointing to agent -javaagent:"C:\Tools\AppDynamicsAgent\javaagent.jar", the firewall is off but for some reason in appdynamics web based (SaaS) set up dialog shows that no agent able to connect:
You need both unit tests and integration tests. Unit tests should not use in database or File, ect. I like to use Spring profiles for my tests. For instance, if I have a profile called integeration_test.
#ActiveProfiles("integeration_test")
#ContextConfiguration(locations = {
"classpath:you-context.xml"})
#RunWith(SpringJUnit4ClassRunner.class)
public abstract class DaoTest
{
#Autowired
protected DataSource dataSource;
// delete all your stuff here
protected void clearDatabase()
{
JdbcTemplate jdbc = new JdbcTemplate(dataSource);
jdbc.execute("delete table");
}
#Before
public final void init()
{
clearDatabase();
}
#After
public final void cleanup()
{
clearDatabase();
}
}
(I'm using xml) then in your context do something like: <beans profile="test">TODO </beans> And configure your data-source in there.
I know there are ways to rollback all your transactions after running a test, but I like this better. Just don't delete everything in your real database haha, could even put some safety code in the clearDatabase to make sure that doesn't happen.
For performance testing you will really need to figure out what you want to achieve, and what is meaningful to display. If you have a specific question about performance testing you can ask that, otherwise it is too broader topic.
Maybe you can make a mini-webapp which does performance testing for you and has the results exposed as URL requests for displaying in HTML. Really just depends on how much effort you are willing to spend on it and what you want to test.
Once the agent is attached you can use the AppDynamics Database Queries Window
I have a very weird situation, that has happened in several systems already. I am using Spring Boot and AspectJ CTW to #Autowired dependencies in some entities (instanciated outside the container).
The class that receives dependencies (an abstract entity) some times receive the dependency without applying the profile (configured by #ActiveProfile in my test class). It is not deterministic, since by changing how tests are executed different outputs can happen. To illustrate the situation with code:
The entity
#Configurable
public class AbstractMongoDocument<T> implements Persistable<T> {
#Transient
private transient MongoTemplate mongoTemplate;
//entity stuff
}
One of the failing tests
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = LOVApplication.class)
#ActiveProfiles("local-test")
public class MyCrazyIntegrationTest {
#Test
public void filterByFieldsFullMatchShouldReturnResult() throws Exception {
//Given
Location l1 = new Location("name","code",new GeoJsonPoint(11,10));
l1.save(); //Hence the need of autowiring there.
//When: whatever
//Then: Some assertions
}
}
There are some facts that I find very disturbing here:
The dependency is always injected, but some times it apparently comes from an AppCtx with the default profile.
If it fails for 1 test in 1 class, it behaves the same for all the tests in that particular class.
It may or may not happen depending on how you execute that class (At the moment it fails only if I run all the tests, but succeed if I run that class in isolation, it also behaves differently in maven).
If I debug it, I can see that the dependency injected didn't get the proper profile. (I discovered that by injecting ApplicationContext and surprisingly discovering that it was a different object than the one I received in my tests).
What worries me the most, is that now I am not sure if this situation could also happen for non-test environments with for example a Production profile, which would imply a catastrophe.
I have tried to look for open bugs in Jira and I found nothing, so I don't discard I am misconfiguring something. Any help or ideas would be very much appreciated.
The behavior you are experiencing typically should not happen in a production deployment; however, it is a known issue with integration test suites that load multiple ApplicationContexts utilizing #Configurable and AspectJ load-time weaving.
For details, see the following issues in Spring's issue tracker:
https://jira.spring.io/browse/SPR-6353
https://jira.spring.io/browse/SPR-6121