How can I get Spring Boot to recreate in-memory test databases from scratch between test classes?
I've got several local integration tests annotated with #SpringApplicationConfiguration and #WebIntegrationTest that alter database state. I have marked each of these with #DirtiesContext. I was expecting the code to create the in-memory database would be part of the ApplicationContext lifecycle, and so a new one should get created in any subsequent tests.
I can see in the logs that Flyway is trying to re-apply migrations and thinks they're already done as the database hasn't been purged.
Is Spring Boot creating the in-memory database outside of each ApplicationContext and sharing it between them? Is there any way to control this behaviour?
EDIT
I'm also seeing odd behaviour when running tests from Maven as opposed to Eclipse. One of my database tables is changing state in Maven, but not in Eclipse. Could this be a ClassLoader issue?
Without being able to inspect the configuration and run-time behavior of your project, I can only assume that you are running into the same problem described in SPR-8849.
Is Spring Boot creating the in-memory database outside of each ApplicationContext and sharing it between them?
That's unlikely. What's more likely is that the database is created only once when the first ApplicationContext is loaded, and that one single database is used across all tests executing within in the same JVM. This would explain the fact that "the database hasn't been purged," as you phrased it.
Is there any way to control this behaviour?
If my above assumptions are correct, yes: you can control this by making sure that you use a unique database name for each embedded database. See the comments in SPR-8849 for details.
Please let me know if that works for you.
Regards,
Sam (author of the Spring TestContext Framework)
Specifying custom configuration yields the expected behaviour.
#Configuration
#EnableAutoConfiguration(exclude={
SecurityAutoConfiguration.class,
ManagementSecurityAutoConfiguration.class,
DataSourceAutoConfiguration.class
})
#EnableJpaRepositories(basePackages = "com.example.repository")
public class TestConfig {
#Bean
public String sharedSecret() {
return null;
}
#Bean
public DataSource dataSource() {
return new EmbeddedDatabaseBuilder()
.setType(EmbeddedDatabaseType.HSQL)
.build();
}
}
If anyone from Pivotal reads this (does Dave Syer have a big red phone?), I can knock up a test project to exhibit the behaviour if you think it's a bug.
Related
Scenario:
I'm supporting an Enterprise application that runs in Wildfly10. The application (.war) uses J2EE technologies (EJBs, JPA, JAX-RS) and SpringBoot features (like, SpringMVC, SpringRest, SpringData, SpringRestData) ... Both stacks co-exists "happily" because they don't interact between them; however, they do share common classes, like utility or Entity Classes (the stacks map to the same database model). Why the application uses those stacks is out the scope of the question.
Currently, I'm trying to improve the performance of a #RestController that pulls some data from the database using a JPA Spring Repository. I found that we're suffering the N + 1 queries problem when calling the #RestController. In past projects (where there were only J2EE technologies), I have used the #BatchSize hibernate annotation to mitigate this problem with total success.
But, in this project, Spring seems to be skipping such annotation. How do I know that? Because I turned on the hibernate SQL logging (hibernate.show_sql) and I can see the N + 1 queries is still happening ...
Key Points:
Here are some insights about the application that you must know before providing (or trying to guess) any answer:
The application has many sub-modules encapsulated as libraries inside WAR file (/WEB-INF/lib) ... Some of these libraries are the jars that encapsulate the entity classes; others are the jars that encapsulate the REST Services (that could be JAX-RS services or Spring Controllers).
The Spring configuration is done in the classes defined in the WAR artifact: in there, we have a class (that extends from SpringBootServletInitializer) annotated with #SpringBootApplication and another class (that extends from RepositoryRestConfigurerAdapter) annotated with #Configuration. Spring customization is done is such class.
The application works with multiple datasources, which are defined in the Wildly server. Spring DATA JPA must address any query pointing to the right datasource. To accomplish this requirement, the application (Spring) was configured like this:
#Bean(destroyMethod="")
#ConfigurationProperties(prefix="app.datasource")
public DataSource dataSource() {
// the following class extends from AbstractRoutingDataSource
// and resolve datasources using JNDI names (the wildfly mode!)
return new DataSourceRouter();
}
#Bean("entityManagerFactory")
public LocalContainerEntityManagerFactoryBean getEntityManagerFactoryBean() {
LocalContainerEntityManagerFactoryBean lemfb;
lemfb = new LocalContainerEntityManagerFactoryBean();
lemfb.setPersistenceUnitName("abcd-pu");
lemfb.setDataSource(dataSource());
return lemfb;
}
The last #Bean declaration favors the use of a persistence.xml file, which we do have in the route /WEB-INF/classes/META-INF/ (i.e. Spring does find this file!) ... In such file, we define our domain classes, so that Spring JPA can see such entities. Also, we can define special JPA properties like: hibernate.show_sql and hibernate.use_sql_comments without issues (this is how I detected the N + 1 queries problem in the first place) ...
What I have done so far?
I tried to add the #BatchSize annotation to the problematic collection. No luck!
I created a new JAX-RS Service whose purpose was to mimic the behavior of the #RestController. I confirmed that the #BatchSize annotation does work in the application's deployment, at least, in JAX-RS Services! (NOTE: the service uses it own persistence.xml) ...
Test details (Updated 2020/07/30): What I did here was to create a new JAX-RS Service and deployed it inside the WAR application, next to the #RestController that presents the problem (I mean, it is the same WAR and the same physical JVM). Both services pull from database the same entity (same class - same classloader), which has a lazy Collection annotated with #BatchSize! ... If I invoke both services, the JAX-RS honors the #BatchSize and pulls the collection using the expected strategy, the #RestController does not ... So, what it is happening here? The only thing different between the services is that each one has a different persistence.xml: the persistence.xml for the JAX-RS is picked by Wildfly directly, the other one is picked by Spring and delegated to Wildfly (I guess) ...
I tried to add the properties: hibernate.batch_fetch_style (=dynamic) and hibernate.default_batch_fetch_size (=10), to the persistence.xml read by Spring ... No luck. I debug the Spring startup process and I saw that such properties are passed to the Spring Engine, but Spring does not care about them. The weird thing here is that properties like: hibernate.show_sql, Spring does honor them ... For those who are asking: "What does these properties do?" Well, they are global equivalent to apply #BatchSize to any JPA lazy collection or proxy without declaring such annotation in any entity.
I setup a small SpringBoot Project using the same Spring version as enterprise application (which is 1.5.8.RELEASE, by the way) and both the annotation and properties approach worked as supposed to.
I've been stuck with this issue for two days, any help to fix this will be appreciated ... thanks!
There are 2-3 possible issues that I can think off.
For some reason, whatever you modify isnt picked up by wildfly - Wildfly classpath resolution is a separate Topic and some missing configuration can cause you a nightmare. This you can identify if you have access to debug the query, and in if you put a breakpoint in the constructor of your Entity class, you will get a chance to evaluate the entity configuration being used, somewhere in the execution conetxt.
BatchSize doesnt work on OneToOne, It only works on OneToMany relationships.
A typical way to define BatchSize is to do along with Lazy load as mentioned in the example here. If you are not using Lazy fetch, hibernate assumes that you are willing to make an eager load and makes another select query to fetch all the details.Please confirm you are using the same syntax as given in the example above.
New Addition:
Put Conditional Breakpoints in PropertyBinder#setLazy() function, and may be backtrace it and put relavent breakpoints in CollectionBinder and AnnotationBinder. then restart/redeploy the server and see what data you are getting for the relavent properties. That will give you fair idea where it is failing..
Why conditional breakpoint? Its because you will have thousands of properties and if you do not add condition to the breakpoint, you will take 1 hour to reach your actual breakpoint
What should be the condition - If its property binder, the condition shoud be like `this.name == . For other classes also you can use the same approach.
Sorry for too detailed description on conditional breakpoints, you might find it redundent.
Looks like the only way to debug your problem is to debug hibernate framework from server startup, then only we will be able to find out the rootcause
I have a simple Spring Boot application which reads from Kafka topic and persists the messages to some cache.
I would like to add an integration test, which would launch my original application, generate some messages from embedded Kafka, and then assert cache contents.
I'm struggling with the "launch my original application" part. How does one do that from Spring Boot integration test?
I've tried doing something like that:
#RunWith(SpringRunner.class)
#SpringBootTest(classes = OriginalApplication.class)
#EmbeddedKafka
public class OriginalApplicationIntegrationTest {
#Test
public void test() throws Exception {
...
}
}
But I see no attempts from Spring to launch my original application.
First of all, there are two possible big "areas" that can go wrong:
Spring Boot Test Setup
Kafka Integration
I believe the question is around the first part so I'll concentrate on that part.
For a quick answer:
When you put a #SpringBootTest annotation, try to use it without parameters at all. And make sure the test is put in a correct package, it matters. This will turn on the automatic resolution of your application.
Now I'll try to briefly explain why its important, the topic is really broad and deep.
Spring Boot checks whether the class annotated with #SpringBootConfiguration (its an annotation put on #SpringBootApplication - which is in turn is on your main class) exists in the same package as the integration test ( Lets say, com.abc.myapp.test is where you put a test)
If not found, it goes one package up and checks there (com.abc.myapp). It will do that again and again till the root package however, lets assume the #SpringBootApplication annotated class is in this package. Notice, If this recursive "search" doesn't find #SpringBootApplication annotated class - the test doesn't start. That's why its important to use the package structure offered by spring boot application.
Now when it finds that class it know which packages should be scanned for beans to start the spring boot application. So it tries to find beans according to practices of spring boot (package com.abc.myapp and beneath). It does it again recursively top-to-bottom this time.
It also runs your starters (autoconfigurations) in this mode.
So, bottom line:
Specifying #SpringBootTest without parameters makes spring boot doing its best to mimic the startup of the real application
If you use it with parameters where you put it a configuration however, it behaves totally differently: Its like saying: "I know where my configurations are, don't try to start everything, here is my configuration, load only it".
A totally different thing, no recursive searches, no auto-configurations startup, etc.
There are several web spring boot java applications. I need to prepare several components for integration testing. My task is to mock all external behaviour such as other projects's components, db calls etc. I found a solution for this using #Profileannotation from spring framework. Here's an example. I can simply create new profile and declare two beans implementations for each profile: one for real usage, for production and another one for integration testing, for stubbing. It would look like this:
#Profile("PROD")
#Configuration
#EnableWebSecurity
public class SecurityConfig extends WebSecurityConfigurerAdapter {
}
#Profile("MOCK")
#Configuration
#EnableWebSecurity
public class SecurityMockConfig extends WebSecurityConfigurerAdapter {
}
But I have doubts about this design. It looks little bit messy for me. Does this solution considered acceptable for task I have?
Doing this, your mocks and their configuration will be probably packaged with the app running in production.
This seems very odd to me. Would you package your units tests in your deliverd Spring application ? I don't think so. So I would like to say this is a "bad" design since testing dependencies should not be embedded with production code.
However, Spring's documentation about #Profile annotation is using the exemple of environment segregation.
Now, there is a question which needs to be answered: what do you mean by "integration testing" ?
Is this automated integration test ? Or do you want to run your application in different modes for the testing teams ?
Is this is an automated integration test, then there is no reason to use #Profile annotation as automated tests and production code will not be packaged together.
However, if you want your users to make integration tests, then you could create standalone fake project which will be used to simulate the external dependencies you are calling (database, webservices, etc).
Then, #Profile can be used to switch from fake to production mode but only through configuration file: fake profile will make call on your fake external services whereas production will call the real external services.
I am trying to implement integration tests for my Tomcat application, but my issue is that the application is launched separately from the tests so the tests cannot access the application context and neither the database.
My idea is running the tests "within" the running application, so I can #Autowire EntityManager and check for instance the state of the database during testing or even create database entities for testing.
My only idea of doing this is to actually run the application programmatically from the tests as ClassPathXmlApplicationContext("applicationContext.xml") and the access the Context. This would work, but it would be very hard for debugging as we wouldn't be able to use Hotswapping during the testing. Also I guess the server would be stopped as soon as the tests would end. I guess that is not the best and correct solution.
EDIT:
My question was probably unclear, so I will try to clarify.
I have a Tomcat application with Spring and Hibernate. The Spring beans and Hibernate database connection is initialised when the Tomcat application is started. The issue is how to run the tests of the active Spring beans from methods annotated with #Test in src/test/java which are started separately.
Consider this class:
#Component
class MyRepository {
#Autowired
EntityManager em;
#Transactional
public void myMethod(MyEntity entity) {
// do some job with entity
...
em.flush();
}
}
This class will be initialised with Tomcat as a MyRepository bean.
To test it, I cannot just call new MyRepository().myMethod(...) - I need to access the bean. The issue is accessing the bean from the #Test method:
#Test
void testMyRepository() {
Item item = ...
// then use the repository to handle the entity
context.getBean(MyRepository.class).myMethod(item);
// then assert the state of the database
context.getBean(EntityManager.class).find(Item.class, ...) ...
}
I can probably get the context in the initialisation of the tests with
ApplicationContext context = ClassPathXmlApplicationContext("applicationContext.xml");
But it would mean launching the whole application each time the tests are started. The better solution would be if the application could run separately from the tests.
Hope my problem is more clear now.
I would suggest you to use the SpringRunner to start the Spring application context and perform your tests on that running instance. You can customize the context the way it doesn't contain parts you don't want to tests and you can create mocks for components that require some external resources (REST clients and such). Take a look at the Spring docs or Spring Boot docs.
If multiple tests use the same Spring context configuration, the context is started just once and reused. So it's good to have it's configuration in a parent class of your tests. You can autowire any Spring bean into your test and test it.
You can use an in-memory database (such as H2) instead of a production one, so your tests are not dependent on an external infrastructure. To initialize the database, use tools like Flyway or Liquibase. To clear the database before each test, you can use the #Sql annotation.
You can find many examples of projects with such tests, for example my own demo.
If you want to test an external system, I would suggest something like JMeter.
Unfortunately you cant mirror your classes and use them in your tests. Thats a big disadvantage of web services. They always depend on user / machine interaction. With a lot of effort you can extract the functionality of the essential classes or methods and construct test scenarios etc. with jUnit.
The Overview of your possibilities:
special drivers and placeholders
you can use a logger with detailed log-level and file output. Then you created scenarios with the expected result and compare it with your log files.
Capture replay tools. They record your exection and replay them for monitoring.
I can also recommend using Selenium for the frontend tests.
Hope it helped.
I have developed my application and I have a .properties file containing several key-value properties.
In my code I inject said properties like this:
#Value("${services.host}${services.name}")
private String hostname;
I am searching for a way to check every #Value inside of my code so to make sure that every property will be solved at runtime. Something like simulating my application startup.
Is it possible?
Yes, you can create a JUnit test class that loads your application context (just like your production code would) and then execute a test method that verifies that your property values have been injected.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = {AppConfig.class})
public class SpringApplicationTest {
#Autowired
private MyServiceBean serviceBean;
#Test
public void shouldExecuteServiceBean_andProduceExpectedOutcome() {
//TODO test setup
serviceBean.doSomething()
//TODO assert output
}
}
In this example MyServiceBean.java is a class that would be executed from your Main class, so that you are testing the end-to-end logic of your application, including all of the spring dependency injections. Think of it as your "happy path" test scenario. I always include at least one test like this in my projects, to ensure that all of the spring injections are correct and load without error. You wan't to catch the errors before you build and deploy your code.
In the example above AppConfig.java is the same Spring configuration class you use when your code is deployed. You probably want to add another configuration class that overrides some properties/beans specifically for testing only.
#ContextConfiguration(classes = {AppConfig.class, TestConfig.class})
Using a test only class, you can mock out any dependencies that make testing difficult (i.e. use an in-memory database), and also override properties so you can test against "localhost" rather than another service which may or may not be available (so long as you can create an equivalent localhost service in your test setup).
Note: If you are finding it difficult to test your application due to too many dependencies, or external dependencies that cannot be swapped out easily, the pain you are feeling is a good guide to start thinking about how to change your architecture to support ease of testing. You also can just test portions of your application using the above concepts.