Can I selectively disable Spring Data repositories for testing? - java

I'm writing a module-level integration test for a system using Spring Integration. I need the integration plan up and running but at this level am still using MockMvc and a mocked repository interface to ensure that I have all of my mappings, conversions, and message routing correct.
Right now, my module-level Enable configuration is meta-annotated with #EnableMongoRepositories, and the Spring test runner aborts because it doesn't have a live mongoTemplate to create the repositories from; the mock repository doesn't prevent the attempt to create real ones.
I know that I can conditionalize the inclusion of #EnableMongoRepositories, but is there simpler way to tell Spring Data not to create repository proxies if I'm already supplying mocks for them?

Basaically if I understand correctly you have two kinds of set up for your MongoDB repositories, mock and live. So you want to run integration tests and control the repository that is being used. I would suggest using "Spring Profiles".
create an simple interface say MongodbCofig
crate two configuration classes for mock and live. Make sure these classes implement MongodbConfig. Set the profiles (#Profiles) in both the classes.
Later activate the required profile before running the tests
For detailed understanding of profiles you can refer here

Related

Prevent tests from running with specific spring profile

I have some integration tests that create/delete entries in my database. The problem is if I run the tests with
spring.profiles.active=prod
my production database gets deleted (as the tests clear the database). Is there any way to prevent tests from running on this specific spring profile?
I have seen this thread: How to prevent running tests when specific spring profile is active? but there was no useful answer.
Thank you
There could be multiple solution to your problem.
Using in-memory databases like H2 for Sql and flapdoodle for no-sql for running tests. **preferred way
Create a separate properties file with clone of spring properties. Just change the database properties/spring profile or other things. Use this properties file with #testpropertysource on test class.
Use #dirtiescontext on tests to create/delete impacted rows only.
Another thing you can do is create stud classes your database layer to mock operations.
I was able to resolve the problem following this comment: https://stackoverflow.com/a/32892291/8679100. The solution is not perfect, as I need to verify if the prod profile is active in #BeforeAll and #AfterAll, but it does work. Furthermore, System.getProperty("spring.profiles.active", "");didn't actually work, butArrays.stream(environment.getActiveProfiles()).anyMatch(env -> (env.equalsIgnoreCase("prod")))``` did
You can use #IfProfileValue annotation to disable the test. it's not directly depend on the spring profile, but you can easily use config file to set the value you want based on the spring profile.
Having said that - it sounds very risky to run tests that delete entries from the db (or any other db transaction) on production DB. I support what #Sachin suggested above - run your test on tests environment, not production

How to start Spring Boot app without depending on Pivotal GemFire cache

I have a Spring Boot app with a Pivotal GemFire ClientCache instance configured and its corresponding domain objects. I am also using Spring Boot Test for unit testing. For every test case execution, either through class or Maven build, Spring ApplicationContext fails to load if the GemFire cache is down.
How to start Spring Boot application without depending on GemFire cache?
I am not sure I follow exactly what you mean by...
"For every test case execution, either through class or Maven build, Spring ApplicationContext fails to load if the GemFire cache is down."
Are you recreating the ClientCache instance for each test case (method) in your test class?
If so, then this can be tricky to do since even after calling ClientCache.close(), GemFire may not have completely "closed" and released all the resources used by the ClientCache instance. However, usually that does not prevent the Spring ApplicationContext from being recreated on subsequent test case executions. It usually just leads to subsequent test failures since the ClientCache instance is dirty, or stale, retaining old state from the previous (or last) test case execution.
Are you also using Spring's #DirtiesContext on your test case method as well?
Usually, it is wise to cycle the ApplicationContext and GemFire cache instance (e.g. ClientCache) per test class, where each test case method in the test class will use the same ApplicationContext and ClientCache instance; this is the most ideal.
With that, I have 2 things to share with you:
First, have a look at the new Spring Boot for Apache Geode/Pivotal GemFire project. Documentation is here. I announced the availability of this project nearly a month ago now. This project takes a "client-side" perspective to building Spring Boot applications with Pivotal GemFire. That is, it gives you an auto-configured ClientCache instance by default.
Specifically, have a look at Spring Boot for Pivotal GemFire's test suite, beginning here. Nearly all these test classes use a ClientCache instance and test various aspects of Pivotal GemFire, such as CQ's or Security, etc.
In certain test classes, I used a "mock" ClientCache instance (for example, this test class, and this test configuration in particular). However, in many other cases, I used a live GemFire ClientCache instance, for example or this test class, which is interesting since this test class even launches a server for the ClientCache instance (the test itself) to connect to.
All the test coordination logic in Spring Boot for Apache Geode/Pivotal GemFire is provided by another new project, Spring Test for Apache Geode/Pivotal GemFire. Unfortunately, Spring Test for Apache Geode/Pivotal GemFire is still largely a WIP, so does not have documentation yet. However, I have used this new test project extensively to test Spring Boot for Apache Geode/Pivotal GemFire. You will see its presence in the extension classes, like ForkingClientServerIntegrationTestsSupport, and so on.
In summary, use the new Spring Boot for Pivotal GemFire & Spring Test for Pivotal GemFire project as your guide for writing more effective Unit and Integration tests.
Finally, if you have an example GitHub repository reproducing your problem, I can help point you in the right direction.
Hope this helps!
Regards,
John
For your unit tests, use a different profile. Say application-ut.yaml and ask spring not to use any cache implementation library:
application-ut.yaml (add below entry and remove whatever implementation u configured for Gemfire)
spring.cache.type : simple

Using #Profile annotation for stubbing external behaviour

There are several web spring boot java applications. I need to prepare several components for integration testing. My task is to mock all external behaviour such as other projects's components, db calls etc. I found a solution for this using #Profileannotation from spring framework. Here's an example. I can simply create new profile and declare two beans implementations for each profile: one for real usage, for production and another one for integration testing, for stubbing. It would look like this:
#Profile("PROD")
#Configuration
#EnableWebSecurity
public class SecurityConfig extends WebSecurityConfigurerAdapter {
}
#Profile("MOCK")
#Configuration
#EnableWebSecurity
public class SecurityMockConfig extends WebSecurityConfigurerAdapter {
}
But I have doubts about this design. It looks little bit messy for me. Does this solution considered acceptable for task I have?
Doing this, your mocks and their configuration will be probably packaged with the app running in production.
This seems very odd to me. Would you package your units tests in your deliverd Spring application ? I don't think so. So I would like to say this is a "bad" design since testing dependencies should not be embedded with production code.
However, Spring's documentation about #Profile annotation is using the exemple of environment segregation.
Now, there is a question which needs to be answered: what do you mean by "integration testing" ?
Is this automated integration test ? Or do you want to run your application in different modes for the testing teams ?
Is this is an automated integration test, then there is no reason to use #Profile annotation as automated tests and production code will not be packaged together.
However, if you want your users to make integration tests, then you could create standalone fake project which will be used to simulate the external dependencies you are calling (database, webservices, etc).
Then, #Profile can be used to switch from fake to production mode but only through configuration file: fake profile will make call on your fake external services whereas production will call the real external services.

Integration tests of Spring application

I am trying to implement integration tests for my Tomcat application, but my issue is that the application is launched separately from the tests so the tests cannot access the application context and neither the database.
My idea is running the tests "within" the running application, so I can #Autowire EntityManager and check for instance the state of the database during testing or even create database entities for testing.
My only idea of doing this is to actually run the application programmatically from the tests as ClassPathXmlApplicationContext("applicationContext.xml") and the access the Context. This would work, but it would be very hard for debugging as we wouldn't be able to use Hotswapping during the testing. Also I guess the server would be stopped as soon as the tests would end. I guess that is not the best and correct solution.
EDIT:
My question was probably unclear, so I will try to clarify.
I have a Tomcat application with Spring and Hibernate. The Spring beans and Hibernate database connection is initialised when the Tomcat application is started. The issue is how to run the tests of the active Spring beans from methods annotated with #Test in src/test/java which are started separately.
Consider this class:
#Component
class MyRepository {
#Autowired
EntityManager em;
#Transactional
public void myMethod(MyEntity entity) {
// do some job with entity
...
em.flush();
}
}
This class will be initialised with Tomcat as a MyRepository bean.
To test it, I cannot just call new MyRepository().myMethod(...) - I need to access the bean. The issue is accessing the bean from the #Test method:
#Test
void testMyRepository() {
Item item = ...
// then use the repository to handle the entity
context.getBean(MyRepository.class).myMethod(item);
// then assert the state of the database
context.getBean(EntityManager.class).find(Item.class, ...) ...
}
I can probably get the context in the initialisation of the tests with
ApplicationContext context = ClassPathXmlApplicationContext("applicationContext.xml");
But it would mean launching the whole application each time the tests are started. The better solution would be if the application could run separately from the tests.
Hope my problem is more clear now.
I would suggest you to use the SpringRunner to start the Spring application context and perform your tests on that running instance. You can customize the context the way it doesn't contain parts you don't want to tests and you can create mocks for components that require some external resources (REST clients and such). Take a look at the Spring docs or Spring Boot docs.
If multiple tests use the same Spring context configuration, the context is started just once and reused. So it's good to have it's configuration in a parent class of your tests. You can autowire any Spring bean into your test and test it.
You can use an in-memory database (such as H2) instead of a production one, so your tests are not dependent on an external infrastructure. To initialize the database, use tools like Flyway or Liquibase. To clear the database before each test, you can use the #Sql annotation.
You can find many examples of projects with such tests, for example my own demo.
If you want to test an external system, I would suggest something like JMeter.
Unfortunately you cant mirror your classes and use them in your tests. Thats a big disadvantage of web services. They always depend on user / machine interaction. With a lot of effort you can extract the functionality of the essential classes or methods and construct test scenarios etc. with jUnit.
The Overview of your possibilities:
special drivers and placeholders
you can use a logger with detailed log-level and file output. Then you created scenarios with the expected result and compare it with your log files.
Capture replay tools. They record your exection and replay them for monitoring.
I can also recommend using Selenium for the frontend tests.
Hope it helped.

How to Configure Dependency Injection in a Library Project?

How to Configure Dependency Injection in a Library Project?
Let me illustrate this question with the following example.
Maven Library Project
ReservationAPI
com.example.reservation-api
This project contains a convenience class called ReservationApiClient which uses a RestTemplate (from the Spring Framework) for making HTTP calls.
Is it possible to make the RestTemplate field #Autowired in this library project instead of instantiating it myself?
Maven Executable Project
org.company.application
This project is a Spring Boot application and uses the above ReservationAPI as a dependency. This app will create a #Bean for the convenience class ReservationApiClient contained in that library and will then execute its public methods which in turn make HTTP requests.
What is a good strategy and/or best practices for the scenario described above?
You can do this if you include autowiring in your library project although that means it would always need to be used with a Spring application context to get the value unless you also have getter/setter methods to use as well. However, I don't think using RestTemplate as an autowired object makes sense since there is nothing specific about a RestTemplate and unless you name the beans there is only one bean definition for a class. All of the methods for the RestTemplate require the URI there anyhow. So in this case I would just use the bean for your ReservationApiClient in your application.
One other way to do it is if you want to include Spring dependencies in your library (which I guess you already are by using RestTemplate) you can declare your ReservationApiClient as a #Service or #Component and then use the #ComponentScan annotation in your main Spring Boot project to search that library for components to include in the bean registry.
Another option is to use a feature like Spring Boot's Autoconfigure to create factories that use third party libraries and configure them per properties in your application settings. The auto configuration documentation would be a good place to start with this. You can see the starter projects they have on GitHub and then the associated Autoconfigure classes they have associated with these.
Let me know if any of this does not make sense.

Categories