A set of tests should be run on every microservice. Current solution is to have an abstract class and extend in every service, providing the necessary properties in abstract getters.
public abstract class AbstractTest {
#LocalServerPort
protected int serverPort;
protected abstract String getPath();
#Test
void someTest() {}
#Test
void conditionalTest() {}
}
#SpringBootTest(
webEnvironment = SpringBootTest.WebEnvironment.DEFINED_PORT,
classes = {...})
#ActiveProfiles(...) // etc
public class MyTest extends AbstractTest {
// ... implement getPath()
// tests from parent will be executed
}
The goal:
Ditch inheritance and have the AbstractTest's logic get executed automatically with conditional #Test execution based on beans/properties etc.
The possible solution:
A concrete class with all the tests or some sort of Configuration/TestFactory to create the necessary tests. It should take into account available properties and beans to determine which tests to run.
The problem:
How can those tests (created in runtime) be discovered and registered for execution?
How to inject all the properties that are part of the current context of the #SpringBootTest?
Failed attempts:
TestInstanceFactory extension doesn't seem to be the solution as it requires an instance of the class which it annotates.
Using the Launcher API seems overkill, and also doesn't seem to work, since the library class won't be created with the Spring context configs.
using cglib and a base class Spring Contract-style is not a desirable solution
Ideally I don't want the client of this lib to implement/create anything, so abstract String getPath(); would be a test.lib.path property, and if it's present, a test from the library which uses it will run.
Any thoughts on this would be great, because right now this just seems impossible to me.
What is the reason to have the inheritance for tests?
In case you need to share some common logic within the tests you may try JUnit features (custom rules/extensions), for example
For junit < 5.x.x #Rule functionality https://junit.org/junit4/javadoc/4.12/org/junit/rules/TemporaryFolder.html https://stackoverflow.com/a/34608174/6916890
For junit >= 5.x.x (jupiter) there is an extension API
https://junit.org/junit5/docs/current/user-guide/#writing-tests-built-in-extensions-TempDirectory
Related
I need to create a version 2 of an existing service endpoint. In creating the unit tests, I realized the most efficient way to test the new code was to extend from the existing unit test class.
The setup, teardown, and stubbing methods were already completed and are pretty complex. I don't want to copy them into the new unit tests for redundancy reasons and I cannot move them into a utilities class since there's some tightly coupled logic in some of the stubbing classes.
When I run my new unit tests I am seeing the inherited unit tests run as well, which is not what I wanted. Has anyone been successful in inheriting from a base unit test without running any of its #Test classes?
Here's an example of my base class :
#SpringBootTest
#ContextConfiguration
#ExtendWith(MockitoExtension.class)
#AutoConfigureMockMvc
public class MyBaseClass extends TestBase {
#Value("${test.foo}")
private String foo
// setup, teardown, stubbing to pull in data to run the tests AND then the tests themselves
}
#SpringBootTest
#ContextConfiguration
#ExtendWith(MockitoExtension.class)
#AutoConfigureMockMvc
public class MyVersion2BaseClass extends MyBaseClass {
/* JUST THE TESTS, NO SETUP/TEAR DOWN/STUBBING NEEDED
AS THIS IS CONTAINED IN THE PARENT CLASS
*/
}
I ended up removing all the setup/tear down/stubbing code into an abstract class.
Now my two versions of test just extend from the abstraction and all the administrative code is available to them.
I have a BaseTest class which consists of several tests. Each test shall be executed for EVERY profile I list.
I thought about using Parameterized values such as:
#RunWith(Parameterized.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
// #ActiveProfiles("h2-test") // <-- how to iterate over this?
public abstract class BaseTest {
#Autowired
private TestRepository test;
// to be used with Parameterized/Spring
private TestContextManager testContextManager;
public BaseTest(String profile) {
System.setProperty("spring.profiles.active", profile);
// TODO what now?
}
#Parameterized.Parameters
public static Collection<Object[]> data() {
Collection<Object[]> params = new ArrayList<>();
params.add(new Object[] {"h2-test" });
params.add(new Object[] {"mysql-test" });
return params;
}
#Before
public void setUp() throws Exception {
this.testContextManager = new TestContextManager(getClass());
this.testContextManager.prepareTestInstance(this);
// maybe I can spinup Spring here with my profile?
}
#Test
public void testRepository() {
Assert.assertTrue(test.exists("foo"))
}
How would I tell Spring to run each test with these different profiles? In fact, each profile will talk to different datasources (in-memory h2, external mysql, external oracle, ..) so my repository/datasource has to be reinitialized.
I know that I can specify #ActiveProfiles(...) and I can even extend from BaseTest and override the ActiveProfile annotation. Although this will work, I only show a portion of my test-suite. Lots of my test-classes extend from BaseTest and I don't want to create several different profile-stubs for each class. Currently working, but ugly solution:
BaseTest (#ActiveProfiles("mysql"))
FooClassMySQL(annotation from BaseTest)
FooClassH2(#ActiveProfiles("h2"))
BarClassMySQL(annotation from BaseTest)
BarClassH2(#ActiveProfiles("h2"))
Thanks
For what it's worth:
My use case was to run a specific test class for multiple spring profiles, this is how I achieved it:
#SpringBootTest
abstract class BaseTest {
#Test void doSomeTest() {... ARRANGE-ACT-ASSERT ...}
}
#ActiveProfiles("NextGen")
class NextGenTest extends BaseTest {}
#ActiveProfiles("Legacy")
class LegacyTest extends BaseTest {}
If you use Maven you can actually specify active profile from command line (or env variable if needed):
mvn clean test -Dspring.profiles.active=h2-test
The approach with parameterized test may not work in this case, because profile has to be specified before Spring boots up its context. In this case when you run parameterized integration test the context will be already booted up before test runner starts running your test. Also JUnit's parameterized tests were invented for other reasons (running unit tests with different data series).
EDIT: Also one more thing - when you decide to use #RunWith(Parameterized.class) you wont be able to use different runner. In many cases (if not all if it comes to integration testing) you want to specify different runner, like SpringRunner.class - with parameterized test you wont be able to do it.
Spring profiles are not designed to work in this way.
In your case, each profile uses a specific datasource.
So each one requires a Spring Boot load to run tests with the expected datasource.
In fact, what you want to do is like making as many Maven build as Spring profiles that you want to test.
Besides, builds in local env should be as fast as possible.
Multiplying automated tests execution by DBMS implementation that requires a Spring Boot reload for each one will not help.
You should not need to specify #ActiveProfiles .
It looks rather like a task for a Continuous Integration tool where you could define a job that executes (sequentially or parallely) each Maven build by specifying a specific Spring Boot profile :
mvn clean test -Dspring.profiles.active=h2
mvn clean test -Dspring.profiles.active=mysql
etc...
You can also try to perform it in local by writing a script that performs the execution of the maven builds.
But as said, it will slowdown your local build and also complex it.
I need to create tests for some class. This class in main project (src/main/java/..) is injected easily into another classes, since I have custom ResourceConfig class which declares which packages have to be scanned to seek for service classes.
Now I created test directories (in src/test/java/..) and created a class, something like:
public class TheMentionedClassIntegrationTest {
#Inject
private TheMentionedClass theMentionedClass ;
#Test
public void testProcessMethod() {
assertNotNull(theMentionedClass);
}
}
But the problem is that whatever I do the class is always null. In another tests in the project I was using JerseyTest class. So I tried to do the same here, extend TheMentionedClassIntegrationTest with JerseyTest, override configure method, create my private ResourceConfig class which registers Binder (default for whole project) and register TheMentionedClassIntegrationTest as well.
It didnt work. I did many different attempts but none of them were successfull. I think working with HK2 is extremly difficult, there is no good documentation or so..
Do you guys have an idea how to inject TheMentionedClass into the test class? Maybe my approach is wrong?
Thanks!
The easiest thing to do is to just create the ServiceLocator and use it to inject the test class, as see here. For example
public class TheMentionedClassIntegrationTest {
#Inject
private TheMentionedClass theMentionedClass;
#Before
public void setUp() {
ServiceLocator locator = ServiceLocatorUtilities.bind(new YourBinder());
locator.inject(this);
}
#Test
public void testProcessMethod() {
assertNotNull(theMentionedClass);
}
}
You could alternatively use (make) a JUnit runner, as seen here.
For some other ideas, you might want to check out the tests for the hk2-testing, and all of its containing projects for some use case examples.
I have a requirement to write many tests. I have extended Suite Runner of JUnit in order to be able to add new annotations where I can mention several Prerequisite classes which will be executed before any of the tests or setups get executed. My Typical test looks like this.
#RunWith(CustomSuiteRunner.class)
#BeforeSuite(Prerequisite.class)
#AfterSuite(CleanupOperations.class)
#Suite.SuiteClasses({
SimpleTests.class,
WeatherTests.class
})
public class SimpleSuite {
}
I have overridden public void run(final RunNotifier notifier) to add code the required code to trigger prerequisites and cleanup operations mentioned in BeforeSuite and AfterSuite annotation.
Now, I'm trying to find out how I can achieve the same by extending BlockJUnit4Runner? I can't find any method equivalent to run that starts the execution to override the behaviour. There is runChild which gets triggered before a child gets executed.
The reason I'm looking for this is I'm trying created several rules in an Interface and make my tests implement that so that they will be available, however as Interface elements are static and final JUnit is ignoring these. In another Question I asked today I got answer that I can make JUnit consider rules mentioned in an Interface by extending BlockJUnit4Runner and overriding getTestRules().
So, Here is what I'm trying find out.
Is it possible to extend BlockJUnit4Runner to make it take a list of tests and run them as suite and also run some code before any tests get execute and after all tests are executed?
How can I extend Suite Runner to consider TestRules defined in an implemented interface?
It is pretty much possible to extend BlockJUnit4Runner and make it take a list of tests and run them as suite with required test dependencies handled within the extended runChild() method
public class CustomRunner extends BlockJUnit4ClassRunner {
private List<String> testsToRun = Arrays.asList(new String[] { “sample1” });
public CustomRunner(Class<?> klass) throws InitializationError {
super(klass);
}
public void runChild(FrameworkMethod method, RunNotifier notifier) {
//Handle any dependency logic by creating a customlistener registering notifier
super.runChild(method,notifier);
}
}
I gave to Google Guice the responsibility of wiring my objects. But, how can I test if the bindings are working well?
For example, suppose we have a class A which has a dependence B. How can I test that B is injected correctly?
class A {
private B b;
public A() {}
#Inject
public void setB(B b) {
this.b = b
}
}
Notice that A hasn't got a getB() method and I want to assert that A.b isn't null.
For any complex Guice project, you should add tests to make sure that the modules can be used to create your classes. In your example, if B were a type that Guice couldn't figure out how to create, then Guice won't be able to create A. If A wasn't needed to start the server but was needed when your server was handling a request, that would cause problems.
In my projects, I write tests for non-trivial modules. For each module, I use requireBinding() to declare what bindings the module requires but doesn't define. In my tests, I create a Guice injector using the module under test and another module that provides the required bindings. Here's an example using JUnit4 and JMock:
/** Module that provides LoginService */
public class LoginServiceModule extends AbstractModule {
#Override
protected void configure() {
requireBinding(UserDao.class);
}
#Provides
LoginService provideLoginService(UserDao dao) {
...
}
}
#RunWith(JMock.class)
public class LoginServiceModuleTest {
private final Mockery context = new Mockery();
#Test
public void testModule() {
Injector injector = Guice.createInjector(
new LoginServiceModule(), new ModuleDeps());
// next line will throw an exception if dependencies missing
injector.getProvider(LoginService.class);
}
private class ModuleDeps extends AbstractModule {
private final UserDao fakeUserDao;
public ModuleDeps() {
fakeUserDao = context.mock(UserDao.class);
}
#Override
protected void configure() {}
#Provides
Server provideUserDao() {
return fakeUserDao;
}
}
}
Notice how the test only asks for a provider. That's sufficient to determine that Guice could resolve the bindings. If LoginService was created by a provider method, this test wouldn't test the code in the provider method.
This test also doesn't test that you binded the right thing to UserDao, or that UserDao was scoped correctly. Some would argue that those types of things are rarely worth checking; if there's a problem, it happens once. You should "test until fear turns to boredom."
I find Module tests useful because I often add new injection points, and it's easy to forget to add a binding.
The requireBinding() calls can help Guice catch missing bindings before it returns your injector! In the above example, the test would still work if the requireBinding() calls were not there, but I like having them because they serve as documentation.
For more complicated modules (like my root module) I might use Modules.override() to override bindings that I don't want at test time (for instance, if I want to verify that my root object to be created, I probably don't want it to create an object that will connect to the database). For simple projects, you might only test the top-level module.
Note that Guice will not inject nulls unless the field as annotated with #Nullable so you very rarely need to verify that the injected objects are non-null in your tests. In fact, when I annotate constructors with #Inject I do not bother to check if the parameters are null (in fact, my tests often inject null into the constructor to keep the tests simple).
Another way to test your configuration is by having a test suite that tests your app end-to-end. Although end-to-end tests nominally test use cases they indirectly check that your app is configured correctly, (that all the dependencies are wired, etc etc). Unit tests on the other hand should focus exclusively on the domain, and not on the context in which your code is deployed.
I also agree with NamshubWriter's answer. I'm am not against tests that check configuration as long as they are grouped in a separate test suite to your unit tests.
IMHO, you should not be testing that. The Google Guice guys have the unit tests to assert that the injections work as expected - after all, that's what Guice is designed to do. You should only be writing tests for your own code (A and B).
I don't think you should test private members being set. Better to test against the public interface of your class. If member "b" wouldn't be injected, you'll probably get a NullPointerException executing your tests, which should be plenty of warning.