Unit tests should only test the logic of one function and should "mock" the data that is used in that function. I am wondering how would we unit the following function with "mocking" the data? or even if its the correct way. The function signature is
public String doSomething(int firstId, int secondId, int count){
//this function looks in a table e.g. C which has foreign keys from table A, and B
//if firstId and secondId exist in db table C return "already-exists"
//if count < a_column_value_in_table_C return "not-allow"
// else return "success"
}
The firstId and secondId are foreign keys form two different tables. Now, how do we go about unit testing this function in terms of:
1. how the unit test should be design so it is able to test 3 scnarios in the function
2. how do we prepare data for this unit test given that it will need foreign keys from two different tables.
You should be using dependency injection from Solid principles.
The class who owner of doSomething method should inject some Repository or DAO etc.
In your unit tests, you should mock repository methods.
For example, lets assume your doSomething method calls findById(...) method of repository. You should mock findById method by desired output and just test the logical part of your flow.
You could use some test database or in-memory one (like HSQLDB). Populate it with some test data before test (in method annotated #BeforeClass or during test datasource initialization if you use Spring). Then execute tests for all your scenarios passing prepared data. Cleanup data in test database in method annotated with #AfterClass.
In case you use Spring and configuration in XML config for test dataSource could look like this:
<jdbc:embedded-database id="dataSource" type="HSQL" >
<jdbc:script location="scripts/ddl/*"/> <!-- create tables -->
<jdbc:script location="scripts/dml/*"/> <!-- populate test data -->
</jdbc:embedded-database>
I usually create a repository-interface with the getById and a getAll function. For testing purpose I create an inmemory-repository and for production I use the database-repository.
Here an example:
public interface Repository<T> {
public T getById(int Id);
public List<T> getAll();
}
public InmemoryRepository implements Repository<User> {
List<User> database = new ArrayList<>(); //with some data
public List<User> getAll() {
return database;
}
public User get(int Id) {
return database.stream().filter(x -> x.Id = Id).collect(Collectors.asList());
}
}
In your function you inject this repository so that you can access the database over:
public String doSomething(int firstId, int secondId, int count, Repository<User> repo){};
You are not supposed to design one unit test to match all the cases there is. For each case create a different unit test.
You need to mock the repository or class that access the database so that it will not execute in the database and instead it will return a result set predetermined by you in the unit test. There are mocking libraries for that, which makes your work very easy. If the code is only executing a query or procedure and returning the result back, there is not much merit in writing a unit test for it, writing an integration test should be enough in that case imo.
P.S: Emre Savcı, dependency injection is not a solid principle, dependency inversion is. They are not related at all.
Related
I was reading this article on Unit Testing. It seems pretty straightforward, but there was a section that interested me and I wanted to see if someone could please provide an explanation or example of what this means. I think I understand it but maybe not well enough.
Write test cases that are independent of each other. For example, if a class depends on a database, do not write a case that interacts with the database to test the class. Instead, create an abstract interface around that database connection and implement that interface with a mock object.
What does it mean to:
create an abstract interface around the db connection?
then implement it with a mock object?
I am more questioning the first part (1) but if someone can explain both parts that would be helpful. Thanks.
ONE: The "abstract data interface" simply means that you provide an interface with methods for storing and finding data in a business oriented way, rather than use the database connection directly. You can then implement this interface to store data in different ways: an sql database, a file based approach, etc. Such a class in some patterns is also referred to as "data access object" (DAO).
public interface PersonDao {
void store(Person personToStore);
Person findById(String id);
}
public class SqlPersonDao implements PersonDao {
#Override
void store(Person personToStore) {
// use database connection here ...
}
}
Basically in a unit test you always want to mock anything that is not your system under test and has a complex behaviour you cannot control. That is especially true for things like system time, for example if a class uses system time, for tests you want a way to inject a predefined time overriding the system clock.
TWO:
In unit tests you don't want to be affected by bugs in any dependency. For a unit using the PersonDao, the PersonDaowould be such a dependency. Rather than relying on the real implementation's behaviour, you want to exactly define the results you expect (using the notation of the Mockito mocking framework and the AssertJ validation framework here):
class MyUnitTest {
// system under test
MyUnit sut;
#Mock
PersonDao personDaoMock;
#BeforeEach
public setup() {
initMocks(this);
sut = new MyUnit("some", "parameters");
}
#Test
void myTest() {
// setup test environment using a mock
var somePerson = new Person("101", "John", "Doe");
doReturn(somePerson).when(personDaoMock.findById("101"));
// run test
var actualValue = sut.doSomething();
// check results
assertThat(actualValue).isNotNull();
}
}
I'd like to ask whether it is alright to use apps repositories(Spring Data based) to fill in testing data. I know I can use sql file with data, but sometimes I need something more dynamical. I find writing sql or datasets definitions cumbersome(and hard to maintain in case of schema change). Is there anything wrong with using app repositories? There are all basic CRUD operations already there. Note we are talking especially about integration testing.
I feel it is kind of weird to use part of app to test itself. Maybe I can create another set of repositories to be used in test contexts.
No, there is absolutely nothing wrong with using Spring Data repositories to create test data.
I even prefer that since it often allows for simpler refactoring.
As with any use of JPA in tests you need to keep in mind that JPA implementations are a write-behind cache. You probably want to flush and clear the EntityManager after setting up the test data, so that you don't get anything from the 1st level cache that really should come from the database. Also, this ensures data is actually written to the database and problems with that will surface.
You might be interested in a couple of articles about testing with Hibernate. They don't use Spring Data, but it would work with Spring Data JPA just the same.
I would recommand to use Flyway to setup your databases and use Flyway test extension for integration testing.
So that you can do something like that:
#ContextConfiguration(locations = {"/context/simple_applicationContext.xml"})
#TestExecutionListeners({DependencyInjectionTestExecutionListener.class,
FlywayTestExecutionListener.class})
#Test
#FlywayTest(locationsForMigrate = {"loadmsql"}) // execution once per class
public class MethodTest extends AbstractTestNGSpringContextTests {
#BeforeClass
#FlywayTest(locationsForMigrate = {"loadmsql"}) // execution once per class
public static void beforeClass() {
// maybe some additional things
}
#BeforeMethod
#FlywayTest(locationsForMigrate = {"loadmsql"}) // execution before each test method
public void beforeMethod() {
// maybe before every test method
}
#Test
#FlywayTest(locationsForMigrate = {"loadmsql"}) // as method annotation
public void simpleCountWithoutAny() {
// or just with an annotation above the test method where you need it
}
I have a lot of DAO classes I need to test on a Spring project.
I am already using DBUnit to mock my database, however I use the #Before annotation to create objects and compare them after tests on create/update/delete operations.
#DatabaseSetup(value = { "/db_data/dao/common.xml", "/db_data/dao/myDAOCommonTest.xml" })
#DbUnitConfiguration(dataSetLoader = ReplacementDataSetLoader.class)
public class MyDAOImplTest extends AbstractDaoTU {
#Autowired
private MyDAO myDAO;
private Set<ClassNeeded> objectsNeeded = new HashSet<>();
private ClassOne classOne;
private ClassTwo classTwo;
private ClassThree classThree;
#Override
public void setUp() throws Exception {
super.setUp();
this.objectsNeeded.add(somethingComingFromTheMotherClass);
this.classOne = new ClassOne();
this.classOne.setIdClassOne(1L)
this.classOne.setObjectsNeeded(this.objectsNeeded);
// ... Many other sets
this.classTwo = new ClassTwo();
this.classTwo.setIdClassTwo(1L);
this.classTwo.setClassOne(this.classOne);
// ... Many other sets
// ... Other sets follow for a lot of other objects
}
#Test
public void testOne {
// ...
}
// ... Other tests follow
}
I am using an ORM (Hibernate in this case), and most objects are inter-dependent. My dao functions mostly need complete objects to be called, so I must create the objects before testing.
My questions are the following :
Is there a better approach to unit test DAOs ?
What tools do you know to make this easier/faster to write ? (I am using maven for packaging)
Thanks for your help !
DB Unit complicates the maintenance of the tests as it increases the number of places that need updates when something changes. Additionally it separates the data preparation from the tests too much so it's hard to find which of the data relates to which tests.
Ideally each test prepares data for itself. This removes global state and keeps related things together.
To prepare the data just create entities and save them in the very same test. You can use randomization and transaction rollbacks to isolate the tests. Here is an example from one of my projects:
#Test public void returnsExperimentAsItWasSaved() {
Experiment original = Experiment.random();
experimentRepository.save(original);
flushToDbAndClearCache();
Experiment fromDb = experimentRepository.findOne(original.getExperimentId());
assertReflectionEquals(original, fromDb);
}
Note, that the very same DAO class is used to prepare the data.
The best way is to develop your tests as you would develop your code: refactor to minimize duplication, extract reusable services, etc.
So, you'll probably create some TestCaseFactory that chains up a whole set of objects and saves them using your actual DAO's. Then, you can call them from an #Before as you did. If you need a lot of different sets of objects, you can create different methods or a parameter object etc.
And do a cleanup of all test data in an #After.
I am looking for information to build unit test for typical DAO methods (find user by username, etc.) and I found several examples using mocks like this one: http://www.christophbrill.de/de_DE/unit-testing-with-junit-and-mockito/
#Test
public void testComeGetSome() {
// Mock the EntityManager to return our dummy element
Some dummy = new Some();
EntityManager em = Mockito.mock(EntityManager.class);
Mockito.when(em.find(Some.class, 1234)).thenReturn(dummy);
// Mock the SomeDao to use our EntityManager
SomeDao someDao = Mockito.mock(SomeDao.class);
Mockito.when(someDao.comeGetSome(1234)).thenCallRealMethod();
Mockito.when(someDao.getEntityManager()).thenReturn(em);
// Perform the actual test
Assert.assertSame(dummy, someDao.comeGetSome(1234));
Assert.assertNull(someDao.comeGetSome(4321));
}
There is also a similar one in Lasse Koskela's book using EasyMock instead of Mockito.
The thing is: what are we really testing in these examples? We are basically telling through mocks what object the query should return, and then asserting that in fact it returned the object we told it to return.
We are not testing if the query is correct or if it returns a different object or no objects at all (or even more than one object). We cannot test if it returns null when the object does not exist in the database. This line
Assert.assertNull(someDao.comeGetSome(4321));
works because there is no scripted interaction for that argument, not because the object does not exist.
It looks like we are just testing if the method calls the proper methods and objects (em.find).
What is the point of unit testing this? Are there any good frameworks in Java to quickly set up an in memory database and perform tests with it?
Your doubts really make sense. Actually there is no need to test DAO with unit tests in most cases because unit tests deal with one layer, but DAO cooperate with database layer.
This article explains this idea:
http://www.petrikainulainen.net/programming/testing/writing-tests-for-data-access-code-unit-tests-are-waste/
Hence we should test DAO and database layer with integration tests.
Integration tests take into account both DAO and database layer.
This article will provide your with Spring + Hibernate example:
https://dzone.com/articles/easy-integration-testing
It looks loke more like service tests, than a real DAO tests.
For example, I'm using dbunit to test my DAO layer.
For example, I have Author table with 2 fields: id and name.
I'm creating a dataset xml file like
<?xml version="1.0" encoding="UTF-8"?>
<dataset>
<AUTHOR AUTHOR_ID="1" NAME="FirstAuthor"/>
...
<AUTHOR AUTHOR_ID="10" NAME="TenthAuthor"/>
</dataset>
And then in my test class using Mockito I'm testing my DAO methods like
#Test
#DatabaseSetup(value = "/dao/author/author-data.xml")
public void testFindAll() {
List<Author> authorList = this.authorDAO.findAll();
assertEquals(10, authorList.size());
}
I have to test some Thrift services using Junit. When I run my tests as a Thrift client, the services modify the server database. I am unable to find a good solution which can clean up the database after each test is run.
Cleanup is important especially because the IDs need to be unique which are currently read form an XML file. Now, I have to manually change the IDs after running tests, so that the next set of tests can run without throwing primary key violation in the database. If I can cleanup the database after each test run, then the problem is completely resolved, else I will have to think about other solutions like generating random IDs and using them wherever IDs are required.
Edit: I would like to emphasize that I am testing a service, which is writing to database, I don't have direct access to the database. But since, the service is ours, I can modify the service to provide any cleanup method if required.
If you are using Spring, everything you need is the #DirtiesContext annotation on your test class.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration("/test-context.xml")
#DirtiesContext(classMode = ClassMode.AFTER_EACH_TEST_METHOD)
public class MyServiceTest {
....
}
Unless you as testing specific database actions (verifying you can query or update the database for example) your JUnits shouldn't be writing to a real database. Instead you should mock the database classes. This way you don't actually have to connect and modify the database and therefor no cleanup is needed.
You can mock your classes a couple of different ways. You can use a library such as JMock which will do all the execution and validation work for you. My personal favorite way to do this is with Dependency Injection. This way I can create mock classes that implement my repository interfaces (you are using interfaces for your data access layer right? ;-)) and I implement only the needed methods with known actions/return values.
//Example repository interface.
public interface StudentRepository
{
public List<Student> getAllStudents();
}
//Example mock database class.
public class MockStudentRepository implements StudentRepository
{
//This method creates fake but known data.
public List<Student> getAllStudents()
{
List<Student> studentList = new ArrayList<Student>();
studentList.add(new Student(...));
studentList.add(new Student(...));
studentList.add(new Student(...));
return studentList;
}
}
//Example method to test.
public int computeAverageAge(StudentRepository aRepository)
{
List<Student> students = aRepository.GetAllStudents();
int totalAge = 0;
for(Student student : students)
{
totalAge += student.getAge();
}
return totalAge/students.size();
}
//Example test method.
public void testComputeAverageAge()
{
int expectedAverage = 25; //What the expected answer of your result set is
int actualAverage = computeAverageAge(new MockStudentRepository());
AssertEquals(expectedAverage, actualAverage);
}
How about using something like DBUnit?
Spring's unit testing framework has extensive capabilities for dealing with JDBC. The general approach is that the unit tests runs in a transaction, and (outside of your test) the transaction is rolled back once the test is complete.
This has the advantage of being able to use your database and its schema, but without making any direct changes to the data. Of course, if you actually perform a commit inside your test, then all bets are off!
For more reading, look at Spring's documentation on integration testing with JDBC.
When writing JUnit tests, you can override two specific methods: setUp() and tearDown(). In setUp(), you can set everything thats necessary in order to test your code so you dont have to set things up in each specific test case. tearDown() is called after all the test cases run.
If possible, you could set it up so you can open your database in the setUp() method and then have it clear everything from the tests and close it in the tearDown() method. This is how we have done all testing when we have a database.
Heres an example:
#Override
protected void setUp() throws Exception {
super.setUp();
db = new WolfToursDbAdapter(mContext);
db.open();
//Set up other required state and data
}
#Override
protected void tearDown() throws Exception {
super.tearDown();
db.dropTables();
db.close();
db = null;
}
//Methods to run all the tests
Assuming you have access to the database: Another option is to create a backup of the database just before the tests and restore from that backup after the tests. This can be automated.
If you are using Spring + Junit 4.x then you don't need to insert anything in DB.
Look at
AbstractTransactionalJUnit4SpringContextTests class.
Also check out the Spring documentation for JUnit support.
It's a bit draconian, but I usually aim to wipe out the database (or just the tables I'm interested in) before every test method execution. This doesn't tend to work as I move into more integration-type tests of course.
In cases where I have no control over the database, say I want to verify the correct number of rows were created after a given call, then the test will count the number of rows before and after the tested call, and make sure the difference is correct. In other words, take into account the existing data, then see how the tested code changed things, without assuming anything about the existing data. It can be a bit of work to set up, but let's me test against a more "live" system.
In your case, are the specific IDs important? Could you generate the IDs on the fly, perhaps randomly, verify they're not already in use, then proceed?
I agree with Brainimus if you're trying to test against data you have pulled from a database. If you're looking to test modifications made to the database, another solution would be to mock the database itself. There are multiple implementations of in-memory databases that you can use to create a temporary database (for instance during JUnit's setUp()) and then remove the entire database from memory (during tearDown()). As long as you're not using an vendor-specific SQL, then this is a good way to test modifying a database without touching your real production one.
Some good Java databases that offer in memory support are Apache Derby, Java DB (but it is really Oracle's flavor of Apache Derby again), HyperSQL (better known as HSQLDB) and H2 Database Engine. I have personally used HSQLDB to create in-memory mock databases for testing and it worked great, but I'm sure the others would offer similar results.