When having a database operations in many integration Tests, how to be sure that the state of the DataBase is the same before and after any Test in an automatic manner (with some framework) ?
I am not interested in the manual manner (#Before #After)
What we do at my company is that
We use transactions to ensure that the DB is in the same state after the test as beforehand.
We use test scripts that ensure valid test data (e.g., inserting some additional rows to some tables for specific test scenarios, whereas updating some others). You can execute these scriptes in setUp methods to reuse them by multiple test cases or even define them in utility classes that can be reused by multple test suites.
This works finy in many scenarios, however, it may be problematic if you try to test parts of your application that are working with nested transactions.
Related
I don't know how to apply unit tests on data access layer. I always wonder if the data access layer should be tested. In my company, we have stable database to store unit test data and test data access layer by running data access objects and check the data they get from the stable database.
In order to pass the unit tests, data in the stable database can not be modified anymore. I think there is a better solution to this. If I am not mistaken, the the mock object cannot perform tests on SQL statements and ResultSet mappings.
What is the best way to unit test the DAO? Is there a better way to do this with TDD?
First, by most definitions, "unit" tests do not depend on external systems like a database. You want to create what are called "functional" or "integration" tests. In practice these types of tests will be implemented in the same way as unit tests, using something like Junit, but you should separate them from unit tests, which should run very fast and not break when your database is down or the data has changed.
Second, try to keep most of your business logic out of DAOs and instead put it into a service POJO layer so that you can test biz logic without involving the database.
Next, the ideal way to set up testing for DAOs is to start with an empty database, and load it with test data (often using the DAOs themselves), and then run your DAO tests against a known, and writable, test dataset. If you're fortunate enough to have a read-only database, the stable database approach you outline will work, but most systems are read/write to the database.
Finally, it is valuable to test DAOs. Often the database queries are some of the most fragile parts of your system, and you don't want to wait until they are deployed to production to find out they are breaking.
Strictly speaking, you're writing a functional test. To do this you're going to need a test database of one sort or another. Let's talk about your options.
HSQL/in-memory DB. Small and fast. Trivial to setup and get rolling, and great performance on unit test sized data. The downside is, unless you are deploying using these environments then you risk having your unit test work but actual code fail. It also means you can't use any SQL constructs which are not support in both HSQL and your production DB. This can be mitigated to an extent by using Hibernate or similar. A good way to go if you only have very simple queries.
Mock out the DB calls entirely. Pointless unless you are doing too much heavy lifting in your DAO.
Use a test instance of your production DB. This will give you the best results in terms of accuracy or results. It will let you test to make sure that all your calls work as expected and lets you use non-portable SQL. You can use something like DBUnit to load database data or just use the DAO under test to do it. I would recommend this if you large and nasty queries. Ones with a lot of edge cases, roll-up views and subtle behavior. The downside is that real DBs will incur performance penalties since they'll be doing real work (transactions, index updates, rollback support).
Some comments/suggestions:
The DAO test is aimed at verifying whether the queries fired and the data retrieved are as per the expectations. There should hardly any business logic to test in DAO.
Since the prime objective is test the database interaction, mocking is not going to make it foolproof, specially the edge cases.
In light of this, the approach that you have right now is fair enough. I am not sure why you feel that its not good. A little elaboration will help.
If you are not convenient on using external database, then you can use Java's inbuilt javaDB for this. Please note that there will be overhead of creating the test data first before you run this test.
For JDBC based project, JDBC connection can be mocked, so that tests can be executed without live RDBMS, with each test case isolated (no data conflict).
It allow to verify, persistence code passes proper queries/parameters (e.g. https://github.com/playframework/playframework/blob/master/framework/src/anorm/src/test/scala/anorm/ParameterSpec.scala) and handle JDBC results (parsing/mapping) as expected.
Framework like jOOQ or my framework Acolyte can be used for: https://github.com/cchantep/acolyte .
We've a Java-Tomcat project, using Spring, JPA, with maven build, JUnit for unit tests and TestNG for integration tests.
Some integration tests will require a database, so a new DB is created each time mvn verify is run. The problem is now to populate it to have test data.
Should I look into dbunit, persist the objects myself using JPA, or another way?
How to load test data in the DB, each time integration tests are run to have a stable testing environment?
I'm using dbunit with an in memory database. It's helpful to load the specific test datasets, to run the tests, to verify the database contents after each test and to clean up the database after the test is run.
The "pros" of dbunit would be that it allows you to control the state of the database before and after each test. The "cons" is that you will work with test datasets in a custom xml format, not SQL. You can export from sql to this custom xml format, but you will still need occasionally to manually edit the xml file.
I take a copy of live database and make tests transactional so they are rolled back each time.
We use Dbunit.
We load test data within junit in a #BeforeClass method.
And delete/clean data in a #BeforeClass and a #AfterClass method.
The problem is now to populate it to have test data
As each integration test might need to have different test data, I think that should be done as part of the set-up phase of each of the integration tests.
There are two patterns to consider Fresh Fixture and Shared Fixture. The first one provides better tests isolation as it is about recreating test data for each test case, i.e. assuring a clean state. The later one introduces the risk of tests coupling but is faster as reuses the same instances of test data across many tests. Both are described in details in Meszaros: xUnit Test Patterns.
Regardless of the choice, it may be worth to consider the random data driven approach designed on top of the test-arranger: How to organize tests with Test Arranger. According to my knowledge, it's the cheapest approach with regard to maintenance costs and the required amount of code.
I am wondering what people have found their best practice to be for testing Hibernate mappings and queries ?
This cannot be done with Unit Testing, so my experience has been to write Integration Tests that solely tests the DAO layer downwards. This way I can fully test each Insert / Update / Delete / ReadQueries without testing the full end-to-end solution.
Whenever the Integration test suite is run it will:-
Drop and re-create the database.
Run an import SQL script that contains a subset of data.
Run each test in a Transactional context that rolls back the transaction. Therefore it can be run multiple times as an independent test, or and as part of a suite, and the same result is returned as the database is always in a known state.
I never test against a different "in memory" database, as there is always an equivalent development database to test against.
I have never had the need to use DBUnit.
Never use DbUnit for this. It's way too much overhead for this level of testing.
Especially if you're using Spring in your app, check out the Spring Test Framework to help manage your data-access tests, particularly the transaction management features.
An "equivalent development database" is fine, but an in-memory H2 database will blow away anything else for speed. That's important because, while the unit/integration status of these tests may be contested, they're tests you want to run a lot, so they need to be as fast as possible.
So my DAO tests look like this:
Spring manages the SessionFactory and TransactionManager.
Spring handles all transactions for test methods.
Hibernate creates the current schema in an in-memory H2 database.
Test all the save, load, delete, and find methods, doing field-for-field comparison on before and after objects. (E.g. create object foo1, save it, load it as foo2, verify foo1 and foo2 contain identical values.)
Very lightweight and useful for quick feedback.
If you don't depend on proprietary rdbms features (triggers, stored procedures etc) then you can easily and fully test your DAOs using JUnit and an in memory database like HSQLDB. You'll need some rudimentary hibernate.cfg.xml emulation via a class (to initialize hibernate with HSQLDB, load the hbm.xml files you want) and then pass the provided datasource to your daos.
Works well and provides real value to the development lifecycle.
The way I do it is pretty similar with your own, with the exception of actually using in-memory data-bases, like HSQLDB. It's faster and more portable than having a real database configured (one that runs in a standalone server). It's true that for certain more advanced features HSQLDB won't work as it simply does not support them, but I've noticed that I hardly run into those when just integration testing my data access layer. However if this is the case, I like to use the "jar" version of mysql, which allows me to start a fully functional MYSql server from java, and shut it down when I'm done. This is not very practical as the jar file is quite big :
http://dev.mysql.com/doc/refman/5.0/en/connector-mxj-configuration-java-object.html
but it's still useful in some instances.
It is often said when unit testing to dont test the database as that is an integration test (see point 4).
However, SQL/JPQL/HQL encapsulate data store specific logic which is often in string format on how to access data. This free form string data access command can easily go wrong and hence needs to be tested.
How do i efficiently test this sort of logic?
The closest you can get to running a unit test against an SQL (or similar framework) query, is to set up a SQLite database in memory, and run against it.
While that still is technically an integration test, it runs almost as fast as a unit test should.
If you do so, just take care to note the slight differences between SQLite and your real database, and try to make your queries compatible with both.
Hope this helps,
Assaf.
It is not a unit test, but there is nothing with using a unit testing framework like Nunit to test your sql. But it IS important that you keep it separated from the real unit tests. Real unit tests are fast and does not communicate with the outside ... nor do they attempt to alter it by updates, deletes and inserts.
I am trying to figure out the best way(s) to test Service and DAO layers. So, a few sub questions...
When testing a service layer, is it best to test against a mock DAO layer or a "live" DAO layer pointed at a testing environment?
How should SQL in the DAO layer be tested when the only test database is in a shared environment (Oracle/DB2)
How do you solve the paradox of any DAO writes/updates need to be tested with DAO reads which is something that also has to be tested?
I am looking for any good documentation, articles, or references in this area along with any tools to help automate the process. I already know about JUint for unit testing and Hudson for CI.
Get Growing Object-Oriented Software, Guided by Tests. It has some great tips about how to test database access.
Personally, I usually break the DAO tests in 2, a unit test with a mocked database to test functionality on the DAO, and an integration test, to test the queries against the DB. If your DAO only has database access code, you won't need a unit test.
One of the suggestions from the book that I took, is that the (integration) test has to commit the changes to the DB. I've learn to do this, after using hibernate and figuring out that the test was marked for rollback and the DB never got the insert statement. If you use triggers or any kind of validation (even FKs) I think this is a must.
Another thing, stay away from dbunit, it's a great framwork to start working, but it becomes hellish when a project becomes something more than tiny. My preference here, is to have a set of Test Data Builder classes to create the data, and insert it in the setup of the test or in the test itself.
And check dbmigrate, it's not for testing, but it will help you to manage scripts to upgrade and downgrade your DB schema.
In the scenario where the DB server is shared, I've creates one schema/user per environment. Since each developer has his own "local" environment, he also owns one schema.
Here are my answers :
Use mock DAOs to test your services. Much easier, mush faster. Use EasyMock or Mockito or any other mock framework to test the service layer.
Give each developer its own database schema to execute his tests. Such schemas are typically empty : the unit tests populate the database with a small test data set before running a test, and empties it once the test is completed. Use DBUnit for this.
If the reads work against a well-defined, static, test data set (which you should unit-test), then you can rely on them to unit-test the writes. But you can also use ad-hoc queries or even DBUnit to test that the writes work as expected. The fact that the tests are not necessarily run in this order doesn't matter. If everything passes, then everything is OK.