I develop an application in Java with Spring and Hibernate. And now I am looking for a solution for fast reloading data between tests. The tests require a lot of data, which are generated and persisted through services. As a db I use hsqldb in memory. Data generating process takes about 30 seconds, so it is too long to simply run it before each test.
So I was wondering if it is good idea and if it is possible with hsqldb to run data loader once at the begining of test case or suite, then create a dump and restore it before each test? I can't find how to create dump in hsqldb, especially if it is in memory db.
I appreaciate all yours help.
EDIT: I have to use database. Let's consider that they re integration tests.
You can use dbUnit to load and clean the database before and after each test but I don't think that will improve your performance.
Instead, I would ask why you need so much data for a unit test? 30 seconds isn't too bad for an integration test that actually hits a database, but I think you should strive to have unit tests that don't hit the database at all and instead use mock objects to simulate interacting with your services. Then you can have a few integration tests that actually use a database but those tests won't have to cover all scenarios since your faster unit tests should do that already.
You can use an HSQLDB file: database with the default MEMORY tables.
After generating the dataset, add the property files_readonly=true to the database.properties file. You then run the tests with this database. This ensures your tests run and modify the data the same way as a mem: database, but the changes made by the tests are not persisted when the test process ends. The original data is loaded in a few seconds in the fastest possible way.
try using this annotation in your test class
#TransactionConfiguration(transactionManager="nameOfYourTransactionManager", defaultRollback=true)
I found it here
Related
We've a Java-Tomcat project, using Spring, JPA, with maven build, JUnit for unit tests and TestNG for integration tests.
Some integration tests will require a database, so a new DB is created each time mvn verify is run. The problem is now to populate it to have test data.
Should I look into dbunit, persist the objects myself using JPA, or another way?
How to load test data in the DB, each time integration tests are run to have a stable testing environment?
I'm using dbunit with an in memory database. It's helpful to load the specific test datasets, to run the tests, to verify the database contents after each test and to clean up the database after the test is run.
The "pros" of dbunit would be that it allows you to control the state of the database before and after each test. The "cons" is that you will work with test datasets in a custom xml format, not SQL. You can export from sql to this custom xml format, but you will still need occasionally to manually edit the xml file.
I take a copy of live database and make tests transactional so they are rolled back each time.
We use Dbunit.
We load test data within junit in a #BeforeClass method.
And delete/clean data in a #BeforeClass and a #AfterClass method.
The problem is now to populate it to have test data
As each integration test might need to have different test data, I think that should be done as part of the set-up phase of each of the integration tests.
There are two patterns to consider Fresh Fixture and Shared Fixture. The first one provides better tests isolation as it is about recreating test data for each test case, i.e. assuring a clean state. The later one introduces the risk of tests coupling but is faster as reuses the same instances of test data across many tests. Both are described in details in Meszaros: xUnit Test Patterns.
Regardless of the choice, it may be worth to consider the random data driven approach designed on top of the test-arranger: How to organize tests with Test Arranger. According to my knowledge, it's the cheapest approach with regard to maintenance costs and the required amount of code.
How should I unit test methods which their intent is querying the database and return some data? For other situations I can just mock the objects but in this case which I want to test whether they return the correct data, how should I check it isolated from db? Should I use some kind of special db? But then how should I configure that new db to work like the other one with all those columns, etc?
Thanks.
Update: Thanks to everyone, their responses leaded me to the correct path. I finally used debry. I just added a new persistence.xml for that. No other significant changes and it seems to be working now.
One approach I've used with great success is to use:
Maven to build your project
Liquibase (or Flyway) to manage your database schema, and versioning it
H2 as an in-memory database that is started along with your tests.
There's a fair bit to learn there if you haven't used any of the above, but in my experience it was well worth it. This worked really well with a Spring application; with other setups your mileage may vary.
Maven should start an instance of the H2 database in-memory before doing any tests. In a Spring application, you can just specify your datasource with an H2 JDBC URL and it'll start automagically.
You can use Liquibase to run a set of XML scripts to set up your database schema, and then a separate file to populate them with test data (either by specifying different files when running Liquibase, or by using the context attribute of each changeSet). This can be done with Maven, or in Spring using a specific Liquibase bean.
From there you can test your application exactly as if it was a normal app. No need for mocking, and you get much more useful tests as a result. You may need to change your schema or otherwise work around SQL differences between H2 and your native RDBMS.
As an aside, I'm greatly in favour of these sorts of tests. In my experience mocking everything doesn't really gain you any interesting insights, and should be a last resort for when intra-build integration tests aren't possible. There are many that disagree with me though!
The question is what behavior do you need to unit test? If you mocked out the database then you've tested all the important logic. Your database adapter will either work or not work, which you can verify in integration/acceptance tests against a real database.
You can use DBUnit. It takes your current schema and you can easily mock your data.
I am wondering what people have found their best practice to be for testing Hibernate mappings and queries ?
This cannot be done with Unit Testing, so my experience has been to write Integration Tests that solely tests the DAO layer downwards. This way I can fully test each Insert / Update / Delete / ReadQueries without testing the full end-to-end solution.
Whenever the Integration test suite is run it will:-
Drop and re-create the database.
Run an import SQL script that contains a subset of data.
Run each test in a Transactional context that rolls back the transaction. Therefore it can be run multiple times as an independent test, or and as part of a suite, and the same result is returned as the database is always in a known state.
I never test against a different "in memory" database, as there is always an equivalent development database to test against.
I have never had the need to use DBUnit.
Never use DbUnit for this. It's way too much overhead for this level of testing.
Especially if you're using Spring in your app, check out the Spring Test Framework to help manage your data-access tests, particularly the transaction management features.
An "equivalent development database" is fine, but an in-memory H2 database will blow away anything else for speed. That's important because, while the unit/integration status of these tests may be contested, they're tests you want to run a lot, so they need to be as fast as possible.
So my DAO tests look like this:
Spring manages the SessionFactory and TransactionManager.
Spring handles all transactions for test methods.
Hibernate creates the current schema in an in-memory H2 database.
Test all the save, load, delete, and find methods, doing field-for-field comparison on before and after objects. (E.g. create object foo1, save it, load it as foo2, verify foo1 and foo2 contain identical values.)
Very lightweight and useful for quick feedback.
If you don't depend on proprietary rdbms features (triggers, stored procedures etc) then you can easily and fully test your DAOs using JUnit and an in memory database like HSQLDB. You'll need some rudimentary hibernate.cfg.xml emulation via a class (to initialize hibernate with HSQLDB, load the hbm.xml files you want) and then pass the provided datasource to your daos.
Works well and provides real value to the development lifecycle.
The way I do it is pretty similar with your own, with the exception of actually using in-memory data-bases, like HSQLDB. It's faster and more portable than having a real database configured (one that runs in a standalone server). It's true that for certain more advanced features HSQLDB won't work as it simply does not support them, but I've noticed that I hardly run into those when just integration testing my data access layer. However if this is the case, I like to use the "jar" version of mysql, which allows me to start a fully functional MYSql server from java, and shut it down when I'm done. This is not very practical as the jar file is quite big :
http://dev.mysql.com/doc/refman/5.0/en/connector-mxj-configuration-java-object.html
but it's still useful in some instances.
It is often said when unit testing to dont test the database as that is an integration test (see point 4).
However, SQL/JPQL/HQL encapsulate data store specific logic which is often in string format on how to access data. This free form string data access command can easily go wrong and hence needs to be tested.
How do i efficiently test this sort of logic?
The closest you can get to running a unit test against an SQL (or similar framework) query, is to set up a SQLite database in memory, and run against it.
While that still is technically an integration test, it runs almost as fast as a unit test should.
If you do so, just take care to note the slight differences between SQLite and your real database, and try to make your queries compatible with both.
Hope this helps,
Assaf.
It is not a unit test, but there is nothing with using a unit testing framework like Nunit to test your sql. But it IS important that you keep it separated from the real unit tests. Real unit tests are fast and does not communicate with the outside ... nor do they attempt to alter it by updates, deletes and inserts.
I am trying to figure out the best way(s) to test Service and DAO layers. So, a few sub questions...
When testing a service layer, is it best to test against a mock DAO layer or a "live" DAO layer pointed at a testing environment?
How should SQL in the DAO layer be tested when the only test database is in a shared environment (Oracle/DB2)
How do you solve the paradox of any DAO writes/updates need to be tested with DAO reads which is something that also has to be tested?
I am looking for any good documentation, articles, or references in this area along with any tools to help automate the process. I already know about JUint for unit testing and Hudson for CI.
Get Growing Object-Oriented Software, Guided by Tests. It has some great tips about how to test database access.
Personally, I usually break the DAO tests in 2, a unit test with a mocked database to test functionality on the DAO, and an integration test, to test the queries against the DB. If your DAO only has database access code, you won't need a unit test.
One of the suggestions from the book that I took, is that the (integration) test has to commit the changes to the DB. I've learn to do this, after using hibernate and figuring out that the test was marked for rollback and the DB never got the insert statement. If you use triggers or any kind of validation (even FKs) I think this is a must.
Another thing, stay away from dbunit, it's a great framwork to start working, but it becomes hellish when a project becomes something more than tiny. My preference here, is to have a set of Test Data Builder classes to create the data, and insert it in the setup of the test or in the test itself.
And check dbmigrate, it's not for testing, but it will help you to manage scripts to upgrade and downgrade your DB schema.
In the scenario where the DB server is shared, I've creates one schema/user per environment. Since each developer has his own "local" environment, he also owns one schema.
Here are my answers :
Use mock DAOs to test your services. Much easier, mush faster. Use EasyMock or Mockito or any other mock framework to test the service layer.
Give each developer its own database schema to execute his tests. Such schemas are typically empty : the unit tests populate the database with a small test data set before running a test, and empties it once the test is completed. Use DBUnit for this.
If the reads work against a well-defined, static, test data set (which you should unit-test), then you can rely on them to unit-test the writes. But you can also use ad-hoc queries or even DBUnit to test that the writes work as expected. The fact that the tests are not necessarily run in this order doesn't matter. If everything passes, then everything is OK.