How shall I write unit test code in database driven app? - java

I have database driven financial application. Most of my method and service depend on database to process. So I am using spring Integration test.Now I want to write also unit test for my app. As we know unit test depends on single component, how should I write unit test for my application where I know Unit testable code is better designed and less error prone. If I use mock or stub.. that won't be my real scenario. So my confusion, is unit test is not worthy for database driven app??

You certainly can (and should) unit test a "database driven" app. Your goal when writing such tests is to verify the logic of the code you're testing, not the database layer. How you handle such code (e.g. with mocks, stubs, fakes, test databases, etc.) depends on exactly what you're testing.
Generally speaking you should structure your code so that it isn't tightly coupled to the database. A handful of classes, tops, are responsible for interacting with the database, while the rest are simply working with Java objects. Then when you test those classes you simply pass in test objects, no database required. For the handful of classes that need to directly interact with the database you can create mocks or use a fake database.

Related

How to apply unit test on DAOs in java

I don't know how to apply unit tests on data access layer. I always wonder if the data access layer should be tested. In my company, we have stable database to store unit test data and test data access layer by running data access objects and check the data they get from the stable database.
In order to pass the unit tests, data in the stable database can not be modified anymore. I think there is a better solution to this. If I am not mistaken, the the mock object cannot perform tests on SQL statements and ResultSet mappings.
What is the best way to unit test the DAO? Is there a better way to do this with TDD?
First, by most definitions, "unit" tests do not depend on external systems like a database. You want to create what are called "functional" or "integration" tests. In practice these types of tests will be implemented in the same way as unit tests, using something like Junit, but you should separate them from unit tests, which should run very fast and not break when your database is down or the data has changed.
Second, try to keep most of your business logic out of DAOs and instead put it into a service POJO layer so that you can test biz logic without involving the database.
Next, the ideal way to set up testing for DAOs is to start with an empty database, and load it with test data (often using the DAOs themselves), and then run your DAO tests against a known, and writable, test dataset. If you're fortunate enough to have a read-only database, the stable database approach you outline will work, but most systems are read/write to the database.
Finally, it is valuable to test DAOs. Often the database queries are some of the most fragile parts of your system, and you don't want to wait until they are deployed to production to find out they are breaking.
Strictly speaking, you're writing a functional test. To do this you're going to need a test database of one sort or another. Let's talk about your options.
HSQL/in-memory DB. Small and fast. Trivial to setup and get rolling, and great performance on unit test sized data. The downside is, unless you are deploying using these environments then you risk having your unit test work but actual code fail. It also means you can't use any SQL constructs which are not support in both HSQL and your production DB. This can be mitigated to an extent by using Hibernate or similar. A good way to go if you only have very simple queries.
Mock out the DB calls entirely. Pointless unless you are doing too much heavy lifting in your DAO.
Use a test instance of your production DB. This will give you the best results in terms of accuracy or results. It will let you test to make sure that all your calls work as expected and lets you use non-portable SQL. You can use something like DBUnit to load database data or just use the DAO under test to do it. I would recommend this if you large and nasty queries. Ones with a lot of edge cases, roll-up views and subtle behavior. The downside is that real DBs will incur performance penalties since they'll be doing real work (transactions, index updates, rollback support).
Some comments/suggestions:
The DAO test is aimed at verifying whether the queries fired and the data retrieved are as per the expectations. There should hardly any business logic to test in DAO.
Since the prime objective is test the database interaction, mocking is not going to make it foolproof, specially the edge cases.
In light of this, the approach that you have right now is fair enough. I am not sure why you feel that its not good. A little elaboration will help.
If you are not convenient on using external database, then you can use Java's inbuilt javaDB for this. Please note that there will be overhead of creating the test data first before you run this test.
For JDBC based project, JDBC connection can be mocked, so that tests can be executed without live RDBMS, with each test case isolated (no data conflict).
It allow to verify, persistence code passes proper queries/parameters (e.g. https://github.com/playframework/playframework/blob/master/framework/src/anorm/src/test/scala/anorm/ParameterSpec.scala) and handle JDBC results (parsing/mapping) as expected.
Framework like jOOQ or my framework Acolyte can be used for: https://github.com/cchantep/acolyte .

How do I efficiently test SQL/JPQL/HQL?

It is often said when unit testing to dont test the database as that is an integration test (see point 4).
However, SQL/JPQL/HQL encapsulate data store specific logic which is often in string format on how to access data. This free form string data access command can easily go wrong and hence needs to be tested.
How do i efficiently test this sort of logic?
The closest you can get to running a unit test against an SQL (or similar framework) query, is to set up a SQLite database in memory, and run against it.
While that still is technically an integration test, it runs almost as fast as a unit test should.
If you do so, just take care to note the slight differences between SQLite and your real database, and try to make your queries compatible with both.
Hope this helps,
Assaf.
It is not a unit test, but there is nothing with using a unit testing framework like Nunit to test your sql. But it IS important that you keep it separated from the real unit tests. Real unit tests are fast and does not communicate with the outside ... nor do they attempt to alter it by updates, deletes and inserts.

Tools and Methods for testing Service/DAO layers in Java

I am trying to figure out the best way(s) to test Service and DAO layers. So, a few sub questions...
When testing a service layer, is it best to test against a mock DAO layer or a "live" DAO layer pointed at a testing environment?
How should SQL in the DAO layer be tested when the only test database is in a shared environment (Oracle/DB2)
How do you solve the paradox of any DAO writes/updates need to be tested with DAO reads which is something that also has to be tested?
I am looking for any good documentation, articles, or references in this area along with any tools to help automate the process. I already know about JUint for unit testing and Hudson for CI.
Get Growing Object-Oriented Software, Guided by Tests. It has some great tips about how to test database access.
Personally, I usually break the DAO tests in 2, a unit test with a mocked database to test functionality on the DAO, and an integration test, to test the queries against the DB. If your DAO only has database access code, you won't need a unit test.
One of the suggestions from the book that I took, is that the (integration) test has to commit the changes to the DB. I've learn to do this, after using hibernate and figuring out that the test was marked for rollback and the DB never got the insert statement. If you use triggers or any kind of validation (even FKs) I think this is a must.
Another thing, stay away from dbunit, it's a great framwork to start working, but it becomes hellish when a project becomes something more than tiny. My preference here, is to have a set of Test Data Builder classes to create the data, and insert it in the setup of the test or in the test itself.
And check dbmigrate, it's not for testing, but it will help you to manage scripts to upgrade and downgrade your DB schema.
In the scenario where the DB server is shared, I've creates one schema/user per environment. Since each developer has his own "local" environment, he also owns one schema.
Here are my answers :
Use mock DAOs to test your services. Much easier, mush faster. Use EasyMock or Mockito or any other mock framework to test the service layer.
Give each developer its own database schema to execute his tests. Such schemas are typically empty : the unit tests populate the database with a small test data set before running a test, and empties it once the test is completed. Use DBUnit for this.
If the reads work against a well-defined, static, test data set (which you should unit-test), then you can rely on them to unit-test the writes. But you can also use ad-hoc queries or even DBUnit to test that the writes work as expected. The fact that the tests are not necessarily run in this order doesn't matter. If everything passes, then everything is OK.

Data-driven tests with jUnit

What do you use for writing data-driven tests in jUnit?
(My definition of) a data-driven test is a test that reads data from some external source (file, database, ...), executes one test per line/file/whatever, and displays the results in a test runner as if you had separate tests - the result of each run is displayed separately, not in one huge aggregate.
In JUnit4 you can use the Parameterized testrunner to do data driven tests.
It's not terribly well documented, but the basic idea is to create a static method (annotated with #Parameters) that returns a Collection of Object arrays. Each of these arrays are used as the arguments for the test class constructor, and then the usual test methods can be run using fields set in the constructor.
You can write code to read and parse an external text file in the #Parameters method (or get data from another external source), and then you'd be able to add new tests by editing this file without recompiling the tests.
This is where TestNG, with its #DataSource, shines. That's one reason why I prefer it to JUnit. The others are dependencies and parallel threaded tests.
I use an in-memory database such as hsqldb so that I can either pre-populate the database with a "production-style" set of data or I can start with an empty hsqldb database and populate it with rows that I need to perform my testing. On top of that I will write my tests using JUnit and Mockito.
I use combination of dbUnit, jMock and jUnit 4. Then you can ether run it as suite or separately
You are better off extending TestCase with a DataDrivenTestCase that suits your needs.
Here is working example:
http://mrlalonde.blogspot.ca/2012/08/data-driven-tests-with-junit.html
Unlike parameterized tests, it allows for nicely named test cases.
I'm with #DroidIn.net, that is exactly what I am doing, however to answer your question literally "and displays the results in a test runner as if you had separate tests," you have to look at the JUnit4 Parameterized runner. DBUnit doesn't do that. If you have to do a lot of this, honestly TestNG is more flexible, but you can absolutely get it done in JUnit.
You can also look at the JUnit Theories runner, but my recollection is that it isn't great for data driven datasets, which kind of makes sense because JUnit isn't about working with large amounts of external data.
Even though this is quite an old topic, i still thought of contributing my share.
I feel JUnit's support for data driven testing is to less and too unfriendly. for eg. in order to use parameterized, we need to write our constructor. With Theories runner we do not have control over the set of test data that is passed to the test method.
There are more drawbacks as identified in this blog post series: http://www.kumaranuj.com/2012/08/junits-parameterized-runner-and-data.html
There is now a comprehensive solution coming along pretty nicely in the form of EasyTest which is a a framework extended out of JUnit and is meant to give a lot of functionality to its users. Its primary focus is to perform Data Driven Testing using JUnit, although you are not required to actually depend on JUnit anymore. Here is the github project for refernece: https://github.com/anujgandharv/easytest
If anyone is interested in contributing their thoughts/code/suggestions then this is the time. You can simply go to the github repository and create issues.
Typically data driven tests use a small testable component to handle the data. (File reading object, or mock objects) For databases, and resources outside of the application mocks are used to similate other systems. (Web services, and databases etc). Typically I see is that there are external data files that handle the data and the output. This way the data file can be added to the VCS.
We currently have a props file with our ID numbers in it. This is horribly brittle, but is easy to get something going. Our plan is to initially have these ID numbers overridable by -D properties in our ant builds.
Our environment uses a legacy DB with horribly intertwined data that is not loadable before a run (e.g. by dbUnit). Eventually we would like to get to where a unit test would query the DB to find an ID with the property under test, then use that ID in the unit test. It would be slow and is more properly called integration testing, not "unit testing", but we would be testing against real data to avoid the situation where our app runs perfectly against test data but fails with real data.
Some tests will lend themselves to being interface driven.
If the database/file reads are retrieved by an interface call then simply get your unit test to implement the interface and the unit test class can return whatever data you want.

End to End testing of Webservices

first time poster and TDD adopter. :-) I'll be a bit verbose so please bear with me.
I've recently started developing SOAP based web services using the Apache CXF framework, Spring and Commons Chain for implementing business flow. The problem I'm facing here is with testing the web services -- testing as in Unit testing and functional testing.
My first attempt at Unit testing was a complete failure. To keep the unit tests flexible, I used a Spring XML file to keep my test data in. Also, instead of creating instances of "components" to be tested, I retrieved them from my Spring Application context. The XML files which harbored data quickly got out of hand; creating object graphs in XML turned out to be a nightmare. Since the "components" to be tested were picked from the Spring Application Context, each test run loaded all the components involved in my application, the DAO objects used etc. Also, as opposed to the concept of unit test cases being centralized or concentrated on testing only the component, my unit tests started hitting databases, communicating with mail servers etc. Bad, really bad.
I knew what I had done wrong and started to think of ways to rectify it. Following an advice from one of the posts on this board, I looked up Mockito, the Java mocking framework so that I could do away with using real DAO classes and mail servers and just mock the functionality.
With unit tests a bit under control, this brings me to my second problem; the dependence on data. The web services which I have been developing have very little logic but heavy reliance on data. As an example, consider one of my components:
public class PaymentScheduleRetrievalComponent implements Command {
public boolean execute(Context ctx) {
Policy policy = (Policy)ctx.get("POLICY");
List<PaymentSchedule> list = billingDAO.getPaymentStatementForPolicy(policy);
ctx.put("PAYMENT_SCHEDULE_LIST", list);
return false;
}
}
A majority of my components follow the same route -- pick a domain object from the context, hit the DAO [we are using iBatis as the SQL mapper here] and retrieve the result.
So, now the questions:
- How are DAO classes tested esp when a single insertion or updation might leave the database in a "unstable" state [in cases where let's say 3 insertions into different tables actually form a single transaction]?
- What is the de-facto standard for functional testing web services which move around a lot of data i.e. mindless insertions/retrievals from the data store?
Your personal experiences/comments would be greatly appreciated. Please let me know in case I've missed out some details on my part in explaining the problem at hand.
-sasuke
I would stay well away from the "Context as global hashmap" 'pattern' if I were you.
Looks like you are testing your persistence mapping...
You might want to take a look at: testing persistent objects without spring
I would recommend an in-memory database for running your unit tests against, such as HSQL. You can use this to create your schema on the fly (for example if you are using Hibernate, you can use your XML mappings files), then insert/update/delete as required before destroying the database at the end of your unit test. At no time will your test interfere with your actual database.
For you second problem (end-to-end testing of web services), I have successfully unit tested CXF-based services in the past. The trick is to publish your web service using a light-weight web server at the beginning of your test (Jetty is ideal), then use CXF to point a client to your web service endpoint, run your calls, then finally shut down the Jetty instance hosting your web service once your unit test has completed.
To achive this, you can use the JaxWsServerFactoryBean (server-side) and JaxWsProxyFactoryBean (client-side) classes provided with CXF, see this page for sample code:
http://cwiki.apache.org/CXF20DOC/a-simple-jax-ws-service.html#AsimpleJAX-WSservice-Publishingyourservice
I would also give a big thumbs up to SOAP UI for doing functional testing of your web service. JMeter is also extremely useful for stress testing web services, which is particularity important for those services doing database lookups.
First of all: Is there a reason you have to retrieve the subject under test (SUT) from the Spring Application context? For efficient unit testing you should be able to create the SUT without the context. It sounds like you have some hidden dependencies somewhere. That might be the root of some of your headache.
How are DAO classes tested esp when a
single insertion or updation might
leave the database in a "unstable"
state [in cases where let's say 3
insertions into different tables
actually form a single transaction]?
It seems you are worried about the database's constistency after you have running the tests. If possible use a own database for testing, where you don't need care about it. If you have such a sandbox database you can delete data as you wish. In this case I would do the following:
Flag all your fake data with some common identifier, like putting a special prefix to a field.
Before running the test drop a delete statement, which deletes the flagged data. If there is none, then nothing bad happens.
Run your single DAO test. After that repeat step 2. for the next test.
What is the de-facto standard for
functional testing web services which
move around a lot of data i.e.
mindless insertions/retrievals from
the data store?
I am not aware of any. From the question your are asking I can infer that you have on one side the web service and on the other side the database. Split up the responsibilities. Have separate test suites for each side. One side just testing database access (as described above). On the other side just testing web service requests and responses. In this case it pays of the stub/fake/mock the layer talking to the network. Or consider https://wsunit.dev.java.net/.
If the program is only shoving data in and out I think that there is not much behavior. If this is the case, then the hardest work is to unit test the database side and the web service side. The point is you can do unit testing without the need for "realistic" data. For functional testing you will need handrolled data, which is close to reality. This might be cumbersome, but if you already unit tested the database and web service parts intensively, this should reduce the need for "realistic" test cases considerably.
First of all, make thing clear.
In an ideal world the lifecycle of the software your are building is something like this:
- sy makes a report with the customer, so you got an user story with examples about how the application should work
- you generalize the user story, so you got rules, which you call as use cases
- you start to write a piece of functional (end to end) test, and it fails...
- after that your build the ui and mock out the services, so you got a green functional test and a specification about how your services should work...
- your job is to keep the functional test green, and implement the services step by step writing integration tests, and mocking out dependencies with the same approach until you reach the level of unit tests
- after that you do the next iteration with the use cases, write the next piece of functional test, and so on until the end of the project
- after that you make acceptance tests with the customer who accepts the product and pays a lot
So what did we learn from this:
There are many types of tests (don't confuse them with each other)
functional tests - for testing the use cases (mock out nothing)
integration tests - for testing application, component, module, class interactions (mock out the irrelevant components)
unit tests - for testing a single class in isolation from its environment (mock out everything)
user acceptance tests - customer makes sure, that she accepts the product (manual functional tests, or presentation made from automatic functional tests in working)
You don't need to test everything by functional tests and integration tests, because it is impossible. Test only the relevant part by functional and integration tests and test everything by unit tests! Familiarize yourself with the testing pyramid.
Use TDD, it makes life easier!
How are DAO classes tested esp when a single insertion or updation might leave the database in a "unstable" state [in cases where let's
say 3 insertions into different tables actually form a single
transaction]?
You don't have to test your database transactions. Assume, that they are working well, because database developers have already tested them, and I am sure you don't want to write concurrency tests... Db is an external component, so you don't have to test it yourself. You can write a data access layer to adapt the data storage to your system, and write integration tests only for those adapters. In case of database migration these tests will work by the adapters of the new database as well, because your write them to implement a specific interface... By any other tests (except functional tests) you can mock out your data access layer. Do the same with every other external component as well, write adapters and mock them out. Put these kind of integration tests to a different test suite than the other tests, because they are slow because of database access, filesystem access, etc...
What is the de-facto standard for functional testing web services which move around a lot of data i.e. mindless insertions/retrievals
from the data store?
Your can mock out your data store with an in memory db which implements the same storage adapters until you implemented everything else except the database. After that your implement the data access layer for the database and test it with your functional tests as well. It will be slow, but it has to run only once, for example by every new release... If you need functional tests by developing, you can mock it out with an in memory solution again... An alternative approach to run only the affected functional tests by developing, or modify the settings of the test db to make things faster, and so on... I am sure there are many test optimization solutions...
I must say I don't really understand
your exact problem. Is the problem
that your database is left in an
altered state after you've run the
test?
Yes, there are actually two issues here. First one being the problem with the database left in an inconsistent state after running the test cases. The second one being that I'm looking for an elegant solution in terms of end-to-end testing of web services.
For efficient unit testing you should
be able to create the SUT without the
context. It sounds like you have some
hidden dependencies somewhere. That
might be the root of some of your
headache.
That indeed was the root cause of my headaches which I am now about to do away with with the help of a mocking framework.
It seems you are worried about the
database's constistency after you have
running the tests. If possible use a
own database for testing, where you
don't need care about it. If you have
such a sandbox database you can delete
data as you wish.
This is indeed one of the solutions to the problem I mentioned in my previous post but this might not work in all the cases esp when integrating with a legacy system in which the database/data isn't in your control and in cases when some DAO methods require a certain data to be already present in a given set of tables. Should I look into database unit testing frameworks like DBUnit?
In this case it pays of the
stub/fake/mock the layer talking to
the network. Or consider
https://wsunit.dev.java.net/.
Ah, looks interesting. I've also heard of tools like SOAPUI and the likes which can be used for functional testing. Has anyone here had any success with such tools?
Thanks for all the answers and apologies for the ambiguous explanation; English isn't my first language.
-sasuke

Categories