I've been trying to implement unit testing and currently have some code that does the following:
query external database, loading
into a feed table
query a view,
which is a delta of my feed and data
tables, updating data table to match
feed table
my unit testing strategy is this:
I have a testing database that I am free to manipulate.
in setUP(), load some data into my testing db
run my code, using my testing db as the source
inspect the data table, checking for counts and the existence/non existence of certain records
clear testing db, loading in a different set of data
run code again
inspect data table again
Obviously I have the data sets that I load into the source db set up such that I know certain records should be added,deleted,updated, etc.
It seems like this is a bit cumbersome and there should be an easier way? any suggestions?
Is it your intent to test the view which generates the deltas, or to test that your code correctly adds, deletes and updates in response to the view?
If you want to test the view, you could use a tool like DBUnit to populate your feed and data tables with various data whose delta you've manually calculated. Then, for each test you would verify that the view returns a matching set.
If you want to test how your code responds to diffs detected by the view, I would try to abstract away database access. I imagine an java method to which you can pass a result set (or list of POJO/DTO's) and returns a list of parameter Object arrays (again, or POJO's) to be added. Other methods would parse the diff list for items to be removed and updated. You could then create a mock result set or pojo's, pass them to your code and verify the correct parameters are returned. All without touching a database.
I think the key is to break your process into parts and test each of those as independently as possible.
DbUnit will meet your needs. One thing to watch out for is that they have switched to using SLF4J as their logging facade instead of JCL. You can configure SLF4J to forward the logging to JCL but be warned if you are using Maven DbUnit sucks in their Nop log provider by default so you will have to use an exclusion, I blogged about this conflict recently.
I use DbUnit, but also I work very hard to not to have to test against the DB.
Tests that go against the database should only exist for the purpose of testing the database interface.
So I have Mock Db Connections that I can set the data for use in all the rest of my tests.
Apart from the already suggested DBUnit, you may want to look into Unitils. It uses DBUnit, but provides more than that (quoting from the site):
Automatic maintenance of databases, with support for incremental,
repeatable and post processing scripts
Automatically disable constraints and set sequences to a minimum value
Support for Oracle, Hsqldb, MySql, DB2, Postgresql, MsSql and Derby
Simplify test database connection setup
Simple insertion of test data with DBUnit * Run tests in a transaction
JPA entity manager creation and injection for hibernate, toplink and *
Hibernate SessionFactory creation and
session
Automatically test the mapping of JPA entities / hibernate mapped
objects with the database
If you are using Maven, one option is to use the sql-maven-plugin. It allows you to run database initialization/population scripts during the maven build cycle.
Related
public <Class> getClassbyName(String name) {
private DSLContext context;
return context.selectFrom(Table)
.where(Table.NAME.equal(name))
.fetchOneInto(Class.class);
}
I have this kind of function. I need to write a unit test for this Select Query in JOOQ. does anyone have any idea?
Testing clients of your methods
If you want to test the logic of that method's callers, you could mock the method with a third party library like mockito. This will allow you to produce a set of expected Class return values for a set of known input String name values.
You can also integration test everything as shown below, that works for your entire application.
You could try to mock jOOQ itself (and jOOQ offers such tooling), but I highly recommend against it. Sooner than later, you'll be implementing an entire RDBMS.
Testing your query's correctness
If you want to be sure that your query itself is correct and doesn't produce e.g. undesired cartesian products or null values, etc. you should run integration tests.
Ideally, your integration tests are as close as possible to your production environment. For example, if you're using PostgreSQL, then you should run this query on an actual PostgreSQL instance with a known data set. A good utility for running such tests is testcontainers, but there are also other ways to automate tests against an actual database instance.
A less recommended way (but faster and maybe more convenient if your queries are simple) would be to run your integration tests on an in-memory database, such as H2. This is faster, but the price is high:
You can no longer use vendor specific features of your production database product
You will have to tweak your database schema to the least common denominator of what's supported between your production database and the test database, e.g. data types, etc.
However, this is a viable option if your application supports more than one production database product, in case of which the above two caveats are a problem you have in production anyway.
I'd still use testcontainers for most tests, though. Here's a quick example how easy it is to set up code generation with testcontainers, for example: https://github.com/jOOQ/jOOQ/tree/main/jOOQ-examples/jOOQ-testcontainers-example
I am trying to drop-and-create tables multiple times using JPA/EclipseLink (2.5.1) on a JUnit test by calling createEntityManager on an EntityManagerFactory instantiated multiple times. However, the schema is dropped/created only once. How to make EclipseLink drop/create the schema every time?
My ultimate goal is to have the db tables in a known state (i.e. empty) for each test. Is there a better way to do that?
With EclipseLink, I used have used
properties.put("eclipselink.ddl-generation", "drop-and-create-tables");
properties.put("eclipselink.ddl-generation.output-mode", "database");
properties.put("eclipselink.deploy-on-startup", "true");
Then
JpaHelper.getEntityManagerFactory(em).refreshMetadata(properties);
This drops and recreates tables for all entities in the persistence unit. As mentioned elsewhere, calling this multiple times is expensive - you are better off clearing data before/after a test using the JPA bulk delete queries:
"DELETE from EntityName entity"
Recreating the schema on every test run will slow down your tests considerably. You simply need to ensure your data is in a 'known state', in your case all tables empty. So:
Run your tests in a transaction marked as 'rollback only' so the database returns to a known state after each execution.
and/or
have a look at either DBUnit which you can use to put your database in a known state before each test run.
http://dbunit.sourceforge.net/index.html
or
http://dbsetup.ninja-squad.com/
It might be overkill for your application, but I've used Liquibase to handle versioning of DB schema and table contents. It should be straightforward to tie in pre/post hooks for your JUnit code.
Here's the scanario:
I am working on a DAO object which uses hibernate criteria API to form a number of complicated queries to perform certain tasks on the database (keyword search across multiple fields for example).
We need to unit test this to ensure that the generated query is correct for various scenarios. One way of testing it -which could be preferable- would be to test the hibernate criteria is created correctly by checking it at the end and mocking the database interaction. However this is not desirable as firstly it's kinda cheating (it's merely duplicating what the code would be doing) and also it doesn't check if the criteria itself causes hibernate to barf or when it goes to database it causes issues.
The option to use is then run the query against a test database. However, for historical reasons there is no static test database (one that code be checked in as part of the code for example) and the remit of my project does not allow me to embark on creating one, we have to content with testing against a shared development database that's periodically refreshed with production data.
When theses refreshes happen, the data behind the tests could change too, and this would make our unit tests brittle. We can get over it by not using exact numbers in tests but it's not really adequate testing that way.
The question is then: what do people do in cases like this to make tests less brittle? One option that I have in mind is to run a native SQL that does the same query (behaviourally - it doesn't have to be exact same as the query generated by hibernate) to get the expected number and then run the DAO version to see if it matches. This way, the behaviour of the query can be always implemented in the initial native SQL and you will always have the correct numbers.
Any feedback on this or other ideas on how to manage this situation would be greatly appreciated.
A.
UPDATE:
With regards to hsqldb/h2/derby suggestions, I am familiar with them but the company is not ready to go down that route just yet and doing it piecemeal on just one test case won't be suitable.
With regards to my earlier suggestion I would like to elaborate a bit more - consider this scenario:
I want to ensure that my relatively complicated keyword search returns 2100 matches for "John Smith".
In order to find the expected number, I would have analyzed my database and found out the number using a SQL Query. What is the downside of having that query as part of the test, so that you will always know the you are testing the behaviour of the criteria?
So basically the question is: if for some reason you could not have a static data set for testing, how would you perform you integration tests in a non-brittle way?
One approach could be to use in-memory database like Apache Derby or HSQLDB, and prepopulate it with data before test start using DBUnit.
UPDATE: Here is a nice article about the aproach.
I agree with Andrey and Bedwyr that the best approach in the long term is to create an hsqldb database specifically for testing. If you don't have the option of doing that, then your solution seems like an appropriate one. You can't test everything, but you don't want to test nothing either. I've used this approach a few times for testing web services against integration databases etc. But remember that this database has to be maintained as well, if you add new columns etc.
You have to decide what you're trying to test. You don't want to test hibernate, you don't want to test that the database is giving what you've asked for (in terms of SQL). In your tests, you can assume that hibernate works, as does the database.
You say:
We need to unit test this to ensure that the generated query is
correct for various scenarios. One way of testing it -which could be
preferable- would be to test the hibernate criteria is created
correctly by checking it at the end and mocking the database
interaction. However this is not desirable as firstly it's kinda
cheating (it's merely duplicating what the code would be doing) and
also it doesn't check if the criteria itself causes hibernate to barf
or when it goes to database it causes issues.
Why should hibernate barf on the criteria you give it? Because you're giving it the wrong criteria. This is not a problem with hibernate, but with the code that is creating the criteria. You can test that without a database.
It has problems when it gets to the database? Hibernate, in general, creates the sql that is appropriate to the criteria and database dialect you give it, so again, any problem is with the criteria.
The database does not match what hibernate is expecting? Now you are testing that the criteria and the database are aligned. For this you need a database. But you're not testing the criteria any more, you're testing that everything is aligned, a different sort of test.
So actually, it seems to me you're doing an integration test, that the whole chain from the criteria to the structure of the database works. This is a perfectly valid test.
So, what I do is in my tests to create another connection to the database (jdbc) to get information. I execute SQL to get number of rows etc, or check that an insert has happened.
I think your approach is a perfectly valid one.
However, for historical reasons there is no static test database (one that code be checked in as part of the code for example) and the remit of my project does not allow me to embark on creating on
All you need to do is fire up H2 or similar - put some entities in it and execute your integration tests. Once you've done this for a few tests you should be able to extract a data setup utility that creates a schema with some test data that you can use for all the integration tests if you feel the need.
If I have a method which establishes a database connection, how could this method be tested? Returning a bool in the event of a successful connection is one way, but is that the best way?
From a testability method, is it best to have the connection method as one method and the method to get data back a seperate method?
Also, how would I test methods which get back data from a database? I may do an assert against expected data but the actual data can change and still be the right resultset.
EDIT: For the last point, to check data, if it's supposed to be a list of cars, then I can check they are real car models. Or if they are a bunch of web servers, I can have a list of existant web servers on the system, return that from the code under test, and get the test result. If the results are different, the data is the issue but the query not?
THnaks
First, if you have involved a database, you are no longer unit testing. You have entered integration (for connection configuration) or functional testing land. And those are very different beasts.
The connection method should definitely be separate from data fetch. In fact, your connection should come from a factory so that you can pool it. As far as testing the connection, really all you can test is that your configuration is correct by making a connection to the DB. You shouldn't be trying to test your connection pool, as that should probably be a library someone else wrote (dbcp or c3p0). Furthermore, you probably can't test this, as your unit/integration/function tests should NEVER connect to a production level database.
As for testing that your data access code works. That's functional testing and involves a lot of framework and support. You need a separate testing DB, the ability to create the schema on the fly during testing, insert any static data into table, and return the database to a known clean state after each tests. Furthermore, this DB should be instantiated and run in such a way that 2 people can run the tests at once. Especially if you have more than 1 developer, plus an automated testing box.
Asserts should be against data that is either static data (list of states for example, that doesn't change often) or against data that is inserted during the test and removed afterwords so it doesn't interfere with other tests.
EDIT: As noted, there are frameworks to assist with this. DBUnit is fairly common.
You can grab ideas from here. I would go for mock objects when unit testing DB.
Otherwise, if application is huge and you are running long and complex unit tests, you can also virtualize your DB server and easily revert it to a saved snapshot to run again your tests on a known environment.
Using my Acolyte framework ( https://github.com/cchantep/acolyte ) you can mimick any JDBC supported DB, describing cases (how to handle each query/update executed) and which resultset/updatecount to returned in each case (describe fixtures as row list for queries, count for update).
Such connection can be directly used passing instance where JDBC is required, or registered with unique id in JDBC URL namespace jdbc:acolyte: to be available for code getting connection thanks to JDBC URL resolution.
Whatever way of creating connection, Acolyte keep each one isolated which is right for unit test (without having extra cleanup to do on a test DB).
As persistence cases can dispatched to different isolated connection, you no longer need a big-all-in-on-hard-to-manage db (or fixtures file): it can be easily split in various connection, e.g. one per persistence method/module.
My Acolyte framework is usable either in pure Java, or Scala.
If the goal is to test method functionality, not the database SP or SQL statement, then you may want to consider dependency injection in sense of data provider interface. In other words, your class uses an interface with methods returning data. The default implementation uses the database. The unit test implementation has several options:
mocking (NMock, Moq, etc.), great way, I live mocking.
in-memory database
static database with static data
I don't like anything but first. As a general rule, programming to interfaces is always much more flexible.
For database connection establish testing: you could let the connection execute a very simple SQL as testing method. Some application servers have such configuration, following snippet is from JBoss DB configuration:
<!-- sql to call on an existing pooled connection when it is obtained from pool
<check-valid-connection-sql>some arbitrary sql</check-valid-connection-sql>
I need to do several integration tests on a Mongo database using Java, and I was looking for a DbUnit-like solution (DbUnit is for Hibernate) that can populate my database with custom data, and reset the state after each run.
Any tips?
Thanks
To start off, I don't know of any direct equivalent to DBUnit for Mongo. Mongo is still a new product, so you'll probably have to "roll your own" for some of this stuff.
However, there are several features of Mongo that should make this easy:
It runs with minimal permissions
It can simply "run" on prepared files
It doesn't really have a schema (except for indexes)
It can work of JSON data
Based on your dataset there are lots of ways to do this. But the basic tools are there.
You should be able to start a version specifically for your test, from your test.
You should be able to import "state" data from JSON file.
You should be able to apply any server-side functions from a JS file (from scratch).
So the whole thing should be pretty straightforward. Though you will have to write much of the glue code.
Here's what I do: connect to a known (often shared) mongo instance, but create a new unique database for each test run using a UUID. You don't have to worry about creating collections, as they are created lazily when you store documents in them for the first time. Create any indexes you need in the constructor of the repository or DAO; mongo index creations succeed immediately without doing any work if the index already exists. Obviously, you don't need to worry about schema migrations ;-)
This scheme requires to you to start from an empty datastore, but it's a known state, so it's easy enough to populate in the setup phase of your tests if need be.
When the test is done, delete the entire database in the teardown phase.
This question has been answered here and permits to start and stop an instance between each test:
https://stackoverflow.com/a/9830861/82609
But start/stop between each test seems to slow down integration tests, and thus you'd better start/stop it for the whole test suite:
https://stackoverflow.com/a/14171993/82609
I know this question is old, but maybe my answer will be useful for someone.
Here is a simple util that I wrote it recently: https://github.com/kirilldev/mongomery
Very simple to populate db with data from json file:
//db here is a com.mongodb.DB instance
MongoDBTester mongoDBTester = new MongoDBTester(db);
mongoDBTester.setDBState("predefinedTestData.json");
To check db state:
mongoDBTester.assertDBStateEquals("expectedTestData.json");
It supports placeholders for expected files which can be useful in some situations.
You can use nosql-unit that has a MongoDB module