I am trying to drop-and-create tables multiple times using JPA/EclipseLink (2.5.1) on a JUnit test by calling createEntityManager on an EntityManagerFactory instantiated multiple times. However, the schema is dropped/created only once. How to make EclipseLink drop/create the schema every time?
My ultimate goal is to have the db tables in a known state (i.e. empty) for each test. Is there a better way to do that?
With EclipseLink, I used have used
properties.put("eclipselink.ddl-generation", "drop-and-create-tables");
properties.put("eclipselink.ddl-generation.output-mode", "database");
properties.put("eclipselink.deploy-on-startup", "true");
Then
JpaHelper.getEntityManagerFactory(em).refreshMetadata(properties);
This drops and recreates tables for all entities in the persistence unit. As mentioned elsewhere, calling this multiple times is expensive - you are better off clearing data before/after a test using the JPA bulk delete queries:
"DELETE from EntityName entity"
Recreating the schema on every test run will slow down your tests considerably. You simply need to ensure your data is in a 'known state', in your case all tables empty. So:
Run your tests in a transaction marked as 'rollback only' so the database returns to a known state after each execution.
and/or
have a look at either DBUnit which you can use to put your database in a known state before each test run.
http://dbunit.sourceforge.net/index.html
or
http://dbsetup.ninja-squad.com/
It might be overkill for your application, but I've used Liquibase to handle versioning of DB schema and table contents. It should be straightforward to tie in pre/post hooks for your JUnit code.
Related
I have DAO using Spring's jdbcTemplate with Create Read Update (no Delete) operation.
Create method have ID parameter which is unique key in table.
Except mocking DAO, how can I actually test create without getting constraint violation?
Using random ID still can fail sometimes
Should I override setAutoCommit to avoid adding record? is it still consider a valid unit test?
Must I delete in SQL the record in database beforehand or is there spring option for this types of tests?
Or should I consider it as integration test and not unit test?
EDIT
I'm using Oracle, I can't use sequence for creating values for the ID
We have a few Data sources exists (not for testing) in production
It really depends on what is the purpose of such a test, not all the tests are "unit tests" with this respect.
If, for example, the goal is to test the "service" that encapsulates a business logic, but from this service, sometimes there are calls to DAO, then probably the best way is to just mock the DAO as you suggest.
In this case, DAO obviously won't be covered by this test, but the service will.
If the purpose is to test SQL statements (and I assume that DAO contains nothing but SQL statements + maybe transforming them into the domain object), then mocking is not an option.
In this case, the test should include calls to some kind of database, but in this case, it's not called a unit test anymore (a unit test is something that runs really fast and only in memory, no DBs, no I/O, etc.) I'll call this an integration test (as you also suggest), but different people have probably different names for this kind of tests.
In practice, we need both kinds of tests because they test different things
So, how to test this?
First of all the decision should be made, which database should be used, there are 3 approaches here:
Run with a real database, shared between the users, tests assume its pre-installed
Run with an in-memory database
Run the DB docker image of the DB when the test suite runs, and destroy it afterwards
While the discussion of which approach is better is very interesting on its own, its out of scope for this question IMO, each choise has its implications.
Once you're done with this decision, you should decide how to work with this database from the code.
Usually spring tests use the following pattern:
open a transaction before the test
run the test (change the data, even change the schema - add columns, tables if you want). Do assertions
Regardless the tests result, rollback the transaction so that the data will be just like before the test
So if you follow this approach for all your tests, they'll start with "empty" data state so that no constraint violations are expected
This effectively solves also the "deletion of the record" question because the data will be deleted anyway when the transactions is rolled back.
Now regarding the Deletion of the record outside a transaction.
An obvious approach is to execute an sql of deletion right from the test (outside the DAO), so that DAO (a production code won't be changed)
You can inject DataSource/JDBCTemplate right into the test (Spring test perfectly supports this) and call the required SQL from there
I'm running an integration test that executes some Hibernate code within a single transaction (managed by Spring). The test is failing with a duplicate key violation and I'd like to hit a breakpoint just before this and inspect the table contents. I can't just go into MySQL Workbench and run a SELECT query as it would be outside the transaction. Is there another way?
After reading your comments, my impression that predominantly you are interested in how to hit a breakpoint and at the same time be able to examine database contents. Under normal circumstances I would just offer you to log the SQLs. Having the breakpoint in mind my suggestion is:
Reduce isolation level to READ_UNCOMMITED for the integration test.
Reducing the isolation level will allow you to see the uncommitted values in the database during the debugging. As long as you don't have parallel activity within the integration test. It should be fine.
Isolation level can be set up on per connection basis. There is no need for anything to be done on the server.
One side note. If you are using Hibernate even the parallel activities may work fine when you reduce the ISOLATION LEVEL because largely Hibernate behaves as it is in REPEATABLE_READ because of the transactional Level 1 cache.
The following can be run from Eclipse's "Display" view:
java.util.Arrays.deepToString(
em.createNativeQuery("SELECT mystuff FROM mytable").getResultList().toArray())
.replace("], ", "]\n");
This displays all the data, albeit not in a very user-friendly way - e.g. will need to work out which columns the comma-separated fields correspond to.
Here's the scanario:
I am working on a DAO object which uses hibernate criteria API to form a number of complicated queries to perform certain tasks on the database (keyword search across multiple fields for example).
We need to unit test this to ensure that the generated query is correct for various scenarios. One way of testing it -which could be preferable- would be to test the hibernate criteria is created correctly by checking it at the end and mocking the database interaction. However this is not desirable as firstly it's kinda cheating (it's merely duplicating what the code would be doing) and also it doesn't check if the criteria itself causes hibernate to barf or when it goes to database it causes issues.
The option to use is then run the query against a test database. However, for historical reasons there is no static test database (one that code be checked in as part of the code for example) and the remit of my project does not allow me to embark on creating one, we have to content with testing against a shared development database that's periodically refreshed with production data.
When theses refreshes happen, the data behind the tests could change too, and this would make our unit tests brittle. We can get over it by not using exact numbers in tests but it's not really adequate testing that way.
The question is then: what do people do in cases like this to make tests less brittle? One option that I have in mind is to run a native SQL that does the same query (behaviourally - it doesn't have to be exact same as the query generated by hibernate) to get the expected number and then run the DAO version to see if it matches. This way, the behaviour of the query can be always implemented in the initial native SQL and you will always have the correct numbers.
Any feedback on this or other ideas on how to manage this situation would be greatly appreciated.
A.
UPDATE:
With regards to hsqldb/h2/derby suggestions, I am familiar with them but the company is not ready to go down that route just yet and doing it piecemeal on just one test case won't be suitable.
With regards to my earlier suggestion I would like to elaborate a bit more - consider this scenario:
I want to ensure that my relatively complicated keyword search returns 2100 matches for "John Smith".
In order to find the expected number, I would have analyzed my database and found out the number using a SQL Query. What is the downside of having that query as part of the test, so that you will always know the you are testing the behaviour of the criteria?
So basically the question is: if for some reason you could not have a static data set for testing, how would you perform you integration tests in a non-brittle way?
One approach could be to use in-memory database like Apache Derby or HSQLDB, and prepopulate it with data before test start using DBUnit.
UPDATE: Here is a nice article about the aproach.
I agree with Andrey and Bedwyr that the best approach in the long term is to create an hsqldb database specifically for testing. If you don't have the option of doing that, then your solution seems like an appropriate one. You can't test everything, but you don't want to test nothing either. I've used this approach a few times for testing web services against integration databases etc. But remember that this database has to be maintained as well, if you add new columns etc.
You have to decide what you're trying to test. You don't want to test hibernate, you don't want to test that the database is giving what you've asked for (in terms of SQL). In your tests, you can assume that hibernate works, as does the database.
You say:
We need to unit test this to ensure that the generated query is
correct for various scenarios. One way of testing it -which could be
preferable- would be to test the hibernate criteria is created
correctly by checking it at the end and mocking the database
interaction. However this is not desirable as firstly it's kinda
cheating (it's merely duplicating what the code would be doing) and
also it doesn't check if the criteria itself causes hibernate to barf
or when it goes to database it causes issues.
Why should hibernate barf on the criteria you give it? Because you're giving it the wrong criteria. This is not a problem with hibernate, but with the code that is creating the criteria. You can test that without a database.
It has problems when it gets to the database? Hibernate, in general, creates the sql that is appropriate to the criteria and database dialect you give it, so again, any problem is with the criteria.
The database does not match what hibernate is expecting? Now you are testing that the criteria and the database are aligned. For this you need a database. But you're not testing the criteria any more, you're testing that everything is aligned, a different sort of test.
So actually, it seems to me you're doing an integration test, that the whole chain from the criteria to the structure of the database works. This is a perfectly valid test.
So, what I do is in my tests to create another connection to the database (jdbc) to get information. I execute SQL to get number of rows etc, or check that an insert has happened.
I think your approach is a perfectly valid one.
However, for historical reasons there is no static test database (one that code be checked in as part of the code for example) and the remit of my project does not allow me to embark on creating on
All you need to do is fire up H2 or similar - put some entities in it and execute your integration tests. Once you've done this for a few tests you should be able to extract a data setup utility that creates a schema with some test data that you can use for all the integration tests if you feel the need.
I am using Spring JdbcTemplate with the DAO pattern to access a database. Instead of creating the database tables manually, I am looking for a way to generate the tables in the DAO layer.
I understand that I can use the JdbcTemplate to execute statements, I am only looking for the right place to do it.
Is there a best practice for that?
You can use the execute(String) method:
public void execute(String sql) throws DataAccessException
Issue a single SQL execute, typically a DDL statement.
Specified by: execute in interface JdbcOperations
Parameters: sql - static SQL to execute
Throws: DataAccessException - if there is any problem
However as beny23 mentions I would be suspicious of an actual need to do this programatically in a live application.
Slightly offtopic:
Is it absolutely necessary that you need to execute the DDL commands from within your code? In fact I do think it is a good idea to have separation between db admin and db usage. Our Oracle database security setup here is actually set up so that the tables are set up using a different database user (DB_OWNER), than the one running the SELECTs, INSERTs, DELETEs are run by DB_USER.
This prevents accidentially deleting tables or modifying the schema and also allows the DB_USER to be setup such that only the privileges that are absolutely necessary are granted, which adds a layer of security.
I suppose it depends on the nature of your service/application, but think about the benefit of creating the tables inside the code (and whether a possible bug in the DDL code could accidentially destroy production data).
Use .update() methods available in the (Simple)JdbcOperations, the number they return is the number of affected rows. They're supposed to be specifically used for both INSERT and UPDATE statements.
I've been trying to implement unit testing and currently have some code that does the following:
query external database, loading
into a feed table
query a view,
which is a delta of my feed and data
tables, updating data table to match
feed table
my unit testing strategy is this:
I have a testing database that I am free to manipulate.
in setUP(), load some data into my testing db
run my code, using my testing db as the source
inspect the data table, checking for counts and the existence/non existence of certain records
clear testing db, loading in a different set of data
run code again
inspect data table again
Obviously I have the data sets that I load into the source db set up such that I know certain records should be added,deleted,updated, etc.
It seems like this is a bit cumbersome and there should be an easier way? any suggestions?
Is it your intent to test the view which generates the deltas, or to test that your code correctly adds, deletes and updates in response to the view?
If you want to test the view, you could use a tool like DBUnit to populate your feed and data tables with various data whose delta you've manually calculated. Then, for each test you would verify that the view returns a matching set.
If you want to test how your code responds to diffs detected by the view, I would try to abstract away database access. I imagine an java method to which you can pass a result set (or list of POJO/DTO's) and returns a list of parameter Object arrays (again, or POJO's) to be added. Other methods would parse the diff list for items to be removed and updated. You could then create a mock result set or pojo's, pass them to your code and verify the correct parameters are returned. All without touching a database.
I think the key is to break your process into parts and test each of those as independently as possible.
DbUnit will meet your needs. One thing to watch out for is that they have switched to using SLF4J as their logging facade instead of JCL. You can configure SLF4J to forward the logging to JCL but be warned if you are using Maven DbUnit sucks in their Nop log provider by default so you will have to use an exclusion, I blogged about this conflict recently.
I use DbUnit, but also I work very hard to not to have to test against the DB.
Tests that go against the database should only exist for the purpose of testing the database interface.
So I have Mock Db Connections that I can set the data for use in all the rest of my tests.
Apart from the already suggested DBUnit, you may want to look into Unitils. It uses DBUnit, but provides more than that (quoting from the site):
Automatic maintenance of databases, with support for incremental,
repeatable and post processing scripts
Automatically disable constraints and set sequences to a minimum value
Support for Oracle, Hsqldb, MySql, DB2, Postgresql, MsSql and Derby
Simplify test database connection setup
Simple insertion of test data with DBUnit * Run tests in a transaction
JPA entity manager creation and injection for hibernate, toplink and *
Hibernate SessionFactory creation and
session
Automatically test the mapping of JPA entities / hibernate mapped
objects with the database
If you are using Maven, one option is to use the sql-maven-plugin. It allows you to run database initialization/population scripts during the maven build cycle.