My unit tests use Hibernate to connect to an in-memory HSQLDB database. I was hoping there would be a way to clear and recreate the database (the entire database including the schema and all the data) in JUnit's TestCase.setUp() method.
you can config your hibernate configuration file to force database to recreate your tables and schema every time.
<!-- Drop and re-create the database schema on startup -->
<property name="hbm2ddl.auto">create-drop</property>
hibernate.hbm2ddl.auto Automatically validates or exports schema DDL to the database when the SessionFactory is created. With create-drop, the database schema will be dropped when the SessionFactory is closed explicitly.
e.g. validate | update | create | create-drop
if you don't like to have this config in your real hibernate config, you can create one hibernate config for unit testing purpose.
If you are using Spring, then you can use the #Transactional attribute on your unit test, and by default at the end of every unit test all persisted data will be automatically rolled back so you dont need to worry about dropping the tables every time.
I haa walked throug an example here http://automateddeveloper.blogspot.com/2011/05/hibernate-spring-testing-dao-layer-with.html
hibernate.hbm2ddl.auto=create-drop
And bootstrap a new SessionFactory.
From testing perspective, the best practice is to clear data after every single test. If you use create-drop, it will also drop the table schema. This causes an overhead of recreating the schema everytime.
Since you are using hsql, which provides a direct mechanism to truncate, it would be the best option in this case.
#After
public void clearDataFromDatabase() {
//Start transaction, based on your transaction manager
dao.executeNativeQuery("TRUNCATE SCHEMA PUBLIC AND COMMIT");
//Commit transaction
}
Be careful with wiping the world and starting over fresh each time. Soon, you will likely want to start with a "default" set of test data loaded in your system. Thus, what you really want is to revert to that base state before each test is ran. In this case, you want a Transaction which rollsback before each test run.
To accomplish this, you should annotate your JUnit class:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations={"classpath:/path/to/spring-config.xml"})
#TransactionConfiguration(transactionManager="myTransactionManager", defaultRollback=true)
public class MyUnitTestClass {
...
}
And then annotate each of your test methods with #Transactional:
#Transactional
#Test
public void myTest() {
...
}
Related
I am having trouble finding information about this issue I am running into. I am interested in implementing row level security on my Postgres db and I am looking for a way to be able to set postgres session variables automatically through some form of an interceptor. Now, I know that with hibernate you are able to do row-level-security using #Filter and #FilterDef, however I would like to additionally set policies on my DB.
A very simple way of doing this would be to execute the SQL statement SET variable=value prior to every query, though I have not been able to find any information on this.
This is being used on a spring-boot application and every request is expected to will have access to a request-specific value of the variable.
Since your application uses spring, you could try accomplishing this in one of a few ways:
Spring AOP
In this approach, you write an advice that you ask spring to apply to specific methods. If your methods use the #Transactional annotation, you could have the advice be applied to those immediately after the transaction has started.
Extended TransactionManager Implementation
Lets assume your transaction is using JpaTransactionManager.
public class SecurityPolicyInjectingJpaTransactionManager extends JpaTransactionManager {
#Autowired
private EntityManager entityManager;
// constructors
#Override
protected void prepareSynchronization(DefaultTransactionStatus status, TransactionDefinition definition) {
super.prepareSynchronization(status, definition);
if (status.isNewTransaction()) {
// Use entityManager to execute your database policy param/values
// I would suggest you also register an after-completion callback synchronization
// This after-completion would clear all the policy param/values
// regardless of whether the transaction succeeded or failed
// since this happens just before it gets returned to the connection pool
}
}
}
Now simply configure your JPA environment to use your custom JpaTransactionManager class.
There are likely others, but these are the two that come to mind that I've explored.
I have been trying to create a Spring batch program which has to read certain data from Database and write it into another table. I don't want the Spring Batch metadata tables to be created in my Database. When I tried that, I was not able to do the transactions.
I avoided the meta data tables by extending DefaultBatchConfigurer and overriding like this,
#Override
public void setDataSource(DataSource dataSource) {
// override to do not set datasource even if a datasource exist.
// initialize will use a Map based JobRepository (instead of database)
}
By doing this I was getting org.springframework.dao.InvalidDataAccessApiUsageException: no transaction is in progress; nested exception is javax.persistence.TransactionRequiredException: no transaction is in progress exception.
Is there a way by which I can avoid the metadata tables and still use the transactions?
If your are using spring boot you may add the below line to application.properties or environment specific property file to ensure the spring batch metadata tables are not getting created:
spring.batch.initializer.enabled=false
Also since you do not need the meta tables so do not extend DefaultBatchConfigurer class.
I would only extend this class if I want to set up a persistent JobRepository i.e. create the spring batch meta tables, for which we need a lot of other configurations that are provided by default by DefaultBatchConfigurer class.
I come from php/laravel. Whenever I want to seed the database i only need to run php artisan db:seed. This will run some php scripts that will insert data into the database.
I want to achieve this same feature using spring/hibernate. I know I can add an import.sql file to seed the database after schema creation. However, I want to import these fixtures using java and the ORM available so I do not need to maintain an sql.
Is there a way?
If not, there should be some configuration to trigger a script that use the ORM entity manager to persist entities in the database after schema creation.
The main idea is not to maintain a big sql seeder file over schema revisions.
Thanks!
If you're using Spring data you can use Repository populators.
Otherwise you may register an event that fires after the spring context is loaded :
#Component
public class YourListener {
// Declare your autowired beans here
#EventListener
public void handleContextRefresh(ContextRefreshedEvent event) {
// Your seeder
// + You can use all the registred beans (repositories, services...)
}
}
For more detail check: Better application events in Spring Framework 4.2
i'm working with spring jdbcTemplate in some desktop aplications.
i'm trying to rollback some database operations, but i don't know how can i manage the transaction with this object(JdbcTemplate). I'm doing multiple inserts and updates through a methods sequence. When any operation fails i need rollback all previous operations.
any idea?
Updated... i tried to use #Transactional, but the rolling back doesn't happend.
Do i need some previous configuration on my JdbcTemplate?
My Example:
#Transactional(rollingbackFor = Exception.class,propagation = Propagation.REQUIRES_NEW)
public void Testing(){
jdbcTemplate.exec("Insert into table Acess_Level(IdLevel,Description) values(1,'Admin')");
jdbcTemplate.exec("Insert into table Acess_Level(IdLevel,Description) values(STRING,'Admin')");
}
The IdLevel is a NUMERIC parameter, so... in our second command will occur an exception. When i see the table in database, i can see the first insert... but, i think this operation should be roll back.
what is wrong?
JdbcTemplate doesn't handle transaction by itself. You should use a TransactionTemplate, or #Transactional annotations : with this, you can then group operations within a transaction, and rollback all operations in case of errors.
#Transactional
public void someMethod() {
jdbcTemplate.update(..)
}
In Spring private methods don't get proxied, so the annotation will not work. See this question: Does Spring #Transactional attribute work on a private method?.
Create a service and put #Transactional on the public methods that you want to be transactional. It would be more normal-looking Spring code to have simple DAO objects that each do one job and have several of them injected than it would to have a complicated DAO object that performed multiple SQL calls within its own transaction.
I want to test my Dao Class using the SpringContextTests.
In my method class I extended the AbstractTransactionalJUnit4SpringContextTests in order for my test class to integrate with JUnit4. I have also set up the configurations and made the initialization and database clean up in the #Before and tearDown in the #After. My test class works perfectly.
My problem was, when I run my test class and the database is filled with data, the original data was not rolled back and my database is cleared. In the #Before method, I clear the database and populate data, thinking that I will be able to rollback it but its not.
Can anyone site an example that works and rollbacks information in the database.
ADDONS:
Every database manipulation in my test methods are rolled back. But the execution of super.deleteFromTables("person") in the #Before method did not rollback all the previous data from the database.
Spring rollbacks all the CRUD operations but the database clean up before the transaction do not rollback.
Thank you to all those who answered my question. I learned a lot from those answers but it didn't solve my problem.
I knew my test data does a transaction management and it does its job properly.
The mistake is on my part.
I forgot the lesson about database commands that when you execute a DDL statement after a DML statement, it will automatically commit the transaction. I executed a DDL after a DML by deleting all record and then ALTER the AUTO_INCREMENT of the table where in it will cause an auto-commit and delete all records of the table permanently.
FIXING THAT SCENARIO SOLVED MY PROBLEM.
Possible causes:
you're using a database/database engine which does not have proper transactions;
you're using multiple transaction managers and/or data sources and the proper one is not picked up;
you're doing your own, separate, transactions in the test class
As for an example, here's one ( top of my head, not compiled )
public class DBTest extends AbstractTransactionalJUnit4SpringContextTests {
#Autowired
private SomeDAO _aBeanDefinedInMyContextFile;
#Test
public void insert_works() {
assert _aBeanDefinedInMyContextFile.findAll() == 0;
_aBeanDefinedInMyContextFile.save(new Bean());
assert _aBeanDefinedInMyContextFile.findAll() == 1;
}
}
Key points:
the SomeDAO is an interface which corresponds to a bean declared in my context;
the bean does not have any transactional settings ( advice/programmatic), it relies on the caller being transactional - either the service in production, or the test in our situation;
the test does not include any transactional management code, as it's all done in the framework.
I'm not sure what is wrong with your class. Here is an extract of a class that does what you want with dbunit and spring 2.5:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations={
"testContext.xml"})
#TransactionConfiguration
#Transactional
public class SampleDAOTest {
#Autowired
private DataSource dataSource;
#Autowired
private SampleDAO sampleDAO;
#Before
public void onSetUpInTransaction() throws Exception {
//Populate Test data
IDatabaseConnection dbUnitCon = new DatabaseConnection(DataSourceUtils.getConnection(dataSource), "DATASOURCE");
//read in from a dbunit excel file of test data
IDataSet dataSet = new XlsDataSet(new File("src/test/resources/TestData.xls"));
DatabaseOperation.INSERT.execute(dbUnitCon, dataSet);
}
#Test
public void testGetIntermediaryOrganisation() {
// Test getting a user
User object = sampleDAO.getUser(99L);
assertTrue(object.getValue);
}
}
One of the benfits of this method is that you don't need to extend any classes. So you can still have your own hierarchy for tests.
If you really want to stick to your current method instead of using the #before annotation I thinnk you need to overide the below method and put your setup code in there.
#Override
public void onSetUpInTransaction() throws Exception {...}
Hope this helps
Sidestepping your question, I suggest that you use a seperate database instance to run your tests against. That way, you can safely wipe it clean and have your tests initialize it as required.
As far as I know the Spring support classes for database testing only rollback what happens in the tests, not what happens in setup and teardown of tests.
Agree with Confusion-- you should be running your tests against their own database schema.
With this, you can set your hibernate properties to 'create-drop':
With create-drop, the database schema
will be dropped when the
SessionFactory is closed explicitly.
See: Optional Hibernate Config properites
Example snippet:
<bean id="sessionBean" class="org.springframework.orm.hibernate3.LocalSessionFactoryBean">
<property name="dataSource" ref="dataSource"/>
<property name="hibernateProperties">
<props>
<prop key="hibernate.hbm2ddl.auto">create-drop</prop>
...etc
While I'd agree with the guy's suggesting a deciated db for testing, there isn't any reason why using a populated db shouldn't work, both #Before and #After methods are executed within the transactional context, therefore there changes should be rolledback.
Possibles:
The data setup is doing something that isn't transactional (ie DDL statements)
Something in your test is actually committing the transaction
Can you post the #Before method, I'm just wondering if you are just clearing the tables or actually dropping and recreating them?
As far as I can tell, by looking at the Javadocs and source code for AbstractJUnit4SpringContextTests and TransactionalTestExecutionListener you need to annotate your test methods you want transactionalised with #Transactional.
There are also #BeforeTransaction and #AfterTransaction annotations where you can better control what runs in a transaction.
I suggest you create methods annotated with all these annotations, including #Before and then run the test with breakpoints at these methods. That way you can look at the stack and work out whether spring has started a transaction for you or not. If you see something like "TransactionInterceptor" in the stack then, or anything else with "Transaction" in the name, then chances are you're in a transaction.
You're doing super.deleteFromTables in your #Before method which is within the tx. So if the tx is rolled back doesn't the deletions get rolled back also?