Im trying to understand what is going on when im using AbstractTransactionalJUnit4SpringContextTests in my integration tests when trying to rollback changes made by legacy code.
The legacy code uses NamedParameterJdbcTemplate to communicate with the database. The transaction manager is of type DataSourceTransactionManager and my datasource is of type DriverManagerDataSource.
This is the overall structure of my test class.
#Begin
//Make initial setup of database using `JDBCTemplate` from `AbstractTransactionalJUnit4SpringContextTests`.
#Test
//Call legacy code that makes inserts to database.
My question is if my assuption is wrong that by extending AbstractTransactionalJUnit4SpringContextTests I make all my tests transactional. This has the expected effect that all changes made directly by my testfunction AND in legacy code called from the test function transactional and implicitly rolled back at the end of the tests???
Some observations I made are:
The #Begin function works as expected when used with testfunctions that does not call legacy code that makes changes. In this case the changes made in #Begin are rolledback.
If however I use #Begin with #Test functions calling legacy code that makes changes both the changes made by #Begin and #Test does not get rolled back! The log message printed dos state that the transactions is initated and that the rollback is successful but im not getting the expected behavior.
The Spring TestContext Framework (TCF) manages what I call test-managed transactions. The TCF does not manage application-managed transactions.
If you have code that it is managing its own transactions (e.g., via Spring's TransactionTemplate or some other programmatic means), then those interactions with the database will not be rolled back by the TCF.
For further details, consult the Test-managed Transactions section of the reference manual or slide #43 from my Testing with Spring: An Introduction presentation.
Furthermore, you naturally have to ensure that the DataSource used by the TCF is the exact same DataSource used by the Spring DataSourceTransactionManager and your legacy code. If all of your legacy code uses Spring's NamedParameterJdbcTemplate, that code should participate in the correct transaction. Otherwise, you'll need to use Spring's DataSourceUtils.getConnection(DataSource) to ensure that legacy code works with Spring-managed and test-managed transactions.
Related
I have a complex service method, that loads a lot of data from the db. Since it only reads and validates data I want to make sure
JPA (Hibernate) does not waste time for checking the big persistence
context for changes.
No accidental changes to the data are written
to the db.
Therefore towards the end of my service I have this code
TransactionAspectSupport.currentTransactionStatus().setRollbackOnly()
My unit tests have mocks for all db access, so transactioning is not an issue. (I have integration test with #SpringBootTest and full trx support as well.) Unfortunately the rollback statement fails in pure unit tests.
Is there an easy way to get the rollback statement just do nothing in case of a unit test?
For now I just wrote a small component which is easy to mock:
#Component
public class RollBacker {
public void setRollBackOnly() {
TransactionAspectSupport.currentTransactionStatus().setRollbackOnly();
}
}
After all it's the singleton approach of TransactionAspectSupport which makes unit testing difficult here. IMHO TransactionAspectSupport should be a spring bean.
Consider to use #Transactional(readOnly=true) instead of TransactionAspectSupport.currentTransactionStatus().setRollbackOnly().
Doc
A boolean flag that can be set to true if the transaction is effectively read-only, allowing for corresponding optimizations at runtime.
In our application we mainly use Spring #Transactional annotations together with JpaTemplate and Hibernate for database interaction. One part of the project, however, communicates with a different database, and hence we cannot use #Transactional here as a different transactionManager is required.
Here, for database updates, we explicitly create transactions in code, using PlatformTransactionManager, and either commit them or roll back after the call to JpaTemplate.execute(JpaCallback).
I have noticed a few places in this area of our code where we are not doing this for reads, however, and simply call JpaTemplate.execute(JpaCallback) without wrapping in a transaction. I am wondering what the dangers are of this. I appreciate that a transaction must be being created, as no database query can be run without a transaction, but since nothing in our code attempts to commit, will something potentially be holding on to resources?
I'm having a difficult time understanding Spring Transactions. First off, I am trying to use as little of Spring as possible. Basically, I want to use #Transactional and some of its injection capabilities with JPA/Hibernate. My transactional usage is in my service layer. I try to keep them out of test cases. I use CTW and am spring-configured. I have component scan on the root of my packages right now. I also am using Java configuration for my datasource, JpaTransactionManager, and EntityManagerFactory. My configuration for the JpaTransactionFactory uses:
AnnotationTransactionAspect.aspectOf().setTransactionManager( txnMgr );
I do not use #EnableTransactionManagement.
Unfortunately, I'm having a hard time understanding the rules for #Transactional and can't find an easy page that describes them simply. Especially with regards to Session usage. For example, what if I want to use #Transactional on a class that does not have a default no-arg constructor?
The weirdest problem I'm having is that in some of the POJO service classes Transacitonal works great while in others I can see the transactions being created but operations ultimately fall saying that there is "no session or the session has been closed". I apologize for not having code to reproduce this, I can't get it down to a small set. This is why I am looking for the best resources so I can figure it out myself.
For example, I can have a method that gets a lazily fetched collection of children, iterates through it and puts it into a Set and returns that set. In one class it will work fine while in another class also marked with #Transactional it will fail while trying to iterate through the PersistentSet saying that there is no session (even though there IS a transaction).
Maybe this is because I create both of these service objects in a test case and the first one is somehow hijacking the session?
By the way, I have read through the spring source transaction documents. I'm looking for a clear set of rules and tips for debugging issues like this. Thanks.
Are you sure you loaded your parent entity in the scope of the very transaction where you try to load the lazy children? If it was passed in as parameter for example (that is, loaded from outside your #Transactional method) then it might not be bound to a persistence context anymore...
Note that when no #Transactional context is given, any database-related action may have a short tx to be created, then immediately closed - disabling subsequent lazy-loading calls. It depends on your persistence context and auto-commit configurations.
To put it simply, the behaviour with no transactional context being sometimes unexpected, the ground rule is to always have one. If you access your database, then you give yourself a well-defined tx - period. With spring-tx, it means all your #Repository's and #Services are #Transactional. This way you should avoid most of tx-related issues.
We're using MyBatis (3.0.5) as our or-mapping tool (and I don't have any say-so over that!).
I've created a Response object that, through MyBatis, mapped to our [responses] database table (we use PostgreSQL).
Actually, the ormapping structure we use is as follows:
ResponseMapper.xml - this is the XML file where the PSQL queries are defined and mapped to the ResponseMapper.java** class and its methods
ReponseMapper.java - An interface used by MyBatis for executing the queries defined in the XML file (above)
ResponseDAO.java - An interface used for DI purposes (we use Spring)
ResponseDAOImpl.java - A concrete implementation of ResponseDAO that actually calls ResponseMapper methods; we use Spring to inject instances of this class into our application
So, to INSERT a new [responses] record into PostgreSQL, the code looks like this from the invoking component:
#Autowired
private ResponseDAO response;
public void doStuff()
{
int action = getAction();
response.setAction(action);
response.insert();
}
This set up works beautifully for us. However I am now trying to write a set of JUnit tests for the ResponseDAOImpl class, and I want to make sure that it is correctly executing queries to our PostgreSQL database.
As far as I can tell, there is no way to "mock" an entire database. So my only option (seemingly) is to have the test method(s) execute a query, check for success, and then roll it back regardless.
MyBatis doesn't seem to support this kind of rollback feature. I found this post off the mybatis-user mailing list on Old Nabble, but the poster was using Guice and his/her question seemed to be more about rolling back transactions through Guice.
If MyBatis doesn't support transactions/rollbacks (does it?!?!), then it seems like my only repireve would be if the PostgreSQL-JDBC driver supports these. I guess I could then try to configure my test methods so that they run the ResponseDAO.insert() method, and then manually try to rollback the transaction directly through the driver (sans MyBatis).
Does SO have any experience with this? Code samples? Tips? Best practices? Thanks in advance!
MyBatis allows rollbacks when working with an "SqlSession", the thing is your using the spring dependency injection piece, which automatically commits your transaction when the method completes.
You have a few options, among them
Inject a Mock of your dependencies. There is some rocking libraries to help with this. Like Mockito, here is a good question on Spring Mockito stuff. This will test your business logic in your java, but not your actual queries.
Commit your queries, and delete your data after the test runs. This is the approach we took, because it tests our database as well. You would obviously need a test instance of your database, which some people don't have.
You could try to provide your own test bindings for the classes that do the automatic commit in the MyBatis Spring Integration and override there behavior so that in the test environment the behavior is to rollback the query instead of committing. A similar approach was use in the Guice integration, and it is described here.
Not sure this is what you need, but org.apache.ibatis.session.SqlSession class has rollback() method which can be used to rollback.
Another approach is to use getConnection() method from the same class which will return javax.sql.Connection javax.sql.Connection class which also has commit() and rollback() methods.
Hope it helps.
Remis B
I'm running a test within a subclass of AbstractTransactionalTestNGSpringContextTests, where I execute tests over a partial Spring context. Each test runs within a transaction, which is rolled back at the end to leave the database unchanged.
One test writes to the database through Hibernate, while another reads from the same database using the JdbcTemplate, with both share the same datasource.
I'm finding that I can't see the hibernate updates when querying through the JdbcTemplate. This does make some sense, as each is presumably getting its own connection from the connection pool and so is operating within its own transaction.
I've seen indications that it's possible to get the two to share a connection & transaction, but am not clear of the best way to set this up, especially with the involvement of the connection factory. All these components are declared as Spring beans. Can anyone give me any pointers?
Edit:
Well I've gone to the trouble of actually reading some documentation and the HibernateTransactionManager class states that this is definitely possible: "This transaction manager is appropriate for applications that use a single Hibernate SessionFactory for transactional data access, but it also supports direct DataSource access within a transaction (i.e. plain JDBC code working with the same DataSource)...".
The only requirement appears to be setting the datasource property, which isn't otherwise required. Having done that, though, I'm still not seeing my changes shared before the transaction has been committed. I will update if I get it working.
The Spring bean that writes has to flush its changes in order for the bean that reads to see them.