Spring #Transaction does not rollback on Exception thrown - java

I've searched around for this question, there's quite a few of them here on StackOverflow and Google but I can't seem to get anything working for me.
here are my codes
Spring config: (I dont use any pointcut - I think I dont need to?)
<bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
...
</bean>
<bean id="hibernateSessionFactory" class="org.springframework.orm.hibernate3.LocalSessionFactoryBean">
<property name="dataSource" ref="dataSource" />
...
</bean>
<bean id="transactionManager" class="org.springframework.orm.hibernate3.HibernateTransactionManager">
<property name="sessionFactory" ref="hibernateSessionFactory"/>
</bean>
<tx:annotation-driven transaction-manager="transactionManager"/>
I have a Service class:
#Service
public class ServiceImpl implements ServiceInterface
{
/**
* Injected session factory
*/
#Autowired(required=true)
private SessionFactory sessionFactory;
#Autowired(required=true)
private Dao myDao;
/**
* {#inheritDoc}
*/
#Transactional(rollbackFor=Exception.class, propagation=Propagation.REQUIRED)
public void scheduleBlast(BlastParameters blastParameters) throws ServiceException
{
... do bunch of stuff ..
myDao.persist(entity)
if(true)
throw new ServiceException("random error")
}
.. setter methods and other stuff ..
}
and a Dao class:
public class DaoImpl implements DaoInterface
{
#Autowired(required=true)
private SessionFactory sessionFactory
/**
* {#inheritDoc}
*/
#Transactional(propagation=Propagation.MANDATORY)
public void persist(Entity e) throws DaoException
{
try
{
sessionFactory.getCurrentSession().persist(e);
}
catch(Exception ex)
{
throw new DaoException(ex);
}
}
.. setter methods and other stuff ..
}
Some unnecessary details are eliminated (eg. missing setter, etc), assume that code works perfectly fine.
My problem with the above is that when I added the throw random exception line, it does not rollback, the object being persisted through the DAO stays in the db.
I am using Spring 3.1 and Hibernate 3.6 (because there was a bug with Hibernate 4.0 on Spring 3.1)
Thoughts?
Thank you

I found the cause of my problem and why the transaction (seemingly) not managed properly.
Somewhere in my code
/**
* {#inheritDoc}
*/
#Transactional(rollbackFor=Exception.class, propagation=Propagation.REQUIRED)
public void doWork(Parameters param) throws ServiceException
{
... do bunch of stuff ..
myDao1.persist(entity)
-- Some logic here --
... do bunch of stuff ..
myDao2.persist(entity2)
if(true)
throw new ServiceException("random error")
}
The part where it says "-- Some logic here --", there was some logic done that uses raw SQL and call on execute update:
Query query = sessionFactory.getCurrentSession().createSQLQuery(queryText);
query.executeUpdate();
And because it's not using Hibernate query, and instead using raw SQL execution, it caused a flush to be called and thus any work done prior to the call will be committed along with
it.
I re-work the flow of logic to account for this logic to make sure transaction is managed properly. While using raw SQL might be an indication that there's something wrong - it was something necessary to be done due to the things that the service try to accomplish and to improve the performance of the service.

That is the intended behavior of transaction management.
The default behavior for #Transactional is to rollback only for runtime exceptions.
If you want your stuff to rollback after throwing DaoException then add it to the rollback exception list. Don't forget to also include RuntimeException also.
Try the following on the Dao class
#Transactional(propagation=Propagation.Mandatory, rollbackFor={RuntimeException.class, DaoException.class})

Try remove the #Transactional annotation from the DaoImpl class. I suspect what might be happening is that the transaction is being committed when it crosses back over that transaction boundary (DaoImpl). I've had mixed success with this setup. You can try some different transaction approaches to the "inner" transaction.
The other thing you can do is turn on spring transaction logging. It think its category org.springframework.transaction or something. That way you will see exactly what it is doing w.r.t to roll back and commit of transactions...

you don't have one of those JDBC drivers that are in AUTOCOMMIT mode by default, do you?

I have the same issue when open connections manually in my Spring Repository. After I start using JdbcTemplate error disappeared
I used
#Autowired
private JdbcTemplate jdbc;
in my Repository
and then
jdbc.query(sql, new RowMapper()); or jdbc.update(sql); or jdbc.queryForObject(sql, new RowMapper()); dependent on how many objects you return by your query.
Also I defined RowMapper():
public class RowMapper implements RowMapper<YourDTOObject> {
#Override
public YourDTOObject mapRow(ResultSet resultSet, int i) throws SQLException {
YourDTOObject dto = new YourDTOObject();
dto.setId(resultSet.getInt("id")); // here you can change to your column names
dto.setName(resultSet.getString("name"));
dto.setAmount(resultSet.getBigDecimal("amount"));
return a;
}

Related

MyBatis Spring Transactions

I'm trying to properly use spring transaction-management functionality provided by MyBatis
I'm creating sqlSessionFactor in the following manner:
<bean id="sqlSessionFactory" class="org.mybatis.spring.SqlSessionFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="mapperLocations" value="classpath:some/package/**/*.xml" />
<property name="transactionFactory">
<beanclass="org.mybatis.spring.transaction.SpringManagedTransactionFactory" />
</property>
</bean>
<bean id="transactionManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<constructor-arg ref="dataSource" />
</bean>
Now there is this section called "Programmatic Transaction Management" here which gets the reference of transactionManager and using this transactionManager we are doing rollback or commit depending upon whether we got an exception or not.
Now my question is that in my DAO layer should I explicitly do something like
public class UserDao extends SqlSessionDaoSupport {
PlatformTransactionManager transactionManager; // wired using bean-property
public void insertUser(Integer userId) {
try {
getSqlSession().insert("user-map.insertUser", userId);
} catch (Exception e) {
transactionManager.rollback(txStatus);
throw e;
}
transactionManager.commit(txStatus);
}
}
or just using the following thing (without programmatic transactions) will also perform all the insertions in a transactional way.
public class UserDao extends SqlSessionDaoSupport {
public void insertUser(Integer userId) {
getSqlSession().insert("user-map.insertUser", userId);
}
}
my mapper file looks something like this:
<insert id="insertUser" parameterType="HashMap">
<!-- this contains multiple insert queries -->
</insert>
Note that I've multiple inserts inside <insert>...</insert> and I want either all of them to happen or non-of them to happen.
This is another reference that I was using.
So a general question is that will MyBatis provide an automatic transaction-management around my <insert>...</insert> or will I have to explicitly use the transactionManager to achieve the transaction-management feature?
Here's the quote from the documentation you referenced:
MyBatis SqlSession provides you with specific methods to handle transactions programmatically. But when using MyBatis-Spring your beans will be injected with a Spring managed SqlSession or a Spring managed mapper. That means that Spring will always handle your transactions.
With the setup you provided transaction timespan is completely managed by spring that is if
you use declarative transaction management you don't need to do anything additionally. Spring will start transaction at the point it is directed to by your configuration.
The simple way to enable declarative transaction management is to add this to spring configuration:
<tx:annotation-driven/>
And then use #Transactional on your service methods:
#Service
public class MyService {
#Autowired
private UserDao userDao;
#Transactional
public addUser(User user) {
userDao.insertUser(user);
}
}
The section in the documentation you mentioned is about the (rare) cases when you want to use programmatic transaction management.

What is the role of Service layer in spring and how do I split my logic in this scenario?

I am trying to write a simple webapp, which takes email id as parameter and generates a token for the user with that id. I think my code is self-explanatory, so I will just paste it here so that I don't have to explain in detail.
This is my controller/servlet code
User user = userManager.getUserByEmailId("xyz#gmail.com");
if (user == null) {
//TODO handle this
}
if (user.getIssuedTokens() == user.getMaxTokens()) {
// TODO handle this
}
String token = tokenService.createToken();
user.setToken(token);
user.setIssuedTokens(user.getIssuedTokens() + 1);
userManager.updateUser(user);
userManager and tokenService are Service layer implementations.
#Service("tokenService")
public class TokenizationServiceImpl implements TokenizationService {
#Autowired
private TokenDAO tokenDAO;
#Transactional
public String createToken() {
String uuid = UUID.randomUUID().toString();
tokenDAO.createToken(uuid);
return uuid;
}
}
#Service("usermanager")
public class UserInterfaceImpl implements UserInterface {
#Autowired
private UserDAO userDAO;
#Transactional
public void createUser() {
userDAO.createUser();
}
public User getUserByEmailId(String emailID) {
return userDAO.getUserByEmailId(emailID);
}
#Transactional
public void updateUser(User user) {
userDAO.updateUser(user);
}
}
my spring configuration is like this
<tx:annotation-driven />
<context:component-scan base-package="com.myapp.dao" />
<context:component-scan base-package="com.myapp.service" />
<bean id="dataSource"
class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value="com.mysql.jdbc.Driver" />
<property name="url" value="jdbc:mysql://localhost:3306/mydb" />
<property name="username" value="root" />
<property name="password" value="root" />
</bean>
<!-- dataSource TransactionManager -->
<bean id="transactionManager"
class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSource" />
</bean>
Here are my questions:
Does it make sense to inject two services (usermanager and tokenService) into my controller/servlet and then invoke them one after another? Or Should I have written one method in TokenServiceImpl directly, which would make use of usermanager service or UserDAO directly?
The transactional attribute did not seem to work for me. When updateUser method failed, a token was created in the database from createToken() method. What am I doing wrong here? Why did not the transaction rollback?
How do I in general decide on whether my service should use multiple DAOs or other services?
Well, my thoughts about your second question may also answer your first question. Looking to you snippet, I can notice that you are creating two different transactions for tokenService.createToken() and userManager.updateUser(user) since you are calling them from outside a #Transaction method. In order fix this behavior, you gonna need to do something like:
public class UserService {
...
#Transactional
public void assignToken() {
User user = userManager.getUserByEmailId("xyz#gmail.com");
if (user == null) {
//TODO handle this
}
if (user.getIssuedTokens() == user.getMaxTokens()) {
// TODO handle this
}
String token = tokenService.createToken();
user.setToken(token);
user.setIssuedTokens(user.getIssuedTokens() + 1);
userManager.updateUser(user);
}
}
You can notice that, in order to take into account this new transactional behavior, I created a new business component called UserService. Even though I don't really know your application enough to say what is the best approach, I would definitely not let it in your controller. In my opinion, you should isolate this behavior in:
a new business component (as I did in this example)
or encapsulate it inside your UserManager for instance
Now, is up to you decide whether it's worth coupling UserService to TokenizationService or creating a new business class for that. Reading the code you provided, it seems to me that UserService could have a TokenizationService since tokens won't be used in a different context.
Please, let me know your opinions.
Yes, all the code in the controller should be in a transactional service. As is your service doesn't do anything other than delegating to methods of the DAO. The service is supposed to contain the business logic, and to demarcate transactions.
Each of your services is transactional. So, when calling createToken(), a transaction starts and is committed as soon as createToken() returns. And when updateUser() is called, another transaction is started and commits or rollbacks as soon as the updateUser() returns/fails. That's one of the reasons why all the controller code should be in a single transactional service. If that was the case, both calls would be made in a single transaction, and if the second call failed, the whole transaction would be rollbacked, including the token generation.
My rule is: if a service just needs to get data from the database, it should use a DAO. If it needs to reuse business logic already defined in another service, then it should delegate to that service.

What's the best way to share a connection between Hibernate's SessionFactory and a JDBC DAO?

I'm using Spring 3.0.6, with Hibernate 3.2.7.GA in a Java-based webapp. I'm declaring transactions with #Transactional annotations on the controllers (as opposed to in the service layer). Most of the views are read-only.
The problem is, I've got some DAOs which are using JdbcTemplate to query the database directly with SQL, and they're being called outside of a transaction. Which means they're not reusing the Hibernate SessionFactory's connection. The reason they're outside the transaction is that I'm using converters on method parameters in the controller, like so:
#Controller
#Transactional
public class MyController {
#RequestMapping(value="/foo/{fooId}", method=RequestMethod.GET)
public ModelAndView get(#PathVariable("fooId") Foo foo) {
// do something with foo, and return a new ModelAndView
}
}
public class FooConverter implements Converter<String, Foo> {
#Override
public Foo convert(String fooId) {
// call FooService, which calls FooJdbcDao to look up the Foo for fooId
}
}
My JDBC DAO relies on SimpleJdbcDaoSupport to have the jdbcTemplate injected:
#Repository("fooDao")
public class FooJdbcDao extends SimpleJdbcDaoSupport implements FooDao {
public Foo findById(String fooId) {
getJdbcTemplate().queryForObject("select * from foo where ...", new FooRowMapper());
// map to a Foo object, and return it
}
}
and my applicationContext.xml wires it all together:
<mvc:annotation-driven conversion-service="conversionService"/>
<bean id="conversionService" class="org.springframework.context.support.ConversionServiceFactoryBean">
<property name="converters">
<set>
<bean class="FooConverter"/>
<!-- other converters -->
</set>
</property>
</bean>
<bean id="jdbcTemplate" class="org.springframework.jdbc.core.JdbcTemplate">
<property name="dataSource" ref="dataSource"/>
</bean>
<bean id="transactionManager" class="org.springframework.orm.hibernate3.HibernateTransactionManager"
p:sessionFactory-ref="sessionFactory" />
FooConverter (which converts a path variable String to a Foo object) gets called before MyController#get() is called, so the transaction hasn't been started yet. Thus when FooJdbcDAO is called to query the database, it has no way of reusing the SessionFactory's connection, and has to check out its own connection from the pool.
So my questions are:
Is there any way to share a database connection between the SessionFactory and my JDBC DAOs? I'm using HibernateTransactionManager, and from looking at Spring's DataSourceUtils it appears that sharing a transaction is the only way to share the connection.
If the answer to #1 is no, then is there a way to configure OpenSessionInViewFilter to just start a transaction for us, at the beginning of the request? I'm using "on_close" for the hibernate.connection.release_mode, so the Hibernate Session and Connection are already staying open for the life of the request.
The reason this is important to me is that I'm experiencing problems under heavy load where each thread is checking out 2 connections from the pool: the first is checked out by hibernate and saved for the whole length of the thread, and the 2nd is checked out every time a JDBC DAO needs one for a query outside of a transaction. This causes deadlocks when the 2nd connection can't be checked out because the pool is empty, but the first connection is still held. My preferred solution is to make all JDBC DAOs participate in Hibernate's transaction, so that TransactionSynchronizationManager will correctly share the one single connection.
Is there any way to share a database connection between the SessionFactory and my JDBC DAOs? I'm using HibernateTransactionManager, and from looking at Spring's DataSourceUtils it appears that sharing a transaction is the only way to share the connection.
--> Well you can share database connection between SessionFactory and JdbcTemplate. What you need to do is share same datasource between the two. Connection pooling is also shared between the two. I am using it in my application.
What you need to do is configure HibernateTransactionManager for both transactions.
Add JdbcDao class(with properties jdbcTemplate and dataSource with getter-setter) in your existing package structure(in dao package/layer), Extend your jdbc implementation classes by JdbcDao. If you have configured, hibernateTxManager for hibernate, you will not need to configure it.
The problem is, I've got some DAOs which are using JdbcTemplate to query the database directly with SQL, and they're being called outside of a transaction. Which means they're not reusing the Hibernate SessionFactory's connection.
--> You may be wrong here. You may be using same connection, I think, only problem may lie in HibernateTransaction configuration.
Check HibernateTransactionManager javadoc : This transaction manager is appropriate for applications that use a single Hibernate SessionFactory for transactional data access, but it also supports direct DataSource access within a transaction (i.e. plain JDBC code working with the same DataSource). This allows for mixing services which access Hibernate and services which use plain JDBC (without being aware of Hibernate)!
Check my question : Using Hibernate and Jdbc both in Spring Framework 3.0
Configuration : Add dao classes and service classes with your current hibernate classes, do not make separate packages for them, If you want to work with existing configuration. Otherwise configure HibernateTransactionManager in xml configuration and Use #Transactional annotation.
Mistake in your code :
#Controller
#Transactional
public class MyController {......
Use #Transactional annotation in service classes(best practice).
Correction :
#Transactional(readOnly = true)
public class FooService implements FooService {
public Foo getFoo(String fooName) {
// do something
}
// these settings have precedence for this method
#Transactional(readOnly = false, propagation = Propagation.REQUIRES_NEW)
public void updateFoo(Foo foo) {
// do something
}
}

Spring/Hibernate Truncate/Delete all rows from a table - Transaction issues

I have a Camel project and after we create a controll bean we want to clean up a DB log table. SO each time we run the application we TRUNCATE a table called agent orders. This is setup in an Enity object as a named query.
#NamedNativeQuery(name="cleanOrderTable", query="TRUNCATE agent_orders",resultClass= AgentOrderEntity.class)
The code that calls this query looks like:
#Component("mgr")
public class Controller{
#PersistenceContext(unitName="camel")
private EntityManager em;
.......
#Transactional
public void clearHistoricalOrders() throws Exception{
Query query = em.createNamedQuery("cleanOrderTable");
query.executeUpdate();
}
}
Call the clear history method we get an error javax.persistence.TransactionRequiredException: Executing an update/delete query
I have tried everything, UserTransaction, em.getTransaction().begin - nothing works. Any idea how I can run this query?
We have the following tran manager setup in our app context.xml:
<tx:annotation-driven transaction-manager="txManager" />
<bean id="txManager" class="org.springframework.orm.jpa.JpaTransactionManager"
p:dataSource-ref="dataSource">
<property name="entityManagerFactory" ref="emFactory" />
</bean>
Try debugging and check whether your controller is proxied and whether there's transaction-related code executed before your method. Try enabling database server logs to check what queries really get executed.
Make sure you don't have any ServletFilters that set-up a read-only transaction prior to getting to your Controller.
Make sure your entity manager is the one that's passed to the transaction manager.
Also, I've found some info advising against using #PersistenceContext in servlets: http://weblogs.java.net/blog/ss141213/archive/2005/12/dont_use_persis.html
Hope this helps!
I'd try executing the query with a TransactionTemplate just to check that the #Transactional annotation really isn't having an effect.
Also, what's up with resultClass= AgentOrderEntity.class? Why does a query that truncates a table need to return something?

Database access from jsr-303 custom validator

I'm using spring based validation in combination with hibernate validator enabled by the following in my application context:
<bean id="entityManagerFactory" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
....
<property name="jpaPropertyMap">
<map>
<entry key="javax.persistence.validation.factory" value-ref="validator" />
</map>
</property>
</bean>
<bean id="validator" class="org.springframework.validation.beanvalidation.LocalValidatorFactoryBean"/>
I've implemented a custom validator that accesses a database to check validity constraints for a particular object using a spring injected DAO. This results in a java.lang.StackOverflowError as it appears that the validation is called every time an object is loaded from the database from within the validator, causing an infinite loop. To get around this, I have tried setting the flush mode on my entity manager to commit from within the validator with the following code:
entityManager.setFlushMode(FlushModeType.COMMIT);
This results in an "collection not process by flush()" exception from hibernate.
Is there an example of best practice in accessing the database from within a custom validator which will allow me to get around both of these issues?
After much experimentation, it would appear that the best way to do this to is to use the EntityManagerFactory directly from within the code. In the initialize(...) method of the validator class I have the following code:
EntityManagerFactory emf = Persistence.createEntityManagerFactory("pu_name");
entityManager = emf.createEntityManager();
The downside is that you don't get Spring's DI features, but you can access the database nonetheless.
I ran into this issue as well, here is how I solved it:
#Autowired bean works with #Valid on controller but fails with CRUD repository
In a few words, I also got a reference to the EntityManagerFactory object. However, I set setFlushMode to FlushModeType.COMMIT just before calling my service method. Finally I set it back to FlushModeType.AUTO:
Here is an example:
public class UniqueUsernameValidator implements ConstraintValidator<UniqueUsername, String> {
#PersistenceContext
private EntityManager em;
#Autowired
UserService userService;
#Override
public void initialize(UniqueUsername constraintAnnotation) {
}
#Override
public boolean isValid(String username, ConstraintValidatorContext context) {
try {
em.setFlushMode(FlushModeType.COMMIT);
return userService.findByUsername(username) == null;
} finally {
em.setFlushMode(FlushModeType.AUTO);
}
}
}

Categories