i'm working with spring jdbcTemplate in some desktop aplications.
i'm trying to rollback some database operations, but i don't know how can i manage the transaction with this object(JdbcTemplate). I'm doing multiple inserts and updates through a methods sequence. When any operation fails i need rollback all previous operations.
any idea?
Updated... i tried to use #Transactional, but the rolling back doesn't happend.
Do i need some previous configuration on my JdbcTemplate?
My Example:
#Transactional(rollingbackFor = Exception.class,propagation = Propagation.REQUIRES_NEW)
public void Testing(){
jdbcTemplate.exec("Insert into table Acess_Level(IdLevel,Description) values(1,'Admin')");
jdbcTemplate.exec("Insert into table Acess_Level(IdLevel,Description) values(STRING,'Admin')");
}
The IdLevel is a NUMERIC parameter, so... in our second command will occur an exception. When i see the table in database, i can see the first insert... but, i think this operation should be roll back.
what is wrong?
JdbcTemplate doesn't handle transaction by itself. You should use a TransactionTemplate, or #Transactional annotations : with this, you can then group operations within a transaction, and rollback all operations in case of errors.
#Transactional
public void someMethod() {
jdbcTemplate.update(..)
}
In Spring private methods don't get proxied, so the annotation will not work. See this question: Does Spring #Transactional attribute work on a private method?.
Create a service and put #Transactional on the public methods that you want to be transactional. It would be more normal-looking Spring code to have simple DAO objects that each do one job and have several of them injected than it would to have a complicated DAO object that performed multiple SQL calls within its own transaction.
Related
When using Hibernate and JPA, I have an existing DAO Abstract Class that sets up an entity manager this way:
#PersistenceContext(unitName = "<name>")
private EntityManager entityManager;
And in certain methods, it's used in the following manner:
public ObjectType findByPrimaryKey(int id) {
return entityManager.find(ObjectType, id);
}
I wanted to set a database configuration parameter in the same transaction as the "find" query. However, I can't seem to find the internal transaction that entityManager uses. I wrote an Aspect that checks for Transactional annotation and sets the variable in there, and added #Transactional on top of findByPrimaryKey method, but that still didn't get set in the session.
Is there something incorrect here or another way to do it? Ideally, want to set a special variable before every query.
Final solution was to combine Spring's "#Transactional" annotation, which automatically opens a transaction, before calling my Data access layer, with an Aspect that pointcuts to "#Transactional" annotation and runs queries within a transaction. Just executing any query within an "#Transactional" method will also work, before calling the Hibernate data access layer.
Without the "#Transactional" annotation, I couldn't control Hibernate's transaction management.
My application loads a list of entities that should be processed. This happens in a class that uses a scheduler
#Component
class TaskScheduler {
#Autowired
private TaskRepository taskRepository;
#Autowired
private HandlingService handlingService;
#Scheduled(fixedRate = 15000)
#Transactional
public void triggerTransactionStatusChangeHandling() {
taskRepository.findByStatus(Status.OPEN).stream()
.forEach(handlingService::handle);
}
}
In my HandlingService processes each task in issolation using REQUIRES_NEW for propagation level.
#Component
class HandlingService {
#Transactional(propagation = Propagation.REQUIRES_NEW)
public void handle(Task task) {
try {
processTask(task); // here the actual processing would take place
task.setStatus(Status.PROCCESED);
} catch (RuntimeException e) {
task.setStatus(Status.ERROR);
}
}
}
The code works only because i started the parent transaction on TaskScheduler class. If i remove the #Transactional annotation the entities are not managed anymore and the update to the task entity is not propagated to the db.I don't find it natural to make the scheduled method transactional.
From what i see i have two options:
1. Keep code as it is today.
Maybe it`s just me and this is a correct aproach.
This varianthas the least trips to the database.
2. Remove the #Transactional annotation from the Scheduler, pass the id of the task and reload the task entity in the HandlingService.
#Component
class HandlingService {
#Autowired
private TaskRepository taskRepository;
#Transactional(propagation = Propagation.REQUIRES_NEW)
public void handle(Long taskId) {
Task task = taskRepository.findOne(taskId);
try {
processTask(task); // here the actual processing would take place
task.setStatus(Status.PROCCESED);
} catch (RuntimeException e) {
task.setStatus(Status.ERROR);
}
}
}
Has more trips to the database (one extra query/element)
Can be executed using #Async
Can you please offer your opinion on which is the correct way of tackling this kind of problems, maybe with another method that i didn't know about?
If your intention is to process each task in a separate transaction, then your first approach actually does not work because everything is committed at the end of the scheduler transaction.
The reason for that is that in the nested transactions Task instances are basically detached entities (Sessions started in the nested transactions are not aware of those instances). At the end of the scheduler transaction Hibernate performs dirty check on the managed instances and synchronizes changes with the database.
This approach is also very risky, because there may be troubles if you try to access an uninitialized proxy on a Task instance in the nested transaction. And there may be troubles if you change the Task object graph in the nested transaction by adding to it some other entity instance loaded in the nested transaction (because that instance will now be detached when the control returns to the scheduler transaction).
On the other hand, your second approach is correct and straightforward and helps avoid all of the above pitfalls. Only, I would read the ids and commit the transaction (there is no need to keep it suspended while the tasks are being processed). The easiest way to achieve it is to remove the Transactional annotation from the scheduler and make the repository method transactional (if it isn't transactional already).
If (and only if) the performance of the second approach is an issue, as you already mentioned you could go with asynchronous processing or even parallelize the processing to some degree. Also, you may want to take a look at extended sessions (conversations), maybe you could find it suitable for your use case.
The current code processes the task in the nested transaction, but updates the status of the task in the outer transaction (because the Task object is managed by the outer transaction). Because these are different transactions, it is possible that one succeeds while the other fails, leaving the database in an inconsistent state. In particular, with this code, completed tasks remain in status open if processing another task throws an exception, or the server is restarted before all tasks have been processed.
As your example shows, passing managed entities to another transaction makes it ambiguous which transaction should update these entities, and is therefore best avoided. Instead, you should be passing ids (or detached entities), and avoid unnecessary nesting of transactions.
Assuming that processTask(task); is a method in the HandlingService class (same as handle(task) method), then removing #Transactional in HandlingService won't work because of the natural behavior of Spring's dynamic proxy.
Quoting from spring.io forum:
When Spring loads your TestService it wrap it with a proxy. If you call a method of the TestService outside of the TestService, the proxy will be invoke instead and your transaction will be managed correctly. However if you call your transactional method in a method in the same object, you will not invoke the proxy but directly the target of the proxy and you won't execute the code wrapped around your service to manage transaction.
This is one of SO thread about this topic, and Here are some articles about this:
http://tutorials.jenkov.com/java-reflection/dynamic-proxies.html
http://tutorials.jenkov.com/java-persistence/advanced-connection-and-transaction-demarcation-and-propagation.html
http://blog.jhades.org/how-does-spring-transactional-really-work/
If you really don't like adding #Transaction annotation in your #Scheduled method, you could get transaction from EntityManager and manage transaction programmatically, for example:
UserTransaction tx = entityManager.getTransaction();
try {
processTask(task);
task.setStatus(Status.PROCCESED);
tx.commit();
} catch (Exception e) {
tx.rollback();
}
But I doubt that you will take this way (well, I wont). In the end,
Can you please offer your opinion on which is the correct way of tackling this kind of problems
There's no correct way in this case. My personal opinion is, annotation (for example, #Transactional) is just a marker, and you need a annotation processor (spring, in this case) to make #Transactional work. Annotation will have no impact at all without its processor.
I Will more worry about, for example, why I have processTask(task) and task.setStatus(Status.PROCESSED); live outside of processTask(task) if it looks like does the same thing, etc.
HTH.
Environment: Spring 3, Custom Transaction Management, JDBC Transactions
I just read the Spring docs on using the transaction template to handle transaction management. It seemed overly complex so I want to ask:
Most of my transactions are JDBC related, meaning I just declare an #Transactional on my service. But now I am making a REST service call to another site which needs to rollback if any of the following JDBC operations fail, I'll provide the rollback code in this case.
As I progress in my method, in my transaction - I want to save a reference to the REST service call (needed to roll back that action), and upon exception I just want a method myCustomRollback() called which can access the previously stored object.
Why not just provide a map in the transactionTemplate for storing stuff and define a custom rollback method on the #Transactional annotation?
This is the way I think about it, I'm not following the way Spring thinks about this. Can someone help me bridge the gap between what I want and how I accomplish it most efficiently in Spring? I only need to do this for a few special case operations.
To anyone still reading this:
I solved a similar problem with spring events - as suggested by Den Roman in option 3.
Here's the basic idea (scenario is fictional):
Whenever I perform external operations that need to be rolled back together with the transaction, I publish an event inside my #Transactional method using support from spring (org.springframework.context.ApplicationEventPublisher):
#Transactional
public String placeOrder(Order order) {
String orderId = orderServiceGateway.createOrder(order);
applicationEventPublisher.publishEvent(new OrderCreatedEvent(orderId));
workflowService.startWorkflow(orderId);
return orderId;
}
The event itself can be any object - I created a POJO with details about the remote entity to be deleted.
Then I registered a special event listener that is bound to a transaction phase - in my case to the rollback:
#TransactionalEventListener(phase = TransactionPhase.AFTER_ROLLBACK)
public void rollBackOrder(OrderCreatedEvent orderCreatedEvent) {
String orderId = orderCreatedEvent.getOrderId();
orderServiceGateway.deleteOrder(orderId);
}
Of course, it's recommended to catch & log the exception from rollback operation, not to lose the original exception from the placeOrder() method.
By default these events are synchronous, but they can be made async by additional configuration.
Here's a very good article on this mechanism, including detailed configuration and pitfalls: Transaction Synchronization and Spring Application Events (DZone)
While I don't like the solution 100% because it clutters the business logic with event publishing stuff and binds to spring, it definitely does what I expect it to do and makes it possible to pass context from the transactional method to the rollback method - which is not available through a traditional try/catch block outside of the transactional method (unless you put your context in the exception itself, which is not very nice).
I've re-read your question a few times and am not sure I understand your question completely. I assume your executing someCode and if that fails you would like to execute myCustomRollback which has some information about someCode. So I'll try to provide a Generic answer.
If you want spring to rollback some code. It will only rollback that which is rollBackAble, like jdbc transactions. Assume you have a method which performs 2 calls.
#Transactional
public void doStuff(SomeEntity entity, File file) {
persist(entity);
customFileService.createOnFileSystem(file);
throw new RunTimeException();
}
So the code above will always rollback. It will undo the persisting of your entity, but not the creation of your file, since that is not managed by Spring transactions, unless you provide custom implementation for it to be.
Second, Spring provides 2 ways of working with transactions:
Spring AOP: a proxy is created at runtime which will decorate your code with transactional stuff. If your class would be named MyClass, then Spring will create a class names MyClassProxy, which will wrap your code in transactional code.
AspectJ: at compile time your .class file will be adjusted and transactional code will be embedded inside your method.
The aspectJ approach seems harder to configure, but isn't so much and is way easier to use. Since anything which is annotated with #Transactional will be embedded (weaved) with code. For Spring AOP this is not the case. Transactional inner method calls for instance in Spring will be ignored! So aspectJ provides a more intuitive approach.
Back to what I think your question is (the code is all in 1 class):
public void doSomeCode() {
Object restCall = initialize();
try {
execute(restCall);
} catch (CustomException e) {
myCustomRollback(restCall; e);
}
}
#Transactional(rollbackFor = CustomException.class)
private void execute(Object restCall) throws CustomException {
// jdbc calls..
restCall = callRest(restCall);
throw new CustomException();
}
void myCustomRollback(Object restCall, CustomException e) {
...
}
The code above will only work with AspectJ! Since your making inner method calls which also seems to be private! AOP at runtime cannot handle this.
So what happens is everything (which is rollbackAble) in execute will be rollbacked. And in doStuff you have information about the objects which were used in execute, you now can use in myCustomRollback to rollback your REST stuff manually.
Not sure if I answered this question properly, but I hope it helps someone with a similar problem.
1 solution is to implement your own transactional manager by extending a one
2 solution is to use TransactionSynchronizationManager class
3 solution is to use #TransactionalEventListener in case you have Spring 4
Spring transaction management the default behavior for automatic rollback is for unchecked exceptions
so for a custom exception,
#Transactional(rollbackFor = CustomException.class, noRollbackFor = RuntimeException.class)
public void doSomething(...
)
the transaction be rolled back if it there is an exception that matches the specified. If an exception not matches, it is propagated to caller of the service or TransactionRolledBackException wrapper
if you use use the org.springframework.transaction.PlatformTransactionManager it is more manageable handling exceptions than template
check the documentation http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/transaction.html
you can use the AfterThrowing advice (when an exception is thrown) & call your method (myCustmRollback()) there, you can use TransactionSynchronizationManager class to get thecurrent transaction & roll it back...
alternatively.. you can use the AroundAdvice to begin & commit/rollback your transaction (this way you can use the spring provided transaction manager by using the TransactionSynchronizationManager class)
The following test cases are functionally working properly, but one of the test methods having to create a new article in the database doesn't rollback at the end of the test case execution.
I expect it to work that way. For a test case that update article actually rollbacks the update at the end of test case execution.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(value = "/applicationContext-test.xml")
#TransactionConfiguration(transactionManager = "txManager", defaultRollback = true)
#Transactional
public class PriceRepositoryTest {
#Resource(name ="repository")
private PriceRepository repository;
#Test
public void testGetAll() throws Exception {
Assert.assertEquals(8, repository.getAll().size());
}
#Test
#Rollback
public void shouldSaveNewArticle(){
Article article = new Article();
article.setName("Article");
article.setPrice(33);
repository.save(article);
Assert.assertEquals(9, repository.getAll().size());
}
#Test
#Rollback
public void shouldUpdateArticle(){
Article article = repository.getArticle(4);
article.setPrice(33);
repository.update(article);
Assert.assertEquals(String.valueOf(33.0), String.valueOf(repository.getArticle(4).getPrice()));
}
}
Is your DAO perhaps also marked with #Transactional? If it is, that's the problem: the transactionality on a different layer doesn't know about your local transaction configuration.
If repository.update(article) is #Transactional, it may or may not start a new transaction (depending on the value of the propagation attribute), but it will commit the transaction after execution, and there's nothing your test method can do to intercept that.
That's one of the reasons why transactions should be started on the service level, not the DAO level.
(If that's not the case, I humbly apologize)
I also encountered this issue and spent some hours trying to find the root cause. The issue was a variant of the problems described here. In my case, the application calls stored procedures through Spring, and one of these procedures contained a COMMIT statement.
Commits should always be controlled by the app, if there is a stray commit somewhere in a stored procedure, Spring can't control the transaction and tests won't roll back.
I just encountered this with my unit tests all set to rollback and my record was still showing up in the database after the test finished. The reason was there was that in the DAO a call to the entitymanager to flush() was in the method - which forced the transaction to commit.
em.persist(jpaServer);
em.flush(); //commits the record no matter what spring is setup to do
taking the flush out and confirmed no record. Tested the same code with the test annotated to #Rollback(false) and confirmed the record (proving the flush was not needed)
Ben
www.nimbits.com
I want to test my Dao Class using the SpringContextTests.
In my method class I extended the AbstractTransactionalJUnit4SpringContextTests in order for my test class to integrate with JUnit4. I have also set up the configurations and made the initialization and database clean up in the #Before and tearDown in the #After. My test class works perfectly.
My problem was, when I run my test class and the database is filled with data, the original data was not rolled back and my database is cleared. In the #Before method, I clear the database and populate data, thinking that I will be able to rollback it but its not.
Can anyone site an example that works and rollbacks information in the database.
ADDONS:
Every database manipulation in my test methods are rolled back. But the execution of super.deleteFromTables("person") in the #Before method did not rollback all the previous data from the database.
Spring rollbacks all the CRUD operations but the database clean up before the transaction do not rollback.
Thank you to all those who answered my question. I learned a lot from those answers but it didn't solve my problem.
I knew my test data does a transaction management and it does its job properly.
The mistake is on my part.
I forgot the lesson about database commands that when you execute a DDL statement after a DML statement, it will automatically commit the transaction. I executed a DDL after a DML by deleting all record and then ALTER the AUTO_INCREMENT of the table where in it will cause an auto-commit and delete all records of the table permanently.
FIXING THAT SCENARIO SOLVED MY PROBLEM.
Possible causes:
you're using a database/database engine which does not have proper transactions;
you're using multiple transaction managers and/or data sources and the proper one is not picked up;
you're doing your own, separate, transactions in the test class
As for an example, here's one ( top of my head, not compiled )
public class DBTest extends AbstractTransactionalJUnit4SpringContextTests {
#Autowired
private SomeDAO _aBeanDefinedInMyContextFile;
#Test
public void insert_works() {
assert _aBeanDefinedInMyContextFile.findAll() == 0;
_aBeanDefinedInMyContextFile.save(new Bean());
assert _aBeanDefinedInMyContextFile.findAll() == 1;
}
}
Key points:
the SomeDAO is an interface which corresponds to a bean declared in my context;
the bean does not have any transactional settings ( advice/programmatic), it relies on the caller being transactional - either the service in production, or the test in our situation;
the test does not include any transactional management code, as it's all done in the framework.
I'm not sure what is wrong with your class. Here is an extract of a class that does what you want with dbunit and spring 2.5:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations={
"testContext.xml"})
#TransactionConfiguration
#Transactional
public class SampleDAOTest {
#Autowired
private DataSource dataSource;
#Autowired
private SampleDAO sampleDAO;
#Before
public void onSetUpInTransaction() throws Exception {
//Populate Test data
IDatabaseConnection dbUnitCon = new DatabaseConnection(DataSourceUtils.getConnection(dataSource), "DATASOURCE");
//read in from a dbunit excel file of test data
IDataSet dataSet = new XlsDataSet(new File("src/test/resources/TestData.xls"));
DatabaseOperation.INSERT.execute(dbUnitCon, dataSet);
}
#Test
public void testGetIntermediaryOrganisation() {
// Test getting a user
User object = sampleDAO.getUser(99L);
assertTrue(object.getValue);
}
}
One of the benfits of this method is that you don't need to extend any classes. So you can still have your own hierarchy for tests.
If you really want to stick to your current method instead of using the #before annotation I thinnk you need to overide the below method and put your setup code in there.
#Override
public void onSetUpInTransaction() throws Exception {...}
Hope this helps
Sidestepping your question, I suggest that you use a seperate database instance to run your tests against. That way, you can safely wipe it clean and have your tests initialize it as required.
As far as I know the Spring support classes for database testing only rollback what happens in the tests, not what happens in setup and teardown of tests.
Agree with Confusion-- you should be running your tests against their own database schema.
With this, you can set your hibernate properties to 'create-drop':
With create-drop, the database schema
will be dropped when the
SessionFactory is closed explicitly.
See: Optional Hibernate Config properites
Example snippet:
<bean id="sessionBean" class="org.springframework.orm.hibernate3.LocalSessionFactoryBean">
<property name="dataSource" ref="dataSource"/>
<property name="hibernateProperties">
<props>
<prop key="hibernate.hbm2ddl.auto">create-drop</prop>
...etc
While I'd agree with the guy's suggesting a deciated db for testing, there isn't any reason why using a populated db shouldn't work, both #Before and #After methods are executed within the transactional context, therefore there changes should be rolledback.
Possibles:
The data setup is doing something that isn't transactional (ie DDL statements)
Something in your test is actually committing the transaction
Can you post the #Before method, I'm just wondering if you are just clearing the tables or actually dropping and recreating them?
As far as I can tell, by looking at the Javadocs and source code for AbstractJUnit4SpringContextTests and TransactionalTestExecutionListener you need to annotate your test methods you want transactionalised with #Transactional.
There are also #BeforeTransaction and #AfterTransaction annotations where you can better control what runs in a transaction.
I suggest you create methods annotated with all these annotations, including #Before and then run the test with breakpoints at these methods. That way you can look at the stack and work out whether spring has started a transaction for you or not. If you see something like "TransactionInterceptor" in the stack then, or anything else with "Transaction" in the name, then chances are you're in a transaction.
You're doing super.deleteFromTables in your #Before method which is within the tx. So if the tx is rolled back doesn't the deletions get rolled back also?