Essence:
How can I auto-rollback my hibernate transaction in a JUnit Test run with JBehave?
The problem seems to be that JBehave wants the SpringAnnotatedEmbedderRunner but annotating a test as #Transactional requires the SpringJUnit4ClassRunner.
I've tried to find some documentation on how to implement either rollback with SpringAnnotatedEmbedderRunner or to make JBehave work using the SpringJUnit4ClassRunner but I couldn't get either to work.
Does anyone have a (preferably simple) setup that runs JBehave storries with Spring and Hibernate and transaction auto-rollback?
Further infos about my setup so far:
Working JBehave with Spring - but not with auto-rollback:
#RunWith(SpringAnnotatedEmbedderRunner.class)
#Configure(parameterConverters = ParameterConverters.EnumConverter.class)
#UsingEmbedder(embedder = Embedder.class, generateViewAfterStories = true, ignoreFailureInStories = false, ignoreFailureInView = false)
#UsingSpring(resources = { "file:src/main/webapp/WEB-INF/test-context.xml" })
#UsingSteps
#Transactional // << won't work
#TransactionConfiguration(...) // << won't work
// both require the SpringJUnit4ClassRunner
public class DwStoryTests extends JUnitStories {
protected List<String> storyPaths() {
String searchInDirectory = CodeLocations.codeLocationFromPath("src/test/resources").getFile();
return new StoryFinder().findPaths(searchInDirectory, Arrays.asList("**/*.story"), null);
}
}
In my test steps I can #Inject everything nicely:
#Component
#Transactional // << won't work
public class PersonServiceSteps extends AbstractSmockServerTest {
#Inject
private DatabaseSetupHelper databaseSetupHelper;
#Inject
private PersonProvider personProvider;
#Given("a database in default state")
public void setupDatabase() throws SecurityException {
databaseSetupHelper.createTypes();
databaseSetupHelper.createPermission();
}
#When("the service $service is called with message $message")
public void callServiceWithMessage(String service, String message) {
sendRequestTo("/personService", withMessage("requestPersonSave.xml")).andExpect(noFault());
}
#Then("there should be a new person in the database")
public void assertNewPersonInDatabase() {
Assert.assertEquals("Service did not save person: ", personProvider.count(), 1);
}
(yes, the databaseSetupHelper methods are all transactional)
PersonProvider is basicly a wrapper around org.springframework.data.jpa.repository.support.SimpleJpaRepository. So there is access to the entityManager but taking control over the transactions (with begin/rollback) didn't work, I guess because of all the #Transactionals that are done under the hood inside that helper class.
Also I read that JBehave runs in a different context?session?something? which causes loss of controll over the transaction started by the test? Pretty confusing stuff..
edit:
Editet the above rephrasing the post to reflect my current knowledge and shortening the whole thing so that the question becomes more obvious and the setup less obstrusive.
I think you can skip the SpringAnnotatedEmbedderRunner and provide the necessary configuration to JBehave yourself. For example instead of
#UsingEmbedder(embedder = Embedder.class, generateViewAfterStories = true, ignoreFailureInStories = false, ignoreFailureInView = false)
you can do
configuredEmbedder()
.embedderControls()
.doGenerateViewAfterStories(true)
.doIgnoreFailureInStories(false)
.doIgnoreFailureInView(false);
Besides: why do you want to rollback the transaction? Typically you are using JBehave for acceptance tests, which run in a production-like environment. For example you first setup some data in the database, access it via Browser/Selenium and check for the results. For that to work the DB transaction has to be committed. Do you need to clean-up manually after your tests, which you can do in #AfterStories or #AfterScenario annotated methods.
I made it work by controlling transaction scope manually, rolling it back after each scenario. Just follow the official guide how to use Spring with JBehave and then do the trick as shown below.
#Component
public class MySteps
{
#Autowired
MyDao myDao;
#Autowired
PlatformTransactionManager transactionManager;
TransactionStatus transaction;
#BeforeScenario
public void beforeScenario() {
transaction = transactionManager.getTransaction(new DefaultTransactionDefinition());
}
#AfterScenario
public void afterScenario() {
if (transaction != null)
transactionManager.rollback(transaction);
}
#Given("...")
public void persistSomething() {
myDao.persist(new Foo());
}
}
I'm not familiar with JBehave, but it appears you're searching for this annotation.
#TransactionConfiguration(transactionManager = "transactionManager", defaultRollback = true).
You could also set defaultRollback to true in your testContext.
Related
I have spring boot app with multiple databases and in services I want to talk to those databases and combine info from 2 of them in the same method.
#Service
#Transactional("transactionManager1")
public class Service1 {
#Service
#Transactional("transactionManager2")
public class Service2 {
I have some methods in Service1 that also call methods from Service2 to get some data from other database.
It worked fine until I added threading. In Service1 I added CompletableFuture.supplyAsync(....) that calls method from Service1 that in turn eventually calls Service2 methods. And at some point it just throws me TransactionException.
To fix this I thought I would manually create transaction before CompletableFuture.supplyAsync(....) and commit it afterwards. When I was searching how to do that I got idea from CompletableFuture vs Spring Transactions and started writing the following code (I use PlatformTransactionManager instead):
#TimeLimiter(name = "method1")
public CompletableFuture<Void> method1Async(long arg1) {
TransactionDefinition txDef = new DefaultTransactionDefinition();
TransactionStatus txStatus = platformTransactionManager.getTransaction(txDef);
CompletableFuture<Void> x = null;
try {
x = CompletableFuture.supplyAsync(() -> {
Void y = this.method1(arg1);
platformTransactionManager.commit(txStatus);
return y;
})
.toCompletableFuture();
} catch (Exception e){
platformTransactionManager.rollback(txStatus);
}
return x;
}
But after still getting exception about transaction I realized that since I use multiple databases I probably need multiple transaction managers. But when I was trying to learn those transaction managers I didn't notice any way to tell for which database I want it.
How Do I create another transaction for transactionManager2?
I just can't seem to win today...
Is there a way to read from a OneToMany relationship in a Spock SpringBootTest integration test, without annotating the test as #Transactional or adding the unrealistic spring.jpa.properties.hibernate.enable_lazy_load_no_trans=true?
OR, is there a way to launch a Spring-Batch Job from within a #Transactional test case?
Let me elaborate...
I'm trying to get a simple Spring Boot Integration test working for my Spring Batch reporting process, which reads from tangled web of DB2 tables and generates a series of change messages for interested systems. I'm using the Groovy Spock testing framework and an H2 in-memory database filled with a representative slice of my DB2 tables' data.
At the beginning of the test, I'm attempting to use every entity from a given Table to generate entries in a change-tracking table that drives my messaging.
setup:
List allExistingTestPeople = peopleRepository.findAll()
Collections.shuffle(allExistingTestPeople)
allExistingTestPeople?.each { Person person ->
Nickname nicknames = person.nicknames
nicknames?.each { Nickname nickname ->
changeTrackingRepository.save(new Change(personId: person.id, nicknameId: nickname.id, status: NEW))
}
}
Given these as my DB2 domain classes:
#Entity
#Table(name = "T_PERSON")
public class Person {
#Id
#Column(name = "P_ID")
private Integer id;
#Column(name = "P_NME")
private String name;
#OneToMany(targetEntity = Nickname.class, mappedBy = "person")
private List<Nickname> nicknames;
}
#Entity
#Table(name = "T_NICKNAME")
public class Nickname{
#EmbeddedId
private PersonNicknamePK id;
#Column(name = "N_NME")
private String nickname;
#ManyToOne(optional = false, targetEntity = Person.class)
#JoinColumn(name = "P_ID", referencedColumnName="P_ID", insertable = false, updatable = false)
private Person person;
}
#Embeddable
public class PersonNicknamePK implements Serializable {
#Column(name="P_ID")
private int personId;
#Column(name="N_ID")
private short nicknameId;
}
But I'm getting this LazyInitializationException, even though I'm reading from that OneToMany relationship within the context of a test case...
org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: com.my.package.db2.model.Person.nicknames, could not initialize proxy - no Session
at org.hibernate.collection.internal.AbstractPersistentCollection.throwLazyInitializationException(AbstractPersistentCollection.java:602)
at org.hibernate.collection.internal.AbstractPersistentCollection.withTemporarySessionIfNeeded(AbstractPersistentCollection.java:217)
at org.hibernate.collection.internal.AbstractPersistentCollection.readSize(AbstractPersistentCollection.java:161)
at org.hibernate.collection.internal.PersistentBag.size(PersistentBag.java:350)
I came across the advice online to annotate my test case with the #Transactional annotation, which definitely got me a little further, allowing me to read from this OneToMany relationship. However, when I then attempt to launch the Spring Batch Job I'd like to test from my when clause:
#Transactional
def "Happy path test to validate I can generate a report of changes"() {
setup:
//... See above
when:
service.launchBatchJob()
then:
//... Messages are generated
}
I'm getting the exception that a Spring Batch Job can't be launched from the context of a transaction! Even though I'm using an in-memory Job manager via ResourcelessTransactionManager and MapJobRepositoryFactoryBean, since this is just a short lived scheduled script I'm writing...
java.lang.IllegalStateException: Existing transaction detected in JobRepository. Please fix this and try again (e.g. remove #Transactional annotations from client).
at org.springframework.batch.core.repository.support.AbstractJobRepositoryFactoryBean$1.invoke(AbstractJobRepositoryFactoryBean.java:177)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212)
at com.sun.proxy.$Proxy125.createJobExecution(Unknown Source)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:134)
at com.my.package.service.MyService.launchBatchJob(MyService.java:30)
The only thing that seems to work so far is if I scrap the #Transactional annotation and instead add spring.jpa.properties.hibernate.enable_lazy_load_no_trans=true to my application-test.properties file. BUT, this doesn't seem like a very good idea, because it's not realistic - if I add this, then even if there's a bug in my code due to a lazy-initialization-exception, I'd never see it in the tests.
Sorry for the novel, hoping someone can point me in the right direction :(
EDIT:
Also here's my In-memory Spring-Batch configuration, in which I've tried turning off the transaction validation. Unfortunately, while this gets me a little further, the Spring Batch partioner's autowired EntityManager is suddenly failing to run queries in the H2 database.
#Configuration
#EnableBatchProcessing
public class InMemoryBatchManagementConfig {
#Bean
public ResourcelessTransactionManager resourceslessTransactionManager() {
ResourcelessTransactionManager resourcelessTransactionManager = new ResourcelessTransactionManager();
resourcelessTransactionManager.setNestedTransactionAllowed(true);
resourcelessTransactionManager.setValidateExistingTransaction(false);
return resourcelessTransactionManager;
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactory(ResourcelessTransactionManager txManager)
throws Exception {
MapJobRepositoryFactoryBean factory = new MapJobRepositoryFactoryBean(txManager);
factory.setValidateTransactionState(false);
factory.afterPropertiesSet();
return factory;
}
#Bean
public JobRepository jobRepository(MapJobRepositoryFactoryBean factory) throws Exception {
return factory.getObject();
}
#Bean
public SimpleJobLauncher jobLauncher(JobRepository jobRepository) throws Exception {
SimpleJobLauncher launcher = new SimpleJobLauncher();
launcher.setJobRepository(jobRepository);
launcher.afterPropertiesSet();
return launcher;
}
#Bean
public JobExplorer jobExplorer(MapJobRepositoryFactoryBean factory) {
return new SimpleJobExplorer(factory.getJobInstanceDao(), factory.getJobExecutionDao(),
factory.getStepExecutionDao(), factory.getExecutionContextDao());
}
#Bean
public BatchConfigurer batchConfigurer(MapJobRepositoryFactoryBean mapJobRepositoryFactory,
ResourcelessTransactionManager resourceslessTransactionManager,
SimpleJobLauncher jobLauncher,
JobExplorer jobExplorer) {
return new BatchConfigurer() {
#Override
public JobRepository getJobRepository() throws Exception {
return mapJobRepositoryFactory.getObject();
}
#Override
public PlatformTransactionManager getTransactionManager() throws Exception {
return resourceslessTransactionManager;
}
#Override
public JobLauncher getJobLauncher() throws Exception {
return jobLauncher;
}
#Override
public JobExplorer getJobExplorer() throws Exception {
return jobExplorer;
}
};
}
}
This error happens because your code will be already executed in a transaction driven by Spring Batch. So running the job in the scope of a transaction is not correct. However, if you still want to disable the transaction validation done by the job repository, you can set the validateTransactionState to false, see AbstractJobRepositoryFactoryBean#setValidateTransactionState.
That said, running the job in a transaction is not the way to fix org.hibernate.LazyInitializationException. The property spring.jpa.properties.hibernate.enable_lazy_load_no_trans=true is there for a reason, and if it works for you, I believe it is a better approach than running the entire job in a transaction (and btw, if I had to use a transaction for that, I would narrow its scope to the minimum (for example the step) and not the entire job).
You can do transactions programmatically using TransactionTemplate to run only the "setup" inside a transaction (instead of having everything in #Transactional). Unfortunately this way the transaction will be committed and you will need to do some manual cleanup.
It can be autowired as any other bean:
#Autowired
private TransactionTemplate transactionTemplate;
...and it's used this way:
transactionTemplate.execute((transactionStatus) -> {
// ...setup...
return null; // alternatively you can return some data out of the callback
});
I am trying to implement a multi-threaded solution so I can parallelize my business logic that includes reading and writing to a database.
Technology stack: Spring 4.0.2, Hibernate 4.3.8
Here is some code to discuss on:
Configuration
#Configuration
public class PartitionersConfig {
#Bean
public ForkJoinPoolFactoryBean forkJoinPoolFactoryBean() {
final ForkJoinPoolFactoryBean poolFactory = new ForkJoinPoolFactoryBean();
return poolFactory;
}
}
Service
#Service
#Transactional
public class MyService {
#Autowired
private OtherService otherService;
#Autowired
private ForkJoinPool forkJoinPool;
#Autowired
private MyDao myDao;
public void performPartitionedActionOnIds() {
final ArrayList<UUID> ids = otherService.getIds();
MyIdPartitioner task = new MyIdsPartitioner(ids, myDao, 0, ids.size() - 1);
forkJoinPool.invoke(task);
}
}
Repository / DAO
#Repository
#Transactional(propagation = Propagation.MANDATORY)
public class IdsDao {
public MyData getData(List<UUID> list) {
// ...
}
}
RecursiveAction
public class MyIdsPartitioner extends RecursiveAction {
private static final long serialVersionUID = 1L;
private static final int THRESHOLD = 100;
private ArrayList<UUID> ids;
private int fromIndex;
private int toIndex;
private MyDao myDao;
public MyIdsPartitioner(ArrayList<UUID> ids, MyDao myDao, int fromIndex, int toIndex) {
this.ids = ids;
this.fromIndex = fromIndex;
this.toIndex = toIndex;
this.myDao = myDao;
}
#Override
protected void compute() {
if (computationSetIsSamllEnough()) {
computeDirectly();
} else {
int leftToIndex = fromIndex + (toIndex - fromIndex) / 2;
MyIdsPartitioner leftPartitioner = new MyIdsPartitioner(ids, myDao, fromIndex, leftToIndex);
MyIdsPartitioner rightPartitioner = new MyIdsPartitioner(ids, myDao, leftToIndex + 1, toIndex);
invokeAll(leftPartitioner, rightPartitioner);
}
}
private boolean computationSetIsSamllEnough() {
return (toIndex - fromIndex) < THRESHOLD;
}
private void computeDirectly() {
final List<UUID> subList = ids.subList(fromIndex, toIndex);
final MyData myData = myDao.getData(sublist);
modifyTheData(myData);
}
private void modifyTheData(MyData myData) {
// ...
// write to DB
}
}
After executing this I get:
No existing transaction found for transaction marked with propagation 'mandatory'
I understood that this is perfectly normal since the transaction doesn't propagate through different threads. So one solution is to create a transaction manually in every thread as proposed in another similar question. But this was not satisfying enough for me so I kept searching.
In Spring's forum I found a discussion on the topic. One paragraph I find very interesting:
"I can imagine one could manually propagate the transaction context to another thread, but I don't think you should really try it. Transactions are bound to single threads with a reason - the basic underlying resource - jdbc connection - is not threadsafe. Using one single connection in multiple threads would break fundamental jdbc request/response contracts and it would be a small wonder if it would work in more then trivial examples."
So the first question arise: Is it worth it to pararellize the reading/writing to the database and can this really hurt the DB consistency?
If the quote above is not true, which I doubt, is there a way to achieve the following:
MyIdPartitioner to be Spring managed - with #Scope("prototype") - and pass the needed arguments for the recursive calls to it and that way leave the transaction management to Spring?
After further readings I managed to solve my problem. Kind of (as I see it now there wasn't a problem at the first place).
Since the reading I do from the DB is in chunks and I am sure that the results won't get edited during that time I can do it outside transaction.
The writing is also safe in my case since all values I write are unique and no constraint violations can occur. So I removed the transaction from there too.
What I mean by saying "I removed the transaction" just override the method's Propagation mode in my DAO like:
#Repository
#Transactional(propagation = Propagation.MANDATORY)
public class IdsDao {
#Transactional(propagation = Propagation.SUPPORTS)
public MyData getData(List<UUID> list) {
// ...
}
}
Or if you decide you need the transaction for some reason then you can still leave the transaction management to Spring by setting the propagation to REQUIRED.
So the solution turns out to be much much simpler than I thought.
And to answer my other questions:
Is it worth it to pararellize the reading/writing to the database and can this really hurt the DB consistency?
Yes, it's worth it. And as long as you have transaction per thread you are cool.
Is there a way to achieve the following: MyIdPartitioner to be Spring managed - with #Scope("prototype") - and pass the needed arguments for the recursive calls to it and that way leave the transaction management to Spring?
Yes there is a way by using pool (another stackoverflow question). Or you can define your bean as #Scope(value = "prototype", proxyMode = ScopedProxyMode.TARGET_CLASS) but then it won't work if you need to set parameters to it since every usage of the instance will give you a new instance. Ex.
#Autowire
MyIdsPartitioner partitioner;
public void someMethod() {
...
partitioner.setIds(someIds);
partitioner.setFromIndex(fromIndex);
partitioner.setToIndex(toIndex);
...
}
This will create 3 instances and you won't be able to use the object beneficial since the fields won't be set.
So in short - there is a way but I didn't need to go for it at first place.
This should be possible with atomikos (http://www.atomikos.com) and optionally with nested transactions.
If you do this, then take care to avoid deadlocks if multiple threads of a same root transaction write to the same tables in the database.
I'm developing a web app which is based in Spring 2.5 and hibernate 3. Recently I've introduced JUnit tests and I've done some integration tests using DBUnit framework. DBUnit is supposed to update the database with an xml dataset between one test and another, and it's working well, as I've seen.
However, when I update an element in a test, hibernate seems to catch this information and even I load the element in the following test, the information is the one I've modified. If I look the DB when the execution is paused, the Data Base is properly reseted by DBUnit. So I think it can be an Hibernate problem..
Is there a way to make a tearDown between tests saying I want a new hibernate session for my spring context? By the way, I'm not using Spring annotations and I get the Spring context by code:
String[] contextLocations = new String[2];
contextLocations[0] = "WebContent/WEB-INF/applicationContext.xml";
contextLocations[1] = "src/System_V3/test/applicationContext.xml";
context = new FileSystemXmlApplicationContext(contextLocations);
DBUnit setUp:
#Before
public void setUpBeforeClass() throws Exception {
handleSetUpOperation();
}
private static void handleSetUpOperation() throws Exception {
conn = getConnection();
conn.getConnection().setAutoCommit(false);
final IDataSet data = getDataSet();
try {
DatabaseOperation.REFRESH.execute(conn, data);
} finally {
conn.close();
}
}
private static IDatabaseConnection getConnection() throws ClassNotFoundException, SQLException,
DatabaseUnitException {
Class.forName("org.gjt.mm.mysql.Driver");
return new DatabaseConnection(DriverManager.getConnection(
"jdbc:mysql://localhost:3306/web_database", "root", "pass"));
}
private static IDataSet getDataSet() throws IOException, DataSetException {
ClassLoader classLoader = TestPrueba.class.getClassLoader();
return new FlatXmlDataSetBuilder().build(classLoader
.getResourceAsStream("System_V3/test/dataset.xml"));
}
Tests are done in JUnit 4 using only #Test annotations and test class is not extending any library class.
Any suggestion?
Not sure if this is something that can help you - but just in case...
Try to use session.clear() and use it in teardown method.
Please take a look here http://docs.jboss.org/hibernate/orm/3.5/api/org/hibernate/Session.html#clear()
According to spec session.clear() ->
Completely clear the session. Evict all loaded instances and cancel all pending saves, updates and deletions. Do not close open iterators or instances of ScrollableResults.
You need to execute your tests within a transaction. This can be achieved by setting the SpringJUnit4ClassRunner for your test. After this is configured you can use #Transactional annotation per test.
With this approach you can #Autowired your beans directly to your test too.
For instance:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = { "classpath:context-file.xml" })
public class MyTest {
#Autowired
private MyService myService;
#Transactional
#Test
private void myFirstTest() {
...
myService.executeSomething();
...
}
}
and of course, you can set the default behaviour to RollBack on your test class annotating it with #TransactionConfiguration(defaultRollback = true/false)
When I run my unit tests in isolation they work fine ie. (omitted the asserts)
#Test
public void testSave()
{
EntityManagerHelper emh = new EntityManagerHelper();
LevelDAO dao = new LevelDAO();
Level l = new Level();
l.setName("aname");
emh.beginTransaction();
dao.save(l);
emh.commit();
}
then running this individual test below no problem
#Test
public void testUpdate()
{
EntityManagerHelper emh = new EntityManagerHelper();
LevelDAO dao = new LevelDAO();
Level l = new Level();
l.setName("bname");
l.setLevelid(1);
emh.beginTransaction();
dao.update(l);
emh.commit();
}
When they run at same time in sequence I recieve that error - Transaction is currently active. Is there a way to allow each unit test to run only after a transaction from previous piece of work is not active? Should I be looking at Spring instead?
Update
The EntityManagerHelper gains access to the persistence context like so
emf = Persistence.createEntityManagerFactory("bw_beta");
threadLocal = new ThreadLocal<EntityManager>();
which looks like the problem
So a hacky workaround was to use define locally ie.
EntityManagerFactory factory = Persistence.createEntityManagerFactory("bw_beta");
EntityManager entityManager = factory.createEntityManager();
entityManager.getTransaction().begin();
dao.save(l);
entityManager.persist(l);
entityManager.getTransaction().commit();
Pretty sure there's a better way - maybe using Spring?
Pretty sure there's a better way - maybe using Spring?
Yes, Spring cleans it up a lot and gives you control on what you'd like to run within a transaction without polluting the actual test.
With Spring, your tests would look something like this:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration({ "classpath:META-INF/conf/spring/application-context.xml",
"classpath:META-INF/conf/spring/test-datasource-spring-config.xml" })
#TransactionConfiguration(transactionManager="txMgr", defaultRollback=false)
public class LevelDaoTest {
#Resource( name="levelDao" )
LevelDao levelDao;
#Test
public void shouldSaveNewLevels() {
Level l = new Level();
l.setName("aname");
levelDao.save(l);
// assert
}
#Test
public void shouldUpdateExistingLevels() {
Level l = new Level(); // or I would assume, you'd read this level back from DB, or set a proper ID, so the DAO will know to update it.. But that is besides the point
l.setName("bname");
levelDao.update(l);
// assert
}
}
Take a look at Spring Documentation under Testing => Transaction Management to get more details.
P.S. From your example:
dao.save(l);
entityManager.persist(l);
Looks really strange, as usually you would encapsulate entityManager within a DAO, so all you'd need to do is dao.save(l)
For anyone that may be having this issue here is how I resolved it. I was doing multiple saves and I kept getting this error. You do not want to begin multiple transactions without checking if it is active.
if(!entityManager.getTransaction().isActive())
entityManager.getTransaction().begin();
dao.save(l);
entityManager.persist(l);
entityManager.getTransaction().commit();
I implemented a Singleton approach to handle it.