Is there a best-practice pattern for completely resetting a database to a freshly-paved schema with JPA before a unit test? I have been using a testing persistence unit with hbml2ddl.auto=create-or-drop and recreating EMFs before each test, but I wonder if there's a cleaner way to do it.
Unit tests should not talk to the database.
Assuming you're writing an integration test for your data access layer, you could use a tool like DBUnit, or you could create a static test helper that programmatically resets your database state by doing all of your deletes and inserts using JPA queries inside of a transaction.
Resetting the database is not a big problem if you use a fast Java database such as the H2 database or HSQLDB. Compared to using Oracle / MySQL (or whatever you use for production) this will speed up your tests, and it will ensure all your code is tested as when using the 'real' production database.
For maximum performance, you can use H2 in-memory (that way you may not have to reset the database manually - it's reset automatically if the connection is closed), or you can use a regular persistent database. To reset the database after use in H2, run the (native) statement 'drop all objects delete files'.
DBUnit has much of what you need, I use Springs Testing framework to rollback, transactions after each test see http://static.springsource.org/spring/docs/3.0.x/spring-framework-reference/html/testing.html
Is there a best-practice pattern for completely resetting a database to a freshly-paved schema with JPA before a unit test?
Don't reset the whole DB schema before every unit test, but reset the "DB environment (which is specific to the current unit test)" at END of each unit test.
We have an entity...
#Entity
public class Candidate implements {
private String id;
private String userName;
private EntityLifeCycle lifeCycle;
protected Candidate() {
}
public Candidate(String userName) {
this.userName = userName;
}
#Id #GeneratedValue(generator="uuid", strategy=GenerationType.AUTO)
#GenericGenerator(name="uuid", strategy="uuid", parameters = {})
#Column(name="candidate_id", nullable=false, unique=true)
public String getId() {
return id;
}
protected void setId(String id) {
this.id = id;
}
#Column(name="user_name", nullable=false, unique=true)
public String getUserName() {
return userName;
}
public void setUserName(String userName) {
this.userName = userName;
}
#Embedded
public EntityLifeCycle getLifeCycle() {
if(lifeCycle == null) {
lifeCycle = new EntityLifeCycleImpl();
}
return lifeCycle;
}
public void setLifeCycle(EntityLifeCycleImpl lifeCycle) {
this.lifeCycle = lifeCycle;
}
#PrePersist
public void prePersist() {
lifeCycle.setCreatedDate(new Date());
}
}
We are setting the createdDate for each Candidate instance in prePersist() method. Here is a test case that asserts that createdDate is getting set properly....
public class EntityLifeCycleTest {
#Test
public void testLifeCycle() {
EntityManager manager = entityManagerFactory.createEntityManager();
Candidate bond = new Candidate("Miss. Bond");
EntityTransaction tx = manager.getTransaction();
tx.begin();
manager.persist(bond);
tx.commit();
Assert.assertNotNull(bond.getLifeCycle().getCreatedDate());
manager.close();
}
}
This test case will run properly for the first time. But if we run this test case second time it would throw ConstraintViolationException, because the userName is unique key.
Therefore, I think the right approach is to "clean the DB environment (which is specific to the current unit test)" at end of each test case. Like this...
public class EntityLifeCycleTest extends JavaPersistenceTest {
#Test
public void testLifeCycle() {
EntityManager manager = entityManagerFactory.createEntityManager();
Candidate bond = new Candidate("Miss. Bond");
EntityTransaction tx = manager.getTransaction();
tx.begin();
manager.persist(bond);
tx.commit();
Assert.assertNotNull(bond.getLifeCycle().getCreatedDate());
/* delete Candidate bond, so next time we can run this test case successfully*/
tx = manager.getTransaction();
tx.begin();
manager.remove(bond);
tx.commit();
manager.close();
}
}
I have been using a testing persistence unit with hbml2ddl.auto=create-or-drop and recreating EMFs before each test, but I wonder if there's a cleaner way to do it.
Recreating EMF before each test is time consuming, IMO.
Drop and recreate the DB schema only if you have made some changes to #Entity annotated class that affects the underlying DB (e.g. adding/removing columns and/or constraints). So first validate the schema, if the schema is valid don't recreate it, and if invalid then recreate it. Like this...
public class JavaPersistenceTest {
protected static EntityManagerFactory entityManagerFactory;
#BeforeClass
public static void setUp() throws Exception {
if(entityManagerFactory == null) {
Map<String, String> properties = new HashMap<String, String>(1);
try {
properties.put("hibernate.hbm2ddl.auto", "validate");
entityManagerFactory = Persistence.createEntityManagerFactory("default", properties);
} catch (PersistenceException e) {
e.printStackTrace();
properties.put("hibernate.hbm2ddl.auto", "create");
entityManagerFactory = Persistence.createEntityManagerFactory("default", properties);
}
}
}
}
Now, if you run all the test cases(that extend JavaPersistenceTest) in one go, the EMF will be created only once(or twice if the schema was invalid).
Related
I just can't seem to win today...
Is there a way to read from a OneToMany relationship in a Spock SpringBootTest integration test, without annotating the test as #Transactional or adding the unrealistic spring.jpa.properties.hibernate.enable_lazy_load_no_trans=true?
OR, is there a way to launch a Spring-Batch Job from within a #Transactional test case?
Let me elaborate...
I'm trying to get a simple Spring Boot Integration test working for my Spring Batch reporting process, which reads from tangled web of DB2 tables and generates a series of change messages for interested systems. I'm using the Groovy Spock testing framework and an H2 in-memory database filled with a representative slice of my DB2 tables' data.
At the beginning of the test, I'm attempting to use every entity from a given Table to generate entries in a change-tracking table that drives my messaging.
setup:
List allExistingTestPeople = peopleRepository.findAll()
Collections.shuffle(allExistingTestPeople)
allExistingTestPeople?.each { Person person ->
Nickname nicknames = person.nicknames
nicknames?.each { Nickname nickname ->
changeTrackingRepository.save(new Change(personId: person.id, nicknameId: nickname.id, status: NEW))
}
}
Given these as my DB2 domain classes:
#Entity
#Table(name = "T_PERSON")
public class Person {
#Id
#Column(name = "P_ID")
private Integer id;
#Column(name = "P_NME")
private String name;
#OneToMany(targetEntity = Nickname.class, mappedBy = "person")
private List<Nickname> nicknames;
}
#Entity
#Table(name = "T_NICKNAME")
public class Nickname{
#EmbeddedId
private PersonNicknamePK id;
#Column(name = "N_NME")
private String nickname;
#ManyToOne(optional = false, targetEntity = Person.class)
#JoinColumn(name = "P_ID", referencedColumnName="P_ID", insertable = false, updatable = false)
private Person person;
}
#Embeddable
public class PersonNicknamePK implements Serializable {
#Column(name="P_ID")
private int personId;
#Column(name="N_ID")
private short nicknameId;
}
But I'm getting this LazyInitializationException, even though I'm reading from that OneToMany relationship within the context of a test case...
org.hibernate.LazyInitializationException: failed to lazily initialize a collection of role: com.my.package.db2.model.Person.nicknames, could not initialize proxy - no Session
at org.hibernate.collection.internal.AbstractPersistentCollection.throwLazyInitializationException(AbstractPersistentCollection.java:602)
at org.hibernate.collection.internal.AbstractPersistentCollection.withTemporarySessionIfNeeded(AbstractPersistentCollection.java:217)
at org.hibernate.collection.internal.AbstractPersistentCollection.readSize(AbstractPersistentCollection.java:161)
at org.hibernate.collection.internal.PersistentBag.size(PersistentBag.java:350)
I came across the advice online to annotate my test case with the #Transactional annotation, which definitely got me a little further, allowing me to read from this OneToMany relationship. However, when I then attempt to launch the Spring Batch Job I'd like to test from my when clause:
#Transactional
def "Happy path test to validate I can generate a report of changes"() {
setup:
//... See above
when:
service.launchBatchJob()
then:
//... Messages are generated
}
I'm getting the exception that a Spring Batch Job can't be launched from the context of a transaction! Even though I'm using an in-memory Job manager via ResourcelessTransactionManager and MapJobRepositoryFactoryBean, since this is just a short lived scheduled script I'm writing...
java.lang.IllegalStateException: Existing transaction detected in JobRepository. Please fix this and try again (e.g. remove #Transactional annotations from client).
at org.springframework.batch.core.repository.support.AbstractJobRepositoryFactoryBean$1.invoke(AbstractJobRepositoryFactoryBean.java:177)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212)
at com.sun.proxy.$Proxy125.createJobExecution(Unknown Source)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:134)
at com.my.package.service.MyService.launchBatchJob(MyService.java:30)
The only thing that seems to work so far is if I scrap the #Transactional annotation and instead add spring.jpa.properties.hibernate.enable_lazy_load_no_trans=true to my application-test.properties file. BUT, this doesn't seem like a very good idea, because it's not realistic - if I add this, then even if there's a bug in my code due to a lazy-initialization-exception, I'd never see it in the tests.
Sorry for the novel, hoping someone can point me in the right direction :(
EDIT:
Also here's my In-memory Spring-Batch configuration, in which I've tried turning off the transaction validation. Unfortunately, while this gets me a little further, the Spring Batch partioner's autowired EntityManager is suddenly failing to run queries in the H2 database.
#Configuration
#EnableBatchProcessing
public class InMemoryBatchManagementConfig {
#Bean
public ResourcelessTransactionManager resourceslessTransactionManager() {
ResourcelessTransactionManager resourcelessTransactionManager = new ResourcelessTransactionManager();
resourcelessTransactionManager.setNestedTransactionAllowed(true);
resourcelessTransactionManager.setValidateExistingTransaction(false);
return resourcelessTransactionManager;
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactory(ResourcelessTransactionManager txManager)
throws Exception {
MapJobRepositoryFactoryBean factory = new MapJobRepositoryFactoryBean(txManager);
factory.setValidateTransactionState(false);
factory.afterPropertiesSet();
return factory;
}
#Bean
public JobRepository jobRepository(MapJobRepositoryFactoryBean factory) throws Exception {
return factory.getObject();
}
#Bean
public SimpleJobLauncher jobLauncher(JobRepository jobRepository) throws Exception {
SimpleJobLauncher launcher = new SimpleJobLauncher();
launcher.setJobRepository(jobRepository);
launcher.afterPropertiesSet();
return launcher;
}
#Bean
public JobExplorer jobExplorer(MapJobRepositoryFactoryBean factory) {
return new SimpleJobExplorer(factory.getJobInstanceDao(), factory.getJobExecutionDao(),
factory.getStepExecutionDao(), factory.getExecutionContextDao());
}
#Bean
public BatchConfigurer batchConfigurer(MapJobRepositoryFactoryBean mapJobRepositoryFactory,
ResourcelessTransactionManager resourceslessTransactionManager,
SimpleJobLauncher jobLauncher,
JobExplorer jobExplorer) {
return new BatchConfigurer() {
#Override
public JobRepository getJobRepository() throws Exception {
return mapJobRepositoryFactory.getObject();
}
#Override
public PlatformTransactionManager getTransactionManager() throws Exception {
return resourceslessTransactionManager;
}
#Override
public JobLauncher getJobLauncher() throws Exception {
return jobLauncher;
}
#Override
public JobExplorer getJobExplorer() throws Exception {
return jobExplorer;
}
};
}
}
This error happens because your code will be already executed in a transaction driven by Spring Batch. So running the job in the scope of a transaction is not correct. However, if you still want to disable the transaction validation done by the job repository, you can set the validateTransactionState to false, see AbstractJobRepositoryFactoryBean#setValidateTransactionState.
That said, running the job in a transaction is not the way to fix org.hibernate.LazyInitializationException. The property spring.jpa.properties.hibernate.enable_lazy_load_no_trans=true is there for a reason, and if it works for you, I believe it is a better approach than running the entire job in a transaction (and btw, if I had to use a transaction for that, I would narrow its scope to the minimum (for example the step) and not the entire job).
You can do transactions programmatically using TransactionTemplate to run only the "setup" inside a transaction (instead of having everything in #Transactional). Unfortunately this way the transaction will be committed and you will need to do some manual cleanup.
It can be autowired as any other bean:
#Autowired
private TransactionTemplate transactionTemplate;
...and it's used this way:
transactionTemplate.execute((transactionStatus) -> {
// ...setup...
return null; // alternatively you can return some data out of the callback
});
I am updating my application from Spring Boot 1.4.5 / Hibernate 4.3.5 to Spring Boot 2.0.9 / Hibernate 5.2.18 and code that used to work in the previous configuration is no longer working.
The scenario is as follows:
Start a transaction by entering a method annotated with #Transactional
Hydrate the entity
Change the entity
Make another query
Detect a problem. As a result of this problem, determine that changes should not persist.
Evict the entity
Exit the method / transaction
With Hibernate 4.3.5, calling entityManager.detach() would prevent the changes from being persisted. However, with Hibernate 5.2.18, I'm finding that changes are persisted even with this call. I have also tried to evict() from the session and I have tried to clear() all entities from the session (just to see what would happen).
So I ask - is it possible to discard entity changes in Hibernate 5.2.18 the way that I was able to do in Hibernate 4.3.5?
The relevant code is below...
#Entity
public class Agreement {
private Long agreementId;
private Integer agreementStateId;
#Id
#Column(name = "agreement_id")
public Long getAgreementId() {
return agreementId;
}
public void setAgreementId(Long agreementId) {
this.agreementId = agreementId;
}
#Basic
#Column(name = "agreement_state_id", nullable = false)
public Integer getAgreementStateId() {
return agreementStateId;
}
public void setAgreementStateId(Integer agreementStateId) {
this.agreementStateId = agreementStateId;
}
}
#Component
public class Repo1 {
#PersistenceContext(unitName = "rights")
private EntityManager entityManager;
public void evict(Object entity) {
entityManager.detach(entity);
}
public Agreement getAgreement(Long agreementId) {
// Code to get entity is here.
// Agreement with an agreementStateId of 5 is returned.
}
public void anotherQuery() {
// Code to make another query is here.
}
}
#Component
public class Service1 {
#Autowired
Repo1 repo;
#Transactional
public void doSomething() {
Agreement agreement = repo.getAgreement(1L);
// Change agreementStateId. Very simple for purposes of example.
agreement.setAgreementStateId(100);
// Make another query
repo.anotherQuery();
// Detect a problem here. Simplified for purposes of example.
if (agreement.getAgreementStateId() == 100) {
repo.evict(agreement);
}
}
}
I have found the problem and it has nothing to do with evict(). It turns out that an additional query was causing the session to flush prior to the evict() call.
In general, the application uses QueryDSL to make queries. Queries made in this way did not result in the session flushing prior to making a query. However in this case, the query was created via Session.createSQLQuery(). This uses the FlushMode already assigned to the session which was FlushMode.AUTO.
I was able to prevent the flush by calling setHibernateFlushMode(FlushMode.COMMIT) on the query prior to making the query. This causes the session FlushMode to temporarily change until after the query has been run. After that, the evict() call worked as expected.
I have a class, Student and the generated Endpoint class for it. ListStudents and insertStudents methods work without any problems, but update and remove don't cause any change in the datastore. The methods don't throw any errors and the call returns, but no changes are made.
My endpoints code is mostly the code generated by google plugin for eclipse:
#ApiMethod(name = "removeStudent", path="remove_student")
public void removeStudent(#Named("email") String email) {
EntityManager mgr = getEntityManager();
try {
Student student = getStudentByEmailName(email);
mgr.remove(student);
} finally {
mgr.close();
}
}
Entitiy manager getter method:
private static EntityManager getEntityManager() {
return EMF.get().createEntityManager();
}
#ApiMethod(name = "updateStudent")
public Student updateStudent(Student student) {
EntityManager mgr = getEntityManager();
try {
if (!containsStudent(student)) {
throw new EntityNotFoundException("Object does not exist");
}
mgr.persist(student);
} finally {
mgr.close();
}
return student;
}
And my EMF class:
public final class EMF {
private static final EntityManagerFactory emfInstance = Persistence
.createEntityManagerFactory("transactions-optional");
private EMF() {
}
public static EntityManagerFactory get() {
return emfInstance;
}
}
The client that uses this endpoint is Android. I have only tried testing on my local server.
Please tell me if I'm doing something wrong. Thank you
Do you have your student entities indexed by email?
This is a typical issue when you move to nosql and expect all queries to work without indexes.
Note that records inserted before defining index would not be in index.
The datastore is eventually consistent and your code should work. What is the return value that you get in the Student object from your updateStudent method.
As much as I don't want to, after you do a mgr.persist(...) , add mgr.flush() and see if that makes a difference.
I got some unit test that stores some Person data into the database. To do this i use the EntityManagerFactory.
#Test
public void findPersonById() {
personDao.savePerson(new Person("Hans"));
Person person = new Person("Peter");
personDao.savePerson(person);
Long id = person.getId();
flushAndClear();
person = personDao.findById(id);
assertThat(person.getName(), is("Peter"));
}
where
private EntityManager entityManager;
public Person savePerson(Person person) {
entityManager.persist(person);
return person;
}
and
public Person findById(Long id) {
Person person = entityManager.find(Person.class, id);
return person;
}
with this Base-Test-class
import javax.persistence.EntityManager;
import javax.persistence.Persistence;
import org.junit.After;
import org.junit.Before;
public class AbstractDaoTest {
protected EntityManager entityManager = Persistence.createEntityManagerFactory("DefaultPersistenceUnit")
.createEntityManager();
protected PersonDao personDao = new PersonDao();
#Before
public void setUp() {
personDao.setEntityManager(entityManager);
entityManager.getTransaction().begin();
}
#After
public void tearDown() {
entityManager.getTransaction().rollback();
}
protected void flushAndClear() {
entityManager.flush();
entityManager.clear();
}
}
My unit test runs green but when i debug this test, i can't see any data on the db (using squirrel). It doesn't matter in which step the debugger currently is, the Tables are always empty. I am using the following configuration for squirrel:
jdbc:h2:file:~/test;DB_CLOSE_DELAY=-1;LOCK_MODE=0;AUTO_SERVER=TRUE
with Driver: H2
Can you help with this issue?
This has happened because your test cases are not properly designed.
Firstly, If you wish the data to be added to the database, you should commit the changes after any update/insert operation. You have to do a rollback only if there are any exceptions during transactions.
Design your tests some thing like this.
#BeforeTest
public void setUp() {
personDao.setEntityManager(entityManager);
entityManager.getTransaction().begin();
}
#AfterTest
public void tearDown() {
// TODO clear all inserted data either using sql statement or JPA
//Rollback only if there is any exception
entityManager.getTransaction().rollback();
}
#Test
public void findPersonById() {
Long id = System.currentTimeMillis();
personDao.savePerson(new Person(id,"Hans"));
entityManager.flush();
// TODO your assert statements go in here
/* Data is visible till here during your debug.
Once AfterTest is executed then the your data would be rolled back
*/
}
I have used testNG as my Unit testing tool, pls use appropriate tool based on your requirements.
Also, if you are enityManager.flush() insert/update statements would be sent to the DB but wont be committed until commit instruction is sent to the DB by HIbernate. Pls refer this answer for further details
Ok, I'm trying to setup an application in a Java EE Container. I use JPA for persistence and I also use the javax.validation.constraints.* constraints. By default the container validate the entities during the #PrePersist and #PreUpdate lifecycle events and it's good for me, but how do I handle the ConstraintViolationException?
I can't find any docs on it, any suggestion is welcomed.
Well, you could catch it :) Here is an example (from a unit test):
public class CustomerTest {
private static EntityManagerFactory emf;
private EntityManager em;
#BeforeClass
public static void createEntityManagerFactory() {
emf = Persistence.createEntityManagerFactory("MyPu");
}
#AfterClass
public static void closeEntityManagerFactory() {
emf.close();
}
#Before
public void beginTransaction() {
em = emf.createEntityManager();
em.getTransaction().begin();
}
#After
public void rollbackTransaction() {
if (em.getTransaction().isActive()) {
em.getTransaction().rollback();
}
if (em.isOpen()) {
em.close();
}
}
#Test
public void nameTooShort() {
try {
Customer customer = new Customer("Bo");
em.persist(customer);
em.flush();
fail("Expected ConstraintViolationException wasn't thrown.");
} catch (ConstraintViolationException e) {
assertEquals(1, e.getConstraintViolations().size());
ConstraintViolation<?> violation = e.getConstraintViolations().iterator().next();
assertEquals("name", violation.getPropertyPath().toString());
assertEquals(Size.class, violation.getConstraintDescriptor().getAnnotation().annotationType());
}
}
}
Where my Customer looks like:
#Entity
public class Customer {
#Id #GeneratedValue
#NotNull
private Long id;
#NotNull
#Size(min = 3, max = 80)
private String name;
private boolean archived;
...
}
But this was just an example to show a tiny part of the API.
In my opinion, you should actually handle the validation at the view level. Many presentation frameworks support Bean Validation: JSF 2.0, Wicket, Spring MVC...
See also
6.3. Presentation layer validation
Spring MVC, Spring Bean Validation Framework and validating confirm password / confirm email fields.
Wicket JSR-303 Validators
TOTD #123: f:ajax, Bean Validation for JSF, CDI for JSF and JPA 2.0 Criteria API
You can catch ConstraintViolationException, as Pascal described it, but please be aware, that this Exception could be thrown anytime after your create/update statement, and before the end of the transaction (see FlushModeType). Typical throwing points are when you read something after your create/update statement, or at an explicit flush() statement (which you should use with caution).