JSON mapping problem: possible non-threadsafe access to the session - java

I am facing a problem due which is unknown to me, can you one have faced this problem?
JSON mapping problem: <package>ApiResponse["data"]; nested exception is com.fasterxml.jackson.databind.JsonMappingException: possible non-threadsafe access to the session (through reference chain: <package>.ApiResponse["data"])
I have a standard API response pojo. Which I return every time with ResponseEntity. Everything is working fine, but sometimes I got that above error. I don't why this error occurred .
I got the below log from console
an assertion failure occurred (this may indicate a bug in Hibernate, but is more likely due to unsafe use of the session): org.hibernate.AssertionFailure: possible non-threadsafe access to the session
org.hibernate.AssertionFailure: possible non-threadsafe access to the session

I think you are trying to share same Hibernate session within multiple threads. That's illegal.
Hibernate Sessions are not thread-safe whereas Hibernate SessionFactory is thread-safe.
So, make a separate DAO layer. Create single sessionfactory object and share it among the DAO classes.
Get a session for a single-threaded DB operation and close the session in that thread.
For example :
#Repository
public class DAO {
#Autowired
private SessionFactory sessionFactory;
public class performDBOperation(Object obj) {
Session session = sessionFactory.currentSession();
session.save(obj);
session.close();
}
}
Now, I have looked at your github code.
I saw the code Exec.java
#Service
public interface Exec {
#Async
#Transactional
public void run();
}
This is incorrect.
Updated :
public interface Exec {
public void run();
}
Update ExecImpl to this :
#Service
public class ExecImpl implements Exec {
#Autowired
private ExecDAO execDAO;
#Override
#Async
#Transactional
public void run() {
// example : create an object to save it.
Object object = ...;
execDAO.saveItem(object);
}
}
Create DAO layer :
Suppose ExecDAO interface and implementation ExecDAOImpl :
public interface ExecDAO {
public void saveItem(Object obj);
// keep here abstract method to perform DB operation
}
#Repository
public class ExecDAOImpl implements ExecDAO {
#Autowired
private SessionFactory sessionFactory;
#Override
public void saveItem(Object obj) {
Session session = sessionFactory.currentSession();
session.save(obj);
session.close();
}
}

Looking at the code at the link you shared in the comment, I think that
#Async
#Transactional
is a dangerous thing.
I would suggest you to extract a method to do the transactions and try
what I mean is that,
interface ExecImpl{
#Async
void run(){
someThingElse.doTransaction();
}
}
interface SomeThingElse{
#Transactional
void doTransaction();
}
I am still not convinced this will help you. But this is something you can try.
I would also suggest to use readonly transactions for getting data and not have a single transaction for all purposes.
This blog explains why its not good to use these two annotations together whether on a class or on an interface

Related

Is it possible to nest transactions with TransactionManager?

I have the following code:
public ResultProcessDTO process() {
TransactionTemplate transactionTemplate = new TransactionTemplate(this.transactionManager);
transactionTemplate.setPropagationBehavior(TransactionDefinition.PROPAGATION_REQUIRES_NEW);
return transactionTemplate.execute(status -> {
...
SomeEntity entity = service.findById(id);
...
otherBean.someMethod();
...
});
}
And, in another bean:
public void someMethod() {
...
service.save(entity);
}
I need someMethod() to be REQUIRES_NEW, to perform the save and commit the transaction regardless of what happens with the rest of the process().
I've already tried #Transactional to leave this up to Spring, but the process() is triggered via KafkaListener, which leads me to have problems with lazy load entities that are fetched along the way.
For example, if I call entity.getChildList() I get a LazyInitializationException. This exception is not thrown if I use TransactionTemplate.
Any suggestions on what I'm doing wrong or how to make it work as I hope?

Set pessimistic lock on entity with EntityManager

Consider the following situation:
We receive a request from a web service which updates our entity. Sometimes we might get two requests at (almost) the same time. We had situations in which our entity looked completely wrong, because of concurrent updates. The idea is to lock the entity pessimistic so that whenever the first request comes it instantly locks the entity and the second request can't touch it (Optimistic locking is no alternative for us). I wrote an integration test to check this behaviour.
I got an integration test which looks like the following:
protected static TestRemoteFacade testFacade;
#BeforeClass
public static void setup() {
testFacade = BeanLocator.lookupRemote(TestRemoteFacade.class, TestRemoteFacade.REMOTE_JNDI_NAME, TestRemoteFacade.NAMESPACE);
}
#Test
public void testPessimisticLock() throws Exception {
testFacade.readPessimisticTwice();
}
which calls the bean
#Stateless
#Clustered
#SecurityDomain("myDomain")
#RolesAllowed({ Roles.ACCESS })
public class TestFacadeBean extends FacadeBean implements TestRemoteFacade {
#EJB
private FiolaProduktLocalFacade produkt;
#Override
public void readPessimisticTwice() {
produkt.readPessimisticTwice();
}
}
with produkt being a bean itself
#Stateless
#Clustered
#SecurityDomain("myDomain")
#RolesAllowed({ Roles.ACCESS })
public class ProduktFacadeBean implements ProduktLocalFacade {
#Override
public void readPessimisticTwice() {
EntityManager entityManager = MyService.getCrudService().getEntityManager();
System.out.println("Before first try.");
entityManager.find(MyEntity.class, 1, LockModeType.PESSIMISTIC_WRITE);
System.out.println("Before second try.");
entityManager.find(MyEntity.class, 1, LockModeType.PESSIMISTIC_WRITE);
System.out.println("After second try.");
}
}
with
public class MyService {
public static CrudServiceLocalFacade getCrudService() {
return CrudServiceLookup.getCrudService();
}
}
public final class CrudServiceLookup {
private static CrudServiceLocalFacade crudService;
private CrudServiceLookup(){
}
public static CrudServiceLocalFacade getCrudService() {
if (crudService == null)
crudService = BeanLocator.lookup(CrudServiceLocalFacade.class, CrudServiceLocalFacade.LOCAL_JNDI_NAME);
return crudService;
}
public static void setCrudService(CrudServiceLocalFacade crudService) {
CrudServiceLookup.crudService = crudService;
}
}
#Stateless
#Local(CrudServiceLocalFacade.class)
#TransactionAttribute(TransactionAttributeType.MANDATORY)
#Interceptors(OracleDataBaseInterceptor.class)
public class CrudServiceFacadeBean implements CrudServiceLocalFacade {
private EntityManager em;
#Override
#PersistenceContext(unitName = "persistence_unit")
public void setEntityManager(EntityManager entityManager) {
em = entityManager;
}
#Override
public EntityManager getEntityManager() {
return em;
}
}
The problem that arises now is: If I start the integration test once with a breakpoint at System.out.println("Before second try."); and then start the integration test a second time, the latter one can still read MyEntity. Remarkable is that they were different instances (I made this observation on the instanceId in debug mode). This suggests that the entityManager didn't share his hibernate context.
I made the following observations:
Whenever I call a setter on entity and save it to the db, the lock is aquired. But this is not what I need. I need the lock without having modified the entity.
I tried the method entityManager.lock(entity, LockModeType.PESSIMISTIC_WRITE) as well, but the behaviour was the same.
I found Transaction settings in DBVisualizer. At the moment it is set to TRANSACTION_NONE. I tried all the others (TRANSACTION_READ_UNCOMMITTED, TRANSACTION_READ_COMMITTED, TRANSACTION_REPEATABLE_READ, TRANSACTION_SERIALIZABLE) as well, without any success.
Let the first thread read the entity, then the second thread read the same entity. Let the first tread modify the entity and then the second modify it. Then let both save the entity and whoever saves the entity last wins and no exceptions will be thrown.
How can I read an object pessimistic, that means: Whenever I load an entity from the db I want it to be locked immediately (even if there was no modification).
Both ways you describe ie.
em.find(MyEntity.class, 1, LockModeType.PESSIMISTIC_WRITE)
em.lock(entity, LockModeType.PESSIMISTIC_WRITE)
hold a lock on the related row in database but only for the the entityManager lifespan, ie. for the time of the enclosing transaction, the lock will be so automatically released once you've reached the end of the transaction
#Transactional()
public void doSomething() {
em.lock(entity, LockModeType.PESSIMISTIC_WRITE); // entity is locked
// any other thread trying to update the entity until this method finishes will raise an error
}
...
object.doSomething();
object.doSomethingElse(); // lock is already released here
Have you tried to set the isolation level in your application server?
To get a lock on a row no matter what you are trying to do afterwards (read/write), you need to set the isolation level to TRANSACTION_SERIALIZABLE.
Lock fails only if another thread is already holding the lock. You can take two FOR UPDATE locks on single row in DB, so it's not JPA-specific thing.

How to tell Hibernate NOT to store data while running JUnit tests?

I want to check my persistence logic and so I am running some test cases.
Repository class:
#Repository
public class MyRepository {
public void add(Object obj) {
/* Do some stuff here */
getSession().persist(obj);
}
}
Test class:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = { Context.class})
public class MyTests {
#Inject
private MyRepository myRepository;
#Test
#Rollback(true)
public void foo() {
/* Test logic */
myRepository.add(obj);
Assert.assert...;
}
}
The unit test: MyTests.java contains test cases which test some stuff that is related to persistence, but not the actual Hibernate persistence itself, so that's why the getSession.persist() statement is obsolete.
For performance reasons, I want to prevent Hibernate from storing data to my database, even if the whole data interaction is rolled back. My approach would be to mock: getSession().persist(). Is there a better, or more specifically, an easier way to achieve my intentions?
First of all, there are different id generators in Hibernate. If identity generator (not all the databases supports it), then to assign id to the entity, when session.persist method is called, insert query will be called. But if, for example, sequence or uuid generator is used, then insert won't be triggered (at least right away).
After that if methods session.get or session.load are called to load persisted (in the current session) object, then select query won't be called, because it gets object from Hibernate cache. But if HQL is used to select data, then select query is called. Moreover before it (by default) insert query for persisted object is called too.
This can be changed with FlushMode. By default is set to AUTO. It means:
The Session is sometimes flushed before query execution in order to
ensure that queries never return stale state.
But if getSession().setHibernateFlushMode(FlushMode.MANUAL) is set:
The Session is only ever flushed when Session.flush() is explicitly
called by the application.
Which means insert query won't be called until session.flush is called explicitly. If methods session.get and session.load are further used your code will still work (in the current session). But in case of select HQL query - it won't find the entity since it wasn't persisted. So beware.
Create an interface, implement it using the Hibernate persist() method, and use it in such a way, that:
the normal calls go through the implementation
the test calls go through a mock version of it
public interface MyRepository {
public void add(Object obj);
}
public class MyRepositoryImpl implements MyRepository {
public void add(Object obj) {
getSession().persist(obj);
}
}
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = { Context.class})
public class MyTests {
#Mock // we inject the mock instead of the true implementation
private MyRepository myRepository;
#Test
#Rollback(true)
public void foo() {
/* Test logic */
myRepository.add(obj); // the test uses the mocked version
Assert.assert...;
}
}
There are many Java libraries that let you mock objects, e.g.
Mockito
JMock
EasyMock
You need to be able to mock your repository object so that you can use the mock in tests, and use the real one in the rest of your application.
DAO:
#Repository(value="MockRepo")
public class MockMyRepositoryImpl implments MyRepository {
#Override
public void add(Foo foo) {
//Do nothing here
}
}
Test:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = { Context.class})
public class MyTests {
#Autowired
#Qualifier("MockRepo");
private MyRepository repo;
#Test
public void testFooSave() {
repo.add(obj);
}
}
The alternative is to use a mocking framework as detailed in another answer. Mocking frameworks are more flexible, but if you want something simple that's just going to work then try the above.

How to RollBack a transaction of an EntityManager injected by #PersistenceContext?

I am using JPA and i have an abstract class where i inject my entityManager and i have a generic method where i persist my oject to the database and in all my services class extends that abstract class but the issue is: sometimes i have to persist client and client details but if i have an exception in the persist of client , then my program persist client details that's why i look for a rollBack to call when i have a persist exception.
I khow that i can do like entityManager.getTransaction().rollback(); but if i manage my entity manager but in my case , it managed by the container.
Here is my abstract class:
public abstract class AbstractEntityFactory<E>{
protected static final transient Logger LOGGER = CommonLogger.getLogger(AbstractEntityFactory.class);
#PersistenceContext(unitName = "Test_PU")
#Transient
#XmlTransient
private transient EntityManager entityManager;
public E persist(final E arg0) {
LOGGER.debug("persist : Begin");
this.getEntityManager().persist(arg0);
this.getEntityManager().flush();
LOGGER.debug("persist : End");
return arg0;
}
}
NB: i have Jboss EAP6 as server
You can try using UserTransaction.rollback to roll back the current transaction.
Since your using a "container". I assume you mean something like wildfly,..
In that case I assume your using #Transactional somewhere. This annotation has an attribute called rollbackFor.
If I remember correctly the default setup does not rollback for RuntimeExceptions. So you might wana change this.
If you need to rollback manually use the UserTransaction. You should be able to get it via
public class SomeBean {
#Resource
private UserTransaction transaction;
}
you can use transaction rollback support annotation :
` #Transactional(propagation=Propagation.REQUIRED, rollbackFor=Exception.class)
public E persist(final E arg0) {
LOGGER.debug("persist : Begin");
this.getEntityManager().persist(arg0);
this.getEntityManager().flush();
LOGGER.debug("persist : End");
return arg0;
}`

Java Spring #Transactional method not rolling back as expected

Below is a quick outline of what I'm trying to do. I want to push a record to two different tables in the database from one method call. If anything fails, I want everything to roll back. So if insertIntoB fails, I want anything that would be committed in insertIntoA to be rolled back.
public class Service {
MyDAO dao;
public void insertRecords(List<Record> records){
for (Record record : records){
insertIntoAAndB(record);
}
}
#Transactional (rollbackFor = Exception.class, propagation = Propagation.REQUIRES_NEW)
public void insertIntoAAndB(Record record){
insertIntoA(record);
insertIntoB(record);
}
#Transactional(propagation = Propagation.REQUIRED)
public void insertIntoA(Record record){
dao.insertIntoA(record);
}
#Transactional(propagation = Propagation.REQUIRED)
public void insertIntoB(Record record){
dao.insertIntoB(record);
}
public void setMyDAO(final MyDAO dao) {
this.dao = dao;
}
}
Where MyDAO dao is an interface that is mapped to the database using mybatis and is set using Spring injections.
Right now if insertIntoB fails, everything from insertIntoA still gets pushed to the database. How can I correct this behavior?
EDIT:
I modified the class to give a more accurate description of what I'm trying to achieve. If I run insertIntoAAndB directly, the roll back works if there are any issues, but if I call insertIntoAAndB from insertRecords, the roll back doesn't work if any issues arise.
I found the solution!
Apparently Spring can't intercept internal method calls to transactional methods. So I took out the method calling the transactional method, and put it into a separate class, and the rollback works just fine. Below is a rough example of the fix.
public class Foo {
public void insertRecords(List<Record> records){
Service myService = new Service();
for (Record record : records){
myService.insertIntoAAndB(record);
}
}
}
public class Service {
MyDAO dao;
#Transactional (rollbackFor = Exception.class, propagation = Propagation.REQUIRES_NEW)
public void insertIntoAAndB(Record record){
insertIntoA(record);
insertIntoB(record);
}
#Transactional(propagation = Propagation.REQUIRED)
public void insertIntoA(Record record){
dao.insertIntoA(record);
}
#Transactional(propagation = Propagation.REQUIRED)
public void insertIntoB(Record record){
dao.insertIntoB(record);
}
public void setMyDAO(final MyDAO dao) {
this.dao = dao;
}
}
I think the behavior you encounter is dependent on what ORM / persistence provider and database you're using. I tested your case using hibernate & mysql and all my transactions rolled back alright.
If you do use hibernate enable SQL and transaction logging to see what it's doing:
log4j.logger.org.hibernate.SQL=DEBUG
log4j.logger.org.hibernate.transaction=DEBUG
// for hibernate 4.2.2
// log4j.logger.org.hibernate.engine.transaction=DEBUG
If you're on plain jdbc (using spring JdbcTemplate), you can also debug SQL & transaction on Spring level
log4j.logger.org.springframework.jdbc.core=DEBUG
log4j.logger.org.springframework.transaction=DEBUG
Double check your autocommit settings and database specific peciular (eg: most DDL will be comitted right away, you won't be able to roll it back although spring/hibernate did so)
Just because jdk parses aop annotation not only with the method, also parse annotation with the target class.
For example, you have method A with #transactional, and method B which calls method A but without #transactional, When you invoke the method B with reflection, Spring AOP will check the B method with the target class has any annotations.
So if your calling method in this class is not with the #transactional, it will not parse any other method in this method.
At last, show you the source code:
org.springframework.aop.framework.jdkDynamicAopProxy.class
public Object invoke(Object proxy, Method method, Object[] args) throws Throwable {
......
// Get the interception chain for this method.
List<Object> chain = this.advised.getInterceptorsAndDynamicInterceptionAdvice(method, targetClass);
// Check whether we have any advice. If we don't, we can fallback on direct
// reflective invocation of the target, and avoid creating a MethodInvocation.
if (chain.isEmpty()) {
// We can skip creating a MethodInvocation: just invoke the target directly
// Note that the final invoker must be an InvokerInterceptor so we know it does
// nothing but a reflective operation on the target, and no hot swapping orfancy proxying.
retVal = AopUtils.invokeJoinpointUsingReflection(target, method, args);
}
else {
// We need to create a method invocation...
invocation = new ReflectiveMethodInvocation(proxy, target, method, args, targetClass, chain);
// Proceed to the joinpoint through the interceptor chain.
retVal = invocation.proceed();
}
}

Categories