Changes via JDBC update are not visible in sequential JDBC select - java

I have a code with this structure (there are a lot of classes, but schema is like this:
void f() {
MyObj o = db.getById(id);
o.setField1(value);
db.update(o);
o = db.getById(id);
assertEquals(value, o.getField());
}
update and get methods use the same data source, incjected with Spring. get works via JdbcTemplate and update just takes connection from dataSource and uses raw JDBC.
Update is marked with #Transactional annotation.
here is a definition of transacion manager from Spring config:
<tx:annotation-driven transaction-manager="TransactionManager"/>
<bean id="TransactionManager"
class="org.springframework.orm.hibernate3.HibernateTransactionManager">
<property name="sessionFactory" ref="sessionFactory"/>
</bean>
The issue is that if I use update and after it get in different calls of webservice methods, that use them, for exampple, the result is correct and I get updated values.
And if I call them sequentially in one unit-test method after update I don't see updated value.
I can't post the whole read/write code here, because it is large and splitted into many files, but probably you have some ideas how to fix it.
Thanks.

You have to flush the update before you can see it in the select.

you can try
entityManager.refresh(yourEntity);
this way, you will get your entities recent instance where you use it after this row.

Related

Can Spring 3 formBackingObject return different command classes?

I have a situation where I need to check a condition in my formBackingObject and, depending on the condition, return one of two classes.
The problem is that, so far as I know, I can only define one commandName and commandClass in the servlet.xml. Anyone know of a way I can handle this? It doesn't seem like a rare use case, but I haven't really found any solution on the net.
Here is the logic block from my controller formBackingObject:
List<FooLoadShed> fooLoadShedList = this.fooLoadShedDao.getActiveSheds();
if(fooLoadShedList.isEmpty()) {
logger.info("LoadShedActive is: " + this.sessionDetailsManager.getSessionDetails().isLoadShedActive());
return new NoAction();
}
else {
this.sessionDetailsManager.getSessionDetails().setLoadShedActive(true);
logger.info("LoadShedActive is : " + this.sessionDetailsManager.getSessionDetails().isLoadShedActive());
logger.info("Number of load sheds: " + nieLoadShedList.size());
return new ModelAndView(new RedirectView("custLookup.htm"));
}
and my servlet.xml config:
<bean name="/index.htm" class="springapp.web.indexController" scope="session">
<property name="sessionForm" value="true"/>
<property name="commandName" value="noAction"/>
<property name="commandClass" value="springapp.service.NoAction"/>
<property name="formView" value="index"/>
<property name="sessionDetailsManager" ref="sessionDetailsManager"/>
<property name="mobiConfigDao" ref="mobiConfigDao"/>
<property name="fooLoadShedDao" ref="fooLoadShed" />
</bean>
This is a very old way to configure controllers Spring MVC! Haven't seen something like this in over 10 years. Why not use more modern Spring MVC configuration, using annotations instead of XML?
In any case, the importance of the command class is in the POST. Spring MVC must be able to construct an instance of the command class. In order to do that, it needs to know the specific class name. Then it will apply the form values to the properties on the command class object that was created. This fully populated command object will be handed to you in the handler method.
The configuration in the XML is for the default command object creation process. You can override this. In modern Spring MVC, this is with a method level #ModelAttribute annotation. In ancient Spring MVC, you need to override the methods that create the command object and create the command object yourself. BaseCommandController has a createCommand method which is protected. If that's the way you created your controller, that's where you would have to start.

#Cacheable not working

I am using #Cacheable for caching the result of a method at Service layer in Spring 3.2. following method code is used inside service Class:
#Cacheable("questions")
public List<ABClassObject> getSecutityQuestionsList(){
List<ABClassObject> list = new ArrayList<ABClassObject>();
----------------
list = ----[DAO call]
return list;
}
xml Configuration
<cache:annotation-driven />
<!-- Generic cache manager based on the JDK ConcurrentMap -->
<bean id="cacheManager" class="org.springframework.cache.support.SimpleCacheManager">
<property name="caches">
<set>
<bean class="org.springframework.cache.concurrent.ConcurrentMapCacheFactoryBean" p:name="questions" />
</set>
</property>
</bean>
Can't use EhCache because of using jdk 1.6.
By using the above code pattern i am unable to cache the List result.DAO is called all the time when i call the above method.
So, Suggest me whats wrong with the code.
Thanks in advance.
Some things you should check:
The class of getSecutityQuestionsList method is a spring bean, i.e, you donĀ“t use a new operator anyway.
The method getSecutityQuestionsList is called from another bean
In your xml configuration put a context:component-scan base-package="xxxxx"
Put a break point inside your method. In the stack trace you should see some spring proxy stuff. When you call this method of your service, you should actually be calling a spring proxy.

How to intercept JDBC queries with Hibernate/Spring/Tomcat?

I'm trying to implement the solution outlined in this answer. The short of it is: I want to set the role for each database connection in order to provide better data separation for different customers. This requires intercepting JDBC queries or transactions, setting the user before the query runs and resetting it afterwards. This is mainly done to comply with some regulatory requirements.
Currently I'm using Tomcat and Tomcat's JDBC pool connecting to a PostgreSQL database. The application is built with Spring and Hibernate. So far I couldn't find any point for intercepting the queries.
I tried JDBC interceptors for Tomcat's built in pool but they have to be global and I need to access data from my Web appliation in order to correlate requests to database users. As far as I see, Hibernate's interceptors work only on entities which is too high level for this use case.
What I need is something like the following:
class ConnectionPoolCallback {
void onConnectionRetrieved(Connection conn) {
conn.execute("SET ROLE " + getRole()); // getRole is some magic
}
void onConnectionReturned(Connection conn) {
conn.execute("RESET ROLE");
}
}
And now I need a place to register this callback... Does anybody have any idea how to implement something like this?
Hibernate 4 has multitenancy support. For plain sql you will need datasource routing which I believe spring has now or is an addon.
I would not mess ( ie extend) the pool library.
Option 1:
As Adam mentioned, use Hibernate 4's multi-tenant support. Read the docs on Hibernate multi-tenancy and then implement the MultiTenantConnectionProvider and CurrentTenantIdentifierResolver interfaces.
In the getConnection method, call SET ROLE as you've done above. Although it's at the Hibernate level, this hook is pretty close in functionality to what you asked for in your question.
Option 2:
I tried JDBC interceptors for Tomcat's built in pool but they have to
be global and I need to access data from my Web appliation in order to
correlate requests to database users.
If you can reconfigure your app to define the connection pool as a Spring bean rather than obtain it from Tomcat, you can probably add your own hook by proxying the data source:
<!-- I like c3p0, but use whatever pool you want -->
<bean id="actualDataSource" class="com.mchange.v2.c3p0.ComboPooledDataSource">
<property name="jdbcUrl" value="${db.url}"/>
<property name="user" value="${db.user}" />
.....
<!-- uses the actual data source. name it "dataSource". i believe the Spring tx
stuff looks for a bean named "dataSource". -->
<bean id="dataSource" class="com.musiKk.RoleSettingDSProxy">
<property name="actualDataSource"><ref bean="actualDataSource" /></property>
</bean>
<bean id="sessionFactory" class="org.springframework.orm.hibernate4.LocalSessionFactoryBean">
<property name="dataSource"><ref bean="dataSource" /></property>
....
And then build com.musiKk.RoleSettingDSProxy like this:
public class RoleSettingDSProxy implements DataSource {
private DataSource actualDataSource;
public Connection getConnection() throws SQLException {
Connection con = actualDataSource.getConnection();
// do your thing here. reference a thread local set by
// a servlet filter to get the current tenant and set the role
return con;
}
public void setActualDataSource(DataSource actualDataSource) {
this.actualDataSource = actualDataSource;
}
Note that I haven't actually tried option 2, it's just an idea. I can't immediately think of any reason why it wouldn't work, but it may unravel on you for some reason if you try to implement it.
One solution that comes to mind is to utilize the Hibernate listeners/callbacks. But do beware that is very low level and quite error-prone. I use it myself to get a certain degree of automated audit logging going; it was not a pretty development cycle to get it to work reliably. unfortunately I can't share code since I don't own it.
http://docs.jboss.org/hibernate/entitymanager/3.6/reference/en/html/listeners.html

hibernate doesn't issue update after flush

I'm using hibernate 3.2.7 (same problem on 3.2.5) with spring 3.0.1, all deployed on weblogic 10.3 and with an Oracle 10g database. I'm using JTA transaction management and the transaction is distributed (it is actually started and ended in another application, this code is just in between).
The configuration used by hibernate is declared in my persistence.xml and is the following:
<property name="hibernate.dialect" value="org.hibernate.dialect.Oracle10gDialect"/>
<property name="hibernate.transaction.manager_lookup_class" value="org.hibernate.transaction.WeblogicTransactionManagerLookup"/>
<property name="hibernate.query.factory_class" value="org.hibernate.hql.classic.ClassicQueryTranslatorFactory"/>
<property name="hibernate.current_session_context_class" value="jta"/>
<property name="hibernate.connection.release_mode" value="auto"/>
The spring configuration regarding the transaction manager is the following:
<!-- Instructs Spring to perfrom declarative transaction managemenet on annotated classes -->
<tx:annotation-driven transaction-manager="txManager" proxy-target-class="true"/>
<!-- Data about transact manager and session factory -->
<bean id="txManager" class="org.springframework.transaction.jta.WebLogicJtaTransactionManager">
<property name="transactionManagerName" value="javax.transaction.TransactionManager"/>
<property name="defaultTimeout" value="${app.transaction.timeOut}"/>
</bean>
<bean id="entityManagerFactory" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<!-- persistence unit is missing jta data source so that application server is not
creating EntitiyManagerFactory, spring will create its own LocalContainerEntityManagerFactoryBean overriding data source-->
<property name="dataSource" ref="myDataSource"/>
<!-- specific properties like jpa provider and jpa provider properties are in persistance unit -->
<property name="persistenceUnitName" value="my.persistence.unit"/>
</bean>
<!-- define data source in application server -->
<jee:jndi-lookup id="myDataSource" jndi-name="${db.jndiName}"/>
I'm using a generic CrudDao with an update method that looks like this:
public void update(Object entity) {
//entityManager injected by #PersistenceContext
entityManager.merge(entity);
entityManager.flush();
}
public Object getById(Object id, Class entityClass) throws PersistenceException{
return (Object)entityManager.find(entityClass, id);
}
UPDATED: added the getById method.
The code that does not work as expected looks like this:
MyObject myObj = getMyObjectThroughSomeOneToManyRelation(idOne, idOther);
// till now was null
myObj.setSomeDateAttr(someDate);
genericDao.update(myObj);
MyObject myObjFromDB = genericDao.getById(myObj.getId(), MyObject.class);
The result is that if I print myObj.getSomeDateAttr() it returns me the value of someDate, if I print myObjFromDB.getSomeDateAttr() it still has null.
I've tried changing the update method to:
org.hibernate.Session s = (org.hibernate.Session) entityManager.getDelegate();
s.evict(entity);
s.update(entity);
s.flush();
And it still doesn't work.
When turning on the show_sql flag of hibernate I don't see any update occurring when doing flush nor when I query the entity manager for the object with the same id. The selects are all visible.
UPDATE:
At the end of the transaction the update is actually called and everything is written to the db. So my problem is "just" during the transaction.
I'm afraid the problem may be linked with the configuration of the transaction manager on spring and on hibernate.
Hope that someone can help me as I have already lost a day and a half with no luck.
You need to look at the hibernate merge behaviour closely. As per documentation
if there is a persistent instance with the same identifier currently
associated with the session, copy the state of the given object onto
the persistent instance
if there is no persistent instance currently associated with the session, try to load it from the database, or create a new persistent instance
the persistent instance is returned
the given instance does not become associated with the session, it
remains detached
As per your statement on the sql queries in log, it look like
MyObject myObj = getMyObjectThroughSomeOneToManyRelation(idOne, idOther); returning the persistent object but when you modify it(becomes dirty) and call merge method, new state is copied to the current persistent object in session. If you see third point merge returns persistent object which is actually new manageable persistent object which you need to use in subsequent operations.
When you call find method hibernate returns the persistent object in session and not maneagable persistent object thats why you dont find the changes in object return by find.
To fix your problem change the reurn type of update method
public Object update(Object entity) {
//entityManager injected by #PersistenceContext
return entityManager.merge(entity);
}
and in service you need to use as below
MyObject myObj = getMyObjectThroughSomeOneToManyRelation(idOne, idOther);
// till now was null
myObj.setSomeDateAttr(someDate);
//You can use myObj as well instead myNewObj
MyObject myNewObj= genericDao.update(myObj);
//No need to call get
//MyObject myObjFromDB = genericDao.getById(myObj.getId(), MyObject.class);
System.out.println("Updated value:"+myNewObj.getSomeDateAttr());
Have a look at this artical as well.

Spring scoped-proxy transactions are fine via JPA but not commiting via JDBC

I have a situation where I have to handle multiple clients in one app and each client has separate database. To support that I'm using Spring custom scope, quite similar to the built in request scope. A user authenticates in each request and can set context client ID based passed credentials. The scoping itself seems to be working properly.
So I used my custom scope to create a scoped-proxy for my DataSource to support a diffrent database per client. And I get connections to proper databases.
Than I created a scoped-proxy for EntityManagerFactory to use JPA. And this part also looks OK.
Than I added a scoped-proxy for PlatformTransactionManager for declarative transaction management. I use #Transactional on my service layer and it gets propagated nicely to my SpringData powered repository layer.
All is fine and works correctly as long a s I use only JPA. I can even switch context to a diffrent client within the request (I use ThreadLocals under the hood) and transactions to both databases are handled correctly.
The problems start when I try to use JDBCTempate in one of my custom repositiries. Than at first glance all looks OK too, as no exceptions are thrown. But when I check the database for the objects I thought I inserted with my custom JDBC-based repository the're not there!
I know for sure I can use JPA and JDBC together by declaring only JpaTransactionManager and passing both the DataSource and EntityManagerFactory to it - I checked it and without the scoped-proxies and it works.
So the question is how to make JDBC work together with JPA using the JpaTransactionManager when I have scoped-proxied the DataSource, EntityManagerFactory and PlatformTransactionManager beans? I remind that using only JPA works perfectly, but adding plain JDBC into the mix is not working.
UPDATE1: And one more thing: all readonly (SELECT) operations work fine with JDBC too - only writes (INSERT, UPDATE, DELETE) end up not commited or rolledback.
UPDATE2: As #Tomasz suggested I've removed scoped proxy from EntityManagerFactory and PlatformTransactionManager as those are indeed not needed and provide more confusion than anything else.
The real problem seems to be switching the scope context within a transaction. The TransactionSynchronizationManager bounds transactional resources (i.e. EMF or DS) to thread at transaction start. It has the ability to unwrap the scoped proxy, so it binds the actual instance of the resource from the scope active at the time of starting a transaction. Then when I change the context within a transaction it all gets messed up.
It seems like I need to suspend the active transaction and store aside the current transaction context to be able to clear it upon entering another scope to make Spring think it's not inside a transaction any more and to force it create one for the new scope when needed. And then when leaving the scope I'd have to restore the previously suspended transaction. Unfortunatelly I was unable to come up with a working implementation yet. Any hints appreciated.
And below is some code of mine, but it's pretty standard, except for the scoped-proxies.
The DataSource:
<!-- provides database name based on client context -->
<bean id="clientDatabaseNameProvider"
class="com.example.common.spring.scope.ClientScopedNameProviderImpl"
c:clientScopeHolder-ref="clientScopeHolder"
p:databaseName="${base.db.name}" />
<!-- an extension of org.apache.commons.dbcp.BasicDataSource that
uses proper database URL based on database name given by above provider -->
<bean id="jpaDataSource" scope="client"
class="com.example.common.spring.datasource.MysqlDbInitializingDataSource"
destroy-method="close"
p:driverClassName="${mysql.driver}"
p:url="${mysql.url}"
p:databaseNameProvider-ref="clientDatabaseNameProvider"
p:username="${mysql.username}"
p:password="${mysql.password}"
p:defaultAutoCommit="false"
p:connectionProperties="sessionVariables=storage_engine=InnoDB">
<aop:scoped-proxy proxy-target-class="false" />
</bean>
The EntityManagerFactory:
<bean id="jpaVendorAdapter"
class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter"
p:database="MYSQL"
p:generateDdl="true"
p:showSql="true" />
<util:properties id="jpaProperties">
<!-- omitted for readability -->
</util:properties>
<bean id="jpaDialect"
class="org.springframework.orm.jpa.vendor.HibernateJpaDialect" />
<bean id="entityManagerFactory"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean"
p:packagesToScan="com.example.model.core"
p:jpaVendorAdapter-ref="jpaVendorAdapter"
p:dataSource-ref="jpaDataSource"
p:jpaDialect-ref="jpaDialect"
p:jpaProperties-ref="jpaProperties" />
The PlatformTracsactionManager:
<bean id="transactionManager"
class="org.springframework.orm.jpa.JpaTransactionManager"
p:dataSource-ref="jpaDataSource"
p:entityManagerFactory-ref="entityManagerFactory" />
<tx:annotation-driven proxy-target-class="false" mode="proxy"
transaction-manager="transactionManager" />

Categories