How can I get a spring JdbcTemplate to read_uncommitted? - java

Firstly, I can't use the declarative #Transactional approach as the application has multiple JDBC data-sources, I don't want to bore with the details, but suffice it to say the DAO method is passed the correct data-source to perform the logic. All JDBC data sources have the same schema, they're separated as I'm exposing rest services for an ERP system.
Due to this legacy system there are a lot of long lived locked records which I do not have control over, so I want dirty reads.
Using JDBC I would perform the following:
private Customer getCustomer(DataSource ds, String id) {
Customer c = null;
PreparedStatement stmt = null;
Connection con = null;
try {
con = ds.getConnection();
con.setTransactionIsolation(Connection.TRANSACTION_READ_UNCOMMITTED);
stmt = con.prepareStatement(SELECT_CUSTOMER);
stmt.setString(1, id);
ResultSet res = stmt.executeQuery();
c = buildCustomer(res);
} catch (SQLException ex) {
// log errors
} finally {
// Close resources
}
return c;
}
Okay, lots' of boiler-plate, I know. So I've tried out JdbcTemplate since I'm using spring.
Use JdbcTemplate
private Customer getCustomer(JdbcTemplate t, String id) {
return t.queryForObject(SELECT_CUSTOMER, new CustomerRowMapper(), id);
}
Much nicer, but it's still using default transaction isolation. I need to somehow change this. So I thought about using a TransactionTemplate.
private Customer getCustomer(final TransactionTemplate tt,
final JdbcTemplate t,
final String id) {
return tt.execute(new TransactionCallback<Customer>() {
#Override
public Customer doInTransaction(TransactionStatus ts) {
return t.queryForObject(SELECT_CUSTOMER, new CustomerRowMapper(), id);
}
});
}
But how do I set the transaction isolation here? I can't find it anywhere on the callback or the TransactionTemplate to do this.
I'm reading Spring in Action, Third Edition which explains as far as I've done, though the chapter on transactions continues on to using declarative transactions with annotations, but as mentioned I can't use this as my DAO needs to determine at runtime which data-source to used based on provided arguments, in my case a country code.
Any help would be greatly appreciated.

I've currently solved this by using the DataSourceTransactionManager directly, though it seems like I'm not saving as much boiler-plate as I first hoped. Don't get me wrong, it's cleaner, though I still can't help but feel there must be a simpler way. I don't need a transaction for the read, I just want to set the isolation.
private Customer getCustomer(final DataSourceTransactionManager txMan,
final JdbcTemplate t,
final String id) {
DefaultTransactionDefinition def = new DefaultTransactionDefinition();
def.setIsolationLevel(TransactionDefinition.ISOLATION_READ_UNCOMMITTED);
TransactionStatus status = txMan.getTransaction(def);
Customer c = null;
try {
c = t.queryForObject(SELECT_CUSTOMER, new CustomerRowMapper(), id);
} catch (Exception ex) {
txMan.rollback(status);
throw ex;
}
txMan.commit(status);
return c;
}
I'm still going to keep this one unanswered for a while as I truly believe there must be a better way.
Refer to Spring 3.1.x Documentation - Chapter 11 - Transaction Management

Using the TransactionTemplate helps you here, you need to configure it appropriately. The transaction template also contains the transaction configuration. Actually the TransactionTemplate extends DefaultTransactionDefinition.
So somewhere in your configuration you should have something like this.
<bean id="txTemplate" class=" org.springframework.transaction.support.TransactionTemplate">
<property name="isolationLevelName" value="ISOLATION_READ_UNCOMMITTED"/>
<property name="readOnly" value="true" />
<property name="transactionManager" ref="transactionManager" />
</bean>
If you then inject this bean into your class you should be able to use the TransactionTemplate based code you posted/tried earlier.
However there might be a nicer solution which can clean up your code. For one of the projects I worked on, we had a similar setup as yours (single app multiple databases). For this we wrote some spring code which basically switches the datasource when needed. More information can be found here.
If that is to far fetched or overkill for your application you can also try and use Spring's AbstractRoutingDataSource, which based on a lookup key (country code in your case) selects the proper datasource to use.
By using either of those 2 solutions you can start using springs declarative transactionmanagement approach (which should clean up your code considerably).

Define a proxy data source, class being org.springframework.jdbc.datasource.LazyConnectionDataSourceProxy and set the transaction isolation level. Inject actual data source either through setter or constructor.
<bean id="yourDataSource" class="org.springframework.jdbc.datasource.LazyConnectionDataSourceProxy">
<constructor-arg index="0" ref="targetDataSource"/>
<property name="defaultTransactionIsolationName" value="TRANSACTION_READ_UNCOMMITTED"/>
</bean>

I'm not sure you can do it without working with at the 'Transactional' abstraction-level provided by Spring.
A more 'xml-free' to build your transactionTemplate could be something like this.
private TransactionTemplate getTransactionTemplate(String executionTenantCode, boolean readOnlyTransaction) {
TransactionTemplate tt = new TransactionTemplate(transactionManager);
tt.setReadOnly(readOnlyTransaction);
tt.setIsolationLevel(TransactionDefinition.ISOLATION_READ_UNCOMMITTED);
tt.setPropagationBehavior(TransactionDefinition.PROPAGATION_REQUIRES_NEW);
return tt;
}
In any case I would "leverage" the #Transactional annotation specifying the appropriate transaction manager, binded with a separate data-source. I've done this for a multi-tenant application.
The usage:
#Transactional(transactionManager = CATALOG_TRANSACTION_MANAGER,
isolation = Isolation.READ_UNCOMMITTED,
readOnly = true)
public void myMethod() {
//....
}
The bean(s) declaration:
public class CatalogDataSourceConfiguration {
#Bean(name = "catalogDataSource")
#ConfigurationProperties("catalog.datasource")
public DataSource catalogDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = ENTITY_MANAGER_FACTORY)
public EntityManagerFactory entityManagerFactory(
#Qualifier("catalogEntityManagerFactoryBean") LocalContainerEntityManagerFactoryBean emFactoryBean) {
return emFactoryBean.getObject();
}
#Bean(name= CATALOG_TRANSACTION_MANAGER)
public PlatformTransactionManager catalogTM(#Qualifier(ENTITY_MANAGER_FACTORY) EntityManagerFactory emf) {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(emf);
return transactionManager;
}
#Bean
public NamedParameterJdbcTemplate catalogJdbcTemplate() {
return new NamedParameterJdbcTemplate(catalogDataSource());
}
}

Related

Cant resolve symbol SimpleNativeJdbcExtractor [duplicate]

My Project is on spring-boot-starter-parent - "1.5.9.RELEASE" and I'm migrating it to spring-boot-starter-parent - "2.3.1.RELEASE".
This is multi-tenant env application, where one database will have multiple schemas, and based on the tenant-id, execution switches between schemas.
I had achieved this schema switching using SimpleNativeJdbcExtractor but in the latest Springboot version NativeJdbcExtractor is no longer available.
Code snippet for the existing implementation:
#Bean
#Scope(
value = ConfigurableBeanFactory.SCOPE_PROTOTYPE,
proxyMode = ScopedProxyMode.TARGET_CLASS)
public JdbcTemplate jdbcTemplate() {
JdbcTemplate jdbcTemplate = new JdbcTemplate(dataSource);
SimpleNativeJdbcExtractor simpleNativeJdbcExtractor = new SimpleNativeJdbcExtractor() {
#Override
public Connection getNativeConnection(Connection con) throws SQLException {
LOGGER.debug("Set schema for getNativeConnection "+Utilities.getTenantId());
con.setSchema(Utilities.getTenantId());
return super.getNativeConnection(con);
}
#Override
public Connection getNativeConnectionFromStatement(Statement stmt) throws SQLException {
LOGGER.debug("Set schema for getNativeConnectionFromStatement "+Utilities.getTenantId());
Connection nativeConnectionFromStatement = super.getNativeConnectionFromStatement(stmt);
nativeConnectionFromStatement.setSchema(Utilities.getTenantId());
return nativeConnectionFromStatement;
}
};
simpleNativeJdbcExtractor.setNativeConnectionNecessaryForNativeStatements(true);
simpleNativeJdbcExtractor.setNativeConnectionNecessaryForNativePreparedStatements(true);
jdbcTemplate.setNativeJdbcExtractor(simpleNativeJdbcExtractor);
return jdbcTemplate;
}
Here Utilities.getTenantId() ( Stored value in ThreadLocal) would give the schema name based on the REST request.
Questions:
What are the alternates to NativeJdbcExtractor so that schema can be dynamically changed for JdbcTemplate?
Is there any other way, where while creating the JdbcTemplate bean I can set the schema based on the request.
Any help, code snippet, or guidance to solve this issue is deeply appreciated.
Thanks.
When I was running the application in debug mode I saw Spring was selecting Hikari Datasource.
I had to intercept getConnection call and update schema.
So I did something like below,
Created a Custom class which extends HikariDataSource
public class CustomHikariDataSource extends HikariDataSource {
#Override
public Connection getConnection() throws SQLException {
Connection connection = super.getConnection();
connection.setSchema(Utilities.getTenantId());
return connection;
}
}
Then in the config class, I created bean for my CustomHikariDataSource class.
#Bean
public DataSource customDataSource(DataSourceProperties properties) {
final CustomHikariDataSource dataSource = (CustomHikariDataSource) properties
.initializeDataSourceBuilder().type(CustomHikariDataSource.class).build();
if (properties.getName() != null) {
dataSource.setPoolName(properties.getName());
}
return dataSource;
}
Which will be used by the JdbcTemplate bean.
#Bean
#Scope(
value = ConfigurableBeanFactory.SCOPE_PROTOTYPE,
proxyMode = ScopedProxyMode.TARGET_CLASS)
public JdbcTemplate jdbcTemplate() throws SQLException {
JdbcTemplate jdbcTemplate = new JdbcTemplate(dataSource);
return jdbcTemplate;
}
With this approach, I will have DataSource bean created only once and for every JdbcTemplate access, the proper schema will be updated during runtime.
There's no need to get rid of JdbcTemplate. NativeJdbcExtractor was removed in Spring Framework 5 as it isn't needed with JDBC 4.
You should replace your usage of NativeJdbcExtractor with calls to connection.unwrap(Class). The method is inherited by Connection from JDBC's Wrapper.
You may also want to consider using AbstractRoutingDataSource which is designed to route connection requests to different underlying data sources based on a lookup key.

Is #Transactional support for NamedParameterTemplate.batchUpdate

Is #Transactional support for NamedParameterTemplate.batchUpdate?
If something went wrong during the batch execution, will it rollback as expected? Personally, I am not experienced that. That's why I am asking.
Is there any document to check #Transactional supported methods.
public class JdbcActorDao implements ActorDao {
private NamedParameterTemplate namedParameterJdbcTemplate;
public void setDataSource(DataSource dataSource) {
this.namedParameterJdbcTemplate = new NamedParameterJdbcTemplate(dataSource);
}
#Transactional
public int[] batchUpdate(List<Actor> actors) {
return this.namedParameterJdbcTemplate.batchUpdate(
"update t_actor set first_name = :firstName, last_name = :lastName where id = :id",
SqlParameterSourceUtils.createBatch(actors));
}
// ... additional methods
}
NamedParameterTemplate is just an abstraction around Jdbc. In spring it is the Transaction Manager that is responsible for managing transactions, not that you can not do it via plain JDBC but this is the spring way. Spring uses AOP internaly to inspect the annotated methods and delegates its transaction managment. But this role is separate from the NamedParameterTemplate.
So you can use it freely and annotate your methods as long as they are within a Spring managed component/bean with #Transactional

JOOQ initializing DAO Best Approach

I want to know Best practices for initilizing JOOQ generated DAO. Now,I am using following approach for initilization of JOOQ generated DAO. In following case StudentDao is JOOQ generated.
public class ExtendedStudentDAO extends StudentDao {
public ExtendedStudentDAO () {
super();
}
public ExtendedStudentDAO (Connection connection) {
Configuration configuration = DSL.using(connection,
JDBCUtils.dialect(connection)).configuration();
this.setConfiguration(configuration);
}
//adding extra methods to DAO using DSL
public String getStudentName(Long ID)
throws SQLException {
try (Connection connection = ServiceConnectionManager.getConnection()) {
DSLContext dslContext = ServiceConnectionManager
.getDSLContext(connection);
Record1<String> record = dslContext
.select(Student.Name)
.from(Student.Student)
.where(Student.ID
.equal(ID)).fetchOne();
if (record != null) {
return record.getValue(Student.Name);
}
return null;
}
}
}
and I have doubt with using above DAO my example code is below.
try (Connection connection = ServiceConnectionManager.getConnection()) {
ExtendedStudentDAO extendedStudentDAO =new ExtendedStudentDAO(connection);
Student stud=new Student();
.....
....
//insert method is from Generated DAO
extendedStudentDAO.insert(stud);
//this method is added in extended class
extendedStudentDAO.getStudentName(12);
}
There are two ways to look at this kind of initialisation:
Create DAOs every time you need them
Your approach is correct, but might be considered a bit heavy. You're creating a new DAO every time you need it.
As of jOOQ 3.7, a DAO is a pretty lightweight object. The same is true for the Configuration that wraps your Connection.
Once your project evolves (or in future jOOQ versions), that might no longer be true, as your Configuration initialisation (or jOOQ's DAO initialisation) might become heavier.
But this is a small risk, and it would be easy to fix:
Use dependency injection to manage DAO or Configuration references
Most people will set up only a single jOOQ Configuration for their application, and also only a single DAO instance (per DAO type), somewhere in a service. In this case, your Configuration must not share the Connection reference, but provide a Connection to jOOQ via the ConnectionProvider SPI. In your case, that seems trivial enough:
class MyConnectionProvider implements ConnectionProvider {
#Override
public Connection acquire() {
return ServiceConnectionManager.getConnection();
}
#Override
public void release(Connection connection) {
try {
connection.close();
}
catch (SQLException e) {
throw new DataAccessException("Error while closing", e);
}
}
}

Is it possible to use transactions in Spring AOP advice?

I am trying to implement logging in DB table using Spring AOP. By "logging in table" I mean to write in special log table information about record that was CREATED/UPDATED/DELETED in usual table for domain object.
I wrote some part of the code and all is working good except one thing - when transaction is rolled back then changes in log table still commit successfully. It's strange for me because in my AOP advice the same transaction is using as in my business and DAO layer. (From my AOP advice I called methods of special manager class with Transaction propagation MANDATORY and also I checked transaction name TransactionSynchronizationManager.getCurrentTransactionName() in business layer, dao layer and AOP advice and it is the same).
Does anyone tried to implement similar things in practice? Is it possible to use in AOP advice the same transaction as in the business layer and rollback changes made in AOP advice if some error in business layer occurs?
Thank you in advance for unswers.
EDIT
I want to clarify that problem with rollback occurs only for changes that was made from AOP advice. All changes that is made in DAO layer are rollbacked successfully. I mean that, for example, if some exception is thrown then changes made in DAO layer will be successfully rollbacked, but in log table information will be saved (commited). But I can't understand why it is like that because as I wrote above in AOP advice the same transaction is using.
EDIT 2
I checked with debugger the piece of the code where I am writting to the log table in AOP advice and it seems to me that JdbcTemplate's update method executes outside transaction because changes had been commited to the DB directly after execution of the statement and before transactional method was finished.
EDIT 3
I solved this problem. Actually, that was my stupid fault. I'm using MySQL. After creation of the log table I did't change DB engine and HeidySQL set MyIsam by default. But MyIsam doesn't support transaction so I changed DB engine to InnoDB (as for all other tables) and now all is working perfectly.
Thank you all for help and sorry for disturbing.
If someone is interested, here is simplified example that illustrate my approach.
Consider DAO class that has save method:
#Repository(value="jdbcUserDAO")
#Transactional(propagation=Propagation.SUPPORTS, readOnly=true, rollbackFor=Exception.class)
public class JdbcUserDAO implements UserDAO {
#Autowired
private JdbcTemplate jdbcTemplate;
#LoggedOperation(affectedRows = AffectedRows.ONE, loggedEntityClass = User.class, operationName = OperationName.CREATE)
#Transactional(propagation=Propagation.REQUIRED, readOnly=false, rollbackFor=Exception.class)
#Override
public User save(final User user) {
if (user == null || user.getRole() == null) {
throw new IllegalArgumentException("Input User object or nested Role object should not be null");
}
KeyHolder keyHolder = new GeneratedKeyHolder();
jdbcTemplate.update(new PreparedStatementCreator() {
#Override
public PreparedStatement createPreparedStatement(Connection connection)
throws SQLException {
PreparedStatement ps = connection.prepareStatement(SQL_INSERT_USER, new String[]{"ID"});
ps.setString(1, user.getUsername());
ps.setString(2, user.getPassword());
ps.setString(3, user.getFullName());
ps.setLong(4, user.getRole().getId());
ps.setString(5, user.geteMail());
return ps;
}
}, keyHolder);
user.setId((Long) keyHolder.getKey());
VacationDays vacationDays = user.getVacationDays();
vacationDays.setId(user.getId());
// Create related vacation days record.
vacationDaysDAO.save(vacationDays);
user.setVacationDays(vacationDays);
return user;
}
}
Here is how aspect looks like:
#Component
#Aspect
#Order(2)
public class DBLoggingAspect {
#Autowired
private DBLogManager dbLogManager;
#Around(value = "execution(* com.crediteuropebank.vacationsmanager.server.dao..*.*(..)) " +
"&& #annotation(loggedOperation)", argNames="loggedOperation")
public Object doOperation(final ProceedingJoinPoint joinPoint,
final LoggedOperation loggedOperation) throws Throwable {
Object[] arguments = joinPoint.getArgs();
/*
* This should be called before logging operation.
*/
Object retVal = joinPoint.proceed();
// Execute logging action
dbLogManager.logOperation(arguments,
loggedOperation);
return retVal;
}
}
And here is how my db log manager class LooksLike:
#Component("dbLogManager")
public class DBLogManager {
#Autowired
private JdbcTemplate jdbcTemplate;
#InjectLogger
private Logger logger;
#Transactional(rollbackFor={Exception.class}, propagation=Propagation.MANDATORY, readOnly=false)
public void logOperation(final Object[] inputArguments, final LoggedOperation loggedOperation) {
try {
/*
* Prepare query and array of the arguments
*/
jdbcTemplate.update(insertQuery.toString(),
insertedValues);
} catch (Exception e) {
StringBuilder sb = new StringBuilder();
// Prepare log string
logger.error(sb.toString(), e);
}
}
It could be to do with the order of the advice - you would want your #Transaction related advice to take effect around(or before and after) your logging related advice. If you are using Spring AOP you can probably control it using the order attribute of the advice - give your transaction related advice the highest precedence so that it executes last on the way out.
Nothing to do with AOP, set datasource property autocommit to false like :
<bean id="datasource" ...>
<property name="autoCommit" value="false/>
</bean>
If you are using xml configuration

Combining JBehave with SpringJUnit4ClassRunner to enable transaction rollback

Essence:
How can I auto-rollback my hibernate transaction in a JUnit Test run with JBehave?
The problem seems to be that JBehave wants the SpringAnnotatedEmbedderRunner but annotating a test as #Transactional requires the SpringJUnit4ClassRunner.
I've tried to find some documentation on how to implement either rollback with SpringAnnotatedEmbedderRunner or to make JBehave work using the SpringJUnit4ClassRunner but I couldn't get either to work.
Does anyone have a (preferably simple) setup that runs JBehave storries with Spring and Hibernate and transaction auto-rollback?
Further infos about my setup so far:
Working JBehave with Spring - but not with auto-rollback:
#RunWith(SpringAnnotatedEmbedderRunner.class)
#Configure(parameterConverters = ParameterConverters.EnumConverter.class)
#UsingEmbedder(embedder = Embedder.class, generateViewAfterStories = true, ignoreFailureInStories = false, ignoreFailureInView = false)
#UsingSpring(resources = { "file:src/main/webapp/WEB-INF/test-context.xml" })
#UsingSteps
#Transactional // << won't work
#TransactionConfiguration(...) // << won't work
// both require the SpringJUnit4ClassRunner
public class DwStoryTests extends JUnitStories {
protected List<String> storyPaths() {
String searchInDirectory = CodeLocations.codeLocationFromPath("src/test/resources").getFile();
return new StoryFinder().findPaths(searchInDirectory, Arrays.asList("**/*.story"), null);
}
}
In my test steps I can #Inject everything nicely:
#Component
#Transactional // << won't work
public class PersonServiceSteps extends AbstractSmockServerTest {
#Inject
private DatabaseSetupHelper databaseSetupHelper;
#Inject
private PersonProvider personProvider;
#Given("a database in default state")
public void setupDatabase() throws SecurityException {
databaseSetupHelper.createTypes();
databaseSetupHelper.createPermission();
}
#When("the service $service is called with message $message")
public void callServiceWithMessage(String service, String message) {
sendRequestTo("/personService", withMessage("requestPersonSave.xml")).andExpect(noFault());
}
#Then("there should be a new person in the database")
public void assertNewPersonInDatabase() {
Assert.assertEquals("Service did not save person: ", personProvider.count(), 1);
}
(yes, the databaseSetupHelper methods are all transactional)
PersonProvider is basicly a wrapper around org.springframework.data.jpa.repository.support.SimpleJpaRepository. So there is access to the entityManager but taking control over the transactions (with begin/rollback) didn't work, I guess because of all the #Transactionals that are done under the hood inside that helper class.
Also I read that JBehave runs in a different context?session?something? which causes loss of controll over the transaction started by the test? Pretty confusing stuff..
edit:
Editet the above rephrasing the post to reflect my current knowledge and shortening the whole thing so that the question becomes more obvious and the setup less obstrusive.
I think you can skip the SpringAnnotatedEmbedderRunner and provide the necessary configuration to JBehave yourself. For example instead of
#UsingEmbedder(embedder = Embedder.class, generateViewAfterStories = true, ignoreFailureInStories = false, ignoreFailureInView = false)
you can do
configuredEmbedder()
.embedderControls()
.doGenerateViewAfterStories(true)
.doIgnoreFailureInStories(false)
.doIgnoreFailureInView(false);
Besides: why do you want to rollback the transaction? Typically you are using JBehave for acceptance tests, which run in a production-like environment. For example you first setup some data in the database, access it via Browser/Selenium and check for the results. For that to work the DB transaction has to be committed. Do you need to clean-up manually after your tests, which you can do in #AfterStories or #AfterScenario annotated methods.
I made it work by controlling transaction scope manually, rolling it back after each scenario. Just follow the official guide how to use Spring with JBehave and then do the trick as shown below.
#Component
public class MySteps
{
#Autowired
MyDao myDao;
#Autowired
PlatformTransactionManager transactionManager;
TransactionStatus transaction;
#BeforeScenario
public void beforeScenario() {
transaction = transactionManager.getTransaction(new DefaultTransactionDefinition());
}
#AfterScenario
public void afterScenario() {
if (transaction != null)
transactionManager.rollback(transaction);
}
#Given("...")
public void persistSomething() {
myDao.persist(new Foo());
}
}
I'm not familiar with JBehave, but it appears you're searching for this annotation.
#TransactionConfiguration(transactionManager = "transactionManager", defaultRollback = true).
You could also set defaultRollback to true in your testContext.

Categories