From what I understand, both DataSource and JdbcTemplates are threadsafe, so you can configure a single instance of a JdbcTemplate and then safely inject this shared reference into multiple DAOs (or repositories). Also DataSourceshould be a Spring singleton, since it manages the connection pool.
The official Spring Documentation JdbcTemplate best practices explains the alternatives (excerpts from the manual are in italics, and my notes between square brackets:
configure a DataSource in your Spring configuration file, and then dependency-inject that shared DataSource bean into your DAO classes; the JdbcTemplate is created in the setter for the DataSource. [with XML configuration and this leads to multiple JdbcTemplate instances, since in the datasource setter there is new JdbcTemplate(dataSource)]
use component-scanning and annotation support for dependency injection. In this case you annotate the class with #Repository (which makes it a candidate for component-scanning) and annotate the DataSource setter method with #Autowired. [also this case leads to multiple JdbcTemplate instances]
If you are using Spring's JdbcDaoSupport class, and your various JDBC-backed DAO classes extend from it, then your sub-class inherits a setDataSource(..) method from the JdbcDaoSupport class. You can choose whether to inherit from this class. The JdbcDaoSupport class is provided as a convenience only. [since you've an instance of JdbcDaoSupport for each class extending it, there is an instance of JdbcTemplate too for each instance of the derived class (see source code for JdbcDaoSupport)]
However, a later note, discourages all the options just presented:
Once configured, a JdbcTemplate instance is threadsafe. You may want multiple JdbcTemplate instances if your application accesses multiple databases, which requires multiple DataSources, and subsequently multiple differently configured JdbcTemplates.
In other words, all the options just presented will result in having multiple JdbcTemplate instances (one per DAO), and just after the docs says that is not necessary when working with a single database.
What I would do is inject directly JdbcTemplate to the various DAOs needing it, so my question is, is it OK to do so? And also, do you also think that the Spring reference documentation is self-contradicting? Or is my misunderstanding?
IMO, there is no problem to inject JdbcTemplate to your (multiple) DAO(s). The template is used to "wire" your DAO to the physical resource (db connection) when you need to run db query. So if the SessionFactory and the TransactionManager are properly configured you will not run into concurrency problems - Spring manages the lifecycle of the beans you need for working with you persistence layer. The advantages of using a template are:
JDBC template manages physical resources required to interact with the DB automatically, e.g. create and release the database connections.
The Spring JDBC template converts the standard JDBC SQLExceptions into RuntimeExceptions. This allows you to react more flexible to the errors. The Spring JDBC template converts also the vendor specific error messages into better understandable error messages
so it should be spilt two situations:
We don’t change JdbcTemplate properties in DAO, we can define as below:
<bean id="tlmJDBCTemplate" class="org.springframework.jdbc.core.JdbcTemplate" <property name="dataSource" ref="plmTlmDataSource"/>
</bean>
NOTE: Most of time we don’t change the JdbcTemplate properties, because it is not necessary.
We change JdbcTemplate properties in DAO, we should be extends JdbcDaoSupport.
State:
• fetchSize: If this variable is set to a non-zero value, it will be used for setting the fetchSize property on statements used for query processing(JDBC Driver default)
• maxRows: If this variable is set to a non-zero value, it will be used for setting the maxRows property on statements used for query processing(JDBC Driver default)
• queryTimeout: If this variable is set to a non-zero value, it will be used for setting the queryTimeout property on statements used for query processing.(JDBC Driver default)
• skipResultsProcessing: If this variable is set to true then all results checking will be bypassed for any callable statement processing. This can be used to avoid a bug in some older Oracle JDBC drivers like 10.1.0.2.(false)
• skipUndeclaredResults: If this variable is set to true then all results from a stored procedure call that don't have a corresponding SqlOutParameter declaration will be bypassed. All other results processing will be take place unless the variable {#code skipResultsProcessing} is set to {#code true}(false)
• resultsMapCaseInsensitive: If this variable is set to true then execution of a CallableStatement will return the results in a Map that uses case insensitive names for the parameters if Commons Collections is available on the classpath.(false)
JdbcDaoSupport
public abstract class JdbcDaoSupport extends DaoSupport {
private JdbcTemplate jdbcTemplate;
/**
* Set the JDBC DataSource to be used by this DAO.
*/
public final void setDataSource(DataSource dataSource) {
if (this.jdbcTemplate == null || dataSource != this.jdbcTemplate.getDataSource()) {
this.jdbcTemplate = createJdbcTemplate(dataSource);
initTemplateConfig();
}
}
summary: I don’t think spring give the practice in guide is the best.
Inherently spring is very subtle about best practices.
JdbcTemplate is thread-safe, notably lock-free (v4.2.4).
Meaning it should not cause performance degradation when shared between concurrent threads*.
Thus, there are no compelling reasons for more than one instance per data source.
Speculative note: this section is indeed confusing.
Probably due to historical (evolutionary) reasons.
Maybe spring had per dao policy in the past due to non thread safety or poor understading of domain at a time.
Similar to xml based configuration "disaster".
Nowadays spring renounce opinionated views and strive to be flexible instead.
Which, unfortunately, led to bad design choices being acknowleged only covertly.
* measure don't guess
After so many years, and see this question again, I think we can create "Jdbc Template" with singleton first, then inject to DAO, so it is only one instance For the Template.
<bean id="template" class="org.springframework.jdbc.core.JdbcTemplate">
<property name="dataSource" ref="dataSource" />
</bean>
then you can inject template to DAO or DAO extends JdbcDaoSupport.
public final void setJdbcTemplate(JdbcTemplate jdbcTemplate)
{
this.jdbcTemplate = jdbcTemplate;
initTemplateConfig();
}
Related
I am having trouble finding information about this issue I am running into. I am interested in implementing row level security on my Postgres db and I am looking for a way to be able to set postgres session variables automatically through some form of an interceptor. Now, I know that with hibernate you are able to do row-level-security using #Filter and #FilterDef, however I would like to additionally set policies on my DB.
A very simple way of doing this would be to execute the SQL statement SET variable=value prior to every query, though I have not been able to find any information on this.
This is being used on a spring-boot application and every request is expected to will have access to a request-specific value of the variable.
Since your application uses spring, you could try accomplishing this in one of a few ways:
Spring AOP
In this approach, you write an advice that you ask spring to apply to specific methods. If your methods use the #Transactional annotation, you could have the advice be applied to those immediately after the transaction has started.
Extended TransactionManager Implementation
Lets assume your transaction is using JpaTransactionManager.
public class SecurityPolicyInjectingJpaTransactionManager extends JpaTransactionManager {
#Autowired
private EntityManager entityManager;
// constructors
#Override
protected void prepareSynchronization(DefaultTransactionStatus status, TransactionDefinition definition) {
super.prepareSynchronization(status, definition);
if (status.isNewTransaction()) {
// Use entityManager to execute your database policy param/values
// I would suggest you also register an after-completion callback synchronization
// This after-completion would clear all the policy param/values
// regardless of whether the transaction succeeded or failed
// since this happens just before it gets returned to the connection pool
}
}
}
Now simply configure your JPA environment to use your custom JpaTransactionManager class.
There are likely others, but these are the two that come to mind that I've explored.
As far as I know, Hibernate Envers stores a revision when you create, change or delete an object annotated with #Audited.
Envers sets automatically the revision date to the current time. Is it possibile to set this time manually?
I'd need this to handle a temporal collection where data has valid time, which I'd need to set manually.
You can, but it may not seem intuitive at first.
When Envers creates its revision-entity instance, several things happen.
The #RevisionTimestamp annotated property is set with the current time.
The optional RevisionListener is called and supplied the revision-entity instance.
You can specify a RevisionListener in two ways and this really depends on whether or not your currently supplying a custom revision-entity instance or using the instance Envers resolves based on your setup.
Supplying Custom Revision Entity
In this situation, you can specifying your RevisionListener by setting it on the #RevisionEntity class annotation on the entity class.
#RevisionEntity(YourCustomRevisionListener.class)
public class CustomRevisionEntity {
...
}
Supplying RevisionListener via configuration
In this situation, you'll want add an additional bootstrap configuration property for Hibernate, either via your hibernate.properties file or in your code where you explicitly set the hibernate configuration properties:
org.hibernate.envers.revision_listener=com.company.envers.YourCustomRevisionListener
Regardless of which approach you take, you'll then implement the listener's contract and explicitly set the timestamp value based on whatever rules your application needs:
public class YourCustomRevisionListener implements RevisionListener {
#Override
public void newRevision(Object revisionEntity) {
// I am going to assume here you're using a custom revision entity.
// If you are not, you'll need to cast it to the appropriate class implementation.
final CustomRevisionEntity revisionEntityImpl = (CustomRevisionEntity) revisionEntity;
revisionEntityImpl.setTimestamp( resolveValidTimestampValue() );
}
private long resolveValidTimestampValue() {
// implement your logic here.
}
}
There are a couple caveats here. If you need to resolve the value from some bean in your application space, you'll need to determine which of the following applies to you:
Using Hibernate Envers version prior to 5.3
In this case you'll have to use the legacy approach of ThreadLocal variables to pass application scope instances/values to access those inside the listener.
Using Hibernate Envers 5.3 or later with CDI
In this case you can simply inject the CDI bean using CDI's injection since we added support to automatically resolve CDI beans when we create the listener instance.
Using Hibernate Envers 5.3 or later with Spring 5.1+
You can inject spring beans directly into the listener using Spring's injection annotations just like the listener were a spring-bean.
Using Hibernate Envers 5.3 or later with Spring prior to 5.1
In this case, you'll need to use the legacy approach of ThreadLocal variables since Spring Framework didn't add support for injecting beans into Hibernate beans until 5.1.
Which is the best approach to implement several different databases in one project, using Spring JdbcDaoSupport?
I have several DB with different datasources and syntax: MySQL & Postgres, for example. In pure java-jdbc projects i used Factory Method and Abstract Factory patterns, and multiple DAOimpl classes (one for each DB) with common DAO interfaces for switch between databases. Now i use Spring-jdbc and want to implement similar behavior.
I faced the same matter two year ago and I finally choose an implementation based on a "Spring Custom Scope" (http://docs.spring.io/spring/docs/current/spring-framework-reference/htmlsingle/#beans-factory-scopes-custom).
The spring frameworks allows multiple instances of the same bean definition to coexists together. They differ only from some contextual settings.
For instance, this bean definition will create various loginAction bean depending on the currently processed HTTP request
<bean id="loginAction" class="com.foo.LoginAction" scope="request"/>
If you create a custom scope called "domain", you will be able to instanciate several datasource based on the same bean definition.
A datasource bean definition based on JndiObjectFactoryBean would let the servlet container manage the database connection (through the web.xml file). However, you would have to variabilize your datasource name with a Spring Property.
Beans like the database Transaction Manager must also be marked with this scope.
Next you need to activate the scope each time an HTTP request is running: I can suggest you to define the datasource name as a prefix of the request url.
Because most of web frameworks allows you to intercept HTTP requests, you can retrieve the expected datasource before processing the request.
Then, create (or reuse) a set of beans specific to the selected datasource and store it inside a ThreadLocal variable (that your custom scope implementation will rely on)
This implementation should look a little complex at first glance, but its usage appears transparent.
I'm thinking of implementing Objectify DAO with dependency injection, such that I can maintain my code to access the same "Dao" while the implementation may change from Objectify to Hibernate-MySQL or MongoDb in the future without me worrying on changing any code in the UI or client side.
UserDao is based on the example here:
http://turbomanage.wordpress.com/2010/01/28/simplify-with-objectify/
UserObjectifyDaoImpl implements Dao<User> {
private UserDao dao = null;
public void put(User entity) {
if (dao == null) {
dao = new UserDao();
}
dao.put(entity);
}
// other put and set methods
}
Such that, I have the context.xml:
<bean id="userDao" class="com.example.server.daoimpl.UserObjectifyDaoImpl">
<property name="dataSource" ref="dataSource"/>
</bean>
And if I need to change the implementation, I just need to change this bean from UserObjectifyDaoImpl to something like:
UserHibernateDaoImpl or UserMongoDBDaoImpl or whatever implementation saving to whatever database.
And still have my code in the UI / Client intact, like:
WebApplicationContext ctx = WebApplicationContextUtils.getWebApplicationContext(getServletContext());
Dao dao = (Dao) ctx.getBean("userDao");
dao.put(something);
One reason I need to do this right now, I need to develop using app engine (via objectify), however in the future I may need to change some data access objects to hibernate and some to mongodb (so its a mix).
I haven't tested this code, will this strategy work?
Yes, this will work. In fact this is one of the major reasons why DI and coding to an interface was invented. Just make sure that all DAO implementations follow the same contract (DAOs very often introduce leaky abstractions).
Also you have several other options to achieve the same goal:
Several #Service annotated classes with one marked as #Primary (if you are using autowiring)
Spring profiles and selective activation of beans
BTW if you are considering switching to a different DAO implementation, have a look at CrudRepository from Spring Data. Spring Data project provides several modules that implement this interface for MongoDB, Neo4J, JPA, etc.
For the time being it seems like several Spring Data modules do not play together nicely (see: DATAJPA-146), so if you choose to implement CrudRepository make sure this issue is fixed or you can work it around. Thanks to #iddqd for pointing that out.
You can changes context config to selected Dao implementation if you only need one implementation in application but if you need more than one implementation in your application(mixing mode), you need design Factory Layer. You trying design a layer with name Factory and with its APIs and implementations and it decide witch Dao(Hibernate, MongoDB, JP or etc) in any time must select.
I want to obtain a JdbcTemplate in my Java code. I've already got a working java.sql.Connection. To create a new JdbcTemplate it would normally need an instance of the javax.sql.DataSource interface.
Is it somehow possible to obtain a new JdbcTemplatefrom an existing java.sql.Connection?
Technically, you can, using SingleConnectionDataSource
new JdbcTemplate(new SingleConnectionDataSource(connection, false))
However, this is not quite advisable, unless for unit-tests for example.
You'd better use a full-featured DataSource and wire things using spring.
No, JdcbTemplate is a Spring class; Connection is part of the JDK. Connection knows nothing about JdbcTemplate.
The way to do it is to add a JdbcTemplate bean in your Spring app context; then inject it into the classes that need it declaratively.