I'm thinking of implementing Objectify DAO with dependency injection, such that I can maintain my code to access the same "Dao" while the implementation may change from Objectify to Hibernate-MySQL or MongoDb in the future without me worrying on changing any code in the UI or client side.
UserDao is based on the example here:
http://turbomanage.wordpress.com/2010/01/28/simplify-with-objectify/
UserObjectifyDaoImpl implements Dao<User> {
private UserDao dao = null;
public void put(User entity) {
if (dao == null) {
dao = new UserDao();
}
dao.put(entity);
}
// other put and set methods
}
Such that, I have the context.xml:
<bean id="userDao" class="com.example.server.daoimpl.UserObjectifyDaoImpl">
<property name="dataSource" ref="dataSource"/>
</bean>
And if I need to change the implementation, I just need to change this bean from UserObjectifyDaoImpl to something like:
UserHibernateDaoImpl or UserMongoDBDaoImpl or whatever implementation saving to whatever database.
And still have my code in the UI / Client intact, like:
WebApplicationContext ctx = WebApplicationContextUtils.getWebApplicationContext(getServletContext());
Dao dao = (Dao) ctx.getBean("userDao");
dao.put(something);
One reason I need to do this right now, I need to develop using app engine (via objectify), however in the future I may need to change some data access objects to hibernate and some to mongodb (so its a mix).
I haven't tested this code, will this strategy work?
Yes, this will work. In fact this is one of the major reasons why DI and coding to an interface was invented. Just make sure that all DAO implementations follow the same contract (DAOs very often introduce leaky abstractions).
Also you have several other options to achieve the same goal:
Several #Service annotated classes with one marked as #Primary (if you are using autowiring)
Spring profiles and selective activation of beans
BTW if you are considering switching to a different DAO implementation, have a look at CrudRepository from Spring Data. Spring Data project provides several modules that implement this interface for MongoDB, Neo4J, JPA, etc.
For the time being it seems like several Spring Data modules do not play together nicely (see: DATAJPA-146), so if you choose to implement CrudRepository make sure this issue is fixed or you can work it around. Thanks to #iddqd for pointing that out.
You can changes context config to selected Dao implementation if you only need one implementation in application but if you need more than one implementation in your application(mixing mode), you need design Factory Layer. You trying design a layer with name Factory and with its APIs and implementations and it decide witch Dao(Hibernate, MongoDB, JP or etc) in any time must select.
Related
Which is the best approach to implement several different databases in one project, using Spring JdbcDaoSupport?
I have several DB with different datasources and syntax: MySQL & Postgres, for example. In pure java-jdbc projects i used Factory Method and Abstract Factory patterns, and multiple DAOimpl classes (one for each DB) with common DAO interfaces for switch between databases. Now i use Spring-jdbc and want to implement similar behavior.
I faced the same matter two year ago and I finally choose an implementation based on a "Spring Custom Scope" (http://docs.spring.io/spring/docs/current/spring-framework-reference/htmlsingle/#beans-factory-scopes-custom).
The spring frameworks allows multiple instances of the same bean definition to coexists together. They differ only from some contextual settings.
For instance, this bean definition will create various loginAction bean depending on the currently processed HTTP request
<bean id="loginAction" class="com.foo.LoginAction" scope="request"/>
If you create a custom scope called "domain", you will be able to instanciate several datasource based on the same bean definition.
A datasource bean definition based on JndiObjectFactoryBean would let the servlet container manage the database connection (through the web.xml file). However, you would have to variabilize your datasource name with a Spring Property.
Beans like the database Transaction Manager must also be marked with this scope.
Next you need to activate the scope each time an HTTP request is running: I can suggest you to define the datasource name as a prefix of the request url.
Because most of web frameworks allows you to intercept HTTP requests, you can retrieve the expected datasource before processing the request.
Then, create (or reuse) a set of beans specific to the selected datasource and store it inside a ThreadLocal variable (that your custom scope implementation will rely on)
This implementation should look a little complex at first glance, but its usage appears transparent.
I'm trying to change some legacy code to use DI with Spring framework. I have a concrete case for which I'm wondering which is the most proper way to implement it.
It is a java desktop application. There is a DataManager interface used to query / change data from the data store. Currently there is only one implementation using a XML file for store, but in the future it is possible to add SQL implementation. Also for unit testing I may need to mock it.
Currently every peace of code that needs the data manager retrieves it by using a factory. Here is the source code of the factory:
public class DataManagerFactory
{
private static DataManagerIfc dataManager;
public static DataManagerIfc getInstance()
{
// Let assume synchronization is not needed
if(dataManager == null)
dataManager = new XMLFileDataManager();
return dataManager;
}
}
Now I see 3 ways to change the application to use DI and Spring.
I. Inject the dependency only in the factory and do not change any other code.
Here is the new code:
public class DataManagerFactory
{
private DataManagerIfc dataManager;
public DataManagerFactory(DataManagerIfc dataManager)
{
this.dataManager = dataManager;
}
public DataManagerIfc getDataManager()
{
return dataManager;
}
public static DataManagerIfc getInstance()
{
return getFactoryInstance().getDataManager();
}
public static DataManagerFactory getFactoryInstance()
{
ApplicationContext context =
new ClassPathXmlApplicationContext(new String[] {"com/mypackage/SpringConfig.xml"});
return context.getBean(DataManagerFactory.class);
}
}
And the XML with the bean description:
<bean id="dataManagerFactory"
class="com.mypackage.DataManagerFactory">
<constructor-arg ref="xmlFileDataManager"/>
</bean>
<bean id="xmlFileDataManager"
class="com.mypackage.datamanagers.xmlfiledatamanager.XMLFileDataManager">
</bean>
II. Change every class that is using the data manager so it takes it through the constructor and store it as a class variable. Make Spring bean definitions only for the "root" classes from where the chain of creation starts.
III. Same as II. but for every class that is using the data manager create a Spring bean definition and instantiate every such class by using the Spring Ioc container.
As I'm new to the DI concept, I will appreciate every advice what will be the correct and "best practice" solution.
Many thanks in advance.
Use option 3.
The first option keeps your code untestable. You won't be able to easily mock the static factory method so that it returns a mock DataManager.
The second option will force you to have the root classes know all the dependencies of all the non-root classes in order to make the code testable.
The third option really uses dependency injection, where each bean only know about its direct dependencies, and is injected by the DI container.
Well... why did you write the factory in the first place? Spring is not intended to make you change how you write code (not just to suit Spring that is), so keeping the factory is correct as it uses well-known pattern. Injecting the dependency into the factory will retain that behaviour.
Option 3 is the correct route to take. By using such a configuration you can usefully take components of your configuration and use them in new configurations, and everything will work as expected.
As a rule of thumb, I would expect one call to Spring to instantiate the application context and get the top-level bean. I wouldn't expect to make repeated calls to the Spring framework to get multiple beans. Everything should be injected at the correct level to reflect responsibilities etc.
Beware (since you're new to this) that you don't plumb in your data manager into every class available! This is quite a common mistake to make, and if you've not abstracted out and centralised responsibilities sufficiently, you'll find you're configuring classes with lots of managers. When you see you're doing this it's a good time to step back and look at your abstractions and componentisation.
I am using spring and Hibernate and Dao design pattern for the my project, In my GenericDaoImpl(Abstract class) class has "tenentId", I want to set the "tenentId" when use login to the System. My other DaoImpl classes extends from GenericDaoImpl, so I need to set the tenentId(It's define in GenericDaoImpl) user login time and reset the "tenentId" when user log out.
What is the best way to do this?
In my test cases I tried #Autowired the "GenericDaoImpl" but I couldn't do that, It throws an exception telling, org.springframework.beans.factory.NoSuchBeanDefinitionException: No unique bean of type. I know the problem now,
(We can't create instance of abstract classes,if use tenantId as static variable, is it going to be a problem?)
Can any one suggest me any solution?
Thank you,
Udeshika
if you are developing multi-tenancy application and would like to have tenant aware application context then have a look spring-tenancy. This will help you to have beans injected which are tenant aware.
if you want to multi-tenancy at hibernate layer then you can also look at multi-tenancy feature of hibernate.
From what I understand, both DataSource and JdbcTemplates are threadsafe, so you can configure a single instance of a JdbcTemplate and then safely inject this shared reference into multiple DAOs (or repositories). Also DataSourceshould be a Spring singleton, since it manages the connection pool.
The official Spring Documentation JdbcTemplate best practices explains the alternatives (excerpts from the manual are in italics, and my notes between square brackets:
configure a DataSource in your Spring configuration file, and then dependency-inject that shared DataSource bean into your DAO classes; the JdbcTemplate is created in the setter for the DataSource. [with XML configuration and this leads to multiple JdbcTemplate instances, since in the datasource setter there is new JdbcTemplate(dataSource)]
use component-scanning and annotation support for dependency injection. In this case you annotate the class with #Repository (which makes it a candidate for component-scanning) and annotate the DataSource setter method with #Autowired. [also this case leads to multiple JdbcTemplate instances]
If you are using Spring's JdbcDaoSupport class, and your various JDBC-backed DAO classes extend from it, then your sub-class inherits a setDataSource(..) method from the JdbcDaoSupport class. You can choose whether to inherit from this class. The JdbcDaoSupport class is provided as a convenience only. [since you've an instance of JdbcDaoSupport for each class extending it, there is an instance of JdbcTemplate too for each instance of the derived class (see source code for JdbcDaoSupport)]
However, a later note, discourages all the options just presented:
Once configured, a JdbcTemplate instance is threadsafe. You may want multiple JdbcTemplate instances if your application accesses multiple databases, which requires multiple DataSources, and subsequently multiple differently configured JdbcTemplates.
In other words, all the options just presented will result in having multiple JdbcTemplate instances (one per DAO), and just after the docs says that is not necessary when working with a single database.
What I would do is inject directly JdbcTemplate to the various DAOs needing it, so my question is, is it OK to do so? And also, do you also think that the Spring reference documentation is self-contradicting? Or is my misunderstanding?
IMO, there is no problem to inject JdbcTemplate to your (multiple) DAO(s). The template is used to "wire" your DAO to the physical resource (db connection) when you need to run db query. So if the SessionFactory and the TransactionManager are properly configured you will not run into concurrency problems - Spring manages the lifecycle of the beans you need for working with you persistence layer. The advantages of using a template are:
JDBC template manages physical resources required to interact with the DB automatically, e.g. create and release the database connections.
The Spring JDBC template converts the standard JDBC SQLExceptions into RuntimeExceptions. This allows you to react more flexible to the errors. The Spring JDBC template converts also the vendor specific error messages into better understandable error messages
so it should be spilt two situations:
We don’t change JdbcTemplate properties in DAO, we can define as below:
<bean id="tlmJDBCTemplate" class="org.springframework.jdbc.core.JdbcTemplate" <property name="dataSource" ref="plmTlmDataSource"/>
</bean>
NOTE: Most of time we don’t change the JdbcTemplate properties, because it is not necessary.
We change JdbcTemplate properties in DAO, we should be extends JdbcDaoSupport.
State:
• fetchSize: If this variable is set to a non-zero value, it will be used for setting the fetchSize property on statements used for query processing(JDBC Driver default)
• maxRows: If this variable is set to a non-zero value, it will be used for setting the maxRows property on statements used for query processing(JDBC Driver default)
• queryTimeout: If this variable is set to a non-zero value, it will be used for setting the queryTimeout property on statements used for query processing.(JDBC Driver default)
• skipResultsProcessing: If this variable is set to true then all results checking will be bypassed for any callable statement processing. This can be used to avoid a bug in some older Oracle JDBC drivers like 10.1.0.2.(false)
• skipUndeclaredResults: If this variable is set to true then all results from a stored procedure call that don't have a corresponding SqlOutParameter declaration will be bypassed. All other results processing will be take place unless the variable {#code skipResultsProcessing} is set to {#code true}(false)
• resultsMapCaseInsensitive: If this variable is set to true then execution of a CallableStatement will return the results in a Map that uses case insensitive names for the parameters if Commons Collections is available on the classpath.(false)
JdbcDaoSupport
public abstract class JdbcDaoSupport extends DaoSupport {
private JdbcTemplate jdbcTemplate;
/**
* Set the JDBC DataSource to be used by this DAO.
*/
public final void setDataSource(DataSource dataSource) {
if (this.jdbcTemplate == null || dataSource != this.jdbcTemplate.getDataSource()) {
this.jdbcTemplate = createJdbcTemplate(dataSource);
initTemplateConfig();
}
}
summary: I don’t think spring give the practice in guide is the best.
Inherently spring is very subtle about best practices.
JdbcTemplate is thread-safe, notably lock-free (v4.2.4).
Meaning it should not cause performance degradation when shared between concurrent threads*.
Thus, there are no compelling reasons for more than one instance per data source.
Speculative note: this section is indeed confusing.
Probably due to historical (evolutionary) reasons.
Maybe spring had per dao policy in the past due to non thread safety or poor understading of domain at a time.
Similar to xml based configuration "disaster".
Nowadays spring renounce opinionated views and strive to be flexible instead.
Which, unfortunately, led to bad design choices being acknowleged only covertly.
* measure don't guess
After so many years, and see this question again, I think we can create "Jdbc Template" with singleton first, then inject to DAO, so it is only one instance For the Template.
<bean id="template" class="org.springframework.jdbc.core.JdbcTemplate">
<property name="dataSource" ref="dataSource" />
</bean>
then you can inject template to DAO or DAO extends JdbcDaoSupport.
public final void setJdbcTemplate(JdbcTemplate jdbcTemplate)
{
this.jdbcTemplate = jdbcTemplate;
initTemplateConfig();
}
Say that I have class Controller with property strategy of type IStrategy. In Spring, I can create different instances of Controller and inject different strategy implementations by defining beans in xml configuration file as shown below:
<bean id="strategyAController" class="com.test.Controller">
<property name="strategy" ref="strategyAImpl">
</bean>
<bean id="strategyBController" class="com.test.Controller">
<property name="strategy" ref="strategyBImpl">
</bean>
<bean id="strategycController" class="com.test.Controller">
<property name="strategy" ref="strategycImpl">
</bean>
I can then reference these beans using #Autowired and #Qualifier("strategyAController"), etc. What is the equivalent way of doing this in Java EE 6
Funny you should ask! Gavin King, who designed Java EE 6 CDI, got into a nasty fight with someone on exactly the same problem.
http://www.tsolak.com/?p=59
The Spring code, of course, looks awfully like Java. We can do that in java - create some variables, set some properties, no biggie. I am curious, in your perticular application, what's the drawback of doing it in plain Java? What's the specific benefit you get from Spring for these beans?
In CDI you can use Qualifer to identify the different instances and producer methods to provide that instances.
public class ControllerFactory {
#Produces
#StrategyA
public Controller createControllerA {
return new Controller(configA);
}
#Produces
#StrategyB
public Controller createControllerB {
return new Controller(configB);
}
}
#StrategyB
#Inject
Controller howToAccessIt;
If you do not like to create a new StrategyA/B/C Annotation for each strategy, you could use one Annotation with an Field (for example an Enum) that identify the strategy. So that you can write: #Strategy(StratType.A).
Then you can use the InjectionPoint in your producer method, so that you can create the instance depending on the annotation in a generic way, instead of writing a new producer method for each Strategy:
#Produces
//must add a Annotation to clarify that this producer produces for all Strategies!
public Controller createController(InjetionPoint ip) {
Annotated annotated = ip.getAnnotated();
if (annotated.isAnnotationPresent(Strategy.class)) {
Strategy stragtegyAnnotation = (Strategy) annotated.getAnnotation(Strategy.class);
switch(stragtegyAnnotation.value) {
case A: return new Controller(configA);
...
}
}
}
So, I'm not really familiar with EE6 annotations, but I solved a similar issue using pure Spring annotations in a question that I asked and then answered myself. It's not quite the same thing because it only creates once instance, but with all of the different dependencies injected into a Map. Basically, it will allow to switch implementations based on a config flag, but not instantiate all of them at once.