Below is the way, i am using jdbc connectivity for my spring mvc.
I have some technical doubts, which follows-
1.
As i have invoked datasource object in every bean that requires db connectivity. Is it the right way of doing it?
What if i don't want a particular repository object to be instantiated when the application starts up
(because I'm not sure when user will invoke the object, so why instantiate it at the very beginning)?
<bean
class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="location">
<value>classpath:jdbc.properties</value>
</property>
</bean>
<bean id="dataSource" class="com.mchange.v2.c3p0.ComboPooledDataSource"
destroy-method="close">
<property name="driverClass" value="${jdbc.driverClassName}" />
<property name="jdbcUrl" value="${jdbc.url}" />
<property name="user" value="${jdbc.username}" />
<property name="password" value="${jdbc.password}" />
<property name="maxPoolSize" value="${jdbc.maxPoolSize}" />
<property name="minPoolSize" value="${jdbc.minPoolSize}" />
<property name="maxStatements" value="${jdbc.maxStatements}" />
<property name="testConnectionOnCheckout" value="${jdbc.testConnection}" />
</bean>
<bean id="ustestuthenticationRepository"
class="com.test.repository.impl.UstestuthenticationRepositoryImpl">
<property name="dataSource" ref="dataSource" />
</bean>
<bean id="someclass"
class="com.test.repository.impl.someclass">
<property name="dataSource" ref="dataSource" />
</bean>
2.
#Qualifier("dbDataSource")
private static DataSource dataSource;
public void setDataSource(DataSource dataSource) {
this.dataSource = dataSource;
}
and then creating
JdbcTemplate jdbcTemplate = new JdbcTemplate(dataSource);
I'm not sure if this is the right way of invoking datasource.If each and every Repository object creates separate jdbctemplate, is it something appropriate?
In every repository class, i am invoking datasource in the folowing way-
Modified Code
<bean id="jdbcTemplate" class="org.springframework.jdbc.core.JdbcTemplate">
<constructor-arg ref="dataSource" />
</bean>
<bean id="someclass" class="com.era.repository.impl.someclass">
<property name="jdbcTemplate" ref="jdbcTemplate" />
</bean>
and implementation in someclass is -
private JdbcTemplate jdbcTemplate;
public void setJdbcTemplate(JdbcTemplate jdbcTemplate) {
this.jdbcTemplate = jdbcTemplate;
}
then only accessing jdbcTemplate variable wherever is required.
Am i doing it correctly now? Please advise.
What you done is not an error. It is possible to work that way.
However.
It means you will simply work with JDBC directly.
It means, you need handle transactions "manually".
It is hard and a lot of code.
You better use spring data and hibernate.
Or spring data and JPA.
Anyway, spring data will help you to manage all resources and will simplify data access for you.
If you already on spring MVC, take also spring data. Why bother?
Official spring data example:
https://spring.io/guides/gs/accessing-data-jpa/
Related
This question already has an answer here:
UnknownServiceException: Unknown service requested (Hibernate/Spring)
(1 answer)
Closed 7 years ago.
I'm using Spring 4 to set up my stuff that I'll need for using Hibernate 4. I have a SessionFactory autowired into my DAO layer. When I call sessionFactory.getCurrentSession() I get the error:
Exception in thread "MyImporterThread" org.hibernate.service.UnknownServiceException: Unknown service requested [org.hibernate.engine.jdbc.connections.spi.ConnectionProvider]
I've looked at many search results from Google (including a bunch from StackOverflow) on this exception, however none of them has struck me as the solution to my issue.
Here's my configuration:
spring-beans.xml:
<context:property-placeholder location="file:spring.properties" />
<context:component-scan base-package="com.company.scraping" />
<!-- Data Source -->
<bean id="dataSource" class="com.mchange.v2.c3p0.ComboPooledDataSource">
<property name="driverClass" value="${jdbc.driver.class}" />
<property name="jdbcUrl" value="${jdbc.url}" />
<property name="user" value="${jdbc.user}" />
<property name="password" value="${jdbc.password}" />
</bean>
<!-- Session Factory -->
<bean id="sessionFactory" class="org.springframework.orm.hibernate4.LocalSessionFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="packagesToScan" value="com.company.scraping" />
<property name="configLocation">
<value>file:scraping.db.hibernate.cfg.xml</value>
</property>
</bean>
<!-- Transaction Stuff -->
<tx:annotation-driven transaction-manager="transactionManager" />
<bean id="transactionManager" class="org.springframework.orm.hibernate4.HibernateTransactionManager">
<property name="sessionFactory" ref="sessionFactory" />
</bean>
<!-- beans that are transactional or autowired -->
<bean id="scrapingDao" class="com.company.scraping.dao.ReportsScrapingDaoImpl" />
<bean id="scrapingService" class="com.company.scraping.service.ReportsScrapingServiceImpl" />
spring.properties:
jdbc.driver.class=oracle.jdbc.OracleDriver
jdbc.url=jdbc:oracle:thin:#server:1521:testdb01
jdbc.user=user
jdbc.password=password
scraping.db.hibernate.cfg.xml:
<hibernate-configuration>
<session-factory>
<property name="connection.url">jdbc:oracle:thin:#server:1521:testdb01</property>
<property name="connection.username">user</property>
<property name="connection.password">password</property>
<property name="connection.driver_class">oracle.jdbc.OracleDriver</property>
<property name="hibernate.connection.autocommit">true</property>
<property name="hibernate.generate_statistics">false</property>
<property name="dialect">org.hibernate.dialect.Oracle10gDialect</property>
<property name="hibernate.show_sql">false</property>
</session-factory>
</hibernate-configuration>
Since this is not a web application, I use ApplicationContext context = new FileSystemXmlApplicationContext(args[0]) to initialize Spring.
My service class is autowired (not shown in the config because I have to type all this out), and contains an autowired instance of the DAO. This is what the DAO looks like:
#org.springframework.stereotype.Repository
#org.springframework.transaction.annotation.Transactional
public class ReportsScrapingDaoImpl implements ReportsScrapingDao
{
#Autowired
#Qualifier("sessionFactory")
private SessionFactory sessionFactory;
#Override
#Transaction(readOnly = true)
public List<Stuff> getAll()
{
Criteria criteria = sessionFactory.getCurrentSession().createCriteria(Stuff.class);
... (more code)
}
}
The code bombs out when sessionFactory.getCurrentSession() is called. I've tried using sessionFactory.openSession() as well, but it gave the same results. I'm not sure what's going on here.
I did a last-ditch search and this turned up. Eclipse outputs a compiler warning, complaining that there may be a resource leak because the ApplicationContext is never closed. I added a line at the very end of my main method (after all the Threads had been started) to close the ApplicationContext. Once I got rid of the line that closes the ApplicationContext, the problem went away.
Moral of the story is that your ApplicationContext should not be closed until you no longer need anything from it--so you should probably never close it.
We have a web application implementing Spring MVC 3.2 using JPA as a framework for ORM. Now the problem is that EntityManager is creating a lot of open connections with the database. We want to handle it in such a way that for every query a connection should be established and closed after completion.
As per the spring implementation EntityManager is created once. But the problem here is we in some way want to handle the client connections that EntityManager is creating for querying the database.
Whenever the query is completed in the database, that connection goes into sleep, instead we want to close it once the query returns the result.
DB type: MySQL
My configuration for JPA is :
<tx:annotation-driven transaction-manager="transactionManager" />
<bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager">
<property name="entityManagerFactory" ref="entityManagerFactory" />
</bean>
<bean class="org.springframework.orm.jpa.support.PersistenceAnnotationBeanPostProcessor" />
<bean id="entityManagerFactory"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="packagesToScan" value="com.reppify" />
<property name="jpaPropertyMap" ref="jpaPropertyMap" />
<property name="dataSource" ref="dataSourceLocal" />
<property name="persistenceUnitName" value="cron-jpa" />
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter" />
</property>
</bean>
<bean id="dataSourceLocal"
class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value="${database.driver}" />
<property name="url" value="${database.url}" />
<property name="username" value="${database.user}" />
<property name="password" value="${database.password}" />
</bean>
We are using hibernate-jpa-api-2.0 jar as a dependency to the project.
And my JAVA Base DAO implementation for injecting EntityManager looks like:
protected EntityManager entityManager;
#PersistenceContext
public void setEntityManager(EntityManager entityManager) {
this.entityManager = entityManager;
}
Please suggest us an optimum solution for the same.
DriverManagerDataSource is not a connections pool, it creates a new connection on every call. This class is useful for testing but you shouldn't use it in production, choose a connection pool instead. There are many connection pools to choose from:
HikariCP
Apache Commons DBCP
c3p0
...
I have a problem using mappers in mybatis-spring. (Spring Batch)
I need to use a SqlSessionTemplate with ExecutorType in BATCH mode for performance issues (my program must execute thousands of insert statement in a table).
However in my program I need to log errors and updating states in another table of the database and if something goes wrong in the execution of the current step everything is rollback, included the logs, which is not an acceptable behaviour.
I thought I could simple set two different SqlSessionTemplate with differents ExecutorType, but if in my step I use two mappers with different templates I get an exception which says that I can't change ExecutorType during transaction, but I don't know how to solve this issue.
Any help is appreciated. Here some XML configuration.
<!-- connect to database -->
<bean id="dataSource" class="org.springframework.jdbc.datasource.LazyConnectionDataSourceProxy">
<property name="targetDataSource">
<ref local="mainDataSource" />
</property>
</bean>
<bean id="mainDataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close" >
<property name="driverClassName" value="${db.driver}" />
<property name="url" value="${db.url}" />
<property name="username" value="${db.user}" />
<property name="password" value="${db.pass}" />
</bean>
<bean id="infrastructureSqlSessionFactory" class="org.mybatis.spring.SqlSessionFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="mapperLocations"
value="classpath*:com/generali/danni/sipo/mdv/dao/mybatis/*Mapper*.xml" />
<property name="configLocation" value="classpath:mybatis-config.xml" />
</bean>
<bean id="infrastructureSqlSessionTemplateBatch" class="org.mybatis.spring.SqlSessionTemplate">
<constructor-arg index="0" ref="infrastructureSqlSessionFactory" />
<constructor-arg index="1" value="BATCH" />
</bean>
<bean id="infrastructureSqlSessionTemplate" class="org.mybatis.spring.SqlSessionTemplate">
<constructor-arg index="0" ref="infrastructureSqlSessionFactory" />
</bean>
<bean id="infrastructureAbstractMapper" class="org.mybatis.spring.mapper.MapperFactoryBean"
abstract="true">
<property name="sqlSessionTemplate" ref="infrastructureSqlSessionTemplate" />
</bean>
<bean id="infrastructureAbstractMapperBatch" class="org.mybatis.spring.mapper.MapperFactoryBean"
abstract="true">
<property name="sqlSessionTemplate" ref="infrastructureSqlSessionTemplateBatch" />
</bean>
<bean id="erroriMapper" parent="infrastructureAbstractMapper">
<property name="mapperInterface"
value="com.mdv.dao.ErroriMapper" />
</bean>
<bean id="stagingFileMapper" parent="infrastructureAbstractMapperBatch">
<property name="mapperInterface"
value="com.mdv.dao.StagingFileMapper" />
</bean>
Here i have two mappers, one I'd like to use in BATCH mode, the other in SIMPLE mode.
How can I accomplish this task? Every suggestion is appreciated.
Thanks in advance, and sorry for my bad english.
After a lot of tries, I decided to change my approach to solve this problem.
I defined programmatically a new SqlSessionFactory, generating a new SqlSession with the Batch Executor and I used that one.
Since it is an entirely different SqlSessionFactory, it seems it doesn't give problem if I use 2 differents ExecutorType.
Here a sample working code:
Environment environment = new Environment("TEST", new JdbcTransactionFactory(), dataSource);
Configuration configuration = new Configuration(environment);
configuration.addMappers("com.mdv.dao");
SqlSessionFactory ssf = new SqlSessionFactoryBuilder().build(configuration);
SqlSession sqlSession = ssf.openSession(ExecutorType.BATCH);
try {
StagingFileMapper sfm = sqlSession.getMapper(StagingFileMapper.class);
for(Record r : staging){
StagingFile sf = new StagingFile();
//set your sf fields
sfm.insert(sf);
}
sqlSession.commit();
} catch (Exception e) {
//manage exception
}
finally{
sqlSession.close();
}
I'm trying to learn spring framework and bean configuration and so far it seems really cool.
I'm about to create a generic class to include all my Mysql functions and it needs to contain the DataSource. My question is: Is it possible to set the datasource already in the bean configuration?
If not then I'll need to set the class as singleton, create an init function and in the init function to do the following:
ApplicationContext context = new ClassPathXmlApplicationContext("beans.xml");
DataSource ds = (DataSource) context.getBean("dataSource");
The question is, instead of doing that, can I 'inject' (donno if that's the right term)
it directly in the bean?
this is my bean configuration.
<bean id="dataSource" class="org.apache.tomcat.jdbc.pool.DataSource">
<property name="driverClassName" value="com.mysql.jdbc.Driver" />
<property name="url" value="jdbc:mysql://localhost:3306/foo"/>
<property name="username" value="root"></property>
<property name="password" value="password"></property>
<property name="validationQuery" value="SELECT 1" />
<property name="testOnBorrow" value="true" />
<property name="testWhileIdle" value="true" />
<property name="initialSize" value="5" />
</bean>
<bean id="bar" class="foo.bar">
<property name="dataSource" value="<HERE_SETTING_THE_DATA_SOURCE_ABOVE>" />
</bean>
Is this possible ?
You can reference a bean like your dataSource.
Your class should have a member that can hold the dataSource:
package mypackage;
public class MyBean {
private DataSource dataSource;
public void setDataSource(DataSource dataSource) {
this.dataSource = data.Source;
}
}
Then you can inject the dataSource bean into this bean:
<beans>
<bean id="dataSource" class="org.apache.tomcat.jdbc.pool.DataSource">
<!-- set properties -->
</bean>
<bean id="myBean" class="mypackage.MyBean">
<property name="dataSource" ref="dataSource"/>
</bean>
</beans>
That's it.
I'm using WebLogic 10.3.3 with Oracle 11g and face a weird problem with Spring Batch as soon as I'm switching from Spring ResourcelessTransactionManager (which is mainly for testing) to the productive DataSourceTransactionManager. First I used WebLogics default driver oracle.jdbc.xa.client.OracleXADataSource but this one fails because Spring can't set the isolation level - this is also documented here.
I'm fine with that since I don't need global transactions anyway so I switched to oracle.jdbc.driver.OracleDriver. Now I'm getting the error message
ORA-01453: SET TRANSACTION must be first statement of transaction
I don't find a lot of information on this, there was a bug but that should have been fixed in Oracle 7 long time ago. It looks like a transaction is started before (?) the actual job gets added to the JobRepository and is not closed properly or something like that.
JobI was able to solve this by setting the Isolation level for all transactions to READ_COMMITTED. By default, Spring sets that to SERIALIZABLE which is very strict (but perfectly fine). This didn't work on my machine although Oracle should support it:
http://www.oracle.com/technetwork/issue-archive/2005/05-nov/o65asktom-082389.html
Here's my code - first for the configuration:
<bean id="jobRepository" class="org.springframework.batch.core.repository.support.JobRepositoryFactoryBean">
<property name="transactionManager" ref="transactionManager" />
<property name="dataSource" ref="dataSource" />
<property name="isolationLevelForCreate" value="ISOLATION_READ_COMMITTED" />
</bean>
...and this is for the job itself (simplified):
public class MyFancyBatchJob {
#Transactional(isolation=Isolation.READ_COMMITTED)
public void addJob() {
JobParameters params = new JobParametersBuilder().toJobParameters();
Job job = jobRegistry.getJob("myFancyJob");
JobExecution execution = jobLauncher.run(job, params);
}
}
<bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource" >
<property name="driverClassName" value="oracle.jdbc.driver.OracleDriver"></property>
<property name="url" value="jdbc:oracle:thin:<username>/<password>#<host>:1521:<sid>" />
</bean>
<jdbc:initialize-database data-source="dataSource">
<jdbc:script location="org/springframework/batch/core/schema-drop-oracle10g.sql" />
<jdbc:script location="org/springframework/batch/core/schema-oracle10g.sql" />
</jdbc:initialize-database>
<bean id="jobRepository"
class="org.springframework.batch.core.repository.support.JobRepositoryFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="transactionManager" ref="transactionManager" />
<property name="databaseType" value="oracle" />
<property name="tablePrefix" value="BATCH_"/>
<property name="isolationLevelForCreate" value="ISOLATION_DEFAULT"/>
</bean>
<bean id="jobLauncher"
class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<property name="jobRepository" ref="jobRepository" />
</bean>
/*for spring batch with oracle 10g and 11g
*/