how to implement DataSourceAutoConfiguration in spring boot - java

I have 5 micro services with different database name so apart from every properties is common so included in application.properties
spring.datasource.driver-class-name=org.mariadb.jdbc.Driver
spring.datasource.username=${local.db.username:}
spring.datasource.password=${local.db.password:}
And i had class commondatasource.java which included properties
#PropertySource({ "classpath:application-test.properties" })
#Component
public class CommonDataSourceConfig {
#Autowired
private Environment env;
#Primary
#Bean
public DataSource dataadmindataSource()
{
final DataSource dataSource = new DataSource();
dataSource.setDriverClassName(Preconditions.checkNotNull(env.getProperty("spring.datasource.driverClassName")));
dataSource.setUrl(Preconditions.checkNotNull("spring.datasource.url"));
dataSource.setUsername(Preconditions.checkNotNull(env.getProperty("spring.datasource.username")));
dataSource.setPassword(Preconditions.checkNotNull(env.getProperty("spring.datasource.password")));
}
}
now i want to call this commondatasource in every micro services datasourceconfig.java
#Configuration
#EnableJpaRepositories(basePackages = {
"xxx.repositories" }, entityManagerFactoryRef = "xxEntityManager",
transactionManagerRef = "xxTransactionManager")
public class xxSourceConfig
{
#Autowired
private Environment env;
#Autowired
private CommonDataSourceConfig common;
#Value("${xx.datasource.url}")
private String url;
/**
* Configures the entity manager
*
* #return
*/
#Primary
#Bean
public LocalContainerEntityManagerFactoryBean dataAdminEntityManager()
{
LocalContainerEntityManagerFactoryBean entityManager = new LocalContainerEntityManagerFactoryBean();
entityManager.setDataSource(common.dataadmindataSource());
entityManager.setPackagesToScan(new String[] { "com.boeing.toolbox.dataadmin.domain" });
HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
entityManager.setJpaVendorAdapter(vendorAdapter);
HashMap<String, Object> properties = new HashMap<String, Object>();
properties.put("hibernate.hbm2ddl.auto", env.getProperty("spring.jpa.hibernate.ddl-auto"));
properties.put("hibernate.dialect", env.getProperty("spring.jpa.database-platform"));
entityManager.setJpaPropertyMap(properties);
return entityManager;
}
}
but now i want to implement by this class https://github.com/spring-projects/spring-boot/blob/master/spring-boot-autoconfigure/src/main/java/org/springframework/boot/autoconfigure/jdbc/DataSourceAutoConfiguration.java
i am new to this concept kindly help on this how to implement on above class in my project

I came across this question because I wanted to remove an existing custom DataSource configuration and just rely on DataSourceAutoConfiguration instead.
The thing is that this auto-configuration applies if
DataSource (or EmbeddedDatabaseType) is on the classpath; and
you don't have a DataSource bean configured; and
either
you have a spring.datasource.type property configured (for Spring Boot 1.3+), or
there is a supported connection pool available (e.g. HikariCP or Tomcat connection pool), or
there is an embedded (in-memory) database driver available (such as H2, HSQLDB or Derby) – probably not what you want.
In your case, the second condition fails, since CommonDataSourceConfig declares a DataSource bean. The auto-configuration thus backs-off.
You should thus remove that bean, and make sure that the 3rd condition is also satisfied by either setting the spring.datasource.type or, probably better, putting a compatible connection pool on the classpath.
The DataSourceAutoConfiguration should then do its job (based on your properties) and you should be able to inject your DataSource directly with #Autowired.

Related

Spring 5 Unable to find a Repository

I have a Spring Framework 5.3.7 application, NOT Spring Boot (I'll move to Spring Boot later). I am using Java Configuration and it is working well so far. I have a multi-maven module project, and the first module is "myapp-entity". There is a config directory with the following file:
#Configuration
#PropertySource(value = "file:/opt/myapp/myapp-ws.properties")
#EnableTransactionManagement
public class AppEntityConfiguration
{
#Value("${hibernate.connection.driver.class}")
private String driverClassName;
#Value("${hibernate.connection.url}")
private String connectionUrl;
#Value("${hibernate.connection.username}")
private String username;
#Value("${hibernate.connection.password}")
private String password;
#Bean
public LocalSessionFactoryBean sessionFactory()
{
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource(dataSource());
sessionFactory.setPackagesToScan("com.app.model");
sessionFactory.setHibernateProperties(hibernateProperties());
return sessionFactory;
}
#Bean
public DataSource dataSource()
{
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driverClassName);
dataSource.setUrl(connectionUrl);
dataSource.setUsername(username);
dataSource.setPassword(password);
return dataSource;
}
#Bean
public PlatformTransactionManager hibernateTransactionManager()
{
HibernateTransactionManager transactionManager = new HibernateTransactionManager();
transactionManager.setSessionFactory(sessionFactory().getObject());
return transactionManager;
}
private final Properties hibernateProperties()
{
Properties hibernateProperties = new Properties();
hibernateProperties.setProperty("hibernate.hbm2ddl.auto", "none");
hibernateProperties.setProperty("hibernate.dialect", "org.hibernate.dialect.H2Dialect");
return hibernateProperties;
}
}
The myapp-entity.jar compiles fine with Java 11, and this is fine. The next maven module which is myapp-dao has a config directory and a configuration class.
#Configuration
#Import(AppEntityConfiguration.class)
#ComponentScan(basePackages = "com.app.dao")
public class RepositoryContextConfiguration {
}
I have an TableOneEntity defined with an #Entity annotation in the app-entity.jar and that is fine.
I have a TableOneDao and TableOneDaoImpl defined with a very basic list of functions.
public interface TableOneDao
{ ... list of functions }
#Repositories("tableOneDao")
public class TableOneDaoImpl implements TableOneDao
{ ... iplementation of functions }
And the test for this works perfectly:
#RunWith(SpringJUnit4ClassRunner.class)
#Transactional
#PropertySource(value = "file:/opt/myapp/myapp-ws.properties")
#ContextConfiguration(classes = RepositoryContextConfiguration.class)
public class OrganizationDaoTest extends BaseDaoTests
{
#Autowired
private TableOneDao tableOneDao;
}
This is very much the old way I used to do things, and it wall worked well. NOW, I want to get rid of this old way of doing things and go with a new way of doing things. Maybe I can't do them in the same project, that could be the issue.
I have a second Entity (TableTwoEntity) and a second Repository (TableTwoDao):
#Repository("clothesDryerDao")
public interface ClothesDryerDao extends JpaRepository<ClothesDryer, Long>
{
}
This is now a JPA repository and when I do a simple on this Dao, it cannot find THIS Dao.
#RunWith(SpringJUnit4ClassRunner.class)
#Transactional
#PropertySource(value = "file:/opt/ekotrope/ekotrope-ws.properties")
#ContextConfiguration(classes = RepositoryContextConfiguration.class)
public class ClothesDryerDaoTest extends TestCase
{
#Autowired
private ClothesDryerDao clothesDryerDao;
}
And I get the error as follows:
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'com.ekotrope.dao.ClothesDryerDao' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {#org.springframework.beans.factory.annotation.Autowired(required=true)}
at org.springframework.beans.factory.support.DefaultListableBeanFactory.raiseNoMatchingBeanFound(DefaultListableBeanFactory.java:1790)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1346)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1300)
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.resolveFieldValue(AutowiredAnnotationBeanPostProcessor.java:657)
... 30 more
I thought I just need to modify the RepositoryContextConfiguration as follows:
#Configuration
#Import(EkotropeEntityConfiguration.class)
#ComponentScan(basePackages = "com.ekotrope.dao")
#EnableJpaRepositories(basePackages = "com.ekotrope.dao")
public class RepositoryContextConfiguration
{
}
But this did not work. As a matter of fact I think it broke the other working tests. So, the question is ... can I use both these methods (Dao and DaoImpl) and (JPA Dao)? Or, should I use only one, and if that was the case I would go with the JPA Repositories. I just wanted to be able to demonstrate both methods to my co-workers who are not familiar with Spring.
So, if I can get both to work at the same time, that would be great, but if not, then I can create one more Maven Module and then I will have a myapp-dao-old.jar and myapp-dao-new.jar.
Thanks for the help!
I think your jpaRepositories config is wrong but yout dao is good.
And I think this 2 solutions can work together.
Firtsly you don't need to put an #Repository on the ClothesDryerDao interface. (I understand you want to use a qualifier but i don't think this will work will not be necessary. (Type is enough for spring injection if you don't have multiple instance of a same class)
Secondly I think you need to change your Jpa configuration.
In a project i've done something like this
#Configuration
#EnableJpaRepositories(
basePackageClasses = ClothesDryerDao.class,
entityManagerFactoryRef = "configEntityManager",
)
public class RepositoryContextConfiguration{ /
//be careful this bean can maybe be in conflict with your sessionFactory
#Bean
public LocalContainerEntityManagerFactoryBean configEntityManager(DataSource dataSource) {
log.info("Start Parametrage Arcole Entity Manager");
LocalContainerEntityManagerFactoryBean em = new LocalContainerEntityManagerFactoryBean();
em.setDataSource(dataSource);
em.setPackagesToScan("com.app.model");
HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
em.setJpaVendorAdapter(vendorAdapter);
return em;
}
}
Thirdly i think (this one is juste my opinion) you should maybe not mix classical Dao and Repository in the same package. For a better visibility :)
Even if the usage is the same

Issue with Declarative Transactions and TransactionAwareDataSourceProxy in combination with JOOQ

I have a data source configuration class that looks as follows, with separate DataSource beans for testing and non-testing environments using JOOQ. In my code, I do not use DSLContext.transaction(ctx -> {...} but rather mark the method as transactional, so that JOOQ defers to Spring's declarative transactions for transactionality. I am using Spring 4.3.7.RELEASE.
I have the following issue:
During testing (JUnit), #Transactional works as expected. A single method is transactional no matter how many times I use the DSLContext's store() method, and a RuntimeException triggers a rollback of the entire transaction.
During actual production runtime, #Transactional is completely ignored. A method is no longer transactional, and TransactionSynchronizationManager.getResourceMap() holds two separate values: one showing to my connection pool (which is not transactional), and one showing the TransactionAwareDataSourceProxy).
In this case, I would have expected only a single resource of type TransactionAwareDataSourceProxy which wraps my DB CP.
After much trial and error using the second set of configuration changes I made (noted below with "AFTER"), #Transactional works correctly as expected even during runtime, though TransactionSynchronizationManager.getResourceMap() holds the following value:
In this case, my DataSourceTransactionManager seems to not even know the TransactionAwareDataSourceProxy (most likely due to my passing it the simple DataSource, and not the proxy object), which seems to completely 'skip' the proxy anyway.
My question is: the initial configuration that I had seemed correct, but did not work. The proposed 'fix' works, but IMO should not work at all (since the transaction manager does not seem to be aware of the TransactionAwareDataSourceProxy).
What is going on here? Is there a cleaner way to fix this issue?
BEFORE (not transactional during runtime)
#Configuration
#EnableTransactionManagement
#RefreshScope
#Slf4j
public class DataSourceConfig {
#Bean
#Primary
public DSLContext dslContext(org.jooq.Configuration configuration) throws SQLException {
return new DefaultDSLContext(configuration);
}
#Bean
#Primary
public org.jooq.Configuration defaultConfiguration(DataSourceConnectionProvider dataSourceConnectionProvider) {
org.jooq.Configuration configuration = new DefaultConfiguration()
.derive(dataSourceConnectionProvider)
.derive(SQLDialect.POSTGRES_9_5);
configuration.set(new DeleteOrUpdateWithoutWhereListener());
return configuration;
}
#Bean
public DataSourceTransactionManager transactionManager(DataSource dataSource) {
return new DataSourceTransactionManager(dataSource);
}
#Bean
public DataSourceConnectionProvider dataSourceConnectionProvider(DataSource dataSource) {
return new DataSourceConnectionProvider(dataSource);
}
#Configuration
#ConditionalOnClass(EmbeddedPostgres.class)
static class EmbeddedDataSourceConfig {
#Value("${spring.jdbc.port}")
private int dbPort;
#Bean(destroyMethod = "close")
public EmbeddedPostgres embeddedPostgres() throws Exception {
EmbeddedPostgres embeddedPostgres = EmbeddedPostgresHelper.startDatabase(dbPort);
return embeddedPostgres;
}
#Bean
#Primary
public DataSource dataSource(EmbeddedPostgres embeddedPostgres) throws Exception {
DataSource dataSource = embeddedPostgres.getPostgresDatabase();
return new TransactionAwareDataSourceProxy(dataSource);
}
}
#Configuration
#ConditionalOnMissingClass("com.opentable.db.postgres.embedded.EmbeddedPostgres")
#RefreshScope
static class DefaultDataSourceConfig {
#Value("${spring.jdbc.url}")
private String url;
#Value("${spring.jdbc.username}")
private String username;
#Value("${spring.jdbc.password}")
private String password;
#Value("${spring.jdbc.driverClass}")
private String driverClass;
#Value("${spring.jdbc.MaximumPoolSize}")
private Integer maxPoolSize;
#Bean
#Primary
#RefreshScope
public DataSource dataSource() {
log.debug("Connecting to datasource: {}", url);
HikariConfig hikariConfig = buildPool();
DataSource dataSource = new HikariDataSource(hikariConfig);
return new TransactionAwareDataSourceProxy(dataSource);
}
private HikariConfig buildPool() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl(url);
config.setUsername(username);
config.setPassword(password);
config.setDriverClassName(driverClass);
config.setConnectionTestQuery("SELECT 1");
config.setMaximumPoolSize(maxPoolSize);
return config;
}
}
AFTER (transactional during runtime, as expected, all non-listed beans identical to above)
#Configuration
#EnableTransactionManagement
#RefreshScope
#Slf4j
public class DataSourceConfig {
#Bean
public DataSourceConnectionProvider dataSourceConnectionProvider(TransactionAwareDataSourceProxy dataSourceProxy) {
return new DataSourceConnectionProvider(dataSourceProxy);
}
#Bean
public TransactionAwareDataSourceProxy transactionAwareDataSourceProxy(DataSource dataSource) {
return new TransactionAwareDataSourceProxy(dataSource);
}
#Configuration
#ConditionalOnMissingClass("com.opentable.db.postgres.embedded.EmbeddedPostgres")
#RefreshScope
static class DefaultDataSourceConfig {
#Value("${spring.jdbc.url}")
private String url;
#Value("${spring.jdbc.username}")
private String username;
#Value("${spring.jdbc.password}")
private String password;
#Value("${spring.jdbc.driverClass}")
private String driverClass;
#Value("${spring.jdbc.MaximumPoolSize}")
private Integer maxPoolSize;
#Bean
#Primary
#RefreshScope
public DataSource dataSource() {
log.debug("Connecting to datasource: {}", url);
HikariConfig hikariConfig = buildPoolConfig();
DataSource dataSource = new HikariDataSource(hikariConfig);
return dataSource; // not returning the proxy here
}
}
}
I'll turn my comments into an answer.
The transaction manager should NOT be aware of the proxy. From the documentation:
Note that the transaction manager, for example
DataSourceTransactionManager, still needs to work with the underlying
DataSource, not with this proxy.
The class TransactionAwareDataSourceProxy is a special purpose class that is not needed in most cases. Anything that is interfacing with your data source through the Spring framework infrastructure should NOT have the proxy in their chain of access. The proxy is intended for code that cannot interface with the Spring infrastructure. For example, a third party library that was already setup to work with JDBC and did not accept any of Spring's JDBC templates. This is stated in the same docs as above:
This proxy allows data access code to work with the plain JDBC API and
still participate in Spring-managed transactions, similar to JDBC code
in a J2EE/JTA environment. However, if possible, use Spring's
DataSourceUtils, JdbcTemplate or JDBC operation objects to get
transaction participation even without a proxy for the target
DataSource, avoiding the need to define such a proxy in the first
place.
If you do not have any code that needs to bypass the Spring framework then do not use the TransactionAwareDataSourceProxy at all. If you do have legacy code like this then you will need to do what you already configured in your second setup. You will need to create two beans, one which is the data source, and one which is the proxy. You should then give the data source to all of the Spring managed types and the proxy to the legacy types.

How to inject spring.datasource.* properties when DataSourceAutoConfiguration.class is excluded from autoconfiguration

I have implemented multi-tenancy in the boot application by following the below link https://dzone.com/articles/spring-boot-hibernate-multitenancy-implementation
For this I have excluded DatasourceAutoconfiguration.class from #SpringBootApplication like
#SpringBootApplication(
exclude = {DataSourceAutoConfiguration.class, HibernateJpaAutoConfiguration.class})
#EnableScheduling
#EnableJpaRepositories
#EnableAspectJAutoProxy(proxyTargetClass = true)
#ComponentScan("com.mps")
public class MpsServiceClientApplication {
The problem is, how do I inject properties like spring.datasource.tomcat.* to my custom datasources? To be more precise how do I set the 2 properties mentioned below to the custom datasource.
spring.datasource.test-while-idle=true
spring.datasource.test-on-borrow=true
This is how I am setting jpa properties.
final Map<String, Object> hibernateProps = new LinkedHashMap<>();
hibernateProps.putAll(this.jpaProperties.getProperties());
final LocalContainerEntityManagerFactoryBean result =
new LocalContainerEntityManagerFactoryBean();
result.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
result.setJpaPropertyMap(hibernateProps);
You have to inject those properties into the #Configuration bean and set while creating the Tomcat Datasource manually:
import org.apache.tomcat.jdbc.pool.DataSource;
#Value("${spring.datasource.test-on-borrow}")
private boolean testWhileIdle;
#Value("${spring.datasource.test-while-idle}")
private boolean testOnBorrow;
#Bean
public DataSource dataSource(){
DataSource dataSource = new DataSource();
dataSource.setTestOnBorrow(testOnBorrow);
dataSource.setTestWhileIdle(testWhileIdle);
...
return dataSource;
}

Changeable Schema for a Specific Entity with JPA and Spring

I'm using Spring and JPA repositories, with multiple database schemas for different entities, and configuring each of them with the proper values and independent beans:
#Configuration
#EnableJpaRepositories(basePackageClasses = {ClassFromSchemaXRepository.class}, entityManagerFactoryRef = "ClassFromSchemaXEntityManagerFactoryA", transactionManagerRef = "ClassFromSchemaXTransactionManager")
{
#Bean
public DataSource classFromSchemaXDataSource(){
HikariDataSource hds = new HikariDataSource();
hds.setJdbcUrl(env.getProperty("dataSource.jdbcUrl.schema.x"));
hds.setUsername(env.getProperty("dataSource.user"));
hds.setPassword(env.getProperty("dataSource.password"));
//...
}
#Bean
public LocalContainerEntityManagerFactoryBean classFromSchemaXEntityManagerFactory() {
LocalContainerEntityManagerFactoryBean em = new LocalContainerEntityManagerFactoryBean();
em.setDataSource(classFromSchemaXDataSource());
em.setPackagesToScan(new String[] { "com.company.core.domain.x" });
//...
}
#Bean
public PlatformTransactionManager classFromSchemaXTransactionManager() {
//...
}
}
This configuration works as expected, and the entire set of entities which are located under the 'com.company.core.domain.x' package is mapped to this sets of beans with the schema connection string which is defined in env.getProperty("dataSource.jdbcUrl.schema.x")
However, I am now trying to configure a specific entity to be used by changeable schema and therefor dymanic DataSource/EntityManagerFactory/TransactionManager.
The business logic should determine which schema should be used at runtime.
What is the best method of doing that?
I've found what I was looking for which is extending the AbstractRoutingDataSource class, and overriding the determineCurrentLookupKey() method

How to configure database configuration connection pooling with custom prefix in Spring Boot?

Consider Spring Boot with spring-boot-starter-jdbc and that you would like to have one or more Data Sources, which do have a custom prefix in their property names. From what I see in org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration it looks to me, that the auto-configuration can only be used with the default prefix spring.datasource, but as soon as you modify it, you will be on your own setting up the properties regarding pooling.
Can someone please shed light on how to configure the Tomcat JDBC Pool DataSource more elegant (read Spring idiomatic)?
Current work around:
#Configuration
public class DatabaseConfiguration {
#Value("${datasource.api.tomcat.maxWait:5000}")
private int maxWaitMillis;
#Value("${datasource.api.tomcat.test-on-borrow:true}")
private boolean testOnBorrow;
#Value("${datasource.api.tomcat.validation-query:SELECT 1}")
private String validationQuery;
#Bean(name = "apiDataSource")
#ConfigurationProperties(prefix = "datasource.api")
public DataSource apiDataSource() {
DataSource ds = DataSourceBuilder.create().build();
// Assume we make use of Apache Tomcat connection pooling (default in Spring Boot)
org.apache.tomcat.jdbc.pool.DataSource tds = (org.apache.tomcat.jdbc.pool.DataSource) ds;
tds.setTestOnBorrow(testOnBorrow);
tds.setValidationQuery(validationQuery);
tds.setMaxWait(maxWaitMillis);
return ds;
}
}
Actually it turned out to be quite straight-forward thanks to the binding feature of Spring Boot's ConfigurationProperties annotation, you can directly populate the JDBC connection pool properties in the following fashion and avoid therefore the cumbersome initialisation of each property on its own:
#Bean
#ConfigurationProperties(prefix = "datasource.api")
public PoolProperties apiPoolProperties() {
return new org.apache.tomcat.jdbc.pool.PoolProperties();
}
#Bean(name = "apiDataSource")
public DataSource apiDataSource(#Qualifier("apiPoolProperties") PoolProperties poolProperties) {
DataSource ds = new org.apache.tomcat.jdbc.pool.DataSource(poolProperties);
logger.info("Initialized API Datasource: {}", ds);
return ds;
}

Categories