Loading Properties from Database using Java Config with Spring Boot - java

I created a FactoryBean<Properties> as
public final class SystemProperteisFactoryBean implements FactoryBean<Properties> {
private static final String QUERY = "select * from tb_system_properties";
private final NamedParameterJdbcTemplate jdbcTemplate;
public SystemProperteisFactoryBean (DataSource datasource) {
this.jdbcTemplate = new NamedParameterJdbcTemplate (datasource);
}
#Override
public Properties getObject() {
Properties result = new Properties();
jdbcTemplate.query(QUERY,
(ResultSet rs) -> result.setProperty(rs.getString(1), rs.getString(2));
return result;
}
#Override
public Class<?> getObjectType() {
return Properties.class;
}
#Override
public boolean isSingletone() {
return true;
}
}
This class worked fine using Spring with XML config, I get DataSource using a JNDI name, and then created proper properties, and then used propertiesPlaceHoldeConfigurer via XML tag.
Now I want to use the same thing in Spring Boot and Java Config.
When I define a ProprtySourcesPlaceHolderConfigurer as a bean (in a static method in a #Configuration class) Spring tries to create this bean before the datasource.
Is there any way to create the datasource before PRopertySourcesPlaceHolderConfigurer?

Basically you need to take the dependency as a #Bean method parameter this way:
#Configuration
public static class AppConfig {
#Bean
public SystemPropertiesFactoryBean systemProperties(DataSource dataSource) {
return new SystemPropertiesFactoryBean(dataSource);
}
#Bean
public DataSource dataSource() {
// return new data source
}
}

Related

Use #ConfigurationProperties over non-managed #Bean

I would like to benefit from #ConfigurationProperties fantastic facilities without needing to expose the bean into my context. It is not a problem of #Primaries and the like, I simply cannot expose another Datasource into the context. How can I achieve the following?
#ConfigurationProperties("com.non.exposed.datasource.hikari")
public DataSource privateHikariDatasource() {
if (Objects.isNull(this.nonExposedDatasource)) {
this.nonExposedDatasource = this.nonExposedDatasourceProperties.initializeDataSourceBuilder().build();
}
return this.nonExposedDatasource;
}
Thanks to the answer by #LppEdd, the final perfect solution is:
#Autowired
private Environment environment;
public DataSource privateHikariDatasource() {
if (Objects.isNull(this.nonExposedDatasource)) {
this.nonExposedDatasource = bindHikariProperties(this.nonExposedDatasourceProperties.initializeDataSourceBuilder().build());
}
return this.nonExposedDatasource;
}
//This does exactly the same as #ConfigurationProperties("com.non.exposed.hikari") but without requiring the exposure of the Datasource in the ctx as #Bean
private <T extends DataSource> T bindHikariProperties(final T instance) {
return Binder.get(this.environment).bind("com.non.exposed.datasource.hikari", Bindable.ofInstance(instance)).get();
}
Then you can call your bean internally with this.privateHikariDatasource() to be used by your other beans.
Great thanks to #LppEdd!
Being that this DataSource is private to a class, and that containing class can be/is inside the Spring context, you can have a #ConfigurationProperties class
#ConfigurationProperties("com.foo.bar.datasource.hikari")
public class HikariConfiguration { ... }
Which, by registering it via #EnableConfigurationProperties, is available for autowiring
#EnableConfigurationProperties(HikariConfiguration.class)
#SpringBootApplication
public class Application { ... }
And thus can be autowired in the containing class
#Component
class MyClass {
private final HikariConfiguration hikariConfiguration;
private DataSource springDatasource;
MyClass(final HikariConfiguration hikariConfiguration) {
this.hikariConfiguration = hikariConfiguration;
}
...
private DataSource privateSingletonDataSource() {
if (Objects.isNull(this.springDatasource)) {
this.springDatasource = buildDataSource(this.hikariConfiguration);
}
return this.springDatasource;
}
}
buildDataSource will manually construct the DataSource instance.
Remember that you need to take care of synchronization when building the DataSource.
Final response is that you cannot re-use DataSourceProperties. You can't even extend it to change the properties' prefix. Only a single instance of it can exist inside the context.
The best you can do is mimic what Spring does.
Having
com.non.exposed.datasource.hikari.url=testUrl
com.non.exposed.datasource.hikari.username=testUsername
com.non.exposed.datasource.hikari.password=testPassword
...
You can define a new #ConfigurationProperties class
#ConfigurationProperties("com.non.exposed.datasource")
public class NonExposedProperties {
private final Map<String, String> hikari = new HashMap<>(8);
public Map<String, String> getHikari() {
return hikari;
}
}
Then, autowire this properties class in your #Configuration/#Component class.
Follow in-code comments.
#Configuration
public class CustomConfiguration {
private final NonExposedProperties nonExposedProperties;
private DataSource dataSource;
CustomConfiguration(final NonExposedProperties nonExposedProperties) {
this.nonExposedProperties= nonExposedProperties;
}
public DataSource dataSource() {
if (Objects.isNull(dataSource)) {
// Create a standalone instance of DataSourceProperties
final DataSourceProperties dataSourceProperties = new DataSourceProperties();
// Use the NonExposedProperties "hikari" Map as properties' source. It will be
// {
// url -> testUrl
// username -> testUsername
// password -> testPassword
// ... other properties
// }
final ConfigurationPropertySource source = new MapConfigurationPropertySource(nonExposedProperties.getHikari());
// Bind those properties to the DataSourceProperties instance
final BindResult<DataSourceProperties> binded =
new Binder(source).bind(
ConfigurationPropertyName.EMPTY,
Bindable.ofInstance(dataSourceProperties)
);
// Retrieve the binded instance (it's not a new one, it's the same as before)
dataSource = binded.get().initializeDataSourceBuilder().build();
}
// Return the constructed HikariDataSource
return dataSource;
}
}

Spring Data Cassandra config change on the fly

I am trying to do a similar thing with my application. I am using following versions of Spring boot and Cassandra:
spring-data-cassandra - 2.0.8.RELEASE
spring-boot-starter-parent - 2.0.4.RELEASE
I need to change some properties(mostly hostnames) of Cassandra on the fly and want it to make a new connection with the application. For config change we have internal Cloud Config Change Management and it runs fine on changes and listens to it.
This is my class :
#Configuration
#Order(Ordered.HIGHEST_PRECEDENCE)
#RefreshScope
#EnableCassandraRepositories(basePackages = {"com.*.*.*.dao.repo"})
public class AppConfig {
private static final Logger LOGGER = LoggerFactory.getLogger(AppConfig.class);
#Value("${application['cassandraPort']}")
private String cassandraPort;
#Value("${application['cassandraEndpoint']}")
private String cassandraEndpoint;
#Value("${application['keyspaceName']}")
private String keyspaceName;
#Value("${application['cassandraConsistency']}")
private String cassandraConsistency;
#Value("${application['cassandraUserName']}")
private String cassandraUserName;
#Autowired
private AppConfig appConfig;
public AppConfig() {
System.out.println("AppConfig Constructor");
}
public String getCassandraPort() {
return cassandraPort;
}
public void setCassandraPort(String cassandraPort) {
this.cassandraPort = cassandraPort;
}
public String getCassandraEndpoint() {
return cassandraEndpoint;
}
public void setCassandraEndpoint(String cassandraEndpoint) {
this.cassandraEndpoint = cassandraEndpoint;
}
public String getKeyspaceName() {
return keyspaceName;
}
public void setKeyspaceName(String keyspaceName) {
this.keyspaceName = keyspaceName;
}
public String getCassandraConsistency() {
return cassandraConsistency;
}
public void setCassandraConsistency(String cassandraConsistency) {
this.cassandraConsistency = cassandraConsistency;
}
public String getCassandraUserName() {
return cassandraUserName;
}
public void setCassandraUserName(String cassandraUserName) {
this.cassandraUserName = cassandraUserName;
}
#Bean
// #RefreshScope
public CassandraConverter converter() {
return new MappingCassandraConverter(this.mappingContext());
}
#Bean
// #RefreshScope
public CassandraMappingContext mappingContext() {
return new CassandraMappingContext();
}
#Bean
//#RefreshScope
public CassandraSessionFactoryBean session() {
CassandraSessionFactoryBean session = new CassandraSessionFactoryBean();
session.setCluster(this.cluster().getObject());
session.setKeyspaceName(appConfig.getKeyspaceName());
session.setConverter(this.converter());
session.setSchemaAction(SchemaAction.NONE);
return session;
}
#Bean
//#RefreshScope
public CassandraClusterFactoryBean cluster() {
CassandraClusterFactoryBean cluster = new CassandraClusterFactoryBean();
cluster.setContactPoints(appConfig.getCassandraEndpoint());
cluster.setPort(Integer.valueOf(appConfig.getCassandraPort()));
cluster.setUsername(appConfig.getCassandraUserName());
cluster.setPassword("password");
cluster.setQueryOptions(new QueryOptions().setConsistencyLevel(ConsistencyLevel.LOCAL_QUORUM));
return cluster;
}
}
However, when I try to use #RefreshScope with that Configuration class, the application fails to start. This is what it shows in console :
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 2 of constructor in org.springframework.boot.autoconfigure.data.cassandra.CassandraDataAutoConfiguration required a bean of type 'com.datastax.driver.core.Cluster' that could not be found.
- Bean method 'cassandraCluster' not loaded because auto-configuration 'CassandraAutoConfiguration' was excluded
Action:
Consider revisiting the entries above or defining a bean of type 'com.datastax.driver.core.Cluster' in your configuration.
Is there some guidelines on using #RefreshScope with Cassandra Bean? If anyone has done that earlier can you share the same?
You're mixing a couple of things here.
The config carries properties and bean definitions.
#RefreshScope on AppConfig causes some interference with Spring Boot's auto-configuration and the declared beans aren't used (that's why you see Parameter 2 of constructor…).
To clean up, we will reuse what Spring Boot provides as much as possible, and only declare what's really needed.
Follow these steps to solve the issue (based on your code above):
Create a #ConfigurationProperties bean that encapsulates your properties, or better, reuse CassandraProperties.
Re-enable CassandraAutoConfiguration and remove your own MappingContext and CassandraConverter beans, keep only Cluster and Session bean definitions
Declare Cluster and Session beans as needed and make them use #RefreshScope. Your #Configuration class should look like:
Example Configuration:
#Configuration
public class MyConfig {
#Bean(destroyMethod = "close")
#RefreshScope
public Cluster cassandraCluster(CassandraProperties properties) {
Cluster.Builder builder = Cluster.builder().addContactPoints(properties.getContactPoints().toArray(new String[0]))
.withoutJMXReporting();
return builder.build();
}
#Bean(destroyMethod = "close")
#RefreshScope
public Session cassandraSession(CassandraProperties properties, Cluster cluster) {
return cluster.connect(properties.getKeyspaceName());
}
}

Spring Boot auto generate tables from wrong datasource

My current project needs to connect to multiple databases. I set
spring.jpa.generate-ddl=true
spring.jpa.hibernate.ddl-auto=update
in application.properties.
and I have some dbConfig as below:
#Configuration
public class DBSourceConfiguration {
public final static String DATA_SOURCE_PRIMARY = "dataSource";
public final static String DATA_SOURCE_PROPERTIES = "propertiesDataSource";
public final static String DATA_SOURCE_REPORT = "reportDataSource";
public final static String DATA_SOURCE_NEW_DRAGON = "newDragonDataSource";
#Primary
#Bean(name = DATA_SOURCE_PRIMARY)
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = DATA_SOURCE_REPORT)
#ConfigurationProperties(prefix = "externaldatasource.report")
public DataSource reportDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = DATA_SOURCE_NEW_DRAGON)
#ConfigurationProperties(prefix = "externaldatasource.newdragon")
public DataSource newDragonDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = DATA_SOURCE_PROPERTIES)
#ConfigurationProperties(prefix = "externaldatasource.properties")
public DataSource propertiesDataSource() {
return DataSourceBuilder.create().build();
}
}
and
<!-- language: Java -->
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = PrimaryDbConfig.ENTITY_MANAGER_FACTORY,
transactionManagerRef = PrimaryDbConfig.TRANSACTION_MANAGER,
basePackageClasses = { _TbsRepositoryBasePackage.class })
public class PrimaryDbConfig extends AbstractDbConfig {
public final static String ENTITY_MANAGER_FACTORY = "entityManagerFactoryPrimary";
public final static String ENTITY_MANAGER = "entityManagerPrimary";
public final static String TRANSACTION_MANAGER = "transactionManagerPrimary";
#Autowired
#Qualifier(DBSourceConfiguration.DATA_SOURCE_PRIMARY)
private DataSource dataSource;
#Primary
#Bean(name = ENTITY_MANAGER_FACTORY)
public LocalContainerEntityManagerFactoryBean entityManagerFactory(EntityManagerFactoryBuilder builder) {
return builder.dataSource(dataSource).properties(getVendorProperties(dataSource)).packages(_TbsEntityBasePackage.class).persistenceUnit("primaryPersistenceUnit").build();
}
#Primary
#Bean(name = ENTITY_MANAGER)
public EntityManager entityManager(EntityManagerFactoryBuilder builder) {
return entityManagerFactory(builder).getObject().createEntityManager();
}
#Primary
#Bean(name = TRANSACTION_MANAGER)
public PlatformTransactionManager transactionManager(EntityManagerFactoryBuilder builder) {
return new JpaTransactionManager(entityManagerFactory(builder).getObject());
}
}
and
<!-- language: Java -->
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = PropertiesDbConfig.ENTITY_MANAGER_FACTORY,
transactionManagerRef = PropertiesDbConfig.TRANSACTION_MANAGER,
basePackageClasses = { _PropertiesRepositoryBasePackage.class })
public class PropertiesDbConfig extends AbstractDbConfig {
public final static String ENTITY_MANAGER_FACTORY = "entityManagerFactoryProperties";
public final static String ENTITY_MANAGER = "entityManagerProperties";
public final static String TRANSACTION_MANAGER = "transactionManagerProperties";
#Autowired
#Qualifier(DBSourceConfiguration.DATA_SOURCE_PROPERTIES)
private DataSource dataSource;
#Bean(name = ENTITY_MANAGER_FACTORY)
public LocalContainerEntityManagerFactoryBean entityManagerFactory(EntityManagerFactoryBuilder builder) {
return builder.dataSource(dataSource).properties(getVendorProperties(dataSource)).packages(_PropertiesEntityBasePackage.class).persistenceUnit("propertiesPersistenceUnit").build();
}
#Bean(name = ENTITY_MANAGER)
public EntityManager entityManager(EntityManagerFactoryBuilder builder) {
return entityManagerFactory(builder).getObject().createEntityManager();
}
#Bean(name = TRANSACTION_MANAGER)
public PlatformTransactionManager transactionManager(EntityManagerFactoryBuilder builder) {
return new JpaTransactionManager(entityManagerFactory(builder).getObject());
}
}
and two more DBConfig classes(just like two DbConfig classes above).
My problem is every time I run this web application, Entities (under different packages) will generate to all databases. In other words, Tbs's(Primary) entities will generate tables to newDragon and all other databases.
For instance, Entity A belongs to primary data source, Entity B belongs to properties datasouce. But framework generates table A, B to both primary database and newDragon database and other two database.
Update 2018/06/01 - 1
Although framework generate all entities to all databases, but I can still access tables from the right database. All my web application functionalities work very well. This is very odd, isn't it?
I guess my configuration is fine, so that there is no any problems while my application access database (like read from wrong database and get empty result or insert data to wrong database, etc). Probably something else cause this gernerte all tables to all databases problem.
Based on the configuration you provided, CRUD tables from the right database shouldn't be problem. But generating tables into the right database, sometimes you may want to check whether the configuration picks entity/package names correctly or not.
Each LocalContainerEntityManagerFactoryBean is set with unique package class, the framework will then scan entities under this package name and generate tables accordingly at target datasource; however, there's a situation the packageToScan will be changed. As you have #EntityScan annotation, it would
overrides packagesToScan on all defined LocalContainerEntityManagerFactoryBean, reference code as follow: EntityScanRegistrar.java
#Override
public Object postProcessBeforeInitialization(Object bean, String beanName) throws BeansException {
if (bean instanceof LocalContainerEntityManagerFactoryBean) {
LocalContainerEntityManagerFactoryBean factoryBean = (LocalContainerEntityManagerFactoryBean) bean;
factoryBean.setPackagesToScan(this.packagesToScan);
this.processed = true;
}
return bean;
}
As a result, even you have provided each LocalContainerEntityManagerFactoryBean with unique package class, the final result may still be overriden by the framework if you've #EntityScan somewhere at your application. Your configuration seems ok to me, so try to find and resolve the package names between #EntityScan and LocalContainerEntityManagerFactoryBean first, it should solve the issue.
reference: https://github.com/spring-projects/spring-boot/issues/6830

Access mulitple Datasource with JdbcTemplate results in error

I have two projects one with DAO classes and Model and another with Rest Controller
Project A : DAO Classes + Model
Project B : Rest Controller
Project A
application.properties:
spring.abcDatasource.url=
spring.abcDatasource.username=
spring.abcDatasource.password=
spring.abcDatasource.driver-class-name=oracle.jdbc.driver.OracleDriver
spring.xyzDatasource.url=
spring.xyzDatasource.username=
spring.xyzDatasource.password=
spring.xyzDatasource.driver-class-name=oracle.jdbc.driver.OracleDriver
spring.datasource.initialize=false
DBConfiguration.java
#Configuration
public class DBConfiguration {
#Primary
#Bean(name = "abcDS")
#ConfigurationProperties(prefix = "spring.abcDatasource")
public DataSource abcDS() {
return DataSourceBuilder.create().build();
}
#Bean(name = "abcJdbc")
public JdbcTemplate abcJdbcTemplate(#Qualifier("abcDS") DataSource abcDS) {
return new JdbcTemplate(abcDS);
}
#Bean(name = "xyzDS")
#ConfigurationProperties(prefix = "spring.xyzDatasource")
public DataSource xyzDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "xyzJdbc")
public JdbcTemplate ebsJdbcTemplate(#Qualifier("xyzDS") DataSource xyzDatasource) {
return new JdbcTemplate(xyzDatasource);
}
}
AlphaDAO.Java
#Repository
public class AlphaDAO{
#Autowired
#Qualifier("abcJdbc")
private JdbcTemplate abcJdbc;
#Autowired
#Qualifier("xyzJdbc")
private JdbcTemplate xyzJdbc;
SqlParameterSource namedParameters;
public Collection<Alpha> findAll(String owner){
String sql = "SELECT * from alpha where OWNER in (:owner)" ;
NamedParameterJdbcTemplate namedParameterJdbcTemplate = new NamedParameterJdbcTemplate(abcJdbc.getDataSource());
namedParameters = new MapSqlParameterSource("owner", owner);
List<Alpha> list = namedParameterJdbcTemplate.query(sql,namedParameters,
new BeanPropertyRowMapper(Alpha.class));
return list;
}
Project B Rest Controller :
AlphaServiceApplication.java
#SpringBootApplication
#EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class})
public class AlphaServiceApplication extends SpringBootServletInitializer implements WebApplicationInitializer {
#Override
protected SpringApplicationBuilder configure(SpringApplicationBuilder builder) {
return builder.sources(AlphaServiceApplication.class);
}
public static void main(String[] args) {
SpringApplication.run(AlphaServiceApplication.class, args);
}
}
AlphaServiceController.java
#RestController
public class AlphaServiceController {
private static final Logger logger = LoggerFactory.getLogger(AlphaServiceController.class);
#Autowired
AlphaDAO dao;
#CrossOrigin(origins = "http://localhost:4200")
#RequestMapping("/alpha")
public Collection<Alpha> index(#RequestBody String owner) {
return dao.findAll(owner);
}
If I try to run the rest controller I am getting the error saying
APPLICATION FAILED TO START
Description:
Field dao in com.xyz.web.wip.AlphaService.AlphaServiceController required a bean of type 'com.xyz.comp.wip.alphaComp.dao.AlphaDAO' that could not be found.
Action:
Consider defining a bean of type 'com.xyz.comp.wip.alphaComp.dao.AlphaDAO' in your configuration.
Your AlphaDao class doesnt make much sense, you are trying to autowire two fields but you still have a constructor.
Spring cant build the object because there is no qualifier on the constructor.
You can either do constructor injection or field injection but you shouldn’t use both.
I would recommend using constructor injection.
#Repository
public class AlphaDAO{
private final JdbcTemplate abcJdbc;
private final JdbcTemplate xyzJdbc;
#Autowired
public AlphaDAO(
#Qualifier("abcJdbc") JdbcTemplate abcJdbc,
#Qualifier("xyzJdbc") JdbcTemplate xyzJdbc){
this.abcJdbc = abcJdbc;
this.xyzJdbc = xyzJdbc;
}
Also remove your #Bean method from the controller.
Since the DAO classes and Rest Controller is in different packages. Added scanBasePackages to #SpringBootApplication annotation with one level up worked fine.
AlphaServiceApplication.java
#SpringBootApplication(scanBasePackages = { "com.xyz" })

Spring boot - how to configure multiple datasources

I am trying to setup multiple data sources(MySql, Postgres & Oracle) using Spring boot. I am not using JPA. Setting up with a JdbcTemplate.
I have tried setting up something like this.
application.properties
spring.datasource.test-oracle.username=test-oracle
spring.datasource.test-oracle.password=test-password
spring.datasource.test-oracle.url=dburl/test
spring.datasource.test-oracle.driver-class-name=oracle.jdbc.OracleDriver
spring.datasource.int-oracle.username=int-oracle
spring.datasource.int-oracle.password=int-password
spring.datasource.int-oracle.url=dburl/int
spring.datasource.int-oracle.driver-class-name=oracle.jdbc.driver.OracleDriver
spring.datasource.d.int-mysql.username=user
spring.datasource.d.int-mysql.password=password
spring.datasource.d.int-mysql.url=dburl/d
spring.datasource.d.int-mysql.driver-class-name=com.mysql.jdbc.Driver
spring.datasource.m.int-mysql.username=user
spring.datasource.m.int-mysql.password=password
spring.datasource.m.int-mysql.url=dburl/m
spring.datasource.m.int-mysql.driver-class-name=com.mysql.jdbc.Driver
spring.datasource.d.test-mysql.username=user
spring.datasource.d.test-mysql.password=password
spring.datasource.d.test-mysql.url=dburl/d
spring.datasource.d.test-mysql.driver-class-name=com.mysql.jdbc.Driver
spring.datasource.m.test-mysql.username=user
spring.datasource.m.test-mysql.password=password
spring.datasource.m.test-mysql.url=dburl/m
spring.datasource.m.test-mysql.driver-class-name=com.mysql.jdbc.Driver
MySqlConfiguration.java
#Configuration
public class MySqlConfiguration() {
#Bean(name = "dMySql")
#ConfigurationProperties(prefix = "spring.datasource.d.int-mysql")
public DataSource mysqlDrupalDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "dJdbc")
public JdbcTemplate drupalJdbcTemplate(DataSource dMySql) {
return new JdbcTemplate(dMySql);
}
#Bean(name = "mMySql")
#ConfigurationProperties(prefix = "spring.datasource.m.int-mysql")
public DataSource mysqlDrupalDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "mJdbc")
public JdbcTemplate drupalJdbcTemplate(DataSource mMySql) {
return new JdbcTemplate(mMySql);
}
}
OracleConfiguration.java
#Configuration
public class OracleConfiguration {
#Primary
#Bean(name = "tOracle")
#ConfigurationProperties(prefix = "spring.datasource.test-oracle")
public DataSource heOracleDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "tOracleJdbc")
public JdbcTemplate jdbcTemplate(DataSource tOracle) {
return new JdbcTemplate(tOracle);
}
#Bean(name = "iOracle")
#ConfigurationProperties(prefix = "spring.datasource.int-oracle")
public DataSource heOracleDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "iOracleJdbc")
public JdbcTemplate jdbcTemplate(DataSource iOracle) {
return new JdbcTemplate(iOracle);
}
}
I am not sure if the above is the correct way to go about this. When I use #Primary as per the boot docs, the Bean that has #Primary is always used. Then I use the configurations in my DAO implementations like this
One of the DAO Implementation
#Repository
public class DAOImpl implements DAOInterface {
#Autowired
#Qualifier("dJdbc")
private JdbcTemplate jdbc;
#Override
public Map<String, Object> getBasicStudentInfo(String MAIL) {
return jdbc.queryForMap(GET_BASIC_STUDENT_INFO, new Object[]{MAIL});
}
How do I go about doing this.? I did see many articles which is about mutliple datasources but unfortunately the examples or solutions don't suite me.
Further to this I need to be able to query against the DB's based on some user input. So if a user provides an environment e.g., "test" or "int", how can I trigger the correct properties based on that input.
I understand that Environment is #Autowired into Spring boot and I can intercept the user input, but unsure how I should provide the plumbing between the user input and the DAO configurations.
If something is unclear or needs a bit more explanation from my side or need more code I can provide that. Any help to resolve this situation would be appreciated.Thanks
Here is complete solution to your problem ...
Your configuration classes will look like this :
MySqlConfiguration.java
#Configuration
public class MySqlConfiguration {
#Bean(name = "dMySql")
#ConfigurationProperties(prefix = "spring.datasource.d.int-mysql")
public DataSource mysqlDrupalDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "dJdbc")
public JdbcTemplate drupalJdbcTemplate(#Qualifier("dMySql") DataSource dMySql) {
return new JdbcTemplate(dMySql);
}
#Bean(name = "mMySql")
#ConfigurationProperties(prefix = "spring.datasource.m.int-mysql")
public DataSource mysqlDrupalDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "mJdbc")
public JdbcTemplate drupalJdbcTemplate(#Qualifier("mMySql") DataSource mMySql) {
return new JdbcTemplate(mMySql);
}
}
OracleConfiguration.java
#Configuration
public class OracleConfiguration {
#Primary
#Bean(name = "tOracle")
#ConfigurationProperties(prefix = "spring.datasource.test-oracle")
public DataSource heOracleDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "tOracleJdbc")
public JdbcTemplate jdbcTemplate(#Qualifier("tOracle") DataSource tOracle) {
return new JdbcTemplate(tOracle);
}
#Bean(name = "iOracle")
#ConfigurationProperties(prefix = "spring.datasource.int-oracle")
public DataSource heOracleDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "iOracleJdbc")
public JdbcTemplate jdbcTemplate(#Qualifier("iOracle") DataSource iOracle) {
return new JdbcTemplate(iOracle);
}
}
and in your DAO class , you can autowire the JdbcTemplate like this :
#Repository
public class DAOImpl implements DAOInterface {
#Autowired
#Qualifier("dJdbc")
private JdbcTemplate dJdbc;
#Autowired
#Qualifier("mJdbc")
private JdbcTemplate mJdbc;
#Autowired
#Qualifier("tOracleJdbc")
private JdbcTemplate tOracleJdbc;
#Autowired
#Qualifier("iOracleJdbc")
private JdbcTemplate iOracleJdbc;
#Override
public Map<String, Object> getBasicStudentInfo(String MAIL) {
return dJdbc.queryForMap(GET_BASIC_STUDENT_INFO, new Object[]{MAIL});
}
.
.
.
}
Note: Make Sure to annotate one of DataSource with #Primary annotation
My setup: spring-boot version 1.2.5.RELEASE
I succeeded in running a setup like this, with the jdbc being created with the correct DataSources by adding a #Qualifier in each JDBC method creation
So, for every JDBC method you should match the qualifying datasource like this
#Bean(name = "dJdbc")
public JdbcTemplate drupalJdbcTemplate(#Qualifier("dMySql") DataSource dMySql) {
return new JdbcTemplate(dMySql);
}
No matter you choose for #Primary, using the #Qualifier for every JDBC should work good.
Autowiring jdbcTemplates in repositories, and using #Qualifier for them is ok also.
In your DAO you could wire in additional jdbctemplates. Then at runtime you can pick which one to use.
#Repository
public class DAOImpl implements DAOInterface {
#Autowired
#Qualifier("tOracle")
private JdbcTemplate testJdbc;
#Autowired
#Qualifier("intOracle")
private JdbcTemplate intJdbc;
#Override
public Map<String, Object> getBasicStudentInfo(String MAIL, String source) {
if ("TEST".equals(source)){
return testJdbc.queryForMap(GET_BASIC_STUDENT_INFO, new Object[]{MAIL});
}else {
return intJdbc.queryForMap(GET_BASIC_STUDENT_INFO, new Object[]{MAIL});
}
}

Categories