spring boot assign specific DataSource to JpaRepository - java

I have two datasources that I'm trying to assign a specific datasource to each JpaRepositories. I'm using the spring boot framework. The primary datasource is used about 90% of the time while the secondary datasource is used about 10% so it would be nice to default to the primary and only assign the secondary datasource when required. I tried using the docs here https://docs.spring.io/spring-boot/docs/current/reference/html/howto-data-access.html , but I dont think its exactly what I need. Any tips would be great!
spring.datasource.configuration.url=jdbc:postgresql://localhost:5432/mydatabase
spring.datasource.configuration.username=dockerusername
spring.datasource.configuration.password=dockerpassword
spring.datasource.configuration.driver-class-name=org.postgresql.Driver
spring.datasource.cached.url=jdbc:hsqldb:mem:main
spring.datasource.cached.driver-class-name=org.hsqldb.jdbc.JDBCDriver
spring.datasource.initialize=false
spring.jpa.database=default
spring.jpa.hibernate.ddl-auto=update
config file
#Configuration
#ComponentScan({"com.praeses.gov"})
public class Config {
#Bean
#ConfigurationProperties("spring.datasource.configuration")
public DataSourceProperties configDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#ConfigurationProperties("spring.datasource.configuration")
public DataSource configDataSource() {
return configDataSourceProperties().initializeDataSourceBuilder().build();
}
#Bean
#Primary
#ConfigurationProperties("spring.datasource.cached")
public DataSourceProperties cachedDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#Primary
#ConfigurationProperties("spring.datasource.cached")
public DataSource cachedDataSource() {
return cachedDataSourceProperties().initializeDataSourceBuilder().build();
}
repository that uses the primary datasource
#Qualifier("spring.datasource.cached")
#Repository("spring.datasource.cached")
public interface GeoitemRepository extends JpaRepository<Geoitem, String> {
}
repository that uses the secondary datasource
#Qualifier("spring.datasource.configuration")
#Repository("spring.datasource.configuration")
public interface GeoitemhistoryRepository extends JpaRepository<Geoitemhistory, String> {
}

You can do something on similar lines as below link. Just create two config files instead of one(one each for each datasource) and add both of them to your main application.
Also, in the properties file, give as below:
datasource.local.url=
datasource.local.driver-class-name=
datasource.local.username=
datasource.local.password=
datasource.primary.url=
datasource.primary.driver-class-name=
datasource.primary.username=
datasource.primary.password=
Also, give #Primary annotation to the LocalContainerEntityManagerFactoryBean and platformTransactionManager in the config file for primary datasource.
See this answer for more details:
https://stackoverflow.com/a/44971911/6775742

Related

Spring boot only uses primary datasource with two datasources

I have two MySQL datasources in Spring Boot, therefore I have two config classes. But it looks like its only using the primary datasource. All entities are created for the primary datasource, so both crawlerdb and userdb entities are created in the userdb.
My userdb primary config:
/**
* Data source for the MySQL User Database schema
*/
#Configuration
#EntityScan(basePackages = "cs.crawler.server.projectcs.domain.userdb")
#EnableJpaRepositories(basePackages = "cs.crawler.server.projectcs.repos.userdb")
#EnableTransactionManagement
public class UserDomainConfig {
#Bean
#Primary
#ConfigurationProperties("spring.datasource.users")
public DataSourceProperties userDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#Primary
#ConfigurationProperties("spring.datasource.users.configuration")
public HikariDataSource firstDataSource(DataSourceProperties firstDataSourceProperties) {
return firstDataSourceProperties.initializeDataSourceBuilder().type(HikariDataSource.class).build();
}
}
My secondary crawlerdb config:
/**
* Data source for the MySQL crawler database schema
*/
#Configuration
#EntityScan(basePackages = "cs.crawler.server.projectcs.domain.crawlerdb")
#EnableJpaRepositories(basePackages = "cs.crawler.server.projectcs.repos.crawlerdb")
#EnableTransactionManagement
public class CrawlerDomainConfig {
#Bean
#ConfigurationProperties("spring.datasource.crawler")
public DataSourceProperties crawlerDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#ConfigurationProperties("spring.datasource.crawler.configuration")
public HikariDataSource secondDataSource(DataSourceProperties secondDataSourceProperties) {
return secondDataSourceProperties.initializeDataSourceBuilder().type(HikariDataSource.class).build();
}
}
The entity classes for both schema's are in different packages as shown in #EntityScan above the class name. But when I check MySQL workbench for the created schema's I see that all entities are created in the userdb.
I fixed my issues by changing the config files like this aticle:
https://springframework.guru/how-to-configure-multiple-data-sources-in-a-spring-boot-application/

Is it possible to Autowire two different beans of the same class without #qualifier and #resource? If yes, how?

I have a component in jar fole which has a autowired DataSource
#Service
Class ReadService{
#Autowire
DataSource readDataSource
}
#Service
Class WriteService{
#Autowire
DataSource writeDataSource
}
I have to configure two different severs, one to read and one to write.. how do I inject those DataSource to these beans created from Jar plugin...
You can read more here
in two words, you can specify different datasource configurations in properties file and create datasource beans separately
app.datasource.member.url=jdbc:mysql://localhost:3306/memberdb?createDatabaseIfNotExist=true
app.datasource.member.username=root
app.datasource.member.password=P#ssw0rd#
app.datasource.member.driverClassName=com.mysql.cj.jdbc.Driver
#card number (cardholder id, cardnumber)
app.datasource.cardholder.url=jdbc:mysql://localhost:3306/cardholderdb?createDatabaseIfNotExist=true
app.datasource.cardholder.username=root
app.datasource.cardholder.password=P#ssw0rd#
app.datasource.cardholder.driverClassName=com.mysql.cj.jdbc.Driver
#expiration date (card id, expiration month, expiration year)
app.datasource.card.url=jdbc:mysql://localhost:3306/carddb?createDatabaseIfNotExist=true
app.datasource.card.username=root
app.datasource.card.password=P#ssw0rd#
app.datasource.card.driverClassName=com.mysql.cj.jdbc.Driver
For primary Datasource
#Bean
#Primary
#ConfigurationProperties("app.datasource.member")
public DataSourceProperties memberDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#Primary
#ConfigurationProperties("app.datasource.member.configuration")
public DataSource memberDataSource() {
return memberDataSourceProperties().initializeDataSourceBuilder()
.type(HikariDataSource.class).build();
For secondary datasource
/*cardholder data source */
#Bean
#ConfigurationProperties("app.datasource.cardholder")
public DataSourceProperties cardHolderDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#ConfigurationProperties("app.datasource.cardholder.configuration")
public DataSource cardholderDataSource() {
return cardHolderDataSourceProperties().initializeDataSourceBuilder()
.type(BasicDataSource.class).build();
}
/*card data source*/
#Bean
#ConfigurationProperties("app.datasource.card")
public DataSourceProperties cardDataSourceProperties() {
return new DataSourceProperties();
}
#Bean
#ConfigurationProperties("app.datasource.card.configuration")
public DataSource cardDataSource() {
return cardDataSourceProperties().initializeDataSourceBuilder()
.type(BasicDataSource.class).build();
}
And specify schema in entity declaration
#Table(name = "member", schema = "memberdb")
If you build 2 separate applications based on same component, you can simply use different database connection settings in application.yml (or application.properties) of both applications
For "Write" app:
spring:
datasource:
url: jdbc:postgresql://localhost:5433/writeDB
username: "username"
password: "password"
For "Read" app:
spring:
datasource:
url: jdbc:postgresql://localhost:5433/readDB
username: "username"
password: "password"

Multiple Datasources

I have wired two datasources following examples for Oracle DB:
#Configuration
public class SpringConfigurationProperties extends DataSourceProperties {
#Bean
#Primary
#ConfigurationProperties(prefix="spring.datasource")
public DataSource primaryDatasource() {
return DataSourceBuilder.create().build();
}
#Bean
#Qualifier("primaryDatasource")
public NamedParameterJdbcTemplate primaryNpJdbcTemplate(DataSource dataSource) {
return new NamedParameterJdbcTemplate(dataSource);
}
#Bean
#ConfigurationProperties(prefix="gps.bulk.load.database")
public DataSource bulkLoadDatasource() {
return DataSourceBuilder.create().build();
}
#Bean
#Qualifier("bulkLoadDatasource")
public NamedParameterJdbcTemplate bulkLoadNpJdbcTemplate(DataSource dataSource) {
return new NamedParameterJdbcTemplate(dataSource);
}
}
But I am geting the following error on startup:
org.springframework.boot.autoconfigure.jdbc.DataSourceInitializerInvoker required a single bean, but 2 were found
I think i could reproduce this error:
Parameter 1 of constructor in org.springframework.boot.autoconfigure.jdbc.DataSourceInitializerInvoker required a single bean, but 2 were found:
- myConfig: defined in file [...MyConfig.class]
- spring.datasource-org.springframework.boot.autoconfigure.jdbc.DataSourceProperties: defined in null
Seems like spring boot found 2 DataSourceProperties for the constructor DataSourceInitializerInvoker, i am not sure why, because i have only the one class that you have but when i mark my Configuration as Primary it works.
#Configuration
#Primary
public class SpringConfigurationProperties extends DataSourceProperties {
...
}
Additional for Comment .properties:
spring.datasource.url=jdbc:mysql://localhost/test
spring.datasource.username=test
spring.datasource.password=test
spring.datasource.driver-class-name=test
gps.bulk.load.database.url=test
gps.bulk.load.database.username=test
gps.bulk.load.database.password=test
gps.bulk.load.database.driver-class-name=test

Two data sources in a Spring Boot application

I have a Spring Boot app for which I have configured two data sources. So far I've configured the data sources in my Application class (annotated with #EnableAutoConfiguration):
#Bean
#Primary
#ConfigurationProperties(prefix="datasource.db1")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix="datasource.db2")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}
I also added the configuration values to application.properties:
datasource.db1.url=...
...
datasource.db2.url=...
...
Since db1 is the #Primary data source, it is chosen by default. How do I tell an interface extending JpaRepository that it should use db2 instead?
UPDATE: mentioning that my repository is an interface.
You can get the bean associated to the secondary data source from the application context.
For example in Application.java (I'm also using Spring Boot) you define:
#Bean
#ConfigurationProperties(prefix="datasource.secondary")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}
and in a service (here for calling a stored procedure) you have:
#Service
public class EngineImpl implements EngineDao {
private SetScartiProcedure setScarti;
#Autowired
public void init(ApplicationContext ctx) {
DataSource dataSource = (DataSource) ctx.getBean("secondaryDataSource");
this.setScarti = new SetScartiProcedure(dataSource);
}
public class SetScartiProcedure extends StoredProcedure {
...
}
based on this you can define several DataSourcethis way
#Bean
public LocalContainerEntityManagerFactoryBean customerEntityManagerFactory(
EntityManagerFactoryBuilder builder) {
return builder
.dataSource(customerDataSource())
.packages(Customer.class)
.persistenceUnit("customers")
.build();
}
#Bean
public LocalContainerEntityManagerFactoryBean orderEntityManagerFactory(
EntityManagerFactoryBuilder builder) {
return builder
.dataSource(orderDataSource())
.packages(Order.class)
.persistenceUnit("orders")
.build();
}
and then bind each one of them to different classes that each one of them manages
#Configuration
#EnableJpaRepositories(basePackageClasses = Customer.class,
entityManagerFactoryRef = "customerEntityManagerFactory")
public class CustomerConfiguration {
...
}
#Configuration
#EnableJpaRepositories(basePackageClasses = Order.class,
entityManagerFactoryRef = "orderEntityManagerFactory")
public class OrderConfiguration {
...
}
the repositories should know which database to use by the DataSource that was bidden to the class

Use of multiple DataSources in Spring Batch

I am trying to configure a couple of datasources within Spring Batch. On startup, Spring Batch is throwing the following exception:
To use the default BatchConfigurer the context must contain no more thanone DataSource, found 2
Snippet from Batch Configuration
#Configuration
#EnableBatchProcessing
public class BatchJobConfiguration {
#Primary
#Bean(name = "baseDatasource")
public DataSource dataSource() {
// first datasource definition here
}
#Bean(name = "secondaryDataSource")
public DataSource dataSource2() {
// second datasource definition here
}
...
}
Not sure why I am seeing this exception, because I have seen some xml based configuration for Spring batch that declare multiple datasources. I am using Spring Batch core version 3.0.1.RELEASE with Spring Boot version 1.1.5.RELEASE. Any help would be greatly appreciated.
You must provide your own BatchConfigurer. Spring does not want to make that decision for you
#Configuration
#EnableBatchProcessing
public class BatchConfig {
#Bean
BatchConfigurer configurer(#Qualifier("batchDataSource") DataSource dataSource){
return new DefaultBatchConfigurer(dataSource);
}
...
AbstractBatchConfiguration tries to lookup BatchConfigurer in container first, if it is not found then tries to create it itself - this is where IllegalStateException is thrown where there is more than one DataSource bean in container.
The approach to solving the problem is to prevent from creation the DefaultBatchConfigurer bean in AbstractBatchConfiguration.
To do it we hint to create DefaultBatchConfigurer by Spring container using #Component annotation:
The configuration class where #EnableBatchProcessing is placed we can annotate with #ComponentScan that scan the package that contains the empty class that is derived from DefaultBatchConfigurer:
package batch_config;
...
#EnableBatchProcessing
#ComponentScan(basePackageClasses = MyBatchConfigurer.class)
public class MyBatchConfig {
...
}
the full code of that empty derived class is here:
package batch_config.components;
import org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer;
import org.springframework.stereotype.Component;
#Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
}
In this configuration the #Primary annotation works for DataSource bean as in the example below:
#Configuration
public class BatchTestDatabaseConfig {
#Bean
#Primary
public DataSource dataSource()
{
return .........;
}
}
This works for the Spring Batch version 3.0.3.RELEASE
The simplest solution to make #Primary annotation on DataSource work might be just adding #ComponentScan(basePackageClasses = DefaultBatchConfigurer.class) along with #EnableBatchProcessing annotation:
#Configuration
#EnableBatchProcessing
#ComponentScan(basePackageClasses = DefaultBatchConfigurer.class)
public class MyBatchConfig {
I would like to provide a solution here, which is very similar to the one answered by #vanarchi, but I managed to put all the necessary configurations into one class.
For the sake of completeness, the solution here assumes that primary datasource is hsql.
#Configuration
#EnableBatchProcessing
public class BatchConfiguration extends DefaultBatchConfigurer {
#Bean
#Primary
public DataSource batchDataSource() {
// no need shutdown, EmbeddedDatabaseFactoryBean will take care of this
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
EmbeddedDatabase embeddedDatabase = builder
.addScript("classpath:org/springframework/batch/core/schema-drop-hsqldb.sql")
.addScript("classpath:org/springframework/batch/core/schema-hsqldb.sql")
.setType(EmbeddedDatabaseType.HSQL) //.H2 or .DERBY
.build();
return embeddedDatabase;
}
#Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(batchDataSource());
factory.setTransactionManager(transactionManager());
factory.afterPropertiesSet();
return (JobRepository) factory.getObject();
}
private ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
//NOTE: the code below is just to provide developer an easy way to access the in-momery hsql datasource, as we configured it to the primary datasource to store batch job related data. Default username : sa, password : ''
#PostConstruct
public void getDbManager(){
DatabaseManagerSwing.main(
new String[] { "--url", "jdbc:hsqldb:mem:testdb", "--user", "sa", "--password", ""});
}
}
THREE key points in this solution:
This class is annotated with #EnableBatchProcessing and #Configuration, as well as extended from DefaultBatchConfigurer. By doing this, we instruct spring-batch to use our customized batch configurer when AbstractBatchConfiguration tries to lookup BatchConfigurer;
Annotate batchDataSource bean as #Primary, which instruct spring-batch to use this datasource as its datasource of storing the 9 job related tables.
Override protected JobRepository createJobRepository() throws Exception method, which makes the jobRepository bean to use the primary datasource, as well as use a different transactionManager instance from the other datasource(s).
The simplest solution is to extend the DefaultBatchConfigurer and autowire your datasource via a qualifier:
#Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
/**
* Initialize the BatchConfigurer to use the datasource of your choosing
* #param firstDataSource
*/
#Autowired
public MyBatchConfigurer(#Qualifier("firstDataSource") DataSource firstDataSource) {
super(firstDataSource);
}
}
Side Note (as this also deals with the use of multiple data sources): If you use autoconfig to run data initialization scripts, you may notice that it's not initializing on the datasource you'd expect. For that issue, take a look at this: https://github.com/spring-projects/spring-boot/issues/9528
You can define below beans and make sure you application.properties file has entries needed for
#Configuration
#PropertySource("classpath:application.properties")
public class DataSourceConfig {
#Primary
#Bean(name = "abcDataSource")
#ConfigurationProperties(prefix = "abc.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
#Bean(name = "xyzDataSource")
#ConfigurationProperties(prefix = "xyz.datasource")
public DataSource xyzDataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
}
application.properties
abc.datasource.jdbc-url=XXXXX
abc.datasource.username=XXXXX
abc.datasource.password=xxxxx
abc.datasource.driver-class-name=org.postgresql.Driver
...........
...........
...........
...........
Here you can refer: Spring Boot Configure and Use Two DataSources
First, create a custom BatchConfigurer
#Configuration
#Component
public class TwoDataSourcesBatchConfigurer implements BatchConfigurer {
#Autowired
#Qualifier("dataSource1")
DataSource dataSource;
#Override
public JobExplorer getJobExplorer() throws Exception {
...
}
#Override
public JobLauncher getJobLauncher() throws Exception {
...
}
#Override
public JobRepository getJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
// use the autowired data source
factory.setDataSource(dataSource);
factory.setTransactionManager(getTransactionManager());
factory.afterPropertiesSet();
return factory.getObject();
}
#Override
public PlatformTransactionManager getTransactionManager() throws Exception {
...
}
}
Then,
#Configuration
#EnableBatchProcessing
#ComponentScan("package")
public class JobConfig {
// define job, step, ...
}

Categories