please help me.
I use multi data source in my project
data source properties:
spring.datasource.url=jdbc:sqlserver://localhost:1433;databaseName=db
spring.datasource.username=xxxxx
spring.datasource.password=xxxxx
spring.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring.datasource2.url=jdbc:mysql://localhost:3306/db2
spring.datasource2.username=xxxx
spring.datasource2.password=xxx
spring.datasource2.driver-class-name=com.mysql.cj.jdbc.Driver
config class:
#Configuration
#EnableJdbcRepositories(
jdbcOperationsRef = "mysqlNamedParameterJdbcOperations",
basePackages = "com.example.demo.mysqlModels"
)
public class Config extends AbstractJdbcConfiguration {
#Bean("mysqlDataSource")
#ConfigurationProperties(prefix = "spring.datasource2")
public DataSource mysqlDataSource() {
return DataSourceBuilder.create()
.build();
}
#Bean(name = "mysqlNamedParameterJdbcOperations")
NamedParameterJdbcOperations mysqlNamedParameterJdbcOperations(#Qualifier("mysqlDataSource") DataSource mysqlDataSource) {
return new NamedParameterJdbcTemplate(mysqlDataSource);
}}
#Configuration
#EnableJdbcRepositories(
jdbcOperationsRef = "mssqlNamedParameterJdbcOperations",
basePackages = "com.example.demo.mssqlModels"
)
public class Config2 extends AbstractJdbcConfiguration {
#Bean("mssqlDataSource")
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource mssqlDataSource() {
return DataSourceBuilder.create()
.build();
}
#Bean(name = "mssqlNamedParameterJdbcOperations")
NamedParameterJdbcOperations mssqlNamedParameterJdbcOperations(#Qualifier("mssqlDataSource") DataSource mssqlDataSource) {
return new NamedParameterJdbcTemplate(mssqlDataSource);
}}
repository in com.example.demo.mssqlModels:
public interface MssqlRepository extends PagingAndSortingRepository<MyEntity, Integer> {}
repository in com.example.demo.mysqlModels:
public interface MysqlRepository extends PagingAndSortingRepository<MyEntity, Integer> {}
my service:
#Slf4j
#Service
public class MyService {
#Autowired
private final MssqlRepository mssqlRepository;
#Autowired
private final MysqlRepository mysqlRepository;
#PostConstruct
public void init() {
log.info("mssql result {}", mssqlRepository.findAll());
log.info("mysql result {}", mysqlRepository.findAll());
}}
but result is same and both repositories read data from mysql datasource
thanks
You might be interested in looking at my question I raised recently here regarding 2 data sources but each applied to a different repository.
In your configuration classes you should also create two TransactionManagers with unique names.
In each repository annotate it with #Transactional(transactionManager = 'transaction manager name') replacing the transaction name with the appropriate name
You'll probably need to override the default methods such as saveAll() with the same annotation as in (2).
However as per my question I found that an incorrect data source is sometimes used (I've since found that having the Postgres classes as the primary resolved my problem but I don't know why this worked)
Related
My current project needs to connect to multiple databases. I set
spring.jpa.generate-ddl=true
spring.jpa.hibernate.ddl-auto=update
in application.properties.
and I have some dbConfig as below:
#Configuration
public class DBSourceConfiguration {
public final static String DATA_SOURCE_PRIMARY = "dataSource";
public final static String DATA_SOURCE_PROPERTIES = "propertiesDataSource";
public final static String DATA_SOURCE_REPORT = "reportDataSource";
public final static String DATA_SOURCE_NEW_DRAGON = "newDragonDataSource";
#Primary
#Bean(name = DATA_SOURCE_PRIMARY)
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = DATA_SOURCE_REPORT)
#ConfigurationProperties(prefix = "externaldatasource.report")
public DataSource reportDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = DATA_SOURCE_NEW_DRAGON)
#ConfigurationProperties(prefix = "externaldatasource.newdragon")
public DataSource newDragonDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = DATA_SOURCE_PROPERTIES)
#ConfigurationProperties(prefix = "externaldatasource.properties")
public DataSource propertiesDataSource() {
return DataSourceBuilder.create().build();
}
}
and
<!-- language: Java -->
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = PrimaryDbConfig.ENTITY_MANAGER_FACTORY,
transactionManagerRef = PrimaryDbConfig.TRANSACTION_MANAGER,
basePackageClasses = { _TbsRepositoryBasePackage.class })
public class PrimaryDbConfig extends AbstractDbConfig {
public final static String ENTITY_MANAGER_FACTORY = "entityManagerFactoryPrimary";
public final static String ENTITY_MANAGER = "entityManagerPrimary";
public final static String TRANSACTION_MANAGER = "transactionManagerPrimary";
#Autowired
#Qualifier(DBSourceConfiguration.DATA_SOURCE_PRIMARY)
private DataSource dataSource;
#Primary
#Bean(name = ENTITY_MANAGER_FACTORY)
public LocalContainerEntityManagerFactoryBean entityManagerFactory(EntityManagerFactoryBuilder builder) {
return builder.dataSource(dataSource).properties(getVendorProperties(dataSource)).packages(_TbsEntityBasePackage.class).persistenceUnit("primaryPersistenceUnit").build();
}
#Primary
#Bean(name = ENTITY_MANAGER)
public EntityManager entityManager(EntityManagerFactoryBuilder builder) {
return entityManagerFactory(builder).getObject().createEntityManager();
}
#Primary
#Bean(name = TRANSACTION_MANAGER)
public PlatformTransactionManager transactionManager(EntityManagerFactoryBuilder builder) {
return new JpaTransactionManager(entityManagerFactory(builder).getObject());
}
}
and
<!-- language: Java -->
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = PropertiesDbConfig.ENTITY_MANAGER_FACTORY,
transactionManagerRef = PropertiesDbConfig.TRANSACTION_MANAGER,
basePackageClasses = { _PropertiesRepositoryBasePackage.class })
public class PropertiesDbConfig extends AbstractDbConfig {
public final static String ENTITY_MANAGER_FACTORY = "entityManagerFactoryProperties";
public final static String ENTITY_MANAGER = "entityManagerProperties";
public final static String TRANSACTION_MANAGER = "transactionManagerProperties";
#Autowired
#Qualifier(DBSourceConfiguration.DATA_SOURCE_PROPERTIES)
private DataSource dataSource;
#Bean(name = ENTITY_MANAGER_FACTORY)
public LocalContainerEntityManagerFactoryBean entityManagerFactory(EntityManagerFactoryBuilder builder) {
return builder.dataSource(dataSource).properties(getVendorProperties(dataSource)).packages(_PropertiesEntityBasePackage.class).persistenceUnit("propertiesPersistenceUnit").build();
}
#Bean(name = ENTITY_MANAGER)
public EntityManager entityManager(EntityManagerFactoryBuilder builder) {
return entityManagerFactory(builder).getObject().createEntityManager();
}
#Bean(name = TRANSACTION_MANAGER)
public PlatformTransactionManager transactionManager(EntityManagerFactoryBuilder builder) {
return new JpaTransactionManager(entityManagerFactory(builder).getObject());
}
}
and two more DBConfig classes(just like two DbConfig classes above).
My problem is every time I run this web application, Entities (under different packages) will generate to all databases. In other words, Tbs's(Primary) entities will generate tables to newDragon and all other databases.
For instance, Entity A belongs to primary data source, Entity B belongs to properties datasouce. But framework generates table A, B to both primary database and newDragon database and other two database.
Update 2018/06/01 - 1
Although framework generate all entities to all databases, but I can still access tables from the right database. All my web application functionalities work very well. This is very odd, isn't it?
I guess my configuration is fine, so that there is no any problems while my application access database (like read from wrong database and get empty result or insert data to wrong database, etc). Probably something else cause this gernerte all tables to all databases problem.
Based on the configuration you provided, CRUD tables from the right database shouldn't be problem. But generating tables into the right database, sometimes you may want to check whether the configuration picks entity/package names correctly or not.
Each LocalContainerEntityManagerFactoryBean is set with unique package class, the framework will then scan entities under this package name and generate tables accordingly at target datasource; however, there's a situation the packageToScan will be changed. As you have #EntityScan annotation, it would
overrides packagesToScan on all defined LocalContainerEntityManagerFactoryBean, reference code as follow: EntityScanRegistrar.java
#Override
public Object postProcessBeforeInitialization(Object bean, String beanName) throws BeansException {
if (bean instanceof LocalContainerEntityManagerFactoryBean) {
LocalContainerEntityManagerFactoryBean factoryBean = (LocalContainerEntityManagerFactoryBean) bean;
factoryBean.setPackagesToScan(this.packagesToScan);
this.processed = true;
}
return bean;
}
As a result, even you have provided each LocalContainerEntityManagerFactoryBean with unique package class, the final result may still be overriden by the framework if you've #EntityScan somewhere at your application. Your configuration seems ok to me, so try to find and resolve the package names between #EntityScan and LocalContainerEntityManagerFactoryBean first, it should solve the issue.
reference: https://github.com/spring-projects/spring-boot/issues/6830
I have two projects one with DAO classes and Model and another with Rest Controller
Project A : DAO Classes + Model
Project B : Rest Controller
Project A
application.properties:
spring.abcDatasource.url=
spring.abcDatasource.username=
spring.abcDatasource.password=
spring.abcDatasource.driver-class-name=oracle.jdbc.driver.OracleDriver
spring.xyzDatasource.url=
spring.xyzDatasource.username=
spring.xyzDatasource.password=
spring.xyzDatasource.driver-class-name=oracle.jdbc.driver.OracleDriver
spring.datasource.initialize=false
DBConfiguration.java
#Configuration
public class DBConfiguration {
#Primary
#Bean(name = "abcDS")
#ConfigurationProperties(prefix = "spring.abcDatasource")
public DataSource abcDS() {
return DataSourceBuilder.create().build();
}
#Bean(name = "abcJdbc")
public JdbcTemplate abcJdbcTemplate(#Qualifier("abcDS") DataSource abcDS) {
return new JdbcTemplate(abcDS);
}
#Bean(name = "xyzDS")
#ConfigurationProperties(prefix = "spring.xyzDatasource")
public DataSource xyzDataSource() {
return DataSourceBuilder.create().build();
}
#Bean(name = "xyzJdbc")
public JdbcTemplate ebsJdbcTemplate(#Qualifier("xyzDS") DataSource xyzDatasource) {
return new JdbcTemplate(xyzDatasource);
}
}
AlphaDAO.Java
#Repository
public class AlphaDAO{
#Autowired
#Qualifier("abcJdbc")
private JdbcTemplate abcJdbc;
#Autowired
#Qualifier("xyzJdbc")
private JdbcTemplate xyzJdbc;
SqlParameterSource namedParameters;
public Collection<Alpha> findAll(String owner){
String sql = "SELECT * from alpha where OWNER in (:owner)" ;
NamedParameterJdbcTemplate namedParameterJdbcTemplate = new NamedParameterJdbcTemplate(abcJdbc.getDataSource());
namedParameters = new MapSqlParameterSource("owner", owner);
List<Alpha> list = namedParameterJdbcTemplate.query(sql,namedParameters,
new BeanPropertyRowMapper(Alpha.class));
return list;
}
Project B Rest Controller :
AlphaServiceApplication.java
#SpringBootApplication
#EnableAutoConfiguration(exclude={DataSourceAutoConfiguration.class})
public class AlphaServiceApplication extends SpringBootServletInitializer implements WebApplicationInitializer {
#Override
protected SpringApplicationBuilder configure(SpringApplicationBuilder builder) {
return builder.sources(AlphaServiceApplication.class);
}
public static void main(String[] args) {
SpringApplication.run(AlphaServiceApplication.class, args);
}
}
AlphaServiceController.java
#RestController
public class AlphaServiceController {
private static final Logger logger = LoggerFactory.getLogger(AlphaServiceController.class);
#Autowired
AlphaDAO dao;
#CrossOrigin(origins = "http://localhost:4200")
#RequestMapping("/alpha")
public Collection<Alpha> index(#RequestBody String owner) {
return dao.findAll(owner);
}
If I try to run the rest controller I am getting the error saying
APPLICATION FAILED TO START
Description:
Field dao in com.xyz.web.wip.AlphaService.AlphaServiceController required a bean of type 'com.xyz.comp.wip.alphaComp.dao.AlphaDAO' that could not be found.
Action:
Consider defining a bean of type 'com.xyz.comp.wip.alphaComp.dao.AlphaDAO' in your configuration.
Your AlphaDao class doesnt make much sense, you are trying to autowire two fields but you still have a constructor.
Spring cant build the object because there is no qualifier on the constructor.
You can either do constructor injection or field injection but you shouldn’t use both.
I would recommend using constructor injection.
#Repository
public class AlphaDAO{
private final JdbcTemplate abcJdbc;
private final JdbcTemplate xyzJdbc;
#Autowired
public AlphaDAO(
#Qualifier("abcJdbc") JdbcTemplate abcJdbc,
#Qualifier("xyzJdbc") JdbcTemplate xyzJdbc){
this.abcJdbc = abcJdbc;
this.xyzJdbc = xyzJdbc;
}
Also remove your #Bean method from the controller.
Since the DAO classes and Rest Controller is in different packages. Added scanBasePackages to #SpringBootApplication annotation with one level up worked fine.
AlphaServiceApplication.java
#SpringBootApplication(scanBasePackages = { "com.xyz" })
We have a Spring Boot Restful API that needs to get data from 2 different Elasticsearch instances (on different servers), 1 for "shared" data (with about 5 different indexes on it) and 1 for "private" data (with about 3 different indexes). Currently running against just the "private" data instance, everything is good. But we now need to get at the "shared" data now.
In our Spring Boot application, we have enabled Elasticsearch repositories like this
#SpringBootApplication
#EnableElasticsearchRepositories(basePackages = {
"com.company.core.repositories", //<- private repos here...
"com.company.api.repositories" //<-- shared repos here...
})
public class Application { //... }
Then we access the "private" data with an ElasticsearchRepository like:
package com.company.core.repositories
public interface DocRepository extends ElasticsearchRepository<Doc, Integer> { ... }
In our endpoint, we have...
#RestController
#CrossOrigin
#RequestMapping("/v2/statuses/")
public class StatusEndpoint {
#Resource
private ElasticsearchTemplate template;
#Autowired
private DocRepository docRepository;
#Autowired
private Validator validator;
//...
}
Now we want to add another repository like:
package com.company.api.repositories
public interface LookupRepository extends ElasticsearchRepository<Lookup, Integer> { ... }
Then in our API layer we would add an auto-wired instance...
#Autowired
private LookupRepository lookupRepo;
We were thinking that we could define multiple Beans with different names, but how do we associate each of the "elasticsearchTemplate" beans with the different ElasticsearchRepository instances that need them? Also, how do we associate the "private" bean/configuration with injected instances of
#Resource
private ElasticsearchTemplate template;
Where we need to use that natively?
You can solve this with 2 unique Elasticsearch configuration beans and an #Resource(name="XXX") annotation for the template injection in your StatusEndpoint controller.
If you segregate your repositories into different packages depending on which Elasticsearch cluster they should use, you can associate them with different configurations using the #EnableElasticsearchRepositories annotation.
For example:
If you have these packages and classes:
com.company.data.repositories.private.YourPrivateRepository
com.company.data.repositories.shared.YourSharedRepository
And then these Configurations:
#Configuration
#EnableElasticsearchRepositories(
basePackages = {"com.company.data.repositories.private"},
elasticsearchTemplateRef = "privateElasticsearchTemplate")
public class PrivateElasticsearchConfiguration {
#Bean(name="privateElasticsearchTemplate")
public ElasticsearchTemplate privateTemplate() {
//code to create connection to private ES cluster
}
}
#Configuration
#EnableElasticsearchRepositories(
basePackages = {"com.company.data.repositories.shared"},
elasticsearchTemplateRef = "sharedElasticsearchTemplate")
public class SharedElasticsearchConfiguration {
#Bean(name="sharedElasticsearchTemplate")
public ElasticsearchTemplate sharedTemplate() {
//code to create connection to shared ES cluster
}
}
Because of the elasticsearchTemplateRef parameter in the #EnableElasticsearchRepositories annotation, the JPA code that implements the repositories will use the specified template for repositories in the basePackages list.
For the StatusEndpoint portion, you would just provide your #Resource annotation with the correct template bean name. Your StatusEndpoint would look like this:
#RestController
#CrossOrigin
#RequestMapping("/v2/statuses/")
public class StatusEndpoint {
#Resource(name="privateElasticsearchTemplate")
private ElasticsearchTemplate template;
#Autowired
private DocRepository docRepository;
#Autowired
private Validator validator;
//...
}
There might be multiple ways to do this. Here is one that takes advantage of the #Bean name and the #Resource name.
#Configuration
public class MyElasticConfig{
#Bean //this is your private template
public ElasticsearchTemplate template(){
//construct your template
return template;
}
#Bean //this is your public template
public ElasticsearchTemplate publicTemplate(){
//construct your template
return template;
}
}
then you can get them like this...
#Resource
private ElasticsearchTemplate template;
#Resource
private ElasticsearchTemplate publicTemplate;
or
#Resource(name="template")
private ElasticsearchTemplate anyName;
#Resource(name="publicTemplate")
private ElasticsearchTemplate anyOtherName;
also you can name your #Bean's directly instead of relying on the #Bean's method name.
#Bean (name="template")
public ElasticsearchTemplate myPrivateTemplate(){
//construct your template
return template;
}
#Bean (name="publicTemplate")
public ElasticsearchTemplate myPubTemplate(){
//construct your template
return template;
}
Check out these wonderful resources on the topic.
SPRING INJECTION WITH #RESOURCE, #AUTOWIRED AND #INJECT
Bean Annotation Type
Autowired vs Resource
I'm trying to replace my old:
#Component
public interface MyEntityRepository extends JpaRepository<MyEntity, Integer> {
#QueryHints({#QueryHint(name = CACHEABLE, value = "true")})
MyEntity findByName(String name);
}
by this:
#Component
public interface MyEntityRepository extends JpaRepository<MyEntity, Integer> {
#Cacheable(value = "entities")
MyEntity findByName(String name);
}
Because I want to use advanced caching features like no caching of null values, etc.
To do so, I followed Spring tutorial https://spring.io/guides/gs/caching/
If I don't annotate my Application.java, caching simply doesn't work.
But if I add #EnableCaching and a CacheManager bean:
package my.application.config;
#EnableWebMvc
#ComponentScan(basePackages = {"my.application"})
#Configuration
#EnableCaching
public class Application extends WebMvcConfigurerAdapter {
#Bean
public CacheManager cacheManager() {
return new ConcurrentMapCacheManager("entities");
}
// ...
}
I get the following error at startup:
java.lang.IllegalStateException: No CacheResolver specified, and no bean of type CacheManager found. Register a CacheManager bean or remove the #EnableCaching annotation from your configuration
I get the same error if I replace My CacheManager bean by a CacheResolver bean like:
#Bean
public CacheResolver cacheResolver() {
return new SimpleCacheResolver(new ConcurrentMapCacheManager("entities"));
}
Do I miss something ?
#herau You were right I had to name the bean !
The problem was that there were another bean "cacheManager", so finally, I didn't annotate Application, and created a configuration as:
#EnableCaching
#Configuration
public class CacheConf{
#Bean(name = "springCM")
public CacheManager cacheManager() {
return new ConcurrentMapCacheManager("entities");
}
}
in MyEntityRepository:
#Cacheable(value = "entities", cacheManager = "springCM")
MyEntity findByName(String name);
In my case the Spring Boot library was old, and there was no way to easily upgrade it. So I used EHCache 2 version, and it worked in my application. Here is a project I found useful to refer to: https://github.com/TechPrimers/spring-ehcache-example/blob/master/src/main/resources/ehcache.xml
I have a Spring Boot app for which I have configured two data sources. So far I've configured the data sources in my Application class (annotated with #EnableAutoConfiguration):
#Bean
#Primary
#ConfigurationProperties(prefix="datasource.db1")
public DataSource dataSource() {
return DataSourceBuilder.create().build();
}
#Bean
#ConfigurationProperties(prefix="datasource.db2")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}
I also added the configuration values to application.properties:
datasource.db1.url=...
...
datasource.db2.url=...
...
Since db1 is the #Primary data source, it is chosen by default. How do I tell an interface extending JpaRepository that it should use db2 instead?
UPDATE: mentioning that my repository is an interface.
You can get the bean associated to the secondary data source from the application context.
For example in Application.java (I'm also using Spring Boot) you define:
#Bean
#ConfigurationProperties(prefix="datasource.secondary")
public DataSource secondaryDataSource() {
return DataSourceBuilder.create().build();
}
and in a service (here for calling a stored procedure) you have:
#Service
public class EngineImpl implements EngineDao {
private SetScartiProcedure setScarti;
#Autowired
public void init(ApplicationContext ctx) {
DataSource dataSource = (DataSource) ctx.getBean("secondaryDataSource");
this.setScarti = new SetScartiProcedure(dataSource);
}
public class SetScartiProcedure extends StoredProcedure {
...
}
based on this you can define several DataSourcethis way
#Bean
public LocalContainerEntityManagerFactoryBean customerEntityManagerFactory(
EntityManagerFactoryBuilder builder) {
return builder
.dataSource(customerDataSource())
.packages(Customer.class)
.persistenceUnit("customers")
.build();
}
#Bean
public LocalContainerEntityManagerFactoryBean orderEntityManagerFactory(
EntityManagerFactoryBuilder builder) {
return builder
.dataSource(orderDataSource())
.packages(Order.class)
.persistenceUnit("orders")
.build();
}
and then bind each one of them to different classes that each one of them manages
#Configuration
#EnableJpaRepositories(basePackageClasses = Customer.class,
entityManagerFactoryRef = "customerEntityManagerFactory")
public class CustomerConfiguration {
...
}
#Configuration
#EnableJpaRepositories(basePackageClasses = Order.class,
entityManagerFactoryRef = "orderEntityManagerFactory")
public class OrderConfiguration {
...
}
the repositories should know which database to use by the DataSource that was bidden to the class