I have a Spring Boot web service that serves different types of data, where each type resides in its own database. As new types are added, I don't want to have to configure the datasource for each type in the code. This is what I've got so far:
Data types and their database connections are defined in application.yml:
datatypes:
someType:
url: jdbc:postgresql://mydatabase.com:5432/some_type
username: user1
password: pass1
otherType:
url: jdbc:postgresql://mydatabase.com:5432/other_type
username: user2
password: pass2
A configuration class reads the properties and creates DataSources and JdbcTemplates:
#Configuration
#EnableConfigurationProperties
public class JdbcConfig {
#Bean
#ConfigurationProperties(prefix = "datatypes")
public Map<String, DataSourceProperties> databaseConfig() {
return new HashMap<>();
}
#Bean
public Map<String, NamedParameterJdbcTemplate> jdbcTemplateMap() {
Map<String, DataSourceProperties> databaseConfig = databaseConfig();
Map<String, DataSource> dataSourceMap = databaseConfig().entrySet().stream()
.collect(Collectors.toMap(Map.Entry::getKey, entry ->
entry.getValue().initializeDataSourceBuilder().build()));
return dataSourceMap.entrySet().stream()
.collect(Collectors.toMap(Map.Entry::getKey, entry ->
new NamedParameterJdbcTemplate(entry.getValue())));
}
}
And finally a repository that fetches data based on its type:
#Repository
public class MyRepository {
#Autowired
private Map<String, NamedParameterJdbcTemplate> jdbcTemplateMap;
public Item getItem(String dataType, String id) {
String sql = "select * from item where id = :id";
return jdbcTemplateMap.get(dataType)
.queryForObject(sql, Map.of("id", id), new ItemRowMapper());
}
}
When Spring tries to autowire jdbcTemplateMap in the repository it can't find any jdbc template beans, so the auto configuration kicks in but fails as the properties for the datasource are not where it expects them in the yml. This can be fixed by disabling the auto configuration: #SpringBootApplication(exclude = DataSourceAutoConfiguration.class)
This setup almost works. However, since the DataSource instances are not registered in the application context, I miss out on some other auto configuration magic, like the actuator health check for instance. I tried registering them myself, by adding this to JdbcConfig and calling registerDataSource() from the jdbcTemplateMap() bean method:
#Autowired
private ApplicationContext applicationContext;
private void registerDataSource(String beanName, DataSource dataSource) {
ConfigurableApplicationContext context =
(ConfigurableApplicationContext) applicationContext;
ConfigurableListableBeanFactory beanFactory = context.getBeanFactory();
beanFactory.registerSingleton(beanName, dataSource);
}
With this in place I should be able to enable the datasouce auto configuration again, but it runs before jdbcTemplateMap() has a chance to run. Actuator won't pick them up either. Can this be fixed?
Related
I have these profiles, uat-nyc, uat-ldn.
uat-nyc datasource is oracle and uat-ldn is mysql server
This configuration is setup in application-uat-nyc.yml and application-uat-ldn.yml
I have below configuration class
#Profile({"uat-nyc", "uat-ldn"})
#Configuration
#EnableConfigurationPropeties(DatSourceProperties.class)
public class DataSourceConfig{
private DataSourceProperties properties; // server, username, password are set here
DataSource getDataSource(){// gets datasource based on profiles}
}
if my application is run with spring.profiles.active: uat-nyc,uat-ldn will it create two datasources, ?
one with configuration from uat-nyc and another from uat-ldn
I have a function below, in this function, I am getting data from third-party service, and depending on ldn or nyc , I need to persist into ldn or nyc database. How can I make the below if else section dynamic? How can I get respective datasources i.e ldn and nyc in the if else section in below getProducts method?
class Product{
String name;
int price;
int region;
}
#Component
Class ProductLoader{
JdbcTemplate jdbcTemplate;
public ProductLoader(DataSource ds){
jdbcTemplate = new JdbcTemplate(ds);
}
public void getProducts(){
List<Product> products = // rest service to get products
if(Product product : product){
if(product.getRegion().equals("LONDON"){
//write to LONDON datbase
// How can I get ldn datasource here?
}
if else(product.getRegion().equals("NewYork"){
//write to NewYork datbase
How can I get NewYork datasource here?
}
else{
// Unknown location
}
}
}
}
Question -
if my application is run with spring.profiles.active: uat-nyc,uat-ldn will it create two datasources, ?
How can I inject the datasources dynamically into ProductLoader and use specific datasource for ldn and nyc
At the first time you need to tell spring you gonna use two datasource will be managed by context spring. use #Bean on #Configuration class then use #Autowired to declareing vars managed by spring.
you could use #Qualifier to choose and qualify your beans
#Configuration
public class ConfigDataSource {
// example for dataSource
#Bean("dataSourceWithPropOnCode") // this name will qualify on #autowired
public DataSource dataSourceWithPropOnCode() {
return DataSourceBuilder.create().url("").password("").username("").driverClassName("").build();
}
#Bean("dataSourceWithPropFromProperties") // this name will qualify on #autowired
#ConfigurationProperties(prefix="spring.datasource.yourname-datasource") // this is the name for the prefix for datasource on .properties settings
public DataSource dataSourcePostgres() {
return DataSourceBuilder.create().build();
}
// example for jdbctemplate
#Bean("jdbcTemaplateWithPropFromProperties") // this name will qualify on #autowired
public JdbcTemplate jdbcTemplatePostgres(#Qualifier("dataSourceWithPropFromProperties") DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
#Bean("jdbcTemaplateWithPropOnCode") // this name will qualify on #autowired
public JdbcTemplate jdbcTemplatePostgres(#Qualifier("dataSourceWithPropOnCode") DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
}
settings on properties
spring.datasource.yourname-datasource.url=...
spring.datasource.yourname-datasource.jdbcUrl=${spring.datasource.yourname-datasource}
spring.datasource.yourname-datasource.username=user
spring.datasource.yourname-datasource.password=pass
spring.datasource.yourname-datasource.driver-class-name=your.driver
using on services
#Qualifier("jdbcTemaplateWithPropFromProperties")
#Autowired
private JdbcTemplate jdbcTemplate1;
#Qualifier("jdbcTemaplateWithPropOnCode")
#Autowired
private JdbcTemplate jdbcTemplate2;
#Qualifier("dataSourceWithPropOnCode")
#Autowired
private DataSource dataSource1;
private DataSource dataSource2;
public someContructorIfYouPrefer(#Qualifier("dataSourceWithPropFromProperties") #Autowired private DataSource dataSource2){
this.dataSource2 = dataSource2;
}
please help me.
I use multi data source in my project
data source properties:
spring.datasource.url=jdbc:sqlserver://localhost:1433;databaseName=db
spring.datasource.username=xxxxx
spring.datasource.password=xxxxx
spring.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring.datasource2.url=jdbc:mysql://localhost:3306/db2
spring.datasource2.username=xxxx
spring.datasource2.password=xxx
spring.datasource2.driver-class-name=com.mysql.cj.jdbc.Driver
config class:
#Configuration
#EnableJdbcRepositories(
jdbcOperationsRef = "mysqlNamedParameterJdbcOperations",
basePackages = "com.example.demo.mysqlModels"
)
public class Config extends AbstractJdbcConfiguration {
#Bean("mysqlDataSource")
#ConfigurationProperties(prefix = "spring.datasource2")
public DataSource mysqlDataSource() {
return DataSourceBuilder.create()
.build();
}
#Bean(name = "mysqlNamedParameterJdbcOperations")
NamedParameterJdbcOperations mysqlNamedParameterJdbcOperations(#Qualifier("mysqlDataSource") DataSource mysqlDataSource) {
return new NamedParameterJdbcTemplate(mysqlDataSource);
}}
#Configuration
#EnableJdbcRepositories(
jdbcOperationsRef = "mssqlNamedParameterJdbcOperations",
basePackages = "com.example.demo.mssqlModels"
)
public class Config2 extends AbstractJdbcConfiguration {
#Bean("mssqlDataSource")
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource mssqlDataSource() {
return DataSourceBuilder.create()
.build();
}
#Bean(name = "mssqlNamedParameterJdbcOperations")
NamedParameterJdbcOperations mssqlNamedParameterJdbcOperations(#Qualifier("mssqlDataSource") DataSource mssqlDataSource) {
return new NamedParameterJdbcTemplate(mssqlDataSource);
}}
repository in com.example.demo.mssqlModels:
public interface MssqlRepository extends PagingAndSortingRepository<MyEntity, Integer> {}
repository in com.example.demo.mysqlModels:
public interface MysqlRepository extends PagingAndSortingRepository<MyEntity, Integer> {}
my service:
#Slf4j
#Service
public class MyService {
#Autowired
private final MssqlRepository mssqlRepository;
#Autowired
private final MysqlRepository mysqlRepository;
#PostConstruct
public void init() {
log.info("mssql result {}", mssqlRepository.findAll());
log.info("mysql result {}", mysqlRepository.findAll());
}}
but result is same and both repositories read data from mysql datasource
thanks
You might be interested in looking at my question I raised recently here regarding 2 data sources but each applied to a different repository.
In your configuration classes you should also create two TransactionManagers with unique names.
In each repository annotate it with #Transactional(transactionManager = 'transaction manager name') replacing the transaction name with the appropriate name
You'll probably need to override the default methods such as saveAll() with the same annotation as in (2).
However as per my question I found that an incorrect data source is sometimes used (I've since found that having the Postgres classes as the primary resolved my problem but I don't know why this worked)
I have a data source configuration class that looks as follows, with separate DataSource beans for testing and non-testing environments using JOOQ. In my code, I do not use DSLContext.transaction(ctx -> {...} but rather mark the method as transactional, so that JOOQ defers to Spring's declarative transactions for transactionality. I am using Spring 4.3.7.RELEASE.
I have the following issue:
During testing (JUnit), #Transactional works as expected. A single method is transactional no matter how many times I use the DSLContext's store() method, and a RuntimeException triggers a rollback of the entire transaction.
During actual production runtime, #Transactional is completely ignored. A method is no longer transactional, and TransactionSynchronizationManager.getResourceMap() holds two separate values: one showing to my connection pool (which is not transactional), and one showing the TransactionAwareDataSourceProxy).
In this case, I would have expected only a single resource of type TransactionAwareDataSourceProxy which wraps my DB CP.
After much trial and error using the second set of configuration changes I made (noted below with "AFTER"), #Transactional works correctly as expected even during runtime, though TransactionSynchronizationManager.getResourceMap() holds the following value:
In this case, my DataSourceTransactionManager seems to not even know the TransactionAwareDataSourceProxy (most likely due to my passing it the simple DataSource, and not the proxy object), which seems to completely 'skip' the proxy anyway.
My question is: the initial configuration that I had seemed correct, but did not work. The proposed 'fix' works, but IMO should not work at all (since the transaction manager does not seem to be aware of the TransactionAwareDataSourceProxy).
What is going on here? Is there a cleaner way to fix this issue?
BEFORE (not transactional during runtime)
#Configuration
#EnableTransactionManagement
#RefreshScope
#Slf4j
public class DataSourceConfig {
#Bean
#Primary
public DSLContext dslContext(org.jooq.Configuration configuration) throws SQLException {
return new DefaultDSLContext(configuration);
}
#Bean
#Primary
public org.jooq.Configuration defaultConfiguration(DataSourceConnectionProvider dataSourceConnectionProvider) {
org.jooq.Configuration configuration = new DefaultConfiguration()
.derive(dataSourceConnectionProvider)
.derive(SQLDialect.POSTGRES_9_5);
configuration.set(new DeleteOrUpdateWithoutWhereListener());
return configuration;
}
#Bean
public DataSourceTransactionManager transactionManager(DataSource dataSource) {
return new DataSourceTransactionManager(dataSource);
}
#Bean
public DataSourceConnectionProvider dataSourceConnectionProvider(DataSource dataSource) {
return new DataSourceConnectionProvider(dataSource);
}
#Configuration
#ConditionalOnClass(EmbeddedPostgres.class)
static class EmbeddedDataSourceConfig {
#Value("${spring.jdbc.port}")
private int dbPort;
#Bean(destroyMethod = "close")
public EmbeddedPostgres embeddedPostgres() throws Exception {
EmbeddedPostgres embeddedPostgres = EmbeddedPostgresHelper.startDatabase(dbPort);
return embeddedPostgres;
}
#Bean
#Primary
public DataSource dataSource(EmbeddedPostgres embeddedPostgres) throws Exception {
DataSource dataSource = embeddedPostgres.getPostgresDatabase();
return new TransactionAwareDataSourceProxy(dataSource);
}
}
#Configuration
#ConditionalOnMissingClass("com.opentable.db.postgres.embedded.EmbeddedPostgres")
#RefreshScope
static class DefaultDataSourceConfig {
#Value("${spring.jdbc.url}")
private String url;
#Value("${spring.jdbc.username}")
private String username;
#Value("${spring.jdbc.password}")
private String password;
#Value("${spring.jdbc.driverClass}")
private String driverClass;
#Value("${spring.jdbc.MaximumPoolSize}")
private Integer maxPoolSize;
#Bean
#Primary
#RefreshScope
public DataSource dataSource() {
log.debug("Connecting to datasource: {}", url);
HikariConfig hikariConfig = buildPool();
DataSource dataSource = new HikariDataSource(hikariConfig);
return new TransactionAwareDataSourceProxy(dataSource);
}
private HikariConfig buildPool() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl(url);
config.setUsername(username);
config.setPassword(password);
config.setDriverClassName(driverClass);
config.setConnectionTestQuery("SELECT 1");
config.setMaximumPoolSize(maxPoolSize);
return config;
}
}
AFTER (transactional during runtime, as expected, all non-listed beans identical to above)
#Configuration
#EnableTransactionManagement
#RefreshScope
#Slf4j
public class DataSourceConfig {
#Bean
public DataSourceConnectionProvider dataSourceConnectionProvider(TransactionAwareDataSourceProxy dataSourceProxy) {
return new DataSourceConnectionProvider(dataSourceProxy);
}
#Bean
public TransactionAwareDataSourceProxy transactionAwareDataSourceProxy(DataSource dataSource) {
return new TransactionAwareDataSourceProxy(dataSource);
}
#Configuration
#ConditionalOnMissingClass("com.opentable.db.postgres.embedded.EmbeddedPostgres")
#RefreshScope
static class DefaultDataSourceConfig {
#Value("${spring.jdbc.url}")
private String url;
#Value("${spring.jdbc.username}")
private String username;
#Value("${spring.jdbc.password}")
private String password;
#Value("${spring.jdbc.driverClass}")
private String driverClass;
#Value("${spring.jdbc.MaximumPoolSize}")
private Integer maxPoolSize;
#Bean
#Primary
#RefreshScope
public DataSource dataSource() {
log.debug("Connecting to datasource: {}", url);
HikariConfig hikariConfig = buildPoolConfig();
DataSource dataSource = new HikariDataSource(hikariConfig);
return dataSource; // not returning the proxy here
}
}
}
I'll turn my comments into an answer.
The transaction manager should NOT be aware of the proxy. From the documentation:
Note that the transaction manager, for example
DataSourceTransactionManager, still needs to work with the underlying
DataSource, not with this proxy.
The class TransactionAwareDataSourceProxy is a special purpose class that is not needed in most cases. Anything that is interfacing with your data source through the Spring framework infrastructure should NOT have the proxy in their chain of access. The proxy is intended for code that cannot interface with the Spring infrastructure. For example, a third party library that was already setup to work with JDBC and did not accept any of Spring's JDBC templates. This is stated in the same docs as above:
This proxy allows data access code to work with the plain JDBC API and
still participate in Spring-managed transactions, similar to JDBC code
in a J2EE/JTA environment. However, if possible, use Spring's
DataSourceUtils, JdbcTemplate or JDBC operation objects to get
transaction participation even without a proxy for the target
DataSource, avoiding the need to define such a proxy in the first
place.
If you do not have any code that needs to bypass the Spring framework then do not use the TransactionAwareDataSourceProxy at all. If you do have legacy code like this then you will need to do what you already configured in your second setup. You will need to create two beans, one which is the data source, and one which is the proxy. You should then give the data source to all of the Spring managed types and the proxy to the legacy types.
i'm developing an spring 4.0 REST application and it works fine, but when i try to make some test it fails at reading propety values from placeholders (they work fine if i run the app normally)
Some of the properties of the app come from files and other from database so i have configured an PropertiesPropertySource to read them from db. I was unable to config it by xml, so i did it in a #Configuration class:
public class AppConfig {
private static final Logger logger = LoggerFactory.getLogger(AppConfig.class);
#Inject
private org.springframework.core.env.Environment env;
#Autowired
private DataSource dataSource;
#PostConstruct
public void initializeDatabasePropertySourceUsage() {
MutablePropertySources propertySources = ((ConfigurableEnvironment) env).getPropertySources();
try {
DatabaseConfiguration databaseConfiguration = new DatabaseConfiguration(dataSource, "[TABLE]", "property", "value");
CommonsConfigurationFactoryBean commonsConfigurationFactoryBean = new CommonsConfigurationFactoryBean(databaseConfiguration);
Properties dbProps = (Properties) commonsConfigurationFactoryBean.getObject();
PropertiesPropertySource dbPropertySource = new PropertiesPropertySource("dbPropertySource", dbProps);
propertySources.addFirst(dbPropertySource);
} catch (Exception e) {
logger.error("Error during database properties setup:"+e.getMessage(), e);
throw new RuntimeException(e);
}
}
#Bean
public static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
}
In the test class i load the context config with this annotation:
#ContextConfiguration(classes = {TestConfig1.class,AppConfig.class,TestConfig2.class})
As i'm mixing XML and java configurations i had to create the TestConfigX classes which load the XMLs with this annotation:
#ImportResource([PATH TO config xml])
TestConfig1 has the DatasourceBean definition and it works fine
AppConfig config the PropertiesPropertySource to read the placeholders from DB
TestConfig2 config the rest of the app, which uses the placeholders ( #Value=${XXXX}), but is unable to read the values: Could not resolve placeholder XXXX
If i omit TestConfig2 i can use the placeholders from DB in the test without problem. But obviously i´m not testing anything.
What i'm doing wrong?
I'm trying to use SpringBoot to talk to a Mongo database.
It is working using spring-boot-starter-data-mongodb and auto configuring a default bean which does allow my MongoRepository classes to talk to the DB ok.
However, I want to override the defaults. I could use application.properties but I need to be able to pass the connection parameters as options on the command line as the application starts up.
I've tried changing the port to break it, I've added debug to the Mongo config and it seems whatever I do the default spring config is being used regardless. It's as if the #Configuration annotation is ignored.
I've tried various flavours of configuring the main application class (specifying conf location, adding #Configuration to main class, with and without #SpringBootApplication ...), but here is where I am at the moment....
package somepackage
#EnableAutoConfiguration
#ComponentScan
public class MyApplication {
public static void main(String[] args) {
ApplicationContext ctx = SpringApplication.run(MyApplication.class, args);
....
}
package somepackage.conf; // should be picked up by ComponentScan, no?
#Configuration
public class MongoConf {
#Bean
public MongoClientFactoryBean mongo() throws Exception {
MongoClientFactoryBean mongo = new MongoClientFactoryBean();
/*
setting to silly values to try to prove it is trying to create connections using this bean - expected to see errors because can't create connection... */
mongo.setHost("flibble");
mongo.setPort(345);
return mongo;
}
}
You should actually use built in Spring Boot MongoDb Starter features and related auto configuration through application properties. Custom host, port, passwords etc. can and should be set via dedicated Spring Boot MongoDB Properties:
spring.data.mongodb.authentication-database= # Authentication database name.
spring.data.mongodb.database=test # Database name.
spring.data.mongodb.field-naming-strategy= # Fully qualified name of the FieldNamingStrategy to use.
spring.data.mongodb.grid-fs-database= # GridFS database name.
spring.data.mongodb.host=localhost # Mongo server host.
spring.data.mongodb.password= # Login password of the mongo server.
spring.data.mongodb.port=27017 # Mongo server port.
spring.data.mongodb.repositories.enabled=true # Enable Mongo repositories.
spring.data.mongodb.uri=mongodb://localhost/test # Mongo database URI. When set, host and port are ignored.
spring.data.mongodb.username= # Login user of the mongo server.
And link to the full list of supported properties is here.
In addition to RafalG's suggestion about MongoProperties, I combined that with the ApplicationArguments class and now I'm getting somewhere....
#Bean
#Primary
public MongoProperties mongoProperties(ApplicationArguments args) {
MongoProperties props = new MongoProperties();
String[] mongoHostAndPort = args.getSourceArgs()[3].split(":");
props.setHost(mongoHostAndPort[0]);
props.setPort(Integer.parseInt(mongoHostAndPort[1]));
return props;
}
#Bean
public MongoClientFactoryBean mongo() {
return new MongoClientFactoryBean();
}
Of course there's lots of error handling to add (nulls, non-ints etc) but hopefully if may help someone else.
#Configuration
#EnableAutoConfiguration(exclude = { EmbeddedMongoAutoConfiguration.class })
#Profile("!testing")
public class TestMongoConfig extends AbstractMongoConfiguration {
private static final MongodStarter starter = MongodStarter.getDefaultInstance();
private MongodExecutable _mongodExe;
private MongodProcess _mongod;
private MongoClient _mongo;
#Value("${spring.data.mongodb.host}")
private String host;
#Value("${spring.data.mongodb.port}")
private Integer port;
#Override
protected String getDatabaseName() {
return "test";
}
#Bean
public Mongo mongo() throws Exception {
_mongodExe = starter.prepare(new MongodConfigBuilder()
.version(Version.Main.PRODUCTION)
.net(new Net(port, Network.localhostIsIPv6()))
.build());
_mongod = _mongodExe.start();
return new MongoClient(host, port);
}
#Override
public String getMappingBasePackage() {
return "com.test.domain";
}