Use #ConfigurationProperties over non-managed #Bean - java

I would like to benefit from #ConfigurationProperties fantastic facilities without needing to expose the bean into my context. It is not a problem of #Primaries and the like, I simply cannot expose another Datasource into the context. How can I achieve the following?
#ConfigurationProperties("com.non.exposed.datasource.hikari")
public DataSource privateHikariDatasource() {
if (Objects.isNull(this.nonExposedDatasource)) {
this.nonExposedDatasource = this.nonExposedDatasourceProperties.initializeDataSourceBuilder().build();
}
return this.nonExposedDatasource;
}
Thanks to the answer by #LppEdd, the final perfect solution is:
#Autowired
private Environment environment;
public DataSource privateHikariDatasource() {
if (Objects.isNull(this.nonExposedDatasource)) {
this.nonExposedDatasource = bindHikariProperties(this.nonExposedDatasourceProperties.initializeDataSourceBuilder().build());
}
return this.nonExposedDatasource;
}
//This does exactly the same as #ConfigurationProperties("com.non.exposed.hikari") but without requiring the exposure of the Datasource in the ctx as #Bean
private <T extends DataSource> T bindHikariProperties(final T instance) {
return Binder.get(this.environment).bind("com.non.exposed.datasource.hikari", Bindable.ofInstance(instance)).get();
}
Then you can call your bean internally with this.privateHikariDatasource() to be used by your other beans.
Great thanks to #LppEdd!

Being that this DataSource is private to a class, and that containing class can be/is inside the Spring context, you can have a #ConfigurationProperties class
#ConfigurationProperties("com.foo.bar.datasource.hikari")
public class HikariConfiguration { ... }
Which, by registering it via #EnableConfigurationProperties, is available for autowiring
#EnableConfigurationProperties(HikariConfiguration.class)
#SpringBootApplication
public class Application { ... }
And thus can be autowired in the containing class
#Component
class MyClass {
private final HikariConfiguration hikariConfiguration;
private DataSource springDatasource;
MyClass(final HikariConfiguration hikariConfiguration) {
this.hikariConfiguration = hikariConfiguration;
}
...
private DataSource privateSingletonDataSource() {
if (Objects.isNull(this.springDatasource)) {
this.springDatasource = buildDataSource(this.hikariConfiguration);
}
return this.springDatasource;
}
}
buildDataSource will manually construct the DataSource instance.
Remember that you need to take care of synchronization when building the DataSource.
Final response is that you cannot re-use DataSourceProperties. You can't even extend it to change the properties' prefix. Only a single instance of it can exist inside the context.
The best you can do is mimic what Spring does.
Having
com.non.exposed.datasource.hikari.url=testUrl
com.non.exposed.datasource.hikari.username=testUsername
com.non.exposed.datasource.hikari.password=testPassword
...
You can define a new #ConfigurationProperties class
#ConfigurationProperties("com.non.exposed.datasource")
public class NonExposedProperties {
private final Map<String, String> hikari = new HashMap<>(8);
public Map<String, String> getHikari() {
return hikari;
}
}
Then, autowire this properties class in your #Configuration/#Component class.
Follow in-code comments.
#Configuration
public class CustomConfiguration {
private final NonExposedProperties nonExposedProperties;
private DataSource dataSource;
CustomConfiguration(final NonExposedProperties nonExposedProperties) {
this.nonExposedProperties= nonExposedProperties;
}
public DataSource dataSource() {
if (Objects.isNull(dataSource)) {
// Create a standalone instance of DataSourceProperties
final DataSourceProperties dataSourceProperties = new DataSourceProperties();
// Use the NonExposedProperties "hikari" Map as properties' source. It will be
// {
// url -> testUrl
// username -> testUsername
// password -> testPassword
// ... other properties
// }
final ConfigurationPropertySource source = new MapConfigurationPropertySource(nonExposedProperties.getHikari());
// Bind those properties to the DataSourceProperties instance
final BindResult<DataSourceProperties> binded =
new Binder(source).bind(
ConfigurationPropertyName.EMPTY,
Bindable.ofInstance(dataSourceProperties)
);
// Retrieve the binded instance (it's not a new one, it's the same as before)
dataSource = binded.get().initializeDataSourceBuilder().build();
}
// Return the constructed HikariDataSource
return dataSource;
}
}

Related

Bean injection failing in Library

I created one library, in which i created some beans. Below is the file where i am creating some beans:
#Configuration
public class StorageBindings {
#Value("${storageAccountName}")
private String storageAccountName;
#Value("${storageAccountKey}")
private String storageAccountKey;
#Bean(name = "cloudBlobClient")
public CloudBlobClient getCloudBlobClientUsingCredentials() throws URISyntaxException {
return new CloudBlobClient();
}
#Bean(name = "storageCredentialsToken")
public StorageCredentialsToken getStorageCredentialsToken() throws IOException {
return new StorageCredentialsToken();
}
#Bean(name = "msiTokenGenerator")
public MSITokenGenerator getMSITokenGenerator() {
return new MSITokenGenerator();
}
}
Then i created the class, which i use as entry point to do further operations
public class StorageClient {
#Autowired
private CloudBlobClient cloudBlobClient;
#Autowired
private MSITokenGenerator msiTokenGenerator;
#Value("${storageAccountName}")
private String storageAccountName;
#Value("${storageAccountKey}")
private String storageAccountKey;
}
I created the jar with above files and include it in our main project, where i created the bean of StorageClient as below:
#Bean(name = {"storageClient"})
public StorageClient getStorageClient() {
LOG.debug("I am inside storage class");
StorageClient ac = null;
try {
ac = new StorageClient();
return ac;
}
But after execution i found that no injection in StorageClient instance ac for below variables and not even environment property getting reflected and all of them are null:
//beans NOT Injecting
ac.cloudBlobClient=null;
ac.msiTokenGenerator=null;
//env variables
ac.storageAccountName=null;
ac.storageAccountKey=null;
Am i missing something, as i am getting null. Sequence of instantiation of beans are ok. I checked. So first beans of StorageBindings are getting created.
When you do this:
ac = new StorageClient();
you lose the context of spring, because you are creating a new instance out of that context. The beans inside CloudBlobClient,MSITokenGenerator and variables storageAccountName,storageAccountKey, they don't get injected.
You can annotate StorageClient with #Component.
So since you pack it as jar, in your main project you have to make sure that the #ComponentScan includes the path where StorageClient is.
Then you can do:
#Autowired
private StorageClient storageClient;
in your main project.
If you are creating object inside #Bean annotated method autowiring doesn't inject beans there - you simply are creating it by yourself.
So you have to #Autowire it ie on fields in your Configuration class and set with setter/constructor.
Ie:
#Autowired
private CloudBlobClient cloudBlobClient;
#Autowired
private MSITokenGenerator msiTokenGenerator;
#Bean(name = {"storageClient"})
public StorageClient getStorageClient() {
LOG.debug("I am inside storage class");
StorageClient ac = null;
try {
ac = new StorageClient();
ac.setCloudBlobClient(cloudBlobClient);
ac.setMsiTokenGenerator(msiTokenGenerator);
return ac;
}
}

Spring Data Cassandra config change on the fly

I am trying to do a similar thing with my application. I am using following versions of Spring boot and Cassandra:
spring-data-cassandra - 2.0.8.RELEASE
spring-boot-starter-parent - 2.0.4.RELEASE
I need to change some properties(mostly hostnames) of Cassandra on the fly and want it to make a new connection with the application. For config change we have internal Cloud Config Change Management and it runs fine on changes and listens to it.
This is my class :
#Configuration
#Order(Ordered.HIGHEST_PRECEDENCE)
#RefreshScope
#EnableCassandraRepositories(basePackages = {"com.*.*.*.dao.repo"})
public class AppConfig {
private static final Logger LOGGER = LoggerFactory.getLogger(AppConfig.class);
#Value("${application['cassandraPort']}")
private String cassandraPort;
#Value("${application['cassandraEndpoint']}")
private String cassandraEndpoint;
#Value("${application['keyspaceName']}")
private String keyspaceName;
#Value("${application['cassandraConsistency']}")
private String cassandraConsistency;
#Value("${application['cassandraUserName']}")
private String cassandraUserName;
#Autowired
private AppConfig appConfig;
public AppConfig() {
System.out.println("AppConfig Constructor");
}
public String getCassandraPort() {
return cassandraPort;
}
public void setCassandraPort(String cassandraPort) {
this.cassandraPort = cassandraPort;
}
public String getCassandraEndpoint() {
return cassandraEndpoint;
}
public void setCassandraEndpoint(String cassandraEndpoint) {
this.cassandraEndpoint = cassandraEndpoint;
}
public String getKeyspaceName() {
return keyspaceName;
}
public void setKeyspaceName(String keyspaceName) {
this.keyspaceName = keyspaceName;
}
public String getCassandraConsistency() {
return cassandraConsistency;
}
public void setCassandraConsistency(String cassandraConsistency) {
this.cassandraConsistency = cassandraConsistency;
}
public String getCassandraUserName() {
return cassandraUserName;
}
public void setCassandraUserName(String cassandraUserName) {
this.cassandraUserName = cassandraUserName;
}
#Bean
// #RefreshScope
public CassandraConverter converter() {
return new MappingCassandraConverter(this.mappingContext());
}
#Bean
// #RefreshScope
public CassandraMappingContext mappingContext() {
return new CassandraMappingContext();
}
#Bean
//#RefreshScope
public CassandraSessionFactoryBean session() {
CassandraSessionFactoryBean session = new CassandraSessionFactoryBean();
session.setCluster(this.cluster().getObject());
session.setKeyspaceName(appConfig.getKeyspaceName());
session.setConverter(this.converter());
session.setSchemaAction(SchemaAction.NONE);
return session;
}
#Bean
//#RefreshScope
public CassandraClusterFactoryBean cluster() {
CassandraClusterFactoryBean cluster = new CassandraClusterFactoryBean();
cluster.setContactPoints(appConfig.getCassandraEndpoint());
cluster.setPort(Integer.valueOf(appConfig.getCassandraPort()));
cluster.setUsername(appConfig.getCassandraUserName());
cluster.setPassword("password");
cluster.setQueryOptions(new QueryOptions().setConsistencyLevel(ConsistencyLevel.LOCAL_QUORUM));
return cluster;
}
}
However, when I try to use #RefreshScope with that Configuration class, the application fails to start. This is what it shows in console :
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 2 of constructor in org.springframework.boot.autoconfigure.data.cassandra.CassandraDataAutoConfiguration required a bean of type 'com.datastax.driver.core.Cluster' that could not be found.
- Bean method 'cassandraCluster' not loaded because auto-configuration 'CassandraAutoConfiguration' was excluded
Action:
Consider revisiting the entries above or defining a bean of type 'com.datastax.driver.core.Cluster' in your configuration.
Is there some guidelines on using #RefreshScope with Cassandra Bean? If anyone has done that earlier can you share the same?
You're mixing a couple of things here.
The config carries properties and bean definitions.
#RefreshScope on AppConfig causes some interference with Spring Boot's auto-configuration and the declared beans aren't used (that's why you see Parameter 2 of constructor…).
To clean up, we will reuse what Spring Boot provides as much as possible, and only declare what's really needed.
Follow these steps to solve the issue (based on your code above):
Create a #ConfigurationProperties bean that encapsulates your properties, or better, reuse CassandraProperties.
Re-enable CassandraAutoConfiguration and remove your own MappingContext and CassandraConverter beans, keep only Cluster and Session bean definitions
Declare Cluster and Session beans as needed and make them use #RefreshScope. Your #Configuration class should look like:
Example Configuration:
#Configuration
public class MyConfig {
#Bean(destroyMethod = "close")
#RefreshScope
public Cluster cassandraCluster(CassandraProperties properties) {
Cluster.Builder builder = Cluster.builder().addContactPoints(properties.getContactPoints().toArray(new String[0]))
.withoutJMXReporting();
return builder.build();
}
#Bean(destroyMethod = "close")
#RefreshScope
public Session cassandraSession(CassandraProperties properties, Cluster cluster) {
return cluster.connect(properties.getKeyspaceName());
}
}

Loading Properties from Database using Java Config with Spring Boot

I created a FactoryBean<Properties> as
public final class SystemProperteisFactoryBean implements FactoryBean<Properties> {
private static final String QUERY = "select * from tb_system_properties";
private final NamedParameterJdbcTemplate jdbcTemplate;
public SystemProperteisFactoryBean (DataSource datasource) {
this.jdbcTemplate = new NamedParameterJdbcTemplate (datasource);
}
#Override
public Properties getObject() {
Properties result = new Properties();
jdbcTemplate.query(QUERY,
(ResultSet rs) -> result.setProperty(rs.getString(1), rs.getString(2));
return result;
}
#Override
public Class<?> getObjectType() {
return Properties.class;
}
#Override
public boolean isSingletone() {
return true;
}
}
This class worked fine using Spring with XML config, I get DataSource using a JNDI name, and then created proper properties, and then used propertiesPlaceHoldeConfigurer via XML tag.
Now I want to use the same thing in Spring Boot and Java Config.
When I define a ProprtySourcesPlaceHolderConfigurer as a bean (in a static method in a #Configuration class) Spring tries to create this bean before the datasource.
Is there any way to create the datasource before PRopertySourcesPlaceHolderConfigurer?
Basically you need to take the dependency as a #Bean method parameter this way:
#Configuration
public static class AppConfig {
#Bean
public SystemPropertiesFactoryBean systemProperties(DataSource dataSource) {
return new SystemPropertiesFactoryBean(dataSource);
}
#Bean
public DataSource dataSource() {
// return new data source
}
}

CacheManager bean definition in Config.class leads to NoSuchBeanDefinitionException

I have a Spring service which is checking database entries. To minimize my repository calls both find methods are "#Cacheable". But when I try to init my service bean while my configuration class has a CacheManager bean definition I get following NoSuchBeanDefinitionException:
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'foo.mediacode.directory.MediaCodeDirectoryService' available
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:353)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBean(DefaultListableBeanFactory.java:340)
at org.springframework.context.support.AbstractApplicationContext.getBean(AbstractApplicationContext.java:1093)
at foo.mediacode.directory.MediaCodeDirectoryService.implementation(MediaCodeDirectoryService.java:63)
at foo.campaigntree.directory.CampaignTreeDirectoryService.<init>(CampaignTreeDirectoryService.java:18)
... 15 more
If I take out the CacheManager bean definition, I can init my service bean and it runs without any problems and caching!
Here is my code:
Configuration
...
#Configuration
#EnableCaching
#EnableJpaRepositories(...)
#PropertySource({...})
public class MediaCodeDirectoryServiceConfig {
private static Logger configLogger = Logger.getLogger(MediaCodeDirectoryServiceConfig.class.getName());
#Value("${jpa.loggingLevel:FINE}")
private String loggingLevel;
#Value("${mysql.databaseDriver}")
private String dataBaseDriver;
#Value("${mysql.username}")
private String username;
#Value("${mysql.password}")
private String password;
#Value("${mysql.databaseUrl}")
private String databaseUrl;
#Bean
public static PropertySourcesPlaceholderConfigurer propertyConfigInDev() {
...
}
#Bean
public MediaCodeDirectoryService mediaCodeDirectoryService() {
return new MediaCodeDirectoryService();
}
#Bean
public CacheManager mediaCodeCacheManager() {
SimpleCacheManager cacheManager = new SimpleCacheManager();
cacheManager.setCaches(Arrays.asList(new ConcurrentMapCache("mediaCodeMappingRegexCache"),
new ConcurrentMapCache("mediaCodeMappingsCache")));
return cacheManager;
}
#Bean
public JpaTransactionManager transactionManager() {
...
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
...
}
public DataSource getDataSource() {
...
}
public JpaDialect getJpaDialect() {
...
}
public Properties getEclipseLinkProperty() {
...
}
public JpaVendorAdapter getJpaVendorAdapter() {
...
}
}
Service
....
public class MediaCodeDirectoryService implements MediaCodeDirectoryServiceApi {
...
#Autowired
private MediaCodeDirectoryRepository repo;
#SuppressWarnings("resource")
public static MediaCodeDirectoryServiceApi implementation() {
if (INSTANCE == null) {
ApplicationContext ctx = new AnnotationConfigApplicationContext(MediaCodeDirectoryServiceConfig.class);
INSTANCE = ctx.getBean(MediaCodeDirectoryService.class);
}
return INSTANCE;
}
...
Repository
...
#Repository
public interface MediaCodeDirectoryRepository extends CrudRepository<MediaCodeDao, Integer> {
#Cacheable("mediaCodeMappingRegexes")
#Query("SELECT m FROM #{#entityName} m WHERE (m.fooId = :fooId) AND (m.isRegex = :isRegex) ORDER BY (m.orderId DESC, m.id ASC)")
List<MediaCodeDao> findByfooIdAndIsRegexOrderByOrderIdDescAndIdAsc(#Param("fooId") int fooId, #Param("isRegex") boolean isRegex);
#Cacheable("mediaCodeMappings")
List<MediaCodeDao> findByMediaCode(String MediaCode, Pageable pageable);
}
When I debug into DefaultListableBeanFactory I can find within beanDefinitionMap my mediaCodeDirectoryService and also within beanDefinitionNames mediaCodeDirectoryService appears. But DefaultListableBeanFactory.getBean(...) cannot resolve name and namedBean in line 364 is null.
When I try to get the context via String like:
INSTANCE = (MediaCodeDirectoryService) ctx.getBean("mediaCodeDirecotryService")
I avoid the NoSuchBeanDefinitionException but I run into an other one.
Anybody here has an idea on what might be the cause of this? Did I missed something in my configuration? Thx!
Caching is applied through AOP. For AOP Spring uses a proxy based approach and the default is to create interface based proxies.
public class MediaCodeDirectoryService implements MediaCodeDirectoryServiceApi {... }
With this class definition at runtime you will get a dynamically created class (Proxy$51 or something along those lines) which implements all interfaces but it isn't a MediaCodeDirectoryService. It is however a MediaCodeDirectoryServiceApi.
You have 2 ways of fixing this, either program to interfaces (which you should have been doing anyway because you have defined interfaces) instead of concrete classes or use class based proxies.
The first option involves you changing your code in the places the directly #Autowire or get an instance of MediaCodeDirectoryService to use MediaCodeDirectoryServiceApi instead (which imho you should already do, why else define an interface). Now you will get the proxy injected and everything will work.
The second option involves you setting proxyTargetClass=true on your #EnableCaching annotation. Then instead of an interface based proxy you will get a class based proxy.
#EnableCaching(proxyTargetClass=true)

Overriding beans in Java-based spring configuration hierarchy

Let's assume we have an application that can be customized for some customers. The application is using Java-based spring configuration (a.k.a. Java config) for dependency injection. The application consists of modules and their submodules. Each module and submodule has its own #Configuration class which is imported by parent configuration using #Import. This creates the following hierarchy:
MainConfig
----------+---------------- ....
| |
ModuleAConfig ModuleBConfig
|--------------------|
| |
SubModuleA1Config SubModuleA2Config
For example ModuleAConfig looks like this:
#Configuration
#Import({SubModuleA1Config.class, SubModuleA2Config.class})
public class ModuleAConfig {
// some module level beans
}
Let's say that SubModuleA1Config defines bean someBean of type SomeBean:
#Configuration
public class SubModuleA1Config {
#Bean
public SomeBean someBean() { return new SomeBean(); }
}
Now I want to customize the application for Customer1 (C1) - I want to use C1SomeBean (extending SomeBean) instead of SomeBean as someBean.
How can I achieve this with minimum duplication?
One of my ideas was to prepare alternative hierarchy with C1Config inheriting from MainConfig, C1ModuleAConfig from ModuleAConfig and C1SubModuleA1Config from SubModuleA1Config. C1SubModuleA1Config would override someBean() method returning C1SomeBean. Unfortunately with Spring 4.0.6 I get something like:
Overriding bean definition for bean 'someBean': replacing [someBean defined in class C1SubmoduleA1Config] with [someBean defined in class SubModuleA1Config]
and indeed SomeBean class is returned from context instead of C1SomeBean. This is clearly not what I want.
Note that you cannot override #Import extending configuration classes.
If you want to select which imports to use at runtime, you could use a #ImportSelector instead.
However, #Configuration classes are not more that spring (scoped) managed factories so as you already have a factory method for someBean you don't need to go even further:
#Configuration
public class SubModuleA1Config {
#Autowired
private Environment env;
#Bean
public SomeBean someBean() {
String customerProperty = env.getProperty("customer");
if ("C1".equals(customerProperty))
return new C1SomeBean();
return new SomeBean();
}
}
Update
Using a ImportSelector:
class CustomerImportSelector implements ImportSelector, EnvironmentAware {
private static final String PACKAGE = "org.example.config";
private static final String CONFIG_CLASS = "SubModuleConfig";
private Environment env;
#Override
public String[] selectImports(AnnotationMetadata importingClassMetadata) {
String customer = env.getProperty("customer");
return new String[] { PACKAGE + "." + customer + "." + CONFIG_CLASS };
}
#Override
public void setEnvironment(Environment environment) {
this.env = environment;
}
}
#Configuration
#Import(CustomerImportSelector.class)
public class ModuleAConfig {
// some module level beans
}
However, as every customer has a a separate package, consider also using #ComponentScan. This will pick the configuration class present and don't need a extra configuration property.
#Configuration
#ComponentScan(basePackages="org.example.customer")
public class SubModuleA1Config {
#Autowired
private CustomerFactory customerFactory;
#Bean
public SomeBean someBean() {
return customerFactory.someBean();
}
}
public interface CustomerFactory {
SomeBean someBean();
}
#Component
public class C1CustomerFactory implements CustomerFactory {
#Override
public SomeBean someBean() {
return new C1SomeBean();
}
}

Categories