I need to provide timeouts from application.properties file, but at initialization it fails because properties are not yet loaded. What is best practice to get them loaded?
#Configuration
#AllArgsConstructor
#Slf4j
public class Config {
#Value("${connectionTimeout}")
int connectionTimeout;
#Value("${responseTimeout}")
int responseTimeout;
#Bean
public ClientHttpConnector getConnector() {
HttpClient client = HttpClient.create();
client.option(ChannelOption.CONNECT_TIMEOUT_MILLIS, connectionTimeout)
.responseTimeout(Duration.ofMillis(responseTimeout));
return new ReactorClientHttpConnector(client);
}
#Bean
public WebClient webClient() {
return WebClient.builder().defaultHeader(HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_FORM_URLENCODED_VALUE)
.clientConnector(getConnector())
.build();
}
application.properties from resource folder
connectionTimeout=30000
responseTimeout=30000
As suggested in other similar posts I tried using #ConfigurationProperties, but that didn't work at all. Is there some easier way to get them loaded that I am not aware of?
Try injecting the values via constructor:
public Config(#Value("${connectionTimeout}") int connectionTimeout,
#Value("${responseTimeout}") int responseTimeout) {
// assign to fields
}
Try using Environment class.
#Configuration
public class Config {
private final Environment environment;
#Autowired
public Config(Environment environment) {
this.environment = environment;
}
#Bean
public SimpleBean simpleBean() {
SimpleBean simpleBean = new SimpleBean();
simpleBean.setConfOne(environment.getProperty("conf.one"));
simpleBean.setConfTwo(environment.getProperty("conf.two"));
return simpleBean;
}
}
Related
I would like to benefit from #ConfigurationProperties fantastic facilities without needing to expose the bean into my context. It is not a problem of #Primaries and the like, I simply cannot expose another Datasource into the context. How can I achieve the following?
#ConfigurationProperties("com.non.exposed.datasource.hikari")
public DataSource privateHikariDatasource() {
if (Objects.isNull(this.nonExposedDatasource)) {
this.nonExposedDatasource = this.nonExposedDatasourceProperties.initializeDataSourceBuilder().build();
}
return this.nonExposedDatasource;
}
Thanks to the answer by #LppEdd, the final perfect solution is:
#Autowired
private Environment environment;
public DataSource privateHikariDatasource() {
if (Objects.isNull(this.nonExposedDatasource)) {
this.nonExposedDatasource = bindHikariProperties(this.nonExposedDatasourceProperties.initializeDataSourceBuilder().build());
}
return this.nonExposedDatasource;
}
//This does exactly the same as #ConfigurationProperties("com.non.exposed.hikari") but without requiring the exposure of the Datasource in the ctx as #Bean
private <T extends DataSource> T bindHikariProperties(final T instance) {
return Binder.get(this.environment).bind("com.non.exposed.datasource.hikari", Bindable.ofInstance(instance)).get();
}
Then you can call your bean internally with this.privateHikariDatasource() to be used by your other beans.
Great thanks to #LppEdd!
Being that this DataSource is private to a class, and that containing class can be/is inside the Spring context, you can have a #ConfigurationProperties class
#ConfigurationProperties("com.foo.bar.datasource.hikari")
public class HikariConfiguration { ... }
Which, by registering it via #EnableConfigurationProperties, is available for autowiring
#EnableConfigurationProperties(HikariConfiguration.class)
#SpringBootApplication
public class Application { ... }
And thus can be autowired in the containing class
#Component
class MyClass {
private final HikariConfiguration hikariConfiguration;
private DataSource springDatasource;
MyClass(final HikariConfiguration hikariConfiguration) {
this.hikariConfiguration = hikariConfiguration;
}
...
private DataSource privateSingletonDataSource() {
if (Objects.isNull(this.springDatasource)) {
this.springDatasource = buildDataSource(this.hikariConfiguration);
}
return this.springDatasource;
}
}
buildDataSource will manually construct the DataSource instance.
Remember that you need to take care of synchronization when building the DataSource.
Final response is that you cannot re-use DataSourceProperties. You can't even extend it to change the properties' prefix. Only a single instance of it can exist inside the context.
The best you can do is mimic what Spring does.
Having
com.non.exposed.datasource.hikari.url=testUrl
com.non.exposed.datasource.hikari.username=testUsername
com.non.exposed.datasource.hikari.password=testPassword
...
You can define a new #ConfigurationProperties class
#ConfigurationProperties("com.non.exposed.datasource")
public class NonExposedProperties {
private final Map<String, String> hikari = new HashMap<>(8);
public Map<String, String> getHikari() {
return hikari;
}
}
Then, autowire this properties class in your #Configuration/#Component class.
Follow in-code comments.
#Configuration
public class CustomConfiguration {
private final NonExposedProperties nonExposedProperties;
private DataSource dataSource;
CustomConfiguration(final NonExposedProperties nonExposedProperties) {
this.nonExposedProperties= nonExposedProperties;
}
public DataSource dataSource() {
if (Objects.isNull(dataSource)) {
// Create a standalone instance of DataSourceProperties
final DataSourceProperties dataSourceProperties = new DataSourceProperties();
// Use the NonExposedProperties "hikari" Map as properties' source. It will be
// {
// url -> testUrl
// username -> testUsername
// password -> testPassword
// ... other properties
// }
final ConfigurationPropertySource source = new MapConfigurationPropertySource(nonExposedProperties.getHikari());
// Bind those properties to the DataSourceProperties instance
final BindResult<DataSourceProperties> binded =
new Binder(source).bind(
ConfigurationPropertyName.EMPTY,
Bindable.ofInstance(dataSourceProperties)
);
// Retrieve the binded instance (it's not a new one, it's the same as before)
dataSource = binded.get().initializeDataSourceBuilder().build();
}
// Return the constructed HikariDataSource
return dataSource;
}
}
I am trying to do a similar thing with my application. I am using following versions of Spring boot and Cassandra:
spring-data-cassandra - 2.0.8.RELEASE
spring-boot-starter-parent - 2.0.4.RELEASE
I need to change some properties(mostly hostnames) of Cassandra on the fly and want it to make a new connection with the application. For config change we have internal Cloud Config Change Management and it runs fine on changes and listens to it.
This is my class :
#Configuration
#Order(Ordered.HIGHEST_PRECEDENCE)
#RefreshScope
#EnableCassandraRepositories(basePackages = {"com.*.*.*.dao.repo"})
public class AppConfig {
private static final Logger LOGGER = LoggerFactory.getLogger(AppConfig.class);
#Value("${application['cassandraPort']}")
private String cassandraPort;
#Value("${application['cassandraEndpoint']}")
private String cassandraEndpoint;
#Value("${application['keyspaceName']}")
private String keyspaceName;
#Value("${application['cassandraConsistency']}")
private String cassandraConsistency;
#Value("${application['cassandraUserName']}")
private String cassandraUserName;
#Autowired
private AppConfig appConfig;
public AppConfig() {
System.out.println("AppConfig Constructor");
}
public String getCassandraPort() {
return cassandraPort;
}
public void setCassandraPort(String cassandraPort) {
this.cassandraPort = cassandraPort;
}
public String getCassandraEndpoint() {
return cassandraEndpoint;
}
public void setCassandraEndpoint(String cassandraEndpoint) {
this.cassandraEndpoint = cassandraEndpoint;
}
public String getKeyspaceName() {
return keyspaceName;
}
public void setKeyspaceName(String keyspaceName) {
this.keyspaceName = keyspaceName;
}
public String getCassandraConsistency() {
return cassandraConsistency;
}
public void setCassandraConsistency(String cassandraConsistency) {
this.cassandraConsistency = cassandraConsistency;
}
public String getCassandraUserName() {
return cassandraUserName;
}
public void setCassandraUserName(String cassandraUserName) {
this.cassandraUserName = cassandraUserName;
}
#Bean
// #RefreshScope
public CassandraConverter converter() {
return new MappingCassandraConverter(this.mappingContext());
}
#Bean
// #RefreshScope
public CassandraMappingContext mappingContext() {
return new CassandraMappingContext();
}
#Bean
//#RefreshScope
public CassandraSessionFactoryBean session() {
CassandraSessionFactoryBean session = new CassandraSessionFactoryBean();
session.setCluster(this.cluster().getObject());
session.setKeyspaceName(appConfig.getKeyspaceName());
session.setConverter(this.converter());
session.setSchemaAction(SchemaAction.NONE);
return session;
}
#Bean
//#RefreshScope
public CassandraClusterFactoryBean cluster() {
CassandraClusterFactoryBean cluster = new CassandraClusterFactoryBean();
cluster.setContactPoints(appConfig.getCassandraEndpoint());
cluster.setPort(Integer.valueOf(appConfig.getCassandraPort()));
cluster.setUsername(appConfig.getCassandraUserName());
cluster.setPassword("password");
cluster.setQueryOptions(new QueryOptions().setConsistencyLevel(ConsistencyLevel.LOCAL_QUORUM));
return cluster;
}
}
However, when I try to use #RefreshScope with that Configuration class, the application fails to start. This is what it shows in console :
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 2 of constructor in org.springframework.boot.autoconfigure.data.cassandra.CassandraDataAutoConfiguration required a bean of type 'com.datastax.driver.core.Cluster' that could not be found.
- Bean method 'cassandraCluster' not loaded because auto-configuration 'CassandraAutoConfiguration' was excluded
Action:
Consider revisiting the entries above or defining a bean of type 'com.datastax.driver.core.Cluster' in your configuration.
Is there some guidelines on using #RefreshScope with Cassandra Bean? If anyone has done that earlier can you share the same?
You're mixing a couple of things here.
The config carries properties and bean definitions.
#RefreshScope on AppConfig causes some interference with Spring Boot's auto-configuration and the declared beans aren't used (that's why you see Parameter 2 of constructor…).
To clean up, we will reuse what Spring Boot provides as much as possible, and only declare what's really needed.
Follow these steps to solve the issue (based on your code above):
Create a #ConfigurationProperties bean that encapsulates your properties, or better, reuse CassandraProperties.
Re-enable CassandraAutoConfiguration and remove your own MappingContext and CassandraConverter beans, keep only Cluster and Session bean definitions
Declare Cluster and Session beans as needed and make them use #RefreshScope. Your #Configuration class should look like:
Example Configuration:
#Configuration
public class MyConfig {
#Bean(destroyMethod = "close")
#RefreshScope
public Cluster cassandraCluster(CassandraProperties properties) {
Cluster.Builder builder = Cluster.builder().addContactPoints(properties.getContactPoints().toArray(new String[0]))
.withoutJMXReporting();
return builder.build();
}
#Bean(destroyMethod = "close")
#RefreshScope
public Session cassandraSession(CassandraProperties properties, Cluster cluster) {
return cluster.connect(properties.getKeyspaceName());
}
}
I am trying to create Spring beans using only annotations. I am not able to load values for my #Bean class from properties file.
Here is my code:
This is my main class
public class AnnotationDI {
public static void main(String[] args) {
ApplicationContext context = new AnnotationConfigApplicationContext(ConfigurationProvider.class);
ApplicationProperties properties = (ApplicationProperties)context.getBean(ApplicationProperties.class);
System.out.println(properties);
}}
Configuration class
#Configuration
public class ConfigurationProvider {
private ApplicationProperties m_applicationProperties;
#Bean
public ApplicationProperties getApplicationProperties() {
return new ApplicationProperties();
}
}
Bean class
#PropertySource(value = { "classpath:application.properties" })
public class ApplicationProperties {
#Value("${longThreadCount}")
private String m_longProcessThread;
#Value("${routeTimeout}")
private String m_routeTimeout;
#Value("${updateDirectoryPath}")
private String m_updateDirectoryPath;
public String getLongProcessThread() {
return m_longProcessThread;
}
#Override
public String toString() {
return "ApplicationProperties [m_longProcessThread=" +m_longProcessThread"]";
}
}
when i run this program, I get following output
m_longProcessThread=${longThreadCount}
Any idea what am i doing wrong?
To be able to have #Value with placeholders resolved you need to register a PropertySourcesPlaceholderConfigurer. As this is a BeanFactoryPostProcessor it needs to be registered as a static bean so that it can be detected early on in the process.
#Bean
public static PropertySourcesPlaceholderConfigurer placeholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
#PropertySource annotation has to be used in conjunction with #Configuration annotation like so,
#Configuration
#PropertySource(value = { "classpath:application.properties" })
public class ApplicationProperties {
...
}
Adding #Configuration annotation would solve the issue in this case.
I am currently writing a Spring Boot autoconfiguration for Retrofit 2. What I am trying to do is to write some sort of an interface builder that is able instantiate an interface that is annotated with some annotation for autowiring just like Spring Data does it with repositories. As I cannot find any resources on how to do this (or if it can even be done with Spring), I would like to ask for your thoughts on that. Below is for an interface that I would like to instantiate.
My replacement for #Repository is #Retrofit the rest is just "ordinary" code you would write for any Retrofit repository.
The kind of interface I would like to autowire:
#Retrofit
public interface Api {
#GET("usernames")
String[] getUsernames();
}
An example for autowiring:
#SpringBootApplication
public class TestApplication {
#Autowired
private Api api;
public static void main(String[] args) {
SpringApplication.run(TestApplication.class, args);
}
#Bean
CommandLineRunner runner() {
return args -> {
System.out.println(api.getUsernames());
};
}
}
As I said I found a solution for my problem.
First we need an auto configuration class that is loaded by Spring Boot - as stated here - by adding the file META-INF/spring.factories with the content that is shown below. This auto configuration loads a registrar which itself searches for classes annotated with #Retrofit via a component provider. At last the registrar creates instances of RetrofitFactoryBean for each instance that could be found while this factory bean creates the Retrofit proxies itself.
The auto configuration
#Configuration
#Import(RetrofitRegistrar.class)
public class RetrofitAutoConfiguration {
#Bean
#ConditionalOnMissingBean
public Retrofit retrofit() {
return new Retrofit.Builder().build();
}
}
META-INF/spring.factories
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
spring.retrofit.RetrofitAutoConfiguration
The imported registrar
public class RetrofitRegistrar implements ImportBeanDefinitionRegistrar, BeanFactoryAware {
#Setter
private BeanFactory beanFactory;
#Override
public void registerBeanDefinitions(AnnotationMetadata importingClassMetadata,
BeanDefinitionRegistry registry) {
List<String> basePackages = AutoConfigurationPackages.get(this.beanFactory);
RetrofitComponentProvider provider = new RetrofitComponentProvider(registry);
basePackages.stream()
.map(provider::findCandidateComponents)
.flatMap(Set::stream)
.forEach(comp -> register(comp, registry));
}
private void register(BeanDefinition component, BeanDefinitionRegistry registry) {
BeanDefinitionBuilder builder = BeanDefinitionBuilder.
rootBeanDefinition(RetrofitFactoryBean.class);
builder.addConstructorArgValue(component.getBeanClassName());
registry.registerBeanDefinition(
component.getBeanClassName().toLowerCase(), builder.getBeanDefinition());
}
}
The component provider
class RetrofitComponentProvider extends ClassPathScanningCandidateComponentProvider {
#Getter
private BeanDefinitionRegistry registry;
public RetrofitComponentProvider(BeanDefinitionRegistry registry) {
super(false);
Assert.notNull(registry, "BeanDefinitionRegistry must not be null!");
this.registry = registry;
addIncludeFilter(new AnnotationTypeFilter(Retrofit.class, true, true));
}
#Override
protected boolean isCandidateComponent(AnnotatedBeanDefinition beanDefinition) {
return true;
}
}
The factory bean
#Component
#RequiredArgsConstructor
public class RetrofitFactoryBean extends AbstractFactoryBean<Object> {
#Getter
private final Class<?> objectType;
private final retrofit2.Retrofit retrofit;
#Override
protected Object createInstance() throws Exception {
return retrofit.create(objectType);
}
}
The #Getter, #Setter and #RequiredArgsConstructor annotations are provided by ProjectLombok
Let me ask you firstly to avoid reinventing the wheel by creating a new Spring annotation (yours here is #Retrofit. However, it is absolutely okay to use retrofit with spring there is nothing to prevent it. You can simply try to use an existing Spring annotation which can be #Component as you can see in this question
you can autowire your interface without facing problems.
Hope this helps.
I am trying to write a IntegrationFlow test. It goes something like this:
JMS(in) -> (find previous versions in db) -> reduce(in,1...n) -> (to db) -> JMS(out)
So, no suprise: I want to mock the DB calls; they are Dao beans. But, I also want it to pickup other beans through component scan; I will selectively scan all packages except dao.
Create a test config and mock the Daos. No problem
Follow spring boot instructions for testing to get Component scanned beans. No problem
I just want to verify the sequence of steps and the resultant output as the outbound JMS queue would see it. Can someone just help me fill in the blanks?
This CANT be tough! The use of mocks seems to be problematic because plenty of essential fields are final. I am reading everywhere about this and just not coming up with a clear path. I inherited this code BTW
My error:
org.springframework.integration.MessageDispatchingException: Dispatcher has no subscribers
Here is my code
#Configuration
#ImportResource("classpath:retry-context.xml")
public class LifecycleConfig {
#Autowired
private MessageProducerSupport inbound;
#Autowired
private MessageHandler outbound;
#Autowired
#Qualifier("reducer")
private GenericTransformer<Collection<ExtendedClaim>,ExtendedClaim> reducer;
#Autowired
#Qualifier("claimIdToPojo")
private GenericTransformer<String,ClaimDomain> toPojo;
#Autowired
#Qualifier("findPreviousVersion")
private GenericTransformer<ExtendedClaim,Collection<ExtendedClaim>> previousVersions;
#Autowired
#Qualifier("saveToDb")
private GenericHandler<ExtendedClaim> toDb;
#Bean
public DirectChannel getChannel() {
return new DirectChannel();
}
#Bean
#Autowired
public StandardIntegrationFlow processClaim() {
return IntegrationFlows.from(inbound).
channel(getChannel()).
transform(previousVersions).
transform(reducer).
handle(ExtendedClaim.class,toDb).
transform(toPojo).
handle(outbound).get();
}
}
Test Config
#Configuration
public class TestConfig extends AbstractClsTest {
#Bean(name = "claimIdToPojo")
public ClaimIdToPojo getClaimIdToPojo() {
return spy(new ClaimIdToPojo());
}
#Bean
public ClaimToId getClaimToIdPojo() {
return spy(new ClaimToId());
}
#Bean(name = "findPreviousVersion")
public FindPreviousVersion getFindPreviousVersion() {
return spy(new FindPreviousVersion());
}
#Bean(name = "reducer")
public Reducer getReducer() {
return spy(new Reducer());
}
#Bean(name = "saveToDb")
public SaveToDb getSaveToDb() {
return spy(new SaveToDb());
}
#Bean
public MessageProducerSupport getInbound() {
MessageProducerSupport mock = mock(MessageProducerSupport.class);
// when(mock.isRunning()).thenReturn(true);
return mock;
}
#Bean
public PaymentDAO getPaymentDao() {
return mock(PaymentDAO.class);
}
#Bean
public ClaimDAO getClaimDao() {
return mock(ClaimDAO.class);
}
#Bean
public MessageHandler getOutbound() {
return new CaptureHandler<ExtendedClaim>();
}
}
Actual test won't load
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = {TestConfig.class, LifecycleConfig.class})
public class ClaimLifecycleApplicationTest extends AbstractClsTest {
#Autowired
private MessageHandler outbound;
#Autowired
#Qualifier("reducer")
private GenericTransformer<Collection<ExtendedClaim>,ExtendedClaim> reducer;
#Autowired
#Qualifier("claimIdToPojo")
private GenericTransformer<String,ClaimDomain> toPojo;
#Autowired
#Qualifier("findPreviousVersion")
private GenericTransformer<ExtendedClaim,Collection<ExtendedClaim>> previousVersions;
#Autowired
#Qualifier("saveToDb")
private GenericHandler<ExtendedClaim> toDb;
#Autowired
private DirectChannel defaultChannel;
#Test
public void testFlow() throws Exception {
ExtendedClaim claim = getClaim();
Message<ExtendedClaim> message = MessageBuilder.withPayload(claim).build();
List<ExtendedClaim> previousClaims = Arrays.asList(claim);
defaultChannel.send(message);
verify(previousVersions).transform(claim);
verify(reducer).transform(previousClaims);
verify(toDb).handle(claim, anyMap());
verify(toPojo).transform(claim.getSubmitterClaimId());
verify(outbound);
}
}
There are a lot of domain-specific object, so I can't test it to reproduce or find some other issue with your code.
But I see that you don't use an #EnableIntegration on your #Configurations classes.