I am trying to do a similar thing with my application. I am using following versions of Spring boot and Cassandra:
spring-data-cassandra - 2.0.8.RELEASE
spring-boot-starter-parent - 2.0.4.RELEASE
I need to change some properties(mostly hostnames) of Cassandra on the fly and want it to make a new connection with the application. For config change we have internal Cloud Config Change Management and it runs fine on changes and listens to it.
This is my class :
#Configuration
#Order(Ordered.HIGHEST_PRECEDENCE)
#RefreshScope
#EnableCassandraRepositories(basePackages = {"com.*.*.*.dao.repo"})
public class AppConfig {
private static final Logger LOGGER = LoggerFactory.getLogger(AppConfig.class);
#Value("${application['cassandraPort']}")
private String cassandraPort;
#Value("${application['cassandraEndpoint']}")
private String cassandraEndpoint;
#Value("${application['keyspaceName']}")
private String keyspaceName;
#Value("${application['cassandraConsistency']}")
private String cassandraConsistency;
#Value("${application['cassandraUserName']}")
private String cassandraUserName;
#Autowired
private AppConfig appConfig;
public AppConfig() {
System.out.println("AppConfig Constructor");
}
public String getCassandraPort() {
return cassandraPort;
}
public void setCassandraPort(String cassandraPort) {
this.cassandraPort = cassandraPort;
}
public String getCassandraEndpoint() {
return cassandraEndpoint;
}
public void setCassandraEndpoint(String cassandraEndpoint) {
this.cassandraEndpoint = cassandraEndpoint;
}
public String getKeyspaceName() {
return keyspaceName;
}
public void setKeyspaceName(String keyspaceName) {
this.keyspaceName = keyspaceName;
}
public String getCassandraConsistency() {
return cassandraConsistency;
}
public void setCassandraConsistency(String cassandraConsistency) {
this.cassandraConsistency = cassandraConsistency;
}
public String getCassandraUserName() {
return cassandraUserName;
}
public void setCassandraUserName(String cassandraUserName) {
this.cassandraUserName = cassandraUserName;
}
#Bean
// #RefreshScope
public CassandraConverter converter() {
return new MappingCassandraConverter(this.mappingContext());
}
#Bean
// #RefreshScope
public CassandraMappingContext mappingContext() {
return new CassandraMappingContext();
}
#Bean
//#RefreshScope
public CassandraSessionFactoryBean session() {
CassandraSessionFactoryBean session = new CassandraSessionFactoryBean();
session.setCluster(this.cluster().getObject());
session.setKeyspaceName(appConfig.getKeyspaceName());
session.setConverter(this.converter());
session.setSchemaAction(SchemaAction.NONE);
return session;
}
#Bean
//#RefreshScope
public CassandraClusterFactoryBean cluster() {
CassandraClusterFactoryBean cluster = new CassandraClusterFactoryBean();
cluster.setContactPoints(appConfig.getCassandraEndpoint());
cluster.setPort(Integer.valueOf(appConfig.getCassandraPort()));
cluster.setUsername(appConfig.getCassandraUserName());
cluster.setPassword("password");
cluster.setQueryOptions(new QueryOptions().setConsistencyLevel(ConsistencyLevel.LOCAL_QUORUM));
return cluster;
}
}
However, when I try to use #RefreshScope with that Configuration class, the application fails to start. This is what it shows in console :
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 2 of constructor in org.springframework.boot.autoconfigure.data.cassandra.CassandraDataAutoConfiguration required a bean of type 'com.datastax.driver.core.Cluster' that could not be found.
- Bean method 'cassandraCluster' not loaded because auto-configuration 'CassandraAutoConfiguration' was excluded
Action:
Consider revisiting the entries above or defining a bean of type 'com.datastax.driver.core.Cluster' in your configuration.
Is there some guidelines on using #RefreshScope with Cassandra Bean? If anyone has done that earlier can you share the same?
You're mixing a couple of things here.
The config carries properties and bean definitions.
#RefreshScope on AppConfig causes some interference with Spring Boot's auto-configuration and the declared beans aren't used (that's why you see Parameter 2 of constructor…).
To clean up, we will reuse what Spring Boot provides as much as possible, and only declare what's really needed.
Follow these steps to solve the issue (based on your code above):
Create a #ConfigurationProperties bean that encapsulates your properties, or better, reuse CassandraProperties.
Re-enable CassandraAutoConfiguration and remove your own MappingContext and CassandraConverter beans, keep only Cluster and Session bean definitions
Declare Cluster and Session beans as needed and make them use #RefreshScope. Your #Configuration class should look like:
Example Configuration:
#Configuration
public class MyConfig {
#Bean(destroyMethod = "close")
#RefreshScope
public Cluster cassandraCluster(CassandraProperties properties) {
Cluster.Builder builder = Cluster.builder().addContactPoints(properties.getContactPoints().toArray(new String[0]))
.withoutJMXReporting();
return builder.build();
}
#Bean(destroyMethod = "close")
#RefreshScope
public Session cassandraSession(CassandraProperties properties, Cluster cluster) {
return cluster.connect(properties.getKeyspaceName());
}
}
Related
I need to provide timeouts from application.properties file, but at initialization it fails because properties are not yet loaded. What is best practice to get them loaded?
#Configuration
#AllArgsConstructor
#Slf4j
public class Config {
#Value("${connectionTimeout}")
int connectionTimeout;
#Value("${responseTimeout}")
int responseTimeout;
#Bean
public ClientHttpConnector getConnector() {
HttpClient client = HttpClient.create();
client.option(ChannelOption.CONNECT_TIMEOUT_MILLIS, connectionTimeout)
.responseTimeout(Duration.ofMillis(responseTimeout));
return new ReactorClientHttpConnector(client);
}
#Bean
public WebClient webClient() {
return WebClient.builder().defaultHeader(HttpHeaders.CONTENT_TYPE, MediaType.APPLICATION_FORM_URLENCODED_VALUE)
.clientConnector(getConnector())
.build();
}
application.properties from resource folder
connectionTimeout=30000
responseTimeout=30000
As suggested in other similar posts I tried using #ConfigurationProperties, but that didn't work at all. Is there some easier way to get them loaded that I am not aware of?
Try injecting the values via constructor:
public Config(#Value("${connectionTimeout}") int connectionTimeout,
#Value("${responseTimeout}") int responseTimeout) {
// assign to fields
}
Try using Environment class.
#Configuration
public class Config {
private final Environment environment;
#Autowired
public Config(Environment environment) {
this.environment = environment;
}
#Bean
public SimpleBean simpleBean() {
SimpleBean simpleBean = new SimpleBean();
simpleBean.setConfOne(environment.getProperty("conf.one"));
simpleBean.setConfTwo(environment.getProperty("conf.two"));
return simpleBean;
}
}
I would like to benefit from #ConfigurationProperties fantastic facilities without needing to expose the bean into my context. It is not a problem of #Primaries and the like, I simply cannot expose another Datasource into the context. How can I achieve the following?
#ConfigurationProperties("com.non.exposed.datasource.hikari")
public DataSource privateHikariDatasource() {
if (Objects.isNull(this.nonExposedDatasource)) {
this.nonExposedDatasource = this.nonExposedDatasourceProperties.initializeDataSourceBuilder().build();
}
return this.nonExposedDatasource;
}
Thanks to the answer by #LppEdd, the final perfect solution is:
#Autowired
private Environment environment;
public DataSource privateHikariDatasource() {
if (Objects.isNull(this.nonExposedDatasource)) {
this.nonExposedDatasource = bindHikariProperties(this.nonExposedDatasourceProperties.initializeDataSourceBuilder().build());
}
return this.nonExposedDatasource;
}
//This does exactly the same as #ConfigurationProperties("com.non.exposed.hikari") but without requiring the exposure of the Datasource in the ctx as #Bean
private <T extends DataSource> T bindHikariProperties(final T instance) {
return Binder.get(this.environment).bind("com.non.exposed.datasource.hikari", Bindable.ofInstance(instance)).get();
}
Then you can call your bean internally with this.privateHikariDatasource() to be used by your other beans.
Great thanks to #LppEdd!
Being that this DataSource is private to a class, and that containing class can be/is inside the Spring context, you can have a #ConfigurationProperties class
#ConfigurationProperties("com.foo.bar.datasource.hikari")
public class HikariConfiguration { ... }
Which, by registering it via #EnableConfigurationProperties, is available for autowiring
#EnableConfigurationProperties(HikariConfiguration.class)
#SpringBootApplication
public class Application { ... }
And thus can be autowired in the containing class
#Component
class MyClass {
private final HikariConfiguration hikariConfiguration;
private DataSource springDatasource;
MyClass(final HikariConfiguration hikariConfiguration) {
this.hikariConfiguration = hikariConfiguration;
}
...
private DataSource privateSingletonDataSource() {
if (Objects.isNull(this.springDatasource)) {
this.springDatasource = buildDataSource(this.hikariConfiguration);
}
return this.springDatasource;
}
}
buildDataSource will manually construct the DataSource instance.
Remember that you need to take care of synchronization when building the DataSource.
Final response is that you cannot re-use DataSourceProperties. You can't even extend it to change the properties' prefix. Only a single instance of it can exist inside the context.
The best you can do is mimic what Spring does.
Having
com.non.exposed.datasource.hikari.url=testUrl
com.non.exposed.datasource.hikari.username=testUsername
com.non.exposed.datasource.hikari.password=testPassword
...
You can define a new #ConfigurationProperties class
#ConfigurationProperties("com.non.exposed.datasource")
public class NonExposedProperties {
private final Map<String, String> hikari = new HashMap<>(8);
public Map<String, String> getHikari() {
return hikari;
}
}
Then, autowire this properties class in your #Configuration/#Component class.
Follow in-code comments.
#Configuration
public class CustomConfiguration {
private final NonExposedProperties nonExposedProperties;
private DataSource dataSource;
CustomConfiguration(final NonExposedProperties nonExposedProperties) {
this.nonExposedProperties= nonExposedProperties;
}
public DataSource dataSource() {
if (Objects.isNull(dataSource)) {
// Create a standalone instance of DataSourceProperties
final DataSourceProperties dataSourceProperties = new DataSourceProperties();
// Use the NonExposedProperties "hikari" Map as properties' source. It will be
// {
// url -> testUrl
// username -> testUsername
// password -> testPassword
// ... other properties
// }
final ConfigurationPropertySource source = new MapConfigurationPropertySource(nonExposedProperties.getHikari());
// Bind those properties to the DataSourceProperties instance
final BindResult<DataSourceProperties> binded =
new Binder(source).bind(
ConfigurationPropertyName.EMPTY,
Bindable.ofInstance(dataSourceProperties)
);
// Retrieve the binded instance (it's not a new one, it's the same as before)
dataSource = binded.get().initializeDataSourceBuilder().build();
}
// Return the constructed HikariDataSource
return dataSource;
}
}
My project has a dependency on another one, and imports beans from it (using #ImportResource("foo.xml")).
foo.xml defines two datasources (datasource1 and datasource2), I would like to make datasource1 a primary (so all auto-configurations of Spring Boot will work).
Is it possible? I found out that there is a DefaultListableBeanFactory that has determinePrimaryCandidate method.
So the idea is to create my own ListableBeanFactory, that would extend the DefaultListableBeanFactory, but how to force Spring Boot to use my implementation?
Or maybe there is another, easier way to mark a given bean as primary (without changing the configuration where it is defined).
You can create a configuration in your project, which builds a new data source annotated as #Primary bean. This new data source will be the datasource1, which will be injected by spring to the new data source factory method. Here you have the working example.
The config:
#SpringBootApplication
public class BeanSpringExampleApplication
{
#Bean(name = "dataSource1")
public FakeDataSource dataSource1()
{
return new FakeDataSource("dataSource1");
}
#Bean(name = "dataSource2")
public FakeDataSource dataSource2()
{
return new FakeDataSource("dataSource2");
}
#Bean
#Primary
public FakeDataSource primaryDataSource(
#Qualifier("dataSource1") FakeDataSource dataSource1)
{
return dataSource1;
}
}
Here you see three beans (using FakeDataSource class), which simulate your situation. The primaryDataSource bean factory method simply returns the dataSource1 (it's just a mere data source selector).
The FakeDataSource is just a placeholder, to make example runnable:
public class FakeDataSource
{
private final String fakeProperty;
public FakeDataSource(String id)
{
fakeProperty = id;
}
/**
* #return the fakeProperty
*/
public String getFakeProperty()
{
return fakeProperty;
}
}
Finally, a test which proves everything is working:
#RunWith(SpringRunner.class)
#SpringBootTest
public class BeanSpringExampleApplicationTests
{
#Autowired
private FakeDataSource fakeDataSource;
#Test
public void should_AutowirePrimaryDataSource() throws Exception
{
assertEquals("dataSource1", fakeDataSource.getFakeProperty());
}
}
I have seen a lot of examples of Spring Batch projects where either (a) a dataSource is defined, or (b) no dataSource is defined.
However, in my project, I would like my business logic to have access to a dataSource, but I want Spring Batch to NOT use the dataSource. Is this possible?
This guy has a similar problem: Spring boot + spring batch without DataSource
Generally, using spring-batch without a database is not a good idea, since there could be concurrency issues depending on the kind of job you define. So at least an using an inmemory db is strongly advised, especially if you plan to use the job in production.
Using SpringBatch with SpringBoot will initialize an inmemory datasource, if you do not configure your own datasource(s).
Taking this into account, let me redefine your question as follows: Can my businesslogic use another datasource than springbatch is using to update its BATCH-tables?
Yes, it can. As a matter of fact, you can use as many datasources as you want inside your SpringBatch Jobs. Just use by-name autowiring.
Here is how I do it:
I always use Configuration class, which defines all the datasources I have to use in my Jobs
Configuration
public class DatasourceConfiguration {
#Bean
#ConditionalOnMissingBean(name = "dataSource")
public DataSource dataSource() {
// create datasource, that is used by springbatch
// for instance, create an inmemory datasource using the
// EmbeddedDatabaseFactory
return ...;
}
#Bean
#ConditionalOnMissingBean(name = "bl1datasource")
public DataSource bl1datasource() {
return ...; // your first datasource that is used in your businesslogic
}
#Bean
#ConditionalOnMissingBean(name = "bl2datasource")
public DataSource bl2datasource() {
return ...; // your second datasource that is used in your businesslogic
}
}
Three points to note:
SpringBatch is looking for a datasource with the name "dataSource", if you do not provide this EXACT (uppercase 'S') name as the name, spring batch will try to autowire by type and if it finds more than one instance of DataSource, it will throw an exception.
Put your datasource configuration in its own class. Do not put them in the same class as your jobdefinitions are. Spring needs to be able to instantiate the datasource-SpringBean with the name "dataSource" very early when it loads the context. Before it starts to instantiate your Job- and Step-Beans. Spring will not be able to do it correctly, if you put your datasource definitions in the same class as you have your job/step definitions.
Using #ConditionalOnMissingBean is not mandatory, but I found it a good practics. It makes it easy to change the datasources for unit/integration tests. Just provide an additional test configuration in the ContextConfiguration of your unit/IT test which, for instance, overwrites the "bl1Datasource" with an inMemoryDataSource:
Configuration
public class TestBL1DatasourceConfiguration {
// overwritting bl1datasource with an inMemoryDatasource.
#Bean
public DataSource bl1datasource() {
return new EmbeddedDatabaseFactory.getDatabase();
}
}
In order to use the businesslogic datasources, use injection by name:
#Component
public class PrepareRe1Re2BezStepCreatorComponent {
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private DataSource bl1datasource;
#Autowired
private DataSource bl2datasource;
public Step createStep() throws Exception {
SimpleStepBuilder<..., ...> builder =
stepBuilderFactory.get("astep") //
.<..., ...> chunk(100) //
.reader(createReader(bl1datasource)) //
.writer(createWriter(bl2datasource)); //
return builder.build();
}
}
Furthermore, you probably want to consider using XA-Datasources if you'd like to work with several datasources.
Edited:
Since it seems that you really don't want to use a datasource, you have to implement your own BatchConfigurer (http://docs.spring.io/spring-batch/trunk/apidocs/org/springframework/batch/core/configuration/annotation/BatchConfigurer.html) (as Michael Minella - the SpringBatch project lead - pointed out above).
You can use the code of org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer as a starting point for your own implementation. Simply remove all the datasource/transactionmanager code and keep the content of the if (datasource === null) part in the initialize method. This will initialize a MapBasedJobRepository and MapBasedJobExplorer. But again, this is NOT a useable solution in a productive environment, since it is not threadsafe.
Edited:
How to implement it:
Configuration class that defines the "businessDataSource":
#Configuration
public class DataSourceConfigurationSimple {
DataSource embeddedDataSource;
#Bean
public DataSource myBusinessDataSource() {
if (embeddedDataSource == null) {
EmbeddedDatabaseFactory factory = new EmbeddedDatabaseFactory();
embeddedDataSource = factory.getDatabase();
}
return embeddedDataSource;
}
}
The implementation of a specific BatchConfigurer:
(of course, the methods have to be implemented...)
public class MyBatchConfigurer implements BatchConfigurer {
#Override
public JobRepository getJobRepository() throws Exception {
return null;
}
#Override
public PlatformTransactionManager getTransactionManager() throws Exception {
return null;
}
#Override
public JobLauncher getJobLauncher() throws Exception {
return null;
}
#Override
public JobExplorer getJobExplorer() throws Exception {
return null;
}
}
And finally the main configuration and launch class:
#SpringBootApplication
#Configuration
#EnableBatchProcessing
// Importing MyBatchConfigurer will install your BatchConfigurer instead of
// SpringBatch default configurer.
#Import({DataSourceConfigurationSimple.class, MyBatchConfigurer.class})
public class SimpleTestJob {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
public Job job() throws Exception {
SimpleJobBuilder standardJob = this.jobs.get(JOB_NAME)
.start(step1());
return standardJob.build();
}
protected Step step1() throws Exception {
TaskletStepBuilder standardStep1 = this.steps.get("SimpleTest_step1_Step")
.tasklet(tasklet());
return standardStep1.build();
}
protected Tasklet tasklet() {
return (contribution, context) -> {
System.out.println("tasklet called");
return RepeatStatus.FINISHED;
};
}
public static void main(String[] args) throws Exception {
SpringApplication.run(SimpleTestJob.class, args);
}
}
On a project I'm currently working on, we have the need for multiple profiles, i.e. "default" and "cloud".
both DefaultContext and CloudContext contains the same bean definitions
We are using PCF(Pivotal Cloud Foundry)
we have created an interface
public interface Config {
public DataSource getDataSource();
public SomeService getService();
}
Then implement each profile with this interface
#Primary
#Configuration
#Profile("default")
public class DevConfig implements Config
{
public DataSource getDataSource() {
// create and return production datasource
}
public SomeService getService() {
// Create and return production service
}
}
And then do the same for cloud.
#Configuratio
#Profile("cloud")
public class CloudConfig extends AbstractCloudConfig implements Config
{
public DataSource getDataSource() {
// create and return dummy datasource
}
public SomeService getService() {
// Create and return dummy service
}
}
And we are Autowiring in the service call, in processor file.
#Service("processor")
public class Processor {
#Autowired Config dsConfig;
public object get(int Number)
{
return dao.get(Number,dsConfig.getDataSource());
}
}
If we deploy in PCF, its working fine, as the profile is cloud. If we are running in local, it should get the default profile, but dsConfig is null.
Could you please help on this.
#Configuration classes aren't availalbe for autowiring.
As #spencergibb pointed out in the comment you need to tell the container to make this classes available for autowiring.
For that annotate them with #Component.
Something like this:
#Component
#Profile("default")
public class DevConfig implements Config
{
public DataSource getDataSource() {
// create and return production datasource
}
public SomeService getService() {
// Create and return production service
}
}
In case it still doesn't work, check the following two points:
Do you have the configs (DevConfig and Cloudconfig) in different packages so the ContextScan doesn't find it?
Are you running in another profile locally? (like Dev).
You can put this snipped to your code (its from JHipster) to log the active profiles.
#Autowired
private Environment env;
/**
* Initializes Application.
* <p/>
* Spring profiles can be configured with a program arguments --spring.profiles.active=your-active-profile
* <p/>
*/
#PostConstruct
public void initApplication() throws IOException {
if (env.getActiveProfiles().length == 0) {
log.warn("No Spring profile configured, running with default configuration");
}
else {
log.info("Running with Spring profile(s) : {}", Arrays.toString(env.getActiveProfiles()));
}
}
I'd rather autowire datasource and service classes instead of configuration class.
In this way you wouldn't need any instance of configuration and directly autowire whatever class you want.
So the classes will look like below.
Default Config:
#Primary
#Configuration
#Profile("default")
public class DevConfig implements Config
{
#Bean
public DataSource getDataSource() {
// create and return production datasource
}
#Bean
public SomeService getService() {
// Create and return production service
}
}
Cloud Config:
#Configuration
#Profile("cloud")
public class CloudConfig extends AbstractCloudConfig implements Config
{
#Bean
public DataSource getDataSource() {
// create and return dummy datasource
}
#Bean
public SomeService getService() {
// Create and return dummy service
}
}
Processor Class:
#Service("processor")
public class Processor {
#Autowired
private DataSource dataSource;
public object get(int Number)
{
return dao.get(Number,datasource);
}
}