I'm trying to replace my old:
#Component
public interface MyEntityRepository extends JpaRepository<MyEntity, Integer> {
#QueryHints({#QueryHint(name = CACHEABLE, value = "true")})
MyEntity findByName(String name);
}
by this:
#Component
public interface MyEntityRepository extends JpaRepository<MyEntity, Integer> {
#Cacheable(value = "entities")
MyEntity findByName(String name);
}
Because I want to use advanced caching features like no caching of null values, etc.
To do so, I followed Spring tutorial https://spring.io/guides/gs/caching/
If I don't annotate my Application.java, caching simply doesn't work.
But if I add #EnableCaching and a CacheManager bean:
package my.application.config;
#EnableWebMvc
#ComponentScan(basePackages = {"my.application"})
#Configuration
#EnableCaching
public class Application extends WebMvcConfigurerAdapter {
#Bean
public CacheManager cacheManager() {
return new ConcurrentMapCacheManager("entities");
}
// ...
}
I get the following error at startup:
java.lang.IllegalStateException: No CacheResolver specified, and no bean of type CacheManager found. Register a CacheManager bean or remove the #EnableCaching annotation from your configuration
I get the same error if I replace My CacheManager bean by a CacheResolver bean like:
#Bean
public CacheResolver cacheResolver() {
return new SimpleCacheResolver(new ConcurrentMapCacheManager("entities"));
}
Do I miss something ?
#herau You were right I had to name the bean !
The problem was that there were another bean "cacheManager", so finally, I didn't annotate Application, and created a configuration as:
#EnableCaching
#Configuration
public class CacheConf{
#Bean(name = "springCM")
public CacheManager cacheManager() {
return new ConcurrentMapCacheManager("entities");
}
}
in MyEntityRepository:
#Cacheable(value = "entities", cacheManager = "springCM")
MyEntity findByName(String name);
In my case the Spring Boot library was old, and there was no way to easily upgrade it. So I used EHCache 2 version, and it worked in my application. Here is a project I found useful to refer to: https://github.com/TechPrimers/spring-ehcache-example/blob/master/src/main/resources/ehcache.xml
Related
Here is my situation. I have a parent project which has a bean configuration as follows
#Configuration
public class Configuration {
#Bean
public BeanA beanA(#Autowired BeanB beanB) {
return new BeanA(beanB);
}
I want to override this configuration, because I need to override some of the definitions on BeanB.
#Configuration
public class Configuration {
#Bean
public BeanA beanA(#Autowired BeanC beanC) {
return new BeanA(beanC);
}
Where my bean of type C
public class BeanC extends BeanB { ... }
But when I run the application, I am always getting the configuration coming from the parent. Also have enabled the bean-definition-overriding
spring.main.allow-bean-definition-overriding=true
Does anyone knows how can I tell the spring container to use my bean definition instead the one that is coming from my parent project.
Thanks in advance!
You can annotate your bean with #Primary annotation (see: https://www.baeldung.com/spring-primary)
#Configuration
public class Configuration {
#Primary
#Bean
public BeanA beanA(#Autowired BeanC beanC) {
return new BeanA(beanC);
}
please help me.
I use multi data source in my project
data source properties:
spring.datasource.url=jdbc:sqlserver://localhost:1433;databaseName=db
spring.datasource.username=xxxxx
spring.datasource.password=xxxxx
spring.datasource.driverClassName=com.microsoft.sqlserver.jdbc.SQLServerDriver
spring.datasource2.url=jdbc:mysql://localhost:3306/db2
spring.datasource2.username=xxxx
spring.datasource2.password=xxx
spring.datasource2.driver-class-name=com.mysql.cj.jdbc.Driver
config class:
#Configuration
#EnableJdbcRepositories(
jdbcOperationsRef = "mysqlNamedParameterJdbcOperations",
basePackages = "com.example.demo.mysqlModels"
)
public class Config extends AbstractJdbcConfiguration {
#Bean("mysqlDataSource")
#ConfigurationProperties(prefix = "spring.datasource2")
public DataSource mysqlDataSource() {
return DataSourceBuilder.create()
.build();
}
#Bean(name = "mysqlNamedParameterJdbcOperations")
NamedParameterJdbcOperations mysqlNamedParameterJdbcOperations(#Qualifier("mysqlDataSource") DataSource mysqlDataSource) {
return new NamedParameterJdbcTemplate(mysqlDataSource);
}}
#Configuration
#EnableJdbcRepositories(
jdbcOperationsRef = "mssqlNamedParameterJdbcOperations",
basePackages = "com.example.demo.mssqlModels"
)
public class Config2 extends AbstractJdbcConfiguration {
#Bean("mssqlDataSource")
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource mssqlDataSource() {
return DataSourceBuilder.create()
.build();
}
#Bean(name = "mssqlNamedParameterJdbcOperations")
NamedParameterJdbcOperations mssqlNamedParameterJdbcOperations(#Qualifier("mssqlDataSource") DataSource mssqlDataSource) {
return new NamedParameterJdbcTemplate(mssqlDataSource);
}}
repository in com.example.demo.mssqlModels:
public interface MssqlRepository extends PagingAndSortingRepository<MyEntity, Integer> {}
repository in com.example.demo.mysqlModels:
public interface MysqlRepository extends PagingAndSortingRepository<MyEntity, Integer> {}
my service:
#Slf4j
#Service
public class MyService {
#Autowired
private final MssqlRepository mssqlRepository;
#Autowired
private final MysqlRepository mysqlRepository;
#PostConstruct
public void init() {
log.info("mssql result {}", mssqlRepository.findAll());
log.info("mysql result {}", mysqlRepository.findAll());
}}
but result is same and both repositories read data from mysql datasource
thanks
You might be interested in looking at my question I raised recently here regarding 2 data sources but each applied to a different repository.
In your configuration classes you should also create two TransactionManagers with unique names.
In each repository annotate it with #Transactional(transactionManager = 'transaction manager name') replacing the transaction name with the appropriate name
You'll probably need to override the default methods such as saveAll() with the same annotation as in (2).
However as per my question I found that an incorrect data source is sometimes used (I've since found that having the Postgres classes as the primary resolved my problem but I don't know why this worked)
I have a spring boot with the main class with #SpringBootApplication (so it is has implied tags #EnableAutoConfiguration, #ComponentScan, and #Configuration).
What happens if I create another class with annotation #Configuration and #ComponentScan? Do I create another container of beans? In this way the beans are duplicates? Is a good way create more #Configuration class? #Configuration create a container of beans? If yes the two containers share the bean?
I need to understand these question.
What happens if i create another class with annotation #Configuration and #Component Scan?
that is ok, is perfectly normal..
I create another container of beans ? In this way the beans are Duplicates ?
If you create two beans of the same type, you will have an error when the app is starting.. you need to declare one of them as #Primary
you can define multiple #configuration in spring boot as
Matias Elorriaga mentioned.
#Configuration equals to <beans> tag in spring xml which hold many <bean> (equal to #Bean) inside of this, same a class with annotation #Configuration say hey you can define multiple beans inside me
e.f:
#Configuration
public class AppConfig {
#Bean
public MessageSource messageSource() {
ResourceBundleMessageSource messageSource = new ResourceBundleMessageSource();
messageSource.setBasename("message");
messageSource.setDefaultEncoding("UTF-8");
return messageSource;
}
#Bean
public Validator validator() {
final LocalValidatorFactoryBean localValidatorFactoryBean = new LocalValidatorFactoryBean();
localValidatorFactoryBean.setValidationMessageSource(messageSource());
return localValidatorFactoryBean;
}
#Bean
public LocaleResolver localeResolver() {
SessionLocaleResolver sessionLocaleResolver = new SessionLocaleResolver();
sessionLocaleResolver.setDefaultLocale(Locale.ENGLISH);
return sessionLocaleResolver;
}
}
Also if you have 2 different class with same type of Bean
e.g. class: SpringConfigA have Composite bean
#Configuration
public class SpringConfigA {
#Bean
public Composite composite() {
Composite c = new Composite();
return c;
}
}
and another class SpringConfigB have same Composite bean
#Configuration
public class SpringConfigB {
#Bean
public Composite composite() {
Composite c = new Composite();
return c;
}
}
then it's will throw the exception of during bean initialization because of ambiguity with Composite beans so
here you can use Qualifier to fixed this issue
so
#Configuration
public class SpringConfigA {
#Bean
#Qualifier("compositeA")
public Composite composite() {
Composite c = new Composite();
return c;
}
}
and another class SpringConfigB have same Composite bean with Qualifier name "compositeB"
#Configuration
public class SpringConfigB {
#Bean
#Qualifier("compositeB")
public Composite composite() {
Composite c = new Composite();
return c;
}
}
yes you can register a bean using #ComponentScan by either registering configuration bean in dispatcher servlet or by importing a seperate configuration bean which is lready registered in spring container.
suppose you have Config class in which you are scanning component-
#ComponentScan(basePackages = {"xyz"})
#Configuration
public class Config {
....
}
To register Config in container you must do-
new AnnotationConfigWebApplicationContext().register(Config .class);
Or
#Configuration
#Import({MvcConfig.class})
public class AnotherConfig {
....
}
I'm unable to get #Profile to work on a #Bean, but it works fine on the #Configuration class. In my module tests I correctly get the MyServiceStub autowired in the first example, but not in the second.
Does anyone know why?
This works:
MyConfig.java
#Configuration
#Import({StubConfig.class, RealConfig.class}
public class MyConfig {}
StubConfig.java
#Profile({"featureTest", "moduleTest"})
#Configuration
public class StubConfig {
#Bean
public MyService myService() {
return new MyServiceStub();
}
}
RealConfig.java
#Profile({"!featureTest", "!moduleTest"})
#Configuration
public class RealConfig {
#Bean
public MyService myService() {
return new MyService();
}
}
But this doesn't work:
MyConfig.java
#Configuration
public class MyConfig {
#Bean
#Profile({"featureTest", "moduleTest"})
public MyService myServiceStub() {
return new MyServiceStub();
}
#Bean
#Profile({"!featureTest", "!moduleTest"})
public MyService myService() {
return new MyService();
}
}
Instead of using #Profile approach, you could try using the #Conditional to have conditional bean creation.
Or have an explicit if-stmt to check what is the activeProfile. And you can create the Bean instance depending on the activeProfile. This approach I think would mirror what could be done in Spring XML using their expression language. But the #Conditional seems better fit for you.
I am trying to configure a couple of datasources within Spring Batch. On startup, Spring Batch is throwing the following exception:
To use the default BatchConfigurer the context must contain no more thanone DataSource, found 2
Snippet from Batch Configuration
#Configuration
#EnableBatchProcessing
public class BatchJobConfiguration {
#Primary
#Bean(name = "baseDatasource")
public DataSource dataSource() {
// first datasource definition here
}
#Bean(name = "secondaryDataSource")
public DataSource dataSource2() {
// second datasource definition here
}
...
}
Not sure why I am seeing this exception, because I have seen some xml based configuration for Spring batch that declare multiple datasources. I am using Spring Batch core version 3.0.1.RELEASE with Spring Boot version 1.1.5.RELEASE. Any help would be greatly appreciated.
You must provide your own BatchConfigurer. Spring does not want to make that decision for you
#Configuration
#EnableBatchProcessing
public class BatchConfig {
#Bean
BatchConfigurer configurer(#Qualifier("batchDataSource") DataSource dataSource){
return new DefaultBatchConfigurer(dataSource);
}
...
AbstractBatchConfiguration tries to lookup BatchConfigurer in container first, if it is not found then tries to create it itself - this is where IllegalStateException is thrown where there is more than one DataSource bean in container.
The approach to solving the problem is to prevent from creation the DefaultBatchConfigurer bean in AbstractBatchConfiguration.
To do it we hint to create DefaultBatchConfigurer by Spring container using #Component annotation:
The configuration class where #EnableBatchProcessing is placed we can annotate with #ComponentScan that scan the package that contains the empty class that is derived from DefaultBatchConfigurer:
package batch_config;
...
#EnableBatchProcessing
#ComponentScan(basePackageClasses = MyBatchConfigurer.class)
public class MyBatchConfig {
...
}
the full code of that empty derived class is here:
package batch_config.components;
import org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer;
import org.springframework.stereotype.Component;
#Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
}
In this configuration the #Primary annotation works for DataSource bean as in the example below:
#Configuration
public class BatchTestDatabaseConfig {
#Bean
#Primary
public DataSource dataSource()
{
return .........;
}
}
This works for the Spring Batch version 3.0.3.RELEASE
The simplest solution to make #Primary annotation on DataSource work might be just adding #ComponentScan(basePackageClasses = DefaultBatchConfigurer.class) along with #EnableBatchProcessing annotation:
#Configuration
#EnableBatchProcessing
#ComponentScan(basePackageClasses = DefaultBatchConfigurer.class)
public class MyBatchConfig {
I would like to provide a solution here, which is very similar to the one answered by #vanarchi, but I managed to put all the necessary configurations into one class.
For the sake of completeness, the solution here assumes that primary datasource is hsql.
#Configuration
#EnableBatchProcessing
public class BatchConfiguration extends DefaultBatchConfigurer {
#Bean
#Primary
public DataSource batchDataSource() {
// no need shutdown, EmbeddedDatabaseFactoryBean will take care of this
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
EmbeddedDatabase embeddedDatabase = builder
.addScript("classpath:org/springframework/batch/core/schema-drop-hsqldb.sql")
.addScript("classpath:org/springframework/batch/core/schema-hsqldb.sql")
.setType(EmbeddedDatabaseType.HSQL) //.H2 or .DERBY
.build();
return embeddedDatabase;
}
#Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(batchDataSource());
factory.setTransactionManager(transactionManager());
factory.afterPropertiesSet();
return (JobRepository) factory.getObject();
}
private ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
//NOTE: the code below is just to provide developer an easy way to access the in-momery hsql datasource, as we configured it to the primary datasource to store batch job related data. Default username : sa, password : ''
#PostConstruct
public void getDbManager(){
DatabaseManagerSwing.main(
new String[] { "--url", "jdbc:hsqldb:mem:testdb", "--user", "sa", "--password", ""});
}
}
THREE key points in this solution:
This class is annotated with #EnableBatchProcessing and #Configuration, as well as extended from DefaultBatchConfigurer. By doing this, we instruct spring-batch to use our customized batch configurer when AbstractBatchConfiguration tries to lookup BatchConfigurer;
Annotate batchDataSource bean as #Primary, which instruct spring-batch to use this datasource as its datasource of storing the 9 job related tables.
Override protected JobRepository createJobRepository() throws Exception method, which makes the jobRepository bean to use the primary datasource, as well as use a different transactionManager instance from the other datasource(s).
The simplest solution is to extend the DefaultBatchConfigurer and autowire your datasource via a qualifier:
#Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
/**
* Initialize the BatchConfigurer to use the datasource of your choosing
* #param firstDataSource
*/
#Autowired
public MyBatchConfigurer(#Qualifier("firstDataSource") DataSource firstDataSource) {
super(firstDataSource);
}
}
Side Note (as this also deals with the use of multiple data sources): If you use autoconfig to run data initialization scripts, you may notice that it's not initializing on the datasource you'd expect. For that issue, take a look at this: https://github.com/spring-projects/spring-boot/issues/9528
You can define below beans and make sure you application.properties file has entries needed for
#Configuration
#PropertySource("classpath:application.properties")
public class DataSourceConfig {
#Primary
#Bean(name = "abcDataSource")
#ConfigurationProperties(prefix = "abc.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
#Bean(name = "xyzDataSource")
#ConfigurationProperties(prefix = "xyz.datasource")
public DataSource xyzDataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
}
application.properties
abc.datasource.jdbc-url=XXXXX
abc.datasource.username=XXXXX
abc.datasource.password=xxxxx
abc.datasource.driver-class-name=org.postgresql.Driver
...........
...........
...........
...........
Here you can refer: Spring Boot Configure and Use Two DataSources
First, create a custom BatchConfigurer
#Configuration
#Component
public class TwoDataSourcesBatchConfigurer implements BatchConfigurer {
#Autowired
#Qualifier("dataSource1")
DataSource dataSource;
#Override
public JobExplorer getJobExplorer() throws Exception {
...
}
#Override
public JobLauncher getJobLauncher() throws Exception {
...
}
#Override
public JobRepository getJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
// use the autowired data source
factory.setDataSource(dataSource);
factory.setTransactionManager(getTransactionManager());
factory.afterPropertiesSet();
return factory.getObject();
}
#Override
public PlatformTransactionManager getTransactionManager() throws Exception {
...
}
}
Then,
#Configuration
#EnableBatchProcessing
#ComponentScan("package")
public class JobConfig {
// define job, step, ...
}