How to resolve circular dependency for Spring context? - java

I have three classes:
open class RedirectProcessor(
private val adProcessor: AdProcessor
) {
fun run(depth: Int): String =
if (depth < 3) adProcessor.run(depth + 1) else "redirect"
}
open class FallbackProcessor(
private val adProcessor: AdProcessor
) {
fun run(depth: Int): String =
if (depth < 3) adProcessor.run(depth + 1) else "fallback"
}
open class AdProcessor(
private val redirectProcessor: RedirectProcessor,
private val fallbackProcessor: FallbackProcessor
) {
fun run(depth: Int): String =
depth.toString() +
redirectProcessor.run(depth) +
fallbackProcessor.run(depth)
}
So, they depends on each other. I try to configure spring context as below:
#Configuration
class Config {
#Bean
#Lazy
fun redirectProcessor(): RedirectProcessor = RedirectProcessor(adProcessor())
#Bean
#Lazy
fun fallbackProcessor(): FallbackProcessor = FallbackProcessor(adProcessor())
#Bean
fun adProcessor() = AdProcessor(
redirectProcessor = redirectProcessor(),
fallbackProcessor = fallbackProcessor()
)
}
I known that I have to use #Lazy annotation. If I mark my services with #Component annotation and use #Lazy in constructor it works fine. But I need to define beans using #Bean annotation and it causes problems. Is there any way to solve it?

I can't say for Kotlin (my knowledge of kotlin is rather limited at this point), but in Java with the last available spring version (5.2.6.RELEASE)
I've got it working with the following "kotlin to java" translation of your example:
public class RedirectProcessor {
private final AdProcessor adProcessor;
public RedirectProcessor(AdProcessor adProcessor) {
this.adProcessor = adProcessor;
}
public String run(int depth) {
if(depth < 3) {
return adProcessor.run(depth + 1);
}
else {
return "redirect";
}
}
}
public class FallbackProcessor {
private final AdProcessor adProcessor;
public FallbackProcessor(AdProcessor adProcessor) {
this.adProcessor = adProcessor;
}
public String run(int depth) {
if(depth < 3) {
return adProcessor.run(depth + 1);
}
else {
return "fallback";
}
}
}
public class AdProcessor {
private RedirectProcessor redirectProcessor;
private FallbackProcessor fallbackProcessor;
public AdProcessor(RedirectProcessor redirectProcessor, FallbackProcessor fallbackProcessor) {
this.redirectProcessor = redirectProcessor;
this.fallbackProcessor = fallbackProcessor;
}
public String run (int depth) {
return depth + redirectProcessor.run(depth) + fallbackProcessor.run(depth);
}
}
Then The trick was to use the configuration in a different (yet totally "legitimate" way from Java Configuration rules's standpoint):
#SpringBootApplication
public class DemoApplication {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
#Bean
public RedirectProcessor redirectProcessor (#Lazy AdProcessor adProcessor) {
return new RedirectProcessor(adProcessor);
}
#Bean
public FallbackProcessor fallbackProcessor (#Lazy AdProcessor adProcessor) {
return new FallbackProcessor(adProcessor);
}
#Bean
public AdProcessor adProcessor (RedirectProcessor redirectProcessor, FallbackProcessor fallbackProcessor) {
return new AdProcessor(redirectProcessor, fallbackProcessor);
}
#EventListener
public void onApplicationStarted(ApplicationStartedEvent evt) {
AdProcessor adProcessor = evt.getApplicationContext().getBean(AdProcessor.class);
String result = adProcessor.run(2);
System.out.println(result);
}
}
Note the usage of #Lazy annotation on a parameter and not on the bean itself.
The listener is done for testing purposes only. Running the application prints 23redirectfallback3redirectfallback
Now why does it work?
When spring sees such a #Lazy annotated parameter - it creates a runtime generated proxy (with CGLIB) from the parameter class.
This proxy acts in a way that it wraps the bean and this bean will be fully created only when it's "required" for the first time (read, we'll call methods of this bean in this case).
If you work with #Component its the same as the following declaration:
#Component
public class FallbackProcessor {
private final AdProcessor adProcessor;
public FallbackProcessor(#Lazy AdProcessor adProcessor) {
this.adProcessor = adProcessor;
}
public String run(int depth) {
...
}
}
One Side note, I haven't put #Autowired on constructor of FallbackProcessor class in the last example, only because if there is a single constructor spring will "recognize that" and use it to inject all the dependencies.
The following tutorial and this somewhat old thread of SO can be relevant as well (worth reading).

I ran into same issue, and #Autowire annotation doesn't work for some reason I don't know.
So I used another workaround:
inject ApplicationContext instead of bean itself
to retrieve bean instance from ApplicationContext
code like:
class ServiceA(
private val serviceB: ServiceB
) {
......
}
class ServiceB(
private val applicationContext: ApplicationContext
) {
private val serviceA: ServiceA by lazy {
// we need this logic for only once
// so "property delegated by lazy ..." is perfect for this purpose
applicationContext.getBean(ServiceA::class.java)
}
......
}

Related

spring data jdbc. Can't add custom converter for enum

I want to have enum as a field for my entity.
My application is look like:
Spring boot version
plugins {
id 'org.springframework.boot' version '2.6.2' apply false
repository:
#Repository
public interface MyEntityRepository extends PagingAndSortingRepository<MyEntity, UUID> {
...
entity:
#Table("my_entity")
public class MyEntity{
...
private FileType fileType;
// get + set
}
enum declaration:
public enum FileType {
TYPE_1(1),
TYPE_2(2);
int databaseId;
public static FileType byDatabaseId(Integer databaseId){
return Arrays.stream(values()).findFirst().orElse(null);
}
FileType(int databaseId) {
this.databaseId = databaseId;
}
public int getDatabaseId() {
return databaseId;
}
}
My attempt:
I've found following answer and try to follow it : https://stackoverflow.com/a/53296199/2674303
So I've added bean
#Bean
public JdbcCustomConversions jdbcCustomConversions() {
return new JdbcCustomConversions(asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter()));
}
converters:
#WritingConverter
public class FileTypeToDatabaseIdConverter implements Converter<FileType, Integer> {
#Override
public Integer convert(FileType source) {
return source.getDatabaseId();
}
}
#ReadingConverter
public class DatabaseIdToFileTypeConverter implements Converter<Integer, FileType> {
#Override
public FileType convert(Integer databaseId) {
return FileType.byDatabaseId(databaseId);
}
}
But I see error:
The bean 'jdbcCustomConversions', defined in class path resource
[org/springframework/boot/autoconfigure/data/jdbc/JdbcRepositoriesAutoConfiguration$SpringBootJdbcConfiguration.class],
could not be registered. A bean with that name has already been
defined in my.pack.Main and overriding is disabled.
I've tried to rename method jdbcCustomConversions() to myJdbcCustomConversions(). It helped to avoid error above but converter is not invoked during entity persistence and I see another error that application tries to save String but database type is bigint.
20:39:10.689 DEBUG [main] o.s.jdbc.core.StatementCreatorUtils: JDBC getParameterType call failed - using fallback method instead: org.postgresql.util.PSQLException: ERROR: column "file_type" is of type bigint but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
Position: 174
I also tried to use the latest(currently) version of spring boot:
id 'org.springframework.boot' version '2.6.2' apply false
But it didn't help.
What have I missed ?
How can I map enum to integer column properly ?
P.S.
I use following code for testing:
#SpringBootApplication
#EnableJdbcAuditing
#EnableScheduling
public class Main {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(Main.class, args);
MyEntityRepositoryrepository = applicationContext.getBean(MyEntityRepository.class);
MyEntity entity = new MyEntity();
...
entity.setFileType(FileType.TYPE_2);
repository.save(entity);
}
#Bean
public ModelMapper modelMapper() {
ModelMapper mapper = new ModelMapper();
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT)
.setFieldMatchingEnabled(true)
.setSkipNullEnabled(true)
.setFieldAccessLevel(PRIVATE);
return mapper;
}
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
}
}
UPDATE
My code is:
#SpringBootApplication
#EnableJdbcAuditing
#EnableScheduling
public class Main {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(Main.class, args);
MyEntityRepositoryrepository = applicationContext.getBean(MyEntityRepository.class);
MyEntity entity = new MyEntity();
...
entity.setFileType(FileType.TYPE_2);
repository.save(entity);
}
#Bean
public ModelMapper modelMapper() {
ModelMapper mapper = new ModelMapper();
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT)
.setFieldMatchingEnabled(true)
.setSkipNullEnabled(true)
.setFieldAccessLevel(PRIVATE);
return mapper;
}
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
#Bean
public JdbcConverter jdbcConverter(JdbcMappingContext mappingContext,
NamedParameterJdbcOperations operations,
#Lazy RelationResolver relationResolver,
JdbcCustomConversions conversions,
Dialect dialect) {
JdbcArrayColumns arrayColumns = dialect instanceof JdbcDialect ? ((JdbcDialect) dialect).getArraySupport()
: JdbcArrayColumns.DefaultSupport.INSTANCE;
DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(operations.getJdbcOperations(),
arrayColumns);
return new MyJdbcConverter(
mappingContext,
relationResolver,
conversions,
jdbcTypeFactory,
dialect.getIdentifierProcessing()
);
}
}
static class MyJdbcConverter extends BasicJdbcConverter {
MyJdbcConverter(
MappingContext<? extends RelationalPersistentEntity<?>, ? extends RelationalPersistentProperty> context,
RelationResolver relationResolver,
CustomConversions conversions,
JdbcTypeFactory typeFactory,
IdentifierProcessing identifierProcessing) {
super(context, relationResolver, conversions, typeFactory, identifierProcessing);
}
#Override
public int getSqlType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Types.BIGINT;
} else {
return super.getSqlType(property);
}
}
#Override
public Class<?> getColumnType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Long.class;
} else {
return super.getColumnType(property);
}
}
}
}
But I experience error:
Caused by: org.postgresql.util.PSQLException: Cannot convert an instance of java.lang.String to type long
at org.postgresql.jdbc.PgPreparedStatement.cannotCastException(PgPreparedStatement.java:925)
at org.postgresql.jdbc.PgPreparedStatement.castToLong(PgPreparedStatement.java:810)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:561)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:931)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.setObject(HikariProxyPreparedStatement.java)
at org.springframework.jdbc.core.StatementCreatorUtils.setValue(StatementCreatorUtils.java:414)
at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValueInternal(StatementCreatorUtils.java:231)
at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValue(StatementCreatorUtils.java:146)
at org.springframework.jdbc.core.PreparedStatementCreatorFactory$PreparedStatementCreatorImpl.setValues(PreparedStatementCreatorFactory.java:283)
at org.springframework.jdbc.core.PreparedStatementCreatorFactory$PreparedStatementCreatorImpl.createPreparedStatement(PreparedStatementCreatorFactory.java:241)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:649)
... 50 more
Caused by: java.lang.NumberFormatException: For input string: "TYPE_2"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Long.parseLong(Long.java:589)
at java.lang.Long.parseLong(Long.java:631)
at org.postgresql.jdbc.PgPreparedStatement.castToLong(PgPreparedStatement.java:792)
... 59 more
Try the following instead:
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return List.of(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
}
Explanation:
Spring complains that JdbcCustomConversions in auto-configuration class is already defined (by your bean) and you don't have bean overriding enabled.
JdbcRepositoriesAutoConfiguration has changed a few times, in Spring 2.6.2 it has:
#Configuration(proxyBeanMethods = false)
#ConditionalOnMissingBean(AbstractJdbcConfiguration.class)
static class SpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
}
In turn, AbstractJdbcConfiguration has:
#Bean
public JdbcCustomConversions jdbcCustomConversions() {
try {
Dialect dialect = applicationContext.getBean(Dialect.class);
SimpleTypeHolder simpleTypeHolder = dialect.simpleTypes().isEmpty() ? JdbcSimpleTypes.HOLDER
: new SimpleTypeHolder(dialect.simpleTypes(), JdbcSimpleTypes.HOLDER);
return new JdbcCustomConversions(
CustomConversions.StoreConversions.of(simpleTypeHolder, storeConverters(dialect)), userConverters());
} catch (NoSuchBeanDefinitionException exception) {
LOG.warn("No dialect found. CustomConversions will be configured without dialect specific conversions.");
return new JdbcCustomConversions();
}
}
As you can see, JdbcCustomConversions is not conditional in any way, so defining your own caused a conflict. Fortunately, it provides an extension point userConverters() which can be overriden to provide your own converters.
Update
As discussed in comments:
FileType.byDatabaseId is broken - it ignores its input param
as the column type in db is BIGINT, your converters must convert from Long, not from Integer, this addresses read queries
for writes, there is an open bug https://github.com/spring-projects/spring-data-jdbc/issues/629 There is a hardcoded assumption that Enums are converted to Strings, and only Enum -> String converters are checked.
As we want to convert to Long, we need to make amendments to BasicJdbcConverter by subclassing it and registering subclassed converter with as a #Bean.
You need to override two methods
public int getSqlType(RelationalPersistentProperty property)
public Class<?> getColumnType(RelationalPersistentProperty property)
I hardcoded the Enum type and corresponding column types, but you may want to get more fancy with that.
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return List.of(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
#Bean
public JdbcConverter jdbcConverter(JdbcMappingContext mappingContext,
NamedParameterJdbcOperations operations,
#Lazy RelationResolver relationResolver,
JdbcCustomConversions conversions,
Dialect dialect) {
JdbcArrayColumns arrayColumns = dialect instanceof JdbcDialect ? ((JdbcDialect) dialect).getArraySupport()
: JdbcArrayColumns.DefaultSupport.INSTANCE;
DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(operations.getJdbcOperations(),
arrayColumns);
return new MyJdbcConverter(
mappingContext,
relationResolver,
conversions,
jdbcTypeFactory,
dialect.getIdentifierProcessing()
);
}
}
static class MyJdbcConverter extends BasicJdbcConverter {
MyJdbcConverter(
MappingContext<? extends RelationalPersistentEntity<?>, ? extends RelationalPersistentProperty> context,
RelationResolver relationResolver,
CustomConversions conversions,
JdbcTypeFactory typeFactory,
IdentifierProcessing identifierProcessing) {
super(context, relationResolver, conversions, typeFactory, identifierProcessing);
}
#Override
public int getSqlType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Types.BIGINT;
} else {
return super.getSqlType(property);
}
}
#Override
public Class<?> getColumnType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Long.class;
} else {
return super.getColumnType(property);
}
}
}

Spring ApplicationContext.getBean returns wrong class

I've been beating my head over this and I just can't figure out what's wrong.
I have a Spring app which uses ApplicationContext.getBean() to retrieve 2 similar classes. I'm getting the wrong instance class from the bean lookup.
Here's ApplicationContext class:
public class DomainRegistryCab {
private static ApplicationContext applicationContext;
private static ApplicationContext createApplicationContext() {
return new AnnotationConfigApplicationContext( CaBridgeDomainServiceConfig.class );
}
public static CertificateProductApplicationService certificateProductAppService() {
var service = BeanFactoryAnnotationUtils.qualifiedBeanOfType(
applicationContext().getAutowireCapableBeanFactory(),
CertificateProductApplicationService.class,
CaBridgeDomainServiceConfig.CERTIFICATE_PRODUCT_APP_SERVICE);
// var service = applicationContext().getBean(
// CaBridgeDomainServiceConfig.CERTIFICATE_PRODUCT_APP_SERVICE,
// CertificateProductApplicationService.class);
// var service = applicationContext().getBean(CertificateProductApplicationService.class);
validateDataSourceIs(DataSource.ProductDataStore, service.dataSource());
return service;
}
public static CertificateProgramApplicationService certificateProgramAppService() {
var service = BeanFactoryAnnotationUtils.qualifiedBeanOfType(
applicationContext().getAutowireCapableBeanFactory(),
CertificateProgramApplicationService.class,
CaBridgeDomainServiceConfig.CERTIFICATE_PROGRAM_APP_SERVICE);
// var service = applicationContext().getBean(
// CaBridgeDomainServiceConfig.CERTIFICATE_PROGRAM_APP_SERVICE,
// CertificateProgramApplicationService.class);
// service = applicationContext().getBean(CertificateProgramApplicationService.class);
validateDataSourceIs(DataSource.ProgramDataStore, service.dataSource());
return service;
}
Here is CaBridgeDomainServiceConfig:
#Configuration
#ComponentScan(basePackageClasses = { HibernateConfigurationMarker.class })
public class CaBridgeDomainServiceConfig {
public static final String CERTIFICATE_PRODUCT_APP_SERVICE = "certificateProductAppService";
public static final String CERTIFICATE_PROGRAM_APP_SERVICE = "certificateProgramAppService";
#Bean(name= CERTIFICATE_PRODUCT_APP_SERVICE)
public CertificateProductApplicationService certificateProductAppService() {
return new CertificateProductApplicationServiceCabImpl();
}
#Bean(name= CERTIFICATE_PROGRAM_APP_SERVICE)
public CertificateProgramApplicationService certificateProgramAppService() {
return new CertificateProgramApplicationServiceCabImpl();
}
}
public interface CertificateProductApplicationService extends CertificateApplicationService {
}
public interface CertificateProductApplicationService extends CertificateApplicationService {
}
public interface CertificateApplicationService {
}
Using the above classes if I call DomainRegistryCab.certificateProductAppService() I get an instance of CertificateProgramApplicationService not CertificateProductApplicationService.
I get similar results if I use this method:
public static CertificateProductApplicationService certificateProductAppService() {
var service = applicationContext().getBean(
CaBridgeDomainServiceConfig.CERTIFICATE_PRODUCT_APP_SERVICE,
CertificateProductApplicationService.class);
validateDataSourceIs(DataSource.ProductDataStore, service.dataSource());
return service;
}
I've also tried having the #Bean methods return the implementation classes and the ApplicationContext().getBean() to request the implementation classes instead of the interfaces:
public class DomainRegistryCab {
private static ApplicationContext applicationContext;
private static ApplicationContext createApplicationContext() {
return new AnnotationConfigApplicationContext( CaBridgeDomainServiceConfig.class );
}
public static CertificateProductApplicationService certificateProductAppService() {
var service = applicationContext().getBean(CertificateProductApplicationServiceCabImpl.class);
validateDataSourceIs(DataSource.ProductDataStore, service.dataSource());
return service;
}
public static CertificateProgramApplicationService certificateProgramAppService() {
var service = applicationContext().getBean(CertificateProgramApplicationServiceCabImpl.class);
validateDataSourceIs(DataSource.ProgramDataStore, service.dataSource());
return service;
}
public static ApplicationContext applicationContext() {
if (applicationContext == null)
applicationContext = createApplicationContext();
return applicationContext;
}
}
#ComponentScan(basePackageClasses = { HibernateConfigurationMarker.class })
public class CaBridgeDomainServiceConfig {
#Bean(name= CERTIFICATE_PRODUCT_APP_SERVICE)
public CertificateProductApplicationServiceCabImpl certificateProductAppService() {
return new CertificateProductApplicationServiceCabImpl();
}
#Bean(name= CERTIFICATE_PROGRAM_APP_SERVICE)
public CertificateProgramApplicationServiceCabImpl certificateProgramAppService() {
return new CertificateProgramApplicationServiceCabImpl();
}
}
This code results in spring not finding the implementation classes at all:
org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'cmb.cabridge.application.cert.CertificateProductApplicationServiceCabImpl' available
I was eventually able to get things to work using the applicationContext().getBean("beanName", CertificateProductApplicationService.class). The problem was deeper in my code in the CertificateProductApplicationServiceCabImpl class which used and returned the wrong datasource.

Why PostConstruct method does not work to keep bean init order when using in springboot?

I have some config information in the database, when the springboot starts, it will load this information only one time. So I write the code like this:
#Configuration
public class CommonConfig {
private final DbConfigDao dbConfigDao;
#Autowired
public CommonConfig (DbConfigDao dbConfigDao) {
this.dbConfigDao = dbConfigDao;
}
private int redisStoreDays; //omit get,set method
#PostConstruct
public void init() {
redisStoreDays = dbConfigDao.getValueByKey("redisStoreDays");
//... continue to load the other config from db
}
}
In the other bean, I try to get the redisStoreDays value, it returns 0, but the real value is 1.
#Configuration
public class AutoConfiguration {
#Conditional(RedisClusterConditional.class)
#Bean(name = "redisDao", initMethod = "init")
public RedisClusterDao getRedisClusterDao() {
return new RedisClusterDaoImpl();
}
#Conditional(RedisSentryConditional.class)
#Bean(name = "redisDao", initMethod = "init")
public RedisSentryConditional getRedisSentryDao() {
return new RedisSentryDaoImpl();
}
}
public class RedisClusterDaoImpl implements RedisDao {
#Autowired
private CommonConfig commonConfig;
private int storeDays = 7;
#Override
public void init() {
storeDays = 60 * 60 * 24 * commonConfig.getRedisStoreDays();
//commonConfig.getRedisStoreDays is 0 but in fact is 1
}
}
How to keep the bean init order?
I try to add PostConstruct in my redis bean, but it still does not work.
I debug and find the commonConfig is not null, but commonConfig.getRedisStoreDays() returns 0.
After executing init method in RedisClusterDaoImpl, commonConfig.getRedisStoreDays() changes to 1.
I also try to add #AutoConfigureBefore(RedisClusterDao.class),but storeDays still gets 0 in RedisClusterDaoImpl class.
Use Spring's #DependsOn, i.e.:
#DependsOn("CommonConfig")
#Configuration
public class AutoConfiguration {
...
}
Why not
#Configuration
public class CommonConfig {
private final DbConfigDao dbConfigDao;
#Autowired
public CommonConfig (DbConfigDao dbConfigDao) {
this.dbConfigDao = dbConfigDao;
// why in PostConstruct? You have the bean right here, it should be initialized
redisStoreDays = dbConfigDao.getValueByKey("redisStoreDays");
}
private int redisStoreDays; //omit get,set method
}
or just
public class RedisClusterDaoImpl implements RedisDao {
#Autowired
private DbConfigDao commonConfig;
private int storeDays = 7;
#Override
public void init() {
storeDays = 60 * 60 * 24 * dbConfigDao.getValueByKey("redisStoreDays");
}
}
After all, this is the redis bean, I don't see why it shouldn't know the name of the config setting it needs.

Why is #AfterReturning is never called

I have this method and it does return a list:
public List<ReportReconciliationEntry> getMissingReports(List<ReportReconciliationEntry> expectedReports,
List<GeneratedReportContent> generatedReports){
...
return missingReports;
}
but this method is never called:
#AfterReturning(value = "execution(* com.XXX.YYY.ZZZ.service.ReconciliationService.getMissingReports(..)) && args(expectedReports,generatedReports)", argNames = "expectedReports,generatedReports,missingReports", returning = "missingReports")
public void logReportReconciliationException(List<ReportReconciliationEntry> expectedReports, List<GeneratedReportContent> generatedReports, List<ReportReconciliationEntry> missingReports) {
final String notApplicable = properties.getNotApplicable();
ReportingAlertMarker marker = ReportingAlertMarker.builder()
.eventType(E90217)
.userIdentity(notApplicable)
.destinationIp(properties.getDestinationIp())
.destinationPort(properties.getDestinationPort())
.dataIdentity(notApplicable)
.resourceIdentity(notApplicable)
.responseCode(404)
.build();
MDC.put(SYSTEM_COMPONENT, properties.getBpsReportGenerationService());
System.out.println(missingReports);
logWrapper.logError(marker, "SDGFHDZFHDFR!!");
}
I check the return of the first method with a breakpoint. It does return a list, but the #AfterReturning is never called, although the IDE shows the "Navigate to AOP advices" icon. What am I missing?
This is what my class looks like:
#Component
#Aspect
#Slf4j
public class ReportingAlertAspect {
private final LogWrapper logWrapper;
private final ReportingAlertProperties properties;
public ReportingAlertAspect(final ReportingAlertProperties properties, final LogWrapper logWrapper) {
this.logWrapper = logWrapper;
this.properties = properties;
}
....
}
I have another class with a function in it and this one works fine:
#Component
#Aspect
#Slf4j
public class ReportingInfoAspect {
private final LogWrapper logWrapper;
private final ReportingAlertProperties properties;
#AfterReturning(value = "execution(* com.xxx.yyy.zzz.qqq.ReconciliationService.reconcile(..)) && args(windowId)", argNames = "windowId,check",
returning = "check")
public void logSuccessfulReportReconciliation(ReconciliationEvent windowId, boolean check){
String notApplicable = properties.getNotApplicable();
MDC.put(SYSTEM_COMPONENT, properties.getBpsReportGenerationService());
ReportingAlertMarker marker = ReportingAlertMarker.builder()
.eventType(E90293)
.userIdentity(notApplicable)
.destinationIp(properties.getDestinationIp())
.destinationPort(properties.getDestinationPort())
.dataIdentity(notApplicable)
.resourceIdentity(notApplicable)
.responseCode(200)
.build();
if (check){
logWrapper.logInfo(marker, "All reports for windowId {} were generated successfully", windowId.windowId);
}
}
}
I found the problem.
The getMissingReports method was called from another method inside the same class. This is a case of self-invocation and the method was never called through the proxy.
This is what the class looks like:
#Service
#RequiredArgsConstructor
public class ReconciliationService {
private final ReconciliationRepository reconciliationRepository;
private final ReportSafeStoreClientService reportSafeStoreClientService;
#Handler
public whatever whatever() {
...
getMissingReports()
}
}
You can find more info here

How to Autowire conditionally in spring boot?

I have created one scheduler class
public class TestSchedulderNew {
#Scheduled(fixedDelay = 3000)
public void fixedRateJob1() {
System.out.println("Job 1 running");
}
#Scheduled(fixedDelay = 3000)
public void fixedRateJob2() {
System.out.println("Job 2 running");
}
}
In configuration i have put #ConditionalOnProperty annotation to enable this on conditional purpose.
#Bean
#ConditionalOnProperty(value = "jobs.enabled")
public TestSchedulderNew testSchedulderNew() {
return new TestSchedulderNew();
}
Now in controller, i have created "stopScheduler" method to stop those scheduler , in this controller i have autowired
TestSchedulderNew class
#RestController
#RequestMapping("/api")
public class TestCont {
private static final String SCHEDULED_TASKS = "testSchedulderNew";
#Autowired
private ScheduledAnnotationBeanPostProcessor postProcessor; /]
#Autowired
private TestSchedulderNew testSchedulderNew;
#GetMapping(value = "/stopScheduler")
public String stopSchedule(){
postProcessor.postProcessBeforeDestruction(testSchedulderNew,
SCHEDULED_TASKS);
return "OK";
}
}
Now the problem is if conditional property is false then i get below exception
Field testSchedulderNew in com.sbill.app.web.rest.TestCont required a bean of type 'com.sbill.app.schedulerJob.TestSchedulderNew
In case of true everything works fine,
Do we have any option to solve this ?
You can use #Autowired(required=false) and null check in stopScheduler method.
#Autowired(required=false)
private TestSchedulderNew testSchedulderNew;
#GetMapping(value = "/stopScheduler")
public String stopSchedule() {
if (testSchedulderNew != null) {
postProcessor.postProcessBeforeDestruction(testSchedulderNew,
SCHEDULED_TASKS);
return "OK";
}
return "NOT_OK";
}

Categories