Java Spring Bean dependency injection, conditionalOnBean - java

Attempting to create a class that will be used by JavaConfig as a source of bean definitions.
#Configuration
public class MyClass {
#Bean
#ConditionalOnProperty(name = "property")
public A fuction1() {
doSomething1(); // may return an exception
}
#Bean
#ConditionalOnMissingBean(name = "fuction1")
public A fuction2() {
doSomething2();
}
#Bean
public B fuction3(A a) {
doSomething3(a);
}
}
The third bean definition has the error "could not autowire. There is more than one bean of A type." How do I tell Spring to try to autowire the first A and if it is missing then to try the second A, i.e. following the conditional process described. Hopefully that makes sense.
Thanks!

You are defining two bean with same name hence, autowiring issue is obvious. Try using #Primary on the bean you want to give priority.

How about this approach?
#Configuration
public class MyClass {
#Bean
#ConditionalOnProperty(name = "property", havingValue="true")
public A fuction1() {
doSomething1(); // may return an exception
}
#Bean
#ConditionalOnProperty(name = "property", havingValue="false")
public A fuction2() {
doSomething2();
}
#Bean
public B fuction3(A a) {
doSomething3(a);
}
}

Maybe this config will be helpful:
#Configuration
public class MyClass {
#Bean
#ConditionalOnProperty(name = "property")
public A fuction1() {
try {
doSomething1(); // may return an exception
} catch(Exception e) {
return null;
}
}
#Bean
#ConditionalOnMissingBean(name = "fuction1")
public A fuction2() {
doSomething2();
}
#Bean
public B fuction3(#Autowired(required = false) #Qualifier("function1") A a1, #Qualifier("function2") A a2) {
if (a1 == null) {
doSomething3(a2);
} else {
doSomething3(a1);
}
}
}

The sample code you've provided is conceptually correct. Of course there are compilation errors but annotations are correct.
Here is a sample that compiles and works.
#Configuration
public class ConditionalConfiguration {
public static class A {
private final String name;
public A(String name) {
this.name = name;
}
}
public static class B {
private final A a;
public B(A a) {
this.a = a;
}
}
#Bean
#ConditionalOnProperty(name = "a1-property")
public A a1() {
return new A("a1");
}
#Bean
#ConditionalOnMissingBean(A.class)
public A a2() {
return new A("a2");
}
#Bean
public B b(A a) {
System.out.println("########## " + a.name);
return new B(a);
}
}
When the property a1-property is passed to the application
./gradlew clean bootRun -i -Da1-property
########## a1
the bean a1 is created.
When the property is missing
./gradlew clean bootRun -i
########## a2
The bean a2 is created.
Make sure to configure Gradle to pass system properties to the bootRun task by adding in the build.gradle
bootRun {
systemProperties System.properties
}
To be decoupled from a1 bean name, condition on bean type is used: #ConditionalOnMissingBean(A.class) instead of #ConditionalOnMissingBean(name = "a1").
If you expect an exception during bean creation it is another case.
When factory method annotated with #Bean throws an exception, Spring application crashes with BeanInstantiationException.
So, exceptions should be handled in the factory method:
#Bean
#ConditionalOnProperty(name = "a1-property")
public A a1() {
try {
//return new A("a1");
throw new RuntimeException("Sample exception");
} catch (Exception e) {
return fallbackA();
}
}
#Bean
#ConditionalOnMissingBean(A.class)
public A a2() {
return fallbackA();
}
private A fallbackA() {
return new A("a2");
}

Related

spring data jdbc. Can't add custom converter for enum

I want to have enum as a field for my entity.
My application is look like:
Spring boot version
plugins {
id 'org.springframework.boot' version '2.6.2' apply false
repository:
#Repository
public interface MyEntityRepository extends PagingAndSortingRepository<MyEntity, UUID> {
...
entity:
#Table("my_entity")
public class MyEntity{
...
private FileType fileType;
// get + set
}
enum declaration:
public enum FileType {
TYPE_1(1),
TYPE_2(2);
int databaseId;
public static FileType byDatabaseId(Integer databaseId){
return Arrays.stream(values()).findFirst().orElse(null);
}
FileType(int databaseId) {
this.databaseId = databaseId;
}
public int getDatabaseId() {
return databaseId;
}
}
My attempt:
I've found following answer and try to follow it : https://stackoverflow.com/a/53296199/2674303
So I've added bean
#Bean
public JdbcCustomConversions jdbcCustomConversions() {
return new JdbcCustomConversions(asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter()));
}
converters:
#WritingConverter
public class FileTypeToDatabaseIdConverter implements Converter<FileType, Integer> {
#Override
public Integer convert(FileType source) {
return source.getDatabaseId();
}
}
#ReadingConverter
public class DatabaseIdToFileTypeConverter implements Converter<Integer, FileType> {
#Override
public FileType convert(Integer databaseId) {
return FileType.byDatabaseId(databaseId);
}
}
But I see error:
The bean 'jdbcCustomConversions', defined in class path resource
[org/springframework/boot/autoconfigure/data/jdbc/JdbcRepositoriesAutoConfiguration$SpringBootJdbcConfiguration.class],
could not be registered. A bean with that name has already been
defined in my.pack.Main and overriding is disabled.
I've tried to rename method jdbcCustomConversions() to myJdbcCustomConversions(). It helped to avoid error above but converter is not invoked during entity persistence and I see another error that application tries to save String but database type is bigint.
20:39:10.689 DEBUG [main] o.s.jdbc.core.StatementCreatorUtils: JDBC getParameterType call failed - using fallback method instead: org.postgresql.util.PSQLException: ERROR: column "file_type" is of type bigint but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
Position: 174
I also tried to use the latest(currently) version of spring boot:
id 'org.springframework.boot' version '2.6.2' apply false
But it didn't help.
What have I missed ?
How can I map enum to integer column properly ?
P.S.
I use following code for testing:
#SpringBootApplication
#EnableJdbcAuditing
#EnableScheduling
public class Main {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(Main.class, args);
MyEntityRepositoryrepository = applicationContext.getBean(MyEntityRepository.class);
MyEntity entity = new MyEntity();
...
entity.setFileType(FileType.TYPE_2);
repository.save(entity);
}
#Bean
public ModelMapper modelMapper() {
ModelMapper mapper = new ModelMapper();
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT)
.setFieldMatchingEnabled(true)
.setSkipNullEnabled(true)
.setFieldAccessLevel(PRIVATE);
return mapper;
}
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
}
}
UPDATE
My code is:
#SpringBootApplication
#EnableJdbcAuditing
#EnableScheduling
public class Main {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(Main.class, args);
MyEntityRepositoryrepository = applicationContext.getBean(MyEntityRepository.class);
MyEntity entity = new MyEntity();
...
entity.setFileType(FileType.TYPE_2);
repository.save(entity);
}
#Bean
public ModelMapper modelMapper() {
ModelMapper mapper = new ModelMapper();
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT)
.setFieldMatchingEnabled(true)
.setSkipNullEnabled(true)
.setFieldAccessLevel(PRIVATE);
return mapper;
}
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
#Bean
public JdbcConverter jdbcConverter(JdbcMappingContext mappingContext,
NamedParameterJdbcOperations operations,
#Lazy RelationResolver relationResolver,
JdbcCustomConversions conversions,
Dialect dialect) {
JdbcArrayColumns arrayColumns = dialect instanceof JdbcDialect ? ((JdbcDialect) dialect).getArraySupport()
: JdbcArrayColumns.DefaultSupport.INSTANCE;
DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(operations.getJdbcOperations(),
arrayColumns);
return new MyJdbcConverter(
mappingContext,
relationResolver,
conversions,
jdbcTypeFactory,
dialect.getIdentifierProcessing()
);
}
}
static class MyJdbcConverter extends BasicJdbcConverter {
MyJdbcConverter(
MappingContext<? extends RelationalPersistentEntity<?>, ? extends RelationalPersistentProperty> context,
RelationResolver relationResolver,
CustomConversions conversions,
JdbcTypeFactory typeFactory,
IdentifierProcessing identifierProcessing) {
super(context, relationResolver, conversions, typeFactory, identifierProcessing);
}
#Override
public int getSqlType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Types.BIGINT;
} else {
return super.getSqlType(property);
}
}
#Override
public Class<?> getColumnType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Long.class;
} else {
return super.getColumnType(property);
}
}
}
}
But I experience error:
Caused by: org.postgresql.util.PSQLException: Cannot convert an instance of java.lang.String to type long
at org.postgresql.jdbc.PgPreparedStatement.cannotCastException(PgPreparedStatement.java:925)
at org.postgresql.jdbc.PgPreparedStatement.castToLong(PgPreparedStatement.java:810)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:561)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:931)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.setObject(HikariProxyPreparedStatement.java)
at org.springframework.jdbc.core.StatementCreatorUtils.setValue(StatementCreatorUtils.java:414)
at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValueInternal(StatementCreatorUtils.java:231)
at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValue(StatementCreatorUtils.java:146)
at org.springframework.jdbc.core.PreparedStatementCreatorFactory$PreparedStatementCreatorImpl.setValues(PreparedStatementCreatorFactory.java:283)
at org.springframework.jdbc.core.PreparedStatementCreatorFactory$PreparedStatementCreatorImpl.createPreparedStatement(PreparedStatementCreatorFactory.java:241)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:649)
... 50 more
Caused by: java.lang.NumberFormatException: For input string: "TYPE_2"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Long.parseLong(Long.java:589)
at java.lang.Long.parseLong(Long.java:631)
at org.postgresql.jdbc.PgPreparedStatement.castToLong(PgPreparedStatement.java:792)
... 59 more
Try the following instead:
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return List.of(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
}
Explanation:
Spring complains that JdbcCustomConversions in auto-configuration class is already defined (by your bean) and you don't have bean overriding enabled.
JdbcRepositoriesAutoConfiguration has changed a few times, in Spring 2.6.2 it has:
#Configuration(proxyBeanMethods = false)
#ConditionalOnMissingBean(AbstractJdbcConfiguration.class)
static class SpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
}
In turn, AbstractJdbcConfiguration has:
#Bean
public JdbcCustomConversions jdbcCustomConversions() {
try {
Dialect dialect = applicationContext.getBean(Dialect.class);
SimpleTypeHolder simpleTypeHolder = dialect.simpleTypes().isEmpty() ? JdbcSimpleTypes.HOLDER
: new SimpleTypeHolder(dialect.simpleTypes(), JdbcSimpleTypes.HOLDER);
return new JdbcCustomConversions(
CustomConversions.StoreConversions.of(simpleTypeHolder, storeConverters(dialect)), userConverters());
} catch (NoSuchBeanDefinitionException exception) {
LOG.warn("No dialect found. CustomConversions will be configured without dialect specific conversions.");
return new JdbcCustomConversions();
}
}
As you can see, JdbcCustomConversions is not conditional in any way, so defining your own caused a conflict. Fortunately, it provides an extension point userConverters() which can be overriden to provide your own converters.
Update
As discussed in comments:
FileType.byDatabaseId is broken - it ignores its input param
as the column type in db is BIGINT, your converters must convert from Long, not from Integer, this addresses read queries
for writes, there is an open bug https://github.com/spring-projects/spring-data-jdbc/issues/629 There is a hardcoded assumption that Enums are converted to Strings, and only Enum -> String converters are checked.
As we want to convert to Long, we need to make amendments to BasicJdbcConverter by subclassing it and registering subclassed converter with as a #Bean.
You need to override two methods
public int getSqlType(RelationalPersistentProperty property)
public Class<?> getColumnType(RelationalPersistentProperty property)
I hardcoded the Enum type and corresponding column types, but you may want to get more fancy with that.
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return List.of(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
#Bean
public JdbcConverter jdbcConverter(JdbcMappingContext mappingContext,
NamedParameterJdbcOperations operations,
#Lazy RelationResolver relationResolver,
JdbcCustomConversions conversions,
Dialect dialect) {
JdbcArrayColumns arrayColumns = dialect instanceof JdbcDialect ? ((JdbcDialect) dialect).getArraySupport()
: JdbcArrayColumns.DefaultSupport.INSTANCE;
DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(operations.getJdbcOperations(),
arrayColumns);
return new MyJdbcConverter(
mappingContext,
relationResolver,
conversions,
jdbcTypeFactory,
dialect.getIdentifierProcessing()
);
}
}
static class MyJdbcConverter extends BasicJdbcConverter {
MyJdbcConverter(
MappingContext<? extends RelationalPersistentEntity<?>, ? extends RelationalPersistentProperty> context,
RelationResolver relationResolver,
CustomConversions conversions,
JdbcTypeFactory typeFactory,
IdentifierProcessing identifierProcessing) {
super(context, relationResolver, conversions, typeFactory, identifierProcessing);
}
#Override
public int getSqlType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Types.BIGINT;
} else {
return super.getSqlType(property);
}
}
#Override
public Class<?> getColumnType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Long.class;
} else {
return super.getColumnType(property);
}
}
}

How to Autowire conditionally in spring boot?

I have created one scheduler class
public class TestSchedulderNew {
#Scheduled(fixedDelay = 3000)
public void fixedRateJob1() {
System.out.println("Job 1 running");
}
#Scheduled(fixedDelay = 3000)
public void fixedRateJob2() {
System.out.println("Job 2 running");
}
}
In configuration i have put #ConditionalOnProperty annotation to enable this on conditional purpose.
#Bean
#ConditionalOnProperty(value = "jobs.enabled")
public TestSchedulderNew testSchedulderNew() {
return new TestSchedulderNew();
}
Now in controller, i have created "stopScheduler" method to stop those scheduler , in this controller i have autowired
TestSchedulderNew class
#RestController
#RequestMapping("/api")
public class TestCont {
private static final String SCHEDULED_TASKS = "testSchedulderNew";
#Autowired
private ScheduledAnnotationBeanPostProcessor postProcessor; /]
#Autowired
private TestSchedulderNew testSchedulderNew;
#GetMapping(value = "/stopScheduler")
public String stopSchedule(){
postProcessor.postProcessBeforeDestruction(testSchedulderNew,
SCHEDULED_TASKS);
return "OK";
}
}
Now the problem is if conditional property is false then i get below exception
Field testSchedulderNew in com.sbill.app.web.rest.TestCont required a bean of type 'com.sbill.app.schedulerJob.TestSchedulderNew
In case of true everything works fine,
Do we have any option to solve this ?
You can use #Autowired(required=false) and null check in stopScheduler method.
#Autowired(required=false)
private TestSchedulderNew testSchedulderNew;
#GetMapping(value = "/stopScheduler")
public String stopSchedule() {
if (testSchedulderNew != null) {
postProcessor.postProcessBeforeDestruction(testSchedulderNew,
SCHEDULED_TASKS);
return "OK";
}
return "NOT_OK";
}

FactoryBeanNotInitializedException: Cannot determine target class for proxy

I want to setup an object pool in my spring application with annotation only.
I started out with this example taken from Spring docs: http://docs.spring.io/spring/docs/2.5.x/reference/aop-api.html#aop-ts-pool.
Here is how I translate the XML configuration:
#Configuration
#EnableAutoConfiguration
#ComponentScan
public class SpringObjectPoolTest {
public static void main(String[] args) throws Exception {
context = new SpringApplicationBuilder(SpringObjectPoolTest.class) //
.addCommandLineProperties(false) //
.web(false) //
.headless(false) //
.registerShutdownHook(true) //
.application() //
.run();
context.getBean(SpringObjectPoolTest.class).go();
}
#Resource(name = "pfb")
private FactoryBean<MyTask> pool;
#Resource(name="pool")
private TargetSource targetSource;
private static ConfigurableApplicationContext context;
#Bean(name = "task")
#Scope("prototype")
public MyTask createNewTask() {
return new MyTask();
}
#Bean(name = "pool")
public CommonsPoolTargetSource setupObjectPool() {
CommonsPoolTargetSource pc = new CommonsPoolTargetSource();
pc.setMaxSize(25);
pc.setTargetBeanName("task");
return pc;
}
#Bean(name = "pfb")
public ProxyFactoryBean createProxyFactoryBean() {
ProxyFactoryBean pfb = new ProxyFactoryBean();
pfb.setTargetSource(targetSource);
return pfb;
}
private void go() {
try {
pool.getObject().speak();
} catch (Exception e) {
e.printStackTrace();
}
}
}
However I get this exception:
org.springframework.beans.factory.BeanCurrentlyInCreationException:
Error creating bean with name 'pfb':
org.springframework.beans.factory.FactoryBeanNotInitializedException:
Cannot determine target class for proxy
You are a bit over engineering this. Spring already knows how to inject a proxied MyTask, there is no need to have a FactoryBean<MyTask> or to call getObject() on the pool. In "pooledTask" below Spring knows that by injecting a ProxyFactoryBean ("pfb") it will actually inject the instance that factory bean creates, not the factory bean itself. Here's how I'd do it:
#Configuration
#EnableAutoConfiguration
#ComponentScan
public class SpringObjectPoolTest {
public static void main(String[] args) throws Exception {
context = new SpringApplicationBuilder(SpringObjectPoolTest.class) //
.addCommandLineProperties(false) //
.web(false) //
.headless(false) //
.registerShutdownHook(true) //
.application() //
.run();
context.getBean(SpringObjectPoolTest.class).go();
}
private static ConfigurableApplicationContext context;
#Resource(name = "pfb")
private MyTask pooledTask;
#Resource(name="pool")
private CommonsPoolTargetSource targetSource;
#Bean(name = "task")
#Scope("prototype")
public MyTask createNewTask() {
return new MyTask();
}
#Bean(name = "pool")
public CommonsPoolTargetSource setupObjectPool() {
CommonsPoolTargetSource pc = new CommonsPoolTargetSource();
pc.setMaxSize(25);
pc.setTargetBeanName("task");
return pc;
}
#Bean(name = "pfb")
public ProxyFactoryBean createProxyFactoryBean() {
ProxyFactoryBean pfb = new ProxyFactoryBean();
pfb.setTargetSource(setupObjectPool());
return pfb;
}
private void go() {
try {
pooledTask.speak();
// getting another object from pool
MyTask someOtherTask = (MyTask) targetSource.getTarget();
// returning the object to the pool
targetSource.releaseTarget(someOtherTask);
} catch (Exception e) {
e.printStackTrace();
}
}
}

How to use #EnableTransactionManagement in combination with a StaticMethodMatcherPointcutAdvisor

Given the following service:
public interface MyService {
void method();
}
And it's implementation:
#Service
public class MyServiceImpl implements MyService {
#Transactional
#CustomAnnotation
#Override
public void method() {
...
}
}
I would like to use a StaticMethodMatcherPointcutAdvisor in the following manner:
public class MyPointcutAdvisor extends StaticMethodMatcherPointcutAdvisor {
...
#Override
public boolean matches(Method method, Class targetClass) {
Method m = method;
if(annotationPresent(method)) {
return true;
}
Class<?> userClass = ClassUtils.getUserClass(targetClass);
Method specificMethod = ClassUtils.getMostSpecificMethod(method, userClass);
specificMethod = BridgeMethodResolver.findBridgedMethod(specificMethod);
if(annotationPresent(specificMethod )) {
return true;
}
return false;
}
...
}
The problem is that Spring uses an InfrastructureAdvisorAutoProxyCreator to create a Proxy of that class, whereas the DefaultAdvisorAutoProxyCreator would create the proxy for the MyPointcutAdvisor, but the MyPointcutAdvisor is only given the proxy as targetClass parameter. Thus, the PointcutAdvisor cannot find the annotation and therefore does not match.
For completion this is my Configuration-class:
#Configuration
#EnableTransactionManagement
public class MyConfiguration {
#Bean
public DefaultAdvisorAutoProxyCreator defaultAdvisorAutoProxyCreator() {
return new DefaultAdvisorAutoProxyCreator();
}
#Bean
public MyPointcutAdvisor myPointcutAdvisor() {
return new MyPointcutAdvisor();
}
...
}
My question is: Is there a way to use #EnableTransactionManagement in combination with a StaticMethodMatcherPointcutAdvisor ?
Workarounds:
Put #CustomAnnotation into the service interface: I want to have clean interfaces.
Add #Role(BeanDefinition.ROLE_INFRASTRUCTURE) to MyPointCutAdvisor bean configuration, thus, the InfrastructureAdvisorAutoProxyCreator will create the proxy. This seems like the wrong way, since this bean is not infrastructure
Copy the beans from ProxyTransactionManagementConfiguration, remove #EnableTransactionManagement and remove #Role(BeanDefinition.ROLE_INFRASTRUCTURE), thus the DefaultAdvisorAutoProxyCreator will create the proxy, which is my current workaround and results in the following configuration:
#Configuration
public class MyWorkaroundConfiguration {
#Bean
public DefaultAdvisorAutoProxyCreator defaultAdvisorAutoProxyCreator() {
return new DefaultAdvisorAutoProxyCreator();
}
#Bean
public MyPointcutAdvisor myPointcutAdvisor() {
return new MyPointcutAdvisor();
}
#Bean
public TransactionAttributeSource transactionAttributeSource() {
return new AnnotationTransactionAttributeSource();
}
#Bean(name = TransactionManagementConfigUtils.TRANSACTION_ADVISOR_BEAN_NAME)
public BeanFactoryTransactionAttributeSourceAdvisor transactionAdvisor(
TransactionInterceptor transactionInterceptor) {
BeanFactoryTransactionAttributeSourceAdvisor advisor =
new BeanFactoryTransactionAttributeSourceAdvisor();
advisor.setTransactionAttributeSource(transactionAttributeSource());
advisor.setAdvice(transactionInterceptor);
return advisor;
}
#Bean
public TransactionInterceptor transactionInterceptor(
PlatformTransactionManager transactionManager) {
TransactionInterceptor interceptor = new TransactionInterceptor();
interceptor.setTransactionAttributeSource(transactionAttributeSource());
interceptor.setTransactionManager(transactionManager);
return interceptor;
}
...
}
Using #EnableAspectJAutoProxy instead of the DefaultAutoProxyCreator works for me.
#Configuration
#EnableAspectJAutoProxy
#EnableTransactionManagement
public class MyConfiguration {
}
This also allows using #Aspect like M. Deinum suggested.

Is it possible to use #Valid annotation in console Spring-based application?

I have AnnotationConfigApplicationContext created with #Configuration annotated class as a param:
#Configuration
class Config {
#Bean
public LocalValidatorFactoryBean localValidatorFactoryBean() {
return new LocalValidatorFactoryBean();
}
#Bean
public A aBean() {
return new A();
}
#Bean
public B aBean() {
return new B();
}
}
Where A and B are:
class A {
#Min(1)
public int myInt;
}
class B {
#Autowire(required = true)
#Valid
public A aBean;
}
Q: Is it possible to make Spring to process #Valid annotation in this case?
PS: Currently I have following working implementation of B:
class B {
public A aBean;
public void setABean(A aBean, Validator validator) {
if (validator.validate(aBean).size() > 0) {
throw new ValidationException();
}
this.aBean = aBean;
}
}
This impl seems a bit clumsy to me and I want to replace it. Please help :)
It looks like you want to validate your bean in the process of injection.
You can read
here.
Here is an example:
public class BeanValidator implements org.springframework.validation.Validator, InitializingBean {
private Validator validator;
public void afterPropertiesSet() throws Exception {
ValidatorFactory validatorFactory = Validation.buildDefaultValidatorFactory();
validator = validatorFactory.usingContext().getValidator();
}
public boolean supports(Class clazz) {
return true;
}
public void validate(Object target, Errors errors) {
Set<ConstraintViolation<Object>> constraintViolations = validator.validate(target);
for (ConstraintViolation<Object> constraintViolation : constraintViolations) {
String propertyPath = constraintViolation.getPropertyPath().toString();
String message = constraintViolation.getMessage();
errors.rejectValue(propertyPath, "", message);
}
}
}
You will need to implement InitializingBean, to be able to validate the bean after it was set.

Categories