I want to have enum as a field for my entity.
My application is look like:
Spring boot version
plugins {
id 'org.springframework.boot' version '2.6.2' apply false
repository:
#Repository
public interface MyEntityRepository extends PagingAndSortingRepository<MyEntity, UUID> {
...
entity:
#Table("my_entity")
public class MyEntity{
...
private FileType fileType;
// get + set
}
enum declaration:
public enum FileType {
TYPE_1(1),
TYPE_2(2);
int databaseId;
public static FileType byDatabaseId(Integer databaseId){
return Arrays.stream(values()).findFirst().orElse(null);
}
FileType(int databaseId) {
this.databaseId = databaseId;
}
public int getDatabaseId() {
return databaseId;
}
}
My attempt:
I've found following answer and try to follow it : https://stackoverflow.com/a/53296199/2674303
So I've added bean
#Bean
public JdbcCustomConversions jdbcCustomConversions() {
return new JdbcCustomConversions(asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter()));
}
converters:
#WritingConverter
public class FileTypeToDatabaseIdConverter implements Converter<FileType, Integer> {
#Override
public Integer convert(FileType source) {
return source.getDatabaseId();
}
}
#ReadingConverter
public class DatabaseIdToFileTypeConverter implements Converter<Integer, FileType> {
#Override
public FileType convert(Integer databaseId) {
return FileType.byDatabaseId(databaseId);
}
}
But I see error:
The bean 'jdbcCustomConversions', defined in class path resource
[org/springframework/boot/autoconfigure/data/jdbc/JdbcRepositoriesAutoConfiguration$SpringBootJdbcConfiguration.class],
could not be registered. A bean with that name has already been
defined in my.pack.Main and overriding is disabled.
I've tried to rename method jdbcCustomConversions() to myJdbcCustomConversions(). It helped to avoid error above but converter is not invoked during entity persistence and I see another error that application tries to save String but database type is bigint.
20:39:10.689 DEBUG [main] o.s.jdbc.core.StatementCreatorUtils: JDBC getParameterType call failed - using fallback method instead: org.postgresql.util.PSQLException: ERROR: column "file_type" is of type bigint but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
Position: 174
I also tried to use the latest(currently) version of spring boot:
id 'org.springframework.boot' version '2.6.2' apply false
But it didn't help.
What have I missed ?
How can I map enum to integer column properly ?
P.S.
I use following code for testing:
#SpringBootApplication
#EnableJdbcAuditing
#EnableScheduling
public class Main {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(Main.class, args);
MyEntityRepositoryrepository = applicationContext.getBean(MyEntityRepository.class);
MyEntity entity = new MyEntity();
...
entity.setFileType(FileType.TYPE_2);
repository.save(entity);
}
#Bean
public ModelMapper modelMapper() {
ModelMapper mapper = new ModelMapper();
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT)
.setFieldMatchingEnabled(true)
.setSkipNullEnabled(true)
.setFieldAccessLevel(PRIVATE);
return mapper;
}
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
}
}
UPDATE
My code is:
#SpringBootApplication
#EnableJdbcAuditing
#EnableScheduling
public class Main {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(Main.class, args);
MyEntityRepositoryrepository = applicationContext.getBean(MyEntityRepository.class);
MyEntity entity = new MyEntity();
...
entity.setFileType(FileType.TYPE_2);
repository.save(entity);
}
#Bean
public ModelMapper modelMapper() {
ModelMapper mapper = new ModelMapper();
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT)
.setFieldMatchingEnabled(true)
.setSkipNullEnabled(true)
.setFieldAccessLevel(PRIVATE);
return mapper;
}
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
#Bean
public JdbcConverter jdbcConverter(JdbcMappingContext mappingContext,
NamedParameterJdbcOperations operations,
#Lazy RelationResolver relationResolver,
JdbcCustomConversions conversions,
Dialect dialect) {
JdbcArrayColumns arrayColumns = dialect instanceof JdbcDialect ? ((JdbcDialect) dialect).getArraySupport()
: JdbcArrayColumns.DefaultSupport.INSTANCE;
DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(operations.getJdbcOperations(),
arrayColumns);
return new MyJdbcConverter(
mappingContext,
relationResolver,
conversions,
jdbcTypeFactory,
dialect.getIdentifierProcessing()
);
}
}
static class MyJdbcConverter extends BasicJdbcConverter {
MyJdbcConverter(
MappingContext<? extends RelationalPersistentEntity<?>, ? extends RelationalPersistentProperty> context,
RelationResolver relationResolver,
CustomConversions conversions,
JdbcTypeFactory typeFactory,
IdentifierProcessing identifierProcessing) {
super(context, relationResolver, conversions, typeFactory, identifierProcessing);
}
#Override
public int getSqlType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Types.BIGINT;
} else {
return super.getSqlType(property);
}
}
#Override
public Class<?> getColumnType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Long.class;
} else {
return super.getColumnType(property);
}
}
}
}
But I experience error:
Caused by: org.postgresql.util.PSQLException: Cannot convert an instance of java.lang.String to type long
at org.postgresql.jdbc.PgPreparedStatement.cannotCastException(PgPreparedStatement.java:925)
at org.postgresql.jdbc.PgPreparedStatement.castToLong(PgPreparedStatement.java:810)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:561)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:931)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.setObject(HikariProxyPreparedStatement.java)
at org.springframework.jdbc.core.StatementCreatorUtils.setValue(StatementCreatorUtils.java:414)
at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValueInternal(StatementCreatorUtils.java:231)
at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValue(StatementCreatorUtils.java:146)
at org.springframework.jdbc.core.PreparedStatementCreatorFactory$PreparedStatementCreatorImpl.setValues(PreparedStatementCreatorFactory.java:283)
at org.springframework.jdbc.core.PreparedStatementCreatorFactory$PreparedStatementCreatorImpl.createPreparedStatement(PreparedStatementCreatorFactory.java:241)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:649)
... 50 more
Caused by: java.lang.NumberFormatException: For input string: "TYPE_2"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Long.parseLong(Long.java:589)
at java.lang.Long.parseLong(Long.java:631)
at org.postgresql.jdbc.PgPreparedStatement.castToLong(PgPreparedStatement.java:792)
... 59 more
Try the following instead:
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return List.of(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
}
Explanation:
Spring complains that JdbcCustomConversions in auto-configuration class is already defined (by your bean) and you don't have bean overriding enabled.
JdbcRepositoriesAutoConfiguration has changed a few times, in Spring 2.6.2 it has:
#Configuration(proxyBeanMethods = false)
#ConditionalOnMissingBean(AbstractJdbcConfiguration.class)
static class SpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
}
In turn, AbstractJdbcConfiguration has:
#Bean
public JdbcCustomConversions jdbcCustomConversions() {
try {
Dialect dialect = applicationContext.getBean(Dialect.class);
SimpleTypeHolder simpleTypeHolder = dialect.simpleTypes().isEmpty() ? JdbcSimpleTypes.HOLDER
: new SimpleTypeHolder(dialect.simpleTypes(), JdbcSimpleTypes.HOLDER);
return new JdbcCustomConversions(
CustomConversions.StoreConversions.of(simpleTypeHolder, storeConverters(dialect)), userConverters());
} catch (NoSuchBeanDefinitionException exception) {
LOG.warn("No dialect found. CustomConversions will be configured without dialect specific conversions.");
return new JdbcCustomConversions();
}
}
As you can see, JdbcCustomConversions is not conditional in any way, so defining your own caused a conflict. Fortunately, it provides an extension point userConverters() which can be overriden to provide your own converters.
Update
As discussed in comments:
FileType.byDatabaseId is broken - it ignores its input param
as the column type in db is BIGINT, your converters must convert from Long, not from Integer, this addresses read queries
for writes, there is an open bug https://github.com/spring-projects/spring-data-jdbc/issues/629 There is a hardcoded assumption that Enums are converted to Strings, and only Enum -> String converters are checked.
As we want to convert to Long, we need to make amendments to BasicJdbcConverter by subclassing it and registering subclassed converter with as a #Bean.
You need to override two methods
public int getSqlType(RelationalPersistentProperty property)
public Class<?> getColumnType(RelationalPersistentProperty property)
I hardcoded the Enum type and corresponding column types, but you may want to get more fancy with that.
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return List.of(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
#Bean
public JdbcConverter jdbcConverter(JdbcMappingContext mappingContext,
NamedParameterJdbcOperations operations,
#Lazy RelationResolver relationResolver,
JdbcCustomConversions conversions,
Dialect dialect) {
JdbcArrayColumns arrayColumns = dialect instanceof JdbcDialect ? ((JdbcDialect) dialect).getArraySupport()
: JdbcArrayColumns.DefaultSupport.INSTANCE;
DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(operations.getJdbcOperations(),
arrayColumns);
return new MyJdbcConverter(
mappingContext,
relationResolver,
conversions,
jdbcTypeFactory,
dialect.getIdentifierProcessing()
);
}
}
static class MyJdbcConverter extends BasicJdbcConverter {
MyJdbcConverter(
MappingContext<? extends RelationalPersistentEntity<?>, ? extends RelationalPersistentProperty> context,
RelationResolver relationResolver,
CustomConversions conversions,
JdbcTypeFactory typeFactory,
IdentifierProcessing identifierProcessing) {
super(context, relationResolver, conversions, typeFactory, identifierProcessing);
}
#Override
public int getSqlType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Types.BIGINT;
} else {
return super.getSqlType(property);
}
}
#Override
public Class<?> getColumnType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Long.class;
} else {
return super.getColumnType(property);
}
}
}
I have created one scheduler class
public class TestSchedulderNew {
#Scheduled(fixedDelay = 3000)
public void fixedRateJob1() {
System.out.println("Job 1 running");
}
#Scheduled(fixedDelay = 3000)
public void fixedRateJob2() {
System.out.println("Job 2 running");
}
}
In configuration i have put #ConditionalOnProperty annotation to enable this on conditional purpose.
#Bean
#ConditionalOnProperty(value = "jobs.enabled")
public TestSchedulderNew testSchedulderNew() {
return new TestSchedulderNew();
}
Now in controller, i have created "stopScheduler" method to stop those scheduler , in this controller i have autowired
TestSchedulderNew class
#RestController
#RequestMapping("/api")
public class TestCont {
private static final String SCHEDULED_TASKS = "testSchedulderNew";
#Autowired
private ScheduledAnnotationBeanPostProcessor postProcessor; /]
#Autowired
private TestSchedulderNew testSchedulderNew;
#GetMapping(value = "/stopScheduler")
public String stopSchedule(){
postProcessor.postProcessBeforeDestruction(testSchedulderNew,
SCHEDULED_TASKS);
return "OK";
}
}
Now the problem is if conditional property is false then i get below exception
Field testSchedulderNew in com.sbill.app.web.rest.TestCont required a bean of type 'com.sbill.app.schedulerJob.TestSchedulderNew
In case of true everything works fine,
Do we have any option to solve this ?
You can use #Autowired(required=false) and null check in stopScheduler method.
#Autowired(required=false)
private TestSchedulderNew testSchedulderNew;
#GetMapping(value = "/stopScheduler")
public String stopSchedule() {
if (testSchedulderNew != null) {
postProcessor.postProcessBeforeDestruction(testSchedulderNew,
SCHEDULED_TASKS);
return "OK";
}
return "NOT_OK";
}
I want to setup an object pool in my spring application with annotation only.
I started out with this example taken from Spring docs: http://docs.spring.io/spring/docs/2.5.x/reference/aop-api.html#aop-ts-pool.
Here is how I translate the XML configuration:
#Configuration
#EnableAutoConfiguration
#ComponentScan
public class SpringObjectPoolTest {
public static void main(String[] args) throws Exception {
context = new SpringApplicationBuilder(SpringObjectPoolTest.class) //
.addCommandLineProperties(false) //
.web(false) //
.headless(false) //
.registerShutdownHook(true) //
.application() //
.run();
context.getBean(SpringObjectPoolTest.class).go();
}
#Resource(name = "pfb")
private FactoryBean<MyTask> pool;
#Resource(name="pool")
private TargetSource targetSource;
private static ConfigurableApplicationContext context;
#Bean(name = "task")
#Scope("prototype")
public MyTask createNewTask() {
return new MyTask();
}
#Bean(name = "pool")
public CommonsPoolTargetSource setupObjectPool() {
CommonsPoolTargetSource pc = new CommonsPoolTargetSource();
pc.setMaxSize(25);
pc.setTargetBeanName("task");
return pc;
}
#Bean(name = "pfb")
public ProxyFactoryBean createProxyFactoryBean() {
ProxyFactoryBean pfb = new ProxyFactoryBean();
pfb.setTargetSource(targetSource);
return pfb;
}
private void go() {
try {
pool.getObject().speak();
} catch (Exception e) {
e.printStackTrace();
}
}
}
However I get this exception:
org.springframework.beans.factory.BeanCurrentlyInCreationException:
Error creating bean with name 'pfb':
org.springframework.beans.factory.FactoryBeanNotInitializedException:
Cannot determine target class for proxy
You are a bit over engineering this. Spring already knows how to inject a proxied MyTask, there is no need to have a FactoryBean<MyTask> or to call getObject() on the pool. In "pooledTask" below Spring knows that by injecting a ProxyFactoryBean ("pfb") it will actually inject the instance that factory bean creates, not the factory bean itself. Here's how I'd do it:
#Configuration
#EnableAutoConfiguration
#ComponentScan
public class SpringObjectPoolTest {
public static void main(String[] args) throws Exception {
context = new SpringApplicationBuilder(SpringObjectPoolTest.class) //
.addCommandLineProperties(false) //
.web(false) //
.headless(false) //
.registerShutdownHook(true) //
.application() //
.run();
context.getBean(SpringObjectPoolTest.class).go();
}
private static ConfigurableApplicationContext context;
#Resource(name = "pfb")
private MyTask pooledTask;
#Resource(name="pool")
private CommonsPoolTargetSource targetSource;
#Bean(name = "task")
#Scope("prototype")
public MyTask createNewTask() {
return new MyTask();
}
#Bean(name = "pool")
public CommonsPoolTargetSource setupObjectPool() {
CommonsPoolTargetSource pc = new CommonsPoolTargetSource();
pc.setMaxSize(25);
pc.setTargetBeanName("task");
return pc;
}
#Bean(name = "pfb")
public ProxyFactoryBean createProxyFactoryBean() {
ProxyFactoryBean pfb = new ProxyFactoryBean();
pfb.setTargetSource(setupObjectPool());
return pfb;
}
private void go() {
try {
pooledTask.speak();
// getting another object from pool
MyTask someOtherTask = (MyTask) targetSource.getTarget();
// returning the object to the pool
targetSource.releaseTarget(someOtherTask);
} catch (Exception e) {
e.printStackTrace();
}
}
}
Given the following service:
public interface MyService {
void method();
}
And it's implementation:
#Service
public class MyServiceImpl implements MyService {
#Transactional
#CustomAnnotation
#Override
public void method() {
...
}
}
I would like to use a StaticMethodMatcherPointcutAdvisor in the following manner:
public class MyPointcutAdvisor extends StaticMethodMatcherPointcutAdvisor {
...
#Override
public boolean matches(Method method, Class targetClass) {
Method m = method;
if(annotationPresent(method)) {
return true;
}
Class<?> userClass = ClassUtils.getUserClass(targetClass);
Method specificMethod = ClassUtils.getMostSpecificMethod(method, userClass);
specificMethod = BridgeMethodResolver.findBridgedMethod(specificMethod);
if(annotationPresent(specificMethod )) {
return true;
}
return false;
}
...
}
The problem is that Spring uses an InfrastructureAdvisorAutoProxyCreator to create a Proxy of that class, whereas the DefaultAdvisorAutoProxyCreator would create the proxy for the MyPointcutAdvisor, but the MyPointcutAdvisor is only given the proxy as targetClass parameter. Thus, the PointcutAdvisor cannot find the annotation and therefore does not match.
For completion this is my Configuration-class:
#Configuration
#EnableTransactionManagement
public class MyConfiguration {
#Bean
public DefaultAdvisorAutoProxyCreator defaultAdvisorAutoProxyCreator() {
return new DefaultAdvisorAutoProxyCreator();
}
#Bean
public MyPointcutAdvisor myPointcutAdvisor() {
return new MyPointcutAdvisor();
}
...
}
My question is: Is there a way to use #EnableTransactionManagement in combination with a StaticMethodMatcherPointcutAdvisor ?
Workarounds:
Put #CustomAnnotation into the service interface: I want to have clean interfaces.
Add #Role(BeanDefinition.ROLE_INFRASTRUCTURE) to MyPointCutAdvisor bean configuration, thus, the InfrastructureAdvisorAutoProxyCreator will create the proxy. This seems like the wrong way, since this bean is not infrastructure
Copy the beans from ProxyTransactionManagementConfiguration, remove #EnableTransactionManagement and remove #Role(BeanDefinition.ROLE_INFRASTRUCTURE), thus the DefaultAdvisorAutoProxyCreator will create the proxy, which is my current workaround and results in the following configuration:
#Configuration
public class MyWorkaroundConfiguration {
#Bean
public DefaultAdvisorAutoProxyCreator defaultAdvisorAutoProxyCreator() {
return new DefaultAdvisorAutoProxyCreator();
}
#Bean
public MyPointcutAdvisor myPointcutAdvisor() {
return new MyPointcutAdvisor();
}
#Bean
public TransactionAttributeSource transactionAttributeSource() {
return new AnnotationTransactionAttributeSource();
}
#Bean(name = TransactionManagementConfigUtils.TRANSACTION_ADVISOR_BEAN_NAME)
public BeanFactoryTransactionAttributeSourceAdvisor transactionAdvisor(
TransactionInterceptor transactionInterceptor) {
BeanFactoryTransactionAttributeSourceAdvisor advisor =
new BeanFactoryTransactionAttributeSourceAdvisor();
advisor.setTransactionAttributeSource(transactionAttributeSource());
advisor.setAdvice(transactionInterceptor);
return advisor;
}
#Bean
public TransactionInterceptor transactionInterceptor(
PlatformTransactionManager transactionManager) {
TransactionInterceptor interceptor = new TransactionInterceptor();
interceptor.setTransactionAttributeSource(transactionAttributeSource());
interceptor.setTransactionManager(transactionManager);
return interceptor;
}
...
}
Using #EnableAspectJAutoProxy instead of the DefaultAutoProxyCreator works for me.
#Configuration
#EnableAspectJAutoProxy
#EnableTransactionManagement
public class MyConfiguration {
}
This also allows using #Aspect like M. Deinum suggested.
I have AnnotationConfigApplicationContext created with #Configuration annotated class as a param:
#Configuration
class Config {
#Bean
public LocalValidatorFactoryBean localValidatorFactoryBean() {
return new LocalValidatorFactoryBean();
}
#Bean
public A aBean() {
return new A();
}
#Bean
public B aBean() {
return new B();
}
}
Where A and B are:
class A {
#Min(1)
public int myInt;
}
class B {
#Autowire(required = true)
#Valid
public A aBean;
}
Q: Is it possible to make Spring to process #Valid annotation in this case?
PS: Currently I have following working implementation of B:
class B {
public A aBean;
public void setABean(A aBean, Validator validator) {
if (validator.validate(aBean).size() > 0) {
throw new ValidationException();
}
this.aBean = aBean;
}
}
This impl seems a bit clumsy to me and I want to replace it. Please help :)
It looks like you want to validate your bean in the process of injection.
You can read
here.
Here is an example:
public class BeanValidator implements org.springframework.validation.Validator, InitializingBean {
private Validator validator;
public void afterPropertiesSet() throws Exception {
ValidatorFactory validatorFactory = Validation.buildDefaultValidatorFactory();
validator = validatorFactory.usingContext().getValidator();
}
public boolean supports(Class clazz) {
return true;
}
public void validate(Object target, Errors errors) {
Set<ConstraintViolation<Object>> constraintViolations = validator.validate(target);
for (ConstraintViolation<Object> constraintViolation : constraintViolations) {
String propertyPath = constraintViolation.getPropertyPath().toString();
String message = constraintViolation.getMessage();
errors.rejectValue(propertyPath, "", message);
}
}
}
You will need to implement InitializingBean, to be able to validate the bean after it was set.