I have AnnotationConfigApplicationContext created with #Configuration annotated class as a param:
#Configuration
class Config {
#Bean
public LocalValidatorFactoryBean localValidatorFactoryBean() {
return new LocalValidatorFactoryBean();
}
#Bean
public A aBean() {
return new A();
}
#Bean
public B aBean() {
return new B();
}
}
Where A and B are:
class A {
#Min(1)
public int myInt;
}
class B {
#Autowire(required = true)
#Valid
public A aBean;
}
Q: Is it possible to make Spring to process #Valid annotation in this case?
PS: Currently I have following working implementation of B:
class B {
public A aBean;
public void setABean(A aBean, Validator validator) {
if (validator.validate(aBean).size() > 0) {
throw new ValidationException();
}
this.aBean = aBean;
}
}
This impl seems a bit clumsy to me and I want to replace it. Please help :)
It looks like you want to validate your bean in the process of injection.
You can read
here.
Here is an example:
public class BeanValidator implements org.springframework.validation.Validator, InitializingBean {
private Validator validator;
public void afterPropertiesSet() throws Exception {
ValidatorFactory validatorFactory = Validation.buildDefaultValidatorFactory();
validator = validatorFactory.usingContext().getValidator();
}
public boolean supports(Class clazz) {
return true;
}
public void validate(Object target, Errors errors) {
Set<ConstraintViolation<Object>> constraintViolations = validator.validate(target);
for (ConstraintViolation<Object> constraintViolation : constraintViolations) {
String propertyPath = constraintViolation.getPropertyPath().toString();
String message = constraintViolation.getMessage();
errors.rejectValue(propertyPath, "", message);
}
}
}
You will need to implement InitializingBean, to be able to validate the bean after it was set.
Related
I implemented a validation using the chain of responsibility pattern. The request payload to validate can have different parameters. The logic is: if the payload has some parameters, validate it and continue to validate other, else throw an exception. In a level of the validation chain I need to call other services, and here comes into play the Dependency Injection.
The validation structure is like a tree, starting from top to bottom.
So, the class where I need to start the Validation
#Service
public class ServiceImpl implements Service {
private final .....;
private final Validator validator;
public ServiceImpl(
#Qualifier("lastLevelValidator") Validator validator, .....) {
this.validator = validator;
this...........=............;
}
/...../
private void validateContext(RequestContex rc) {
Validator validation = new FirstLevelValidator(validator);
validation.validate(rc);
}
}
So the Validator Interface
public interface Validator<T> {
void validate(T object);
}
The validation classes that implements Validator
#Component
public class FirstLevelValidator implements Validator<RequestContext>{
private final Validator<RequestContext> validator;
#Autowired
public FirstLevelValidator(#Qualifier("lastLevelValidator") Validator<RequestContext> validator) {
this.validator = validator;
}
#Override
public void validate(RequestContext requestContext) {
if ( requestContext.getData() == null ) {
LOGGER.error(REQUEST_ERROR_MSG);
throw new BadRequestException(REQUEST_ERROR_MSG, INVALID_CODE);
}
if (requestContex.getData() == "Some Data") {
Validator validator = new SecondLevelValidator(this.validator);
validator.validate(requestContext);
} else {/* other */ }
}
#Component
public class SecondLevelValidator implements Validator<RequestContext>{
private final Validator<RequestContext> validator;
#Autowired
public SecondLevelValidator(#Qualifier("lastLevelValidator") Validator<RequestContext> validator) {
this.validator = validator;
}
#Override
public void validate(RequestContext requestContext) {
if ( requestContext.getOption() == null ) {
LOGGER.error(REQUEST_ERROR_MSG);
throw new BadRequestException(REQUEST_ERROR_MSG, INVALID_CODE);
}
if ( requestContext.getOption() == " SOME " ) {
validator.validate(requestContext); //HERE WHERE I CALL THE Qualifier
}
}
#Component
public class LastLevelValidator implements Validator<RequestContext>{
private final ClientService1 client1;
private final ClientService2 client2;
public LastLevelValidator(ClientService1 client1, ClientService2 client2) {
this.client1 = client1;
this.client2 = client2;
}
#Override
public void validate(RequestContext requestContext) {
Integer userId = client2.getId()
List<ClientService1Response> list = client1.call(requestContext.id(), userId);
boolean isIdListValid = list
.stream()
.map(clientService1Response -> clientService1Response.getId())
.collect(Collectors.toSet()).containsAll(requestContext.getListId());
if (!isIdListValid) {
LOGGER.error(NOT_FOUND);
throw new BadRequestException(NOT_FOUND, INVALID_CODE);
} else { LOGGER.info("Context List validated"); }
}
}
In the LastLevelValidator I need to call other services to make the validation, for that I inject into each validator class (First.., Second..) the #Qualifier("lastLevelValidator") object, so when I need to instantiate the LastLevelValidation class I can call it like validator.validate(requestContext); instance of validator.validate(ClientService1, ClientService2 ) that it would force me to propagate the ClientServices objects through all the chain from the ServiceImpl class.
Is it this a good solution ?
Is there any concern I didn't evaluate?
I tried also declaring the services I need to call for the validation as static in the LastLevelValidation, in the way that I can call it like LastLevelValidation.methodvalidar(), but look like not a good practice declares static objects.
I tried to pass the objects I need propagating it for each Validation class, but seems to me that if I need another object for the validation I have to pass it through all the validation chain.
In Spring, when I inject a list of beans, I only want to inject specific implementations of the interface, when used from different places. Is this possible to do? What would be the cleanest way to configure this? For example, I have the following validators:
public interface Validator {
Optional<Error> validate(MultipartFile file);
}
public class Validator1 implements Validator {
public Optional<Error> validate(MultipartFile file) {
// do stuff
}
}
public class Validator2 implements Validator {
public Optional<Error> validate(MultipartFile file) {
// do stuff
}
}
public class Validator3 implements Validator {
public Optional<Error> validate(MultipartFile file) {
// do stuff
}
}
And then I have a validation service which looks similar to this:
public class ValidationService {
#Autowired
private List<Validator> validators;
public List<Error> validate(MultipartFile file) {
List<Error> errors = new ArrayList<>();
validators.forEach(v -> {
Optional<Error> error = v.validate(file);
if (error.isPresent()) {
errors.add(error.get());
}
});
return errors;
}
}
And then I have some services, which use the ValidationService, e.g:
public class Service1 {
#Autowired
private ValidationService validationService;
public void doStuff(MultipartFile file) {
...
validationService.validate(file);
...
}
}
public class Service2 {
#Autowired
private ValidationService validationService;
public void doStuff(MultipartFile file) {
...
validationService.validate(file);
...
}
}
When Service1 calls validate, I only want Validator1 and Validator2 to have been injected into the ValidatorService.
When Service2 calls validate, I only want Validator2 and Validator3 to have been injected into the ValidatorService.
Hope I have explained this clearly enough. Thanks in advance for any help offered.
Create the bean like this with #Qualifier annotations --
#Qualifier("validator1")
public class Validator1 implements Validator {
public Optional<Error> validate(MultipartFile file) {
// do stuff
}
}
#Qualifier("validator2")
public class Validator2 implements Validator {
public Optional<Error> validate(MultipartFile file) {
// do stuff
}
}
#Qualifier("validator3")
public class Validator3 implements Validator {
public Optional<Error> validate(MultipartFile file) {
// do stuff
}
}
and inject it like this ---
#Autowired("validator1")
private ValidationService validationService;
Update
You can also create a bean collection for all the validators like this -
#Bean("validators")
public List<Validator> validatorList(Validator1 validator1, Validator2 validator2, Validator3 validator3) {
return Arrays.asList(validator1, validator2, validator3);
}
and the inject the list bean as --
#Autowired("validators")
private List<Validator> validators;
Check this page fore a detailed example - https://www.baeldung.com/spring-injecting-collections
This is likely not the best way to do this.
Here is how I would do it,
based on my current understanding of Spring.
Summary:
Create a bean method for each collection of implementations.
In your case, create a bean method for a List<Validator> that contains Validator1 and Validator2 and create a second List<Validator> that contains Validator2 and Validator3.
Inject the desired List using #Qualifier.
The code should be something like this:
#Configuration
public class ValidatorLists
{
private void getAndAddBean(
final ApplicationContext applicationContext,
final List<Validator> list,
final String beanName)
{
final Validator bean;
bean = applicationContext.getBean(beanName);
if (bean != null)
{
list.add(bean);
}
}
#Bean("ValidatorList1")
public List<Validator> validatorList1(final ApplicationContext applicationContext)
{
Validator bean;
final List<Validator> returnValue = new LinkedList<>();
getAndAddBean(applicationContext, returnValue, "ValidatorImpl1");
getAndAddBean(applicationContext, returnValue, "ValidatorImpl2");
return returnValue;
}
#Bean("ValidatorList2")
public List<Validator> validatorList2(final ApplicationContext applicationContext)
{
Validator bean;
final List<Validator> returnValue = new LinkedList<>();
getAndAddBean(applicationContext, returnValue, "ValidatorImpl2");
getAndAddBean(applicationContext, returnValue, "ValidatorImpl3");
return returnValue;
}
}
Then reference the list by qualifier.
#Autowired
#Qualifier("ValidatorList1")
private List<Validator> validators;
I want to have enum as a field for my entity.
My application is look like:
Spring boot version
plugins {
id 'org.springframework.boot' version '2.6.2' apply false
repository:
#Repository
public interface MyEntityRepository extends PagingAndSortingRepository<MyEntity, UUID> {
...
entity:
#Table("my_entity")
public class MyEntity{
...
private FileType fileType;
// get + set
}
enum declaration:
public enum FileType {
TYPE_1(1),
TYPE_2(2);
int databaseId;
public static FileType byDatabaseId(Integer databaseId){
return Arrays.stream(values()).findFirst().orElse(null);
}
FileType(int databaseId) {
this.databaseId = databaseId;
}
public int getDatabaseId() {
return databaseId;
}
}
My attempt:
I've found following answer and try to follow it : https://stackoverflow.com/a/53296199/2674303
So I've added bean
#Bean
public JdbcCustomConversions jdbcCustomConversions() {
return new JdbcCustomConversions(asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter()));
}
converters:
#WritingConverter
public class FileTypeToDatabaseIdConverter implements Converter<FileType, Integer> {
#Override
public Integer convert(FileType source) {
return source.getDatabaseId();
}
}
#ReadingConverter
public class DatabaseIdToFileTypeConverter implements Converter<Integer, FileType> {
#Override
public FileType convert(Integer databaseId) {
return FileType.byDatabaseId(databaseId);
}
}
But I see error:
The bean 'jdbcCustomConversions', defined in class path resource
[org/springframework/boot/autoconfigure/data/jdbc/JdbcRepositoriesAutoConfiguration$SpringBootJdbcConfiguration.class],
could not be registered. A bean with that name has already been
defined in my.pack.Main and overriding is disabled.
I've tried to rename method jdbcCustomConversions() to myJdbcCustomConversions(). It helped to avoid error above but converter is not invoked during entity persistence and I see another error that application tries to save String but database type is bigint.
20:39:10.689 DEBUG [main] o.s.jdbc.core.StatementCreatorUtils: JDBC getParameterType call failed - using fallback method instead: org.postgresql.util.PSQLException: ERROR: column "file_type" is of type bigint but expression is of type character varying
Hint: You will need to rewrite or cast the expression.
Position: 174
I also tried to use the latest(currently) version of spring boot:
id 'org.springframework.boot' version '2.6.2' apply false
But it didn't help.
What have I missed ?
How can I map enum to integer column properly ?
P.S.
I use following code for testing:
#SpringBootApplication
#EnableJdbcAuditing
#EnableScheduling
public class Main {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(Main.class, args);
MyEntityRepositoryrepository = applicationContext.getBean(MyEntityRepository.class);
MyEntity entity = new MyEntity();
...
entity.setFileType(FileType.TYPE_2);
repository.save(entity);
}
#Bean
public ModelMapper modelMapper() {
ModelMapper mapper = new ModelMapper();
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT)
.setFieldMatchingEnabled(true)
.setSkipNullEnabled(true)
.setFieldAccessLevel(PRIVATE);
return mapper;
}
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
}
}
UPDATE
My code is:
#SpringBootApplication
#EnableJdbcAuditing
#EnableScheduling
public class Main {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = SpringApplication.run(Main.class, args);
MyEntityRepositoryrepository = applicationContext.getBean(MyEntityRepository.class);
MyEntity entity = new MyEntity();
...
entity.setFileType(FileType.TYPE_2);
repository.save(entity);
}
#Bean
public ModelMapper modelMapper() {
ModelMapper mapper = new ModelMapper();
mapper.getConfiguration()
.setMatchingStrategy(MatchingStrategies.STRICT)
.setFieldMatchingEnabled(true)
.setSkipNullEnabled(true)
.setFieldAccessLevel(PRIVATE);
return mapper;
}
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return asList(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
#Bean
public JdbcConverter jdbcConverter(JdbcMappingContext mappingContext,
NamedParameterJdbcOperations operations,
#Lazy RelationResolver relationResolver,
JdbcCustomConversions conversions,
Dialect dialect) {
JdbcArrayColumns arrayColumns = dialect instanceof JdbcDialect ? ((JdbcDialect) dialect).getArraySupport()
: JdbcArrayColumns.DefaultSupport.INSTANCE;
DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(operations.getJdbcOperations(),
arrayColumns);
return new MyJdbcConverter(
mappingContext,
relationResolver,
conversions,
jdbcTypeFactory,
dialect.getIdentifierProcessing()
);
}
}
static class MyJdbcConverter extends BasicJdbcConverter {
MyJdbcConverter(
MappingContext<? extends RelationalPersistentEntity<?>, ? extends RelationalPersistentProperty> context,
RelationResolver relationResolver,
CustomConversions conversions,
JdbcTypeFactory typeFactory,
IdentifierProcessing identifierProcessing) {
super(context, relationResolver, conversions, typeFactory, identifierProcessing);
}
#Override
public int getSqlType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Types.BIGINT;
} else {
return super.getSqlType(property);
}
}
#Override
public Class<?> getColumnType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Long.class;
} else {
return super.getColumnType(property);
}
}
}
}
But I experience error:
Caused by: org.postgresql.util.PSQLException: Cannot convert an instance of java.lang.String to type long
at org.postgresql.jdbc.PgPreparedStatement.cannotCastException(PgPreparedStatement.java:925)
at org.postgresql.jdbc.PgPreparedStatement.castToLong(PgPreparedStatement.java:810)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:561)
at org.postgresql.jdbc.PgPreparedStatement.setObject(PgPreparedStatement.java:931)
at com.zaxxer.hikari.pool.HikariProxyPreparedStatement.setObject(HikariProxyPreparedStatement.java)
at org.springframework.jdbc.core.StatementCreatorUtils.setValue(StatementCreatorUtils.java:414)
at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValueInternal(StatementCreatorUtils.java:231)
at org.springframework.jdbc.core.StatementCreatorUtils.setParameterValue(StatementCreatorUtils.java:146)
at org.springframework.jdbc.core.PreparedStatementCreatorFactory$PreparedStatementCreatorImpl.setValues(PreparedStatementCreatorFactory.java:283)
at org.springframework.jdbc.core.PreparedStatementCreatorFactory$PreparedStatementCreatorImpl.createPreparedStatement(PreparedStatementCreatorFactory.java:241)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:649)
... 50 more
Caused by: java.lang.NumberFormatException: For input string: "TYPE_2"
at java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
at java.lang.Long.parseLong(Long.java:589)
at java.lang.Long.parseLong(Long.java:631)
at org.postgresql.jdbc.PgPreparedStatement.castToLong(PgPreparedStatement.java:792)
... 59 more
Try the following instead:
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return List.of(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
}
Explanation:
Spring complains that JdbcCustomConversions in auto-configuration class is already defined (by your bean) and you don't have bean overriding enabled.
JdbcRepositoriesAutoConfiguration has changed a few times, in Spring 2.6.2 it has:
#Configuration(proxyBeanMethods = false)
#ConditionalOnMissingBean(AbstractJdbcConfiguration.class)
static class SpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
}
In turn, AbstractJdbcConfiguration has:
#Bean
public JdbcCustomConversions jdbcCustomConversions() {
try {
Dialect dialect = applicationContext.getBean(Dialect.class);
SimpleTypeHolder simpleTypeHolder = dialect.simpleTypes().isEmpty() ? JdbcSimpleTypes.HOLDER
: new SimpleTypeHolder(dialect.simpleTypes(), JdbcSimpleTypes.HOLDER);
return new JdbcCustomConversions(
CustomConversions.StoreConversions.of(simpleTypeHolder, storeConverters(dialect)), userConverters());
} catch (NoSuchBeanDefinitionException exception) {
LOG.warn("No dialect found. CustomConversions will be configured without dialect specific conversions.");
return new JdbcCustomConversions();
}
}
As you can see, JdbcCustomConversions is not conditional in any way, so defining your own caused a conflict. Fortunately, it provides an extension point userConverters() which can be overriden to provide your own converters.
Update
As discussed in comments:
FileType.byDatabaseId is broken - it ignores its input param
as the column type in db is BIGINT, your converters must convert from Long, not from Integer, this addresses read queries
for writes, there is an open bug https://github.com/spring-projects/spring-data-jdbc/issues/629 There is a hardcoded assumption that Enums are converted to Strings, and only Enum -> String converters are checked.
As we want to convert to Long, we need to make amendments to BasicJdbcConverter by subclassing it and registering subclassed converter with as a #Bean.
You need to override two methods
public int getSqlType(RelationalPersistentProperty property)
public Class<?> getColumnType(RelationalPersistentProperty property)
I hardcoded the Enum type and corresponding column types, but you may want to get more fancy with that.
#Bean
public AbstractJdbcConfiguration jdbcConfiguration() {
return new MySpringBootJdbcConfiguration();
}
#Configuration
static class MySpringBootJdbcConfiguration extends AbstractJdbcConfiguration {
#Override
protected List<?> userConverters() {
return List.of(new DatabaseIdToFileTypeConverter(), new FileTypeToDatabaseIdConverter());
}
#Bean
public JdbcConverter jdbcConverter(JdbcMappingContext mappingContext,
NamedParameterJdbcOperations operations,
#Lazy RelationResolver relationResolver,
JdbcCustomConversions conversions,
Dialect dialect) {
JdbcArrayColumns arrayColumns = dialect instanceof JdbcDialect ? ((JdbcDialect) dialect).getArraySupport()
: JdbcArrayColumns.DefaultSupport.INSTANCE;
DefaultJdbcTypeFactory jdbcTypeFactory = new DefaultJdbcTypeFactory(operations.getJdbcOperations(),
arrayColumns);
return new MyJdbcConverter(
mappingContext,
relationResolver,
conversions,
jdbcTypeFactory,
dialect.getIdentifierProcessing()
);
}
}
static class MyJdbcConverter extends BasicJdbcConverter {
MyJdbcConverter(
MappingContext<? extends RelationalPersistentEntity<?>, ? extends RelationalPersistentProperty> context,
RelationResolver relationResolver,
CustomConversions conversions,
JdbcTypeFactory typeFactory,
IdentifierProcessing identifierProcessing) {
super(context, relationResolver, conversions, typeFactory, identifierProcessing);
}
#Override
public int getSqlType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Types.BIGINT;
} else {
return super.getSqlType(property);
}
}
#Override
public Class<?> getColumnType(RelationalPersistentProperty property) {
if (FileType.class.equals(property.getActualType())) {
return Long.class;
} else {
return super.getColumnType(property);
}
}
}
Attempting to create a class that will be used by JavaConfig as a source of bean definitions.
#Configuration
public class MyClass {
#Bean
#ConditionalOnProperty(name = "property")
public A fuction1() {
doSomething1(); // may return an exception
}
#Bean
#ConditionalOnMissingBean(name = "fuction1")
public A fuction2() {
doSomething2();
}
#Bean
public B fuction3(A a) {
doSomething3(a);
}
}
The third bean definition has the error "could not autowire. There is more than one bean of A type." How do I tell Spring to try to autowire the first A and if it is missing then to try the second A, i.e. following the conditional process described. Hopefully that makes sense.
Thanks!
You are defining two bean with same name hence, autowiring issue is obvious. Try using #Primary on the bean you want to give priority.
How about this approach?
#Configuration
public class MyClass {
#Bean
#ConditionalOnProperty(name = "property", havingValue="true")
public A fuction1() {
doSomething1(); // may return an exception
}
#Bean
#ConditionalOnProperty(name = "property", havingValue="false")
public A fuction2() {
doSomething2();
}
#Bean
public B fuction3(A a) {
doSomething3(a);
}
}
Maybe this config will be helpful:
#Configuration
public class MyClass {
#Bean
#ConditionalOnProperty(name = "property")
public A fuction1() {
try {
doSomething1(); // may return an exception
} catch(Exception e) {
return null;
}
}
#Bean
#ConditionalOnMissingBean(name = "fuction1")
public A fuction2() {
doSomething2();
}
#Bean
public B fuction3(#Autowired(required = false) #Qualifier("function1") A a1, #Qualifier("function2") A a2) {
if (a1 == null) {
doSomething3(a2);
} else {
doSomething3(a1);
}
}
}
The sample code you've provided is conceptually correct. Of course there are compilation errors but annotations are correct.
Here is a sample that compiles and works.
#Configuration
public class ConditionalConfiguration {
public static class A {
private final String name;
public A(String name) {
this.name = name;
}
}
public static class B {
private final A a;
public B(A a) {
this.a = a;
}
}
#Bean
#ConditionalOnProperty(name = "a1-property")
public A a1() {
return new A("a1");
}
#Bean
#ConditionalOnMissingBean(A.class)
public A a2() {
return new A("a2");
}
#Bean
public B b(A a) {
System.out.println("########## " + a.name);
return new B(a);
}
}
When the property a1-property is passed to the application
./gradlew clean bootRun -i -Da1-property
########## a1
the bean a1 is created.
When the property is missing
./gradlew clean bootRun -i
########## a2
The bean a2 is created.
Make sure to configure Gradle to pass system properties to the bootRun task by adding in the build.gradle
bootRun {
systemProperties System.properties
}
To be decoupled from a1 bean name, condition on bean type is used: #ConditionalOnMissingBean(A.class) instead of #ConditionalOnMissingBean(name = "a1").
If you expect an exception during bean creation it is another case.
When factory method annotated with #Bean throws an exception, Spring application crashes with BeanInstantiationException.
So, exceptions should be handled in the factory method:
#Bean
#ConditionalOnProperty(name = "a1-property")
public A a1() {
try {
//return new A("a1");
throw new RuntimeException("Sample exception");
} catch (Exception e) {
return fallbackA();
}
}
#Bean
#ConditionalOnMissingBean(A.class)
public A a2() {
return fallbackA();
}
private A fallbackA() {
return new A("a2");
}
I would like to implement Factory pattern in my project..i have gone through online resources and I came to know that spring ServiceLocatorFactoryBean should be implemented instead of normal java factory pattern....
i have followed this link but it is explained in xml based....can any one tell me how to do it using annotations based Factory pattern??
Spring Java Configuration ref guide #Configuration
Interface Parser.class
public interface Parser {
void parse(String str);
}
Implementation for above interface.
JsonParser.java
public class JsonParser implements Parser {
#Override
public void parse(String str) {
System.out.println("JsonParser.parse::" + str);
}
}
XMLParser.java
public class XMLParser implements Parser{
#Override
public void parse(String str) {
System.out.println("XMLParser.parse :: " + str);
}
}
ParserFactory.java actual Factory interface.
public interface ParserFactory {
public Parser getParser(ParserType parserType);
}
ParseType.java enum to specify parsing types(avoid typos and safe)
public enum ParserType {
JSON("jsonParser"), XML("xmlParser");
private final String value;
ParserType(String input) {
this.value = input;
}
public String getValue() {
return this.value;
}
#Override
public String toString() {
return this.value;
}
}
ParseService.java , where Business logic implemeted.
#Service
public class ParserService {
#Autowired
private ParserFactory parserFactory;
public void doParse(String parseString, ParserType parseType) {
Parser parser = parserFactory.getParser(parseType);
System.out.println("ParserService.doParse.." + parser);
parser.parse(parseString);
}
}
Finally AppConfig.java Spring java configuration class, where all of my beans registered as container managed beans.
#Configuration
#ComponentScan(basePackages = {"<Your Package Name>"})
public class AppConfig {
#Bean
public FactoryBean serviceLocatorFactoryBean() {
ServiceLocatorFactoryBean factoryBean = new ServiceLocatorFactoryBean();
factoryBean.setServiceLocatorInterface(ParserFactory.class);
return factoryBean;
}
#Bean(name = "jsonParser")
#Scope(scopeName = ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public JsonParser jsonParser() {
return new JsonParser();
}
#Bean(name = "xmlParser")
#Scope(scopeName = ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public XMLParser xmlParser() {
return new XMLParser();
}
}
Now autowire ParserService bean in either controller or test classs, and invoke parese(String, ParseType) method to test.
Here is my test.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = AppConfig.class)
public class ServiceLocatorFactoryExample {
#Autowired
private ParserService parserService;
#Test
public void testParserFactory() {
parserService.doParse("Srilekha", ParserType.JSON);
parserService.doParse("Srilekha", ParserType.XML);
}
}
Look this complete example: Serevice Locator factory
It helps me to understand how it works using spring boot.