Autowiring a Spring's beans during a Jackson's deserialization - java

I am using a Jackson for converting a json configuration into an object that represents a Spring Batch's job (list of steps to execute). Each step is either a Tasklet, or Chunk Step (reader - processor - writer).
private ObjectMapper objectMapper;
public JobDefinition convert(String jobConfiguration) {
try {
return objectMapper.reader()
.forType(JobDefinition.class)
.readValue(jobConfiguration);
} catch (IOException exception) {
throw new RuntimeException("Error while building a JobDefinition!", exception);
}
}
What I need to do, is to deserialize a JSON configuration, but during the deserialization I need to autowire all of the dependencies from the Spring's context e.g.:
public class CustomProcessor implements ItemProcessor<Row, Row> {
#JsonProperty("someField")
private String attributeFromJson;
private SomeDependency someDependencyFromSpringContext;
#Override
public Row process(Row row) throws Exception {
// business logic
return row;
}
}
The String value is taken from the provided JSON configuration, but the SomeDependency should be taken from the Spring's context. It also needs to properly handle the Spring's Scopes, so for scope=prototype I should always give me a new instance of SomeDependency.class, but for singleton - it will always be the same object.
Is it possible to achieve that in Jackson in a simple way?
In the past I was trying to use a Jackson's InjectableValues and simply copy all of the dependencies from the ApplicationContext into the ObjectMapper and use #JacksonInject. However the InjectableValues have been fully initialized immediately the app started, so it always gave me the same object's references and didn't work properly with the #JobScope, #StepScope or simply with #Scope(BeanDefinition.SCOPE_PROTOTYPE).
I've noticed that there is also something called SpringHandlerInstantiator that can be passed to ObjectMapper, and it gives a direct access into the ApplicationContext, but somehow it doesn't work properly as dependencies are not autowired at all.
Is there anything else I can do to solve that?
Thanks in advance for all of the tips!

Related

Create different instances of a class chain based on a field in the input request with Spring Boot

Every request that my Java application receives, passes through 4 layers:
Handler --> Validator --> Processor --> DAO
Handler is the API Resource. (Handler.java)
Validator validates the input. (Validator.java)
Processor performs some business logic. (Processor.java)
DAO is the DB communication layer. (DAO.java)
The input request has a field called the request_type. Based on this request_type, I want to create different objects for all the layer classes, i.e:
request_type_A should pass through Handler1, Validator1, Processor1, DAO1 (instances)
request_type_B should pass through Handler2, Validator2, Processor2, DAO2 (instances)
request_type_C should pass through Handler3, Validator3, Processor3, DAO3 (instances).. and so on
To clarify, the requirement is to create different chain of objects for a given request type, so that two request having different request_type have entirely different object chain instances. Basically i want to shard my application's object based on a given request_type.
I am using Spring Boot. Is there a way that spring's ApplicationContext can provide different object chains for different object types. Or should I manage these instances by my own?
Is there a way I can create a library which would give me a new object instance for every layer, based on the request_type using Spring's ApplicationContext?
Or should i create multiple ApplicationContext?
Based on comments & question, I understand that you would be receiving 2 or 3 request_type.
So main idea which I have used here is to use constructor injection of chained objects with different configuration beans which will be used based on your request type.
Feel free to check-out this simple demonstration based code from github where I have proposed my idea : https://github.com/patilashish312/SpringObjectChaining
So based on this code, I can confirm that
This application is not creating chain of object per request but will re-use if same type of requests received by application
Objects assigned to one request type is not being used by other request type.
Below console output is proof :
displaying request MyRequest(id=1, name=Ashish, requestType=requestTypeA)
Printing handler bean com.spr.boot3.ConditionalVerification.Handler.MyHandler#31182e0a
Printing validator bean com.spr.boot3.ConditionalVerification.Validator.MyValidator#484e3fe7
Printing processor bean com.spr.boot3.ConditionalVerification.Processor.MyProcessor#70f9b9c7
Printing dao bean com.spr.boot3.ConditionalVerification.Dao.MyDao#2a8175d9
inside dao, doing DAO processing
displaying request MyRequest(id=1, name=Ashish, requestType=requestTypeA)
Printing handler bean com.spr.boot3.ConditionalVerification.Handler.MyHandler#31182e0a
Printing validator bean com.spr.boot3.ConditionalVerification.Validator.MyValidator#484e3fe7
Printing processor bean com.spr.boot3.ConditionalVerification.Processor.MyProcessor#70f9b9c7
Printing dao bean com.spr.boot3.ConditionalVerification.Dao.MyDao#2a8175d9
inside dao, doing DAO processing
displaying request MyRequest(id=1, name=Ashish, requestType=requestTypeB)
Printing handler bean com.spr.boot3.ConditionalVerification.Handler.MyHandler#55ea9008
Printing validator bean com.spr.boot3.ConditionalVerification.Validator.MyValidator#5b2d74c5
Printing processor bean com.spr.boot3.ConditionalVerification.Processor.MyProcessor#5f12fb78
Printing dao bean com.spr.boot3.ConditionalVerification.Dao.MyDao#1a107efe
inside dao, doing DAO processing
displaying request MyRequest(id=1, name=Ashish, requestType=requestTypeB)
Printing handler bean com.spr.boot3.ConditionalVerification.Handler.MyHandler#55ea9008
Printing validator bean com.spr.boot3.ConditionalVerification.Validator.MyValidator#5b2d74c5
Printing processor bean com.spr.boot3.ConditionalVerification.Processor.MyProcessor#5f12fb78
Printing dao bean com.spr.boot3.ConditionalVerification.Dao.MyDao#1a107efe
inside dao, doing DAO processing
I had a similar requirement in my solution. What I built was a general-purpose command handler, and used a decorator pattern of annotations on each command to provide the specification for which handlers, validators, processors, and dao.
In my implementation, I have API handlers which convert requests to specific commands. Command class was an subclass of an abstract command class with a generic type param.
API -> all API variables are copied into a wrapper data model. (This could encapsulate the entrypoint of your handler concept or request_type concept)
Command extends AbstractCommand where T is the wrapper data model.
Then I would have an annotation for each of your concepts: Handler, Validator, Processor, Dao.
The general purpose command handler would have a method that "process"es commands by reading their annotations and then lining up the annotation helper that corresponds to that annotation. This could use the application context to load the bean of the class referenced in the annotation value. By providing a sequencing property for each of the annotation helpers you could loop over the sorted helpers to perform actions in the right order.
In my implementation this was further augmented by whether or not the command included asynchronous behavior, so that all the synchronous behavior would occur first, and the asychronous behavior would be wrapped in a background thread.
The beans that are injected in the rest controller don't vary with the http request content. What you can do is factor your request_type as a path variable and create the desired chains in separate http mappings like so:
#PostMapping(value = "/request_type_A")
public Object handle1(args...){
// Validator1 --> Processor1 --> DAO1
}
#PostMapping(value = "/request_type_B")
public Object handle2(args...){
// Validator2 --> Processor2 --> DAO2
}
If this is not practical for whatever reason and you must specify the type dynamically in the #RequestBody, #PathVariable or #RequestParam, then I would suggest implementing a resolver bean similar to this:
#Component
public class Resolver {
private final RequestTypeAValidator requestTypeAValidator;
private final RequestTypeBValidator requestTypeBValidator;
...
public IValidator getValidator(String requestType) {
switch (requestType) {
case "request_type_A":
return requestTypeAValidator;
case "request_type_B":
return requestTypeBValidator;
default:
throw new IllegalArgumentException("cannot find validator");
}
}
}
The drawback of this approach is that it does not comply with the "Open-Closed" principle in the sense that for any new request type, you will need to edit the resolvers. That can be fixed by using a HashMap in the resolver and letting the beans register themselves to that map on #PostConstruct:
#Component
public class Resolver {
private final Map<String, IValidator> validators = new HashMap<>();
public IValidator getValidator(String requestType) {
IValidator result = validators.get(requestType);
if (Objects.isNull(result)) {
throw new IllegalArgumentException("cannot find validator");
}
return result;
}
public void register(String type, IValidator validator) {
validators.put(type, validator)
}
}
#Component
public class ValidatorA implements IValidator {
private final Resolver resolver;
#PostConstruct
private void register() {
resolver.register("request_type_A", this);
}
...
}
However, in this approach there is a direct dependency from all implementations back to the Resolver.
Lastly, you could inject dynamically like so:
#Component
public class Resolver {
private final ApplicationContext applicationContext;
...
public IValidator getValidator(String requestType) {
switch (requestType) {
case "request_type_A":
try {
return applicationContext.getBean(ValidatorA.class);
} catch (NoSuchBeanDefinitionException e) {
// handle exception
}
case "request_type_B":
try {
return applicationContext.getBean(ValidatorB.class);
} catch (NoSuchBeanDefinitionException e) {
// handle exception
}
default:
throw new IllegalArgumentException("cannot find validator");
}
}
}
Note: Avoid taking the client specified string as the class name or type directly in the applicationContext.getBean() call. That is not safe and may present a great security vulnerability, use a switch or dictionary to resolve the correct bean name or bean type.
If you want to inject multiple instances of the same classes, create a configuration class and declare the beans like this:
#Configuration
public class BeanConfiguration {
#Bean
public IValidator aValidator(){
return new ValidatorImpl(...);
}
#Bean
public IValidator bValidator(){
return new ValidatorImpl(...);
}
}
And then to inject it, you can either use the dynamic resolution by name as above, or use the #Qualifier annotation:
#Service
public class MyService {
private final ApplicationContext applicationContext;
private final IValidator bValidator;
public MyService(ApplicationContext applicationContext, #Qualifier("bValidator") IValidator bValidator) {
this.applicationContext = applicationContext;
this.bValidator = bValidator;
}
public void getDynamically(){
IValidator aValidator = (IValidator)applicationContext.getBean("aValidator");
}
}

How to use an explicitly specified marshaller in Spring Boot

I'm trying to create a REST service that is able to produce XML output (I have a custom class that is wrapped inside a HATEOAS object). Mapping is like this:
#GetMapping("/customclass")
Resource<CustomClass> custom() {
return new Resource<CustomClass>(new CustomClass());
}
Resolved [org.springframework.http.converter.HttpMessageNotWritableException: Could not marshal [Resource { content: CustomClass(a=10, string=abc), links: [] }]: null; nested exception is javax.xml.bind.MarshalException
- with linked exception:
[com.sun.istack.internal.SAXException2: class test.CustomClass nor any of its super class is known to this context.
javax.xml.bind.JAXBException: class test.CustomClass nor any of its super class is known to this context.]]
I'm pretty sure that there is nothing wrong with my CustomClass. If I use the following mapping instead
#GetMapping("/customclass")
CustomClass custom() {
return (new CustomClass());
}
then it works fine.
It also works fine if I try to marshal things manually (by settings things up inside of a main method and then running it). It's also fine then if I wrap the instance of CustomClass inside of a Resource instance.
As far I understand the issue is that the marshaller in SpringApplication is using context that just knows about HATEOAS Resource and I need to some how make it aware of CustomClass.
I tried to use something like this (from https://stackoverflow.com/a/40398632)
#Configuration
public class ResponseResolver {
#Bean
public Marshaller marshaller() {
try {
System.out.println("getting marshaller");
JAXBContext context = JAXBContext.newInstance(CustomClass.class, Resource.class);
return context.createMarshaller();
} catch (JAXBException e) {
throw new RuntimeException(e);
}
}
}
but that didn't work (there was a lot of guessing on my part here, since I don't know that much about the inner workings of Spring Boot).
A promising reply was also in https://stackoverflow.com/a/14073899 , but ContextResolver wasn't in my projects classpath.
I also considered wrapping Resource inside of a another class and then using XmlSeeAlso annotation, but that would mess up my XML and would be somewhat ugly hack.
So is it possible to define a custom JAXBContext that SpringApplication would be able to pick up?
From the Spring Boot Documentation
Spring Message message converters
Spring MVC uses the HttpMessageConverter interface to convert HTTP
requests and responses. Sensible defaults are included out of the box.
For example, objects can be automatically converted to JSON (by using
the Jackson library) or XML (by using the Jackson XML extension, if
available, or by using JAXB if the Jackson XML extension is not
available). By default, Jaxb2RootElementHttpMessageConverter –
converts Java objects to/from XML (added only if JAXB2 is present on
the classpath)
Custom Converters Configuration
#Configuration
public class WebConfig implements WebMvcConfigurer {
#Override
public void configureMessageConverters(
List<HttpMessageConverter<?>> converters) {
messageConverters.add(createXmlHttpMessageConverter());
messageConverters.add(new MappingJackson2HttpMessageConverter());
}
private HttpMessageConverter<Object> createXmlHttpMessageConverter() {
MarshallingHttpMessageConverter xmlConverter =
new MarshallingHttpMessageConverter();
XStreamMarshaller xstreamMarshaller = new XStreamMarshaller();
xmlConverter.setMarshaller(xstreamMarshaller);
xmlConverter.setUnmarshaller(xstreamMarshaller);
return xmlConverter;
}
}

How do I convert an object using ObjectMapper and inject dependencies into it

I am trying to convert Object to a concrete class using jackson ObjectMapper and to inject dependencies after I convert it.Here is the example:
public class SimpleClass{
private String parameter;
#JsonIgnore
private SomeService service;
/** getters and setters **/
public void doSomethingFromService(){
//call some methods from the service
}
}
After attempt to deserialize the object(value is map containing parameters) :
ObjectMapper om = new ObjectMapper();
om.convertValue(value,SimpleClass.class).doSomethingFromService();
-> result to a NullPointerException...
How to inject the service?
Can I get the service from the context and inject it by default constructor?
If you were doing the initialization of the SimpleClass object yourself then you can make it work by autowiring the paritcular instance with AutowireCapableBeanFactory in that case the #Autowrire of the service will work since Spring manages also the partical instance of the SomeClass object.
That said, it is not the way to go. Data objects should be isolated from the business logic.
Back to your case, you can't autowire even in the way I mentined because Jackson is the provider of the instance, and Jackson requires the presence of empty constructor.
Since you want some kind of auto-wiring in your SimpleClass bean,
you need to annotate its property SomeService service with #Autowired.
public class SimpleClass {
private String parameter;
#JsonIgnore
#Autowired
private SomeService service;
// getter and setter for parameter (omitted here for brevity)
public void doSomethingFromService(){
//call some methods from the service
}
}
As others already said, ObjectMapper does not do any dependency injection.
But you can combine the #Autowired-ignoring ObjectMapper
with some manually triggered auto-wiring.
For that you need an AutowireCapableBeanFactory which you get by Spring's
normal dependency injection with #Autowired.
Then you use its autowireBean(Object) method
to inject the bean's #Autowired properties.
#Autowired
private AutowireCapableBeanFactory autowireCapableBeanFactory;
public void doSomething(Map<String, Object> value) throws Exception {
ObjectMapper om = new ObjectMapper();
SimpleClass bean = om.convertValue(value, SimpleClass.class);
autowireCapableBeanFactory.autowireBean(bean);
bean.doSomethingFromService();
}
It's not possible. Only objects that have been instanciated with spring can use #Autowired.
It's simple : when you annotate an object with #Service, spring will try to resolve all it's dependency with introspection.
With jackson (or if you try to instanciate an object by yourself) you are totally out of the scope of spring.
And I should had, what you are trying to do (even if it was possible) is not a good practice. You should'n mix your data objects with business processing.
Injection of services is only possible with a framework like Spring, as said previously.
If you are using Spring, annotate your service class with #Service and then as opposed to using #Autowired declare your service private final and inject it using a constructor (IMO I find this works better).
If you aren't using Spring, you'll need to new up an instance of the service and then call that instance - unless of course it's a static...

Spring Boot Camel - Autowiring issues in Camel components

I am using Spring Boot 1.5.7 and Apache Camel 2.19.3, using Spring Boot AutoConfiguration provided by spring-boot-starter-camel
It is pretty basic Spring Boot and Camel initialized as in their tutorial, so we have a RouteBuilder component that does exactly that.
#Component
public class CIFRoutes extends RouteBuilder {
#Override
public void configure() throws Exception {
// build routes
}
}
We have a Configuration that defines some beans we need in our application
#Configuration
public class MyConfiguration {
#Bean
public void Map<String, Object> map() {
return new HashMap<>()
}
}
Finally, we have a custom InflightRepository implementation that should be scanned by auto-configuration and added to the CamelContext which basically works, but for some reason, the component doesn't get initialized properly. Means, its dependencies are not initialized but the bean is instantiated and injected in my Application.
#Component
public class MyCustomInflightRepository extends DefaultInflightRepository {
#Autowired
private Map<String, Object> map;
#Override
public void add(Exchange exchange) {
super.addExchange(exchange);
// ....
}
}
The problem now is that map remains (null), I also tried adding a #PostConstruct initializer method but it doesn't get called.
As far as I was able to reconstruct, it seems to be connected to premature in CamelAutoConfiguration where the CamelContext bean gets instantiated (done in private method afterPropertiesSet.
InflightRepository inflightRepository = getSingleBeanOfType(applicationContext, InflightRepository.class);
if (inflightRepository != null) {
LOG.info("Using custom InflightRepository: {}", inflightRepository);
camelContext.setInflightRepository(inflightRepository);
}
If MyCustomInflightRepository doesn't implement InflightRepository, the bean is initialized correctly, but indeed not recognized by Camel. When disabling auto-configuration, the bean's dependencies are injected.
So, either I'm doing the impossible by Spring standards or there's something fishy with the Camel component for Spring.
I'm a bit quick on resolving this (I wanted to post this two days ago already^^), but a colleague figured out what could be the problem.
When using CamelAutoConfiguration the InflightRepository bean (or practicially everything for which Camel tries to get a matching bean here), the context is accessed before property resolvers are fully initialized which leads to the bean being initialized (and cached in context) before any auto-wired properties can be resolved.
I'm not a Spring expert but this behavior is a bit problematic in my opinion because uninitialized beans are pulled into the CamelContext when you rely on Spring DI for your custom components.
To be sure, I'll raise this with the maintainers...
By the way, my simple solution was to manually setting the in-flight repository in context configuration (as suggested)
#Bean
public CamelContextConfiguration camelConfig() {
return new CamelContextConfiguration() {
#Override
public void beforeApplicationStart(CamelContext context) {
context.setInflightRepository(new MyCustomInflightRepository(/* Dependencies go here */ ));
}
#Override
public void afterApplicationStart(CamelContext camelContext) {
}
};
}
Also it seems to be an issue when use camel-http-starter in your project which isn't recommended, they claim it is deprecated.
So, either don't do DI (regardless if via property or constructor injection) for your camel-managed beans or skip that starter.
The problem is that a Map<String,Object> is too vague for Spring to be able to understand what you want; I think the default behavior is that it'll give you all beans keyed by name.
Instead, be more specific, or possibly provide the necessary parameters as constructor arguments and configure them explicitly in an #Bean method (it's a good idea to always use constructor injection anyway).

How to register custom converters in spring boot?

I writing application using spring-boot-starter-jdbc (v1.3.0).
The problem that I met: Instance of BeanPropertyRowMapper fails as it cannot convert from java.sql.Timestamp to java.time.LocalDateTime.
In order to copy this problem, I implemented
org.springframework.core.convert.converter.Converter for these types.
public class TimeStampToLocalDateTimeConverter implements Converter<Timestamp, LocalDateTime> {
#Override
public LocalDateTime convert(Timestamp s) {
return s.toLocalDateTime();
}
}
My question is: How do I make available TimeStampToLocalDateTimeConverter for BeanPropertyRowMapper.
More general question, how do I register my converters, in order to make them available system wide?
The following code bring us to NullPointerException on initialization stage:
private Set<Converter> getConverters() {
Set<Converter> converters = new HashSet<Converter>();
converters.add(new TimeStampToLocalDateTimeConverter());
converters.add(new LocalDateTimeToTimestampConverter());
return converters;
}
#Bean(name="conversionService")
public ConversionService getConversionService() {
ConversionServiceFactoryBean bean = new ConversionServiceFactoryBean();
bean.setConverters(getConverters());
bean.afterPropertiesSet();
return bean.getObject();
}
Thank you.
All custom conversion service has to be registered with the FormatterRegistry. Try creating a new configuration and register the conversion service by implementing the WebMvcConfigurer
#Configuration
public class WebConfig implements WebMvcConfigurer {
#Override
public void addFormatters(FormatterRegistry registry) {
registry.addConverter(new TimeStampToLocalDateTimeConverter());
}
}
Hope this works.
I'll copy my answer from https://stackoverflow.com/a/72781591/140707 since I think the two questions are similar (so the answer applies to both).
Existing answers didn't work for me:
Customizing via WebMvcConfigurerAdapter.addFormatters (or simply annotating the converter with #Component) only works in the WebMvc context and I want my custom converter to be available everywhere, including #Value injections on any bean.
Defining a ConversionService bean (via ConversionServiceFactoryBean #Bean or #Component) causes Spring Boot to replace the default ApplicationConversionService on the SpringApplication bean factory with the custom bean you've defined, which will probably be based on DefaultConversionService (in AbstractApplicationContext.finishBeanFactoryInitialization). The problem is that Spring Boot adds some handy converters such as StringToDurationConverter to the standard set in DefaultConversionService, so by replacing it you lose those conversions. This may not be an issue for you if you don't use them, but it means that solution won't work for everyone.
I created the following #Configuration class which did the trick for me. It basically adds custom converters to the ConversionService instance used by Environment (which is then passed on to BeanFactory). This maintains as much backwards compatibility as possible while still adding your custom converter into the conversion services in use.
#Configuration
public class ConversionServiceConfiguration {
#Autowired
private ConfigurableEnvironment environment;
#PostConstruct
public void addCustomConverters() {
ConfigurableConversionService conversionService = environment.getConversionService();
conversionService.addConverter(new MyCustomConverter());
}
}
Obviously you can autowire a list of custom converters into this configuration class and loop over them to add them to the conversion service instead of the hard-coded way of doing it above, if you want the process to be more automatic.
To make sure this configuration class gets run before any beans are instantiated that might require the converter to have been added to the ConversionService, add it as a primary source in your spring application's run() call:
#SpringBootApplication
public class MySpringBootApplication {
public static void main(String[] args) {
SpringApplication.run(new Class<?>[] { MySpringBootApplication.class, ConversionServiceConfiguration.class }, args);
}
}
If you don't do this, it might work, or not, depending on the order in which your classes end up in the Spring Boot JAR, which determines the order in which they are scanned. (I found this out the hard way: it worked when compiling locally with an Oracle JDK, but not on our CI server which was using a Azul Zulu JDK.)
Note that for this to work in #WebMvcTests, I had to also combine this configuration class along with my Spring Boot application class into a #ContextConfiguration:
#WebMvcTest(controllers = MyController.class)
#ContextConfiguration(classes = { MySpringBootApplication.class, ConversionServiceConfiguration.class })
#TestPropertySource(properties = { /* ... properties to inject into beans, possibly using your custom converter ... */ })
class MyControllerTest {
// ...
}
I suggest to use #Autowired and the related dependency injection mechanism of spring to use a single ConversionService instance throughout your application. The ConversionService will be instantiated within the configuration.
All Converters to be available application wide receive an annotation (e.g. #AutoRegistered). On application start a #Component FormatterRegistrar (Type name itself is a bit misleading, yes it is "...Registrar" as it does the registering. And #Component as it is fully spring managed and requires dependency injection) will receive #AutoRegistered List of all annotated Converters.
See this thread for concrete implementation details. We use this mechanism within our project and it works out like a charm.
org.springframework.web.servlet.config.annotation.WebMvcConfigurer or any on its implementation is one stop place for any kind of customization in spring boot project. It prvoides various methods, for your Converter requirement.
Just create a new Converter by extending org.springframework.core.convert.converter.Converter<S, T>. Then register it with Spring by your class overriding method org.springframework.web.servlet.config.annotation.WebMvcConfigurer.addFormatters(FormatterRegistry)
Note there are Other types of Converter also which basically starts from ConditionalConverter.
Trying adding
#Converter(autoApply = true)
Its needs to be placed over the convertor class. This works for me in case of Convertor needed for Localdate for interacting to DB.
#Converter(autoApply = true)
public class LocalDateAttributeConverter implements AttributeConverter<LocalDate, Date> {
#Override
public Date convertToDatabaseColumn(LocalDate locDate) {
return (locDate == null ? null : Date.valueOf(locDate));
}
#Override
public LocalDate convertToEntityAttribute(Date sqlDate) {
return (sqlDate == null ? null : sqlDate.toLocalDate());
}
}
This is now applied automatically while interacting with DB.

Categories