I would like to understand, how the below code works.
#Bean
public StateHandlerDef handler() {
return () -> StateOne.class;
}
#Named
#Scope("prototype")
public class StateOne extends AbstractStateActorActor<StatObject> {
#Inject
public StateOne(final Props prop, final StatRegistry statRegistry) {
super("test", transformationConfig, GenericStateObject.class, statRegistry);
}
}
#FunctionalInterface
public interface StateHandlerDef {
Class<? extends AbstractUntypedActor> getHandlerClass();
}
This is the working code.
I would like to understand, How the bean creation work here.
Below code creates a bean.
#Bean
public StateHandlerDef handler() {
return () -> StateOne.class;
}
StateOne class has a constructor. But, this creates the bean without passing the constructor arguments. Also, the return type is a function interface which is not implemented by the actual state class and not sure how does it work. This is based an Akka actor model.
AbstractStateActorActor extends AbstractUntypedActor
Here, I would like to set the bean name programmatically instead of setting thru the annotation.
#Bean("test")
If I try BeanPostProcessor for programmatically setting the bean name, it throws error that instance cannot be created using new and should be created with actorof.
Caused by: akka.actor.ActorInitializationException: You cannot create an instance of [com.test.Test] explicitly using the constructor (new). You have to use one of the 'actorOf' factory methods to create a new actor. See the documentation.
at akka.actor.ActorInitializationException$.apply(Actor.scala:181) ~[akka-actor_2.11-2.4.19.jar:na]
Any help on this?
To understand this think of this way. The library that you are trying to extend (in this case akka) needs to know the class that is going to handle a state. To do that it gets an instance (bean) of type StateHandlerDef. This instance is created by the lambda expression in this code:
#Bean
public StateHandlerDef handler() {
return () -> StateOne.class;
}
which is equivalent to something like:
#Bean
public StateHandlerDef handler() {
return new StateHanderDefImpl();
}
The library will use this to get StateOne.class, for which it will look for a bean and will get it from the dependency injection framework. That bean is defined here:
#Named
#Scope("prototype")
public class StateOne extends AbstractStateActorActor<StatObject> {
#Inject
public StateOne(final Props prop, final StatRegistry statRegistry) {
super("test", transformationConfig, GenericStateObject.class, statRegistry);
}
}
The DI framework will create a bean from this class by injecting the dependencies it needs in its constructor.
#FunctionInterface is special type of interface which actually restrict user not to include more than one SAM (Single Abstraction Method). For the below example we are having one method, It will provide any class which extends Object class.
#FunctionalInterface
interface ClassHandleDef {
Class<? extends Object> getHandlerClass();
}
Now we are creating anonymous class of the interface ClassHandleDef and providing the body of the gethandlerClass method.
new ClassHandleDef() {
#Override
public Class<? extends Object> getHandlerClass() {
return String.class;
}
};
Now we are removing the extra code which is not required. According the lambda expression remove all extra code and provide argument if there exist and body definition of the method along with lambda operator.
() -> String.class;
If there is single line method definition then there is no need to write return statement explicitly.
If there is single argument then there is no need of brackets. for example
a -> a*2;
I hope, you understand the lambda expression working flow. Thanks for taking time to read this post.
Functional interface there StateHandlerDef is in a way to represent the getter function getHandlerClass() that is defined within its class definition with a lambda definition. With the below declaration;
#Bean
public StateHandlerDef handler() {
return () -> StateOne.class; // a supplier, no input, returns value (getter)
}
We are essentially implementing the StateHandlerDef interface by defining getHandlerClass() method. That is why the return value of the lambda is same as the getter method, StateOne is of type Class<? extends AbstractUntypedActor>.
So in a way the bean we created was like the following;
public interface StateHandlerDef {
Class<? extends AbstractUntypedActor> getHandlerClass();
}
public class StateHandlerDefImpl implements StateHandlerDef {
// explicit way of writing lambda "() -> StateOne.class"
Class<? extends AbstractUntypedActor> getHandlerClass() {
return StateOne.class;
}
}
#Bean
public StateHandlerDef handler() {
return new StateHandlerDefImpl(); // then we use the getter thru this bean.
}
With #FunctionalInterface we can skip the implementation of the interface as given above, and simply utilize the interface itself with the passed lambda (which is a Supplier).
Now you can just do this;
#Autowire
private StateHandlerDef handler;
public .. someLogic() {
...
handler.getHandlerClass(); // will trigger the lambda, returning `StateOne.class`
...
}
You can change the name of the bean by just changing the name of its creation method. #Bean handler() will generate the bean with name handler.
I tried to simplify, hope it was understandable, otherwise please check this
Related
In my java project I use the hexagonal architecture.
I have an Interface "Use Case" called RapprochementUseCase who is implemented by a Service called "RapprochementService".
In my ItemProcessor of my spring batch step I need to call to my Interface "RapprochementUseCase", so in my Processor I put in my constructor the interface thank's to the annotation RequiredArgsConstructor. But I got an error when I try to put the Interface into the parameter of my Processor.
I don't see in the documentation how to do this.. Do you have any ideas ?
In my declaration of processor :
#Bean
public ItemProcessor<PlaqueSousSurveillanceEntity, PlaqueLueEntity> rapprochementProcessor() {
return new RapprochementProcessor(); <-- Error here
}
RapprochementProcessor :
#Slf4j
#RequiredArgsConstructor
public class RapprochementProcessor implements ItemProcessor<PlaqueSousSurveillanceEntity, PlaqueLueEntity> {
private final RapprochementUseCase rapprochementUseCase;
#Override
public PlaqueLueEntity process(PlaqueSousSurveillanceEntity item) {
log.trace("Traitement d'une entrée PlaqueSousSurveillanceEntity: {}", item);
List<PlaqueLue> plaqueLues = this.rapprochementUseCase.findRapprochementByPlaque(item.getPlaque());
return new PlaqueLueEntity();
}
}
When I tried to put the RapprochementUseCase in the contructor of the BatchConfiguration and if I declare the bean like :
#Bean
public RapprochementUseCase rapprochementUseCase(RapprochementUseCase rapprochementUseCase) {
return rapprochementUseCase;
}
I got an error : The dependencies of some of the beans in the application context form a cycle:
Your RapprochementProcessor requires a RapprochementUseCase, you should have a constructor generated by #RequiredArgsConstructor.
You need to declare a bean of type RapprochementUseCase, and then pass it to your item processor like follows for example:
#Bean
public ItemProcessor<PlaqueSousSurveillanceEntity, PlaqueLueEntity> rapprochementProcessor(RapprochementUseCase rapprochementUseCase) {
return new RapprochementProcessor(rapprochementUseCase);
}
Question:
In order to Inject all subclasses of a superclass with no common interface, I created an interface tightly-coupled to said superclass, that every "properly" written subclass is supposed to implement.
This works, but seems insane. Was there a better way?
A simple cast do not work, as the Instance holds only a proxy that do not resolves to any real subclass of the interface when called. This results in a ClassCastException.
Some context:
I was recently tasked to provide framework code for an application. In this application, several data transfer objects are mapping from and to service level POJOs, but their mappings are not always trivial. Dozer is used to do most of the work and to avoid boilerplate code.
In the specific cases requiring explicit mapping instructions, the current recommendation with Dozer is to use the API driven mapping. All the BeanMappingBuilder subclasses, defining the mappings, should be added to the Dozer mapper upon initialisation.
In order to keep all the work needed to add a new BeanMappingBuilder in one place, I came with a convoluted use of dependancy injection that will automatically add it to the Dozer mapper, despite it having no common interface, only a common superclass with the others.
Some code:
The interface:
#Local
public interface DtoBeanMappingBuilder {
BeanMappingBuilder get();
}
Subclass example:
#Stateless
public class SomeDtoMappingBuilder extends BeanMappingBuilder implements DtoBeanMappingBuilder {
#Override
public BeanMappingBuilder get() {
return this;
}
#Override
protected void configure() {
mapping(
// Some mapping...
);
}
}
Mapper with injection point:
#Singleton
#Startup
public class DtoBeanMapper {
private DozerBeanMapper innerMapper;
#Inject
#Any
private Instance<DtoBeanMappingBuilder> mappingBuilders;
public <D> D map(Object source, Class<D> destinationClass) {
return innerMapper.map(source, destinationClass);
}
#PostConstruct
private void init() {
innerMapper = new DozerBeanMapper();
mappingBuilders.forEach(mb -> innerMapper.addMapping(mb.get()));
}
}
My current situation:
I want to inject the following class into my application:
public interface IConfigAccessor<T extends IConfig> {
...
}
ConfigAccessors are a proxy-objects, created dynamically at runtime. The creation of these object works as follows:
public class ConfigFactory implements IConfigFactory {
private final IConfigUpdater updater;
#Inject
public ConfigFactory(IConfigUpdater updater) {
this.updater = updater;
}
#Override
public <T extends IConfig> IConfigAccessor<T> register(final String configKey, final Class<T> configClass) {
ConfigCache<T> configCache = new ConfigCache<>(new SomeOtherThings(), configKey, configClass);
updater.register(configCache);
return new ConfigAccessor<>(configCache, configKey, configClass);
}
}
As you can see, to create these objects, I need to inject the ConfigUpdater and other depdencies. This means, that guice needs to be fully configured already.
To get the instance out of Guice, I use the following code:
IConfigFactory configClient = injector.getInstance(IConfigFactory.class);
IConfigAccessor<ConcreteConfig> accessor = configClient.register("key", ConcreteConfig.class)
How I want to inject them via Guice:
Currently, I can get the requried objects, but I have to manually pass them around in my application.
Instead, what I want to have is the following:
public class SomeClass {
#Inject
public SomeClass(#Config(configKey="key") IConfigAccessor<ConcreteConfig> accessor) {
// hurray!
}
}
What's the correct approach/technology to get this working?
After a lot of research, I'm feeling a bit lost on how to approach this topic. There are a lot of different things Guice offers, including simple Providers, custom Listeners which scan classes and identify custom annotations, FactoryModuleBuilders and more.
My problem is quite specific, and I'm not sure which of these things to use and how to get it working. I'm not even sure if this is even possible with Guice?
Edit: What I have so far
I have the following annotation which I want to use inside constructor paramters:
#Target({ ElementType.FIELD, ElementType.PARAMETER })
#Retention(RetentionPolicy.RUNTIME)
public #interface InjectConfig {
String configKey();
}
Inside the module, I can bind a provider to IConfigAccessor (with the above annotation) as such:
bind(IConfigAccessor.class).annotatedWith(InjectConfig.class)
.toProvider(new ConfigProvider<>());
However, there are two problems whith this:
The provider cannot provide IConfigAccessor. To create such an instance, the provider would need an IConfigUpdater, but since I use 'new' for the provider, I can't inject it.
Inside the provider, there is no way to find out about the configKey used in the Annotation.
Second approach:
Let's assume that I already know all configurations and configKeys I want to inject during startup. In this case, I could loop over all possible configKeys and have the following binding:
String configKey = "some key";
final Class<? extends IConfig> configClass =...;
bind(IConfigAccessor.class).annotatedWith(Names.named(configKey))
.toProvider(new ConfigProvider<>(configKey, configClass));
However, problem (1) still resides: The provider cannot get an IConfigUpdater instance.
The main problem here is that you cannot use the value of the annotation in the injection. There is another question which covers this part:
Guice inject based on annotation value
Instead of binding a provider instance, you should bind the provider class, and get the class by injecting a typeliteral.
That way, your config factory can look like that:
public class ConfigFactory<T extends IConfig> implements IConfigFactory {
#Inject private final IConfigUpdater updater;
#Inject private TypeLiteral<T> type;
#Override
public IConfigAccessor<T> register(final String configKey) {
Class<T> configClass = (Class<T>)type.getRawType();
ConfigCache<T> configCache = new ConfigCache<>(new SomeOtherThings(), configKey, configClass);
updater.register(configCache);
return new ConfigAccessor<>(configCache, configKey, configClass);
}
}
And then SomeClass:
public class SomeClass {
#Inject
public SomeClass(ConfigFactory<ConcreteConfig> accessor) {
ConcreteConfig config = accessor.register("key");
}
}
Since SomeClass needs to know "key" anyway, this is not too much a change information-wise. The downside is that the SomeClass API now gets a factory instead of the concrete config.
[EDIT]
And here is someone who actually did inject annotated values using custom injection.
I have a typed service interface:
public interface BaseArticleService<T extends BaseArticle> {
T getById(Long id);
}
And have two interfaces which extends it:
public interface AccArticleService extends BaseArticleService<AccArticle> {
}
public interface ExpArticleService extends BaseArticleService<ExpArticle> {
Long getCount();
}
Then I have a similar architecture for controllers:
public abstract class BaseArticleController<T extends BaseArticle> {
#Autowired
BaseArticleService<T> baseArticleService;
}
And:
#Controller
#RequestMapping(value = "/exp/articles")
public class ExpArticleController extends BaseArticleController<ExpArticle> {
}
#Controller
#RequestMapping(value = "/acc/articles")
public class AccArticleController extends BaseArticleController<AccArticle> {
}
Now if I want to get my ExpArticleService instance of the BaseArticleService that is injected in my BaseController, how can I achieve this?
If I do this:
public BaseArticleService<ExpArticle> getExpArticleService() {
return super.baseArticleService;
}
then I cannot call my getCount() method like getExpArticleService().getCount()
And cannot do this neither:
public ExpArticleService getExpArticleService() {
return super.baseArticuloService;
}
So what's the solution? Maybe inject another ExpArticleService in my ExpArticleController?
When using DI you are by definition relying on the interface. If you inject BaseArticleService then the only method available to you are the methods defined in this interface. In your case T getById(Long id);
Your way is not a bad way, you have a generic super class which allow you to initialise the interface with different parameters very easily, but you shouldn't expect to explicitely get any of the specific implementation. (maybe you could check and cast but it is not clean there).
As already stated you should extract the specific implementation in another class and directly inject this one in the controller requiring it. Keep your generic though, it is a nice way to to handle all the parametrized methods that will be shared between your controllers
I've read a lot about getting generic type at runtime and I've understood that to prevent full type erasure and get generic type without giving it to constructor I can use an anonymous class plus an utility method, i.e.
interface Generic<T> {
public Class<T> getGenericType();
}
#Component
class GenericImpl<T> extends AbstractGenericImpl<T> {
}
abstract class AbstractGenericImpl<T> implements Generic<T> {
protected Class<T> klass;
#SuppressWarnings("unchecked")
public Class<T> getGenericType() {
if (klass == null) {
// this is a spring utility method
klass = (Class<T>) GenericTypeResolver.resolveTypeArgument(getClass(), AbstractGenericImpl.class);
}
return klass;
}
}
Now using the previous class hierarchy I can have a working getGenericType method if and only if I instantiate a Generic<Anything> using an anonymous class. In fact in this test only the first two assertions are working:
#Test
public void testGeneric() throws Exception {
Generic<String> anonymous = new AbstractGenericImpl<String>() {};
Generic<String> anonymous2 = new GenericImpl<String>() {};
Generic<String> concrete = new GenericImpl<String>();
// assertion
assertThat("Anonymous of abstract class", anonymous.getGenericType(), equalTo(String.class));
assertThat("Anonymous of concrete subclass", anonymous2.getGenericType(), equalTo(String.class));
assertThat("With non anonymous class it fails", concrete.getGenericType(), equalTo(String.class));
}
The third one is failing with Expected: <class java.lang.String> but: was <class java.lang.Object>
Now I'd like to use the Generic class with spring #Autowired annotation i.e.
#Autowired Generic<String> auto;
#Test
public void testAutowiring() {
assertThat(auto, instanceOf(Generic.class));
assertThat(auto.getGenericType(), equalTo(String.class));
}
but the second assertion fails with the same error as above (Object instead of String), because spring container internally instantiate it with new GenericImpl<String>()
I've already tried to make constructor of GenericImpl<T> protected and also to declare GenericImpl<String> itself abstract but in both cases spring fail with a Cannot instantiate bean exception.
Is there any simple way to tell spring to instantiate classes using anonymous classes?
Additional details
The final class will convert a json stream into a POJO with Jackson and the Jackson library needs the Class<T> field to unmarshal objects.
// here I convert json stream to a POJO and I need the generic type
mapper.readValue(hit.source(), getGenericType());
Since I have multiple POJO classes to convert from to JSON I've implemented all the logic in a common class with generics called Retriever. At the end I'll have one Retriever for each POJO and often those retrievers are autowired in other classes.
#Autowired Retriever<Artifact> retriever;
Currently I've a constructor in Retriever which takes a Class<T> parameter and use it later to perform conversion. In the spring context I've this for autowiring
<!-- Since retriever has a Class<T> constructor this is the only way I found to resolve its dependency -->
<bean id="artifactRetriever" class="a.b.c.RetrieverImpl">
<constructor-arg value="a.b.c.Artifact"/>
</bean>
and I need one of this for each POJO for which I need conversion. This approach works but it's a little verbose and it clutters the application context with useless lines. So I was looking for a way to get rid of all this noise in application context.
It's not possible to create and instantiate anonymous classes in-place with Spring, not with XML configuration (since it needs class name, and you don't have one).
Ok, final solution for my use case will use the approach described in this answer. It would be better because it will be possible to track usages and I'll get rid of every problem I'm having with the current approach.
In that way I can do the following
#Component
public class ArtifactImpl extends AbstractGenericImpl<Artifact> {
}
#Component
public class MaterialImpl extends AbstractGenericImpl<Material> {
}
#Component
class Usage {
#Autowired ArtifactImpl foo;
#Autowired MaterialImpl bar;
}
In this way everything is checked at compile time and I got rid of Class<T> constructor in fact I have autowiring in place (without #Qualifier) and the following test is working:
#RunWith(SpringJUnit4ClassRunner.class)
public class AutowiringTest {
#Autowired Usage test;
public void testAutowiring() {
assertThat(test.foo.getGenericType(), equalTo(Artifact.class));
assertThat(test.bar.getGenericType(), equalTo(Material.class));
}
}
Original answer
Ok, I've found out that what I'm asking will be useless because autowiring happens at runtime and so having two autowired object with different objects will lead to spring errors, i.e. this won't work:
#Configuration
class RetrieverProvider {
#Bean
Retriever<Artifact> getArtifact() {
return new RetrieverImpl<Artifact>() {};
}
#Bean
Retriever<Material> getMaterial() {
return new RetrieverImpl<Material>() {};
}
}
class InjectedAttempt {
// at injection time, i.e. runtime, type erasure prevent spring to distinguish
#Autowired Retriever<Artifact> foo; // this type
#Autowired Retriever<Material> bar; // from this type
// so it cannot perform injection by type
}
The only way to get that working is to use qualifiers in this way, but I don't like this approach, so I'll remain with xml configuration and constructor arguments.
#Configuration
class RetrieverProvider {
#Bean #Qualifier("artifact") Retriever<Artifact> getArtifact() {
return new RetrieverImpl<Artifact>() {};
}
#Bean #Qualifier("material")
Retriever<Material> getMaterial() {
return new RetrieverImpl<Material>() {};
}
}
class Injected {
#Autowired #Qualifier("artifact") Retriever<Artifact> foo;
#Autowired #Qualifier("material") Retriever<Material> bar;
}
As a side note guice has support for generic injections, maybe spring has something similar.