I have an application which has a very simple plugin architecture by the means of registering additional behaviour using spring classpath scanning ("Installing" the plugin happens by putting the plugin.jar on the classpath).
This works great for registering additional beans and it also works great for registering hooks like this:
// core.jar:
#Component
class CoreClass {
public void addListener(Listener listener) { /* ... */ }
}
// plugin.jar
#Component
class Plugin {
public Plugin(CoreClass coreClass) {
coreClass.addListener(new PluginListener());
}
}
However sometimes it is more appropriate to to replace an entire bean. This might look like this:
// core.jar:
#Component("someBean")
class CoreClass implements CoreInterface {
#Override
public void doStuff();
}
// plugin.jar
#Component("someBean")
#Order(HIGHEST_PRECEDENCE)
class Plugin implements CoreInterface {
#Override
public void doStuff();
}
In this case I want that both Plugin and CoreClass are discovered by classpath scan during startup, but then the CoreClass should be ignored, because the bean someBean has another definition with higher precedence.
I am aware that I could do this using XML (<import resource="classpath*:plugin-spring.xml" />), because XML allows overriding definitions. But is this also possible using annotations?
Edit: Note that I sometimes have multiple bean instances of the same type (with different properties) which I inject name-based. Thus I actually need to override the bean with a certain name.
Possible solution to use:
#Primary
Indicates that a bean should be given preference when multiple candidates are qualified to autowire a single-valued dependency. If exactly one 'primary' bean exists among the candidates, it will be the autowired value.
This annotation is semantically equivalent to the element's primary attribute in Spring XML.
May be used on any class directly or indirectly annotated with #Component or on methods annotated with #Bean.
// core.jar:
#Component
class CoreClass implements CoreInterface {
#Override
public void doStuff();
}
// plugin.jar
#Component
#Primary
class Plugin implements CoreInterface {
#Override
public void doStuff();
}
http://memorynotfound.com/handling-multiple-autowire-candidates-with-spring-primary/
Or you could use :
#Configuration
#ImportResource( { "classpath*:core-spring.xml", "classpath*:plugin-spring.xml" } )
public class ConfigClass { }
I suspect that problem with annotations is not clear which order to override, if you define over notations same name you will end-up with ConflictingBeanDefinitionException spring will not allow you to have #Component or #Bean with same name. In xml you can manage override order that’s why you can do this over xml. In annotations you can't do that.
Any given Spring context can only have one bean for any given id or name. In the case of the XML id attribute, this is enforced by the schema validation. In the case of the name attribute, this is enforced by Spring's logic.
However, if a context is constructed from two different XML descriptor files, and an id is used by both files, then one will "override" the other. The exact behavior depends on the ordering of the files when they get loaded by the context.
So while it's possible, it's not recommended. It's error-prone and fragile, and you'll get no help from Spring if you change the ID of one but not the other.
However, if a context is constructed from two different XML descriptor files, and an id is used by both files, then one will "override" the other. The exact behavior depends on the ordering of the files when they get loaded by the context.
Related
I'm looking for a pattern to simplify bean creation for a team of developers.
For now, I've been trying to use abstract classes like this:
public abstract class BaseClass {
#Bean
public BeanA generateBeanA() {
...
return beanA;
}
#Bean
public BeanB generateBeanB() {
...
return beanB;
}
}
The idea behind: I provide the BaseClass and I'd like developers to extend it many times. For each extension, every beans should be generated. But this approach doesn't work : for each extension, developers have to redeclare beans for 2 reasons :
bean declaration is not inherited
to avoid name clashing, they have to name beans manually in each extension
Is there a better pattern that would allow the following?
centralized bean naming (ie: the developer declare a base name in the extension and every bean of the extension is renamed accordingly: ${baseName}${beanName} )
overriden beans would be declared (instead of parent version)
parent beans would be declared if not overriden
There are 3 ways to configure beans in Spring container:
Declare them using XML configuration.
Declare beans using the #Bean
annotation in a configuration class.
Mark the class with one of the
annotations from the org.springframework.stereotype package, and
leave the rest to component scanning.
So, the way you declare Spring bean is wrong.
IMHO, let the developers declare beans as they want, and they just #Autowired them or something.
Please continue by reading this
I have a collection of classes which I want to be injected into the Spring application context. However, these classes can only be guaranteed to be annotated with one of a group of annotations I have written - i.e. I can assume it will be annotated with #MyAnnotation, but not #Component.
However, #MyAnnotation forms part of an API for my project, and I don't want to state an explicit dependency of this API on Spring. Thus, I can't annotate #MyAnnotation with #Component in order to have it be transitively picked up by Spring.
Is there a way to tell Spring to additionally include #MyAnnotation in its classpath scanning without adding this dependency to my API?
Currently I'm manipulating the bean definition registry to 'manually' add each class annotated with #MyAnnotation, but I'd prefer to rely on Spring's inbuilt support.
Thanks in advance.
It's possible if you create your own BeanDefinitionRegistryPostProcessor to register your own beans. If you implement the postProcessBeanDefinitionRegistry method, you can add beans to the registry by yourself, for example:
#Component
public class FooFactoryBean implements BeanDefinitionRegistryPostProcessor {
#Override
public void postProcessBeanDefinitionRegistry(BeanDefinitionRegistry registry) throws BeansException {
registry.registerBeanDefinition(..);
}
}
To obtain these bean definitions, you can use the ClassPathScanningCandidateComponentProvider class, which will create BeanDefinition objects for all classes found for a specific filter. In this case, an AnnotationTypeFilter will work:
ClassPathScanningCandidateComponentProvider scanner = new ClassPathScanningCandidateComponentProvider(false);
scanner.addIncludeFilter(new AnnotationTypeFilter(Foo.class));
Set<BeanDefinition> definitions = scanner.findCandidateComponents("com.example.my");
In this example, it will find all classes annotated with #Foo in the com.example.my package.
#Configuration classes and XML based configuration should work for you. Have a look at this tutorial: https://www.tutorialspoint.com/spring/spring_java_based_configuration.htm
But to get your #MyAnnotations picked up is more difficult (see #g00glen00b's answer), and I'm not sure it makes sense if the above mentioned solutions are available.
I have a Spring application consisting of multiple modules. One of these modules requires certain Spring beans to be present in the context (it cannot run standalone as it does not have a complete context itself).
This module provides basic functionality that needs to be shared amongst many applications that customize this module by making the correct beans available (singleton or request scoped, depending on needs).
This works perfectly and we're very happy with this setup as it provides a seperation between core functionality and business specific logic.
My question is now, I have a class that can optionally be used to satisfy one of the depedencies. It is not annotated with #Component to prevent it being scanned, however I would like the projects to be able to choose to use this class or supply their own implementation.
The core module looks like this:
public interface AProvider;
#Component
public class AService {
#Inject private AProvider aProvider;
}
And it provides this implementation that can optionally be used:
public class DatabaseBasedAProvider implements AProvider {
#Inject private SomeOtherDependency dependency; // <-- this needs to be injected still if used!
}
An example project that uses the core module then must make sure that one bean of type AProvider is present on the context. This can be achieved like:
#Configuration
public class Configuration {
#Bean
AProvider getAProvider() {
return new OurOwnAProviderImplementation();
}
}
What I would like though is something like:
#BeanClass // <-- some annotation I made up
Class<AProvider> getAProviderClass() {
return DatabaseBasedAProvider.class; // <-- have spring inject this!
}
What I don't want is:
#Bean
AProvider getAProvider() {
return new DatabaseBasedAProvider( ... add dependencies here myself ... );
}
I have solved a case similar to yours (if I understand correctly), using the #Primary annotation. Might be something for you.
public interface AProvider { }
For every module to have some implementation of the interface, create a default implementation that is shared.
#Service
public class DefaultAProvider implements AProvider {}
Then, if some module wishes to use its own implementation, "override" the bean using #Primary.
#Primary
#Service
public class MyVerySpecialAProvider implements AProvider {}
Then, anytime you inject AProvider, Spring will pick the #Primary implementation.
An alternative will be to use #Profile, another alternative would be to annotate your AProvider classes with #Component in combination with #ConditionalOnProperty and document the different choices to your consumers.
Example
#Component
#ConditionalOnProperty(name = "my.aprovider.choice", havingValue = "database")
public class DatabaseBasedAProvider implements AProvider {
#Inject private SomeOtherDependency dependency; // <-- this needs to be injected still if used!
}
I've found a solution that allows me to decide at the client what class I want to use for AProvider.
It is not super nice, but it does mean I don't need to make specific changes to the code in the core module (as this module is supposed to be generic).
In a #Configuration class in the client's config I'm now doing this:
#Component
static class MyDatabaseBasedAProvider extends DatabaseBasedAProvider {
// No implementation
}
This makes Spring construct the class and handle all the injections. It could be shorter and it does require the class to be non-final but it works.
The client is now alerted if the bean is missing, is free to make their own implementation and free to pick one of the existing implementations if one suits their needs, without the core module having to decide before hand how AProvider might be supplied.
Often we have tests and production environment different from each other. Depending on some configuration parameters, certain classes are not required to be registered as beans by container (while using spring). Is there a way to dynamically skip such classes from application context ?
If the class that should been used in the test is only available the classpath while testing, and your problem is about replacing injected classes, then you could annotate the class that is used in TESTs with #Primary annotation.
#Primary is a feature of spring that is much older then conditions and less powerful but it is really easy to use. Is says: when an injection point
could been fulfilled with two Beans then use the bean that is annotated with #Primary, instead throwing an exception that complains about ambiguous beans.
So when you add a bean in test scope with #Primary annotation, then this bean replaces the original bean in its injection points.
Using Spring 4 #Conditional annotation we can.
Check here for details of Spring Condition interface
First create a Class say- "ComponentScanCondition which implements Spring's "Condition" interface. And the only method "matches" returns false if the system property is not null or it is a "test" environment.
public class ComponentScanCondition implements Condition{
#Override
public boolean matches(ConditionContext context, AnnotatedTypeMetadata metaData) {
return System.getProperty("environment")!=null && System.getProperty("environment").equals("test")? false:true;
}
}
Now with annotation- #Conditional(ComponentScanCondition.class) you can control the component scan on the classes you do not need in test environment.
In jUnit test class, set up a system property as below:
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = {"classpath:applicationContext.xml"})
public class testMyClass{
#BeforeClass
public static void setUpBeforeClass() throws Exception{
System.setProperty("environment", "test");
}
#Test
public void testSomeMethod(){
}
}
And in the classes, not required in test, use #Conditional annotation.For example if you do not need UserProfile class in test environment, skip it as below:
#service("userProfile")
#Conditional(ComponentScanCondition.class)
public class UserProfile{
}
As in test class the system property "environment" is set to "test", matches method will return false and Spring will skip the class UserProfile to be scanned.
In prod environment this property will not be set and will be null so matches will return true and hence UserProfile (and those class with #Conditional) will be scanned by Spring to register as beans in the container.
I have a Dynamic Web Project created in Eclipse (Juno) using Spring/Maven/Weblogic. I also have an Abstract class into which I inject a property (on the setter) via a properties file. The annotation used is #Value(value="${some.property}").
For some strange reason, when I deploy this project via maven to weblogic, the property gets injected for the concrete class that etends this Abstract class. But when I deploy this project directly onto weblogic via the Server -> Add Deployment, it fails to inject the property. In fact it does inject properties for all other annotations on the concrete class but ignores any annotations for the abstract class.
So basically this has nothing to do with coding but seems like some kind of config problem. Has anyone encountered something similar. Thanks.
public abstract class MyAbstract {
#Value(value="${myproperty1}")
public void setMyValue(String myValue) {
log.debug("setMyValue({})", myValue);
this.myValue = myValue;
}
}
public class MyConcrete {
#Value(value="${myproperty2}")
public void setMyValue2(String myValue2) {
log.debug("setMyValue2({})", myValue2);
this.myValue2 = myValue2;
}
}
I found the issue. The thing is that this particular bean has #Scheduled annotation and hence does not qualify for Bean Post Processing. Since the properties via #Value are set during this phase, they are skipped. Instead I shifted the bean creation into my application context xml file and set the properties from there eg.
<bean id="myConcrete" class="some.package.MyConcrete"
p:value="${myproperty1}"
/>
Now the properties are getting injected during normal bean lifecycle i.e. after Instantiation but before Bean Post Processor.