I have a project which is expanding quickly and now has over a hundred components. For unit testing, I have a #Configuration class to substitute stubbed classes, such as a database service with limited functionality:
#Configuration
public class SpringDevTestConfiguration {
#Bean
#Profile("unittest")
public DatabaseService databaseService() {
// this one is needed for the test
return new TestDatabaseServiceImpl();
}
#Bean
#Profile("unittest")
public RobotControlService robotControlService() {
// Why should I do this on every service?
return new RobotControlServiceImpl();
}
... etc.
The vast majority of components don't need test alternatives, so we want the default one that exists and is picked up at runtime. But whenever a new component is added to the system, we have to add it to all of the #Configuration classes, or we get the "NoSuchBeanDefinitionException: No qualifying bean of type..." exception. That's a lot of boilerplate code to return Impl classes, and prevents us from making implementation classes package-protected (because the configuration class needs to be able to instantiate them).
I have tried adding #ComponentScan to the class, but that results in BeanDefinitionOverrideException. If I allow bean overrides with:
spring.main.allow-bean-definition-overriding=true
... then I get NoUniqueBeanDefinitionException because of the multiple database service implementations.
I'm sure I must be missing something. How can I have short, simple #Configuration class that only instantiates the test implementations I need for the test, then allows the other services to be discovered automatically without explicitly creating them?
Related
Edit Fixed by changing package.
I have this configuration file for spring framework
#Configuration
public class AppConfig {
#Bean(initMethod = "populateCache")
public AccountRepository accountRepository(){
return new JdbcAccountRepository();
}
}
JdbcAccountRepository looks like this.
#Repository
public class JdbcAccountRepository implements AccountRepository {
#Override
public Account findByAccountId(long
return new SavingAccount();
}
public void populateCache() {
System.out.println("Populating Cache");
}
public void clearCache(){
System.out.println("Clearing Cache");
}
}
I'm new to spring framework and trying to use initMethod or destroyMethod. Both of these method are showing following errors.
Caused by: org.springframework.beans.factory.support.BeanDefinitionValidationException: Could not find an init method named 'populateCache' on bean with name 'accountRepository'
Here is my main method.
public class BeanLifeCycleDemo {
public static void main(String[] args) {
ConfigurableApplicationContext applicationContext = new
AnnotationConfigApplicationContext(AppConfig.class);
AccountRepository bean = applicationContext.getBean(AccountRepository.class);
applicationContext.close();
}
}
Edit
I was practicing from a book and had created many packages for different chapters. Error was it was importing different JdbcAccountRepository from different package that did not have that method. I fixed it and it works now. I got hinted at this from answers.
Like you said, if you are mixing configurations types, it can be confusing. Besides, even if you created a Bean of type AccountRepository, because Spring does a lot of things at runtime, it can call your initMethod, even if the compiler couldn't.
So yes, if you have many beans with the same type, Spring can be confused an know which one to call, hence your exception.
Oh and by the way, having a Configuration creating the accountRepoisitory Bean, you can remove the #Repository from your JdbcAccountRepository... It is either #Configuration + #Bean or #Component/Repository/Service + #ComponentScan.
TL;DR
Here is more information and how Spring creates your bean : What object are injected by Spring ?
#Bean(initMethod = "populateCache")
public AccountRepository accountRepository(){
return new JdbcAccountRepository();
}
With this code, Spring will :
Detect that you want to add a Bean in the application Context
The bean information are retrieved from the method signature. In your case, it will create a bean of type AccountRepository named accountRepository... That's all Spring knows, it won't look inside your method body.
Once Spring is done analysing your classpath, or scanning the bean definitions, it will start instanciating your object.
It will therefor creates your bean accountRepository of type AccountRepository.
But Spring is "clever" and nice with us. Even if you couldn't write this code without your compiler yelling at you, Spring can still call your method.
To make sure, try writing this code :
AccountRepository accountRepository = new JdbcAccountRepository();
accountRepository.populateCache(); // Compiler error => the method is not found.
But it works for Spring... Magic.
My recommandation, but you might thinking the same now: If you have classes across many packages to answer different business case, then rely on #Configuration classes. #ComponentScan is great to kickstart your development, but reach its limit when your application grows...
You mix two different ways of spring bean declaration:
Using #Configuration classes. Spring finds all beans annotated with #Configuration and uses them as a reference to what beans should be created.
So if you follow this path of configuration - don't use #Repository on beans. Spring will detect it anyway
Using #Repository - other way around - you don't need to use #Configuration in this case. If you decide to use #Repository put #PostConstruct annotation on the method and spring will call it, in this case remove #Configuration altogether (or at least remove #Bean method that creates JdbcAccountRepository)
Annotate populateCache method with #PostConstruct and remove initMethod from #Bean. It will work.
(spring boot 1.5, java 8)
Suppose there is a Foundation, some Wall types, and Ceilings. They depend on each other, just like their physical counterparts.
This configuration class has a Foundation injected and creates Wall beans, so that Ceilings can have Walls injected. Don't mind the bad practices, I'm just trying to keep the code as short as possible without changing the essential mechanisms.
#Configuration
class Config {
#Autowired
Foundation foundation;
#Bean
// assume there is a lot of repetitive logic here
WoodenWall wood() { return new WoodenWall(foundation); }
#Bean
BrickWall brick() { return new BrickWall(foundation); }
}
And a ceiling:
#Component
class KitchenCeiling {
#Autowired
WoodWall wall;
}
I'm going to be making a lot more Wall types and want them all registered as beans. Instead of having to define a #Bean method for each and every one of them, I want to instantiate them all in a loop and register them manually in a single go. Think of an AllTypesOfWallsBeanFactory if you wish.
#Configuration
class Config implements ??? {
#Autowired
Foundation foundation;
#Override
void addBeans(??? beanRegistry) {
for (Class beanClass : wallClasses) {
// instantiate BrickWall, WoodWall, etc
registry.add(beanClass.getSimpleName(), beanClass, wallInstance);
}
}
}
The problem is, I can't find the right interface to implement or the right SpringBeanRegistryPostProcessorFactoryImplementationThingamajig to go to. I've tried the answers given on all the other SO posts I could find but none do the right thing.
One issue here is that I need the context to already be initializing beans since I need a Foundation, but I also want to add some more beans to the context so that Ceilings yet to be initialized can inject my freshly made Walls.
#Bean methods do exactly that: they can take dependencies that are already initialized and return new beans to be used elsewhere at the same time. I just need the programmatic equivalent to this exact mechanism.
I know dependency resolution is harder in this situation (since there isn't any reflective information for spring to figure out the total following order), but it's gotta be possible to tell spring to
initialize what it has dependencies for
process the newly created beans
goto 1
BeanDefinition doesn't fit the bill because it doesn't take any instances, and to use the factoryMethod setting I would need to make a factory class for each Wall. Back to square one.
Most of the BeanDefinitionRegistryPostProcessor et al interfaces are called too late, don't inject the beans I give them into dependent beans (there was 1 that took instances but didn't do anything with them), or demand a default constructor (i.e. no dependencies). They also tend to give you objects that only take BeanDefinition.
In another project where there was a similar need, what ended up being used was a factory method with still a #Bean method for each instance it needed to produce. I'm starting to think it isn't really possible and what was done in this project is as good as it'll get.
You can try using #PostConstructwith ConfigurableApplicationContext
#Configuration
class Config {
#Autowired
Foundation foundation;
#Autowired
ConfigurableApplicationContext ctx;
#PostConstruct
void addBeans() {
for (Class beanClass : wallClasses) {
ctx.getBeanFactory().registerSingleton(beanClass.getSimpleName(), beanClass, wallInstance);
}
}
}
I'm going to conclude that it just isn't possible to do this reliably. The best one can do is making a factory method and a #Bean method calling the factory for each bean instance that needs to be registered. It doesn't seem possible to cut the #Bean methods out of the picture.
I am developing a REST API with Spring Boot.The problem it's that I have one interface and two implementations and I want to test only with the mock implementation.
Interface CRMService
#Service
CRMServiceImpl
#Service
CRMServiceMock
Implementations: the first one is the real integration with the backend and the second is a mock for testing purposes, what's the best approach? Integration test or test based on the active profile ? If I need to autowire a service based on profile what's the best practice?
While I'm sure there's exceptions, generally it shouldn't be integration or unit tests (often involves mocks), but both; see testing pyramid concept.
Integration tests: just use the real service. If it calls out to other live services, then consider injecting the URLs as Spring Boot properties which point to mock servers in the test environment (Node.js or something easy and quick).
Unit tests: Consider using a test-framework like Mockito. Using this you can write your tests with mocks approximately like so:
private CRMServiceImpl mockService = mock(CRMServiceImpl.class);
#Test
public void someTest() {
when(mockService.someMethod(any(String.class), eq(5))).thenReturn("Hello from mock object.")
}
The above example roughly translates to "when some class invokes 'someMethod(String, int)' on your service, return the String specified".
This way allows you to still use mocks where necessary, but avoids having to maintain entire mock implementation profiles and avoids the problem of what to auto-wire.
Finally, if you need a full separate implementation, consider not auto-wiring services! Instead, use #Bean annotations in your configuration class and inject it via constructors into the classes that need it. Something like so:
#Configuration
public class ApplicationConfiguration {
#Value{$"service.crm.inmem"} // Injected property
private boolean inMem;
#Bean
CRMService getCRMService() {
if (inMem) {
return new CRMServiceMock();
}
return new CRMServiceImpl();
}
#Bean
OtherService getOtherService() {
// Inject CRMService interface into constructor instead of auto-wiring in OtherService.class
return new OtherService(getCRMService());
}
}
An example of when you could use ^^ would be if you wanted to switch between an in-memory store, and a real database-connection layer.
Personally I'd suggest doing dependency injection like the above example even when there aren't multiple implementations since as a project grows, if an auto-wired property fails it can be difficult to track down exactly why. Additionally explicitly showing where dependencies come from can help with organizing your application and visualizing your application hierarchy.
When I setup Spring with XML a can override component definitions in XML files that are loaded later.
It's very usefull for tests - I create default config set and than load it with addition test configuration that replaces some of components with specials (stubs, mocks, and so on).
Now i start to migrate to annotation based configurations and it causes some problems.
The direct way to use annotations is auto-discovering of packages with #Component
So I have
#Configuration
#ComponentScan({"some.pack1", "some.pack2"})
public class ProductConfig{}
And when
#Configuration
#Import({ProductConfig.class})
#ComponentScan({"test.pack"})
public class TestConfig{}
But it will cause conflict if I try to override components in test.pack
And what I can do?
After some investigations where are 3 answers with some issues on them
Worst - i can use #Filter on ComponentScan - it's worst way,
i must not import existed config (that can has some additional beans)
i must rescan all components, and explicitly define set of filters
i can use #Profile and activeProfiles - it's better, while it's more sophistical, implict, but
it means that i must to know at product classes that they can be disabled in some tests
not to use #ComponentScan on override Config and using #Bean insted of it
it's maybe well on test configurations, but it means that I lost ability to use #Component annotation
use setParent on contexts - it works well, but
it's explicit operation on implementation of ApplicationContext not on interface
it's not hard to setup if overriding services has #Autwire dependency on some components from overriden config - require manual register and refresh
What is best and standard way to override conigurations??? When I used XML-based it was not a problem...
#profile plays a crucial role while implementing the testing strategy for your service/code.
For example, in development, you may have:
public interface DataSource{
public String getHost();
}
Default implementation is
#Component
#Profile("Prod")
public class DevDataSource implements DataSource {
public String getHost(){
// return actual value
}
And the implementation for component tests(Fake impl)
#Component
#Profile("test")
public class StubbyDataSource implements DataSource {
public String getHost(){
return "some-host"; // return mocked data
}
Now you can write a test here which can act as integration test, unit test and component tests (https://martinfowler.com/bliki/ComponentTest.html)
In that way, your testing strategy would be much more elegant, concise and easy to maintain. Just by changing the profile, the same test can point to different environments (real or fake).
I have a Spring application consisting of multiple modules. One of these modules requires certain Spring beans to be present in the context (it cannot run standalone as it does not have a complete context itself).
This module provides basic functionality that needs to be shared amongst many applications that customize this module by making the correct beans available (singleton or request scoped, depending on needs).
This works perfectly and we're very happy with this setup as it provides a seperation between core functionality and business specific logic.
My question is now, I have a class that can optionally be used to satisfy one of the depedencies. It is not annotated with #Component to prevent it being scanned, however I would like the projects to be able to choose to use this class or supply their own implementation.
The core module looks like this:
public interface AProvider;
#Component
public class AService {
#Inject private AProvider aProvider;
}
And it provides this implementation that can optionally be used:
public class DatabaseBasedAProvider implements AProvider {
#Inject private SomeOtherDependency dependency; // <-- this needs to be injected still if used!
}
An example project that uses the core module then must make sure that one bean of type AProvider is present on the context. This can be achieved like:
#Configuration
public class Configuration {
#Bean
AProvider getAProvider() {
return new OurOwnAProviderImplementation();
}
}
What I would like though is something like:
#BeanClass // <-- some annotation I made up
Class<AProvider> getAProviderClass() {
return DatabaseBasedAProvider.class; // <-- have spring inject this!
}
What I don't want is:
#Bean
AProvider getAProvider() {
return new DatabaseBasedAProvider( ... add dependencies here myself ... );
}
I have solved a case similar to yours (if I understand correctly), using the #Primary annotation. Might be something for you.
public interface AProvider { }
For every module to have some implementation of the interface, create a default implementation that is shared.
#Service
public class DefaultAProvider implements AProvider {}
Then, if some module wishes to use its own implementation, "override" the bean using #Primary.
#Primary
#Service
public class MyVerySpecialAProvider implements AProvider {}
Then, anytime you inject AProvider, Spring will pick the #Primary implementation.
An alternative will be to use #Profile, another alternative would be to annotate your AProvider classes with #Component in combination with #ConditionalOnProperty and document the different choices to your consumers.
Example
#Component
#ConditionalOnProperty(name = "my.aprovider.choice", havingValue = "database")
public class DatabaseBasedAProvider implements AProvider {
#Inject private SomeOtherDependency dependency; // <-- this needs to be injected still if used!
}
I've found a solution that allows me to decide at the client what class I want to use for AProvider.
It is not super nice, but it does mean I don't need to make specific changes to the code in the core module (as this module is supposed to be generic).
In a #Configuration class in the client's config I'm now doing this:
#Component
static class MyDatabaseBasedAProvider extends DatabaseBasedAProvider {
// No implementation
}
This makes Spring construct the class and handle all the injections. It could be shorter and it does require the class to be non-final but it works.
The client is now alerted if the bean is missing, is free to make their own implementation and free to pick one of the existing implementations if one suits their needs, without the core module having to decide before hand how AProvider might be supplied.