How can I ensure Spring constructs a bean once - java

I have a complex enterprise environment with multiple projects depending on each other. Configuration is handled by spring and includes crazy number imports and what not.
Looks like by default Spring is OK with constructing a bean with the same name more than once. Particularly, in case of multiple Spring contexts. Each context its own instance of the same singleton bean. Singleton is not really a singleton in Spring architects' minds...
Unfortunately, in my case there's a bean that can never be created more than once.
Is there a way to enforce Spring checking upon the bean whether it's been created already and not to try to call its constructor again?
Particularly, the bean creates ehcache's CacheManager inside and fails because CacheManager with the same name can't be created twice.
<bean id="cacheService" class="somepackage.CacheServiceImpl">
<constructor-arg index="0" value="/somepackage/ehcache.xml" />
</bean>
I don't have control over the CacheServiceImpl code. I can only change configuration around it.

Different Spring application contexts don't know about each other. As far as each context is concerned, your object is a singleton. However, it sounds like you don't want there to be multiple contexts.
How are you creating contexts? Your application should create a single instance of ApplicationContext (probably in main or somewhere nearby). Anything else that needs an application context should have it injected or be made ApplicationContextAware.
http://docs.spring.io/spring/docs/current/spring-framework-reference/html/beans.html#beans-factory-instantiation
You mention a "complex enterprise environment". What's worked best for us where I work is having a single project manage Spring. Assuming all your projects are being run in the same application (otherwise Spring can't really help you), there's likely some project that starts everything off. For us, that's the project that is built into a war and deployed to our server.

I figured it out finally. The reason for loading the context twice was a hidden error occurring during the first context loading. A long and boring explanation below.
Context is supposed to be loaded during first invocation of spring method DefaultTestContext.getApplicationContext().
There's a class DefaultCacheAwareContextLoaderDelegate, which is responsible for actual loading the context once and caching it for future uses. The method code is below:
#Override
public ApplicationContext loadContext(MergedContextConfiguration mergedContextConfiguration) {
synchronized (this.contextCache) {
ApplicationContext context = this.contextCache.get(mergedContextConfiguration);
if (context == null) {
try {
context = loadContextInternal(mergedContextConfiguration);
if (logger.isDebugEnabled()) {
logger.debug(String.format("Storing ApplicationContext in cache under key [%s]",
mergedContextConfiguration));
}
this.contextCache.put(mergedContextConfiguration, context);
}
catch (Exception ex) {
throw new IllegalStateException("Failed to load ApplicationContext", ex);
}
}
else {
if (logger.isDebugEnabled()) {
logger.debug(String.format("Retrieved ApplicationContext from cache with key [%s]",
mergedContextConfiguration));
}
}
this.contextCache.logStatistics();
return context;
}
}
Obviously, when a Throwable other than Exception gets thrown during loadContextInternal() invocation, the application context doesn't get put in the cache.
Apparently, even when a single JUnit test is run getApplicationContext() method is called more than once with an expectation that second time it won't need to be loaded again since already cached.
If an Error is thrown during first load it gets buried and JUnit/Spring continues with the test until it calls getApplicationContext() for the second time. That's where it catches the exception and crashes because ehcache's doesn't expect CacheManager to be initialized more than once.

Related

#ConditionalOnBean(KafkaTemplate.class) crashes entire application

I have a Spring boot application that consumes data from Kafka topic and send email notifications with a data received from Kafka,
#Bean
public EmailService emailService() {
return new EmailServiceImpl(getJavaMailSender());
}
it works perfectly,
but after I added #ConditionalOnBean:
#Bean
#ConditionalOnBean(KafkaTemplate.class)
public EmailService emailService() {
return new EmailServiceImpl(getJavaMailSender());
}
application failed to start:
required a bean of type 'com.acme.EmailService' that could not be
found.
And I can't find any explanation, how it is possible, because KafkaTemplate bean automatically created by Spring in KafkaAutoConfiguration class.
Could you please give me an explanation?
From the documentation:
The condition can only match the bean definitions that have been
processed by the application context so far and, as such, it is
strongly recommended to use this condition on auto-configuration
classes only. If a candidate bean may be created by another
auto-configuration, make sure that the one using this condition runs
after.
This documentation clearly says what might be wrong here. I understand KafkaTemplateConfiguration creates the KafkaTemplate.class. But it may not be added in the bean context while the condition was being checked. Try to use autoconfiguration for KafkaTemplate or make sure the ordering of different configuration classes so that you can have the guarantee of having the KafkaTemplate in bean registry before that conditional check.

Cleanly destroy Spring Application Context

I have been running in to problems making sure that a spring application context that i am destroying has completely gone away, and cant see the object being garbage collected. When i look at the instance in VisualVM i can see that there are a number of outstanding references to both the context and it's bean factory that remain once the context is closed and destroyed. These all to be in relation to the initial set up of the bean factory (during the refresh method of AbstractApplicationContext) which registers the bean factory and the context with various bean post processors etc.
There do not appear to be any methods on the bean factory or on the application contexts (even the refreshable ones) that do more than remove the lowest level reference to the bean factory. The result is that it appears to be leaking memory, and in certain circumstances preventing the clean re-creation of a context.
I am asking as the software i am working on at the moment may dynamically create / destroy and then re-create the context (as modules are dynamically loaded and unloaded) and the leftover elements of the context and bean factory are causing problems with components such as spring-data-jpa (especially the proxy that binds the repository interfaces to the repository implementations).
Does anyone know of a way whereby i can cleanly and completely remove a context and bean factory without having to completely close down the VM that initially created it?
Having looked into this again recently, i noticed that i was overriding the doClose() method of the context to make sure beans were completely destroyed, but was not calling the super.doClose() method, which meant that LiveBeansView.unregisterApplicationContext() / destroyBeans() / getLifecycleProcessor().onClose() and closeBeanFactory() were not being called.
I added this in, and (most) if not all contexts are now cleanly destroyed and garbage collected. I will assume that any outstanding contexts that are not destroyed are more probably issues in our own code with dangling references.
If you are using Spring's IoC container in a non-web application environment; for example, in a rich client desktop environment; you register a shutdown hook with the JVM. Doing so ensures a graceful shutdown and calls the relevant destroy methods on your singleton beans so that all resources are released. Of course, you must still configure and implement these destroy callbacks correctly.
To register a shutdown hook, you should call the registerShutdownHook() method that is declared on the AbstractApplicationContext class:
Code
import org.springframework.context.support.AbstractApplicationContext;
import org.springframework.context.support.ClassPathXmlApplicationContext;
public final class startup {
public static void main(final String[] args) throws Exception {
AbstractApplicationContext ctx
= new ClassPathXmlApplicationContext(new String []{"beans.xml"});
// add a shutdown hook for the above context...
ctx.registerShutdownHook();
// app runs here...
// main method exits, hook is called prior to the app shutting down...
}
}
Call destroy on context and set null to all variables referencing instance of your application context:
AbstractApplicationContext context = new ClassPathXmlApplicationContext(new String []{"beans.xml"});
// ... do your stuff
context.destroy();
context = null;

How to tell spring to only load the needed beans for the JUnit test?

A simple question that might have an advanced answer.
The Question:
My question is, is there a way to instantiate only the classes, in your application context, needed for that specific JUnit test ?
The Reason:
My application context is getting quite big. I also do a lot of integration tests so you I guess you would understand when I say that every time I run a test all the classes in my application context get instantiated and this takes time.
The Example:
Say class Foo inject only bar
public class Foo {
#Inject
Bar bar;
#Test
public void testrunSomeMethod() throws RegisterFault {
bar.runSomeMethod();
}
but the application context has beans foobar and bar. I know this is not a vaild application context but rest assure all my code works.
<beans>
<bean id="foobar" class="some.package.FooBar"/>
<bean id="bar" class="some.package.Bar"/>
<beans>
So how do I tell spring to only instantiate Bar and ignore FooBar for the test class foo.
Thank you.
Consider adding default-lazy-init="true" to your spring context xml beans tag (or add lazy-init="true" to those specific beans that take a long time starting up).
This will ensure that only those beans are created that called with applicationContext.getBean(class-or-bean-name) or injected via #Autowired / #Inject into your tests. (Some other types of beans like #Scheduled beans will be created nevertheless but you need to check if that's a problem or not)
(if you use spring Java configuration, add #Lazy to the config files)
Caveat - If there is a bean that is not initialized explicitly with applicationContext.getBean() or injected as a dependency used by the bean obtained by using applicationContext.getBean(), then that bean will NO LONGER be constructed or initialized. Depending upon your application, that can cause things to fail OR not. Maybe you can selectively mark those beans as lazy-init="false"
Yes, we can do that, using context per test case. Prepare a test context xml file with the beans required for your test case.
If you use maven, place the test-context.xml under src/test/resources folder.
Annotate your required test class with the following annotation
#ContextConfiguration(locations = "classpath:test-application-context.xml")
This helps in loading only specific beans for the test case.
If you have two kinds of test cases, then
#Runwith(SpringJUnit4Runner.class)
#ContextConfiguration(locations = "classpath:test-context-case1.xml")
public class TestClassCase1 {}
#Runwith(SpringJUnit4Runner.class)
#ContextConfiguration(locations = "classpath:test-context-case2.xml")
public class TestClassCase2 {}
It's not direct answer, so I'd would not mark as solution. But hope it's helpful.
Generally I see three options.
As VinayVeluri answered nicely. Create separate contexts and launch them in every tests separately.
Create context one time per all tests. Just like here: Reuse spring application context across junit test classes It's a big optimization for testing all tests at once.
Mix those two first points. Create one smaller context only for testing purpose. Mock that, what's never is tested but can throw NPE etc. Like here: Injecting Mockito mocks into a Spring bean to boost up context build. And re-use it like in point 2. One time build for all tests. Personally I'd go with that one.
This one waiting for answer about some kind of smart test runner, which creates minimum needed context per test.

Detecting unused Spring beans

Given a Spring configuration that exclusively contains eager (non-lazy) singleton beans, i.e. the defaults, is it possible to have Spring throw an exception in the case where any of those beans is not injected anywhere? I'm essentially looking for a way to detect dead code in the form of Spring beans.
My question is somewhat similar to these.
http://forum.spring.io/forum/spring-projects/container/116494-any-tools-or-method-to-identify-unused-spring-beans
Spring Instantiation and 'unused beans'
How to detect unused properties in Spring
However,
I'm not interested in manually inspecting a graph or parsing log data.
I don't have the added complexity of multiple context files, overriding beans, bean post-processing, or xml. It's a simple, straightforward, annotation-driven configuration.
I'm using Spring Boot 1.2.6 which is several years newer than those questions (maybe new functionality exists).
Spring will certainly throw an exception if a necessary bean is missing. Can it also throw an exception in the opposite scenario where a bean is found but unnecessary?
Spring will certainly throw an exception if a necessary bean is
missing. Can it also throw an exception in the opposite scenario where
a bean is found but unnecessary?
TL/DR:
Spring does not support this (and probably never will).
Long version:
Detecting if a bean is used can be really hard.
First, lets define when does spring throw the "missing bean" exception.
During the initialisation of the spring context, spring creates the beans in the order in which it will allow for all dependencies to be satisfied (if possible). If a bean is missing a dependency, spring will throw an exception (as you said).
So, the exception is thrown during the spring context initialisation process.
Now, you could say that we could monitor this process and look for a bean that was not used as a dependency in any other bean.
The problem is that not all bean dependencies are defined during the spring context initialisation process.
Let's look at the following example:
First, we have a simple interface, DataService
public interface DataService {
String getData();
}
Now we have 2 spring beans that implement this interface:
#Service("firstDataService")
public class FirstDataService implements DataService {
#Override
public String getData() {
return "FIRST DATA SERVICE";
}
}
#Service("secondDataService")
public class SecondDataService implements DataService {
#Override
public String getData() {
return "SECOND DATA SERVICE";
}
}
Now, imagine that there is no bean that depends on these two beans directly. When I say directly, I mean there is no bean that depends on these beans via constructor-based, setter-based or field-based dependency injection.
Because of that, spring will not inject these beans inside any other bean during the context initialisation process.
Now, consider the following bean:
#Service
public class DataCollector {
#Autowired
ApplicationContext applicationContext;
String getDataFromService(String beanName) {
DataService ds = (DataService) applicationContext.getBean(beanName);
return ds.getData();
}
}
If I call the getDataFromService method of the DataCollector bean with "firstDataService" value for the beanName parameter, the method will return "FIRST DATA SERVICE" as a result.
If I call the method with "secondDataService", I will return "SECOND DATA SERVICE" as a result.
Now, when spring looks at the definition of DataController during context initialisation, there is no way to determine on which beans DataCollector depends on.
It all depends on the application logic, and the value that we use for the beanName parameter when we call the getDataFromService method.
Because of that, spring is not capable of determining if there is bean that is never used (because the bean usage can be dynamic, like in the case above).

Is closing an ApplicationContext always (after its use) a good idea?

I am working on a Spring Batch application, whether I have categorised the spring xml configs into 4 files. The first 2 files comprises of spring batch database and core application beans definitions (namely database.xml and context.xml). The next 2 files are dynamic, in the sense that, they depend on which batch script to execute.
So executing a batch script comprises of loading context.xml, database.xml + 2 script related files.
For example, to execute "batch script 1", I have to load batch1.xml and tasklet1.xml along with 2 core configs, to execute "batch script 2", I have to load batch2.xml and tasklet2.xml along with 2 core configs so on..
Due to this scenario, I need to create a new ApplicationContext when a request to run a batch script comes. Right now, once I create and use my ApplicationContext, I am destroying them by calling close() method. My question is, is creating and destroying the ApplicationContext for each run a good idea (performance and memory wise)? or are there any good alternate approach?
EDIT: I am already using hierarchical contexts. That means, for the 2 core configs, I am creating an ApplicationContext and keeping it in memory (static variable). And for a new request, I am creating a new ApplicationContext with the core context as parent.
public void runBatch(String batchXmlLoc, String taskletXmlLoc) {
ApplicationContext context = new ClassPathXmlApplicationContext(new String[]{batchXmlLoc, taskletXmlLoc}, getParent());
//... do the work...
((ClassPathXmlApplicationContext)context).close();
}
private static ApplicationContext parent;
private ApplicationContext getParent() {
if(parent == null) {
parent = new ClassPathXmlApplicationContext("database.xml", "context.xml");
}
return parent;
}
Unless you need to restart the context after use (which would indicate a coding error anyway), you could simply use hierarchical contexts.
Create a root context with your database and core definitions and then create two child contexts using the following constructor:
public ClassPathXmlApplicationContext(String[] configLocations,
ApplicationContext parent)
The two child contexts won't see or intefere with each other.
If you need to restart the context, you should consider to refactor your setup. A spring context should not contain any stateful information like this. A possible solution might be to create a factory bean, that takes the run arguments and builds the necessary classes for the batch run.

Categories