I recently update my spring boot app 2.1.9 to 2.2.0 and i'm facing a problem. When i'm calling "configprops" from actuator endpoint, an exception is throw :
Scope 'job' is not active for the current thread
I reproduce the bug : https://github.com/guillaumeyan/bugspringbatch (just launch the test). Original project come from https://github.com/spring-guides/gs-batch-processing/tree/master/complete
I tried to add :
#Bean
public StepScope stepScope() {
final StepScope stepScope = new StepScope();
stepScope.setAutoProxy(true);
return stepScope;
}
but it does not work (with spring.main.allow-bean-definition-overriding=true)
Here is my configuration of the spring batch
#Bean
#JobScope
public RepositoryItemReader<DossierEntity> dossierToDiagnosticReader(PagingAndSortingRepository<DossierEntity, Long> dossierJpaRepository, #Value("#{jobParameters[origin]}") String origin) {
RepositoryItemReader<DossierEntity> diagnosticDossierReader = new RepositoryItemReader<>();
diagnosticDossierReader.setRepository(dossierJpaRepository);
diagnosticDossierReader.setMethodName("listForBatch");
// doing some stuff with origin
return diagnosticDossierReader;
}
ExceptionHandlerExceptionResolver[199] - Resolved [org.springframework.beans.factory.BeanCreationException:
Error creating bean with name 'scopedTarget.dossierToDiagnosticReader': Scope 'job' is not active for the current thread;
consider defining a scoped proxy for this bean if you intend to refer to it from a singleton; nested exception is java.lang.IllegalStateException: No context holder available for job scope]
I downloaded your project and was able to reproduce the case. There are two issues with your example:
You are defining a job scoped bean in your app but the JobScope is not defined in your context (and you are not using #EnableBatchProcessing annotation that adds it automatically to the context). If you want to use the job scope without #EnableBatchProcessing, you need to add it manually to the context.
Your test fails because there is no job running during your test. Job scoped beans are lazily instantiated when a job is actually run. Since your test does not start a job, the bean is not able to be proxied correctly.
Your test does not seem to test a batch job, I would exclude the job scoped bean from the test's context.
Bug resolve in spring boot 2.2.1 https://github.com/spring-projects/spring-boot/issues/18714
I am using Spring Boot 1.5.7 and Apache Camel 2.19.3, using Spring Boot AutoConfiguration provided by spring-boot-starter-camel
It is pretty basic Spring Boot and Camel initialized as in their tutorial, so we have a RouteBuilder component that does exactly that.
#Component
public class CIFRoutes extends RouteBuilder {
#Override
public void configure() throws Exception {
// build routes
}
}
We have a Configuration that defines some beans we need in our application
#Configuration
public class MyConfiguration {
#Bean
public void Map<String, Object> map() {
return new HashMap<>()
}
}
Finally, we have a custom InflightRepository implementation that should be scanned by auto-configuration and added to the CamelContext which basically works, but for some reason, the component doesn't get initialized properly. Means, its dependencies are not initialized but the bean is instantiated and injected in my Application.
#Component
public class MyCustomInflightRepository extends DefaultInflightRepository {
#Autowired
private Map<String, Object> map;
#Override
public void add(Exchange exchange) {
super.addExchange(exchange);
// ....
}
}
The problem now is that map remains (null), I also tried adding a #PostConstruct initializer method but it doesn't get called.
As far as I was able to reconstruct, it seems to be connected to premature in CamelAutoConfiguration where the CamelContext bean gets instantiated (done in private method afterPropertiesSet.
InflightRepository inflightRepository = getSingleBeanOfType(applicationContext, InflightRepository.class);
if (inflightRepository != null) {
LOG.info("Using custom InflightRepository: {}", inflightRepository);
camelContext.setInflightRepository(inflightRepository);
}
If MyCustomInflightRepository doesn't implement InflightRepository, the bean is initialized correctly, but indeed not recognized by Camel. When disabling auto-configuration, the bean's dependencies are injected.
So, either I'm doing the impossible by Spring standards or there's something fishy with the Camel component for Spring.
I'm a bit quick on resolving this (I wanted to post this two days ago already^^), but a colleague figured out what could be the problem.
When using CamelAutoConfiguration the InflightRepository bean (or practicially everything for which Camel tries to get a matching bean here), the context is accessed before property resolvers are fully initialized which leads to the bean being initialized (and cached in context) before any auto-wired properties can be resolved.
I'm not a Spring expert but this behavior is a bit problematic in my opinion because uninitialized beans are pulled into the CamelContext when you rely on Spring DI for your custom components.
To be sure, I'll raise this with the maintainers...
By the way, my simple solution was to manually setting the in-flight repository in context configuration (as suggested)
#Bean
public CamelContextConfiguration camelConfig() {
return new CamelContextConfiguration() {
#Override
public void beforeApplicationStart(CamelContext context) {
context.setInflightRepository(new MyCustomInflightRepository(/* Dependencies go here */ ));
}
#Override
public void afterApplicationStart(CamelContext camelContext) {
}
};
}
Also it seems to be an issue when use camel-http-starter in your project which isn't recommended, they claim it is deprecated.
So, either don't do DI (regardless if via property or constructor injection) for your camel-managed beans or skip that starter.
The problem is that a Map<String,Object> is too vague for Spring to be able to understand what you want; I think the default behavior is that it'll give you all beans keyed by name.
Instead, be more specific, or possibly provide the necessary parameters as constructor arguments and configure them explicitly in an #Bean method (it's a good idea to always use constructor injection anyway).
Please explain the following about NoSuchBeanDefinitionException exception in Spring:
What does it mean?
Under what conditions will it be thrown?
How can I prevent it?
This post is designed to be a comprehensive Q&A about occurrences of NoSuchBeanDefinitionException in applications using Spring.
The javadoc of NoSuchBeanDefinitionException explains
Exception thrown when a BeanFactory is asked for a bean instance for
which it cannot find a definition. This may point to a non-existing
bean, a non-unique bean, or a manually registered singleton instance
without an associated bean definition.
A BeanFactory is basically the abstraction representing Spring's Inversion of Control container. It exposes beans internally and externally, to your application. When it cannot find or retrieve these beans, it throws a NoSuchBeanDefinitionException.
Below are simple reasons why a BeanFactory (or related classes) would not be able to find a bean and how you can make sure it does.
The bean doesn't exist, it wasn't registered
In the example below
#Configuration
public class Example {
public static void main(String[] args) throws Exception {
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(Example.class);
ctx.getBean(Foo.class);
}
}
class Foo {}
we haven't registered a bean definition for the type Foo either through a #Bean method, #Component scanning, an XML definition, or any other way. The BeanFactory managed by the AnnotationConfigApplicationContext therefore has no indication of where to get the bean requested by getBean(Foo.class). The snippet above throws
Exception in thread "main" org.springframework.beans.factory.NoSuchBeanDefinitionException:
No qualifying bean of type [com.example.Foo] is defined
Similarly, the exception could have been thrown while trying to satisfy an #Autowired dependency. For example,
#Configuration
#ComponentScan
public class Example {
public static void main(String[] args) throws Exception {
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(Example.class);
}
}
#Component
class Foo { #Autowired Bar bar; }
class Bar { }
Here, a bean definition is registered for Foo through #ComponentScan. But Spring knows nothing of Bar. It therefore fails to find a corresponding bean while trying to autowire the bar field of the Foo bean instance. It throws (nested inside a UnsatisfiedDependencyException)
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException:
No qualifying bean of type [com.example.Bar] found for dependency [com.example.Bar]:
expected at least 1 bean which qualifies as autowire candidate for this dependency. Dependency annotations: {#org.springframework.beans.factory.annotation.Autowired(required=true)}
There are multiple ways to register bean definitions.
#Bean method in a #Configuration class or <bean> in XML configuration
#Component (and its meta-annotations, eg. #Repository) through #ComponentScan or <context:component-scan ... /> in XML
Manually through GenericApplicationContext#registerBeanDefinition
Manually through BeanDefinitionRegistryPostProcessor
...and more.
Make sure the beans you expect are properly registered.
A common error is to register beans multiple times, ie. mixing the options above for the same type. For example, I might have
#Component
public class Foo {}
and an XML configuration with
<context:component-scan base-packages="com.example" />
<bean name="eg-different-name" class="com.example.Foo />
Such a configuration would register two beans of type Foo, one with name foo and another with name eg-different-name. Make sure you're not accidentally registering more beans than you wanted. Which leads us to...
If you're using both XML and annotation-based configurations, make sure you import one from the other. XML provides
<import resource=""/>
while Java provides the #ImportResource annotation.
Expected single matching bean, but found 2 (or more)
There are times when you need multiple beans for the same type (or interface). For example, your application may use two databases, a MySQL instance and an Oracle one. In such a case, you'd have two DataSource beans to manage connections to each one. For (simplified) example, the following
#Configuration
public class Example {
public static void main(String[] args) throws Exception {
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(Example.class);
System.out.println(ctx.getBean(DataSource.class));
}
#Bean(name = "mysql")
public DataSource mysql() { return new MySQL(); }
#Bean(name = "oracle")
public DataSource oracle() { return new Oracle(); }
}
interface DataSource{}
class MySQL implements DataSource {}
class Oracle implements DataSource {}
throws
Exception in thread "main" org.springframework.beans.factory.NoUniqueBeanDefinitionException:
No qualifying bean of type [com.example.DataSource] is defined:
expected single matching bean but found 2: oracle,mysql
because both beans registered through #Bean methods satisfied the requirement of BeanFactory#getBean(Class), ie. they both implement DataSource. In this example, Spring has no mechanism to differentiate or prioritize between the two. But such mechanisms exists.
You could use #Primary (and its equivalent in XML) as described in the documentation and in this post. With this change
#Bean(name = "mysql")
#Primary
public DataSource mysql() { return new MySQL(); }
the previous snippet would not throw the exception and would instead return the mysql bean.
You can also use #Qualifier (and its equivalent in XML) to have more control over the bean selection process, as described in the documentation. While #Autowired is primarily used to autowire by type, #Qualifier lets you autowire by name. For example,
#Bean(name = "mysql")
#Qualifier(value = "main")
public DataSource mysql() { return new MySQL(); }
could now be injected as
#Qualifier("main") // or #Qualifier("mysql"), to use the bean name
private DataSource dataSource;
without issue. #Resource is also an option.
Using wrong bean name
Just as there are multiple ways to register beans, there are also multiple ways to name them.
#Bean has name
The name of this bean, or if plural, aliases for this bean. If left
unspecified the name of the bean is the name of the annotated method.
If specified, the method name is ignored.
<bean> has the id attribute to represent the unique identifier for a bean and name can be used to create one or more aliases illegal in an (XML) id.
#Component and its meta annotations have value
The value may indicate a suggestion for a logical component name, to
be turned into a Spring bean in case of an autodetected component.
If that's left unspecified, a bean name is automatically generated for the annotated type, typically the lower camel case version of the type name. For example MyClassName becomes myClassName as its bean name. Bean names are case sensitive. Also note that wrong names/capitalization typically occur in beans referred to by string like #DependsOn("my BeanName") or XML config files.
#Qualifier, as mentioned earlier, lets you add more aliases to a bean.
Make sure you use the right name when referring to a bean.
More advanced cases
Profiles
Bean definition profiles allow you to register beans conditionally. #Profile, specifically,
Indicates that a component is eligible for registration when one or
more specified profiles are active.
A profile is a named logical grouping that may be activated
programmatically via
ConfigurableEnvironment.setActiveProfiles(java.lang.String...) or
declaratively by setting the spring.profiles.active property as a JVM
system property, as an environment variable, or as a Servlet context
parameter in web.xml for web applications. Profiles may also be
activated declaratively in integration tests via the #ActiveProfiles
annotation.
Consider this examples where the spring.profiles.active property is not set.
#Configuration
#ComponentScan
public class Example {
public static void main(String[] args) throws Exception {
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(Example.class);
System.out.println(Arrays.toString(ctx.getEnvironment().getActiveProfiles()));
System.out.println(ctx.getBean(Foo.class));
}
}
#Profile(value = "StackOverflow")
#Component
class Foo {
}
This will show no active profiles and throw a NoSuchBeanDefinitionException for a Foo bean. Since the StackOverflow profile wasn't active, the bean wasn't registered.
Instead, if I initialize the ApplicationContext while registering the appropriate profile
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext();
ctx.getEnvironment().setActiveProfiles("StackOverflow");
ctx.register(Example.class);
ctx.refresh();
the bean is registered and can be returned/injected.
AOP Proxies
Spring uses AOP proxies a lot to implement advanced behavior. Some examples include:
Transaction management with #Transactional
Caching with #Cacheable
Scheduling and asynchronous execution with #Async and #Scheduled
To achieve this, Spring has two options:
Use the JDK's Proxy class to create an instance of a dynamic class at runtime which only implements your bean's interfaces and delegates all method invocations to an actual bean instance.
Use CGLIB proxies to create an instance of a dynamic class at runtime which implements both interfaces and concrete types of your target bean and delegates all method invocations to an actual bean instance.
Take this example of JDK proxies (achieved through #EnableAsync's default proxyTargetClass of false)
#Configuration
#EnableAsync
public class Example {
public static void main(String[] args) throws Exception {
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(Example.class);
System.out.println(ctx.getBean(HttpClientImpl.class).getClass());
}
}
interface HttpClient {
void doGetAsync();
}
#Component
class HttpClientImpl implements HttpClient {
#Async
public void doGetAsync() {
System.out.println(Thread.currentThread());
}
}
Here, Spring attempts to find a bean of type HttpClientImpl which we expect to find because the type is clearly annotated with #Component. However, instead, we get an exception
Exception in thread "main" org.springframework.beans.factory.NoSuchBeanDefinitionException:
No qualifying bean of type [com.example.HttpClientImpl] is defined
Spring wrapped the HttpClientImpl bean and exposed it through a Proxy object that only implements HttpClient. So you could retrieve it with
ctx.getBean(HttpClient.class) // returns a dynamic class: com.example.$Proxy33
// or
#Autowired private HttpClient httpClient;
It's always recommended to program to interfaces. When you can't, you can tell Spring to use CGLIB proxies. For example, with #EnableAsync, you can set proxyTargetClass to true. Similar annotations (EnableTransactionManagement, etc.) have similar attributes. XML will also have equivalent configuration options.
ApplicationContext Hierarchies - Spring MVC
Spring lets you build ApplicationContext instances with other ApplicationContext instances as parents, using ConfigurableApplicationContext#setParent(ApplicationContext). A child context will have access to beans in the parent context, but the opposite is not true. This post goes into detail about when this is useful, particularly in Spring MVC.
In a typical Spring MVC application, you define two contexts: one for the entire application (the root) and one specifically for the DispatcherServlet (routing, handler methods, controllers). You can get more details here:
Difference between applicationContext.xml and spring-servlet.xml in Spring Framework
It's also very well explained in the official documentation, here.
A common error in Spring MVC configurations is to declare the WebMVC configuration in the root context with #EnableWebMvc annotated #Configuration classes or <mvc:annotation-driven /> in XML, but the #Controller beans in the servlet context. Since the root context cannot reach into the servlet context to find any beans, no handlers are registered and all requests fail with 404s. You won't see a NoSuchBeanDefinitionException, but the effect is the same.
Make sure your beans are registered in the appropriate context, ie. where they can be found by the beans registered for WebMVC (HandlerMapping, HandlerAdapter, ViewResolver, ExceptionResolver, etc.). The best solution is to properly isolate beans. The DispatcherServlet is responsible for routing and handling requests so all related beans should go into its context. The ContextLoaderListener, which loads the root context, should initialize any beans the rest of your application needs: services, repositories, etc.
Arrays, collections, and maps
Beans of some known types are handled in special ways by Spring. For example, if you tried to inject an array of MovieCatalog into a field
#Autowired
private MovieCatalog[] movieCatalogs;
Spring will find all beans of type MovieCatalog, wrap them in an array, and inject that array. This is described in the Spring documentation discussing #Autowired. Similar behavior applies to Set, List, and Collection injection targets.
For a Map injection target, Spring will also behave this way if the key type is String. For example, if you have
#Autowired
private Map<String, MovieCatalog> movies;
Spring will find all beans of type MovieCatalog and add them as values to a Map, where the corresponding key will be their bean name.
As described previously, if no beans of the requested type are available, Spring will throw a NoSuchBeanDefinitionException. Sometimes, however, you just want to declare a bean of these collection types like
#Bean
public List<Foo> fooList() {
return Arrays.asList(new Foo());
}
and inject them
#Autowired
private List<Foo> foos;
In this example, Spring would fail with a NoSuchBeanDefinitionException because there are no Foo beans in your context. But you didn't want a Foo bean, you wanted a List<Foo> bean. Before Spring 4.3, you'd have to use #Resource
For beans that are themselves defined as a collection/map or array
type, #Resource is a fine solution, referring to the specific
collection or array bean by unique name. That said, as of 4.3,
collection/map and array types can be matched through Spring’s
#Autowired type matching algorithm as well, as long as the element
type information is preserved in #Bean return type signatures or
collection inheritance hierarchies. In this case, qualifier values can
be used to select among same-typed collections, as outlined in the
previous paragraph.
This works for constructor, setter, and field injection.
#Resource
private List<Foo> foos;
// or since 4.3
public Example(#Autowired List<Foo> foos) {}
However, it will fail for #Bean methods, ie.
#Bean
public Bar other(List<Foo> foos) {
new Bar(foos);
}
Here, Spring ignores any #Resource or #Autowired annotating the method, because it's a #Bean method, and therefore can't apply the behavior described in the documentation. However, you can use Spring Expression Language (SpEL) to refer to beans by their name. In the example above, you could use
#Bean
public Bar other(#Value("#{fooList}") List<Foo> foos) {
new Bar(foos);
}
to refer to the bean named fooList and inject that.
Im using Spring #Scheduled annotation for the first time. I've enabled scheduling in configuration, and schedule method works perfectly.
My problem is, that Spring doesn't inject required beans into my bean with #Scheduled annotated method. What I actually get is an LockService proxy object, which is useless in this case.
Is there any way to inject Spring beans into a bean with scheduled task?
I've tried with #Lazy annotation under #Autowired annotation, but result was the same. Getting bean from injected ApplicationContext has no result also.
Here is my bean with not injected dependencies:
#Component
public class LockScheduler extends AbstractScheduler {
private final LockService lockService;
#Autowired
public LockScheduler(LockService lockService) {
this.lockService = lockService;
}
#Scheduled(fixedDelay = 15000)
public void removeAbandonedLocks() {
if (!isSchedulingEnabled()) {
return;
}
LOG.info("Execute abandoned lock remove");
lockService.removeAbandonedLocks();
}
}
I guess it is something with Aspects, that I cant reach ApplicationContext from bean with scheduled tasks.
UPDATE:
I've posted screen from debug mode. Instead of LockService object I get proxy with null reference to LockRepository, therefore breakpointed method execute has no effect.
Given a Spring configuration that exclusively contains eager (non-lazy) singleton beans, i.e. the defaults, is it possible to have Spring throw an exception in the case where any of those beans is not injected anywhere? I'm essentially looking for a way to detect dead code in the form of Spring beans.
My question is somewhat similar to these.
http://forum.spring.io/forum/spring-projects/container/116494-any-tools-or-method-to-identify-unused-spring-beans
Spring Instantiation and 'unused beans'
How to detect unused properties in Spring
However,
I'm not interested in manually inspecting a graph or parsing log data.
I don't have the added complexity of multiple context files, overriding beans, bean post-processing, or xml. It's a simple, straightforward, annotation-driven configuration.
I'm using Spring Boot 1.2.6 which is several years newer than those questions (maybe new functionality exists).
Spring will certainly throw an exception if a necessary bean is missing. Can it also throw an exception in the opposite scenario where a bean is found but unnecessary?
Spring will certainly throw an exception if a necessary bean is
missing. Can it also throw an exception in the opposite scenario where
a bean is found but unnecessary?
TL/DR:
Spring does not support this (and probably never will).
Long version:
Detecting if a bean is used can be really hard.
First, lets define when does spring throw the "missing bean" exception.
During the initialisation of the spring context, spring creates the beans in the order in which it will allow for all dependencies to be satisfied (if possible). If a bean is missing a dependency, spring will throw an exception (as you said).
So, the exception is thrown during the spring context initialisation process.
Now, you could say that we could monitor this process and look for a bean that was not used as a dependency in any other bean.
The problem is that not all bean dependencies are defined during the spring context initialisation process.
Let's look at the following example:
First, we have a simple interface, DataService
public interface DataService {
String getData();
}
Now we have 2 spring beans that implement this interface:
#Service("firstDataService")
public class FirstDataService implements DataService {
#Override
public String getData() {
return "FIRST DATA SERVICE";
}
}
#Service("secondDataService")
public class SecondDataService implements DataService {
#Override
public String getData() {
return "SECOND DATA SERVICE";
}
}
Now, imagine that there is no bean that depends on these two beans directly. When I say directly, I mean there is no bean that depends on these beans via constructor-based, setter-based or field-based dependency injection.
Because of that, spring will not inject these beans inside any other bean during the context initialisation process.
Now, consider the following bean:
#Service
public class DataCollector {
#Autowired
ApplicationContext applicationContext;
String getDataFromService(String beanName) {
DataService ds = (DataService) applicationContext.getBean(beanName);
return ds.getData();
}
}
If I call the getDataFromService method of the DataCollector bean with "firstDataService" value for the beanName parameter, the method will return "FIRST DATA SERVICE" as a result.
If I call the method with "secondDataService", I will return "SECOND DATA SERVICE" as a result.
Now, when spring looks at the definition of DataController during context initialisation, there is no way to determine on which beans DataCollector depends on.
It all depends on the application logic, and the value that we use for the beanName parameter when we call the getDataFromService method.
Because of that, spring is not capable of determining if there is bean that is never used (because the bean usage can be dynamic, like in the case above).