Spring Step Scoped beans do not respect #Order annotation - java

When I create 2 beans with #StepScope and apply #Order then the order for both beans still gets resolved to Ordered.LOWEST_PRECEDENCE when trying to autowire them as a List.
#Bean
#Order(1)
#StepScope
public MyBean bean1(
return new MyBean("1");
}
#Bean
#Order(2)
#StepScope
public MyBean bean2(
return new MyBean("2");
}
Looking in OrderComparator I see the beans are resolved to a source of ScopedProxyFactoryBean which returns a null order value.
Wondering if Im doing something wrong here as I would expect the ordering to work correctly.
So the aim in to autowire an ordered list into another bean eg
#Component
public class OuterBean {
private List<MyBean> beans;
public OuterBean(List<MyBean> beans) {
this.beans = beans;
}
}
And I would expect the list to contain {bean1,bean2}

Step scoped beans need to be used in the scope of a running step. Here is an example:
import java.util.List;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.annotation.Order;
#Configuration
public class MyJob {
#Bean
#Order(1)
#StepScope
public MyBean bean1() {
return new MyBean("1");
}
#Bean
#Order(2)
#StepScope
public MyBean bean2() {
return new MyBean("2");
}
#Bean
public OuterBean outerBean(List<MyBean> beans) {
return new OuterBean(beans);
}
#Bean
public Tasklet tasklet(OuterBean outerBean) {
return (contribution, chunkContext) -> {
outerBean.sayHello();
return RepeatStatus.FINISHED;
};
}
#Bean
public Job job(JobBuilderFactory jobs, StepBuilderFactory steps) {
return jobs.get("job")
.start(steps.get("step")
.tasklet(tasklet(null))
.build())
.build();
}
}
The complete code can be found here. This example prints:
bean = 1
bean = 2
which means step scoped beans have been injected in the right order.

Related

Understanding Primary annotation in Profile

I am trying to understand the behaviour of #Primary in #Profile from this video
Dependency Injection using profile.
The active profile in file application.properties is english and running it gives error
expected single matching bean but found 2: helloWorldServiceEnglish,helloWorldServiceSpanish
Adding #Primary annotation in helloConfig.java resolves the error:
#Bean
#Profile("english")
#Primary
public HelloWorldService helloWorldServiceEnglish(HelloWorldFactory factory) {
return factory.createHelloWorldService("en");
}
When I am Autowiring using Profile and there is only one single Profile named english then why it is searching for other beans which do not have #Profile annotation? And how adding #Primary is changing this behaviour?
Does Spring internally first scans for Autowire by type and completely ignore #Profile because of which it throws error expected single matching bean but found 2.
helloConfig.java
package com.spring.config;
import com.spring.services.HelloWorldFactory;
import com.spring.services.HelloWorldService;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.context.annotation.Profile;
#Configuration
public class HelloConfig {
#Bean
public HelloWorldFactory helloWorldFactory() {
return new HelloWorldFactory();
}
#Bean
#Profile("english")
#Primary
public HelloWorldService helloWorldServiceEnglish(HelloWorldFactory factory) {
return factory.createHelloWorldService("en");
}
#Bean
#Qualifier("spanish")
public HelloWorldService helloWorldServiceSpanish(HelloWorldFactory factory) {
return factory.createHelloWorldService("es");
}
#Bean
#Profile("french")
public HelloWorldService helloWorldServiceFrench(HelloWorldFactory factory) {
return factory.createHelloWorldService("fr");
}
#Bean
#Profile("german")
public HelloWorldService helloWorldServiceGerman(HelloWorldFactory factory) {
return factory.createHelloWorldService("de");
}
#Bean
#Profile("polish")
public HelloWorldService helloWorldServicePolish(HelloWorldFactory factory) {
return factory.createHelloWorldService("pl");
}
#Bean
#Profile("russian")
public HelloWorldService helloWorldServiceRussian(HelloWorldFactory factory) {
return factory.createHelloWorldService("ru");
}
}
DependencyInjectionApplication.java
package com.spring.componentScan;
import com.spring.controllers.GreetingController;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.ComponentScan;
#SpringBootApplication
#ComponentScan("com.spring")
public class DependencyInjectionApplication {
public static void main(String[] args) {
ApplicationContext ctx = SpringApplication.run(DependencyInjectionApplication.class, args);
GreetingController controller = (GreetingController) ctx.getBean("greetingController");
controller.sayHello();
}
}
GreetingController.java
package com.spring.controllers;
import com.spring.services.HelloWorldService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.stereotype.Controller;
#Controller
public class GreetingController {
private HelloWorldService helloWorldService;
private HelloWorldService helloWorldServiceSpanish;
#Autowired
public void setHelloWorldService(HelloWorldService helloWorldService) {
this.helloWorldService = helloWorldService;
}
#Autowired
#Qualifier("spanish")
public void setHelloWorldServiceFrench(HelloWorldService helloWorldServiceSpanish) {
this.helloWorldServiceSpanish = helloWorldServiceSpanish;
}
public String sayHello() {
String greeting = helloWorldService.getGreeting();
System.out.println(helloWorldServiceSpanish.getGreeting());
System.out.println(greeting);
return greeting;
}
}
Application.properties
spring.profiles.active=english
Complete Source code:
If you consider this source code
#Bean(name = "french")
public HelloWorldService helloWorldServiceFrench(HelloWorldFactory factory) {
return factory.createHelloWorldService("fr");
}
#Bean
public HelloWorldService helloWorldServiceGerman(HelloWorldFactory factory) {
return factory.createHelloWorldService("de");
}
#Bean
public HelloWorldService helloWorldServicePolish(HelloWorldFactory factory) {
return factory.createHelloWorldService("pl");
}
#Bean
public HelloWorldService helloWorldServiceRussian(HelloWorldFactory factory) {
return factory.createHelloWorldService("ru");
}
here is no #Profile annotation, and thats why Spring creating multiple beans of same type, if you want them to recognized differently, try giving them well qualified explicit name by #Bean(name="polish") (or Spring anyway assign them by looking at #Bean method name) and then autowire using #Qualifier("polish")
Okay, so your example is not updated with the latest code. But I assume that you want to create multiple instances of the same bean type and use them for different languages. It's easy to achieve and you don't need to have #Profile and #Primary for that.
What you need is just assign qualifier for the bean instance (or use the one that spring assigns by default). And the inject bean by this qualifier.
#Bean
public HelloWorldService helloWorldServiceFrench(HelloWorldFactory factory) {
return factory.createHelloWorldService("fr");
}
#Bean
public HelloWorldService helloWorldServiceGerman(HelloWorldFactory factory) {
return factory.createHelloWorldService("de");
}
#Bean
public HelloWorldService helloWorldServicePolish(HelloWorldFactory factory) {
return factory.createHelloWorldService("pl");
}
#Bean
public HelloWorldService helloWorldServiceRussian(HelloWorldFactory factory) {
return factory.createHelloWorldService("ru");
}
Controller:
#Controller
public class GreetingController {
#Qualifier("helloWorldServiceGerman")
#Autowired
private HelloWorldService helloWorldServiceGerman;
#Qualifier("helloWorldServiceFrench")
#Autowired
private HelloWorldService helloWorldServiceFrench;
#Qualifier("helloWorldServicePolish")
#Autowired
private HelloWorldService helloWorldServicePolish;
#Qualifier("helloWorldServiceRussian")
#Autowired
private HelloWorldService helloWorldServiceRussian;
. . .
}
Update
You usually mark a bean as #Primary when you want to have one bean instance as a priority option when there are multiple injection candidates. Official doc with good example.
#Profile just narrows bean search, but still if you have multiple beans of the same type in the same profile - #Primary to the rescue (if you autowire by type, autowire by qualifier still works fine though).
Developers usually do this to avoid NoUniqueBeanDefinitionException that you had initially.

Spring Boot - creating Bean as Singleton in IOC container

Bean creation is supposed to be a Singleton in the spring container, correct? I am migrating a Spring configuration file to Spring Boot with Annotations. I have the below code, but it seems to call the "mySingletonBean()" every time that it is used within another bean creation. It is my understanding that the #Bean annotation is supposed to be Singleton by default. Am I creating my Beans correctly?
#Bean
public SomeBean mySingletonBean() {
SomeBean mybean = new SomeBean();
mybean.setName = "Name";
return mybean;
}
#Bean
public Bean1 bean1() {
Bean1 bean1 = new Bean1();
bean1.setBean(mySingletonBean());
return bean1;
}
#Bean
public Bean2 bean2() {
Bean2 bean2 = new Bean2();
bean2.setBean(mySingletonBean());
return bean2;
}
Spring is proxying your application context class, and takes care of all the context related stuff like instantiating, caching etc. of the beans.
Run this small test, feel free to debug to see what class the configuration class became:
package stackoverflow;
import java.util.Arrays;
import java.util.Date;
import org.junit.runner.RunWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.context.ApplicationContext;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit4.SpringJUnit4ClassRunner;
import static org.junit.Assert.assertTrue;
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = Test.MyContext.class)
public class Test {
#Autowired
private ApplicationContext applicationContext;
#Autowired
#Qualifier("myStringBean")
private String myStringBean;
#Autowired
private SomeBean someBean;
#Configuration
static class MyContext {
#Bean
public String myStringBean() {
return String.valueOf(new Date().getTime());
}
#Bean
public SomeBean mySomeBean() {
return new SomeBean(myStringBean());
}
}
#org.junit.Test
public void test() {
assertTrue(myStringBean == applicationContext.getBean("myStringBean"));
assertTrue(myStringBean == someBean.getValue());
System.out.println(Arrays.asList(applicationContext.getBean("myStringBean"), myStringBean, someBean.getValue()));
}
static class SomeBean {
private String value;
public SomeBean(String value) {
this.value = value;
}
public String getValue() {
return value;
}
}
}
Spring Boot is smart framework and method mySingletonBean() will starts only once time, don't worry about it. U can start debug mode and see .

GemFire entry Time-To-Live is not getting set using spring-cache

Following is the code snippet which is successful in persisting the data in a remote GemFire cluster and successfully keeping local spring-cache updated. However, the entries are not getting DESTROY-ed as expected when I tried using ExpirationAttributes. I've referred to this and related links.
import org.springframework.data.gemfire.ExpirationActionType;
import org.springframework.data.gemfire.ExpirationAttributesFactoryBean;
import org.springframework.data.gemfire.RegionAttributesFactoryBean;
import org.springframework.data.gemfire.client.ClientCacheFactoryBean;
import org.springframework.data.gemfire.client.ClientRegionFactoryBean;
import org.springframework.data.gemfire.support.ConnectionEndpoint;
import org.springframework.data.gemfire.support.GemfireCacheManager;
import com.gemstone.gemfire.cache.ExpirationAttributes;
import com.gemstone.gemfire.cache.RegionAttributes;
import com.gemstone.gemfire.cache.client.ClientCache;
import com.gemstone.gemfire.cache.client.ClientRegionShortcut;
import com.gemstone.gemfire.pdx.ReflectionBasedAutoSerializer;
#Configuration
#Profile("local")
public class GemFireCachingConfig {
#Bean
Properties gemfireProperties(...) {
//Sets gemfire properties and return
return gemfireProperties;
}
#Bean
#Primary
ReflectionBasedAutoSerializer reflectionBasedAutoSerializer() {
return new ReflectionBasedAutoSerializer("pkg.containing.cacheable.object");
}
#Bean
#Primary
ClientCacheFactoryBean clientCacheFactory(String injectedGemFirehost,
int injectedGemfirePort, Properties gemfireProperties,
ReflectionBasedAutoSerializer reflectionBasedAutoSerializer) {
ClientCacheFactoryBean cachefactoryBean = new ClientCacheFactoryBean();
cachefactoryBean.setProperties(gemfireProperties);
cachefactoryBean.setClose(true);
cachefactoryBean.setPdxSerializer(reflectionBasedAutoSerializer);
cachefactoryBean.setPdxReadSerialized(false);
cachefactoryBean.setPdxIgnoreUnreadFields(true);
ConnectionEndpoint[] locators = new ConnectionEndpoint[1];
locators[0] = new ConnectionEndpoint(injectedGemFirehost, injectedGemfirePort);
cachefactoryBean.setLocators(locators);
return cachefactoryBean;
}
#Bean
public ExpirationAttributesFactoryBean entryTtlExpirationAttributes(
int injectedTimeoutInSecs) {
ExpirationAttributesFactoryBean expirationAttributes = new ExpirationAttributesFactoryBean();
expirationAttributes.setAction(ExpirationActionType.DESTROY.getExpirationAction());
expirationAttributes.setTimeout(injectedTimeoutInSecs);
return expirationAttributes;
}
#Bean
#Autowired
public RegionAttributesFactoryBean regionAttributes(
#Qualifier("entryTtlExpirationAttributes") ExpirationAttributes entryTtl) {
RegionAttributesFactoryBean regionAttributes = new RegionAttributesFactoryBean();
regionAttributes.setStatisticsEnabled(true);
regionAttributes.setEntryTimeToLive(entryTtl);
return regionAttributes;
}
#Bean
#Primary
ClientRegionFactoryBean<String, Object> regionFactoryBean(ClientCache gemfireCache,
#Qualifier("regionAttributes") RegionAttributes<String, Object> regionAttributes) {
ClientRegionFactoryBean<String, Object> regionFactoryBean = new ClientRegionFactoryBean<>();
regionFactoryBean.setAttributes(regionAttributes);
regionFactoryBean.setCache(gemfireCache);
regionFactoryBean.setClose(false);
regionFactoryBean.setPersistent(false);
regionFactoryBean.setRegionName(regionName);
regionFactoryBean.setShortcut(ClientRegionShortcut.CACHING_PROXY_HEAP_LRU);
return regionFactoryBean;
}
#Bean
GemfireCacheManager cacheManager(ClientCache gemfireCache) {
GemfireCacheManager cacheManager = new GemfireCacheManager();
cacheManager.setCache(gemfireCache);
return cacheManager;
}
}
Just curious how you think the injectedTimeoutInSeconds is "injected" into your entryTtlExpirationAttributes bean definition in your Spring config; this...
#Bean
public ExpirationAttributesFactoryBean entryTtlExpirationAttributes(
int injectedTimeoutInSecs) {
ExpirationAttributesFactoryBean expirationAttributes =
new ExpirationAttributesFactoryBean();
expirationAttributes.setAction(
ExpirationActionType.DESTROY.getExpirationAction());
expirationAttributes.setTimeout(injectedTimeoutInSecs);
return expirationAttributes;
}
You need to annotate your entryTtlExpirationAttributes bean definition method parameter (i.e. injectedTimeoutInSecs) with Spring's #Value annotation, like so...
#Bean
public ExpirationAttributesFactoryBean entryTtlExpirationAttributes(
#Value("${gemfire.cache.expiration.ttl.timeout:600}")
int injectedTimeoutInSecs) {
Then, in your Spring Boot application.properties file, you can set a value for the property (gemfire.cache.expiration.ttl.timeout)...
#application.properties
gemfire.cache.expiration.ttl.timeout = 300
The #Value annotation can supply a default if the property is not explicitly set...
#Value({${property:defaultValue}")
Additionally, you need to supply a propertySourcePlaceholderConfigurer bean definition in your Spring Java config to enable Spring to "replace" property placeholder values...
#Bean
static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
You can see a similar configuration to what you have above here.
Finally, you can simplify your entire Spring, GemFire Java configuration class to this...
import java.util.Collections;
import org.apache.geode.cache.ExpirationAttributes;
import org.apache.geode.cache.GemFireCache;
import org.apache.geode.cache.RegionAttributes;
import org.apache.geode.cache.client.ClientRegionShortcut;
import org.apache.geode.pdx.ReflectionBasedAutoSerializer;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Primary;
import org.springframework.context.annotation.Profile;
import org.springframework.context.support.PropertySourcesPlaceholderConfigurer;
import org.springframework.data.gemfire.RegionAttributesFactoryBean;
import org.springframework.data.gemfire.cache.config.EnableGemfireCaching;
import org.springframework.data.gemfire.client.ClientRegionFactoryBean;
import org.springframework.data.gemfire.config.annotation.ClientCacheApplication;
import org.springframework.data.gemfire.config.annotation.ClientCacheConfigurer;
import org.springframework.data.gemfire.config.annotation.EnablePdx;
import org.springframework.data.gemfire.expiration.ExpirationActionType;
import org.springframework.data.gemfire.expiration.ExpirationAttributesFactoryBean;
import org.springframework.data.gemfire.support.ConnectionEndpoint;
#ClientCacheApplication
#EnableGemfireCaching
#EnablePdx(ignoreUnreadFields = true, readSerialized = false,
serializerBeanName = "reflectionBasedAutoSerializer")
#Profile("local")
public class GemFireCachingConfig {
#Bean
static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
// NOTE: you can externalize Pivotal GemFire properties in a gemfire.properties file,
// placed in the root of your application classpath.
//
// Alternatively, you can use Spring Boot's application.properties to set GemFire properties
// using the corresponding Spring Data GemFire (annotation-based) property (e.g. spring.data.gemfire.cache.log-level)
//
// See here...
// https://docs.spring.io/spring-data/gemfire/docs/current/api/org/springframework/data/gemfire/config/annotation/ClientCacheApplication.html#logLevel--
#Bean
#Primary
ReflectionBasedAutoSerializer reflectionBasedAutoSerializer() {
return new ReflectionBasedAutoSerializer("pkg.containing.cacheable.object");
}
#Bean
ClientCacheConfigurer clientCacheHostPortConfigurer(
#Value("gemfire.locator.host") String locatorHost,
#Value("gemfire.locator.port") int locatorPort) {
return (beanName, clientCacheFactoryBean) ->
clientCacheFactoryBean.setLocators(Collections.singletonList(
new ConnectionEndpoint(locatorHost, locatorPort)));
}
#Bean("RegionNameHere")
ClientRegionFactoryBean<String, Object> regionFactoryBean(GemFireCache gemfireCache,
#Qualifier("regionAttributes") RegionAttributes<String, Object> regionAttributes) {
ClientRegionFactoryBean<String, Object> clientRegionFactory = new ClientRegionFactoryBean<>();
clientRegionFactory.setAttributes(regionAttributes);
clientRegionFactory.setCache(gemfireCache);
clientRegionFactory.setClose(false);
clientRegionFactory.setShortcut(ClientRegionShortcut.CACHING_PROXY_HEAP_LRU);
return clientRegionFactory;
}
#Bean
public RegionAttributesFactoryBean regionAttributes(
#Qualifier("entryTtlExpirationAttributes") ExpirationAttributes expirationAttributes) {
RegionAttributesFactoryBean regionAttributes = new RegionAttributesFactoryBean();
regionAttributes.setStatisticsEnabled(true);
regionAttributes.setEntryTimeToLive(expirationAttributes);
return regionAttributes;
}
#Bean
public ExpirationAttributesFactoryBean entryTtlExpirationAttributes(
#Value("${gemfire.cache.expiration:600") int timeoutInSeconds) {
ExpirationAttributesFactoryBean expirationAttributes = new ExpirationAttributesFactoryBean();
expirationAttributes.setAction(ExpirationActionType.DESTROY.getExpirationAction());
expirationAttributes.setTimeout(timeoutInSeconds);
return expirationAttributes;
}
}
Of course, this configuration is based on Spring Data GemFire 2.0.1.RELEASE (Kay-SR1).
Notice the #ClientCacheApplication annotation, which replaces the need for your clientCacheFactory bean definition.
I also used the new #EnablePdx annotation to configure GemFire's PDX serialization behavior.
I declared a ClientCacheConfigurer typed bean definition (clientCacheHostPortConfigurer) to dynamically adjust the Locator host and port configuration based on property placeholders.
I defined a PropertySourcesPlaceholderConfigurer to handle the property placeholders used in the #Value annotations throughout the Spring, Java-based configuration meta-data.
I also used the new #EnableGemfireCaching annotation which replaces the need to explicitly define a gemfireCacheManager bean definition. It also enables Spring's Cache Abstraction (specifying #EnableCaching for you).
Anyway, SDG's new Annotation-based configuration model makes it easier to do everything. But again, you need to be using Spring Data GemFire 2.0+ (SD Kay) with Pivotal GemFire 9.1.x.
Hope this helps!
-John

Spring boot #Qualifier doesn't work with datasources

I'm building JPA configuration with multiple persistence units using different in-memory datasources, but the configuration fails resolving the qualified datasource for entity manager factory bean with the following error:
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 0 of method emfb in datasources.Application$PersistenceConfiguration required a single bean, but 2 were found:
- ds1: defined by method 'ds1' in class path resource [datasources/Application$PersistenceConfiguration.class]
- ds2: defined by method 'ds2' in class path resource [datasources/Application$PersistenceConfiguration.class]
Action:
Consider marking one of the beans as #Primary, updating the consumer to accept multiple beans, or using #Qualifier to identify the bean that should be consumed
Here is the sample application
package datasources;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import javax.persistence.PersistenceContextType;
import javax.sql.DataSource;
import javax.ws.rs.ApplicationPath;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import org.apache.log4j.Logger;
import org.glassfish.jersey.server.ResourceConfig;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.boot.autoconfigure.transaction.jta.JtaAutoConfiguration;
import org.springframework.boot.builder.SpringApplicationBuilder;
import org.springframework.boot.orm.jpa.EntityManagerFactoryBuilder;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseBuilder;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.stereotype.Component;
#Configuration
#EnableAutoConfiguration(exclude = {
// HibernateJpaAutoConfiguration.class,
// DataSourceAutoConfiguration.class
JtaAutoConfiguration.class
})
#ComponentScan
public class Application {
public static void main(String[] args) {
new SpringApplicationBuilder(Application.class)
.build()
.run(args);
}
#Component
#Path("/ds")
public static class DsApi {
private final static Logger logger = Logger.getLogger(DsApi.class);
#Autowired(required = false)
#Qualifier("ds1")
private DataSource ds;
#GET
public String ds() {
logger.info("ds");
return ds.toString();
}
}
#Component
#Path("/em")
public static class EmApi {
private final static Logger logger = Logger.getLogger(EmApi.class);
#PersistenceContext(unitName = "ds2", type = PersistenceContextType.TRANSACTION)
private EntityManager em;
#GET
public String em() {
logger.info("em");
return em.toString();
}
}
#Configuration
#ApplicationPath("/jersey")
public static class JerseyConfig extends ResourceConfig {
public JerseyConfig() {
register(DsApi.class);
register(EmApi.class);
}
}
#Configuration
public static class PersistenceConfiguration {
#Bean
#Qualifier("ds1")
public DataSource ds1() {
return new EmbeddedDatabaseBuilder().build();
}
#Bean
#Qualifier("ds2")
public DataSource ds2() {
return new EmbeddedDatabaseBuilder().build();
}
#Bean
#Primary
#Autowired
public LocalContainerEntityManagerFactoryBean emfb(#Qualifier("ds1") DataSource ds, EntityManagerFactoryBuilder emfb) {
return emfb.dataSource(ds)
.packages(Application.class)
.persistenceUnit("ds1")
.build();
}
#Bean
#Autowired
public LocalContainerEntityManagerFactoryBean emfb2(#Qualifier("ds2") DataSource ds, EntityManagerFactoryBuilder emfb) {
return emfb.dataSource(ds)
.packages(Application.class)
.persistenceUnit("ds2")
.build();
}
}
}
The error is indicating that at some point in the application, a bean is being injected by the type DataSource and not being qualified by name at that point.
It does not matter that you have added #Qualifier in one location. The injection is failing in some other location that has not been qualified. It's not your fault though because that location is in Spring Boot's DataSourceAutoConfiguration which you should be able to see in your stack trace, below the piece that you have posted.
I would recommend excluding DataSourceAutoConfiguration i.e. #SpringBootApplication(exclude = DataSourceAutoConfiguration.class). Otherwise, this configuration is only being applied to the bean you have made #Primary. Unless you know exactly what that is, it is likely to result in subtle and unexpected differences in behaviour between your DataSources.
Declare one of your DataSource as #Primary.
Also you have 2 beans of same type - LocalContainerEntityManagerFactoryBean, declare one of them #Primary as well, as follows:
#Configuration
public static class PersistenceConfiguration {
#Bean
#Primary
public DataSource ds1() {
return new EmbeddedDatabaseBuilder().build();
}
#Bean
public DataSource ds2() {
return new EmbeddedDatabaseBuilder().build();
}
#Bean
#Primary
#Autowired
public LocalContainerEntityManagerFactoryBean emfb(#Qualifier("ds1") DataSource ds, EntityManagerFactoryBuilder emfb) {
return emfb.dataSource(ds)
.packages(DemoApplication.class)
.persistenceUnit("ds1")
.build();
}
#Bean
#Autowired
public LocalContainerEntityManagerFactoryBean emfb2(#Qualifier("ds2") DataSource ds, EntityManagerFactoryBuilder emfb) {
return emfb.dataSource(ds)
.packages(DemoApplication.class)
.persistenceUnit("ds2")
.build();
}
}
Try declaring the datasource beans outside the static class . I.e directly in Application.java

Spring boot + spring batch without DataSource

I'm trying to configure spring batch inside spring boot project and I want to use it without data source. I've found that ResourcelessTransactionManager is the way to go but I cannot make it work. Problem is I already have 3 another dataSources defined, but I don't want to use any of them in springBatch.
I've checked default implementation DefaultBatchConfigurer and if it is not able to find dataSource it will do exactly what I want. Problem is I've 3 of them and dont want to use any.
Please dont suggest to use hsql or other in memory DB as I dont want that.
I got around this problem by extending the DefaultBatchConfigurer class so that it ignores any DataSource, as a consequence it will configure a map-based JobRepository.
Example:
#Configuration
#EnableBatchProcessing
public class BatchConfig extends DefaultBatchConfigurer {
#Override
public void setDataSource(DataSource dataSource) {
//This BatchConfigurer ignores any DataSource
}
}
In my case I persist data to Cassandra. If you are using spring-boot-starter-batch it is expected to provide a DataSource which is not yet implemented but you can trick the configuration like in the following steps:
Step1:
#SpringBootApplication(exclude = { DataSourceAutoConfiguration.class })
public class SampleSpringBatchApplication{
public static void main(String[] args) {
System.setProperty("spring.devtools.restart.enabled", "true");
SpringApplication.run(SampleSpringBatchApplication.class, args);
}
}
Step2:
#Configuration
#EnableBatchProcessing
public class SampleBatchJob extends DefaultBatchConfigurer {
//..
#Override
public void setDataSource(DataSource dataSource) {
}
//..
}
You can try excluding the DataSourceAutoConfiguration in #SpringBootApplication. See the sample code below.
import org.springframework.batch.core.Job;
import org.springframework.batch.core.Step;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.configuration.annotation.EnableBatchProcessing;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.autoconfigure.jdbc.DataSourceAutoConfiguration;
import org.springframework.context.annotation.Bean;
#SpringBootApplication(exclude = { DataSourceAutoConfiguration.class })
#EnableBatchProcessing
public class SampleBatchApplication {
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory steps;
#Bean
protected Tasklet tasklet() {
return new Tasklet() {
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext context) {
return RepeatStatus.FINISHED;
}
};
}
#Bean
public Job job() throws Exception {
return this.jobs.get("job").start(step1()).build();
}
#Bean
protected Step step1() throws Exception {
return this.steps.get("step1").tasklet(tasklet()).build();
}
public static void main(String[] args) throws Exception {
System.exit(SpringApplication.exit(SpringApplication.run(SampleBatchApplication.class, args)));
}
}
And sample test class
import org.junit.Rule;
import org.junit.Test;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.test.rule.OutputCapture;
import static org.assertj.core.api.Assertions.assertThat;
public class SampleBatchApplicationTests {
#Rule
public OutputCapture outputCapture = new OutputCapture();
#Test
public void testDefaultSettings() throws Exception {
assertThat(SpringApplication.exit(SpringApplication.run(SampleBatchApplication.class))).isEqualTo(0);
String output = this.outputCapture.toString();
assertThat(output).contains("completed with the following parameters");
}
}
If you have more than one DataSource in your configuration (regardless of if you want to use them or not) you need to define your own BatchConfigurer. It's the only way the framework knows what to do in situations like that.
You can read more about the BatchConfigurer in the documentation here: http://docs.spring.io/spring-batch/trunk/apidocs/org/springframework/batch/core/configuration/annotation/BatchConfigurer.html
We had the similar problem, we were using spring boot JDBC and we did not want to store spring batch tables in the DB, but we still wanted to use spring's transaction management for our DataSource.
We ended up implementing own BatchConfigurer.
#Component
public class TablelessBatchConfigurer implements BatchConfigurer {
private final PlatformTransactionManager transactionManager;
private final JobRepository jobRepository;
private final JobLauncher jobLauncher;
private final JobExplorer jobExplorer;
private final DataSource dataSource;
#Autowired
public TablelessBatchConfigurer(DataSource dataSource) {
this.dataSource = dataSource;
this.transactionManager = new DataSourceTransactionManager(this.dataSource);
try {
final MapJobRepositoryFactoryBean jobRepositoryFactory = new MapJobRepositoryFactoryBean(this.transactionManager);
jobRepositoryFactory.afterPropertiesSet();
this.jobRepository = jobRepositoryFactory.getObject();
final MapJobExplorerFactoryBean jobExplorerFactory = new MapJobExplorerFactoryBean(jobRepositoryFactory);
jobExplorerFactory.afterPropertiesSet();
this.jobExplorer = jobExplorerFactory.getObject();
final SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(this.jobRepository);
simpleJobLauncher.afterPropertiesSet();
this.jobLauncher = simpleJobLauncher;
} catch (Exception e) {
throw new BatchConfigurationException(e);
}
}
// ... override getters
}
and setting up initializer to false
spring.batch.initializer.enabled=false

Categories