Following is the code snippet which is successful in persisting the data in a remote GemFire cluster and successfully keeping local spring-cache updated. However, the entries are not getting DESTROY-ed as expected when I tried using ExpirationAttributes. I've referred to this and related links.
import org.springframework.data.gemfire.ExpirationActionType;
import org.springframework.data.gemfire.ExpirationAttributesFactoryBean;
import org.springframework.data.gemfire.RegionAttributesFactoryBean;
import org.springframework.data.gemfire.client.ClientCacheFactoryBean;
import org.springframework.data.gemfire.client.ClientRegionFactoryBean;
import org.springframework.data.gemfire.support.ConnectionEndpoint;
import org.springframework.data.gemfire.support.GemfireCacheManager;
import com.gemstone.gemfire.cache.ExpirationAttributes;
import com.gemstone.gemfire.cache.RegionAttributes;
import com.gemstone.gemfire.cache.client.ClientCache;
import com.gemstone.gemfire.cache.client.ClientRegionShortcut;
import com.gemstone.gemfire.pdx.ReflectionBasedAutoSerializer;
#Configuration
#Profile("local")
public class GemFireCachingConfig {
#Bean
Properties gemfireProperties(...) {
//Sets gemfire properties and return
return gemfireProperties;
}
#Bean
#Primary
ReflectionBasedAutoSerializer reflectionBasedAutoSerializer() {
return new ReflectionBasedAutoSerializer("pkg.containing.cacheable.object");
}
#Bean
#Primary
ClientCacheFactoryBean clientCacheFactory(String injectedGemFirehost,
int injectedGemfirePort, Properties gemfireProperties,
ReflectionBasedAutoSerializer reflectionBasedAutoSerializer) {
ClientCacheFactoryBean cachefactoryBean = new ClientCacheFactoryBean();
cachefactoryBean.setProperties(gemfireProperties);
cachefactoryBean.setClose(true);
cachefactoryBean.setPdxSerializer(reflectionBasedAutoSerializer);
cachefactoryBean.setPdxReadSerialized(false);
cachefactoryBean.setPdxIgnoreUnreadFields(true);
ConnectionEndpoint[] locators = new ConnectionEndpoint[1];
locators[0] = new ConnectionEndpoint(injectedGemFirehost, injectedGemfirePort);
cachefactoryBean.setLocators(locators);
return cachefactoryBean;
}
#Bean
public ExpirationAttributesFactoryBean entryTtlExpirationAttributes(
int injectedTimeoutInSecs) {
ExpirationAttributesFactoryBean expirationAttributes = new ExpirationAttributesFactoryBean();
expirationAttributes.setAction(ExpirationActionType.DESTROY.getExpirationAction());
expirationAttributes.setTimeout(injectedTimeoutInSecs);
return expirationAttributes;
}
#Bean
#Autowired
public RegionAttributesFactoryBean regionAttributes(
#Qualifier("entryTtlExpirationAttributes") ExpirationAttributes entryTtl) {
RegionAttributesFactoryBean regionAttributes = new RegionAttributesFactoryBean();
regionAttributes.setStatisticsEnabled(true);
regionAttributes.setEntryTimeToLive(entryTtl);
return regionAttributes;
}
#Bean
#Primary
ClientRegionFactoryBean<String, Object> regionFactoryBean(ClientCache gemfireCache,
#Qualifier("regionAttributes") RegionAttributes<String, Object> regionAttributes) {
ClientRegionFactoryBean<String, Object> regionFactoryBean = new ClientRegionFactoryBean<>();
regionFactoryBean.setAttributes(regionAttributes);
regionFactoryBean.setCache(gemfireCache);
regionFactoryBean.setClose(false);
regionFactoryBean.setPersistent(false);
regionFactoryBean.setRegionName(regionName);
regionFactoryBean.setShortcut(ClientRegionShortcut.CACHING_PROXY_HEAP_LRU);
return regionFactoryBean;
}
#Bean
GemfireCacheManager cacheManager(ClientCache gemfireCache) {
GemfireCacheManager cacheManager = new GemfireCacheManager();
cacheManager.setCache(gemfireCache);
return cacheManager;
}
}
Just curious how you think the injectedTimeoutInSeconds is "injected" into your entryTtlExpirationAttributes bean definition in your Spring config; this...
#Bean
public ExpirationAttributesFactoryBean entryTtlExpirationAttributes(
int injectedTimeoutInSecs) {
ExpirationAttributesFactoryBean expirationAttributes =
new ExpirationAttributesFactoryBean();
expirationAttributes.setAction(
ExpirationActionType.DESTROY.getExpirationAction());
expirationAttributes.setTimeout(injectedTimeoutInSecs);
return expirationAttributes;
}
You need to annotate your entryTtlExpirationAttributes bean definition method parameter (i.e. injectedTimeoutInSecs) with Spring's #Value annotation, like so...
#Bean
public ExpirationAttributesFactoryBean entryTtlExpirationAttributes(
#Value("${gemfire.cache.expiration.ttl.timeout:600}")
int injectedTimeoutInSecs) {
Then, in your Spring Boot application.properties file, you can set a value for the property (gemfire.cache.expiration.ttl.timeout)...
#application.properties
gemfire.cache.expiration.ttl.timeout = 300
The #Value annotation can supply a default if the property is not explicitly set...
#Value({${property:defaultValue}")
Additionally, you need to supply a propertySourcePlaceholderConfigurer bean definition in your Spring Java config to enable Spring to "replace" property placeholder values...
#Bean
static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
You can see a similar configuration to what you have above here.
Finally, you can simplify your entire Spring, GemFire Java configuration class to this...
import java.util.Collections;
import org.apache.geode.cache.ExpirationAttributes;
import org.apache.geode.cache.GemFireCache;
import org.apache.geode.cache.RegionAttributes;
import org.apache.geode.cache.client.ClientRegionShortcut;
import org.apache.geode.pdx.ReflectionBasedAutoSerializer;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Primary;
import org.springframework.context.annotation.Profile;
import org.springframework.context.support.PropertySourcesPlaceholderConfigurer;
import org.springframework.data.gemfire.RegionAttributesFactoryBean;
import org.springframework.data.gemfire.cache.config.EnableGemfireCaching;
import org.springframework.data.gemfire.client.ClientRegionFactoryBean;
import org.springframework.data.gemfire.config.annotation.ClientCacheApplication;
import org.springframework.data.gemfire.config.annotation.ClientCacheConfigurer;
import org.springframework.data.gemfire.config.annotation.EnablePdx;
import org.springframework.data.gemfire.expiration.ExpirationActionType;
import org.springframework.data.gemfire.expiration.ExpirationAttributesFactoryBean;
import org.springframework.data.gemfire.support.ConnectionEndpoint;
#ClientCacheApplication
#EnableGemfireCaching
#EnablePdx(ignoreUnreadFields = true, readSerialized = false,
serializerBeanName = "reflectionBasedAutoSerializer")
#Profile("local")
public class GemFireCachingConfig {
#Bean
static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
// NOTE: you can externalize Pivotal GemFire properties in a gemfire.properties file,
// placed in the root of your application classpath.
//
// Alternatively, you can use Spring Boot's application.properties to set GemFire properties
// using the corresponding Spring Data GemFire (annotation-based) property (e.g. spring.data.gemfire.cache.log-level)
//
// See here...
// https://docs.spring.io/spring-data/gemfire/docs/current/api/org/springframework/data/gemfire/config/annotation/ClientCacheApplication.html#logLevel--
#Bean
#Primary
ReflectionBasedAutoSerializer reflectionBasedAutoSerializer() {
return new ReflectionBasedAutoSerializer("pkg.containing.cacheable.object");
}
#Bean
ClientCacheConfigurer clientCacheHostPortConfigurer(
#Value("gemfire.locator.host") String locatorHost,
#Value("gemfire.locator.port") int locatorPort) {
return (beanName, clientCacheFactoryBean) ->
clientCacheFactoryBean.setLocators(Collections.singletonList(
new ConnectionEndpoint(locatorHost, locatorPort)));
}
#Bean("RegionNameHere")
ClientRegionFactoryBean<String, Object> regionFactoryBean(GemFireCache gemfireCache,
#Qualifier("regionAttributes") RegionAttributes<String, Object> regionAttributes) {
ClientRegionFactoryBean<String, Object> clientRegionFactory = new ClientRegionFactoryBean<>();
clientRegionFactory.setAttributes(regionAttributes);
clientRegionFactory.setCache(gemfireCache);
clientRegionFactory.setClose(false);
clientRegionFactory.setShortcut(ClientRegionShortcut.CACHING_PROXY_HEAP_LRU);
return clientRegionFactory;
}
#Bean
public RegionAttributesFactoryBean regionAttributes(
#Qualifier("entryTtlExpirationAttributes") ExpirationAttributes expirationAttributes) {
RegionAttributesFactoryBean regionAttributes = new RegionAttributesFactoryBean();
regionAttributes.setStatisticsEnabled(true);
regionAttributes.setEntryTimeToLive(expirationAttributes);
return regionAttributes;
}
#Bean
public ExpirationAttributesFactoryBean entryTtlExpirationAttributes(
#Value("${gemfire.cache.expiration:600") int timeoutInSeconds) {
ExpirationAttributesFactoryBean expirationAttributes = new ExpirationAttributesFactoryBean();
expirationAttributes.setAction(ExpirationActionType.DESTROY.getExpirationAction());
expirationAttributes.setTimeout(timeoutInSeconds);
return expirationAttributes;
}
}
Of course, this configuration is based on Spring Data GemFire 2.0.1.RELEASE (Kay-SR1).
Notice the #ClientCacheApplication annotation, which replaces the need for your clientCacheFactory bean definition.
I also used the new #EnablePdx annotation to configure GemFire's PDX serialization behavior.
I declared a ClientCacheConfigurer typed bean definition (clientCacheHostPortConfigurer) to dynamically adjust the Locator host and port configuration based on property placeholders.
I defined a PropertySourcesPlaceholderConfigurer to handle the property placeholders used in the #Value annotations throughout the Spring, Java-based configuration meta-data.
I also used the new #EnableGemfireCaching annotation which replaces the need to explicitly define a gemfireCacheManager bean definition. It also enables Spring's Cache Abstraction (specifying #EnableCaching for you).
Anyway, SDG's new Annotation-based configuration model makes it easier to do everything. But again, you need to be using Spring Data GemFire 2.0+ (SD Kay) with Pivotal GemFire 9.1.x.
Hope this helps!
-John
Related
When I create 2 beans with #StepScope and apply #Order then the order for both beans still gets resolved to Ordered.LOWEST_PRECEDENCE when trying to autowire them as a List.
#Bean
#Order(1)
#StepScope
public MyBean bean1(
return new MyBean("1");
}
#Bean
#Order(2)
#StepScope
public MyBean bean2(
return new MyBean("2");
}
Looking in OrderComparator I see the beans are resolved to a source of ScopedProxyFactoryBean which returns a null order value.
Wondering if Im doing something wrong here as I would expect the ordering to work correctly.
So the aim in to autowire an ordered list into another bean eg
#Component
public class OuterBean {
private List<MyBean> beans;
public OuterBean(List<MyBean> beans) {
this.beans = beans;
}
}
And I would expect the list to contain {bean1,bean2}
Step scoped beans need to be used in the scope of a running step. Here is an example:
import java.util.List;
import org.springframework.batch.core.Job;
import org.springframework.batch.core.configuration.annotation.JobBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepBuilderFactory;
import org.springframework.batch.core.configuration.annotation.StepScope;
import org.springframework.batch.core.step.tasklet.Tasklet;
import org.springframework.batch.repeat.RepeatStatus;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.annotation.Order;
#Configuration
public class MyJob {
#Bean
#Order(1)
#StepScope
public MyBean bean1() {
return new MyBean("1");
}
#Bean
#Order(2)
#StepScope
public MyBean bean2() {
return new MyBean("2");
}
#Bean
public OuterBean outerBean(List<MyBean> beans) {
return new OuterBean(beans);
}
#Bean
public Tasklet tasklet(OuterBean outerBean) {
return (contribution, chunkContext) -> {
outerBean.sayHello();
return RepeatStatus.FINISHED;
};
}
#Bean
public Job job(JobBuilderFactory jobs, StepBuilderFactory steps) {
return jobs.get("job")
.start(steps.get("step")
.tasklet(tasklet(null))
.build())
.build();
}
}
The complete code can be found here. This example prints:
bean = 1
bean = 2
which means step scoped beans have been injected in the right order.
I withdraw all data to properties, for example:
email.content.charset=utf-8
I created a class with all properties, ResourcesProperties, in which I have this field & getter:
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
#Component
public class ResourcesProperties {
#Value("${email.content.charset}")
private String emailCharset;
public String getEmailCharset() {
return emailCharset;
}
}
Spring scans this property file:
#PropertySource({"classpath:/properties/mail.properties"})
It autowires into class I need:
#Autowired
private ResourcesProperties properties;
I try to show it:
System.out.println(properties.getEmailCharset());
Result:
${email.content.charset}
Do same via Environment:
System.out.println(environment.getProperty("email.content.charset"));
Result:
utf-8
Please help me to resolve the problem.
You need to register PropertySourcesPlaceholderConfigurer bean
#Bean
public PropertySourcesPlaceholderConfigurer placeHolderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
I use Spring Data and decided that I want to create new custom data type that can be used in Hibernate entities. I checked the documentation and choose BasicType and implemented it according to this official user guide.
I wanted to be able to register the type under its class name and be able to use the new type in entities without need for #Type annotation. Unfortunately, I’m unable to get reference to the MetadataBuilder or Hibernate configuration to register the new type. Is there a way how to get it in Spring Data? It seems that initialization of the Hibernate is hidden from the user and cannot be easily accessed. We use following class to initialize the JPA:
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(
entityManagerFactoryRef = "entityManagerFactory",
transactionManagerRef = "transactionManager",
basePackages = {
"..." // omitted
}
)
public class JpaConfiguration implements TransactionManagementConfigurer {
#Primary
#Bean(name = "entityManagerFactory")
public LocalContainerEntityManagerFactoryBean configureEntityManagerFactory(
DataSource dataSource,
SchemaPerTenantConnectionProviderImpl provider) {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setPersistenceUnitName("defaultPersistenceUnit");
entityManagerFactoryBean.setDataSource(dataSource);
entityManagerFactoryBean.setPackagesToScan(
"..." // omitted
);
entityManagerFactoryBean.setJpaProperties(properties(provider));
entityManagerFactoryBean.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
return entityManagerFactoryBean;
}
#Primary
#Bean(name = "transactionManager")
public PlatformTransactionManager annotationDrivenTransactionManager() {
return new JpaTransactionManager();
}
private Properties properties(SchemaPerTenantConnectionProviderImpl provider) {
Properties properties = new Properties();
// omitted
return properties;
}
}
I found lots of articles about way how to do it with Hibernate’s Configuration object but this one refers to Hibernate 3 and 4. I also found way how to do it via Hibernate org.hibernate.integrator.spi.Integrator but when I use it according to the articles I found I will get exception with the message “org.hibernate.HibernateException: Can not alter TypeRegistry at this time”
What is the correct way to register custom types in Spring Data?
I finally figured it out. I will post it here for others:
I created a new class that implements org.hibernate.boot.spi.SessionFactoryBuilderFactory interface. In this class I can get reference to the TypeResolver from metadata and register my custom type.
package com.example.configuration;
import org.hibernate.boot.SessionFactoryBuilder;
import org.hibernate.boot.spi.MetadataImplementor;
import org.hibernate.boot.spi.SessionFactoryBuilderFactory;
import org.hibernate.boot.spi.SessionFactoryBuilderImplementor;
import org.slf4j.LoggerFactory;
import com.example.CustomType;
public class CustomDataTypesRegistration implements SessionFactoryBuilderFactory {
private static final org.slf4j.Logger logger = LoggerFactory.getLogger(CustomDataTypesRegistration.class);
#Override
public SessionFactoryBuilder getSessionFactoryBuilder(final MetadataImplementor metadata, final SessionFactoryBuilderImplementor defaultBuilder) {
logger.info("Registering custom Hibernate data types");
metadata.getTypeResolver().registerTypeOverride(CustomType.INSTANCE);
return defaultBuilder;
}
}
The class must be then registered via Java ServiceLoader mechanism by adding full name of the class with its packages into the file with name org.hibernate.boot.spi.SessionFactoryBuilderFactory into the java module’s META-INF/services directory:
src/main/resources/META-INF/services/org.hibernate.boot.spi.SessionFactoryBuilderFactory
The file can contain multiple lines, each referencing different class. In this case it is:
com.example.configuration.CustomDataTypesRegistration
This way the Spring Data starts and custom type is successfully registered during Hibernate initialization.
What helped my quite a lot was this SO answer that deals with schema export in Hibernate 5 under Spring Data.
There's a much easier solution to this -- in fact, it's just 1 line of code. You can just use the #TypeDef annotation and thus avoid having to register the custom type:
#Entity(name = "Product")
#TypeDef(
name = "bitset",
defaultForType = BitSet.class,
typeClass = BitSetType.class
)
public static class Product {
#Id
private Integer id;
private BitSet bitSet;
For an example, see "Example 11. Using #TypeDef to register a custom Type" in http://docs.jboss.org/hibernate/orm/5.3/userguide/html_single/Hibernate_User_Guide.html
I use JPA with Spring 4.3.9 and Hibernate 5.0.5 and I use custom property EntityManagerFactoryBuilderImpl.TYPE_CONTRIBUTORS with Spring
LocalContainerEntityManagerFactoryBean to override Hibernate BasicTypes.
final Properties jpaProperties = new Properties();
jpaProperties.put(EntityManagerFactoryBuilderImpl.TYPE_CONTRIBUTORS, new TypeContributorList() {
#Override
public List<TypeContributor> getTypeContributors() {
return Lists.newArrayList(new CustomDateTimeTypeContributor());
}
});
final LocalContainerEntityManagerFactoryBean factoryBean = new
LocalContainerEntityManagerFactoryBean();
factoryBean.setJpaProperties(jpaProperties);
factoryBean.setJpaVendorAdapter(jpaVendorAdapter);
return factoryBean;
An alternative to what xMort did, can be registering a org.hibernate.boot.model.TypeContributor via ServiceLoader mechanism.
Implement TypeContributor
package com.example.configuration;
import org.hibernate.boot.model.TypeContributions;
import org.hibernate.boot.model.TypeContributor;
import org.hibernate.service.ServiceRegistry;
public class CustomTypeContributor implements TypeContributor {
#Override
public void contribute(TypeContributions typeContributions, ServiceRegistry serviceRegistry) {
typeContributions.contributeType(CustomType.INSTANCE);
}
}
Create a file org.hibernate.boot.model.TypeContributor into the java module’s META-INF/services
src/main/resources/META-INF/services/org.hibernate.boot.model.TypeContributor
Reference the TypeContributor, file content:
ru.eastbanctech.scs.air.transaction.repositories.StringArrayTypeContributor
As of Hibernate 5.2.17, the code that picks up the TypeContributor service can be found at org.hibernate.boot.model.process.spi.MetadataBuildingProcess#handleTypes.
Inspired by #alex.tran's answer:
#Bean
public HibernatePropertiesCustomizer customHibernateTypeRegistrar() {
return (Map<String, Object> props) -> {
props.put(
EntityManagerFactoryBuilderImpl.TYPE_CONTRIBUTORS,
(TypeContributorList) () -> Arrays.asList((TypeContributor) (typeContributions, serviceRegistry) -> {
// Deregister built-in org.hibernate.type.OffsetDateTimeSingleColumnType as it hides our mapping.
final BasicTypeRegistry basicTypeRegistry = typeContributions.getTypeConfiguration().getBasicTypeRegistry();
Class<OffsetDateTime> clazz = OffsetDateTime.class;
basicTypeRegistry.unregister(clazz.getName());
basicTypeRegistry.unregister(clazz.getSimpleName());
typeContributions.contributeSqlTypeDescriptor(OffsetDateTimeMsSqlTypeDescriptor.INSTANCE);
typeContributions.contributeJavaTypeDescriptor(OffsetDateTimeJavaTypeDescriptor.INSTANCE);
typeContributions.contributeType(OffsetDateTimeSingleColumnType.INSTANCE);
}));
};
}
Magical constant EntityManagerFactoryBuilderImpl.TYPE_CONTRIBUTORS & bean of type HibernatePropertiesCustomizer do the magic in EntityManagerFactoryBuilderImpl:
HibernateJpaAutoConfiguration -> HibernateJpaConfiguration -> JpaBaseConfiguration -> LocalContainerEntityManagerFactoryBean -> HibernatePersistenceProvider -> Bootstrap.getEntityManagerFactoryBuilder() -> EntityManagerFactoryBuilderImpl.
See details: https://discourse.hibernate.org/t/map-ms-sql-server-datetimeoffset-to-java-8-offsetdatetime/5937/7
Solution keeps Boot magic intact as it plugs into Spring Boot autoconfiguration, also there is no need to redefine lots of beans (like transaction or entity manager).
You can use the HibernatePropertiesCustomizer class spring will load all beans of this type (for more details see HibernateJpaConfiguration)
Also see PG Json config Test
this is an example (I'm adding Json type only in case postgress dialect is configured)
#Bean
#ConditionalOnProperty(value = "spring.jpa.properties.hibernate.dialect",
havingValue = "org.hibernate.dialect.PostgreSQL10Dialect")
public HibernatePropertiesCustomizer hibernatePropertiesCustomizerPG(ObjectMapper objectMapper) {
return hibernateProperties -> {
hibernateProperties.put(EntityManagerFactoryBuilderImpl.TYPE_CONTRIBUTORS,
(TypeContributorList) () -> List.of(
(TypeContributor) (TypeContributions typeContributions, ServiceRegistry serviceRegistry) ->
typeContributions.contributeType(new JsonType(objectMapper))));
};
}
This is the same than have
#TypeDefs({
#TypeDef(name = "json", typeClass = JsonType.class)
})
#MappedSuperclass
public class BaseEntity {
//Code omitted for brevity
}
I prefer to do in this way to not override any bean handled by spring such as SessionFactory
There is another solution.
We extend Hibernate MetadaSources overriding methods getMetadataBuilder by creating MetadataBuilder:
import org.hibernate.boot.MetadataBuilder;
import org.hibernate.boot.MetadataSources;
import org.hibernate.boot.internal.MetadataBuilderImpl;
import org.hibernate.boot.registry.StandardServiceRegistry;
public class CustomTypeIncludedMetadataSources extends MetadataSources {
#Override
public MetadataBuilder getMetadataBuilder() {
MetadataBuilder b = new MetadataBuilderImpl(this);
applyCustomTypes(b);
return b;
}
#Override
#Deprecated
public MetadataBuilder getMetadataBuilder(StandardServiceRegistry serviceRegistry) {
MetadataBuilder b = new MetadataBuilderImpl(this, serviceRegistry);
applyCustomTypes(b);
return b;
}
private void applyCustomTypes(MetadataBuilder b) {
b.applyBasicType(new CustomType());
...
}
}
Next, in Spring #Configuration class we set an instance of our CustomTypeIncludedMetadataSources into Spring LocalSessionFactoryBean:
#Bean
public LocalSessionFactoryBean sessionFactory(DataSource dataSource) {
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setMetadataSources(new CustomTypeIncludedMetadataSources());
sessionFactory.setDataSource(dataSource);
...
return sessionFactory;
}
I'm building JPA configuration with multiple persistence units using different in-memory datasources, but the configuration fails resolving the qualified datasource for entity manager factory bean with the following error:
***************************
APPLICATION FAILED TO START
***************************
Description:
Parameter 0 of method emfb in datasources.Application$PersistenceConfiguration required a single bean, but 2 were found:
- ds1: defined by method 'ds1' in class path resource [datasources/Application$PersistenceConfiguration.class]
- ds2: defined by method 'ds2' in class path resource [datasources/Application$PersistenceConfiguration.class]
Action:
Consider marking one of the beans as #Primary, updating the consumer to accept multiple beans, or using #Qualifier to identify the bean that should be consumed
Here is the sample application
package datasources;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import javax.persistence.PersistenceContextType;
import javax.sql.DataSource;
import javax.ws.rs.ApplicationPath;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import org.apache.log4j.Logger;
import org.glassfish.jersey.server.ResourceConfig;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.boot.autoconfigure.transaction.jta.JtaAutoConfiguration;
import org.springframework.boot.builder.SpringApplicationBuilder;
import org.springframework.boot.orm.jpa.EntityManagerFactoryBuilder;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.Primary;
import org.springframework.jdbc.datasource.embedded.EmbeddedDatabaseBuilder;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.stereotype.Component;
#Configuration
#EnableAutoConfiguration(exclude = {
// HibernateJpaAutoConfiguration.class,
// DataSourceAutoConfiguration.class
JtaAutoConfiguration.class
})
#ComponentScan
public class Application {
public static void main(String[] args) {
new SpringApplicationBuilder(Application.class)
.build()
.run(args);
}
#Component
#Path("/ds")
public static class DsApi {
private final static Logger logger = Logger.getLogger(DsApi.class);
#Autowired(required = false)
#Qualifier("ds1")
private DataSource ds;
#GET
public String ds() {
logger.info("ds");
return ds.toString();
}
}
#Component
#Path("/em")
public static class EmApi {
private final static Logger logger = Logger.getLogger(EmApi.class);
#PersistenceContext(unitName = "ds2", type = PersistenceContextType.TRANSACTION)
private EntityManager em;
#GET
public String em() {
logger.info("em");
return em.toString();
}
}
#Configuration
#ApplicationPath("/jersey")
public static class JerseyConfig extends ResourceConfig {
public JerseyConfig() {
register(DsApi.class);
register(EmApi.class);
}
}
#Configuration
public static class PersistenceConfiguration {
#Bean
#Qualifier("ds1")
public DataSource ds1() {
return new EmbeddedDatabaseBuilder().build();
}
#Bean
#Qualifier("ds2")
public DataSource ds2() {
return new EmbeddedDatabaseBuilder().build();
}
#Bean
#Primary
#Autowired
public LocalContainerEntityManagerFactoryBean emfb(#Qualifier("ds1") DataSource ds, EntityManagerFactoryBuilder emfb) {
return emfb.dataSource(ds)
.packages(Application.class)
.persistenceUnit("ds1")
.build();
}
#Bean
#Autowired
public LocalContainerEntityManagerFactoryBean emfb2(#Qualifier("ds2") DataSource ds, EntityManagerFactoryBuilder emfb) {
return emfb.dataSource(ds)
.packages(Application.class)
.persistenceUnit("ds2")
.build();
}
}
}
The error is indicating that at some point in the application, a bean is being injected by the type DataSource and not being qualified by name at that point.
It does not matter that you have added #Qualifier in one location. The injection is failing in some other location that has not been qualified. It's not your fault though because that location is in Spring Boot's DataSourceAutoConfiguration which you should be able to see in your stack trace, below the piece that you have posted.
I would recommend excluding DataSourceAutoConfiguration i.e. #SpringBootApplication(exclude = DataSourceAutoConfiguration.class). Otherwise, this configuration is only being applied to the bean you have made #Primary. Unless you know exactly what that is, it is likely to result in subtle and unexpected differences in behaviour between your DataSources.
Declare one of your DataSource as #Primary.
Also you have 2 beans of same type - LocalContainerEntityManagerFactoryBean, declare one of them #Primary as well, as follows:
#Configuration
public static class PersistenceConfiguration {
#Bean
#Primary
public DataSource ds1() {
return new EmbeddedDatabaseBuilder().build();
}
#Bean
public DataSource ds2() {
return new EmbeddedDatabaseBuilder().build();
}
#Bean
#Primary
#Autowired
public LocalContainerEntityManagerFactoryBean emfb(#Qualifier("ds1") DataSource ds, EntityManagerFactoryBuilder emfb) {
return emfb.dataSource(ds)
.packages(DemoApplication.class)
.persistenceUnit("ds1")
.build();
}
#Bean
#Autowired
public LocalContainerEntityManagerFactoryBean emfb2(#Qualifier("ds2") DataSource ds, EntityManagerFactoryBuilder emfb) {
return emfb.dataSource(ds)
.packages(DemoApplication.class)
.persistenceUnit("ds2")
.build();
}
}
Try declaring the datasource beans outside the static class . I.e directly in Application.java
I have a spring application that is currently using *.properties files and I want to have it using YAML files instead.
I found the class YamlPropertiesFactoryBean that seems to be capable of doing what I need.
My problem is that I'm not sure how to use this class in my Spring application (which is using annotation based configuration).
It seems I should configure it in the PropertySourcesPlaceholderConfigurer with the setBeanFactory method.
Previously I was loading property files using #PropertySource as follows:
#Configuration
#PropertySource("classpath:/default.properties")
public class PropertiesConfig {
#Bean
public static PropertySourcesPlaceholderConfigurer placeholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
}
How can I enable the YamlPropertiesFactoryBean in the PropertySourcesPlaceholderConfigurer so that I can load YAML files directly?
Or is there another way of doing this?
Thanks.
My application is using annotation based config and I'm using Spring Framework 4.1.4.
I found some information but it always pointed me to Spring Boot, like this one.
With XML config I've been using this construct:
<context:annotation-config/>
<bean id="yamlProperties" class="org.springframework.beans.factory.config.YamlPropertiesFactoryBean">
<property name="resources" value="classpath:test.yml"/>
</bean>
<context:property-placeholder properties-ref="yamlProperties"/>
Of course you have to have the snakeyaml dependency on your runtime classpath.
I prefer XML config over the java config, but I recon it shouldn't be hard to convert it.
edit:
java config for completeness sake
#Bean
public static PropertySourcesPlaceholderConfigurer properties() {
PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer = new PropertySourcesPlaceholderConfigurer();
YamlPropertiesFactoryBean yaml = new YamlPropertiesFactoryBean();
yaml.setResources(new ClassPathResource("default.yml"));
propertySourcesPlaceholderConfigurer.setProperties(yaml.getObject());
return propertySourcesPlaceholderConfigurer;
}
To read .yml file in Spring you can use next approach.
For example you have this .yml file:
section1:
key1: "value1"
key2: "value2"
section2:
key1: "value1"
key2: "value2"
Then define 2 Java POJOs:
#Configuration
#EnableConfigurationProperties
#ConfigurationProperties(prefix = "section1")
public class MyCustomSection1 {
private String key1;
private String key2;
// define setters and getters.
}
#Configuration
#EnableConfigurationProperties
#ConfigurationProperties(prefix = "section2")
public class MyCustomSection1 {
private String key1;
private String key2;
// define setters and getters.
}
Now you can autowire these beans in your component. For example:
#Component
public class MyPropertiesAggregator {
#Autowired
private MyCustomSection1 section;
}
In case you are using Spring Boot everything will be auto scaned and instantiated:
#SpringBootApplication
public class MainBootApplication {
public static void main(String[] args) {
new SpringApplicationBuilder()
.sources(MainBootApplication.class)
.bannerMode(OFF)
.run(args);
}
}
If you'are using JUnit there is a basic test setup for loading YAML file:
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(MainBootApplication.class)
public class MyJUnitTests {
...
}
If you're using TestNG there is a sample of test configuration:
#SpringApplicationConfiguration(MainBootApplication.class)
public abstract class BaseITTest extends AbstractTestNGSpringContextTests {
....
}
`
package com.yaml.yamlsample;
import com.yaml.yamlsample.config.factory.YamlPropertySourceFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.boot.CommandLineRunner;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.context.annotation.PropertySource;
#SpringBootApplication
#PropertySource(value = "classpath:My-Yaml-Example-File.yml", factory = YamlPropertySourceFactory.class)
public class YamlSampleApplication implements CommandLineRunner {
public static void main(String[] args) {
SpringApplication.run(YamlSampleApplication.class, args);
}
#Value("${person.firstName}")
private String firstName;
#Override
public void run(String... args) throws Exception {
System.out.println("first Name :" + firstName);
}
}
package com.yaml.yamlsample.config.factory;
import org.springframework.boot.env.YamlPropertySourceLoader;
import org.springframework.core.env.PropertySource;
import org.springframework.core.io.support.DefaultPropertySourceFactory;
import org.springframework.core.io.support.EncodedResource;
import java.io.IOException;
import java.util.List;
public class YamlPropertySourceFactory extends DefaultPropertySourceFactory {
#Override
public PropertySource createPropertySource(String name, EncodedResource resource) throws IOException {
if (resource == null) {
return super.createPropertySource(name, resource);
}
List<PropertySource<?>> propertySourceList = new YamlPropertySourceLoader().load(resource.getResource().getFilename(), resource.getResource());
if (!propertySourceList.isEmpty()) {
return propertySourceList.iterator().next();
}
return super.createPropertySource(name, resource);
}
}
My-Yaml-Example-File.yml
person:
firstName: Mahmoud
middleName:Ahmed
Reference my example on github spring-boot-yaml-sample So you can load yaml files and inject values using #Value()
I spend 5 to 6 hours in understanding why external configuration of yml/yaml file(not application.yml) are so different.I read various articles, stack overflow questions but didn't get correct answer.
I was stuck in between like I was able to use custom yml file value using YamlPropertySourceLoader but not able to use #Value because it is giving me error like
Injection of autowired dependencies failed; nested exception is java.lang.IllegalArgumentException: Could not resolve placeholder
'fullname.firstname' in value "${fullname.firstname}"
fullname is a property in yml.
Then I used "turtlesallthewaydown" given above solution, then at last I was able to use #Value without any issues for yaml files and I removed YamlPropertySourceLoader.
Now I understand the difference between YamlPropertySourceLoader and PropertySourcesPlaceholderConfigurer, a big thanks, however I have added these changes in my git repo.
Git Repo:
https://github.com/Atishay007/spring-boot-with-restful-web-services
Class Name: SpringMicroservicesApplication.java
If one will seek how to load yaml file into Properties in Spring, then there is a solution:
public Properties loadYaml(String fileName){
// fileName for eg is "my-settings.yaml"
YamlPropertySourceLoader ypsl = new YamlPropertySourceLoader();
PropertySource ps = ypsl.load(fileName, new ClassPathResource(fileName)).get(0);
Properties props = new Properties();
props.putAll((Map)ps.getSource());
return props;
}