How do I configure multiple JPA data sources using Spring #Configuration class? - java

I am trying to configure multiple JPA entity/transaction managers within the same application context using Spring's #Configuration class.
When the context loads, Spring is having difficulties auto-wiring the beans because they implement the same interfaces.
Unfortunately, I'm using legacy code so I can't auto-wire the beans directly and use the #Qualifier annotations, which is why I'm trying to do it using the configuration class.
Within a #Bean declaration, is there any way to qualify which bean should be injected? I thought that using a direct method call would be enough, but it typically results in errors such as
NoUniqueBeanDefinitionException: No qualifying bean of type
[javax.sql.DataSource] is defined: expected single matching bean but
found 4
Here's an example of what I'm trying to do below:
#Configuration
public class ApplicationConfig {
#Bean(name = "transactionManager1")
public PlatformTransactionManager transactionManager1() {
return new JpaTransactionManager(entityManagerFactory1());
}
#Bean(name = "entityManagerFactory1")
public EntityManagerFactory entityManagerFactory1() {
...
LocalContainerEntityManagerFactoryBean factory = new LocalContainerEntityManagerFactoryBean();
factory.setDataSource(dataSource1());
...
}
#Bean(destroyMethod = "")
#ConfigurationProperties(prefix = "datasource.test1")
public JndiObjectFactoryBean jndiObjectFactoryBean1() {
return new JndiObjectFactoryBean();
}
#Bean(name = "dataSource1")
public DataSource dataSource1() {
JndiDataSourceLookup lookup = new JndiDataSourceLookup();
return lookup.getDataSource(jndiObjectFactoryBean1().getJndiName());
}
#Bean(name = "transactionManager2")
public PlatformTransactionManager transactionManager2() {
return new JpaTransactionManager(entityManagerFactory2());
}
#Bean(name = "entityManagerFactory2")
public EntityManagerFactory entityManagerFactory2() {
...
LocalContainerEntityManagerFactoryBean factory = new LocalContainerEntityManagerFactoryBean();
factory.setDataSource(dataSource2());
...
}
#Bean(destroyMethod = "")
#ConfigurationProperties(prefix = "datasource.test2")
public JndiObjectFactoryBean jndiObjectFactoryBean2() {
return new JndiObjectFactoryBean();
}
#Bean(name = "dataSource2")
public DataSource dataSource2() {
JndiDataSourceLookup lookup = new JndiDataSourceLookup();
return lookup.getDataSource(jndiObjectFactoryBean2().getJndiName());
}
I suppose I could try to inject the beans directly via the Spring context's getBean() method, but is there a cleaner way of doing this?
I'm not too familiar with the #Primary annotation, but based on what I've read I don't know how spring would autowire the secondary data source in this case since it looks like it would always pick the beans with #Primary first.

If you cannot change the injection sites to add qualifiers, then you're going to have to create a delegating DataSource based on some logic (which you haven't detailed in the question).
Something like this.
#Primary #Bean
public DelegatingDataSource delegatingDataSource(List<DataSource> sources) {
return new DelegatingDataSource() {
#Override
public DataSource getTargetDataSource() {
// decide which dataSource to delegate to
return sources.get(0);
}
}
}
I've used DelegatingDataSource, but that may not be able to provide what you need. You may need to get more advanced with some kind of interceptor/aspect to get details of the caller on which to base the DataSource selection.
However it's implemented, you need to specify a #Primary bean and use it as a proxy.

Related

Enabling exception translation for JPA in plain Spring

I was setting up a basic CRUD web app with JPA in plain Spring (no Spring Boot or Spring Data JPA) for educational purposes and faced a strange problem: Spring doesn't translate exceptions for my repository. According to the Spring documentation (here and here), it is sufficient to mark the repository with the #Repository annotation and Spring will automatically enable exception translation for this repository.
However, when I did so and triggered a UNIQUE constraint violation, I still was getting a JPA PersistenceException (with a Hibernate ConstraintViolationException inside) instead of the Spring DataIntegrityViolationException.
I used pure Java Spring configuration and it took me quite some time to realize that I should compare it with the XML configuration in the documentation. Compared to the pure Java configuration, the XML configuration adds a PersistenceExceptionTranslationPostProcessor into the context. When I added it manually with #Bean, it worked, but now I have a question.
Have I misconfigured something? The Spring documentation doesn't require registering that post-processor manually for pure Java configuration. Maybe there is another way to register it, say an #EnableXXX annotation?
Here is the summary of my configuration.
#Configuration
#ComponentScan("com.example.secured_crm")
public class SpringConfiguration {
// the problem is solved if I uncomment this
//#Bean
//public PersistenceExceptionTranslationPostProcessor exceptionTranslation() {
// return new PersistenceExceptionTranslationPostProcessor();
//}
}
#Configuration
#PropertySource("classpath:db.properties")
#EnableTransactionManagement
public class DataSourceConfiguration {
#Value("${jdbc.driver}")
private String driverClass;
#Value("${jdbc.url}")
private String url;
// ...
#Value("${hibernate.debug}")
private String hibernateDebug;
#Bean
public DataSource dataSource() {
var dataSource = new ComboPooledDataSource();
// ...
return dataSource;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
var emFactory = new LocalContainerEntityManagerFactoryBean();
emFactory.setDataSource(dataSource());
emFactory.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
emFactory.setPackagesToScan("com.example.secured_crm.entities");
var properties = new Properties();
properties.setProperty("hibernate.dialect", hibernateDialect);
properties.setProperty("hibernate.show_sql", hibernateDebug);
properties.setProperty("hibernate.format_sql", "true");
emFactory.setJpaProperties(properties);
return emFactory;
}
#Bean
public PlatformTransactionManager transactionManager(EntityManagerFactory entityManagerFactory) {
var txManager = new JpaTransactionManager();
txManager.setEntityManagerFactory(entityManagerFactory);
return txManager;
}
}
public interface UserRepository {
User findByName(String username);
List<User> findAll();
void save(User user);
boolean deleteById(int id);
User findById(int id);
}
#Repository
public class UserJpaRepository implements UserRepository {
#PersistenceContext
EntityManager em;
#Override
public void save(User user) {
if (user.getId() == null) {
em.persist(user);
} else {
em.merge(user);
}
}
// and so on...
}
By the way, when I tried to add the post-processor in DataSourceConfiguration, it disabled #PropertySource effect. So far my impression of Spring is that it's one big hack...
It requires to manually register PersistenceExceptionTranslationPostProcessor in order for the exception translation to take effect.
The documentation you mentioned simply does not updated yet to show a fully working java configuration. It should mention to register this post processor. ( So feel free to provide a PR to update the docs.).
If you check from its javadoc , it already mentioned PersistenceExceptionTranslationPostProcessor is necessary to be registered :
As a consequence, all that is usually needed to enable automatic
exception translation is marking all affected beans (such as
Repositories or DAOs) with the #Repository annotation, along with
defining this post-processor as a bean in the application context.
P.S. If you are using spring-boot , and if it detects PersistenceExceptionTranslationPostProcessor is in the class-path , it will automatically register it by default such that you do not need to register manually.

Spring 5 Unable to find a Repository

I have a Spring Framework 5.3.7 application, NOT Spring Boot (I'll move to Spring Boot later). I am using Java Configuration and it is working well so far. I have a multi-maven module project, and the first module is "myapp-entity". There is a config directory with the following file:
#Configuration
#PropertySource(value = "file:/opt/myapp/myapp-ws.properties")
#EnableTransactionManagement
public class AppEntityConfiguration
{
#Value("${hibernate.connection.driver.class}")
private String driverClassName;
#Value("${hibernate.connection.url}")
private String connectionUrl;
#Value("${hibernate.connection.username}")
private String username;
#Value("${hibernate.connection.password}")
private String password;
#Bean
public LocalSessionFactoryBean sessionFactory()
{
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource(dataSource());
sessionFactory.setPackagesToScan("com.app.model");
sessionFactory.setHibernateProperties(hibernateProperties());
return sessionFactory;
}
#Bean
public DataSource dataSource()
{
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driverClassName);
dataSource.setUrl(connectionUrl);
dataSource.setUsername(username);
dataSource.setPassword(password);
return dataSource;
}
#Bean
public PlatformTransactionManager hibernateTransactionManager()
{
HibernateTransactionManager transactionManager = new HibernateTransactionManager();
transactionManager.setSessionFactory(sessionFactory().getObject());
return transactionManager;
}
private final Properties hibernateProperties()
{
Properties hibernateProperties = new Properties();
hibernateProperties.setProperty("hibernate.hbm2ddl.auto", "none");
hibernateProperties.setProperty("hibernate.dialect", "org.hibernate.dialect.H2Dialect");
return hibernateProperties;
}
}
The myapp-entity.jar compiles fine with Java 11, and this is fine. The next maven module which is myapp-dao has a config directory and a configuration class.
#Configuration
#Import(AppEntityConfiguration.class)
#ComponentScan(basePackages = "com.app.dao")
public class RepositoryContextConfiguration {
}
I have an TableOneEntity defined with an #Entity annotation in the app-entity.jar and that is fine.
I have a TableOneDao and TableOneDaoImpl defined with a very basic list of functions.
public interface TableOneDao
{ ... list of functions }
#Repositories("tableOneDao")
public class TableOneDaoImpl implements TableOneDao
{ ... iplementation of functions }
And the test for this works perfectly:
#RunWith(SpringJUnit4ClassRunner.class)
#Transactional
#PropertySource(value = "file:/opt/myapp/myapp-ws.properties")
#ContextConfiguration(classes = RepositoryContextConfiguration.class)
public class OrganizationDaoTest extends BaseDaoTests
{
#Autowired
private TableOneDao tableOneDao;
}
This is very much the old way I used to do things, and it wall worked well. NOW, I want to get rid of this old way of doing things and go with a new way of doing things. Maybe I can't do them in the same project, that could be the issue.
I have a second Entity (TableTwoEntity) and a second Repository (TableTwoDao):
#Repository("clothesDryerDao")
public interface ClothesDryerDao extends JpaRepository<ClothesDryer, Long>
{
}
This is now a JPA repository and when I do a simple on this Dao, it cannot find THIS Dao.
#RunWith(SpringJUnit4ClassRunner.class)
#Transactional
#PropertySource(value = "file:/opt/ekotrope/ekotrope-ws.properties")
#ContextConfiguration(classes = RepositoryContextConfiguration.class)
public class ClothesDryerDaoTest extends TestCase
{
#Autowired
private ClothesDryerDao clothesDryerDao;
}
And I get the error as follows:
Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'com.ekotrope.dao.ClothesDryerDao' available: expected at least 1 bean which qualifies as autowire candidate. Dependency annotations: {#org.springframework.beans.factory.annotation.Autowired(required=true)}
at org.springframework.beans.factory.support.DefaultListableBeanFactory.raiseNoMatchingBeanFound(DefaultListableBeanFactory.java:1790)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1346)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1300)
at org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor$AutowiredFieldElement.resolveFieldValue(AutowiredAnnotationBeanPostProcessor.java:657)
... 30 more
I thought I just need to modify the RepositoryContextConfiguration as follows:
#Configuration
#Import(EkotropeEntityConfiguration.class)
#ComponentScan(basePackages = "com.ekotrope.dao")
#EnableJpaRepositories(basePackages = "com.ekotrope.dao")
public class RepositoryContextConfiguration
{
}
But this did not work. As a matter of fact I think it broke the other working tests. So, the question is ... can I use both these methods (Dao and DaoImpl) and (JPA Dao)? Or, should I use only one, and if that was the case I would go with the JPA Repositories. I just wanted to be able to demonstrate both methods to my co-workers who are not familiar with Spring.
So, if I can get both to work at the same time, that would be great, but if not, then I can create one more Maven Module and then I will have a myapp-dao-old.jar and myapp-dao-new.jar.
Thanks for the help!
I think your jpaRepositories config is wrong but yout dao is good.
And I think this 2 solutions can work together.
Firtsly you don't need to put an #Repository on the ClothesDryerDao interface. (I understand you want to use a qualifier but i don't think this will work will not be necessary. (Type is enough for spring injection if you don't have multiple instance of a same class)
Secondly I think you need to change your Jpa configuration.
In a project i've done something like this
#Configuration
#EnableJpaRepositories(
basePackageClasses = ClothesDryerDao.class,
entityManagerFactoryRef = "configEntityManager",
)
public class RepositoryContextConfiguration{ /
//be careful this bean can maybe be in conflict with your sessionFactory
#Bean
public LocalContainerEntityManagerFactoryBean configEntityManager(DataSource dataSource) {
log.info("Start Parametrage Arcole Entity Manager");
LocalContainerEntityManagerFactoryBean em = new LocalContainerEntityManagerFactoryBean();
em.setDataSource(dataSource);
em.setPackagesToScan("com.app.model");
HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
em.setJpaVendorAdapter(vendorAdapter);
return em;
}
}
Thirdly i think (this one is juste my opinion) you should maybe not mix classical Dao and Repository in the same package. For a better visibility :)
Even if the usage is the same

Issue with Declarative Transactions and TransactionAwareDataSourceProxy in combination with JOOQ

I have a data source configuration class that looks as follows, with separate DataSource beans for testing and non-testing environments using JOOQ. In my code, I do not use DSLContext.transaction(ctx -> {...} but rather mark the method as transactional, so that JOOQ defers to Spring's declarative transactions for transactionality. I am using Spring 4.3.7.RELEASE.
I have the following issue:
During testing (JUnit), #Transactional works as expected. A single method is transactional no matter how many times I use the DSLContext's store() method, and a RuntimeException triggers a rollback of the entire transaction.
During actual production runtime, #Transactional is completely ignored. A method is no longer transactional, and TransactionSynchronizationManager.getResourceMap() holds two separate values: one showing to my connection pool (which is not transactional), and one showing the TransactionAwareDataSourceProxy).
In this case, I would have expected only a single resource of type TransactionAwareDataSourceProxy which wraps my DB CP.
After much trial and error using the second set of configuration changes I made (noted below with "AFTER"), #Transactional works correctly as expected even during runtime, though TransactionSynchronizationManager.getResourceMap() holds the following value:
In this case, my DataSourceTransactionManager seems to not even know the TransactionAwareDataSourceProxy (most likely due to my passing it the simple DataSource, and not the proxy object), which seems to completely 'skip' the proxy anyway.
My question is: the initial configuration that I had seemed correct, but did not work. The proposed 'fix' works, but IMO should not work at all (since the transaction manager does not seem to be aware of the TransactionAwareDataSourceProxy).
What is going on here? Is there a cleaner way to fix this issue?
BEFORE (not transactional during runtime)
#Configuration
#EnableTransactionManagement
#RefreshScope
#Slf4j
public class DataSourceConfig {
#Bean
#Primary
public DSLContext dslContext(org.jooq.Configuration configuration) throws SQLException {
return new DefaultDSLContext(configuration);
}
#Bean
#Primary
public org.jooq.Configuration defaultConfiguration(DataSourceConnectionProvider dataSourceConnectionProvider) {
org.jooq.Configuration configuration = new DefaultConfiguration()
.derive(dataSourceConnectionProvider)
.derive(SQLDialect.POSTGRES_9_5);
configuration.set(new DeleteOrUpdateWithoutWhereListener());
return configuration;
}
#Bean
public DataSourceTransactionManager transactionManager(DataSource dataSource) {
return new DataSourceTransactionManager(dataSource);
}
#Bean
public DataSourceConnectionProvider dataSourceConnectionProvider(DataSource dataSource) {
return new DataSourceConnectionProvider(dataSource);
}
#Configuration
#ConditionalOnClass(EmbeddedPostgres.class)
static class EmbeddedDataSourceConfig {
#Value("${spring.jdbc.port}")
private int dbPort;
#Bean(destroyMethod = "close")
public EmbeddedPostgres embeddedPostgres() throws Exception {
EmbeddedPostgres embeddedPostgres = EmbeddedPostgresHelper.startDatabase(dbPort);
return embeddedPostgres;
}
#Bean
#Primary
public DataSource dataSource(EmbeddedPostgres embeddedPostgres) throws Exception {
DataSource dataSource = embeddedPostgres.getPostgresDatabase();
return new TransactionAwareDataSourceProxy(dataSource);
}
}
#Configuration
#ConditionalOnMissingClass("com.opentable.db.postgres.embedded.EmbeddedPostgres")
#RefreshScope
static class DefaultDataSourceConfig {
#Value("${spring.jdbc.url}")
private String url;
#Value("${spring.jdbc.username}")
private String username;
#Value("${spring.jdbc.password}")
private String password;
#Value("${spring.jdbc.driverClass}")
private String driverClass;
#Value("${spring.jdbc.MaximumPoolSize}")
private Integer maxPoolSize;
#Bean
#Primary
#RefreshScope
public DataSource dataSource() {
log.debug("Connecting to datasource: {}", url);
HikariConfig hikariConfig = buildPool();
DataSource dataSource = new HikariDataSource(hikariConfig);
return new TransactionAwareDataSourceProxy(dataSource);
}
private HikariConfig buildPool() {
HikariConfig config = new HikariConfig();
config.setJdbcUrl(url);
config.setUsername(username);
config.setPassword(password);
config.setDriverClassName(driverClass);
config.setConnectionTestQuery("SELECT 1");
config.setMaximumPoolSize(maxPoolSize);
return config;
}
}
AFTER (transactional during runtime, as expected, all non-listed beans identical to above)
#Configuration
#EnableTransactionManagement
#RefreshScope
#Slf4j
public class DataSourceConfig {
#Bean
public DataSourceConnectionProvider dataSourceConnectionProvider(TransactionAwareDataSourceProxy dataSourceProxy) {
return new DataSourceConnectionProvider(dataSourceProxy);
}
#Bean
public TransactionAwareDataSourceProxy transactionAwareDataSourceProxy(DataSource dataSource) {
return new TransactionAwareDataSourceProxy(dataSource);
}
#Configuration
#ConditionalOnMissingClass("com.opentable.db.postgres.embedded.EmbeddedPostgres")
#RefreshScope
static class DefaultDataSourceConfig {
#Value("${spring.jdbc.url}")
private String url;
#Value("${spring.jdbc.username}")
private String username;
#Value("${spring.jdbc.password}")
private String password;
#Value("${spring.jdbc.driverClass}")
private String driverClass;
#Value("${spring.jdbc.MaximumPoolSize}")
private Integer maxPoolSize;
#Bean
#Primary
#RefreshScope
public DataSource dataSource() {
log.debug("Connecting to datasource: {}", url);
HikariConfig hikariConfig = buildPoolConfig();
DataSource dataSource = new HikariDataSource(hikariConfig);
return dataSource; // not returning the proxy here
}
}
}
I'll turn my comments into an answer.
The transaction manager should NOT be aware of the proxy. From the documentation:
Note that the transaction manager, for example
DataSourceTransactionManager, still needs to work with the underlying
DataSource, not with this proxy.
The class TransactionAwareDataSourceProxy is a special purpose class that is not needed in most cases. Anything that is interfacing with your data source through the Spring framework infrastructure should NOT have the proxy in their chain of access. The proxy is intended for code that cannot interface with the Spring infrastructure. For example, a third party library that was already setup to work with JDBC and did not accept any of Spring's JDBC templates. This is stated in the same docs as above:
This proxy allows data access code to work with the plain JDBC API and
still participate in Spring-managed transactions, similar to JDBC code
in a J2EE/JTA environment. However, if possible, use Spring's
DataSourceUtils, JdbcTemplate or JDBC operation objects to get
transaction participation even without a proxy for the target
DataSource, avoiding the need to define such a proxy in the first
place.
If you do not have any code that needs to bypass the Spring framework then do not use the TransactionAwareDataSourceProxy at all. If you do have legacy code like this then you will need to do what you already configured in your second setup. You will need to create two beans, one which is the data source, and one which is the proxy. You should then give the data source to all of the Spring managed types and the proxy to the legacy types.

#Sql Failed SQL scripts: The configured DataSource [*] (named 'fooDS') is not the one associated with transaction manager [*] (named 'fooTM')

Update 1 (scroll down)
The setup is as follows:
Our application database is constructed and used by two separate users:
SCHEMA - User that has authority to create and grant permissions on tables and
APP - User who is granted permissions (INSERT, UPDATE, DELETE, SELECT) (by SCHEMA) for above tables to be used.
This enables us to lock any schema changes until needed so no profound changes happen through the app user.
I am running integration tests with a live Oracle database that contains both these users. on the class itself, I use the #SqlConfig(dataSource = "schemaDataSource", transactionManager = "transactionManagerSchema").
On the test method I place two #Sql that fail because in the SqlScriptsTestExecutionListener class, the transaction is not managing the same datasource. (hence the error message further below).
I've tried setting the datasource to the transaction manager manually as shown in my config class below, however some unknown process seems to override it every time. (My best guess is through the #DataJpaTest annotation but I don't know exactly which of the 11 Auto Configurations does it, as you can see I've already disabled a couple with no effect).
Test Class:
#RunWith(SpringRunner.class)
#DataJpaTest(excludeAutoConfiguration = {TestDatabaseAutoConfiguration.class, DataSourceAutoConfiguration.class})
#FlywayTest
#SqlConfig(dataSource = TestDataSourceConfig.SCHEMA_DATA_SOURCE, transactionManager = "transactionManagerSchema")
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.NONE, classes = {TestDataSourceConfig.class, TestFlywayConfig.class})
#EntityScan(basePackageClasses = BaseEnum.class)
public class NotificationTypeEnumTest {
#Autowired
private EntityManager em;
#Test
#Sql(statements = {"INSERT INTO MYAPP_ENUM (ENUM_ID, \"TYPE\", \"VALUE\") VALUES (MYAPP_ENUM_ID_SEQ.nextval, '" + NotificationTypeEnum.DTYPE + "', 'foo')"}, executionPhase = Sql.ExecutionPhase.BEFORE_TEST_METHOD)
#Sql(statements = {"DELETE FROM MYAPP_ENUM"}, executionPhase = Sql.ExecutionPhase.AFTER_TEST_METHOD)
public void canFetchNotificationTypeEnum() throws Exception {
TypedQuery<NotificationTypeEnum> query = em.createQuery("select a from NotificationTypeEnum a", NotificationTypeEnum.class);
NotificationTypeEnum result = query.getSingleResult();
assertEquals("foo", result.getValue());
assertEquals(NotificationTypeEnum.DTYPE, result.getConfigType());
}
}
DataSource and TM config:
#Slf4j #Configuration #EnableTransactionManagement
public class TestDataSourceConfig {
public static final String SCHEMA_DATA_SOURCE = "schemaDataSource";
public static final String SCHEMA_TRANSACTION_MANAGER = "schemaTransactionManager";
/*Main Datasource and supporting beans*/
#Bean #Primary #ConfigurationProperties(prefix = "spring.datasource")
public DataSource dataSource() { return new DriverManagerDataSource(); }
#Bean #Primary #Autowired
public PlatformTransactionManager transactionManager(EntityManagerFactory emf) { return new JpaTransactionManager(emf); }
#Bean(name = SCHEMA_DATA_SOURCE) #ConfigurationProperties(prefix = "myapp.datasource.test_schema")
public DataSource schemaDataSource() { return new DriverManagerDataSource(); }
#Bean(name = SCHEMA_TRANSACTION_MANAGER) #Autowired
public PlatformTransactionManager transactionManagerSchema(#Qualifier(SCHEMA_DATA_SOURCE) DataSource dataSource) {
JpaTransactionManager jpaTransactionManager = new JpaTransactionManager();
jpaTransactionManager.setDataSource(dataSource);
return jpaTransactionManager;
}
}
The full error that I couldn't fit in the title is:
java.lang.IllegalStateException: Failed to execute SQL scripts for test context
...
SOME LONG STACK TRACE
...
the configured DataSource [org.springframework.jdbc.datasource.DriverManagerDataSource] (named 'schemaDataSource') is not the one associated with transaction manager [org.springframework.orm.jpa.JpaTransactionManager] (named 'transactionManagerSchema').
When there is a single DataSource, it appears the Spring auto-configuration model works fine, however, as soon as there are 2 or more, the assumptions break down and the programmer needs to manually fill in the sudden (plentiful) gaps in configuration required.
Am I missing some fundamental understanding surrounding DataSources and TransactionManagers?
Update 1
After some debugging, I have discovered the afterPropertiesSet() method is being called on the bean I created when the TransactionManager is retrieved for use with the #Sql script annotation. This causes whatever EntityManagerFactory it owns (i.e. JpaTransactionManager.entityManagerFactory) to set the datasource according to its configured EntityManagerFactoryInfo.getDataSource(). The EntityManagerFactory itself is being set as a result of the JpaTransactionManager.setBeanFactory method being called (as it implements BeanFactoryAware).
here is the spring code:
// JpaTransactionManager.java
#Override
public void setBeanFactory(BeanFactory beanFactory) throws BeansException {
if (getEntityManagerFactory() == null) {
if (!(beanFactory instanceof ListableBeanFactory)) {
throw new IllegalStateException("Cannot retrieve EntityManagerFactory by persistence unit name " +
"in a non-listable BeanFactory: " + beanFactory);
}
ListableBeanFactory lbf = (ListableBeanFactory) beanFactory;
setEntityManagerFactory(EntityManagerFactoryUtils.findEntityManagerFactory(lbf, getPersistenceUnitName()));
}
}
I then tried creating my own EntityManagerFactory bean to attempt to inject it into my created transaction manager but this seems to be opening up Hibernate Specific classes and I wish to stay abstracted at the JPA level. As well as it being difficult to configure at first glance.
Finally, a JPA only solution!
The Solution was to control the creation of the EntityManagerFactoryBeans using the provided spring EntityManagerFactoryBuilder component and inject the EntityManager into the test using the #PersistenceContext annotation.
#SqlConfig(dataSource = TestDataSourceConfig.SCHEMA_DATA_SOURCE, transactionManager = SCHEMA_TRANSACTION_MANAGER, transactionMode = SqlConfig.TransactionMode.ISOLATED)
...
public class MyJUnitTest {
#PersistenceContext(unitName = "pu")
private EntityManager em;
...
#Test
#Sql(statements = {"SOME SQL USING THE PRIVILEGED SCHEMA CONNECTION"}, ...)
public void myTest() {
em.createQuery("...").getResultList() // uses the APP database user.
}
}
Below is the configuration for both datasources. The application related DataSource beans all have #Primary in their definition to disambiguate any #Autowired dependencies. there are no Hibernate specific classes needed other than the Automatic hibernate config done through the #DataJpaTest class.
#Configuration
#EnableTransactionManagement
#EnableConfigurationProperties(JpaProperties.class)
public class TestDataSourceConfig {
public static final String SCHEMA_DATA_SOURCE = "schemaDS";
public static final String SCHEMA_TRANSACTION_MANAGER = "schemaTM";
public static final String SCHEMA_EMF = "schemaEMF";
/*Main Datasource and supporting beans*/
#Bean
#Primary
#ConfigurationProperties(prefix = "spring.datasource")
public DataSource dataSource() {
return new DriverManagerDataSource();
}
#Bean #Primary #Autowired
public PlatformTransactionManager transactionManager(EntityManagerFactory emf) { return new JpaTransactionManager(emf); }
#Bean #Primary
public LocalContainerEntityManagerFactoryBean emfBean(
EntityManagerFactoryBuilder entityManagerFactoryBuilder,
DataSource datasource,
JpaProperties jpaProperties) {
return entityManagerFactoryBuilder
.dataSource(datasource)
.jta(false)
.packages(CourseOffering.class)
.persistenceUnit("pu")
.properties(jpaProperties.getProperties())
.build();
}
#Bean(name = SCHEMA_EMF)
public LocalContainerEntityManagerFactoryBean emfSchemaBean(
EntityManagerFactoryBuilder entityManagerFactoryBuilder,
#Qualifier(SCHEMA_DATA_SOURCE) DataSource schemaDataSource,
JpaProperties jpaProperties) {
return entityManagerFactoryBuilder
.dataSource(schemaDataSource)
.jta(false)
.packages(CourseOffering.class)
.persistenceUnit("spu")
.properties(jpaProperties.getProperties())
.build();
}
#Bean(name = SCHEMA_DATA_SOURCE)
#ConfigurationProperties(prefix = "myapp.datasource.test_schema")
public DataSource schemaDataSource() { return new DriverManagerDataSource(); }
#Bean(name = SCHEMA_TRANSACTION_MANAGER)
public PlatformTransactionManager transactionManagerSchema(
#Qualifier(SCHEMA_EMF) EntityManagerFactory emfSchemaBean) {
JpaTransactionManager jpaTransactionManager = new JpaTransactionManager();
jpaTransactionManager.setEntityManagerFactory(emfSchemaBean);
return jpaTransactionManager;
}
}
Actual Test Class:
#RunWith(SpringRunner.class) // required for all spring tests
#DataJpaTest(excludeAutoConfiguration = {TestDatabaseAutoConfiguration.class, DataSourceAutoConfiguration.class}) // this stops the default data source and database being configured.
#SqlConfig(dataSource = TestDataSourceConfig.SCHEMA_DATA_SOURCE, transactionManager = SCHEMA_TRANSACTION_MANAGER, transactionMode = SqlConfig.TransactionMode.ISOLATED) // make sure the #Sql statements are run using the SCHEMA datasource and txManager in an isolated way so as not to cause problems when running test methods requiring these statements to be run.
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.NONE, classes = {TestDataSourceConfig.class})
#TestExecutionListeners({
SqlScriptsTestExecutionListener.class, // enables the #Sql script annotations to work.
SpringBootDependencyInjectionTestExecutionListener.class, // injects spring components into the test (i.e. the EntityManager)
TransactionalTestExecutionListener.class}) // I have this here even though the #Transactional annotations don't exist yet as I plan on using them in further tests.
public class NotificationTypeEnumTest {
#PersistenceContext(unitName = "pu") // required to inject the correct EntityManager
private EntityManager em;
// these statements are
#Test
#Sql(statements = {"INSERT INTO MYAPP_ENUM (ENUM_ID, \"TYPE\", \"VALUE\") VALUES (MYAPP_ENUM_ID_SEQ.nextval, '" + NotificationTypeEnum.DTYPE + "', 'foo')"}, executionPhase = Sql.ExecutionPhase.BEFORE_TEST_METHOD)
#Sql(statements = {"DELETE FROM MYAPP_ENUM"}, executionPhase = Sql.ExecutionPhase.AFTER_TEST_METHOD)
public void canFetchNotificationTypeEnum() throws Exception {
TypedQuery<NotificationTypeEnum> query = em.createQuery("select a from NotificationTypeEnum a", NotificationTypeEnum.class); // notification type is just a subclass of the BaseEnum type
NotificationTypeEnum result = query.getSingleResult();
assertEquals("foo", result.getValue());
assertEquals(NotificationTypeEnum.DTYPE, result.getConfigType());
}
}
noteworthy classes:
EntityManagerFactoryBuilder - I don't like factory factories, but this one served me well in creating the correct implementation of EntityManagerFactory without depending on any hibernate specific classes. may be injected with #Autowired. The builder bean itself is configured through the HibernateJpaAutoConfiguration class (extends JpaBaseConfiguration) (imported by #DataJpaTest).
JpaProperties - useful for maintaining application.properties config in the resulting entitymanagerfactories. enabled through the #EnableConfigurationProperties(JpaProperties.class) annotation above this config class.
#PersistenceContext(unitName = "...") - I can inject the correct EntityManager in my test class with this annotation.

Use of multiple DataSources in Spring Batch

I am trying to configure a couple of datasources within Spring Batch. On startup, Spring Batch is throwing the following exception:
To use the default BatchConfigurer the context must contain no more thanone DataSource, found 2
Snippet from Batch Configuration
#Configuration
#EnableBatchProcessing
public class BatchJobConfiguration {
#Primary
#Bean(name = "baseDatasource")
public DataSource dataSource() {
// first datasource definition here
}
#Bean(name = "secondaryDataSource")
public DataSource dataSource2() {
// second datasource definition here
}
...
}
Not sure why I am seeing this exception, because I have seen some xml based configuration for Spring batch that declare multiple datasources. I am using Spring Batch core version 3.0.1.RELEASE with Spring Boot version 1.1.5.RELEASE. Any help would be greatly appreciated.
You must provide your own BatchConfigurer. Spring does not want to make that decision for you
#Configuration
#EnableBatchProcessing
public class BatchConfig {
#Bean
BatchConfigurer configurer(#Qualifier("batchDataSource") DataSource dataSource){
return new DefaultBatchConfigurer(dataSource);
}
...
AbstractBatchConfiguration tries to lookup BatchConfigurer in container first, if it is not found then tries to create it itself - this is where IllegalStateException is thrown where there is more than one DataSource bean in container.
The approach to solving the problem is to prevent from creation the DefaultBatchConfigurer bean in AbstractBatchConfiguration.
To do it we hint to create DefaultBatchConfigurer by Spring container using #Component annotation:
The configuration class where #EnableBatchProcessing is placed we can annotate with #ComponentScan that scan the package that contains the empty class that is derived from DefaultBatchConfigurer:
package batch_config;
...
#EnableBatchProcessing
#ComponentScan(basePackageClasses = MyBatchConfigurer.class)
public class MyBatchConfig {
...
}
the full code of that empty derived class is here:
package batch_config.components;
import org.springframework.batch.core.configuration.annotation.DefaultBatchConfigurer;
import org.springframework.stereotype.Component;
#Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
}
In this configuration the #Primary annotation works for DataSource bean as in the example below:
#Configuration
public class BatchTestDatabaseConfig {
#Bean
#Primary
public DataSource dataSource()
{
return .........;
}
}
This works for the Spring Batch version 3.0.3.RELEASE
The simplest solution to make #Primary annotation on DataSource work might be just adding #ComponentScan(basePackageClasses = DefaultBatchConfigurer.class) along with #EnableBatchProcessing annotation:
#Configuration
#EnableBatchProcessing
#ComponentScan(basePackageClasses = DefaultBatchConfigurer.class)
public class MyBatchConfig {
I would like to provide a solution here, which is very similar to the one answered by #vanarchi, but I managed to put all the necessary configurations into one class.
For the sake of completeness, the solution here assumes that primary datasource is hsql.
#Configuration
#EnableBatchProcessing
public class BatchConfiguration extends DefaultBatchConfigurer {
#Bean
#Primary
public DataSource batchDataSource() {
// no need shutdown, EmbeddedDatabaseFactoryBean will take care of this
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
EmbeddedDatabase embeddedDatabase = builder
.addScript("classpath:org/springframework/batch/core/schema-drop-hsqldb.sql")
.addScript("classpath:org/springframework/batch/core/schema-hsqldb.sql")
.setType(EmbeddedDatabaseType.HSQL) //.H2 or .DERBY
.build();
return embeddedDatabase;
}
#Override
protected JobRepository createJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(batchDataSource());
factory.setTransactionManager(transactionManager());
factory.afterPropertiesSet();
return (JobRepository) factory.getObject();
}
private ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
//NOTE: the code below is just to provide developer an easy way to access the in-momery hsql datasource, as we configured it to the primary datasource to store batch job related data. Default username : sa, password : ''
#PostConstruct
public void getDbManager(){
DatabaseManagerSwing.main(
new String[] { "--url", "jdbc:hsqldb:mem:testdb", "--user", "sa", "--password", ""});
}
}
THREE key points in this solution:
This class is annotated with #EnableBatchProcessing and #Configuration, as well as extended from DefaultBatchConfigurer. By doing this, we instruct spring-batch to use our customized batch configurer when AbstractBatchConfiguration tries to lookup BatchConfigurer;
Annotate batchDataSource bean as #Primary, which instruct spring-batch to use this datasource as its datasource of storing the 9 job related tables.
Override protected JobRepository createJobRepository() throws Exception method, which makes the jobRepository bean to use the primary datasource, as well as use a different transactionManager instance from the other datasource(s).
The simplest solution is to extend the DefaultBatchConfigurer and autowire your datasource via a qualifier:
#Component
public class MyBatchConfigurer extends DefaultBatchConfigurer {
/**
* Initialize the BatchConfigurer to use the datasource of your choosing
* #param firstDataSource
*/
#Autowired
public MyBatchConfigurer(#Qualifier("firstDataSource") DataSource firstDataSource) {
super(firstDataSource);
}
}
Side Note (as this also deals with the use of multiple data sources): If you use autoconfig to run data initialization scripts, you may notice that it's not initializing on the datasource you'd expect. For that issue, take a look at this: https://github.com/spring-projects/spring-boot/issues/9528
You can define below beans and make sure you application.properties file has entries needed for
#Configuration
#PropertySource("classpath:application.properties")
public class DataSourceConfig {
#Primary
#Bean(name = "abcDataSource")
#ConfigurationProperties(prefix = "abc.datasource")
public DataSource dataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
#Bean(name = "xyzDataSource")
#ConfigurationProperties(prefix = "xyz.datasource")
public DataSource xyzDataSource() {
return DataSourceBuilder.create().type(HikariDataSource.class).build();
}
}
application.properties
abc.datasource.jdbc-url=XXXXX
abc.datasource.username=XXXXX
abc.datasource.password=xxxxx
abc.datasource.driver-class-name=org.postgresql.Driver
...........
...........
...........
...........
Here you can refer: Spring Boot Configure and Use Two DataSources
First, create a custom BatchConfigurer
#Configuration
#Component
public class TwoDataSourcesBatchConfigurer implements BatchConfigurer {
#Autowired
#Qualifier("dataSource1")
DataSource dataSource;
#Override
public JobExplorer getJobExplorer() throws Exception {
...
}
#Override
public JobLauncher getJobLauncher() throws Exception {
...
}
#Override
public JobRepository getJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
// use the autowired data source
factory.setDataSource(dataSource);
factory.setTransactionManager(getTransactionManager());
factory.afterPropertiesSet();
return factory.getObject();
}
#Override
public PlatformTransactionManager getTransactionManager() throws Exception {
...
}
}
Then,
#Configuration
#EnableBatchProcessing
#ComponentScan("package")
public class JobConfig {
// define job, step, ...
}

Categories