Spring Boot auto-configuration order - java

I want to create a Spring Boot auto-configuration class that (conditionally) creates a single bean A. The challenge however is, that this been has to be created before another bean B is created in one of Spring Boot's default auto-configuration classes. The bean B does not depend on A.
My first try was to use #AutoConfigureBefore. That didn't work the way I expected and judging from this comment by Dave Syer it shouldn't.
Some background: The aforementioned bean A alters a Mongo Database and this has to happen before MongoTemplate is created.

It turns out, what I want is to dynamically make instances of B depend on A. This can be achieved by using a BeanFactoryPostProcessor to alter the bean definitions of B beans.
public class DependsOnPostProcessor implements BeanFactoryPostProcessor {
#Override
public void postProcessBeanFactory(ConfigurableListableBeanFactory beanFactory) throws BeansException {
String[] beanNames = BeanFactoryUtils.beanNamesForTypeIncludingAncestors(
beanFactory, B.class, true, false);
for (String beanName : beanNames) {
BeanDefinition definition = beanFactory.getBeanDefinition(beanName);
definition.setDependsOn(StringUtils.addStringToArray(
definition.getDependsOn(), "beanNameOfB");
}
}
}
This works with plain Spring, no Spring Boot required. To complete the auto-configuration I need to add the bean definition for the DependsOnPostProcessor to the configuration class that instantiates the bean A.

There is a new annotation #AutoConfigureOrder in Boot 1.3.0. Though it is unclear to me at least if this would still behave the same way as #AutoConfiugreBefore.

Refer to #hzpz 's answer, here is a example for create database before the auto-configuration of Hikari data source.
import com.zaxxer.hikari.HikariDataSource;
import java.sql.Connection;
import java.sql.DriverManager;
import java.sql.SQLException;
import java.sql.Statement;
import javax.annotation.PostConstruct;
import org.springframework.beans.factory.BeanFactoryUtils;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.beans.factory.config.BeanDefinition;
import org.springframework.beans.factory.config.BeanFactoryPostProcessor;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.util.StringUtils;
#Configuration
public class CreateDatabaseConfig {
#Value("${spring.datasource.url}")
private String url;
#Value("${spring.datasource.hikari.username}")
private String username;
#Value("${spring.datasource.hikari.password}")
private String password;
#Value("${spring.datasource.hikari.catalog}")
private String catalog;
#Bean
public static BeanFactoryPostProcessor dependsOnPostProcessor() {
return beanFactory -> {
String[] beanNames = BeanFactoryUtils.beanNamesForTypeIncludingAncestors(
beanFactory, HikariDataSource.class, true, false);
for (String beanName : beanNames) {
BeanDefinition definition = beanFactory.getBeanDefinition(beanName);
definition.setDependsOn(StringUtils.addStringToArray(
definition.getDependsOn(), "createDatabaseConfig"));
}
};
}
#PostConstruct
public void init() {
try (
Connection connection = DriverManager.getConnection(url, username, password);
Statement statement = connection.createStatement()
) {
statement.executeUpdate(
"CREATE DATABASE IF NOT EXISTS `" + catalog
+ "` DEFAULT CHARACTER SET = utf8mb4 COLLATE utf8mb4_unicode_ci;"
);
} catch (SQLException e) {
throw new RuntimeException("Create Database Fail", e);
}
}
}
More initialization for schema and data can be done with data.sql and schema.sql, see 85. Database Initialization
ps. I tried to CREATE DATABASE in schema.sql but failed, and the example above works :)

Related

Java aop ComponentScan not working & AnnotationConfigApplicationContext getBean not working

I wrote a simple set of classes to show a friend about using Annotations for AOP (instead of xml config) . We couldnt get the #ComponentScan to work AND AnnotationConfigApplicationContext getBean too misbehaves. I wanted to understand two things . See Code below :
PersonOperationsI.java
package samples.chapter3;
import org.springframework.stereotype.Component;
#Component
public interface PersonOperationsI {
public String getName();
}
PersonOperations.java
/**
*
*/
package samples.chapter3;
import org.springframework.stereotype.Component;
#Component
public class PersonOperations implements PersonOperationsI {
public String getName() {
return "";
}
}
PersonOperationsConfigClass.java
package samples.chapter3;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.EnableAspectJAutoProxy;
#Configuration
//question2 - Below Component Scan didnt work - Test Case failing in setup()
//#ComponentScan(basePackages = {"samples.chapter3"})
#EnableAspectJAutoProxy
public class PersonOperationsConfigClass {
}
PersonOperationsAdvice.java
/**
*
*/
package samples.chapter3;
import org.aspectj.lang.JoinPoint;
import org.aspectj.lang.annotation.Aspect;
import org.aspectj.lang.annotation.Before;
import org.springframework.stereotype.Component;
#Component
#Aspect
public class PersonOperationsAdvice {
/**
* execution( [Modifiers] [ReturnType] [FullClassName].[MethodName]
([Arguments]) throws [ExceptionType])
* #param joinPoint
* #return
*/
#Before("execution(public * samples.chapter3.PersonOperations.getName()))")
public String beforeGetName(JoinPoint joinPoint) {
System.out.println("method name = " + joinPoint.getSignature().getName());
return null;
}
}
PersonOperationsTest.java
package samples.chapter3;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Ignore;
import org.junit.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit.jupiter.SpringExtension;
#ExtendWith(SpringExtension.class)
#ContextConfiguration(classes = { PersonOperationsConfigClass.class })
public class PersonOperationsTest {
//#Autowired
private PersonOperationsI obj;
#Before
public void setUp() {
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext();
ctx.scan("samples.chapter3");
ctx.refresh();
obj = ctx.getBean(PersonOperationsI.class);
//obj = ctx.getBean(PersonOperations.class);//getBean of Child class not working - why ?
Assert.assertNotNull(obj);
ctx.close();
}
#Test
public void test() {
System.out.println(obj.getName());
}
}
Question1 - Why #componentscan doesnt work .If I dont use AnnotationConfigApplicationContext in test case and just rely on #componentscan & autowired - the object in test case is null
Question2 - ctx.getBean(PersonOperations.class);//getBean of Child class not working - why ?
A1 , #ComponentScan didn't work because it is commented out from the "The component classes to use for loading an ApplicationContext." or PersonOperationsConfigClass
#Configuration
//#ComponentScan(basePackages = {"samples.chapter3"})
#EnableAspectJAutoProxy
public class PersonOperationsConfigClass {}
The test class gets the ApplicationContext created from the component classes specified with the #ContextConfiguration annotation. Since no components were created or auto detected , #Autowired failed.
When AnnotationConfigApplicationContext was used within a method annotated with #Before , an ApplicationContext was programmatically created.ctx.scan("samples.chapter3"); scanned and auto-deteced PersonOperations annotated with #Component. obj reference got set with the code obj = ctx.getBean(PersonOperationsI.class);. This object was not 'Autowired'.
Update based on the comment from OP
The Junit 4 annotations and the #ExtendWith(SpringExtension.class) combination is not working for me.
Following Test class runs successfully with zero errors/failures. obj is autowired and not null. I have used the corresponding annotations from Junit 5.
package rg.app.aop.so.q1;
import org.junit.jupiter.api.Assertions;
import org.junit.jupiter.api.BeforeEach;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.junit.jupiter.SpringExtension;
#ExtendWith(SpringExtension.class)
#ContextConfiguration(classes= {PersonOperationsConfigClass.class})
public class PersonOperationsTest {
#Autowired
private PersonOperationsI obj;
#BeforeEach
public void setUp() {
System.out.println("init ::"+ obj);
Assertions.assertNotNull(obj);
}
#Test
public void testPersonOps() {
Assertions.assertNotNull(obj);
}
}
Configuration class
package rg.app.aop.so.q1;
import org.springframework.context.annotation.ComponentScan;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.EnableAspectJAutoProxy;
#Configuration
#EnableAspectJAutoProxy
#ComponentScan(basePackages = {"rg.app.aop.so.q1"})
public class PersonOperationsConfigClass {
}
A2,
Following are my analysis.
Remember , #EnableAspectJAutoProxy has got a default value "false" for proxyTargetClass attribute. This attribute determines the proxying mechanism : JDK proxy (false) or CGLIB proxy (true) .
Here the presence of a valid Aspect with a valid advice results in the actual proxying to kick in. A component will get proxied only when the advice has any effect on it. In short , the proxying of a Component happens only if required.
Case 1
When : #EnableAspectJAutoProxy / #EnableAspectJAutoProxy(proxyTargetClass = false )
ctx.getBean(InterfaceType) returns a bean
ctx.getBean(ImplementationClassType) fails to return a bean
Case 2
When : #EnableAspectJAutoProxy(proxyTargetClass = true )
ctx.getBean(InterfaceType) returns a bean
ctx.getBean(ImplementationClassType) returns a bean
Case 3
When : #EnableAspectJAutoProxy annotation is absent
ctx.getBean(InterfaceType) returns a bean
ctx.getBean(ImplementationClassType) returns a bean
Case 1 , Spring AOP is enabled with proxyTargetClass as false. JDK proxy creates a proxy bean of Interface type.
The bean created is of type InterfaceType and not ImplementationClassType . This explains why ctx.getBean(ImplementationClassType) fails to return a bean.
Case 2 , Spring AOP is enabled with proxyTargetClass as true . CGLIB creates a proxy bean by subclassing the class annotated with #Component.
The bean created is of type ImplementationClassType , as well qualifies as InterfaceType. So both the getBean() calls returns this bean successfully.
Case 3,
Spring only create "proxy" objects if any special processing is required ( eg: AOP , Transaction Management ).
Now with this logic , since #EnableAspectJAutoProxy is absent , a bean gets created for class annotated with #Component without any proxying.
The bean created is of type ImplementationClassType , as well qualifies as InterfaceType. So both the getBean() calls returns this bean successfully.
Analysis done with the following code.
package rg.app.aop.so.q1;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
public class AppMain {
public static void main(String[] args) {
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext();
ctx.scan("rg.app.aop.so.q1");
ctx.refresh();
System.out.println();
for(String name:ctx.getBeanNamesForType(PersonOperationsI.class)) {
System.out.println(name);
}
for(String name:ctx.getBeanNamesForType(PersonOperations.class)) {
System.out.println(name);
}
PersonOperationsI obj = ctx.getBean(PersonOperationsI.class);
System.out.println(obj.getClass());
obj = ctx.getBean(PersonOperations.class);
System.out.println(obj.getClass());
ctx.registerShutdownHook();
}
}
Case 1 prints
personOperations
class com.sun.proxy.$Proxy18
Exception in thread "main" org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type 'rg.app.aop.so.q1.PersonOperations' available
Case 2 prints
personOperations
personOperations
class rg.app.aop.so.q1.PersonOperations$$EnhancerBySpringCGLIB$$c179e7f2
class rg.app.aop.so.q1.PersonOperations$$EnhancerBySpringCGLIB$$c179e7f2
Case 3 prints
personOperations
personOperations
class rg.app.aop.so.q1.PersonOperations
class rg.app.aop.so.q1.PersonOperations
Hope this helps
Usually you should use #ComponentScan along with a #Configuration annotated class and keep in mind that #ComponentScan without arguments tells Spring to scan the current package and all of its sub-packages..
The #Component class tells Spring to create a bean of that type so you no longer need to use xml configuration, and the bean is a class that can be instantiated => no interface / abstract classes. So, in your case, you should remove the #Component from PersonOperationsI and leave it only in PersonOperations. When you annotate a class with #Component, the default name given to the bean is the class name with lower first letter, so you should call ctx.getBean("personOperationsI") or ctx.getBean(PersonOperations.class)
And for the future read these naming conventions for interfaces and implementations. In your case I would modify the following :
PersonOperationsI to Operations
Question 2
As you said, bean scanning process was not complete, so there is no bean in context and you should not expect any bean from context, Either #Autowired way or context.getBean way.(Both ways return null )
Below link has more information on bean scanning(It may help)
Spring Component Scanning

GemFire entry Time-To-Live is not getting set using spring-cache

Following is the code snippet which is successful in persisting the data in a remote GemFire cluster and successfully keeping local spring-cache updated. However, the entries are not getting DESTROY-ed as expected when I tried using ExpirationAttributes. I've referred to this and related links.
import org.springframework.data.gemfire.ExpirationActionType;
import org.springframework.data.gemfire.ExpirationAttributesFactoryBean;
import org.springframework.data.gemfire.RegionAttributesFactoryBean;
import org.springframework.data.gemfire.client.ClientCacheFactoryBean;
import org.springframework.data.gemfire.client.ClientRegionFactoryBean;
import org.springframework.data.gemfire.support.ConnectionEndpoint;
import org.springframework.data.gemfire.support.GemfireCacheManager;
import com.gemstone.gemfire.cache.ExpirationAttributes;
import com.gemstone.gemfire.cache.RegionAttributes;
import com.gemstone.gemfire.cache.client.ClientCache;
import com.gemstone.gemfire.cache.client.ClientRegionShortcut;
import com.gemstone.gemfire.pdx.ReflectionBasedAutoSerializer;
#Configuration
#Profile("local")
public class GemFireCachingConfig {
#Bean
Properties gemfireProperties(...) {
//Sets gemfire properties and return
return gemfireProperties;
}
#Bean
#Primary
ReflectionBasedAutoSerializer reflectionBasedAutoSerializer() {
return new ReflectionBasedAutoSerializer("pkg.containing.cacheable.object");
}
#Bean
#Primary
ClientCacheFactoryBean clientCacheFactory(String injectedGemFirehost,
int injectedGemfirePort, Properties gemfireProperties,
ReflectionBasedAutoSerializer reflectionBasedAutoSerializer) {
ClientCacheFactoryBean cachefactoryBean = new ClientCacheFactoryBean();
cachefactoryBean.setProperties(gemfireProperties);
cachefactoryBean.setClose(true);
cachefactoryBean.setPdxSerializer(reflectionBasedAutoSerializer);
cachefactoryBean.setPdxReadSerialized(false);
cachefactoryBean.setPdxIgnoreUnreadFields(true);
ConnectionEndpoint[] locators = new ConnectionEndpoint[1];
locators[0] = new ConnectionEndpoint(injectedGemFirehost, injectedGemfirePort);
cachefactoryBean.setLocators(locators);
return cachefactoryBean;
}
#Bean
public ExpirationAttributesFactoryBean entryTtlExpirationAttributes(
int injectedTimeoutInSecs) {
ExpirationAttributesFactoryBean expirationAttributes = new ExpirationAttributesFactoryBean();
expirationAttributes.setAction(ExpirationActionType.DESTROY.getExpirationAction());
expirationAttributes.setTimeout(injectedTimeoutInSecs);
return expirationAttributes;
}
#Bean
#Autowired
public RegionAttributesFactoryBean regionAttributes(
#Qualifier("entryTtlExpirationAttributes") ExpirationAttributes entryTtl) {
RegionAttributesFactoryBean regionAttributes = new RegionAttributesFactoryBean();
regionAttributes.setStatisticsEnabled(true);
regionAttributes.setEntryTimeToLive(entryTtl);
return regionAttributes;
}
#Bean
#Primary
ClientRegionFactoryBean<String, Object> regionFactoryBean(ClientCache gemfireCache,
#Qualifier("regionAttributes") RegionAttributes<String, Object> regionAttributes) {
ClientRegionFactoryBean<String, Object> regionFactoryBean = new ClientRegionFactoryBean<>();
regionFactoryBean.setAttributes(regionAttributes);
regionFactoryBean.setCache(gemfireCache);
regionFactoryBean.setClose(false);
regionFactoryBean.setPersistent(false);
regionFactoryBean.setRegionName(regionName);
regionFactoryBean.setShortcut(ClientRegionShortcut.CACHING_PROXY_HEAP_LRU);
return regionFactoryBean;
}
#Bean
GemfireCacheManager cacheManager(ClientCache gemfireCache) {
GemfireCacheManager cacheManager = new GemfireCacheManager();
cacheManager.setCache(gemfireCache);
return cacheManager;
}
}
Just curious how you think the injectedTimeoutInSeconds is "injected" into your entryTtlExpirationAttributes bean definition in your Spring config; this...
#Bean
public ExpirationAttributesFactoryBean entryTtlExpirationAttributes(
int injectedTimeoutInSecs) {
ExpirationAttributesFactoryBean expirationAttributes =
new ExpirationAttributesFactoryBean();
expirationAttributes.setAction(
ExpirationActionType.DESTROY.getExpirationAction());
expirationAttributes.setTimeout(injectedTimeoutInSecs);
return expirationAttributes;
}
You need to annotate your entryTtlExpirationAttributes bean definition method parameter (i.e. injectedTimeoutInSecs) with Spring's #Value annotation, like so...
#Bean
public ExpirationAttributesFactoryBean entryTtlExpirationAttributes(
#Value("${gemfire.cache.expiration.ttl.timeout:600}")
int injectedTimeoutInSecs) {
Then, in your Spring Boot application.properties file, you can set a value for the property (gemfire.cache.expiration.ttl.timeout)...
#application.properties
gemfire.cache.expiration.ttl.timeout = 300
The #Value annotation can supply a default if the property is not explicitly set...
#Value({${property:defaultValue}")
Additionally, you need to supply a propertySourcePlaceholderConfigurer bean definition in your Spring Java config to enable Spring to "replace" property placeholder values...
#Bean
static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
You can see a similar configuration to what you have above here.
Finally, you can simplify your entire Spring, GemFire Java configuration class to this...
import java.util.Collections;
import org.apache.geode.cache.ExpirationAttributes;
import org.apache.geode.cache.GemFireCache;
import org.apache.geode.cache.RegionAttributes;
import org.apache.geode.cache.client.ClientRegionShortcut;
import org.apache.geode.pdx.ReflectionBasedAutoSerializer;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Primary;
import org.springframework.context.annotation.Profile;
import org.springframework.context.support.PropertySourcesPlaceholderConfigurer;
import org.springframework.data.gemfire.RegionAttributesFactoryBean;
import org.springframework.data.gemfire.cache.config.EnableGemfireCaching;
import org.springframework.data.gemfire.client.ClientRegionFactoryBean;
import org.springframework.data.gemfire.config.annotation.ClientCacheApplication;
import org.springframework.data.gemfire.config.annotation.ClientCacheConfigurer;
import org.springframework.data.gemfire.config.annotation.EnablePdx;
import org.springframework.data.gemfire.expiration.ExpirationActionType;
import org.springframework.data.gemfire.expiration.ExpirationAttributesFactoryBean;
import org.springframework.data.gemfire.support.ConnectionEndpoint;
#ClientCacheApplication
#EnableGemfireCaching
#EnablePdx(ignoreUnreadFields = true, readSerialized = false,
serializerBeanName = "reflectionBasedAutoSerializer")
#Profile("local")
public class GemFireCachingConfig {
#Bean
static PropertySourcesPlaceholderConfigurer propertySourcesPlaceholderConfigurer() {
return new PropertySourcesPlaceholderConfigurer();
}
// NOTE: you can externalize Pivotal GemFire properties in a gemfire.properties file,
// placed in the root of your application classpath.
//
// Alternatively, you can use Spring Boot's application.properties to set GemFire properties
// using the corresponding Spring Data GemFire (annotation-based) property (e.g. spring.data.gemfire.cache.log-level)
//
// See here...
// https://docs.spring.io/spring-data/gemfire/docs/current/api/org/springframework/data/gemfire/config/annotation/ClientCacheApplication.html#logLevel--
#Bean
#Primary
ReflectionBasedAutoSerializer reflectionBasedAutoSerializer() {
return new ReflectionBasedAutoSerializer("pkg.containing.cacheable.object");
}
#Bean
ClientCacheConfigurer clientCacheHostPortConfigurer(
#Value("gemfire.locator.host") String locatorHost,
#Value("gemfire.locator.port") int locatorPort) {
return (beanName, clientCacheFactoryBean) ->
clientCacheFactoryBean.setLocators(Collections.singletonList(
new ConnectionEndpoint(locatorHost, locatorPort)));
}
#Bean("RegionNameHere")
ClientRegionFactoryBean<String, Object> regionFactoryBean(GemFireCache gemfireCache,
#Qualifier("regionAttributes") RegionAttributes<String, Object> regionAttributes) {
ClientRegionFactoryBean<String, Object> clientRegionFactory = new ClientRegionFactoryBean<>();
clientRegionFactory.setAttributes(regionAttributes);
clientRegionFactory.setCache(gemfireCache);
clientRegionFactory.setClose(false);
clientRegionFactory.setShortcut(ClientRegionShortcut.CACHING_PROXY_HEAP_LRU);
return clientRegionFactory;
}
#Bean
public RegionAttributesFactoryBean regionAttributes(
#Qualifier("entryTtlExpirationAttributes") ExpirationAttributes expirationAttributes) {
RegionAttributesFactoryBean regionAttributes = new RegionAttributesFactoryBean();
regionAttributes.setStatisticsEnabled(true);
regionAttributes.setEntryTimeToLive(expirationAttributes);
return regionAttributes;
}
#Bean
public ExpirationAttributesFactoryBean entryTtlExpirationAttributes(
#Value("${gemfire.cache.expiration:600") int timeoutInSeconds) {
ExpirationAttributesFactoryBean expirationAttributes = new ExpirationAttributesFactoryBean();
expirationAttributes.setAction(ExpirationActionType.DESTROY.getExpirationAction());
expirationAttributes.setTimeout(timeoutInSeconds);
return expirationAttributes;
}
}
Of course, this configuration is based on Spring Data GemFire 2.0.1.RELEASE (Kay-SR1).
Notice the #ClientCacheApplication annotation, which replaces the need for your clientCacheFactory bean definition.
I also used the new #EnablePdx annotation to configure GemFire's PDX serialization behavior.
I declared a ClientCacheConfigurer typed bean definition (clientCacheHostPortConfigurer) to dynamically adjust the Locator host and port configuration based on property placeholders.
I defined a PropertySourcesPlaceholderConfigurer to handle the property placeholders used in the #Value annotations throughout the Spring, Java-based configuration meta-data.
I also used the new #EnableGemfireCaching annotation which replaces the need to explicitly define a gemfireCacheManager bean definition. It also enables Spring's Cache Abstraction (specifying #EnableCaching for you).
Anyway, SDG's new Annotation-based configuration model makes it easier to do everything. But again, you need to be using Spring Data GemFire 2.0+ (SD Kay) with Pivotal GemFire 9.1.x.
Hope this helps!
-John

Test DB with TestNG not rolling back

I am trying to do some basic automated testing with a DB and TestNG and it isn't working. First run succeeds as I expect, 2nd one doesn't because the first one never rolled back. I have looked at several places for examples and it seems to be correct. Anyone know what I am missing
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.11</version>
<scope>test</scope>
</dependency>
code:
import com.mchange.v2.c3p0.ComboPooledDataSource;
import java.beans.PropertyVetoException;
import java.sql.Connection;
import java.sql.SQLException;
import javax.sql.DataSource;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.PropertySource;
import org.springframework.dao.DataAccessException;
import org.springframework.jdbc.datasource.DataSourceTransactionManager;
import org.springframework.test.annotation.Rollback;
import org.springframework.test.context.ContextConfiguration;
import org.springframework.test.context.testng.AbstractTransactionalTestNGSpringContextTests;
import org.springframework.transaction.annotation.Transactional;
import org.testng.annotations.Test;
#ContextConfiguration(classes = AutomatedTest.Context.class)
public class RollbackTest extends AbstractTransactionalTestNGSpringContextTests {
#Test
#Rollback
#Transactional
public void testThing() throws Exception {
Class<? extends RollbackTest> c = this.getClass();
String path = String.format("/%s.sql", c.getName().replaceAll("\\.", "/"));
super.executeSqlScript(path, false);
}
#Configuration
#PropertySource("db.properties")
static class Context {
#Bean
public DataSource dataSource(
#Value("${datasource.url}") String url,
#Value("${datasource.username}") String user,
#Value("${datasource.password}") String pass,
#Value("${datasource.driver-class-name}") String driver) throws PropertyVetoException {
ComboPooledDataSource ds = new ComboPooledDataSource();
ds.setUser(user);
ds.setPassword(pass);
ds.setJdbcUrl(url);
ds.setDriverClass(driver);
return ds;
}
#Bean
public Connection connection(DataSource dataSource) throws SQLException {
Connection c = dataSource.getConnection();
System.out.println("Connection is " + c);
return c;
}
#Bean
public DataSourceTransactionManager txMan(DataSource ds) {
return new DataSourceTransactionManager(ds);
}
}
}
sql:
CREATE SCHEMA FOO;
CREATE TABLE FOO.BAZ (
ID int PRIMARY KEY AUTO_INCREMENT,
name VARCHAR(256) NOT NULL
);
INSERT INTO FOO.BAZ (name) values('christian');
error:
CREATE SCHEMA FOO
org.springframework.jdbc.datasource.init.ScriptStatementFailedException: Failed to execute SQL script statement #1 of class path resource [automated/RollbackTest.sql]: CREATE SCHEMA FOO; nested exception is java.sql.SQLException: Can't create database 'FOO'; database exists
The #Bean method that returns a Connection looks very suspect. So I'd recommend you delete that.
For executing an SQL script before or after a test method, you should ideally look into Spring's #Sql annotation.
Also, you can safely delete the #Rollback and #Transactional declarations on your test method.
#Rollback is the default behavior.
#Transactional is already declared on AbstractTransactionalTestNGSpringContextTests.
Regards,
Sam (author of the Spring TestContext Framework)

Manual bean registration and configuration order

I had an issue with spring's #Order annotation, it seems i can't get it to work in my application. Hence I managed to create a test class that imitates the very same behaviour which #Order does not have any effect on my components. The following test fails to run because of lack of bean typed javax.sql.Datasource:
package com.so;
import org.springframework.beans.factory.BeanFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.beans.factory.config.ConfigurableBeanFactory;
import org.springframework.context.annotation.AnnotationConfigApplicationContext;
import org.springframework.context.annotation.Configuration;
import org.springframework.core.Ordered;
import org.springframework.core.annotation.Order;
import org.springframework.jdbc.datasource.AbstractDataSource;
import org.springframework.stereotype.Repository;
import org.springframework.stereotype.Service;
import javax.annotation.PostConstruct;
import javax.sql.DataSource;
import java.sql.Connection;
import java.sql.SQLException;
public class TestSpring {
public static void main(String[] args) {
Class<?>[] classes = new Class[]{AConf.class, ADAO.class, AService.class, RepoConf.class} ;
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(classes);
}
#Configuration
#Order(Ordered.HIGHEST_PRECEDENCE + 100)
public static class AConf {
#Autowired
AService aService;
}
#Repository
#Order(Ordered.LOWEST_PRECEDENCE)
public static class ADAO {
#Autowired
#Qualifier("myds")
DataSource dataSource;
}
#Service
#Order(Ordered.LOWEST_PRECEDENCE)
public static class AService {
#Autowired
ADAO adao;
#PostConstruct
public void init() {
System.out.println("service init");
}
}
// #Component does not have any effect
#Configuration
#Order(Ordered.HIGHEST_PRECEDENCE)
public static class RepoConf {
#Autowired
BeanFactory beanFactory;
#PostConstruct
public void init() {
ConfigurableBeanFactory configurableBeanFactory = (ConfigurableBeanFactory) beanFactory;
configurableBeanFactory.registerSingleton("myds", new AbstractDataSource() {
#Override
public Connection getConnection() throws SQLException {
return null;
}
#Override
public Connection getConnection(String username, String password) throws SQLException {
return null;
}
});
}
}
}
Manual bean registration has risks as stated here: https://stackoverflow.com/a/11751503/1941560, although I cannot find out in which circumtances that #Order annotation works. For above configuration of application I expect the execution order like; RepoConf, AConf, ADAO, AService.
A weird thing to notice is that when I changed the order of component classes declared to (with commencing array with RepoConf):
Class<?>[] classes = new Class[]{RepoConf.class, AConf.class, ADAO.class, AService.class};
or changed my AConf class to:
#Configuration
#Order(Ordered.HIGHEST_PRECEDENCE + 100)
public static class AConf {
#Autowired
RepoConf repoConf; // must be declared before aService
#Autowired
AService aService;
}
application works as expected. Could someone explain that spring container's behaviour and how can I utilize #Order annotations?
springframework version I use is 4.2.1.RELEASE
Judging by the JavaDoc documentation for the #Order annotation, I don't think it is used to order bean creation:
NOTE: Annotation-based ordering is supported for specific kinds of
components only — for example, for annotation-based AspectJ aspects.
Ordering strategies within the Spring container, on the other hand, are
typically based on the Ordered interface in order to allow for
programmatically configurable ordering of each instance.
Consulting the Spring Framework documentation, the #Ordered annotation seems to be used for:
Ordering of instances when injected into a collection
Ordering the execution of event listeners
Ordering of #Configuration class processing, for example if you want to override a bean by name.
From the documentation, it does not look like #Order is intended for your use case.
However, from the Spring documentation, we can see there depends-on, which enforces that certain beans should be created before the bean being defined. This has a corresponding annotation #DependsOn.
It looks that your version of Spring Framework simply ignores the #Order annotation on configuration classes. There's no surprise here, because that annotation should only be used on classes that implement the Ordered interface. Moreover, I could never find any reference to it about configuration classes in Spring Framework reference documentation.
Anyway you are walking in terra incognita here. It is not described in official documentation, so it will work or not depending on implementation details. What happens here is (seeing the results):
the AnnotationConfigApplicationContext first instantiate the configuration classes and their beans in their declaration order
it then builds all the beans in their instantiation order
As you register the datasource myds at build time in its configuration class, it happens to be registered in time to be autowired in other beans when repoConf is built before any other beans using them. But this is never guaranteed by Spring Framework documentation and future versions could use a different approach without breaking their contract. Moreover, Spring does change initialization order to allow dependencies to be constructed before beans depending on them.
So the correct way here is to tell Spring that the ADAO bean depends on the configuration RepoConf. Just throw away all Order annotations which are no use here, and put a #DependsOn one. You code could be:
package com.so;
...
public class TestSpring {
public static void main(String[] args) {
Class<?>[] classes = new Class[]{AConf.class, ADAO.class, AService.class, RepoConf.class} ;
AnnotationConfigApplicationContext ctx = new AnnotationConfigApplicationContext(classes);
}
#Configuration
public static class AConf {
#Autowired
AService aService;
}
#DependsOn("repoConf")
#Repository
public static class ADAO {
#Autowired
#Qualifier("myds")
DataSource dataSource;
}
#Service
public static class AService {
#Autowired
ADAO adao;
#PostConstruct
public void init() {
System.out.println("service init");
}
}
#Configuration("repoConf")
public static class RepoConf {
#Autowired
BeanFactory beanFactory;
#PostConstruct
public void init() {
ConfigurableBeanFactory configurableBeanFactory = (ConfigurableBeanFactory) beanFactory;
configurableBeanFactory.registerSingleton("myds", new AbstractDataSource() {
#Override
public Connection getConnection() throws SQLException {
return null;
}
#Override
public Connection getConnection(String username, String password) throws SQLException {
return null;
}
});
}
}
}
#DependsOn ensures that repoConf will be created before ADAO making the datasource available for dependency injection.

Transaction Managment Spring framework

While learning spring framework I'm currently on the transaction management topic and while I'm not 100% against using xml I've been trying to do everything using annotations. Along comes this transaction management thing and the instructor just plops it right into the xml file and references a DataSource bean which I created with annotations
... BELOW ...
package com.udemy.learning.main;
import java.sql.SQLException;
import javax.sql.DataSource;
import javax.annotation.*;
import org.apache.commons.dbcp.BasicDataSource;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.stereotype.Component;
#Component("connect")
public class Connections extends BasicDataSource{
#Value("${jdbc.user}")
private String username;
#Value("${jdbc.password}")
private String password;
#Value("${jdbc.driver}")
private String driverClassName;
#Value("${jdbc.url}")
private String url;
public DataSource connect() throws SQLException{
super.setDriverClassName(this.driverClassName);
super.setUrl(this.url);
super.setUsername(this.username);
super.setPassword(this.password);
return super.createDataSource();
}
#PreDestroy
public void close() {
try {
super.close();
System.out.println("Connection closed");
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
}
So in an attempt to do everything with code and annotations I created a TransactionManager class and just used the #Component annotation above it. I'm sure you can imagine what that looks like so I won't drop that here, plus I'm thinking it's rather elementary looking anyway.
So anyway, long question short...
Is there a way to do this transaction management configuration the way I have attempted? rather than strictly xml? My attempt ended me up with an error like the following...
Bean named 'transactionManager' must be of type [org.springframework.transaction.PlatformTransactionManager], but was actually of type [com.udemy.learning.main.TransactionManager]
You need your bean to be of type PlatformTransactionManager so extend PlatformTransactionManager in your TransactionManager class. The exception is occuring because Spring is expecting any Transaction Manager to be a subclass of the PlatformTransactionManager

Categories