Given the following service:
public interface MyService {
void method();
}
And it's implementation:
#Service
public class MyServiceImpl implements MyService {
#Transactional
#CustomAnnotation
#Override
public void method() {
...
}
}
I would like to use a StaticMethodMatcherPointcutAdvisor in the following manner:
public class MyPointcutAdvisor extends StaticMethodMatcherPointcutAdvisor {
...
#Override
public boolean matches(Method method, Class targetClass) {
Method m = method;
if(annotationPresent(method)) {
return true;
}
Class<?> userClass = ClassUtils.getUserClass(targetClass);
Method specificMethod = ClassUtils.getMostSpecificMethod(method, userClass);
specificMethod = BridgeMethodResolver.findBridgedMethod(specificMethod);
if(annotationPresent(specificMethod )) {
return true;
}
return false;
}
...
}
The problem is that Spring uses an InfrastructureAdvisorAutoProxyCreator to create a Proxy of that class, whereas the DefaultAdvisorAutoProxyCreator would create the proxy for the MyPointcutAdvisor, but the MyPointcutAdvisor is only given the proxy as targetClass parameter. Thus, the PointcutAdvisor cannot find the annotation and therefore does not match.
For completion this is my Configuration-class:
#Configuration
#EnableTransactionManagement
public class MyConfiguration {
#Bean
public DefaultAdvisorAutoProxyCreator defaultAdvisorAutoProxyCreator() {
return new DefaultAdvisorAutoProxyCreator();
}
#Bean
public MyPointcutAdvisor myPointcutAdvisor() {
return new MyPointcutAdvisor();
}
...
}
My question is: Is there a way to use #EnableTransactionManagement in combination with a StaticMethodMatcherPointcutAdvisor ?
Workarounds:
Put #CustomAnnotation into the service interface: I want to have clean interfaces.
Add #Role(BeanDefinition.ROLE_INFRASTRUCTURE) to MyPointCutAdvisor bean configuration, thus, the InfrastructureAdvisorAutoProxyCreator will create the proxy. This seems like the wrong way, since this bean is not infrastructure
Copy the beans from ProxyTransactionManagementConfiguration, remove #EnableTransactionManagement and remove #Role(BeanDefinition.ROLE_INFRASTRUCTURE), thus the DefaultAdvisorAutoProxyCreator will create the proxy, which is my current workaround and results in the following configuration:
#Configuration
public class MyWorkaroundConfiguration {
#Bean
public DefaultAdvisorAutoProxyCreator defaultAdvisorAutoProxyCreator() {
return new DefaultAdvisorAutoProxyCreator();
}
#Bean
public MyPointcutAdvisor myPointcutAdvisor() {
return new MyPointcutAdvisor();
}
#Bean
public TransactionAttributeSource transactionAttributeSource() {
return new AnnotationTransactionAttributeSource();
}
#Bean(name = TransactionManagementConfigUtils.TRANSACTION_ADVISOR_BEAN_NAME)
public BeanFactoryTransactionAttributeSourceAdvisor transactionAdvisor(
TransactionInterceptor transactionInterceptor) {
BeanFactoryTransactionAttributeSourceAdvisor advisor =
new BeanFactoryTransactionAttributeSourceAdvisor();
advisor.setTransactionAttributeSource(transactionAttributeSource());
advisor.setAdvice(transactionInterceptor);
return advisor;
}
#Bean
public TransactionInterceptor transactionInterceptor(
PlatformTransactionManager transactionManager) {
TransactionInterceptor interceptor = new TransactionInterceptor();
interceptor.setTransactionAttributeSource(transactionAttributeSource());
interceptor.setTransactionManager(transactionManager);
return interceptor;
}
...
}
Using #EnableAspectJAutoProxy instead of the DefaultAutoProxyCreator works for me.
#Configuration
#EnableAspectJAutoProxy
#EnableTransactionManagement
public class MyConfiguration {
}
This also allows using #Aspect like M. Deinum suggested.
Related
I have a code snippet that looks like this one
Service with #Transactional method
public class XService {
private Repo1 repo1;
private Repo2 repo2;
private Repo3 repo3;
XService(Repo1 repo1, Repo2 repo2, Repo3 repo3) {
this.repo1 = repo1;
this.repo2 = repo2;
this.repo3 = repo3;
}
#Transactional(rollbackFor = Exception.class)
public SomeObject method(Arg1 arg1, Arg2 arg2) {
repo1.method1();
repo2.method2();
repo3.method3(); // probability of exception here, in which case rollback is needed
}
}
Class from where method is invoked
public class YService {
private XService xService;
public YService(XService xService) {
this.xService = xService;
}
public SomeObject method(Arg1 arg1, Arg2 arg2) {
xService.method(arg1, arg2);
}
}
I have also added the #EnableTransactionManagement on my SpringBootApplication class. But the database operations from repo1 and repo2 are not rolled back in case of Exception from repo3.
Every repository is using Spring JDBCTemplate for querying the database.
Configuration class
#Configuration
public class ConfigurationClass {
#Bean
#Inject
public PlatformTransactionManager transactionManager(DataSource dataSource) {
return new DataSourceTransactionManager(dataSource);
}
#Bean
#Inject
public DataSource dataSource() {
// setting config properties here
return new HikariDataSource(config);
}
#Bean
#Inject
public Repo1 repo1(JDBCTemplate template) {
return new Repo1(template);
}
#Bean
#Inject
public Repo2 repo2(JDBCTemplate template) {
return new Repo2(template);
}
#Bean
#Inject
public Repo3 repo3(JDBCTemplate template) {
return new Repo3(template);
}
#Bean
#Inject
public XService XService(Repo1 repo1, Repo2 repo2, Repo3 repo3) {
return new XService(repo1, repo2, repo3);
}
#Bean
#Inject
public YService YService(XService xService) {
return new YService(xService);
}
}
Using a TransactionTemplate and enclosing my repo calls in the template worked.
transactionTemplate.execute(new TransactionCallbackWithoutResult() {
#Override
protected void doInTransactionWithoutResult(TransactionStatus transactionStatus) throws TransactionException {
repo1.method1();
repo2.method2();
repo3.method3();
}
});
I don't know the reason this one worked. It will be great if anyone can help in understanding the reason behind this.
I want to override #RepositoryRestResource autogenerated controller methods using #RepositoryRestController having set the SDR's Base Path
to "/api".
Spring Data Rest 3.0 (and earlier) says:
"This controller [as shown in the snippet] will be served from the same API base path defined in RepositoryRestConfiguration.setBasePath that is used by all other RESTful endpoints (e.g. /api)".
https://docs.spring.io/spring-data/rest/docs/3.0.1.RELEASE/reference/html/#customizing-sdr.overriding-sdr-response-handlers (chapter 15.4)
This code snippet DOES NOT have a #RequestMapping on the class level, though.
My SDR app is configured with RepositoryRestConfiguration object
config.setBasePath("/api");
and yet #RepositoryRestController doesn't override SDR's autogenerated controller methods.
Please consider the accepted answear to this post:
Spring Data Rest controllers: behaviour and usage of #BasePathAwareController, #RepositoryRestController, #Controller and #RestController
Please help me understand this! :)
AppConf.java:
#Configuration
#Import(value = {DataConf.class})
#EnableWebMvc
#ComponentScan(value = "pl.mydomain.controller")
public class AppConf
{
#Bean
public RepositoryRestConfigurer repositoryRestConfigurer() {
return new RepositoryRestConfigurerAdapter() {
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration config) {
config.setBasePath("/api");
}
};
}
}
TokenController.java:
#RepositoryRestController
public class TokenController
{
private TokenRepository repository;
#Autowired
public TokenController(TokenRepository tokenRepository) {
this.repository = tokenRepository;
}
#RequestMapping(method = GET, path = "/tokens")
public #ResponseBody ResponseEntity<?> tokens()
{
return ResponseEntity.ok("Hello");
}
}
TokenRepository.java:
#RepositoryRestResource(path = "tokens")
public interface TokenRepository extends CrudRepository<Token, Long>{
}
The key to resolve the above dilemma was configuring the project in a correct fashion. That is, to put #ComponentScan in the class passed to AbstractAnnotationConfigDispatcherServletInitializer::getServletConfigClasses() method (not in AppConf.java passed to getRootConfigClasses()).
DispatcherConf.java:
public class DispatcherConf extends AbstractAnnotationConfigDispatcherServletInitializer {
#Override
protected Class<?>[] getRootConfigClasses() {
return new Class[] {AppConf.class};
}
#Override
protected Class<?>[] getServletConfigClasses() {
return new Class[] {WebConf.class}; // !!!
}
#Override
protected String[] getServletMappings() {
return new String[] {"/*"};
}
}
AppConf.java:
#Configuration
#Import({DataConf.class})
public class ApplicationConf
{
#Bean
public RepositoryRestConfigurer repositoryRestConfigurer() {
return new RepositoryRestConfigurerAdapter() {
#Override
public void configureRepositoryRestConfiguration(RepositoryRestConfiguration config) {
config.setBasePath("/api"); // !!!
}
};
}
}
DataConf.java:
#Configuration
#EnableJpaRepositories(basePackages = {
"pl.example.data.repository"
})
#EnableTransactionManagement
public class DataConf
{ ... }
WebConf.java:
#Import(RepositoryRestMvcConfiguration.class)
#ComponentScan({"pl.example.api.controller"}) // !!!
public class WebConf {
}
Even if I solved the riddle I don't understand why it was an issue. The rather that https://docs.spring.io/spring/docs/current/javadoc-api/org/springframework/context/annotation/ComponentScan.html states:
Annotation Type ComponentScan onfigures component scanning directives
for use with #Configuration classes.
I need to be able to create new bean instance - not on every call to method like in proxyMode=*, but only when needed (eg. to call .getObject()).
I know that there is ObjectFactory and ServiceFactory, but problem with this is that I cannot define factories in java config, but have to use hardcoded string inside bean. So this is what I want to achieve:
#Configuration
class Config {
#Bean
public MessageListenerContainerFactory listenerContainerFactory() {
MessageListenerContainerFactory listenerContainerFactory = new MessageListenerContainerFactory();
listenerContainerFactory.setMessageListener(rabbitProcessor());
return listenerContainerFactory;
}
#Bean
#Scope(SCOPE_PROTOTYPE)
public MessageListener rabbitProcessor() {
return new RabbitProcessor();
}
#Bean
#Scope(SCOPE_PROTOTYPE)
public MessageListener notThisOne() {
return new NotThisOne();
}
}
class MessageListenerContainerFactory {
private MessageListener messageListener;
public void setMessageListener(MessageListener messageListener) {
this.messageListener = messageListener;
}
public SimpleMessageListenerContainer createListenerContainer(){
SimpleMessageListenerContainer setMessageListener= new SimpleMessageListenerContainer();
//THIS HERE IS NEEDED!!!
Object needed = SPRINGCONTEXT.GETBEANNAMEOF(this.messageListener).getObject();
listenerContainer.setMessageListener(needed);
return setMessageListener;
}
}
You can try with #Qualifier annotation.
Ref - http://zetcode.com/articles/springbootqualifier/
#Bean
#Qualifier("rabbitProcessor")
#Scope(SCOPE_PROTOTYPE)
public MessageListener rabbitProcessor() {
return new RabbitProcessor();
}
#Bean
#Qualifier("notThisOne")
#Scope(SCOPE_PROTOTYPE)
public MessageListener notThisOne() {
return new NotThisOne();
}
Then you can create object by calling getBean() on applicationContext whenever you need like ..
applicationContext.getBean("rabbitProcessor");
//or
applicationContext.getBean("notThisOne");
i´m going to write a library that does some stuff and uses spring data.
The idea is that projects which uses this library can import this jar and use this library: MyLib.doSomeStuff().
It is possible to use Spring in this way and how can i initialize the ApplicationContext within the doSomeStuff() method, so that DI and the #Configuration Classes with the DataSources will be loaded?
public class MyLib {
#Autowired
private static SomeJpaRepository someJpaRepository;
public static void doSomeStuff(){
...init ApplicationContext....
...setup DataSources...
List<SomeEntity> someEntityList = someJpaRepository.someMethod();
}
// or
public static List<SomeEntity> getSomeEntityList() {
return someJpaRepository.finAll();
}
}
//other package
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(entityManagerFactoryRef = "gxEntityManager", transactionManagerRef = "gxTransactionManager",
basePackages = "com.gx")
public class GxConfig {
#Primary
#Bean(name = "gxDataSource")
public DataSource gxDataSource() {
DataSourceBuilder dataSourceBuilderGx = null;
//..
return dataSourceBuilderGx.build();
}
#Primary
#Bean(name = "gxEntityManager")
public LocalContainerEntityManagerFactoryBean gxEntityManagerFactory(EntityManagerFactoryBuilder builder) {
return builder.dataSource(gxDataSource()).packages("com.gx").build();
}
#Primary
#Bean(name = "gxTransactionManager")
public PlatformTransactionManager gxTransactionManager(
#Qualifier("gxEntityManager") EntityManagerFactory entityManagerFactory) {
return new JpaTransactionManager(entityManagerFactory);
}
}
//other package
public interface SomeEntity extends JpaRepository<SomeEntity, Long>
{
SomeEntity findById(Long id);
}
If you have a root configuration class it can be as simple as
ApplicationContext context =
new AnnotationConfigApplicationContext(GxConfig.class);
Just don't do it every time you call doStuff() as creating an application context is expensive. If you library is meant to be used as a black box, I guess it's ok to have this isolated application context.
You can do something like this:
public class MyLib {
private ApplicationContext context;
public MyLib() {
context = new AnnotationConfigApplicationContext(GxConfig.class);
}
public void doStuff() {
SomeBean bean = context.getBean(SomeBean.class);
// do something with the bean
}
}
I'm using Spring Batch version 2.2.4.RELEASE
I tried to write a simple example with stateful ItemReader, ItemProcessor and ItemWriter beans.
public class StatefulItemReader implements ItemReader<String> {
private List<String> list;
#BeforeStep
public void initializeState(StepExecution stepExecution) {
this.list = new ArrayList<>();
}
#AfterStep
public ExitStatus exploitState(StepExecution stepExecution) {
System.out.println("******************************");
System.out.println(" READING RESULTS : " + list.size());
return stepExecution.getExitStatus();
}
#Override
public String read() throws Exception {
this.list.add("some stateful reading information");
if (list.size() < 10) {
return "value " + list.size();
}
return null;
}
}
In my integration test, I'm declaring my beans in an inner static java config class like the one below:
#ContextConfiguration
#RunWith(SpringJUnit4ClassRunner.class)
public class SingletonScopedTest {
#Configuration
#EnableBatchProcessing
static class TestConfig {
#Autowired
private JobBuilderFactory jobBuilder;
#Autowired
private StepBuilderFactory stepBuilder;
#Bean
JobLauncherTestUtils jobLauncherTestUtils() {
return new JobLauncherTestUtils();
}
#Bean
public DataSource dataSource() {
EmbeddedDatabaseBuilder embeddedDatabaseBuilder = new EmbeddedDatabaseBuilder();
return embeddedDatabaseBuilder.addScript("classpath:org/springframework/batch/core/schema-drop-hsqldb.sql")
.addScript("classpath:org/springframework/batch/core/schema-hsqldb.sql")
.setType(EmbeddedDatabaseType.HSQL)
.build();
}
#Bean
public Job jobUnderTest() {
return jobBuilder.get("job-under-test")
.start(stepUnderTest())
.build();
}
#Bean
public Step stepUnderTest() {
return stepBuilder.get("step-under-test")
.<String, String>chunk(1)
.reader(reader())
.processor(processor())
.writer(writer())
.build();
}
#Bean
public ItemReader<String> reader() {
return new StatefulItemReader();
}
#Bean
public ItemProcessor<String, String> processor() {
return new StatefulItemProcessor();
}
#Bean
public ItemWriter<String> writer() {
return new StatefulItemWriter();
}
}
#Autowired
JobLauncherTestUtils jobLauncherTestUtils;
#Test
public void testStepExecution() {
JobExecution jobExecution = jobLauncherTestUtils.launchStep("step-under-test");
assertEquals(ExitStatus.COMPLETED, jobExecution.getExitStatus());
}
}
This test passes.
But as soon as I define my StatefulItemReader as a step scoped bean (which is better for a stateful reader), the "before step" code is no longer executed.
...
#Bean
#StepScope
public ItemReader<String> reader() {
return new StatefulItemReader();
}
...
And I notice the same issue with processor and my writer beans.
What's wrong with my code? Is it related to this resolved issue: https://jira.springsource.org/browse/BATCH-1230
My whole Maven project with several JUnit tests can be found on GitHub: https://github.com/galak75/spring-batch-step-scope
Thank you in advance for your answers.
When you configure a bean as follows:
#Bean
#StepScope
public MyInterface myBean() {
return new MyInterfaceImpl();
}
You are telling Spring to use the proxy mode ScopedProxyMode.TARGET_CLASS. However, by returning the MyInterface, instead of the MyInterfaceImpl, the proxy only has visibility into the methods on the MyInterface. This prevents Spring Batch from being able to find the methods on MyInterfaceImpl that have been annotated with the listener annotations like #BeforeStep. The correct way to configure this is to return MyInterfaceImpl on your configuration method like below:
#Bean
#StepScope
public MyInterfaceImpl myBean() {
return new MyInterfaceImpl();
}
We have added a warning log message on startup that points out, as we look for the annotated listener methods, if the object is proxied and the target is an interface, we won't be able to find methods on the implementing class with annotations on them.
as suggested by pojo-guy
Solution is to implement StepExecutionListener and Override beforeStep method to set stepExecution
#Override
public void beforeStep(StepExecution stepExecution) {
this.stepExecution = stepExecution;
}