ResourcelessTransactionManager could not be found - java

I am trying to schedule a Spring Batch on top of Spring boot Application. The following are my configurations. However, i see an error when application start up fails with the following error.
Parameter 0 of method mapJobRepositoryFactory in ScheduleConfig required a bean of type 'org.springframework.batch.support.transaction.ResourcelessTransactionManager' that could not be found. Can some one throw light on why this is occuring?
#Configuration
#EnableScheduling
public class ScheduleConfig {
#Bean
public ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactory(
ResourcelessTransactionManager transactionManager) throws Exception {
MapJobRepositoryFactoryBean factory = new
MapJobRepositoryFactoryBean(transactionManager);
factory.afterPropertiesSet();
return factory;
}
#Bean
public JobRepository jobRepository(
MapJobRepositoryFactoryBean factory) throws Exception {
return factory.getObject();
}
#Bean
public SimpleJobLauncher jobLauncher(JobRepository jobRepository) {
SimpleJobLauncher launcher = new SimpleJobLauncher();
launcher.setJobRepository(jobRepository);
return launcher;
}
}

Add #EnableBatchProcessing on top of the class :)

Don't use
public PlatformTransactionManager transactionManager(....
Cause...
It conflicts with "PlatformTransactionManager" in "DefaultBatchConfigurer".

Related

MapJobRepositoryFactoryBean in Spring Batch uses Jdbc-based Daos, not the Map-based

I am implementing a small REST-service, which runs batch job. I am not going to persist any metadata and my job is not parallel, so I've choosed MapJobRepositoryFactoryBean. And I also have H2 in classpath for other purposes. But when I trying to run the job, I am getting the following error:
org.springframework.dao.EmptyResultDataAccessException: Incorrect result size: expected 1, actual 0
I figured out, that JobRepository bean I getting from MapJobRepositoryFactoryBean uses JdbcJobExecutionDao instead of MapJobExecutionDao. And because of that it fails on request
SELECT VERSION FROM BATCH_JOB_EXECUTION WHERE JOB_EXECUTION_ID=?
Why this happens? How should I properly configure it to use Map-based Dao?
Currently my configuration is the following:
public class ServiceConfiguration {
#Bean
public PlatformTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean(PlatformTransactionManager transactionManager) throws Exception {
MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean = new MapJobRepositoryFactoryBean(transactionManager);
return mapJobRepositoryFactoryBean;
}
#Bean(name = "inMemoryJobRepository")
public JobRepository jobRepository(MapJobRepositoryFactoryBean mapJobRepositoryFactoryBean) throws Exception {
JobRepository mapJobRepository = mapJobRepositoryFactoryBean.getObject();
return mapJobRepository;
}
#Bean(name = "syncJobLauncher")
public JobLauncher jobLauncher(#Qualifier("inMemoryJobRepository") JobRepository jobRepository) throws Exception {
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(jobRepository);
return simpleJobLauncher;
}
}
UPDATE:
As Michael Minella mentioned, the bean for custom BatchConfigurer should be created.
#Bean
public BatchConfigurer batchConfigurer(#Qualifier("inMemoryJobRepository") JobRepository jobRepository,
#Qualifier("syncJobLauncher") JobLauncher jobLauncher,
#Qualifier("resourcelessTransactionManager") PlatformTransactionManager transactionManager,
#Qualifier("inMemoryJobExplorer") JobExplorer jobExplorer){
return new InMemoryBatchConfigurer(jobRepository, transactionManager, jobLauncher, jobExplorer);
}

Spring Batch WriterNotOpenException

I have created a simple single-step Spring batch job that reads items from a DB, processes them and writes the result to a csv.
During runtime I end up with a
org.springframework.batch.item.WriterNotOpenException: Writer must be open before it can be written to
The relevant code:
#Configuration
#EnableBatchProcessing
#EnableAutoConfiguration
public class CleanEmailJob {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
public DataSource dataSource;
#Bean
public ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactory(ResourcelessTransactionManager txManager)
throws Exception {
MapJobRepositoryFactoryBean factory = new MapJobRepositoryFactoryBean(txManager);
factory.afterPropertiesSet();
return factory;
}
#Bean
public JobRepository jobRepository(MapJobRepositoryFactoryBean factory) throws Exception {
return factory.getObject();
}
#Bean
public JobExplorer jobExplorer(MapJobRepositoryFactoryBean factory) {
return new SimpleJobExplorer(factory.getJobInstanceDao(), factory.getJobExecutionDao(),
factory.getStepExecutionDao(), factory.getExecutionContextDao());
}
#Bean
public SimpleJobLauncher jobLauncher(JobRepository jobRepository) {
SimpleJobLauncher launcher = new SimpleJobLauncher();
launcher.setJobRepository(jobRepository);
return launcher;
}
#Bean
public Job cleanEmailAddressesJob() throws Exception {
return jobBuilderFactory.get("cleanEmailAddresses")
.incrementer(new RunIdIncrementer())
.start(processEmailAddresses())
.build();
}
#Bean
public Step processEmailAddresses() throws UnexpectedInputException, ParseException, Exception {
return stepBuilderFactory.get("processAffiliates")
.<AffiliateEmailAddress, VerifiedAffiliateEmailAddress> chunk(10)
.reader(reader())
.processor(processor())
.writer(report())
.build();
}
#Bean
public ItemWriter<VerifiedAffiliateEmailAddress> report(){
FlatFileItemWriter<VerifiedAffiliateEmailAddress> reportWriter = new FlatFileItemWriter<VerifiedAffiliateEmailAddress>();
reportWriter.setResource(new ClassPathResource("report.csv"));
DelimitedLineAggregator<VerifiedAffiliateEmailAddress> delLineAgg = new DelimitedLineAggregator<VerifiedAffiliateEmailAddress>();
delLineAgg.setDelimiter(",");
BeanWrapperFieldExtractor<VerifiedAffiliateEmailAddress> fieldExtractor = new BeanWrapperFieldExtractor<VerifiedAffiliateEmailAddress>();
fieldExtractor.setNames(new String[] {"uniekNr", "reason"});
delLineAgg.setFieldExtractor(fieldExtractor);
reportWriter.setLineAggregator(delLineAgg);
reportWriter.setShouldDeleteIfExists(true);
return reportWriter;
}
As described in the documentation I would expect the lifecycle events(open, close) are automatically taken care of since I am in a single threaded and single writer job?
To elaborate on the comment left, Spring Batch will register any ItemStream implementations automatically when it finds them so that they will be automatically opened when the step begins. When using java config, Spring only knows what the return type is. Since you are returning ItemReader, we don't know that your implementation also implements ItemStream. When using java config, I usually recommend returning the implementation if it's known (instead of the interface). That allows Spring to introspect it fully. So in this example, returning FlatFileItemReader instead of ItemReader will fix the issue.
Finally it turned out to be the Spring Dev Tools interfering with my batch application. The report.csv is in my classpath, when writing to the report file, Spring Dev Tools detects a change in my classpath and triggers a reload of the application causing the report.csv resource being closed...
spring.devtools.restart.enabled=false
in my application.properties made it work again

Multiple transaction managers NoUniqueBeanDefinitionException

I'm using Spring boot and I defined the spring.datasource.* properties to enable my datasource. If I only use this it works fine. However, I'm now trying to add JMS to my application as well, using the following config:
#Configuration
#EnableJms
public class TriggerQueueConfig {
private Logger logger = LoggerFactory.getLogger(getClass());
#Value("${jms.host:localhost}")
private String host;
#Value("${jms.port:1414}")
private int port;
#Value("${jms.concurrency.min:3}-${jms.concurrency.max:10}")
private String concurrency;
#Value("${jms.manager}")
private String queueManager;
#Value("${jms.cache:100}")
private int cacheSize;
#Bean
public JmsListenerContainerFactory<?> jmsListenerContainerFactory() throws JMSException {
logger.debug("Setting queue concurrency to {} (min-max)", concurrency);
DefaultJmsListenerContainerFactory factory = new DefaultJmsListenerContainerFactory();
factory.setConnectionFactory(cachedConnectionFactory());
factory.setMessageConverter(messageConverter());
factory.setTransactionManager(transactionManager());
factory.setSessionTransacted(true);
factory.setConcurrency(concurrency);
return factory;
}
#Bean(name = "jmsTransactionManager")
public JmsTransactionManager transactionManager() throws JMSException {
JmsTransactionManager transactionManager = new JmsTransactionManager();
transactionManager.setConnectionFactory(cachedConnectionFactory());
return transactionManager;
}
#Bean
#Primary
public ConnectionFactory cachedConnectionFactory() throws JMSException {
CachingConnectionFactory connectionFactory = new CachingConnectionFactory(ibmConnectionFactory());
connectionFactory.setSessionCacheSize(cacheSize);
connectionFactory.setCacheConsumers(true);
return connectionFactory;
}
#Bean
public ConnectionFactory ibmConnectionFactory() throws JMSException {
logger.debug("Connecting to queue on {}:{}", host, port);
MQQueueConnectionFactory connectionFactory = new MQQueueConnectionFactory();
connectionFactory.setHostName(host);
connectionFactory.setPort(port);
connectionFactory.setQueueManager(queueManager);
connectionFactory.setTransportType(WMQConstants.WMQ_CM_CLIENT);
return connectionFactory;
}
#Bean
public MessageConverter messageConverter() {
MarshallingMessageConverter converter = new MarshallingMessageConverter();
converter.setMarshaller(marshaller());
converter.setUnmarshaller(marshaller());
return converter;
}
#Bean
public Jaxb2Marshaller marshaller() {
Jaxb2Marshaller marshaller = new Jaxb2Marshaller();
marshaller.setPackagesToScan("com.example");
return marshaller;
}
}
The JMS listener I created is working fine. However, when I'm trying to persist data using my repository (Spring Data JPA) in a #Transactional method, I'm getting the following exception:
org.springframework.beans.factory.NoUniqueBeanDefinitionException: No qualifying bean of type [org.springframework.transaction.PlatformTransactionManager] is defined: expected single matching bean but found 2: transactionManager,jmsTransactionManager
This makes sense, because both transactionmanagers are PlatformTransactionManager's. Usually you would put #Primary on top of the bean that should be the default one. However, in this case I'm using Spring boot's autoconfiguration so I can't add the #Primary on it.
An alternative solution would be to provide the name of the transaction manager with each #Transactional annotation (for example #Transactional("transactionManager"), but this would be a lot of work, and it would make more sense to have a default transactionmanager because the JMS transactionmanager is an exceptional case.
Is there an easy way to define the automatically configured transactionmanager to be used by default?
The Spring boot 'magic' is really only this:
#Bean
#ConditionalOnMissingBean(PlatformTransactionManager.class)
public PlatformTransactionManager transactionManager() {
return new JpaTransactionManager();
}
in org.springframework.boot.autoconfigure.orm.jpa.JpaBaseConfiguration class.
Notice the #ConditionalOnMissingBean annotation - this will get configured only if a bean of type PlatformTransactionManager doesn't exist. So you can override this by creating your own bean with #Primary annotation:
#Bean
#Primary
public PlatformTransactionManager transactionManager() {
return new JpaTransactionManager();
}

Convert existing Spring application to Spring-Boot

I have configured and working Spring-based REST application, but now I'd like to convert it to Spring-Boot.
My application uses Spring-Data-JPA on top of JPA datasource with Hibernate provider:
#Configuration
#EnableJpaRepositories("foo.bar.web.repository")
#EnableTransactionManagement
public class RepositoryConfig {
// properties ommited
#Bean
public DataSource dataSource() {
BasicDataSource dataSource = new BasicDataSource();
dataSource.setDriverClassName(className);
dataSource.setUrl(url);
dataSource.setUsername(userName);
dataSource.setPassword(password);
return dataSource;
}
#Bean
public EntityManagerFactory entityManagerFactory() {
LocalContainerEntityManagerFactoryBean factory = new LocalContainerEntityManagerFactoryBean();
factory.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
factory.setPackagesToScan("foo.bar.web.domain");
factory.setDataSource(dataSource());
factory.setJpaPropertyMap(new HashMap<String, Object>() {{
put("hibernate.dialect", dialect);
put("hibernate.hbm2ddl.auto", hbm2ddl);
}});
factory.afterPropertiesSet();
return factory.getObject();
}
#Bean
public PlatformTransactionManager transactionManager() {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory());
return transactionManager;
}
}
My REST endpoints implemented using SpringMVC with following configuration:
#Configuration
#EnableWebMvc
#ComponentScan("foo.bar.web.controller")
public class MvcConfig extends WebMvcConfigurerAdapter {
#Override
public void configureDefaultServletHandling(DefaultServletHandlerConfigurer configurer) {
configurer.enable();
}
#Bean
public MultipartResolver multipartResolver() {
return new CommonsMultipartResolver();
}
}
Web initializer:
public class WebInitializer extends AbstractAnnotationConfigDispatcherServletInitializer {
#Override
protected Class<?>[] getRootConfigClasses() {
return new Class[]{
ApplicationConfig.class,
RepositoryConfig.class
};
}
#Override
protected Class<?>[] getServletConfigClasses() {
return new Class[]{MvcConfig.class};
}
#Override
protected String[] getServletMappings() {
return new String[]{"/"};
}
}
The problem is that I don't want to use Spring-Boot auto configuration because I'd like to reuse my existing configuration classes with minimal changes, but I cannot find correct way to do this. I tried to implement Spring-boot application class annotated with #SpringBootApplication, but I'm not 100% sure that my config classes is used, because in this case I get java.lang.ClassCastException: org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean$$EnhancerBySpringCGLIB$$ba21071f cannot be cast to javax.persistence.EntityManagerFactory.
Also I tried throw away #EnableAutoConfiguration annotation from application class and add TomcatEmbeddedServletContainerFactory bean to my context manually, but in this case the embedded tomcat is not configured properly.
It would be great if somebody can give me a hint how to solve my problem. I believe that all I need to do is somehow replace my WebInitilizer with Spring-Boot config.
After spending a day in a research, I finally found a solition of my problem.
First of all I had to modify my entityManagerFactory() and transactionManager() beans:
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean factory = new LocalContainerEntityManagerFactoryBean();
factory.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
factory.setPackagesToScan("foo.bar.web.domain");
factory.setDataSource(dataSource());
factory.setJpaPropertyMap(new HashMap<String, Object>() {{
put("hibernate.dialect", dialect);
put("hibernate.hbm2ddl.auto", hbm2ddl);
}});
factory.afterPropertiesSet();
return factory;
}
#Bean
public PlatformTransactionManager transactionManager() {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory().getObject());
return transactionManager;
}
Also I totally removed my WebInitializerclass and removed #EnableWebMvc annotation from MvcConfig. In Spring-Boot it's not possible to have class extended from WebMvcConfigurerAdapter in classpath because if Spring-Boot find it, all automatic configuration related to SpringMVC will be skipped. Here is the final version of my MvcConfig class:
#Configuration
#ComponentScan("foo.bar.web.controller")
public class MvcConfig {
#Bean
public MultipartResolver multipartResolver() {
return new CommonsMultipartResolver();
}
}
I used the version of Spring-Boot application class which shown in doc:
#SpringBootApplication(exclude = MultipartAutoConfiguration.class)
public class Application extends SpringBootServletInitializer {
public static void main(String[] args) throws Exception {
SpringApplication.run(Application.class, args);
}
#Override
protected SpringApplicationBuilder configure(SpringApplicationBuilder application) {
return application.sources(Application.class);
}
}
Note, that in my case I had to exclude MultipartAutoConfiguration from auto configuration because I've already have this feature configured in MvcConfig. Bun it is also possible to leave it autoconfigured, but in this case I had to tune allowed file size in application.properties config file or add a MultipartConfigElement bean to my classpath.

Spring Batch Multiple Threads

I am writing a Spring Batch with idea of scaling it when required.
My ApplicationContext looks like this
#Configuration
#EnableBatchProcessing
#EnableTransactionManagement
#ComponentScan(basePackages = "in.springbatch")
#PropertySource(value = {"classpath:springbatch.properties"})
public class ApplicationConfig {
#Autowired
Environment environment;
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public Job job() throws Exception {
return jobs.get("spring_batch")
.flow(step()).end()
.build();
}
#Bean(name = "dataSource", destroyMethod = "close")
public DataSource dataSource() {
BasicDataSource basicDataSource = new BasicDataSource();
return basicDataSource;
}
#Bean
public JobRepository jobRepository() throws Exception {
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setTransactionManager(transactionManager());
jobRepositoryFactoryBean.setDataSource(dataSource());
return jobRepositoryFactoryBean.getObject();
}
#Bean(name = "batchstep")
public Step step() throws Exception {
return stepBuilderFactory.get("batchstep").allowStartIfComplete(true).
transactionManager(transactionManager()).
chunk(2).reader(batchReader()).processor(processor()).writer(writer()).build();
}
#Bean
ItemReader batchReader() throws Exception {
System.out.println(Thread.currentThread().getName()+"reader");
HibernateCursorItemReader<Source> hibernateCursorItemReader = new HibernateCursorItemReader<>();
hibernateCursorItemReader.setQueryString("from Source");
hibernateCursorItemReader.setFetchSize(2);
hibernateCursorItemReader.setSessionFactory(sessionFactory().getObject());
hibernateCursorItemReader.close();
return hibernateCursorItemReader;
}
#Bean
public ItemProcessor processor() {
return new BatchProcessor();
}
#Bean
public ItemWriter writer() {
return new BatchWriter();
}
public TaskExecutor taskExecutor(){
SimpleAsyncTaskExecutor asyncTaskExecutor=new SimpleAsyncTaskExecutor("spring_batch");
asyncTaskExecutor.setConcurrencyLimit(5);
return asyncTaskExecutor;
}
#Bean
public LocalSessionFactoryBean sessionFactory() {
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource(dataSource());
sessionFactory.setPackagesToScan(new String[]{"in.springbatch.entity"});
sessionFactory.setHibernateProperties(hibernateProperties());
return sessionFactory;
}
#Bean
public PersistenceExceptionTranslationPostProcessor exceptionTranslation() {
return new PersistenceExceptionTranslationPostProcessor();
}
#Bean
#Autowired
public HibernateTransactionManager transactionManager() {
HibernateTransactionManager txManager = new HibernateTransactionManager();
txManager.setSessionFactory(sessionFactory().getObject());
return txManager;
}
Properties hibernateProperties() {
return new Properties() {
{
setProperty("hibernate.hbm2ddl.auto", environment.getProperty("hibernate.hbm2ddl.auto"));
setProperty("hibernate.dialect", environment.getProperty("hibernate.dialect"));
setProperty("hibernate.globally_quoted_identifiers", "false");
}
};
}
}
With above configuration I am able to read from DB , process the data and write to DB.
I am using chunk size as 2 and reading 2 records from cursor using
HibernateCusrsorItem reader and my query to read from DB is based on
date to pick current date records.
So far I am able to achieve desired behavior as well as restart
ability with job only picking records which were not processed
due to failure in previous run.
Now my requirement is to make batch use multiple threads to process data and write to DB.
My Processor and writer looks like this
#Component
public class BatchProcessor implements ItemProcessor<Source,DestinationDto>{
#Override
public DestinationDto process(Source source) throws Exception {
System.out.println(Thread.currentThread().getName()+":"+source);
DestinationDto destination=new DestinationDto();
destination.setName(source.getName());
destination.setValue(source.getValue());
destination.setSourceId(source.getSourceId().toString());
return destination;
}
#Component
public class BatchWriter implements ItemWriter<DestinationDto>{
#Autowired
IBatchDao batchDao;
#Override
public void write(List<? extends DestinationDto> list) throws Exception {
System.out.println(Thread.currentThread().getName()+":"+list);
batchDao.saveToDestination((List<DestinationDto>)list);
}
I updated my step and added a ThreadPoolTaskExecutor as follows
#Bean(name = "batchstep")
public Step step() throws Exception {
return stepBuilderFactory.get("batchstep").allowStartIfComplete(true).
transactionManager(transactionManager()).chunk(1).reader(batchReader()).
processor(processor()).writer(writer()).taskExecutor(taskExecutor()).build();
}
After this my processor is getting called by multiple threads but with same source data.
Is there anything extra i need to do?
This is a big question
Your best bet at getting a good answers would be to look through the Scaling and Parallel Processing chapter in the Spring Batch Documentation (Here)
There might be some multi-threading samples in the spring batch examples (Here)
An easy way to thread the Spring batch job is to Create A Future Processor - you put all your Processing Logic in a Future Object and you spring-processor class only adds Objects to the future. You writer class then wait on the future to finish before performing the write process. Sorry I don't have a sample to point you too for this - but if you have specific questions I can try and answer!

Categories