JpaItemWriter: no transaction is in progress - java

I'd like to use JpaItemWriter to batch persist entities. But when I use the following code to persist, I'm told:
Hibernate:
select
nextval ('hibernate_sequence')
[] 2014-03-19 15:46:02,237 ERROR : TransactionRequiredException: no transaction is in progress
How can I enable transactions on the following:
#Bean
public ItemWriter<T> writer() {
JpaItemWriter<T> itemWriter = new JpaItemWriter<>();
itemWriter.setEntityManagerFactory(emf);
}
#Configuration
#EnableTransactionManagement
#EnableBatchProcessing
class Config{ {
#Bean
public LocalContainerEntityManagerFactoryBean emf() {
LocalContainerEntityManagerFactoryBean emf = new LocalContainerEntityManagerFactoryBean();
emf.setDataSource(dataSource());
emf.setPackagesToScan("my.package");
emf.setJpaVendorAdapter(jpaAdapter());
emf.setJpaProperties(jpaProterties());
return emf;
}
Edit:
#Bean
public Job airlineJob(JobBuilderFactory jobs, Step step) {
return jobs.get("job")
.start(step)
.build();
}
//Reader is a `FlatFileItemReader`, writer is `CustomItemWriter`.
#Bean
public Step step(StepBuilderFactory steps,
MultiResourceItemReader<T> rea,
ItemProcessor<T, T> pro,
ItemWriter<T> wr) {
return steps.get("step")
.reader(rea)
.processor(proc)
.writer(wr)
.build();
}
//use same datasource and tx manager as in the full web application
#Bean
public JobLauncher launcher(TransactionManager tx, DataSource ds) throws Exception {
SimpleJobLauncher launcher = new SimpleJobLauncher();
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(ds);
factory.setTransactionManager(tx);
jobLauncher.setJobRepository(factory.getJobRepository());
return launcher;
}
Edit 2 as response to #Haim:
#Bean
public JpaItemWriter<T> jpaItemWriter(EntityManagerFactory emf) {
JpaItemWriter<T> writer = new JpaItemWriter<T>();
writer.setEntityManagerFactory(emf);
return writer;
}

I agree with Michael Minella: Spring batch job repository does not like to share its transaction manager with others. The logic is simple, if you share your job transaction manager with your step transaction manager upon step failure it will rollback both the step and the data written to the job repository. This means that you will not persist data for job restart.
In order to use two transaction managers you need to:
Delete #EnableTransactionManagement in case you use it only for the #Transactional above
Define an additional transaction manager
#Bean
#Qualifier("jpaTrx")
public PlatformTransactionManager jpaTransactionManager() {
return new JpaTransactionManager(emf());
}
Set the transaction manager to your step
#Autowired
#Qualifier("jpaTrx")
PlatformTransactionManager jpaTransactionManager
//Reader is a FlatFileItemReader, writer is CustomItemWriter.
#Bean
public Step step(StepBuilderFactory steps,
MultiResourceItemReader<T> rea,
ItemProcessor<T, T> pro,
ItemWriter<T> wr) {
return steps.get("step")
//attach tx manager
.transactionManager(jpaTransactionManager)
.reader(rea)
.processor(proc)
.writer(wr)
.build();
}

Write your own JpaTransactionManager for your datasources instead of spring transactions
#Autowired
private DataSource dataSource;
#Bean
#Primary
public JpaTransactionManager jpaTransactionManager() {
final JpaTransactionManager tm = new JpaTransactionManager();
tm.setDataSource(dataSource);
return tm;
}
Click Up Vote if this is useful for you.

I solved it creating my own transactional JpaWriter:
#Component
public class CustomItemWriter<T> extends JpaItemWriter<T> {
#Override
#Transactional
public void write(List<? extends T> items) {
super.write(items);
}
}

Related

Spring Batch WriterNotOpenException

I have created a simple single-step Spring batch job that reads items from a DB, processes them and writes the result to a csv.
During runtime I end up with a
org.springframework.batch.item.WriterNotOpenException: Writer must be open before it can be written to
The relevant code:
#Configuration
#EnableBatchProcessing
#EnableAutoConfiguration
public class CleanEmailJob {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
public DataSource dataSource;
#Bean
public ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactory(ResourcelessTransactionManager txManager)
throws Exception {
MapJobRepositoryFactoryBean factory = new MapJobRepositoryFactoryBean(txManager);
factory.afterPropertiesSet();
return factory;
}
#Bean
public JobRepository jobRepository(MapJobRepositoryFactoryBean factory) throws Exception {
return factory.getObject();
}
#Bean
public JobExplorer jobExplorer(MapJobRepositoryFactoryBean factory) {
return new SimpleJobExplorer(factory.getJobInstanceDao(), factory.getJobExecutionDao(),
factory.getStepExecutionDao(), factory.getExecutionContextDao());
}
#Bean
public SimpleJobLauncher jobLauncher(JobRepository jobRepository) {
SimpleJobLauncher launcher = new SimpleJobLauncher();
launcher.setJobRepository(jobRepository);
return launcher;
}
#Bean
public Job cleanEmailAddressesJob() throws Exception {
return jobBuilderFactory.get("cleanEmailAddresses")
.incrementer(new RunIdIncrementer())
.start(processEmailAddresses())
.build();
}
#Bean
public Step processEmailAddresses() throws UnexpectedInputException, ParseException, Exception {
return stepBuilderFactory.get("processAffiliates")
.<AffiliateEmailAddress, VerifiedAffiliateEmailAddress> chunk(10)
.reader(reader())
.processor(processor())
.writer(report())
.build();
}
#Bean
public ItemWriter<VerifiedAffiliateEmailAddress> report(){
FlatFileItemWriter<VerifiedAffiliateEmailAddress> reportWriter = new FlatFileItemWriter<VerifiedAffiliateEmailAddress>();
reportWriter.setResource(new ClassPathResource("report.csv"));
DelimitedLineAggregator<VerifiedAffiliateEmailAddress> delLineAgg = new DelimitedLineAggregator<VerifiedAffiliateEmailAddress>();
delLineAgg.setDelimiter(",");
BeanWrapperFieldExtractor<VerifiedAffiliateEmailAddress> fieldExtractor = new BeanWrapperFieldExtractor<VerifiedAffiliateEmailAddress>();
fieldExtractor.setNames(new String[] {"uniekNr", "reason"});
delLineAgg.setFieldExtractor(fieldExtractor);
reportWriter.setLineAggregator(delLineAgg);
reportWriter.setShouldDeleteIfExists(true);
return reportWriter;
}
As described in the documentation I would expect the lifecycle events(open, close) are automatically taken care of since I am in a single threaded and single writer job?
To elaborate on the comment left, Spring Batch will register any ItemStream implementations automatically when it finds them so that they will be automatically opened when the step begins. When using java config, Spring only knows what the return type is. Since you are returning ItemReader, we don't know that your implementation also implements ItemStream. When using java config, I usually recommend returning the implementation if it's known (instead of the interface). That allows Spring to introspect it fully. So in this example, returning FlatFileItemReader instead of ItemReader will fix the issue.
Finally it turned out to be the Spring Dev Tools interfering with my batch application. The report.csv is in my classpath, when writing to the report file, Spring Dev Tools detects a change in my classpath and triggers a reload of the application causing the report.csv resource being closed...
spring.devtools.restart.enabled=false
in my application.properties made it work again

Camunda Spring Transaction Integration not working

I'm using Camunda 7.3, Spring 4.2.4 and Hibernate 4.3.8 and I'm trying to use them with the same transaction as explained in Camunda Documentation. The transaction works ok with Hibernate operations but not with Camunda operations, if a transaction rollback occurs just the hibernate operations are reverted.
#Configuration
public class CamundaConfiguration {
// Variables with connection Data
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean bean = new LocalContainerEntityManagerFactoryBean();
bean.setPersistenceUnitName("PostgreSQL");
bean.setDataSource(dataSource());
bean.getJpaPropertyMap().put("hibernate.dialect", "org.hibernate.dialect.PostgreSQL82Dialect");
bean.getJpaPropertyMap().put("hibernate.ejb.naming_strategy", NamingStrategyLowerCase.class.getCanonicalName());
bean.getJpaPropertyMap().put("hibernate.jdbc.batch_size", 0);
bean.getJpaPropertyMap().put("hibernate.cache.use_second_level_cache", true);
bean.getJpaPropertyMap().put("hibernate.cache.use_query_cache", true);
bean.getJpaPropertyMap().put("javax.persistence.sharedCache.mode", SharedCacheMode.ALL);
bean.getJpaPropertyMap().put("hibernate.cache.default_cache_concurrency_strategy", "read-write");
bean.getJpaPropertyMap().put("javax.persistence.validation.factory", validator);
bean.getJpaPropertyMap().put("hibernate.cache.region.factory_class", SingletonEhCacheRegionFactory.class.getCanonicalName());
bean.setPersistenceProviderClass(org.hibernate.jpa.HibernatePersistenceProvider.class);
bean.setPackagesToScan("br.com.model");
return bean;
}
#Bean
public JpaTransactionManager transactionManager() {
JpaTransactionManager bean = new JpaTransactionManager(entityManagerFactory());
bean.getJpaPropertyMap().put("org.hibernate.flushMode", FlushMode.AUTO);
bean.setDataSource(dataSource);
bean.setPersistenceUnitName("PostgreSQL");
return bean;
}
#Bean
public DataSource dataSource() {
HikariConfig config = new HikariConfig();
config.setDriverClassName(driverClass);
config.setJdbcUrl(jdbcUrl);
config.setUsername(username);
config.setPassword(password);
config.setMaximumPoolSize(50);
config.setConnectionTestQuery("select 1");
HikariDataSource bean = new HikariDataSource(config);
return new LazyConnectionDataSourceProxy(bean);
}
#Bean
public ManagedProcessEngineFactoryBean processEngine() {
ManagedProcessEngineFactoryBean processEngineFactoryBean = new ManagedProcessEngineFactoryBean();
processEngineFactoryBean.setProcessEngineConfiguration(processEngineConfiguration());
return processEngineFactoryBean;
}
#Bean
public SpringProcessEngineConfiguration processEngineConfiguration() {
SpringProcessEngineConfiguration processEngineConfiguration = new SpringProcessEngineConfiguration();
processEngineConfiguration.setDataSource(dataSource());
processEngineConfiguration.setTransactionManager(transactionManager());
processEngineConfiguration.setJobExecutorActivate(true);
processEngineConfiguration.setDatabaseSchemaUpdate(ProcessEngineConfigurationImpl.DB_SCHEMA_UPDATE_TRUE);
return processEngineConfiguration;
}
#Bean
public TaskService taskService() throws Exception {
return processEngine().getObject().getTaskService();
}
}
The dataSource and transactionManager is the same used by Spring and Hibernate.
#Service
public class TaskManager {
#Inject
private TaskService taskService;
#Transactional
public void completeTask(String taskId, final Map<String, Object> variables) {
org.camunda.bpm.engine.task.Task camundaTask = taskService.createTaskQuery().taskId(taskId).singleResult();
taskService.complete(camundaTask.getId(), variables);
// Hibernate Operations
throw new RuntimeException("Exception test");
}
}
When executed the code above a rollback will occur and the 'Hibernate Operations' are rollbacked but the operations executed in taskService.complete are not.
I already debugged the Camunda code and everything seems ok, I found a SpringTransactionInterceptor and the commands are executed inside a TransactionTemplate.execute() and at this point the transaction is active.
After studying about transactions, Jpa and Spring, I found out the problem was jpaDialect is not configured, it's responsible to synchronize JDBC and JTA transactions.
The dialect object can be used to retrieve the underlying JDBC
connection and thus allows for exposing JPA transactions as JDBC
transactions.
I included the following code into configuration and now it's working:
#Configuration
public class CamundaConfiguration {
....
#Bean
public JpaDialect jpaDialect() {
return new org.springframework.orm.jpa.vendor.HibernateJpaDialect();
}
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean bean = new LocalContainerEntityManagerFactoryBean();
bean.setJpaDialect(jpaDialect());
bean.setJpaVendorAdapter(new org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter());
...
return bean;
}
#Bean
public JpaTransactionManager transactionManager() {
JpaTransactionManager bean = new JpaTransactionManager(entityManagerFactory());
bean.setJpaDialect(jpaDialect());
...
return bean;
}
...
}

Spring Batch Multiple Threads

I am writing a Spring Batch with idea of scaling it when required.
My ApplicationContext looks like this
#Configuration
#EnableBatchProcessing
#EnableTransactionManagement
#ComponentScan(basePackages = "in.springbatch")
#PropertySource(value = {"classpath:springbatch.properties"})
public class ApplicationConfig {
#Autowired
Environment environment;
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public Job job() throws Exception {
return jobs.get("spring_batch")
.flow(step()).end()
.build();
}
#Bean(name = "dataSource", destroyMethod = "close")
public DataSource dataSource() {
BasicDataSource basicDataSource = new BasicDataSource();
return basicDataSource;
}
#Bean
public JobRepository jobRepository() throws Exception {
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setTransactionManager(transactionManager());
jobRepositoryFactoryBean.setDataSource(dataSource());
return jobRepositoryFactoryBean.getObject();
}
#Bean(name = "batchstep")
public Step step() throws Exception {
return stepBuilderFactory.get("batchstep").allowStartIfComplete(true).
transactionManager(transactionManager()).
chunk(2).reader(batchReader()).processor(processor()).writer(writer()).build();
}
#Bean
ItemReader batchReader() throws Exception {
System.out.println(Thread.currentThread().getName()+"reader");
HibernateCursorItemReader<Source> hibernateCursorItemReader = new HibernateCursorItemReader<>();
hibernateCursorItemReader.setQueryString("from Source");
hibernateCursorItemReader.setFetchSize(2);
hibernateCursorItemReader.setSessionFactory(sessionFactory().getObject());
hibernateCursorItemReader.close();
return hibernateCursorItemReader;
}
#Bean
public ItemProcessor processor() {
return new BatchProcessor();
}
#Bean
public ItemWriter writer() {
return new BatchWriter();
}
public TaskExecutor taskExecutor(){
SimpleAsyncTaskExecutor asyncTaskExecutor=new SimpleAsyncTaskExecutor("spring_batch");
asyncTaskExecutor.setConcurrencyLimit(5);
return asyncTaskExecutor;
}
#Bean
public LocalSessionFactoryBean sessionFactory() {
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource(dataSource());
sessionFactory.setPackagesToScan(new String[]{"in.springbatch.entity"});
sessionFactory.setHibernateProperties(hibernateProperties());
return sessionFactory;
}
#Bean
public PersistenceExceptionTranslationPostProcessor exceptionTranslation() {
return new PersistenceExceptionTranslationPostProcessor();
}
#Bean
#Autowired
public HibernateTransactionManager transactionManager() {
HibernateTransactionManager txManager = new HibernateTransactionManager();
txManager.setSessionFactory(sessionFactory().getObject());
return txManager;
}
Properties hibernateProperties() {
return new Properties() {
{
setProperty("hibernate.hbm2ddl.auto", environment.getProperty("hibernate.hbm2ddl.auto"));
setProperty("hibernate.dialect", environment.getProperty("hibernate.dialect"));
setProperty("hibernate.globally_quoted_identifiers", "false");
}
};
}
}
With above configuration I am able to read from DB , process the data and write to DB.
I am using chunk size as 2 and reading 2 records from cursor using
HibernateCusrsorItem reader and my query to read from DB is based on
date to pick current date records.
So far I am able to achieve desired behavior as well as restart
ability with job only picking records which were not processed
due to failure in previous run.
Now my requirement is to make batch use multiple threads to process data and write to DB.
My Processor and writer looks like this
#Component
public class BatchProcessor implements ItemProcessor<Source,DestinationDto>{
#Override
public DestinationDto process(Source source) throws Exception {
System.out.println(Thread.currentThread().getName()+":"+source);
DestinationDto destination=new DestinationDto();
destination.setName(source.getName());
destination.setValue(source.getValue());
destination.setSourceId(source.getSourceId().toString());
return destination;
}
#Component
public class BatchWriter implements ItemWriter<DestinationDto>{
#Autowired
IBatchDao batchDao;
#Override
public void write(List<? extends DestinationDto> list) throws Exception {
System.out.println(Thread.currentThread().getName()+":"+list);
batchDao.saveToDestination((List<DestinationDto>)list);
}
I updated my step and added a ThreadPoolTaskExecutor as follows
#Bean(name = "batchstep")
public Step step() throws Exception {
return stepBuilderFactory.get("batchstep").allowStartIfComplete(true).
transactionManager(transactionManager()).chunk(1).reader(batchReader()).
processor(processor()).writer(writer()).taskExecutor(taskExecutor()).build();
}
After this my processor is getting called by multiple threads but with same source data.
Is there anything extra i need to do?
This is a big question
Your best bet at getting a good answers would be to look through the Scaling and Parallel Processing chapter in the Spring Batch Documentation (Here)
There might be some multi-threading samples in the spring batch examples (Here)
An easy way to thread the Spring batch job is to Create A Future Processor - you put all your Processing Logic in a Future Object and you spring-processor class only adds Objects to the future. You writer class then wait on the future to finish before performing the write process. Sorry I don't have a sample to point you too for this - but if you have specific questions I can try and answer!

Spring Data CrudRepository on java based configuration - EntityManager - no transaction is in progress

I've read I believe tried all of the posts on this, but no luck in finding the right answer.
I am using java based configuration with my spring mvc project, and wanted to try Spring CrudRepository, to get away from DAOs, and that is when the whole hell broke loose:
started with "no transaction is in progress" on flush after persist:
- tried adding #Transactional to the method - none of the variations found here worked
- tried changing configuration, but since it is java based, most of the answers are xml based. no luck either.
So finally I have to ask:
How to configure my project to make CrudRepository persist, or how to create Spring EntityManager using java configuration.
This is the last version of my configuration file:
#Configuration
#ComponentScan(basePackages = { "ba.fit.vms" })
#ImportResource(value = "classpath:spring-security-context.xml")
#EnableTransactionManagement
#EnableJpaRepositories
public class AppConfig {
#Bean
public static PropertyPlaceholderConfigurer propertyPlaceholderConfigurer() {
PropertyPlaceholderConfigurer ppc = new PropertyPlaceholderConfigurer();
ppc.setLocation(new ClassPathResource("/persistence.properties"));
return ppc;
}
// Security Configuration
#Bean
public KorisnickiServis korisnickiServis(){
return new KorisnickiServis();
}
#Bean
public TokenBasedRememberMeServices rememberMeServices() {
return new TokenBasedRememberMeServices("remember-me-key", korisnickiServis());
}
#Bean
public PasswordEncoder passwordEncoder() {
return new StandardPasswordEncoder();
}
// Jpa Configuration
#Value("${dataSource.driverClassName}")
private String driver;
#Value("${dataSource.url}")
private String url;
#Value("${dataSource.username}")
private String username;
#Value("${dataSource.password}")
private String password;
#Value("${hibernate.dialect}")
private String dialect;
#Value("${hibernate.hbm2ddl.auto}")
private String hbm2ddlAuto;
#Bean
public DataSource configureDataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(driver);
dataSource.setUrl(url);
dataSource.setUsername(username);
dataSource.setPassword(password);
return dataSource;
}
#Bean
public LocalContainerEntityManagerFactoryBean configureEntityManagerFactory() {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource(configureDataSource());
entityManagerFactoryBean.setPackagesToScan("ba.fit.vms");
entityManagerFactoryBean.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
Properties jpaProperties = new Properties();
jpaProperties.put(org.hibernate.cfg.Environment.DIALECT, dialect);
jpaProperties.put(org.hibernate.cfg.Environment.HBM2DDL_AUTO, hbm2ddlAuto);
//jpaProperties.put(org.hibernate.cfg.Environment.SHOW_SQL, true);
entityManagerFactoryBean.setJpaProperties(jpaProperties);
return entityManagerFactoryBean;
}
#Bean
public PlatformTransactionManager transactionManager() {
final JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(configureEntityManagerFactory().getObject());
return transactionManager;
}
}
I've tried number of variations, but was always receiving same "no transaction is in progress" error.
Also, just a glimpse at the repos:
LokacijaRepository:
#Transactional
public interface LokacijaRepository extends CrudRepository<Lokacija, Long> {
}
And LokacijaRepositoryImpl:
#Repository
public class LokacijaRepositoryImpl implements LokacijaRepository {
protected static Logger logger = Logger.getLogger("repo");
#PersistenceContext // tried this as well(type= PersistenceContextType.EXTENDED)
private EntityManager entityManager;
#Override
#Transactional// tried number of variations here as well, like REQUIRED...
public <S extends Lokacija> S save(S entity) {
logger.debug("trying to save!");
try {
entityManager.persist(entity);
entityManager.flush();
return entity;
} catch (Exception e) {
logger.debug("error: "+ e.toString());
return null;
}
}
If you need anything else to help me figure this one out, let me know.
The problem is that you are attempting to create an implementation of LokacijaRepository (in LokacijaRepositoryImpl) while Spring Data JPA (which you have configured) is trying to do the same.
What you need to do is:
totally remove LokacijaRepositoryImpl
Either change configureEntityManagerFactory to entityManagerFactory or add entityManagerFactoryRef=configureEntityManagerFactory to #EnableJpaRepositories

Transaction management with Spring Batch

I am discovering actually Spring and I am able to setup some jobs. Now, I would like to save my imported datas in a database using Hibernate/JPA and I keep getting this error :
14:46:43.500 [main] ERROR o.s.b.core.step.AbstractStep - Encountered an error executing the step javax.persistence.TransactionRequiredException: no transaction is in progress
I see that the problem is with the transaction. Here is my spring java config for the entityManager and the transactionManager :
#Configuration
public class PersistenceSpringConfig implements EnvironmentAware
{
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() throws Exception
{
// Initializes the entity manager
LocalContainerEntityManagerFactoryBean factoryBean = new LocalContainerEntityManagerFactoryBean();
factoryBean.setPersistenceUnitName(PERSISTENCE_UNIT_NAME);
factoryBean.setJpaVendorAdapter(new HibernateJpaVendorAdapter());
factoryBean.setDataSource(dataSource());
// Scans the database model
factoryBean.setPackagesToScan(EntiteJuridiqueJPA.class.getPackage().getName());
// Defines the Hibernate properties
Properties jpaProperties = new Properties();
jpaProperties.setProperty("hibernate.show_sql", "false");
jpaProperties.setProperty("hibernate.format_sql", "false");
String connectionURL = "jdbc:h2:file:" + getDatabaseLocation();
jpaProperties.setProperty("hibernate.connection.url", connectionURL);
jpaProperties.setProperty("hibernate.connection.username", "sa");
jpaProperties.setProperty("hibernate.connection.driver_class", "org.h2.Driver");
jpaProperties.setProperty("hibernate.dialect", H2Dialect.class.getName());
jpaProperties.setProperty("hibernate.hbm2ddl.auto", "create");
jpaProperties.setProperty("hibernate.hbm2ddl.import_files_sql_extractor", "org.hibernate.tool.hbm2ddl.MultipleLinesSqlCommandExtractor");
jpaProperties.setProperty("hibernate.hbm2ddl.import_files",
"org/springframework/batch/core/schema-drop-h2.sql,org/springframework/batch/core/schema-h2.sql");
factoryBean.setJpaProperties(jpaProperties);
return factoryBean;
}
#Bean
public PlatformTransactionManager transactionManager2() throws Exception
{
EntityManagerFactory object = entityManagerFactory().getObject();
JpaTransactionManager jpaTransactionManager = new JpaTransactionManager(object);
return jpaTransactionManager;
}
I am using the JpaItemWriter to store the datas in the database :
#Bean
public ItemWriter<EntiteJuridiqueJPA> writer()
{
JpaItemWriter<EntiteJuridiqueJPA> writer = new JpaItemWriter<EntiteJuridiqueJPA>();
writer.setEntityManagerFactory(entityManagerFactory.getObject());
return writer;
}
This is the code that causes the exception : javax.persistence.TransactionRequiredException: no transaction is in progress
Any idea to how to solve this problem?
[Edit] I am putting also the Job definition and the step definition. All my Spring configuration is written in Java.
#Configuration
#EnableBatchProcessing
#Import(PersistenceSpringConfig.class)
public class BatchSpringConfig
{
#Autowired
private JobBuilderFactory jobBuilders;
#Autowired
private StepBuilderFactory stepBuilders;
#Autowired
private DataSource dataSource;
#Autowired
private LocalContainerEntityManagerFactoryBean entityManagerFactory;
#Bean
public Step step()
{
return stepBuilders.get("step").<EntiteJuridique, EntiteJuridiqueJPA> chunk(5).reader(cvsReader(null))
.processor(processor()).writer(writer()).listener(processListener()).build();
}
#Bean
#StepScope
public FlatFileItemReader<EntiteJuridique> cvsReader(#Value("#{jobParameters[input]}") String input)
{
FlatFileItemReader<EntiteJuridique> flatFileReader = new FlatFileItemReader<EntiteJuridique>();
flatFileReader.setLineMapper(lineMapper());
flatFileReader.setResource(new ClassPathResource(input));
return flatFileReader;
}
#Bean
public LineMapper<EntiteJuridique> lineMapper()
{
DefaultLineMapper<EntiteJuridique> lineMapper = new DefaultLineMapper<EntiteJuridique>();
DelimitedLineTokenizer lineTokenizer = new DelimitedLineTokenizer();
lineTokenizer.setDelimiter(";");
lineTokenizer.setNames(new String[] { "MEGA_ENTITE", "PORTEFEUILLE", "MEGA_ENTITE", "Libellé" });
BeanWrapperFieldSetMapper<EntiteJuridique> fieldSetMapper = new BeanWrapperFieldSetMapper<EntiteJuridique>();
fieldSetMapper.setTargetType(EntiteJuridique.class);
lineMapper.setLineTokenizer(lineTokenizer);
lineMapper.setFieldSetMapper(fieldSetMapper);
return lineMapper;
}
#Bean
public Job dataInitializer()
{
return jobBuilders.get("dataInitializer").listener(protocolListener()).start(step()).build();
}
#Bean
public ItemProcessor<EntiteJuridique, EntiteJuridiqueJPA> processor()
{
return new EntiteJuridiqueProcessor();
}
#Bean
public ItemWriter<EntiteJuridiqueJPA> writer()
{
JpaItemWriter<EntiteJuridiqueJPA> writer = new JpaItemWriter<EntiteJuridiqueJPA>();
writer.setEntityManagerFactory(entityManagerFactory.getObject());
return writer;
// return new EntiteJuridiqueWriter();
}
#Bean
public ProtocolListener protocolListener()
{
return new ProtocolListener();
}
#Bean
public CSVProcessListener processListener()
{
return new CSVProcessListener();
}
#Bean
public PlatformTransactionManager transactionManager2() throws Exception
{
EntityManagerFactory object = entityManagerFactory.getObject();
JpaTransactionManager jpaTransactionManager = new JpaTransactionManager(object);
return jpaTransactionManager;
}
[EDIT] I am still stuck with this problem. I have followed the suggestions of #Sean Patrick Floyd and #bellabax by setting a transaction manager for the stepBuilders, but I still get the same exception. I have tested my entityManager independtly of spring-batch and I am able to store any data in the database.
But, when using the same entity manager with spring batch, I have this exception.
Anyone can give more insights how transactions are managed within spring batch? Thx for your help?
The problem is that you are creating a second transaction manager (transactionManager2), but Spring Batch is using another transaction manager for starting transactions. If you use #EnableBatchProcessing, Spring Batch automatically registers a transaction manager to use for its transactions, and your JpaTransactionManager never gets used.
If you want to change the transaction manager that Spring Batch uses for transactions, you have to implement the interface BatchConfigurer. Take a look at this example: https://github.com/codecentric/spring-batch-javaconfig/blob/master/src/main/java/de/codecentric/batch/configuration/WebsphereInfrastructureConfiguration.java.
Here I am switching the transaction manager to a WebspherUowTransactionManager, and in the same way you can switch the transaction manager to some other transaction manager.
Here's the link to the blog post explaining it: http://blog.codecentric.de/en/2013/06/spring-batch-2-2-javaconfig-part-3-profiles-and-environments/
You need to explicitly reference your Transaction Manager in your step definition:
<job id="sampleJob" job-repository="jobRepository">
<step id="step1">
<tasklet transaction-manager="transactionManager">
<chunk reader="itemReader" writer="itemWriter" commit-interval="10"/>
</tasklet>
</step>
</job>
See: 5.1.1. Configuring a Step
Ah, seeing that you use JavaConfig, you need to assign the transaction manager to the TaskletStepBuilder using builder.transactionManager(transactionManager) (inherited from StepBuilderHelper)

Categories