This question already has answers here:
Why is my Spring #Autowired field null?
(21 answers)
Closed 1 year ago.
I tried implementing Spring batch. Here I'm trying to save data from text file to save in database. I'm getting NPE while processing.
#Configuration
public class JobConfig {
#Value("${inputFile}")
private Resource resource;
#Autowired
private ResourceLoader resourceLoader;
private JobBuilderFactory jobBuilderFactory;
private StepBuilderFactory stepBuilderFactory;
#Autowired
public JobConfig(JobBuilderFactory jobBuilderFactory, StepBuilderFactory stepBuilderFactory) {
this.jobBuilderFactory = jobBuilderFactory;
this.stepBuilderFactory = stepBuilderFactory;
}
#Bean
public Job job() {
return this.jobBuilderFactory.get("JOB-Load")
.start(fileReadingStep())
.build();
}
#Bean
public Step fileReadingStep() {
return stepBuilderFactory.get("File-Read-Step1")
.<Student,Student>chunk(5)
.reader(itemReader())
.processor(new Processer())
.writer(new Writer())
.build();
}
#Bean
public FlatFileItemReader<Student> itemReader() {
FlatFileItemReader<Student> flatFileItemReader = new FlatFileItemReader<>();
flatFileItemReader.setResource(resource);
flatFileItemReader.setName("File-Reader");
flatFileItemReader.setStrict(false);
flatFileItemReader.setLineMapper(LineMapper());
return flatFileItemReader;
}
#Bean
public LineMapper<Student> LineMapper() {
DefaultLineMapper<Student> defaultLineMapper = new DefaultLineMapper<Student>();
FixedLengthTokenizer fixedLengthTokenizer = new FixedLengthTokenizer();
fixedLengthTokenizer.setNames(new String[] { "name"});
fixedLengthTokenizer.setColumns(new Range[] { new Range(1, 5)});
fixedLengthTokenizer.setStrict(false);
defaultLineMapper.setLineTokenizer(fixedLengthTokenizer);
defaultLineMapper.setFieldSetMapper(new DefaultFieldSetMapper());
return defaultLineMapper;
}
}
Writer Class
#Component
public class Writer implements ItemWriter<Student> {
#Autowired
private StudentRepo repo;
#Override
public void write(List<? extends Student> items) throws Exception {
for(Student s:items) {
System.out.println(s.getName());
}
try {
repo.saveAll(items);
}catch (Exception e) {
e.printStackTrace();
}
}
}
here I'm used JPARepository to save my text file data into database in my custom writer class. Here StudentRepo is null.
Why it's null ?
I tried another method method used same StudentRepo to store in db manually there is no issue.
Only in writer class it's null.
In fileReadingStep, you do not use the Writer bean that gets created through the #Component annotation. You create a new Writer instance which the Spring context is unaware of wherefore the repo field is not set and remains null.
You need to inject the Writer bean like you inject the ResourceLoader or the JobBuilderFactory and then use the bean in fileReadingStep.
Related
I´m trying to inject parameters from an outside context to an ItemReader using spring batch.
Below I have my code that trigger the job:
Date date = Date.from(advanceSystemDateEvent.getReferenceDate().atStartOfDay(ZoneId.systemDefault()).toInstant());
JobParametersBuilder jobParametersBuilder = new JobParametersBuilder();
jobParametersBuilder.addLong("uniqueness", System.nanoTime());
jobParametersBuilder.addDate("date", date);
jobLauncher.run(remuneradorJob, jobParametersBuilder.toJobParameters());
My job:
#EnableBatchProcessing
#Configuration
public class JobsConfig {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Bean
public Job remuneradorJob(Step remuneradorStep) {
return jobBuilderFactory
.get("remuneradorJob")
.start(remuneradorStep)
.incrementer(new RunIdIncrementer())
.build();
}
}
My step:
#Configuration
public class StepsConfig {
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public Step remuneradorStep(ItemReader<Entity> remuneradorReader, ItemWriter<Entity> remuneradorWriter) {
return stepBuilderFactory
.get("remuneradorStep")
.<Entity, Entity>chunk(1)
.reader(remuneradorReader)
.writer(remuneradorWriter)
.build();
}
}
My itemreader:
#Configuration
public class RemuneradorReaderConfig {
#Autowired
private EntityManagerFactory entityManagerFactory;
#Bean
#StepScope
public JpaPagingItemReader<Entity> remuneradorReader(#Value("#{jobParameters}") Map jobParameters) {
//LocalDate localDate = date.toInstant().atZone(ZoneId.systemDefault()).toLocalDate();
LocalDate localDate = LocalDate.of(2011,5,16);
JpaPagingItemReader<Entity> databaseReader = new JpaPagingItemReader<>();
databaseReader.setEntityManagerFactory(entityManagerFactory);
databaseReader.setQueryString("...");
databaseReader.setPageSize(1000);
databaseReader.setParameterValues(Collections.<String, Object>singletonMap("limit", localDate));
return databaseReader;
}
}
I got the error:
Error creating bean with name 'scopedTarget.remuneradorReader': Scope 'step' is not active for the current thread; consider defining a scoped proxy for this bean if you intend to refer to it from a singleton; nested exception is java.lang.IllegalStateException: No context holder available for step scope
I tried replacing #StepScope for #JobScope, but I got the same error.
I saw this issue:
Spring batch scope issue while using spring boot
And finally, the application runs.
But now I´m facing another problem, according the code below:
#Bean
#StepScope
public JpaPagingItemReader<Entity> remuneradorReader(#Value("#{jobParameters}") Map jobParameters) {
JpaPagingItemReader<Entity> databaseReader = new JpaPagingItemReader<>();
databaseReader.setEntityManagerFactory(entityManagerFactory);
databaseReader.setQueryString("select o from Object o");
databaseReader.setPageSize(1000);
return databaseReader;
}
When I execute this reader, it gives me:
Deposit{Deposit_ID='null', Legacy_ID ='null', Valor_Depósito='10000', Saldo='10000'}
The idDeposit and idLegacy comes null.
But when I remove #StepScope and #Value("#{jobParameters}") Map jobParameters from ItemReader, like code below:
#Bean
public JpaPagingItemReader<Entity> remuneradorReader() {
JpaPagingItemReader<Entity> databaseReader = new JpaPagingItemReader<>();
databaseReader.setEntityManagerFactory(entityManagerFactory);
databaseReader.setQueryString("select o from Object o");
databaseReader.setPageSize(1000);
return databaseReader;
}
The reader gives me the correct response:
Deposit{Deposit_ID='98', Legacy_ID ='333', Valor_Depósito='10000', Saldo='10000'}
I think it´s missing something else.
Can anyone help me?
I have a need to share relatively large amounts of data between job steps for a spring batch implementation. I am aware of StepExecutionContext and JobExecutionContext as mechanisms for this. However, I read that since these must be limited in size (less than 2500 characters). That is too small for my needs. In my novice one-step Spring Batch implementation, my single step job is as below:
#Configuration
#EnableBatchProcessing
public class BatchConfig {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
private static final String GET_DATA =
" SELECT " +
"stuffA, " +
"stuffB, " +
"FROM STUFF_TABLE " +
"ORDER BY stuffA ASC";
#Bean
public ItemReader<StuffDto> databaseCursorItemReader(DataSource dataSource) {
return new JdbcCursorItemReaderBuilder<StuffDto>()
.name("cursorItemReader")
.dataSource(dataSource)
.sql(GET_DATA)
.rowMapper(new BeanPropertyRowMapper<>(StuffDto.class))
.build();
}
#Bean
ItemProcessor<StuffDto, StuffDto> databaseXmlItemProcessor() {
return new QueryLoggingProcessor();
}
#Bean
public ItemWriter<StuffDto> databaseCursorItemWriter() {
return new LoggingItemWriter();
}
#Bean
public Step databaseCursorStep(#Qualifier("databaseCursorItemReader") ItemReader<StuffDto> reader,
#Qualifier("databaseCursorItemWriter") ItemWriter<StuffDto> writer,
StepBuilderFactory stepBuilderFactory) {
return stepBuilderFactory.get("databaseCursorStep")
.<StuffDto, StuffDto>chunk(1)
.reader(reader)
.writer(writer)
.build();
}
#Bean
public Job databaseCursorJob(#Qualifier("databaseCursorStep") Step exampleJobStep,
JobBuilderFactory jobBuilderFactory) {
return jobBuilderFactory.get("databaseCursorJob")
.incrementer(new RunIdIncrementer())
.flow(exampleJobStep)
.end()
.build();
}
}
This works fine in the sense that I can successfully read from the database and write in the writer step to a loggingitemwriter like this:
public class LoggingItemWriter implements ItemWriter<StuffDto> {
private static final Logger LOGGER = LoggerFactory.getLogger(LoggingItemWriter.class);
#Override
public void write(List<? extends StuffDto> list) throws Exception {
LOGGER.info("Writing stuff: {}", list);
}
}
However, I need to be able to make available that StuffDto (or equivalent) and it's data to a second step that would be performing some processing against it rather than just logging it.
I would be grateful for any ideas on how that could be accomplished if you assume that the step and job contexts are too limited. Thanks.
If you do not want to write the data in the database or filesystem, one way to achieve the same is like below:
Create your own job context bean in your config class having the required properties and annotated it with #JobScope.
Implement org.springframework.batch.core.step.tasklet interface to your reader, processor and writer classes. If you want more control over steps you can also implement org.springframework.batch.core.StepExecutionListener with it.
Get your own context object using #Autowire and use the setter-getter method of it to store and retrieve the data.
Sample Code:
Config.java
#Autowired
private Processor processor;
#Autowired
private Reader reader;
#Autowired
private Writer writer;
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
#JobScope
public JobContext getJobContexts() {
JobContext newJobContext = new JobContext();
return newJobContext;
}
#Bean
public Step reader() {
return stepBuilderFactory.get("reader")
.tasklet(reader)
.build();
}
#Bean
public Step processor() {
return stepBuilderFactory.get("processor")
.tasklet(processor)
.build();
}
#Bean
public Step writer() {
return stepBuilderFactory.get("writer")
.tasklet(writer)
.build();
}
public Job testJob() {
return jobBuilderFactory.get("testJob")
.start(reader())
.next(processor())
.next(writer())
.build();
}
//Below will start the job
#Scheduled(fixedRate = 1000)
public void starJob(){
Map<String, JobParameter> confMap = new HashMap<>();
confMap.put("time", new JobParameter(new Date()));
JobParameters jobParameters = new JobParameters(confMap);
monitorJobLauncher.run(testJob(), jobParameters);
}
JobContext.java
private List<StuffDto> dataToProcess = new ArrayList<>();
private List<StuffDto> dataToWrite = new ArrayList<>();
//getter
SampleReader.java
#Component
public class SampleReader implements Tasklet,StepExecutionListener{
#Autowired
private JobContext context;
#Override
public void beforeStep(StepExecution stepExecution) {
//logic that you need to perform before the execution of this step.
}
#Override
public void afterStep(StepExecution stepExecution) {
//logic that you need to perform after the execution of this step.
}
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext){
// Whatever code is here will get executed for reader.
// Fetch StuffDto object from database and add it to jobContext
//dataToProcess list.
return RepeatStatus.FINISHED;
}
}
SampleProcessor.java
#Component
public class SampleProcessor implements Tasklet{
#Autowired
private JobContext context;
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext){
// Whatever code is here will get executed for processor.
// context.getDataToProcessList();
// apply business logic and set the data to write.
return RepeatStatus.FINISHED;
}
Same ways for the writer class.
Note: Please note here that here you need to write database-related boilerplate code on your own. But this way you can have more control over your logic and nothing to worry about context size limit. All the data will be in memory so as soon as operation done those will be garbage collected. I hope you get the idea of what I willing to convey.
To read more about Tasklet vs Chunk read this.
I did some searching, but couldn't find an example code. Spring batch
reading from REST api (which I have done) and writing multiple records
for one read to a single DB table using JdbcBatchItemWriter.
Below is my BatchConfig code, but it writes only one record. I think I
have to make my processor return a List of Registration object and the
JDBCItemWriter has to write multiple records
code
#Configuration
#EnableBatchProcessing
public class BatchConfiguration {
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Autowired
public StepBuilderFactory stepBuilderFactory;
#Autowired
private Environment environment;
#Bean
RestTemplate restTemplate() {
return new RestTemplate();
}
//my reader
#Bean
ItemReader<EmployeeEmploymentDTO> restEmployeeReader(Environment
environment,
RestTemplate restTemplate) {
return new RESTEmployeeReader(
environment.getRequiredProperty("rest.api.to.listemployees.ugs.api.url"),
restTemplate
);
}
//my processor which is a separate class
#Bean
public RegistrationItemProcessor processor() {
return new RegistrationItemProcessor();
}
//my writer which now only inserts one record for a read but i want to
insert multiple varying number of records for a read
#Bean
public JdbcBatchItemWriter<Registration> writer(DataSource dataSource) {
return new JdbcBatchItemWriterBuilder<Registration>()
.itemSqlParameterSourceProvider(new
BeanPropertyItemSqlParameterSourceProvider<>())
.sql("INSERT INTO registration //.....*ommitted insert statement
.dataSource(dataSource)
.build();
}
#Bean
public Job importUserJob(JobCompletionNotificationListener listener,
Step step1) {
return jobBuilderFactory.get("importUserJob")
.incrementer(new RunIdIncrementer())
.listener(listener)
.flow(step1)
.end()
.build();
}
#Bean
public Step step1(JdbcBatchItemWriter<Registration> writer) {
return stepBuilderFactory.get("step1")
.<EmployeeEmploymentDTO, Registration> chunk(10)
.reader(restEmployeeReader(environment,restTemplate()))
.processor(processor())
.writer(writer)
.build();
}
}
My processor returned a list of List
My writer as below
public class MultiOutputItemWriter implements ItemWriter<List<Registration>> {
ItemWriter<Registration> itemWriter;
#Autowired
NamedParameterJdbcTemplate namedParamJdbcTemplate;
#Override
public void write(List<? extends List<Registration>> items) throws Exception {
for (List<Registration> registrations : items) {
final String SQL_INSERT_INTO_REGISTRATION="INSERT INTO registration (employee_id, ....";
final List<MapSqlParameterSource> params = new ArrayList<>();
for (Registration registration : registrations) {
MapSqlParameterSource param = new MapSqlParameterSource();
param.addValue("employeeId", registration.getEmployeeId());
param.addValue("startDate", registration.getStartDate());
param.addValue("user", registration.getUser());
param.addValue("endTime", registration.getEndTime());
params.add(param);
}
namedParamJdbcTemplate.batchUpdate(SQL_INSERT_INTO_REGISTRATION,params.toArray(new MapSqlParameterSource[params.size()]));
}
}
}
I am trying to write a IntegrationFlow test. It goes something like this:
JMS(in) -> (find previous versions in db) -> reduce(in,1...n) -> (to db) -> JMS(out)
So, no suprise: I want to mock the DB calls; they are Dao beans. But, I also want it to pickup other beans through component scan; I will selectively scan all packages except dao.
Create a test config and mock the Daos. No problem
Follow spring boot instructions for testing to get Component scanned beans. No problem
I just want to verify the sequence of steps and the resultant output as the outbound JMS queue would see it. Can someone just help me fill in the blanks?
This CANT be tough! The use of mocks seems to be problematic because plenty of essential fields are final. I am reading everywhere about this and just not coming up with a clear path. I inherited this code BTW
My error:
org.springframework.integration.MessageDispatchingException: Dispatcher has no subscribers
Here is my code
#Configuration
#ImportResource("classpath:retry-context.xml")
public class LifecycleConfig {
#Autowired
private MessageProducerSupport inbound;
#Autowired
private MessageHandler outbound;
#Autowired
#Qualifier("reducer")
private GenericTransformer<Collection<ExtendedClaim>,ExtendedClaim> reducer;
#Autowired
#Qualifier("claimIdToPojo")
private GenericTransformer<String,ClaimDomain> toPojo;
#Autowired
#Qualifier("findPreviousVersion")
private GenericTransformer<ExtendedClaim,Collection<ExtendedClaim>> previousVersions;
#Autowired
#Qualifier("saveToDb")
private GenericHandler<ExtendedClaim> toDb;
#Bean
public DirectChannel getChannel() {
return new DirectChannel();
}
#Bean
#Autowired
public StandardIntegrationFlow processClaim() {
return IntegrationFlows.from(inbound).
channel(getChannel()).
transform(previousVersions).
transform(reducer).
handle(ExtendedClaim.class,toDb).
transform(toPojo).
handle(outbound).get();
}
}
Test Config
#Configuration
public class TestConfig extends AbstractClsTest {
#Bean(name = "claimIdToPojo")
public ClaimIdToPojo getClaimIdToPojo() {
return spy(new ClaimIdToPojo());
}
#Bean
public ClaimToId getClaimToIdPojo() {
return spy(new ClaimToId());
}
#Bean(name = "findPreviousVersion")
public FindPreviousVersion getFindPreviousVersion() {
return spy(new FindPreviousVersion());
}
#Bean(name = "reducer")
public Reducer getReducer() {
return spy(new Reducer());
}
#Bean(name = "saveToDb")
public SaveToDb getSaveToDb() {
return spy(new SaveToDb());
}
#Bean
public MessageProducerSupport getInbound() {
MessageProducerSupport mock = mock(MessageProducerSupport.class);
// when(mock.isRunning()).thenReturn(true);
return mock;
}
#Bean
public PaymentDAO getPaymentDao() {
return mock(PaymentDAO.class);
}
#Bean
public ClaimDAO getClaimDao() {
return mock(ClaimDAO.class);
}
#Bean
public MessageHandler getOutbound() {
return new CaptureHandler<ExtendedClaim>();
}
}
Actual test won't load
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = {TestConfig.class, LifecycleConfig.class})
public class ClaimLifecycleApplicationTest extends AbstractClsTest {
#Autowired
private MessageHandler outbound;
#Autowired
#Qualifier("reducer")
private GenericTransformer<Collection<ExtendedClaim>,ExtendedClaim> reducer;
#Autowired
#Qualifier("claimIdToPojo")
private GenericTransformer<String,ClaimDomain> toPojo;
#Autowired
#Qualifier("findPreviousVersion")
private GenericTransformer<ExtendedClaim,Collection<ExtendedClaim>> previousVersions;
#Autowired
#Qualifier("saveToDb")
private GenericHandler<ExtendedClaim> toDb;
#Autowired
private DirectChannel defaultChannel;
#Test
public void testFlow() throws Exception {
ExtendedClaim claim = getClaim();
Message<ExtendedClaim> message = MessageBuilder.withPayload(claim).build();
List<ExtendedClaim> previousClaims = Arrays.asList(claim);
defaultChannel.send(message);
verify(previousVersions).transform(claim);
verify(reducer).transform(previousClaims);
verify(toDb).handle(claim, anyMap());
verify(toPojo).transform(claim.getSubmitterClaimId());
verify(outbound);
}
}
There are a lot of domain-specific object, so I can't test it to reproduce or find some other issue with your code.
But I see that you don't use an #EnableIntegration on your #Configurations classes.
I am writing a Spring Batch with idea of scaling it when required.
My ApplicationContext looks like this
#Configuration
#EnableBatchProcessing
#EnableTransactionManagement
#ComponentScan(basePackages = "in.springbatch")
#PropertySource(value = {"classpath:springbatch.properties"})
public class ApplicationConfig {
#Autowired
Environment environment;
#Autowired
private JobBuilderFactory jobs;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Bean
public Job job() throws Exception {
return jobs.get("spring_batch")
.flow(step()).end()
.build();
}
#Bean(name = "dataSource", destroyMethod = "close")
public DataSource dataSource() {
BasicDataSource basicDataSource = new BasicDataSource();
return basicDataSource;
}
#Bean
public JobRepository jobRepository() throws Exception {
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setTransactionManager(transactionManager());
jobRepositoryFactoryBean.setDataSource(dataSource());
return jobRepositoryFactoryBean.getObject();
}
#Bean(name = "batchstep")
public Step step() throws Exception {
return stepBuilderFactory.get("batchstep").allowStartIfComplete(true).
transactionManager(transactionManager()).
chunk(2).reader(batchReader()).processor(processor()).writer(writer()).build();
}
#Bean
ItemReader batchReader() throws Exception {
System.out.println(Thread.currentThread().getName()+"reader");
HibernateCursorItemReader<Source> hibernateCursorItemReader = new HibernateCursorItemReader<>();
hibernateCursorItemReader.setQueryString("from Source");
hibernateCursorItemReader.setFetchSize(2);
hibernateCursorItemReader.setSessionFactory(sessionFactory().getObject());
hibernateCursorItemReader.close();
return hibernateCursorItemReader;
}
#Bean
public ItemProcessor processor() {
return new BatchProcessor();
}
#Bean
public ItemWriter writer() {
return new BatchWriter();
}
public TaskExecutor taskExecutor(){
SimpleAsyncTaskExecutor asyncTaskExecutor=new SimpleAsyncTaskExecutor("spring_batch");
asyncTaskExecutor.setConcurrencyLimit(5);
return asyncTaskExecutor;
}
#Bean
public LocalSessionFactoryBean sessionFactory() {
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource(dataSource());
sessionFactory.setPackagesToScan(new String[]{"in.springbatch.entity"});
sessionFactory.setHibernateProperties(hibernateProperties());
return sessionFactory;
}
#Bean
public PersistenceExceptionTranslationPostProcessor exceptionTranslation() {
return new PersistenceExceptionTranslationPostProcessor();
}
#Bean
#Autowired
public HibernateTransactionManager transactionManager() {
HibernateTransactionManager txManager = new HibernateTransactionManager();
txManager.setSessionFactory(sessionFactory().getObject());
return txManager;
}
Properties hibernateProperties() {
return new Properties() {
{
setProperty("hibernate.hbm2ddl.auto", environment.getProperty("hibernate.hbm2ddl.auto"));
setProperty("hibernate.dialect", environment.getProperty("hibernate.dialect"));
setProperty("hibernate.globally_quoted_identifiers", "false");
}
};
}
}
With above configuration I am able to read from DB , process the data and write to DB.
I am using chunk size as 2 and reading 2 records from cursor using
HibernateCusrsorItem reader and my query to read from DB is based on
date to pick current date records.
So far I am able to achieve desired behavior as well as restart
ability with job only picking records which were not processed
due to failure in previous run.
Now my requirement is to make batch use multiple threads to process data and write to DB.
My Processor and writer looks like this
#Component
public class BatchProcessor implements ItemProcessor<Source,DestinationDto>{
#Override
public DestinationDto process(Source source) throws Exception {
System.out.println(Thread.currentThread().getName()+":"+source);
DestinationDto destination=new DestinationDto();
destination.setName(source.getName());
destination.setValue(source.getValue());
destination.setSourceId(source.getSourceId().toString());
return destination;
}
#Component
public class BatchWriter implements ItemWriter<DestinationDto>{
#Autowired
IBatchDao batchDao;
#Override
public void write(List<? extends DestinationDto> list) throws Exception {
System.out.println(Thread.currentThread().getName()+":"+list);
batchDao.saveToDestination((List<DestinationDto>)list);
}
I updated my step and added a ThreadPoolTaskExecutor as follows
#Bean(name = "batchstep")
public Step step() throws Exception {
return stepBuilderFactory.get("batchstep").allowStartIfComplete(true).
transactionManager(transactionManager()).chunk(1).reader(batchReader()).
processor(processor()).writer(writer()).taskExecutor(taskExecutor()).build();
}
After this my processor is getting called by multiple threads but with same source data.
Is there anything extra i need to do?
This is a big question
Your best bet at getting a good answers would be to look through the Scaling and Parallel Processing chapter in the Spring Batch Documentation (Here)
There might be some multi-threading samples in the spring batch examples (Here)
An easy way to thread the Spring batch job is to Create A Future Processor - you put all your Processing Logic in a Future Object and you spring-processor class only adds Objects to the future. You writer class then wait on the future to finish before performing the write process. Sorry I don't have a sample to point you too for this - but if you have specific questions I can try and answer!