Spring Batch - Using an ItemWriter with List of Lists - java

Our processor returns a List<?> (effectively passing a List<List<?>>) to our ItemWriter.
Now, we observed that the JdbcBatchItemWriter is not programmed to handle item instanceof List. We also observed to process item instanceof List; we need to write a custom ItemSqlParameterSourceProvider.
But the sad part is that it returns SqlParameterSource which can handle only one item and again not capable of handling a List.
So, can someone help us understand how to handle list of lists in the JdbcBatchItemWriter?

Typically, the design pattern is:
Reader -> reads something, returns ReadItem
Processor -> ingests ReadItem, returns ProcessedItem
Writer -> ingests List<ProcessedItem>
If your processor is returning List<Object>, then you need your Writer to expect List<List<Object>>.
You could do this by wrapping your JdbcBatchItemWriter as a delegate in an ItemWriter that looks something like this:
public class ListUnpackingItemWriter<T> implements ItemWriter<List<T>>, ItemStream, InitializingBean {
private ItemWriter<T> delegate;
#Override
public void write(final List<? extends List<T>> lists) throws Exception {
final List<T> consolidatedList = new ArrayList<>();
for (final List<T> list : lists) {
consolidatedList.addAll(list);
}
delegate.write(consolidatedList);
}
#Override
public void afterPropertiesSet() {
Assert.notNull(delegate, "You must set a delegate!");
}
#Override
public void open(ExecutionContext executionContext) {
if (delegate instanceof ItemStream) {
((ItemStream) delegate).open(executionContext);
}
}
#Override
public void update(ExecutionContext executionContext) {
if (delegate instanceof ItemStream) {
((ItemStream) delegate).update(executionContext);
}
}
#Override
public void close() {
if (delegate instanceof ItemStream) {
((ItemStream) delegate).close();
}
}
public void setDelegate(ItemWriter<T> delegate) {
this.delegate = delegate;
}
}

public class ListUnpackingItemWriter<T> implements FlatFileItemWriter<List<T>>, ItemStream, InitializingBean {
#Override
public void afterPropertiesSet() {
setLineAggregator(item -> String.join("\n", item.stream().map(T::toString).collect(Collectors.toList())));
}
}
Just added a custom line aggregator to the above solution, this helps in writing the content to a file by using FlatFileItemWriter<List<T>>. You can replace T with actual class name to avoid compilation error while calling toString() method.

Related

How to use MongoItemWriter to write a List<T>

My processor return a List of Object but I don't find a solution to override the MongoItemWriter to write some List of object and not object.
I tried this.
My processor :
#Override
public List<PlaqueLueEntity> process(PlaqueSousSurveillanceEntity item) {
log.trace("Traitement d'une entrée PlaqueSousSurveillanceEntity: {}", item);
List<PlaqueLue> rapprochements = this.rapprochementUseCase.findRapprochementByPlaque(item.getPlaque());
if (rapprochements.isEmpty()) {
return null;
}
for (PlaqueLue rapprochement : rapprochements) {
rapprochement.setRapprochement(true);
rapprochement.setHorodatageRapprochement(LocalDateTime.now());
}
List<PlaqueLueEntity> plaqueLueEntities =
rapprochements.stream().map(this.plateDetectionMapper::plateDetectionToPlateDetectionEntity).toList();
return plaqueLueEntities;
}
My writer
public class RapprochementMongoWriter<T> extends MongoItemWriter<List<T>> {
private MongoOperations template;
private MongoItemWriter<T> delegate;
#Override
public void write(final List<? extends List<T>> lists) throws Exception {
for (final List<T> list : lists) {
delegate.write(list);
}
}
#Bean
public MongoItemWriter<List<PlaqueLueEntity>> writer(MongoTemplate mongoTemplate,
BatchConfigurationProperties batchConfigurationProperties) {
MongoItemWriter<List<PlaqueLueEntity>> writer = new MongoItemWriter<>();
writer.setTemplate(mongoTemplate);
writer.setCollection(String.valueOf(CollectionEnum.COLLECTION.PLAQUE_LUE));
return writer;
}
And I defined in my BatchConfiguration:
#Bean
public MongoItemWriter<List<PlaqueLueEntity>> rapprochementWriter() {
return new RapprochementMongoWriter().writer(mongoTemplate, batchConfigurationProperties);
}
But before I can debug on my writer I got this error:
class org.bson.Document cannot be cast to class java.util.Collection (org.bson.Document is in unnamed module of loader 'app'; java.util.Collection is in module java.base of loader 'bootstrap')
And :
Attempt to update step execution id=1 with wrong version (1), where current version is 2
I don't find a solution to override the MongoItemWriter to write some List of object and not object.
The way you defined your custom item writer is correct:
public class RapprochementMongoWriter<T> extends MongoItemWriter<List<T>> {
private MongoItemWriter<T> delegate;
//...
}
However, you should not call the delegate for each list like this:
#Override
public void write(final List<? extends List<T>> lists) throws Exception {
for (final List<T> list : lists) {
delegate.write(list);
}
}
What you should do instead is "flatten" the items in a single list and call the delegate once, something like this:
#Override
public void write(final List<? extends List<T>> lists) throws Exception {
List<T> flatList = new ArrayList();
for (final List<T> list : lists) {
for (T item: list) {
flatList.add(item);
}
}
delegate.write(flatList);
}
Please note that I did not compile that snippet, so I will let you adapt it if needed, but you got the idea.

Spring Batch: onWriteError in ItemWriteListener won't be reached on org.springframework.dao.DuplicateKeyException when implementing writer in step

I'm trying to capture items that have failed to be inserted into database due to unique key violation. After 'everywhere' research this is discouragingly starting to feel like unfixed bug or something. I honestly need a helpful clarification.
Edit: Visited here to have skipped exceptions captured using #OnSkipInWrite and this is either not clear enough or just wont work.
#Bean
public PersonFailureLoggerListener loggerListener() {
return new PersonFailureLoggerListener();
}
#Bean
public Step step1(JdbcBatchItemWriter<Person> writer) throws MalformedURLException {
return stepBuilderFactory.get("step1")
.<Person, Person>chunk(10)
.reader(reader())
.faultTolerant().skipPolicy(fileVerificationSkipper())
.processor(itemProcessor())
.writer(writer)
//.listener(new PersonWriteListener())
.listener(loggerListener().asItemProcessListener())
.build();
}
Item listener support:
#Component
public class PersonFailureLoggerListener extends ItemListenerSupport<Person, Person> {
private static final Logger LOG = LoggerFactory.getLogger(Person.class.getSimpleName().concat(".error"));
#Override
public void afterProcess(Person i, Person o) {
LOG.info("before: {}, after: {}", i.toString(), o.toString());
}
#Override
public void onWriteError(Exception excptn, List<? extends Person> list) {
LOG.debug("list[{}] threw write exception", list.size(), excptn);
}
#Override
public void onReadError(Exception excptn) {
LOG.debug("encountered error on read", excptn);
}
public ItemProcessListener<Person, Person> asItemProcessListener() {
return this;
}
}
Update on PersonFailureLoggerListener:
#OnSkipInWrite
public void onSkipInWrite( Person skippedItem, Exception exception ){
LOG.debug("skipped person: {}", skippedItem.toString());
}
Edit: Visited here to have skipped exceptions captured using #OnSkipInWrite and this is either not clear enough or just wont work.
The answer suggests to use a SkipListener but you are registering a ItemProcessListener here:
.listener(loggerListener().asItemProcessListener())
Those are two different listeners which are called at different points in the step lifecycle. To listen to skipped items, you should register a SkipListener:
class MySkipListener extends SkipListenerSupport {
#Override
public void onSkipInWrite(Object item, Throwable t) {
// TODO do something with skipped items
}
}
#Bean
public Step step() {
return stepBuilderFactory.get("step")
// .. step config
.listener(new MySkipListener())
.build();
}

Design pattern suggestion to perform pipeline operation

Problem statement:
I have to process request similar to a pipeline.
For example:
When a request comes, it has to undergo a sequence of operations, like (step1,step2,step3...).
So, in order to achieve that, I am using Template design pattern.
Please review and suggest if I am implementing this problem correctly, or there is a better solution.
I am suspecting my approach will introduce code smells, as I am changing values of objects very frequently.
Also, suggest if I & how can I use Java 8 to accomplish this?
Thanks.
Code:
package com.example.demo.design;
import java.util.List;
public abstract class Template {
#Autowired
private Step1 step1;
#Autowired
private Step2 step2;
#Autowired
private Save save;
List<String> stepOutput = null;
List<String> stepOutputTwo = null;
List<String> stepOutputThree = null;
public void step1(String action1) {
stepOutput = step1.method(action1);
}
public void step2(String action2) {
stepOutputTwo = step2.method(stepOutput, action2);
}
abstract public void step3();
public void save() {
save.persist(stepOutputThree);
}
final public void run(String action1, String action2) {
step1(action1);
step2(action2);
stepOutputTwo = step3();
}
}
In Java 8 streams model, that could look like the following:
final public void run(String action1, String action2) {
Stream.of(action1) // Stream<String>
.map(s -> step1.method(s)) // Stream<List<String>>
.map(l -> step2.method(l,action2) // Stream<List<String>>
.map(l -> step3.method(l)) // Stream<List<String>>
.forEach(l -> save.persist(l));
}
I had same issue! you can do something like this: and uncheckCall method is for handling exceptions.
final public void run(String action1, String action2) {
//other stuffs
Stream.of(step1.method(action1))
.map(stepOutput->uncheckCall(() ->step2.method(stepOutput,action2)))
.forEach(stepOutputThree -> uncheckCall(()->save.persist(stepOutputThree)));
//.....
}
for uncheckCall method:
public static <T> T uncheckCall(Callable<T> callable) {
try {
return callable.call();
} catch (RuntimeException e) {
// throw BusinessException.wrap(e);
} catch (Exception e) {
//throw BusinessException.wrap(e);
}
}
Well, when there are "pipelines", "sequence of operations", etc. the first design pattern that comes to mind is Chain of Responsibility, that looks like the following
and provides you with these benefits:
allows you to add new handlers when necessary (e.g. at runtime) without modifying other handlers and processing logic (Open/Closed Principle of SOLID)
allows a handler to stop processing a request if necessary
allows you to decouple processing logic of the handlers from each other (Single Responsibility Principle of SOLID)
allows you to define the order of the handlers to process a request outside of the handlers themselves
One example of real world usage is Servlet filters where you call doFilter(HttpRequest, HttpResponse, FilterChain) to invoke the next handler
protected void doFilter(HttpServletRequest req, HttpServletResponse resp, FilterChain chain) {
if (haveToInvokeNextHandler) {
chain.doFilter(req, resp);
}
}
In case of using classical Chain of Responsibility pattern your processing pipeline may look like the following:
API
public class StepContext {
private Map<String, Object> attributes = new HashMap<>();
public <T> T getAttribute(String name) {
(T) attributes.get(name);
}
public void setAttribute(String name, Object value) {
attributes.put(name, value);
}
}
public interface Step {
void handle(StepContext ctx);
}
public abstract class AbstractStep implements Step {
private Step next;
public AbstractStep() {
}
public AbstractStep(Step next) {
this.next = next;
}
protected void next(StepContext ctx) {
if (next != null) {
next.handle(ctx);
}
}
}
Implementation
public class Step1 extends AbstractStep {
public Step1(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action1 = ctx.getAttribute("action1");
List<String> output1 = doSomething(action1);
ctx.setAttribute("output1", output1);
next(ctx); // invoke next step
}
}
public class Step2 extends AbstractStep {
public Step2(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output1 = ctx.getAttribute("output1");
List<String> output2 = doSomething(output1, action2);
ctx.setAttribute("output2", output2);
next(ctx); // invoke next step
}
}
public class Step3 extends AbstractStep {
public Step3(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output2 = ctx.getAttribute("output2");
persist(output2);
next(ctx); // invoke next step
}
}
Client code
Step step3 = new Step3(null);
Step step2 = new Step2(step3);
Step step1 = new Step1(step2);
StepContext ctx = new StepContext();
ctx.setAttribute("action1", action1);
ctx.setAttribute("action2", action2);
step1.handle(ctx);
Also all this stuff can be simplified into a chain of handlers decoupled from each other by means of removing the corresponding next references in case your processing pipeline will have to always invoke all the available steps without controlling the necessity of invocation from the previous one:
API
public class StepContext {
private Map<String, Object> attributes = new HashMap<>();
public <T> T getAttribute(String name) {
(T) attributes.get(name);
}
public void setAttribute(String name, Object value) {
attributes.put(name, value);
}
}
public interface Step {
void handle(StepContext ctx);
}
Implementation
public class Step1 implements Step {
public void handle(StepContext ctx) {
String action1 = ctx.getAttribute("action1");
List<String> output1 = doSomething(action1);
ctx.setAttribute("output1", output1);
}
}
public class Step2 implements Step {
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output1 = ctx.getAttribute("output1");
List<String> output2 = doSomething(output1, action2);
ctx.setAttribute("output2", output2);
}
}
public class Step3 implements Step {
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output2 = ctx.getAttribute("output2");
persist(output2);
}
}
Client code
Note that in case of Spring framework (just noticed #Autowired annotation) the client code may be simplified even more as the #Autowired annotation can be used to inject all the beans of the corresponding type into a corresponding collection.
Here what the documentation states:
Autowiring Arrays, Collections, and Maps
In case of an array, Collection, or Map dependency type, the container autowires all beans matching the declared value type. For such purposes, the map keys must be declared as type String which will be resolved to the corresponding bean names. Such a container-provided collection will be ordered, taking into account Ordered and #Order values of the target components, otherwise following their registration order in the container. Alternatively, a single matching target bean may also be a generally typed Collection or Map itself, getting injected as such.
public class StepsInvoker {
// spring will put all the steps into this collection in order they were declared
// within the spring context (or by means of `#Order` annotation)
#Autowired
private List<Step> steps;
public void invoke(String action1, String action2) {
StepContext ctx = new StepContext();
ctx.setAttribute("action1", action1);
ctx.setAttribute("action2", action2);
steps.forEach(step -> step.handle(ctx))
}
}

Java Generic type is failing without an intermediate variable

I'm having an unexpected error while compiling this example code (in the fails() method). IntelliJ used to not report the error in the IDE, but it has since started to report it (some of the classes were in a library, which seemed to confuse it)
public class Main {
// The command
public interface ToMessageOperation<MODEL, MESSAGE> {
void run(MODEL object, MESSAGE message) throws Exception;
}
// A command
public class SelfLink<MODEL> implements ToMessageOperation<MODEL, LinkedMessage> {
#Override
public void run(MODEL object, LinkedMessage linkedMessage) throws Exception {
}
}
// A message type
public interface LinkedMessage {
void linkme();
}
// A message
public interface BootInfo extends LinkedMessage {
}
//The Executor
public interface GetRequest<MODEL, MESSAGE> {
GetRequest<MODEL, MESSAGE> runAll(ToMessageOperation<? super MODEL, ? super MESSAGE>... operations);
GetRequest<MODEL, MESSAGE> cleanUp();
MESSAGE now();
}
// The command factory
public SelfLink selfLink() {
return null;
}
public <MODEL, MESSAGE> GetRequest<MODEL,MESSAGE> get(Class<MESSAGE> message) {
return null;
}
public BootInfo works() {
return get(BootInfo.class).cleanUp().now();
}
public BootInfo alsoWorks() {
return get(BootInfo.class).runAll(new ToMessageOperation<Object, BootInfo>() {
#Override
public void run(Object object, BootInfo bootInfo) throws Exception {
}
}).now();
}
public BootInfo surprisedItWorks() {
return get(BootInfo.class).runAll(new ToMessageOperation<Object, LinkedMessage>() {
#Override
public void run(Object object, LinkedMessage message) throws Exception {
}
}).now();
}
public BootInfo fails() {
return get(BootInfo.class).runAll(new SelfLink()).now();
}
}
I'm a bit surprised that the error only happens when I add the runAll() method, as all methods return the same objects.
I'm really surprised that the method works with different types that are assignable from the failing case ( where the ToMessageOperation type is inherited instead of being explicit ). And even that shouldn't change the return type, right ?
Am I doing something wrong ?
I quickly fixed it by putting explicit generic parameters on SelfLink
public class SelfLink<MODEL,MESSAGE extends LinkedMessage> implements ToMessageOperation<MODEL, MESSAGE> {
#Override
public void run(MODEL object, MESSAGE linkedMessage) throws Exception {
linkedMessage.linkme();
}
}
public <MODEL,MESSAGE extends LinkedMessage> SelfLink<MODEL, MESSAGE> selfLink() {
return null;
}
Using the factory class for selflink, the code looks like this:
public BootInfo fails() {
return get(BootInfo.class).runAll(selfLink()).now();
}
I don't understand why this is necessary in java 8, but use is pretty much equivalent. If I get a better explanation I'll accept it as an answer.

Access generic type parameter at runtime?

Event dispatcher interface
public interface EventDispatcher {
<T> EventListener<T> addEventListener(EventListener<T> l);
<T> void removeEventListener(EventListener<T> l);
}
Implementation
public class DefaultEventDispatcher implements EventDispatcher {
#SuppressWarnings("unchecked")
private Map<Class, Set<EventListener>> listeners = new HashMap<Class, Set<EventListener>>();
public void addSupportedEvent(Class eventType) {
listeners.put(eventType, new HashSet<EventListener>());
}
#Override
public <T> EventListener<T> addEventListener(EventListener<T> l) {
Set<EventListener> lsts = listeners.get(T); // ****** error: cannot resolve T
if (lsts == null) throw new RuntimeException("Unsupported event type");
if (!lsts.add(l)) throw new RuntimeException("Listener already added");
return l;
}
#Override
public <T> void removeEventListener(EventListener<T> l) {
Set<EventListener> lsts = listeners.get(T); // ************* same error
if (lsts == null) throw new RuntimeException("Unsupported event type");
if (!lsts.remove(l)) throw new RuntimeException("Listener is not here");
}
}
Usage
EventListener<ShapeAddEvent> l = addEventListener(new EventListener<ShapeAddEvent>() {
#Override
public void onEvent(ShapeAddEvent event) {
// TODO Auto-generated method stub
}
});
removeEventListener(l);
I've marked two errors with a comment above (in the implementation). Is there any way to get runtime access to this information?
No, you can't refer 'T' at runtime.
http://java.sun.com/docs/books/tutorial/java/generics/erasure.html
update
But something like this would achieve similar effect
abstract class EventListener<T> {
private Class<T> type;
EventListener(Class<T> type) {
this.type = type;
}
Class<T> getType() {
return type;
}
abstract void onEvent(T t);
}
And to create listener
EventListener<String> e = new EventListener<String>(String.class) {
public void onEvent(String event) {
}
};
e.getType();
You can't do it in the approach you are trying, due to erasure.
However, with a little change in the design I believe you can achieve what you need. Consider adding the following method to EventListener interface:
public Class<T> getEventClass();
Every EventListener implementation has to state the class of events it works with (I assume that T stands for an event type). Now you can invoke this method in your addEventListener method, and determine the type at runtime.

Categories