The write method of Itemwriter has been changed in the spring version 5.
#Override
public void write(List<? extends List<DataDTO>> items) throws Exception{
for(List<DataDTO> sublist:items){
writer.write(sublist);
}
The above writer is FlatFileItemWriter.
I changed to the following
#Override
public void write(Chunk<? extends List<DataDTO>> items) throws Exception{
for(List<DataDTO> sublist:items){
writer.write((Chunk<? extends DataDTO>)sublist);
}
}
Is this correct way of replacing/fix? need some help.
Im expecting the correct fix.
Expecting coreect replacement.
According to the Spring Batch 5.0 migration guide, most references to Lists in the API were replaced with Chunk.
The signature of the method ItemWriter#write(List) was changed to ItemWriter#write(Chunk)
All implementations of ItemWriter were updated to use the Chunk API instead of List
All methods in the ItemWriteListener interface were updated to use the Chunk API instead of List
All implementations of ItemWriteListener were updated to use the Chunk API instead of List
The constructor of ChunkRequest was changed to accept a Chunk instead of a Collection of items
The return type of ChunkRequest#getItems() was changed from List to Chunk
A good way to do the migration in your code is to use one of Chunk's constructors.
Example:
#Override
public void write(Chunk<? extends List<DataDTO>> items) throws Exception{
for(List<DataDTO> sublist:items){
writer.write(new Chunk<>(sublist));
}
}
Related
I'm writing a spring batch job and in one of my step I have the following code for the processor:
#Component
public class SubscriberProcessor implements ItemProcessor<NewsletterSubscriber, Account>, InitializingBean {
#Autowired
private AccountService service;
#Override public Account process(NewsletterSubscriber item) throws Exception {
if (!Strings.isNullOrEmpty(item.getId())) {
return service.getAccount(item.getId());
}
// search with email address
List<Account> accounts = service.findByEmail(item.getEmail());
checkState(accounts.size() <= 1, "Found more than one account with email %s", item.getEmail());
return accounts.isEmpty() ? null : accounts.get(0);
}
#Override public void afterPropertiesSet() throws Exception {
Assert.notNull(service, "account service must be set");
}
}
The above code works but I've found out that there are some edge cases where having more than one Account per NewsletterSubscriber is allowed. So I need to remove the state check and to pass more than one Account to the item writer.
One solution I found is to change both ItemProcessor and ItemWriter to deal with List<Account> type instead of Account but this have two drawbacks:
Code and tests are uglier and harder to write and maintain because of nested lists in writer
Most important more than one Account object may be written in the same transaction because a list given to writer may contain multiple accounts and I'd like to avoid this.
Is there any way, maybe using a listener, or replacing some internal component used by spring batch to avoid lists in processor?
Update
I've opened an issue on spring Jira for this problem.
I'm looking into isComplete and getAdjustedOutputs methods in FaultTolerantChunkProcessor which are marked as extension points in SimpleChunkProcessor to see if I can use them in some way to achieve my goal.
Any hint is welcome.
Item Processor takes one thing in, and returns a list
MyItemProcessor implements ItemProcessor<SingleThing,List<ExtractedThingFromSingleThing>> {
public List<ExtractedThingFromSingleThing> process(SingleThing thing) {
//parse and convert to list
}
}
Wrap the downstream writer to iron things out. This way stuff downstream from this writer doesn't have to work with lists.
#StepScope
public class ItemListWriter<T> implements ItemWriter<List<T>> {
private ItemWriter<T> wrapped;
public ItemListWriter(ItemWriter<T> wrapped) {
this.wrapped = wrapped;
}
#Override
public void write(List<? extends List<T>> items) throws Exception {
for (List<T> subList : items) {
wrapped.write(subList);
}
}
}
There isn't a way to return more than one item per call to an ItemProcessor in Spring Batch without getting pretty far into the weeds. If you really want to know where the relationship between an ItemProcessor and ItemWriter exits (not recommended), take a look at the implementations of the ChunkProcessor interface. While the simple case (SimpleChunkProcessor) isn't that bad, if you use any of the fault tolerant logic (skip/retry via FaultTolerantChunkProcessor), it get's very unwieldily quick.
A much simpler option would be to move this logic to an ItemReader that does this enrichment before returning the item. Wrap whatever ItemReader you're using in a custom ItemReader implementation that does the service lookup before returning the item. In this case, instead of returning a NewsletterSubscriber from the reader, you'd be returning an Account based on the previous information.
Instead of returning an Account you return create an AccountWrapper or Collection. The Writer obviously must take this into account :)
You can made transformer to transform your Pojo( Pojo object from file) to your Entity
By making the following code :
public class Intializer {
public static LGInfo initializeEntity() throws Exception {
Constructor<LGInfo> constr1 = LGInfo.class.getConstructor();
LGInfo info = constr1.newInstance();
return info;
}
}
And in your item Processor
public class LgItemProcessor<LgBulkLine, LGInfo> implements ItemProcessor<LgBulkLine, LGInfo> {
private static final Log log = LogFactory.getLog(LgItemProcessor.class);
#SuppressWarnings("unchecked")
#Override
public LGInfo process(LgBulkLine item) throws Exception {
log.info(item);
return (LGInfo) Intializer.initializeEntity();
}
}
Jackson docs say that a class that implements their JsonSerializable interface will be serialized by calling the Interface's serialize() method.
I tried this in a project that uses Jackson 2.8.4 under Jersey 2.25.
It continues to use Jackson's BeanSerializer to do default serialization based on public getters, instead of using the SerializableSerializer and the serialize() method.
Code looks like this fairly minimal example...
public class Minimal extends JsonSerializable.Base {
private String title;
private ZonedDateTime theTime;
public String getTitle() { return title; }
void setTitle(String title) { this.title = title; }
public ZonedDateTime getTheTime() { return theTime; }
void setTheTime(ZonedDateTime theTime) { this.theTime = theTime; }
#Override
public void serialize(JsonGenerator gen, SerializerProvider serializers) throws IOException {
gen.writeStartObject();
gen.writeFieldName("title");
gen.writeString(this.title);
// Other serialization...
gen.writeEndObject();
}
#Override
public void serializeWithType(JsonGenerator gen, SerializerProvider serializers, TypeSerializer typeSer) throws IOException {
throw new UnsupportedOperationException("Not supported.");
}
}
I also tried other ideas to get it using the right serializer...maybe foolish ideas like...
public class Minimal implements JsonSerializable {
and
#JsonSerialize(using = SerializableSerializer.class)
public class Minimal extends JsonSerializable.Base {
and
#JsonSerialize(as = JsonSerializable.Base.class)
public class Minimal extends JsonSerializable.Base {
What am I missing to get this Jackson feature working?
I verified in Jackson's latest source code that the feature still exists. I suppose I can trace it through the 2.8.4 source in the debugger, but maybe someone knows the answer.
Apparently the answer to
"What am I missing?"
is Nothing.
After writing the question, I rebuilt everything again, restarted Tomcat, and redeployed and tested again and got my expected output.
So I will chalk this up to bad build, bad deploy, confused Classloader, something like that. I am leaving the question, since it provides an example that might help someone.
I have a problem with Orika 1.4.5 and PermGen Space.
Indeed, i'm using a a ConfigurableMapper this way :
public class SoapSearchPrdvFreeSlotsMapper extends ConfigurableMapper {
#Override
public void configure(MapperFactory mapperFactory) {
mapperFactory.registerClassMap(mapperFactory.classMap(PrdvFreeSlot.class, PrdvWsListerDispoTelV2Filter.class)
.field("typeRdv", "wsldtTypeRdv")
.field("motifId", "wsldtMotifId")
.byDefault().toClassMap());
}
mapperFactory.registerClassMap(mapperFactory.classMap(PrdvFreeSlot.class, PrdvWsListerDispoTelV2.class)
.field("typeRdv", "wsldtTypeRdv")
.field("motifId", "wsldtMotifId")
.field("quantum", "wsldtActiviteIdActivQuantum")
.field("activiteJours", "wsldtActiviteIdActivJours")
.field("activiteHeureFerme", "wsldtActiviteIdActivHeureFerme")
.field("activiteHeureOuvert", "wsldtActiviteIdActivHeureOuvert")
.field("startDate", "disDate")
.field("disCapacity", "disCapacite")
.field("disReserve", "disReserve")
.field("reserveCC", "wsldtReserveCC")
.byDefault().toClassMap());
}
}
#Override
public void configureFactoryBuilder(DefaultMapperFactory.Builder builder) {
builder.build().getConverterFactory().registerConverter(new DateXmlDateConverter());
}
}
But each time i call this mapper, i have autogenerated class mappers which are stored in the PermGen.
I try to use the "existsRegisteredMapper" of the MapperFactory to prevent class mappers auto-generation but it doesn't work:
public static <T, U> boolean existsRegisteredMapperInFactory(MapperFactory mapperFactory, Class<T> classSrc, Class<U> classDest) {
return mapperFactory.existsRegisteredMapper(TypeFactory.valueOf(classSrc), TypeFactory.valueOf(classDest), true);
}
and the modified first code block:
if (!existsRegisteredMapperInFactory(mapperFactory, PrdvWsListerDispoTelV2Filter.class, PrdvFreeSlot.class)) {
mapperFactory.registerClassMap(mapperFactory.classMap(PrdvFreeSlot.class, PrdvWsListerDispoTelV2Filter.class)
.field("typeRdv", "wsldtTypeRdv")
.field("motifId", "wsldtMotifId")
.byDefault().toClassMap());
}
Please, Is there a way to prevent class mappers autogeneration without rewriting all the mappers i have ?
Thanks for your help.
Please make sure that the mapper is a singleton. You don't need to instantiate it everytime.
You don't need to verify if the the mapper has been registered or not. It will be generated only once (per MapperFactory instance).
So just make sure that SoapSearchPrdvFreeSlotsMapper is a singleton (only one instance, ConfigurableMapper is thread-safe)
Suppose I have a validation annotation on my Interface method to validate input arguments and return value.
Is it possible at the moment (V 1.9.5) to tell Mockito to invoke this validator during the invocation process?
The background would be to prevent developers from writing unrealistic tests by mocking the given interface in a way that violates the specified validator.
So what I would want is to register something like
class MyAnswerInterceptor<T> implements AnswerInterceptor<T> {
#Override
public Answer<T> intercept(final Answer<T> answer) {
return new Answer<T>() {
#Override
public T answer(InvocationOnMock invocation) throws Throwable {
validateArguments(invocation);
T result = answer.answer(invocation);
validateReturnValue(result);
return result;
}
}
}
}
to be called on every answer of a given mock.
Is this possible at all? I've looked into the code, also to check if I could hack in at some point (even using reflection or the like), but it seems due to entanglement of instance creation and logic, it's hardly possible to achieve what I want (i.e. stuff like MockHandler mockHandler = new MockHandlerFactory().create(settings); makes it impossible to hook in and put custom stuff on top without patching and deploying the whole thing...)
Any insight would be highly appreciated :-)
You could achieve that by creating a custom MockMaker.
MockMaker is an extension point that makes it possible to use custom dynamic proxies and avoid using the default cglib/asm/objenesis implementation
Our custom implementation delegates all the complex stuff to the default MockMaker: CglibMockMaker. It "decorates" only the createMock method by registering on the settings parameter an InvocationListener. This listener will be notified when an invocation have been done allowing use to call validateArguments and validateReturnValue.
import org.mockito.internal.creation.CglibMockMaker;
import org.mockito.invocation.Invocation;
import org.mockito.invocation.MockHandler;
import org.mockito.listeners.InvocationListener;
import org.mockito.listeners.MethodInvocationReport;
import org.mockito.mock.MockCreationSettings;
import org.mockito.plugins.MockMaker;
public class ValidationMockMaker implements MockMaker {
private final MockMaker delegate = new CglibMockMaker();
public ValidationMockMaker() {
}
#Override
public <T> T createMock(MockCreationSettings<T> settings, MockHandler handler) {
settings.getInvocationListeners().add(new InvocationListener() {
#Override
public void reportInvocation(MethodInvocationReport methodInvocationReport) {
Invocation invocation = (Invocation) methodInvocationReport.getInvocation();
validateArguments(invocation.getArguments());
validateReturnValue(methodInvocationReport.getReturnedValue());
}
});
return delegate.createMock(settings, handler);
}
#Override
public MockHandler getHandler(Object mock) {
return delegate.getHandler(mock);
}
#Override
public void resetMock(Object mock, MockHandler newHandler, MockCreationSettings settings) {
delegate.resetMock(mock, newHandler, settings);
}
protected void validateArguments(Object... arguments) {
// Arrays.stream(arguments).forEach(Objects::requireNonNull);
}
private void validateReturnValue(Object result) {
// Objects.requireNonNull(result);
}
}
Last but not least, we need to tell to Mockito to use our implementation. This is possible by adding a file
mockito-extensions/org.mockito.plugins.MockMaker
containing our MockMaker class name:
ValidationMockMaker
See Using the extension point section in the javadoc.
When I make a call to fetch a list of TripImportSummaryProxy objects, I get back a list of:
com.schedgy.core.dao.filter.proxy.FilterProxyAutoBean_com_google_web_bindery_requestfactory_shared_impl_EntityProxyCategory_com_google_web_bindery_requestfactory_shared_impl_ValueProxyCategory_com_google_web_bindery_requestfactory_shared_impl_BaseProxyCategory.
#ProxyFor(value=TripImportSummary.class, locator=TripImportSummaryLocator.class)
public interface TripImportSummaryProxy extends MyBaseProxy {
// some setter/getters defined here
}
public interface TripImportSummaryRequestFactory extends RequestFactory, HasPaginationRequest<TripImportSummaryProxy> {
TripImportSummaryRequest request();
}
#Service(value=TripImportSummaryService.class, locator=MyServiceLocator.class)
public interface TripImportSummaryRequest extends RequestContext, PaginationRequest<TripImportSummaryProxy> {
}
#SkipInterfaceValidation
public interface HasPaginationRequest<T> extends RequestFactory {
PaginationRequest<T> request();
}
#ExtraTypes(FilterProxy.class)
#SkipInterfaceValidation
public interface PaginationRequest<T> extends RequestContext {
Request<List<T>> paginate(int offset, int limit, String sortColumn,
boolean isSortAscending, List<FilterProxy> filters);
Request<Integer> count(List<FilterProxy> list);
}
This is all executed via:
PaginationRequest<TripImportSummaryProxy> request = requestFactory.request();
request.paginate(offset, limit, sortColumn, isSortAscending, getFilters(request)).with(getPaths()).fire(new MyReceiver<List<TripImportSummaryProxy>>() {
#Override
public void onSuccess(List<TripImportSummaryProxy> response) {
// Response is a list of type that seems to extend from FilterProxy
}
});
FilterProxy is just a marker interface that various filter interfaces extend.
#ProxyFor(Object.class)
public interface FilterProxy extends ValueProxy {
}
I have about two dozen other requests working and its only failing on this one. I have verified that the server side service is correctly fetching and returning the right data. I have found that the TripImportSummaryLocator class is never instantiated even though it appears to be bound to the proxy type correctly and has a default constructor.
I was using GWT 2.4 rc1 and after upgrading to GWT 2.4 stable I am no longer seeing this problem.