I currently have an IntegrationFlow implementation that utilizes a Service class to implement all desired functionality to be performed by the flow. Something like this...
#Service
public class FlowService {
public Message<String> removeLineFeeds(Message<String> message) {
return MessageBuilder
.withPayload(StringUtils.remove(message.getPayload(), StringUtils.LF))
.copyHeadersIfAbsent(message.getHeaders())
.build();
}
}
#Configuration
#EnableIntegration
public class FlowConfiguration {
#Autowired
private FlowService flowService;
#Bean
public IntegrationFlow flow() {
return IntegrationFlows
.from("inputChannel")
.transform(flowService, "removeLineFeeds")
.get();
}
}
The above implementation works exactly as desired but I was hoping to improve/modify the implementation to utilize the power of Java 8/Lambdas so that it looked something like this...
#Bean
public IntegrationFlow flow() {
return IntegrationFlows
.from("inputChannel")
.transform(flowService::removeLineFeeds)
.get();
}
Unfortunately, when implemented this way, the flow will throw a ClassCastException whenever it processes a message. I have tried a few of the different proposed solutions that exist online currently but none of them seem to do the trick. I am encountering a similar issue regardless of the IntegrationFlow method used (transform, filter, etc.).
What needs to be changed with the current implementation to allow the use of flowService::removeLineFeeds within the IntegrationFlow methods?
EDIT: PER ARTEM'S RESPONSE
It appears a simple converter in the IntegrationFlow did the trick. My current implementation seemed to be passing the message as a Message<byte[]> instead of the Message<String> I was expecting. See Artem's full response below for more details.
#Bean
public IntegrationFlow flow() {
return IntegrationFlows
.from("inputChannel")
.convert(String.class)
.transform(flowService::removeLineFeeds)
.get();
}
The point is that lambda must correspond to some functional interface.
In case of transform() it is a GenericTransformer<S, T>. Indeed your Message<String> removeLineFeeds(Message<String> message) satisfies such a contract. And it would work well if you deal with only payload:
public String removeLineFeeds(String message) {
return StringUtils.remove(message.getPayload(), StringUtils.LF);
}
Just because when all the generic information from the target implementation is erased at runtime we can't guess you would like to deal with the whole Message<?>, so the framework only propagates to your lambda only a payload. That how your String cannot be cast to Message, therefore a ClassCastException.
To fix the problem and mock Java generics system we suggest an overloaded method with an explicit expected type:
/**
* Populate the {#link MessageTransformingHandler} instance for the provided
* {#link GenericTransformer} for the specific {#code payloadType} to convert at
* runtime.
* #param payloadType the {#link Class} for expected payload type. It can also be
* {#code Message.class} if you wish to access the entire message in the transformer.
* Conversion to this type will be attempted, if necessary.
* #param genericTransformer the {#link GenericTransformer} to populate.
* #param <P> the payload type - 'transform from' or {#code Message.class}.
* #param <T> the target type - 'transform to'.
* #return the current {#link BaseIntegrationFlowDefinition}.
* #see MethodInvokingTransformer
* #see LambdaMessageProcessor
*/
public <P, T> B transform(Class<P> payloadType, GenericTransformer<P, T> genericTransformer) {
So, your configuration should look like this:
.transform(Message.class, flowService::removeLineFeeds)
This way we say the framework that we would like to get a whole message for our function to process.
Anyway I'd prefer the first variant just with a payload: the framework will take care for you about coping request headers into a reply message.
See more info in Docs: https://docs.spring.io/spring-integration/reference/html/dsl.html#java-dsl-class-cast
Related
Problem
I have defined a CustomHttpMessageReader (which implements HttpMessageReader<CustomClass>), which is able to read a multipart response from a server and converts the received parts into an object of a specific class. The CustomHttpMessageReader uses internally the DefaultPartHttpMessageReader to actually read/parse the multipart responses.
The CustomHttpMessageReader accumulates the parts read by the DefaultReader and converts them into the desired class CustomClass.
I've created a CustomHttpMessageConverter that does the same thing for a RestTemplate, but I struggle to do the same for a WebClient.
I always get the following Exception:
block()/blockFirst()/blockLast() are blocking, which is not supported in thread reactor-http-nio-2
java.lang.IllegalStateException: block()/blockFirst()/blockLast() are blocking, which is not supported in thread reactor-http-nio-2
at reactor.core.publisher.BlockingSingleSubscriber.blockingGet(BlockingSingleSubscriber.java:83)
at reactor.core.publisher.Flux.blockFirst(Flux.java:2600)
at com.company.project.deserializer.multipart.CustomHttpMessageReader.readMultipartData(CustomHttpMessageReader.java:116)
at com.company.project.deserializer.multipart.CustomHttpMessageReader.readMono(CustomHttpMessageReader.java:101)
at org.springframework.web.reactive.function.BodyExtractors.lambda$readToMono$14(BodyExtractors.java:211)
at java.base/java.util.Optional.orElseGet(Optional.java:369)
...
Mind you, I'm not interested in running WebClient asynchronously. I'm only future proofing my application because RestTemplate is apparently only in maintenance mode and the folks at Pivotal/Spring suggest using WebClient instead.
What I Tried
As I understand, there are threads that are not allowed to be blocked, namely the netty-nio one in the exception. I tried removing netty from my dependencies, so that I can rely solely on Tomcat. That however doesn't seem to help, as I get another exception, explaining me, that no suitable HttpConnector exists (exception thrown by the WebClient.Builder)
No suitable default ClientHttpConnector found
java.lang.IllegalStateException: No suitable default ClientHttpConnector found
at org.springframework.web.reactive.function.client.DefaultWebClientBuilder.initConnector(DefaultWebClientBuilder.java:297)
at org.springframework.web.reactive.function.client.DefaultWebClientBuilder.build(DefaultWebClientBuilder.java:266)
at com.company.project.RestClientUsingWebClient.getWebclient(RestClientUsingWebClient.java:160)
I've tried my code executed in a unit test as well, as starting a whole Spring context. The result is unfortunately the same.
Setup
To provide a bit more details, the following are snippets from the Classes mentioned earlier. The classes are not shown fully in order to understand better what is going on. All necessary methods are implemented (like e.g. canRead() in the Reader).
CustomHttpMessageReader
I also included in the class the usage of CustomPart (in addition to CustomClass) just to show, that the content of the Part is also read i.e. blocked.
public class CustomHttpMessageReader implements HttpMessageReader<CustomClass> {
private final DefaultPartHttpMessageReader defaultPartHttpMessageReader = new DefaultPartHttpMessageReader();
#Override
public Flux<CustomClass> read(final ResolvableType elementType, final ReactiveHttpInputMessage message,
final Map<String, Object> hints) {
return Flux.merge(readMono(elementType, message, hints));
}
#Override
public Mono<CustomClass> readMono(final ResolvableType elementType, final ReactiveHttpInputMessage message,
final Map<String, Object> hints) {
final List<CustomPart> customParts = readMultipartData(message);
return convertToCustomClass(customParts);
}
private List<CustomPart> readMultipartData(final ReactiveHttpInputMessage message) {
final ResolvableType resolvableType = ResolvableType.forClass(byte[].class);
return Optional.ofNullable(
defaultPartHttpMessageReader.read(resolvableType, message, Map.of())
.buffer()
.blockFirst()) // <- EXCEPTION IS THROWN HERE!
.orElse(new ArrayList<>())
.stream()
.map(part -> {
final byte[] content = Optional.ofNullable(part.content().blockFirst()) //<- HERE IS ANOTHER BLOCK
.map(DataBuffer::asByteBuffer)
.map(ByteBuffer::array)
.orElse(new byte[]{});
// Here we cherry pick some header fields
return new CustomPart(content, someHeaderFields);
}).collect(Collectors.toList());
}
}
Usage of WebClient
class RestClientUsingWebClient {
/**
* The "Main" Method for our purposes
*/
public Optional<CustomClass> getResource(final String baseUrl, final String id) {
final WebClient webclient = getWebclient(baseUrl);
//curl -X GET "http://BASE_URL/id" -H "accept: multipart/form-data"
return webclient.get()
.uri(uriBuilder -> uriBuilder.path(id).build()).retrieve()
.toEntity(CustomClass.class)
.onErrorResume(NotFound.class, e -> Mono.empty())
.blockOptional() // <- HERE IS ANOTHER BLOCK
.map(ResponseEntity::getBody);
}
//This exists also as a Bean definition
private WebClient getWebclient(final String baseUrl) {
final ExchangeStrategies exchangeStrategies = ExchangeStrategies.builder()
.codecs(codecs -> {
codecs.defaultCodecs().maxInMemorySize(16 * 1024 * 1024);
codecs.customCodecs().register(new CustomHttpMessageReader()); // <- Our custom reader
})
.build();
return WebClient.builder()
.baseUrl(baseUrl)
.exchangeStrategies(exchangeStrategies)
.build();
}
}
Usage of build.gradle
For the sake of completion, here is what I think is the relevant part of my build.gradle
plugins {
id 'org.springframework.boot' version '2.7.2'
id 'io.spring.dependency-management' version '1.0.13.RELEASE'
...
}
dependencies {
implementation 'org.springframework.boot:spring-boot-starter-actuator'
implementation 'org.springframework.boot:spring-boot-starter-web' // <- This
implementation 'org.springframework.boot:spring-boot-starter-webflux'
// What I tried:
// implementation ('org.springframework.boot:spring-boot-starter-webflux'){
// exclude group: 'org.springframework.boot', module: 'spring-boot-starter-reactor-netty'
//}
...
}
if we look in the stacktrace that you provided we see these 3 lines
at reactor.core.publisher.Flux.blockFirst(Flux.java:2600)
at com.company.project.deserializer.multipart.CustomHttpMessageReader.readMultipartData(CustomHttpMessageReader.java:116)
at com.company.project.deserializer.multipart.CustomHttpMessageReader.readMono(CustomHttpMessageReader.java:101)
They should be read from bottom to top. So what do they tell us?
The bottom line tells us that the function readMono on the line 101 in the class CustomHttpMessageReader.javawas called first.
That function then called the function readMultipartData on line 116 in the class CustomHttpMessageReader(same class as above)
Then the function blockFirst was called on line 2600 in the class Flux.
Thats your blocking call.
So we can tell that there is a blocking call in the function readMultipartData.
So why cant we block in the function? well if we look in the API for the interface that function is overriding HttpMessageReader we can se that the function returns a Mono<T> which means that the function is an async function.
And if it is async and we block we might get very very bad performance.
This interface is used within the Spring WebClient which is a fully async client.
You can use it in a non-async application, but that means you can block outside of the WebClient but internally, it needs to operate completely async if you want it to be as efficient as possible.
So the bottom line is that you should not block in any function that returns a Mono or a Flux.
I am using CompositeFileListFilter to add two filters. Firstly, I want to execute AbstractPersistentFileListFilter with the default implementation, thereby protecting it from duplication.
Second, I use my own implementation, which checks the database for the existence of a file, protecting me from duplicates in the event of a system restart
How can this approach be released? So that the default implementation of AbstractPersistentFileListFilter with the internal memory of the MetadataStore is executed first
The goal is to reduce database calls to check for the existence of a file. Perhaps there is a better approach to the solution. Thanks for any help!
FtpConfiguration.java
#Bean
CompositeFileListFilter<FTPFile> getCompositeFileListFilter() {
CompositeFileListFilter<FTPFile> compositeFileListFilter = new CompositeFileListFilter<>();
compositeFileListFilter.addFilters(new CustomFileFilter(messageRepo));
return compositeFileListFilter;
}
#Bean
public IntegrationFlow ftpIntegrationFlow() {
return IntegrationFlows.from(
Ftp.inboundStreamingAdapter(template())
.filter(getCompositeFileListFilter())
.remoteDirectory("."),
e -> e.poller(Pollers.fixedDelay(500).advice(advice())))
.transform(new StreamTransformer("UTF-8"))
.handle(messageService::unmarshall)
.get();
}
CustomFileFilter.java
#Component
#Log4j2
public class CustomFileFilter implements FileListFilter<FTPFile> {
private final MessageRepo messageRepo;
public CustomFileFilter(MessageRepo messageRepo) {
this.messageRepo = messageRepo;
}
#Override
public List<FTPFile> filterFiles(FTPFile[] files) {
return null;
}
#Override
public boolean accept(FTPFile file) {
String fileName = file.getName();
log.info("file filter get name: {}", fileName);
Integer fileCheck = messageRepo.checkExistsMessage(fileName);
log.info("fileCheck: {}", fileCheck);
return fileCheck != 1;
}
#Override
public boolean supportsSingleFileFiltering() {
return true;
}
}
Use the ChainFileListFilter instead:
/**
* The {#link CompositeFileListFilter} extension which chains the result
* of the previous filter to the next one. If a filter in the chain returns
* an empty list, the remaining filters are not invoked.
*
* #param <F> The type that will be filtered.
*
* #author Artem Bilan
* #author Gary Russell
* #author Cengis Kocaurlu
*
* #since 4.3.7
*
*/
public class ChainFileListFilter<F> extends CompositeFileListFilter<F> {
https://docs.spring.io/spring-integration/docs/current/reference/html/file.html#file-reading
Starting with version 4.3.7, a ChainFileListFilter (an extension of CompositeFileListFilter) has been introduced to allow scenarios when subsequent filters should only see the result of the previous filter. (With the CompositeFileListFilter, all filters see all the files, but it passes only files that have passed all filters). ...
I use spring amqp with multi-method listeners, like this:
#RabbitListener(queues = PLATFORM_COMMAND_QUEUE)
#Component
public class PlatformListener {
#RabbitHandler
public Response<GetAllPlatformsResponse> getAllPlatforms(GetAllPlatforms command) {
...
return Response.ok(GetAllPlatformsResponse.create(allPlatforms));
}
#RabbitHandler
public Response<PlatformResponse> getPlatform(GetPlatformCommand command) {
...
return Response.ok(platformService.getPlatform(command));
}
}
And I wand add specific header with handler name (getAllPlatforms, getPlatform) for all response messages. For that, i try add setAfterReceivePostProcessors and setBeforeSendReplyPostProcessors, but they do not provide any information about handler methods.
factory.setBeforeSendReplyPostProcessors(message -> {
Method targetMethod = message.getMessageProperties().getTargetMethod();
assert targetMethod == null;
return message;
});
How can i get method name and add it to reply message header?
It's not currently possible; as the javadocs state, that property is only populated for method-level #RabbitListener.
/**
* The target method when using a method-level {#code #RabbitListener}.
* #return the method.
* #since 1.6
*/
public Method getTargetMethod() {
return this.targetMethod;
}
Given some changes to the architecture over the years, I think it should now be possible to populate this also for class-level listeners. Please open a new feature suggestion and I'll take a look at adding it.
Is there a clean way to change a mock's method behavior based on other method's invocation?
Example of code under test, service will be mocked by Mockito in the test:
public Bar foo(String id) {
Bar b = service.retrieveById(id);
boolean flag = service.deleteById(id);
b = service.retrieveById(id); //this should throw an Exception
return b;
}
Here, we would like service.retrieveById to return an object, unless service.delete has been called.
Chaining behaviours could work in this simple case, but it doesn'd consider the invocation of the other method deleteById (imagine refactoring).
when(service.retrieveById(any())).
.thenReturn(new Bar())
.thenThrow(new RuntimeException())
I am wondering for example if it's possible to implement an Answer object which can detect whether deleteById has been invoked. Or if there is a totally different approach which would make the test cleaner.
In my eyes, this is a good example of over-engeneering mock objects.
Don't try to make your mocks behave like "the real thing".
That is not what mocking should be used for when writing tests.
The test is not about Service itself, it's about some class that makes use of it.
If Service either returns something for a given Id, or raises an exception when there is no result, make 2 individual test cases!
we can't foresee the reason of the refactoring.. maybe there will be n call to retrieve before the delete.. So this is really about tying the two methods behavior together.
Yes, and someone could add another twelve methods that all influence the outcome of deleteById. Will you be keeping track?
Use stubbing only to make it run.
Consider writing a fake if Service is rather simple and doesn't change much. Remember mocking is just one tool. Sometimes there are alternatives.
Considering what I've just said, this might send you mixed messages but since StackOverflow was down for a while and I'm currently working heavily with Mockito myself, I spent some time with your other question:
I am wondering for example if it's possible to implement an Answer object which can detect whether deleteById has been invoked.
import org.mockito.invocation.InvocationOnMock;
import org.mockito.stubbing.Answer;
import java.util.function.Supplier;
import static java.util.Objects.requireNonNull;
/**
* An Answer that resolves differently depending on a specified condition.
*
* <p>This implementation is NOT thread safe!</p>
*
* #param <T> The result type
*/
public class ConditionalAnswer <T> implements Answer<T> {
/**
* Create a new ConditionalAnswer from the specified result suppliers.
*
* <p>On instantiation, condition is false</p>
*
* #param whenConditionIsFalse The result to supply when the underlying
condition is false
* #param whenConditionIsTrue The result to supply when the underlying
condition is true
* #param <T> The type of the result to supply
* #return A new ConditionalAnswer
*/
public static <T> ConditionalAnswer<T> create (
final Supplier<T> whenConditionIsFalse,
final Supplier<T> whenConditionIsTrue) {
return new ConditionalAnswer<>(
requireNonNull(whenConditionIsFalse, "whenConditionIsFalse"),
requireNonNull(whenConditionIsTrue, "whenConditionIsTrue")
);
}
/**
* Create a Supplier that on execution throws the specified Throwable.
*
* <p>If the Throwable turns out to be an unchecked exception it will be
* thrown directly, if not it will be wrapped in a RuntimeException</p>
*
* #param throwable The throwable
* #param <T> The type that the Supplier officially provides
* #return A throwing Supplier
*/
public static <T> Supplier<T> doThrow (final Throwable throwable) {
requireNonNull(throwable, "throwable");
return () -> {
if (RuntimeException.class.isAssignableFrom(throwable.getClass())) {
throw (RuntimeException) throwable;
}
throw new RuntimeException(throwable);
};
}
boolean conditionMet;
final Supplier<T> whenConditionIsFalse;
final Supplier<T> whenConditionIsTrue;
// Use static factory method instead!
ConditionalAnswer (
final Supplier<T> whenConditionIsFalse,
final Supplier<T> whenConditionIsTrue) {
this.whenConditionIsFalse = whenConditionIsFalse;
this.whenConditionIsTrue = whenConditionIsTrue;
}
/**
* Set condition to true.
*
* #throws IllegalStateException If condition has been toggled already
*/
public void toggle () throws IllegalStateException {
if (conditionMet) {
throw new IllegalStateException("Condition can only be toggled once!");
}
conditionMet = true;
}
/**
* Wrap the specified answer so that before it executes, this
* ConditionalAnswer is toggled.
*
* #param answer The ans
* #return The wrapped Answer
*/
public Answer<?> toggle (final Answer<?> answer) {
return invocation -> {
toggle();
return answer.answer(invocation);
};
}
#Override
public T answer (final InvocationOnMock invocation) throws Throwable {
return conditionMet ? whenConditionIsTrue.get() : whenConditionIsFalse.get();
}
/**
* Test whether the underlying condition is met
* #return The state of the underlying condition
*/
public boolean isConditionMet () {
return conditionMet;
}
}
I wrote some tests to make it work. This is how it would look applied to the Service example:
#Test
void conditionalTest (
#Mock final Service serviceMock, #Mock final Bar barMock) {
final var id = "someId"
// Create shared, stateful answer
// First argument: Untill condition changes, return barMock
// Second: After condition has changed, throw Exception
final var conditional = ConditionalAnswer.create(
() -> barMock,
ConditionalAnswer.doThrow(new NoSuchElementException(someId)));
// Whenever retrieveById is invoked, the call will be delegated to
// conditional answer
when(service.retrieveById(any())).thenAnswer(conditional);
// Now we can define, what makes the condition change.
// In this example it is service#delete but it could be any other
// method on any other class
// Option 1: Easy but ugly
when(service.deleteById(any())).thenAnswer(invocation -> {
conditional.toggle();
return Boolean.TRUE;
});
// Option 2: Answer proxy
when(service.deleteById(any()))
.thenAnswer(conditional.toggle(invocation -> Boolean.TRUE));
// Now you can retrieve by id as many times as you like
assertSame(barMock, serviceMock.retrieveById(someId));
assertSame(barMock, serviceMock.retrieveById(someId));
assertSame(barMock, serviceMock.retrieveById(someId));
assertSame(barMock, serviceMock.retrieveById(someId));
assertSame(barMock, serviceMock.retrieveById(someId));
// Until
assertTrue(serviceMock.deleteById(someId));
// NoSuchElementException
serviceMock.retrieveById(someId)
}
}
The test above might contain errors (I used some classes from the project that I am currently working on).
Thanks for the challenge.
You can use Mockito.verify() to check whether deleteById was called or not:
Mockito.verify(service).deleteById(any());
You can also use Mockito.InOrder for orderly verification (I have not tested the below code):
InOrder inOrder = Mockito.inOrder(service);
inOrder.verify(service).retrieveById(any());
inOrder.verify(service).deleteById(any());
inOrder.verify(service).retrieveById(any());
I was going through the picasso source code and came across this chunk in lines 80-94:
public interface RequestTransformer {
/**
* Transform a request before it is submitted to be processed.
*
* #return The original request or a new request to replace it. Must not be null.
*/
Request transformRequest(Request request);
/** A {#link RequestTransformer} which returns the original request. */
RequestTransformer IDENTITY = new RequestTransformer() {
#Override public Request transformRequest(Request request) {
return request;
}
};
}
From my understanding, it's somewhat declaring a variable in the interface with a static constructor. Can someone explain what is that code supposed to be doing? I read through a similar post regarding constructors in interfaces (Constructor in an Interface?) but I still don't see why this case does not apply there.
Thanks
This actually is not a variable. This is constant with anonymous implementation. Within interface it is compiled to:
public interface RequestTransformer {
Request transformRequest(Request request);
public static final RequestTransformer IDENTITY = new RequestTransformer() {
#Override
public Request transformRequest(Request request) {
return request;
}
};
}
And this is a bad practice (to have implementation within interface) :)