Spring cloud - Input vs Output - java

From this example:
#SpringBootApplication
#EnableBinding(MyProcessor.class)
public class MultipleOutputsServiceApplication {
public static void main(String[] args) {
SpringApplication.run(MultipleOutputsServiceApplication.class, args);
}
#Autowired
private MyProcessor processor;
#StreamListener(MyProcessor.INPUT)
public void routeValues(Integer val) {
if (val < 10) {
processor.anOutput()
.send(message(val));
} else {
processor.anotherOutput()
.send(message(val));
}
}
private static final <T> Message<T> message(T val) {
return MessageBuilder.withPayload(val)
.build();
}
}
MyProcessor interface:
public interface MyProcessor {
String INPUT = "myInput";
#Input
SubscribableChannel myInput();
#Output("myOutput")
MessageChannel anOutput();
#Output
MessageChannel anotherOutput();
}
My question:
Why the method routeValues in MultipleOutputsServiceApplication class is annotated with MyProcessor.INPUT instead of MyProcessor.myOutput (after adding this member to MyProcessor interface) ?
From the docs, INPUT is for getting data and OUTPUT is for sending data. Why the example does the opposite and if I reverse it, nothing is working?

That method looks correct to me. It doesn't have to be annotated with #Output as your method doesn't have a return type and you are programmatically sending the output to arbitrary destinations (through two different output bindings) in the method. So you need to make sure that your outputs are bound properly as your program properly does through #EnableBinding(MyProcessor.class). You need the #StreamListener(MyProcessor.INPUT) on the method as MyProcessor.INPUT is the binding where StreamListener is listening from. Once you get data through that input, your code then programmatically takes over sending the data downstream. With that said, there are multiple ways to address these types of use cases. You can alternatively doing this too.
#StreamListener
public void routeValues(#Input("input")SubscribableChannel input,
#Output("mOutput") MessageChannel myOutput,
#Output("output")MessageChannel output {
input.subscribe(new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
int val = (int) message.getPayload();
if (val < 10) {
myOutput.send(message(val));
}
else {
output.send(message(val));
}
}
}

Related

Prevent to start spring-boot application if enum valitiation fails

I want to prevent my application from starting if my enum method for validation uniqueness of a field fails.
public enum MyEnum {
VALUE_1(1),
VALUE_2(1), //same code as VALUE_1 is forbidden
VALUE_3(3),
;
private int code;
static { //this method is called only when 1st time acessing an inscance of this enum, I want it to be executed upon spring boot initialization and when it fails stop appliacation
long uniqueCodesCount = Arrays.stream(MyEnum.values()).map(MyEnum::getCode)
.distinct()
.count();
if (MyEnum.values().length != uniqueCodesCount) {
throw new RuntimeException("Not unique codes");
}
}
}
Just keep it simple. For example convert the verification to a static method:
public enum MyEnum {
...
public static void verifyUniqueness() {
long uniqueCodesCount = Arrays.stream(MyEnum.values()).map(MyEnum::getCode)
.distinct()
.count();
if (MyEnum.values().length != uniqueCodesCount) {
throw new RuntimeException("Not unique codes");
}
}
}
Then you may implement InitializingBean in a bean and override the method afterPropertiesSet(). E.g. suppose your application is called DemoApplication, it will look like this:
#SpringBootApplication
public class DemoApplication implements InitializingBean {
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
#Override
public void afterPropertiesSet() throws Exception {
MyEnum.verifyUniqueness();
}
}
From the documentation of InitializingBean:
Interface to be implemented by beans that need to react once all their properties have been set by a BeanFactory: e.g. to perform custom initialization, or merely to check that all mandatory properties have been set.

How to send and receive dedicated types in spring-cloud-stream test?

I have a simple spring-cloud-stream application with this Function implementation :
#Configuration
public class StarZFunction {
private static final Logger LOGGER = LoggerFactory.getLogger(StarZFunction.class);
#Bean
public Function<StarZ, StarZ> processEvents() {
return starZ -> {
starZ.setVmNr(starZ.getVmNr()*1000);
starZ.setTuCode(starZ.getTuCode().toLowerCase(Locale.ROOT));
return starZ;
};
}
}
And I'm trying to thest this code by simply sending and receiving the object:
#SpringBootTest(classes = SampleApp.class)
#Import({TestChannelBinderConfiguration.class})
public class IntegrationTest1 {
private static final Logger LOGGER = LoggerFactory.getLogger(IntegrationTest1.class);
#Autowired
private InputDestination input;
#Autowired
private OutputDestination output;
#Test
void sendAndReceive() {
StarZ starZ = new StarZ(10, "SBB");
input.send(new GenericMessage<StarZ>(starZ));
GenericMessageConverter genericMessageConverter = new GenericMessageConverter();
assertThat(output.receive().getPayload()).isEqualTo(starZ);
}
}
This fails, because output.receive().getPayload() only returns a byte[]. How can I get the StarZ Object?
It looks like the data is converted to json by default. Since I did not find a way to get something like Message<StarZ> directly from the API I did the conversion by hand.
#Test
void sendAndReceive() {
StarZ starZ = new StarZ(10, "SBB");
input.send(new GenericMessage<>(starZ));
Message<byte[]> messageWithByte = output.receive();
assertThat(deserializeMessage(messageWithByte)).isEqualTo(starZ);
}
private StarZ deserializeMessage(Message<byte[]> m) {
LOGGER.info(new String(m.getPayload()));
try {
return objectMapper.readValue(m.getPayload(), StarZ.class);
} catch (IOException e) {
throw new RuntimeException(e);
}
}
Looks a little bit akward, but does the job.
I'm more than happy to accept a better answer though.

#StreamListener define groupId for kafka- How can I set many consumers for the same topic

I am using java and springboot and Kafka in my application.
I want to define many consumers for the same topic inside Kafka.
Now, I am defining the group ID inside my application properties file:
spring.cloud.stream.bindings.myFirstTopic.destination=my-first-topic
spring.cloud.stream.bindings.myFirstTopic.group=my-first-consumer
and on the method I am using the annotation:
#StreamListener(MyFirstTopicBinding.MY_FIRST_TOPIC)
public void firstConsumer(#Payload MessageDto dto) {}
#StreamListener(MyFirstTopicBinding.MY_FIRST_TOPIC)
public void secondConsumer(#Payload MessageDto dto) {}
I want both methods to get the same messages....
How can I do it?
You cannot do that; you need to give each one a different channel name, then
spring.cloud.stream.bindings.myFirstConsumer.destination=my-first-topic
spring.cloud.stream.bindings.myFirstConsumer.group=my-first-consumer
spring.cloud.stream.bindings.mySecondConsumer.destination=my-first-topic
spring.cloud.stream.bindings.mySecondConsumer.group=my-second-consumer
Also, #StreamListener is deprecated and will be removed soon; you should convert to the functional model.
#Bean
Consumer<MessageDto> myFirstConsumer() {
return dto -> {...};
}
#Bean
Consumer<MessageDto> mySecondConsumer() {
return dto -> {...};
}
Then
spring.cloud.function.definition=myFirstConsumer;mySecondConsumer
spring.cloud.stream.bindings.myFirstConsumer-in-0.destination=my-first-topic
spring.cloud.stream.bindings.myFirstConsumer-in-0.group=my-first-consumer
spring.cloud.stream.bindings.mySecondConsumer-in-0.destination=my-first-topic
spring.cloud.stream.bindings.mySecondConsumer-in-0.group=my-first-consumer
EDIT
Here is an example when using a single binding and publishing each message to multiple listeners:
#SpringBootApplication
public class So70447378Application {
public static void main(String[] args) {
SpringApplication.run(So70447378Application.class, args);
}
#Bean
public Consumer<Message<MyDto>> input(PublishSubscribeChannel multiplex) {
return msg -> multiplex.send(msg);
}
#Bean
PublishSubscribeChannel multiplex() {
return new PublishSubscribeChannel();
}
#ServiceActivator(inputChannel = "multiplex")
void firstConsumer(MyDto dto) {
System.out.println("1:" + dto);
}
#ServiceActivator(inputChannel = "multiplex")
void secondConsumer(MyDto dto) {
System.out.println("2:" + dto);
}
public static class MyDto {
private String foo;
public String getFoo() {
return this.foo;
}
public void setFoo(String foo) {
this.foo = foo;
}
}
}
spring.cloud.function.definition=input
spring.cloud.stream.bindings.input-in-0.destination=my-topic
spring.cloud.stream.bindings.input-in-0.group=my-group

Dynamically trigger existing Flux from another API end point - Spring Webflux

I am trying to build a micro service using web-flux which will send/publish some data based on an event for a particular subscriber.
With the below implementation (Another Stackflow Issue) I am able to create one publisher and all who are subscribed will receive the data automatically when we trigger the event by calling "/send" API
#SpringBootApplication
#RestController
public class DemoApplication {
final FluxProcessor processor;
final FluxSink sink;
final AtomicLong counter;
public static void main(String[] args) {
SpringApplication.run(DemoApplication.class, args);
}
public DemoApplication() {
this.processor = DirectProcessor.create().serialize();
this.sink = processor.sink();
this.counter = new AtomicLong();
}
#GetMapping("/send/{userId}")
public void test(#PathVariable("userId") String userId) {
sink.next("Hello World #" + counter.getAndIncrement());
}
#RequestMapping(produces = MediaType.TEXT_EVENT_STREAM_VALUE)
public Flux<ServerSentEvent> sse() {
return processor.map(e -> ServerSentEvent.builder(e).build());
}
}
Problem statement - My app is having user based access and for each user there will be some notifications which I want to push only based on an event. Here the events will be stored in the DB with user id's and when we hit the "send" end point from another API along with "userId" as a path variable, it should only send the data related to that user only if it is registered as a subscriber and still listening on the channel.
This is not an accurate or actual solution, but this thing works:
First the SSE end-point :
#RestController
public class SSEController {
private String value = "";
public String getValue() {
return value;
}
public void setValue(String value) {
this.value = value;
}
#GetMapping(path = "/sse", produces = MediaType.TEXT_EVENT_STREAM_VALUE)
Flux<String> getWords() {
System.out.println("sse ?");
return Flux.interval(Duration.ofSeconds(1))
.map(sequence -> getValue());
}
}
And from any Service or wherever you can autowire, just autowire it:
#Service
public class SomeService {
#Autowired
private SSEController sseController;
...
void something(){
....
sseController.setValue("some value");
....
}
This is the way I'm using.

How to pass parameters from one lambda to another in java?

I want to invoke Lambda function A from another lambda function B with some parameters.
The following is the invoking lambda function.
#SpringBootApplication
public class Application extends SpringBootServletInitializer implements CommandLineRunner {
#Autowired
private ConfigurableApplicationContext context;
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
#Override
public void run(String... args) {
DCService dcService = LambdaInvokerFactory.builder().lambdaFunctionNameResolver(
(method, lambdaFunction, lambdaInvokerFactoryConfig) -> "EventPlanDCFunction-Dev")
.build(DCService.class);
log.info("Response from DC service :: {}",dcService.getClass());
String[] params = new String[]{"Subir has invoked"};
dcService.run(params);
SpringApplication.exit(context);
}
}
following is the code of DCService.java file.
public interface DCService {
#LambdaFunction(functionName = "DeliveryCycleLambdaHandler",
invocationType = InvocationType.Event)
void run(String... params);
}
The following is the code of the lambda function which I want to invoke.
#SpringBootApplication
public class Application extends SpringBootServletInitializer implements CommandLineRunner {
#Autowired
private ConfigurableApplicationContext context;
#Autowired
private DeliveryCycleService deliveryCycleService;
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
#Override
public void run(String... args) {
deliveryCycleService.printMessage(args[0]);
SpringApplication.exit(context);
}
}
As you can see, I tried to pass the parameter by creating an array of String from the invoking method but I am getting ArrayOutOFBoundException in the other method meaning the parameter is not actually reaching the invoked method. If I do not pass parameter it works fine but for my use case, I need to pass parameter and invoke the method asynchronously.
NOTE: The lambdaHandle code is same for both of them. The following belongs to one of them.
#Slf4j
public class DCInvokeHandler implements RequestStreamHandler {
private static final Logger LOGGER = LoggerFactory.getLogger(DCInvokeHandler.class);
private volatile SpringBootLambdaContainerHandler<AwsProxyRequest, AwsProxyResponse> handler;
#Override
public void handleRequest(InputStream inputStream, OutputStream outputStream, Context context) throws IOException {
if (handler == null) {
synchronized (this) {
if (handler == null) {
handler = initHandler();
}
}
}
handler.proxyStream(inputStream, outputStream, context);
}
private static SpringBootLambdaContainerHandler<AwsProxyRequest, AwsProxyResponse> initHandler() {
try {
return SpringBootLambdaContainerHandler.getAwsProxyHandler(Application.class, Env.getEnv().name());
} catch (ContainerInitializationException e) {
LOGGER.error("Failed to start spring boot lambda handler", e);
// if we fail here. We re-throw the exception to force another cold start
throw new IllegalStateException("Could not initialize Spring Boot application", e);
}
}
}
This is the basic code to invoke another lambda from a lambda function.aws sdk doc
try {
InvokeRequest invokeRequest = new InvokeRequest();
invokeRequest.setFunctionName(FunctionName);
invokeRequest.setPayload(ipInput);
returnDetails = byteBufferToString(
lambdaClient.invoke(invokeRequest).getPayload(),
Charset.forName("UTF-8"),logger);
} catch (Exception e) {
logger.log(e.getMessage());
}
To invoke the another lambda function asynchronously, set InvocationType to Event.aws api docs
Following are the invocation type RequestResponse, Event, DryRun.
RequestResponse (default) - Invoke the function synchronously. Keep the connection open until the function returns a response or times out. The API response includes the function response and additional data.
Event - Invoke the function asynchronously. Send events that fail multiple times to the function's dead-letter queue (if it's configured). The API response only includes a status code.
DryRun - Validate parameter values and verify that the user or role has permission to invoke the function.

Categories