Invoke route from Processor - java

I'm using Camel to integrate 2 systems. I have defined different routes and one of the routes consumes from a specific rabbitmq queue and send it to a REST service. Nothing fancy here, the route looks like this:
public class WebSurfingRabbitToRestRoute extends RouteBuilder{
#Override
public void configure() throws Exception {
from("rabbitmq://rabbit_host:port/Rabbit_Exchange").
setHeader("CamelHttpMethod", constant("POST")).
setHeader("Content-Type", constant("application/json")).
bean(TransformResponse.class, "transform").
to("http4://rest_service_host:port/MyRestService).
}
}
As you can see, i process every message before sending it to the rest service since i need to adjust some things. The problem comes when i find out that sometimes (i dont know how or when), the system that publish into rabbit, send 2 messages concatenated at once.
What i expect to get is a simple json like this:
[{field1:value1, field2:value2}]
What i sometimes get is:
[{field1:value1, field2:value2},{field1:value3, field2:value4}]
So when i face this scenario, the rest service im routing the message to, fails (obviously).
In order to solve this, i would like to know if there is a way to invoke a route from inside a processor. From the previous snippet of code you can see that Im calling the transform method, so the idea will be to do something like the following pseudo-code, because after the route is already fired, i cant split the events and send them both within the same route "instance", so i thought about invoking a different route that i can call from here which will send the message2 to the very same rest service.
public class TransformRabbitmqResponse {
public String transform(String body) throws Exception {
// In here i do stuff with the message
// Check if i got 2 messages concatenated
// if body.contains("},{") {
// split_messages
// InvokeDifferentRoute(message2)
//}
}
}
Do you guys think this is possible?

One option (though I am not sure this is the best option) would be to split this up into two different routes using a direct endpoint.
public class WebSurfingRabbitToRestRoute extends RouteBuilder{
#Override
public void configure() throws Exception {
from("rabbitmq://rabbit_host:port/Rabbit_Exchange")
.setHeader("CamelHttpMethod", constant("POST"))
.setHeader("Content-Type", constant("application/json"))
.bean(TransformResponse.class, "transform");
from("direct:transformedResponses")
.to("http4://rest_service_host:port/MyRestService");
}
}
And then in your transform bean, you can use camel Producer Template to publish the transformed payload(s) to your new direct endpoint (assuming you are using json?).
producerTemplate.sendBody("direct:transformedResponses", jsonString);

Related

Publish / Subscribe MQTT using SmallRye reactive messaging dynamically

We try to publish and subscribe to MQTT protocol using smallrye reactive messaging. We managed to actually publish a message into a specific topic/channel through the following simple code
import io.smallrye.mutiny.Multi;
import org.eclipse.microprofile.reactive.messaging.Outgoing;
import javax.enterprise.context.ApplicationScoped;
import java.time.Duration;
#ApplicationScoped
public class Publish {
#Outgoing("pao")
public Multi<String> generate() {
return Multi.createFrom().ticks().every(Duration.ofSeconds(1))
.map(x -> "A Message in here");
}
}
What we want to do is to call whenever we want the generate() method somehow with a dynamic topic, where the user will define it. That one was our problem but then we found these classes from that repo in github. Package name io.smallrye.reactive.messaging.mqtt
For example we found that there is a class that says it makes a publish call to a MQTT broker(Mosquitto server up).
Here in that statement SendingMqttMessage<String> message = new SendingMqttMessage<String>("myTopic","A message in here",0,false);
We get the a red underline under the SendingMqttMessage<String> saying 'SendingMqttMessage(java.lang.String, java.lang.String, io.netty.handler.codec.mqtt.MqttQoS, boolean)' is not public in 'io.smallrye.reactive.messaging.mqtt.SendingMqttMessage'. Cannot be accessed from outside package
UPDATE(Publish done)
Finally made a Publish request to the mqtt broker(a mosquitto server) and all this with a dynamic topic configured from user. As we found out the previous Class SendingMqttMessage was not supposed to be used at all. And we found out that we also needed and emitter to actually make a publish request with a dynamic topic.
#Inject
#Channel("panatha")
Emitter<String> emitter;
#POST
#Consumes(MediaType.APPLICATION_JSON)
#Produces(MediaType.APPLICATION_JSON)
public Response createUser(Device device) {
System.out.println("New Publish request: message->"+device.getMessage()+" & topic->"+device.getTopic());
emitter.send(MqttMessage.of(device.getTopic(), device.getMessage()));
return Response.ok().status(Response.Status.CREATED).build();
}
Now we need to find out about making a Subscription to a topic dynamically.
first to sett us to the same page:
Reactive messaging does not work with topics, but with channels.
That is important to note, because you can exclusively read or write to a channel. So if you want to provide both, you need to configure two channels pointing at the same topic, one incoming and one outgoing
To answer your question:
You made a pretty good start with Emitters, but you still lack the dynamic nature you'd like.
In your example, you acquired that Emitter thru CDI.
Now that is all we need, to make this dynamic, since we cann dynamically inject Beans at runtime using CDI like this:
Sending Messages
private Emitter<byte[]> dynamicEmitter(String topic){
return CDI.current().select(new TypeLiteral<Emitter<byte[]>>() {}, new ChannelAnnotation(topic)).get();
}
please also note, that i am creating a Emitter of type byte[], as this is the only currently supportet type of the smallrye-mqtt connector (version 3.4.0) according to its documentation.
Receiving Messages
To read messages from a reactive messaging channel, you can use the counterpart of the Emitter, which is the Publisher.
It can be used analog:
private Publisher<byte[]> dynamicReceiver(String topic){
return CDI.current().select(new TypeLiteral<Publisher<byte[]>>() {}, new ChannelAnnotation(topic)).get();
}
You can then process these Date in any way you like.
As demo, it hung it on a simple REST Endpoint
#GET
#Produces(MediaType.SERVER_SENT_EVENTS)
public Multi<String> stream(#QueryParam("topic") String topic) {
return Multi.createFrom().publisher(dynamicReceiver(topic)).onItem().transform(String::new);
}
#GET
#Path("/publish")
public boolean publish(#QueryParam("msg") String msg, #QueryParam("topic") String topic) {
dynamicEmitter(topic).send(msg.getBytes());
return true;
}
One more Thing
When creating this solution I hit a few pitfalls you should know about:
Quarkus removes any CDI-Beans that are "unused". So if you want to inject them dynamically, you need to exclude those, or turne off that feature.
All channels injected that way must be configured. Otherwise the injection will fail.
For some Reason, (even with removal completely disabled) I was unable to inject Emitters dynamically, unless they are ever injected elsewhere.

Reuse HTTP message in different methods in Citrus Framework

Problem: How to reuse the same HTTP message in two different methods (in the same step) in Citrus Framework
Versions: Citrus 2.8.0-SNAPSHOT; Cucumber 3.0.2; Java 8
Having this Gherkin:
Scenario: Client has permission to access the action
Given that client has access to action
When the client calls the endpoint /some-endpoint/client/1/action/some-action
Then the client receives status code of 200
And the client receives a response with {"action" : "some-action", "permission": "AUTHORIZED"}
and the following piece of Java code:
#Then("the client receives status code of {int}")
public void the_client_receives_status_code_of(Integer statusCode) {
designer.http().client(httpClient).receive().response(HttpStatus.valueOf(statusCode))
.contentType("application/json;charset=UTF-8"):
}
#Then("the client receives a response with {string}")
public void the_client_receives_a_response_with(String payload) {
designer.http().client(httpClient).receive().response().payload(payload);
}
If this code is run, it will give a timeout in the method the_client_receives_a_response_with, since Citrus is expecting to receive a second message (send method was only called once).
The objective here is to separate the validations of the HTTP code from the payload, so the separation was made by creating two methods. How to reuse the message received in method the_client_receives_status_code_of?
Already tried the following without success:
Giving a name to the received message:
designer.http().client(httpClient).receive().response(HttpStatus.valueOf(statusCode)).name("currentMessage")
.contentType("application/json;charset=UTF-8");
But trying to access the message like this:
#CitrusResource
private TestContext testContext;
...
testContext.getMessageStore().getMessage("currentMessage");
Returns null.
But using this:
designer.echo("citrus:message(currentMessage)");
Prints the correct message.
So, how can I access the message in Java code, i.e, have access to the message to do something like this:
Assert.assertTrue(testContext.getMessageStore().getMessage("currentMessage").getPayload().equals(payload));
In two different methods.
You can do something like this:
#Then("the client receives a response with {string}")
public void the_client_receives_a_response_with(String payload) {
designer.action(new AbstractTestAction() {
public void doExecute(TestContext context) {
Assert.assertTrue(context.getMessageStore()
.getMessage("currentMessage")
.getPayload(String.class)
.equals(payload));
}
});
}
The abstract action is always provided with the current TestContext instance of your running test. So the #CitrusResource injection is not working here as you get a different instance where the named message is unknown.
Also as an alternative you could follow the default named messages steps BDD API that is described here: https://citrusframework.org/citrus/reference/html/index.html#named-messages
Maybe message creator BDD API will also help: https://citrusframework.org/citrus/reference/html/index.html#message-creator-steps

How are Camel routes launched manually?

I have a Camel route that is set to run every five minutes
#Component
public class CamelRoute extends RouteBuilder{
private final String comment = "Cron"
#Override
public void setup() {
from("quartz2://myGroup/myTimerName?cron=0+0/5+12-18+?+*+MON-FRI")
.log("Processing from"+comment)
.to("activemq:Totally.Rocks");
}
}
And I want to force it to run manually, from Spring http request, and change comment field in CamelRoute
#RequestMapping(value = "/ex/foos", method = RequestMethod.GET)
#ResponseBody
public String getFoosBySimplePath() {
//TODO: Start Camel route
//change camel log "comment" from "Cron" to "HTTP request"
}
To run a Camel route manually you can use FluentProducerTemplate. You can autowire an instance like a normal bean.
Examples: 1, 2
To be honest, I am not sure if it will work with quartz endpoints, but I am sure it is working pretty well with "direct:" endpoints. Anyway, it could be a good start for your findings.
Solution for my task was easy, although not straightforward if using Camel documentation:
startRoute(String routeId) Starts the given route if it has been
previously stopped
I added another route
from("timer://manualRestart?repeatCount=1")
.routeId("manualRestart")
.noAutoStartup()
.to("activemq:Totally.Rocks");
and use startRoute() to launch it when needed
public String getFoosBySimplePath() {
camelContext.startRoute("manualRestart");
}
Why do I find it "not straightforward"? Because documentation says, that startRoute() can start previously stopped routes. Route was never stopped, it was configured not to start by default.

Using Spring Cloud Stream Source to send method results to stream

I'm trying to create a Spring Cloud Stream Source Bean inside a Spring Boot Application that simply sends the results of a method to a stream (underlying Kafka topic is bound to the stream).
Most of the Stream samples I've seen use #InboundChannelAdapter annotation to send data to the stream using a poller. But I don't want to use a poller. I've tried setting the poller to an empty array but the other problem is that when using #InboundChannelAdapter you are unable to have any method parameters.
The overall concept of what I am trying to do is read from an inbound stream. Do some async processing, then post the result to an outbound stream. So using a processor doesn't seem to be an option either. I am using #StreamListener with a Sink channel to read the inbound stream and that works.
Here is some code i've been trying but this doesn't work at all. I was hoping it would be this simple because my Sink was but maybe it isn't. Looking for someone to point me to an example of a source that isn't a Processor (i.e. doesn't require listening on an inbound channel) and doesn't use #InboundChannelAdapter or to give me some design tips to accomplish what I need to do in a different way. Thanks!
#EnableBinding(Source.class)
public class JobForwarder {
#ServiceActivator(outputChannel = Source.OUTPUT)
#SendTo(Source.OUTPUT)
public String forwardJob(String message) {
log.info(String.format("Forwarding a job message [%s] to queue [%s]", message, Source.OUTPUT));
return message;
}
}
Your orginal requirement can be achieved through the below steps.
Create your custom Bound Interface (you can use the default #EnableBinding(Source.class) as well)
public interface CustomSource {
String OUTPUT = "customoutput";
#Output(CustomSource.OUTPUT)
MessageChannel output();
}
Inject your bound channel
#Component
#EnableBinding(CustomSource.class)
public class CustomOutputEventSource {
#Autowired
private CustomSource customSource;
public void sendMessage(String message) {
customSource.output().send(MessageBuilder.withPayload(message).build());
}
}
Test it
#RunWith(SpringRunner.class)
#SpringBootTest
public class CustomOutputEventSourceTest {
#Autowired
CustomOutputEventSource output;
#Test
public void sendMessage() {
output.sendMessage("Test message from JUnit test");
}
}
So if you don't want to use a Poller, what causes the forwardJob() method to be called?
You can't just call the method and expect the result to go to the output channel.
With your current configuration, you need an inputChannel on the service containing your inbound message (and something to send a message to that channel). It doesn't have to be bound to a transport; it can be a simple MessageChannel #Bean.
Or, you could use a #Publisher to publish the result of the method invocation (as well as being returned to the caller) - docs here.
#Publisher(channel = Source.OUTPUT)
Thanks for the input. It took me a while to get back to the problem. I did try reading the documentation for #Publisher. It looked to be exactly what I needed but I just couldn't get the proper beans initialized to get it wired properly.
To answer your question the forwardJob() method is called after some async processing of the input.
Eventually I just implemented using spring-kafka library directly and that was much more explicit and felt easier to get going. I think we are going to stick to kafka as the only channel binding so I think we'll stick with that library.
However, we did eventually get the spring-cloud-stream library working quite simply. Here was the code for a single source without a poller.
#Component
#EnableBinding(Source.class)
public class JobForwarder {
private Source source;
#Autowired
public ScheduledJobForwarder(Source source) {
this.source = source;
}
public void forwardScheduledJob(String message) {
log.info(String.format("Forwarding a job message [%s] to queue [%s]", message, Source.OUTPUT));
source.output().send(MessageBuilder.withPayload(message).build());
}
}

Testing Camel with MockEndpoints

I've got a series of "pipelined" components that all communicate through ActiveMQ message queues. Each component uses Camel to treat each of these queues as an Endpoint. Each component uses the same basic pattern:
Where each component consumes messages off of an input queue, processes the message(s), and then places 1+ messages on an outbound/output queue. The "output" queue then becomes the "input" queue for the next component in the chain. Pretty basic.
I am now trying to roll up my sleeves and provide unit testing for each component using the MockEndpoints provided by Camel's test API. I have been pouring over the javadocs and the few examples on Camel's website, but am having difficulty connecting all the dots.
It seems to me that, for each component, a portion of my unit testing is going to want to accomplish the following three things:
Test to see if there are messages waiting on a particular "input" queue
Pull those messages down and process them
Push new messages to an "output" queue and verify that they made it there
I believe I need to create MockEndpoints for each queue like so:
#EndpointInject(uri = "mock:inputQueue")
protected MockEndpoint intputQueue;
#EndpointInject(uri = "mock:outputQueue")
protected MockEndpoint outputQueue;
So now, in my JUnit test methods, I can set up expectations and interact with these endpoints:
#Test
public final void processMethodShouldSendToOutputQueue()
{
Component comp = new Component();
comp.process();
outputQueue.assertIsSatisfied();
}
I'm just not understanding how to wire everything up correctly:
How do I connect comp to the inputQueue and outputQueue MockEndpoints?
For each MockEndpoint, how do I set up expectations so that assertIsSatisfied() checks that a message is present inside a particular queue, or that a particular queue contains messages?
Adam, there are several ways to do this.
For POJO components, blackbox test them separately from any Camel context/routing to focus on business logic.
If you want to do end-to-end testing of the routes, consider using one of these approaches to validate that each step in the route is satisfied.
use NotifyBuilder to build Exchange validation expressions (somewhat complex to get your head around)
use AdviceWith to dynamically change the route before its run (add Log/Mock endpoints, etc)
I prefer AdviceWith because its very flexible and leverages the familiar MockEndpoints. For a complete example of this, see this unit test
In short, you will create a unit test to inject MockEndpoints into your route and then validate against them as usual...
context.getRouteDefinition("myRouteId").adviceWith(context, new AdviceWithRouteBuilder() {
#Override
public void configure() throws Exception {
// mock all endpoints
mockEndpoints();
}
});
getMockEndpoint("mock:direct:start").expectedBodiesReceived("Hello World");
template.sendBody("direct:start", "Hello World");

Categories