How to route the data using camel? - java

I'm new to camel.
When I get the request to the endpoint, the camel flow should be started. RequestBody is the input to the flow (InputA).
Please let me know how to start this:
InputA -> ProcessA -> OutputA
OutputA -> ProcessB -> OutputB
OutputB -> ProcessC -> OutputC
Just as an example:
public class ProcessA{
public String methodA(String arg){
return arg;
}
}
public class ProcessB{
public String methodB(String arg){
return arg;
}
}
public class ProcessC{
public String methodC(String arg){
return arg;
}
}
How to flow the input and output using Camel data flow.
Any help or links will be appreciated.

I advice you to read the book Camel in action. There are a lot of examples, source code is also awailable. When i started to study camel, I found this book very useful.
Also In addition to Cluas answer i can add an example:
public class Example
{
public static void main(String[] args) throws Exception
{
CamelContext context = new DefaultCamelContext();
context.addRoutes(new RouteBuilder()
{
#Override
public void configure() throws Exception
{
from("InputA").process(exchange -> {
//This is process A.
}).to("OutputA");
from("OutputA").process(exchange -> {
//This is process B.
}).to("OutputB");
from("OutputB").process(exchange -> {
//This is process C.
}).to("OutputC");
}
});
context.start();
//let camel complite his job
Thread.sleep(2000);
context.stop();
}
}

You can build 3 Camel routes, and then use some queue component to separate them, such as the internal direct (no queue by direct method invocation) or seda queues.
Pseudo routes would be something like:
from("someInput").process(...).to("seda:a")
from("seda:a").process(...).to("seda:b");
from("seda:b").process(...).to("seda:c");

Let's say you read the Json from a file and you want to process it inside a processor. Something like this:
from("file:/C:/TEST/")
.process(new MyProcessor())
.to("direct:anotherRouter");
MyProcessor class is a special kind of class that implements Processor. You need to override the process method.
public class MyProcessor implements Processor{
#Override
public void process(Exchange exchange) throws Exception {
}
}
In a camel route, the data is in the body. To get the data in a processor you should get it from the body. In the example, it is something like this
public class MyProcessor implements Processor{
#Override
public void process(Exchange exchange) throws Exception {
String data = exchange.getIn().getBody(String.class);
ObjectMapper mapper = new ObjectMapper();
JsonNode rootNode = mapper.readTree(data);
//TODO: All the stuff you want to do with the data.
}
}

Related

How can I manage (or pass as an argument) my config json to start the testing of my vertx application?

I have a small vertx application with an AppLauncher class that extend of VertxCommandLauncher and I set a appConfig.json with the typical config parameters :
public class AppLauncher extends VertxCommandLauncher implements VertxLifecycleHooks {
public static void main(String[] args) {
new AppLauncher().dispatch(args);
}
#Override
public void afterConfigParsed(JsonObject config) {
AppConfig.INSTANCE.setConfig(config);
}
To run my application in my IDE I put in edit configuration my main class (Applauncher.java) and the arguments :
run io.vertx.covid.verticle.MainVerticle -conf../vertx-application/src/main/resources/appConfig.json
This is my test class:
#BeforeAll
static void deployVerticles(Vertx vertx, VertxTestContext testContext) {
vertx.deployVerticle(BaseVerticle.class.getName(),testContext
.succeeding(id->testContext.completeNow()));
}
This is my BaseVerticle class that all my verticles extends from:
public abstract class BaseVerticle extends AbstractVerticle {
public static String CONTEXT_PATH = AppConfig.INSTANCE.getConfig().getString(Constants.CONTEXT_PATH);
}
And this is my AppConfig class :
public enum AppConfig {
INSTANCE;
private JsonObject config;
public JsonObject getConfig() {
return config;
}
public void setConfig(JsonObject config) {
this.config = config;
}
}
Everything works, but if I would like to test it in a separete way then I deploy my verticles but I have a Nullpointer in the CONTEXT_PATH (BaseVerticle class) because the config (suppose to be taken from appConfig.json) is null.
I haven't found a way to pass the arguments with my appConfig.json or should I call to the main method passing the arguments?
I like to do something that is similar to profiles in my vertx application.
If you set an environment variable with the key vertx-config-path before the vertx instance is initialized, you can control where vertx's config retriever (you might need to add vert-config to your gradle/maven dependencies) gets the configuration from.
In your launcher, you can do something like the following, which will give you the ability to add profile based config files to your resources folder conf/config-%s.json where %s is the profile name:
public class CustomLauncher extends Launcher {
public static final String ACTIVE_PROFILE_PROPERTY = "APP_ACTIVE_PROFILE";
private static final CLI cli = CLI.create("main")
.addOption(new Option()
.setShortName("p")
.setLongName("profile")
);
public static void main(String[] args) {
initDefaults(Arrays.asList(args));
new CustomLauncher().dispatch(args);
}
public static void executeCommand(String cmd, String... args) {
initDefaults(Arrays.asList(args));
new CustomLauncher().execute(cmd, args);
}
public static void initDefaults(List<String> args) {
System.setProperty(LoggerFactory.LOGGER_DELEGATE_FACTORY_CLASS_NAME, SLF4JLogDelegateFactory.class.getName());
CommandLine parse = cli.parse(args);
String profile = parse.getOptionValue("p");
if (profile != null && !profile.isEmpty()) {
System.setProperty(ACTIVE_PROFILE_PROPERTY, profile);
System.setProperty("vertx-config-path", String.format("conf/config-%s.json", profile));
}
}
}
Then in your test, instead of relaying on vertx test extension to inject vertx for you, you can initialize it by yourself and control the profile (aka which config file to load) like the following:
private static Vertx vertx;
#BeforeAll
public static void deployVerticles(VertxTestContext testContext) {
CustomLauncher.initDefaults(Arrays.asList("--profile", "test"))
vertx = Vertx.vertx();
ConfigRetriever.create(vertx).getConfig(asyncResult -> {
if (asyncResult.succeeded()) {
JsonObject config = asyncResult.result();
DeploymentOptions deploymentOptions = new DeploymentOptions()
.setConfig(config);
vertx.deployVerticle(BaseVerticle.class.getName(), deploymentOptions);
} else {
// handle failure
}
});
}
Then when you run your application, instead of providing -conf, you can use -p or --profile
I also highly recommend to get familiar with vertx-config as you can also get env variables, k8s config maps, and much more.
EDIT: I also highly recommend to move to Kotlin if possible, makes the async-code much easier to handle in an imperative way (with Coroutines). It's very hard to deal with libraries like Vert.x in Java compared to languages like Kotlin.
I solved my problem creating a verticle with the config stuffs (vertx-config documentation), here is my verticle config class:
public class ConfigVerticle extends AbstractVerticle {
protected static Logger logger = LoggerFactory.getLogger(ConfigVerticle.class);
public static JsonObject config;
#Override
public void start() throws Exception {
ConfigStoreOptions fileStore = new ConfigStoreOptions()
.setType("file")
.setOptional(true)
.setConfig(new JsonObject().put("path", "conf/appConfig.json"));
ConfigStoreOptions sysPropsStore = new ConfigStoreOptions().setType("sys");
ConfigRetrieverOptions options = new ConfigRetrieverOptions().addStore(fileStore).addStore(sysPropsStore);
ConfigRetriever retriever = ConfigRetriever.create(vertx, options);
retriever.getConfig(ar -> {
if (ar.failed()) {
logger.info("Failed to retrieve config from appConfig.json");
} else {
config = ar.result();
vertx.deployVerticle(MainVerticle.class.getName(), new DeploymentOptions().setConfig(config));
}
});
}
}
And my MainVerticle.class I pass the new configuration like this:
public class MainVerticle extends AbstractVerticle {
#Override
public void start(){
vertx.deployVerticle(BackendVerticle.class.getName(), new DeploymentOptions().setConfig(config()));
}
}
Then, my simpleTests :
#ExtendWith(VertxExtension.class)
public class BaseCovidTest {
protected WebClient webClient;
#BeforeEach
void initWebClient(Vertx vertx){
webClient = WebClient.create(vertx);
}
#BeforeAll
static void deployVerticles(Vertx vertx, VertxTestContext vertxTestContext) {
vertx.deployVerticle(ConfigVerticle.class.getName() ,vertxTestContext
.succeeding(id-> {
try {
vertxTestContext.awaitCompletion(1, TimeUnit.SECONDS);
} catch (InterruptedException e) {
e.printStackTrace();
}
vertxTestContext.completeNow();
}));
}
}
And everything works, thanks #Tom that inspired me to fix it!

Design pattern suggestion to perform pipeline operation

Problem statement:
I have to process request similar to a pipeline.
For example:
When a request comes, it has to undergo a sequence of operations, like (step1,step2,step3...).
So, in order to achieve that, I am using Template design pattern.
Please review and suggest if I am implementing this problem correctly, or there is a better solution.
I am suspecting my approach will introduce code smells, as I am changing values of objects very frequently.
Also, suggest if I & how can I use Java 8 to accomplish this?
Thanks.
Code:
package com.example.demo.design;
import java.util.List;
public abstract class Template {
#Autowired
private Step1 step1;
#Autowired
private Step2 step2;
#Autowired
private Save save;
List<String> stepOutput = null;
List<String> stepOutputTwo = null;
List<String> stepOutputThree = null;
public void step1(String action1) {
stepOutput = step1.method(action1);
}
public void step2(String action2) {
stepOutputTwo = step2.method(stepOutput, action2);
}
abstract public void step3();
public void save() {
save.persist(stepOutputThree);
}
final public void run(String action1, String action2) {
step1(action1);
step2(action2);
stepOutputTwo = step3();
}
}
In Java 8 streams model, that could look like the following:
final public void run(String action1, String action2) {
Stream.of(action1) // Stream<String>
.map(s -> step1.method(s)) // Stream<List<String>>
.map(l -> step2.method(l,action2) // Stream<List<String>>
.map(l -> step3.method(l)) // Stream<List<String>>
.forEach(l -> save.persist(l));
}
I had same issue! you can do something like this: and uncheckCall method is for handling exceptions.
final public void run(String action1, String action2) {
//other stuffs
Stream.of(step1.method(action1))
.map(stepOutput->uncheckCall(() ->step2.method(stepOutput,action2)))
.forEach(stepOutputThree -> uncheckCall(()->save.persist(stepOutputThree)));
//.....
}
for uncheckCall method:
public static <T> T uncheckCall(Callable<T> callable) {
try {
return callable.call();
} catch (RuntimeException e) {
// throw BusinessException.wrap(e);
} catch (Exception e) {
//throw BusinessException.wrap(e);
}
}
Well, when there are "pipelines", "sequence of operations", etc. the first design pattern that comes to mind is Chain of Responsibility, that looks like the following
and provides you with these benefits:
allows you to add new handlers when necessary (e.g. at runtime) without modifying other handlers and processing logic (Open/Closed Principle of SOLID)
allows a handler to stop processing a request if necessary
allows you to decouple processing logic of the handlers from each other (Single Responsibility Principle of SOLID)
allows you to define the order of the handlers to process a request outside of the handlers themselves
One example of real world usage is Servlet filters where you call doFilter(HttpRequest, HttpResponse, FilterChain) to invoke the next handler
protected void doFilter(HttpServletRequest req, HttpServletResponse resp, FilterChain chain) {
if (haveToInvokeNextHandler) {
chain.doFilter(req, resp);
}
}
In case of using classical Chain of Responsibility pattern your processing pipeline may look like the following:
API
public class StepContext {
private Map<String, Object> attributes = new HashMap<>();
public <T> T getAttribute(String name) {
(T) attributes.get(name);
}
public void setAttribute(String name, Object value) {
attributes.put(name, value);
}
}
public interface Step {
void handle(StepContext ctx);
}
public abstract class AbstractStep implements Step {
private Step next;
public AbstractStep() {
}
public AbstractStep(Step next) {
this.next = next;
}
protected void next(StepContext ctx) {
if (next != null) {
next.handle(ctx);
}
}
}
Implementation
public class Step1 extends AbstractStep {
public Step1(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action1 = ctx.getAttribute("action1");
List<String> output1 = doSomething(action1);
ctx.setAttribute("output1", output1);
next(ctx); // invoke next step
}
}
public class Step2 extends AbstractStep {
public Step2(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output1 = ctx.getAttribute("output1");
List<String> output2 = doSomething(output1, action2);
ctx.setAttribute("output2", output2);
next(ctx); // invoke next step
}
}
public class Step3 extends AbstractStep {
public Step3(Step next) {
super(next);
}
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output2 = ctx.getAttribute("output2");
persist(output2);
next(ctx); // invoke next step
}
}
Client code
Step step3 = new Step3(null);
Step step2 = new Step2(step3);
Step step1 = new Step1(step2);
StepContext ctx = new StepContext();
ctx.setAttribute("action1", action1);
ctx.setAttribute("action2", action2);
step1.handle(ctx);
Also all this stuff can be simplified into a chain of handlers decoupled from each other by means of removing the corresponding next references in case your processing pipeline will have to always invoke all the available steps without controlling the necessity of invocation from the previous one:
API
public class StepContext {
private Map<String, Object> attributes = new HashMap<>();
public <T> T getAttribute(String name) {
(T) attributes.get(name);
}
public void setAttribute(String name, Object value) {
attributes.put(name, value);
}
}
public interface Step {
void handle(StepContext ctx);
}
Implementation
public class Step1 implements Step {
public void handle(StepContext ctx) {
String action1 = ctx.getAttribute("action1");
List<String> output1 = doSomething(action1);
ctx.setAttribute("output1", output1);
}
}
public class Step2 implements Step {
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output1 = ctx.getAttribute("output1");
List<String> output2 = doSomething(output1, action2);
ctx.setAttribute("output2", output2);
}
}
public class Step3 implements Step {
public void handle(StepContext ctx) {
String action2 = ctx.getAttribute("action2");
List<String> output2 = ctx.getAttribute("output2");
persist(output2);
}
}
Client code
Note that in case of Spring framework (just noticed #Autowired annotation) the client code may be simplified even more as the #Autowired annotation can be used to inject all the beans of the corresponding type into a corresponding collection.
Here what the documentation states:
Autowiring Arrays, Collections, and Maps
In case of an array, Collection, or Map dependency type, the container autowires all beans matching the declared value type. For such purposes, the map keys must be declared as type String which will be resolved to the corresponding bean names. Such a container-provided collection will be ordered, taking into account Ordered and #Order values of the target components, otherwise following their registration order in the container. Alternatively, a single matching target bean may also be a generally typed Collection or Map itself, getting injected as such.
public class StepsInvoker {
// spring will put all the steps into this collection in order they were declared
// within the spring context (or by means of `#Order` annotation)
#Autowired
private List<Step> steps;
public void invoke(String action1, String action2) {
StepContext ctx = new StepContext();
ctx.setAttribute("action1", action1);
ctx.setAttribute("action2", action2);
steps.forEach(step -> step.handle(ctx))
}
}

Spring cloud - Input vs Output

From this example:
#SpringBootApplication
#EnableBinding(MyProcessor.class)
public class MultipleOutputsServiceApplication {
public static void main(String[] args) {
SpringApplication.run(MultipleOutputsServiceApplication.class, args);
}
#Autowired
private MyProcessor processor;
#StreamListener(MyProcessor.INPUT)
public void routeValues(Integer val) {
if (val < 10) {
processor.anOutput()
.send(message(val));
} else {
processor.anotherOutput()
.send(message(val));
}
}
private static final <T> Message<T> message(T val) {
return MessageBuilder.withPayload(val)
.build();
}
}
MyProcessor interface:
public interface MyProcessor {
String INPUT = "myInput";
#Input
SubscribableChannel myInput();
#Output("myOutput")
MessageChannel anOutput();
#Output
MessageChannel anotherOutput();
}
My question:
Why the method routeValues in MultipleOutputsServiceApplication class is annotated with MyProcessor.INPUT instead of MyProcessor.myOutput (after adding this member to MyProcessor interface) ?
From the docs, INPUT is for getting data and OUTPUT is for sending data. Why the example does the opposite and if I reverse it, nothing is working?
That method looks correct to me. It doesn't have to be annotated with #Output as your method doesn't have a return type and you are programmatically sending the output to arbitrary destinations (through two different output bindings) in the method. So you need to make sure that your outputs are bound properly as your program properly does through #EnableBinding(MyProcessor.class). You need the #StreamListener(MyProcessor.INPUT) on the method as MyProcessor.INPUT is the binding where StreamListener is listening from. Once you get data through that input, your code then programmatically takes over sending the data downstream. With that said, there are multiple ways to address these types of use cases. You can alternatively doing this too.
#StreamListener
public void routeValues(#Input("input")SubscribableChannel input,
#Output("mOutput") MessageChannel myOutput,
#Output("output")MessageChannel output {
input.subscribe(new MessageHandler() {
#Override
public void handleMessage(Message<?> message) throws MessagingException {
int val = (int) message.getPayload();
if (val < 10) {
myOutput.send(message(val));
}
else {
output.send(message(val));
}
}
}

Mock another method in SWF Workflow Client

I am trying to write a unit test for a AWS SWF workflow. Below is the code I would like to Test
#Override
public void execute(String abc) {
new TryCatch() {
#Override
protected void doTry() throws Throwable {
Promise<SomeObject> temp = activityClient.action(abc);
again(temp, abc);
}
#Override
protected void doCatch(Throwable e) throws Throwable {
throw new RuntimeException(e);
}
};
}
#Asynchronous
public void again(Promise<SomeObject> someObject, String abc) {
// Do Something
}
My Test class is as below:
public class SomeWorkflowTest extends AbstractTestCase {
#Rule
public WorkflowTest workflowTest = new WorkflowTest();
List<String> trace;
private SomeWorkflowClientFactory workflowFactory = new SomeWorkflowClientFactoryImpl();
#Before
public void setUp() throws Exception {
trace = new ArrayList<String>();
// Register activity implementation to be used during test run
SomeActivitiesImpl activitiesImpl = new SomeActivitiesImpl() {
#Override
public SomeObject performHostRecovery(String abc) {
trace.add("ABC: " + abc);
SomeObject testObject = new SomeObject();
return testObject;
}
};
workflowTest.addActivitiesImplementation(activitiesImpl);
workflowTest.addWorkflowImplementationType(SomeWorkflowImpl.class);
}
#Test
public void testWorkflowExecutionCall() throws Throwable {
SomeWorkflowClient workflow = workflowFactory.getClient("XZY");
workflow.execute("XYZ");
List<String> expected = new ArrayList<String>();
expected.add("ABC: abc");
AsyncAssert.assertEqualsWaitFor("Wrong Wrong", expected, trace, null);
}
}
I have used SWF Testing Docs to write above test class. However the method that I am testing (execute()) is invoking another method in same class. I am not concerned with the execution of internal method and would like to mock it out, but given the way the workflow class object is instantiated, I am not clear on how to mock the inner method.
Can someone please point out on this?
Thanks
You actually can instantiate a workflow object or any other object that workflow uses inside the test method:
#Test
public void testWorkflowExecutionCall() throws Throwable {
SomeWorkflow workflow = new SimpleWorkflow(...);
workflow.execute("XYZ");
List<String> expected = new ArrayList<String>();
expected.add("ABC: abc");
AsyncAssert.assertEqualsWaitFor("Wrong Wrong", expected, trace, null);
}
It works because WorkflowTest executes test methods in the context of a dummy test workflow. The code
SomeWorkflowClient workflow = workflowFactory.getClient("XZY");
workflow.execute("XYZ");
actually creates a child workflow in the context of this dummy workflow. But nothing prevents you from executing any async code directly without creating the child workflow.

Camel and ActiveMQ

I´m very new on the camel world, that is why I´m asking for your help.
Let me tell you what I would like to do:
I have this basic Camel standalone project:
package maventest1;
public class JmsToSql {
private Main main;
public static void main(String[] args) throws Exception {
JmsToSql example = new JmsToSql();
example.boot();
}
public void boot() throws Exception {
main = new Main();
main.enableHangupSupport();
main.bind("foo", new MyBean());
ConnectionFactory connectionFactory = new ActiveMQConnectionFactory("tcp://localhost:61616");
main.bind("test-jms",JmsComponent.jmsComponentAutoAcknowledge(connectionFactory));
main.addRouteBuilder(new MyRouteBuilder());
main.run();
}
private static class MyRouteBuilder extends RouteBuilder {
#Override
public void configure() throws Exception {
from("timer:foo?delay=2000")
.process(new Processor() {
public void process(Exchange exchange) throws Exception {
//NOT SURE THIS IS THE RIGHT WAY
from("test-jms:queue:order1")
.to("test-jms:queue:order2");
}
})
.beanRef("foo");
}
}
public static class MyBean {
public void callMe() {
System.out.println("MyBean.calleMe method has been called");
}
}
}
All I want to do is read all the messages from an activeMQ queue and pass them into another queue. Does anybody know how I can do this?
Thanks in advance =D
Just do a route from JMS to JMS
private static class MyRouteBuilder extends RouteBuilder {
#Override
public void configure() throws Exception {
from("test-jms:queue:order1")
.to("test-jms:queue:order2");
}
As you are new to Camel, I recommend to also read this article first
http://java.dzone.com/articles/open-source-integration-apache
And if you want to have great documentation and tutorials, then pickup one of the Camel books
http://camel.apache.org/books

Categories