Camel: failure to add routes dynamically - java

I'm using Apache-Camel 2.15.2.
I am trying to add routes to a CamelContext dynamically, but I came across a problem that baffles me.
As far as I can tell, I do add the routes to the correct CamelContext, and it seems like their configure() is called without throwing exceptions. However when I try to execute the main route, I get a run time Exception telling me that the route I added dynamically does not exist.
Here is a simplified version of my code:
public class MainRouteBuilder extends RouteBuilder
{
public static CamelContext camelContext;
public static boolean returnLabel = true;
public static RouteBuilder nestedRouteBuilder;
#Override
public void configure() throws Exception
{
System.out.println("Building main route!");
System.out.println("Context: " + getContext());
camelContext = getContext();
from("direct:mainRoute")
//3. I do not actually want to instantiate RouteContainer like this each time I call this route.
//I would simply want to reuse a reference to an instance I created outside of configure()...
.to(new RouteContainer().getMyRoute(2))
;
returnLabel = false;
//configure direct:myRoute.2
includeRoutes(nestedRouteBuilder);
}
}
public class RouteContainer extends RouteBuilder
{
public Route route;
RouteContainer() {
super(MainRouteBuilder.camelContext);
}
String getMyRoute(final int n) {
if (MainRouteBuilder.returnLabel && route == null) {
route = new Route() {
#Override
public void configure()
{
System.out.println("Building nested route!");
System.out.println("Context: " + getContext());
from("direct:myRoute." + n)
.transform()
.simple("number: " + n)
.to("stream:out")
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
Response response = new Response();
response.setStatus(Status.SUCCESS);
exchange.getOut().setBody(response);
}
});
}
};
}
//1. works:
MainRouteBuilder.nestedRouteBuilder = this;
//2. does not work:
// RouteContainer routeContainer = new RouteContainer();
// routeContainer.route = this.route;
// MainRouteBuilder.nestedRouteBuilder = routeContainer;
return "direct:myRoute." + n;
}
#Override
public void configure() throws Exception {
if (route != null) {
route.configure();
}
}
public abstract static class Route {
abstract public void configure();
}
}
Requests that are sent to direct:mainRoute work.
During Camel startup I see in the console:
Building main route!
Context: SpringCamelContext(camel-1) with spring id org.springframework.web.context.WebApplicationContext:/sample-route
Building nested route!
Context: SpringCamelContext(camel-1) with spring id org.springframework.web.context.WebApplicationContext:/sample-route
and when I send a request to direct:mainRoute, the output is:
{"status":"SUCCESS"}
HOWEVER, if I comment out (1) above, and uncomment (2), Camel starts up with the same output to the console, but when I send a request to direct:mainRoute, the execution of the route fails with the exception:
org.apache.camel.component.direct.DirectConsumerNotAvailableException: No consumers available on endpoint: Endpoint[direct://myRoute.2].
To Clarify: my problem is because I would actually like NOT to instantiate RouteContainer each time I call its route, as I do in (3). This is why I instantiate them at point (2) and plug the Route instance into it...
So I would like MainRouteBuilder to look like this:
public class MainRouteBuilder extends RouteBuilder
{
public static CamelContext camelContext;
public static boolean returnLabel = true;
public static RouteBuilder nestedRouteBuilder;
RouteContainer routeContainer = new RouteContainer();
#Override
public void configure() throws Exception
{
System.out.println("Building main route!");
System.out.println("Context: " + getContext());
camelContext = getContext();
from("direct:mainRoute")
.to(routeContainer.getMyRoute(2))
//I may want to call it again like this:
//.to(routeContainer.getMyRoute(3))
;
returnLabel = false;
//configure direct:myRoute.2
includeRoutes(nestedRouteBuilder);
}
}
My assumption is that maybe the nested route direct:myRoute.2 is created in the wrong CamelContext, but the console output tells me it is not so.
Any idea what I am doing wrong here?

Route configuration != route execution
It seems that you are confusing route configuration with route execution. We've all been there ;-)
When you configure the RouteBuilder in MainRouteBuilder#configure(), the method is only is only executed once when your Camel app bootstraps, in order to set up the routing logic. The DSL creates the plumbing for the route (Processors, Interceptors, etc.) and that's what the route runtime will be.
Point to bring home: The DSL is not executed over and over again with every Exchange.
In other words, Camel does not do what you point out in (3). It doesn't execute new RouteContainer().getMyRoute(2) for every single Exchange. Think about it: the bytecode for configure() is only executed when configuring Camel, and the bytecode instantiates an object of class RouteContainer and it invokes the getMyRoute with argument 2. The resulting object is fed into the SendProcessor that the to() DSL generates.
Analysis of your code
Now, with regards to why your code doesn't yield the result you expect.
You have a problem with the state-keeping of RouteContainer. Every time you call getMyRoute you overwrite the instance variable route. So it's impossible for your current code to call getMyRoute several times (with different ns) and then call includeRoutes just once at the end, because only the most recently generated route will be added.
I also don't like masking the Camel Route class with a class of your own, just to act as a placeholder, but that brings up a different discussion you didn't ask for.
Simpler solution
Instead of your RouteContainer class, here's a RouteGenerator class that creates routes and returns the direct: endpoint to the caller. It keeps track of all the routes in an internal Set.
public class RouteGenerator {
private Set<RouteBuilder> routeBuilders = new HashSet<>();
public String generateRoute(final int n) {
routeBuilders.add(new RouteBuilder() {
#Override public void configure() throws Exception {
System.out.println("Building nested route!");
System.out.println("Context: " + getContext());
from("direct:myRoute." + n)
.transform() .simple("number: " + n)
.to("stream:out")
.process(new Processor() {
#Override
public void process(Exchange exchange) throws Exception {
Response response = new Response();
response.setStatus(Status.SUCCESS);
exchange.getOut().setBody(response);
}
});
}
});
return "direct:myRoute." + n;
}
public Set<RouteBuilder> getRouteBuilders() {
return routeBuilders;
}
}
And here is your MainRouteBuilder, which instantiates the RouteGenerator only once, and can generate as many routes as you wish.
Once you finish configuring your routes, you just iterate over the accumulated RouteBuilders and include them:
public class MainRouteBuilder extends RouteBuilder {
public static CamelContext camelContext;
public static RouteGenerator routeGenerator = new RouteGenerator();
#Override
public void configure() throws Exception {
System.out.println("Building main route!");
System.out.println("Context: " + getContext());
camelContext = getContext();
from("direct:mainRoute")
.to(routeGenerator.generateRoute(2));
for (RouteBuilder routeBuilder : routeGenerator.getRouteBuilders()) {
includeRoutes(routeBuilder);
}
}
}
EDIT: Why doesn't your option (2) work?
After debugging for some time, I realised why you're seeing this effect.
Extracted from the Java Tutorial:
As with instance methods and variables, an inner class is associated
with an instance of its enclosing class and has direct access to that
object's methods and fields.
In (2), you create an instance of Route within the scope of the initial RouteContainer object, acting as the outer object. The Route object retains the outer RouteContainer as its outer object. The from() and subsequent methods are therefore being invoked on that initial RouteContainer (RouteBuilder) object, not on the new RouteContainer you create later, which is the one you provide to the upper RouteBuilder (which is associated to the CamelContext).
That's why your direct:myRoute.2 is not being added to the Camel Context, because it's being created in a different route builder.
Also note that the console output of (2):
Building main route!
Context: CamelContext(camel-1)
Building nested route!
Context: CamelContext(camel-2)
Added!
The second route is being added to a different context camel-2. This new context is created by Camel lazily when it adds the route to the old RouteBuilder, which hasn't been associated to any Camel Context yet.
Note that the the Camel Context of the initial RouteContainer (created in the instance variable initialization) is null, because you assign the MainRouteBuilder.camelContext property later on.
You can see how two different route builders are being used by adding the following println statements:
Inside Route#configure:
System.out.println("RouteContainer to which route is added: " + RouteContainer.this.hashCode());
Inside MainRouteBuilder#configure, just before includeRoutes:
System.out.println("RouteContainer loaded into Camel: " + nestedRouteBuilder.hashCode());
With (1), the hashcode is the same. With (2), the hashcode is different, clearly showing that there are two different RouteBuilders in play (one which contains the route, and the one that's loaded into the Context, which does not include the route).
Source: I'm a Apache Camel PMC member and committer.

Related

Mockito - verify a method with Callable<> as parameter [duplicate]

I have a simple scenario in which am trying to verify some behavior when a method is called (i.e. that a certain method was called with given parameter, a function pointer in this scenario). Below are my classes:
#SpringBootApplication
public class Application {
public static void main(String[] args) {
ConfigurableApplicationContext context = SpringApplication.run(Application.class, args);
AppBootStrapper bootStrapper = context.getBean(AppBootStrapper.class);
bootStrapper.start();
}
}
#Component
public class AppBootStrapper {
private NetworkScanner networkScanner;
private PacketConsumer packetConsumer;
public AppBootStrapper(NetworkScanner networkScanner, PacketConsumer packetConsumer) {
this.networkScanner = networkScanner;
this.packetConsumer = packetConsumer;
}
public void start() {
networkScanner.addConsumer(packetConsumer::consumePacket);
networkScanner.startScan();
}
}
#Component
public class NetworkScanner {
private List<Consumer<String>> consumers = new ArrayList<>();
public void startScan(){
Executors.newSingleThreadExecutor().submit(() -> {
while(true) {
// do some scanning and get/parse packets
consumers.forEach(consumer -> consumer.accept("Package Data"));
}
});
}
public void addConsumer(Consumer<String> consumer) {
this.consumers.add(consumer);
}
}
#Component
public class PacketConsumer {
public void consumePacket(String packet) {
System.out.println("Packet received: " + packet);
}
}
#RunWith(JUnit4.class)
public class AppBootStrapperTest {
#Test
public void start() throws Exception {
NetworkScanner networkScanner = mock(NetworkScanner.class);
PacketConsumer packetConsumer = mock(PacketConsumer.class);
AppBootStrapper appBootStrapper = new AppBootStrapper(networkScanner, packetConsumer);
appBootStrapper.start();
verify(networkScanner).addConsumer(packetConsumer::consumePacket);
verify(networkScanner, times(1)).startScan();
}
}
I want to verify that bootStrapper did in fact do proper setup by registering the packet consumer(there might be other consumers registered later on, but this one is mandatory) and then called startScan. I get the following error message when I execute the test case:
Argument(s) are different! Wanted:
networkScanner bean.addConsumer(
com.spring.starter.AppBootStrapperTest$$Lambda$8/438123546#282308c3
);
-> at com.spring.starter.AppBootStrapperTest.start(AppBootStrapperTest.java:24)
Actual invocation has different arguments:
networkScanner bean.addConsumer(
com.spring.starter.AppBootStrapper$$Lambda$7/920446957#5dda14d0
);
-> at com.spring.starter.AppBootStrapper.start(AppBootStrapper.java:12)
From the exception, clearly the function pointers aren't the same.
Am I approaching this the right way? Is there something basic I am missing? I played around and had a consumer injected into PacketConsumer just to see if it made a different and that was OK, but I know that's certainly not the right way to go.
Any help, perspectives on this would be greatly appreciated.
Java doesn't have any concept of "function pointers"; when you see:
networkScanner.addConsumer(packetConsumer::consumePacket);
What Java actually compiles is (the equivalent of):
networkScanner.addConsumer(new Consumer<String>() {
#Override void accept(String packet) {
packetConsumer.consumePacket(packet);
}
});
This anonymous inner class happens to be called AppBootStrapper$$Lambda$7. Because it doesn't (and shouldn't) define an equals method, it will never be equal to the anonymous inner class that the compiler generates in your test, which happens to be called AppBootStrapperTest$$Lambda$8. This is regardless of the fact that the method bodies are the same, and are built in the same way from the same method reference.
If you generate the Consumer explicitly in your test and save it as a static final Consumer<String> field, then you can pass that reference in the test and compare it; at that point, reference equality should hold. This should work with a lambda expression or method reference just fine.
A more apt test would probably verify(packetConsumer, atLeastOnce()).consumePacket(...), as the contents of the lambda are an implementation detail and you're really more concerned about how your component collaborates with other components. The abstraction here should be at the consumePacket level, not at the addConsumer level.
See the comments and answer on this SO question.

Kafka Consumer Custom MetricReporter not receiving metrics

I've created a class that implements org.apache.kafka.common.metrics.KafkaMetric like so:
public class DatadogMetricTracker implements MetricsReporter {
#Override
public void configure(Map<String, ?> configs) {
System.out.println(configs);
}
#Override
public void init(List<KafkaMetric> metrics) {
System.out.println(metrics);
}
#Override
public void metricChange(KafkaMetric metric) {
System.out.println(metric.metricName().name() + ": " + metric.value() + " tags: " + metric.metricName().tags());
}
#Override
public void metricRemoval(KafkaMetric metric) {
}
#Override
public void close() {
}
}
Then I register the class as a metric reporter when I set-up the Kafka props:
properties.put(ConsumerConfig.METRIC_REPORTER_CLASSES_CONFIG, "com.myco.utils.DatadogMetricTracker");
When I start my consumer, configure gets called and init, then metricChange is called one time with a batch of metrics for which the values are all 0 or -Infinity, then it never gets called again. How do I get my metric recorder to fire again?
Thanks!
Checked the code and metricChange method doesn't get called apart from initially registering the metric. So better to use some alternative approach. As of now, I will be implementing the periodic read of metrics and publishing of them since in our case Prometheus will scrape metrics from our application. One more important point is, since values are already calculated internally so, will be using only gauge to populate the values for prometheus
It works in case of JMX reporter because they create dynamic bean and when Mbean's attributes values are checked (Through JConsole or something) then actual method of getting metric's value is called so it works there, here is the code:
public Object getAttribute(String name) throws AttributeNotFoundException, MBeanException, ReflectionException {
if (this.metrics.containsKey(name))
return this.metrics.get(name).metricValue();
else
throw new AttributeNotFoundException("Could not find attribute " + name);
}

JPA.withTransaction executing other controllers method error: Global.java:39: error: 'void' type not allowed here

I am trying to execute some db insert/update queries in some time intervals.
To achive this I'decided to use Playframework built in Akka Actor system.
I have my class with method:
public class Global extends GlobalSettings {
#Override
public void onStart(Application application) {
Akka.system().scheduler().schedule(
Duration.create(1, TimeUnit.SECONDS), // start task delay
Duration.create(24, TimeUnit.HOURS), // between task instance delay
//Duration.create(24, TimeUnit.HOURS), // between task instance delay
new Runnable() {
#Override
public void run() {
JPA.withTransaction(
ImportCrmData.start()
);
}
},
Akka.system().dispatcher()
);
And the method that is called by the actor system:
public class ImportCrmData extends Controller {
#Transactional
public static void start() {
List<Customer> customersToUpdate = CustomerCRM.importCrmData();
for(Customer c: customersToUpdate) {
c.toDataBase();
}
}
}
I am getting an error on compile:
[error] app/modules/common/app/Global.java:39: error: 'void' type not allowed here ImportCrmData.start()
I understand that the problem occurs cause JPA.withTransaction() demand me to return from ImportCrmData.start() return Callback0 or Function0<>, but I dont know how to do it.
My method is just doing this.persist. Why should I even return something from that?
ImportCrmData is a controller thus it must return a valid http response (a result). A typical use case:
public class CustomerController extends Controller {
public static Result getCustomers() {
List<Customer> customers = CustomerService.getCustomers();
return ok(Json.toJson(customers));
}
}
Above example consists of controller which is an entry point to your application and reacts with client requests. CustomerService encapsulates logic related to getting customers. ok(...) returns an implementation of Result - a valid http response with code 200 and in above scenario, json body. It is implemented in Controller base class. Next your controller can be mapped in routes file to a url like so:
GET /customers controller.CustomerController.getCustomers
Applying above pattern you should have:
CrmController - entry point
CrmService - actual business logic
This separation allows using your CrmService in Global class, as well as in Controller layer without duplicating logic. Mind this is just a suggestion.

Java template method pattern

I am trying to implement something along the lines of the template method pattern within some JavaEE beans that do some processing work on a dataset.
Each processing bean takes the Job object, does some work then returns the updated job for the next bean (about 10 in total), each bean has a single method with the same name (doProcessing) and single argument (job)
I would like to perform some logging at the start and end of each beans 'doProcessing' method so that at the end of processing the Job contains logging info from each bean (stored in a hashmap or suchlike)
My current implementation looks something like this...
#Stateless
public class processingTaskOne(){
public void doProcessing(Job job){
//always called at beginning of method
String beanReport = "Info from Task 1: ";
for(int i=0; i<job.getDataArray().size();i++){
beanReport+="\n some more info";
//Do some processing work here
}
//always called at end of method
job.addNewReportSection(beanReport)
}
}
But I know that I can do better than this, using inheritance I should be able to create a superclass along the lines of...
public abstract class Reportable{
private String sectionReport;
public void preProcessing(Job job){
//Setup bean report, use reflection to get subclass name
}
public void postProcessing(Job job){
//Finish bean report and append to job
job.addNewReportSection(sectionReport)
}
public abstract doProcessing(){
//not sure how this should work
}
}
And any class that extends the superclass will automatically perform the pre/postprocessing actions...
#Stateless
public class processingTaskOne() extends Reportable{
public void doProcessing(Job job){
for(int i=0; i<job.getDataArray().size();i++){
super.sectionReport += "log some info"
//Do some processing work here
}
}
}
But I have been unable to work out how to implement this exactly, since all the examples refer to POJO's and as my beans are #Stateless there is no constructor.
Can someone provide some guidance on this? Am I barking up the wrong tree?
If I understood your question correctly, you can try the following:
public abstract class Reporting {
public void setUp(Job job) {
// set things up
}
public void tearDown(Job job) {
// post-processing stuff
}
public void process(Job job) {
setUp(job);
doProcessing(job);
tearDown(job);
}
public abstract void doProcessing(Job job);
}
public class Processor1 extends Reporting {
#Override
public void doProcessing(Job job) {
// business logic
}
}
and later, somewhere in your code, you should call not the doProcessing(), but rather the process() method of your base class.
Also, in case my interpretation was correct, you might be interested in using some aspect-oriented programming framework like AcpectJ or Spring AOP.

How to properly test with mocks Akka actors in Java?

I'm very new with Akka and I'm trying to write some unit tests in Java. Consider the following actor:
public class Worker extends UntypedActor {
#Override
public void onReceive(Object message) throws Exception {
if (message instanceof Work) {
Work work = (Work) message;
Result result = new Helper().processWork(work);
getSender().tell(result, getSelf());
} else {
unhandled(message);
}
}
}
What is the proper way to intercept the call new Helper().processWork(work)? On a side note, is there any recommended way to achieve dependency injection within Akka actors with Java?
Thanks in advance.
Your code is already properly testable:
you can test your business logic separately, since you can just instantiate your Helper outside of the actor
once you are sure that the Helper does what it is supposed to do, just send some inputs to the actor and observe that the right replies come back
Now if you need to have a “mocked” Worker to test some other component, just don’t use a Worker at all, use a TestProbe instead. Where you would normally get the ActorRef of the Worker, just inject probe.getRef().
So, how to inject that?
I’ll assume that your other component is an Actor (because otherwise you won’t have trouble applying whatever injection technique you normally use). Then there are three basic choices:
pass it in as constructor argument
send it within a message
if the actor creates the ref as its child, pass in the Props, possibly in an alternative constructor
The third case is probably what you are looking at (I’m guessing based on the actor class’ name):
public class MyParent extends UntypedActor {
final Props workerProps;
public MyParent() {
workerProps = new Props(...);
}
public MyParent(Props p) {
workerProps = p;
}
...
getContext().actorOf(workerProps, "worker");
}
And then you can inject a TestProbe like this:
final TestProbe probe = new TestProbe(system);
final Props workerMock = new Props(new UntypedActorFactory() {
public UntypedActor create() {
return new UntypedActor() {
#Override
public void onReceive(Object msg) {
probe.getRef().tell(msg, getSender());
}
};
}
});
final ActorRef parent = system.actorOf(new Props(new UntypedActorFactory() {
public UntypedActor create() {
return new MyParent(workerMock);
}
}), "parent");

Categories