Transactions in camel routes with Splitter parallel processing - java

We have an XML file which consists of multiple customer elements in it and we want to save customer information to the DB using transactions. From my understanding, transactions needs to be run in a single thread to roll back the whole transaction in case of errors.
Here's my XML:
<root>
<customers>
<customer>...</customer>
<customer>...</customer>
<customer>...</customer>
<customer>...</customer>
</customers
</root>
Here is my route:
<route id="routeA">
<from uri="direct-vm:sample" />
<transacted />
<splitter parallelProcessing = "true" stopOnException="true"
strategyRef="combine" />
<xpath>/root/customers/customer</xpath>
<bean ref="customerService" method="saveCustomer" />
<onException>java.lang.Exception</onException>
<handled><constant>true</constant></handled>
<rollback markRollbackOnly="true" />
</route>
The method saveCustomer() runs a lot of business logic before saving customers to the Database and if for some reason an exception is thrown for 1 or 2 customers, I see multiple rollback messages, and it seems like this is happening for each thread. Do transactions in camel routes with parallel processing work? Is there any other way to save customers in parallel to the DB in a single DB transaction?

No you cannot do parallel work in the same transaction. The work must occur on the same thread.

You can use shareUnitOfWork() in combination with your parallelProcessing.
As the Camel Documentation of the Splitter EIP mentions it will rollback the entire unit of work not only the sub units:
When the Splitter is done, it checks the state of the shared unit of work and checks if any errors occurred. And if an error occurred it will set the exception on the Exchange and mark it for rollback. The error handler will yet again kick in, as the Exchange has been marked as rollback and it had an exception as well. No redelivery attempts is performed (as it was marked for rollback) and the Exchange will be moved into the dead letter queue.
see Sharing Unit of work chapter in the link.

Related

Execute #Handler method in multi-thread

Need to execute #Handler(import org.apache.camel.Handler) method in multi-threading environment. below is my current code and camelroute.xml file. Any Idea or suggestion would be appreciable.
#Component("messagehandler")
public class HandleMessages {
#Handler
public void handle(String body, Exchange exchange) throws Exception {
// do some business operation
}
}
<?xml version="1.0" encoding="UTF-8"?>
<routes xmlns="http://camel.apache.org/schema/spring"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://camel.apache.org/schema/spring http://camel.apache.org/schema/spring/camel-spring.xsd">
<route id="IncomingRoute">
<from uri="someSourceURL" />
<to uri="bean:messagehandler" />
<log message="Message Body - ${body}" />
</route>
</routes>
In general terms, thread safety is all about execution – that's all. A given method / routine / piece of code is thread-safe if it guarantees the manipulation of shared data (data structures, etc.) safely in a way that multiple threads don't corrupt that data.
So it really depends on how you structure the execution of a given workflow; and that's the same for any other piece of code you can think of.
Since you are using Apache Camel, take a look at their threading model. If I remember correctly, you would have to define your routes / workflows in such a way they run concurrently by using parallelProcessing (CC EIP), a custom thread pool, or using staged event-driven architecture (SEDA); at that point you would need to take care about what you do in the handlers (or any other "processor(s)" that handle shared data), otherwise you should be OK.
Another thing you need to think about is how Camel uses its routing engine to route messages synchronously or asynchronously; be aware that synchronicity and MEP affects the threading model.

How can I route messages in Spring Integration, which an exception was thrown during processing?

I have the following Integration Flow:
Integration Flow
If an exception is thrown during the second split method, inside the parser, I would like to channel that message to an error channel, is that possible somehow?
One way is to have an output channel for spitter as an ExecutorChannel or QueueChannel. This way every splitted item is going to be processed in the separate thread. You can then apply any available error handling options for those asyn channels.
See docs for more info: https://docs.spring.io/spring-integration/docs/5.2.0.RELEASE/reference/html/error-handling.html#error-handling
Another one is to use a .gateway() with its errorChannel option downstream after the second splitter, so every item is going to be processed in isolation again.
Also an ExpressionEvaluatingRequestHandlerAdvice (maybe together with a RequestHandlerRetryAdvice) can be used downstream on the specific endpoint to deal with its own exceptions: https://docs.spring.io/spring-integration/docs/5.2.0.RELEASE/reference/html/messaging-endpoints.html#message-handler-advice-chain

Apache Camel - Prevent Exchange Propagation on Route Policy on Exception

I'm currently working on an application which uses Camel heavily. I'll briefly explain what I'm trying to achieve:
An OnException Processor (handled=true) catches an Exception and when that happens I want to stop the Exchange from being processed by the RoutePolicy(s) of the route. Example with pseudo-code:
<route id="route1" routePolicyRef="policy1, policy2, policy3">
... an exception *e1* is thrown...
</route>
<onException>
<handled>true</handled>
... handle exception *e1*
<bean ref="customExceptionProcessor"/>
</onException>
So in essence, once an Exception is handled I want the Exchange to stop propagating to all the attached Policy(s).
Wondering if there's a simple way to achieve this which I may have missed when reading from the documentation.
That's it really. Cheers.

Spring Integration and Transaction Management - how difficult need it be?

Using Spring Integration I am trying to built a simple message producing component. Basically something like this:
<jdbc:inbound-channel-adapter
channel="from.database"
data-source="dataSource"
query="SELECT * FROM my_table"
update="DELETE FROM my_table WHERE id IN (:id)"
row-mapper="someRowMapper">
<int:poller fixed-rate="5000">
<int:transactional/>
</int:poller>
</jdbc:inbound-channel-adapter>
<int:splitter
id="messageProducer"
input-channel="from.database"
output-channel="to.mq" />
<jms:outbound-channel-adapter
channel="to.mq"
destination="myMqQueue"
connection-factory="jmsConnectionFactory"
extract-payload="true" />
<beans:bean id="myMqQueue" class="com.ibm.mq.jms.MQQueue">
<!-- properties omitted --!>
</beans:bean>
The "messageProducer" may produce several messages per poll but not necessarily one per row.
My concern is that I want to make sure that rows are not deleted from my_table unless the messages produced has been committed to the MQ channel.
On the other hand I will accept that rows in case of db- or network failure are not deleted thus causing duplicate messages to be produced. In other words I will settle for a non-XA one-phase commit with possible duplicates.
When trying to figure out what I need to put to my Spring configuration I quickly get lost in endless discussions about transaction managers, AOP and transaction advice chains which I find difficult to understand - I know I ought to though.
But I fear that I will spend a lot of time cooking up a configuration that is not really necessary for my problem at hand.
So - my question is: Can it be that simple - or do I need to provide explicit configuration for transaction synchronization?
But can I do something similar with a jdbc/jms mix?
I'd say "Yes".
Please, read Dave Syer's article about Best effort 1PC, where the ChainedTransactionManager came from.

How to wait for completion of multi-threaded publish-subscribe-channel

I have a Spring Integration project where I want to process a message concurrently through multiple actions. So I have set up a publish-subscribe-channel with a task-executor. However I want to wait for all processing to complete before moving on. How would I do this?
<publish-subscribe-channel id="myPubSub" task-executor="my10ThreadPool"/>
<channel id="myOutputChannel"/>
<service-activator input-channel="myPubSub" output-channel="myOutputChannel"
ref="beanA" method="blah"/>
<service-activator input-channel="myPubSub" output-channel="myOutputChannel"
ref="beanB" method="blah"/>
<service-activator id="afterThreadingProcessor" input-channel="myOutputChannel" .../>
So in the above case, I want my afterThreadingProcessor to be invoked only once after both beanA and beanB have completed their work. However, in the above afterThreadingProcessor will be invoked twice.
Add apply-sequence="true" to the pub-sub channel (this adds default correlation data to the messages, including correlationId, sequenceSize, and sequenceNumber and allows default strategies to be used on downstream components).
Add an <aggregator/> before afterThreadingProcessor and route the output from the two <service-activator/>s to it.
Add a <splitter/> after the aggregator - the default splitter will split the collection made by the aggregator into two messages.
afterThreadingProcessor will be invoked once for each message on the second thread that completes its work.
You can make the configuration easier by using a chain...
<chain input-channel="myOutputChannel">
<aggregator />
<splitter />
<service-activator id="afterThreadingProcessor" input-channel="myOutputChannel" .../>
</chain>
To make a single call to the final service, just change your service to take a Collection<?> instead of adding the splitter.
EDIT:
In order to do what you want in comment #3 (run the final service on the original thread), something this should work...
<int:channel id="foo" />
<int:service-activator ref="twoServicesGateway" input-channel="foo"
output-channel="myOutputChannel" />
<int:gateway id="twoServicesGateway" default-request-channel="myPubSub"/>
<int:publish-subscribe-channel id="myPubSub" task-executor="my10ThreadPool"
apply-sequence="true"/>
<int:service-activator input-channel="myPubSub" output-channel="aggregatorChannel"
ref="beanA" method="blah"/>
<int:service-activator input-channel="myPubSub" output-channel="aggregatorChannel"
ref="beanB" method="blah"/>
<int:aggregator input-channel="aggregatorChannel" />
<int:service-activator id="afterThreadingProcessor" input-channel="myOutputChannel" .../>
In this case, the gateway encapsulates the two other services and the aggregator; the default service-interface is a simple RequestReplyExchanger. The calling thread will wait for the output. Since the aggregator has no output-channel the framework will send the reply to the gateway, and the waiting thread will receive it, return to the <service-activator/> and the result will then be sent to the final service.
You would probably want to put a reply-timeout on the gateway because, by default, it will wait indefinitely and, if one of the services returns a null, no agreggated response will ever be received.
Note that I indented the gateway flow just to show it runs from the gateway, they are NOT child elements of the gateway.
Same kind of behavior can now be achieved using a more cleaner approach introduced in Spring Integration 4.1.0 as an implementation of EIP Scatter-Gather pattern.
Checkout Scatter-Gather example gist:
https://gist.github.com/noorulhaq/b13a19b9054941985109

Categories