spring integration dynamic header - java

In Spring-integration Http request based on user logged in want to add dynamic header param.say for example if "A" user logged in & hitHttp request,now need to add dynamic 1 additional header,for others user,it should even the key too(i.e value as null)
For A user
<int:gateway id="requestGateway" service-interface="net.group.gateway.Gateway" default-request-channel="jsonTransformationChannel">
<int:default-header name="X-MW-LOGGEDID" expression="#requestData.getLoggedID()" />
<int:default-header name="X-Srcvalue" value="56789" />
<int:default-header name="content-type" value="application/json" />
<int:default-header name="Accept" value="application/json" />
</int:gateway>
For Other user
<int:gateway id="requestGateway" service-interface="net.group.gateway.Gateway" default-request-channel="jsonTransformationChannel">
<int:default-header name="X-MW-LOGGEDID" expression="#requestData.getLoggedID()" />
<int:default-header name="content-type" value="application/json" />
<int:default-header name="Accept" value="application/json" />
</int:gateway>

I guess the story is really about that X-Srcvalue. And since you say it's OK to null for anyone else, that would better to use an expression instead of value.
<int:default-header name="X-Srcvalue" expression="USER == A ? 56789 : null" />
In the expression you can use any bean in the application context to perform the logic any complexity.

Related

Append header value in SpringIntegration

When I try to assign a value to url-expression is working (Below working):
<bean id="requestData" class="net.model.RequestData"/>
<int-http:outbound-gateway request-channel="requestChannel"
url-expression="#requestData.getCompleteUrl()" http-method-expression="#requestData.getRequestMethod()"
expected-response-type="java.lang.String" header-mapper="headerMapper"
charset="UTF-8" reply-timeout="5000" reply-channel="responseChannel">
</int-http:outbound-gateway>
But when I try to assign a value in Header, it throws an Exception (below not working):
<int:gateway id="requestGateway" service-interface="net.model.RequestData" default-request-channel="jsonTransformationChannel">
<int:default-header name="X-MW-getId" value="#requestData.getId() />
<int:default-header name="X-Srcsys" value="ttsim" />
<int:default-header name="content-type" value="application/json" />
<int:default-header name="Accept" value="application/json" />
</int:gateway>
And I tried the lines below, but it didn't work:
<int:default-header name="X-MW-getId" expression="#requestData.id" />
This one must work:
<int:default-header name="X-MW-getId" expression="#requestData.id" />
It must be definitely an expression attribute and if you would like to get access to the requestData bean from there so that should be with the #. The getter indeed can be used as a property accessor.

camel cxf - how to send soap request response in email

Here is my spring configuration.
Spring.xml
-------------
<!-- Outgoing SOAP client endpoint -->
<cxf:cxfEndpoint id="serviceEndpoint" address="${endpointAddress}"
wsdlURL="${wsdlAddress}" endpointName="${portName}" serviceName="${serviceName}">
<!-- The interceptors - needed to log the SOAP requests and responses -->
<!-- They can be removed, when no logging is needed -->
<cxf:inInterceptors>
<ref bean="loggingInInterceptor" />
</cxf:inInterceptors>
<cxf:outInterceptors>
<ref bean="loggingOutInterceptor" />
</cxf:outInterceptors>
<cxf:outFaultInterceptors>
<ref bean="loggingOutInterceptor" />
</cxf:outFaultInterceptors>
<cxf:inFaultInterceptors>
<ref bean="loggingInInterceptor" />
</cxf:inFaultInterceptors>
<cxf:properties>
<entry key="dataFormat" value="PAYLOAD" />
</cxf:properties>
</cxf:cxfEndpoint>
<http:conduit name="*.http-conduit">
<http:tlsClientParameters disableCNCheck="${disableHostnameCheck}">
<sec:keyManagers keyPassword="${keystorePassword}">
<sec:keyStore type="JKS" password="${keystorePassword}"
file="${keystoreLocation}" />
</sec:keyManagers>
<sec:trustManagers>
<sec:keyStore type="JKS" password="${truststorePassword}"
file="${truststoreLocation}" />
</sec:trustManagers>
<sec:cipherSuitesFilter>
<!-- these filters ensure that a ciphersuite with export-suitable or
null encryption is used, but exclude anonymous Diffie-Hellman key change
as this is vulnerable to man-in-the-middle attacks -->
<sec:include>.*_EXPORT_.*</sec:include>
<sec:include>.*_EXPORT1024_.*</sec:include>
<sec:include>.*_WITH_DES_.*</sec:include>
<sec:include>.*_WITH_AES_.*</sec:include>
<sec:include>.*_WITH_NULL_.*</sec:include>
<sec:exclude>.*_DH_anon_.*</sec:exclude>
</sec:cipherSuitesFilter>
</http:tlsClientParameters>
<http:client AutoRedirect="true" Connection="Keep-Alive"
ReceiveTimeout="${connectionTimeout}" ConnectionTimeout="${connectionTimeout}" />
</http:conduit>
Here is the Camel route configuration
-------------
<from ...
<to uri="cxf:bean:serviceEndpoint" />
This works well and we can have the soap request/response logged into the log file. Here soap request with soap header is generated by cxf.
Do we have a way to capture the soap request and response into camel Exchange? I have to send an email attached with soap request and response if the service call is failed.
Have tried using thread local but it doesn't seems to work expected.
Suggestion:
You have it in CXF Interceptors - take a look at them.
I guess, you can send your e-mail out of it.
Start from org.apache.cxf.phase.AbstractPhaseInterceptor class - there are bunch of different ones for different phases.
P.S. From first glance org.apache.cxf.binding.soap.saaj.SAAJInInterceptor and
org.apache.cxf.binding.soap.saaj.SAAJOutInterceptor could be good candidates...

Can we pass a message directly from seda:queue to direct:endpoint in camel?

Using below configuration:
<camel:camelContext>
<camel:template id="camelProcessTemplate" />
<camel:endpoint id="asyncEndpoint" uri="seda:asyncQueue" />
<camel:endpoint id="calcWeightEndpoint" uri="direct:calculateWeightIn" />
<camel:route id="route1">
<camel:from ref="..." />
<camel:to ref="asyncEndpoint" />
</camel:route>
<camel:route id="route2">
<camel:from ref="asyncEndpoint" />
<camel:to ref="calcWeightEndpoint" />
</camel:route>
<camel:route id="route3">
<camel:from ref="calcWeightEndpoint" />
<camel:process ref="..." />
<camel:to ref="..." />
</camel:route>
</camel:camelContext>
The message is entering into route2 but is not getting passed to route3.
You should use the uri attributes instead of the ref attributes in your froms and tos:
<camel:to uri="direct:calculateWeightIn" />
Also, it's possible that your processor in route3 is failing and not logging properly. Maybe add a <camel:log /> step between the from and the process.
EDIT 1: OP updated the question to correct their code block.
Could you please add a <camel:log /> between the from and the process, to verify whether the message is arriving in route3? The problem may be in your processor, as I said above.

Mix tcp-connection-factory + tcp-inbound-gateway + router

Can I send distinct messages (distinct serialize/deserialize) to the same tcp server (same host and port) and differentiate the tcp-inbound-gateway by some value of header or payload with a router???
(...)
I want to add a router for select the correct tcp-inbound-gateway depends on some field in header or payload (for example named typedata). In what order would enter the request (or message) between router, tcp-inbound-gateway, tcp-connection-factory, serialize/deserialize? Will I have problems with the serialization/deserialization for chose the tcp-inbound-gateway? What is the correct way to do this?
Thanks in advance.
EDIT
In server:
<!-- Common -->
<int:channel id="channelConnectionFactoryRequest" />
<int:channel id="channelConnectionFactoryResponse" />
<router method="getDestinationChannel"
input-channel="channelSConnectionFactoryRequest">
<beans:bean class="org.mbracero.integration.CustomRouter" />
</router>
<beans:bean id="customSerializerDeserializer"
class="org.mbracero.integration.serialization.CustomSerializerDeserializer" />
<int-ip:tcp-connection-factory id="customConnectionFactory"
type="server" port="${payment.port}" single-use="true"
serializer="customSerializerDeserializer"
deserializer="customSerializerDeserializer" />
<int-ip:tcp-inbound-gateway id="customInboundGateway"
connection-factory="customConnectionFactory"
request-channel="channelCustomConnectionFactoryRequest"
reply-channel="channelCustomConnectionFactoryResponse"
error-channel="errorChannel" />
<!-- Custom -->
<beans:bean id="operations"
class="org.mbracero.integration.applepay.impl.OperationsImpl" />
<!-- Operation One -->
<int:channel id="operationOneRequest" />
<int:service-activator input-channel="operationOneRequest"
output-channel="operationOneResponse" ref="operations" method="getOperationOne" />
<!-- Operation Two -->
<int:channel id="operationTwoRequest" />
<int:service-activator input-channel="operationTwoRequest"
output-channel="operationTwoResponse" ref="operations" method="getOperationTwo" />
OperationsImpl:
ResultOne getOperationOne(RequestOne request);
ResultTwo getOperationOne(RequestTwo request);
ResultOne & ResultTwo implements ResultBase. And in serialize of customSerializerDeserializer I have:
#Override
public void serialize(ResultBase arg0, OutputStream arg1) throws IOException {
byte[] xxx = XXX.getBytes();
arg1.write(xxx);
byte[] yyy = yyy.getBytes();
arg1.write(senderName);
// **Each custom object have a method for serialize their own data**
arg0.transformDataToByte(arg1);
arg1.flush();
}
In client:
<gateway id="tmGateway"
service-interface="org.mbracero.integration.CustomGateway" />
<beans:bean id="operationOneSerializerDeserializer"
class="org.mbracero.integration.serialization.OperationOneSerializerDeserializer" />
<int-ip:tcp-connection-factory id="operationOneFactory"
type="client" host="127.0.0.1" port="7878" single-use="true"
serializer="operationOneSerializerDeserializer" deserializer="operationOneSerializerDeserializer" />
<int-ip:tcp-outbound-gateway id="operationOneOutGateway"
request-channel="operationOneChannel" connection-factory="operationOneFactory"
request-timeout="5000" reply-timeout="5000" remote-timeout="5000" />
<beans:bean id="operationTwoSerializerDeserializer"
class="org.mbracero.integration.operationTwoRequestSerializerDeserializer"/>
<int-ip:tcp-connection-factory id="operationTwoFactory"
type="client" host="127.0.0.1" port="7878" single-use="true"
serializer="operationTwoSerializerDeserializer"
deserializer="operationTwoSerializerDeserializer" />
<int-ip:tcp-outbound-gateway id="operationTwoOutGateway"
request-channel="operationTwoChannel" connection-factory="operationTwoFactory"
request-timeout="50000" reply-timeout="50000" remote-timeout="50000" />
CustomGateway:
#Gateway(requestChannel="operationOneChannel")
OperationOneResponse sendOperationOne(OperationOneRequest request);
#Gateway(requestChannel="operationTwoChannel")
OperationTwoResponse sendOperationTwo(OperationTwo request);
You cannot have 2 server connection factories listening on the same port. TCP doesn't allow it - the network stack wouldn't know which server socket to route the request to.
There's no problem on the client side but, with a single socket, the server would have to understand how to deserialize both data types.
It's probably easier to combine the serializers/deserializers into one on both sides (add another header to the message so the deserializer knows what type of payload to decode).

Several data sources in one spring integration pipeline?

I have a configured spring integration pipeline where xml files are parsed into various objects. The objects are going through several channel endpoints where they are slightly modified - nothing special, just some properties added.
The last endpoint from the pipeline is the persister, where the objects are persisted in DB. There might be duplicates, so in this endpoint there is also a check whether the object is already persisted or its a new one.
I use a message driven architecture, with simple direct channels.
<int:channel id="parsedObjects1" />
<int:channel id="parsedObjects2" />
<int:channel id="processedObjects" />
<int:service-activator input-channel="parsedObjects1" ref="processor1" method="process" />
<int:service-activator input-channel="parsedObjects2" ref="processor2" method="process" />
<int:service-activator input-channel="processedObjects" ref="persister" method="persist" />
In the moment there is only one data source, from where I get xml files, and everything is going smoothly. The problems begin when I need to attach a second data source. The files are coming in the same time so I want them processed in parallel. So, I've placed two parser instances, and every parser is sending messages through the pipeline.
The configuration with the direct channels that I have creates concurrency problems, so I've tried modifying it. I've tried several configuration from spring integration documentation, but so far with no success.
I've tried with dispatcher configured with max pool size of 1 - one thread per message in every channel endpoint.
<task:executor id="channelTaskExecutor" pool-size="1-1" keep-alive="10" rejection-policy="CALLER_RUNS" queue-capacity="1" />
<int:channel id="parsedObjects1" >
<int:dispatcher task-executor="channelTaskExecutor" />
</int:channel>
<int:channel id="parsedObjects2" >
<int:dispatcher task-executor="channelTaskExecutor" />
</int:channel>
<int:channel id="processedObjects" >
<int:dispatcher task-executor="channelTaskExecutor" />
</int:channel>
I have tried the queue-poller configuration also:
<task:executor id="channelTaskExecutor" pool-size="1-1" keep-alive="10" rejection-policy="CALLER_RUNS" queue-capacity="1" />
<int:channel id="parsedObjects1" >
<int:rendezvous-queue/>
</int:channel>
<int:channel id="parsedObjects2" >
<int:rendezvous-queue/>
</int:channel>
<int:channel id="processedObjects" >
<int:rendezvous-queue/>
</int:channel>
<int:service-activator input-channel="parsedObjects1" ref="processor1" method="process" >
<int:poller task-executor="channelTaskExecutor" max-messages-per-poll="1" fixed-rate="2" />
</int:service-activator>
<int:service-activator input-channel="parsedObjects2" ref="processor2" method="process" >
<int:poller task-executor="channelTaskExecutor" max-messages-per-poll="1" fixed-rate="2" />
</int:service-activator>
<int:service-activator input-channel="processedObjects" ref="persister" method="persist" >
<int:poller task-executor="channelTaskExecutor" max-messages-per-poll="1" fixed-rate="2" />
</int:service-activator>
Basically, I want to get rid of any race conditions in the channel endpoints - in my case in the persister. The persister channel endpoint should block for every message, because if it runs in parallel, I get many duplicates persisted in the DB.
EDIT:
After some debugging I've done, it seems that the problems are in the endpoints logic rather than the configuration. Some of the objects which are sent through the pipeline to the persister, are also stored in a local cache until parsing of the file is done - they are later sent through the pipeline as well to persist some join tables as a part of some other domain entities. It happens that with the above configurations, some of the objects were not yet persisted when they are sent for the second time in the pipeline, so at the end I get duplicates in the DB.
I'm fairly new at spring integration, so probably at this point I will ask more general questions. In a setup with multiple data sources - meaning multiple instances of parsers etc:
Is there a common way (best way) to go to configure the pipeline to enable parallelization?
If there is need, is there a way to serialize the message handling?
Any suggestions are welcomed. Thanks in advance.
First, can you describe what the "concurrency problems" are? Ideally you would not need to serialize the message handling, so that would be a good place to start.
Second, the thread pool as you've configured it will not completely serialize. You will have 1 thread available in the pool but the rejection policy you've chosen leads to a caller thread running the task itself (basically throttling) in the case that the queue is at capacity. That means you will get a caller-run thread concurrently with the one from the pool.
The best way that I can think of for your scenario would be along these lines:
Make your parsedObject1 and parsedObject2 be normal queue channels, the capacity of the queue can be set appropriately (say 25 at any time):
<int:channel id="parsedObjects1" >
<int:queue />
</int:channel>
Now at this point your xml processors against the 2 channels - parsedObjects1 and parsedObjects2, will process the xml's and should output to the processedObjects channel. You can use the configuration similar to what you have for this, except that I have explicitly specified the processedObjects channel -:
<int:service-activator input-channel="parsedObjects1" ref="processor1" method="process" output-channel="processedObjects">
<int:poller task-executor="channelTaskExecutor"/>
</int:service-activator>
The third step is where I will deviate from your configuration, at this point you said you want to serialize the persistence, the best way would be to do it through a DIFFERENT task executor with a pool size of 1, this way only 1 instance of your persister is running at any point in time:
<task:executor id="persisterpool" pool-size="1"/>
<int:service-activator input-channel="processedObjects" ref="persister" method="persist" >
<int:poller task-executor="persisterpool" fixed-delay="2"/>
</int:service-activator>
I managed to get the pipeline working. I'm not sure if I'll keep the current configuration, or experiment some more, but for now, this is the configuration I ended up with:
<task:executor id="channelTaskExecutor" pool-size="1-1" keep-alive="10" rejection-policy="CALLER_RUNS" queue-capacity="1" />
<int:channel id="parsedObjects1" >
<int:queue capacity="1000" />
</int:channel>
<int:channel id="parsedObjects2" >
<int:queue capacity="1000" />
</int:channel>
<int:channel id="processedObjects" >
<int:queue capacity="1000" />
</int:channel>
<int:service-activator input-channel="parsedObjects1" ref="processor1" method="process" >
<int:poller task-executor="channelTaskExecutor" max-messages-per-poll="100" fixed-rate="2" />
</int:service-activator>
<int:service-activator input-channel="parsedObjects2" ref="processor2" method="process" >
<int:poller task-executor="channelTaskExecutor" max-messages-per-poll="100" fixed-rate="2" />
</int:service-activator>
<int:service-activator input-channel="processedObjects" ref="persister" method="persist" >
<int:poller task-executor="channelTaskExecutor" max-messages-per-poll="1" fixed-rate="2" />
</int:service-activator>

Categories