Spring Integration - MongoDB inbound channel reading the same data - java

I need to Query the data from mongoDB using spring integration, I can able Query the data from MongoDB, but the same data is returned more than once,
<bean id="mongoDBFactory"
class="org.springframework.data.mongodb.core.SimpleMongoDbFactory">
<constructor-arg name="mongo">
<bean class="com.mongodb.Mongo">
<constructor-arg name="host" value="localhost" />
<constructor-arg name="port" value="27017" />
</bean>
</constructor-arg>
<constructor-arg name="databaseName" value="test" />
</bean>
<int:channel id="controlChannel"/>
<int:control-bus input-channel="controlChannel"/>
<int-mongodb:inbound-channel-adapter
id="mongoInboundAdapter" channel="splittingChannel" auto-startup= "false"
query="{_id:1}"
collection-name="order"
mongodb-factory="mongoDBFactory">
<int:poller fixed-rate="10000" max-messages-per-poll="1000"/>
</int-mongodb:inbound-channel-adapter>
<int:splitter input-channel="splittingChannel" output-channel="logger"/>
<int:logging-channel-adapter id="logger" level="WARN"/>
I am using the control channel to start and stop,
please help me how can i stop the inbound-channel-adapter once the Query is completed.
Thanks in advance

I suggest you to use transaction-synchronization-factory to modify or remove documents instead of stopping Adapter. See Reference Manual for more info.

Related

How to consume AWS SQS message with more than just String in header?

I have a Spring Integration project configured to consume messages from a AWS SQS and when a message with just String headers and a simple body is produced, the processor consumes it quickly.
But when I try to consume a message that have, let's suppose, a Binary.xml inside the header attributes, the message just got on the flight but it's never consumed. What's the matter with Spring Integration? Can someone help me?
I've try to use a wire tap to see depth logs (Enabling logging in spring integration utiliy) but nothing was shown.
<int:channel id="in">
<int:interceptors>
<int:wire-tap channel="logger"/>
</int:interceptors>
</int:channel>
<int:logging-channel-adapter id="logger" level="TRACE"/>
My connectionFactory bean:
<bean id="basicAWSCredential" class="com.amazonaws.auth.BasicAWSCredentials"
primary="true">
<constructor-arg value="${awsAccessKey}"/>
<constructor-arg value="${awsSecretKey}"/>
</bean>
<bean id="AWSCredentialProvider" class="com.amazonaws.auth.AWSStaticCredentialsProvider"
primary="true">
<constructor-arg index="0" ref="basicAWSCredential"></constructor-arg>
</bean>
<bean id="AWSClientBuilder" class="com.amazonaws.services.sqs.AmazonSQSClientBuilder"
factory-method="standard" primary="true">
<property name="region" value="${awsRegion}"/>
<property name="credentials" ref="AWSCredentialProvider"/>
</bean>
<bean id="SQSProviderConfiguration" class="com.amazon.sqs.javamessaging.ProviderConfiguration">
<property name="numberOfMessagesToPrefetch" value="20"/>
</bean>
<bean id="connectionFactory" class="com.amazon.sqs.javamessaging.SQSConnectionFactory">
<constructor-arg ref="SQSProviderConfiguration"/>
<constructor-arg ref="AWSClientBuilder"/>
</bean>
My consumer bean:
<int-jms:message-driven-channel-adapter
channel="RouteByFileCompressionChannel" connection-factory="connectionFactory"
destination-name="MyQueueName" concurrent-consumers="10"
acknowledge="client" max-concurrent-consumers="20" id="SqsInboundAdapter" />
I found out that the problem was a custom type in one of the headers.
The Binary.xml custom type couldn't be parsed by the Spring Integration JMS Adapter, so nothing happen when I produce a message with this type. I don't know who produced such header but now I fixed all the producers along the application.
Thanks a lot for those who helped me here.

Re Read file when using int-sftp:inbound-channel-adapter

I have a int-sftp:inbound-channel-adapter which uses SftpPersistentAcceptOnceFileListFilter as part of a composite filter. Reading the documentation/ source code it should accept a file to be read again if the modified datetime has changed, but I cant get it to work, it only reads is the once. Im using redis as the store.
Any ideas what is wrong with the configuration, Im using spring integration 4.3.5
<int-sftp:inbound-channel-adapter id="sftpInboundAdapterCensus"
channel="sftpInboundCensus"
session-factory="sftpSessionFactory"
local-directory="${sftp.localdirectory}/census-local"
filter="censusCompositeFilter"
remote-file-separator="/"
remote-directory="${sftp.directory.census}">
<int:poller cron="${sftp.cron}" max-messages-per-poll="1" error-channel="pollerErrorChannel"/>
</int-sftp:inbound-channel-adapter>
<bean id="censusCompositeFilter"
class="org.springframework.integration.file.filters.CompositeFileListFilter">
<constructor-arg>
<list>
<bean class="org.springframework.integration.sftp.filters.SftpSimplePatternFileListFilter">
<constructor-arg value="*.xml" />
</bean>
<bean id="SftpPersistentAcceptOnceFileListFilter" class="org.springframework.integration.sftp.filters.SftpPersistentAcceptOnceFileListFilter">
<constructor-arg ref="metadataStore" />
<constructor-arg value="censusSampleLock_" />
</bean>
</list>
</constructor-arg>
</bean>
The SftpPersistentAcceptOnceFileListFilter only controls what we fetch from the server. You also need a FileSystemPersistentAcceptOnceFileListFilter in the local-filter (which determines which files that have been fetched end up being emitted as messages). The local filter is an AcceptOnceFileListFilter by default.

Spring JMS Template

I'm really struggling to get my Spring JMS template to work and send messages to a queue. Here's what I've got attempted:
In my XML:
<bean name="jmsTemplate" class="org.springframework.jms.core.JmsTemplate">
<constructor-arg ref="mqQueueConnectionFactory" />
<property name="defaultDestination" ref="mqQueue" />
</bean>
<bean name="mqQueue" class="com.ibm.mq.jms.MQQueue">
<constructor-arg value="${MQ_QUEUE_MANAGER_NAME}" />
<constructor-arg value="${MQ_QUEUE_NAME}" />
</bean>
<bean name="mqQueueConnectionFactory" class="com.ibm.mq.jms.MQXAQueueConnectionFactory">
<property name="hostName" value="${MQ_HOST_NAME}" />
<property name="channel" value="${MQ_CHANNEL}" />
<property name="port" value="${MQ_PORT}" />
<property name="queueManager" value="${MQ_QUEUE_MANAGER_NAME}" />
<property name="transportType" ref="wmq_cl_binding" />
</bean>
So those are my beans for setting up the template/queue.
Now I setup a listener and jmsContainer:
<bean id="messageListener" class="CloseoutListener" />
<bean id="jmsContainer"
class="org.springframework.jms.listener.DefaultMessageListenerContainer">
<property name="connectionFactory" ref="mqQueueConnectionFactory" />
<property name="destination" ref="mqQueue" />
<property name="messageListener" ref="messageListener" />
</bean>
and my implementation of CloseoutListener is the same that is on the Spring JMS docs: Listener
In addition to this, I am trying to send a message in the same way that Spring sends a message in the docs: Sender
Full disclosure: First time using queues and any sort of JMS, as well as my second time using Spring so I'm aware if this is sloppy or just plain wrong. That's why I'm asking for assistance.
No message is appearing in the queue and in addition I'm getting this message in my logs:
INFO DefaultMessageListenerContainer.handleListenerSetupFailure :825 - JMS message listener invoker needs to establish shared Connection

Spring integration with ftp skip files

I am using spring integration to connect and download files from ftp.
I have two filters , one by file name and another one to accept only one file using redis.
To the most part it works great however i notice two issues:
Some files are skipped and not downloaded at all
Some files are starting to be written but stop before it finished and left with the .writing temporary file extension - I suspect it occur when i restart my service or when the connection to ftp server is lost.
Below is my configuration for an sftp connection but i also have two more vendors one using ftp and the other ftps who have same problem.
<bean id="eeSftpClientFactory" class="org.springframework.integration.sftp.session.DefaultSftpSessionFactory">
<property name="host" value="ftp.host.com"/>
<property name="port" value="22"/>
<property name="user" value="myUserName"/>
<property name="password" value="myPassword"/>
</bean>
<bean id="eeFilesFilter" class="org.springframework.integration.file.filters.CompositeFileListFilter">
<constructor-arg>
<list>
<bean class="org.springframework.integration.sftp.filters.SftpPersistentAcceptOnceFileListFilter">
<constructor-arg ref="redisMetadataStore"/>
<constructor-arg value=""/>
</bean>
<bean class="org.springframework.integration.sftp.filters.SftpSimplePatternFileListFilter">
<constructor-arg value="*.nm4"/>
</bean>
</list>
</constructor-arg>
</bean>
<int-sftp:inbound-channel-adapter id="eeChannelAdapter"
channel="eeFtpChannel"
session-factory="eeSftpClientFactory"
auto-startup="${ais.feeds.ee.enabled}"
auto-create-local-directory="true"
delete-remote-files="false"
remote-directory="/SAISData/"
filter="eeFilesFilter"
local-directory="${ais.feeds.base.path}/eeVendor">
<int:poller fixed-delay="500" max-messages-per-poll="-1"/>
</int-sftp:inbound-channel-adapter>
<int:channel id="eeFtpChannel">
<int:queue capacity="500"/>
</int:channel>
<int:service-activator id="eeFeedHandlerActivator"
input-channel="eeFtpChannel"
ref="eeFeedHandler"
method="execute">
<int:poller fixed-delay="500" max-messages-per-poll="-1"/>
</int:service-activator>
Your advice is greatly appriciated!
Found the cause for issue #2 -
The SftpPersistentAcceptOnceFileListFilter check if the file was already processed and adds it to the metadata store - if the process was stopped in the middle due to restart the file isn't rollback from the metadata store so when checking again after restart file already exists in the metadata store and therefore isn't re-downloaded.

transactional not working in spring data neo4j

I am using spring-data-neo4j for my neo4j database in my application,i want to have transactional APIs in my service layer but it seems that #transaction is not working.
Service Layer:
#Transactional('neo4jTransactionManager')
def savePerson(){
Person person=new Person()
person.setName("prabh")
person.setDistance(100)
PersonRepository.save(person)
int i=10/0;
}
Configuration :
<context:component-scan base-package="neo4j"></context:component-scan>
<bean id="graphDatabaseService"
class="org.springframework.data.neo4j.rest.SpringRestGraphDatabase">
<constructor-arg value="http://localhost:7474/db/data" />
</bean>
<neo4j:config graphDatabaseService="graphDatabaseService"
base-package="neo4j" />
<neo4j:repositories base-package="neo4j" />
<bean id="neo4jTransactionManager"
class="org.springframework.transaction.jta.JtaTransactionManager">
<property name="transactionManager">
<bean class="org.neo4j.kernel.impl.transaction.SpringTransactionManager">
<constructor-arg ref="graphDatabaseService" />
</bean>
</property>
<property name="userTransaction">
<bean class="org.neo4j.kernel.impl.transaction.UserTransactionImpl">
<constructor-arg ref="graphDatabaseService" />
</bean>
</property>
</bean>
<tx:annotation-driven mode="aspectj"
transaction-manager="neo4jTransactionManager" />
</beans>
I am using rest server of neo4j database.
That's what the documentation says, for remote access there is no transactionality due to Neo4j's REST API not exposing transactions over the wire in the past
In the next milestone (and the current 3.3.0.BUILD-SNAPSHOT) build a new remoting integration is used, which exposes transactions over the wire and is also much faster than the existing one.

Categories