Re Read file when using int-sftp:inbound-channel-adapter - java

I have a int-sftp:inbound-channel-adapter which uses SftpPersistentAcceptOnceFileListFilter as part of a composite filter. Reading the documentation/ source code it should accept a file to be read again if the modified datetime has changed, but I cant get it to work, it only reads is the once. Im using redis as the store.
Any ideas what is wrong with the configuration, Im using spring integration 4.3.5
<int-sftp:inbound-channel-adapter id="sftpInboundAdapterCensus"
channel="sftpInboundCensus"
session-factory="sftpSessionFactory"
local-directory="${sftp.localdirectory}/census-local"
filter="censusCompositeFilter"
remote-file-separator="/"
remote-directory="${sftp.directory.census}">
<int:poller cron="${sftp.cron}" max-messages-per-poll="1" error-channel="pollerErrorChannel"/>
</int-sftp:inbound-channel-adapter>
<bean id="censusCompositeFilter"
class="org.springframework.integration.file.filters.CompositeFileListFilter">
<constructor-arg>
<list>
<bean class="org.springframework.integration.sftp.filters.SftpSimplePatternFileListFilter">
<constructor-arg value="*.xml" />
</bean>
<bean id="SftpPersistentAcceptOnceFileListFilter" class="org.springframework.integration.sftp.filters.SftpPersistentAcceptOnceFileListFilter">
<constructor-arg ref="metadataStore" />
<constructor-arg value="censusSampleLock_" />
</bean>
</list>
</constructor-arg>
</bean>

The SftpPersistentAcceptOnceFileListFilter only controls what we fetch from the server. You also need a FileSystemPersistentAcceptOnceFileListFilter in the local-filter (which determines which files that have been fetched end up being emitted as messages). The local filter is an AcceptOnceFileListFilter by default.

Related

Spring SAML cant read property from properties file

So this one stumps me. I have been using properties files for years now, but I have never seen this.
I am using Spring MVC with SAML authentication. my context xml has this in it:
<context:property-placeholder location="file:/opt/saml.properties" />
<bean id="keyManager" class="org.springframework.security.saml.key.JKSKeyManager">
<constructor-arg value="file:/opt/mySamlKeystore.jks"/>
<constructor-arg type="java.lang.String" value="${keystore.password}"/>
<constructor-arg>
<map>
<entry key="${privateKey.alias}" value="${key.password}"/>
</map>
</constructor-arg>
<constructor-arg type="java.lang.String" value="${privateKey.alias}"/>
</bean>
I am getting this error:
java.security.UnrecoverableKeyException: Cannot recover key
I so some SO research and they all say basically that I have the wrong password, which im sure I don't. So to test that it's reading the right file, I go and replace all the properties %{} and hard code them. Everything then works fine.
I am trying to figure this out, when I noticed that some of the other properties from that file are working! In fact, I can even do this:
<bean id="keyManager" class="org.springframework.security.saml.key.JKSKeyManager">
<constructor-arg value="file:/opt/myKeystore.jks"/>
<constructor-arg type="java.lang.String" value="${keystore.password}"/>
<constructor-arg>
<map>
<entry key="${privateKey.alias}" value="password"/>
</map>
</constructor-arg>
<constructor-arg type="java.lang.String" value="${privateKey.alias}"/>
</bean>
So spring is getting ${keystore.password} and ${privateKey.alias} (Along with others needed like entityID, metadataProvider, etc...) from the properties file, but not ${key.password} !!!
here is the saml.properties
#keystore stuff
keystore.password=password
key.password=password
privateKey.alias=mysaml
#SP stuff (aka, my side of things)
entity.id=mycompany:me:me:me1
entity.base.url=https://mycompany.com
#IDP stuff (aka, the SAML server)
metadata.provider=https://saml.mycompany.com/FederationMetadata/2007-06/FederationMetadata.xml
This is all working when I hard coded the key password, but not when I use the ${key.password} property. What is going on here?
you have two more slash after file
For example
<bean id="keyManager" class="org.springframework.security.saml.key.JKSKeyManager">
<constructor-arg value="file:///opt/myKeystore.jks"/>
<constructor-arg type="java.lang.String" value="${keystore.password}"/>
<constructor-arg>
<map>
<entry key="${privateKey.alias}" value="password"/>
</map>
</constructor-arg>
<constructor-arg type="java.lang.String" value="${privateKey.alias}"/>
</bean>

Spring Integration - MongoDB inbound channel reading the same data

I need to Query the data from mongoDB using spring integration, I can able Query the data from MongoDB, but the same data is returned more than once,
<bean id="mongoDBFactory"
class="org.springframework.data.mongodb.core.SimpleMongoDbFactory">
<constructor-arg name="mongo">
<bean class="com.mongodb.Mongo">
<constructor-arg name="host" value="localhost" />
<constructor-arg name="port" value="27017" />
</bean>
</constructor-arg>
<constructor-arg name="databaseName" value="test" />
</bean>
<int:channel id="controlChannel"/>
<int:control-bus input-channel="controlChannel"/>
<int-mongodb:inbound-channel-adapter
id="mongoInboundAdapter" channel="splittingChannel" auto-startup= "false"
query="{_id:1}"
collection-name="order"
mongodb-factory="mongoDBFactory">
<int:poller fixed-rate="10000" max-messages-per-poll="1000"/>
</int-mongodb:inbound-channel-adapter>
<int:splitter input-channel="splittingChannel" output-channel="logger"/>
<int:logging-channel-adapter id="logger" level="WARN"/>
I am using the control channel to start and stop,
please help me how can i stop the inbound-channel-adapter once the Query is completed.
Thanks in advance
I suggest you to use transaction-synchronization-factory to modify or remove documents instead of stopping Adapter. See Reference Manual for more info.

Spring Integration Kafka message-driven-channel-adapter receive message

For spring-integration-kafka version 2.1.0.RELEASE, documentation seems to be outdated
The example in the doc is incorrect as it doesn't match the constructor argument for KafkaMessageListenerContainer. Can somebody direct me how to create the bean correctly and corresponding Java code to process the message ?
<bean id="container1" class="org.springframework.kafka.listener.KafkaMessageListenerContainer">
<constructor-arg>
<bean class="org.springframework.kafka.core.DefaultKafkaConsumerFactory">
<constructor-arg>
<map>
<entry key="bootstrap.servers" value="localhost:9092" />
</map>
</constructor-arg>
</bean>
</constructor-arg>
<constructor-arg name="topics" value="foo" />
</bean>
Sorry about that; we'll fix the docs; the correct documentation is in the quick start section.

Loading application configuration properties from database in spring based application using java based configuration

Its better to store the configuration properties in a database table so that it can be managed easily for different environments. The approach to store and retrieve the configuration properties from database table in xml based configuration is like below :
<bean class="org.springframework.beans.factory.config.PropertyPlaceholderConfigurer">
<property name="systemPropertiesModeName" value="SYSTEM_PROPERTIES_MODE_OVERRIDE" />
<property name="properties">
<bean class="org.apache.commons.configuration.ConfigurationConverter" factory-method="getProperties">
<constructor-arg>
<bean class="org.apache.commons.configuration.DatabaseConfiguration">
<constructor-arg>
<ref bean="dbDataSource" />
</constructor-arg>
<constructor-arg value="DOMAIN_CONFIG" />
<!-- DB Table -->
<constructor-arg value="CONFIG_NAME" />
<!-- DB Key Column -->
<constructor-arg value="CONFIG_VALUE" />
<!-- DB Value Column -->
</bean>
</constructor-arg>
</bean>
</property>
</bean>
But the same thing i'm trying to achieve using java based configuration but no luck.
Can anyone please help me.
I found answer for my question.
Thanks to this post : https://gist.github.com/jeffsheets/8ab5f3aeb74787bdb051
This exactly suits to my problem. Thanks.!

Spring integration with ftp skip files

I am using spring integration to connect and download files from ftp.
I have two filters , one by file name and another one to accept only one file using redis.
To the most part it works great however i notice two issues:
Some files are skipped and not downloaded at all
Some files are starting to be written but stop before it finished and left with the .writing temporary file extension - I suspect it occur when i restart my service or when the connection to ftp server is lost.
Below is my configuration for an sftp connection but i also have two more vendors one using ftp and the other ftps who have same problem.
<bean id="eeSftpClientFactory" class="org.springframework.integration.sftp.session.DefaultSftpSessionFactory">
<property name="host" value="ftp.host.com"/>
<property name="port" value="22"/>
<property name="user" value="myUserName"/>
<property name="password" value="myPassword"/>
</bean>
<bean id="eeFilesFilter" class="org.springframework.integration.file.filters.CompositeFileListFilter">
<constructor-arg>
<list>
<bean class="org.springframework.integration.sftp.filters.SftpPersistentAcceptOnceFileListFilter">
<constructor-arg ref="redisMetadataStore"/>
<constructor-arg value=""/>
</bean>
<bean class="org.springframework.integration.sftp.filters.SftpSimplePatternFileListFilter">
<constructor-arg value="*.nm4"/>
</bean>
</list>
</constructor-arg>
</bean>
<int-sftp:inbound-channel-adapter id="eeChannelAdapter"
channel="eeFtpChannel"
session-factory="eeSftpClientFactory"
auto-startup="${ais.feeds.ee.enabled}"
auto-create-local-directory="true"
delete-remote-files="false"
remote-directory="/SAISData/"
filter="eeFilesFilter"
local-directory="${ais.feeds.base.path}/eeVendor">
<int:poller fixed-delay="500" max-messages-per-poll="-1"/>
</int-sftp:inbound-channel-adapter>
<int:channel id="eeFtpChannel">
<int:queue capacity="500"/>
</int:channel>
<int:service-activator id="eeFeedHandlerActivator"
input-channel="eeFtpChannel"
ref="eeFeedHandler"
method="execute">
<int:poller fixed-delay="500" max-messages-per-poll="-1"/>
</int:service-activator>
Your advice is greatly appriciated!
Found the cause for issue #2 -
The SftpPersistentAcceptOnceFileListFilter check if the file was already processed and adds it to the metadata store - if the process was stopped in the middle due to restart the file isn't rollback from the metadata store so when checking again after restart file already exists in the metadata store and therefore isn't re-downloaded.

Categories