I have to poll an ftp location. For testing purpose I have created an ftp site on my machine using IIS manager. It listens at port 21 and is started.
The dependancies are proper for my project
This is the xml configuration for spring ftp
<bean id="ftpClientFactory"
class="org.springframework.integration.ftp.session.DefaultFtpSessionFactory">
<property name="host" value="localhost"/>
<property name="port" value="21"/>
<property name="username" value="ICMAS"/>
<property name="password" value="kavita12"/>
<property name="clientMode" value="0"/>
<property name="fileType" value="2"/>
<property name="bufferSize" value="100000"/>
</bean>
<int-ftp:inbound-channel-adapter id="ftpInbound"
channel="ftpChannel"
session-factory="ftpClientFactory"
charset="UTF-8"
auto-create-local-directory="true"
delete-remote-files="true"
local-filter="compositeFilter"
remote-directory="c:\ftproot"
remote-file-separator="\"
preserve-timestamp="true"
local-directory="c:\data"
>
<int:poller fixed-rate="1000"/>
</int-ftp:inbound-channel-adapter>
<int:channel id="ftpChannel"/>
The filenamegenerator and the compositefilter are present in my code but havent pated their code here.
My problem is that the local-directory is getting polled instead of the remote-directory. I thought that the files are read from the remote-directory location the go to the filter and if successful will go to the filenamegenerator and be put in the local-directory location. What is wrong with this code???
Please correct me if I am doing something wrong.
Need help on this issue... Please put in your suggessions!!
Have resolved this issue.
firstly I needed the filter attribute rather than the local-filter as there is a difference in them.
Secondly and more importantly I have given the romote-directory location as the absolute path. That needs to be relative to the ftp directory mentioned while creating the ftp site.
Thanks. Hope this is useful to someone!!
Related
I am using spring integration to connect and download files from ftp.
I have two filters , one by file name and another one to accept only one file using redis.
To the most part it works great however i notice two issues:
Some files are skipped and not downloaded at all
Some files are starting to be written but stop before it finished and left with the .writing temporary file extension - I suspect it occur when i restart my service or when the connection to ftp server is lost.
Below is my configuration for an sftp connection but i also have two more vendors one using ftp and the other ftps who have same problem.
<bean id="eeSftpClientFactory" class="org.springframework.integration.sftp.session.DefaultSftpSessionFactory">
<property name="host" value="ftp.host.com"/>
<property name="port" value="22"/>
<property name="user" value="myUserName"/>
<property name="password" value="myPassword"/>
</bean>
<bean id="eeFilesFilter" class="org.springframework.integration.file.filters.CompositeFileListFilter">
<constructor-arg>
<list>
<bean class="org.springframework.integration.sftp.filters.SftpPersistentAcceptOnceFileListFilter">
<constructor-arg ref="redisMetadataStore"/>
<constructor-arg value=""/>
</bean>
<bean class="org.springframework.integration.sftp.filters.SftpSimplePatternFileListFilter">
<constructor-arg value="*.nm4"/>
</bean>
</list>
</constructor-arg>
</bean>
<int-sftp:inbound-channel-adapter id="eeChannelAdapter"
channel="eeFtpChannel"
session-factory="eeSftpClientFactory"
auto-startup="${ais.feeds.ee.enabled}"
auto-create-local-directory="true"
delete-remote-files="false"
remote-directory="/SAISData/"
filter="eeFilesFilter"
local-directory="${ais.feeds.base.path}/eeVendor">
<int:poller fixed-delay="500" max-messages-per-poll="-1"/>
</int-sftp:inbound-channel-adapter>
<int:channel id="eeFtpChannel">
<int:queue capacity="500"/>
</int:channel>
<int:service-activator id="eeFeedHandlerActivator"
input-channel="eeFtpChannel"
ref="eeFeedHandler"
method="execute">
<int:poller fixed-delay="500" max-messages-per-poll="-1"/>
</int:service-activator>
Your advice is greatly appriciated!
Found the cause for issue #2 -
The SftpPersistentAcceptOnceFileListFilter check if the file was already processed and adds it to the metadata store - if the process was stopped in the middle due to restart the file isn't rollback from the metadata store so when checking again after restart file already exists in the metadata store and therefore isn't re-downloaded.
I have a successfully running ActiveMQ 5.9.1 , Camel 2.11 and Tomcat 7.0.50 service layer application with a dependency on ActiveMQ to be started independently.
The reason Im using ActiveMQ is to have a shared datastore among 2 same load balanced instances for faster processing.
Here is what I want to do :
To be able to start ActiveMQ from pom.xml or worst case scenario from context.xml. So, lets say 2 instances are load balanced and they start their own ActiveMQ servers but they point to a single data store(directory) for queue information.
Please advise how can I have such a design to sustain optimum performance in a production environment.
I'm still on the hunt for any psuedo code that I can try , have not succeeded yet .
Code snippet from camelContext.xml
<broker id="broker" brokerName="myBroker" useShutdownHook="false" useJmx="true" persistent="true" dataDirectory="activemq-data"
xmlns="http://activemq.apache.org/schema/core">
<transportConnectors>
<transportConnector name="tcp" uri="tcp://localhost:61616"/>
</transportConnectors>
</broker>
<bean id="jmsConnectionFactory" class="org.apache.activemq.ActiveMQConnectionFactory">
<property name="brokerURL" value="tcp://myBroker?create=false&waitForStart=5000" />
</bean>
<bean id="pooledConnectionFactory" class="org.apache.activemq.pool.PooledConnectionFactory"
init-method="start" destroy-method="stop">
<property name="maxConnections" value="8" />
<property name="connectionFactory" ref="jmsConnectionFactory" />
</bean>
<bean id="activeMQConfig"
class="org.apache.activemq.camel.component.ActiveMQConfiguration">
<property name="connectionFactory" ref="pooledConnectionFactory" />
<property name="concurrentConsumers" value="20" />
</bean>
<bean id="activemq" class="org.apache.activemq.camel.component.ActiveMQComponent">
<property name="configuration" ref="activeMQConfig" />
<property name="transacted" value="true" />
<property name="cacheLevelName" value="CACHE_CONSUMER" />
</bean>
Please help .
I resolved the issue finally . In case somebody else is facing the same problem , I downgraded the ActivemQ version to 5.8.0 to resolve the issue.
I'm facing an issue with spring placeholder configuration. I've searched the web trying to find a solution but nothing worked for me at all.
We had used to use spring configurer for loading our .properties files and everything worked fine since the configfiles where located in META-INF dir.
Now we need to have our config files located in /etc/sep/properties directory or in some other filesystem directory.
I tried to use
<context:property-placeholder location="file:/etc/sep/properties/jdbc.properties" />
<bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource">
<property name="driverClassName" value="${jdbc.driverClassName}" />
<property name="url" value="${jdbc.databaseurl}" />
<property name="username" value="${jdbc.username}" />
<property name="password" value="${jdbc.password}" />
<property name="initialSize" value="${jdbc.initialPoolSize}" />
</bean>
the content of /etc/sep/properties/jdbc.properties is following:
cat /etc/sep/properties/jdbc.properties
jdbc.driverClassName= org.postgresql.Driver
jdbc.dialect=org.hibernate.dialect.PostgreSQLDialect
jdbc.databaseurl=jdbc:postgresql://localhost:5432/sep?characterEncoding=utf8&autoReconnect=true
jdbc.username=****
jdbc.password=****
I also tried using another approach as folows, but it worked for me neither.
<context:property-placeholder properties-ref="prop" />
<util:properties id="prop" location="reso"/>
<bean id="reso" class="org.springframework.core.io.FileSystemResource">
<constructor-arg index="0" value="/etc/sep/properties/jdbc.properties" />
</bean>
I don't know if it matters but we are using maven building, so that the application-context.xml is placed in core-lib.jar which is used in our web-app as dependency. Other config, such as logging work great.
I would be grateful for any suggestions.
This is just an idea but it may work,
Have you checked that the .property file has the correct access rights? I mean, is it accessible by the user that runs your Spring application?
It would help a lot if you show the error displayed.
Ok, I've finally resolved it. There were two things about it.
At First: My tomcat server was not updating deployed files properly.
And finally I'm not pretty sure if it helped, but we added one more slash after file: specification, so that the result was:
<context:property-placeholder location="file:///etc/sep/properties/*.properties" />
Now it is loading all config files properly.
I am using Tomcat JDBC API(org.apache.tomcat.jdbc.pool.DataSource) to connect to my PostgreSQL database from Spring configuration file as shown below. I got a new requirement to configure two databases which should act as a fail over mechanism, Like When one database is down application should automatically switch back to another database.
<bean id="dataSource" class="org.apache.tomcat.jdbc.pool.DataSource"
destroy-method="close">
<property name="driverClassName" value="org.postgresql.Driver" />
<property name="url" value="jdbc:postgresql://localhost/dbname?user=postgres" />
<property name="username" value="postgres" />
<property name="password" value="postgres" />
<property name="maxActive" value="5" />
<property name="maxIdle" value="5" />
<property name="minIdle" value="2" />
<property name="initialSize" value="2" />
</bean>
Can anyone suggest how this can be achieved using Spring configuration file.
The normal way this is done is by using virtual IP addresses (with possible forwarding), checking for activity, a shoot-the-other-node-in-the-head approach and proper failover. Spring is exactly the wrong solution to this if you want to avoid things like data loss.
A few recommendations.
repmgr from 2ndquadrant will manage a lot of the process for you.
Use identical hardware and OS and streaming replication.
Use virtual IP addresses, and the like. Use a heartbeat mechanism to trigger failover via rempgr
Then from this perspective your spring app doesn't need reconfiguring.
I've been working on an interntaional website using Java/Spring using #springMessage() tags and message.properties files. See my recent question: In Java/Spring, how to gracefully handle missing translation values?
I want to be able to edit (overwrite) the messages.properties files and be able to see the new translations immedatiately in my browser (without restarting Tomcat).
I thought that http://commons.apache.org/proper/commons-configuration/userguide/howto_filebased.html#Automatic_Reloading would be what I need, but I'm not sure how to edit my webmvc-config.xml to use that.
Figured it out. It worked after I edited webmvc-config.xml:
<bean id="messageSource"
class="org.springframework.context.support.ReloadableResourceBundleMessageSource">
<property name="basename">
<value>${content.path.config}/WEB-INF/messages</value>
</property>
<property name="defaultEncoding" value="UTF-8" />
<property name="cacheSeconds" value="2"/>
</bean>
(I just needed to add the cacheSeconds property.)