Resume server to server FTP transfer using apache commons net - java

I’m refactoring a Java EE application that uses apache commons net FTP library to facilitate FTP
transfers between two servers. The code is almost identical to the code posted as an example on the web
page, http://commons.apache.org/proper/commons-net/examples/ftp/ServerToServerFTP.java.
The files that are being transferred sometimes exceed 60 gb, and even though the timeout is set quite high,
and the largest transfers are over a LAN, I’m still seeing some exceptions.
I’ve been trying to figure out how to implement the REST function in FTP, i.e. resume of transfers. The
servers support it so it only needs to be implemented using commons.
So far I’ve gathered that I need to use getRestartOffset and setRestartOffset.
I have not been able to locate any resources or examples online of how this can be implemented in a
server-to-server transfer and was wondering whether anybody has any pointers or examples?
Edit:
Solution
Using the solution suggested by user270349 I was able to implement the desired functionality, although it was not possible by using the REST command. I got the amount of bytes written from the destination, set the offset on both destination and source and then used the remoteAppend(String filename) method provided by the library instead of remoteStore(String filename) which is used in the example linked above.

The only difference between Server->Client download resume and Server->Server transfer resume is how you get the restarOffset. You need to list the files in the destination directory (remote in your case) and use the partial file size as offset for the next attempt.

Related

Resumable File Downloading/Uploading in Android

I have been working on an android project in which I have to download/upload few files via HTTP. I was wondering if there is a way to have resumable downloads/uploads for the files. As in, if my file is being downloaded or uploaded and there is a subtle internet choke for very minimal time (this sometimes corrupts the file and the process is stopped and next time it starts from 0 ) the downloading/uploading is paused and once the internet is back again on my device, the downloading/uploading starts from the same point where it was stopped at so that the file does not get corrupted and the process does not start from 0.
Is there any way to achieve this functionality in android/Java ? Please do let me know. Thanks in advance.
Html itself doesn't provide such ability to load file in chunks. FileUpload is simple object which works with file as whole and so sends it from scratch. To fulfill your requirements you need more sophisticated client/server relations. Java Applet is good candidate to do so on the client side and server side is trivial. However you need to implement some protocol (like handshake, start to send file, continue from some location, validation) and this is not an easy task. Even most commonly protocols (for example ftp) don't provide such ability. And even when you create all this stuff it will be compatible only with itself. Is it really worth all the efforts? Common answer is - no. That's the reason why we don't see such approach in the wild.

ftp file retrieve for unfinished file

What is the best way for a program which retrieves files from ftp server to check if the file to be downloaded is an ongoing transfer (someone already uploading this file during we decide to download). Do ftp client apis handle this ? (e.g. apache commons ftp client).
i think it's not really possible. a couple years ago i had a similar problem and i've got 2 options. (unfortunatelly it was C#, not Java)
you can check if the file's still growing (implies that you're gonna have a small delay because you need to check twice) or if you're using windows (i don't know how linux works) you can try to access this file and you should get an exception that the file is in use by another process.
just two possibilities and a starter for you to think about your problem. maybe someone else'
s coming up with a really good solution, but for now that might be a little workaround for you
Ftp was not designed to tell you if a file is in use, the most the ftp daemon can do is deny the transfer, and that is configurable in some servers. There may be a server that renames files temporarily or offer a script to do so , but you'd have to find one.
Do not know if it is sufficient for you, but if you need only some "dum" check, I would try System.getSecurityManager().checkDelete(). File can be deleted only if no streams are opened.

Uploading huge files with PHP or any other language?

I have a problem I've been dealing with lately. My application asks its users to upload videos, to be shared with a private community. They are teaching videos, which are not always optimized for web quality to start with. The problem is, many of the videos are huge, way over the 50 megs I've seen in another question. In one case, a video was over a gig, and the only solution I had was to take the client's video from box.net, upload it to the video server via FTP, then associate it with the client's account by updating the database manually. Obviously, we don't want to deal with videos this way, we need it to all be handled automatically.
I've considered using either the box.net or dropbox API to facilitate large uploads, but would rather not go that way if I don't have to. We're using PHP for the main logic of the site, though I'm comfortable with many other languages, especially Python, but including Java, C++, or Perl. If I have to dedicate a whole server or server instance to handling the uploads, I will.
I'd rather do the client-side using native browser JavaScript, instead of Flash or other proprietary tech.
What is the final answer to uploading huge files though the web, by handling the server response in PHP or any other language?
It is possible to raise the limits in Apache and PHP to handle files of this size. The basic HTTP upload mechanism does not offer progressive information, however, so I would usually consider this acceptable only for LAN-type connections.
The normal alternative is to locate a Flash or Javascript uploader widget. These have the bonus that they can display progressive information and will integrate well with a PHP-based website.
For php http://php.net/manual/en/features.file-upload.php
Note the ini files changes in the first comment.
Edit: Assuming you are running into timeout issues.

Importing a 30GB flat text file over internet to local file system using multiple connection to that flat file?

Lets say, I have a flat text file in a server. I need to download/import/copy to my local file system over internet. Is there any way I could import the file in chunks or have multiple connection to that flat text file itself from my local system, so that, the import/copy/ becomes faster?
Regards
One way you can do it is if supported by an FTP server available, use a multiple ftp based product that basically divdes it up, downloads multiple streams to the same file. However, in the end there is one stark reality, your speed will still be only a max of the smallest link along the path. So, if you download the file and get full bandwidth of say 2mb a second, making 10 connections you'll only get say 200k/s each, it wont make it faster unless something throttles the download on a connection by connection basis.
Of course, thats also not using java, but there probably is a java multi FTP thing about.
The fastest way to do this is probably to email the site administrator at that server, and request that he express post you a hard drive with the data you want, for a fee.
Failing that, you will need to investigate and buy the fastest link to the internet you can find.
Given that you have a fast link to the internet which will allow you to download data at X bytes per second, then your maximum theoretical limit is X bytes per second.
If you get significantly below X bytes per second, you may be being rate limited, either by the server in question, or any link between you or their server.
If it just happens to be that your ISP is doing the rate limiting, then in that specific case, you might be able to download data faster using a second connection and downloading a different portion of the file.

How to stream and transcode media files using java (on Tomcat)?

This has been discussed before here. Using Java, I have developed my web services on Tomcat for a media library. I want to add a functionality for streaming media while dynamically transcoding them as appropriate to mobile clients. There are few questions I am pondering over :
How exactly to stream the files (both audio and video) ? I am coming across many streaming servers - but I want something to be done on my code from Tomcat itself. Do I need to install one more server, i.e , the streaming server - and then redirect streaming requests to that server from Tomcat ?
Is it really a good idea to dynamically transcode ? Static transcoding means we have to replicate the same file in 'N' formats - something which is space consuming and I dont want. So is there a way out ?
Is it possible to stream the data "as it is transcoded"...that is, I dont want to start streaming when the transcoding has finished (as it introduces latency) - rather I want to stream the transcoded data bytes as they are produced. I apologize if this is an absurd requirement...I have no experience of either transcoding or streaming.
Other alternatives like ffmpeg, Xuggler and other technologies mentioned here - are they a better approach for getting the job done ?
I dont want to use any proprietary / cost based alternative to achieve this goal, and I also want this to work in production environments. Hope to get some help here...
Thanks a lot !
Red5 is another possible solution. Its open source and is essentially Tomcat with some added features. I don't know how far back in time the split from the Tomcat codebase occurred but the basics are all there (and the source - so you can patch what's missing).
Xuggler is a lib 'front end' for ffmpeg and plays nicely with Red5. If you intend to do lots of transcoding you'll probably run into this code along the way.
Between these two projects you can change A/V format and stream various media.
Unless you really need to roll your own I'd reccomend an OSS project with good community support.
For your questions:
1.) This is the standard space vs. performace tradeoff. You see the same thing in generating hash tables and other computationally expensive operations. If space is a larger issue than processor time, then dynamic transcoding is your only way out.
2.) Yes, you can stream during the transcode process. VLC http://www.videolan.org/vlc/ does this.
3.) I'd really look into VLC if I were you.

Categories