How can I duplicate files on a remote server in Java? - java

I'm trying to find a way to duplicate a series of files/folders from one section on a server, to a new directory on the same server. I looked into doing this with FTP, but it seems to be strictly for transfer and not so much changing files on the server itself. As such, I've been looking into SSH and wondering if it might be possible that way. Is it possible? If not, is there another way, or perhaps and easier way to do this? Any help would be much appreciated thanks!

I'm not sure if this is the best way, but, if you have SSH access to the server, you can SSH in and use the system's native copy command. I would recommend you use the Jsch library to SSH into the box and then just call the appropriate command (cp or copy).
Jsch supplies no documentation, but they include tons of example code. You can follow the Exec.java file to show you how to execute commands on the remote server. Also, fyi, they use tons of swing code in their examples. You can easily remove all of that if you don't want swing username/password prompts.

Yeah, both FTP and SFTP (which is the file transfer mode of SSH) are mainly for transferring files between client and server. Additionally they also support some management tasks (like creating directories, setting modes, listing files, removing files/directories, even renaming files), but no copying of files without downloading and uploading again.
As Jon7 and Mark proposed, you can (per ssh) invoke the remote server's native copy command (copy or xcopy on Windows, cp on Unix-like systems) to do the job, assuming you have shell access (not only SFTP or some forced command).
If using JSch, an exec channel would be the thing to use here.

Related

Monitor open file handles in Java - OS-independent

my issue is to monitor open file handles in java and being independent from the type of OS (Linux, Windows, etc)
My Application is a web based server application running with tomcat and Java8.
For a possible solution only the Java-Language itself and both, Guava and Apache Commons, are to be considered as my company does not allow the use of any other third party libraries.
Any suggestions on how to do that? I already thought of inheriting an InputStream and injecting some stuff into its open() and close() methods but this is not really whats I consider a "good" solution...
I'd be very thankful for your help!
The solution you don't really like seems to be the best one, you can simply emit a message, for example through an EventBus (you have it in Guava), whenever you open or close a file. Your ManagedFileInputStream has just to call enventBus.post(someFilePointer) and super.open() or super.close()
.
But there are many other file that could be opened by your application (jar files, web resources like css , images, scripts, temporary file created by tomcat , temporary files created by the JVM).
I think that there is no sensible way to monitor all file opened by a java application from within the java Application, maybe that's something for the OS to manage.

Java development on files located on remote machine

I've been developing in Java using VIM or Notepad++ editors and my java code is on remote linux machine. For small changes, I just putty/vnc to the remote linux machine; for big changes, I use Notepad++ as it has FTP integrated into it. In Notepad ++, I browse the remote files, download the files I want to edit, and just saving the file saves the file back the remove linux machine.
What I'm missing with notepad++ is intellisense, autocode complete, and couple of features that will help me code faster.
I've used Eclipse before where I could code locally, and integrate with version control. However in this case, the files are located remotely and I cannot integrate with version control.
Any one else in a similar situation that has a working solution they can explain?
This is the goal of Eclipse's Target Management (TM) and Remote Systems Explorer (RSE) projects: http://www.eclipse.org/tm/ . Also see their FAQ: http://wiki.eclipse.org/TM_and_RSE_FAQ
You can try something like SSHFS, then you can remotely mount the remote files and treat them as if they were local. If you have a huge project, maybe it's not efficient enough, or if you have a crummy connection. But it's a nice way to bring remote resources local to your machine.
Addenda:
I don't know much about Windows, but I found this link.
Regarding rsync, rsync is a manual after the fact process. With SSHFS you can save or build files, "alt - tab" over to the terminal window and the files are already on the server. We've used it for PHP development. Edit files locally, save the files, tab to the browser on the server and hit refresh -- shazam.
Okay, from the mention of putty I infer you're running Windows.
Choice number one: get an operating system.
Sorry, I just had to say it.
Okay, you've really got two choices.
Choice one: use some kind of distributed configuration management system. Among the possibilities are darcs, bazaar, git, and mercurial. Subversion can access files remotely, so it can do the same thing in limited senses. In all of these cases, you can basically replicate your files to the local machine and return them using simple commands that more or less optimally transfer the files.
Choice two: use a remote file system. SSHFS, and FTP file systems are good. I'd recommend ExpanDrive, which I've used very happily for some years on Macs. It's now available for windows too.
Perhaps the easiest version of this is DropBox, which replicates files across all your machines, including Linux. It's not very real-time, but it doesn't sound like you need that. I use DropBox between home, laptop, and work (on a linux machine) and by the time I've gotten to the office, all my changes at home are replicated.

Are there no way to know a ftp shell script failed?

I have an experience of using shell script to ftp some file. And some files could just missed. For example for I have 100000 files to be ftp, there might be 1 or 2 files haven't been ftp and there is no obvious error.
The shell didn't stop when one of the files failed to be ftp, although we don't want the script to stop due to only 1 missing file, we still want it to have some feedback to know there is a problem.
Would it be better if we use java to ftp?
You should probably not use ftp at all -- but rather consider something like rsync which can restart failed transfers and only transfer the files which has been updated since last.
The fact that you are asking if java is better makes me think you are more proficient in java. It's probably best to use whatever language you are more comfortable with.
Unless it's perl ;) j/k

Uploading and Downloading Files on linux hosting server Using RSync algorithm

i am writing an application in java that takes backup of files on server.It is a windows version application. In my application i have to perform incremental backup operation. For implementing incremental backup i am trying to follow rsync algorithm. i got one lib in java "jarsync0.3" but not getting how to write a code using rsync for uploading and downloading files on linux hosting server (SSH enabled).
I searched enough to get any solution which will help me for uploading and downloading files using rsync, but could not succeeded.
Please give me your valuable suggestion that will help me to get a way for using rsync for files uploading and downloading on linux hosting server.
From your question, it is not quite clear if you are trying to:
implement rsync client
implement rsync server
or just use rsync internal algorithms for some other purpose
For first two options: forget about it :-) See "Any good rsync library for Java?" for details.
If you need the last option - well, good luck. Wikipedia is your friend ;-)

Large File Download

Internet Explorer has a file download limit of 4GB (2 GB on IE6). Firefox does not have this problem (haven't tested safari yet)
(More info here: http://support.microsoft.com/kb/298618)
I am working on a site that will allow the user to download very large files (up to and exceeding 100GB)
What is the best way to do this without using FTP. The end user must be able to download the file from there browser using HTTP. I don't think Flash or Silverlight can save files to the client so as far as I know they won't cut it.
I'm guessing we will need an ActiveX or Java applet to pull this off. Something like the download manager that MSDN uses.
Does anyone know of a commercial (or free) component that will do that? We do not want the user to have to install a "browser wide" download manager (like GetRight), we want it to only work with downloading on our site.
Update: Here is some additional info to help clarify what I'm trying to do. Most of the files above the 4GB limit would be large HD video files (its for a video editing company). These will be downloaded by users across the internet, this isn't going to be people on a local network. We want the files to be available via HTTP (some users are going to be behind firewalls that aren't going to allow FTP, Bittorrent, etc.). The will be a library of files the end user could download, so we aren't talking about a one time large download. The will be download different large files on a semi-regular basis.
So far Vault that #Edmund-Tay suggested is the closest solution. The only problem is that it doesn't work for files larger than 4GB (it instantly fails before starting the download, they are probably using a 32bit integer somewhere which is exceeded/overflown by the content length of the file).
The best solution would be a java applet or ActiveX component, since the problem only exist in IE, that would work like the article #spoulson linked to. However, so far I haven't had any luck finding a solution that does anything like that (multipart downloads, resume, etc.).
It looks like we might have to write our own. Another option would be to write a .Net application (maybe ClickOnce) that is associated with an extension or mime type. Then the user would actually be downloading a small file from the web server that opens in the exe/ClickOnce app that tells the application what file to download. That is how the MSDN downloader works. The end user would then only have to download/install an EXE once. That would be better than downloading an exe every time they wanted to download a large file.
#levand:
My actual preference, as a user, in these situations is to download a lightweight .exe file that downloads the file for you.
That's a dealbreaker for many, many sites. Users either are or should be extremely reluctant to download .exe files from websites and run them willy-nilly. Even if they're not always that cautious, incautious behaviour is not something we should encourage as responsible developers.
If you're working on something along the lines of a company intranet, a .exe is potentially an okay solution, but for the public web? No way.
#TonyB:
What is the best way to do this without using FTP.
I'm sorry, but I have to ask why the requirement. Your question reads to me along the lines of "what's the best way to cook a steak without any meat or heat source?" FTP was designed for this sort of thing.
bittorrent?
There have been a few web-based versions already (bitlet, w3btorrent), and Azureus was built using java, so it's definitely possible.
Edit: #TonyB is it limited to port 80?
Please don't use ActiveX... I am so sick of sites that are only viewable in IE.
My actual preference, as a user, in these situations is to download a lightweight .exe file that downloads the file for you.
Can you split the files into pieces and then rejoin them after the download?
If you don't want to write java code in-house, there are commercial applet solutions available:
Vault
MyDownloder
Both of them have eval versions that you can download and test.
A few ideas:
Blizzard use a light-weight .exe BitTorrent wrapper for their patches. I'm not entirely sure how it is done, but it looks like a branded version of the official BitTorrent client.
Upload to Amazon S3, provide the torrent link of the file (all S3 files are automatically BitTorrent-enabled), plus the full HTTP download link as alternative. See S3 documentation
What about saying "We recommend that you install Free Download Manager to download this file. You will have the added benefit of being able to resume the file and accelerate the download."
Personally I never download anything using the built in browser download tool unless I have to (e.g. Gmail attachments)
#travis
Unfortunately It has to be over HTTP inside the users browser.
I'll update the question to be more clear about that.
#levand
The problem only exist in IE (it works in Firefox) so while ActiveX would only work on IE, IE is the only one we need the work around for.
#travis - interesting idea. Not sure if it will work for what I need but I'll keep it in mind. I'm hoping to find something to integrate with the existing site instead of having to go out to a third party. It would also require me to setup a bittorrent tracker which wouldn't be as easy as it sounds for this application because different users will have different access to different files.
#jjnguy
I'm looking for a java applet or ActiveX component that will do that for me. These are non-technical users so we really just want to have them click download and the full file ends up in the specified location
#ceejayoz
I totally agree but its part of the requirement for our client. There will be FTP access but each user will have the option of downloading via HTTP or FTP. There are some users that will be behind corporate firewalls that don't permit FTP
I have seen other sites do this in the past (MSDN, Adobe) so I was hoping there is something out there already instead of having to make one in house (and learning java and/or ActiveX)
I say click-once installed download manager, similar to msdn.
But becoming a CDN without a more optimized protocol for the job is no easy task. I can't imagine a business model that can be worthwhile enough to have such large file downloads as a core competency unless you are doing something like msdn. If you create a thick client, you at least get the chance to get some more face time with the users, for advertising or some other revenue model, since you will probably be paying in the hundreds of thousands of dollars to host such a service.
The problem with the applet approach mentioned is that unless you have the end user modify their java security properties your applet will not have permission to save to the hard drive.
It may be possible using Java Web Start (aka JNLP). I think that if it is a signed app it can get the extra permission to write to the hard drive. This is not too different from the download an exe approach. The problem with this is that the user has to have the correct version of Java installed and has to have the Java Web Start capability setup correctly.
I would recommend the exe approach as it will be the easiest for the non-technical users to use.
There are some users that will be behind corporate firewalls that don't permit FTP...
Are users with restrictive firewalls like that likely to be permitted to install and run a .exe file from your website?
Take a look at cURL. This article describes how to do a multi-part simultaneous download via HTTP. I've used cURL in the past to manage FTP downloads of files over 300GB.
Another tip: You can boost download times even more if you increase the size of the TCP Window on the client's NIC configuration. Set it as high as the OS allows and you should see up to 2x improvement depending on your physical network. This worked for me on Windows 2000 and 2003 when FTPing over a WAN. The down side is it may increase overhead for all other network traffic that wants only a few KB for a network packet, but is now forced to send/recv in 64KB packets. Your mileage may vary.
Edit: What exactly is this you're trying to accomplish? Who is the audience? I'm assumed for a bit that you're looking to do this over your own network; but you seem to imply the client side is someone on the internet. I think we need clearer requirements.
Create a folder of files to be downloaded on the server where the document service is running (either using Linux commands or using java to execute shell commands)
Write the file to be downloaded to this folder (using Linux command or Java shell command is OK). Considering the efficiency of program execution, WGet command is used here
Package the downloaded folder as a zip file (using shell command), configure nginx agent, return the access file path of nginx to the front end, and then download from the front end.

Categories