I'm recently working on a file agent that will be deployed on machine of both linux and windows to perform unified file transfer. It generally consist of a sshd server and a vfs manager. Usually one agent would use the vfs manager to connect to the sftp server on another agent and manipulate file on it.
The obstacle I just encountered is that windows file system is different with linux's for it typically has multiple roots(drives). Although the root path of ssh can be configured using the FileSystemFactory, it can't be changed at runtime and therefore making it impossible to access other drives after server bootstrap.
When using vfs to connect to the sftp subsystem of another agent, as expected it can only resolve the file in the drive where its root path resides. But, WinSCP seems not restricted by this and can change both current directory and drive when connected.
I wonder whether it is possible to construct a virtual FileObject corresponding to the / of linux file system, and access different drive like they're folders under that root. Or is there other ways to acquire the FileObject on other drives?
I have NFS server where in i require to host files and read it. The approach to read and write file on NFS server found is
using NFS Client Like here.
My question is when we can write content on NFS server with normal java read/write program then why is NFS client introduced ? Is there any service specific to NFS which these client provide and why is it different than normal file creation process ?
When you're using normal Java API to access a NFS folder, all communications are actually handled by your OS. So you can just use the normal File API and Java doesn't know if it's accessing a local file or a remote one. But in cases that your OS doesn't support NFS (e.g. if your Java app is running in an environment with limited resources or NFS mounting is disabled in OS level) or you are developing an application that needs more lower level details about the NFS resource (e.g. when you're developing a framework or a middleware), you may need to be able to communicate directly with the server that is exposing files/folders via a library like nfs-client-java.
I started with requirement of reading and writing files in from/in a directory on a remote Ubuntu machine.
First, I wrote a Java program that could read,write files from a shared folder on a remote Windows machine i.e on a LAN. Here, something like this works on my(local) Windows machine :
File inputFile = new File(
"\\172.17.89.76\EBook PDF");/*ignore the syntax errors, the loc is just for the idea*/
Now when I consider a remote Ubuntu machine, obviously I cannot do something like this as the machine is not on the LAN(I'm not sure if that can be done even if it is on the LAN!). Hence, I tried following approaches :
Using Jsch, establishing the trust between two machines(local - remote Linux , remote Linux - remote Linux) and file writing using sftp.(done)
Running sockets on the two machines - one sender, one receiver(both Java)(done)
Attempting to achieve I/O alike the code snippet for Windows (LAN) machines(not achieved)
While doing all these, I had many queries, read many posts etc. and I felt that I'm missing something on the fundamentals :
Some sort of trust-building(between two machines) utility will be required to achieve IO. But finally, I want to write a code like the snippet given, irrespective of the machines, network etc.
The Jsch solution and the others suggested(usage of http, ftp etc. over URL) finally are using some services that are running on the remote machine. In other words, it is NOT THAT Java IO is being used to access the remote file system - this doesn't appeal to me as I'm relying on services rather than using good-old I/O.
Samba, SSHFS too popped onto the scene, only to add to my confusion. But I don't see them as the solutions to my objective !
To reiterate, I want to write a code using Java I/O(either plain or nio, both are fine) which simply can read, write remote files without using services over protocols like ftp, http etc. or socket sender-receiver model. Is my expectation valid?
If not, why and what is the best I can do to read/write remote files
using Java?
If yes, how to achieve the same !
P.S : Please comment in case I need to elaborate to pose my question accurately !
To answer your question - No, your expectation isn't valid.
Retrieving files from a remote server is inherently reliant on the services running on that server. To retrieve a file from a remote server, the remote server needs to be expecting your request for a file.
The cases you listed in your question (using jsch and sftp, using a sender and receiver Java sockets) that you have achieved already, are essentially the same as this:
File inputFile = new File(
"\\172.17.89.76\EBook PDF");
The only difference is that Java is using the native os's built in support for reading from a windows style share. The remote windows machine has a sharing service running on it (just like Samba on linux, or a java socket program) waiting for your request.
From the Java API docs on File (http://docs.oracle.com/javase/6/docs/api/java/io/File.html)
The canonical pathname of a file that resides on some other machine and is accessed via a remote-filesystem protocol such as SMB or NFS ...
So essentially "Good old Java I/O" is more or less just a wrapper over some common protocols.
To answer the second part of your question (what is the best I can do to read/write remote files using Java?), that depends on what remote system you are accessing and, more importantly, what services are running on it.
In the case of your target remote machine being an Ubuntu machine, I would say the best alternative would be to use Jsch. If your target machine can be either a windows machine or a linux machine, I would probably go for running Java sockets on the two machines (obviously dependant on whether you have access to installing your app on the remote machine).
Generally speaking, go with the common lowest denominator between your target systems (in terms of file sharing protocols).
If you want to access a filesystem on a remote computer, then this computer has to make his filesystem available with a service. Such a service is typically a background job, which handles incoming requests and returns a response, e.g. for authentication, authorization, reading and writing. The specification of the request/response pattern is called a protocol. Well known protocols are SMB (or SAMBA) on Windows or NFS on UNIX/LINUX. To access such a remote service you mount the remote filesystem on the level of the operating system and make it available locally as a drive on Windows or as mount point on UNIX.
Then you can access the remote file system from your Java program like any local file system.
Of course it is also possible to write your own file service provider (with your own protocol layer) and run it on the remote machine. As transport layer for such an endeavor sockets (TCP/IP) can be used. Another good transport layer would be the http protocol, e.g. with a restful service or something based on WebDav.
We used sshfs. You can add to /etc/fstab the line:
sshfs#user#remoteAddress:remoteDir /mnt/ssh fuse defaults 0 0
and then mount /mnt/ssh
I think RMI might be the solution, you could set up a server an RMI server on the machine you want to connect to, and use your machine a the client.
I would give the client a path to the file this will be sent to the server, the server could then read in the file as bytes and sent the file back to the client.
how to upload a Folder from Windows to Linux machine using Java. Windows machine is the client machine. I am establishing the connection to Linux machine on windows machine using putty, java code will be running on linux machine. This is a servlet based project.
I'd suggest trying FTP, since your working from Linux to Windows.
Windows is a little cranky about these things so it might take some setting up it get it to work.
Check out How to retrieve a file from a server via SFTP? for some suggestions.
If you can, I'd reverse the process, copy the folder from Windows to Linux, Linux just seems to be easier to get setup to handle this kind of thing...IMHO
You can use FTP, SSH or Telnet. FTP and SSH are preferable.
If you choose SSH I suggest you to use JSch. For FTP you can use VFS from Jakarta.
What I'm attempting to do is copy a number of files from one host machine to a remote server using java and after the copy is made, I'll execute those files that I transferred. The host machine may have some dependencies like requiring putty or some other program but I'm hoping that there might be a solution that doesn't require anything installed on the remote side. And on top of that, this needs to be OS independent, though different methods can be used for different communications. I'll have access to the IP address and admin control (root username and password).
What I've had so far was that for Windows to Windows, I can mount the remote windows drive and access the files that way. In Windows to Linux, I can use putty or a similar program to ssh into the remote box. I'll also ssh from Linux to Linux and obviously I won't need putty. I can't figure out what to do for a Linux to Windows instance that won't require me setting up some ssh method on the remote end. Any ideas? Any way (or library) to perform both the copy and/or execute methods that won't even be OS specific?
A simple solution is to use what Windows already offers: rdesktop or the more comfortable Terminal Server Client if you are on a Gnome machine.
To get the files to the Windows box, you can set up Samba Client on your Linux machine and mount the Windows file share, copy your files there, connect via rdesktop and then execute them.