I am trying to upload files to an amazon EC2 virtual machine running Ubuntu. I use the JSch library for SSH connection as in
here
Connection through SSH succeeds but when I try to upload the file, I get "Permission Denied" error.
I use keys to log in to the EC2 instance.
The question is how to set the permissions on Ubuntu to allow the file upload?
This seems more like a linux / filesystem question, if I understand you correctly. Copy-paste your exact error, please.
Where on your EC2 instance are you trying to put the file? SSH to your EC2 instance using the same user as the SFTP user you've specified in your Java code, cd to the target directory and check ownership and permissions with ls -l. Try to create a file there touch mytestfile. Do you get any permission errors at this point?
Alternative:
What kind of file is this? Does it change a lot? Do the writes/reads have to be faster than HTTP? A better "cloud design pattern" would be to upload your file to an S3 bucket and give your EC2 instance read(&write?) permissions to that bucket/file.
Related
I am new to AWS EFS and trying to understand how EFS file upload works.
Is there a way to upload files to EFS from local machine programmatically using java?
EFS is only accessible from within a VPC. You can't access it directly from outside of AWS. So you would have to setup a VPN connection between home network and your VPC, and then mount EFS filesystem in your local computer.
AWS EFS is a managed NFS service. Copying files from a local (on-premise) machine would require to mount it through a VPN connection or AWS Direct Connect. There is a guide for this here.
Once this is done, you can access it just like any other mounted file system, either with Java or otherwise.
I saw answer mentioned by Gord in
Unable to connect to a database on a shared drive - UCanAccess.
I am able to access my db from Windows to Windows Server where my Access Database file resides.
But when I deploy the same code on Unix, I am not able to access my database. I am using the same URL as proposed by Gord.
My URL is:
datasource.crr.url=jdbc:ucanaccess://////abc.dch.com\\der\\Share\\SongUnflaggedTest\\Songs\ Unflagged.accdb;Skipindexes=true;memory=true.
Unlike Windows, most Linux/Unix environments are unable to directly access a file in a shared folder by simply using its UNC path, e.g.,
\\server\share\folder\file.ext
Instead, we normally have to tell the Linux/Unix box to mount the share at a point on the local filesystem (sort of like assigning a drive letter in Windows), and then use that as the starting point.
For example, if we mount the share
\\server\share
to a mount point on the local filesystem named
/mnt/servershare
then we can access the file using the path
/mnt/servershare/folder/file.ext
See this Ask Ubuntu question for an example.
I have a program that needs to read/write files to/from a network computer. Sometimes however the program cannot access the folder on the network computer. Currently, to fix this issue, I go into windows explorer's network section, click the computer, enter my credentials, and then my program is able to read and write to this computer without a problem. Is there a way to allow Java to ask for these credentials or a way to automatically send these credentials using java? I am using Win7 and Win7 embedded on the network computer.
You can use http://jcifs.samba.org/ which is a library for accessing remote CIFS shares. This allows you to set username and password and works on any platform (not just windows) It works without needing to mount a drive.
after creating a instance in amazon cloud using webservice in java i need to transfer a executable file or war file via program from my local machine to the newly created instance in amazon and i want to execute that excetuable,i tried and found that there is something called createbucket in ec2 api and using that we can upload the file to that and we can transfer that reference using PutObjectRequest i can transfer the reference to a remote computer in amazon do it is possible or if it is wrong please suggest me the correct way to proceed for file transfer from my local machine to the amazon ec2.
The basic suggestion is, you shouldn't transfer the file(s) with CreateBucket, which is actually an S3 API. Use scp may be a better solution.
Amazon S3, which you are trying to use with CreateBucket, is a data storage service mainly for flexible, public (with authentication) file sharing. You can use REST or SOAP APIs to access the data, but cannot really read/write it in EC2 instances as if it's in local harddisk.
To access file system in EC2 instances, that really depends on your operating system (on EC2). If it's running Linux, scp is a mature choice. You can use Java to directly invoke scp, if you are using Linux locally, or pscp if you are using Windows. If the EC2 instance is running Windows, one choice is to host an SSH/SFTP environment with FreeSSHD, and then proceed like Linux. Another option is use Shared Folder and regular file copy.
I want to get download and upload from a remote windows machine(workgroup,or domain) to my local unix machine.I dont want to use Sftp or FTP server.I also consider the Jcifs(smb) librariy but it only allowing access to shared directories.I want to access any directory with sufficent user permission.How can i do this I think active directory has a capability.
I would just open up a samba share on your unix machine and connect to the share from your windows machine.