I have an application written in aws-sdk-java that read the content from S3 bucket. I am using IAM role to connect S3 from EC2, and using user profile credentials from local environment to S3. Both cases works fine. But I do not want to use user credentials (keys id and sceret accesss keys) from local (eclipse ide). I am searching different options including SSH tunnel to S3 through EC2 instance. As I have seen many example of ssh tunnel to RDS instance but don't find one to S3. Is there any better ways to access S3 bucket without using credentials from local and forward the the to EC2 instance to access S3 from local machine? Thanks in advance
Related
I need to upload a file using a web form to AWS and then trigger a function to import it into a Postgres DB. I have the file import to a DB working locally using Java, but need it to work in the cloud
It needs a file upload with some settings (such as which table to import into) to be passed through a Java function which imports it to the Postgres DB
I can upload files to an EC2 instance with php, but then need to trigger a lambda function on that file. My research suggests S3 buckets are perhaps a better solution? Looking for some pointers to which services could be best suited
There are two main steps in your scenario:
Step 1: Upload a file to Amazon S3
It is simple to create an HTML form that uploads data directly to an Amazon S3 bucket.
However, it is typically unwise to allow anyone on the Internet to use the form, since they might upload any number and type of files. Typically, you will want your back-end to confirm that they are entitled to upload the file. Your back-end can then Upload objects using presigned URLs - Amazon Simple Storage Service, which authorize the user to perform the upload.
For some examples in various coding languages, see:
Direct uploads to AWS S3 from the browser (crazy performance boost)
File Uploads Directly to S3 From the Browser
Amazon S3 direct file upload from client browser - private key disclosure
Uploading to Amazon S3 directly from a web or mobile application | AWS Compute Blog
Step 2: Load the data into the database
When the object is created in the Amazon S3 bucket, you can configure S3 to trigger an AWS Lambda function, which can be written in the programming language of your choice.
The Bucket and Filename (Key) of the object will be passed into the Lambda function via the event parameter. The Lambda function can then:
Read the object from S3
Connect to the database
Insert the data into the desired table
It is your job to code this functionality but you will find many examples on the Internet.
You can use AWS SDK in your convenient language to invoke Lambda.
Please refer this documentation
I am new to AWS EFS and trying to understand how EFS file upload works.
Is there a way to upload files to EFS from local machine programmatically using java?
EFS is only accessible from within a VPC. You can't access it directly from outside of AWS. So you would have to setup a VPN connection between home network and your VPC, and then mount EFS filesystem in your local computer.
AWS EFS is a managed NFS service. Copying files from a local (on-premise) machine would require to mount it through a VPN connection or AWS Direct Connect. There is a guide for this here.
Once this is done, you can access it just like any other mounted file system, either with Java or otherwise.
Hi, I'm trying to fetch files from AWS s3 to ec2 to zip it, and then wanna upload the zip back to s3,
all via AWS internal communication.
In order to achieve this, I have set up VPC, and both s3 and ec2 are in the same region.
I'm able to fetch files from s3 to ec2 on AWS CLI but don't know how to achieve the same using java.
I need help for this purpose
I have created my vm in aws using ec2 instance. Also i have created a bucket in s3 storage and some folders in that bucket. I have my java jar file in filezilla and now i want to run my application. This scenario is working ok in the case when I have my filed I want to access in filezilla also but now that I want to create a s3 storage instance I do not know how can my java application can access these folders. Do I need to create a new java class in my application in order to access these folders from s3 storage?
I am not quite familiar with aws and I will appreciate any help from you guys.Even if you send me some useful links which solve my problem it will be a great help for me.
Thanks in advance
after creating a instance in amazon cloud using webservice in java i need to transfer a executable file or war file via program from my local machine to the newly created instance in amazon and i want to execute that excetuable,i tried and found that there is something called createbucket in ec2 api and using that we can upload the file to that and we can transfer that reference using PutObjectRequest i can transfer the reference to a remote computer in amazon do it is possible or if it is wrong please suggest me the correct way to proceed for file transfer from my local machine to the amazon ec2.
The basic suggestion is, you shouldn't transfer the file(s) with CreateBucket, which is actually an S3 API. Use scp may be a better solution.
Amazon S3, which you are trying to use with CreateBucket, is a data storage service mainly for flexible, public (with authentication) file sharing. You can use REST or SOAP APIs to access the data, but cannot really read/write it in EC2 instances as if it's in local harddisk.
To access file system in EC2 instances, that really depends on your operating system (on EC2). If it's running Linux, scp is a mature choice. You can use Java to directly invoke scp, if you are using Linux locally, or pscp if you are using Windows. If the EC2 instance is running Windows, one choice is to host an SSH/SFTP environment with FreeSSHD, and then proceed like Linux. Another option is use Shared Folder and regular file copy.