Java to download files from s3 to ec2 via internal communication - java

Hi, I'm trying to fetch files from AWS s3 to ec2 to zip it, and then wanna upload the zip back to s3,
all via AWS internal communication.
In order to achieve this, I have set up VPC, and both s3 and ec2 are in the same region.
I'm able to fetch files from s3 to ec2 on AWS CLI but don't know how to achieve the same using java.
I need help for this purpose

Related

Upload file to AWS for ingression to Postgres DB

I need to upload a file using a web form to AWS and then trigger a function to import it into a Postgres DB. I have the file import to a DB working locally using Java, but need it to work in the cloud
It needs a file upload with some settings (such as which table to import into) to be passed through a Java function which imports it to the Postgres DB
I can upload files to an EC2 instance with php, but then need to trigger a lambda function on that file. My research suggests S3 buckets are perhaps a better solution? Looking for some pointers to which services could be best suited
There are two main steps in your scenario:
Step 1: Upload a file to Amazon S3
It is simple to create an HTML form that uploads data directly to an Amazon S3 bucket.
However, it is typically unwise to allow anyone on the Internet to use the form, since they might upload any number and type of files. Typically, you will want your back-end to confirm that they are entitled to upload the file. Your back-end can then Upload objects using presigned URLs - Amazon Simple Storage Service, which authorize the user to perform the upload.
For some examples in various coding languages, see:
Direct uploads to AWS S3 from the browser (crazy performance boost)
File Uploads Directly to S3 From the Browser
Amazon S3 direct file upload from client browser - private key disclosure
Uploading to Amazon S3 directly from a web or mobile application | AWS Compute Blog
Step 2: Load the data into the database
When the object is created in the Amazon S3 bucket, you can configure S3 to trigger an AWS Lambda function, which can be written in the programming language of your choice.
The Bucket and Filename (Key) of the object will be passed into the Lambda function via the event parameter. The Lambda function can then:
Read the object from S3
Connect to the database
Insert the data into the desired table
It is your job to code this functionality but you will find many examples on the Internet.
You can use AWS SDK in your convenient language to invoke Lambda.
Please refer this documentation

How to access AWS S3 bucket and object without credentials

I have an application written in aws-sdk-java that read the content from S3 bucket. I am using IAM role to connect S3 from EC2, and using user profile credentials from local environment to S3. Both cases works fine. But I do not want to use user credentials (keys id and sceret accesss keys) from local (eclipse ide). I am searching different options including SSH tunnel to S3 through EC2 instance. As I have seen many example of ssh tunnel to RDS instance but don't find one to S3. Is there any better ways to access S3 bucket without using credentials from local and forward the the to EC2 instance to access S3 from local machine? Thanks in advance

How to access the files from a s3 storage using java

I have created my vm in aws using ec2 instance. Also i have created a bucket in s3 storage and some folders in that bucket. I have my java jar file in filezilla and now i want to run my application. This scenario is working ok in the case when I have my filed I want to access in filezilla also but now that I want to create a s3 storage instance I do not know how can my java application can access these folders. Do I need to create a new java class in my application in order to access these folders from s3 storage?
I am not quite familiar with aws and I will appreciate any help from you guys.Even if you send me some useful links which solve my problem it will be a great help for me.
Thanks in advance

How to put object to S3 via CloudFront

I'd like to upload image to S3 via CloudFront.
If you see the document about CloudFront, you can find that cloud front offers put method for uploading to cloudFront
There could be someone to ask me why i use the cloud front for uploading to S3
If you search out about that, you can find the solution
What i wanna ask is whether there is method in SDK for uploading to cloud front or not
As you know , there is method "putObejct" for uploading directly to S3 but i can't find for uploading cloud front ...
please help me..
Data can be sent through Amazon CloudFront to the back-end "origin". This is used for using a POST on web forms, to send information back to web servers. It can also be used to POST data to Amazon S3.
If you would rather use an SDK to upload data to Amazon S3, there is no benefit in sending it "via CloudFront". Instead, use the Amazon S3 APIs to upload the data directly to S3.
So, bottom line:
If you're uploading from a web page that was initially served via CloudFront, send it through CloudFront to S3
If you're calling an API, call S3 directly
If the bucket's region is far away from the uploading computer you can upload faster by enabling S3 Accelerate which uploads directly through the Amazon server located closest to you and then continues sending the file from there to the bucket's actual region at an optimal route.
Have a look here.

File transfer to a remote machine in amazon ec2

after creating a instance in amazon cloud using webservice in java i need to transfer a executable file or war file via program from my local machine to the newly created instance in amazon and i want to execute that excetuable,i tried and found that there is something called createbucket in ec2 api and using that we can upload the file to that and we can transfer that reference using PutObjectRequest i can transfer the reference to a remote computer in amazon do it is possible or if it is wrong please suggest me the correct way to proceed for file transfer from my local machine to the amazon ec2.
The basic suggestion is, you shouldn't transfer the file(s) with CreateBucket, which is actually an S3 API. Use scp may be a better solution.
Amazon S3, which you are trying to use with CreateBucket, is a data storage service mainly for flexible, public (with authentication) file sharing. You can use REST or SOAP APIs to access the data, but cannot really read/write it in EC2 instances as if it's in local harddisk.
To access file system in EC2 instances, that really depends on your operating system (on EC2). If it's running Linux, scp is a mature choice. You can use Java to directly invoke scp, if you are using Linux locally, or pscp if you are using Windows. If the EC2 instance is running Windows, one choice is to host an SSH/SFTP environment with FreeSSHD, and then proceed like Linux. Another option is use Shared Folder and regular file copy.

Categories