AWS File Upload to S3 with Java - java

I am trying to upload files into my S3 bucket using AWS Lambda in Java and i'm having some issues.
I am using APIGatewayProxyRequestEvent in my AWS Lambda function to get my file upload from Postman.
request.getBody() method of this event gives me a String representation of the image file whereas the S3.putObject takes as input an InputStream of the file to be uploaded.
How can I feed in request.getBody() to the S3.putObject() method in my Lambda code to make the File Upload work?

1) You may create a File and using FileWriter you may write the request.getBody() into it.
2) You can go with PutObjectRequest object and put file created in Step1 into it.
3) s3Client.putObject(PutObjectRequest) will help you to put object to s3

Related

Row-by-row writing into a file on amazon s3

I need to write into a file on amazon s3.
When event A occurs, I need to get the object and write it to the end of the csv file on s3. When event B occurs I need to get the object and write it to the end of the same csv file on s3.
How can I do it?
P.S. Java
I tried to use outputstream but the one there isn't in c3 library "sofware.amazone.awssdk.services.s3"

Uploading video file to Amazon S3 from an url without having to download the file in java

I'm trying to upload a file from a Zoom recording to Amazon S3. I have the download url but I don't want to download the file in the server before uploading it to Amazon S3 to avoid using all the resources. Is there a way to do that?
You can use the method put that uses a stream to upload an object on AmazonS3:
Uploads the specified input stream and object metadata to Amazon S3 under the specified bucket and key name
So basically you can:
download from zoom as stream
use this stream as parameter to AmazonS3
In this case you don't need to store locally the file because the stream is redirected to AWS.

How can I use Java upload a folder to AWS S3?

I checked the document and the code examples. However, I just found the way to upload a file. If I set the file path be a folder, the program would return exception:
Exception in thread "main" com.amazonaws.SdkClientException: Unable to calculate MD5 hash: /path/to/folder (Is a directory)
I noticed that C# code example has a way to upload folder, but Java doesn't. Does it mean Java cannot upload folder to AWS S3?
The Amazon S3 API only supports uploading one object per API call. There is no API call to upload a folder.
Your code would need to loop through each file in the folder and upload them individually.

Play Framework file upload to in memory to S3

I am writing a web server with Play framework 2.6 in Java. I want to upload a file to WebServer through a multipart form and do some validations, then upload the file s3. The default implementation in play saves the file to a temporary file in the file system but I do no want to do that, I want to upload the file straight to AWS S3.
I looked into this tutorial, which explains how to save file the permanently in file system instead of using temporary file. To my knowledge I have to make a custom Accumulator or a Sink that saves the incoming ByteString(s) to a byte array? but I cannot find how to do so, can someone point me in the correct direction?
thanks

Can I use one S3 bucket for upload different java lambda function?

Currently I am using different S3 bucket for every function.
Ex. I have 3 Java Lambda Function created on Eclipse IDE.
RegisterUser
LoginUser
ResetPassword
I am uploading lambda function through Eclipse IDE,
I have to upload function through Amazon S3 Bucket.
I create 3 Amazon S3 Bucket for upload all 3 function.
My Question is : Can I upload all 3 Lambda Function using one Amazon S3 Bucket?
or
I have to create separate Amazon S3 Bucketfor all function.?
You don't need to upload to a bucket. You can upload the function code via the command line as well. They only recommend not using the web interface for large Lambda functions, all other methods are ok, and command line is a very good option.
However, if you really want to upload to a bucket first, just give each zip file that contains the function code a different filename and you're good.

Categories