Use AWS Java SDK to read text file from S3 - java

Im trying to read a text file from AWS S3 object store (and then send it via http to a client). I have AWS CLI command which copies the file locally, but how can I do that via the SDK? I want to read the contents as string and avoid saving as a file and then read it back.
In CLI, I create a profile with keys (one time only):
aws configure --profile cloudian
Which then asks for questions like AWS Access Key ID [None]: and such. And then I need to run this command to retrieve the file:
aws --profile=cloudian --endpoint-url=https://s3-abc.abcstore.abc.net s3 cp s3://abc-store/STORE1/abc2/ABC/test_08.txt test.txt

For Reading S3 Object using SDK :
String s3Key ="";
AmazonS3 s3Client = AmazonS3ClientBuilder.standard().withRegion(region).build();
s3Key = URLDecoder.decode("s3Key", "UTF-8");
String s3BuckerName="Your Bucket Name";
S3Object object = s3Client.getObject(new GetObjectRequest(s3BuckerName, s3Key));
S3ObjectInputStream inputStream = object.getObjectContent();
You can get the content with the above code.
And I dint get second part of your question, do you wanna send this data somewhere?

Related

Amazon S3 Copy between two buckets with different Authentication

I have two buckets, each with a Private ACL.
I have an authenticated link to the source:
String source = "https://bucket-name.s3.region.amazonaws.com/key?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=...&X-Amz-SignedHeaders=host&X-Amz-Expires=86400&X-Amz-Credential=...Signature=..."
and have been trying to use the Java SDK CopyObjectRequest to copy it into another bucket using:
AWSCredentials credentials = new BasicAWSCredentials(accessKey, secretKey)
AWSCredentialsProvider provider = new AWSStaticCredentialsProvider(credentials)
AmazonS3 s3Client = AmazonS3ClientBuilder
.standard()
.withCredentials(provider)
AmazonS3URI sourceURI = new AmazonS3URI(URI(source))
CopyObjectRequest request = new CopyObjectRequest(sourceURI.getBucket, sourceURI.getKey, destinationBucket, destinationKey);
s3Client.copyObject(request);
However I get AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied because My AWS credentials I've set the SDK up with do not have access to the source file.
Is there a way I can provide an authenticated source URL instead of just the bucket and key?
This isn't supported. The PUT+Copy service API, which is used by s3Client.copyObject(), uses an internal S3 mechanism to copy of the object, and the source object is passed as /bucket/key -- not as a full URL. There is no API functionality that can be used for fetching from a URL, S3 or otherwise.
With PUT+Copy, the user making the request to S3...
must have READ access to the source object and WRITE access to the destination bucket
https://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectCOPY.html
The only alternative is download followed by upload.
Doing this from EC2... or a Lambda function running in the source region would be the most cost-effective, but if the object is larger than the Lambda temp space, you'll have to write hooks and handlers to read from the stream and juggle the chunks into a multipart upload... not impossible, but requires some mental gyrations in order to understand what you're actually trying to persuade your code to do.

How to get an excel file from AWS S3 bucket into a MultipartFile in Java

I've been trying to extract an .xlsx file from a AWS bucket I created and store it as a multipartfile variable. I've tried many different approaches, but at best I get weird characters. I'm not finding much documentation on how to do this.
Thanks!
// you may need to initialize this differently to get the correct authorization
final AmazonS3Client s3Client = AmazonS3ClientBuilder.defaultClient();
final S3Object object = s3Client.getObject("myBucket", "fileToDownload.xlsx");
// with Java 7 NIO
final Path filePath = Paths.get("localFile.xlsx");
Files.copy(object.getObjectContent(), filePath);
final File localFile = filePath.toFile();
// or Apache Commons IO
final File localFile = new File("localFile.xlsx");
FileUtils.copyToFile(object.getObjectContent(), localFile);
I'm not 100% sure what you mean by "MultipartFile" - that's usually in the context of a file that's been sent to your HTTP web service via a multipart POST or PUT. The file you're getting from S3 is technically part of the response to an HTTP GET request, but the Amazon Java Library abstracts this away for you, and just gives you the results as an InputStream.

Reading Encrypted Data from S3

I have to download data from an S3 bucket , the data is encrypted and I have the kms key to decrypt data .The code is running in an EC2 instane and the EC2 instance is having the IAM role to read from S3 .
I have seen a sample code in this link ,but I am not able to read the contents .I am getting the following exception
Exception in thread "main" com.amazonaws.SdkClientException: Unable to load credentials into profile [default]: AWS Access Key ID is not specified.
at com.amazonaws.auth.profile.internal.ProfileStaticCredentialsProvider.fromStaticCredentials(ProfileStaticCredentialsProvider.java:55)
at com.amazonaws.auth.profile.internal.ProfileStaticCredentialsProvider.<init>(ProfileStaticCredentialsProvider.java:40)
at com.amazonaws.auth.profile.ProfilesConfigFile.fromProfile(ProfilesConfigFile.java:207)
at com.amazonaws.auth.profile.ProfilesConfigFile.getCredentials(ProfilesConfigFile.java:160)
Can somebody suggest where I am going wrong or give some guidelines on how to read encrypted data from S3 buckets without credentials
I was able to find a solution by providing InstanceProfileCredentialsProvider .Below is the code .
String kms_key = Constants.KMS_key;
String inputString = null;
KMSEncryptionMaterialsProvider materialProvider = new KMSEncryptionMaterialsProvider(kms_key);
AmazonS3EncryptionClient client = new AmazonS3EncryptionClient(InstanceProfileCredentialsProvider.getInstance(),
materialProvider);
S3Object downloadedObject = client.getObject(bucketName, filePath);
if (null != downloadedObject) {
inputString = convertToString(downloadedObject.getObjectContent());
}

Is is secure to transfer data between S3 and my data center using AWS S3 SDK?

I want to use AWS SDK to upload/download files from a private S3 bucket using Java to our data center.
I am planning to use following code
AmazonS3 s3Client = new AmazonS3Client(credentials);
s3Client.putObject(new PutObjectRequest(bucket, key, fileToUpload));
S3Object s3object = s3Client.getObject(new GetObjectRequest(bucket, key));
Will the data be traveling between S3 and our Datacenter in plain text or will it use some form of secure transport like SSL?
The default for communication is SSL. While HTTP is available, you would have to specifically use it. For example, from this page, you would have to do something like:
AmazonS3 s3Client = new AmazonS3Client(credentials);
s3Client.setEndpoint("http://s3-us-west-1.amazonaws.com");
with the endpoints taken from this page.
Since you're using the defaults, your communication is via SSL/HTTPS.

Store Blob in Heroku (or similar cloud services)

I want to deploy an app in Heroku to try their new Play! Framework support. For what I've read in the site (I gotta confess I did not try it yet) they don't provide any file system. This means that (probably) Blob fields used in Play to store files won't work properly.
Could somebody:
Confirm if you can use the Play Blob in Heroku?
Provide the "best" alternative to store files in Heroku? Is better to store them in the database (they use PostgreSQL) or somewhere else?
I put an example of how to do this with Amazon S3 on github:
https://github.com/jamesward/plays3upload
Basically you just need to send the file to S3 and save the key in the entity:
AWSCredentials awsCredentials = new BasicAWSCredentials(System.getenv("AWS_ACCESS_KEY"), System.getenv("AWS_SECRET_KEY"));
AmazonS3 s3Client = new AmazonS3Client(awsCredentials);
s3Client.createBucket(BUCKET_NAME);
String s3Key = UUID.randomUUID().toString();
s3Client.putObject(BUCKET_NAME, s3Key, attachment);
Document doc = new Document(comment, s3Key, attachment.getName());
doc.save();
listUploads();

Categories