Upload file to S3 - java

I'm trying to develop a file upload function on AWS.
I wrote a servlet to process post request, and the framework is shown below:
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
boolean isMulti = ServletFileUpload.isMultipartContent(request);
if (isMulti) {
ServletFileUpload upload = new ServletFileUpload();
try {
FileItemIterator iter = upload.getItemIterator(request);
while (iter.hasNext()) {
FileItemStream item = iter.next();
InputStream inputStream = item.openStream();
if (item.isFormField()) {
} else {
String fileName = item.getName();
if (fileName != null && fileName.length() > 0) {
//read stream of file uploaded
//store as a temporary file
//upload the file to s3
}
}
}
} catch (Exception e) {
}
}
response.sendRedirect("location of the result page");
}
I think these classes should be used to upload files. I tried but in S3, the size of the file is always 0 byte. Are there any other ways to upload file with multiple parts?
InitiateMultipartUploadRequest
InitiateMultipartUploadResult
UploadPartRequest
CompleteMultipartUploadRequest
I refer to the code http://docs.aws.amazon.com/AmazonS3/latest/dev/llJavaUploadFile.html

I guess you are clubbing two things - one is getting the entire file to your app container (on the tomcat where your servlet is running) and the other is uploading the file to S3.
If you are able to achieve the first of the above tasks (you should have the file as a java.io.File object), the second task should be straightforward:
AmazonS3 s3 = new AmazonS3Client(awsCredentials);
s3.putObject(new PutObjectRequest(bucketName, key, fileObj));
This is S3's simple upload. S3 Multipartupload is where you divide the whole file into chunks and upload each chunk in parallel. S3 gives two ways to achieve a multipartupload.
Low level - where you take of chunking and uploading each part. This uses the classes you mentioned like InitiateMultipartUploadRequest
High level - In this S3 abstracts the process of chunking/uploading etc. using a TransferManager class

Related

Downloading Thymeleaf template from S3

I am currently storing and downloading my Thymeleaf templates in S3.
I am using the following function to retrieve the Template from S3:
public String getTemplateFile(String name, File localFile) {
ObjectMetadata object = s3Client.getObject(new GetObjectRequest(connectionProperties.getBucket(), name), localFile);
boolean success = localFile.exists() && localFile.canRead();
return localFile.getPath();
}
After doing this the file is successfully downloaded in the desired location.
But when trying to access the file from the FlyingSaucer PDF generator the file doesn't exist, despite it is already downloaded in FILE_LOCATION_PATH. (I can open the file... the file is there but the function doesn't see it)
String xHtmlStringDocument =
convertHtmlToXhtml(templateEngine
.process(FILE_LOCATION_PATH,
initializeLetterHtmlTemplateContext(letter)));
When I run the program again and again I get the same result. But when I STOP the program and RUN it AGAIN then everything works because the file form the last execution is now recognized by the program.
This sounds to me like an asynchronous function issue.
Does anybody know how can I fix this?
Thanks in advance.
EDITED (following suggestion)
New function: Same result:
(And the file was created, the Download from S3 was successful)
java.io.FileNotFoundException: ClassLoader resource "static/templates/template.html" could not be resolved
public String getTemplateFileN(String name, File localFile) throws IOException {
S3Object fullObject = null;
InputStream in = null;
try {
fullObject = s3Client.getObject(new GetObjectRequest(connectionProperties.getBucket(), name));
System.out.println("Content-Type: " + fullObject.getObjectMetadata().getContentType());
System.out.println("Content: ");
displayTextInputStream(fullObject.getObjectContent());
in = fullObject.getObjectContent();
System.out.println(localFile.toPath());
Files.copy(in, localFile.toPath());
} //then later
finally {
// To ensure that the network connection doesn't remain open, close any open input streams.
if (fullObject != null) {
fullObject.close();
}
if (in != null) {
in.close();
}
}
return localFile.getPath();
}
Checking javadoc
https://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/AmazonS3Client.html#getObject-com.amazonaws.services.s3.model.GetObjectRequest-java.io.File-
I see not method signature ObjectMetadata getObject(GetObjectRequest getObjectRequest,String file)
There is
ObjectMetadata getObject(GetObjectRequest getObjectRequest,
File destinationFile)
Where you provide File (not String) as second argument. Make sure the file is not opened for write before you try reading it!

Uploading objects to google cloud storage bucket java

I want to upload simple image file to google cloud storage
when the upload is to root bucket upload happens smoothly but when i try to upload image to folder within a bucket it fails
following is my code to do so
static Storage sStorage;
public static void uploadFileToServer(Context context, String filePath) {
Storage storage = getStorage(context);
StorageObject object = new StorageObject();
object.setBucket(BUCKET_NAME);
File sdCard = Environment.getExternalStorageDirectory();
File file = new File(sdCard + filePath);
try {
InputStream stream = new FileInputStream(file);
String contentType = URLConnection.guessContentTypeFromStream(stream);
InputStreamContent content = new InputStreamContent(contentType, stream);
Storage.Objects.Insert insert = storage.objects().insert(BUCKET_NAME, null, content);
insert.setName(file.getName());
insert.execute();
} catch (Exception e) {
e.printStackTrace();
}
}
i tried putting bucket name like [PARENT_BUCKET]/[CHILD_FOLDER]
but it doesn't work
GCS has a flat namespace, with "folders" being just a client-supported abstraction (basically, treating "/" characters in object names as the folder delimiter).
So, to upload to a folder you should put just the bucket name in the bucket field of the request, and put the folder at the beginning of of the object field. For example, to upload to bucket 'my-bucket', folder 'my-folder' and file-within-folder 'file', you'd set bucket name to 'my-bucket' and object name to 'my-folder/file'.

how to wrap the file as blob and commit them as datastore in google app engine java

I tried with the file service to store the data in Google app engine and i successfully uploaded it but later i noted its not storing as blob values. So i googled and got this link How do I handle multipart form data? or How do I handle file uploads to my app? provided by google.
In this document the code says like below,
FileItemIterator iterator = upload.getItemIterator(req);
while (iterator.hasNext()) {
FileItemStream item = iterator.next();
InputStream stream = item.openStream();
if (item.isFormField()) {
log.warning("Got a form field: " + item.getFieldName());
} else {
log.warning("Got an uploaded file: " + item.getFieldName() +", name = " + item.getName());
// You now have the filename (item.getName() and the
// contents (which you can read from stream). Here we just
// print them back out to the servlet output stream, but you
// will probably want to do something more interesting (for
// example, wrap them in a Blob and commit them to the
// datastore).
Here am not understanding how to wrap them as blob and commit them to the datastore. can anyone suggest me how to solve this. Does this way stores the file as blob values in google app engine?
InputStream is = item.openStream();
try {
FileService fileService = FileServiceFactory.getFileService();
AppEngineFile file = fileService.createNewBlobFile(mime, fileName);
boolean lock = true;
FileWriteChannel writeChannel = fileService.openWriteChannel(file, lock);
byte[] buffer = new byte[BUFFER_SIZE];
int readBytes;
while ((readBytes = is.read(buffer)) != -1) {
writeChannel.write(ByteBuffer.wrap(buffer, 0, readBytes));
}
writeChannel.closeFinally();
String blobKey = fileService.getBlobKey(file).getKeyString();
} catch (Exception e) {
e.printStackTrace(resp.getWriter());
}

Uploading File in jsp or multipart/form-data

Actually I am trying to get no of files count to be uploading before uploading using common upload lib in jsp and I made a function for getting the counts of file this is here:
public static int getUploadFileCount(HttpServletRequest request) throws FileUploadException
{
int Result = 0;
DiskFileItemFactory factory = new DiskFileItemFactory();
ServletFileUpload upload = new ServletFileUpload(factory);
List<FileItem> items = upload.parseRequest(request);
for (FileItem item : items)
{
if (!item.isFormField())
Result++;
}
return Result;
}
And use this function in business logic this here:
public void doChangeProfilePhoto(HttpServletRequest request) throws FileUploadException
{
if(UploadFileUtil.getUploadFileCount(request) != 1)
throw new FileUploadException("There is Multiple/None File upload for Profile Photo. ");
ProfileImage newProfileImage = new ProfileImage();
newProfileImage.setFileName("sam.jpg");
if(new UploadBPOProfilePhotoImpl().uploadPhoto(newProfileImage, request))
{}
}
And in this code after calling uploadPhoto function there is also use same code for retrieving the files one by one like this:
DiskFileItemFactory factory = new DiskFileItemFactory();
ServletFileUpload upload = new ServletFileUpload(factory);
List<FileItem> items = upload.parseRequest(request);
for (FileItem item : items) // Did not get file if i already used this code once For getting the files count:
{
if (!item.isFormField())
//did for makeing file and writing/saving it with a name
}
So here I get problem is that when I use this code its not get file when I have used the same code for counting the files
and if I comment this two line where I have use to get files count in doChangeProfilePhoto like this:
//if(UploadFileUtil.getUploadFileCount(request) != 1)
// throw new FileUploadException("There is Multiple/None File upload for Profile Photo. ");
Then its working. Why its happening that if one time it is used so after it unable to retrieving the file.What's the reason behind it
and is there any way to count files using common upload...and also for their names?

Writing Zip Files to GAE Blobstore

I'm using the Java API for reading and writing to the Google App Engine Blobstore.
I need to zip files directly into the Blobstore, meaning I have String objects which I want to be stored in the Blobstore when zipped.
My problem is that standard zipping methods are using OutputStream to write, while it seems that GAE doesn't provide one for writing to the Blobstore.
Is there a way to combine those APIs, or are there different APIs I can use (I haven't found such)?
If I am not wrong, you can try to use the Blobstore low level API. It offers a Java Channel (FileWriteChannel), so you could probably convert it to an OutputStream:
Channels.newOutputStream(channel)
And use that output stream with the java.util.zip.* classes you are currently using (here you have a related example that uses Java NIO to zip something to a Channel/OutputStream)
I have not tried it.
Here is one example to write content file and zip it and store it into blobstore:
AppEngineFile file = fileService.createNewBlobFile("application/zip","fileName.zip");
try {
FileWriteChannel writeChannel = fileService.openWriteChannel(file, lock);
//convert as outputstream
OutputStream blobOutputStream = Channels.newOutputStream(writeChannel);
ZipOutputStream zip = new ZipOutputStream(blobOutputStream);
zip.putNextEntry(new ZipEntry("fileNameTozip.txt"));
//read the content from your file or any context you want to get
final byte data[] = IOUtils.toByteArray(file1InputStream);
//write byte data[] to zip
zip.write(bytes);
zip.closeEntry();
zip.close();
// Now finalize
writeChannel.closeFinally();
} catch (IOException e) {
throw new RuntimeException(" Writing file into blobStore", e);
}
The other answer is using BlobStore api, but currently the recommended way is to use App Engine GCS client.
Here is what I use to zip multiple files in GCS :
public static void zipFiles(final GcsFilename targetZipFile,
Collection<GcsFilename> filesToZip) throws IOException {
final GcsFileOptions options = new GcsFileOptions.Builder()
.mimeType(MediaType.ZIP.toString()).build();
try (GcsOutputChannel outputChannel = gcsService.createOrReplace(targetZipFile, options);
OutputStream out = Channels.newOutputStream(outputChannel);
ZipOutputStream zip = new ZipOutputStream(out)) {
for (GcsFilename file : filesToZip) {
try (GcsInputChannel readChannel = gcsService.openPrefetchingReadChannel(file, 0, MB);
InputStream is = Channels.newInputStream(readChannel)) {
final GcsFileMetadata meta = gcsService.getMetadata(file);
if (meta == null) {
log.warn("{} NOT FOUND. Skipping.", file.toString());
continue;
}
final ZipEntry entry = new ZipEntry(file.getObjectName());
zip.putNextEntry(entry);
ByteStreams.copy(is, zip);
zip.closeEntry();
}
zip.flush();
}
}

Categories