I have written a piece of code, to upload a file on Amazon S3. It is working fine in my local system. I am able to upload a file and as a response I am getting the file url. But when I try to upload those files to AWS server, they are not getting uploaded, I am getting a 200 response as well, and its showing a rectangle box on postman.
Can anyone help me to solve this issue? Any help will be appreciated. Thanks!
#Service
public class AmazonClient {
private AmazonS3 amazonS3;
#Value("${amazonProperties.accessKey}")
public String accessKey;
#Value("${amazonProperties.secretKey}")
public String secretKey;
#Value("${amazonProperties.bucketName}")
public String bucketName;
#Value("${amazonProperties.endpointUrl}")
public String endpointUrl;
#Value("${amazonProperties.region}")
public String region;
#PostConstruct
private void initializeAmazon()
{
AWSCredentials credentials = new BasicAWSCredentials(accessKey, secretKey);
this.amazonS3 = AmazonS3ClientBuilder
.standard()
.withCredentials(new AWSStaticCredentialsProvider(credentials))
.withRegion(region)
.build();
}
private File convertMultiPartToFile(MultipartFile file) throws IOException
{
File convFile = new File(file.getOriginalFilename());
FileOutputStream fos = new FileOutputStream(convFile);
fos.write(file.getBytes());
fos.close();
return convFile;
}
private String generateFileName(MultipartFile multiPart)
{
return new Date().getTime() + "-" + multiPart.getOriginalFilename().replace(" ", "_");
}
private void uploadFileTos3bucket(String fileName, File file)
{
amazonS3.putObject(new PutObjectRequest(bucketName, fileName, file)
.withCannedAcl(CannedAccessControlList.PublicRead));
}
public String uploadFile(MultipartFile multipartFile) {
String fileUrl = "";
try
{
File file = convertMultiPartToFile(multipartFile);
String fileName = generateFileName(multipartFile);
fileUrl = endpointUrl + "/" + bucketName + "/" + fileName;
uploadFileTos3bucket(fileName, file);
//file.delete();
}
catch (Exception e)
{
e.printStackTrace();
}
return fileUrl;
}
public String deleteFileFromS3Bucket(String fileUrl)
{
String fileName = fileUrl.substring(fileUrl.lastIndexOf("/") + 1);
amazonS3.deleteObject(new DeleteObjectRequest(bucketName + "/", fileName));
return "Successfully deleted";
}
}
Please ensure that the AWS IAM credentials provided are working and the bucket location, as well as the name, are provided correctly. If it is then:-
i. Make sure that the bucket is accessible programmatically.
ii. Make sure that the IAM role has privileges enough to interact with S3 bucket eg(Full S3 access policy).
These might be probable resolutions.
Related
I am going to convert MultipartFile to File and upload it to S3 bucket.
However, when running the test, an error occurs in the process of converting MultipartFile to File.
ERROR : java.nio.file.AccessDeniedException: D:\workspace_intellij_forKiri\Kiri\server\kiri\temp\8b28a2f2-7276-4036
multipartFile.transferTo(file);
Please advise if there is anything I am missing.
The spring boot version is 2.7.7 version.
Test code
#WithAccount("creamyyyy")
#DisplayName("image save test")
#Test
public void createImageTest() throws Exception {
//given
String filename = "files";
String contentType = "png";
MockMultipartFile image1 = new MockMultipartFile(
filename,
filename + "." + contentType,
"image/png",
filename.getBytes());
//when
//then
this.mockMvc.perform( //== ERROR!!!
MockMvcRequestBuilders
.multipart("/api/posts/image")
.file(image1)
.contentType(MediaType.MULTIPART_FORM_DATA)
.characterEncoding("UTF-8")
)
.andDo(print())
.andExpect(status().isOk());
}
ImageService Code
// FileSave
public List<ImageResDto> addFile(List<MultipartFile> multipartFiles) throws IOException {
List<ImageResDto> imageResDtoList = new ArrayList<>();
/**
* <ImageResDto>
* private Long image_id;
* private String imgUrl;
*/
String absolutePath = new File("").getAbsolutePath() + File.separator + "temp";
for (MultipartFile multipartFile : multipartFiles) {
String contentType = multipartFile.getContentType();
if(ObjectUtils.isEmpty(contentType)) {
throw new RuntimeException("FILE TYPE NOT FOUND");
} else if(!verifyContentType(contentType)){
throw new RuntimeException("FILE TYPE NOT FOUND");
}
}
for (MultipartFile multipartFile : multipartFiles) {
String filename = UUID.randomUUID() + multipartFile.getOriginalFilename();
// save in local
String fullFilePath = absolutePath + File.separator + filename;
System.out.println("fullFilePath = " + fullFilePath);
File file = new File(fullFilePath);
if(!file.exists()) { file.mkdirs(); }
multipartFile.transferTo(file); // ERROR ... OTL
file.createNewFile();
// S3 upload
amazonS3.putObject(
new PutObjectRequest(bucket, filename, file)
.withCannedAcl(CannedAccessControlList.PublicRead)
);
String imgUrl = amazonS3.getUrl(bucket, filename).toString();
Image newImage = Image.builder()
.filename(filename)
.filepath(filename)
.imgUrl(imgUrl)
.build();
imageRepository.save(newImage);
ImageResDto imageResDto = ImageResDto.of(newImage);
imageResDtoList.add(imageResDto);
file.delete(); // local file delete
}
return imageResDtoList;
}
ImageController Code
#PostMapping(value = "/api/posts/image", consumes = {MediaType.MULTIPART_FORM_DATA_VALUE, MediaType.APPLICATION_JSON_VALUE})
public ResponseEntity createImage(#RequestPart(value = "files") List<MultipartFile> multipartFiles) throws IOException {
System.out.println("ImageController Runnnn");
// get member
PrincipalDetails principalDetails = (PrincipalDetails) SecurityContextHolder.getContext().getAuthentication().getPrincipal();
Member member = principalDetails.getMember();
List<ImageResDto> imageResDtoList = imageService.addFile(multipartFiles);
return new ResponseEntity(imageResDtoList, HttpStatus.CREATED);
}
I tried to specify a separate route using Path, but I failed.
// Error ..java.nio.file.AccessDeniedException => Path
// multipartFile -> File
Path path = Paths.get(fullFilePath).toAbsolutePath();
multipartFile.transferTo(path.toFile());
Files.createFile(path);
What is incomprehensible is that when tested using PostMan, the file is normally uploaded to the S3 bucket.
Please tell me if I applied anything wrong.
uploading to amazon S3 which is making it confusing on how to Mock the upload
#Service
public class S3service {
private AmazonS3 s3client;
#Value("test")
private String bucketName;
private String folder = "test";
#SuppressWarnings("deprecation")
#PostConstruct
private void intitialAmazon() {
this.s3client = new AmazonS3Client(new DefaultAWSCredentialsProviderChain());
}
Method wanting to unit test. Taking in a byte array
public String uploadFile(byte[] pdf) throws IOException {
ByteArrayInputStream inputStream = new ByteArrayInputStream(pdf);
s3client.putObject(new PutObjectRequest(bucketName, folder + "/" + "pdf", inputStream, new ObjectMetadata()));
return "Uploaded";
}
I am trying to upload an image to the AWS S3 Bucket using my Spring Boot Application.But the image does not get uploaded and it shows an error in the console.But all my configurations are correct.In here my awsS3AudioBucketCoverPhoto is my S3Bucket Name.
This is my AmazonS3Config file.
#Configuration
public class AmazonS3Config
{
#Value("${aws.access.key.id}")
private String awsKeyId;
#Value("${aws.access.key.secret}")
private String awsKeySecret;
#Value("${aws.region}")
private String awsRegion;
#Value("${aws.s3.audio.bucket.cover.photo}")
private String awsS3AudioBucketCoverPhoto;
#Value("${aws.s3.audio.bucket.profile.photo}")
private String awsS3AudioBucketProfilePhoto;
#Bean(name = "awsKeyId")
public String getAWSKeyId() {
return awsKeyId;
}
#Bean(name = "awsKeySecret")
public String getAWSKeySecret() {
return awsKeySecret;
}
#Bean(name = "awsRegion")
public Region getAWSPollyRegion() {
return Region.getRegion(Regions.fromName(awsRegion));
}
#Bean(name = "awsCredentialsProvider")
public AWSCredentialsProvider getAWSCredentials() {
BasicAWSCredentials awsCredentials = new BasicAWSCredentials(this.awsKeyId, this.awsKeySecret);
return new AWSStaticCredentialsProvider(awsCredentials);
}
#Bean(name = "awsS3AudioBucketCoverPhoto")
public String getAWSS3AudioBucketCoverPhoto() {
return awsS3AudioBucketCoverPhoto;
}
#Bean(name = "awsS3AudioBucketProfilePhoto")
public String getAWSS3AudioBucketProfilePhoto() {
return awsS3AudioBucketProfilePhoto;
}
}
This my ServiceImpl class code.
#Override
public String uploadCoverImageToS3Bucket(MultipartFile multipartFileCover, boolean enablePublicReadAccess) {
String fileName = PathCOVER+multipartFileCover.getOriginalFilename();
try {
//creating the file in the server (temporarily)
File file = new File(fileName);
FileOutputStream fos = new FileOutputStream(file);
fos.write(multipartFileCover.getBytes());
fos.close();
PutObjectRequest putObjectRequest = new PutObjectRequest(this.awsS3AudioBucketCoverPhoto, fileName, file);
if (enablePublicReadAccess) {
putObjectRequest.withCannedAcl(CannedAccessControlList.PublicRead);
}
this.amazonS3.putObject(putObjectRequest);
//removing the file created in the server
file.delete();
} catch (IOException | AmazonServiceException ex) {
logger.error("error [" + ex.getMessage() + "] occurred while uploading [" + fileName + "] ");
}
return multipartFileCover.getOriginalFilename() + " File uploaded successfully";
}
This is the error it shows in the intellij idea console.
error [https:\elasticbeanstalk-ap-southeast-1-530228581445.s3-ap-southeast-1.amazonaws.com\CoverPhoto\henna.jpg (The filename, directory name, or volume label syntax is incorrect)] occurred while uploading [https://elasticbeanstalk-ap-southeast-1-530228581445.s3-ap-southeast-1.amazonaws.com/CoverPhoto/henna.jpg]
When you use the new PutObjectRequest(String bucketName, String key, File file), the first argument is the name of the bucket, the second is the object key.
It seems like your fileName which you pass as object key has the value
https://elasticbeanstalk-ap-southeast-1-530228581445.s3-ap-southeast-1.amazonaws.com/CoverPhoto/henna.jpg
Which is clearly not a valid object key.
I can't figure out from your code which value awsS3AudioBucketCoverPhoto has, but be sure to check it as well - it has to be the name of the target bucket.
private static final String bucketName = "imagebucket";
private static final BasicAWSCredentials credentials = new BasicAWSCredentials("secret_id", "secret_pass");
private static final String destinationFolder = "images/";
private static AmazonS3 s3 = AmazonS3Client
.builder()
.withRegion("us-east-2")
.withCredentials(new AWSStaticCredentialsProvider(credentials))
.build();
#RequestMapping("/image")
public String uploadToBucketAndReturnUrl(byte[] fileByteArray, String extension)
{
//generate a UUID hash
UUID uuid = UUID.randomUUID();
String randomUUIDString = uuid.toString();
String imageHash = randomUUIDString + "." + extension;
String filePath = destinationFolder+imageHash;
try
{
File tf = File.createTempFile("image", "." + extension);
BufferedImage image = ImageIO.read(new ByteArrayInputStream(fileByteArray));
ImageIO.write(image, extension, tf);
s3.putObject(new PutObjectRequest(bucketName, filePath, tf));
}
catch(IOException e)
{
e.printStackTrace();
}
//return the url
String imageUrl = "https://s3.us-east-2.amazonaws.com/" + bucketName + "/" + filePath;
return imageUrl;
}
I have a function that will upload an image to my encrypted bucket and return a URL. Now I need to create a function that will take in the URL and retrieve the image from the bucket, and I'm unsure how to procceed.
I'm trying to upload an image to a existing bucket in my Google Cloud Storage.
The image file gets uploaded successfully when I go and check, but the returned download url is null
CODE
private String uploadImage(File filePath, String blobName, File uploadCreds) throws FileNotFoundException, IOException{
Storage storage = StorageOptions.newBuilder().setProjectId("myProjectId")
.setCredentials(ServiceAccountCredentials.fromStream(new FileInputStream(uploadCreds)))
.build()
.getService();
String bucketName = "myBucketName";
Bucket bucket = storage.get(bucketName);
BlobId blobId = BlobId.of(bucket.getName(), blobName);
InputStream inputStream = new FileInputStream(filePath);
BlobInfo blobInfo = BlobInfo.newBuilder(blobId).setContentType("image/jpeg").build();
try (WriteChannel writer = storage.writer(blobInfo)) {
byte[] buffer = new byte[1024];
int limit;
try {
while ((limit = inputStream.read(buffer)) >= 0) {
writer.write(ByteBuffer.wrap(buffer, 0, limit));
}
} catch (Exception ex) {
ex.printStackTrace();
}finally {
writer.close();
}
System.out.println("Image URL : " + blobInfo.getMediaLink());
System.out.println("Blob URL : " + blobInfo.getSelfLink());
return blobInfo.getMediaLink();
}
}
filePath is the Image File
blobName is a random Image Name
uploadCreds is my credintials.json file
Why is the blobInfo.getMediaLink() and blobInfo.getSelfLink() returning null? What am i doing wrong?
Here is my code that works perfectly
#RestController
#RequestMapping("/api")
public class CloudStorageHelper {
Credentials credentials = GoogleCredentials.fromStream(new FileInputStream("C:\\Users\\sachinthah\\Downloads\\MCQ project -1f959c1fc3a4.json"));
Storage storage = StorageOptions.newBuilder().setCredentials(credentials).build().getService();
public CloudStorageHelper() throws IOException {
}
#SuppressWarnings("deprecation")
#RequestMapping(method = RequestMethod.POST, value = "/imageUpload112")
public String uploadFile(#RequestParam("fileseee")MultipartFile fileStream)
throws IOException, ServletException {
BlobstoreService blobstoreService = BlobstoreServiceFactory.getBlobstoreService();
String bucketName = "mcqimages";
checkFileExtension(fileStream.getName());
DateTimeFormatter dtf = DateTimeFormat.forPattern("-YYYY-MM-dd-HHmmssSSS");
DateTime dt = DateTime.now(DateTimeZone.UTC);
String fileName = fileStream.getOriginalFilename();
BlobInfo blobInfo = BlobInfo.newBuilder(bucketName, fileName)
.setAcl(new ArrayList<>(Arrays.asList(Acl.of(User.ofAllUsers(), Role.READER))))
.build(),
fileStream.getInputStream());
System.out.println(blobInfo.getMediaLink());
// sachintha added a comma after the link to identify the link that get generated
return blobInfo.getMediaLink() + ",";
}
private void checkFileExtension(String fileName) throws ServletException {
if (fileName != null && !fileName.isEmpty() && fileName.contains(".")) {
String[] allowedExt = {".jpg", ".jpeg", ".png", ".gif"};
for (String ext : allowedExt) {
if (fileName.endsWith(ext)) {
return;
}
}
throw new ServletException("file must be an image");
}
}
The Answer was quite simple, i just got rid of the manual upload method and used the inbuilt create blob.
private String uploadImage(File filePath, String blobName, File uploadCreds) throws FileNotFoundException, IOException{
Storage storage = StorageOptions.newBuilder().setProjectId("porjectId")
.setCredentials(ServiceAccountCredentials.fromStream(new FileInputStream(uploadCreds)))
.build()
.getService();
String bucketName = "bucketName";
Bucket bucket = storage.get(bucketName);
BlobId blobId = BlobId.of(bucket.getName(), blobName);
InputStream inputStream = new FileInputStream(filePath);
BlobInfo blobInfo = BlobInfo.newBuilder(blobId).setContentType("image/jpeg").build();
Blob blob = storage.create(blobInfo, inputStream);
System.out.println("Image URL : " + blob.getMediaLink());
return blob.getMediaLink();
}
In case you want to store it in a special folder get the blobinfo.name() and append the "/" to it. for e.g if temp.jpg need to be stored with date folders. get the date from date object and format it with date formatter and prepend it
blobinfo.name() = date+"/"+blobinfo.name();
will classify all images date wise..