getOutputStream() has already been called Spring MVC - java

Our requirement is to download files from FTP server which we used a download files in sequential manner one after another which is taking huge time.
Sequential code:
for(String mediaValue: generatedMediaTypes) {
// Logic here is getting the user selected files from Ui and keep it in string array str
try {
ddownloadMultipleNavDbs(str[i]),uniqueID);
//now we download all the files here with downloadfile logic with help of unique id we are a folder by that name and keep all this files in that folder
} catch (Exception ex) {
logger.error(ex.getMessage());
}
}
After downloading all files into uniqueID folder now I zip and send to client by downloadZip(request,uniqueID):
public void downloadZipFile (HttpServletResponse response, String suuid) throws IOException
{
try
{
ZipOutputStream zipOutputStream = new ZipOutputStream(response.getOutputStream());
response.setContentType("application/octet-stream");
File loadableFolder = new File(uniqueID);
String timeStamp = new java.sql.Timestamp(Calendar.getInstance().getTime().getTime()).toString().replaceAll(
"-", "_").replaceAll(" ", "_").replaceAll(":", "_");
response.setHeader(
"Content-Disposition",
"attachment; filename=" + "File" + timeStamp.substring(
0, timeStamp.indexOf(".")) + ".ZIP");
File[] loadableMedias = //here we get the files from that uinqueID folder
ZipUtil.addFilesToZipStream(Medias, Folder,
zipOutputStream);
zipOutputStream.close();
response.flushBuffer();
response.getOutputStream().flush();
response.getOutputStream().close();
}
It works perfectly fine in above code.
When I used executorservice for downloading files in parallel
executorService.submit(new myThread(str[i]),uniqueID);
am facing error:
getOutputStream() has already been closed
Can anyone please explain me why am facing this error?
Why we face this error and how to resolve it?

Related

Invalid zip file getting generated from a byte array

I am trying to compress set of files and storing it to memory .
Below is the code I am using
try (ByteArrayOutputStream zipBaos = new ByteArrayOutputStream();
ZipOutputStream zs = new ZipOutputStream(zipBaos)) {
Path pp = Paths.get(sourceDirPath);
Files.walk(pp)
.filter(path -> !Files.isDirectory(path) && pp.relativize(path).toString().contains(instituteId)
&& pp.relativize(path).toString().contains("dumps-" + hostCount))
.forEach(LambdaExceptionUtil.rethrowConsumer(path -> {
ZipEntry zipEntry = new ZipEntry(pp.relativize(path).toString());
try {
downloadedfilename.add(zipEntry.getName().substring(
zipEntry.getName().lastIndexOf(File.separator) + 1, zipEntry.getName().length()));
zs.putNextEntry(zipEntry);
Files.copy(path, zs);
zs.closeEntry();
} catch (IOException e) {
LOGGER.error("Exception in Zipping downloaded files {}", e);
throw e;
}
}));
return zipBaos.toByteArray();
}
}
Now later when I am trying store this byte array content again in the file system as a form of zip file
FileUtils.writeByteArrayToFile(new File(location + File.separator + name), content);
Zip file is getting created and it is showing proper size as well .
But when I am trying to open the file windows complaining it to be invalid.
Note: I can open it with 7Zip but not with the windows explorer.
Thanks.
Adding ZipOutputStream finish() and flush() resolved the issue
zs.finish();
zs.flush();
return zipBaos.toByteArray();

Spring Boot + AWS S3: Unable to delete files in the bucket

i am new to AWS, and my first module i try to learn is S3 for file storage.
Uploading works fine, the problem is with deleting.So when i upload a file i store the string version of the name of the file in AWS bucket mybucket and the whole URL in mysql database like this
-> https://mybucket.s3.eu-west-2.amazonaws.com/what.png
The problem with deleting is that even if i pass the whole URL in this case https://mybucket.s3.eu-west-2.amazonaws.com/what.png to the delete method, the method goes to each steps successfully, telling me that the file has been succesfully deleted but when i check the bucket, the file is still there.I have tried searching around here for a similar issue, but couldn't find something that could help me understand what the problem is.here is the code
#Service
public class AmazonS3ClientServiceImpl {
private String awsS3AudioBucket; //bucket name
private AmazonS3 amazonS3; // s3 object which uploads file
private static final Logger logger = LoggerFactory.getLogger(AmazonS3ClientServiceImpl.class);
#Autowired
public AmazonS3ClientServiceImpl(Region awsRegion, AWSCredentialsProvider awsCredentialsProvider, String awsS3AudioBucket) {
this.amazonS3 = AmazonS3ClientBuilder.standard()
.withCredentials(awsCredentialsProvider)
.withRegion(awsRegion.getName()).build();
this.awsS3AudioBucket = awsS3AudioBucket;
}
public String uploadFileToS3Bucket(MultipartFile multipartFile, boolean enablePublicReadAccess) {
String uploadedfile = ""; // the file path which is on s3
String fileName = multipartFile.getOriginalFilename();
try {
//creating the file in the server (temporarily)
File file = new File(fileName);
FileOutputStream fos = new FileOutputStream(file);
fos.write(multipartFile.getBytes());
fos.close();
PutObjectRequest putObjectRequest = new PutObjectRequest(this.awsS3AudioBucket, fileName, file);
if (enablePublicReadAccess) {
putObjectRequest.withCannedAcl(CannedAccessControlList.PublicRead);
}
this.amazonS3.putObject(putObjectRequest);
uploadedfile = String.valueOf(this.amazonS3.getUrl(awsS3AudioBucket, fileName));
System.out.println(this.amazonS3.getUrl(awsS3AudioBucket, fileName));
System.out.println(uploadedfile);
//removing the file created in the server
file.delete();
} catch (IOException | AmazonServiceException ex) {
logger.error("error [" + ex.getMessage() + "] occurred while uploading [" + fileName + "] ");
}
return uploadedfile;
}
public void deleteFileFromS3Bucket(String fileName) {
LOGGER.info("Deleting file with name= " + fileName);
final DeleteObjectRequest deleteObjectRequest = new DeleteObjectRequest(this.awsS3AudioBucket, fileName);
amazonS3.deleteObject(deleteObjectRequest);
LOGGER.info("File deleted successfully");
}
and when i call the deletemethod i use this
#GetMapping("/dashboard/showposts/delete/{id}")
public String deletePost(#PathVariable("id") Long id, Model model) {
System.out.println("GOT HERE");
//Retrieving Post image name
Post post = postService.findBydId(id);
String imageName = post.getImage();
System.out.println(imageName);
//Deleting image from S3 bucket
amazonClient.deleteFileFromS3Bucket(imageName);
//Deleting post from db
postService.detelePost(id);
String success = "Successfully deleted post with Id" + id;
model.addAttribute("success", success);
return "redirect:/admin/dashboard/showposts";
}
Any help would be greatly appreciated.
L.E For anyone having the same issue and searching for a quick answer.You have to pass only the string image name to the delete method not the whole URL.
You aren't checking the response returned from amazonS3.deleteObject() to see if it was actually successful or not. It is probably returning a failure status.
I'm guessing the root issue is that you are passing the full URL to the delete method, instead of just the path to the file within S3. For example with this URL: https://mybucket.s3.eu-west-2.amazonaws.com/what.png the S3 object path is simply what.png.
The simplest answer is to use the URL class. Something like:
URL url = null;
try {
url = new URL("https://mybucket.s3.eu-west-2.amazonaws.com/some/path/what.png");
} catch (MalformedURLException e) {
e.printStackTrace();
}
System.out.println( "file is \""+ url.getFile() + "\"" );
output would be "/some/path/what.png". You can remove the first "/" character to use for the key.
Aws S3 is eventual consistent. You might delete object and s3 list that object in browser . So it take few seconds or less to delete .
Please refer this link

Creation of single zip containing multiple zips fails for ServletOutputStream

I am using commons compress to zip multiple files and send it the client from a Servlet.
The files could be a combination of any type of files(text, video, audio, archives, images etc). I take the inputStream of file and write to ServletOutputStream using IOUtils.copy(is, os).
The code usually works fine for any document combination but when there is a request to download files that contain more than 1 zip, I get java.io.IOException: Closed
As a result, the zip file created is corrupted even though the size of zip is summation of individual filesizes(I am not using compression).
I tried to locally create zip and use FileOutputStream instead of response.getOutputStream() in the constructor of ZipArchiveOutputStream and it succeeds.
So, it looks like the problem exists for ServletOutputStream.
Can anyone suggest any workaround.
Here is my code :
`try (ZipArchiveOutputStream zos = new ZipArchiveOutputStream( response.getOutputStream())) {
//get fileList
for(File file : files) {
addFileToZip(zos, file.getName(), new BufferedInputStream(new FileInputStream(file)));
}
zos.close()
}
`
public static void addFileToZip(ZipArchiveOutputStream zipOutputStream, String filename, InputStream inputStream) throws FileNotFoundException {
if(zipOutputStream != null && inputStream != null) {
try {
zipOutputStream.putArchiveEntry(new ZipArchiveEntry(filename));
IOUtils.copy(inputStream, zipOutputStream);
logger.debug("fileAddedToZip :" + filename);
} catch (IOException e) {
logger.error("Error in adding file :" + filename, e);
} finally {
try {
inputStream.close();
zipOutputStream.closeArchiveEntry(); //**Starts to fail here after 1st zip is added**
} catch (IOException e) {
logger.error("Error in closing zip entry :" + filename, e);
}
}
}
`
Here is the exception trace :
`
java.io.IOException: Closed
at org.mortbay.jetty.AbstractGenerator$Output.write(AbstractGenerator.java:627)
at org.mortbay.jetty.AbstractGenerator$Output.write(AbstractGenerator.java:577)
at org.apache.commons.compress.archivers.zip.ZipArchiveOutputStream.writeOut(ZipArchiveOutputStream.java:1287)
at org.apache.commons.compress.archivers.zip.ZipArchiveOutputStream.writeOut(ZipArchiveOutputStream.java:1272)
at org.apache.commons.compress.archivers.zip.ZipArchiveOutputStream.writeDataDescriptor(ZipArchiveOutputStream.java:997)
at org.apache.commons.compress.archivers.zip.ZipArchiveOutputStream.closeArchiveEntry(ZipArchiveOutputStream.java:461)
at xxx.yyy.zzz.util.ZipUtils.addFileToZip(ZipUtils.java:110)
line 110 is zipOutputStream.closeArchiveEntry(); //**Starts to fail here after 1st zip is added**
Thanks in advance.
The problem is that you use try-with-resources which automatically closes the stream you create in it, and yet you also close it manually, and therefore when the JVM tries to auto-close it is when you get java.io.IOException: Closed exception because it is already closed.
If you use try-with-resources, you don't need to close the streams you create in it. Remove your manual zos.close() statement:
try (ZipArchiveOutputStream zos =
new ZipArchiveOutputStream(response.getOutputStream())) {
//get fileList
for(File file : files) {
addFileToZip(zos, attachment.getFileName(), is);
}
} // Here zos will be closed automatically!
Also note that once zos is closed, it will also close the servlet's underlying OutputStream so you will not be able to add further entries. You have to add all before it is closed.

FileUploader - Save data in the project

I am uploading a file with the PF 3.5 File Uploader
My Upload Method looks like that:
public void handleFileUpload(FileUploadEvent event) {
log.info("Method handleFileUpload invoked");
FacesMessage msg = new FacesMessage("Succesful", event.getFile().getFileName() + " is uploaded.");
FacesContext.getCurrentInstance().addMessage(null, msg);
InputStream inputStream = null;
OutputStream out = null;
try {
File targetFolder = new File("\\resources\\uploads");
if(!targetFolder.exists()) {
targetFolder.mkdirs();
}
inputStream = event.getFile().getInputstream();
File outFile = new File(targetFolder, event.getFile().getFileName());
log.info("copy file stream to " + outFile.getAbsolutePath());
out = new FileOutputStream(outFile);
int read = 0;
byte[] bytes = new byte[size];
log.info("read file stream");
while ((read = inputStream.read(bytes)) != -1) {
out.write(bytes, 0, read);
}
out.flush();
} catch (IOException e) {
log.error(e);
} finally {
...
}
at the moment my files get uploaded to \\resources\\uploads". Thats the path to a folder on theC:`.
However, I want to upload my uploads to a path in my eclipse project. How to change the path? I really appreciate your answer!!!
However, I want to upload my uploads to a path in my eclipse project.
That's absolutely not recommended for the reasons mentioned in this answer: Uploaded image only available after refreshing the page. The point is: the IDE's workspace and server's deploy folder is absolutely not intented as a permanent file storage. The uploaded files would be unreachable and/or disappear like by magic.
Just keep them in a path external to the IDE's workspace and server's deploy folder. You're doing it fine. I'd only make the path configurable by a system property, environment variable or properties file setting so that you don't need to edit, recompile, rebuild, etc the code everytime when you change the upload location.
If your concrete problem is more the serving of the uploaded file, then just add the upload folder as another context in server's configuration, or create a simple servlet for the serving job, or as you're using PrimeFaces, just use <p:fileDownload> or <p:graphicImage> with StreamedContent pointing to the desired FileInputStream.
See also:
How to save uploaded file in JSF

uploaded file not saved to the webcontent directory

I am developing and application using eclipse IDE. My application has a file upload functionality.
I am able to achieve how to upload the file and also to save it. But the problem is that the file uploaded didn't get store to my dynamic web project directory.
The file uploaded get store to my server directory with .metadata folder having path
file:///E:/workspace/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/
I want to store my uploaded folder to my Webcontent folder having upload folder having images folder like WebContent/upload/images.
No doubt I am able to view the image file but, the path i want is like above only.
below code I am using to store the uploaded file
#RequestMapping(value = "/company/UploadFile.action", method = RequestMethod.POST)
public #ResponseBody String uploadFile(FileUploadBean uploadItem, BindingResult result,HttpServletRequest request, HttpServletResponse response) {
System.out.println("FILE UPLOAD ITEM SI SSLSL ::"+uploadItem);
ExtJSFormResult extjsFormResult = new ExtJSFormResult();
if (result.hasErrors()){
for(ObjectError error : result.getAllErrors()){
System.err.println("Error: " + error.getCode() + " - " + error.getDefaultMessage());
}
//set extjs return - error
extjsFormResult.setSuccess(false);
return extjsFormResult.toString();
}
// Some type of file processing...
System.err.println("-------------------------------------------");
System.err.println("Test upload: " + uploadItem.getFile().getOriginalFilename());
System.err.println("-------------------------------------------");
try{
MultipartFile file = uploadItem.getFile();
String fileName = null;
InputStream inputStream = null;
OutputStream outputStream = null;
if (file.getSize() > 0) {
inputStream = file.getInputStream();
/*if (file.getSize() > 10000) {
System.out.println("File Size:::" + file.getSize());
extjsFormResult.setSuccess(false);
return extjsFormResult.toString();
}*/
System.out.println("also path ::"+request.getRealPath("") + "/upload/images/");
System.out.println("PATHI SIS SIS"+this.getClass().getProtectionDomain().getCodeSource().getLocation().getPath());
System.out.println("size::" + file.getSize());
InetAddress addr = InetAddress.getLocalHost();
byte[] ipAddr = addr.getAddress();
System.out.println("HOST NAME"+request.getRealPath("ResourceMgt"));
System.out.println("HOST ADDR"+addr.getHostAddress());
System.out.println("HOST "+request.getRequestURI());
System.out.println("HOST "+request.getRequestURL());
fileName = request.getRealPath("") + "/upload/images/"
+ file.getOriginalFilename();
outputStream = new FileOutputStream(fileName);
System.out.println("FILEN ANEM AND PATH IS ::"+fileName);
System.out.println("fileName:" + file.getOriginalFilename());
int readBytes = 0;
byte[] buffer = new byte[40000];
while ((readBytes = inputStream.read(buffer, 0, 40000)) != -1) {
outputStream.write(buffer, 0, readBytes);
}
companyservice.saveImages(file.getOriginalFilename(),fileName);
outputStream.close();
inputStream.close();
}
}catch (Exception e) {
// TODO: handle exception
e.printStackTrace();
}
//set extjs return - sucsess
extjsFormResult.setSuccess(true);
return extjsFormResult.toString();
}
please suggest me how can I store the file uploaded to my WebContent having upload folder with images folder. My above code is working perfectly Just there is some issue with specifying the path.
Have you tried to change the destination of the outputStream?
fileName = request.getRealPath("") + "/upload/images/"
+ file.getOriginalFilename();
Instead of request.getRealPath("") put an absolute destination or play with ClassPath. For example:
fileName = "/opt/tomcat/webapps/upload/images/"
+ file.getOriginalFilename();
forum member
now I am able to upload the file successfully, but the file get stored to the deployed directory on the server.
As soon as I remove the project and redeployed the project to my Tomcat server 6.0 all the files I had uploaded gets deleted from that.
I am using JAVA as my server side technology with Tomcat server 6.0.
I am able to upload the file successfully, but the file get stored to the deployed directory on the server.
As soon as I remove the project and redeployed the project to my Tomcat server 7.0 all the files I had uploaded gets deleted from that.
I am using JAVA and JSF as my server side technology with Tomcat server 7.0 in Eclipse IDE

Categories