I'm exporting an xls file from a Java backend but when the client receives it, it is recognised as an xlsx file despite it having the .xls extension in the name. I am using JasperXlsExporterBuilder to build the xls file.
When sending the file, I set the response type:response.type("application/vnd.ms-excel");
and the response headers: response.header("Content-Disposition", "attachment;filename=" + file.getName()); where file name is filename.xls
But still when the client receives it, the file is called filename.xls but it says You have chosen to open: filename.xls which is Excel 2007 spreadsheet (5kB) from: blob: this causes an issue as some excel versions can't handle the mismatch between the recognised type and the file extension.
Nothing you're doing is causing that; the computer that is downloading your file has a broken installation. For example, if the registry 'links' the .xls extension, you'd get that. You cannot detect this and cannot fix it (because you're a webserver; if you could detect or fix such things, you could also do malicious things, hence, you can't, and you never will be able to do such things from web servers).
One last ditch effort you can try which probably won't work (at which point you've exhausted all options available to you server-side) is to ensure that the URL itself ends in .xls. Make sure the user is following a link like https://www.user3274server.com/foo/bar/filename.xls.
Related
My application uses wicket 6.3 and my file upload is working as expected, except for some cases of corrupted or distorted files when viewed or downloaded.
In most cases, file upload is working, but there are cases when the uploaded files are saved (no error), but cannot be viewed or downloaded correctly since it is corrupted.
The file size of these files are just less than 100KB.
When I say corrupted, when you try to open the uploaded file it will look like this:
Below is the part of my code that saves the uploaded file:
BE code
FileUploadField fileUploadField = new FileUploadField("fileUploadField");
...
// File types can be images (jpg/png/bmp), documents (docx/pdf)
// File types with random distorted/corrupted (png and docx)
// Reuploading the file will fix the issue
for (FileUpload fu : m_fileUpload.getFileUploads()) {
byte[] fileByes = fu.getBytes();
String fileId = myService.persistFile(new MyFile().setContent(fileByes));
supportingDoc.addAttachment(fileId, fu.getClientFileName(), fileByes.length);
}
...
HTML
<input wicket:id="fileUploadField" type="file" class="form-control">
When I try to replicate it again, it usually works, and I can download or view the file.
I am unsure why it gets corrupted/distorted and how I can avoid getting a such an error though it rarely happens.
Update:
I thought the initial case was a PNG upload but it was not, it was docx. When I tried to replicate the issue using PNG file. It works.
I have 2 tomcat servers (test and live). I uploaded the same docx file on both tomcat servers (both on ubuntu). Test server was able to view/download the uploaded docx file, while Live server did not.
I am converting the file into byte array and save it to DB. When I compare both file contents in DB, they have exactly the same content. So the problem is not really on uploads.
I think the problem is on the download decoding, both servers does not have the same decoder. On my local environment (tomcat + windows) it works, same with my Test environment (tomcat + ubuntu). My Live environment running tomcat + ubuntu seems to have a different default decoder that is why it cannot view/download docx properly.
My problem now is where and how I can check that default decoder? Will I check it in ubuntu side? or it should be in tomcat side? When I checked the tomcat server config on both Test and Live tomcats, both seems to have the same config. They only differ on SSL certs.
Solution that works on my case:
The problem was really on how my Live server handles unknown mime-types. It is handling it like text file. This is the reason why it is showing the unknown mime-type file as garbled text.
I checked my Live tomcat server configuration and compared it with my test server's config and both are almost the same and I cannot find any configuration relating to mime-type or encoder/decoder.
In my file download code, when mime-type is unknown it is setting it as force-download only so I changed it to application/force-download.
For unknown mime-type, I changed the content-disposition from inline; filename="<filename>" to attachment; filename="<filename>".
I think content disposition attachment alone works, I haven't tested it though since application/force-download seems to be a hack, while setting the correct content disposition as attachment will download the unknown mime-type for my case.
I will not delete this question though, since I might have the same issue in the future and I forgot how I solved it :)
Situation:
The System is fetching Emails via standard methods (Pop3) from a Mailserver and sends them to the Archiving component as multi-part messages (*.eml files).
If the mail was sent from Outlook it may contain an OLE-Object for example a MS Word, MS Excel and so on. There are several ways to include such an Object, for example via Menu "Insert->Object"
Problem:
Our requirement is now to extract those OLE-Objects archive them as separate attachments. It would be best to do it in Java or other JVM Languages. Other Languages and Frameworks would be possible but they must be working on different platforms (Win, Linux, Unix)
The problem is we haven't found any library or functions in the libraries to do this.
First issue is, that the message the receiver gets depends on how outlook is configured:
It may send RTF messages: Then the receiver get's a message having an attachment "Untitled Attachment.bin"
It may send HTML messages: Then the receiver get's a message inlcuding an attachment "oledata.mso".
What we've tried so far:
We tried Apache POI, especially POIFS to load the file "oledata.mso" but it complained about that some header value is wrong:
Invalid header signature; read 0xD7EC9C7800013000, expected 0xE11AB1A1E011CFD0 - Your file appears not to be a valid OLE2 document
We found a website talking about the same issue. As far as we understood, the oledata.mso is an collection of Compound File Binary Files. Which should also be parsable with POI individually because the OpenMCDF is doing the same things as POI.
On this website they somehow separate the parts and parse them separatly. But we haven't found a similar function or any specification how this is done.
Can anybody please shed some light on this?
I am working within Java, and downloading files from a HTTP Server. Now we are working with symlinks here, so we do not need to change the http link - it is always pointing to "last-uploaded.zip" which is linked to the last uploaded zip file, as an example "package43.zip".
Do I have the chance to get the original filename within java? So the link is pointing to "last-uploaded.zip" but if it is downloaded I want to rename it to "package$version.zip".
Regards,
Marco
No. The whole symlink concept doesn't transfer over HTTP, so when you make a HTTP GET for last-uploaded.zip you don't know if it's a file, a symlink or just an endpoint that returns bytes.
The simplest solution is probably opening the zip and searching for the version number from inside there somewhere.
While uploading a image/doc/xlsx file from my AngularJS client to my server-side java using JAX-RS(Jersey) i am getting the following exception,
org.jvnet.mimepull.MIMEParsingException: Reached EOF, but there is no closing MIME boundary.
What is this? Why I am getting this exception? How can I get rid of this?
Note: It works for the files with extension .txt, .html, .yml, .java, .properties
But not working for the for the file with extension .doc, .xlsx, .png, .PNG, .jpeg.. etc.
My Server side code:
#POST
#Path("/{name}")
#Consumes(MediaType.MULTIPART_FORM_DATA)
public String uploadedFiles(#Nonnull #PathParam("name") final String name,
#FormDataParam("file") final InputStream inputStream,
#FormDataParam("file") final FormDataContentDisposition content) {
}
I encountered the same issue. Based on my research, the problem has no relation with the file type. It has a little relation with the size of the uploaded file.
I'm not sure if the root cause is when the uploading file is very big, before the file is uploaded to the server completely, the client disconnects to the server (such as timeout). And I also verified the guess. My test steps is,
1. In client, upload a very big file.
2. Before the get the response from server, which means is uploading file;
close the test client
3. check the server side, you will see the issue.
So To fix it, my solution is add timeout time in client side.
OK, I'm only guessing, but I think I can see a pattern here.
The file types that are working are text based
The file types that are not working are binary
This suggests to me that maybe the problem is that there is some kind of issue with the way that non-text data is being handled by the upload process. Maybe it is being transcoded when it shouldn't be.
Anyway, I suggest that you use some tool like Wireshark to capture the TCP/IP traffic in an upload to see if the upload request body has valid MIME encapsulation.
On the network where I have my web server there is a machine that has many zipped pdf files (zipped using java.util.zip) and I can access these files through HTTP. When a user wants to download a pdf file, I know how to unzip the file locally on the server first and then deliver the unzipped pdf to the user through a servlet. Is it possible to deliver the unzipped file to the user without unzipping it locally first?
Regards
In principle, if the client has said in his request that he accepts gzip-compressed data, you could send the PDF file in compressed form, and the client will decompress it. There is a gotcha, though: While the compression algorithm of zip files and the HTTP Content-Encoding: gzip is the same, the Zip file format has some more things around it (since it can contain multiple files, and a directory structure), so it would be necessary to strip these things off before. I'm not sure this would be much easier than decompressing in your servlet and then let your Servlet-engine take care of compressing again, but try it.
You can send the response to a request, encoded in a compressed format. If the client does the request with the header
Accept-Encoding: gzip, deflate
you can for instance serve him the content compressed using gzip (as long as you declare this through a header:)
Content-Encoding: gzip
Source: Wikipedia: HTTP Compression
Is it possible to deliver the unzipped file to the user without unzipping it locally first?
That depends a little bit on what exactly you mean with "locally", the general answer is "no". To deliver unzipped content, you have to unzip the zip first.
If you actually mean that the zip file is located at some non-local machine and that you currently need to save and zip it locally first before streaming unzipped content, then the answer would be "yes", it is possible to unzip and stream it without saving the file locally. Just pass/decorate the streams without using FileInputStream/FileOutputStream.