I have a JSF application which runs in JBoss 6.1 which uses internal the
Tomcat Servlet container.
I've realised the upload with apache commons file upload.
I want to prevent too large file uploads and have set the property
fileSizeMax to 10MB within the class FileUploadBase. It works, the file
upload throws an FileSizeLimitExceededException for all files larger than
10MB. This exception throws within less than a second.
But the main problem is, that the whole file will be transferred over the
network. I have found this out by checking the network traffic. Afterwards the redirect to the error page is done.
How can I interrupt the file transfer when the max size is exceeded
without transferring the whole file? I assume that the file will be
transferred in multiple packages because of the web form attribute enctype
="multipart/form-data".
You cannot abort a HTTP request halfway. If you did it, you would not be able to return a HTTP response and the client would end up with no form of feedback, expect maybe a browser-specific "Connection reset by peer" error page.
Your best bet is to validate it in JavaScript beforehand. This works by the way only in browsers supporting HTML5 File API. You didn't tell anything about which JSF file upload component you're using, so I have the impression that you just homebrewed one, so I'll give a generic answer which is applicable on the rendered HTML <input type="file"> (note that it works as good on e.g. Tomahawk's <t:inputFileUpload>):
<input type="file" ... onchange="checkFileSize(this)" />
with something like this
function checkFileSize(inputFile) {
var max = 10 * 1024 * 1024; // 10MB
if (inputFile.files && inputFile.files[0].size > max) {
alert("File too large."); // Do your thing to handle the error.
inputFile.value = null; // Clears the field.
}
}
In case of older browsers not supporting this, well, you're lost. Your best alternative is Flash or Applet.
Related
Currently, i using XmlHttpRequest to uploading files to the server using HTML5 capabilities.
There's progress bar:
xhr.upload.addEventListener('progress', function(e) {
var done = e.position || e.loaded, total = e.totalSize || e.total;
console.log(done);
});
... everything works fine, but it doesn't consider processing the file by server. So it shows 100% uploaded even when file weren't created yet.
The file receiver is Java servlet, which able to response only after return. So here's no way to count the percents left by its response.
Whether there are ways around it?
If the processing the server does takes a long time and you want to give feedback while it happens, here's a rough outline of a solution. This will take a while to implement so it's only really worth doing if the processing is slow.
Modify your servlet to do its work asynchronously, and return a 201 Accepted response to the client.
As the server processes the file, set the progress on an object, typically a ConcurrentHashMap injected with Spring if that's what you're using.
Expose an API that queries the current progress of the task without blocking for it to complete.
In your javascript, poll this API until the task completes, and show the progress.
You can return a tracking ID in the response at step 1 if you need to.
I am working with a software which pass through an Application Security which indicates the line codes that are "probably insecure".
Given the following code the Application is signing the outputStream.write() line accusing of Improper Neutralization of Script-Related HTML Tags in a Web Page
response.addHeader("Content-Disposition","attachment; filename=" + Util.NeutralizeFileName(filename));
byte[] bytes = obj_Data.getBytes("File");
ServletOutputStream ouputStream = response.getOutputStream();
ouputStream.write(bytes, 0, bytes.length);
ouputStream.flush();
ouputStream.close();
Actually, I am not writing an html page but a file download. And all the data before convert it to bytes are being validated and neutralized.
So, my question is: Is this a false warning? If not, what can I do to make the properly validation?
The user could still choose to open the file in browser. For instance in IE the user often gets a dialog where he/she can choose between cancel, save and open, where open will open in current tab. You can disable open by using a header though. Whether this is vulnerable to xss also depends on the content-type. Is it HTML?
In some Security analysis Engine (as my case) the analysis application flags any areas where the software is writing data out that originates with data from outside (the user for example), which is considered out of the trust boundary.
So, it is not necessarily a false warning but a designed behavior of the analysis application which is probably unable to understand the context of the output (whether it is an html or a byte file).
The best advice I could provide is consulting the application support or the documentation itself, which you can retrieve information of the standards used to flag the insecure areas of the your software.
I have a URL in my Play! app that routes to either HTML or XLSX depending on the extension that is passed in the URL, with a routes line like :-
# Calls
GET /calls.{format} Call.index
so calls.html renders the page, calls.xlsx downloads an Excel file (using Play Excel module). All works fine from the browser, a cURL request, etc.
I now want to be able to create an email and have the Excel attached to it, but I cannot pull the attachment. Here's the basic version of what I tried first :-
public static void sendReport(List<Object[]> invoicelines, String emailaddress) throws MalformedURLException, URISyntaxException
{
setFrom("Telco Analysis <test#test.com>");
addRecipient(emailaddress);
setSubject("Telco Analysis report");
EmailAttachment emailAttachment = new EmailAttachment();
URL url = new URL("http://localhost:9001/calls.xlsx");
emailAttachment.setURL(url);
emailAttachment.setName(url.getFile());
emailAttachment.setDescription("Test file");
addAttachment(emailAttachment);
send(invoicelines);
}
but it just doesn't pull the URL content, it just sits there without any error messages, with Chrome's page spinner going and ties up the web server (to the point that requests from another browser/machine don't appear to get serviced). If I send the email without the attachment, all is fine, so it's just the pulling of the file that appears to be the problem.
So far I've tried the above method, I've tried Play's WS webservice library, I've tried manually-crafted HttpRequests, etc. If I specify another URL (such as http://www.google.com) it works just fine.
Anyone able to assist?
I am making an assumption that you are running in Dev mode.
In Dev mode, you will likely have a single request execution pool, but in your controller that send an email, you are sending off a second request, which will block until your previous request has completed (which it won't because it is waiting for the second request to respond)...so....deadlock!
The resaon why external requests work fine, is because you are not causing the deadlock on your Play request pool.
Simple answer to your problem is to increase the value of the play.pool in the application.conf. Make sure that it is uncommented, and choose a value greater than 1!
# Execution pool
# ~~~~~
# Default to 1 thread in DEV mode or (nb processors + 1) threads in PROD mode.
# Try to keep a low as possible. 1 thread will serialize all requests (very useful for debugging purpose)
play.pool=3
I have a Web Service that receives an image upload by a Multipart POST request. I would like to forward the file to another web service without storing it, as the environment does not have access to a file system, so basically just passing along the information that's being received.
How do I achieve this?
If the other webservice resides on the same server use:
String url = "<relative path>";
request.getRequestDispatcher(url).forward(request, response);
return;
otherwise use:
response.sendRedirect(url);
You could always try chaining the input and output streams from one to the other, but I suspect you won't get very far with this when there's a hiccup on either side of the connection.
Another option you have, depending on how much memory you have access to, is to save it as a variable after you fetch it, and then pass it along to the other webservice. This of course won't work with very large images but it's a starting point.
I'm trying to limit the image size using Spring CommonsMultipartResolver. The problem is that the "File size exception" gets thrown immediately upon upload but the upload continues to the end (due to the nature of doing a POST on a form). So, I end up saving a temp file on the disk, checking the size and then displaying validation error message to the user and on top of that wasting additional bandwidth / resources.
Is there any way to solve the problem by canceling the form submission when reaching a specified image size ?
This is only possible by checking it in the client side. You can do it with the new HTML5 File API which offers a size property which returns the size in bytes. There is no way to achieve this in HTML4 (expect of possibly some IE-specific ActiveX functions) You're thus dependent on the target webbrowser whether it works. HTML5 is supported in FF >= 3.6, Chrome >= 2 and Safari >= 4 (no, not in IE, even not IE9 yet!). You should in any case keep your server side code to check the file size, not only as fallback for browsers not supporting it, but also for cases where the enduser disables or hacks the JS code.
Here's a kickoff example:
<input type="file" onchange="checkSize(this)" />
with
function checkSize(input) {
if (input.files && input.files[0].size > (10 * 1024)) {
alert("File too large. Max 10KB allowed.");
input.value = null;
}
}
Yes, the input.files is an array. Since HTML5 the <input type="file"> allows multiple file selection by multiple attribute.