Several uploads to servlet in one request - java

In Servlet 3.1 it is possible to upload multiple files to servlet in one request (like a bulk upload), but what I want to achieve is to upload a big number of files to servlet as soon as each one is created without creating a new request on every file. In other words, there are 1000 files created on runtime, and I do not want 1st one to wait for the creation of the last to be uploaded. I want each file uploaded immediately after creation, so that servlet can check the upload timestamp for each file (and send back the result). The hard part for me is, I want neither to wait for creating all of them, nor to use separate requests for each file.
Is that possible? If yes, please give me some tips and directions/sources.

Yes, it is possible read multiple files in a single servlet request.
Here is how you could read the files in the servlet:
#WebServlet(urlPatterns = { "/UploadServlet" })
#MultipartConfig(location = "/uploads")
public class UploadServlet extends HttpServlet {
#Override
protected void doPost(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
PrintWriter out = response.getWriter();
for (Part part : request.getParts()) {
String fileName = part.getSubmittedFileName();
out.println("... writing " + fileName");
part.write(fileName);
out.println("... uploaded to: /uploads/" + fileName);
}
}
}
Update:
Based on your updated question, you have ~1000 files that you want to "stream" to the server as they are created. It's not possible to add more parts to an existing post request, and I think it would be a bit excessive to try and upload all 1000 files in a single POST request anyway.
I recommend setting some arbitrary chunk size (say 20 files at a time for example) and batch them up and make a POST requests for every 20 files. This way, you are not waiting for all 1000 files to be created, and you are not doing everything over a single POST request. This will allow you to "stream" your files to the server in 20 file increments.

Related

Prevent client timing out while a servlet generates a large download

I have a Java servlet that generates some arbitrary report file and returns it as a download to the user's browser. The file is written directly to the servlet's output stream, so if it is very large then it can successfully download in chunks. However, sometimes the resulting data is not large enough to get split into chunks, and either the client connection times out, or it runs successfully but the download doesn't appear in the browser's UI until it's 100% done.
This is what my servlet code looks like:
#Override
protected void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
response.setContentType("application/pdf");
response.setHeader("Content-Disposition", "attachment; filename=\"" + report.getFileName(params) + "\"");
try(OutputStream outputStream = response.getOutputStream()) {
// Send the first response ASAP to suppress client timeouts
response.flushBuffer(); // This doesn't seem to change anything??
// This calls some arbitrary function that writes data directly into the given stream
generateFile(outputStream);
}
}
I ran a particularly bad test where the file generation took 110,826ms. Once it got to the end, the client had downloaded a 0 byte file - I assume this is the result of a timeout. I am expecting this specific result to be somewhere between 10 and 30 KB - smaller than the servlet's buffer. When I ran a different test, it generated a lot of data quickly (up to 80MB total), so the download appeared in my browser after the first chunk was filled up.
Is there a way to force a downloaded file to appear in the browser (and prevent a timeout from occurring) before any actual data has been generated? Am I on the right track with that flushBuffer() call?
Well, it looks like shrinking the size of my output buffer with response.setBufferSize(1000); allowed my stress test file to download successfully. I still don't know why response.flushBuffer() didn't seem to do anything, but at least as long as I generate data quickly enough to fill that buffer size before timing out, the download will complete.

Java outputstream, trigger file download before all data retrieved from db

Im trying to wrap my head around Java Out/Inputstreams, closing and flushing. I have a situation where I want to create a file using Apache POI with data from a server. I would like the file to start downloading as soon as I retrieve the first record from the DB(Show the file at the bottom of the browser has started to download).
public void createExcelFile(final HttpServletResponse response,
final Long vendorId) {
try {
// setup responses types...
final XSSFWorkbook xssfWorkbook = new XSSFWorkbook();
final XSSFSheet sheet = xssfWorkbook.createSheet("sheets1");
// create file with data
writeExcelOutputData(sheet, xssfWorkbook);
xssfWorkbook.write(response.getOutputStream());
xssfWorkbook.close();
}
catch (final Exception e) {
LOGGER.error("Boom");
}
The above code will perform a file download no problem, but this could be a big file. I go off getting the data(around 20/30s) and then after that the download begins < no good...
Can I achive what I need or whats the best approach, thanks in advance
Cheers :)
Reasons could be as following:
maybe there is a read/write timeout with your http server, then if the process gets lengthy or becasue of low-bandwidth, so the connection will be closed by the server.
make sure the process(the excel work) gets completely done, maybe there would be an error/exception during work.
The solution of Jorge looks very promising. User need once request for a file, then server would do the work in background and then either user check the work process and download the file if ready, or server informs the user by email, web notification, etc...
Also you would keep the file in the server in a temp file for a while, and if the connection gets interrupted, server would respond the generated file partial(omit the bytes sent, like normal file download)
Keeping a connection alive to do a lengthy work is not very logical.
Again, if the file gets ready fast(really fast) for download/stream, and the download interrupts, if could be becasue of read/write timeout by server, or a very bad network.

Tomcat 7 Servlet spawning a thread

I am writing a web app that I want to use to perform FTP tasks (downloads)
I have Apache FTPS server installed in Tomcat and a Java Client ready to initiate the transfers.
The client will be invoked by the Servlet.
For example:
http://laptop:8080/MyServlet?action=download&from=desktop&file=C:/home/fred/file.xml
Would tell the instance on my laptop to download file.xml from my desktop.
EDIT:
Apologies, I never made this very clear.
There will be an FTP server at both ends of this process. 1 on my remote laptop and 1 on my local desktop. So in a nutshell I am submitting an FTP 'get' request to the Servlet on the remote side. The Servlet then kicks off an FTP process to pull the file across.
My Servlet is all set to receive the GET parameters and do the work.
If the file is quite big then each request will take a long time to complete. I want the Servlet resources freed up as quickly as possible.
Ideally I'd like the following things to happen:
User to send URL to Servlet
Servlet to digest the URL and work out what file and where from etc...
Servlet to pass info to a Thread
Servlet to come back with an "In progress" message
Request completes
Thread is still working in the background downloading the file
At this time I'm not too concerned with the Servlet having knowledge of the success of the thread, I just need it to kick one off and forget about it. The FTP process will have separate logging elsewhere for any issues.
I am interested in the concept of creating a Threadpool in the WebApp and fetching a thread from there but again, all examples I've found are old and don't really cater for my level of understanding.
There are a few similar questions on StackOverflow with this being the most similar to what I am asking for but it just hints at something ExecutorService that I have no prior knowledge of. How would I set this up in a WebApp ?
What is recommended way for spawning threads from a servlet in Tomcat
For info,
I have researched this and have found a lot of incomplete examples that require a better understanding than I currently have, or hints towards what is required.
Also a lot of the examples I've read are a few years old, nothing recent. I'm hoping there might be a magical one-liner to do everything I need (doubtful) that has come about in the last year or so :)
I'm new to Threading concepts in Java, I understand Threading in general so appreciate any help you can offer me.
Trevor
I'm not sure I have really understood what you want ...
client server
send request (via HTTP) and wait for
HTTP response
analyse request and find file to send
... (processing)
send HTTP response (1) with ?
opens FTP connection (could not open it before)
receive FTP request (command connection)
send file (data connection)
file is received and saved locally
If the client side is a browser, it should be enough for the response (1) to be a redirect to an URL like ftp://host/path/to/file, because all major browsers know of the FTP protocal and are able to use it to download a file.
The problem is not on server side, you can easily spawn a thread that could acts as a FTP client or (probably harder) as a FTP server, but I cannot imagine better than a redirection on client side : the client has open a HTTP connection than cannot be used for a FTP transfert and it must open a new connection for the FTP request. As it is a new connection, how do you want it to be processed by the thread launched at previous step ? There is no notion of session in FTP and there's no easy way to identify the right request.
Edit per comment:
Ok, I appears that you just want to do defered processing on server after request completion. You have two ways of doing that :
as suggested by you tags, use a worker thread to do the job. Your servlet is plain Java and you can create a thread like you would do in any other Java application. If you are interested in getting later the result of the defered processing, you could give a reference the the session (or simply to a session attribute) to the thread where it will be able to put its advancement and/or completion status. This requires some more boiler plate code but is guaranteed to work (examples below)
you can close the HTTP connection before the servlet returns. It is not explicitely guaranteed per the official servlet specs, but I found it to work at least in tomcat 7. You will find more details on that on this other post Servlet - close connection but not method
Example using simple threads, and storing status in session :
public class ThreadedServlet extends HttpServlet {
#Override
protected void doGet(HttpServletRequest hsr, HttpServletResponse hsr1) throws ServletException, IOException {
String fileName = null;
// preliminary work ...
Worker worker = new Worker(fileName);
final HttpSession session = hsr.getSession();
synchronized (session) {
List<Status> statuses = (List<Status>) session.getAttribute("statuses");
if (statuses == null) {
statuses = new ArrayList<Status>();
}
statuses.add(new Status(fileName));
}
Thread thr = new Thread(worker);
thr.start();
// write the response either directly or by forwarding to a JSP
}
public static class Status implements Serializable {
private String fileName;
private WStatus status;
public Status(String fileName) {
this.fileName = fileName;
}
public String getFileName() {
return fileName;
}
public void setFileName(String fileName) {
this.fileName = fileName;
}
public WStatus getStatus() {
return status;
}
public void setStatus(WStatus status) {
this.status = status;
}
}
public enum WStatus {
STARTED,
RUNNING,
COMPLETED
}
private static class Worker implements Runnable {
private String fileName;
private Status status;
public Worker(String fileName) {
this.fileName = fileName;
}
#Override
public void run() {
status.setStatus(WStatus.RUNNING);
// do your stuff ...
status.setStatus(WStatus.COMPLETED);
}
}
}

Cancel FileUpload when FileSizeMax is exceeded

I have a JSF application which runs in JBoss 6.1 which uses internal the
Tomcat Servlet container.
I've realised the upload with apache commons file upload.
I want to prevent too large file uploads and have set the property
fileSizeMax to 10MB within the class FileUploadBase. It works, the file
upload throws an FileSizeLimitExceededException for all files larger than
10MB. This exception throws within less than a second.
But the main problem is, that the whole file will be transferred over the
network. I have found this out by checking the network traffic. Afterwards the redirect to the error page is done.
How can I interrupt the file transfer when the max size is exceeded
without transferring the whole file? I assume that the file will be
transferred in multiple packages because of the web form attribute enctype
="multipart/form-data".
You cannot abort a HTTP request halfway. If you did it, you would not be able to return a HTTP response and the client would end up with no form of feedback, expect maybe a browser-specific "Connection reset by peer" error page.
Your best bet is to validate it in JavaScript beforehand. This works by the way only in browsers supporting HTML5 File API. You didn't tell anything about which JSF file upload component you're using, so I have the impression that you just homebrewed one, so I'll give a generic answer which is applicable on the rendered HTML <input type="file"> (note that it works as good on e.g. Tomahawk's <t:inputFileUpload>):
<input type="file" ... onchange="checkFileSize(this)" />
with something like this
function checkFileSize(inputFile) {
var max = 10 * 1024 * 1024; // 10MB
if (inputFile.files && inputFile.files[0].size > max) {
alert("File too large."); // Do your thing to handle the error.
inputFile.value = null; // Clears the field.
}
}
In case of older browsers not supporting this, well, you're lost. Your best alternative is Flash or Applet.

Why is this URL not opened from Play! Framework 1.2.4?

I have a URL in my Play! app that routes to either HTML or XLSX depending on the extension that is passed in the URL, with a routes line like :-
# Calls
GET /calls.{format} Call.index
so calls.html renders the page, calls.xlsx downloads an Excel file (using Play Excel module). All works fine from the browser, a cURL request, etc.
I now want to be able to create an email and have the Excel attached to it, but I cannot pull the attachment. Here's the basic version of what I tried first :-
public static void sendReport(List<Object[]> invoicelines, String emailaddress) throws MalformedURLException, URISyntaxException
{
setFrom("Telco Analysis <test#test.com>");
addRecipient(emailaddress);
setSubject("Telco Analysis report");
EmailAttachment emailAttachment = new EmailAttachment();
URL url = new URL("http://localhost:9001/calls.xlsx");
emailAttachment.setURL(url);
emailAttachment.setName(url.getFile());
emailAttachment.setDescription("Test file");
addAttachment(emailAttachment);
send(invoicelines);
}
but it just doesn't pull the URL content, it just sits there without any error messages, with Chrome's page spinner going and ties up the web server (to the point that requests from another browser/machine don't appear to get serviced). If I send the email without the attachment, all is fine, so it's just the pulling of the file that appears to be the problem.
So far I've tried the above method, I've tried Play's WS webservice library, I've tried manually-crafted HttpRequests, etc. If I specify another URL (such as http://www.google.com) it works just fine.
Anyone able to assist?
I am making an assumption that you are running in Dev mode.
In Dev mode, you will likely have a single request execution pool, but in your controller that send an email, you are sending off a second request, which will block until your previous request has completed (which it won't because it is waiting for the second request to respond)...so....deadlock!
The resaon why external requests work fine, is because you are not causing the deadlock on your Play request pool.
Simple answer to your problem is to increase the value of the play.pool in the application.conf. Make sure that it is uncommented, and choose a value greater than 1!
# Execution pool
# ~~~~~
# Default to 1 thread in DEV mode or (nb processors + 1) threads in PROD mode.
# Try to keep a low as possible. 1 thread will serialize all requests (very useful for debugging purpose)
play.pool=3

Categories