Understand Working File Upload Solution - java

I've written the method below that is called by my doPost method to parse multipart/form-data given in the request. It all works great, I just don't really understand what is going on. If anyone could break down the three lines in my try I would really appreciate it. I've read through the Apache Commons File Upload documentation but it's just not making sense to me and I hate writing code that I do not fully understand. In particular, I would like to know what is actually happening when the factory and upload objects are created.
public static List<FileItem> parseFormRequest(HttpServletRequest request)
{
List<FileItem> items = null;
try
{
DiskFileItemFactory factory = new DiskFileItemFactory();
ServletFileUpload upload = new ServletFileUpload(factory);
items = upload.parseRequest(request);
}
catch (FileUploadException error)
{
System.out.println("UploadFileServlet - Error With File Parsing - " + error.getMessage());
}
return items;
}
BONUS HELP!
I also get a warning under upload.parseRequest(request) that says Type safety: The expression of type List needs unchecked conversion to conform to List<FileItem>. If anyone could explain this too that would really help me get what I've done. Thanks

The factory is just a helper, i'll explain later. The main work is done by the ServletFileUpload.
ServletFileUpload will scan through all the uploaded files (using an Iterator which parses the MIME stuff and knows how to deal with the boundary markers, content length etc.).
For each uploaded file, the parse method asks the FileItemFactory to create a local representation for the uploaded file and then copies the contents from memory (e.g. from the HTTP POST request, which is held in memory) to the actual file on disk.
Simplified, the procedure is as follows:
Get next uploaded file
Ask factory for a local file ("FileItem")
Copy content from in-memory (from HttpServletRequest) to local file (a java.io.File in case of a DiskFileItemFactory)
Loop until end of HTTP request is reached
See the sources of the following classes for details
org.apache.commons.fileupload.FileUploadBase.parseRequest(RequestContext)
org.apache.commons.fileupload.FileUploadBase.FileItemIteratorImpl.FileItemStreamImpl
This design allows to switch to another storage facility for files, for example:
You could replace the DiskFileItemFactory with your own DatabaseFileItemFactory, so the uploaded files get stored in a database instead of a local file on the server. The code changes would only affect a single line and the rest of commons-fileupload can be use as-is (e.g. the parsing of the HTTP request, the iterating over the uploaded
files etc.)
For the second question: commons-fileupload seems to be Java 1.4 compatible, so the return type of parseRequest()
is actually an un-typed java.util.List - it's missing the declaration that the list only contains FileItem objects (e.g. java.util.List<FileItem>).
Since you declared your variable items to be of type List<FileItem>, the Java compiler warns you about this mismatch.
In this case, you did it correctly and you may ignore the warning by adding the following:
#SuppressWarnings( "unchecked" )
public static List<FileItem> parseFormRequest(HttpServletRequest request)
{
...

You need to spend some time in the documentation.
From what I can gather, the ServletFileUpload instance uses the factory you supplied to actually create the file instances that are in the request. You used a factory that writes the files to disk; there are other options though (e.g. memory). By specifying the factory, you are specifying the types of Files that are created.
When you call
upload.parseRequest(request)
the ServletFileUpload instance goes thru the request data and actually creates the files it finds, using the factory, and returns them to you in a list.
If you look at the parseRequest documentation you will notice that that method only returns a List. In your code, you are assigning that returned list to a List<FileItem>. That requires a cast, which is why you get the compiler warning.

I need the java code that is used to invoke this function...
I need the Servlet code that is used to add parameters to the HttpServletRequest

Related

How can we pass header information to a JsonDataSource?

We are using REST endpoint as datasource for jasper reports, but everywhere using REST point its mandatory to create an adapter with rest url and header info and use that as datasource.
We don't want to use adapter, instead we want to use directly the constructor
public JsonDataSource(String location, String selectExpression) throws JRException
as a dataset expression so we formed expression as follow.
new net.sf.jasperreports.engine.data.JsonDataSource("http://vagrant.ptcnet.ptc.com:2280/Windchill/trustedAuth/servlet/odata/D...","value")
However this particular endpoint expects some header information from requestor ("Accept", "application/json") else it throws bad exception as error
Is there any way we can pass header info here?
You need to use the constructor where you pass a InputStream
public JsonDataSource​(java.io.InputStream jsonStream,java.lang.String selectExpression)
The easiest way to provide the input stream is probably to create a method within your java project that execute the request and returns the result in for example a ByteArrayInputStream
If you need to do it directly within the report (jrxml) you need to do it in 1 expression (jrxml do not support multi-line code). In this case you could the apache HttpClients that already is included as dependency of the jasper report project.
It could be something like this
new net.sf.jasperreports.engine.data.JsonDataSource(
org.apache.http.impl.client.HttpClients.createDefault().execute(
org.apache.http.client.methods.RequestBuilder.
get().
setUri("http://vagrant.ptcnet.ptc.com:2280/Windchill/trustedAuth/servlet/odata/D...").
setHeader("Accept", "application/json").
build()
)
.getEntity()
.getContent()
,""
)
The getContent() will return the InputStream and jasper reports will close this stream when it is done. However both the client and the execute response are theoretically Closable which means that normally you should call close() to free up resource, hence I'm not sure that it is enough to close only the InputStream, you may risk leaking of resource. This is why I initially suggested to create a method within the/a java project where this can be handled appropriately.

Can I transform a JSON-LD to a Java object?

EDIT: I changed my mind. I would find a way to generate the Java class and load the JSON as an object of that class.
I just discovered that exists a variant of JSON called JSON-LD.
It seems to me a more structured way of defining JSON, that reminds me XML with an associated schema, like XSD.
Can I create a Java class from JSON-LD, load it at runtime and use it to convert JSON-LD to an instantiation of that class?
I read the documentation of both the implementations but I found nothing about it. Maybe I read them bad?
Doing a Google search brought me to a library that will decode the JSON-LD into an "undefined" Object.
// Open a valid json(-ld) input file
InputStream inputStream = new FileInputStream("input.json");
// Read the file into an Object (The type of this object will be a List, Map, String, Boolean,
// Number or null depending on the root object in the file).
Object jsonObject = JsonUtils.fromInputStream(inputStream);
// Create a context JSON map containing prefixes and definitions
Map context = new HashMap();
// Customise context...
// Create an instance of JsonLdOptions with the standard JSON-LD options
JsonLdOptions options = new JsonLdOptions();
// Customise options...
// Call whichever JSONLD function you want! (e.g. compact)
Object compact = JsonLdProcessor.compact(jsonObject, context, options);
// Print out the result (or don't, it's your call!)
System.out.println(JsonUtils.toPrettyString(compact));
https://github.com/jsonld-java/jsonld-java
Apparently, it can take it from just a string as well, as if reading it from a file or some other source. How you access the contents of the object, I can't tell. The documentation seems to be moderately decent, though.
It seems to be an active project, as the last commit was only 4 days ago and has 30 contributors. The license is BSD 3-Clause, if that makes any difference to you.
I'm not in any way associate with this project. I'm not an author nor have I made any pull requests. It's just something I found.
Good luck and I hope this helped!
see this page: JSON-LD Module for Jackson

JAX-WS CXF empty XOP multipart attachments with file size > ~210kb

I am using jax-ws cxf to load documents from a SOAP interface. I can get the correct document via SoapUI (xop/multipart). Unfortunately, when I try to load the attachment via code, the CachedOutputStream is empty for files greater than ~210kb.
What I tried:
Activate MTOMFeature for my WebServiceClient
Play with JVM arguments CachedOutputStream.Threshold and CachedOutputStream.MaxSize
Use different versions of apache-cxf (3.2.1 or 3.1.14)
When debugging:
PhaseInterceptorChain#doIntercept uses the AttachmentInInterceptor (at currentInterceptor.handleMessage(message);) which loads the attachments with LazyAttachmentCollection and adds it to the message.
happy case: document is loaded into CachedOutputStream and available after the for-loop.
error case (file too big?): document is available directly after currentInterceptor.handleMessage is called, but disappears when the loop has finished
In both of the above cases however, a correct tmp file is saved to my disk (with exactly my document's content). Furthermore, I can load that file in both cases even when the loop has finished with: ((org.apache.cxf.attachment.LazyAttachmentCollection)(message.getAttachments())).loadAll();
I had similar problem with apache-cxf 3.1.6. The issue was that files above 102kB were empty. After some digging it turned turned out to be "attachment-memory-threshold" which u can set in requestContext, for some reason file cache doesnt seem to work.

Jersey Post request - How to perform a file upload with an unknown number of additional parameters?

I asked something like this previously, but upon re-reading my original post, it was not easy to understand what I was really asking. I have the following situation. We have (or at least I'm trying to get working) a custom file upload procedure that will take in the file, a set number of 'known' metadata values (and they will always be there), as well as potentially an unknown number of additional metadata values. The service that exists currently uses the Jersey framework (1.16)
I currently have both client and server code that handles dealing with the file upload portion and the known metadata values (server code below)
#POST
#Path("asset/{obfuscatedValue0}/")
#Consumes(MediaType.MULTIPART_FORM_DATA)
public UUID uploadBlob(#PathParam("obfuscatedValue0") Integer obfuscatedValue0,
#FormDataParam("obfuscatedValue1") String obfuscatedValue1,
#FormDataParam("obfuscatedValue2") String obfuscatedValue2,
#FormDataParam("obfuscatedValue3") String obfuscatedValue3,
#FormDataParam("obfuscatedValue4") String obfuscatedValue4,
#FormDataParam("obfuscatedValue5") String obfuscatedValue5,
#FormDataParam("file") InputStream uploadedInputStream) {
.....
}
...and excerpt of client code:
Builder requestBuilder = _storageService
.path("asset")
.path(obfuscatedValue0.toString())
.type(MediaType.MULTIPART_FORM_DATA)
.accept(MediaType.APPLICATION_JSON);
FormDataMultiPart part = new FormDataMultiPart()
.field("file", is, MediaType.TEXT_PLAIN_TYPE) // 'is' is an inputstream from earlier in code.
.field("obfuscatedValue1", obfuscatedValue1)
.field("obfuscatedValue2", obfuscatedValue2)
.field("obfuscatedValue3", obfuscatedValue3)
.field("obfuscatedValue4", obfuscatedValue4)
.field("obfuscatedValue5", obfuscatedValue5);
storedAsset = requestBuilder.post(UUID.class, part);
However, I need to pass a map of additional parameters that will have an unknown number of values/names. From what I've seen, there is no easy way to do this using the FormDataParam annotation like my previous example.
Based upon various internet searches related to Jersey file uploads, I've attempted to convert it to use MultivaluedMap with the content type set to "application/x-www-form-urlencoded" so it resembles this:
#POST
#Path("asset/{value}/")
#Consumes("application/x-www-form-urlencoded")
public UUID uploadBlob(#PathParam(value), MultivaluedMap<String,String> formParams) {
....
}
It's my understanding that MultivaluedMap is intended to obtain a general map of form parameters (and as such, cannot play nicely together in the same method bearing #FormDataParam annotations.) If I can pass all this information from the Client inside some sort of map, I think I can figure out how to handle parsing the map to grab and 'doMagic()' on the data to get what I want done; I don't think I'll have a problem there.
What I AM fairly confused about is how to format the request client-side code when using this second method within the jersey framework. Can anyone provide some guidance for the situation, or some suggestions on how to proceed? I'm considering trying the solution proposed here and developing a custom xml adapter to deal with this situation, and sending xml instead of multipart-form-data but I'm still confused how this would interact with the InputStream value that will need to be passed. It appears the examples with MultivaluedMap that I've seen only deal with String data.

Writing to a PDF from inside a GAE app

I need to read several megabytes (raw text strings) out of my GAE Datastore and then write them all to a new PDF file, and then make the PDF file available for the user to download.
I am well aware of the sandbox restrictions that prevent you from writing to the file system. I am wondering if there is a crafty way of creating a PDF in-memory (or a combo of memory and the blobstore) and then storing it somehow so that the client-side (browser) can actually pull it down as a file and save it locally.
This is probably a huge stretch, but my only other option is to farm this task out to a non-GAE server, which I would like to avoid at all cost, even if it takes a lot of extra development on my end. Thanks in advance.
You can definitely achieve your use case using GAE itself. Here are the steps that you should follow at a high level:
Download the excellent iText library, which is a Java library to work with PDFs. First build out your Java code to generate the PDF content. Check out various examples at : http://itextpdf.com/book/toc.php
Since you cannot write to a file directly, you need to generate your PDF content in bytes and then write a Servlet which will act as a Download Servlet. The Servlet will use the Response object to open a stream, manipulate the Mime Headers (filename, filetype) and write the PDF contents to the stream. A browser will automatically present a download option when you do that.
Your Download Servlet will have high level code that looks like this:
public class DownloadPDF extends HttpServlet {
public void doGet(HttpServletRequest req, HttpServletResponse res)
throws ServletException, IOException {
//Extract some request parameters, fetch your data and generate your document
String fileName = "<SomeFileName>.pdf";
res.setContentType("application/pdf");
res.setHeader("Content-Disposition", "attachment;filename=\"" + fileName + "\"");
writePDF(<SomeObjectData>, res.getOutputStream());
}
}
}
Remember the writePDF method above is your own method, where you use iText libraries Document and other classes to generate the data and write it ot the outputstream that you have passed in the second parameter.
While I'm not aware of the PDF generation on Google App Engine and especially in Java, but once you have it you can definitely store it and later serve it.
I suppose the generation of the PDF will take more than 30 seconds so you will have to consider using Task Queue Java API for this process.
After you have the file in memory you can simply write it to the Blobstore and later serve it as a regular blob. In the overview you will find a fully functional example on how to upload, write and serve your binary data (blobs) on Google App Engine.
I found a couple of solutions by googling. Please note that I have not actually tried these libraries, but hopefully they will be of help.
PDFJet (commercial)
Write a Google Drive document and export to PDF

Categories