WS Download operation with MTOM - java

I want to stream directly from an Oracle database blobs files via WS with MTOM directly to the WS client.
I thought I found a way which is described here:
http://www.java.net/forum/topic/glassfish/metro-and-jaxb/mtom-best-practices
but after i took a look on InputStreamDataSource and javax.mail.util.ByteArrayDataSource i realized that they acutal hava a byte[] of the 'document' in memory meaning the streaming ideea is in vain, cause what i try to avoid is to have multiple docs in the same time fully in memory.
So how can I stream from DB via WS and MTOM to a WS client ?
Any idea ?
Thanks
Cris

I tried experimenting and finally i had some positive results.
In order to stream from DB directly to clients browser the above
things are valid but the InputStreamDataSource should be like this:
public class InputStreamDataSource implements DataSource {
private InputStream inputStream;
public InputStreamDataSource(InputStream inputStream) {
this.inputStream = inputStream;
}
public InputStream getInputStream() throws IOException {
return inputStream;
}
public OutputStream getOutputStream() throws IOException {
throw new UnsupportedOperationException("Not implemented");
}
public String getContentType() {
return "*/*";
}
public String getName() {
return "InputStreamDataSource";
}
}
What I was affraid is that once I closed the input stream myself...
the ws client did not received the binary content...
Than i check and actually the DataHandler creates a new thread and closes the input stream
I was able to stream 500MB from DB to client fast and with low memory footprint !

Related

Okio/Okhttp download file using BufferedSink and decode Base64 without having whole file in memory multiple times

Got a bit of a problem atm. for my "inapp"-update im downloading the new base64 encoded .apk from my webspace. I have the functionality pretty much down, this is the code without decoding.
public void onResponse(Call call, Response response) throws IOException {
if(response.isSuccessful()){
ResponseBody body = response.body();
BufferedSource source = body.source();
source.request(Long.MAX_VALUE);
Buffer buffer = source.buffer();
String rString = buffer.clone().readString(Charset.forName("UTF-8"));
Log.i("Test: ", AppUtils.decodeBase64(rString));
if(rString.equals("xxx")){
EventBus.getDefault().post(new KeyNotValid());
dispatcher.cancelAll();
}else{
EventBus.getDefault().post(new SaveKey(apikey));
BufferedSink sink = Okio.buffer(Okio.sink(myFile));
sink.writeAll(source);
sink.flush();
sink.close();
}
}
}
The Buffer/Log is not really necessary, just using it to check the response during testing.
How would i go about decoding the bytes before i write them to the sink?
I tried doing it via. ByteString, but i couldn't find a way to write the decoded String back to a BufferedSource.
Most alternatives are pretty slow like reopening the file afterwards, reading the bytes into memory, decode and write them back.
Would really appreciate any help on this
cheers
You can already consume the response as an InputStream via ResponseBody.byteStream. You can decorate this stream with https://commons.apache.org/proper/commons-codec/apidocs/org/apache/commons/codec/binary/Base64InputStream.html and use it to read a stream of bytes and write it to the Sink for the file in chunks.
I know this answer arrives quite late and that Yuri's answer is technically correct, but I think the most idiomatic way to do that is to take advantage of the composition pattern promoted by Okio to create a Source that decodes from Base64 (or a Sink that encodes to Base64, if you need so).
Here's a little proof of concept (I'm sure it can be improved):
public class Base64Source implements Source {
private Source delegate;
private Base64.Decoder decoder; // Using Java 8 API, but it can be any library
public Base64Source(Source delegate) {
this(delegate, Base64.getDecoder());
}
public Base64Source(Source delegate, Base64.Decoder decoder) {
this.delegate = delegate;
this.decoder = decoder;
}
#Override
public long read(Buffer sink, long byteCount) throws IOException {
Buffer buffer = new Buffer();
long actualRead = this.delegate.read(buffer, byteCount);
if (actualRead == -1) {
return -1;
}
byte[] encoded = buffer.readByteArray(actualRead);
byte[] decoded = decoder.decode(encoded);
sink.write(decoded);
return decoded.length;
}
#Override
public Timeout timeout() {
return this.delegate.timeout();
}
#Override
public void close() throws IOException {
this.delegate.close();
}
}
And here's how it can be used
BufferedSource source = Okio.buffer(new Base64Source(originalSource));
BufferedSink sink = ... // create sink
sink.writeAll(source);
// Don't forget to close the source/sink to flush and free resources
sink.close();
source.close();

Mocking/Testing HTTP Get Request

I'm trying to write unit tests for my program and use mock data. I'm a little confused on how to intercept an HTTP Get request to a URL.
My program calls a URL to our API and it is returned a simple XML file. I would like the test to instead of getting the XML file from the API online to receive a predetermined XML file from me so that I can compare the output to the expected output and determine if everything is working correctly.
I was pointed to Mockito and have been seeing many different examples such as this SO post, How to use mockito for testing a REST service? but it's not becoming clear to me how to set it all up and how to mock the data (i.e., return my own xml file whenever the call to the URL is made).
The only thing I can think of is having another program made that's running locally on Tomcat and in my test pass a special URL that calls the locally running program on Tomcat and then return the xml file that I want to test with. But that just seems like overkill and I don't think that would be acceptable. Could someone please point me in the right direction.
private static InputStream getContent(String uri) {
HttpURLConnection connection = null;
try {
URL url = new URL(uri);
connection = (HttpURLConnection) url.openConnection();
connection.setRequestMethod("GET");
connection.setRequestProperty("Accept", "application/xml");
return connection.getInputStream();
} catch (MalformedURLException e) {
LOGGER.error("internal error", e);
} catch (IOException e) {
LOGGER.error("internal error", e);
} finally {
if (connection != null) {
connection.disconnect();
}
}
return null;
}
I am using Spring Boot and other parts of the Spring Framework if that helps.
Part of the problem is that you're not breaking things down into interfaces. You need to wrap getContent into an interface and provide a concrete class implementing the interface. This concrete class will then
need to be passed into any class that uses the original getContent. (This is essentially dependency inversion.) Your code will end up looking something like this.
public interface IUrlStreamSource {
InputStream getContent(String uri)
}
public class SimpleUrlStreamSource implements IUrlStreamSource {
protected final Logger LOGGER;
public SimpleUrlStreamSource(Logger LOGGER) {
this.LOGGER = LOGGER;
}
// pulled out to allow test classes to provide
// a version that returns mock objects
protected URL stringToUrl(String uri) throws MalformedURLException {
return new URL(uri);
}
public InputStream getContent(String uri) {
HttpURLConnection connection = null;
try {
Url url = stringToUrl(uri);
connection = (HttpURLConnection) url.openConnection();
connection.setRequestMethod("GET");
connection.setRequestProperty("Accept", "application/xml");
return connection.getInputStream();
} catch (MalformedURLException e) {
LOGGER.error("internal error", e);
} catch (IOException e) {
LOGGER.error("internal error", e);
} finally {
if (connection != null) {
connection.disconnect();
}
}
return null;
}
}
Now code that was using the static getContent should go through a IUrlStreamSource instances getContent(). You then provide to the object that you want to test a mocked IUrlStreamSource rather than a SimpleUrlStreamSource.
If you want to test SimpleUrlStreamSource (but there's not much to test), then you can create a derived class that provides an implementation of stringToUrl that returns a mock (or throws an exception).
The other answers in here advise you to refactor your code to using a sort of provider which you can replace during your tests - which is the better approach.
If that isn't a possibility for whatever reason you can install a custom URLStreamHandlerFactory that intercepts the URLs you want to "mock" and falls back to the standard implementation for URLs that shouldn't be intercepted.
Note that this is irreversible, so you can't remove the InterceptingUrlStreamHandlerFactory once it's installed - the only way to get rid of it is to restart the JVM. You could implement a flag in it to disable it and return null for all lookups - which would produce the same results.
URLInterceptionDemo.java:
public class URLInterceptionDemo {
private static final String INTERCEPT_HOST = "dummy-host.com";
public static void main(String[] args) throws IOException {
// Install our own stream handler factory
URL.setURLStreamHandlerFactory(new InterceptingUrlStreamHandlerFactory());
// Fetch an intercepted URL
printUrlContents(new URL("http://dummy-host.com/message.txt"));
// Fetch another URL that shouldn't be intercepted
printUrlContents(new URL("http://httpbin.org/user-agent"));
}
private static void printUrlContents(URL url) throws IOException {
try(InputStream stream = url.openStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(stream))) {
String line;
while((line = reader.readLine()) != null) {
System.out.println(line);
}
}
}
private static class InterceptingUrlStreamHandlerFactory implements URLStreamHandlerFactory {
#Override
public URLStreamHandler createURLStreamHandler(final String protocol) {
if("http".equalsIgnoreCase(protocol)) {
// Intercept HTTP requests
return new InterceptingHttpUrlStreamHandler();
}
return null;
}
}
private static class InterceptingHttpUrlStreamHandler extends URLStreamHandler {
#Override
protected URLConnection openConnection(final URL u) throws IOException {
if(INTERCEPT_HOST.equals(u.getHost())) {
// This URL should be intercepted, return the file from the classpath
return URLInterceptionDemo.class.getResource(u.getHost() + "/" + u.getPath()).openConnection();
}
// Fall back to the default handler, by passing the default handler here we won't end up
// in the factory again - which would trigger infinite recursion
return new URL(null, u.toString(), new sun.net.www.protocol.http.Handler()).openConnection();
}
}
}
dummy-host.com/message.txt:
Hello World!
When run, this app will output:
Hello World!
{
"user-agent": "Java/1.8.0_45"
}
It's pretty easy to change the criteria of how you decide which URLs to intercept and what you return instead.
The answer depends on what you are testing.
If you need to test the processing of the InputStream
If getContent() is called by some code that processes the data returned by the InputStream, and you want to test how the processing code handles specific sets of input, then you need to create a seam to enable testing. I would simply move getContent() into a new class, and inject that class into the class that does the processing:
public interface ContentSource {
InputStream getContent(String uri);
}
You could create a HttpContentSource that uses URL.openConnection() (or, better yet, the Apache HttpClientcode).
Then you would inject the ContentSource into the processor:
public class Processor {
private final ContentSource contentSource;
#Inject
public Processor(ContentSource contentSource) {
this.contentSource = contentSource;
}
...
}
The code in Processor could be tested with a mock ContentSource.
If you need to test the fetching of the content
If you want to make sure that getContent() works, you could create a test that starts a lightweight in-memory HTTP server that serves the expected content, and have getContent() talk to that server. That does seem overkill.
If you need to test a large subset of the system with fake data
If you want to make sure things work end to end, write an end to end system test. Since you indicated you use Spring, you can use Spring to wire together parts of the system (or to wire the entire system, but with different properties). You have two choices
Have the system test start a local HTTP server, and when you have your test create your system, configure it to talk to that server. See the answers to this question for ways to start the HTTP server.
Configure spring to use a fake implementation of ContentSource. This gets you slightly less confidence that everything works end-to-end, but it will be faster and less flaky.

is there an existing FileInputStream delete on close?

Is there an existing way to have a FileInputStream delete the underlying file automatically when closed?
I was planning to make my own utility class to extend FileInputStreamand do it myself, but I'm kinda surprised that there isn't something already existing.
edit: Use case is that I have a Struts 2 action that returns an InputStream for file download from a page. As far as I can tell, I don't get notified when the action is finished, or the FileInputStream is not in use anymore, and I don't want the (potentially large) temporary files that are generated to be downloaded left lying around.
The question wasn't Struts 2 specific, so I didn't include that info originally and complicate the question.
There's no such thing in the standard libraries, and not any of the apache-commons libs either , so something like:
public class DeleteOnCloseFileInputStream extends FileInputStream {
private File file;
public DeleteOnCloseFileInputStream(String fileName) throws FileNotFoundException{
this(new File(fileName));
}
public DeleteOnCloseFileInputStream(File file) throws FileNotFoundException{
super(file);
this.file = file;
}
public void close() throws IOException {
try {
super.close();
} finally {
if(file != null) {
file.delete();
file = null;
}
}
}
}
I know this is a fairly old question; however, it's one of the first results in Google, and Java 7+ has this functionality built in:
Path path = Paths.get(filePath);
InputStream fileStream = Files.newInputStream(path, StandardOpenOption.DELETE_ON_CLOSE);
There are a couple caveats with this approach though, they're written up here, but the gist is that the implementation makes a best effort attempt to delete the file when the input stream is closed, and if that fails makes another best effort attempt when the JVM terminates. It is intended for use with temp files that are used solely by a single instance of the JVM. If the application is security sensitive, there are also a few other caveats.
Can you can use File.deleteOnExit() before opening the file ?
EDIT: On you can subclass a FileInputStream that will delete the file on 'close()';
class MyFileInputStream extends FileInputStream
{
File file;
MyFileInputStream(File file) { super(file); this.file=file;}
public void close() { super.close(); file.delete();}
}
I know this is an old question, but I just ran into this issue, and found another answer: javax.ws.rs.core.StreamingOutput.
Here's how I used it:
File downloadFile = ...figure out what file to download...
StreamingOutput so = new StreamingOutput(){
public void write(OutputStream os) throws IOException {
FileUtils.copyFile(downloadFile, os);
downloadFile.delete();
}
ResponseBuilder response = Response.ok(so, mimeType);
response.header("Content-Disposition", "attachment; filename=\""+downloadFile.getName()+"\"");
result = response.build();

JSP compilation to string or in memory bytearray with Tomcat/Websphere

I am doing conversion to image and PDF output. I need an input HTML document that is generated by our application JSPs. Essentially, I need to render the final output product of a JSP based application to a String or memory and then use that string for other processing.
What are some ways that I can just invoke the JSP renderer to get the final HTML content that is normally output to the user?
Ideally, I am looking for something that will work for multiple application servers like websphere. But something that is Tomcat specific will also work.
There are a couple of other different approaches, but I think rendering the JSP (which may include sub JSPs) is the best approach.
Optional Paths that I would rather stay away from.
I could perform a network request to the page using the Socket APIs and then read the final output that is rendered from that particular page. This is probably the next best option, but we work on multiple servers and JVMs, targeting the page I need would be complicated.
Use a filter to get that final page output. This Ok but I have always had problems with filters and illegalstateexceptions. It never seems to work 100% the way I need to.
It seems like this should be simple. The JSP compiler is essentially just a library for parsing an input JSP document and subdocuments and then output some HTML content. I would like to invoke that process through Java code. On the server and possibly as a standalone console application.
This is a downright irritating problem, one I've had to handle a few times and one I've never found a satisfactory solution to.
The basic problem is that the servlet API is of no help here, so you have to trick it. My solution is to write a subclass of HttpServletResponseWrapper which override the getWriter() and getOutput() methods and captures the data into a buffer. You then forward() your request to the URI of the JSP you want to capture, substituting your wrapper response for the original response. You then extract the data from the buffer, manipulate it, and write the end result back to the original response.
Here's my code that does this:
public class CapturingResponseWrapper extends HttpServletResponseWrapper {
private final OutputStream buffer;
private PrintWriter writer;
private ServletOutputStream outputStream;
public CapturingResponseWrapper(HttpServletResponse response, OutputStream buffer) {
super(response);
this.buffer = buffer;
}
#Override
public ServletOutputStream getOutputStream() {
if (outputStream == null) {
outputStream = new DelegatingServletOutputStream(buffer);
}
return outputStream;
}
#Override
public PrintWriter getWriter() {
if (writer == null) {
writer = new PrintWriter(buffer);
}
return writer;
}
#Override
public void flushBuffer() throws IOException {
if (writer != null) {
writer.flush();
}
if (outputStream != null) {
outputStream.flush();
}
}
}
The code to use it can be something like this:
HttpServletRequest originalRequest = ...
HttpServletResponse originalResponse = ...
ByteArrayOutputStream bufferStream = new ByteArrayOutputStream();
CapturingResponseWrapper responseWrapper = new CapturingResponseWrapper(originalResponse, bufferStream);
originalRequest.getRequestDispatcher("/my.jsp").forward(originalRequest, responseWrapper);
responseWrapper.flushBuffer();
byte[] buffer = bufferStream.toByteArray();
// now use the data
It's very ugly, but it's the best solution I've found. In case you're wondering, the wrapper response has to contain the original response because the servlet spec says that you cannot substitute a completely different request or response object when you forward, you have to use the originals, or wrapped versions of them.

How can you pipe an OutputStream to a StreamingDataHandler?

I've got a Java web service in JAX-WS that returns an OutputStream from another method. I can't seem to figure out how to stream the OutputStream into the returned DataHandler any other way than to create a temporary file, write to it, then open it back up again as an InputStream. Here's an example:
#MTOM
#WebService
class Example {
#WebMethod
public #XmlMimeType("application/octet-stream") DataHandler service() {
// Create a temporary file to write to
File fTemp = File.createTempFile("my", "tmp");
OutputStream out = new FileOutputStream(fTemp);
// Method takes an output stream and writes to it
writeToOut(out);
out.close();
// Create a data source and data handler based on that temporary file
DataSource ds = new FileDataSource(fTemp);
DataHandler dh = new DataHandler(ds);
return dh;
}
}
The main issue is that the writeToOut() method can return data that are far larger than the computer's memory. That's why the method is using MTOM in the first place - to stream the data. I can't seem to wrap my head around how to stream the data directly from the OutputStream that I need to provide to the returned DataHandler (and ultimately the client, who receives the StreamingDataHandler).
I've tried playing around with PipedInputStream and PipedOutputStream, but those don't seem to be quite what I need, because the DataHandler would need to be returned after the PipedOutputStream is written to.
Any ideas?
I figured out the answer, along the lines that Christian was talking about (creating a new thread to execute writeToOut()):
#MTOM
#WebService
class Example {
#WebMethod
public #XmlMimeType("application/octet-stream") DataHandler service() {
// Create piped output stream, wrap it in a final array so that the
// OutputStream doesn't need to be finalized before sending to new Thread.
PipedOutputStream out = new PipedOutputStream();
InputStream in = new PipedInputStream(out);
final Object[] args = { out };
// Create a new thread which writes to out.
new Thread(
new Runnable(){
public void run() {
writeToOut(args);
((OutputStream)args[0]).close();
}
}
).start();
// Return the InputStream to the client.
DataSource ds = new ByteArrayDataSource(in, "application/octet-stream");
DataHandler dh = new DataHandler(ds);
return dh;
}
}
It is a tad more complex due to final variables, but as far as I can tell this is correct. When the thread is started, it blocks when it first tries to call out.write(); at the same time, the input stream is returned to the client, who unblocks the write by reading the data. (The problem with my previous implementations of this solution was that I wasn't properly closing the stream, and thus running into errors.)
Sorry, I only did this for C# and not java, but I think your method should launch a thread to run "writeToOut(out);" in parralel. You need to create a special stream and pass it to the new thread which gives that stream to writeToOut. After starting the thread you return that stream-object to your caller.
If you only have a method that writes to a stream and returns afterwards and another method that consumes a stream and returns afterwards, there is no other way.
Of coure the tricky part is to get hold of such a -multithreading safe- stream: It shall block each side if an internal buffer is too full.
Don't know if a Java-pipe-stream works for that.
Wrapper pattern ? :-).
Custom javax.activation.DataSource implementation (only 4 methods) to be able to do this ?
return new DataHandler(new DataSource() {
// implement getOutputStream to return the stream used inside writeToOut()
...
});
I don't have the IDE available to test this so i'm only doing a suggestion. I would also need the writeToOut general layout :-).
In my application I use InputStreamDataSource implementation that take InputStream as constructor argument instead of File in FileDataSource. It works so far.
public class InputStreamDataSource implements DataSource {
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
private final String name;
public InputStreamDataSource(InputStream inputStream, String name) {
this.name = name;
try {
int nRead;
byte[] data = new byte[16384];
while ((nRead = inputStream.read(data, 0, data.length)) != -1) {
buffer.write(data, 0, nRead);
}
buffer.flush();
inputStream.close();
} catch (IOException e) {
e.printStackTrace();
}
}
#Override
public String getContentType() {
return new MimetypesFileTypeMap().getContentType(name);
}
#Override
public InputStream getInputStream() throws IOException {
return new ByteArrayInputStream(buffer.toByteArray());
}
#Override
public String getName() {
return name;
}
#Override
public OutputStream getOutputStream() throws IOException {
throw new IOException("Read-only data");
}
}

Categories