render video stream as partial content rather than full stream to chrome - java

We currently submit a video playback request to back end server the sends the full stream as inputstream to be played back in the browser(window). This works fine but has an added complication in that the seek function does not work in chrome. The suggested solution is to tell the webserver that it needs to accept a byte range and then deliver the stream in partial byte ranges. I am not sure if this will resolve the situation but my question is how to return the stream in byte ranges considering the following is the way it is done now:
InputStream is= null;
is = new FileInputStream(ndirectoryFile);
....
//(calling class request)
stream = videoWrapper.getVideo(id, address);
If I read the file in byte ranges, do I just loop through the file but how do I send the response:
InputStream is = new ByteArrayInputStream(new byte[] { 0, 1, 2 });
ByteArrayOutputStream buffer = new ByteArrayOutputStream();
int nRead;
byte[] data = new byte[1024];
while ((nRead = is.read(data, 0, data.length)) != -1) {
buffer.write(data, 0, nRead);
}
buffer.flush();
byte[] byteArray = buffer.toByteArray(
The initial inputstream gets passed to quite a few classes along the way prior to sending the final response. Any ideas please.
EDIT:
I would like to just understand the html5 video issue with chrome. There are quite a few posts on setting the server response headers to include Accept-Ranges=, Content-length= , Content-Range= and this would tell chrome to download the byte range which will then allow seek feature to work. As the video playback seek works in firefox I should not have to change how I deliver the stream or would I? Would I still have to submit partial ranges of the video from the server? and how?

I resolved my situation by not having to change any streaming code but rather just ensure the correct range headers were applied to the response to ensure chrome handled the playback.
Accept-Ranges: bytes
Content-Range: bytes 0-1025/905357
Content-Length: 905357

Related

Use socket to get picture from web Java

I am doing a project to get the picture from a website(anyone will be OK),and I know that I could use the URL to get it . But I want to know better about the TCP ,so I use the socket to get it . That's all be OK, but the problem is that the data stream I received contain the respond of the HTTP ,and I don't know how to filter it.
Here is my code (just a part of it)
Socket socket = new Socket(netAdress, 80);
bw = new BufferedWriter(new OutputStreamWriter(socket.getOutputStream()));
bw.write("GET HTTP://" + sources + " HTTP/1.0\r\n");
bw.write("\r\n");
bw.flush()//connect
BufferedOutputStream writeImg = new BufferedOutputStream(new FileOutputStream(adj));
byte[] data = new byte[512];
int len=0;
boolean OK=false;
while ((len=in.read(data))>0) {
writeImg.write(data,0,len);
writeImg.flush();
}//receive the data stream
and this is what I received,and the picture couldn't open.
the detail of the data stream
If you know how to solved the problem or you have a better idea of get the picture By socket ,please contact me.Thanks.
... this is what I received,and the picture couldn't open
Yup. The response starts with an HTTP response header.
If you know how to solved the problem ...
Well, this is a hack, and NOT recommended (and it won't work in general!) but the HTTP response header ends with the first <CR> <NL> <CR> <NL> sequence (ASCII control codes). So if you strip off everything up to and including that sequence you should have an image. (Unless it is compressed, or encoded, or a multi-part, or .....)
... or you have a better idea of get the picture.
A better idea is to use the URL. Seriously.

Call large URL in Java

How should I call a large URL in Java? I'm integrating scene7 image server with java application. Here I call a URL of around 10000 characters which should return me an Image. What ways can I do this?
The way I wrote is -
URL oURL= new URL("<*LONG LONG URL - approx. 10k characters*>");
HttpURLConnection connection = (HttpURLConnection) oURL.openConnection();
InputStream stream = connection.getInputStream();
int len;
byte[] buf = new byte[1024];
BufferedImage bi = ImageIO.read(stream);
ImageIO.write(bi,"png",response.getOutputStream());
while ((len = stream.read(buf)) > 0) {
outs.write(buf, 0, len);
}
You won't get URLs that long to work reliably. The longest you can rely on working is around 2000 characters. Beyond that, you are liable to run into browser or server-side limits. (Or even limits in intermediate proxy servers)
You are going to need to "think outside the box". For example:
pass some the information that is currently encoded in your URL in the form of POST data,
pass some the information in the form of custom request headers (ewww!!), or
come up with a clever way to encode (or compress) the informational part of the URL.
All of these are likely to require changes to the server side.
Reference:
What is the maximum length of a URL in different browsers?
While the HTTP 1.1 specification does not place a limit on the length (size) of a URL, the server that you are working with might. You should check the documentation for the server you are connecting to (in this case, Adobe's Scene7.)
Just getting an image from the URL is as easy as:
URL url = new URL("<extremely long URL>");
Image image = ImageIO.read(url);
So I'm wondering if you weren't asking for something else.

Getting the response of a Httpurl connection back to the web browser correctly

Problem:Writing the response off a http message back to the client(web browser) does not return the full page if the page has images when using strings so i decided to use bytes but i still have the same issue.I have been able to get the header from the request as string and flushed it to the client but i am not sure what to do with the message to ensure that it appears properly on the web browser.
//This portion takes the message from the Httpurl connection inputstream
//the header has already been exttracted
//uc here represents a httpurlconnection
byte[] data = new byte[uc.getContentLength()];
int bytesRead = 0;
int offset = 0;
InputStream in = new BufferedInputStream(uc.getInputStream());
while (offset < uc.getContentLength()) {
bytesRead =in.read(data, offset, data.length-offset);
if (bytesRead == -1) break;
offset += bytesRead;
I would advice you to write the bytes to your response as you read them, and use a small buffer, this way your server won't be affected by large memory usage. It is not a good practice to load up all the bytes into an array in the server's memory.
Here is a quick sample:
response.setContentType("text/html;charset=UTF-8");
OutputStream out = response.getOutputStream();
HttpURLConnection uc;
// Setup the HTTP connection...
InputStream in = uc.getInputStream();
byte[] b = new byte[1024];
int bytesRead = 0;
while ( bytesRead != -1 ) {
bytesRead = in.read(b);
out.write(b);;
}
// Close the streams...
You seem to be proxying a HTML page with images and you seem to be expecting that images in a HTML page are somehow automagically inlined in the HTML source code. This is thus absolutely not true. Images in HTML are represented by the <img> element with a src attribute pointing to an URL which the webbrowser has to invoke and download individually. Exactly the same story applies to other resources like CSS and JS files.
You basically need to parse the obtained HTML, scan for all <img src> (and if necessary also <link href> and <script src>) elements and change their URL to the URL of your proxy so that it can serve the desired image (and CSS/JS) resources individually.
You can find a kickoff example in this answer on a related question: Make HttpURLConnection load web pages with images

Java's URLConnection not receiving entire binary file

I am currently working on a school that encompasses creating a P2P client for a standard we came up with in class that uses HTTP to request chunks of a binary file from peers. We are allowed to us Java's HTTP libraries to make these requests, however I am hitting a major problem with these libraries. All chunks of a file will be served up in chunks that are <=64KB, but when I use the following code, the max amount of bytes that I receive is around 15040 even though the content-length of the response is 64KB:
String response = "";
URL url = new URL(uriPath);
URLConnection conn = url.openConnection ();
conn.setConnectTimeout(30 * 1000);
conn.setReadTimeout(30 * 1000);
InputStream stream = conn.getInputStream();
ByteArrayOutputStream byteArrayOut = new ByteArrayOutputStream();
int c;
byte[] buffer = new byte[4096];
while ((c = stream.read(buffer)) != -1)
{
byteArrayOut.write(buffer,0,c);
}
body = byteArrayOut.toByteArray();
stream.close();
result.put(Constants.HEADER_CONTENT_LENGTH, conn.getHeaderField(Constants.HEADER_CONTENT_LENGTH));
result.put(Constants.HEADER_CONTENT_CHECKSUM, conn.getHeaderField(Constants.HEADER_CONTENT_CHECKSUM));
result.put(Constants.KEY_BODY, new String(body));
We've tested our server component, and that serves the file correctly when accessing a chunk with wget or in a browser - this java client is the only problematic client we were able to find.
Is this a problem with Java's URLConnection class, or is there something in my code that is wrong with reading a binary file that is returned in a response?
Note: I am using Java 1.6 in Eclipse and from the command line.
How do you know that the max amount of bytes is 15040? Did you byteArrayOut.toByteArray().length or did you do new String(byteArrayOut.toByteArray()).length()?
Creating a new String from a byte array that has binary content is likely to give unpredictable results. Use a FileOutputStream and open the file.

Determine size of HTTP Response?

Is there a way to determine the size of the HTTPServletResponse content? I read this get-size-of-http-response-in-java question but sadly where I work I do not have access to CommonsIO :(
The response content consists of a single complex object so I have considered writing it out to a temp file and then checking that file. This is not something I want to be doing as a diagnostic while the application is running in production though so want to avoid it if at all possible.
PS I read erickson's answer but it mentioned input streams I want to know the size of the object being written out... Would be really nice if the writeObject() method returned a number representing bytes written instead of void...
If you have access to the response header, you can read the Content-Length.
Here is a example of a response header:
(Status-Line):HTTP/1.1 200 OK
Connection:Keep-Alive
Date:Fri, 25 Mar 2011 16:26:56 GMT
Content-Length:728
Check this out: Header Field Definitions
This seems to be what you're looking for:
DataOutputStream dos = new DataOutputStream(response.getOutputStream());
...
int len = dos.size();
I eventually found a way to get what I wanted:
URLConnection con = servletURL.openConnection();
BufferedInputStream bif = new BufferedInputStream(con.getInputStream());
ObjectInputStream input = new ObjectInputStream(bif);
int avail = bif.available();
System.out.println("Response content size = " + avail);
This allowed me to see the response size on the client. I still would like to know what it is on the server side before it is sent but this was the next best thing.
Assuming the use of ObjectOutputStream, build it around a java.io.ByteArrayOutputStream:
ByteArrayOutputStream contentBytes = new ByteArrayOutputStream();
ObjectOutputStream objectOut = new ObjectOutputStream(contentBytes);
objectOut.writeObject(content);
int contentLength = contentBytes.size();
And then you can send the content with
contentBytes.writeTo(connection.getOutputStream());
where connection is whatever you're getting your OutputStream from.
Better late than never, right?

Categories