Call large URL in Java - java

How should I call a large URL in Java? I'm integrating scene7 image server with java application. Here I call a URL of around 10000 characters which should return me an Image. What ways can I do this?
The way I wrote is -
URL oURL= new URL("<*LONG LONG URL - approx. 10k characters*>");
HttpURLConnection connection = (HttpURLConnection) oURL.openConnection();
InputStream stream = connection.getInputStream();
int len;
byte[] buf = new byte[1024];
BufferedImage bi = ImageIO.read(stream);
ImageIO.write(bi,"png",response.getOutputStream());
while ((len = stream.read(buf)) > 0) {
outs.write(buf, 0, len);
}

You won't get URLs that long to work reliably. The longest you can rely on working is around 2000 characters. Beyond that, you are liable to run into browser or server-side limits. (Or even limits in intermediate proxy servers)
You are going to need to "think outside the box". For example:
pass some the information that is currently encoded in your URL in the form of POST data,
pass some the information in the form of custom request headers (ewww!!), or
come up with a clever way to encode (or compress) the informational part of the URL.
All of these are likely to require changes to the server side.
Reference:
What is the maximum length of a URL in different browsers?

While the HTTP 1.1 specification does not place a limit on the length (size) of a URL, the server that you are working with might. You should check the documentation for the server you are connecting to (in this case, Adobe's Scene7.)

Just getting an image from the URL is as easy as:
URL url = new URL("<extremely long URL>");
Image image = ImageIO.read(url);
So I'm wondering if you weren't asking for something else.

Related

Java program is not able to write complete web content, however in SOAPUI, I can see complete content

I wrote a Java program to make a simple HTTP get call and read content from a REST webservice. I include 2 custom HTTP headers and provide values to them.
In SOAPUI, when I make a REST call with updating those 2 headers, I get proper response, however when I make the same call in my Java program, I get truncated output. Below is my code:
try {
URL url = new URL(lnk);
HttpURLConnection connection = (HttpURLConnection) url.openConnection();
connection.setRequestMethod("GET");
connection.setRequestProperty("HeaderParameter1", param1);
connection.setRequestProperty("HeaderParamter2", param2);
connection.setConnectTimeout(60000);
InputStream is = connection.getInputStream();
byte [] b = new byte [is.available()];
is.read(b);
String input = new String(b);
System.out.println("The payload is \n"+input);
connection.disconnect();} catch(Exception e){}
Calling is.available() returns the number of bytes that are available to be read without blocking, but that may be less than the full content of the response. In the documentation for this method, it says: "It is never correct to use the return value of this method to allocate a buffer intended to hold all data in this stream."
A simple way to read the entire response into a String could be:
InputStream is = connection.getInputStream();
byte [] b = is.readAllBytes();
String input = new String(b);
There are two caveats with this approach:
The new String(byte[]) constructor uses the platform's default charset to convert the bytes to characters. This is generally not a good idea for network protocols, since the client and server need to agree on the character encoding. You can hardcode the charset encoding if it will always be the same, for example: new String(b, "UTF-8").
If the response is very large, this will allocate a large byte array. If your handling of the response can be written in a streaming manner, it would be better to read iteratively with a fixed buffer size, as Anders mentioned in his comment.

How can I login to a website using Java and stay logged in?

So what i am trying to do is log into a a web app that is used by our company, I need to download multiple graphs(45 to be exact) I have a program that will do this exactly how i want it to, However the way my code works it has to log in each time. I'm not sure if this is a problem( Might look suspicious to the admin of the web app) or not but it seems a little inefficient. Ideally I would like to log into the site once and them move to whatever Urls I need to download the images. Any help you guys could offer would be great.
for (int i = 1; i <= 45; i++) {
URL url;
if(i<10) {
url = new URL("http://127.0.0.1:3333/website/image0"+i);
}
else{
url = new URL("http://127.0.0.1:3333/website/image"+i);
}
String loginPassword = "usrName" + ":" + "PassWrd";
String encoded = new sun.misc.BASE64Encoder().encode(loginPassword.getBytes());
URLConnection conn = url.openConnection();
conn.setRequestProperty("Authorization", "Basic " + encoded);
String destName = "C:\\Users\\Name\\Documents\\Report\\p"+i+".png";
InputStream in = new BufferedInputStream(conn.getInputStream());
OutputStream os = new FileOutputStream(destName);
byte[] b = new byte[2048];
int length;
while ((length = in.read(b)) != -1) {
os.write(b, 0, length);
}
in.close();
os.close();
}
This will largely depend on the website you're trying to get the images off of. Most of the time, the website will use Cookies to keep track of the fact that you're logged in.
Java provides a class specifically for this purpose to be used with HttpUrlConnection: The CookieHandler
Using that class to set the default handler to a CookieManager will probably be enough for you to stay logged in. Something like this:
CookieHandler.setDefault(new CookieManager(null, CookiePolicy.ACCEPT_ALL));
// null for the CookieStore means CookieManager will use an in-memory implementation
// And CookiePolicy.ACCEPT_ALL means that this CookieHandler will accept
// all cookies
In case your website uses something else to identify its users, you will have to reverse engineer the process and implement it in your program. Generally though, everything that can be done with a browser can be done with Java as well so it should always be possible.
Edit: After writing this whole thing I just realized that you're website uses HTTP Authentication. In this case Cookies won't actually help, since you have reauthorize every request anyway. HTTP does not have a state by itself, so the server won't remember at all that you were already logged in. If this is the only way that a user can get access to those images then there will be nothing suspicious about it, since a browser has to resend the username and the password with every request as well.
I would recommend saving the encoded String somewhere just to not have it being regenerated every time.
I'll still leave the Cookie thing up, since there might still be some use for it in case the authentication mechanisms are different.

Getting the response of a Httpurl connection back to the web browser correctly

Problem:Writing the response off a http message back to the client(web browser) does not return the full page if the page has images when using strings so i decided to use bytes but i still have the same issue.I have been able to get the header from the request as string and flushed it to the client but i am not sure what to do with the message to ensure that it appears properly on the web browser.
//This portion takes the message from the Httpurl connection inputstream
//the header has already been exttracted
//uc here represents a httpurlconnection
byte[] data = new byte[uc.getContentLength()];
int bytesRead = 0;
int offset = 0;
InputStream in = new BufferedInputStream(uc.getInputStream());
while (offset < uc.getContentLength()) {
bytesRead =in.read(data, offset, data.length-offset);
if (bytesRead == -1) break;
offset += bytesRead;
I would advice you to write the bytes to your response as you read them, and use a small buffer, this way your server won't be affected by large memory usage. It is not a good practice to load up all the bytes into an array in the server's memory.
Here is a quick sample:
response.setContentType("text/html;charset=UTF-8");
OutputStream out = response.getOutputStream();
HttpURLConnection uc;
// Setup the HTTP connection...
InputStream in = uc.getInputStream();
byte[] b = new byte[1024];
int bytesRead = 0;
while ( bytesRead != -1 ) {
bytesRead = in.read(b);
out.write(b);;
}
// Close the streams...
You seem to be proxying a HTML page with images and you seem to be expecting that images in a HTML page are somehow automagically inlined in the HTML source code. This is thus absolutely not true. Images in HTML are represented by the <img> element with a src attribute pointing to an URL which the webbrowser has to invoke and download individually. Exactly the same story applies to other resources like CSS and JS files.
You basically need to parse the obtained HTML, scan for all <img src> (and if necessary also <link href> and <script src>) elements and change their URL to the URL of your proxy so that it can serve the desired image (and CSS/JS) resources individually.
You can find a kickoff example in this answer on a related question: Make HttpURLConnection load web pages with images

Java's URLConnection not receiving entire binary file

I am currently working on a school that encompasses creating a P2P client for a standard we came up with in class that uses HTTP to request chunks of a binary file from peers. We are allowed to us Java's HTTP libraries to make these requests, however I am hitting a major problem with these libraries. All chunks of a file will be served up in chunks that are <=64KB, but when I use the following code, the max amount of bytes that I receive is around 15040 even though the content-length of the response is 64KB:
String response = "";
URL url = new URL(uriPath);
URLConnection conn = url.openConnection ();
conn.setConnectTimeout(30 * 1000);
conn.setReadTimeout(30 * 1000);
InputStream stream = conn.getInputStream();
ByteArrayOutputStream byteArrayOut = new ByteArrayOutputStream();
int c;
byte[] buffer = new byte[4096];
while ((c = stream.read(buffer)) != -1)
{
byteArrayOut.write(buffer,0,c);
}
body = byteArrayOut.toByteArray();
stream.close();
result.put(Constants.HEADER_CONTENT_LENGTH, conn.getHeaderField(Constants.HEADER_CONTENT_LENGTH));
result.put(Constants.HEADER_CONTENT_CHECKSUM, conn.getHeaderField(Constants.HEADER_CONTENT_CHECKSUM));
result.put(Constants.KEY_BODY, new String(body));
We've tested our server component, and that serves the file correctly when accessing a chunk with wget or in a browser - this java client is the only problematic client we were able to find.
Is this a problem with Java's URLConnection class, or is there something in my code that is wrong with reading a binary file that is returned in a response?
Note: I am using Java 1.6 in Eclipse and from the command line.
How do you know that the max amount of bytes is 15040? Did you byteArrayOut.toByteArray().length or did you do new String(byteArrayOut.toByteArray()).length()?
Creating a new String from a byte array that has binary content is likely to give unpredictable results. Use a FileOutputStream and open the file.

Possible to send an image to my Servlet as a ByteBuffer?

I am attempting to have my android phone connect to my servlet and send it a certain image. The way I figured I would do this, is to use the copyPixelsToBuffer() function and then attempt to send this to the servlet through some output stream(similar to how I would do it in a normal stand alone java application). Will this way work? If so, what kind of stream do I use exactly? Should I just use DataOutputStream and just do something like the following:
ByteBuffer imgbuff;
Bitmap bm = BitmapFactory.decodeResource(getResources(), R.drawable.icon);
bm.copyPixelsToBuffer(bm);
...code...
URLConnection sc = server.openConnection();
sc.setDoOutput(true);
DataOutputStream out = new DataOutputStream( sc.getOutputStream() );
out.write(imgbuff.array());
out.flush();
out.close();
Note: I understand that this may not be the proper way of connecting to a server using the Android OS but at the moment I'm working on just how to send the image, not the connection (unless this is relevant on how the image is sent).
If this is not a way you'd recommend sending the image to the servlet (I figured a byte buffer would be best but I could be wrong), how would you recommend this to be done?
Since a HttpServlet normally listens on HTTP requests, you'd like to use multipart/form-data encoding to send binary data over HTTP, instead of raw (unformatted) like that.
From the client side on, you can use URLConnection for this as outlined in this mini tutorial, but it's going to be pretty verbose. You can also use Apache HttpComponents Client for this. This adds however extra dependencies, I am not sure if you'd like to have that on Android.
Then, on the server side, you can use Apache Commons FileUpload to parse the items out of a multipart/form-data encoded request body. You can find a code example in this answer how the doPost() of the servlet should look like.
As to your code example: wrapping in the DataOutputStream is unnecessary. You aren't taking benefit of the DataOutputStream's facilities. You are just using write(byte[]) method which is already provided by the basic OutputStream as returned by URLConnection#getOutputStream(). Further, the Bitmap has a compress() method which you can use to compress it using a more standard and understandable format (PNG, JPG, etc) into an arbitrary OutputStream. E.g.
output = connection.getOutputStream();
// ...
bitmap.compress(CompressFormat.JPEG, 100, output);
Do this instead of output.write(bytes) as in your code.

Categories