I seem to be running into a peculiar problem on Android 1.5 when a library I'm using (signpost 1.1-SNAPSHOT), makes two consecutive connections to a remote server. The second connection always fails with a HttpURLConnection.getResponseCode() of -1
Here's a testcase that exposes the problem:
// BROKEN
public void testDefaultOAuthConsumerAndroidBug() throws Exception {
for (int i = 0; i < 2; ++i) {
final HttpURLConnection c = (HttpURLConnection) new URL("https://api.tripit.com/oauth/request_token").openConnection();
final DefaultOAuthConsumer consumer = new DefaultOAuthConsumer(api_key, api_secret, SignatureMethod.HMAC_SHA1);
consumer.sign(c); // This line...
final InputStream is = c.getInputStream();
while( is.read() >= 0 ) ; // ... in combination with this line causes responseCode -1 for i==1 when using api.tripit.com but not mail.google.com
assertTrue(c.getResponseCode() > 0);
}
}
Basically, if I sign the request and then consume the entire input stream, the next request will fail with a resultcode of -1. The failure doesn't seem to happen if I just read one character from the input stream.
Note that this doesn't happen for any url -- just specific urls such as the one above.
Also, if I switch to using HttpClient instead of HttpURLConnection, everything works fine:
// WORKS
public void testCommonsHttpOAuthConsumerAndroidBug() throws Exception {
for (int i = 0; i < 2; ++i) {
final HttpGet c = new HttpGet("https://api.tripit.com/oauth/request_token");
final CommonsHttpOAuthConsumer consumer = new CommonsHttpOAuthConsumer(api_key, api_secret, SignatureMethod.HMAC_SHA1);
consumer.sign(c);
final HttpResponse response = new DefaultHttpClient().execute(c);
final InputStream is = response.getEntity().getContent();
while( is.read() >= 0 ) ;
assertTrue( response.getStatusLine().getStatusCode() == 200);
}
}
I've found references to what seems to be a similar problem elsewhere, but so far no solutions. If they're truly the same problem, then the problem probably isn't with signpost since the other references make no reference to it.
Any ideas?
Try set this property to see if it helps,
http.keepAlive=false
I saw similar problems when server response is not understood by UrlConnection and client/server gets out of sync.
If this solves your problem, you have to get a HTTP trace to see exactly what's special about the response.
EDIT: This change just confirms my suspicion. It doesn't solve your problem. It just hides the symptom.
If the response from first request is 200, we need a trace. I normally use Ethereal/Wireshark to get the TCP trace.
If your first response is not 200, I do see a problem in your code. With OAuth, the error response (401) actually returns data, which includes ProblemAdvice, Signature Base String etc to help you debug. You need to read everything from error stream. Otherwise, it's going to confuse next connection and that's the cause of -1. Following example shows you how to handle errors correctly,
public static String get(String url) throws IOException {
ByteArrayOutputStream os = new ByteArrayOutputStream();
URLConnection conn=null;
byte[] buf = new byte[4096];
try {
URL a = new URL(url);
conn = a.openConnection();
InputStream is = conn.getInputStream();
int ret = 0;
while ((ret = is.read(buf)) > 0) {
os.write(buf, 0, ret);
}
// close the inputstream
is.close();
return new String(os.toByteArray());
} catch (IOException e) {
try {
int respCode = ((HttpURLConnection)conn).getResponseCode();
InputStream es = ((HttpURLConnection)conn).getErrorStream();
int ret = 0;
// read the response body
while ((ret = es.read(buf)) > 0) {
os.write(buf, 0, ret);
}
// close the errorstream
es.close();
return "Error response " + respCode + ": " +
new String(os.toByteArray());
} catch(IOException ex) {
throw ex;
}
}
}
I've encountered the same problem when I did not read in all the data from the InputStream before closing it and opening a second connection. It was also fixed either with System.setProperty("http.keepAlive", "false"); or simply just looping until I've read the rest of the InputStream.
Not completely related to your issue, but hope this helps anyone else with a similar problem.
Google provided an elegant workaround since it's only happening prior to Froyo:
private void disableConnectionReuseIfNecessary() {
// HTTP connection reuse which was buggy pre-froyo
if (Integer.parseInt(Build.VERSION.SDK) < Build.VERSION_CODES.FROYO) {
System.setProperty("http.keepAlive", "false");
}
}
Cf. http://android-developers.blogspot.ca/2011/09/androids-http-clients.html
Or, you can set HTTP header in the connection (HttpUrlConnection):
conn.setRequestProperty("Connection", "close");
Can you verify that the connection is not getting closed before you finish reading the response? Maybe HttpClient parses the response code right away, and saves it for future queries, however HttpURLConnection could be returning -1 once the connection is closed?
Related
I've got a simple Tomcat-based Java app that functions as a sort of firewall - I take requests from the "outside", reroute them to resources on the "inside", and return the result to the "outside."
This works fine for GETs, but I'm trying to add a POST function for a different request and I cannot get it working. The "inside" remote server is password protected and I cannot get the remote server to accept the authentication credentials (they work for the GET so the credentials are fine.) Instead, the Tomcat server calls the Authenticator over and over, and finally fails. Here's the error I'm getting:
java.net.ProtocolException: Server redirected too many times (20)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1848)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1441)
at com.mystuff.house.server.MyServlet.doPost(MyServlet.java:191)
I'm sure I'm doing something stupid, but I can't see where it is. Here's the guts of the servlet doPost() routine:
URL url = new URL("HTTP", "10.10.1.101", -1, "/myresource");
URLConnection con = url.openConnection();
HttpURLConnection http = (HttpURLConnection) con;
http.setRequestMethod("POST");
http.setDoOutput(true);
String encoded = String.valueOf(Base64.getEncoder().encode((a.getUsername().concat(":").concat(a.getPassword())).getBytes()));
http.setRequestProperty("Authorization", "Basic "+encoded);
http.setRequestProperty("Content-Type", "application/x-www-form-urlencoded; charset=UTF-8");
// Read the POST payload from the front end post, write to back end post
InputStream r = request.getInputStream();
OutputStream os = http.getOutputStream();
int j = 0;
while ((j = r.read()) != -1) {
os.write((byte) j);
}
http.connect();
// Try reading the result from the back end, push it back to the front end
try {
InputStream i = http.getInputStream();
OutputStream o = response.getOutputStream();
// read/write bytes until EOF
j = 0;
while ((j = i.read()) != -1) {
o.write((byte) j);
}
} catch (Exception ex) {
System.out.println("AIEEEE! Error receiving page from HTTP call");
ex.printStackTrace();
}
The problem with this, after some investigation, turned out to be that the authentication was not valid for the specific URL that I was trying to hit on the remote server.
I had expected to get a 403, 401 or 407 back from the remote server but that never happened, instead this "redirect" happened. So that's something to be aware of if you are trying to hit password-protected URLs from Java code.
I am using simple code to execute get request and load page data. Relevant code is here :
public class HttpTest {
public static void main(String[] args) throws IOException {
URL url = new URL(
"http://" + args[0] + "/send?pts=900000000&place=1");
URLConnection conn = url.openConnection();
conn.setReadTimeout(3000);
conn.setConnectTimeout(3000);
StringBuilder result = new StringBuilder();
InputStream in = conn.getInputStream();
int c;
while ((c = in.read()) != -1) {
result.append((char) c);
}
String response = result.toString();
System.out.println(response);
}
}
When I execute the code with Java 6 everything is OK and my code prints the response. But when I execute it with Java 8 I get error on opening the stream :
Exception in thread "main" java.io.IOException: Invalid Http response
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1553)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1439)
at HttpTest.main(HttpTest.java:23)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:483)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
I tried different methods and used some libraries but always with Java 8 I get errors.
When curl-ing that URL or testing it with postman I can see that the response does not contain any header fields and statuses but I can not change the service that is giving that response so somehow I must tell Java 8 to chill about missing code 200 and headers and to just give me what is there in the body. How?
You can’t expect an error tolerance from an http client implementation unless explicitly specified. So if you know that the server isn’t actually an http server (strictly speaking), you might simply implement a manual socket access mimicking the http protocol as far as the server understands. It’s as simple as
String host = args[0];
try(Socket s=new Socket(host, 80)) {
Writer w=new OutputStreamWriter(s.getOutputStream(), StandardCharsets.US_ASCII);
w.write("GET http://"+host+"/send?pts=900000000&place=1 HTTP/1.0\r\n\r\n");
w.flush();
// rest taken from your original code, what you are basically doing
// is interpreting the received data like being ISO_8851_1 encoded
// you might have to strip off the remains of the return header, if there is one
StringBuilder result = new StringBuilder();
InputStream in = s.getInputStream();
int c;
while ((c = in.read()) != -1) {
result.append((char) c);
}
String response = result.toString();
System.out.println(response);
}
I'm trying to invoke a webservice call and get a response. When I tried it first time it worked perfectly and printed the response. But after that one run, how many ever times I run it, i throws me
Exception in thread "main" java.lang.IllegalStateException: Already connected
at sun.net.www.protocol.http.HttpURLConnection.setRequestProperty(Unknown Source)
at SOAPClient4XG.main(SOAPClient4XG.java:72)
I have tried various solutions provided for similar problem (like connect / disconnect) but nothing seems to make it work. I understand that it tries to perform an operation on already existing connection, but not sure how to fix. I'm fairly new to all this and I need help.
Below is my code
import java.io.*;
import java.net.*;
public class SOAPClient4XG
{
private static HttpURLConnection httpConn;
public static void main(String[] args) throws Exception {
String SOAPUrl = args[0];
String xmlFile2Send = args[1];*/
String SOAPUrl = "http://10.153.219.88:8011/celg-svcs-soap/business/ApplicantEligibility";
String xmlFile2Send =
"C:\\Users\\dkrishnamoorthy\\workspace\\SOAPUI_Automation\\src\\ApplicantElligibilty.xml";
String SOAPAction = "";
if (args.length > 2)
SOAPAction = args[2];
// Create the connection where we're going to send the file.
URL url = new URL(SOAPUrl);
URLConnection connection = url.openConnection();
//URLConnection connection = new URLConnection(url);
httpConn = (HttpURLConnection) connection;
if(httpConn.getResponseCode()==500)
{
System.out.println("Error Stream for 500 : "+httpConn.getErrorStream());
}
// Open the input file. After we copy it to a byte array, we can see
// how big it is so that we can set the HTTP Cotent-Length
// property. (See complete e-mail below for more on this.)
FileInputStream fin = new FileInputStream(xmlFile2Send);
ByteArrayOutputStream bout = new ByteArrayOutputStream();
// Copy the SOAP file to the open connection.
copy(fin,bout);
fin.close();
byte[] b = bout.toByteArray();
// Set the appropriate HTTP parameters.
httpConn.setRequestProperty( "Content-Length",
String.valueOf( b.length ) );
httpConn.setRequestProperty("Content-Type","text/xml; charset=utf-8");
httpConn.setRequestProperty("SOAPAction",SOAPAction);
httpConn.setRequestMethod( "POST" );
httpConn.setDoOutput(true);
httpConn.setDoInput(true);
// httpConn.connect();
// Everything's set up; send the XML that was read in to b.
OutputStream out = httpConn.getOutputStream();
out.write( b );
out.close();
// Read the response and write it to standard out.
InputStreamReader isr =
new InputStreamReader(httpConn.getInputStream());
BufferedReader in = new BufferedReader(isr);
String inputLine;
System.out.println("Printing the Response ");
while ((inputLine = in.readLine()) != null)
System.out.println(inputLine);
in.close();
}
public static void copy(InputStream in, OutputStream out)
throws IOException {
synchronized (in) {
synchronized (out) {
byte[] buffer = new byte[256];
while (true) {
int bytesRead = in.read(buffer);
if (bytesRead == -1) break;
out.write(buffer, 0, bytesRead);
}
}
}
}
}
If you use eclipse version just restart it. I met the same issue and I sorted out by doing that .
I solved this because I had a forgotten watch for connection.getResponseCode() in my debugging interface in NetBeans. Hope it might help others making the same mistake.
If you have any watch relative to the response value of the request, such as getResponseCode(), getResponseMessage(), getInputStream() or even just connect(), you will get this error in debugging mode.
All of the previous methods implicitly call connect() and fire the request. So when you reach setDoOutput, the connection is already made.
so i made a little code that can download 4chan pages. i get the raw HTML page and parse it for my need. the code below was working fine but it suddenly stopped working. when i run it the server does not accept my request it seems its waiting for something more. however i know that HTTP request is as below
GET /ck HTTP/1.1
Host: boards.4chan.org
(extra new line)
if i change this format in anyway i revive "400 bad request" status code. but if i change HTTP/1.1 to 1.0 the server responses in "200 ok" status and i get the whole page. so this makes me thing the error is in the host line since that became mandatory in HTTP/1.1. but still i cannot figure out what exactly need to be changed.
the calling function simply this, to get one whole board
downloadHTMLThread( "ck", -1);
or for a specific thread u just change -1 to that number. for example like for the link below will have like below.
//http://boards.4chan.org/ck/res/3507158
//url.getDefaultPort() is 80
//url.getHost() is boards.4chan.org
//url.getFile() is /ck/res/3507158
downloadHTMLThread( "ck", 3507158);
any advise would be appreciated, thanks
public static final String BOARDS = "boards.4chan.org";
public static final String IMAGES = "images.4chan.org";
public static final String THUMBS = "thumbs.4chan.org";
public static final String RES = "/res/";
public static final String HTTP = "http://";
public static final String SLASH = "/";
public String downloadHTMLThread( String board, int thread) {
BufferedReader reader = null;
PrintWriter out = null;
Socket socket = null;
String str = null;
StringBuilder input = new StringBuilder();
try {
URL url = new URL(HTTP+BOARDS+SLASH+board+(thread==-1?SLASH:RES+thread));
socket = new Socket( url.getHost(), url.getDefaultPort());
reader = new BufferedReader( new InputStreamReader( socket.getInputStream()));
out = new PrintWriter(socket.getOutputStream(), true);
out.println( "GET " +url.getFile()+ " HTTP/1.1");
out.println( "HOST: " + url.getHost());
out.println();
long start = System.currentTimeMillis();
while ((str = reader.readLine()) != null) {
input.append( str).append("\r\n");
}
long end = System.currentTimeMillis();
System.out.println( input);
System.out.println( "\nTime: " +(end-start)+ " milliseconds");
} catch (Exception ex) {
ex.printStackTrace();
input = null;
} finally {
if( reader!=null){
try {
reader.close();
} catch (IOException ioe) {
// nothing to see here
}
}
if( socket!=null){
try {
socket.close();
} catch (IOException ioe) {
// nothing to see here
}
}
if( out!=null){
out.close();
}
}
return input==null? null: input.toString();
}
Try using Apache HttpClient instead of rolling your own:
static String getUriContentsAsString(String uri) throws IOException {
HttpClient client = new DefaultHttpClient();
HttpResponse response = client.execute(new HttpGet(uri));
return EntityUtils.toString(response.getEntity());
}
If you are doing this to really learn the internals of HTTP client requests, then you might start by playing with curl from the command line. This will let you get all your headers and request body squared away. Then it will be a simple matter of adjusting your request to match what works in curl.
By the code I think that you are sending 'HOST' instead of 'Host'. Since this is a compulsory header in http/1.1, but ignored in http/1.0, that might be the problem.
Anyway, you could use a program to capture the packet sent (i. e. wireshark), just to make sure.
Using println is quite useful, but the line separator appended to the command depends on the system property line.separator. I think (although I'm not sure) that the line separator used in http protocol has to be '\r\n'. If you're capturing the packet, I think it'd be a good idea to check that each line sent ends with '\r\n' (bytes x0D0A) (just in case your os line separator is different)
Use www.4chan.org as the host instead. Since boards.4chan.org is a 302 redirect to www.4chan.org, you won't be able to scrape anything from boards.4chan.org.
I have the following Java code to fetch the entire contents of an HTML page at a given URL. Can this be done in a more efficient way? Any improvements are welcome.
public static String getHTML(final String url) throws IOException {
if (url == null || url.length() == 0) {
throw new IllegalArgumentException("url cannot be null or empty");
}
final HttpURLConnection conn = (HttpURLConnection) new URL(url).openConnection();
final BufferedReader buf = new BufferedReader(new InputStreamReader(conn.getInputStream()));
final StringBuilder page = new StringBuilder();
final String lineEnd = System.getProperty("line.separator");
String line;
try {
while (true) {
line = buf.readLine();
if (line == null) {
break;
}
page.append(line).append(lineEnd);
}
} finally {
buf.close();
}
return page.toString();
}
I can't help but feel that the line reading is less than optimal. I know that I'm possibly masking a MalformedURLException caused by the openConnection call, and I'm okay with that.
My function also has the side-effect of making the HTML String have the correct line terminators for the current system. This isn't a requirement.
I realize that network IO will probably dwarf the time it takes to read in the HTML, but I'd still like to know this is optimal.
On a side note: It would be awesome if StringBuilder had a constructor for an open InputStream that would simply take all the contents of the InputStream and read it into the StringBuilder.
As seen in the other answers, there are many different edge cases (HTTP peculiarities, encoding, chunking, etc) that should be accounted for in any robust solution. Therefore I propose that in anything other than a toy program you use the de facto Java standard HTTP library: Apache HTTP Components HTTP Client.
They provide many samples, "just" getting the response contents for a request looks like this:
HttpClient httpclient = new DefaultHttpClient();
HttpGet httpget = new HttpGet("http://www.google.com/");
ResponseHandler<String> responseHandler = new BasicResponseHandler();
String responseBody = httpclient.execute(httpget, responseHandler);
// responseBody now contains the contents of the page
System.out.println(responseBody);
httpclient.getConnectionManager().shutdown();
OK, edited once more. Be sure to put your try-finally blocks around it, or catch IOException
...
final static int BUFZ = 4096;
StringBuilder page = new StringBuilder();
HttpURLConnection conn =
(HttpURLConnection) new URL(url).openConnection();
InputStream is = conn.getInputStream()
// perhaps allocate this one time and reuse if you
//call this method a lot.
byte[] buf = new byte[BUFZ] ;
int nRead = 0;
while((nRead = is.read(buf, 0, BUFZ) > 0) {
page.append(new String(buf /* , Charset charset */));
// uses local default char encoding for now
}
Here try this one:
...
final static int MAX_SIZE = 10000000;
HttpURLConnection conn =
(HttpURLConnection) new URL(url).openConnection();
InputStream is = conn.getInputStream()
// perhaps allocate this one time and reuse if you
//call this method a lot.
byte[] buf = new byte[MAX_SIZE] ;
int nRead = 0;
int total = 0;
// you could also use ArrayList so that you could dynamically
// resize or there are other ways to resize an array also
while(total < MAX_SIZE && (nRead = is.read(buf) > 0) {
total += nRead;
}
...
// do something with buf array of length total
OK the code below was not working for you because the Content-length header line is not being sent in the beginning due to HTTP/1.1 "chunking"
...
HttpURLConnection conn =
(HttpURLConnection) new URL(url).openConnection();
InputStream is = conn.getInputStream()
int cLen = conn.getContentLength() ;
byte[] buf = new byte[cLen] ;
int nRead=0 ;
while(nRead < cLen) {
nRead += is.read(buf, nRead, cLen - nRead) ;
}
...
// do something with buf array
You could do your own buffering on top of InputStreamReader by reading bigger chunks into a character array and appending the array contents to the StringBuilder.
But it would make your code slightly harder to understand, and I doubt it would be worth it.
Note that the proposal by Sean A.O. Harney reads raw bytes, so you'd need to do the conversion to text on top of that.