In Java, this code throws an exception when the HTTP result is 404 range:
URL url = new URL("http://stackoverflow.com/asdf404notfound");
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.getInputStream(); // throws!
In my case, I happen to know that the content is 404, but I'd still like to read the body of the response anyway.
(In my actual case the response code is 403, but the body of the response explains the reason for rejection, and I'd like to display that to the user.)
How can I access the response body?
Here is the bug report (close, will not fix, not a bug).
Their advice there is to code like this:
HttpURLConnection httpConn = (HttpURLConnection)_urlConnection;
InputStream _is;
if (httpConn.getResponseCode() < HttpURLConnection.HTTP_BAD_REQUEST) {
_is = httpConn.getInputStream();
} else {
/* error from server */
_is = httpConn.getErrorStream();
}
It's the same problem I was having:
HttpUrlConnection returns FileNotFoundException if you try to read the getInputStream() from the connection.
You should instead use getErrorStream() when the status code is higher than 400.
More than this, please be careful since it's not only 200 to be the success status code, even 201, 204, etc. are often used as success statuses.
Here is an example of how I went to manage it
... connection code code code ...
// Get the response code
int statusCode = connection.getResponseCode();
InputStream is = null;
if (statusCode >= 200 && statusCode < 400) {
// Create an InputStream in order to extract the response object
is = connection.getInputStream();
}
else {
is = connection.getErrorStream();
}
... callback/response to your handler....
In this way, you'll be able to get the needed response in both success and error cases.
Hope this helps!
In .Net you have the Response property of the WebException that gives access to the stream ON an exception. So i guess this is a good way for Java,...
private InputStream dispatch(HttpURLConnection http) throws Exception {
try {
return http.getInputStream();
} catch(Exception ex) {
return http.getErrorStream();
}
}
Or an implementation i used. (Might need changes for encoding or other things. Works in current environment.)
private String dispatch(HttpURLConnection http) throws Exception {
try {
return readStream(http.getInputStream());
} catch(Exception ex) {
readAndThrowError(http);
return null; // <- never gets here, previous statement throws an error
}
}
private void readAndThrowError(HttpURLConnection http) throws Exception {
if (http.getContentLengthLong() > 0 && http.getContentType().contains("application/json")) {
String json = this.readStream(http.getErrorStream());
Object oson = this.mapper.readValue(json, Object.class);
json = this.mapper.writer().withDefaultPrettyPrinter().writeValueAsString(oson);
throw new IllegalStateException(http.getResponseCode() + " " + http.getResponseMessage() + "\n" + json);
} else {
throw new IllegalStateException(http.getResponseCode() + " " + http.getResponseMessage());
}
}
private String readStream(InputStream stream) throws Exception {
StringBuilder builder = new StringBuilder();
try (BufferedReader in = new BufferedReader(new InputStreamReader(stream))) {
String line;
while ((line = in.readLine()) != null) {
builder.append(line); // + "\r\n"(no need, json has no line breaks!)
}
in.close();
}
System.out.println("JSON: " + builder.toString());
return builder.toString();
}
I know that this doesn't answer the question directly, but instead of using the HTTP connection library provided by Sun, you might want to take a look at Commons HttpClient, which (in my opinion) has a far easier API to work with.
First check the response code and then use HttpURLConnection.getErrorStream()
InputStream is = null;
if (httpConn.getResponseCode() !=200) {
is = httpConn.getErrorStream();
} else {
/* error from server */
is = httpConn.getInputStream();
}
My running code.
HttpURLConnection httpConn = (HttpURLConnection) urlConn;
if (httpConn.getResponseCode() < HttpURLConnection.HTTP_BAD_REQUEST) {
in = new InputStreamReader(urlConn.getInputStream());
BufferedReader bufferedReader = new BufferedReader(in);
if (bufferedReader != null) {
int cp;
while ((cp = bufferedReader.read()) != -1) {
sb.append((char) cp);
}
bufferedReader.close();
}
in.close();
} else {
/* error from server */
in = new InputStreamReader(httpConn.getErrorStream());
BufferedReader bufferedReader = new BufferedReader(in);
if (bufferedReader != null) {
int cp;
while ((cp = bufferedReader.read()) != -1) {
sb.append((char) cp);
}
bufferedReader.close();
}
in.close();
}
System.out.println("sb="+sb);
How to read 404 response body in java:
Use Apache library - https://hc.apache.org/httpcomponents-client-4.5.x/httpclient/apidocs/
or
Java 11 - https://docs.oracle.com/en/java/javase/11/docs/api/java.net.http/java/net/http/HttpClient.html
Snippet given below uses Apache:
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClients;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.util.EntityUtils;
CloseableHttpClient client = HttpClients.createDefault();
CloseableHttpResponse resp = client.execute(new HttpGet(domainName + "/blablablabla.html"));
String response = EntityUtils.toString(resp.getEntity());
Related
I just try to post data to google by using the following code,but always got 405 error,can anybody tell me way?
package com.tom.labs;
import java.net.*;
import java.io.*;
public class JavaHttp {
public static void main(String[] args) throws Exception {
File data = new File("D:\\in.txt");
File result = new File("D:\\out.txt");
FileOutputStream out = new FileOutputStream(result);
OutputStreamWriter writer = new OutputStreamWriter(out);
Reader reader = new InputStreamReader(new FileInputStream(data));
postData(reader,new URL("http://google.com"),writer);//Not working
//postData(reader,new URL("http://google.com/search"),writer);//Not working
sendGetRequest("http://google.com/search", "q=Hello");//Works properly
}
public static String sendGetRequest(String endpoint,
String requestParameters) {
String result = null;
if (endpoint.startsWith("http://")) {
// Send a GET request to the servlet
try {
// Send data
String urlStr = endpoint;
if (requestParameters != null && requestParameters.length() > 0) {
urlStr += "?" + requestParameters;
}
URL url = new URL(urlStr);
URLConnection conn = url.openConnection();
// Get the response
BufferedReader rd = new BufferedReader(new InputStreamReader(
conn.getInputStream()));
StringBuffer sb = new StringBuffer();
String line;
while ((line = rd.readLine()) != null) {
sb.append(line);
}
rd.close();
result = sb.toString();
} catch (Exception e) {
e.printStackTrace();
}
}
System.out.println(result);
return result;
}
/**
* Reads data from the data reader and posts it to a server via POST
* request. data - The data you want to send endpoint - The server's address
* output - writes the server's response to output
*
* #throws Exception
*/
public static void postData(Reader data, URL endpoint, Writer output)
throws Exception {
HttpURLConnection urlc = null;
try {
urlc = (HttpURLConnection) endpoint.openConnection();
try {
urlc.setRequestMethod("POST");
} catch (ProtocolException e) {
throw new Exception(
"Shouldn't happen: HttpURLConnection doesn't support POST??",
e);
}
urlc.setDoOutput(true);
urlc.setDoInput(true);
urlc.setUseCaches(false);
urlc.setAllowUserInteraction(false);
urlc.setRequestProperty("Content-type", "text/xml; charset=UTF-8");
OutputStream out = urlc.getOutputStream();
try {
Writer writer = new OutputStreamWriter(out, "UTF-8");
pipe(data, writer);
writer.close();
} catch (IOException e) {
throw new Exception("IOException while posting data", e);
} finally {
if (out != null)
out.close();
}
InputStream in = urlc.getInputStream();
try {
Reader reader = new InputStreamReader(in);
pipe(reader, output);
reader.close();
} catch (IOException e) {
throw new Exception("IOException while reading response", e);
} finally {
if (in != null)
in.close();
}
} catch (IOException e) {
e.printStackTrace();
throw new Exception("Connection error (is server running at "
+ endpoint + " ?): " + e);
} finally {
if (urlc != null)
urlc.disconnect();
}
}
/**
* Pipes everything from the reader to the writer via a buffer
*/
private static void pipe(Reader reader, Writer writer) throws IOException {
char[] buf = new char[1024];
int read = 0;
while ((read = reader.read(buf)) >= 0) {
writer.write(buf, 0, read);
}
writer.flush();
}
}
405 means "method not allowed". For example, if you try to POST to a URL that doesn't allow POST, then the server will return a 405 status.
What are you trying to do by making a POST request to Google? I suspect that Google's home page only allows GET, HEAD, and maybe OPTIONS.
Here's the body of a POST request to Google, containing Google's explanation.
405. That’s an error.
The request method POST is inappropriate for the URL /. That’s all we know.
I am making a crawler, and need to get the data from the stream regardless if it is a 200 or not. CURL is doing it, as well as any standard browser.
The following will not actually get the content of the request, even though there is some, an exception is thrown with the http error status code. I want the output regardless, is there a way? I prefer to use this library as it will actually do persistent connections, which is perfect for the type of crawling I am doing.
package test;
import java.net.*;
import java.io.*;
public class Test {
public static void main(String[] args) {
try {
URL url = new URL("http://github.com/XXXXXXXXXXXXXX");
URLConnection connection = url.openConnection();
DataInputStream inStream = new DataInputStream(connection.getInputStream());
String inputLine;
while ((inputLine = inStream.readLine()) != null) {
System.out.println(inputLine);
}
inStream.close();
} catch (MalformedURLException me) {
System.err.println("MalformedURLException: " + me);
} catch (IOException ioe) {
System.err.println("IOException: " + ioe);
}
}
}
Worked, thanks: Here is what I came up with - just as a rough proof of concept:
import java.net.*;
import java.io.*;
public class Test {
public static void main(String[] args) {
//InputStream error = ((HttpURLConnection) connection).getErrorStream();
URL url = null;
URLConnection connection = null;
String inputLine = "";
try {
url = new URL("http://verelo.com/asdfrwdfgdg");
connection = url.openConnection();
DataInputStream inStream = new DataInputStream(connection.getInputStream());
while ((inputLine = inStream.readLine()) != null) {
System.out.println(inputLine);
}
inStream.close();
} catch (MalformedURLException me) {
System.err.println("MalformedURLException: " + me);
} catch (IOException ioe) {
System.err.println("IOException: " + ioe);
InputStream error = ((HttpURLConnection) connection).getErrorStream();
try {
int data = error.read();
while (data != -1) {
//do something with data...
//System.out.println(data);
inputLine = inputLine + (char)data;
data = error.read();
//inputLine = inputLine + (char)data;
}
error.close();
} catch (Exception ex) {
try {
if (error != null) {
error.close();
}
} catch (Exception e) {
}
}
}
System.out.println(inputLine);
}
}
Simple:
URLConnection connection = url.openConnection();
InputStream is = connection.getInputStream();
if (connection instanceof HttpURLConnection) {
HttpURLConnection httpConn = (HttpURLConnection) connection;
int statusCode = httpConn.getResponseCode();
if (statusCode != 200 /* or statusCode >= 200 && statusCode < 300 */) {
is = httpConn.getErrorStream();
}
}
You can refer to Javadoc for explanation. The best way I would handle this is as follows:
URLConnection connection = url.openConnection();
InputStream is = null;
try {
is = connection.getInputStream();
} catch (IOException ioe) {
if (connection instanceof HttpURLConnection) {
HttpURLConnection httpConn = (HttpURLConnection) connection;
int statusCode = httpConn.getResponseCode();
if (statusCode != 200) {
is = httpConn.getErrorStream();
}
}
}
You need to do the following after calling openConnection.
Cast the URLConnection to HttpURLConnection
Call getResponseCode
If the response is a success, use getInputStream, otherwise use getErrorStream
(The test for success should be 200 <= code < 300 because there are valid HTTP success codes apart from than 200.)
I am making a crawler, and need to get the data from the stream regardless if it is a 200 or not.
Just be aware that it if the code is a 4xx or 5xx, then the "data" is likely to be an error page of some kind.
The final point that should be made is that you should always respect the "robots.txt" file ... and read the Terms of Service before crawling / scraping the content of a site whose owners might care. Simply blatting off GET requests is likely to annoy site owners ... unless you've already come to some sort of "arrangement" with them.
I'm trying to find Java's equivalent to Groovy's:
String content = "http://www.google.com".toURL().getText();
I want to read content from a URL into string. I don't want to pollute my code with buffered streams and loops for such a simple task. I looked into apache's HttpClient but I also don't see a one or two line implementation.
Now that some time has passed since the original answer was accepted, there's a better approach:
String out = new Scanner(new URL("http://www.google.com").openStream(), "UTF-8").useDelimiter("\\A").next();
If you want a slightly fuller implementation, which is not a single line, do this:
public static String readStringFromURL(String requestURL) throws IOException
{
try (Scanner scanner = new Scanner(new URL(requestURL).openStream(),
StandardCharsets.UTF_8.toString()))
{
scanner.useDelimiter("\\A");
return scanner.hasNext() ? scanner.next() : "";
}
}
This answer refers to an older version of Java. You may want to look at ccleve's answer.
Here is the traditional way to do this:
import java.net.*;
import java.io.*;
public class URLConnectionReader {
public static String getText(String url) throws Exception {
URL website = new URL(url);
URLConnection connection = website.openConnection();
BufferedReader in = new BufferedReader(
new InputStreamReader(
connection.getInputStream()));
StringBuilder response = new StringBuilder();
String inputLine;
while ((inputLine = in.readLine()) != null)
response.append(inputLine);
in.close();
return response.toString();
}
public static void main(String[] args) throws Exception {
String content = URLConnectionReader.getText(args[0]);
System.out.println(content);
}
}
As #extraneon has suggested, ioutils allows you to do this in a very eloquent way that's still in the Java spirit:
InputStream in = new URL( "http://jakarta.apache.org" ).openStream();
try {
System.out.println( IOUtils.toString( in ) );
} finally {
IOUtils.closeQuietly(in);
}
Or just use Apache Commons IOUtils.toString(URL url), or the variant that also accepts an encoding parameter.
There's an even better way as of Java 9:
URL u = new URL("http://www.example.com/");
try (InputStream in = u.openStream()) {
return new String(in.readAllBytes(), StandardCharsets.UTF_8);
}
Like the original groovy example, this assumes that the content is UTF-8 encoded. (If you need something more clever than that, you need to create a URLConnection and use it to figure out the encoding.)
Now that more time has passed, here's a way to do it in Java 8:
URLConnection conn = url.openConnection();
try (BufferedReader reader = new BufferedReader(new InputStreamReader(conn.getInputStream(), StandardCharsets.UTF_8))) {
pageText = reader.lines().collect(Collectors.joining("\n"));
}
Additional example using Guava:
URL xmlData = ...
String data = Resources.toString(xmlData, Charsets.UTF_8);
Java 11+:
URI uri = URI.create("http://www.google.com");
HttpRequest request = HttpRequest.newBuilder(uri).build();
String content = HttpClient.newHttpClient().send(request, BodyHandlers.ofString()).body();
If you have the input stream (see Joe's answer) also consider ioutils.toString( inputstream ).
http://commons.apache.org/io/api-1.4/org/apache/commons/io/IOUtils.html#toString(java.io.InputStream)
The following works with Java 7/8, secure urls, and shows how to add a cookie to your request as well. Note this is mostly a direct copy of this other great answer on this page, but added the cookie example, and clarification in that it works with secure urls as well ;-)
If you need to connect to a server with an invalid certificate or self signed certificate, this will throw security errors unless you import the certificate. If you need this functionality, you could consider the approach detailed in this answer to this related question on StackOverflow.
Example
String result = getUrlAsString("https://www.google.com");
System.out.println(result);
outputs
<!doctype html><html itemscope="" .... etc
Code
import java.net.URL;
import java.net.URLConnection;
import java.io.BufferedReader;
import java.io.InputStreamReader;
public static String getUrlAsString(String url)
{
try
{
URL urlObj = new URL(url);
URLConnection con = urlObj.openConnection();
con.setDoOutput(true); // we want the response
con.setRequestProperty("Cookie", "myCookie=test123");
con.connect();
BufferedReader in = new BufferedReader(new InputStreamReader(con.getInputStream()));
StringBuilder response = new StringBuilder();
String inputLine;
String newLine = System.getProperty("line.separator");
while ((inputLine = in.readLine()) != null)
{
response.append(inputLine + newLine);
}
in.close();
return response.toString();
}
catch (Exception e)
{
throw new RuntimeException(e);
}
}
Here's Jeanne's lovely answer, but wrapped in a tidy function for muppets like me:
private static String getUrl(String aUrl) throws MalformedURLException, IOException
{
String urlData = "";
URL urlObj = new URL(aUrl);
URLConnection conn = urlObj.openConnection();
try (BufferedReader reader = new BufferedReader(new InputStreamReader(conn.getInputStream(), StandardCharsets.UTF_8)))
{
urlData = reader.lines().collect(Collectors.joining("\n"));
}
return urlData;
}
URL to String in pure Java
Example call to get payload from http get call
String str = getStringFromUrl("YourUrl");
Implementation
You can use the method described in this answer, on How to read URL to an InputStream and combine it with this answer on How to read InputStream to String.
The outcome will be something like
public String getStringFromUrl(URL url) throws IOException {
return inputStreamToString(urlToInputStream(url,null));
}
public String inputStreamToString(InputStream inputStream) throws IOException {
try(ByteArrayOutputStream result = new ByteArrayOutputStream()) {
byte[] buffer = new byte[1024];
int length;
while ((length = inputStream.read(buffer)) != -1) {
result.write(buffer, 0, length);
}
return result.toString(UTF_8);
}
}
private InputStream urlToInputStream(URL url, Map<String, String> args) {
HttpURLConnection con = null;
InputStream inputStream = null;
try {
con = (HttpURLConnection) url.openConnection();
con.setConnectTimeout(15000);
con.setReadTimeout(15000);
if (args != null) {
for (Entry<String, String> e : args.entrySet()) {
con.setRequestProperty(e.getKey(), e.getValue());
}
}
con.connect();
int responseCode = con.getResponseCode();
/* By default the connection will follow redirects. The following
* block is only entered if the implementation of HttpURLConnection
* does not perform the redirect. The exact behavior depends to
* the actual implementation (e.g. sun.net).
* !!! Attention: This block allows the connection to
* switch protocols (e.g. HTTP to HTTPS), which is <b>not</b>
* default behavior. See: https://stackoverflow.com/questions/1884230
* for more info!!!
*/
if (responseCode < 400 && responseCode > 299) {
String redirectUrl = con.getHeaderField("Location");
try {
URL newUrl = new URL(redirectUrl);
return urlToInputStream(newUrl, args);
} catch (MalformedURLException e) {
URL newUrl = new URL(url.getProtocol() + "://" + url.getHost() + redirectUrl);
return urlToInputStream(newUrl, args);
}
}
/*!!!!!*/
inputStream = con.getInputStream();
return inputStream;
} catch (Exception e) {
throw new RuntimeException(e);
}
}
Pros
It is pure java
It can be easily enhanced by adding different headers as a map (instead of passing a null object, like the example above does), authentication, etc.
Handling of protocol switches is supported
I'm making call to Alfresco Webscripts which return JSON. I do this using GET requests which all work perfectly. If I do a file POST however, the Alfresco server receives the file correctly and sends back a JSON response, but this time the response causes the browser to prompt for a download instead of the letting Javascript process the callback.
Now all these calls are going through a "home made" reverse proxy (see below) which uses HttpUrlConnection. This proxy routes all the calls to an Alfresco running on another host. Everything else works fine (pngs, text, html, GET requests,even authentication). In both GET and POST responses the Content-Type is "application/json;charset=UTF-8"
Many thanks for any responses.
import java.io.*;
import java.net.HttpURLConnection;
import java.net.URL;
import java.util.*;
import javax.servlet.*;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import org.apache.commons.codec.binary.Base64;
public class ReverseProxy extends GenericServlet{
public static final String SERVER_URL = "serverURL";
protected String serverURL;
protected boolean debug;
public ReverseProxy(){
}
public void init(ServletConfig config) throws ServletException {
super.init(config);
debug = Boolean.valueOf(config.getInitParameter("debug")).booleanValue();
serverURL = config.getInitParameter("serverURL");
if(serverURL == null){
throw new ServletException("ReverseProxy servlet initialization parameter 'serverURL' not defined");
}
}
public void service(ServletRequest req, ServletResponse resp) throws ServletException, IOException {
InputStream inputStream;
OutputStream outputStream;
Exception exception;
if(debug){System.out.println("ReverseProxy.service()");}
HttpServletRequest request;
HttpServletResponse response;
try{
request = (HttpServletRequest)req;
response = (HttpServletResponse)resp;
}
catch(ClassCastException e){
throw new ServletException("non-HTTP request or response");
}
String method = request.getMethod();
StringBuffer urlBuffer = new StringBuffer();
urlBuffer.append(serverURL);
urlBuffer.append(request.getServletPath());
if(request.getPathInfo() != null)
urlBuffer.append(request.getPathInfo());
if(request.getQueryString() != null){
urlBuffer.append('?');
urlBuffer.append(request.getQueryString());
}
URL url = new URL(urlBuffer.toString());
//pass authentication
String user=null, password=null;
Set entrySet = req.getParameterMap().entrySet();
Map headers = new HashMap();
for ( Object anEntrySet : entrySet ) {
Map.Entry header = (Map.Entry) anEntrySet;
String key = (String) header.getKey();
String value = ((String[]) header.getValue())[0];
if ("user".equals(key)) {
user = value;
} else if ("password".equals(key)) {
password = value;
}else {
headers.put(key, value);
}
}
String userpass = null;
if (user != null && userpass!=null) {
userpass = user+":"+password;
}
String auth = request.getHeader("Authorization");
if(auth != null){
if (auth.toUpperCase().startsWith("BASIC ")){
String userpassEncoded = auth.substring(6);
userpass = new String(Base64.decodeBase64(userpassEncoded.getBytes()));
}
}
String digest=null;
if (userpass!=null) {
if(debug){System.out.println("ReverseProxy found userpass:" + userpass);}
digest = "Basic " + new String(Base64.encodeBase64((userpass).getBytes()));
}
else{
if(debug){System.out.println("ReverseProxy found no auth credentials");}
}
//do connection
HttpURLConnection connection = null;
connection = (HttpURLConnection) url.openConnection();
if (digest != null) {connection.setRequestProperty("Authorization", digest);}
connection.setRequestMethod(method);
connection.setDoInput(true);
if(method.equals("POST")){
if(request.getHeader("Content-Type") != null){
if(debug){System.out.println("ReverseProxy Content-Type: " + request.getHeader("Content-Type"));}
if(debug){System.out.println("ReverseProxy Content-Length: " + request.getHeader("Content-Length"));}
if(request.getHeader("Content-Type").indexOf("multipart/form-data") != -1){
connection.setRequestProperty("Content-Type", request.getHeader("Content-Type"));
connection.setRequestProperty("Content-Length", request.getHeader("Content-Length"));
}
}
connection.setDoOutput(true);
}
if(debug){
System.out.println((new StringBuilder()).append("ReverseProxy: URL=").append(url).append(" method=").append(method).toString());
}
//set headers
Set headersSet = headers.entrySet();
for ( Object aHeadersSet : headersSet ) {
Map.Entry header = (Map.Entry) aHeadersSet;
connection.setRequestProperty((String) header.getKey(), (String) header.getValue());
}
connection.connect();
inputStream = null;
outputStream = null;
try{
if(method.equals("POST")){
javax.servlet.ServletInputStream servletInputStream = request.getInputStream();
outputStream = connection.getOutputStream();
copy(servletInputStream, outputStream);
}
response.setContentLength(connection.getContentLength());
response.setContentType(connection.getContentType());
if(debug){System.out.println("ReverseProxy Connection Content-Type: " + connection.getContentType());}
response.setCharacterEncoding(connection.getContentEncoding());
String cacheControl = connection.getHeaderField("Cache-Control");
if(cacheControl != null){
response.setHeader("Cache-Control", cacheControl);
}
int responseCode = connection.getResponseCode();
response.setStatus(responseCode);
if(responseCode == 401){
response.setHeader("WWW-Authenticate", "Basic realm=\"Login Required\"");
}
for( Iterator i = connection.getHeaderFields().entrySet().iterator() ; i.hasNext() ;){
Map.Entry mapEntry = (Map.Entry)i.next();
if(mapEntry.getKey()!=null){
response.setHeader(mapEntry.getKey().toString(), ((List)mapEntry.getValue()).get(0).toString());
}
}
//if(debug){System.out.println("ReverseProxy Connection Content-Disposition: " + connection.getHeaderField("Content-Disposition"));}
if(debug){System.out.println((new StringBuilder()).append("ReverseProxy: response code '").append(responseCode).append("' from ").append(url).toString());}
if (responseCode == 200 || responseCode == 201) {
inputStream = connection.getInputStream();
}
else{
inputStream = connection.getErrorStream();
}
javax.servlet.ServletOutputStream servletOutputStream = response.getOutputStream();
copy(inputStream, servletOutputStream);
}
catch(IOException ex){
if(debug)
ex.printStackTrace();
throw ex;
}
finally{
//if(inputStream == null) goto _L0; else goto _L0
//break;
}
if(inputStream != null){
inputStream.close();
}
if(outputStream != null){
outputStream.close();
}
inputStream.close();
if(outputStream != null){
outputStream.close();
}
//throw exception;
}
public long copy(InputStream input, OutputStream output) throws IOException{
byte buffer[] = new byte[4096];
long count = 0L;
for(int n = 0; -1 != (n = input.read(buffer));){
output.write(buffer, 0, n);
count += n;
}
output.flush();
if(debug)
System.err.println((new StringBuilder()).append("copy ").append(count).append(" bytes").toString());
return count;
}
}
I guess the problem is more in the client side or a misconception in your side. It's correct behaviour if the browser prompts to download the file when it has a content type of application/json, because the browser itself doesn't know how to handle it. The browser can only display everything which matches a content type of at least text/* or image/*.
Normally, JSON responses are to be handled internally by JavaScript, which can perfectly handle ajaxical responses with a content type of application/json. You can test it by changing it to text/plain or text/javascript, you'll see that the browser will display it (because it matches text/*). But for JSON the correct content type is indeed application/json. Just keep it as is and use the right tools to download/open the JSON ;)
Solved (as per my comment)
If the request is a XmlHttpRequest sent from Javascript, then the "application/json" content type will be understood and a download will not occur. This is be true for both GET and POST requests. If one is doing a file upload, Libraries such as JQuery, ExtJS etc create a hidden form with a setting of "application/x-www-form-urlencoded" and post it (all without the users interaction). This means the response is being interpreted by the browser, not Javascript. The only way around this is to set the content type of the returning JSON to "text/html" (NOT "text/plain" or else the browser tries to add tags).
In Java, this code throws an exception when the HTTP result is 404 range:
URL url = new URL("http://stackoverflow.com/asdf404notfound");
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.getInputStream(); // throws!
In my case, I happen to know that the content is 404, but I'd still like to read the body of the response anyway.
(In my actual case the response code is 403, but the body of the response explains the reason for rejection, and I'd like to display that to the user.)
How can I access the response body?
Here is the bug report (close, will not fix, not a bug).
Their advice there is to code like this:
HttpURLConnection httpConn = (HttpURLConnection)_urlConnection;
InputStream _is;
if (httpConn.getResponseCode() < HttpURLConnection.HTTP_BAD_REQUEST) {
_is = httpConn.getInputStream();
} else {
/* error from server */
_is = httpConn.getErrorStream();
}
It's the same problem I was having:
HttpUrlConnection returns FileNotFoundException if you try to read the getInputStream() from the connection.
You should instead use getErrorStream() when the status code is higher than 400.
More than this, please be careful since it's not only 200 to be the success status code, even 201, 204, etc. are often used as success statuses.
Here is an example of how I went to manage it
... connection code code code ...
// Get the response code
int statusCode = connection.getResponseCode();
InputStream is = null;
if (statusCode >= 200 && statusCode < 400) {
// Create an InputStream in order to extract the response object
is = connection.getInputStream();
}
else {
is = connection.getErrorStream();
}
... callback/response to your handler....
In this way, you'll be able to get the needed response in both success and error cases.
Hope this helps!
In .Net you have the Response property of the WebException that gives access to the stream ON an exception. So i guess this is a good way for Java,...
private InputStream dispatch(HttpURLConnection http) throws Exception {
try {
return http.getInputStream();
} catch(Exception ex) {
return http.getErrorStream();
}
}
Or an implementation i used. (Might need changes for encoding or other things. Works in current environment.)
private String dispatch(HttpURLConnection http) throws Exception {
try {
return readStream(http.getInputStream());
} catch(Exception ex) {
readAndThrowError(http);
return null; // <- never gets here, previous statement throws an error
}
}
private void readAndThrowError(HttpURLConnection http) throws Exception {
if (http.getContentLengthLong() > 0 && http.getContentType().contains("application/json")) {
String json = this.readStream(http.getErrorStream());
Object oson = this.mapper.readValue(json, Object.class);
json = this.mapper.writer().withDefaultPrettyPrinter().writeValueAsString(oson);
throw new IllegalStateException(http.getResponseCode() + " " + http.getResponseMessage() + "\n" + json);
} else {
throw new IllegalStateException(http.getResponseCode() + " " + http.getResponseMessage());
}
}
private String readStream(InputStream stream) throws Exception {
StringBuilder builder = new StringBuilder();
try (BufferedReader in = new BufferedReader(new InputStreamReader(stream))) {
String line;
while ((line = in.readLine()) != null) {
builder.append(line); // + "\r\n"(no need, json has no line breaks!)
}
in.close();
}
System.out.println("JSON: " + builder.toString());
return builder.toString();
}
I know that this doesn't answer the question directly, but instead of using the HTTP connection library provided by Sun, you might want to take a look at Commons HttpClient, which (in my opinion) has a far easier API to work with.
First check the response code and then use HttpURLConnection.getErrorStream()
InputStream is = null;
if (httpConn.getResponseCode() !=200) {
is = httpConn.getErrorStream();
} else {
/* error from server */
is = httpConn.getInputStream();
}
My running code.
HttpURLConnection httpConn = (HttpURLConnection) urlConn;
if (httpConn.getResponseCode() < HttpURLConnection.HTTP_BAD_REQUEST) {
in = new InputStreamReader(urlConn.getInputStream());
BufferedReader bufferedReader = new BufferedReader(in);
if (bufferedReader != null) {
int cp;
while ((cp = bufferedReader.read()) != -1) {
sb.append((char) cp);
}
bufferedReader.close();
}
in.close();
} else {
/* error from server */
in = new InputStreamReader(httpConn.getErrorStream());
BufferedReader bufferedReader = new BufferedReader(in);
if (bufferedReader != null) {
int cp;
while ((cp = bufferedReader.read()) != -1) {
sb.append((char) cp);
}
bufferedReader.close();
}
in.close();
}
System.out.println("sb="+sb);
How to read 404 response body in java:
Use Apache library - https://hc.apache.org/httpcomponents-client-4.5.x/httpclient/apidocs/
or
Java 11 - https://docs.oracle.com/en/java/javase/11/docs/api/java.net.http/java/net/http/HttpClient.html
Snippet given below uses Apache:
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClients;
import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpGet;
import org.apache.http.util.EntityUtils;
CloseableHttpClient client = HttpClients.createDefault();
CloseableHttpResponse resp = client.execute(new HttpGet(domainName + "/blablablabla.html"));
String response = EntityUtils.toString(resp.getEntity());