I have an inputStream from an HttpURLConnection. The inputStream is passed as property to an object that will later be called via its getter from the Struts2 framework to provide the stream directly to the users browser. Although the code seems to work as expected I am worried that I cannot close properly the HttpURLConnection as this will invalidate my input stream before is read from the user's browser. The code is as follows:
private void DownloadOutput(DownloadableObject retVal, URL u, String cookie) {
try {
HttpURLConnection conn = (HttpURLConnection) u.openConnection();
conn.setRequestMethod("GET");
conn.setRequestProperty("Cookie", cookie);
Map<String, List<String>> headers = conn.getHeaderFields();
retVal.setContentLength(conn.getContentLength());
retVal.setStream(new BufferedInputStream(conn.getInputStream()));
// in.close();
// conn.disconnect();
} catch (MalformedURLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Is there any suggestion as what would be the optimal approach? I assume that the gc will later on clear the HttpURLConnection object but it is good to do some housekeeping pro-actively. I also assume that the "new BufferedInputStream" passed into the proxy object will be closed by the underlying struts framework(?).
convert it to string and then set it to an object
Read/convert an InputStream to a String
Streams basically represents a handle input/output source, and when you close the reference it loses the handle
http://docs.oracle.com/javase/tutorial/essential/io/streams.html
Looks like the cleaner way is to subclass the inputstream and Override the close. That way when struts will call the close, after having read the stream, you can close your connection:
private class mytest extends BufferedInputStream {
private HttpURLConnection aConn;
public mytest(HttpURLConnection conn, InputStream in) {
super(in);
this.aConn = conn;
}
public mytest(HttpURLConnection conn, InputStream in, int size) {
super(in, size);
this.aConn = conn;
}
#Override
public void close() throws IOException {
super.close();
System.out.println("The stream has been closed, time to close the connection");
aConn.disconnect();
System.out.println("Connection has been disconnected");
}
}
So the above object is the stream that will be set in the action as the inputStream parameter.
Related
I am trying to terminate a connection if no data is being received or server is just keeping the connection open for a url by setting connectionTimeout and readTimeout.
I have create anonymous class of URLResource and fetch the data from url. code block below is of spring project. spring boot version is 2.7.1
try {
URL url = new URL("http://httpstat.us/200?sleep=20000");
UrlResource urlResource = new UrlResource(url) {
#Override
protected void customizeConnection(HttpURLConnection connection) throws IOException {
super.customizeConnection(connection);
connection.setConnectTimeout(4000);
connection.setReadTimeout(2000);
}
};
InputStream inputStream = urlResource.getInputStream();
InputStreamReader isr = new InputStreamReader(inputStream,
StandardCharsets.UTF_8);
BufferedReader br = new BufferedReader(isr);
br.lines().forEach(line -> System.out.println(line));
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
System.out.println("IO exception");
e.printStackTrace();
}
I am using a service (http://httpstat.us/200?sleep=20000) that allows to hold connection for specified amount of time to check out the connection termination but the connection is not getting terminate after specified amount of time
Is there any other way to customize urlResource so that timeout can be set
It looks like the UrlResource.getInputStream() is missing to call customizeConnection(con); in its logic:
public InputStream getInputStream() throws IOException {
URLConnection con = this.url.openConnection();
ResourceUtils.useCachesIfNecessary(con);
try {
return con.getInputStream();
}
catch (IOException ex) {
// Close the HTTP connection (if applicable).
if (con instanceof HttpURLConnection httpConn) {
httpConn.disconnect();
}
throw ex;
}
}
Please, raise a GH issue for Spring Framework to address this problem.
As a workaround I see this:
UrlResource urlResource = new UrlResource(url) {
#Override
public InputStream getInputStream() throws IOException {
URLConnection con = getURL().openConnection();
customizeConnection(con);
try {
return con.getInputStream();
}
catch (IOException ex) {
// Close the HTTP connection (if applicable).
if (con instanceof HttpURLConnection httpConn) {
httpConn.disconnect();
}
throw ex;
}
}
#Override
protected void customizeConnection(HttpURLConnection connection) throws IOException {
super.customizeConnection(connection);
connection.setReadTimeout(2000);
}
};
So, I override that getInputStream() with the same logic, but also apply our customizeConnection() on it. With that fix your test fails like this:
java.net.SocketTimeoutException: Read timed out
at java.base/sun.nio.ch.NioSocketImpl.timedRead(NioSocketImpl.java:283)
at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:309)
at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:350)
at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:803)
at java.base/java.net.Socket$SocketInputStream.read(Socket.java:966)
at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:244)
at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:284)
at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:343)
at java.base/sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:791)
at java.base/sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:726)
at java.base/sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1688)
at java.base/sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1589)
I have write a simple method to post data to URL and consume the output. I hava tried multiple ways to consume the out put but no success yet:
public void postToUrl(final String theurl, final String query, final Callable<Void> myMethod){
String urlData="";
new Thread(new Runnable() {
#Override
public void run() {
try {
String urlData="";
String url=baseurl+ theurl + "/?" +query;
StrictMode.ThreadPolicy policy = new StrictMode.ThreadPolicy.Builder().permitAll().build();
StrictMode.setThreadPolicy(policy);
URLConnection connection = new URL(url).openConnection();
connection.setDoOutput(true);
connection.connect();
InputStream response = connection.getInputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(response));
String line="";
while ((line = reader.readLine()) != null) {
urlData += line;
}
reader.close();
// How to consume urlData here?
// myMethod.call(); not accepts parameter
try {
myMethod.call(urlData);
}catch (Exception e){
e.printStackTrace();
}
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
e.printStackTrace();
}
}}
).start();
}
I have to wrap the method inside a runnable and can not expect a runnable method to return a value.
I tried to consume the output when it is ready inside the runnable but I need to call a third party method to pass the output. I found callable but it does not accept parameters.
I have read the accepted answer here but it needs to define a new class for every method.
I have read this Q/A but it suggest to define an interface instead of method that I believe is not the proper use of interface.
If this method is not the proper way to call and consume a url, how do you manage multiple client-server request in an application? Do you rewrite the codes above for every type of request? Do you really define a new class or new interface for every clien-server interactions?
I've seen In several answers here, in order to reuse connection, I needed to close inputstream and outputstream.
Before I closed the streams, I had seen too many CLOSE_WAIT connections.
But now, In case of a base response (for instance 404 file not found)
I'm handling the response this way
public static void handleIOError(URLConnection conn)
{
InputStream es = null;
HttpURLConnection urlConn = null;
if (conn instanceof HttpURLConnection)
{
urlConn = (HttpURLConnection)conn;
es = urlConn.getErrorStream();
}
if (es != null)
{
try
{
while (es.read() > -1) {}
}
catch (IOException e)
{
logger.error("Unable to close input stream", e);
}
finally
{
IOUtils.closeQuietly(es);
}
}
}
so I'm closing inputstream, but still, I'm having multiple CLOSE_WAIT connections. I'm guessing I didn't close the socket stream, but which one? As you may know, in case of an error I can not do urlConn.getInputStream()
my server is tomcat 8.
Any Idea how to solve the socket leak?
I'm using the new java.net.http classes to handle asynchronous HTTP request+response exchanges, and I'm trying to find a way to have the BodySubscriber handle different encoding types such as gzip.
However, mapping a BodySubsriber<InputStream> so that the underlying stream is wrapped by a GZIPInputStream (when "Content-Encoding: gzip" is found in the response header) leads to a hang. No exceptions, just a total cessation of activity.
The code which maps the BodySubscriber looks like this:
private HttpResponse.BodySubscriber<InputStream> gzippedBodySubscriber(
HttpResponse.ResponseInfo responseInfo) {
return HttpResponse.BodySubscribers.mapping(
HttpResponse.BodySubscribers.ofInputStream(),
this::decodeGzipStream);
}
private InputStream decodeGzipStream(InputStream gzippedStream) {
System.out.println("Entered decodeGzipStream method.");
try {
InputStream decodedStream = new GZIPInputStream(gzippedStream);
System.out.println(
"Created GZIPInputStream to handle response body stream.");
return decodedStream;
} catch (IOException ex) {
System.out.println("IOException occurred while trying to create GZIPInputStream.");
throw new UncheckedIOException(ex);
}
}
Receiving an HTTP response which has "gzip" encoding leads to the console showing just this:
Entered EncodedBodyHandler.apply method.
Entered decodeGzipStream method.
Nothing more is seen, so the line after the call to the GZIPInputStream constructor is never executed.
Does anyone know why this attempt to wrap the InputStream from a BodySubscriber<InputStream> in a GZIPInputStream is hanging?
Note: the equivalent method for unencoded (raw text) HTTP response bodies contains simply a call to BodySubscribers.ofInputStream() with no mapping, and this allows the response to be received and displayed without problem.
EDIT: JDK-8217264 is fixed since JDK13
This is indeed a bug. I have logged JDK-8217264. I can suggest two work-arounds:
Workaround one
Do not use BodySubscribers.mapping - but transform the InputStream into a GZIPInputStream after getting the HttpResponse's body:
GZIPInputStream gzin = new GZIPInputStream(resp.getBody());
Workaround two
Have the mapping function return a Supplier<InputStream> instead, taking care not to create the GZIPInputStream until Supplier::get is called
static final class ISS implements Supplier<InputStream> {
final InputStream in;
GZIPInputStream gz;
ISS(InputStream in) {
this.in = in;
}
public synchronized InputStream get() {
if (gz == null) {
try {
gz = new GZIPInputStream(in);
} catch (IOException t) {
throw new UncheckedIOException(t);
}
}
return gz;
}
}
Encountered the exact same problem. I tried the example in the Javadoc of the BodySubscribers.mapping method. Same behavior, the application hangs without any errors.
Could be a bug, because this is an official example from the Javadoc.
public static <W> BodySubscriber<W> asJSON(Class<W> targetType) {
BodySubscriber<InputStream> upstream = BodySubscribers.ofInputStream();
BodySubscriber<W> downstream = BodySubscribers.mapping(
upstream,
(InputStream is) -> {
try (InputStream stream = is) {
ObjectMapper objectMapper = new ObjectMapper();
return objectMapper.readValue(stream, targetType);
} catch (IOException e) {
throw new UncheckedIOException(e);
}
});
return downstream;
} }
So I'm trying to simply fetch the user's profile photo from facebook but I'm getting a null response from facebook.request(path) and the IOException "Hostname fbcdn-profile-a.akamaihd.net was not verified".
Anyone know what could be causing this exception? Here's my method to call the facebook.request:
public Bitmap getUserPic(String path){
URL picURL = null;
try {
responsePic = facebook.request(path);
picURL = new URL(responsePic);
HttpURLConnection conn = (HttpURLConnection)picURL.openConnection();
conn.setDoInput(true);
conn.connect();
InputStream is = conn.getInputStream();
userPic = BitmapFactory.decodeStream(is);
} catch (IOException e) {
// TODO Auto-generated catch block
e.printStackTrace();
} catch (FacebookError e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return userPic;
}
The string "path" is "me/picture"
Edit:
Also tried setting picURL to "https://fbcdn-profile-a.akamaihd.net/hprofile-ak-snc4/260885_608260639_822979518_q.jpg" which is the url that the request should return. Still no photo :(
Thanks for any help
It sounds like a issue with the HTTPS connection used to get the image from the Facebook CDN. What happens if you request the regular HTTP version of the image?
E.g. http://fbcdn-profile-a.akamaihd.net/hprofile-ak-snc4/260885_608260639_822979518_q.jpg