Sockets of HttpURLConnection are leaked - java

I'm using OpenJDK 11 on Linux and I need to make sure all my web requests done with HttpURLConnection are properly closed and do not keep any file descriptors open.
Oracle's manual tells to use close on the InputStream and Android's manual tells to use disconnect on the HttpURLConnection object.
I also set Connection: close and http.keepAlive to false to avoid pooling of connections.
This seems to work with plain http requests but not encrypted https requests whose response is sent with non-chunked encoding. Only a GC seems to clean up the closed connections.
This example code:
import java.io.BufferedReader;
import java.io.File;
import java.io.IOException;
import java.io.InputStreamReader;
import java.net.HttpURLConnection;
import java.net.URL;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.stream.Stream;
public class Test {
private static int printFds() throws IOException {
int cnt = 0;
try (Stream<Path> paths = Files.list(new File("/proc/self/fd").toPath())) {
for (Path path : (Iterable<Path>)paths::iterator) {
System.out.println(path);
++cnt;
}
}
System.out.println();
return cnt;
}
public static void main(String[] args) throws IOException, InterruptedException {
System.setProperty("http.keepAlive", "false");
for (int i = 0; i < 10; i++) {
// Must be a https endpoint returning non-chunked response
HttpURLConnection conn = (HttpURLConnection) new URL("https://www.google.com/").openConnection();
conn.setRequestProperty("Connection", "close");
BufferedReader in = new BufferedReader(new InputStreamReader(conn.getInputStream()));
while (in.readLine() != null) {
}
in.close();
conn.disconnect();
conn = null;
in = null;
}
Thread.sleep(1000);
int numBeforeGc = printFds();
System.gc();
Thread.sleep(1000);
int numAfterGc = printFds();
System.out.println(numBeforeGc == numAfterGc ? "No socket leaks" : "Sockets were leaked");
}
}
prints this output:
/proc/self/fd/0
/proc/self/fd/1
/proc/self/fd/2
/proc/self/fd/3
/proc/self/fd/4
/proc/self/fd/5
/proc/self/fd/9
/proc/self/fd/6
/proc/self/fd/7
/proc/self/fd/8
/proc/self/fd/10
/proc/self/fd/11
/proc/self/fd/12
/proc/self/fd/13
/proc/self/fd/14
/proc/self/fd/15
/proc/self/fd/16
/proc/self/fd/17
/proc/self/fd/18
/proc/self/fd/19
/proc/self/fd/0
/proc/self/fd/1
/proc/self/fd/2
/proc/self/fd/3
/proc/self/fd/4
/proc/self/fd/5
/proc/self/fd/9
/proc/self/fd/6
/proc/self/fd/7
/proc/self/fd/8
Sockets were leaked
Changing to a http URL makes the sockets close correctly as expected without GC:
/proc/self/fd/0
/proc/self/fd/1
/proc/self/fd/2
/proc/self/fd/3
/proc/self/fd/4
/proc/self/fd/5
/proc/self/fd/6
/proc/self/fd/0
/proc/self/fd/1
/proc/self/fd/2
/proc/self/fd/3
/proc/self/fd/4
/proc/self/fd/5
/proc/self/fd/6
No socket leak
Tested with both OpenJDK 11 and 12. Did I miss something or is this a bug?

Turns out to be a bug after all: https://bugs.openjdk.java.net/browse/JDK-8216326
shutdownInput is now replaced by close in the latest builds of JDK 11 and 13 (but not 12).

Related

SSL exception in JAVA6 but no in JAVA8

Hy
I try to connect a java program to an REST API.
With the same part of code I have a Java Exception in Java 6 and it works fine in Java 8.
It's the same environment :
trustore
machine
unix user
the code :
import java.io.DataInputStream;
import java.io.IOException;
import java.net.MalformedURLException;
import java.net.URL;
import java.net.URLConnection;
public class MainClass {
public static void main (String[] args){
String serviceUrl = "https://api.domain.com" + "/endpont/path";
try {
URL url = new URL(serviceUrl);
URLConnection connection = null;
try{
connection = url.openConnection();
connection.setRequestProperty("Accept", "application/json");
connection.setRequestProperty("Content-Type", "application/json");
String body = "";
String inputLine;
DataInputStream input = new DataInputStream (connection.getInputStream());
while (((inputLine = input.readLine()) != null)){
body += inputLine;
}
System.out.println(body);
} catch (IOException e) {
e.printStackTrace();
}
} catch (MalformedURLException e) {
e.printStackTrace();
}
}
}
the error in Java 6 : sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
Somebody know how it's different ? Can I use some tricks to have the same result in Java 6 ?
The CN of the cert is a wildcard : "*.domain.com" . It can be the cause ?
I tried several api but there all used the sun SSL layer. Do you know an other to replace it ?
JRE has it's own keystore, where certificates can be stored. Maybe your JDK/JRE for Java 6 has different keys than Java 8.

Connection reset - Java (underlying socket status remain established) Azure VM

I am debugging one problem of connection reset and need some help.
Here is the background
Using java version 8, apache httpClient 4.5.2
I have a following program, which runs successfully on windows 10, 7 but end up with connection reset on Azure windows server 2016 VM.
import java.io.IOException;
import java.util.HashMap;
import java.util.Iterator;
import java.util.Map;
import org.apache.commons.codec.binary.Base64;
import org.apache.http.Header;
import org.apache.http.HttpResponse;
import org.apache.http.client.ClientProtocolException;
import org.apache.http.client.config.RequestConfig;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.entity.StringEntity;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClientBuilder;
import org.apache.http.impl.conn.PoolingHttpClientConnectionManager;
import org.apache.http.util.EntityUtils;
public class TestConnectionReset
{
static PoolingHttpClientConnectionManager connManager = new PoolingHttpClientConnectionManager();
static {
connManager.setMaxTotal(10);
connManager.setDefaultMaxPerRoute(2);
}
public static void main(String[] args) throws ClientProtocolException, IOException, InterruptedException {
while (true) {
HttpClientBuilder clientBuilder = HttpClientBuilder.create();
RequestConfig config = RequestConfig.custom().setConnectTimeout(1800000).setConnectionRequestTimeout(1800000)
.setSocketTimeout(1800000).build();
clientBuilder.setDefaultRequestConfig(config);
clientBuilder.setConnectionManager(connManager);
String userName = "xxxxx";
String password = "xxxxx";
String userNamePasswordPair = String.valueOf(userName) + ":" + password;
String authenticationData = "Basic " + new String((new Base64()).encode(userNamePasswordPair.getBytes()));
HttpPost post = new HttpPost("https://url/rest/oauth/token");
Map<String, String> requestBodyMap = new HashMap<String, String>();
requestBodyMap.put("grant_type", "client_credentials");
String req = getFormUrlEncodedBodyFromMap(requestBodyMap);
StringEntity stringEntity = new StringEntity(req);
post.setEntity(stringEntity);
post.setHeader("Authorization", authenticationData);
post.setHeader("Content-Type", "application/x-www-form-urlencoded");
CloseableHttpClient closeableHttpClient = clientBuilder.build();
HttpResponse response = closeableHttpClient.execute(post);
Header[] hs = response.getAllHeaders();
for (Header header : hs) {
System.out.println(header.toString());
}
System.out.println(EntityUtils.toString(response.getEntity()));
Thread.sleep(10*60*1000L);
}
}
public static String getFormUrlEncodedBodyFromMap(Map<String, String> formData) {
StringBuilder requestBody = new StringBuilder();
Iterator<Map.Entry<String, String>> itrFormData = formData.entrySet().iterator();
while (itrFormData.hasNext()) {
Map.Entry<?, ?> entry = (Map.Entry)itrFormData.next();
requestBody.append(entry.getKey()).append("=").append(entry.getValue());
if (itrFormData.hasNext()) {
requestBody.append("&");
}
}
return requestBody.toString();
}
}
I am using pooling httpclient connection manager. 1st request in 1st time loop execution succeeded but subsequent iteration of for loop with next request fails.
My findings
If we see underlying socket connection on windows 10, after 1st request socket goes into CLOSE_WAIT state and next request executes with closing the existing connection and creating new connection.
Actually server closes the connection in duration of 5 minutes. But windows 10 able to detect it and re-initiate the connection when next request is triggered.
Now, on windows server 2016, I can see that netstat shows socket ESTABLISHED state. Means connection is ready to use and in that, it picks up the same connection and finally server has already closed it so results into connection reset error.
I suspect its an environmental issue, where server 2016 is keeping socket ESTABLISHED even after server has terminated it, but on windows 10 socket status changed to CLOSE_WAIT.
Help on this is much appreciated
Finally got the root cause,
Its issue with microsoft azure. They are using SNAT and closing outbound TCP connections after 4 minute idle time. This wasted my 5 days to figureout.
Means if you are connected with server with keep-alive and hope that you can reuse the connection till server time out and sends FIN. But before that if idle period reaches to 4 minutes, azure kills it. BOOM!!. Worst part is, it is not even notifying server or client with RST, means violating TCP and questioning its reliability.
clientBuilder.setKeepAliveStrategy(new ConnectionKeepAliveStrategy() {
#Override
public long getKeepAliveDuration(HttpResponse response, HttpContext context) {
// TODO Auto-generated method stub
return 3*60*1000;
}
});
Using above code, I managed to close connection on 3 minute expiry and close it before azure kills it.

Copy of InputStream blocks with Jetty HTTP client using an InputStreamResponseListener

Im am using Jetty 9.4.8 HTTP client and want to write a stream of incoming data to a file. Currently I am using an InputStreamResponseListener and IOUtils.copy(..) writing to a FileOutputStream. I have also tried Files.copy().
InputStreamResponseListener streamResponseListener = new InputStreamResponseListener();
request.send(streamResponseListener);
if(streamResponseListener.get(5, TimeUnit.MINUTES).getStatus() == 200) {
OutputStream outputStream = null;
try {
TMP_FILE.toFile().createNewFile();
outputStream = new FileOutputStream(TMP_FILE.toFile());
IOUtils.copy(inputStream, outputStream);
} catch(IOException e) {
this.getLogService().log(..)
} finally {
IOUtils.closeQuietly(inputStream);
IOUtils.closeQuietly(outputStream);
}
// NOT REACHED IN CASE InputStream is BLOCKED FOR SOME REASON
}
However, the copy methods seem to block after all bytes have been received. Why could this happen and how can I avoid this?
Headers of the HTTP content requested:
Date: Wed, 23 May 2018 16:46:06 GMT
Content-Type: application/octet-stream
Content-Disposition: attachment; filename=".."
Content-Length: 613970044
Server: Jetty(9.4.8.v20171121)
IOUtils from Apache Commons IO Version 2.4
Here's a working example of your codebase, using only Java and Jetty.
This is requesting content from a server that is known to comply with the HTTP spec.
package demo.jettyclient;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.net.URI;
import java.net.URL;
import java.nio.file.Files;
import java.nio.file.Path;
import java.util.Properties;
import java.util.concurrent.TimeUnit;
import org.eclipse.jetty.client.HttpClient;
import org.eclipse.jetty.client.api.Request;
import org.eclipse.jetty.client.api.Response;
import org.eclipse.jetty.client.util.InputStreamResponseListener;
import org.eclipse.jetty.http.HttpStatus;
import org.eclipse.jetty.util.IO;
import org.eclipse.jetty.util.StringUtil;
import org.eclipse.jetty.util.ssl.SslContextFactory;
public class DownloadUrl
{
public static void main(String[] args) throws Exception
{
String uriString = "https://upload.wikimedia.org/wikipedia/en/a/a1/Jetty_logo.png?download";
if (args.length >= 1)
uriString = args[0];
URI srcUri = URI.create(uriString);
SslContextFactory ssl = new SslContextFactory(true);
HttpClient client = new HttpClient(ssl);
try
{
client.start();
Request request = client.newRequest(srcUri);
System.out.printf("Using HttpClient v%s%n", getHttpClientVersion());
System.out.printf("Requesting: %s%n", srcUri);
InputStreamResponseListener streamResponseListener = new InputStreamResponseListener();
request.send(streamResponseListener);
Response response = streamResponseListener.get(5, TimeUnit.SECONDS);
if (response.getStatus() != HttpStatus.OK_200)
{
throw new IOException(
String.format("Failed to GET URI [%d %s]: %s",
response.getStatus(),
response.getReason(),
srcUri));
}
Path tmpFile = Files.createTempFile("tmp", ".dl");
try (InputStream inputStream = streamResponseListener.getInputStream();
OutputStream outputStream = Files.newOutputStream(tmpFile))
{
IO.copy(inputStream, outputStream);
}
System.out.printf("Downloaded %s%n", srcUri);
System.out.printf("Destination: %s (%,d bytes)%n", tmpFile.toString(), Files.size(tmpFile));
}
finally
{
client.stop();
}
}
private static String getHttpClientVersion()
{
ClassLoader cl = HttpClient.class.getClassLoader();
// Attempt to use maven pom properties first
String pomResource = "/META-INF/maven/org/eclipse/jetty/jetty-client/pom.properties";
URL url = cl.getResource(pomResource);
if (url != null)
{
try (InputStream in = url.openStream())
{
Properties props = new Properties();
props.load(in);
String version = props.getProperty("version");
if (StringUtil.isNotBlank(version))
return version;
}
catch (IOException ignore)
{
}
}
// Attempt to use META-INF/MANIFEST.MF
String version = HttpClient.class.getPackage().getImplementationVersion();
if (StringUtil.isNotBlank(version))
return version;
return "<unknown>";
}
}
When run, this results in ...
2018-05-23 10:52:08.401:INFO::main: Logging initialized #325ms to org.eclipse.jetty.util.log.StdErrLog
Using HttpClient v9.4.9.v20180320
Requesting: https://upload.wikimedia.org/wikipedia/en/a/a1/Jetty_logo.png?download
Downloaded https://upload.wikimedia.org/wikipedia/en/a/a1/Jetty_logo.png?download
Destination: C:\Users\joakim\AppData\Local\Temp\tmp2166600286896937563.dl (11,604 bytes)
Process finished with exit code 0
One (or more) of the following is likely causing your issue.
There is either something wrong with your server, not complying with the HTTP spec.
The HTTP exchange isn't complete yet (from the protocol point of view). Capture the traffic and verify the behavior.
The IOUtil library you are using (you didn't say which one) has a bug.
The fact that wget (or curl) works is likely because they are not strict with Content-Length (per recommendations in RFC7230) and will display / download all content received until physical connection EOF/disconnect. While the HTTP/1.1 protocol has a connection persistence and strict rules on when the request (and response) content ends.

JAVA apps and ECLIPSE can't connect to internet

Ok so I wrote a piece of code testing ability of my java to connect to internet. It is supposed to fetch html from www.google.com and display the contents in a JFrame's JTextArea object.
Here's the code, so you can have clear picture:
import java.awt.Color;
import java.awt.Dimension;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.net.URL;
import javax.swing.JFrame;
import javax.swing.JTextArea;
import org.jsoup.Jsoup;
import org.jsoup.nodes.Document;
public class JSoupFetchTest extends JFrame{
String userAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:37.0) Gecko/20100101 Firefox/37.0";
boolean jsoupcond = true;
String address = "http://www.google.com";
JTextArea text;
public JSoupFetchTest(){
text = new JTextArea();
text.setPreferredSize(new Dimension(500, 500));
text.setBackground(Color.BLACK);
text.setForeground(Color.WHITE);
text.setVisible(true);
text.setLineWrap(true);
this.add(text);
this.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
this.setVisible(true);
this.pack();
gogo();
}
private void gogo() {
if(jsoupcond){
text.setText(text.getText() +"\nstart...");
try {
text.setText(text.getText() +"\nConnecting to " +address+ "...");
Document doc = Jsoup.connect(address).userAgent(userAgent).get();
text.setText(text.getText() +"\nConverting page document into text");
String s = doc.toString();
text.setText(text.getText() +"\nText: \n" +s);
System.out.println();
} catch (Exception e) {
text.setText(text.getText() +"\n" +e.toString());
e.printStackTrace();
}
text.setText(text.getText() +"\nEnd.");
}
String html = downloadHtml(address);
text.setText(text.getText() +"\nDownloading HTML...");
text.setText(text.getText() +"\nHTML:");
text.setText(text.getText() +"\n" +html);
}
private String downloadHtml(String path) {
text.setText(text.getText() +"\ndownloadHtml entry point...");
InputStream is = null;
try {
text.setText(text.getText() +"\ntry block entered...");
String result = "";
String line;
URL url = new URL(path);
text.setText(text.getText() +"\nabout to open url stream...");
is = url.openStream(); // throws an IOException
text.setText(text.getText() +"\nurl stream opened...");
BufferedReader br = new BufferedReader(new InputStreamReader(is));
text.setText(text.getText() +"\nstarting to read lines...");
while ((line = br.readLine()) != null) {
result += line;
}
text.setText(text.getText() +"\nreading lines finished...");
return result;
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
try {
if (is != null) is.close();
} catch (IOException ioe) { }
}
return "";
}
public static void main(String[] args) {
new JSoupFetchTest();
}
}
I should also add that:
1. My eclipse (cause that's what I'm using) can't connect to marketplace nor can't fetch updates.
2. Eclipse's web browser works fine.
3. My system's browser (Mozilla Firefox) connects fine
4. I exported JSoupFetchTest into a runnable jar and tried to run it from system's level, with no effect
5. I am running Windows 7 Professional MSDN version
6. I contacted eclipse support and they concluded it is not eclipse's fault and suggested that I'm behind a proxy.
7. I contacted my ISP to see if I indeed am and they said I am not.
8. I changed my JAVA's network settings so now it connects "directly". Before the setting was "use browser settings" and it didn't work either.
9. My eclipse's Window -> Preferences -> General -> Network Connections active provider is set to "Native", I also tried "Direct"
10. Method downloadHtml(String path) stops at "is = url.openStream();" and goes on forever...
The exception I get from JSoup is:
java.net.SocketTimeoutException: Read timed out
at java.net.SocketInputStream.socketRead0(Native Method)
at java.net.SocketInputStream.read(SocketInputStream.java:150)
at java.net.SocketInputStream.read(SocketInputStream.java:121)
at java.io.BufferedInputStream.fill(BufferedInputStream.java:246)
at java.io.BufferedInputStream.read1(BufferedInputStream.java:286)
at java.io.BufferedInputStream.read(BufferedInputStream.java:345)
at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:703)
at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:647)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1534)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1439)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
at org.jsoup.helper.HttpConnection$Response.execute(HttpConnection.java:453)
at org.jsoup.helper.HttpConnection$Response.execute(HttpConnection.java:434)
at org.jsoup.helper.HttpConnection.execute(HttpConnection.java:181)
at org.jsoup.helper.HttpConnection.get(HttpConnection.java:170)
at JSoupFetchTest.gogo(JSoupFetchTest.java:42)
at JSoupFetchTest.<init>(JSoupFetchTest.java:32)
at JSoupFetchTest.main(JSoupFetchTest.java:92)
I also tried to set JSoup.connect's timeout to infinity. Then it goes on forever.
Before you guys say that my question is a duplicate, or delegate me to other, external possible solutions to my problem, believe me - either the question is mine or I was there - I browse internet in search for solution for weeks now and I feel like pulling my hair out...
Please help if you can cause it prevents me from installing stuff in my eclipse and from developing anything else than stand alone apps...
You need a socket number after the URL -- "http:/www.google.com:80" works. JSoup likely uses defaults for that, but opening the URL as a stream in Java does not.
The following program works for me. So Java and JSoup are working. It has to be some sort of local configuration problem with your network. Check your firewall, routers, gateway, and Java permissions. Do a clean rebuild of your project. Etc. Comment out lines until it does work and then put the lines back one at a time until you find the problem. Etc.
package stuff;
import java.io.BufferedReader;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.Reader;
import java.net.URL;
import java.net.URLConnection;
import org.jsoup.Jsoup;
import org.jsoup.nodes.Document;
public class SocketTest
{
public static void main( String[] args ) throws Exception
{
URL url = new URL( "http://www.google.com" );
URLConnection sock = url.openConnection();
InputStream ins = sock.getInputStream();
BufferedReader reader = new BufferedReader( new InputStreamReader(ins, "UTF-8" ) );
for( String line; (line = reader.readLine()) != null; ) {
System.out.println( line );
}
ins.close();
Document doc = Jsoup.connect( "http://www.google.com" ).get();
System.out.println( doc.toString() );
String userAgent = "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:37.0) Gecko/20100101 Firefox/37.0";
Document doc2 = Jsoup.connect( "http://www.google.com" ).userAgent(userAgent).get();
System.out.println( doc2.toString() );
}
}

Java program does not connect to internet?

This program compiles successfully but when I try to run the program it gives me errors.
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.FileWriter;
import java.io.InputStreamReader;
import java.net.URL;
public class Main {
public static void main(String[] args)
throws Exception {
URL url = new URL("http://www.google.com");
BufferedReader reader = new BufferedReader
(new InputStreamReader(url.openStream()));
BufferedWriter writer = new BufferedWriter
(new FileWriter("data.html"));
String line;
while ((line = reader.readLine()) != null) {
System.out.println(line);
writer.write(line);
writer.newLine();
}
reader.close();
writer.close();
}
}
The following error occurs (I have attached the image):
Screenshot of errors
I am behind a proxy server. Does that make a problem in connecting to the internet? If so please post the solution that .. Thanks in advance.
You should do something similar:
1st of all put proxy information to system properties:
System.getProperties().put( "proxySet", "true" );
System.getProperties().put( "proxyHost", "proxy_hostname" );
System.getProperties().put( "proxyPort", "8080" ); // or other proxy port
And then you need to do authentication on proxy, using something similar:
URL url = new URL("http://www.google.com");
URLConnection con = url.openConnection();
String pass = "MY_USERNAME:MY_PASS";
String encodedPass = base64Encode( pass );
con.setRequestProperty( "Proxy-Authorization", encodedPass );
Good luck.
Yes. Proxy settings can protect a standalone app from connecting to internet. If you know the proxy try using
-Dhttp.proxyHost=yourProxy & -Dhttp.proxyPort=proxyPort
These are VM arguments. If you are running it command line then use it as
java -Dhttp.proxyHost=yourProxy & -Dhttp.proxyPort=proxyPort Main

Categories