java.net.SocketException: Network is unreachable: connect - java

I am trying to download a xml text file from a web server using this method:
static void download (String url , String fileName) throws IOException{
FileWriter xmlWriter;
xmlWriter = new FileWriter(fileName);
System.out.println("URL to download is : " + url);
// here Exception is thrown/////////////////////////////////
BufferedReader inputTxtReader = new BufferedReader
(new BufferedReader(new InputStreamReader(addURL.openStream())));
////////////////////////////////////////////////////////
String str ;
String fileInStr = "";
str = inputTxtReader.readLine();
while (!(str == null) ){///&& !(str.equals("</tv>"))
fileInStr += (str + "\r\n");
str = inputTxtReader.readLine();
}
xmlWriter.write(fileInStr);
xmlWriter.flush();
xmlWriter.close();
System.out.println("File Downloaded");
}
Sometimes this exception is thrown (where I specified is code):
java.net.SocketException: Network is unreachable: connect
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.PlainSocketImpl.doConnect(PlainSocketImpl.java:333)
at java.net.PlainSocketImpl.connectToAddress(PlainSocketImpl.java:195)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:182)
at java.net.Socket.connect(Socket.java:518)
at java.net.Socket.connect(Socket.java:468)
at sun.net.NetworkClient.doConnect(NetworkClient.java:157)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:389)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:516)
at sun.net.www.http.HttpClient.<init>(HttpClient.java:233)
at sun.net.www.http.HttpClient.New(HttpClient.java:306)
at sun.net.www.http.HttpClient.New(HttpClient.java:318)
at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:788)
at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:729)
at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:654)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:977)
at java.net.URL.openStream(URL.java:1009)
at MessagePanel.download(MessagePanel.java:640)
at WelcomThread.run(MainBody2.java:891)
Please guide me
Thank you all.

You are facing a connection breakdown. Does this happen in 3G, WiFi or "plain" connection on a computer?
Anyway, you must assume that the connection may be lost from time to time, when writing your app. For example, with mobiles, this happens frequently in the tube, in basements, etc. With PC apps, this is less frequent but occurs sometimes.
A retry can be a good solution. And a clean error message that explains the network is not available at this moment too.

I faced situation of getting java.net.SocketException not sometimes but every time. I've added -Djava.net.preferIPv4Stack=true to java command line and my program started to work properly.

"Network is unreachable" means just that. You're not connected to a network. It's something outside of your program. Could be a bad OS setting, NIC, router, etc.

I haven't tested with your code so it would be totally different case though, still I'd like to share my experience. (Also this must be too late answer though, I hope this answer still would help somebody in the future)
I recently faced similar experience like you such as some times Network is unreachable, but sometimes not. In short words, what was cause is too small time out. It seems Java throws IOException with stating "Network is unreachable" when the connection fails because of it. It was so misleading (I would expect something like saying "time out") and I spent almost a month to detect it.
Here I found another post about how to set time out.
Alternative to java.net.URL for custom timeout setting
Again, this might not the same case as you got experienced, but somebody for the future.

this just happened to me. None of the answers helped, as the issue was I have recently changed the target host configuration and put incorrect host value there. So it could just be wrong connection details as well.

I faced this error after updating my network adapter configuration (migration to a NIC coupled network by PowerShell commandlet New-NetSwitchTeam). My guess is, that something in the java configuration must be adapted to reflect this change to the java system. But it is unclear where the changes should take place. I am investigating further.

Related

httpclient Connection reset [duplicate]

I'm creating a (well behaved) web spider and I notice that some servers are causing Apache HttpClient to give me a SocketException -- specifically:
java.net.SocketException: Connection reset
The code that causes this is:
// Execute the request
HttpResponse response;
try {
response = httpclient.execute(httpget); //httpclient is of type HttpClient
} catch (NullPointerException e) {
return;//deep down in apache http sometimes throws a null pointer...
}
For most servers it's just fine. But for others, it immediately throws a SocketException.
Example of site that causes immediate SocketException: http://www.bhphotovideo.com/
Works great (as do most websites): http://www.google.com/
Now, as you can see, www.bhphotovideo.com loads fine in a web browser. It also loads fine when I don't use Apache's HTTP Client. (Code like this:)
HttpURLConnection c = (HttpURLConnection)url.openConnection();
BufferedInputStream in = new BufferedInputStream(c.getInputStream());
Reader r = new InputStreamReader(in);
int i;
while ((i = r.read()) != -1) {
source.append((char) i);
}
So, why don't I just use this code instead? Well there are some key features in Apache's HTTP Client that I need to use.
Does anyone know what causes some servers to cause this exception?
Research so far:
Problem occurs on my local Mac dev machines AND an AWS EC2 Instance, so it's not a local firewall.
It seems the error isn't caused by the remote machine because the exception doesn't say "by peer"
This stack overflow seems relavent java.net.SocketException: Connection reset but the answers don't show why this would happen only from Apache HTTP Client and not other approaches.
Bonus question: I'm doing a fair amount of crawling with this system. Is there generally a better Java class for this other than Apache HTTP Client? I've found a number of issues (such as the NullPointerException I have to catch in the code above). It seems that HTTPClient is very picky about server communications -- more picky than I'd like for a crawler that can't just break when a server doesn't behave.
Thanks all!
Solution
Honestly, I don't have a perfect solution, but it works, so that's good enough for me.
As pointed out by oleg below, Bixo has created a crawler that customizes HttpClient to be more forgiving to servers. To "get around" the issue more than fix it, I just used SimpleHttpFetcher provided by Bixo here:
(linked removed - SO thinks I'm a spammer, so you'll have to google it yourself)
SimpleHttpFetcher fetch = new SimpleHttpFetcher(new UserAgent("botname","contact#yourcompany.com","ENTER URL"));
try {
FetchedResult result = fetch.fetch("ENTER URL");
System.out.println(new String(result.getContent()));
} catch (BaseFetchException e) {
e.printStackTrace();
}
The down side to this solution is that there are a lot of dependencies for Bixo -- so this may not be a good work around for everyone. However, you can always just work through their use of DefaultHttpClient and see how they instantiated it to get it to work. I decided to use the whole class because it handles some things for me, like automatic redirect following (and reporting the final destination url) that are helpful.
Thanks for the help all.
Edit: TinyBixo
Hi all. So, I loved how Bixo worked, but didn't like that it had so many dependencies (including all of Hadoop). So, I created a vastly simplified Bixo, without all the dependencies. If you're running into the problems above, I would recommend using it (and feel free to make pull requests if you'd like to update it!)
It's available here: https://github.com/juliuss/TinyBixo
First, to answer your question:
The connection reset was caused by a problem on the server side. Most likely the server failed to parse the request or was unable to process it and dropped the connection as a result without returning a valid response. There is likely something in the HTTP requests generated by HttpClient that causes server side logic to fail, probably due to a server side bug. Just because the error message does not say 'by peer' does not mean the connection reset took place on the client side.
A few remarks:
(1) Several popular web crawlers such as bixo http://openbixo.org/ use HttpClient without major issues but pretty much of them had to tweak HttpClient behavior to make it more lenient about common HTTP protocol violations. Per default HttpClient is rather strict about the HTTP protocol compliance.
(2) Why did not you report the NPE problem or any other problem you have been experiencing to the HttpClient project?
These two settings will sometimes help:
client.getParams().setParameter("http.socket.timeout", new Integer(0));
client.getParams().setParameter("http.connection.stalecheck", new Boolean(true));
The first sets the socket timeout to be infinite.
Try getting a network trace using wireshark, and augment that with log4j logging of the HTTPClient. That should show why the connection is being reset

Unknown host exception thrown once in while

String urlSD=" some url" ;
URL urlGetContents = new URL(urlSD);
DataInputStream rd = new DataInputStream(urlGetContents.openStream());
I am getting an UnknownHostException here. What made me confuse here is that it works well for a while (not throwing the exception for 50 times) and then the 51th time it will throw the exception. How is this caused and how can I solve it?
How is this caused and how can I solve it?
Sounds like you've either got a malfunctioning DNS server or cache, or an intermittent network problem. (Like my home broadband at the moment ...)
For a solution to the latter problem, try sacrificing a black rooster on your modem.

Apache HTTPClient throws java.net.SocketException: Connection reset for many domains

I'm creating a (well behaved) web spider and I notice that some servers are causing Apache HttpClient to give me a SocketException -- specifically:
java.net.SocketException: Connection reset
The code that causes this is:
// Execute the request
HttpResponse response;
try {
response = httpclient.execute(httpget); //httpclient is of type HttpClient
} catch (NullPointerException e) {
return;//deep down in apache http sometimes throws a null pointer...
}
For most servers it's just fine. But for others, it immediately throws a SocketException.
Example of site that causes immediate SocketException: http://www.bhphotovideo.com/
Works great (as do most websites): http://www.google.com/
Now, as you can see, www.bhphotovideo.com loads fine in a web browser. It also loads fine when I don't use Apache's HTTP Client. (Code like this:)
HttpURLConnection c = (HttpURLConnection)url.openConnection();
BufferedInputStream in = new BufferedInputStream(c.getInputStream());
Reader r = new InputStreamReader(in);
int i;
while ((i = r.read()) != -1) {
source.append((char) i);
}
So, why don't I just use this code instead? Well there are some key features in Apache's HTTP Client that I need to use.
Does anyone know what causes some servers to cause this exception?
Research so far:
Problem occurs on my local Mac dev machines AND an AWS EC2 Instance, so it's not a local firewall.
It seems the error isn't caused by the remote machine because the exception doesn't say "by peer"
This stack overflow seems relavent java.net.SocketException: Connection reset but the answers don't show why this would happen only from Apache HTTP Client and not other approaches.
Bonus question: I'm doing a fair amount of crawling with this system. Is there generally a better Java class for this other than Apache HTTP Client? I've found a number of issues (such as the NullPointerException I have to catch in the code above). It seems that HTTPClient is very picky about server communications -- more picky than I'd like for a crawler that can't just break when a server doesn't behave.
Thanks all!
Solution
Honestly, I don't have a perfect solution, but it works, so that's good enough for me.
As pointed out by oleg below, Bixo has created a crawler that customizes HttpClient to be more forgiving to servers. To "get around" the issue more than fix it, I just used SimpleHttpFetcher provided by Bixo here:
(linked removed - SO thinks I'm a spammer, so you'll have to google it yourself)
SimpleHttpFetcher fetch = new SimpleHttpFetcher(new UserAgent("botname","contact#yourcompany.com","ENTER URL"));
try {
FetchedResult result = fetch.fetch("ENTER URL");
System.out.println(new String(result.getContent()));
} catch (BaseFetchException e) {
e.printStackTrace();
}
The down side to this solution is that there are a lot of dependencies for Bixo -- so this may not be a good work around for everyone. However, you can always just work through their use of DefaultHttpClient and see how they instantiated it to get it to work. I decided to use the whole class because it handles some things for me, like automatic redirect following (and reporting the final destination url) that are helpful.
Thanks for the help all.
Edit: TinyBixo
Hi all. So, I loved how Bixo worked, but didn't like that it had so many dependencies (including all of Hadoop). So, I created a vastly simplified Bixo, without all the dependencies. If you're running into the problems above, I would recommend using it (and feel free to make pull requests if you'd like to update it!)
It's available here: https://github.com/juliuss/TinyBixo
First, to answer your question:
The connection reset was caused by a problem on the server side. Most likely the server failed to parse the request or was unable to process it and dropped the connection as a result without returning a valid response. There is likely something in the HTTP requests generated by HttpClient that causes server side logic to fail, probably due to a server side bug. Just because the error message does not say 'by peer' does not mean the connection reset took place on the client side.
A few remarks:
(1) Several popular web crawlers such as bixo http://openbixo.org/ use HttpClient without major issues but pretty much of them had to tweak HttpClient behavior to make it more lenient about common HTTP protocol violations. Per default HttpClient is rather strict about the HTTP protocol compliance.
(2) Why did not you report the NPE problem or any other problem you have been experiencing to the HttpClient project?
These two settings will sometimes help:
client.getParams().setParameter("http.socket.timeout", new Integer(0));
client.getParams().setParameter("http.connection.stalecheck", new Boolean(true));
The first sets the socket timeout to be infinite.
Try getting a network trace using wireshark, and augment that with log4j logging of the HTTPClient. That should show why the connection is being reset

Unable to acquire image through ImageIO.read(url) because of connection timed out

The following code always seems to fail:
URL url = new URL("http://userserve-ak.last.fm/serve/126/8636005.jpg");
Image img = ImageIO.read(url);
System.out.println(img);
I've checked the url, and it is a valid jpg image. The error I get is:
Exception in thread "main" javax.imageio.IIOException: Can't get input stream from URL!
at javax.imageio.ImageIO.read(ImageIO.java:1385)
at maestro.Main2.main(Main2.java:25)
Caused by: java.net.ConnectException: Connection timed out
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:310)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:176)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:163)
at java.net.Socket.connect(Socket.java:546)
at java.net.Socket.connect(Socket.java:495)
at sun.net.NetworkClient.doConnect(NetworkClient.java:174)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:409)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:530)
at sun.net.www.http.HttpClient.(HttpClient.java:240)
at sun.net.www.http.HttpClient.New(HttpClient.java:321)
at sun.net.www.http.HttpClient.New(HttpClient.java:338)
at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:814)
at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:755)
at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:680)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1005)
at java.net.URL.openStream(URL.java:1029)
at javax.imageio.ImageIO.read(ImageIO.java:1383)
... 1 more
Java Result: 1
What does this mean?
Funny thing is, if I change my internet-connection to that of the neighbour's wireless, it suddenly works.
This worked for me. :)
URL url = new URL("http://userserve-ak.last.fm/serve/126/8636005.jpg");
Image image = ImageIO.read(url.openStream());
System.out.println(image);
I know I am late. Since, even I faced the same issue, thought of putting as it would help some one. :)
This is maybe unlikely on a home network, but a lot of companies have HTTP proxy servers that can make your errors a little misleading. Often the URL will appear to work fine manually because your browser is configured to use your proxy server. You can set the proxy settings on the command line or in the code, see: http://java.sun.com/javase/6/docs/technotes/guides/net/proxies.html.
This code works perfect for me.
If you have a very slow internet-connection, then that is the reason. Or you are downloading/uploading stuff (http, torrents, ftp, ...)
I've manually checked the url, and it is valid, and contains a valid jpg image.
Edit:
Did you tested it in a browser? If so, maybe it's timeout is longer.
Did you tested it on your own network with the browser?
What does this mean?
A time out exception means that you couldn't create a Socket. This can have a few reasons:
Server is not responding.
The server is very busy.
The packages are lost. This can have also a few reasons:
Your are downloading and your broadband is full.
You are far away from the internet-provider's "central". (You live in the country)

How to detect internet connectivity using java program

How to write a java program which will tell me whether I have internet access or not. I donot want to ping or create connection with some external url because if that server will be down then my program will not work. I want reliable way to detect which will tell me 100% guarantee that whether I have internet connection or not irrespective of my Operating System. I want the program for the computers who are directly connected to internet.
I have tried with the below program
URL url = new URL("http://www.xyz.com/");
URLConnection conn = url.openConnection();
conn.connect();
I want something more appropriate than this program
Thanks
Sunil Kumar Sahoo
It depends on what you mean by "internet" connection. Many computers are not connected directly to the internet, so even if you could check whether they have a network connection, it doesn't always mean they can access the internet.
The only 100% reliable way to test whether the computer can access some other server is to actually try.
Effective connectivity to the internet (i.e. where you can actually do stuff) depends on lots of things being correct, on your machine, your local net, your router, your modem, your ISP and so on. There are lots of places where a failure or misconfiguration will partly or completely block network access.
It is impossible to test all of these potential failure points with any certainty ... or even to enumerate them. (For example, you typically have no way of knowing what is happening inside your ISP's networking infrastructure.)
As #codeka says: "the only 100% reliable way to test whether the computer can access some other server is to actually try".
I think if you were to open up a HTTP session with all of:
www.google.com
www.microsoft.com
www.ibm.com
www.ford.com
and at least one of them came back with a valid response, you would have internet connectivity. Don't keep testing once you get a valid response since that would be a waste.
Feel free to expand on that list with some more mega-corporations in case you fear that all four of them may be down at the same time :-)
Of course, even that method can be tricked if someone has taken control of your DNS servers but it's probably about as reliable as you're going to get.
Just put a try/catch block around the code you mentioned. If an exception is thrown/caught then you don't have connectivity.
boolean connectivity;
try {
URL url = new URL("http://www.xyz.com/");
URLConnection conn = url.openConnection();
conn.connect();
connectivity = true;
} catch (Exception e) {
connectivity = false;
}
For better results investigate what kind of exceptions can be thrown and handle each individually.
You can check the connectivity by ask the Internet Protocol from InetAddress class. If you get an exception, or for example you use getLocalHost() -- which is returns the address of the local host -- give you the following output:
localhost/127.0.0.1 instead of fully qualified name like for example jaf-stephen-lenovoG40-80/152.6.44.13 then you're not connected to the Internet.
public static void main(String[] args) {
try {
InetAddress address = InetAddress.getByName("www.facebook.com");
System.out.println(address);
} catch (UnknownHostException e) {
System.err.println("Couldn't find www.facebook.com");
}
}
if you're connected to the Internet, you'll get the following output:
www.facebook.com/31.13.78.35
Enumaration<NetworkInterface> networkInterface = null;
networkInterface = NetworkInterface.getNetworkInterfaces();
for(NetworkInterface interface : Collections.list(networkInterface)){
System.out.println("Internet Available status is :"+ interface.isUp());
}

Categories