I am pretty new to Java, and I couldn't find a proper answer to this anywhere, so:
I'm trying to check if the loaded URL can't load because of an error or more specifically error 429 (Too many requests).
I am getting the data from the URL using this:
String finalLine = null;
URL oracle = new URL(url);
BufferedReader in = new BufferedReader(
new InputStreamReader(oracle.openStream()));
String inputLine;
while ((inputLine = in.readLine()) != null)
finalLine = inputLine;
in.close();
return finalLine;
It works very well actually, after that I'm using GSON to parse it.
When it doesn't work the game I'm working on is just crashing. So I would like to code an if function or something that handles the error and does stuff accordingly.
Is there a proper way of doing that? thanks!
If this is what you mean, you can try checking the response code using HttpURLConnection#getResponseCode().
URL oracle = new URL("http://httpstat.us/429"); // url whose response code is 429
HttpURLConnection openConnection = (HttpURLConnection) oracle.openConnection();
if (openConnection.getResponseCode() == 429) {
System.out.println("Too many request");
} else {
System.out.println("ok");
}
openConnection.disconnect();
Related
I am having some difficulty receiving the JSON list response from a given URL in my Android application. I am not sure if I am missing a step in firing the GET call, or the problem is on the web service side. Right when the code gets to the "getInputStream" line, it crashes
if (url != null) {
try {
HttpURLConnection httpURLConnection = (HttpURLConnection) url.openConnection();
httpURLConnection.setRequestMethod("GET");
BufferedReader in = new BufferedReader(
new InputStreamReader(httpURLConnection.getInputStream()));
String inputLine;
StringBuffer response = new StringBuffer();
while ((inputLine = in.readLine()) != null) {
response.append(inputLine);
}
in.close();
System.out.println(response.toString());
}
catch (IOException e) {
e.printStackTrace();
}
}
The errors given are as follows have to do with NetworkOnMainThread Exceptions as well as a few others. Note: This is within a method that is called in the "onCreate" method, which could also be a source of the problem.
Alright it ended up being the last issue, thanks for the clarification Daniel. I got lazy and did not put it in an ASyncTask. Works great now, thanks!
I have a URL like this and the following method
public static void saveContent( String webURL )throws Exception
{
URL website = new URL(webURL);
URLConnection connection = website.openConnection();
BufferedReader in = new BufferedReader(
new InputStreamReader(
connection.getInputStream()));
StringBuilder response = new StringBuilder();
String inputLine;
while ((inputLine = in.readLine()) != null)
response.append(inputLine);
in.close();
System.out.println(response.toString());
}
However, When I want to print web content, it always fetches the source code of the main page(www.google.com).
How can I solve my problem ? Thanks for your help.
I copied yours code to netbeans and it seems to work correctly. I think the problem could lead on content in method argument "webURL". Try run your app on debug mode and look what you've got back there.
In my Wikipedia reader app for Android, I'm downloading an article's html by using HttpURLConnection, some users report that they are unable to see articles, instead they see some css, so it seems like their carrier is somehow preprocessing the html before it's downloaded, while other wikipedia readers seem to work fine.
Example url: http://en.m.wikipedia.org/wiki/Black_Moon_(album)
My method:
public static String downloadString(String url) throws Exception
{
StringBuilder downloadedHtml = new StringBuilder();
HttpURLConnection urlConnection = null;
String line = null;
BufferedReader rd = null;
try
{
URL targetUrl = new URL(url);
urlConnection = (HttpURLConnection) targetUrl.openConnection();
if (url.toLowerCase().contains("/special"))
urlConnection.setInstanceFollowRedirects(true);
else
urlConnection.setInstanceFollowRedirects(false);
//read the result from the server
rd = new BufferedReader(new InputStreamReader(urlConnection.getInputStream()));
while ((line = rd.readLine()) != null)
downloadedHtml.append(line + '\n');
}
catch (Exception e)
{
AppLog.e("An exception occurred while downloading data.\r\n: " + e);
e.printStackTrace();
}
finally
{
if (urlConnection != null)
{
AppLog.i("Disconnecting the http connection");
urlConnection.disconnect();
}
if (rd != null)
rd.close();
}
return downloadedHtml.toString();
}
I'm unable to reproduce this problem, but there must be a way to get around that? I even disabled redirects by setting setInstanceFollowRedirects to 'false' but it didn't help.
Am I missing something?
Example of what the users are reporting:
http://pastebin.com/1E3Hn2yX
carrier is somehow preprocessing the html before it's downloaded
a way to get around that?
Use HTTPS to prevent carriers from rewriting pages. (no citation)
Am I missing something?
not that I can see
I have a problem with this code to display the html content. When I try it on your smartphone, I would print "Error" that is capturing an error, where am I wrong?
String a2="";
try {
URL url = new URL("www.google.com");
InputStreamReader isr = new InputStreamReader(url.openStream());
BufferedReader in = new BufferedReader(isr);
String inputLine;
while ((inputLine = in.readLine()) != null){
a2+=inputLine;
}
in.close();
tx.setText("OUTPUT \n"+a2);
} catch (Exception e) {
tx.setText("Error");
}
URL requires a correctly formed url. You should use:
URL url = new URL("http://www.google.com");
Update:
As you are getting a NetworkOnMainThreadException, it appears that you are attempting to make the connection in the main thread.
Ths solution is to run the code in an AsyncTask.
I am attempting to connect to a website where I'd like to extract its HTML contents. My application will never connect to the site - only time out.
Here is my code:
URL url = new URL("www.website.com");
URLConnection connection = url.openConnection();
connection.setConnectTimeout(2000);
connection.setReadTimeOut(2000);
BufferedReader reader = new BufferedReader(new InputStreamReader(connection.getInputStream());
String line;
while ((line = reader.readLine()) != null) {
// do stuff with line
}
reader.close();
Any ideas would be greatly appreciated. Thanks!
I believe the url should be (ie. you need a protocol):
URL url = new URL("http://www.website.com");
If that doesn't help then post your real SSCCE that demonstrates the problem so we don't have to guess what you are really doing because we can't tell if you are using your try/catch block correctly or if you are just ignoring exceptions.