I have a URL like this and the following method
public static void saveContent( String webURL )throws Exception
{
URL website = new URL(webURL);
URLConnection connection = website.openConnection();
BufferedReader in = new BufferedReader(
new InputStreamReader(
connection.getInputStream()));
StringBuilder response = new StringBuilder();
String inputLine;
while ((inputLine = in.readLine()) != null)
response.append(inputLine);
in.close();
System.out.println(response.toString());
}
However, When I want to print web content, it always fetches the source code of the main page(www.google.com).
How can I solve my problem ? Thanks for your help.
I copied yours code to netbeans and it seems to work correctly. I think the problem could lead on content in method argument "webURL". Try run your app on debug mode and look what you've got back there.
Related
I'm trying to redesign my code that originally took in JSON POST. To now take in images from POST requests. The problem is that when trying to debug it keeps giving me weird Unicode for the image data. I've tried looking into this problem from scratch, but all the examples I've been finding have been used for static images already on the hard drive.
I've tried this for my code inside my JSP file and it works fine for JSON data posts. Can someone tell me if I'm on the right path at all for this?
try{
BufferedReader in = new BufferedReader(
new InputStreamReader(request.getInputStream()));
String inputLine;
StringBuffer resp = new StringBuffer();
while((inputLine = in.readLine()) != null){
resp.append(inputLine);
}
System.out.println("Images request line: \"" +resp.toString().substring(0, 200)+ "\"");
in.close();
}catch(Exception e){}
I am pretty new to Java, and I couldn't find a proper answer to this anywhere, so:
I'm trying to check if the loaded URL can't load because of an error or more specifically error 429 (Too many requests).
I am getting the data from the URL using this:
String finalLine = null;
URL oracle = new URL(url);
BufferedReader in = new BufferedReader(
new InputStreamReader(oracle.openStream()));
String inputLine;
while ((inputLine = in.readLine()) != null)
finalLine = inputLine;
in.close();
return finalLine;
It works very well actually, after that I'm using GSON to parse it.
When it doesn't work the game I'm working on is just crashing. So I would like to code an if function or something that handles the error and does stuff accordingly.
Is there a proper way of doing that? thanks!
If this is what you mean, you can try checking the response code using HttpURLConnection#getResponseCode().
URL oracle = new URL("http://httpstat.us/429"); // url whose response code is 429
HttpURLConnection openConnection = (HttpURLConnection) oracle.openConnection();
if (openConnection.getResponseCode() == 429) {
System.out.println("Too many request");
} else {
System.out.println("ok");
}
openConnection.disconnect();
I am writing a program in Java that parses some text from a web page. But when I use the code below I get weird/incorrect characters.
code:
URL url = new URL(getSearchUrl(crit));
URLConnection connection = url.openConnection();
BufferedReader br = new BufferedReader(
new InputStreamReader(connection.getInputStream(), "UTF-8"));
String line;
while((line = br.readLine()) != null){
System.out.println(line);
}
br.close();
I get the following output:
?}?v?8????...
So what am I doing wrong? I know that the site I want to gather info from uses utf-8.
Edit: I am currently in Crotia. I tried some other program I know worked in Serbia (my home country) but it doesn't work here.
It's g-zipped. you can see it using connection.getContentEncoding().
If you use a GZIPInputStream around the connection.getInputStream() it should work better.
BufferedReader br = new BufferedReader(
new InputStreamReader(new GZIPInputStream(connection.getInputStream()), "UTF-8"));
I have a problem with this code to display the html content. When I try it on your smartphone, I would print "Error" that is capturing an error, where am I wrong?
String a2="";
try {
URL url = new URL("www.google.com");
InputStreamReader isr = new InputStreamReader(url.openStream());
BufferedReader in = new BufferedReader(isr);
String inputLine;
while ((inputLine = in.readLine()) != null){
a2+=inputLine;
}
in.close();
tx.setText("OUTPUT \n"+a2);
} catch (Exception e) {
tx.setText("Error");
}
URL requires a correctly formed url. You should use:
URL url = new URL("http://www.google.com");
Update:
As you are getting a NetworkOnMainThreadException, it appears that you are attempting to make the connection in the main thread.
Ths solution is to run the code in an AsyncTask.
I'm trying to read data from a webpage, and I have to do it using Java.
When I try to do it in Eclipse using Java i'm getting time out error:
java.net.ConnectException: Connection timed out: connect
(Using HttpURLConnection):
URL yahoo = new URL("http://www.yahoo.com/");
URLConnection yc = yahoo.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(yc.getInputStream()));
String inputLine;
while ((inputLine = in.readLine()) != null)
System.out.println(inputLine);
in.close();
In order to understand where is the problem I tried doing the same task using c# and VS2008, and it worked perfectly fine, no time out at all.
I'm doing this from work so there's a firewall but I don't have information about it.
What can be the reason for this?
Thanks!
Daniel
I'm using this code:
URL yahoo = new URL("http://www.yahoo.com/");
URLConnection yc = yahoo.openConnection();
BufferedReader in = new BufferedReader(
new InputStreamReader(
yc.getInputStream()));
String inputLine;
while ((inputLine = in.readLine()) != null)
System.out.println(inputLine);
in.close();
found from here: http://java.sun.com/docs/books/tutorial/networking/urls/readingWriting.html
I'm doing this from work so there's a firewall but I don't have information about it.