I use the following code
private String resultGET(String addr)
{
try
{
String result = "";
HttpURLConnection conn = null;
addr = (isFull)?addr:Statics.fullURL(addr);
try
{
URL url = new URL(addr);
conn = (HttpURLConnection)url.openConnection();
conn.setRequestMethod("GET");
conn.setRequestProperty("User-Agent", Statics.USER_AGENT);
InputStream ips = conn.getInputStream();
int responseCode = conn.getResponseCode();
if (200 != responseCode)
{
Feedback.setError("GET error: " + responseCode + " on " + addr);
return "";
}
BufferedReader bufr = new BufferedReader(new InputStreamReader(ips));
String line;
while ((line = bufr.readLine()) != null) result += line;
bufr.close();
} finally{if (null != conn) conn.disconnect();}
return result;
} catch(Exception e)
{
Feedback.setError("get fault " + Utils.stackTrace(e));
return "";
}
}
Feedback is simply a Java class I use internally to handle all messages that I send back to the Android app front end (this is a hybrid app and the code above is part of a plugin I have written for the app).
I find that when any significant amount of data are returned the resultGETcall gets excruciatingly slow. For instance, a 43Kb Javascript file - which I later use to run JS code via Duktape takes the best part of 1 minute to download and save.
I am still quite a newbie when it comes to Java so I imagine that I am doing something wrong here which is causing the issue. I'd be most obliged to anyone who might be able to put me on the right track.
A while later...
I have now tested the issue on an Android 6 device instead of my default Android 4.4.2 device. On the Android 6 the download + file save speed comes in at a decent 5 seconds. On Android 4.4.2 it is over 40s. Are there any known issues with HTTPURLConnection on earlier versions of Android? I
String result = "";
The += operator on a String is slow. If you have a lot of lines use a StringBuilder sb = new StringBuilder(); and use its append() method to sb.append(line + " \n");
At the end you can use result = sb.toString();
Related
Getting stuck in an infinite-loop while retrieving api result.
Initialy I am getting scan percentage for 2-3 times but after that I am not getting any response.
My java code:
int responseCode=200;
String responseText = "text";
while (!responseText.equalsIgnoreCase("100") && responseCode == 200) {
URL urlForGetRequest = new URL("http://localhost:8090/burp/scanner/status");
String readLine = null;
HttpURLConnection conection = (HttpURLConnection) urlForGetRequest.openConnection();
conection.setRequestMethod("GET");
responseCode = conection.getResponseCode();
System.out.println("response" + responseCode);
BufferedReader in = new BufferedReader(new InputStreamReader(conection.getInputStream()));
String response = "";
while ((readLine = in.readLine()) != null) {
response += (readLine);
}
in.close();
JSONObject jsonObj = new JSONObject(response);
// print result
responseText = String.valueOf(jsonObj.get("scanPercentage"));
System.out.println(responseText);
TimeUnit.MINUTES.sleep(2);
}
output I got
response200
0
response200
4
response200
14
response200
17
and after this code kept on running without any output
Note: I perform get from its swagger UI,there is 1 error. i.e TypeError: Failed to fetch
There may be a chance the issue is in the response text check the actual text what the sender is sending
or just eliminate the extra space from the response text using following oneline code before processing the response text. It may be work for you if such problem is there..
responseText=responseText.replace(/^\s+|\s+$/g, '');
Put the code in try, catch block and check what exception it's throwing.
You could try to check if it gets stuck in the while loop, by adding an output and see if it keeps writing when it gets stuck.
while ((readLine = in.readLine()) != null) {
response += (readLine);
System.out.print(".");
}
I don't think this is the reason but it is worth a try.
I'm trying to use the open data sets that data.LACity.org publishes using Socrata software.
They have a Java API for it, but first I tried to just build and send a URL, as
a variant on the 'Sunshine' app several people have learned from on Udacity.
Now I'm actually building a URL, and then sending it out, but then I get a FileNotFoundException, as follows:
java.io.FileNotFoundException: http://data.lacity.org/resource/yv23-pmwf.json?$select=zip_code, issue_date, address_start, address_end, street_name, street_suffix, work_description, valuation&$where=issue_date >= '2015-02-27T00:00:00' AND zip_code = 90291
Here's the pisser: That whole URL is, as a final attempt, hardwired as a complete string, not built from pieces. The URL works if I plug it into Chrome, but not from my app.
But from my app, the old URL string that the Sunshine sample app builds, plugged in from logcat from a Sunshine run, to replace the URL on the lacity URL, well, that call works, and returns the JSON data.
So I'm doing something wrong when I call the LACity URL for Socrata data from my Android app. I've tried this both as https and http, and both failed. But the same code works when I call the weathermap data from the sample app.
Here are the two URLs:
http://api.openweathermap.org/data/2.5/forecast/daily?q=94043&mode=json&units=metric&cnt=7 <<< this works, both in Chrome and from Android
https://data.lacity.org/resource/yv23-pmwf.json?$select=zip_code, issue_date, address_start, address_end, street_name, street_suffix, work_description, valuation&$where=issue_date >= '2015-02-27T00:00:00' AND zip_code = 90291
This works in Chrome but not from Android.
Any suggestions would be appreciated. I'm going to try again to make heads or tails of the Socrata Soda2 Java API (and why, in this case, it might be necessary.)
Thanks
-k-
The immediate code fragment (pardon my newness to Android/Java):
final String PERMIT_BASE_URL = "one of the url strings above";
Uri builtUri = Uri.parse(PERMIT_BASE_URL).buildUpon()
.build();
URL url = new URL(builtUri.toString());
Log.v(LOG_TAG, "Build URL: " + url.toString());
urlConnection = (HttpURLConnection) url.openConnection();
urlConnection.setRequestMethod("GET");
urlConnection.connect();
InputStream inputStream = urlConnection.getInputStream();
StringBuffer buffer = new StringBuffer();
if (inputStream == null) {
return null;
}
reader = new BufferedReader(new InputStreamReader(inputStream));
String line;
while ((line = reader.readLine()) != null) {
//simplify debugging
buffer.append(line + "\n");
}
if (buffer.length() == 0) {
return null;
}
permitJsonStr = buffer.toString();
Log.v(LOG_TAG, "Permit JSON string: " + permitJsonStr);
} catch (IOException e) {
Log.e(LOG_TAG, "Error on Xoom", e);
// Nothing to parse.
return null;
} finally{
if (urlConnection != null) {
urlConnection.disconnect();
}
if (reader != null) {
try {
reader.close();
} catch (final IOException e) {
Log.e(LOG_TAG, "Error closing stream on Xoom", e);
}
}
Figured this out from the way this page highlighted the URLs in my question.
Spaces.
The call out of Android seems to cough because of the spaces in the URL string.
I closed them all up, but then the 'AND' caused issues.
Replaced it with &, now it works, hardwired.
I'll work on constructing it from input parameters, as intended, but I think this is OK.
As Emily Litella would say...
In my Wikipedia reader app for Android, I'm downloading an article's html by using HttpURLConnection, some users report that they are unable to see articles, instead they see some css, so it seems like their carrier is somehow preprocessing the html before it's downloaded, while other wikipedia readers seem to work fine.
Example url: http://en.m.wikipedia.org/wiki/Black_Moon_(album)
My method:
public static String downloadString(String url) throws Exception
{
StringBuilder downloadedHtml = new StringBuilder();
HttpURLConnection urlConnection = null;
String line = null;
BufferedReader rd = null;
try
{
URL targetUrl = new URL(url);
urlConnection = (HttpURLConnection) targetUrl.openConnection();
if (url.toLowerCase().contains("/special"))
urlConnection.setInstanceFollowRedirects(true);
else
urlConnection.setInstanceFollowRedirects(false);
//read the result from the server
rd = new BufferedReader(new InputStreamReader(urlConnection.getInputStream()));
while ((line = rd.readLine()) != null)
downloadedHtml.append(line + '\n');
}
catch (Exception e)
{
AppLog.e("An exception occurred while downloading data.\r\n: " + e);
e.printStackTrace();
}
finally
{
if (urlConnection != null)
{
AppLog.i("Disconnecting the http connection");
urlConnection.disconnect();
}
if (rd != null)
rd.close();
}
return downloadedHtml.toString();
}
I'm unable to reproduce this problem, but there must be a way to get around that? I even disabled redirects by setting setInstanceFollowRedirects to 'false' but it didn't help.
Am I missing something?
Example of what the users are reporting:
http://pastebin.com/1E3Hn2yX
carrier is somehow preprocessing the html before it's downloaded
a way to get around that?
Use HTTPS to prevent carriers from rewriting pages. (no citation)
Am I missing something?
not that I can see
I'm making an Android app for my WordPress blog and I've set up a static class Server.java in my Android app to communicate with a PHP file that I created on my hosting servers.
I use URLConnection, DataOutputStream and DataInputStream to do the communication. One of my functions gets the latest 10 posts from my WordPress, and is retrieved the following way...
In any of my java files, call Server.getPosts(); Below is my Server.java file (the parts you need to see, at least)
public class Server
{
public static final String SERVER_URL = "http://www.startingtofeelit.com/android-server.php";
public static String getPosts()
{
return executeHttpRequest("command=getPosts");
}
#SuppressWarnings("deprecation")
private static String executeHttpRequest(String data)
{
String result = new String();
try
{
URL url = new URL(SERVER_URL);
URLConnection connection = url.openConnection();
connection.setDoInput(true);
connection.setDoOutput(true);
connection.setUseCaches(false);
connection.setRequestProperty("Content-Type", "application/x-www-form-urlencoded");
DataOutputStream dataOut = new DataOutputStream(connection.getOutputStream());
dataOut.writeBytes(data);
dataOut.flush();
dataOut.close();
// get the response from the server and store it in result
DataInputStream dataIn = new DataInputStream(connection.getInputStream());
String inputLine;
while ((inputLine = dataIn.readLine()) != null)
{
result += inputLine;
}
dataIn.close();
}
catch (IOException e)
{
result = null;
}
System.out.println("Returning from Server.java");
return result;
}
}
I got most of this code from a guide online. It returns a string that has the xml representation of my posts. If you'd like, you can view the xml/return string here, but that is not really necessary for you to help me.
My main problem is here: I'm getting a huge amount of memory dumps. Something in the way my code is set up, I get massive amounts of memory dumps with the tag "dalvikvm" and the message GC_CONCURRENT freed 312K, 5% free 9989K/10439K, paused 10ms+15ms in my Eclipse LogCat, and other messages similar to that. It slows down the load of my app immensely. It seems to originating in my while loop, but I don't know how else to do this. Any help?
Use a StringBuilder in the while loop. You're creating a new String object each time through the loop (and leaving the old one eligible for GC) which could be impacting performance and causing the GC to go more crazy than usual. Remember String's are immutable in Java.
StringBuilder response = new StringBuilder();
String line;
while ((line = dataIn.readLine()) != null)
{
response.append(line);
}
return response.toString();
final StringBuffer buff = new StringBuffer();
String s;
while ((s = in.readLine()) != null) {
buff.append(s);
}
return buff.toString();
I'm requested to write an inverted index, so I would like as a start to write a java program which google searches a word and putting the results into an arraylist.
Here's my code:
String search = "Dan";
String google = "http://www.google.com/cse/publicurl?cx=012216303403008813404:kcqyeryhhm8&q=" + search;
URL url = new URL(google);
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.setRequestMethod("GET");
conn.setRequestProperty("Accept", "application/json");
BufferedReader reader = new BufferedReader(new InputStreamReader(
(conn.getInputStream())));
// Gather the results to a String array
List<String> resultsList = new ArrayList<String>();
String r;
while ((r = reader.readLine()) != null)
resultsList.add(r);
conn.disconnect();
System.out.println("Google Search for: " + search + " Is Done!");
The programs runs with no crashes in the middle, but I get only a source code of a page (which does not contain any links).
What do I need to change in the code? Maybe I need a whole different method?
If you want to use google search in your app you should use Google's API for that:
Custom search API
You get search results in JSON format.