android - return ignored in try-statement - java

I have a class called RetreiveHttpStringResponse. It's used to get an InputStream from an URL containing JSON data. The class extends AsyncTask<String, Void, InputStream>. So the strange problem here is that null is always returned. No matter what. There is even no Exception. I checked out the program behaviour with the debugger and could see that at point (1) the processing is jumping immediately to the finally-statement and continues with return null;. And again there are no Errors and no Exceptions are going on. The programm is running normally.
I'm using Android 4.4 (SDK version 19), the response code is 200 and the following lines are set in the Manifest file.
uses-permission android:name="android.permission.INTERNET"
uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"
The problem is happening on the emulator and on a real device with internet connection. Here is the code:
#Override
protected InputStream doInBackground(String... arg0) {
URL url = null;
InputStream is = null;
HttpURLConnection urlConn = null;
int responseCode = 0;
try {
url = new URL(arg0[0]);
urlConn = (HttpURLConnection) url.openConnection();
urlConn.setReadTimeout(10000);
urlConn.setConnectTimeout(15000);
urlConn.setRequestMethod("GET");
urlConn.connect();
responseCode = urlConn.getResponseCode();
Log.d("DataHandlerInternet:RESPONSE_CODE", "The response is: " + responseCode);
is= urlConn.getInputStream(); //-->(1)<--
return is;
}
catch ( MalformedURLException e ) { // new URL() went wrong!
//TODO error message. URL is not correct!
e.printStackTrace();
}
catch (SocketTimeoutException e) { // Timeout while connecting or holding connection to URL.
//TODO error message. Timeout happened!
e.printStackTrace();
}
catch ( IOException e ) { // openConnection() failed!
//TODO error message. Couldn't connect to URL!
e.printStackTrace();
}
catch( Exception e ) { // Any other Exception!
e.printStackTrace();
}
finally {
try { if(is != null) { is.close(); } } catch(Exception e) {e.printStackTrace();}
try { if(urlConn != null) { urlConn.disconnect(); } } catch(Exception e) {e.printStackTrace();}
}
return null;
}
One bad solution is to delete the finally-statement. Well, not the best way to solve this problem.
Now I changed the code. I've put the reading in it and return just the String.
#Override
protected String doInBackground(String... arg0) {
URL url = null;
InputStream is = null;
HttpURLConnection urlConn = null;
int responseCode = 0;
try {
url = new URL(arg0[0]);
urlConn = (HttpURLConnection) url.openConnection();
urlConn.setReadTimeout(10000);
urlConn.setConnectTimeout(15000);
urlConn.setRequestMethod("GET");
urlConn.connect();
responseCode = urlConn.getResponseCode();
Log.d("DataHandlerInternet:RESPONSE_CODE", "The response is: " + responseCode);
is= urlConn.getInputStream();
StringBuilder sb = new StringBuilder();
BufferedReader br = new BufferedReader(new InputStreamReader(is));
String line = null;
while ( (line = br.readLine()) != null ) {
sb.append(line);
}
return sb.toString();
}
catch ( MalformedURLException e ) { // new URL() went wrong!
//TODO error message. URL is not correct!
e.printStackTrace();
}
catch (SocketTimeoutException e) { // Timeout while connecting or holding connection to URL.
//TODO error message. Timeout happened!
e.printStackTrace();
}
catch ( IOException e ) { // openConnection() failed!
//TODO error message. Couldn't connect to URL!
e.printStackTrace();
}
catch( Exception e ) { // Any other Exception!
e.printStackTrace();
}
finally {
try { if(is != null) { is.close(); } } catch(Exception e) {e.printStackTrace();}
try { if(urlConn != null) { urlConn.disconnect(); } } catch(Exception e) {e.printStackTrace();}
}
return null;
}
And still, after going through the while loop the return line; is completely ignored. I've checked the data in the String with the debugger and it was correct! No Errors no Exceptions.

finally will run in either case, also during normal return without exceptions. And you call .close in the finally statement clause.
So your code always returns the closed stream. Probably this is not that you intend.
Your description ("jumps to finally statement") still looks very much like a exception has been thrown by urlConn.getInputStream(). Strange you do not observe it.

I dont see why you get your null result but, one thing you are doing wrong is actually returning InputStream:
is= urlConn.getInputStream(); //-->(1)<--
return is;
you should read your stream in doInBackground (on worker thread), otherwise reading it in onPostExecute (UI Thread), will possibly cause NetworkOnMainThreadException, or at least ANR. Reading data from InputStream is still a network operation - data you download can be several MBs.

Related

setting connectionTimeOut and readTimeout not working on UrlResource

I am trying to terminate a connection if no data is being received or server is just keeping the connection open for a url by setting connectionTimeout and readTimeout.
I have create anonymous class of URLResource and fetch the data from url. code block below is of spring project. spring boot version is 2.7.1
try {
URL url = new URL("http://httpstat.us/200?sleep=20000");
UrlResource urlResource = new UrlResource(url) {
#Override
protected void customizeConnection(HttpURLConnection connection) throws IOException {
super.customizeConnection(connection);
connection.setConnectTimeout(4000);
connection.setReadTimeout(2000);
}
};
InputStream inputStream = urlResource.getInputStream();
InputStreamReader isr = new InputStreamReader(inputStream,
StandardCharsets.UTF_8);
BufferedReader br = new BufferedReader(isr);
br.lines().forEach(line -> System.out.println(line));
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (IOException e) {
System.out.println("IO exception");
e.printStackTrace();
}
I am using a service (http://httpstat.us/200?sleep=20000) that allows to hold connection for specified amount of time to check out the connection termination but the connection is not getting terminate after specified amount of time
Is there any other way to customize urlResource so that timeout can be set
It looks like the UrlResource.getInputStream() is missing to call customizeConnection(con); in its logic:
public InputStream getInputStream() throws IOException {
URLConnection con = this.url.openConnection();
ResourceUtils.useCachesIfNecessary(con);
try {
return con.getInputStream();
}
catch (IOException ex) {
// Close the HTTP connection (if applicable).
if (con instanceof HttpURLConnection httpConn) {
httpConn.disconnect();
}
throw ex;
}
}
Please, raise a GH issue for Spring Framework to address this problem.
As a workaround I see this:
UrlResource urlResource = new UrlResource(url) {
#Override
public InputStream getInputStream() throws IOException {
URLConnection con = getURL().openConnection();
customizeConnection(con);
try {
return con.getInputStream();
}
catch (IOException ex) {
// Close the HTTP connection (if applicable).
if (con instanceof HttpURLConnection httpConn) {
httpConn.disconnect();
}
throw ex;
}
}
#Override
protected void customizeConnection(HttpURLConnection connection) throws IOException {
super.customizeConnection(connection);
connection.setReadTimeout(2000);
}
};
So, I override that getInputStream() with the same logic, but also apply our customizeConnection() on it. With that fix your test fails like this:
java.net.SocketTimeoutException: Read timed out
at java.base/sun.nio.ch.NioSocketImpl.timedRead(NioSocketImpl.java:283)
at java.base/sun.nio.ch.NioSocketImpl.implRead(NioSocketImpl.java:309)
at java.base/sun.nio.ch.NioSocketImpl.read(NioSocketImpl.java:350)
at java.base/sun.nio.ch.NioSocketImpl$1.read(NioSocketImpl.java:803)
at java.base/java.net.Socket$SocketInputStream.read(Socket.java:966)
at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:244)
at java.base/java.io.BufferedInputStream.read1(BufferedInputStream.java:284)
at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:343)
at java.base/sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:791)
at java.base/sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:726)
at java.base/sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1688)
at java.base/sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1589)

W/System: A resource failed to call end. When using HttpsURLConnection - Android

In the logcat I get a warning:
W/System: A resource failed to call end.
I am 100% positive that this piece of code makes the warning, since when I take it out it stops.
I can't seem to fix it so it doesn't display warning.
The purpose of the code is to check if there is internet connection or not.
It is on separate thread. Declared with:
public class ConnectWifiThread extends Thread {
public static boolean isInternetAvailable(Context context) {
Here is the code:
try {
URL url = new URL("https://www.google.com/");
HttpsURLConnection https = (HttpsURLConnection) url.openConnection();
https.setRequestProperty("User-Agent", "test");
https.setRequestProperty("Connection", "close");
https.setConnectTimeout(2000); // mTimeout is in seconds
https.connect();
int tempResponse = https.getResponseCode();
if (tempResponse == 200) {
https.disconnect();
Thread.sleep(50);
https=null;
url=null;
Thread.sleep(50);
Log.d("Has", "internet");
return true;
} else {
https.disconnect();
Thread.sleep(50);
https=null;
url=null;
Thread.sleep(50);
Log.d("NO", "internet");
return false;
}
} catch (MalformedURLException e) {
e.printStackTrace();
return false;
} catch (IOException e) {
e.printStackTrace();
return false;
}
catch (Exception e) {
Log.d("Error checking internet", e.getMessage());
return false;
}
Thank you
When you use a URLConnection, by default it makes an input stream for you to read the response. You'd get that by calling getInputStream() on the connection, and then you'd read the stream to completion and close it.
If you don't need the data, you can alternatively call setDoInput(false) to save you the trouble of doing the above.

JavaME - HTTP Get - Send Values to Website

I'm building a project in which I want a method to make a simple http GET request in order to send two variables to an website via URL.
In a normal java project I would likely use java.net or apache and solve the issue in a matter of minutes. In JavaME, due to my lack of experience I'm not really being able to fulfill the task.
Basically what I want to do is having an url like google.com/index.php?v1=x&v=y
being able to do a get request in order to send those variables via URL.
Any tips?
Here's an example of how you could do something like that.
HttpConnection connection = null;
InputStream inputstream = null;
String url = null;
StringBuffer dataReceived = null;
url = "http://www.google.com/index.php?v1=x&v=y";
dataReceived = new StringBuffer();
try {
connection = (HttpConnection) Connector.open(url);
connection.setRequestMethod(HttpConnection.GET);
connection.setRequestProperty("Content-Type", "text/plain");
connection.setRequestProperty("Connection", "close");
if (connection.getResponseCode() == HttpConnection.HTTP_OK) {
inputstream = connection.openInputStream();
int ch;
while ((ch = inputstream.read()) != -1 ) {
dataReceived.append((char) ch);
}
} else {
// Connection not ok
}
} catch (Exception e) {
// Something went wrong
} finally {
if (inputstream != null) {
try {
inputstream.close();
} catch (Exception e) {
}
}
if (connection != null) {
try {
connection.close();
} catch (Exception e) {
}
}
}
Note: I didn't test this specific code. I just edited some code I had lying around from a previous project of mine, so you may need to fix a few errors.

XmlReader returning null on all occasions

i have a code which is to get dat from active mq and display the data on Rss feed, but the code give me no data on the feed, i get an empty list and the reason seems to be that XmlReader reader= null; i have set this line an dthe reder seems to be null during the whole execution.
public List<RssFeedMessage> readRssFeeds(#PathVariable String sourceName) {
XmlReader reader = null;
RssFeedMessage rssFeedMessage = null;
StringBuffer feedUrl = new StringBuffer("http://").append(ipaddress).append(":")
.append(port).append("/admin/queueBrowse/").append(sourceName).append("?view=rss&feedType=rss_2.0");
List<RssFeedMessage> rssFeedMessages = new ArrayList<RssFeedMessage>();
try {
URL url = new URL(feedUrl.toString());
reader = new XmlReader(url);
SyndFeed feedMsg = new SyndFeedInput().build(reader);
List<SyndEntry> feedEntries = feedMsg.getEntries();
for (SyndEntry entry : feedEntries) {
rssFeedMessage = new RssFeedMessage();
rssFeedMessage.setTitle(entry.getTitle());
rssFeedMessage.setDescription(entry.getDescription().getValue());
rssFeedMessage.setDate(OptimerUtil.simpleDateHourTimeInd.format(entry.getPublishedDate()));
rssFeedMessages.add(rssFeedMessage);
}
} catch(IOException e){
e.printStackTrace();
} catch (IllegalArgumentException e) {
e.printStackTrace();
} catch (FeedException e) {
e.printStackTrace();
} finally {
if (reader != null) {
try {
reader.close();
} catch (IOException e) {
}
}
}
return rssFeedMessages;
}
}
it just exits coz reader remains null the wole time an i get io exception on reader = new XmlReader(url);
Check what feedUrl contains in line URL url = new URL(feedUrl.toString());
There is probably a problem with the string.
Also, make you you manage conditions like String equals null or unreachable, before parsing it

java does not seem to download an image correctly

after some research I figured out the easiest way for me to download an image and store it into a file.
This is my code so far:
public boolean descargarArchivo(String url, String outputDirectory) {
try {
File img = new File(outputDirectory);
URLConnection conn = new URL(url).openConnection();
conn.connect();
try (InputStream in = conn.getInputStream();
OutputStream out = new FileOutputStream(img)) {
int b = 0;
while (b != -1) {
b = in.read();
if (b != -1) {
out.write(b);
}
}
}
return true;
} catch (IOException ioe) {
ioe.printStackTrace();
}
return false;
}
The issue with this code, is that it sometimes performs a wrong download of the picture. Let me clear the idea with an example:
While the first picture is the original image (and the output file should be like that), the second picture is the real output file of the image, performed by that code, which is, clearly wrong (ignore the resolution, I'm just talking about this "wrong pixels"). Is there any way to improve this? Should I change the way to download images from the web?
This is how I read images from a URL:
try {
final InputStream inputStream = createInputStream(new URL(getImgUrl()));
try {
return ImageIO.read(inputStream);
} finally {
inputStream.close();
}
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (SocketTimeoutException e) {
//e.printStackTrace(); in case of error, will try again
} catch (IOException e) {
e.printStackTrace();
}
using
protected InputStream createInputStream(URL url) throws IOException {
URLConnection con = url.openConnection();
con.setConnectTimeout(500);
con.setReadTimeout(200);
return con.getInputStream();
}

Categories