I'm building a project in which I want a method to make a simple http GET request in order to send two variables to an website via URL.
In a normal java project I would likely use java.net or apache and solve the issue in a matter of minutes. In JavaME, due to my lack of experience I'm not really being able to fulfill the task.
Basically what I want to do is having an url like google.com/index.php?v1=x&v=y
being able to do a get request in order to send those variables via URL.
Any tips?
Here's an example of how you could do something like that.
HttpConnection connection = null;
InputStream inputstream = null;
String url = null;
StringBuffer dataReceived = null;
url = "http://www.google.com/index.php?v1=x&v=y";
dataReceived = new StringBuffer();
try {
connection = (HttpConnection) Connector.open(url);
connection.setRequestMethod(HttpConnection.GET);
connection.setRequestProperty("Content-Type", "text/plain");
connection.setRequestProperty("Connection", "close");
if (connection.getResponseCode() == HttpConnection.HTTP_OK) {
inputstream = connection.openInputStream();
int ch;
while ((ch = inputstream.read()) != -1 ) {
dataReceived.append((char) ch);
}
} else {
// Connection not ok
}
} catch (Exception e) {
// Something went wrong
} finally {
if (inputstream != null) {
try {
inputstream.close();
} catch (Exception e) {
}
}
if (connection != null) {
try {
connection.close();
} catch (Exception e) {
}
}
}
Note: I didn't test this specific code. I just edited some code I had lying around from a previous project of mine, so you may need to fix a few errors.
Related
after some research I figured out the easiest way for me to download an image and store it into a file.
This is my code so far:
public boolean descargarArchivo(String url, String outputDirectory) {
try {
File img = new File(outputDirectory);
URLConnection conn = new URL(url).openConnection();
conn.connect();
try (InputStream in = conn.getInputStream();
OutputStream out = new FileOutputStream(img)) {
int b = 0;
while (b != -1) {
b = in.read();
if (b != -1) {
out.write(b);
}
}
}
return true;
} catch (IOException ioe) {
ioe.printStackTrace();
}
return false;
}
The issue with this code, is that it sometimes performs a wrong download of the picture. Let me clear the idea with an example:
While the first picture is the original image (and the output file should be like that), the second picture is the real output file of the image, performed by that code, which is, clearly wrong (ignore the resolution, I'm just talking about this "wrong pixels"). Is there any way to improve this? Should I change the way to download images from the web?
This is how I read images from a URL:
try {
final InputStream inputStream = createInputStream(new URL(getImgUrl()));
try {
return ImageIO.read(inputStream);
} finally {
inputStream.close();
}
} catch (MalformedURLException e) {
e.printStackTrace();
} catch (SocketTimeoutException e) {
//e.printStackTrace(); in case of error, will try again
} catch (IOException e) {
e.printStackTrace();
}
using
protected InputStream createInputStream(URL url) throws IOException {
URLConnection con = url.openConnection();
con.setConnectTimeout(500);
con.setReadTimeout(200);
return con.getInputStream();
}
I have a class called RetreiveHttpStringResponse. It's used to get an InputStream from an URL containing JSON data. The class extends AsyncTask<String, Void, InputStream>. So the strange problem here is that null is always returned. No matter what. There is even no Exception. I checked out the program behaviour with the debugger and could see that at point (1) the processing is jumping immediately to the finally-statement and continues with return null;. And again there are no Errors and no Exceptions are going on. The programm is running normally.
I'm using Android 4.4 (SDK version 19), the response code is 200 and the following lines are set in the Manifest file.
uses-permission android:name="android.permission.INTERNET"
uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"
The problem is happening on the emulator and on a real device with internet connection. Here is the code:
#Override
protected InputStream doInBackground(String... arg0) {
URL url = null;
InputStream is = null;
HttpURLConnection urlConn = null;
int responseCode = 0;
try {
url = new URL(arg0[0]);
urlConn = (HttpURLConnection) url.openConnection();
urlConn.setReadTimeout(10000);
urlConn.setConnectTimeout(15000);
urlConn.setRequestMethod("GET");
urlConn.connect();
responseCode = urlConn.getResponseCode();
Log.d("DataHandlerInternet:RESPONSE_CODE", "The response is: " + responseCode);
is= urlConn.getInputStream(); //-->(1)<--
return is;
}
catch ( MalformedURLException e ) { // new URL() went wrong!
//TODO error message. URL is not correct!
e.printStackTrace();
}
catch (SocketTimeoutException e) { // Timeout while connecting or holding connection to URL.
//TODO error message. Timeout happened!
e.printStackTrace();
}
catch ( IOException e ) { // openConnection() failed!
//TODO error message. Couldn't connect to URL!
e.printStackTrace();
}
catch( Exception e ) { // Any other Exception!
e.printStackTrace();
}
finally {
try { if(is != null) { is.close(); } } catch(Exception e) {e.printStackTrace();}
try { if(urlConn != null) { urlConn.disconnect(); } } catch(Exception e) {e.printStackTrace();}
}
return null;
}
One bad solution is to delete the finally-statement. Well, not the best way to solve this problem.
Now I changed the code. I've put the reading in it and return just the String.
#Override
protected String doInBackground(String... arg0) {
URL url = null;
InputStream is = null;
HttpURLConnection urlConn = null;
int responseCode = 0;
try {
url = new URL(arg0[0]);
urlConn = (HttpURLConnection) url.openConnection();
urlConn.setReadTimeout(10000);
urlConn.setConnectTimeout(15000);
urlConn.setRequestMethod("GET");
urlConn.connect();
responseCode = urlConn.getResponseCode();
Log.d("DataHandlerInternet:RESPONSE_CODE", "The response is: " + responseCode);
is= urlConn.getInputStream();
StringBuilder sb = new StringBuilder();
BufferedReader br = new BufferedReader(new InputStreamReader(is));
String line = null;
while ( (line = br.readLine()) != null ) {
sb.append(line);
}
return sb.toString();
}
catch ( MalformedURLException e ) { // new URL() went wrong!
//TODO error message. URL is not correct!
e.printStackTrace();
}
catch (SocketTimeoutException e) { // Timeout while connecting or holding connection to URL.
//TODO error message. Timeout happened!
e.printStackTrace();
}
catch ( IOException e ) { // openConnection() failed!
//TODO error message. Couldn't connect to URL!
e.printStackTrace();
}
catch( Exception e ) { // Any other Exception!
e.printStackTrace();
}
finally {
try { if(is != null) { is.close(); } } catch(Exception e) {e.printStackTrace();}
try { if(urlConn != null) { urlConn.disconnect(); } } catch(Exception e) {e.printStackTrace();}
}
return null;
}
And still, after going through the while loop the return line; is completely ignored. I've checked the data in the String with the debugger and it was correct! No Errors no Exceptions.
finally will run in either case, also during normal return without exceptions. And you call .close in the finally statement clause.
So your code always returns the closed stream. Probably this is not that you intend.
Your description ("jumps to finally statement") still looks very much like a exception has been thrown by urlConn.getInputStream(). Strange you do not observe it.
I dont see why you get your null result but, one thing you are doing wrong is actually returning InputStream:
is= urlConn.getInputStream(); //-->(1)<--
return is;
you should read your stream in doInBackground (on worker thread), otherwise reading it in onPostExecute (UI Thread), will possibly cause NetworkOnMainThreadException, or at least ANR. Reading data from InputStream is still a network operation - data you download can be several MBs.
I'm writing a client j2me app that connects to a php based api using a post method. For security purposes the app and the api should interact within a session with a 30 minutes timeout. My problem is that, the application user has to keep on logging in even when the session time out is not yet done. My php code is fine because I've tested it on a browser and it works fine; however but the application fails and I have to keep on logging in. Might I be missing something? These are the headers am sending to the server using my Java app.
String phonenumber = this.phonenumberTextbox.getString();
String password = this.passwordTextBox.getString();
HttpConnection httpConn = null;
String url = "http://192.168.56.1/?action=login-user";
InputStream is;
OutputStream os;
is = null;
os = null;
String sReply = "";
boolean loggedIn = false;
try {
// Open an HTTP Connection object
httpConn = (HttpConnection) Connector.open(url);
// Setup HTTP Request to POST
httpConn.setRequestMethod(HttpConnection.POST);
httpConn.setRequestProperty("User-Agent", "Profile/MIDP-1.0 Confirguration/CLDC-1.0");
httpConn.setRequestProperty("Accept_Language", "en-US");
//Content-Type is must to pass parameters in POST Request
httpConn.setRequestProperty("Content-Type", "application/x-www-form-urlencoded");
os = httpConn.openOutputStream();
String params;
params = "phonenumber=" + phonenumber + "&password=" + password;
os.write(params.getBytes());
// Read Response from the Server
StringBuffer sb = new StringBuffer();
is = httpConn.openDataInputStream();
int chr;
while ((chr = is.read()) != -1) {
sb.append((char) chr);
}
// Web Server just returns stuff
sReply = sb.toString();
} catch (IOException ex) {
System.out.println("Cannot find server");
} finally {
if (is != null) {
try {
is.close();
} catch (IOException ex) {
}
}
if (os != null) {
try {
os.close();
} catch (IOException ex) {
}
}
if (httpConn != null) {
try {
httpConn.close();
} catch (IOException ex) {
}
}
}
// do something with sReply
Default practice in PHP these days** when dealing for sessions if for PHP to generate a cookie for the client to store. Your java client must store this cookie (called phpsessid by default, but any PHP programmer worth their salt will change the cookie name), and present it in the HTTP headers on subsequent requests.
I'm not very familiar with the HTTP support in Java, but if it's anything like curl it will include options for setting and retrieving cookies.
** It used to be possible to pass session tokens by a query string (a GET parameter) when making the request, but as this was horribly insecure and leaked potentially sensitive information, it's never used any more and may even no longer be available as an option.
I have written a test web crawler class that attempts to search Google, as shown:
public class WebCrawler {
String query;
public WebCrawler(String search)
{
query = search;
}
public void connect()
{
HttpURLConnection connection = null;
try
{
String url = "http://www.google.com/search?q=" + query;
URL search = new URL(url);
connection = (HttpURLConnection)search.openConnection();
connection.setRequestMethod("GET");
connection.setDoOutput(true);
connection.setDoInput(true);
connection.setUseCaches(false);
connection.setAllowUserInteraction(false);
connection.connect();
BufferedReader read = new BufferedReader(new InputStreamReader(connection.getInputStream()));
String line = null;
while((line = read.readLine())!=null)
{
System.out.println(line);
}
read.close();
}
catch(MalformedURLException e)
{
e.printStackTrace();
}
catch(ProtocolException e)
{
e.printStackTrace();
}
catch(IOException e)
{
e.printStackTrace();
}
finally
{
connection.disconnect();
}
}
}
When I try to run it with a test query "test" though, I get a HTTP response 403 error-- what am I missing? This is my first time doing any networking stuff with Java.
403 == forbidden, which makes sense because you're a robot trying to access a part of google that they don't want robots accessing. Google's robots.txt pretty clearly specifies that you shouldn't be scraping /search.
Google provides a search API which allows 100 queries per day. They provide libraries and examples of how to interface with it in most languages, including Java. More than that, you've gotta pay.
See, I have to check like 50+ URLs for validity, and I'm assuming that catching more than 50 exceptions is kind of over the top. Is there a way to check if a bunch of URLs are valid without wrapping it in a try catch to catch exceptions? Also, just fyi, in Android the class "UrlValidator" doesn't exist (but it does exist in the standard java), and there's UrlUtil.isValidUrl(String url) but that method seems to be pleased with whatever you throw at it as long as it contains http://... any suggestions?
This solution does catch exceptions, however others may find it useful and doesn't require any libraries.
public boolean URLIsReachable(String urlString)
{
try
{
URL url = new URL(urlString);
HttpURLConnection urlConnection = (HttpURLConnection) url
.openConnection();
responseCode = urlConnection.getResponseCode();
urlConnection.disconnect();
return responseCode != 200;
} catch (MalformedURLException e)
{
e.printStackTrace();
return false;
} catch (IOException e)
{
e.printStackTrace();
return false;
}
}