I have found quite a lot of other posts on this topic but none seem to have the answer I need.
I have written a Bukkit plugin for Minecraft that can send post data to a PHP page and get a return from the page.
Now the one thing I can't figure out. I would like to have a button on the page, and when the button is clicked, send data to the Java plugin and have the plugin print the message.
I have seen something about sockets. But after reading about them I can't figure out how to set them up.
Pretty much at any time you should be able to click the button and it sends data to the Java plugin and I can use that data however I like.
Does anyone know how I can have the Java plugin constantly waiting for data from the page?
My current code:
(This sends the players name to the website.)
String re = "";
URL url = new URL("address here");
URLConnection con = url.openConnection();
con.setDoOutput(true);
PrintStream ps = new PrintStream(con.getOutputStream());
ps.print("player=" + player.getName());
con.getInputStream();
BufferedReader rd = new BufferedReader(new InputStreamReader(con.getInputStream()));
String line;
while ((line = rd.readLine()) != null) {
re += line + "\n";
}
rd.close();
ps.close();
And my php just returns any post data it gets.
It works fine, but I would like to listen in my java plugin for data from the php page.
There are many ways to make communication between two servers. I'd use one of them:
Sockets
JMS - Java Message Service such as ActiveMQ
Both of them have tutorials available, just google.
You could use a database, or setup a json/xml api on the PHP end, and access the database, or access the json/xml from Java with this example code to open the url.
URL url = new URL("site.com/api/foo.json");
try (BufferedReader reader = new BufferedReader(new InputStreamReader(url.openStream(), "UTF-8"))) {
for (String line; (line = reader.readLine()) != null;) {
System.out.println(line);
}
}
You can look at this tutorial to parse JSON with Java.
Related
I've recently picked up NetBeans as what looks like a comprehensive IDE for writing a GUI application. What struck me as unusual was that it didn't come with any built-in support for creating and reading responses to simple GET and POST requests. While this seems to be because it's aimed at a crowd interested in writing web services in NetBeans, I was hoping there would be some basic request and response functionality in the program regardless.
Is there a library included to serve this purpose already? (If there is, I must be using the wrong methods to look for it.)
Presently, I've downloaded and utilised Apache's HTTPClient and IOUtils, which work but look like they may be redundant.
Try this out
For GET
URL url = new URL(urlString);
HttpURLConnection c = (HttpURLConnection)url.openConnection(); //connecting to url
c.setRequestMethod("GET");
BufferedReader in = new BufferedReader(new InputStreamReader(c.getInputStream())); //stream to resource
String str;
while ((str = in.readLine()) != null) //reading data
responsestring += str+"\n";//process the response and save it in some string or so
in.close(); //closing stream
For POST
URL obj = new URL(url);
HttpsURLConnection con = (HttpsURLConnection) obj.openConnection();
con.setRequestMethod("POST");
String urlParameters = ..;
con.setDoOutput(true);
DataOutputStream wr = new DataOutputStream(con.getOutputStream());
wr.writeBytes(urlParameters);
wr.flush();
wr.close();
BufferedReader in = new BufferedReader(new InputStreamReader(con.getInputStream()));
String inputLine;
StringBuffer res = new StringBuffer();
while ((inputLine = in.readLine()) != null) {
res.append(inputLine);
}
in.close();
NetBeans doesn't provide you with any java classes. All the classes come from the java libraries, NetBeans is just the IDE that lets you do your work conveniently.
So to translate your question: No, the standard java libraries don't provide you with a useful http client other than URLConnection.
The Apache HTTPClient library is a good approach, althoug for basic GETs and POSTs, it is far too complex. Coding GETs and POSTs yourself is pretty easy, though, take a look at the htt protocol.
I am trying to send a request to a server using GET that will respond with XML. I am told that I need to set the "Accept" property, code follows:
StringBuffer url = new StringBuffer(BASE_URL);
url.append(DRS_SERVICE_RELATIVE_URL);
url.append("?").append(DOC_PARAM_NAME).append("=").append(docId);
url.append("&").append(DOB_PARAM_NAME).append("=").append(dob);
try
{
this.server = new URL(url.toString());
URLConnection urlCon = this.server.openConnection();
HttpURLConnection con = (HttpURLConnection)urlCon;
con.addRequestProperty("Accept", "text/xml, application/*+xml, application/xml, text/xml, application/*+xml");
con.connect();
input = new BufferedReader(new InputStreamReader(con.getInputStream()));
String line = null;
while((line = input.readLine()) != null)
System.out.println(line);
I get response code 500. When I talk to the developers of the URL I am trying to access they say I am not setting the "Accept" property to XML? What am I doing wrong? How are you supposed to set that property?
EDIT:
OK this is embarassing. The problem had to do with my development enviroment, specifically the way I set up a TCP/IP monitoring tool. When I stopped monitoring the network messages it worked as expected.
The problem had to do with my development enviroment, specifically the way I set up a TCP/IP monitoring tool. When I stopped monitoring the network messages it worked as expected.
I am trying to use core Java to read HTTP request data from an inputstream, using the following code:
BufferedReader in = new BufferedReader(new InputStreamReader(clientSocket.getInputStream()));
String inputLine;
while ((inputLine = in.readLine()) != null)
System.out.println(inputLine);
in.close();
I receive the header fine, but then the client just hangs forever because the server never finds "EOF" of the request. How do I handle this? I've seen this question asked quite a bit, and most solutions involve something like the above, however it's not working for me. I've tried using both curl and a web browser as the client, just sending a get request
Thanks for any ideas
An HTTP request ends with a blank line (optionally followed by request data such as form data or a file upload), not an EOF. You want something like this:
BufferedReader in = new BufferedReader(new InputStreamReader(clientSocket.getInputStream()));
String inputLine;
while (!(inputLine = in.readLine()).equals(""))
System.out.println(inputLine);
in.close();
In addition to the answer above (as I am not able to post comments yet), I'd like to add that some browsers like Opera (I guess it was what did it, or it was my ssl setup, I don't know) send an EOF. Even if not the case, you would like to prevent that in order for your server not to crash because of a NullPointerException.
To avoid that, just add the null test to your condition, like this:
while ((inputLine = in.readLine()) != null && !inputLine.equals(""));
Usually I use this code to download a webpage source:
URL myURL = new URL("http://mysite.com/index.html");
StringBuffer all = new StringBuffer("");
URLConnection ucon = myURL.openConnection();
InputStream is = ucon.getInputStream();
BufferedReader page = new BufferedReader(new InputStreamReader(is, "ISO-8859-15"));
while((linea = page.readLine()) != null){
all.append(linea.trim());
}
It works fine with a wifi connection because it downloads the string like <!-- it's a comment -->,but i tried to used a mobile connection with my mobile phone but it doesn't download the comments.. Is there a method to include the comments on download webpage source?
thx for reply ;)
It is possible that your service provider is compressing the pages on their side to reduce the data sent. I've not heard of this being done for HTML but it is frequently done for JPG, so it's easy to image that's what's happening. This compression would be very likely to remove comments.
It would be nice if there was some http convention to tell the stack 'never compress', but (at fas as I know) there is not. So you're probably out of luck.
I'm downloading a web page then extracting some data out of it, using regex (don't yell at me, I know a proper parser would be better, but this is a very simple machine generated page). This works fine in the emulator, and on my phone when connected by wi-fi, but not on 3G - the string returned is not the same, and I don't get a match. I can imagine it has something to do with packet size or latency, but I can't figure it out.
My code:
public static String getPage(URL url) throws IOException {
final URLConnection connection = url.openConnection();
HttpGet httpRequest = null;
try {
httpRequest = new HttpGet(url.toURI());
} catch (URISyntaxException e) {
e.printStackTrace();
}
HttpClient httpclient = new DefaultHttpClient();
HttpResponse response = (HttpResponse) httpclient.execute(httpRequest);
HttpEntity entity = response.getEntity();
BufferedHttpEntity bufHttpEntity = new BufferedHttpEntity(entity);
InputStream stream = bufHttpEntity.getContent();
String ct = connection.getContentType();
final BufferedReader reader;
if (ct.indexOf("charset=") != -1) {
ct = ct.substring(ct.indexOf("charset=") + 8);
reader = new BufferedReader(new InputStreamReader(stream, ct));
}else {
reader = new BufferedReader(new InputStreamReader(stream));
}
final StringBuilder sb = new StringBuilder();
String line;
while ((line = reader.readLine()) != null) {
sb.append(line);
}
stream.close();
return sb.toString();
}
Is it my poor connection causing this, or is there a bug in there? Either way, how do I solve it?
Update:
The file downloaded over 3G is 201 bytes smaller than the one over wi-fi. While they are obviously both downloading the correct page, the 3G one is missing a whole bunch of whitespace, and also some HTML comments that are present in the original page which I find a little strange. Does Android fetch pages differently on 3G as to reduce file size?
UserAgent (UA) shouldn't be changed if u access web page using 3g or wifi.
As it is mentioned before, get rid of UrlConnection, cause obviously code is complete for using HTTPClient method, and you are able to set UA using:
httpclient.getParams().setParameter(CoreProtocolPNames.USER_AGENT, userAgent);
last one..it might be silly but maybe web page is dynamic?! is that possible?
Here you go some hints, some of them silly hints, but just in case:
Review your mobile connection, try to open web browser, surf the web, and make sure it actually works
I don't know which is the web page your are trying to access but take into account that depending on your phone User Agent (UA), the rendered content might be different (web pages specially designed for mobile phones), or even no content rendered at all. Is it a web page on your own.
Try to access that same web page from Firefox, changing the UA (Use the User Agent Switcher for Firefox), and review the code returned.
That will be a good start point to figure out what's your problem
Ger
You may want to check if your provider has a transparent proxy in place with 3G.