How to parse a txt web service - java

I'm trying to obtain a result from a web service in a java program. I've done xml services before but this one is text based and i can't figure out how to record the response.
Here is the webService: http://ws.geonames.org/countryCode?lat=47.03&lng=10.2
Thanks!

If it's only text, and you doesn't use any standard format (like SOAP), you need to use Sockets:
URL myURL = new URL("http://ws.geonames.org/countryCode?lat=47.03&lng=10.2");
URLConnection serviceConnection = myURL.openConnection();
BufferedReader in = new BufferedReader(new InputStreamReader(
serviceConnection.getInputStream()));
List<String> response =new ArrayList<String>();
Use this if you had many lines:
while ((inputLine = in.readLine()) != null)
response.add(inputLine);
Or use this if you had ONLY ONE line (like the Web Service in your question):
String countryCode = in.readLine();
And finish with:
serviceConnection.close();
In my case, countryCode it was "AT"

An alternative implementation in addition to URL is to use a HttpClient.

public class CountryCodeReader {
private String readAll(Reader rd) throws IOException {
StringBuilder sb = new StringBuilder();
int cp;
while ((cp = rd.read()) != -1) {
sb.append((char) cp);
}
return sb.toString();
}
public String readFromUrl(String url) throws IOException, JSONException {
InputStream is = new URL(url).openStream();
try {
InputStreamReader is = new InputStreamReader(is, Charset.forName("UTF-8"))
BufferedReader rd = new BufferedReader(is);
return readAll(rd);
} finally {
is.close();
}
return null;
}
public static void main(String[] argv) {
CountryCodeReader ccr = new CountryCodeReader();
String cc = ccr.readFromUrl("http://ws.geonames.org/countryCode?lat=47.03&lng=10.2");
}
}

Related

How do I print request and response from static xml file

i am having text file called "ecareframework.xml". It contains two request and response. From this file, i have to print request and response separately using string.
public class RequestTime {
public static void main(String[] args) throws IOException {
File xmlFile=new File("C:\\Users\\gsanaulla\\Documents\\My Received Files\\ecarewsframework.xml");
Reader fileReader = new FileReader(xmlFile);
BufferedReader bufReader = new BufferedReader(fileReader);
StringBuffer myStringBuffer = new StringBuffer();
String line;
while((line = bufReader.readLine()) != null) {
if (String.valueOf(line).contains("</S:Envelope>")) {
myStringBuffer.append(String.valueOf(line));
}
//fileReader.close();
System.out.println(myStringBuffer.toString());
}
bufReader.close();
}
}

Android HttpURLConnection Error

I have an issue on my Android Project,
I want to call php page with get method and I use HttpURLConnection,
when I run this application give an error "app has stopped"
This is my code
public void someFunction() throws IOException {
String inputLine = "";
URL url = new URL("http://www.domain.com/demo.php?test=abc");
HttpURLConnection urlConnection = (HttpURLConnection) url.openConnection();
InputStream in = new BufferedInputStream(urlConnection.getInputStream());
try {
BufferedReader reader = new BufferedReader(new InputStreamReader(in));
StringBuilder result = new StringBuilder();
String line;
while((line = reader.readLine()) != null) {
result.append(line);
}
} finally{
in.close();
}
}
What should I do?
Thanks.
I have solved this issue with following code,
StrictMode.ThreadPolicy policy = new StrictMode.ThreadPolicy.Builder().permitAll().build();
StrictMode.setThreadPolicy(policy);
Android is build to resist the developers to make network requests in the main (UI) thread. Changing Thread policy is not the solution. You should use AsyncTask to make all network requests.
public void someFunction() throws IOException {
AsyncTask task = new AsyncTask<Void,StringBuilder,StringBuilder>() {
#Override
protected StringBuilder doInBackground(Void... params) {
String inputLine = "";
URL url = new URL("http://www.domain.com/demo.php?test=abc");
HttpURLConnection urlConnection = (HttpURLConnection) url.openConnection();
InputStream in = new BufferedInputStream(urlConnection.getInputStream());
try {
BufferedReader reader = new BufferedReader(new InputStreamReader(in));
StringBuilder result = new StringBuilder();
String line;
while((line = reader.readLine()) != null) {
result.append(line);
}
} finally{
in.close();
}
return result;
}
#Override
protected void onPostExecute(StringBuilder result) {
super.onPostExecute(o);
}
};
task.execute();
}
See the android documentation: http://developer.android.com/reference/android/os/AsyncTask.html

Jersey request.getInputStream()

#POST
#Path("/getphotos")
#Produces(MediaType.TEXT_HTML)
public String getPhotos() throws IOException{
BufferedReader rd = new BufferedReader(new InputStreamReader(request.getInputStream(),"UTF-8"));
String line;
while ((line = rd.readLine()) != null) {
System.out.println(line);
}
return "ok";
}
The code above is for my server.
But in this code, the String "line" has no value.(always)
Is there any problem with the code?
client side code
String message = "message";
URL url = new URL(targetURL);
HttpURLConnection urlConnection = (HttpURLConnection) url.openConnection();
urlConnection.setDoInput(true);
urlConnection.setDoOutput(true);
OutputStreamWriter wr = new OutputStreamWriter(urlConnection.getOutputStream());
wr.write(message);
You can manually consume a request's data in Jersey, as long as you have a valid handle to the actual HttpServletRequest. On a slight side note, keep in mind that you can only consume the request body once:
#Context
private HttpServletRequest request;
#POST
#Path("/")
public Response consumeRequest() {
try {
final BufferedReader rd = new BufferedReader(new InputStreamReader(
request.getInputStream(), "UTF-8"));
String line = null;
final StringBuffer buffer = new StringBuffer(2048);
while ((line = rd.readLine()) != null) {
buffer.append(line);
}
final String data = buffer.toString();
return Response.ok().entity(data).build();
} catch (final Exception e) {
return Response.status(Status.BAD_REQUEST)
.entity("No data supplied").build();
}
}
Side note: Libraries like Apache Commons IO provide robust functions for reading IO data

Reading from a URL Connection Java

I'm trying to read html code from a URL Connection. In one case the html file I'm trying to read includes 5 line breaks before the actual doc type declaration. In this case the input reader throws an exception for EOF.
URL pageUrl =
new URL(
"http://www.nytimes.com/2011/03/15/sports/basketball/15nbaround.html"
);
URLConnection getConn = pageUrl.openConnection();
getConn.connect();
DataInputStream dis = new DataInputStream(getConn.getInputStream());
//some read method here
Has anyone ran into a problem like this?
URL pageUrl = new URL("http://www.nytimes.com/2011/03/15/sports/basketball/15nbaround.html");
URLConnection getConn = pageUrl.openConnection();
getConn.connect();
DataInputStream dis = new DataInputStream(getConn.getInputStream());
String urlData = "";
while ((urlData = dis.readUTF()) != null)
System.out.println(urlData);
//exception thrown
java.io.EOFException
at java.io.DataInputStream.readUnsignedShort(DataInputStream.java:323)
at java.io.DataInputStream.readUTF(DataInputStream.java:572)
at java.io.DataInputStream.readUTF(DataInputStream.java:547)
in the case of bufferedreader, it just responds null and doesn't continue
pageUrl = new URL("http://www.nytimes.com/2011/03/15/sports/basketball/15nbaround.html");
URLConnection getConn = pageUrl.openConnection();
getConn.connect();
BufferedReader br = new BufferedReader(new InputStreamReader(getConn.getInputStream()));
String urlData = "";
while(true)
urlData = br.readLine();
System.out.println(urlData);
outputs null
You're using DataInputStream to read data that wasn't encoded using DataOutputStream. Examine the documented behavior for your call to DataInputStream#readUtf(); it first reads two bytes to form a 16-bit integer, indicating the number of bytes that follow comprising the UTF-encoded string. The data you're reading from the HTTP server is not encoded in this format.
Instead, the HTTP server is sending headers encoded in ASCII, per RFC 2616 sections 6.1 and 2.2. You need to read the headers as text, and then determine how the message body (the "entity") is encoded.
This works fine:
package url;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.Reader;
import java.net.URL;
/**
* UrlReader
* #author Michael
* #since 3/20/11
*/
public class UrlReader
{
public static void main(String[] args)
{
UrlReader urlReader = new UrlReader();
for (String url : args)
{
try
{
String contents = urlReader.readContents(url);
System.out.printf("url: %s contents: %s\n", url, contents);
}
catch (Exception e)
{
e.printStackTrace();
}
}
}
public String readContents(String address) throws IOException
{
StringBuilder contents = new StringBuilder(2048);
BufferedReader br = null;
try
{
URL url = new URL(address);
br = new BufferedReader(new InputStreamReader(url.openStream()));
String line = "";
while (line != null)
{
line = br.readLine();
contents.append(line);
}
}
finally
{
close(br);
}
return contents.toString();
}
private static void close(Reader br)
{
try
{
if (br != null)
{
br.close();
}
}
catch (Exception e)
{
e.printStackTrace();
}
}
}
This:
public class Main {
public static void main(String[] args)
throws MalformedURLException, IOException
{
URL pageUrl = new URL("http://www.google.com");
URLConnection getConn = pageUrl.openConnection();
getConn.connect();
BufferedReader dis = new BufferedReader(
new InputStreamReader(
getConn.getInputStream()));
String myString;
while ((myString = dis.readLine()) != null)
{
System.out.println(myString);
}
}
}
Works perfectly. The URL you are supplying, however, returns nothing.

Read url to string in few lines of java code

I'm trying to find Java's equivalent to Groovy's:
String content = "http://www.google.com".toURL().getText();
I want to read content from a URL into string. I don't want to pollute my code with buffered streams and loops for such a simple task. I looked into apache's HttpClient but I also don't see a one or two line implementation.
Now that some time has passed since the original answer was accepted, there's a better approach:
String out = new Scanner(new URL("http://www.google.com").openStream(), "UTF-8").useDelimiter("\\A").next();
If you want a slightly fuller implementation, which is not a single line, do this:
public static String readStringFromURL(String requestURL) throws IOException
{
try (Scanner scanner = new Scanner(new URL(requestURL).openStream(),
StandardCharsets.UTF_8.toString()))
{
scanner.useDelimiter("\\A");
return scanner.hasNext() ? scanner.next() : "";
}
}
This answer refers to an older version of Java. You may want to look at ccleve's answer.
Here is the traditional way to do this:
import java.net.*;
import java.io.*;
public class URLConnectionReader {
public static String getText(String url) throws Exception {
URL website = new URL(url);
URLConnection connection = website.openConnection();
BufferedReader in = new BufferedReader(
new InputStreamReader(
connection.getInputStream()));
StringBuilder response = new StringBuilder();
String inputLine;
while ((inputLine = in.readLine()) != null)
response.append(inputLine);
in.close();
return response.toString();
}
public static void main(String[] args) throws Exception {
String content = URLConnectionReader.getText(args[0]);
System.out.println(content);
}
}
As #extraneon has suggested, ioutils allows you to do this in a very eloquent way that's still in the Java spirit:
InputStream in = new URL( "http://jakarta.apache.org" ).openStream();
try {
System.out.println( IOUtils.toString( in ) );
} finally {
IOUtils.closeQuietly(in);
}
Or just use Apache Commons IOUtils.toString(URL url), or the variant that also accepts an encoding parameter.
There's an even better way as of Java 9:
URL u = new URL("http://www.example.com/");
try (InputStream in = u.openStream()) {
return new String(in.readAllBytes(), StandardCharsets.UTF_8);
}
Like the original groovy example, this assumes that the content is UTF-8 encoded. (If you need something more clever than that, you need to create a URLConnection and use it to figure out the encoding.)
Now that more time has passed, here's a way to do it in Java 8:
URLConnection conn = url.openConnection();
try (BufferedReader reader = new BufferedReader(new InputStreamReader(conn.getInputStream(), StandardCharsets.UTF_8))) {
pageText = reader.lines().collect(Collectors.joining("\n"));
}
Additional example using Guava:
URL xmlData = ...
String data = Resources.toString(xmlData, Charsets.UTF_8);
Java 11+:
URI uri = URI.create("http://www.google.com");
HttpRequest request = HttpRequest.newBuilder(uri).build();
String content = HttpClient.newHttpClient().send(request, BodyHandlers.ofString()).body();
If you have the input stream (see Joe's answer) also consider ioutils.toString( inputstream ).
http://commons.apache.org/io/api-1.4/org/apache/commons/io/IOUtils.html#toString(java.io.InputStream)
The following works with Java 7/8, secure urls, and shows how to add a cookie to your request as well. Note this is mostly a direct copy of this other great answer on this page, but added the cookie example, and clarification in that it works with secure urls as well ;-)
If you need to connect to a server with an invalid certificate or self signed certificate, this will throw security errors unless you import the certificate. If you need this functionality, you could consider the approach detailed in this answer to this related question on StackOverflow.
Example
String result = getUrlAsString("https://www.google.com");
System.out.println(result);
outputs
<!doctype html><html itemscope="" .... etc
Code
import java.net.URL;
import java.net.URLConnection;
import java.io.BufferedReader;
import java.io.InputStreamReader;
public static String getUrlAsString(String url)
{
try
{
URL urlObj = new URL(url);
URLConnection con = urlObj.openConnection();
con.setDoOutput(true); // we want the response
con.setRequestProperty("Cookie", "myCookie=test123");
con.connect();
BufferedReader in = new BufferedReader(new InputStreamReader(con.getInputStream()));
StringBuilder response = new StringBuilder();
String inputLine;
String newLine = System.getProperty("line.separator");
while ((inputLine = in.readLine()) != null)
{
response.append(inputLine + newLine);
}
in.close();
return response.toString();
}
catch (Exception e)
{
throw new RuntimeException(e);
}
}
Here's Jeanne's lovely answer, but wrapped in a tidy function for muppets like me:
private static String getUrl(String aUrl) throws MalformedURLException, IOException
{
String urlData = "";
URL urlObj = new URL(aUrl);
URLConnection conn = urlObj.openConnection();
try (BufferedReader reader = new BufferedReader(new InputStreamReader(conn.getInputStream(), StandardCharsets.UTF_8)))
{
urlData = reader.lines().collect(Collectors.joining("\n"));
}
return urlData;
}
URL to String in pure Java
Example call to get payload from http get call
String str = getStringFromUrl("YourUrl");
Implementation
You can use the method described in this answer, on How to read URL to an InputStream and combine it with this answer on How to read InputStream to String.
The outcome will be something like
public String getStringFromUrl(URL url) throws IOException {
return inputStreamToString(urlToInputStream(url,null));
}
public String inputStreamToString(InputStream inputStream) throws IOException {
try(ByteArrayOutputStream result = new ByteArrayOutputStream()) {
byte[] buffer = new byte[1024];
int length;
while ((length = inputStream.read(buffer)) != -1) {
result.write(buffer, 0, length);
}
return result.toString(UTF_8);
}
}
private InputStream urlToInputStream(URL url, Map<String, String> args) {
HttpURLConnection con = null;
InputStream inputStream = null;
try {
con = (HttpURLConnection) url.openConnection();
con.setConnectTimeout(15000);
con.setReadTimeout(15000);
if (args != null) {
for (Entry<String, String> e : args.entrySet()) {
con.setRequestProperty(e.getKey(), e.getValue());
}
}
con.connect();
int responseCode = con.getResponseCode();
/* By default the connection will follow redirects. The following
* block is only entered if the implementation of HttpURLConnection
* does not perform the redirect. The exact behavior depends to
* the actual implementation (e.g. sun.net).
* !!! Attention: This block allows the connection to
* switch protocols (e.g. HTTP to HTTPS), which is <b>not</b>
* default behavior. See: https://stackoverflow.com/questions/1884230
* for more info!!!
*/
if (responseCode < 400 && responseCode > 299) {
String redirectUrl = con.getHeaderField("Location");
try {
URL newUrl = new URL(redirectUrl);
return urlToInputStream(newUrl, args);
} catch (MalformedURLException e) {
URL newUrl = new URL(url.getProtocol() + "://" + url.getHost() + redirectUrl);
return urlToInputStream(newUrl, args);
}
}
/*!!!!!*/
inputStream = con.getInputStream();
return inputStream;
} catch (Exception e) {
throw new RuntimeException(e);
}
}
Pros
It is pure java
It can be easily enhanced by adding different headers as a map (instead of passing a null object, like the example above does), authentication, etc.
Handling of protocol switches is supported

Categories