Using Perl data on server to display on Android - java

I am developing an Android application. I am calling a Perl file on a server. This Perl file has different print statements.
I want to make the collective text available to a variable in android Java file of mine.
I have tried this :
URL url= new URL("http://myserver.com/cgi-bin/myfile.pl?var=97320");
here goes my request to the server file. But how can i get the data from the Perl file available there?

In your perl service:
use CGI qw(param header);
use JSON;
my $var = param('var');
my $json = &fetch_return_data($var);
print header('application/json');
print to_json($json); # or encode_json($json) for utf-8
to return data as a JSON object. Then use one of many JSON libraries for Java to read the data. For instance http://json.org/java/:
Integer var = 97320;
InputStream inputStream = new URL("http://myserver.com/cgi-bin/myfile.pl?var=" + var).openStream();
try {
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(inputStream));
// Or this if you returned utf-8 from your service
//BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(inputStream, Charset.forName("UTF-8")));
JSONObject json = new JSONObject(readAll(bufferedReader));
} catch (Exception e) {
}

Related

Google Dataflow: how to parse big file with valid JSON array from FileIO.ReadableFile

In my pipeline FileIO.readMatches() transform reads big JSON file(around 300-400MB) with a valid JSON array and returns FileIO.ReadableFile object to the next transform. My task is to read each JSON object from that JSON array, add new properties and output to the next transform.
At the moment my code to parse the JSON file looks like this:
// file is a FileIO.ReadableFile object
InputStream bis = new ByteArrayInputStream(file.readFullyAsBytes());
// Im using gson library to parse JSON
JsonReader reader = new JsonReader(new InputStreamReader(bis, "UTF-8"));
JsonParser jsonParser = new JsonParser();
reader.beginArray();
while (reader.hasNext()) {
JsonObject jsonObject = jsonParser.parse(reader).getAsJsonObject();
jsonObject.addProperty("Somename", "Somedata");
// processContext is a ProcessContext object
processContext.output(jsonObject.toString());
}
reader.close();
In this case the whole content of the file will be in my memory which brings options to get java.lang.OutOfMemoryError. Im searching for solution to read one by one all JSON objects without keeping the whole file in my memory.
Possible solution is to use method open() from object FileIO.ReadableFile which returns ReadableByteChannel channel but Im not sure how to use that channel to read specifically one JSON object from that channel.
Updated solution
This is my updated solution which reads the file line by line
ReadableByteChannel readableByteChannel = null;
InputStream inputStream = null;
BufferedReader bufferedReader = null;
try {
// file is a FileIO.ReadableFile
readableByteChannel = file.open();
inputStream = Channels.newInputStream(readableByteChannel);
bufferedReader = new BufferedReader(new InputStreamReader(inputStream, "UTF-8"));
String line;
while ((line = bufferedReader.readLine()) != null) {
if (line.length() > 1) {
// my final output should contain both filename and line
processContext.output(fileName + file);
}
}
} catch (IOException ex) {
logger.error("Exception during reading the file: {}", ex);
} finally {
IOUtils.closeQuietly(bufferedReader);
IOUtils.closeQuietly(inputStream);
}
I see that this solution doesnt work with Dataflow running on n1-standard-1 machine and throws java.lang.OutOfMemoryError: GC overhead limit exceeded exception and works correctly on n1-standard-2 machine.
ReadableByteChannel is a java NIO API, introduced in Java 7. Java provides a way to convert it to an InputStream: InputStream bis = Channels.newInputStream(file.open()); - I believe this is the only change you need to make.

How to get whole data from Solr

I have to write some logic in Java which should retrieve all the index data from Solr.
As of now I am doing it like this
String confSolrUrl = "http://localhost/solr/master/select?q=*%3A*&wt=json&indent=true"
LOG.info(confSolrUrl);
url = new URL(confSolrUrl);
URLConnection conn = url.openConnection();
BufferedReader br = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String inputLine;
//save to this filename
String fileName = "/qwertyuiop.html";
File file = new File(fileName);
if (!file.exists())
{
file.createNewFile();
}
FileWriter fw = new FileWriter(file.getAbsoluteFile());
BufferedWriter bw = new BufferedWriter(fw);
while ((inputLine = br.readLine()) != null) {
bw.write(inputLine);
}
bw.close();
br.close();
System.out.println("Done");
In my file I will get the whole HTML file that I can parse and extract my JSON.
Is there any better way to do it?
Instead of get the resource from the url and parse it?
I just wrote an application to do this, take a look at github: https://github.com/freedev/solr-import-export-json
If you want read all data from a solr collection the first problem you're facing is the pagination, in this case we are talking of deep paging.
A direct http request like you did will return a relative short amount of documents. And you can even have millions or billions of documents in a solr collection.
So you should use the correct API, i.e. Solrj.
In my project I just did it.
I would also suggest this reading:
https://lucidworks.com/blog/2013/12/12/coming-soon-to-solr-efficient-cursor-based-iteration-of-large-result-sets/

Java: Reading from a URL produces gibberish

So I've been trying to read the html code from kickass.to(it works fine on other sites) but all I get is some weird gibberish.
My code:
BufferedReader in = new BufferedReader(
new InputStreamReader(new URL("http://kickass.to/").openStream()));
String s = "";
while ((s=in.readLine())!=null) System.out.println(s);
in.close();
For example:
Does anyone knows why it does that?
thanks!
The problem here is a server that is probably not configured correctly, as it returns its response gzip compressed, even if the client does not send an Accept-Encoding: gzip header.
So what you're seeing is the compressed version of the page. To decompress it, pass it through a GZIPInputStream:
BufferedReader in = new BufferedReader(
new InputStreamReader(
new GZIPInputStream(new URL("http://kickass.to/").openStream())));

Java reading httprequest stream to process json is jibberish

JSONObject obj = new JSONObject();
br = new BufferedReader(new InputStreamReader(conn.getInputStream()));
obj.put("auth key", br.readLine());
System.out.println(obj.toString());
br.close();
return "test";
The problem I am having with this code is the json outputted is jibberish "{"auth key":""}" at first after reading through google I thought it was because it was being compressed with gzip however after checking the headers with fiddler there is no content encoding.
Any thoughts on the matter would be good many thanks

how to read/fetch the XML file from an URL using Java?

I want to read an XML file from an URL and I want to parse it. How can I do this in Java??
Reading from a URL is know different than any other input source. There are several different Java tools for XML parsing.
You can use Xstream it supports this.
URL url = new URL("yoururl");
BufferedReader in = new BufferedReader(
new InputStreamReader(
url.openStream()));
xSteamObj.fromXML(in);//return parsed object
Two steps:
Get the bytes from the server.
Create a suitable XML source for it, perhaps even a Transformer.
Connect the two and get e.g. a DOM for further processing.
I use JDOM:
import org.jdom.Document;
import org.jdom.Element;
import org.jdom.input.*;
StringBuilder responseBuilder = new StringBuilder();
try {
// Create a URLConnection object for a URL
URL url = new URL( "http://127.0.0.1" );
URLConnection conn = url.openConnection();
HttpURLConnection httpConn;
httpConn = (HttpURLConnection)conn;
BufferedReader rd = new BufferedReader(new InputStreamReader(httpConn.getInputStream()));
String line;
while ((line = rd.readLine()) != null)
{
responseBuilder.append(line + '\n');
}
}
catch(Exception e){
System.out.println(e);
}
SAXBuilder sb = new SAXBuilder();
Document d = null;
try{
d = sb.build( new StringReader( responseBuilder.toString() ) );
}catch(Exception e){
System.out.println(e);
}
Of course, you can cut out the whole read URL to string, then put a string reader on the string, but Ive cut/pasted from two different areas. So this was easier.
This is a good candidate for using Streaming parser : StAX
StAX was designed to deal with XML streams serially; than compared to DOM APIs that needs entire document model at one shot. StAX also assumes that the contents are dynamic and the nature of XML is not really known. StAX use cases comprise of processing pipeline as well.

Categories