use Java client like curl with param - java

I use influxdb 0.9. in this version, i can write database like
curl -XPOST 'http://localhost:8086/write?db=mydb' -d 'cpu,host=server01,region=uswest value=1.0'
Now I convert it to java
URL url = new URL("http", "localhost", 8086, "/write?db=mydb");
HttpURLConnection con = (HttpURLConnection) url.openConnection();
con.setRequestMethod("POST");
con.setDoOutput(true);
OutputStream wr = con.getOutputStream();
Stirng s = "cpu,host=server01,region=uswest value=51.0";
wr.write(s.getBytes(UTF_8));
wr.flush();
wr.close();
but it doesn't work. Is the "-d" meant to represent post parameters? How can I express that in Java?

In this example the curl flag should really be --data-binary, not -d, which can have different encoding. As long as your string is unaltered by the Java code it should be fine. Anything like URL encoding will prevent the line protocol insert from working.

You need to grab the HTTP response as well. The following worked for me:
import java.net.*;
import java.io.*;
import java.util.*;
public class Client {
private static HttpURLConnection client;
public static void main(String[] args) {
try {
Random rn = new Random();
URL input;
while(true) {
input = new URL("http", "localhost", 8086, "/write?db=mydb");
client = (HttpURLConnection) input.openConnection();
client.setRequestMethod("POST");
client.setDoOutput(true);
double thermals = rn.nextDouble();
String s = "cpu_temperature value=" + thermals;
try (OutputStreamWriter writer =
new OutputStreamWriter(client.getOutputStream())) {
writer.write(s);
}
BufferedReader in = new BufferedReader(new InputStreamReader(
client.getInputStream()));
String decodedString;
while ((decodedString = in.readLine()) != null) {
System.out.println(decodedString);
}
in.close();
System.out.println(s); // for debugging
Thread.sleep(1000); // send data every 1 second
}
} catch(MalformedURLException error) {
System.out.println("Malformed URL!");
} catch(SocketTimeoutException error) {
System.out.println("Socket Timeout Exception!");
} catch (IOException error) {
System.out.println("IOException!");
System.out.println(error);
} catch(InterruptedException e) {
System.out.println("InterruptedException!");
} finally {
if(client != null) { // Make sure the connection is not null.
client.disconnect();
}
}
}
}

your SQL query need to be corrected. Remove "," in "cpu,host=" it will work.

Related

Android Retrieving JSON Object from URL

I am working on an app that makes an API call to a php script that echos a JSON Object. Testing the php file manually through a browser returns the expected information, but my app is acting as if the string that is returned is empty (before I even get to the point of decoding the JSON Object).
Here's the snippet of my code. I've used this script multiple times in my app successfully for api's that echo strings.
String urlParameters =
"request=item_search&item_num=" + barcode + "&ou=" + OU + "&user_tag=" + initials + "&version=" + version + "&scan_point=return";
URL url = null;
try {
if (testMode == true)
{
url = new URL("http://URL/api.php");
}
else
{
url = new URL("http://URL/api.php");
}
}
catch (MalformedURLException e)
{
e.printStackTrace();
}
StringBuilder output = new StringBuilder();
try
{
assert url != null;
URLConnection conn = url.openConnection();
conn.setDoOutput(true);
OutputStreamWriter writer = new OutputStreamWriter(conn.getOutputStream());
writer.write(urlParameters);
writer.flush();
writer.close();
String line;
BufferedReader reader = new BufferedReader(new InputStreamReader(conn.getInputStream()));
while ((line = reader.readLine()) != null)
{
output.append(line);
}
writer.close();
reader.close();
}
catch (IOException e)
{
e.printStackTrace();
}
String outputString = output.toString();
Have you tried OkHttp.
HTTP is the way modern applications network. It’s how we exchange data & media. Doing HTTP efficiently makes your stuff load faster and saves bandwidth.
You can try following code:
package com.squareup.okhttp.guide;
import com.squareup.okhttp.OkHttpClient;
import com.squareup.okhttp.Request;
import com.squareup.okhttp.Response;
import java.io.IOException;
public class GetExample {
OkHttpClient client = new OkHttpClient();
String run(String url) throws IOException {
Request request = new Request.Builder()
.url(url)
.build();
Response response = client.newCall(request).execute();
return response.body().string();
}
public static void main(String[] args) throws IOException {
GetExample example = new GetExample();
String response = example.run("https://raw.github.com/square/okhttp/master/README.md");
System.out.println(response);
}
}
For more you can visit:
Vogella's article
OkHttp 2.0

Can't send data with the use of POST request from Java to PHP

I am trying to send data from Java code to PHP code with the use of a POST request. For some reason on the PHP side I get nothing at all, but my code "works" (no exception occurs). What can be the problem? Thanks in advance!
My data looks something like this:
"data=" + sb.toString()
And the code can be found here below:
public static void sendRequest(String encryptedString) {
HttpURLConnection connection = null;
BufferedReader br = null;
DataOutputStream dos = null;
try {
connection = (HttpURLConnection) new URL("http://localhost/something/function").openConnection();
connection.setDoOutput(true);
connection.setRequestMethod("POST");
connection.setRequestProperty("Content-Type", "application/x-www-form-urlencoded");
connection.setFixedLengthStreamingMode(encryptedString.length());
dos = new DataOutputStream(connection.getOutputStream());
dos.writeBytes(encryptedString);
dos.flush();
br = new BufferedReader(new InputStreamReader(connection.getInputStream()));
String line;
while((line = br.readLine()) != null) {
System.out.println(line);
}
} catch (IOException e) {
System.out.println("Can't create connection...");
e.printStackTrace();
} finally {
try {
if(dos != null) dos.close();
if(connection != null) connection.disconnect();
if(br != null) br.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
PS: The picture:
The problem was that I encrypted the whole data=randomnumber thing with SHA-1. The data= part should not be encrypted and needs to be added later.

SocketException: Connection reset

I all but copied the following code from here. I get a java.net.SocketException on line 10 saying "Connection Reset".
import java.net.*;
import java.io.*;
import org.apache.commons.io.*;
public class HelloWorld {
public static void main(String[] x) {
try {
URL url = new URL("http://money.cnn.com/2013/06/07/technology/security/page-zuckerberg-spying/index.html");
URLConnection con = url.openConnection();
InputStream in = con.getInputStream();
String encoding = con.getContentEncoding();
encoding = encoding == null ? "UTF-8" : encoding;
String body = IOUtils.toString(in, encoding);
System.out.print(body);
} catch (Exception e) {
e.printStackTrace();
}
}
}
I'm worried this may not actually be an issue with the actual code but rather some permission I need to give Java. Is there something wrong with my code or is this an environment issue?
I used your code with small modification cause I don't have IOUtils at hands. And it works as it should. There is no need to set agent. No special privileges also as I run it by normal user.
try {
URL url = new URL("http://money.cnn.com/2013/06/07/technology/security/page-zuckerberg-spying/index.html");
URLConnection con = url.openConnection();
InputStream in = con.getInputStream();
BufferedReader br = new BufferedReader(new InputStreamReader(in));
StringBuilder sb = new StringBuilder();
String line = br.readLine();
while (line != null) {
sb.append(line);
line = br.readLine();
}
System.out.print(sb.toString());
} catch (Exception e) {
e.printStackTrace();
}

Java, FileNotfound Exception, While reading conn.getInputStream()

Please tell me some one, How to resolve this problem,
Sometime I am getting Filenotfound Exception and Some time this code working fine.
Below is my code,
public String sendSMS(String data, String url1) {
URL url;
String status = "Somthing wrong ";
try {
url = new URL(url1);
URLConnection conn = url.openConnection();
conn.setDoOutput(true);
conn.setRequestProperty("User-Agent","Mozilla/5.0 ( compatible ) ");
conn.setRequestProperty("Accept","*/*");
OutputStreamWriter wr = new OutputStreamWriter(conn.getOutputStream());
wr.write(data);
wr.flush();
// Get the response
try {
BufferedReader rd = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String s;
while ((s = rd.readLine()) != null) {
status = s;
}
rd.close();
} catch (FileNotFoundException e) {
e.printStackTrace();
}
wr.close();
} catch (MalformedURLException e) {
status = "MalformedURLException Exception in sendSMS";
e.printStackTrace();
} catch (IOException e) {
status = "IO Exception in sendSMS";
e.printStackTrace();
}
return status;
}
Rewrite like this and let me know how you go... (note closing of reading and writing streams, also the cleanup of streams if an exception is thrown).
public String sendSMS(String data, String url1) {
URL url;
OutputStreamWriter wr = null;
BufferedReader rd = null;
String status = "Somthing wrong ";
try {
url = new URL(url1);
URLConnection conn = url.openConnection();
conn.setDoOutput(true);
conn.setRequestProperty("User-Agent","Mozilla/5.0 ( compatible ) ");
conn.setRequestProperty("Accept","*/*");
wr = new OutputStreamWriter(conn.getOutputStream());
wr.write(data);
wr.flush();
wr.close();
rd = new BufferedReader(new InputStreamReader(conn.getInputStream()));
String s;
while ((s = rd.readLine()) != null) {
status = s;
}
rd.close();
} catch (Exception e) {
if (wr != null) try { wr.close(); } catch (Exception x) {/*cleanup*/}
if (rd != null) try { rd.close(); } catch (Exception x) {/*cleanup*/}
e.printStackTrace();
}
return status;
}
This issue seems to be known, but for different reasons so its not clear why this happend.
Some threads would recommend closing the OutputStreamWriter as flushing it is not enough, therefor i would try to clos it directly after fushing as you are not using it in the code between the flush and close.
Other threads show that using a different connections like HttpURLConnection are avoiding this problem from occuring (Take a look here)
Another article suggests to use the URLEncoder class’ static method encode. This method takes a string and encodes it to a string that is ok to put in a URL.
Some similar questions:
URL is accessable with browser but still FileNotFoundException with URLConnection
URLConnection FileNotFoundException for non-standard HTTP port sources
URLConnection throwing FileNotFoundException
Wish you good luck.
It returns FileNotFoundException when the server response to HTTP request is code 404.
Check your URL.

URLConnection is not allowing me to access data on Http errors (404,500,etc)

I am making a crawler, and need to get the data from the stream regardless if it is a 200 or not. CURL is doing it, as well as any standard browser.
The following will not actually get the content of the request, even though there is some, an exception is thrown with the http error status code. I want the output regardless, is there a way? I prefer to use this library as it will actually do persistent connections, which is perfect for the type of crawling I am doing.
package test;
import java.net.*;
import java.io.*;
public class Test {
public static void main(String[] args) {
try {
URL url = new URL("http://github.com/XXXXXXXXXXXXXX");
URLConnection connection = url.openConnection();
DataInputStream inStream = new DataInputStream(connection.getInputStream());
String inputLine;
while ((inputLine = inStream.readLine()) != null) {
System.out.println(inputLine);
}
inStream.close();
} catch (MalformedURLException me) {
System.err.println("MalformedURLException: " + me);
} catch (IOException ioe) {
System.err.println("IOException: " + ioe);
}
}
}
Worked, thanks: Here is what I came up with - just as a rough proof of concept:
import java.net.*;
import java.io.*;
public class Test {
public static void main(String[] args) {
//InputStream error = ((HttpURLConnection) connection).getErrorStream();
URL url = null;
URLConnection connection = null;
String inputLine = "";
try {
url = new URL("http://verelo.com/asdfrwdfgdg");
connection = url.openConnection();
DataInputStream inStream = new DataInputStream(connection.getInputStream());
while ((inputLine = inStream.readLine()) != null) {
System.out.println(inputLine);
}
inStream.close();
} catch (MalformedURLException me) {
System.err.println("MalformedURLException: " + me);
} catch (IOException ioe) {
System.err.println("IOException: " + ioe);
InputStream error = ((HttpURLConnection) connection).getErrorStream();
try {
int data = error.read();
while (data != -1) {
//do something with data...
//System.out.println(data);
inputLine = inputLine + (char)data;
data = error.read();
//inputLine = inputLine + (char)data;
}
error.close();
} catch (Exception ex) {
try {
if (error != null) {
error.close();
}
} catch (Exception e) {
}
}
}
System.out.println(inputLine);
}
}
Simple:
URLConnection connection = url.openConnection();
InputStream is = connection.getInputStream();
if (connection instanceof HttpURLConnection) {
HttpURLConnection httpConn = (HttpURLConnection) connection;
int statusCode = httpConn.getResponseCode();
if (statusCode != 200 /* or statusCode >= 200 && statusCode < 300 */) {
is = httpConn.getErrorStream();
}
}
You can refer to Javadoc for explanation. The best way I would handle this is as follows:
URLConnection connection = url.openConnection();
InputStream is = null;
try {
is = connection.getInputStream();
} catch (IOException ioe) {
if (connection instanceof HttpURLConnection) {
HttpURLConnection httpConn = (HttpURLConnection) connection;
int statusCode = httpConn.getResponseCode();
if (statusCode != 200) {
is = httpConn.getErrorStream();
}
}
}
You need to do the following after calling openConnection.
Cast the URLConnection to HttpURLConnection
Call getResponseCode
If the response is a success, use getInputStream, otherwise use getErrorStream
(The test for success should be 200 <= code < 300 because there are valid HTTP success codes apart from than 200.)
I am making a crawler, and need to get the data from the stream regardless if it is a 200 or not.
Just be aware that it if the code is a 4xx or 5xx, then the "data" is likely to be an error page of some kind.
The final point that should be made is that you should always respect the "robots.txt" file ... and read the Terms of Service before crawling / scraping the content of a site whose owners might care. Simply blatting off GET requests is likely to annoy site owners ... unless you've already come to some sort of "arrangement" with them.

Categories