I have written a test case for the exception is passed but does not cover code coverage.
Please help me I tried many ways but doesn't resolve.
public String checkJiraStatus(HttpURLConnection httpURLConnection) throws IOException {
try {
BufferedReader in = new BufferedReader(new InputStreamReader(httpURLConnection.getInputStream()));
String inputLine;
StringBuilder stringBuilder = new StringBuilder();
while ((inputLine = in.readLine()) != null) {
stringBuilder.append(inputLine);
}
in.close();
JSONObject jsonObject = new JSONObject(String.valueOf(stringBuilder));
JSONObject fields = (JSONObject) jsonObject.get("fields");
JSONObject status = (JSONObject) fields.get("status");
return (String) status.get("name");
}catch (IOException |JSONException ioException) {
throw new IOException("Problem while fetching the data"+ioException.getMessage());
}
}
test case passes correctly but didn't give code coverage.
#Test(expected = Exception.class)
public void testIoException() throws Exception {
when(mockJiraFunctions.checkJiraStatus(any())).thenThrow(new
IOException("Problem while fetching the data"));
jiraFunctions.checkJiraStatus(any());
}
As I mentioned in comment, you should run your real method to have a coverage.
And you should make a situation, where your function will throw an exception. For example in your case you can make a mock object of HttpURLConnection class and make him throw IOException when you call getInputStream() method on it.
So your test will be like
#Test(expected = Exception.class)
public void test() {
HttpURLConnection connectionMock = mock(HttpURLConnection.class);
when(connectionMock.getInputStream()).thenThrow(new IOException());
jiraFunctions.checkJiraStatus(connectionMock);
}
POST Request Java
I've read this post on how to do a POST request in Java. I don't understand how I can implement this in a JSON parser. This is what I tried so far:
public class JSONParser {
private String read(BufferedReader bufferedReader) throws IOException {
//Creates new StringBuilder to avoid escaping chars
StringBuilder stringBuilder = new StringBuilder();
//Gets the currentLine
String currentLine;
while((currentLine = bufferedReader.readLine()) !=null ){
//Adds the currentLine to the stringBuild if the currentLine is not null
stringBuilder.append(currentLine);
}
//Returns the StringBuilder is String format
return stringBuilder.toString();
}
public JSONObject readJsonFromUrl(String JSONurl) throws IOException, JSONException {
InputStream url = new URL(JSONurl).openStream();
try {
BufferedReader bufferedReader = new BufferedReader(new InputStreamReader(url));
String jsonText = read(bufferedReader);
JSONObject json = new JSONObject(jsonText);
return json;
} finally {
url.close();
}
}
public void printJSON() throws IOException, JSONException {
JSONObject json = readJsonFromUrl("http://grwn.ddns.net:1337/locations");
System.out.print(json);
//for (Integer i = 0; i < json.getJSONArray("damage_or_theft_car").length(); i++) {
// System.out.println(json.getJSONArray("damage_or_theft_car")
//.getJSONObject(i).get("hood_id"));
//}
}
}
When I run this code with a link which doesn't require a POST request it all works fine but when I run this code on a link which DOES require a POST request I get the following error:
Exception in thread "main" java.io.FileNotFoundException: http://grwn.ddns.net:1337/locations
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1872)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1474)
at java.net.URL.openStream(URL.java:1045)
at com.company.JSONParser.readJsonFromUrl(JSONParser.java:30)
at com.company.JSONParser.printJSON(JSONParser.java:42)
at com.company.Main.main(Main.java:33)
Could someone help me out or point me in the right direction?
I think you need to specify that you want to make a POST request after you open the connection, maybe try something like this.
public JSONObject readJsonFromUrl(String JSONurl) throws IOException, JSONException {
URL url = new URL(JSONurl);
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.setRequestMethod("POST");
conn.setRequestProprtry("Content-type", "application/JSON");
try {
BufferedReader bufferedReader = new BufferedReader(conn. getInputStream());
String jsonText = read(bufferedReader);
JSONObject json = new JSONObject(jsonText);
return json;
} finally {
url.close();
}
}
Possible causes for this issue might be one of the following;
1) Nothing to read/fetch from url
2) Check proxy settings, might be something configured on proxy for access to above url
3) Hostname mapping missing on your host
Some of the ways to verify this manually are as follows;
1) Try to access url from the web browser with proxy setting, see if you can get the desired raw json
2) Try to access url from the web browser without proxy setting, see if you can get the desired raw json
3) If step 1) or Step2) is successful, try same from your java code
I have an application in which I am trying to download an ArrayList of urls using an Async Task Manager and and than show the content of per website one by one.I am so confused please help me.I have also tried to use a for loop on the execute method but it thorws me an error.
Please let me know what i have to do with the required code.
Thanks.
Async task
public class DownloadWeb extends AsyncTask<String,Void,String>{
#Override
protected String doInBackground(String... urls) {
String result = "";
HttpURLConnection connection;
URL myUrl;
try{
myUrl = new URL(urls[0]);
connection = (HttpURLConnection) myUrl.openConnection();
//!!!!!!!!! The page will not re direct!!!!!!!!!!!!!!!!//
String redirect = connection.getHeaderField("Location");
if(redirect != null){
connection = (HttpURLConnection) new URL(redirect).openConnection();
}
InputStream stream = connection.getInputStream();
BufferedReader reader = new BufferedReader(new InputStreamReader(stream));
String line = "";
StringBuilder stringBuilder = new StringBuilder();
while((line=reader.readLine()) != null){
stringBuilder.append(line);
result = stringBuilder.toString();
}
return result;
}
catch(Exception e){
e.printStackTrace();
}
return null;
}
}
The onCreate method
DownloadWeb task = new DownloadWeb();
try {
String res = task.execute(arr.get(0)).get();
}
catch(Exception e){
e.printStackTrace();
}
try to execute your async task on Executor
asyncTask1.executeOnExecutor(AsyncTask.SERIAL_EXECUTOR);
// asyncTask2 will only be done AFTER asyncTask1 is done
asyncTask2.executeOnExecutor(AsyncTask.SERIAL_EXECUTOR);
or you can create a custom queue and execute add your asyncTask in there
refer https://developer.android.com/reference/android/os/AsyncTask.html#SERIAL_EXECUTOR
I have a web application under test. Using Fiddler/Httpfox, I can see that on logging in to the web app, there are TWO 302 HTTP redirects before a 200 OK is response is received. Is it possible to observe the two redirects using Java code?
This is what I coded:
public class HttpReq {
HttpURLConnection con = null;
StringBuilder str = new StringBuilder();
BufferedReader br = null;
URL address = null;
String line = null;
HttpReq () {
try {
address = new URL("http://walhs24002v.us.oracle.com/t1mockapp1/");
con = (HttpURLConnection)address.openConnection();
con.setRequestMethod("GET");
con.setReadTimeout(60000);
con.setConnectTimeout(60000);
con.setDoOutput(true);
con.setInstanceFollowRedirects(true);
con.connect();
InputStreamReader is = new InputStreamReader(con.getInputStream());
br = new BufferedReader(is);
while((line = br.readLine()) != null)
{
str.append(line + '\n');
}
//System.out.println(str);
System.out.println(con.getResponseCode());
System.out.println(con.getResponseMessage());
}
catch (MalformedURLException m)
{
m.printStackTrace();
}
catch (IOException i)
{
i.printStackTrace();
}
finally
{
br = null;
str = null;
con = null;
}
}
public static void main(String[] args) {
HttpReq http = new HttpReq();
}
}
The program gives the output:
200
OK
No surprises there. Is there a way to capture the two 302 redirects before the 200 ok is received?
With defaulthttpclient you most certainly can:
http://hc.apache.org/httpcomponents-client-ga/tutorial/html/httpagent.html
ClientPNames.HANDLE_REDIRECTS='http.protocol.handle-redirects':
defines whether redirects should be handled automatically. This
parameter expects a value of type java.lang.Boolean. If this parameter
is not set HttpClient will handle redirects automatically.
I'm trying to find Java's equivalent to Groovy's:
String content = "http://www.google.com".toURL().getText();
I want to read content from a URL into string. I don't want to pollute my code with buffered streams and loops for such a simple task. I looked into apache's HttpClient but I also don't see a one or two line implementation.
Now that some time has passed since the original answer was accepted, there's a better approach:
String out = new Scanner(new URL("http://www.google.com").openStream(), "UTF-8").useDelimiter("\\A").next();
If you want a slightly fuller implementation, which is not a single line, do this:
public static String readStringFromURL(String requestURL) throws IOException
{
try (Scanner scanner = new Scanner(new URL(requestURL).openStream(),
StandardCharsets.UTF_8.toString()))
{
scanner.useDelimiter("\\A");
return scanner.hasNext() ? scanner.next() : "";
}
}
This answer refers to an older version of Java. You may want to look at ccleve's answer.
Here is the traditional way to do this:
import java.net.*;
import java.io.*;
public class URLConnectionReader {
public static String getText(String url) throws Exception {
URL website = new URL(url);
URLConnection connection = website.openConnection();
BufferedReader in = new BufferedReader(
new InputStreamReader(
connection.getInputStream()));
StringBuilder response = new StringBuilder();
String inputLine;
while ((inputLine = in.readLine()) != null)
response.append(inputLine);
in.close();
return response.toString();
}
public static void main(String[] args) throws Exception {
String content = URLConnectionReader.getText(args[0]);
System.out.println(content);
}
}
As #extraneon has suggested, ioutils allows you to do this in a very eloquent way that's still in the Java spirit:
InputStream in = new URL( "http://jakarta.apache.org" ).openStream();
try {
System.out.println( IOUtils.toString( in ) );
} finally {
IOUtils.closeQuietly(in);
}
Or just use Apache Commons IOUtils.toString(URL url), or the variant that also accepts an encoding parameter.
There's an even better way as of Java 9:
URL u = new URL("http://www.example.com/");
try (InputStream in = u.openStream()) {
return new String(in.readAllBytes(), StandardCharsets.UTF_8);
}
Like the original groovy example, this assumes that the content is UTF-8 encoded. (If you need something more clever than that, you need to create a URLConnection and use it to figure out the encoding.)
Now that more time has passed, here's a way to do it in Java 8:
URLConnection conn = url.openConnection();
try (BufferedReader reader = new BufferedReader(new InputStreamReader(conn.getInputStream(), StandardCharsets.UTF_8))) {
pageText = reader.lines().collect(Collectors.joining("\n"));
}
Additional example using Guava:
URL xmlData = ...
String data = Resources.toString(xmlData, Charsets.UTF_8);
Java 11+:
URI uri = URI.create("http://www.google.com");
HttpRequest request = HttpRequest.newBuilder(uri).build();
String content = HttpClient.newHttpClient().send(request, BodyHandlers.ofString()).body();
If you have the input stream (see Joe's answer) also consider ioutils.toString( inputstream ).
http://commons.apache.org/io/api-1.4/org/apache/commons/io/IOUtils.html#toString(java.io.InputStream)
The following works with Java 7/8, secure urls, and shows how to add a cookie to your request as well. Note this is mostly a direct copy of this other great answer on this page, but added the cookie example, and clarification in that it works with secure urls as well ;-)
If you need to connect to a server with an invalid certificate or self signed certificate, this will throw security errors unless you import the certificate. If you need this functionality, you could consider the approach detailed in this answer to this related question on StackOverflow.
Example
String result = getUrlAsString("https://www.google.com");
System.out.println(result);
outputs
<!doctype html><html itemscope="" .... etc
Code
import java.net.URL;
import java.net.URLConnection;
import java.io.BufferedReader;
import java.io.InputStreamReader;
public static String getUrlAsString(String url)
{
try
{
URL urlObj = new URL(url);
URLConnection con = urlObj.openConnection();
con.setDoOutput(true); // we want the response
con.setRequestProperty("Cookie", "myCookie=test123");
con.connect();
BufferedReader in = new BufferedReader(new InputStreamReader(con.getInputStream()));
StringBuilder response = new StringBuilder();
String inputLine;
String newLine = System.getProperty("line.separator");
while ((inputLine = in.readLine()) != null)
{
response.append(inputLine + newLine);
}
in.close();
return response.toString();
}
catch (Exception e)
{
throw new RuntimeException(e);
}
}
Here's Jeanne's lovely answer, but wrapped in a tidy function for muppets like me:
private static String getUrl(String aUrl) throws MalformedURLException, IOException
{
String urlData = "";
URL urlObj = new URL(aUrl);
URLConnection conn = urlObj.openConnection();
try (BufferedReader reader = new BufferedReader(new InputStreamReader(conn.getInputStream(), StandardCharsets.UTF_8)))
{
urlData = reader.lines().collect(Collectors.joining("\n"));
}
return urlData;
}
URL to String in pure Java
Example call to get payload from http get call
String str = getStringFromUrl("YourUrl");
Implementation
You can use the method described in this answer, on How to read URL to an InputStream and combine it with this answer on How to read InputStream to String.
The outcome will be something like
public String getStringFromUrl(URL url) throws IOException {
return inputStreamToString(urlToInputStream(url,null));
}
public String inputStreamToString(InputStream inputStream) throws IOException {
try(ByteArrayOutputStream result = new ByteArrayOutputStream()) {
byte[] buffer = new byte[1024];
int length;
while ((length = inputStream.read(buffer)) != -1) {
result.write(buffer, 0, length);
}
return result.toString(UTF_8);
}
}
private InputStream urlToInputStream(URL url, Map<String, String> args) {
HttpURLConnection con = null;
InputStream inputStream = null;
try {
con = (HttpURLConnection) url.openConnection();
con.setConnectTimeout(15000);
con.setReadTimeout(15000);
if (args != null) {
for (Entry<String, String> e : args.entrySet()) {
con.setRequestProperty(e.getKey(), e.getValue());
}
}
con.connect();
int responseCode = con.getResponseCode();
/* By default the connection will follow redirects. The following
* block is only entered if the implementation of HttpURLConnection
* does not perform the redirect. The exact behavior depends to
* the actual implementation (e.g. sun.net).
* !!! Attention: This block allows the connection to
* switch protocols (e.g. HTTP to HTTPS), which is <b>not</b>
* default behavior. See: https://stackoverflow.com/questions/1884230
* for more info!!!
*/
if (responseCode < 400 && responseCode > 299) {
String redirectUrl = con.getHeaderField("Location");
try {
URL newUrl = new URL(redirectUrl);
return urlToInputStream(newUrl, args);
} catch (MalformedURLException e) {
URL newUrl = new URL(url.getProtocol() + "://" + url.getHost() + redirectUrl);
return urlToInputStream(newUrl, args);
}
}
/*!!!!!*/
inputStream = con.getInputStream();
return inputStream;
} catch (Exception e) {
throw new RuntimeException(e);
}
}
Pros
It is pure java
It can be easily enhanced by adding different headers as a map (instead of passing a null object, like the example above does), authentication, etc.
Handling of protocol switches is supported