Java, Selenium: Verify URLs using HTTP Get - How long should it take - java

I have a test suite made up of 10 testcases. Each test case navigates to a section of the website and checks all the url's in that section. Each section has roughly 10 webpages and each webpage has over 100 href elements.
The total execution time is between 50 mins and 1 hour. How long should this take? 1 hr seems a bit excessive.
Helper method:
public Map<Boolean, List<String>> getHrefResponseBoolean() {
return driver.findElements(By.xpath("//*[#href]"))
.stream()
.filter(s -> !s.getAttribute("href").endsWith("svg") && !s.getAttribute("href").endsWith("webmanifest") && !s.getAttribute("href").endsWith("ico?v=2"))
.map(ele -> ele.getAttribute("href"))
.map(String::trim)
.distinct()
.collect(Collectors.partitioningBy(link ->
HttpUtility.getResponseCode(link) == 200))
;
HttpUtility:
/**
* Hits the given url and returns the HTTP response code
* #param link
* #return
*/
public static int getResponseCode(String link) {
URL url;
HttpURLConnection con = null;
Integer responsecode = 0;
try {
url = new URL(link);
con = (HttpURLConnection) url.openConnection();
responsecode = con.getResponseCode();
} catch (Exception e) {
// skip
} finally {
if (null != con)
con.disconnect();
}
return responsecode;
}

First of all, you shoul set timeout to connection. This is going to avoid con stuck.
public static int getResponseCode(String link) {
URL url;
HttpURLConnection con = null;
Integer responsecode = 0;
try {
url = new URL(link);
con = (HttpURLConnection) url.openConnection();
con.setConnectTimeout(1000); //set time out
responsecode = con.getResponseCode();
} catch (Exception e) {
// skip
} finally {
if (null != con)
con.disconnect();
}
return responsecode;
}
Now, i beleave you should filter the WebElements through the xpath. I know it looks messy, but that's for xpath compatibility with version 1.
public Map<Boolean, List<String>> getHrefResponseBoolean() {
WebDriver driver = new FirefoxDriver();
return driver.findElements(By.xpath("//*[not('svg' = substring(#href,string-length(#href) -string-length('svg') +1)) " +
" and" +
" not('webmanifest' = substring(#href,string-length(#href) -string-length('webmanifest') +1))" +
" and" +
" not('ico?v=2' = substring(#href,string-length(#href) -string-length('ico?v=2') +1))" +
"]"))
.stream()
.map(ele -> ele.getAttribute("href").trim())
.distinct()
.collect(Collectors.partitioningBy(link -> getResponseCode(link) == 200));
}
and then simply we could access the map to list all the bad urls as shown here.
map.get(true) // will contain all the good urls
map.get(false) // will contain all the bad urls

Related

How to stop a string from being cut off in Java

I have a java code to call a REST API which returns a JWT token as a response. I send a GET call to the API and it will return a JWT token as a response. The token is being returned fine. However, I've noticed somehow the token is being trimmed.
I tried everything online and nothing seems to be working for me. Below is my code :
try {
URL url = new URL(proxyService.getProperty("proxy.url") + "/" + sessionToken);
log.logText("Connection URL: " + url, logLevel);
String readLine = null;
HttpURLConnection conn = (HttpURLConnection)url.openConnection();
conn.setRequestMethod("GET");
conn.setRequestProperty("Accept", "application/json");
int responseCode = conn.getResponseCode();
if (responseCode == HttpURLConnection.HTTP_OK) {
InputStream in = ((URLConnection)conn).getInputStream();
int length = 0;
StringBuffer response = new StringBuffer();
byte[] data1 = new byte[1024];
while (-1 != (length = in.read(data1))) {
response.append(new String(data1, 0, length));
}
log.logText("JSON String Result: " + response.toString(), logLevel);
}
conn.disconnect();
} catch(MalformedURLException e) {
e.printStackTrace();
} catch(IOException e) {
e.printStackTrace();
}
oauthToken = oauthToken.replaceAll("^\"|\"$", "");
log.logText("OAuth2 Token: " + oauthToken, logLevel);
return oauthToken;
Regards,
Learnmore
As #markspace mentioned, please specify the data type for oauthToken (I believe it is of type String). Print the total String and then the length before replaceALL and after replaceALL. compare whats replace adds to the total length if yes then there is no issue of string getting trimmed.
You are not assigning response value to anything. I assume you should be assigning it to oauthToken variable.
Also please close the InputStream instance in finally clause, otherwise you will cause resource leakage.
I think you have to close InputStream first, to flush internal buffer.
public static String getOauthToken() throws IOException {
URL url = new URL(proxyService.getProperty("proxy.url") + "/" + sessionToken);
log.logText("Connection URL: " + url, logLevel);
String oauthToken = readInputString(url);
oauthToken = oauthToken.replaceAll("^\"|\"$", "");
log.logText("OAuth2 Token: " + oauthToken, logLevel);
return oauthToken;
}
private static String readInputString(URL url) throws IOException {
HttpURLConnection conn = (HttpURLConnection)url.openConnection();
conn.setRequestMethod("GET");
conn.setRequestProperty("Accept", "application/json");
if (conn.getResponseCode() != HttpURLConnection.HTTP_OK)
throw new RuntimeException("Not expected response code");
try (InputStream in = conn.getInputStream()) {
StringBuffer buf = new StringBuffer();
byte[] b = new byte[1024];
while (true) {
int readBytes = in.read(b);
if (readBytes == -1)
break;
buf.append(new String(b, 0, readBytes));
}
log.logText("JSON String Result: " + buf, logLevel);
return buf.toString();
}
}
It looks like the actual application that I'm calling from is cutting off the response value. I shortened the length of the JWT token and it's not cutting it off. The application must have a limit for a maximum number of characters allowed in a string could be for performance reasons.

Yahoo Finance URL not working (again since 05/2018)

Thanks to this valuable site, I found useful tips since 08/2017 to retrieve cookies and crumbs for Yahoo Finance site in order to solve my bulk quote download problem.
Nevertheless my program (written in Java) doesn't work anymore since end of May 2018.
I get the following error message :
CookieHandler retrieved cookie:
GUCS="AX62rEgH";$Path="/";$Domain=".yahoo.com" Added cookie using
cookie handler getContent on quote failed: java.io.IOException: Server
returned HTTP response code: 401 for URL:
https://query1.finance.yahoo.com/v7/finance/download/AC.PA?period1=1526594400&period2=1527631200&interval=1d&events=history&crumb=null
I think that the crumb search is failing..
FYI : I am a Java programmer "amateur" since 2003
Please advise if anybody knows how to solve this problem
Thanks to Maxzoom and Dave for their prompt answer. I apologize for the lack of details in my question.For that reason I am adding the complete java method I was using successfully until last month, thanks on one hand to the code of Serge dated Aug 27 2017
Since the method is too long in the comment page I will paste in a new answer Here is the method below:
public static String getQuote3(String quoteString, String stock) throws IOException {
int curByte;
char curChar;
String curQuote,z;
boolean priceFlag;
z="rr";
//////////////Search for cookies
try {
CookieManager manager = new CookieManager();
manager.setCookiePolicy(CookiePolicy.ACCEPT_ALL);
CookieHandler.setDefault(manager);
URL quoteURL = new URL("https://fr.finance.yahoo.com/quote"+stock+"/history?p="+stock);
URLConnection con = quoteURL.openConnection();
con.getContent();
// get cookies from underlying CookieStore
CookieStore cookieJar = manager.getCookieStore();
java.util.List <HttpCookie> cookies = cookieJar.getCookies();
for (HttpCookie cookie: cookies) {
System.out.println("CookieHandler retrieved cookie: " + cookie);
}
//now you can search for the crumb in the yahoo site:
String crumb = null;
InputStream inStream = con.getInputStream();
InputStreamReader irdr = new InputStreamReader(inStream);
BufferedReader rsv = new BufferedReader(irdr);
Pattern crumbPattern = Pattern.compile(".*\"CrumbStore\":\\{\"crum\":\"([^\"]+)\"\\}.*");
String line = null;
while (crumb == null && (line = rsv.readLine()) != null) {
Matcher matcher = crumbPattern.matcher(line);
if (matcher.matches() && matcher.group(1).length()< 12)
crumb = matcher.group(1);
if(crumb!= null)
{
System.out.println ("crumb= " + crumb) ;
}
}
rsv.close();
String quoteUrls = quoteString + crumb;
// create cookie
HttpCookie cookie = new HttpCookie("UserName", "John Doe");
// add cookie to CookieStore for a particular URL quoteURL = new URL(quoteUrls);
try {
cookieJar.add(quoteURL.toURI(), cookie);
System.out.println("Added cookie using cookie handler");
} catch(Exception e) {
System.out.println("Unable to set cookie using CookieHandler");
e.printStackTrace();
}
con.connect();
try {
DataInputStream quoteStream = new DataInputStream(quoteURL.openStream());
priceFlag = false;
curQuote = "";
while( (curByte = quoteStream.read()) != -1) {
curChar = (char) curByte;
curQuote += curChar;
}
System.out.println(curQuote);
priceFlagn = true;
return curQuote;
} catch (IOException e) {
System.err.println("getContent on quote failed: " + e);
priceFlagn = false;
}
} catch (MalformedURLException e) {
System.err.println("Yikes. URL exception");
}
return z;
}

Android - Size in chars of an http response

I am not an pro developing android. I wanted to download a JSON object from my server, but only code I could find was this:
private String downloadUrl(String myurl) throws IOException {
InputStream is = null;
// Only display the first 500 characters of the retrieved
// web page content.
int len = 500;
try {
URL url = new URL(myurl);
HttpURLConnection conn = (HttpURLConnection) url.openConnection();
conn.setReadTimeout(10000 /* milliseconds */);
conn.setConnectTimeout(15000 /* milliseconds */);
conn.setRequestMethod("GET");
conn.setDoInput(true);
// Starts the query
conn.connect();
int response = conn.getResponseCode();
Log.d("ServerConnection", "The response is: " + response);
is = conn.getInputStream();;
//is.
// Convert the InputStream into a string
String contentAsString = readIt(is, len);
return contentAsString;
// Makes sure that the InputStream is closed after the app is
// finished using it.
} catch (MalformedURLException e) {
//
return "error";
} catch (IOException e) {
//
return "error";
} finally {
if (is != null) {
is.close();
}
}
}
And it works fine, I cant understand. But it has a int len = 500, and my returned json is cropped to 500 chars. I tried changing to a great number, but it puts spaces at the end. How can I know the size in chars of the String contained by the InputSteam?
Thanks
You can check the Content-Length header value of your response:
Map<String, List<String>> headers = connection.getHeaderFields();
for (Entry<String, List<String>> header : headers.entrySet()) {
if(header.getKey().equals("Content-Legth")){
len=Integer.parseInt(header.getValue());
}
}
or you can your response in a buffered reader like this:
InputStream is = connection.getInputStream();
InputStreamReader reader = new InputStreamReader(is);
StringBuilder builder = new StringBuilder();
int c = 0;
while ((c = reader.read()) != -1) {
builder.append((char) c);
}
Yout can use Apache Commons IO IOUtils.toString to convert InputStream to String or use Gson to read object from input stream directly:
return gson.fromJson(new InputStreamReader(inputStream), YourType.class);

URL set Connection Timeout is not working

I am using rss feeds to get latest news
and i get a XML response back
The issue i am facng is that in case if it takes longer than 5 seconds i just want the program to be stopped
this is my code (for testing purpose i have set time to 1 second)
public static void main(String args[]) {
Connection dbConnection = null;
PreparedStatement inserpstmt = null;
try {
final JSONArray latestnews = new JSONArray();
builder = getDocumentBuilderInstance();
final URL url = new URL("http://www.rssmix.com/u/8171434/rss.xml");
url.openConnection().setConnectTimeout(1000);
url.openConnection().setReadTimeout(1000);
final Document doc = builder.parse(url.openStream());
final NodeList items = doc.getElementsByTagName("item");
for (int i = 0; i < items.getLength(); i++) {
final Element item = (Element) items.item(i);
final String title = getValue(item, "title");
System.out.println(title);
}
} catch (Exception e) {
e.printStackTrace();
e.getMessage();
} catch (Throwable e) {
e.getMessage();
e.printStackTrace();
} finally {
}
}
But could you please let me know why this isn't being stopped and waiting for more than 1 second
Edited Code
StringBuffer sb = new StringBuffer("http://www.rssmix.com/u/8171434/rss.xml");
URLConnection conn = new URL(sb.toString()).openConnection();
conn.setConnectTimeout(7000);
conn.setReadTimeout(7000);
final Document doc = builder.parse(new InputSource(conn.getInputStream()));
You should probably approach this in the following fashion.
final URL url = new URL("http://www.rssmix.com/u/8171434/rss.xml");
URLConnection urlConn = url.openConnection();
urlConn.setConnectTimeout(1000);
urlConn.setReadTimeout(1000);
final Document doc = builder.parse(urlConn.getInputStream());
In the above code each time when you call openConnection, you get a new Connection object. Also finally you are using openStream which is equivalent to openConnection().getInputStream(). So all the timeouts are on different connection object and finally there is no timeout set on the connection object from where inputstream is taken. That is why it was not working. Below code will work as timeouts are present on same object from where InputStream is retrieved.
URLConnection connection = url.openConnection();
connection.setConnectTimeout(1000);
connection.setReadTimeout(1000);
connection.connect();

Sugar get_entry_list returns "Invalid Session ID" when querying Meetings for a specific user

I try to query Sugar through its REST API using Java for entries in Meetings module for a specific user, namely the one who is logged in currently.
I am trying this a few days already while googling around for asolution.
I made a login() call, where I got a session ID, than I make a call to get_user_id(). With the returned user ID I try to query the Meetings module by using get_entry_list().
To get the Meetings assigned to the UserID it works with following query string, where mUserId holds the returned user id of get_user_id():
queryString = "meetings.assigned_user_id='"+mUserId+"'";
But I not only want to get the meetings, where a user is assigned to, but all Meetings where he participates. For that I try a subquery on meetings_users table in my query.
Here is a query strings I tried, which os working on MySQL prompt. But when I try this over REST, it returns "Invalid Session ID":
queryString = "meetings.id IN ( SELECT meetings_users.meeting_id FROM meetings_users WHERE meetings_users.user_id = '"+mUserId+"' )";
Does anyone have a hint on this? Which conditions lead to an "Invalid Session ID" at all?
What also does not work e.g. is appending "and deleted = '0'" to the first stated query:
queryString = "meetings.assigned_user_id='"+mUserId+"' and deleted = '0'";
also fails.
As requested here is the full code example, platform is Android, API Level 8:
private JSONArray getEntryList(String moduleName,
String selectFields[], String queryString, String orderBy, int max_results) throws JSONException, IOException, KeyManagementException, NoSuchAlgorithmException
{
JSONArray jsoSub = new JSONArray();
if (selectFields.length > 0)
{
for (int i = 0; i < selectFields.length; i++)
{
jsoSub.put(selectFields[i]);
}
}
// get_entry_list expects parameters to be ordered, JSONObject does
// not provide this, so I built my JSON String on my own
String sessionIDPrefix = "{\"session\":\""+ mSessionId+ "\"," +
"\"modulename\":\""+ moduleName+ "\"," +
"\"query\":\""+ queryString + "\"," +
"\"order_by\":\""+ orderBy + "\"," +
"\"offset\":\""+ mNextOffset+ "\"," +
"\"select_fields\":["+ jsoSub.toString().substring(
1, jsoSub.toString().length()-2)+ "\"],"+
"\"max_results\":\""+ 20 + "\"}";
String restData = sessionIDPrefix;
Log.d(TAG, restData);
String data = null;
String baseurl = mUrl + REST_URI_APPEND;
data = httpPost(baseurl+"?method=get_entry_list&input_type=json&response_type=json&rest_data="+restData);
Log.d(TAG, data);
JSONObject jsondata = new JSONObject(data);
mResultCount = jsondata.getInt("result_count");
mNextOffset = jsondata.getInt("next_offset");
return jsondata.getJSONArray("entry_list");
}
private String httpPost(String urlStr) throws IOException{
String urlSplitted [] = urlStr.split("/", 4);
String hostPort[] = urlSplitted[2].split(":");
String hostname = hostPort[0];
int port = 80;
if (hostPort.length > 1)
port = new Integer(hostPort[1]);
String file = "/"+urlSplitted[3];
Log.d(TAG, hostname + ", " + port + ", " +file);
URL url = null;
try {
url = new URL("http", hostname, port, file);
} catch (MalformedURLException e) {
throw new IOException(mContext.getText(R.string.error_malformed_url).toString());
}
Log.d(TAG, "URL "+url.toString());
HttpURLConnection conn = null;
try {
conn = (HttpURLConnection) url.openConnection();
} catch (IOException e) {
throw new IOException(mContext.getText(R.string.error_conn_creat).toString());
}
conn.setConnectTimeout(60 * 1000);
conn.setReadTimeout(60 * 1000);
try {
conn.setRequestMethod("POST");
} catch (ProtocolException e) {
throw new IOException(mContext.getText(R.string.error_post).toString());
}
conn.setDoOutput(true);
conn.setDoInput(true);
conn.setUseCaches(false);
conn.setAllowUserInteraction(false);
conn.setRequestProperty("Content-Type",
"application/x-www-form-urlencoded");
try {
conn.connect();
} catch (IOException e) {
throw new IOException(mContext.getText(R.string.error_conn_open).toString()
+ "\n" + e.getMessage());
}
int response = 0;
String responseMessage = null;
try {
response = conn.getResponseCode();
responseMessage = conn.getResponseMessage();
} catch (IOException e) {
conn.disconnect();
throw new IOException(mContext.getText(R.string.error_resp_io).toString());
}
Log.d(TAG, "Exception Response "+ response);
if (response != 200) {
conn.disconnect();
throw new IOException(mContext.getText(R.string.error_http).toString()
+ "\n" + response + " " + responseMessage);
}
StringBuilder sb = null;
try {
BufferedReader rd = new BufferedReader(
new InputStreamReader(conn.getInputStream()));
sb = new StringBuilder();
String line;
while ((line = rd.readLine()) != null) {
Log.d(TAG,"line " + line);
sb.append(line);
}
rd.close();
} catch (IOException e) {
conn.disconnect();
throw new IOException(mContext.getText(R.string.error_resp_read).toString());
}
conn.disconnect();
if (sb.toString() == null)
{
throw new IOException(mContext.getText(R.string.error_resp_empty).toString());
}
return sb.toString();
}
Calling the code above:
if (login() != OK)
return null;
mResultCount = -1;
mNextOffset = 0;
mUserId = getUserId();
String fields[] = new String [] {
"id",
"name",
"description",
"location",
"date_start",
"date_end",
"status",
"type",
"reminder_time",
"parent_type",
"parent_id",
"deleted",
"date_modified"
};
String queryString = null;
if (syncAllUsers)
queryString = "";
else
{
queryString = "meetings.assigned_user_id = 'seed_sarah_id' and meetings.deleted = '0'";
//queryString = "meetings.id IN ( SELECT meeting_id FROM meetings_users WHERE user_id ='"+mUserId+"'";
}
entryList.clear();
while (mResultCount != 0)
{
if (!seamless_login())
return null;
JSONArray serverEntryList = getEntryList(
"Meetings", fields, queryString, "date_start", 0);
//... do sth with data
}
totalContactsResults += mResultCount;
}
logout();
login() returns valid session id, and getUserId() returns right id. The whole code is already working for fetching contacts, and also working for a simple query as stated above.
Thanks in advance
Marc
After further testing, I realized, that whitespaces in the query string are the problem. They lead to an URL containing whitespace. To avoid that some kind of URL encoding has to be done.
I had not success in encoding the whole URL in my httpPost methods (seems not to be necessary). But replacing spaces with '+' in the query string works for me:
queryString = "meetings.id+IN+(SELECT+meetings_users.meeting_id+FROM meetings_users+WHERE+meetings_users.user_id='"+mUserId+"')";
If anyone has a more elegant method of doing this, please let me know.
You would probably be better off using the get_relationships web service call.
SugarRest.call('get_relationships', [SugarRest.session, 'Users', SugarRest.user_id, 'meetings', '', ['name'], 0, ''])
That should be all you need. In the parameter after 'meetings' you can also pass in an additional filter.

Categories