StaleElementReferenceException in get text with [duplicate] - java

This question already has answers here:
Getting StaleElementReferenceException while trying print the link names
(3 answers)
Selenium Webdriver - Stale element exception when clicking on multiple dropdowns while HTML DOM doesn't change
(1 answer)
StaleElementReference Exception in PageFactory
(3 answers)
Closed 4 years ago.
I have try all available solving
WebDriverWait wait6 = new WebDriverWait(driver, 500);
wait6 .until(ExpectedConditions.presenceOfElementLocated(By.xpath("(//i[#class='material-icons'])[" + j + "]")));
I have application where I need to click on all item and get text of item name I am get Stale Element Reference Exception.
I have try to put different method to resolve it but nothing working.
public void page(WebDriver driver, String Filtername) throws InterruptedException {
waitForElementPresent(driver, 60, sidenavbutton);
click(driver, sidenavbutton);
Thread.sleep(2000);
click(driver, viewcopyportfolio);
Thread.sleep(1000);
click(driver, sidenavbutton);
waitForElementPresent(driver, 30, porfoliosheader);
clearText(driver, pagenumtextbox);
Thread.sleep(1000);
setText(driver, pagenumtextbox, Filtername);
Thread.sleep(1000);
List<WebElement> editicons1 = driver.findElements(By.xpath("//i[#class='material-icons']"));
for (int j = 1; j <= editicons1.size(); j++) {
editicons1 = driver.findElements(By.xpath("//i[#class='material-icons']"));
String porfolioName = driver.findElement(By.xpath("(//mat-table//mat-row)[" + j + "]//mat-cell[2]")).getText();
//Added to fix Stale Element Exception
WebElement editicon = driver.findElement(By.xpath("(//i[#class='material-icons'])[" + j + "]"));
//In click method attached code below this will loop for 5 times
click1(driver, editicon, porfolioName + " portfolio edit icon");
Thread.sleep(1000);
waitForElementPresent(driver, 30, buildportfolioheader);
}
}
This code for click1 method
public void click1(WebDriver driver, WebElement element, String name) throws InterruptedException {
int attempts = 0;
while(attempts < 5) {
try {
element.click();
Add_Log.info("Successfully clicked on " + name);
Reporter.log("Successfully clicked on " + name);
return;
} catch (Exception e) {
attempts++;
Thread.sleep(500);
try {
JavascriptExecutor executor = (JavascriptExecutor) driver;
executor.executeScript("arguments[0].click();", element);
Add_Log.info("Successfully clicked on " + name);
Reporter.log("Successfully clicked on " + name);
return;
} catch (Exception e2) {
Add_Log.info("Not able to click " + name);
Reporter.log("Not able to click " + name);
TestResultStatus.Testfail = true;
Assert.fail("Not able to click " + name);
}
}
}
}

" editicons1 = driver.findElements(By.xpath("//i[#class='material-icons']"));" This line in the loop doesn't look like it's needed, you just wanted the initial count, I don't see a reason to re load the list of elements.
The problem with this wait logic is that if the element already exists it will just sleep a second, see that the element is there and then continue, and from what I've seen, the next page could then start loading and then your script will be in a world of hurt.
Thread.sleep(1000);
waitForElementPresent(driver, 30, buildportfolioheader);
IF the element isn't already on the page, I would swap the explicit wait to come first. The reason for that is that presence of an element doesn't really mean a whole lot, the page could still be in motion, so a little bit of a sleep after the explicit wait (assuming this is one of the last elements to appear on the page) usually stabilizes flakey scripts.
waitForElementPresent(driver, 30, buildportfolioheader);
Thread.sleep(1000);

Related

Unable to select/click second and last element on mouse hover in selenium

Below is the screenshot of the mouse hover event.
Style type
It is triggered only on mouse hover. Currently, I'm able to read the Style Type Text. I can select the first element without any issues, however, I'm unable to click the second and last element (Categorized and Graduated ).
UI Code for reference.
UI code
Below is the Selenium code for reference.
static void styletype() throws InterruptedException {
Thread.sleep(1000);
String sname = null;
// select style type
Actions action = new Actions(driver);
WebElement menu = driver.findElement(By.xpath("//*[#src = 'assets/images/down_arrow.svg']"));
action.moveToElement(menu).perform();
Thread.sleep(1500);
List<WebElement> rowsList = driver.findElements(By.xpath("//*[contains(#class,'dropdown-item')]"));
for (WebElement element : rowsList) {
try {
sname = element.getText();
// System.out.println("File Type is : " + sname);
int result = JOptionPane.showConfirmDialog(frame, "Layer Stype Type : '" + sname + "'",
"Do you want to continue", JOptionPane.YES_NO_OPTION, JOptionPane.QUESTION_MESSAGE);
if (result == 0) {
element.click();
break;
}
action.moveToElement(menu).perform();
Thread.sleep(1000);
} catch (Exception e) {
}
}
}

How to find web element location for dynamic drop down list element?

I am unable to locate dynamic web element location for drop down list
public void selectClassofService(String value) throws InterruptedException
{
driver.findElement(By.xpath("/html/body/div[1]/div/div/div/div[3]/div[2]/div[2]/div/div/div/div[2]/div[2]/div[2]/form/div[5]/div/div[2]/div/div/div[1]/input")).click();
List<WebElement> list = driver.findElements(By.xpath("/html/body/div[1]/div/div/div/div[3]/div[2]/div[2]/div/div/div/div[2]/div[2]/div[2]/form/div[5]/div/div[2]/div/div/div[1]/input"));
System.out.println("Size of the list size =" + list.size());
for (int i = 0; i < list.size(); i++) {
System.out.println("names of the divisions " + list.get(i).getText());
if (list.get(i).getText().contains(value)) {
list.get(i).click();
break;
}
}
this is my sample html code
<ui class="vs_dropdown-menu" role="listbox">
<l1 class=vs_dropdown-option role="option">sample pack </l1>
<l2 class=vs_dropdown-option vs vs_dropdown-option--highlight role="option">sample pack </l2>
<l3 class=vs_dropdown-option role="option">sample pack2 </l3>
First of all, Stop writing too long xpaths. there are better ways to access an element instead of doing that creepy things.
String dropdownXpath = "//input[#class='vs_dropdown-menu' and #role='listbox']";
new WebDriverWait(driver, 20).until(ExpectedConditions.elementToBeClickable(By.xpath(dropdownXpath))).click();
List<WebElement> myList = new WebDriverWait(driver, 20).until(ExpectedConditions.visibilityOfAllElementsLocatedBy(By.xpath(dropdownXpath + "//li[#class='vs_dropdown-option' and #role='option']")));
for (WebElement element:myList)
{
System.out.println("names of the divisions " + element.getText());
if(element.getText().contains("Mumbai"));
element.click();
}
if it's not possible to access dropdown like above, just get the parent div's Id like this:
String dropdownXpath = "//div[#id='some-id']input[#class='vs_dropdown-menu' and #role='listbox']";
don't waste your time by crawling html tags to find an element.

Stale Object Reference while Navigation using Selenium

I have been trying a simple program that navigates and fetches data from the new page, comes back in history and open other page and fetch data and so on until all the links have been visited and data is fetched.
After getting results on the below site, i am trying to loop through all the links i get in the first column and open those links one by one and extract text from each of these page. But the below program only visits first link and gives StaleElementReferenceException, I have tried using Actions but it didn't work and I am not aware about JavascriptExecutor. I also tried solutions posted on other SO questions, one of which was mine over here. I would like to have the mistake corrected in the below code and a working code.
public class Selenium {
private final static String CHROME_DRIVER = "C:\\Selenium\\chromedriver\\chromedriver.exe";
private static WebDriver driver = null;
private static WebDriverWait wait = null;
private void setConnection() {
try {
System.setProperty("webdriver.chrome.driver", CHROME_DRIVER);
driver = ChromeDriver.class.newInstance();
wait = new WebDriverWait(driver, 5);
driver.get("https://sanctionssearch.ofac.treas.gov");
this.search();
} catch (Exception e) {
e.printStackTrace();
}
}
private void search() {
try {
driver.findElement(By.id("ctl00_MainContent_txtLastName")).sendKeys("Dawood");
driver.findElement(By.id("ctl00_MainContent_btnSearch")).click();
this.extractText();
} catch (Exception e) {
e.printStackTrace();
}
}
private void extractText() {
try {
List<WebElement> rows = driver.findElements(By.xpath("//*[#id='gvSearchResults']/tbody/tr"));
List<WebElement> links = null;
for (int i = 1; i <= rows.size(); i++) {
links = driver.findElements(By.xpath("//*[#id='gvSearchResults']/tbody/tr/td[1]/a"));
for (int j = 0; j < links.size(); j++) {
System.out.println(links.get(j).getText() + ", ");
links.get(j).click();
System.out.println("Afte click");
driver.findElement(By.id("ctl00_MainContent_btnBack")).click();
this.search();
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
public static void main(String[] ar) {
Selenium object = new Selenium();
object.setConnection();
}
}
Generally we will be getting the Stale Exception if the element attributes or something is changed after initiating the webelement. For example, in some cases if user tries to click on the same element on the same page but after page refresh, gets staleelement exception.
To overcome this, we can create the fresh webelement in case if the page is changed or refreshed. Below code can give you some idea.
Example:
webElement element = driver.findElement(by.xpath("//*[#id='StackOverflow']"));
element.click();
//page is refreshed
element.click();//This will obviously throw stale exception
To overcome this, we can store the xpath in some string and use it create a fresh webelement as we go.
String xpath = "//*[#id='StackOverflow']";
driver.findElement(by.xpath(xpath)).click();
//page has been refreshed. Now create a new element and work on it
driver.fineElement(by.xpath(xpath)).click(); //This works
In this case, we are collecting a group of webelements and iterating to get the text. But it seems there is some changes in the webelement after collecting the webelements and gettext throws staleness. We can use a loop and create the element on the go and get text.
for(int i = 0; i<5; i++)
{
String value = driver.findElement(by.xpath("//.....["+i+"]")).getText);
System.out.println(value);
}
Hope this helps you. Thanks.
The reason you get StaleElementReference Exception, is normally because you stored element(s) into some variable, however after that you did some action and page has changed (due to some ajax response) and so your stored element has become stale.
The best solution is not to store element in any variable in such case.
This should work.
links = driver.findElements(By.xpath("//*[#id='gvSearchResults']/tbody/tr/td[1]/a"));
for (int j = 0; j < links.size(); j++) {
System.out.println(links.get(j).getText() + ", ");
driver.findElements(By.xpath("//*[#id='gvSearchResults']/tbody/tr/td[1]/a")).get(j).click();
System.out.println("Afte click");
driver.findElement(By.id("ctl00_MainContent_btnBack")).click();
this.search();
}
Please check this code
private void extractText() {
try {
List<WebElement> rows = driver.findElements(By.xpath("//*[#id='gvSearchResults']/tbody/tr"));
List<WebElement> links = null;
System.out.println(rows.size());
for (int i = 0; i < rows.size(); i++) {
links = driver.findElements(By.xpath("//*[#id='gvSearchResults']/tbody/tr/td[1]/a"));
WebElement ele= links.get(0);
System.out.println(ele.getText() + ", ");
ele.click();
System.out.println("After click");
driver.findElement(By.id("ctl00_MainContent_btnBack")).click();
}
} catch (Exception e) {
e.printStackTrace();
}
}

WebDriver prints wrong conditional statement

I'm learning WebDriver and just trying to check the links on demoaut website. The code in the loop is supposed to recognize "Under Construction" page by its title, print out the first line, and then go back to base url. But that doesn't happen for some reason. The very first "under construction" link it gets to (featured vacation destinations) is not recognized as such, prompts the wrong line to be printed, and then instead of going back it crashes due to NoSuchElementException since it's looking for a link on the wrong page. Why is this happening? Why doesn't it act based on the title of "Under Construction" page?
import java.util.List;
import java.util.concurrent.TimeUnit;
import org.openqa.selenium.*;
import org.openqa.selenium.firefox.FirefoxDriver;
public class CheckLinks {
public static void main(String[] args) {
String baseUrl = "http://newtours.demoaut.com/";
System.setProperty("webdriver.gecko.driver", "C:\\Workspace_e\\geckodriver.exe");
WebDriver driver = new FirefoxDriver();
String underConsTitle = "Under Construction: Mercury Tours";
driver.manage().timeouts().implicitlyWait(10, TimeUnit.SECONDS);
driver.get(baseUrl);
List<WebElement> linkElements = driver.findElements(By.tagName("a"));
String[] linkTexts = new String[linkElements.size()];
int i = 0;
//extract the link texts of each link element
for (WebElement e : linkElements) {
linkTexts[i] = e.getText();
i++;
}
//test each link
for (String t : linkTexts) {
driver.findElement(By.linkText(t)).click();
if (driver.getTitle().equals(underConsTitle)) {
System.out.println("\"" + t + "\""
+ " is under construction.");
} else {
System.out.println("\"" + t + "\""
+ " is working.");
}
driver.navigate().back();
}
driver.quit();
}
}
After you click the first link, all the references in linkTexts will become stale... even if you return to the page. What you need to do is to store all the hrefs in a List and then navigate to each one and check the title of the page.
I would write it this way...
public class CheckLinks
{
public static void main(String[] args) throws UnsupportedFlavorException, IOException
{
String firefoxDriverPath = "C:\\Users\\Jeff\\Desktop\\branches\\Selenium\\lib\\geckodriver-v0.11.1-win32\\geckodriver.exe";
System.setProperty("webdriver.gecko.driver", firefoxDriverPath);
WebDriver driver = new FirefoxDriver();
driver.manage().window().maximize();
String baseUrl = "http://newtours.demoaut.com/";
driver.get(baseUrl);
List<WebElement> links = driver.findElements(By.tagName("a"));
List<String> hrefs = new ArrayList<>();
for (WebElement link : links)
{
hrefs.add(link.getAttribute("href"));
}
System.out.println(hrefs.size());
String underConsTitle = "Under Construction: Mercury Tours";
for (String href : hrefs)
{
driver.get(href);
System.out.print("\"" + href + "\"");
if (driver.getTitle().equals(underConsTitle))
{
System.out.println(" is under construction.");
}
else
{
System.out.println(" is working.");
}
}
driver.close();
driver.quit();
}
}
Your code works fine in my Chrome browser. Your problem could be the speed of the webdriver. You can use WebDriverWait which is an explicit wait for a particular element.
Try below modified code
for (String t : linkTexts) {
WebDriverWait wait = new WebDriverWait(driver, 60);
wait.until(ExpectedConditions.elementToBeClickable(driver.findElement(By.linkText(t))));
driver.findElement(By.linkText(t)).click();
if (driver.getTitle().equals(underConsTitle)) {
System.out.println("\"" + t + "\""
+ " is under construction.");
} else {
System.out.println("\"" + t + "\""
+ " is working.");
}
try {
Thread.sleep(2000);
} catch (InterruptedException e1) {
e1.printStackTrace();
}
driver.navigate().back();
}
I am able to get the out as below
"Home" is working.
"Flights" is working.
"Hotels" is under construction.
"Car Rentals" is under construction.
"Cruises" is working.
"Destinations" is under construction.
"Vacations" is under construction.
"SIGN-ON" is working.
"REGISTER" is working.
"SUPPORT" is under construction.
"CONTACT" is under construction.
"your destination" is under construction.
"featured vacation destinations" is under construction.
"Register here" is working.
"Business Travel # About.com" is working.
"Salon Travel" is working.
I do not find anything wrong in your logic.Infact I copied your code and just replaced firefox driver with IE driver and it worked as expected.Below was the console output I got on running the code:
> Home" is working. "Flights" is working. "Hotels" is under
> construction. "Car Rentals" is under construction. "Cruises" is
> working. "Destinations" is under construction. "Vacations" is under
> construction. "SIGN-ON" is working. "REGISTER" is working. "SUPPORT"
> is under construction. "CONTACT" is under construction. "your
> destination" is under construction. "featured vacation destinations"
> is under construction. "Register here" is working. "Business Travel #
> About.com" is working.

How to fetch all links and click those links one by one using Selenium WebDriver

I am using Selenium WebDriver with java.
I am fetching all links from webpage and trying to click each link one by one. I am getting below error:
error org.openqa.selenium.StaleElementReferenceException: Element not found in the cache - perhaps the page has changed since it was looked up
Command duration or timeout: 30.01 seconds
For documentation on this error, please visit: http://seleniumhq.org/exceptions/stale_element_reference.html
Build info: version: '2.25.0', revision: '17482', time: '2012-07-18 21:09:54'
and here is my code :
public void getLinks()throws Exception{
try {
List<WebElement> links = driver.findElements(By.tagName("a"));
int linkcount = links.size();
System.out.println(links.size());
for (WebElement myElement : links){
String link = myElement.getText();
System.out.println(link);
System.out.println(myElement);
if (link !=""){
myElement.click();
Thread.sleep(2000);
System.out.println("third");
}
//Thread.sleep(5000);
}
}catch (Exception e){
System.out.println("error "+e);
}
}
actually, it's displaying in output
[[FirefoxDriver: firefox on XP (ce0da229-f77b-4fb8-b017-df517845fa78)] -> tag name: a]
as link, I want to eliminate these form result.
There is no such a good idea to have following scenario :
for (WebElement element : webDriver.findElements(locator.getBy())){
element.click();
}
Why? Because there is no guarantee that the element.click(); will have no effect on other found elements, so the DOM may be changed, so hence the StaleElementReferenceException.
It is better to use the following scenario :
int numberOfElementsFound = getNumberOfElementsFound(locator);
for (int pos = 0; pos < numberOfElementsFound; pos++) {
getElementWithIndex(locator, pos).click();
}
This is better because you will always take the WebElement refreshed, even the previous click had some effects on it.
EDIT : Example added
public int getNumberOfElementsFound(By by) {
return webDriver.findElements(by).size();
}
public WebElement getElementWithIndex(By by, int pos) {
return webDriver.findElements(by).get(pos);
}
Hope to be enough.
Credit goes to "loan".
I am also getting "stale exception" so I used 'loan' answer and works perfectly. Just if anyone need to know how to click on each link from results page try this (java)
clickAllHyperLinksByTagName("h3"); where "h3" tag contains hyperlink
public static void clickAllHyperLinksByTagName(String tagName){
int numberOfElementsFound = getNumberOfElementsFound(By.tagName(tagName));
System.out.println(numberOfElementsFound);
for (int pos = 0; pos < numberOfElementsFound; pos++) {
getElementWithIndex(By.tagName(tagName), pos).click();
driver.navigate().back();
}
}
public static int getNumberOfElementsFound(By by) {
return driver.findElements(by).size();
}
public static WebElement getElementWithIndex(By by, int pos) {
return driver.findElements(by).get(pos);
}
WebDriver _driver = new InternetExplorerDriver();
_driver.navigate().to("http://www.google.co.in/");
List <WebElement> alllinks = _driver.findElements(By.tagName("a"));
for(int i=0;i<alllinks.size();i++)
System.out.println(alllinks.get(i).getText());
for(int i=0;i<alllinks.size();i++){
alllinks.get(i).click();
_driver.navigate().back();
}
If you're OK using WebDriver.get() instead of WebElement.click() to test the links, an alternate approach is to save the href value of each found WebElement in a separate list. This way you avoid the StaleElementReferenceException because you're not trying to reuse subsequent WebElements after navigating away with the first WebElement.click().
Basic example:
List<String> hrefs = new ArrayList<String>();
List<WebElement> anchors = driver.findElements(By.tagName("a"));
for ( WebElement anchor : anchors ) {
hrefs.add(anchor.getAttribute("href"));
}
for ( String href : hrefs ) {
driver.get(href);
}
//extract the link texts of each link element
for (WebElement elements : linkElements) {
linkTexts[i] = elements.getText();
i++;
}
//test each link
for (String t : linkTexts) {
driver.findElement(By.linkText(t)).click();
if (driver.getTitle().equals(notWorkingUrlTitle )) {
System.out.println("\"" + t + "\""
+ " is not working.");
} else {
System.out.println("\"" + t + "\""
+ " is working.");
}
driver.navigate().back();
}
driver.quit();
}
For complete Explanation Read This POST
List <WebElement> links = driver.findElements(By.tagName("a"));
int linkCount=links.size();
System.out.println("Total number of page on the webpage:"+ linkCount);
String[] texts=new String[linkCount];
int t=0;
for (WebElement text:links){
texts[t]=text.getText();//extract text from link and put in Array
//System.out.println(texts[t]);
t++;
}
for (String clicks:texts) {
driver.findElement(By.linkText(clicks)).click();
if (driver.getTitle().equals("notWorkingUrlTitle" )) {
System.out.println("\"" + t + "\""
+ " is not working.");
} else {
System.out.println("\"" + t + "\""
+ " is working.");
}
driver.navigate().back();
}
driver.quit();

Categories