Selenium - iterate over elements - java

My goal is:
Iterate on WebElements in a webpage
Click on all elements founded and open link in same session
Parse the new page with some other logic
Return back to prev page and continue the loop for all prev matched id
I have this code:
List<WebElement> links = driver.findElements(By.cssSelector("div[data-sigil='touchable']"));
// this will display list of all images exist on page
for(WebElement ele:links)
{
System.out.println("test->"+ele.getAttribute("id"));
ele.click();
Thread.sleep(500);
System.out.println("URI->"+driver.getCurrentUrl());
js.executeScript("window.history.go(-1)");
}
return "ok";
Which is working fine and it finds correct elements id, "ele.click()" is actually working, but I'm always failing when I execute js.executeScript("window.history.go(-1)")
This is my error message:
org.openqa.selenium.StaleElementReferenceException: stale element reference: element is not attached to the page document
(Session info: chrome=73.0.3683.103)
(Driver info: chromedriver=2.40.565498 (ea082db3280dd6843ebfb08a625e3eb905c4f5ab),platform=Windows NT 10.0.17134 x86_64) (WARNING: The server did not provide any stacktrace information)
Command duration or timeout: 0 milliseconds
So basically I'm not able to continue the loop.
Is it there any useful technique to "click into new tab" and manage different Selenium driver session?
Thanks a lot in advance for any suggestion.

What is happening is that when you proceed to another page it makes all the elements in the list stale. Those elements are not attached to the page when you come back to the page again. You need to find the elements every time you load the page.
Try this:
List<WebElement> links = driver.findElements(By.cssSelector("div[data-sigil='touchable']"));
// this will display list of all images exist on page
String address;
for(int i=0; i<links.size(); i++){
address = driver.getCurrentUrl();
links = driver.findElements(By.cssSelector("div[data-sigil='touchable']"));
System.out.println("size: "+links.size());
WebElement ele = links.get(i);
System.out.println("test->"+ele.getAttribute("id"));
ele.click();
Thread.sleep(500);
System.out.println("URI->"+driver.getCurrentUrl());
//js.executeScript("window.history.go(-1)");
//driver.navigate().back();
driver.get(address);
}
Edit:
Try the driver.get() as it waits for the page to load. Or you can directly add another sleep as you used after the click.

I think you need to create the js object, like so.
Reason being that you "lost" the reference to the JavascriptExecutor
List<WebElement> links = driver.findElements(By.cssSelector("div[data-sigil='touchable']"));
// this will display list of all images exist on page
for(WebElement ele:links){
System.out.println("test->"+ele.getAttribute("id"));
ele.click();
Thread.sleep(500);
System.out.println("URI->"+driver.getCurrentUrl());
// Re initialise js executor
JavascriptExecutor js = (JavascriptExecutor) driver;
js.executeScript("window.history.go(-1)");
}
return "ok";

Related

Selenium chrome driver scrape dynamically add attribute in element

Hello I am new in selenium chrome driver. I am scraping ecommerce web site where i am scraping all products details from home page but in that page products image are loading dynamically(after 5-7 seconds when products loaded).
source code is like this
<img alt="product1" class="image" />
after 5-7 seconds
<img alt="product1" class="image" src="product image url" />
So i want to scrape that image src attribute value.
I tried by below way
driver.manage().timeouts().pageLoadTimeout(20, TimeUnit.SECONDS);
or
driver.manage().timeouts().implicitlyWait(20, TimeUnit.SECONDS);
or
Thread.sleep(20000)
but i am failed
anybody help me for how to get image src attribute value?
Selenium's "FluentWait" is your friend
final WebElement imgWithSrc = new FluentWait<>(driver)
.withTimeout(Duration.of(10_000, ChronoUnit.MILLIS))
.pollingEvery(Duration.of(250, ChronoUnit.MILLIS))
.ignoring(NoSuchElementException.class)
.ignoring(StaleElementReferenceException.class)
.ignoring(ScriptTimeoutException.class)
.until(d -> {
final WebElement imgElement = d.findElement(By.cssSelector("img.image"));
if (StringUtils.isNotBlank(imgElement.getAttribute("src"))) {
return imgElement;
}
return null;
});
In the second line you see the max. wait of 10s, with polling every 250ms (third line)
Try this:
WebElement image = new FluentWait<WebDriver>(driver)
.withTimeout(Duration.of(10, ChronoUnit.SECONDS))
.until(
ExpectedConditions.presenceOfElementLocated(
By.xpath("//img[#alt='product1'][#src]")
)
);
The above code means that the waiter will be polling your DOM for 10 seconds unless your DOM gets the element described in the xpath. This [#src] part of xpath means that we query the element having src attribute so no positive result would be returned unless the required attribute is assigned to an element.

Xpath issue in element location

I am getting a very long xpath for an element that I selected. Is there anyway to shorten it? This is the xpath I am getting:
//li[#class='menu_men 1-level hasChild']//div[contains(#class,'level-2')]//div[#class='menu-wrapper']//ul[#class='level-2']//li[#class='1-level']//div[#class='level-3']//ul[#class='level-3']//li//a[#class='level-3'][contains(text(),'Socks')]
This is the URL: Calvin Klein Singapore I hovered over 'MEN', the accessories section will appear, than I hover the 'Socks' to get the xPath.
I am getting the following execption in my code and I am wondering if somehow the long xpath could be one of the reasons:
org.openqa.selenium.NoSuchElementException: no such element: Unable to
locate element: {"method":"xpath","selector":"//li[#class='first
menu_men 1-level
hasChild']//div[contains(#class,'level-2')]//div[#class='menu-wrapper']//ul[#class='level-2']//li[#class='1-level']//div[#class='level-3']//ul[#class='level-3']//li//a[#class='level-3'][contains(text(),'Socks')]"}
I am using cropath from within chrome developer tools to get the xPath.
I am new to automation, I really hope someone can advise. Thank you.
#SameerArora this is the code I have to clear the pop up window, as what I had mentioned in the comments below.
//for clearing the popup window
#FindBy(how=How.XPATH,using="//*[starts-with(#id,'popup-subcription-closes-link-')]")
public WebElement newsletterpopup;
public String clickCategory(){
//.....
resusableFunctions.buttonClick(driver, newsletterpopup, "popoup");
}
public void buttonClick(WebDriver driver, WebElement element, String elementName) throws InterruptedException
{
try
{
element.click();
System.out.println("Log: ResuableFunction.buttonClick");
}
catch (org.openqa.selenium.ElementNotInteractableException notInteract)
{}
The element you are looking for can be found using xpath:
WebElement element = driver.findElement(By.xpath("(//a[contains(text(),'Socks')])[1]"));
However, as the element is not visible directly when you are opening the link, you would be getting NoSuchElementException, so to resolve it you can use javascript click method on the element which directly operates on the div of the page.
Addition to this, i can see that a subscription popup comes when i am opening the page for the first time, so you need to dismiss that popup first(if the popup is present) and then click on the "Socks" element using the JavaScript click method.
Your code should be like:
List<WebElement> closeSubscriptionPopUp = driver.findElements(By.xpath("//a[contains(#id,'popup-subcription-closes-link')]"));
if (closeSubscriptionPopUp.size() > 0) {
closeSubscriptionPopUp.get(0).click();
}
WebElement sockElement = driver.findElement(By.xpath("(//a[contains(text(),'Socks')])[1]"));
JavascriptExecutor executor = (JavascriptExecutor)driver;
executor.executeScript("arguments[0].click();", sockElement);
To hovered over 'MEN' >> accessories >> 'Socks' section, You need to use selenium Actions class.
As it is not really possible to first click on men(as it will open other section),
So to hover to sock, you need to chain all of the actions that you want to achieve in one go.
Process should be:
move to men element first
Move to accessories
then move to Socks and click on it.
Note: By using Action class, we can chain all the process in one single go.
As mentioned below
1) First way:
Actions action = new Actions(driver);
action.moveToElement(driver.findElement(By.xpath("(//a[contains(text(),'MEN')])[2]")))
.moveToElement(driver.findElement(By.xpath("(//a[contains(text(),'Socks')])[1]")))
.click().build().perform();
2) Second way with wait:
WebDriverWait wait= new WebDriverWait(driver, 10);
Actions action = new Actions(driver);
action.moveToElement(driver.findElement(By.xpath("(//a[contains(text(),'MEN')])[2]"))).build().perform();
wait.until(ExpectedConditions.visibilityOfElementLocated(By.xpath("(//a[contains(text(),'Socks')])[1]")));
action.moveToElement(driver.findElement(By.xpath("(//a[contains(text(),'Socks')])[1]")));
action.click().build().perform();
Try this:
//a[normalize-space(text()) = 'Socks']
I would recommend you to not use such long xpath's and try to write xpath on your own.
Try :
//li[contains(#class,'menu_men')]//a[contains(text(),'Socks')]

wait.until(ExpectedConditions.elementToBeClickable) is not waiting for the defined time

Using Using Selenium WebDriver with Java and I want to click on an element that is present on the page and is visible, but is grayed out, i.e., element is present on the page but the same is not intractable.
So, I am using ExplicitWebDriverWait to wait until that element is clickable and for that I am using below line of code. But the same is not working. Driver is not waiting for the element to become intractable. It is throwing exception, "is not clickable at point (415, 765). Other element would receive the click:".
Now,if I am using static wait instead of this Explicit Waint, I am able to click on the element.
Code which I have written:
wait.until(ExpectedConditions.visibilityOfElementLocated(By.xpath("//*[#name='mobile']")));
wait.until(ExpectedConditions.elementToBeClickable(By.xpath("//*[#name='mobile']")));
newNum.click();
Script Log:
Trying to locate: By.xpath: //*[#name='mobile']
Located element:By.xpath: //*[#name='mobile']
Trying to locate: By.xpath: //*[#name='mobile']
Located element:By.xpath: //*[#name='mobile']
Trying to locate: By.xpath: //*[#name='mobile']
Located element:By.xpath: //*[#name='mobile']
Trying to click on:[[ChromeDriver: chrome on XP (7686dd92e2bb577696qa2e1aa13effd6)] -> xpath: //*[#name='mobile']]
Exception occured:org.openqa.selenium.WebDriverException: unknown error: Element <input id="abc-radiobox-2032-inputEl" data-ref="inputEl" type="text" role="combobox" size="1" name="mobile" placeholder="- select option -" readonly="readonly" class="dummyclass" autocomplete="off" componentid="gwc-combobox-2032"> is not clickable at point (415, 765). Other element would receive the click: <div class="anotherclass" role="status" id="loadmask-1985" tabindex="0" componentid="loadmask-1985" style="">...</div>
(Session info: chrome=71.0.3578.98)
(Driver info: chromedriver=2.41.578737 (49da6702b16031c40d63e1234de03a32ff6c197e),platform=Windows NT 10.0.10586 x86_64) (WARNING: The server did not provide any stacktrace information)
Command duration or timeout: 0 milliseconds
According to the error you are getting:
Exception: is not clickable at point (415, 765). Other element would receive the click:
Seems like driver is not able to even find the particular element, so its not waiting until it becomes clickable. Mostly whenever this type of error occurs, we can use JavaScript Clicks instead of using wait or any other types of clicks.
Try the below code: make sure element locator's value should be good enough to locate the element uniquely:
WebElement element = driver.findElement(By.xpath("//*[#name='mobile']"));
JavascriptExecutor Js = (JavascriptExecutor)driver;
Js.executeScript("arguments[0].clicks();", element)
This code always works for me. Sometimes I need to insert scrollIntoView() method to scroll the page to the element to perform action on it.
You should wait, until the element is visible. And then need to check whether element is in enable state(clickable state) or not. After that only, you have to perform the click operation.
Steps
1.Create Firefox browser session
2.Navigate to the page as per you requirement
3.Wait, until the web element (//*[#name='mobile']) is visible [wait for approximately 15 seconds]
[What is web element ? Ans : Whatever the action you are going to perform with this element. This element may be a button , link, icon, text field, etc..]
4.Now check the element, it is in clickable state(enable state) or not
5.If it is in clickable state(enable state), then perform the click operation.
public void test_01_ButtonClick()
{
WebDriver driver = new FirefoxDriver();
driver.navigate().to("www.hlo.com");
//Here will check element is visible or not
waitForElementInDOM(driver, "//*[#name='mobile']", 15);
//Here will check element is enable or not
boolean enable = elementIsEnable(driver, "//*[#name='mobile']");
if(enable)
{
driver.findElement(By.xpath("//*[#name='mobile']")).click();
}
else
{
System.out.println("Element not visible. Please increase your waiting time");
}
}
----------------------------------------------------------------------------
public void waitForElementInDOM(WebDriver driver,String elementIdentifier, long
timeOutInSeconds)
{
WebDriverWait wait = new WebDriverWait(driver, timeOutInSeconds );
try
{
//this will wait for element to be visible for 15 seconds
wait.until(ExpectedConditions.visibilityOfElementLocated
(By.xpath(elementIdentifier)));
}
catch(NoSuchElementException e)
{
e.printStackTrace();
}
}
-------------------------------------------------------------------------------
public boolean elementIsEnable(WebDriver driver, String elementIdentifier)
{
WebElement element = driver.findElement(By.xpath("elementIdentifier"));
if(element.isEnabled())
{
return true;
}
else
{
return false;
}
}

Looping through with List<WebElement> with Selenium

I get a list of anchors using the code below, and then I want to go to each link. I have came up with the code below, but after the first loop I get the following exception
org.openqa.selenium.StaleElementReferenceException: stale element
reference: element is not attached to the page document (Session
info: chrome=55.0.2883.87)
List<WebElement> listingAnchorList = driver.findElements(By.xpath("//div[contains(#class,'cat')]/a"));
for (WebElement listingAnchor : listingAnchorList) {
driver.get(listingAnchor.getAttribute("href"));
System.out.println(driver.getTitle());
}
Is there anyway to do this without having go to back to the page every time?
You can collect your href attributes in some new List, and then iterate over it and open each page.

Selenium WebDriver- Java- It is not willing to click a link

I have tried, quite a lot and I am not even getting any sort of errors but it is not printing anything (I want it to print the title of the page)
WebDriver driver = new HtmlUnitDriver();
WebElement element = driver.findElement(By.cssSelector("a[href*='Alerts.htm']"));
element.click();
System.out.println(driver.getTitle());
Here is the HTML code (the part I wish to click), there is a title for both , the page I want to click and the current page.
<li title="Alerts"><span>Alerts</span></li>
I am not any errors but it should print the title, which it is not doing.
I have followed many sorts of instructions found here and on the web.
Things I have tried so far:
By locator = By.xpath("//li[#title='Alerts']/a");
WebElement element = driver.findElement(locator);
element.click();
WebElement element = driver.findElement(By.partialLinkText("Alert"));
element.click();
Where am I going wrong?
The title of an HTML document is defined within a <title> tag, typically in the <head> section.
This is the title that the getTitle method returns.
See http://www.w3schools.com/tags/tag_title.asp.
I am not sure of this. But I think HTMLUnitDriver is a headless browser instance. Kindly try in another browser, firefox perhaps.
WebDriver driver = new FirefoxDriver();
You first need to open a page in a browser!
WebDriver driver = new ...
driver.get(url) // THIS LAUNCHES AN ACTUAL BROWSER
// now you can actually do things
driver.findElement ...
driver.getTitle()
driver.quit() // TO DISMISS THE BROWSER

Categories