how to use List<WebElement> webdriver - java

I am creating an automatic test for some webs and I'm using WebDriver, TestNG and code that is written in Java. On the page is shown register of categories, in parentheses is number of auctions and i need to get this number as variable.
I use this code
By bycss =By.cssSelector(".list.list-categories>li:first-child");
WebElement number1_1_vse = driver.findElement(bycss );
String text_vse1_1 = number1_1_vse.getText();
but I get only first number and i need to get all. Someone advised me that I should use List. But when i use it, i get only
[[[[[[[FirefoxDriver: firefox on WINDOWS (7e6e0d0f-5cbb-4e48-992f-26d743a321a5)] -> css selector: .list.list-categories>li:first-child]] -> xpath: ..]] -> xpath: .//*], [[[[[[FirefoxDriver: firefox on WINDOWS (7e6e0d0f-5cbb-4e48-992f-.....
code:
By bycss2 =By.cssSelector(".list.list-categories>li:first-child");
WebElement number1_1_vse2 = driver.findElement(bycss2 );
WebElement parent1 = number1_1_vse2.findElement(By.xpath(".."));
List<WebElement> childs1 = parent1.findElements(By.xpath(".//*"));
System.out.println(childs1);
link to the website
screenshot -> image with the number
can anyone advise me please?

Try the following code:
//...
By mySelector = By.xpath("/html/body/div[1]/div/section/div/div[2]/form[1]/div/ul/li");
List<WebElement> myElements = driver.findElements(mySelector);
for(WebElement e : myElements) {
System.out.println(e.getText());
}
It will returns with the whole content of the <li> tags, like:
<a class="extra">Vše</a> (950)</li>
But you can easily get the number now from it, for example by using split() and/or substring().

Try with below logic
driver.get("http://www.labmultis.info/jpecka.portal-exdrazby/index.php?c1=2&a=s&aa=&ta=1");
List<WebElement> allElements=driver.findElements(By.cssSelector(".list.list-categories li"));
for(WebElement ele :allElements) {
System.out.println("Name + Number===>"+ele.getText());
String s=ele.getText();
s=s.substring(s.indexOf("(")+1, s.indexOf(")"));
System.out.println("Number==>"+s);
}
====Output======
Name + Number===>Vše (950)
Number==>950
Name + Number===>Byty (181)
Number==>181
Name + Number===>Domy (512)
Number==>512
Name + Number===>Pozemky (172)
Number==>172
Name + Number===>Chaty (28)
Number==>28
Name + Number===>Zemědělské objekty (5)
Number==>5
Name + Number===>Komerční objekty (30)
Number==>30
Name + Number===>Ostatní (22)
Number==>22

List<WebElement> myElements = driver.findElements(By.xpath("some/path//a"));
System.out.println("Size of List: "+myElements.size());
for(WebElement e : myElements)
{
System.out.print("Text within the Anchor tab"+e.getText()+"\t");
System.out.println("Anchor: "+e.getAttribute("href"));
}
//NOTE: "//a" will give you all the anchors there on after the point your XPATH has reached.

Related

Java Selenium webdriver access span element inside each div element from a autosuggest

am a newbie to selenium and am struggling with what looks to be a simple ask.
Using Java 8 with selenium webdriver and Chrome.
My requirement is to go to this website : https://start.duckduckgo.com/
In the search type : elephant
Now I need to retrieve all the autosuggested values using selenium.
From browser console I think this is the necessary HTML containing the auto suggestions :
<div class="search__autocomplete" style="display: block;">
<div class="acp-wrap js-acp-wrap">
<div class="acp" data-index="0"><span class="t-normal">elephant toothpaste</span></div>
<div class="acp" data-index="1"><span class="t-normal">elephant toothpaste</span> experiment</div>
...
</div>
<div class="acp-footer is-hidden js-acp-footer">
<span class="acp-footer__instructions">Shortcuts to other sites to search off DuckDuckGo</span>
</div>
Here is the code I have tried with no luck :
WebDriver driver = new ChromeDriver();
driver.manage().timeouts().implicitlyWait(10, TimeUnit.SECONDS);
driver.get("https://start.duckduckgo.com/");
WebElement searchText = driver.findElement(By.name("q"));
searchText.sendKeys("elephant");
List<WebElement> searchList = new ArrayList<WebElement>();
// have tried various options : all of the below selectors return no results
//searchList = driver.findElements(By.xpath("//div[contains(#class, 'acp')]//span"));
//searchList = driver.findElements(By.xpath("//div[contains(#class,
'acp')]//span[contains(#class, 't-normal')]"));
//searchList = driver.findElements(By.cssSelector(".t-normal"));
//searchList = driver.findElements(By.cssSelector("t-normal"));
//searchList = driver.findElements(By.cssSelector(".acp > span"));
// this atleast returns a collection of 10 but not sure of its content
searchList = driver.findElements(By.xpath("//div[contains(#class, 'acp')]"));
if(null != searchList && searchList.size() > 0) {
for(int i = 0;i<searchList.size();i++) {
WebElement e = searchList.get(i);
System.out.println("element details are " + e.toString());
/** NoSuchElementException with below tried Xpath and by css
WebElement spanElement = e.findElement(By.className("t-normal"));
WebElement spanElement = e.findElement(By.className(".t-normal"));
WebElement spanElement = e.findElement(By.cssSelector(".t-normal"));
WebElement spanElement = e.findElement(By.cssSelector("t-normal"));
**/
WebElement spanElement = e.findElement(By.cssSelector("t-normal"));
System.out.println(e.getText());
}
}else {
System.out.println("<<<<< not able to locate >>>>>");
}
If in browser I do a find using xpath I can locate these span elements :
//div[contains(#class, 'acp-wrap js-acp-wrap')]//div[contains(#class, 'acp')]//span[contains(#class,'t-normal')]
So really confused on how to extract text from within span ?
searchList = driver.findElements(By.xpath("//div[#class='acp']"))
for(int i = 0;i<searchList.size();i++) {
System.out.println(searchList.get(i).getText());
Just get all 8 dropdown values and then print it's getText.
You can generate a list of WebElements directly.
List<WebElement> searchList = driver.findElements(By.xpath("//div[#class='search__autocomplete'][contains(#style,'display: block')]/div/div"));
If you want to iterate through the list and capture the values in a new list then you can try the following -
List<String> listOfValues = new ArrayList<String>();
for(WebElement ele : searchList){
String elementValue = ele.getText().toString().trim();
listOfValues.add(elementValue);
}

How to solve "org.openqa.selenium.support.ui.UnexpectedTagNameException: Element should have been "select" but was "input"" selenium error

I'm going to get the default selected value of the below dropdown
I wrote below code to do it.
Select selectyear = new Select(driver.findElement(By.id("year")));
WebElement year = selectyear.getFirstSelectedOption();
String selectedoption = year.getText();
But this throws the following error
org.openqa.selenium.support.ui.UnexpectedTagNameException: Element should have been "select" but was "input"
How can I fix this? same code is working perfectly for dropdowns that don't have "value" attribute.
The only explanation is there is another element with id year, the input tag.
Put this code before Select selectyear = new Select(driver.findElement(By.id("year")));:
List<WebElement> elements = driver.findElements(By.id("year"));
for (WebElement element: elements) {
System.out.println("Tag: " + element.getTagName());
System.out.println("Text: " + element.getText());
System.out.println("Location: " + element.getLocation());
}
Solution is relative xpath for the select: select[#id='year']

How to exclude printing all the hidden links from a website using selenium

The following code prints all the links from the website which includes some hidden links as well. These hidden links are displayed as blank on the console. How do I write my code in a such a way that it only prints visible link from the website and not print hidden links in the form of blanks.
driver.get("https://www.duke-energy.com/my-account/sign-in");
List<WebElement> link = driver.findElements(By.tagName("a"));
System.out.println("The total number of links on the page are :"+link.size());
for(int i=0;i<link.size();i++)
{
String url=link.get(i).getText();
System.out.println(url);
}
You can use stream and filter to filter for visible links using isDisplayed:
driver.get("https://www.duke-energy.com/my-account/sign-in");
List<WebElement> links = new WebDriverWait(driver, 5).until(ExpectedConditions.presenceOfAllElementsLocatedBy(By.tagName("a")));
System.out.println("The total number of links on the page are :" + links.size());
List<WebElement> vlinks = links.stream().filter(WebElement::isDisplayed).collect(Collectors.toList());
System.out.println("The total number of visible links on the page are :" + vlinks.size());
for (WebElement link : vlinks) {
String url = link.getText();
System.out.println(url);
}
Visible links and text not empty:
links.stream().filter(e -> e.isDisplayed() && !e.getText().isEmpty())
.collect(Collectors.toList())
.forEach(e -> System.out.println(e.getText()));

How to print the elements having common css value in selenium

WebDriver driver = new FirefoxDriver();
driver.get("https://www.ignitionone.com/company/careers/");
driver.manage().window().maximize();
Thread.sleep(2000);
driver.findElement(By.cssSelector("button.teal")).click();
Thread.sleep(2000);
String s2 =driver.findElement(By.cssSelector("#board_title")).getText();
List<WebElement>d_details = driver.findElements(By.cssSelector(".level-0"));
for(int i=0; i<d_details.size();i++){
WebElement element = d_details.listIterator();
String innerhtml = element.getAttribute("innerHTML");
System.out.println("Available openings are" + innerhtml);
}
System.out.println("The title is " + s2);
driver.quit();
This is my code.I am trying to print the available job openings in different areas in the webpage. Can someone please help to understand whats going wring in here.
You have a type casting problem on this line:
WebElement element = d_details.listIterator();
A better way to iterate over the elements would be this:
List<WebElement> results = driver.findElements(By.cssSelector(".level-0"));
for (WebElement result: results) {
String innerhtml = result.getAttribute("innerHTML");
System.out.println("Available openings are" + innerhtml);
}
Note that you may also be experiencing a timing issue. You should replace your Thread.sleep() calls with Explicit Wait commands, check out this topic:
WebDriver - wait for element using Java

Open every link in loop (Selenium)

I just cracked my head trying to find how make it work. I am trying to force Selenium open link by link, but it opens on first link again and again, console output shows that loop is working correctly. Tried to use while loop but it doesn`t work too. I am trying to open link after link and change number of the li element to open further link.
for (int footer_links = 1; footer_links < 6; footer_links++) {
WebElement self_service_bi = driver.findElement(By.xpath("//div/div/ul/li['$footer_links']/a"));
self_service_bi.click();
File srcFile1 = ((TakesScreenshot)driver).getScreenshotAs(OutputType.FILE);
File targetFile1 = new File("D:\\DataPineScreenshots\\" + driver.getTitle() + ".png");
FileUtils.copyFile(srcFile1, targetFile1);
driver.navigate().back();
System.out.print(footer_links + "\n");
}
fix your syntax
By.xpath("//div/div/ul/li['$footer_links']/a")
by
By.xpath("//div/div/ul/li[" + footer_links + "]/a")
driver.findElement will always return the first element of the type. Use driver.findElements function to get list of all matching the given xpath. But dont do that in loop cause everytime it will open the same link.
Try out like:
List<String> lstUrls = new ArrayList<String>();
List<WebElement> lstEle = driver.findElements(By.xpath("//div/div/ul/li['$footer_links']/a"));
for (WebElement element : lstEle)
lstUrls.add(element.getAttribute("href"));
for (String string : lstUrls) {
driver.get(string)
File srcFile1 = ((TakesScreenshot)driver).getScreenshotAs(OutputType.FILE);
File targetFile1 = new File("D:\\DataPineScreenshots\\" + driver.getTitle() + ".png");
FileUtils.copyFile(srcFile1, targetFile1);
driver.navigate().back();
System.out.print(footer_links + "\n");
}

Categories