I'm having a problem extracting the full style Attribute, as part of it still didn't reached the DOM- Its an image that sometimes takes 1-2 seconds to load in the screen, so what happens is that there IS a string with attribute, but it CHANGES when the image arrives from the server.
I'm doing a WebDriverWait for the element to be visible in the DOM, but when I try to take its "style" attribute (that has the "...;(url:"http://....")", it sometimes isn't there, and then my substring() fails.
Here is the code:
#Step("Print Image src url")
public String printImage(Integer imgNo){
WebElement imgStyle = (new WebDriverWait(driver, 15)).until(ExpectedConditions.visibilityOfElementLocated(By.xpath(Consts.ACTIVE_IMG_XPATH_1000 + "/div[" + (currentImg + 1) +"]/div[1]")));
String styleString = imgStyle.getAttribute("style");
Integer idxUrl = styleString.indexOf("url");
Integer idxJpg = styleString.indexOf("jpg");
String urlImage = styleString.substring(idxUrl+5,idxJpg+3); //styleString.indexOf("http"),20
Log.info("V - Image " + imgNo + " src is: " + urlImage);
return urlImage;
}
I can simply add System.wait(3000), but I don't wan't to use it. Any ideas how to solve it nicer? Can I wait for visibility of String?
Thanks for your time.
What is the HTML like? You could use contains or starts-with and the beginning of the url before the part that is dynamic and changes. Like WebDriverWait(driver, 15).until(EC.visibility_of_element_located(By.XPATH, "//div[contains(#style, 'text']"))
Related
In my application, there are more than 50 links starting with "pw_listing_widget_tabs_list_ul']/li". I want to click on the randomly generated link. I have written below, but it is failing to click the link.
List<WebElement> links= driver.findElements(By.xpath("//ul[#id='pw_listing_widget_tabs_list_ul']/li"));
int count=links.size();
System.out.println("Total links are: " +count);
Random r = new Random();
int linkNo = r.nextInt(count);
System.out.println("Random Link# " +linkNo);
WebElement link = links.get(linkNo);
String text = link.getText();
System.out.println("Text: " + text);
Thread.sleep(3000);
link.click();
I have even tried to click the link using text, but the text is coming to long to & it is failing to click.
If you want to click on the link (anchor tag), try to specifically point it to the tag a.
Can you please change your XPath to
//ul[#id='pw_listing_widget_tabs_list_ul']/li/a
List<WebElement> links= driver.findElements(By.xpath("//ul[#id='pw_listing_widget_tabs_list_ul']/li/a"));
Q2: I have even tried to click the link using text, but the text is coming to long to & it is failing to click.
Ans2: Your element point to <li> and when u use link.getText();it will going to return all the text inside <li> and not inside {ANCHOR-TAG-TEXT}
Try to use the below XPATH:
//ul[#id='pw_listing_widget_tabs_list_ul']/li/a
Note: If possible share error code and HTML
I'm trying to write an Program that will automatically make a google-pictures-search and download the first image of the given String.
I'm doing it all with selenium webdriver for Google, but I can change it tho. I tried to filter the results, but the only thinks that appears different to me is the "data-atf"-attribute. I want to download the first, so it should be on zero, but how Can I search after that? Besides the other attributes always change because of the different String that is given.
String = "German Shepherd"
ChromeDriver driver = new ChromeDriver();
driver.get("https:/google.com/search?q=" + String +
"&source=lnms&tbm=isch&sa=X&ved=0ahUKEw
iXlMO0nq_jAhUEzaQKHVVXC50Q_AUIEygE&biw
=834&bih=770");
//and then I've got something like this
//wont work because cssSelector is always different
WebElement img = driver.findElement(By.cssSelector("#selector"));
BufferedImage buffer = ImageIO.read(new URL(img.getAttribute("src")));
ImageIO.write(buffer, "png", new File("image.png"));
} catch (Exception e) {
e.printStackTrace();
} finally {
driver.close();
}
Credits for the second part to: Save/copy to clipboard image from page by chrome console
I need help most importantly to filter the result and after that helping to download would be highly appreciated.
If you want to filter the images to the only those which have data-atf attribute the easiest is doing it via XPath selector
//img[#data-atf]
alternatively, if you want only children of "Search Results":
//h2[text()='Search Results']/parent::*/descendant::img[#data-atf]
Of course you can also filter images in Java code using Stream.filter() function
List<WebElement> allImages = driver.findElements(By.tagName("img"));
System.out.println("All images #: " + allImages.size());
List<WebElement> imagesWithDataAtf = allImages
.stream()
.filter(image -> image.getAttribute("data-atf") != null)
.collect(Collectors.toList());
System.out.println("Images with data-atf attribute #: " + imagesWithDataAtf.size());
I'm having error on my code wherein im finding a text but it does not display based on the string I have on my excel. I assume that it is due to frame set up.
here is my code:
'if (driver.findElement(By.linkText(reports)).isDisplayed())
{
System.out.println("report = "+ reports + "does not exist");
}
else
{
System.out.println("report = "+ reports + "does not exist");
}
}
Take note that report = "Order Qty" (text is extracted on excel)
here is the element that I need to find on browser
Instead of finding the element by linkText try finding the elements by text using XPath.
Since you are using the isDisplayed() method to check if the element is visible or not, I'm assuming you are expecting the element to be sometime not visible. In this case, if the element is not visible it will always throw the NoSuchElement exception.
To avoid this, either you have to use the condition inside a try block and handle the exception in the catch block. Or you can use findElements and check the list size which will never throw the exception.
As mentioned by cruisepandey, you should also use explicit waits for element loading delays.
String reports = "Order Qty";
List<WebElement> list = new WebDriverWait(driver,30).until(ExpectedConditions.visibilityOfAllElementsLocatedBy(By.xpath("//div[#class='kpi-report-wrapper']/h2[contains('"+reports+"')]")));
if(list.isEmpty()){
System.out.println("report = "+reports+ "does not exist" );
}else {
System.out.println("report = "+reports+ "exists");
}
You can use this Xpath :
//h2[text()='Order Qty']
Just make sure this shouldn't be in any frame.
However, introducing WebDriverWait will be great idea for stability.
for WebDriverWait :
(new WebDriverWait(driver, 10))
.until(ExpectedConditions.elementToBeClickable (By.xpath("//h2[text()='Order Qty']")));
This would return you a web element.
Can we catch in Selenium WebDriver events generated by the user (or events in general)? I know we can check state of page f.e. with WebDriverWait and ExpectedConditions, but that is not always appropriate.
Let's say I wanted to wait for user input to continue execution of test class. It would look something like this:
driver.get("https://www.google.com");
waitForKey(driver, org.openqa.selenium.Keys.RETURN);
/* rest of code */
driver.quit();
Where waitForKey would be implemented as:
public static void waitForKey(WebDriver driver, org.openqa.selenium.Keys key) {
Wait wait = new WebDriverWait(driver, 2147483647);
wait.until((WebDriver dr) -> /* what should be here? */);
}
Is there any way to do this?
I never heard that Selenium support it. However, you can make it yourself by adding an eventListener to the document to create or change DOM. Then you can use Selenium to detect the change. See my example below.
The example uses JavaScript Executor to add a keydown listener to the document. When the key Enter has been pressed, it will create a div with ID onEnter and then add it to the DOM. Finally, Selenium will looking for the element with ID onEnter, and then it will click a link in the web.
driver.get("http://buaban.com");
Thread.sleep(5000);
String script = "document.addEventListener('keydown', function keyDownHandler(event) {" +
" const keyName = event.key;" +
" if(keyName===\"Enter\") {" +
" var newdiv = document.createElement('DIV');" +
" newdiv.id = 'onEnter';"+
" newdiv.style.display = 'none';"+
" document.body.appendChild(newdiv);" +
" }" +
"});";
((JavascriptExecutor)driver).executeScript(script);
WebDriverWait wait = new WebDriverWait(driver, 20);
wait.until(ExpectedConditions.presenceOfElementLocated(By.id("onEnter")));
driver.findElement(By.cssSelector("#menu-post a")).click();
Thread.sleep(5000);
I just cracked my head trying to find how make it work. I am trying to force Selenium open link by link, but it opens on first link again and again, console output shows that loop is working correctly. Tried to use while loop but it doesn`t work too. I am trying to open link after link and change number of the li element to open further link.
for (int footer_links = 1; footer_links < 6; footer_links++) {
WebElement self_service_bi = driver.findElement(By.xpath("//div/div/ul/li['$footer_links']/a"));
self_service_bi.click();
File srcFile1 = ((TakesScreenshot)driver).getScreenshotAs(OutputType.FILE);
File targetFile1 = new File("D:\\DataPineScreenshots\\" + driver.getTitle() + ".png");
FileUtils.copyFile(srcFile1, targetFile1);
driver.navigate().back();
System.out.print(footer_links + "\n");
}
fix your syntax
By.xpath("//div/div/ul/li['$footer_links']/a")
by
By.xpath("//div/div/ul/li[" + footer_links + "]/a")
driver.findElement will always return the first element of the type. Use driver.findElements function to get list of all matching the given xpath. But dont do that in loop cause everytime it will open the same link.
Try out like:
List<String> lstUrls = new ArrayList<String>();
List<WebElement> lstEle = driver.findElements(By.xpath("//div/div/ul/li['$footer_links']/a"));
for (WebElement element : lstEle)
lstUrls.add(element.getAttribute("href"));
for (String string : lstUrls) {
driver.get(string)
File srcFile1 = ((TakesScreenshot)driver).getScreenshotAs(OutputType.FILE);
File targetFile1 = new File("D:\\DataPineScreenshots\\" + driver.getTitle() + ".png");
FileUtils.copyFile(srcFile1, targetFile1);
driver.navigate().back();
System.out.print(footer_links + "\n");
}