Is there a way in Selenium (java) to get the "page source" as shown on the elements page (F12) in chrome.
I have to test an application the page is heavily modified by javascript. I already tried:
String html = (String)((JavascriptExecutor)driver).executeScript("return document.getElementsByTagName('html')[0].innerHTML");
and
String html = driver.getPageSource();
but both show the "effective" html.
Try to wait until the expected elements are loaded - see https://www.seleniumhq.org/docs/04_webdriver_advanced.jsp
E.g.
driver.get("http://somedomain/url_that_delays_loading");
WebElement myDynamicElement = (new WebDriverWait(driver, 10)).
.until(ExpectedConditions.presenceOfElementLocated(By.id("myDynamicElement")));
Related
I'm practicing and trying to parse behance.net to retrieve .jpg files.
First, I tried with JSOUP, but I only receives and JS code without any useful code. Then i tried with selenium:
System.setProperty("webdriver.chrome.driver", "S:\\behance-id\\src\\main\\resources\\chromedriver.exe");
WebDriver driver = new ChromeDriver();
driver.get("https://www.behance.net/gallery/148589707/Hercules-and-Randy");
String str = driver.getPageSource();
And I got same result. Through Google Chrome inspect option I found what I need:
But I cannot acces to this source page via Selenium and JSOUP and other instruments.
I only receive this with <script> tags:
Is it possible?
That page is loading its resources dynamically, after the original HTML, so you should use waits in Selenium. This is the Java example for waiting an element to be loaded in page, from the documentation:
WebDriver driver = new ChromeDriver();
driver.get("https://google.com/ncr");
driver.findElement(By.name("q")).sendKeys("cheese" + Keys.ENTER);
// Initialize and wait till element(link) became clickable - timeout in 10 seconds
WebElement firstResult = new WebDriverWait(driver, Duration.ofSeconds(10))
.until(ExpectedConditions.elementToBeClickable(By.xpath("//a/h3")));
// Print the first result
System.out.println(firstResult.getText());
The documentation can be found at https://www.selenium.dev/documentation/webdriver/waits/
I'm trying to select an element based on its text contents. I am using XPath to achieve this.
I am just puzzled as this should work?
WebElement link = obj.driver.findElement(By.xpath("//div[contains(text(), 'Notifications')]"));
I'll even copy the HTML code:
<div class="linkWrap noCount">Notifications <span class="count _5wk0 hidden_elem uiSideNavCountText">(<span class="countValue fsm">0</span><span class="maxCountIndicator"></span>)</span></div>
The div element has the words "Notifications" inside it. So why doesn't it work.
Go to this page on Facebook: https://www.facebook.com/settings
Use this chrome extension to highlight any area via xPath.
You have a space before the word Notifications:
WebElement link = obj.driver.findElement(By.xpath("//div[contains(text(), 'Notifications')]"));
You should also add a wait for element before trying to find the element:
WebDriverWait wait = new WebDriverWait(webDriver, timeoutInSeconds);
wait.until(ExpectedConditions.visibilityOfElementLocated(By.xpath("//div[contains(text(), 'Notifications')]"));
WebElement link = obj.driver.findElement(By.xpath("//div[contains(text(), 'Notifications')]"));
I found the issue with the help of some amazing people in this community.
Ok, so my element was in an iFrame.
In order to access the element, I must first access the iFrame
WebElement iframe = obj.driver.findElement(By.xpath("//iframe[#tabindex='-1']"));
obj.driver.switchTo().frame(iframe);
I would like to automate the authentication process in https://appleid.apple.com/ using java webdriver selenium, but html elements of the form doesn't loaded in the DOM
to my knowledge, Selenium webdriver interpret the markup as a browser does.
And It apply the CSS styles, run the JavaScript and the dynamically
rendered content be added to the DOM
Why HTML elements are not loaded in the DOM ?
How can I proceed, to fix it and load all elements in the DOM, exactly like a browser ?
N.B : https://appleid.apple.com/ website use Mustache.JS (logic-less template)
public static void main(String[] args) {
System.setProperty("webdriver.chrome.driver", "chromedriver.exe");
ChromeOptions options = new ChromeOptions();
options.addArguments("--headless", "--disable-gpu", "--window-size=1920,1200", "--ignore-certificate-errors");
WebDriver driver = new ChromeDriver(options);
driver.get("https://appleid.apple.com/");
waitForPageLoadComplete(driver, 30);
//can't found input name element
WebElement inputName = driver.findElement(By.id("account_name_text_field"));
System.out.println(driver.getPageSource());
}
The element you are trying to find is inside an iFrame. You will need to switch to this iFrame first and then proceed with finding the element as you already have.
driver.switchTo().frame("aid-auth-widget-iFrame");
WebElement inputName = driver.findElement(By.id("account_name_text_field"));
You can find some additional information about switching to iFrames here: https://www.guru99.com/handling-iframes-selenium.html
I'm trying to verify page header titles on two different pages.
Accessing the site and verifying the page header title.
Entering an input in text field and clicking on submit button.
Verifying the page header title
Page from step 1 and step 3 has same xpath for header titles, when I'm trying to access and verify, I get an error message:
expected(expected title) but found(unexpected title)
Code:
/html/body/div[4]/div/div[1]/div/h1/span
/html/body/div[4]/div/div[1]/div/h1/span
Screenshots for HTML Code :
It's probably a timing issue, the driver still reading the old header. You can use Expected Conditions to wait for the text to become what you are waiting for
WebDriverWait wait = new WebDriverWait(driver, 10);
WebElement header = wait.until(ExpectedConditions.textToBePresentInElementLocated(By.xpath("/html/body/div[4]/div/div[1]/div/h1/span"), "expected title"));
Or wait for the first header to become stale
WebElement firstHeader = driver.findElement(...);
WebDriverWait wait = new WebDriverWait(driver, 10);
wait.until(ExpectedConditions.stalenessOf(firstHeader));
WebElement secondHeader = driver.findElement(...);
Edit
If its the same element you can use textToBePresentInElement
WebElement header = driver.findElement(...);
WebDriverWait wait = new WebDriverWait(driver, 10);
wait.until(ExpectedConditions.textToBePresentInElement(header, "expected title"));
I have tried, quite a lot and I am not even getting any sort of errors but it is not printing anything (I want it to print the title of the page)
WebDriver driver = new HtmlUnitDriver();
WebElement element = driver.findElement(By.cssSelector("a[href*='Alerts.htm']"));
element.click();
System.out.println(driver.getTitle());
Here is the HTML code (the part I wish to click), there is a title for both , the page I want to click and the current page.
<li title="Alerts"><span>Alerts</span></li>
I am not any errors but it should print the title, which it is not doing.
I have followed many sorts of instructions found here and on the web.
Things I have tried so far:
By locator = By.xpath("//li[#title='Alerts']/a");
WebElement element = driver.findElement(locator);
element.click();
WebElement element = driver.findElement(By.partialLinkText("Alert"));
element.click();
Where am I going wrong?
The title of an HTML document is defined within a <title> tag, typically in the <head> section.
This is the title that the getTitle method returns.
See http://www.w3schools.com/tags/tag_title.asp.
I am not sure of this. But I think HTMLUnitDriver is a headless browser instance. Kindly try in another browser, firefox perhaps.
WebDriver driver = new FirefoxDriver();
You first need to open a page in a browser!
WebDriver driver = new ...
driver.get(url) // THIS LAUNCHES AN ACTUAL BROWSER
// now you can actually do things
driver.findElement ...
driver.getTitle()
driver.quit() // TO DISMISS THE BROWSER