Is it possible to capture user input/actions with Selenium WebDriver, in the same way that you can use the Selenium IDE for recording / creating tests?
i.e. when the user enters a URL, clicks a link, fills in a text box, clicks a button etc etc.
I'd like to be able to capture these actions using the WebDriver rather than just using the Selenium IDE, as I want to integrate with other classes available in my Java application.
I attempted to offer a viable solution in Record Actions using Selenium
Hope this helps.
You can't 'record' a set of actions with Selenium WebDriver, you will need to write those steps manually.
Strictly speaking you can capture user input by using the WebDriver API in your chosen language (C#, Java, PHP, Ruby. Python, Perl or JavaScript) and it vaguely resembles using the DOM. If it suits your requirements you could use configuration files to supply some of your user input.
Navigate to a URL:
WebDriver driver = new FirefoxDriver();
driver.get('url')
Click a link/button:
WebElement element = driver.findElement(By.id("coolestWidgetEvah"));
element.click();
Enter Text in a field:
WebElement element = driver.findElement(By.id("coolestWidgetEvah"));
element.sendKeys('userinput');
For more information on the API Selenium HQ is pretty definitive:
http://seleniumhq.org/docs/03_webdriver.html#introducing-the-selenium-webdriver-api-by-example
If you're going from Selenium IDE to writing tests it'd be really useful to check out the page object pattern as I've found it makes your tests more maintainable in the long-run. This link is a good starting point because it gives an overview, and a visual representation of what you get by following the pattern:
http://blog.josephwilk.net/cucumber/page-object-pattern.html
Hope that helps.
As far as I'm aware, there isn't an easy way to do it - but recording on IDE and exporting as a java file has worked well for me (File -> Export test case as...). I usually do it to c# but have used it with java.
Related
I am working on a little app for myself. I am trying to get a list of links from a site. The site is for example: http://kinox.to/Stream/Prison_Break.html
If you hover over the big window in the middle that says kinox.to best online, it show the link that I want in the bottom left. The problem is if I look at the html file I can't find the link anywhere. I guess it has to do something with the site using JavaScript or Ajax.
Is it possible to somehow get the link using JSoup or are there any other Java libraries that could help me?
I did not look closely into the page you try to load, but here is what I think the problem may be: The link is loaded/generated dynamically via JavaScript. Jsoup does not run JavaScript, so therefore you can't find the link in the html.
Two possible solutions:
1) Use something like selenium webdriver to access the content. The Java bindings allow to remote control a real browser which should have no problems loading the page and running all scripts within. Solution 1 is simple to program, but runs slowly. It may depend on an extern browser program which must be installed on the machine. An alternative to webdriver is the JavaFx webkit engine in case you are on java 8.
2) Analyse the traffic and the JavaScript on the page and find out where the link comes from. This may take a bit of time to find out, but when you succeed you can use Jsoup to get all the data you need. This solution should run much faster than solution 1.
One solution and probably the easiest would be to use Selenium:
WebDriver driver = new FirefoxDriver();
driver.get("http://kinox.to/Stream/Prison_Break.html");
String mylink = driver.findElement(By.cssSelector("#AjaxStream > a")).getText();
need to code a bot that needs to do the following:
Go to a jsp page and search for something by:
writing something on a search box
clicking the search button(submit button)
clicking one of the the resulting buttons/links(same jsp page
with different output)
get the entire html of the new page(same jsp page with different
output)
The 4th one can be done with screen scraping and I do not think I need help with it. But I need some guidance to do the options from 1 to 3. Any links or just some keyword that will help me Google to learn about it will be appreciated. I plan to do this with java.
My suggestion is to use Selenium (http://docs.seleniumhq.org/download/).
Install Selenium IDE in your firefox, and it can record what you do on a website, store it into a script and reply it.
This video (http://www.youtube.com/watch?v=gsHyDIyA3dg) is gonna be helpful if you are a beginner.
And if you want to do it in Java, its easy, just export the scripts in Selenium IDE to JUnit Webdriver code.
Of course you can use Selenium Java webdriver in Java to write your program to operate on website directly.
Selenium automates browsers. That's it. What you do with that power is entirely up to you.
The above steps can be done by using selenium(which is a testing tool in java)
Even points 1 to 3 are screenscraping - you're figuring out (using either manual or automated means) what's there in the page and performing actions on them. You could try exploring the Apache HTTP Client for an easy way to run HTTP commands and get responses.
I hope you're doing this for legitimate means - screenscraping is almost always frowned upon if done without permission.
Language - JAVA
IDE - Eclipse
Tool - Selenium Web Driver
I have a test scenario where clicking on a link opens a new window with a pdf content being shown. The PDF shows up a form with Save and Cancel button. We don't get any element identifier using firebug for elements shown on the new window. How can I write a script to tell the driver to identify fields in the pdf and input something there followed by clicking on Save button.
Not sure if you can identify fields in a pdf using webdriver, since, in most cases, the pdf would be embedded in the browser and the fields in the pdf are not Html components.
One thing you might want to check out is using AutoIt scripting and calling this script from your host program, in your case, java program. I have not tried using AutoIt for pdf but used it for other purposes and found that it is not difficult to learn and use.
You can call AutoIt executable from java program as follows:
Runtime.getRuntime().exec("checkPDF.exe");
Since it is an executable, you might have portability issues if you plan on running your webdriver script on other platforms.
Not going to work with Selenium. PDFs are usually displayed using native desktop applications/browser plugins and that is nothing Selenium can handle in general. BTW: what do you intend to test in a PDF formular? Seems pointless to me (at first sight).
What you CAN do is: take a desktop screenshot using AWT. And you should be able to use native desktop methods to simulate mouse clicks and key presses. But that is, well, not so nice [tm] because you may need to record the click coordinates and re-record them every time the PDF changes.
Maybe related: Interacting with a PDF popup in Selenium
I am developing a java application. I have scenario to take screen shot of the URL that comes in to the server.
Is there any java(or any lang) browser library to load webpages and get some screenshots of the loaded page. It would be nice if the lib allows DOM traversal.
Update:
java(or any lang): Any other language is not a problem but the library should co-operate with java.
I have tried to setup Qt Jambi and spent a lot of time on this but the result is nothing.
If you provide any concrete material to setup Jambi, it would be appreciative.
I also gave a try to spynner.py. My native language is Java and i thought i could use spynner.py with Jython. But, PyQt cannot be used with Jython. So, i am not expecting any answers related to Python.
Basically, I need a library to do:
Take Screen shot.
Some DOM traversing.
Some Javascript Execution.
and to get the result of the Executed JS code.
Thanks.
I appreciate all the responses. I ended up with phantomjs. It fits well for my needs. Its a command line tool.
Selenium/Webdriver provides all this functionality.
Webdriver provides a simple api allowing you to "drive" a browser instance. Many browsers are supported.
See here for a simple example:
http://seleniumhq.org/docs/03_webdriver.html#getting-started-with-selenium-webdriver
Traversal of the dom using the "By" locators:
Good examples here: http://www.qaautomation.net/?p=388
driver.findElement(By.name("q"));
Execution of Javascript:
http://code.google.com/p/selenium/wiki/FrequentlyAskedQuestions#Q:_How_do_I_execute_Javascript_directly?
WebDriver driver; // Assigned elsewhere
JavascriptExecutor js = (JavascriptExecutor) driver;
js.executeScript("return document.title");
Screenshot capture:
http://seleniumhq.org/docs/04_webdriver_advanced.html#taking-a-screenshot
File scrFile = ((TakesScreenshot)driver).getScreenshotAs(OutputType.FILE);
In java, you should read the following stackoverflow posts :
Programmatic web browser Java library
Take a screenshot of a webpage with JavaScript?
Embed a web browser within a java application
Because you say "or any lang" :
In Python, you have Spynner :
Spynner is a stateful programmatic web browser module for Python with Javascript/AJAX support based upon the QtWebKit framework.
According to the documentation, here's a small snippet :
import spynner
browser = spynner.Browser()
browser.load("http://www.wordreference.com")
browser.runjs("console.log('I can run Javascript!')")
browser.runjs("_jQuery('div').css('border', 'solid red')") # and jQuery!
browser.select("#esen")
browser.fill("input[name=enit]", "hola")
browser.click("input[name=b]")
browser.wait_page_load()
print browser.url, len(browser.html)
browser.close()
This site does the screenshot job:
Tutorial:
http://www.paulhammond.org/webkit2png/
The program:
http://www.paulhammond.org/2009/03/webkit2png-0.5/webkit2png-0.5.txt
Could it be any easier ? :)
There are some other tools mentioned at that page:
"
If you use a mac, but don't like the command line then you may want to try Paparazzi or Little Snapper.
If you use linux you may be more interested in khtml2png, Matt Biddulph's Mozilla screenshot script or Roland Tapken's QT Webkit script.
"
You could use Rhino, Gecko for the javascript execution.
For dom traversal there are many options, but if you are using Rhino you could use jQuery to make it even easier!
Hope that works out for you!
If you need a screenshot, I guess the quality of rendering is important for you.
We had a similar scenario. What we ended up doing is to run firefox on headless mode, actually browse the webpage and get a screen shot in memory. It is not trivial, but I can give you more details if you wanted to go for it.
I am working in Selenium RC. Can anyone please let me know how to write xpath for button in Selenium (Java)?
You should develop the script in the Selenium IDE (download) before porting it to Selenium RC. In Selenium IDE, when you click anything on the webpage, it should automatically generate some kind of selector for the element you clicked. Then, once you've recorded all the events, you Format it in whatever language you're using, and then you copy and paste it to your Selenium RC code.
But the Recorder Javascript isn't foolproof (e.g. if you click on a div that causes some XMLHttpRequest or setTimeout, it won't be recorded). Or, the click may be recorded but you may not like the selector that Selenium chooses for the element. In either case, you'll have to write your own selector based on the DOM structure. To see the DOM structure, open Firebug if you're in Firefox (F12), or open the Inspector if you're on Chrome (Ctrl-Shift-J) Fortunately, Selenium understands a bunch of selector syntaxes, so you can use CSS selectors if you don't know XPath.
If you do decide to use XPath, you'll have to learn it first. I haven't found any good tutorials (and I'm not a fan of w3schools). But feel free to use a bookmarklet to test XPaths that I wrote. You'll probably end up with something like //button[.="text on button"], or //input[#value="text on button"].
You can find button Xpath by using Firebug which is an add-on for Firefox and as above answer Selenium IDE is also another, easier option to find.