I have a Java application that is scanning a website using Selenium. It is crawling all the pages of the website. Some of these pages are generated dynamically by selecting a combination values, clicking some buttons.
The purpose of this application is to crawl through all the pages of the website and save the HTML source and a screenshot of all the pages it comes across. The content and structure of these webpages keep changing over time.
The application is running fine, but the methods calling the webdriver and fetching the elements to enter a set combination of values and clicking buttons to get to all the pages need to be updated quite often as the HTML structure of the website is changing frequently.
Now my question is that:
How can I test the functionality of my Java methods calling the Web Driver using Unit Tests
How should I approach the unit tests to test the stability of my code that is finding HTML elements, filling in values, clicking buttons, getting the HTML source.
Currently I save a sample HTML file and test my code against it. But since I have to update my code and unit tests along with every HTML structure change, that defeats the purpose of unit tests, if I have to updated my unit tests as well (since the values on the page also change).
Please help me find an efficient or correct approach to testing my code using unit tests.
I have been working on the same feature in this project Revolance UI Monitoring to do reviews on changes in the UI acrross several versions.
To answer your question, I solved the testing problem by using PhantomJS with a mocked website. As you designed yourself the mocked website, you know what should be the expected outcome.
I hope this help!
Related
Below is my requirement :
(1)I need to do UI performance testing .
(2)At the end of the test I want jmx file to be generated,I do have the batch file that will convert it to jtl and then to html report.
Answers not known :
(1)I did browse through a lot of links that explained how to execute webdriver code using JUnit in JMeter. However I don't want to do that[run webdriver code in JMeter] and want my code to be a standalone code using HTMLUnit(headless browser) performing authentication and then the remaining click actions[Click on multiple links] ....Behind the scenes jmeter should record performance of every page and at end of test ,dump the results.Also it should be irrespective of testng /junit .Does anyone know if this is possible and can redirect me to the appropriate link.Thanks!
JMeter has nothing in common with the UI performance testing, as per JMeter main page:
JMeter is not a browser, it works at protocol level. As far as web-services and remote services are concerned, JMeter looks like a browser (or rather, multiple browsers); however JMeter does not perform all the actions supported by browsers. In particular, JMeter does not execute the Javascript found in HTML pages. Nor does it render the HTML pages as a browser does (it's possible to view the response as HTML etc., but the timings are not included in any samples, and only one sample in one thread is ever displayed at a time).
Both JUnit and TestNG automatically record function execution duration, you can build your test in such a way that measurable actions would be annotated with #Test annotation so the time will be automatically recorded.
In addition you can consider using Navigation Timing API from your WebDriver tests in order to get more information regarding page loading events.
If you still want to run JMeter tests from Java code - check out Five Ways To Launch a JMeter Test without Using the JMeter GUI article, it covers both running existing .jmx file and creating a new JMeter test purely using JMeter API from Java, however it is not applicable for your use case. The recommended way is to:
Create the main load using JMeter
Measure client-side performance using Selenium
I'm looking for a solution to the following:
Basically I am running automated tests using Selenium & TestNG, I have a report set up and a monitor set up ( A HTML file that I can view to see the progress of my tests. ) When a test is passed/failed/skipped it gets amended to the HTML file until the suite is finished.
When the project is fully finished and implemented, I want to run the tests outside of work hours for various reasons.
Therefore what I want to achieve is a mobile application where I can log in and view my tests progress.
To achieve this ( well from the plan I have taught up, someone else might be able to point me in a better direction ) I plan on finding a way to host results on a webserver which can then be accessed by the mobile application and convert these results into something viewable on the front end of the mobile application. The time accucary might not be a 100% with this method as the time it will take from the results to go from TestNG to the server to the mobile application but once its within reason it should be OK.
So the question is how can I store a live feed of my TestNG results on a webserver? Or even locally at the moment just for testing purposes.
Thanks.
I would log your results to the db of your choice and then create a mobile friendly webpage that queries the db and summarizes the run.
If that's too much work, can you not just post your HTML file to a webserver that you have access to from your mobile device?
need to code a bot that needs to do the following:
Go to a jsp page and search for something by:
writing something on a search box
clicking the search button(submit button)
clicking one of the the resulting buttons/links(same jsp page
with different output)
get the entire html of the new page(same jsp page with different
output)
The 4th one can be done with screen scraping and I do not think I need help with it. But I need some guidance to do the options from 1 to 3. Any links or just some keyword that will help me Google to learn about it will be appreciated. I plan to do this with java.
My suggestion is to use Selenium (http://docs.seleniumhq.org/download/).
Install Selenium IDE in your firefox, and it can record what you do on a website, store it into a script and reply it.
This video (http://www.youtube.com/watch?v=gsHyDIyA3dg) is gonna be helpful if you are a beginner.
And if you want to do it in Java, its easy, just export the scripts in Selenium IDE to JUnit Webdriver code.
Of course you can use Selenium Java webdriver in Java to write your program to operate on website directly.
Selenium automates browsers. That's it. What you do with that power is entirely up to you.
The above steps can be done by using selenium(which is a testing tool in java)
Even points 1 to 3 are screenscraping - you're figuring out (using either manual or automated means) what's there in the page and performing actions on them. You could try exploring the Apache HTTP Client for an easy way to run HTTP commands and get responses.
I hope you're doing this for legitimate means - screenscraping is almost always frowned upon if done without permission.
I have a problem that doesn't seem to be answered clearly in StackOverflow. I want to download a page using Java and retrieve some data from it in order to give some values to an application that I develop. This page is a betting site so it contains javascrit methods to change the betting values.
In order to do some tests I downloaded the page manually using Ctrl-S and then I made a programm (with FileReader, BufferedReader, etc...) which retrieves the data. This worked perfectly. So I would make a bash script to be executed in order to download the page every time when the user opens my application.
After that I searched for methods who download the page programmatically (I used Jsoup, URL, ...). What I noticed is that the javascript variable values couldn't be printed because the javascript code wasn't executed.
What i want to know is that if there is some way to download programmatically the executed website (download the instance of the javascript values) without having to make some bash script to do it every time before someone opens my app.
Try HtmlUnit. It is used for automatic testing but should fit your purpose well too!
I want to extract HTML data from a website using JAVA. The problem is the webpage keeps scrolling down once the user reaches the bottom of the page. Number of times it scrolls down is fixed. My JAVA code can extract only for the 1st part. How do I extract for the remaining scrolls? Is there a way to load the whole page at once with JAVA? ANy help would be appreciated :)
This might be the type of thing that PhantomJS (http://phantomjs.org/) was designed for. It will crawl entire web pages and even execute JavaScript, using a "real" browser in headless mode. I suggest stopping what you're doing with Java and take a look at PhantomJS instead. It could save you a LOT of time. :)
This type of behavior is implemented in the browser, interpreting the user's scrolling actions to load more content via AJAX and dynamically modifying the in-memory DOM in the browser. Consider that your Java runs in a web container on the server, and that web container (i.e. Tomcat, JBoss, etc) provides a huge amount of underlying code so your app doesn't have to worry about the plumbing.
Conceptually, a similar thing occurs at the client, with the DHTML web page running in its own "container" (the browser), which provides a wealth of functionality, from UI to networking, to DOM, etc. If you remove the browser from the equation and replace it with a Java program, you will need to provide the equivalent of the browser in which the DHTML/Javascript can execute.
I believe that HTMLUnit may fill the bill, but have not worked with it personally.