Is there a way to get the Request URL in chrome browser upon page load when I navigate to a certain URL. I am trying to do it using selenium and java but it seems not really working to what I want to happen so I did not include the codes here. I only put the developer tool (F12) in chrome where I need to get the Request URL.
Related
I am trying to automate salesforce application using selenium. But i see in few navigations mentioned error is shown note that this is specific to chrome opened by selenium.
"We can't display this page because your browser blocks cross-domain cookies, but you can view this page in Salesforce Classic. Click here to open this page in Salesforce Classic."
Can anyone help me resolve this issue.
PS: specific to chrome and selenium web driver. Works fine on normal instance of browser.
I have recorded a scenario in Jmeter, I have webpage which is having Iframe in it. which loads another webpage from same domain.
Retrieve All Embedded Resources is checked but I don't want that Iframe should get loaded. I have tried adding .css,.js.*.png in URLs must match but it doesn't work.
You can stop downloading all embedded resources in the iframe. In a way, iframe won't get loaded.
Please note that - requested page which has iframe embedded will still show the iframe in HTML response but subsequent calls that iframe will make to download embedded resources can be stopped.
Here the sample iframe example. Editor being displayed on the page is in iframe. So if you load the page all the resources get downloaded.
So lets try this in jmeter :
and results of this call as same in developer console-
.
Now, block the iframe using URLs must match functionality.
I peeked in the respons of eariler request and blocked the iframe using below regex pattern :
^(nested_frames)*?
Here is the image:
And here is the response for this request:
I have uploaded the JMX file on Github if you want to play around.
Your requirement seems a little bit weird as well-behaved JMeter test should have the same network footprint as the real browser does (it applies to embedded resources, cookies, cache, headers, etc.) so if the real browser does load the page from the domain you're testing the JMeter test needs to do the same.
If you still need to exclude the iframe from your JMeter test you can "blacklist" the "another webpage" from being loaded via "URLs must match" section of the HTTP Request sampler like:
^((?!the-webpage-you-don-want-here).)*$
More information: Excluding Domains from the Load Test
I have to open the chrome browser using webdriver using the url that has a few parameters. Depending upon the parameters the flow of my website will differ. This works correctly when I access my page manually on a browser. Below are the exemple of the url :-
https://.com/en/account/signinwithticket?ticketId=16795&hash=2ae74c8a8a088a691&checkout=false&redirect=/en/payment
I compared the cookies of the webdriver browser and the browser I open manually. So, all the cookies corresponding to the REST parameters are set correctly when I try manually. But when the browser opens on webdriver these cookies are not set. Can there be a specific chromedriver capability or options to solve this issue.
Just as an FYI - I am using the usual way of opening the browser - WebDriver.get("url");
I want to download a source of a webpage to a file (*.htm) (i.e. entire content with all html markups at all) from this URL:
http://isap.sejm.gov.pl/DetailsServlet?id=WDU20061831353
which works perfectly fine with FileUtils.copyURLtoFile method.
However, the said URL has also some links, for instance one which I'm very interested in:
http://isap.sejm.gov.pl/RelatedServlet?id=WDU20061831353&type=9&isNew=true
This link works perfectly fine If open it with a regular browser, but when I try to download it in Java by means of FileUtils -- I got only a no-content page with single message "trwa ladowanie danych" (which means: "loading data...") but then nothing happens, the target page is not loaded.
Could anyone help me with this? From the URL I can see that the page uses Servlets -- is there a special way to download pages created with servlets?
Regards --
This isn't a servlet issue - that just happens to be the technology used to implement the server, but generally clients don't need to care about that. I strongly suspect it's just that the server is responding with different data depending on the request headers (e.g. User-Agent). I see a very different response when I fetch it with curl compared to when I load it in Chrome, for example.
I suggest you experiment with curl, making a request which looks as close as possible to a request from a browser, and then fiddling until you can find out exactly which headers are involved. You might want to use Wireshark or Fiddler to make it easy to see the exact requests/responses involved.
Of course, even if you can fetch the original HTML correctly, there's still all the Javascript - it would be entirely feasible for the HTML to contain none of the data, but for it to include Javascript which does the actual data fetching. I don't believe that's the case for this particular page, but you may well find it happens for
try using selenium webdriver to the main page
HtmlUnitDriver driver = new HtmlUnitDriver(true);
driver.manage().timeouts().implicitlyWait(30, TimeUnit.SECONDS);
driver.get(baseUrl);
and then navigate to the link
driver.findElement(By.name("name of link")).click();
UPDATE: I checked the following: if I turn off the cookies in Firefox and then try to load my page:
http://isap.sejm.gov.pl/RelatedServlet?id=WDU20061831353&type=9&isNew=true
then I yield the incorrect result just like in my java app (i.e. page with "loading data" message instead of the proper content).
Now, how can I manage the cookies in java to download this page properly then?
I need to go around a site using java programmatically but the site doesn't change the url when linked is clicked.
Site: http://cliqa.nana10.co.il/
On the right you have a bar with some links, click them and you will see that while the content changes the url doesn't change. how can i achieve programmatically this mouse click on one of the links in Java, I thought about HTTP POST but what exactly I'm going to send? an example would be much appreciated.
These links use JavaScript to trigger an AJAX request and refresh only the center of the page. Use FireBug inside Firefox to sniff the network requests and see which requests are executed on each click. Or use a programmatic web browser like HtmlUnit which will handle JavaScript as your web browser does.
You need to look at the actual HTTP request being sent. You can do this Chrome with the built in Inspect Element tool or Firebug in Firefox (or Live HTTP Headers). I prefer to use Burp Suite's intercepting proxy to see this.
You can try charles (http://www.charlesproxy.com/). Check the section on JAVA APPLICATION PROXY CONFIGURATION at http://www.charlesproxy.com/documentation/configuration/browser-and-system-configuration/. You can inspect and change request sent from your java application.