Get autocomplete data in Java from a website - java

I ma using SpringBoot and Java. I would like to get data from a website. The website has autocomplete box that returns data in a browser. I would like access this data. What is the easiest way to do this in Java? Is this possible?
I try use the url in Postman using a GET request, but I get a 403 error.
e.g. https://www.booking.com/autocomplete_csrf?v=1&lang=en-gb&sid=2939ff8d1a04c7dcecd8206f31a3481e&aid=304142&pid=3ce829a2434a0089&stype=1&src=index&eb=0&e_obj_labels=1&e_tclm=1&e_smmd=2&e_ms=1&e_msm=1&e_themes_msm_1=1&add_themes=1&themes_match_start=1&include_synonyms=1&sort_nr_destinations=1&gpf=1&term=Lond&_=1675662919991
If you look at booking.com, there is an autocomplete box.

Related

How to fill and submit a Login Form on a Website with Java in Android Studio?

I am currently working on a Java Android App which should connect to a website and get a JSON-Stream from there. However you have to login to the website first. I've made a form in the App where you pass your username and password. My goal is that both values will fill the form and submit it. Then you are logged in and can do the new HTTP-Request to get the JSON-Stream. The Website doesn't offer an API or Webservice. Doing it with WebView is not an option.
Libaries like HttpUnit don't work with Android Studio.
It seems like there is no real alternative for it.
With WebView it would be possible to use JS to fill it, however I want to avoid that.
JSoup doesn't seem to work to fill out html-forms.
I am looking for a solution to get the username and password field of the HTML-Form and be able to submit it by using the submit button.
Is there any good alternative to HttpUnit where I can fill and submit web forms with values from user inputs by getting the fields by their ID or name?

What must I do to receive JSON data from a server on an android app

I have PHP script on the server side which executes the SQL query and produces a JSON string from it. But how do I get that code to execute to then retrieve and decode the JSON on the Android app?
I am quite new to android so as much help would be very appreciated.
I can't write you the whole tutorial but i can give you a to-do list which you can check your progress.
Your application to generate JSON script in txt, or real time response.
Setup your backend server(you have sql server already and also PHP, so i guess you have set it up already, probably WAMP or similar), which is the endpoint/url path to generate the response required, in this case your JSON response.
Check the API is working or not by using "advanced rest client" or simple type your endpoint in browser, if you can get the correct response from it, the API is working as expected.
Created your android app, with simple HTTP request/response, send a HTTP request to the url/path you setup, see if you can get the JSON response or not.
Please comment here if you need further details, this is relatively easy and should be able to finished in 1 day for beginner.

Flickr API URLs not resulting in images

i have integrated the Flickr search in my java application using scribe. I am using the
flickr.photos.search
api call to get a json object as response. But after i am parsing the json object to get the url and i am testing the url . i am getting the following .gif image.
i am getting the josn response correctly then i am using this URL FORMAT
to parse the URL.
Please tell me whether this is an api problem or am i doing something wrong ??
The urls generated by me are correct since i have checked it using the flickr api explorer in the site itself.
Found my problem was not placing a .jpg at the end of the URL. So much for that..

How to get the response stream of browser in Java WebDriver program

I need to validate the PDF report. I need to get the report embeded in HTML. If I read that URL using:
File file = new File("url");
or
HttpWebConnection.getResponse();
it requests the URL in separate session, hence it cannot get the file.
Does ieDriver have something like HtmlUnit?
HttpWebConnection.getResponse()
or somebody can suggest alternative.
Unfortunately it does not.
If you want to get the response code you will need a proxy. If you are using Java then Browser Mob is what you need. You may also try making XmlHttpRequest from javascript and get the status code in that way.
You could also stick to the method you are using right now (separate request from Java) but pass the cookie with session (you can obtain the cookie from WebDriver)

Advice with crawling web site content

I was trying to crawl some of website content, using jsoup and java combination. Save the relevant details to my database and doing the same activity daily.
But here is the deal, when I open the website in browser I get rendered html (with all element tags out there). The javascript part when I test it, it works just fine (the one which I'm supposed to use to extract the correct data).
But when I do a parse/get with jsoup(from Java class), only the initial website is downloaded for parsing. Meaning there are some dynamic parts of a website and I want to get that data but since they're rendered post get, asynchronously on the website I'm unable to capture it with jsoup.
Does anybody knows a way around this? Am I using the right toolset? more experienced people, I bid your advice.
You need to check before if the website you're crawling demands some of this list to show all contents:
Authentication with Login/Password
Some sort of session validation on HTTP headers
Cookies
Some sort of time delay to load all the contents (sites profuse on Javascript libraries, CSS and asyncronous data may need of this).
An specific User-Agent browser
A proxy password if, by example, you're inside a corporative network security configuration.
If anything on this list is needed, you can manage that data providing the parameters in your jsoup.connect(). Please refer the official doc.
http://jsoup.org/cookbook/input/load-document-from-url

Categories