It is possible to get cookies from a flash application using java? - java

I'm trying to get data from a website, but first I need to log in to the site using java. The script worked until now, but now the site installed an anti bot system. Until now the procedure was simple, I've created a HttpStreamWriter and submitted my details to the login.php page, then get the cookies and later, when I want to get data from the site, I resubmit the cookie from the login.php page, but now there is a problem: an anti bot system:
I'm not sure, but I think this is the system:
https://github.com/yuri-gushin/Roboo/blob/master/Roboo.pm
The anti bot system creates a cookie, called anti-bot and I can't access the page without that cookie, the problem is that the cookie is generated by a flash application only after the page loads so I can't get the cookie from the page?
Any ideas how to "hack" this ? Thanks!

Your need is about cookie extraction, here is how to do or on the oracle site
That is you need to connect to the site, browse the headers until Set-Cookie. Having the correct http header, you'll be able to parse it very easily.
After what, you'll have to set it back to your further request.
Edit
Flash cookie or Local Shared Object are stored in AMF. AMF wil be used store anything, the problem with your use case is that you don't know which value (or maybe class instances) have been serialized...
However you could (it will take time, at least for include all necessary libs) try with the AMFConnection to retrieve information. But I won't bet on that.
Could you contact the webmaster to have some information about that ? Or doens't this website any login api ?

Related

Authenticate to autodesk

We are developing a Java application that is supposed to show models from users store.
initially, I'm trying to allow users to login using their autodesk account, and check if they are entitled to access my app.
I couldn't find any good example to show how it is done, I just want to confirm that what I will be doing is the recommended thing or if there is better options.
First, on app start, I will show an embedded webbrowser that will open
"https://developer.api.autodesk.com/authentication/v1/authorize?response_type=code&client_id=XXX&redirect_uri=XXX&scope=XXX"
the app will get the url from our server (so not saved locally) and the call back is pointing to an api on our server. then as user login and consent, will get the code from the url, close the login dialog and continue to get the bearer token using plain rest apis to /authentication/v1/gettoken.
As I said, not 100% sure if this is approved way or not or even if it is doable or not. so thought to check before we implement it.
After that I will just use rest apis to browse and get the model.
any thoughts or complains ?
Thanks in advance
Rest assured that the workflow being proposed here is actually orthodoxical and well “approved” by our official tutorials:
https://forge.autodesk.com/en/docs/oauth/v2/tutorials/get-3-legged-token/
http://learnforge.autodesk.io/#/oauth/3legged/
Unfortunaly the code sample for that bit is in node and we are still working on a Java equilvalent
Some of our endpoints require 3-legged oauth to access personal data - see here for an example and you can always refer to the authentication context section of each endpoint for the oauth flow required.

Advice with crawling web site content

I was trying to crawl some of website content, using jsoup and java combination. Save the relevant details to my database and doing the same activity daily.
But here is the deal, when I open the website in browser I get rendered html (with all element tags out there). The javascript part when I test it, it works just fine (the one which I'm supposed to use to extract the correct data).
But when I do a parse/get with jsoup(from Java class), only the initial website is downloaded for parsing. Meaning there are some dynamic parts of a website and I want to get that data but since they're rendered post get, asynchronously on the website I'm unable to capture it with jsoup.
Does anybody knows a way around this? Am I using the right toolset? more experienced people, I bid your advice.
You need to check before if the website you're crawling demands some of this list to show all contents:
Authentication with Login/Password
Some sort of session validation on HTTP headers
Cookies
Some sort of time delay to load all the contents (sites profuse on Javascript libraries, CSS and asyncronous data may need of this).
An specific User-Agent browser
A proxy password if, by example, you're inside a corporative network security configuration.
If anything on this list is needed, you can manage that data providing the parameters in your jsoup.connect(). Please refer the official doc.
http://jsoup.org/cookbook/input/load-document-from-url

Java - reading data from website requiring login

I'm basically trying to use Java to read some data from my school's website (homework assignments, which lessons I have and when, etc.) for personal use. However, my school requires one to be logged in to access this information.
Could anyone point me in the right direction for logging in with code and accessing this information?
Thanks,
Mike.
The Apache has a API for http client simulating.
Link: http://hc.apache.org/httpcomponents-client-ga/
You need to find out how the server handles logins. What authenticationer it used: cookies, url session id, etc.
Then you can send the server a login http form submit (as you would have done manually) and save the authenticationer.
With the authenticationer you can then access the secured sites.
I would use HtmlUnit, which is a programmatic web browser.
You tell it to load the authentication page, to fill the form with your credentials and to click on the submit button. The you click on links, just as you would do it with a real browser, but programmatically, using Java instructions.
And it even supports JavaScript, if necessary.

Is there a way to access cookies stored in Chrome from a Java program

Here is my ideal situation. I log into www.philstockworld.com using chrome. Once logged in I start up my java application that uses the cookies just stored by chrome. Then my java application goes to work. So here is my question.
Here is what my program can do now, I can login to the website using whatever browser I want, then look up the value of the PHPSESSID cookie and start up my app using that value. Then my app can do what it needs to. I can also supply my app with my username and password and have it log in, then store the returned PHPSESSID cookie and do what it needs to. However, what I would like to have happen is I login to the website using a browser, then my app starts and uses the PHPSESSID cookie from my browser session, without me having to look it up and copy it.
Is there a way for my java application to get the value of that cookie, without me having to manually type it in?
The location of the Cookies file is:
On Linux:
$HOME/.config/google-chrome/Default/Cookies
For other OS's see the user data page on chromium.org.
However, the file is stored in a binary format, so it's going to be hard for you to load the data within.
Joel's answer tells you where the cookie data is stored. This data is a sqlite3 database file. See this question for how to read a sqlite3 database.
I can't find how/where Chrome stores cookies, on Linux at least. Chances are that they won't be cached as simple plain text files and thus not be easily readable. You say you don't want to hard code your username/password in your java app - but why do you have do this? You could just pass them as arguments to your app?

Client HTTP Post to external sites

Is there any web language that allows the client itself to create HTTP posts to external sites.
I know that JavaScript does this with XMLHttpRequest, but it does not allow cross-domain posting, unless the recipient domain wants to allow the sending domain.
I want to post data to an external site (that I don't control) and have the request be authenticated with what the client's browser already has (cookies, etc).
Is this possible? I tried cURL but it seems to make a server HTTP post, not a client HTTP post.
Edit:
A bit more insight of what I am trying to do:
I am trying to POST JSON to the website using the user's session (I said cookies but I believe they are PHP sessions, which I guess I still consider cookies).
The website does NOT check the referral (poor security #1)
I can execute javascript and html on the webpage using my personal homepage (poor security #2)
The JSON code will still work even if the content-type is form (poor security #3)
There is no security checking at all, just PHP session checking.
The form idea is wonderful and it works. The probably again is that its JSON. So having sent postdata as foo={"test":"123", "test2":"456"} the whole foo= part messes it up. Plus forms seem to turn JSON into form encoding, so its sending:
foo=%7B%22
test%22%3A+%22
123%22%2C+%22
test2%22%3A+%22
456%22%7D
when i need it to send;
{"test":"123", "test2":"456"}
So with everything known, is there a better chance of sending JSON or not?
I don't think so: You won't get hold of the user's auth cookies on the third party site from server side (because of the Single Origin Policy) and you can't make Ajax requests to the third party site.
The best you can do is probably create a <form> (maybe in an <iframe>), point it to the third party site, populate it with data, and have the user submit it (or auto-submit it). You will not be able to get hold of the request results programmatically (again because of the Single Origin Policy), but maybe it'll do - you can still show the request results to the user.
I think for obvious reasons this is not allowed. If this was allowed what would stop a malicious person from posting form data from a person's browser to any number of sites in some hidden iframe or popup window.
If this is a design of your application you need to rethink what you are trying to accomplish.
EDIT: As #Pekka was pointing out I know you can submit a form to a remote site using typical form submits. I was referring to using some client side ajax solution. Sorry for the confusion.
You should follow the way OpenID and other single-sign-on system works. How openID works is your website POSTs some token to openID service and in return gets authentication result. Refer How Does it Work? section here
Yes, you can use a special flash library that supports cross-domain calls: YUI connection manager
Added: not sure about the cookie authentication issue though...
The client cannot post to an external site directly; it's a breach of basic cross-domain security models. The exception is accessing javascript with JSONP. What you describe would require access to a user's cookies for another website, which is impossible as the browser only allows cookie access within the same domain/path.
You would need to use a server-side proxy to make cross-domain requests, but you still cannot access external cookies: http://jquery-howto.blogspot.com/2009/04/cross-domain-ajax-querying-with-jquery.html

Categories