I am trying to upload a sound to myinstants.com using java and OkHttp.
I used the Chrome dev tools to look at what requests get made and try to recreate them using OkHttp but I'm failing at the login part.
Chrome dev tools tells me that this is the form post, with a content-type of application/x-www-form-urlencoded.
And I try to replicate the post using the following code:
RequestBody loginBody = new FormBody.Builder()
.add("csrfmiddlewaretoken", token) //this token is comes from inside the <input> tag that is retrieved in the HTML of a normal get request to https://myinstants.com/accounts/login and is diffrent every time you load the page
.add("login", username)
.add("password", password)
.add("remember", "on")
.add("next", "/new/")
.build();
Request login = new Request.Builder()
.url("https://www.myinstants.com/accounts/login/?next=/new/")
.addHeader("cookie", CookieHandler.getCookie()) // cookie that is generated from the "set-cookie" response headers of the get request to https://myinstants.com/accounts/login
.addHeader("content-type", "application/x-www-form-urlencoded")
.post(loginBody)
.build();
Response response = new OkHttpClient().newCall(login).execute();
According to the chrome dev tools, the response of the above post request should have a couple of set-cookie response headers, but they are not present for me.
I don't think the issue is with the cookie I'm using because when comparing to what is found in the chrome dev tools, the cookie matches that exaclty (except for some things that are new every time you visit the site), so I think the issue is with the form post. Any ideas what I am doing wrong?
Some servers block the requests if they see you're making the requests from outside a browser.
What often works (but not always) is to try tricking the server into thinking you're using a browser. You can do this by setting the "User-Agent" header.
To do so, open the dev tools of your browser (F12), access the "Network" tab and make a request to any site. Then, look into the "Request Headers" section for the "User-Agent" value. Just copy it and send it with your request.
If all this fail, it's possible that the site has a bot protection based in Javascript. In this kind of site, the login page contains a javascript that triggers right before the login process and generates a random token that you need to send along with the credentials in order to login successfully. Since you're accessing without a browser, you can't run JavaScript and thus, you can't generate this token.
If that's the case, the best thing you can do is to use a real browser controlled programmatically. For Java you can use Selenium, but I personally prefer to use Puppeteer from NodeJS. In essence they're the same thing, an API to remote control a modified version of the chromium/chrome browser.
But with Puppeteer you have a little more flexibility than with Java because you don't need to convert between Java and Javsacritp objects and vice-versa.
I don't know what may be wrong and I don't have time to test your code now. But as a suggestion, you could try to make the upload using another library and see if it works.
I recommend Apache Fluent API:
https://mvnrepository.com/artifact/org.apache.httpcomponents/fluent-hc
Related
I am developing a website with Java for the backend and Angular for frontend. There is a situation when some external websites may send data to my website using POST form. For instance,
▼ General
Request URL: https://myangularwebsite/
Request Method: POST
...
▼ Request Headers
Content-Type: application/x-www-form-urlencoded
Host: myangularwebsite
Origin: https://externalwebsite
Referer: https://externalwebsite/send.form?id=0
...
▼ Form data
ID: 0000000
TIME: 2017.06.04 11:53:58
SIGNATURE: ...geirgmGKFGJWR...
...
Now, I need to capture the form in Angular somehow, send/redirect it to the backend to validate the signature and return the answer back to Angular to proceed working with this website.
I tried posting to my website to test how it might work using Postman, but get Cannot POST /.
I know how to work with GET and URL query parameters in Angular but I think I need to process a POST request based on headers I see with Chrome DevTools 'Network' section when coming from externalwebsite to myangularwebsite.
Should I dedicate a route in the backend and expose it, for example, .../api/external in my backend and tell these websites to use this link instead of directly posting to my Angular website's homepage?
I have already read another question ( How to read form post data in Angular 2 typescript? ) which is somewhat similar but I do not think using PHP is the right way for me as the website I am developing already has an older version written in PHP.
The answer at the link you provided is correct: you cannot do it in just Javascript, you have to use some server-side code. They mention PHP as an example, but any server-side component will do, and as you have Java at your backend, let it be Java.
So, when an HTTP request comes from an external site, you have to use a server-side component to handle it. But there are some options.
If this request is made using your user browser (so it is something like a redirect, but using a POST method), then you can do the following: catch that request at your backend, output some javascript with some data to the user's browser and process that data in your Angular code. Or this could be a redirection to your main Angular entry point, it is up to you.
If this request is made by some other means (for example, this is a server-to-server request made with with curl like a notification from a Credit Card processing), with no browser involved, then you don't need to have any Javascript (Angular or whatever it could be) as they are needed for browser only. In this case you just handle the request at your server-side.
In both cases, it seems plausible to dedicate some special endpoint for handling (or landing) such externally-originated requests.
I need to validate the PDF report. I need to get the report embeded in HTML. If I read that URL using:
File file = new File("url");
or
HttpWebConnection.getResponse();
it requests the URL in separate session, hence it cannot get the file.
Does ieDriver have something like HtmlUnit?
HttpWebConnection.getResponse()
or somebody can suggest alternative.
Unfortunately it does not.
If you want to get the response code you will need a proxy. If you are using Java then Browser Mob is what you need. You may also try making XmlHttpRequest from javascript and get the status code in that way.
You could also stick to the method you are using right now (separate request from Java) but pass the cookie with session (you can obtain the cookie from WebDriver)
Is there any web language that allows the client itself to create HTTP posts to external sites.
I know that JavaScript does this with XMLHttpRequest, but it does not allow cross-domain posting, unless the recipient domain wants to allow the sending domain.
I want to post data to an external site (that I don't control) and have the request be authenticated with what the client's browser already has (cookies, etc).
Is this possible? I tried cURL but it seems to make a server HTTP post, not a client HTTP post.
Edit:
A bit more insight of what I am trying to do:
I am trying to POST JSON to the website using the user's session (I said cookies but I believe they are PHP sessions, which I guess I still consider cookies).
The website does NOT check the referral (poor security #1)
I can execute javascript and html on the webpage using my personal homepage (poor security #2)
The JSON code will still work even if the content-type is form (poor security #3)
There is no security checking at all, just PHP session checking.
The form idea is wonderful and it works. The probably again is that its JSON. So having sent postdata as foo={"test":"123", "test2":"456"} the whole foo= part messes it up. Plus forms seem to turn JSON into form encoding, so its sending:
foo=%7B%22
test%22%3A+%22
123%22%2C+%22
test2%22%3A+%22
456%22%7D
when i need it to send;
{"test":"123", "test2":"456"}
So with everything known, is there a better chance of sending JSON or not?
I don't think so: You won't get hold of the user's auth cookies on the third party site from server side (because of the Single Origin Policy) and you can't make Ajax requests to the third party site.
The best you can do is probably create a <form> (maybe in an <iframe>), point it to the third party site, populate it with data, and have the user submit it (or auto-submit it). You will not be able to get hold of the request results programmatically (again because of the Single Origin Policy), but maybe it'll do - you can still show the request results to the user.
I think for obvious reasons this is not allowed. If this was allowed what would stop a malicious person from posting form data from a person's browser to any number of sites in some hidden iframe or popup window.
If this is a design of your application you need to rethink what you are trying to accomplish.
EDIT: As #Pekka was pointing out I know you can submit a form to a remote site using typical form submits. I was referring to using some client side ajax solution. Sorry for the confusion.
You should follow the way OpenID and other single-sign-on system works. How openID works is your website POSTs some token to openID service and in return gets authentication result. Refer How Does it Work? section here
Yes, you can use a special flash library that supports cross-domain calls: YUI connection manager
Added: not sure about the cookie authentication issue though...
The client cannot post to an external site directly; it's a breach of basic cross-domain security models. The exception is accessing javascript with JSONP. What you describe would require access to a user's cookies for another website, which is impossible as the browser only allows cookie access within the same domain/path.
You would need to use a server-side proxy to make cross-domain requests, but you still cannot access external cookies: http://jquery-howto.blogspot.com/2009/04/cross-domain-ajax-querying-with-jquery.html
I have a Java/Wicket page that generates a JNLP file that launches my company's software. This class will optionally take some url parameters and embed them as arguments in the JNLP. When the user launches this JNLP file the client application will perform some function based on those parameters. If the client software is already running on the machine, hitting the JNLP page will instead try to feed these parameters via a remote call to the running client instead of launching a new page.
This part is where I'm having issues. On IE, Firefox and Chrome I could open a new client, but trying to hit the same URL again would instead return a JNLP file. I found that clearing the browser cache fixes this issue on all browsers. Also, I cannot seem to hit breakpoints in the JNLP class, which enforces my hunch that this is more of an issue with the request than something strange with Wicket.
I put the following code in my page class, which extends org.apache.wicket.markup.html.WebPage:
#Override
protected void setHeaders(WebResponse response) {
getPageMap().remove(this);
HttpServletResponse httpServletResponse = response.getHttpServletResponse();
if (httpServletResponse != null) {
httpServletResponse.setDateHeader("Expires", 0);
httpServletResponse.addHeader("Cache-Control", "no-cache,no-store,private,must-revalidate,max-stale=0,post-check=0,pre-check=0");
httpServletResponse.addHeader("Keep-Alive", "timeout=3, max=993");
}
}
This doesn't seem to work, as Firefox 3.6 still seems to cache the result. IE 7 will work but only after trying the link I create a few times. I don't know a lot about web development and Wicket and this is new to me so it's possible I'm missing something simple.
TL;DR: How do I get a Wicket page to not cache on the client browser?
A hack used in some of the Wicket internals (see for example the source for org.apache.wicket.markup.html.image.NonCachingImage) is to add random noise to the url.
Basically, if you're generating the urls that the browser calls, you can add a parameter ignored by the web application that varies randomly and fools the browser into ignoring its cache.
Please check the following page:
https://web.archive.org/web/20120104201334/http://palisade.plynt.com:80/issues/2008Jul/cache-control-attributes/
Firefox should honor the "Cache-Control" header.
I don't know Wicket very well but have you tried using WebResponse.setLastModifiedTime(Time time)? I know FF sends an If-Modified-Since header to which your server would reply with 304 Not Modified or the normal response.
It would seem natural to me that your server would check the lastModifiedTime on WebResponse to decide.
If that doesn't help I would suggest you get Firebug for Firefox and take a look at the requests and responses.
response.setHeader( "Expires", "0" );
response.setHeader( "Cache-Control", "no-store, no-cache, must-revalidate, max-age=0, private" );
response.setHeader( "Pragma", "no-cache" );
This works with IE, Firefox and so on, the only browser with which it certainly does not work is konqueror.
Wicket 6.11.0:
Application.get().getResourceSettings().setDefaultCacheDuration(Duration.NONE);
Have you ever tried to load pages using window.location.replace?
I get the following error from following piece of code. I am trying to login to Google sites service through GAE apps.
"The page you requested is invalid. "
String authenticationUrl = userService.isUserLoggedIn()
? userService.createLogoutURL(MainServlet.MAIN_URL)
: userService.createLoginURL(MainServlet.MAIN_URL+"?close=1");
googleData.setAuthenticationUrl(authenticationUrl);
The complete url for login
https://www.google.com/a/example.com/ServiceLogin?service=ah&passive=true&continue=http://myapp.appspot.com/_ah/login?continue=http://myapp.appspot.com/main%3Fclose%3D1<mpl=ga&ahname=Myapp+Google+Sites&sig=7cbc9f7c9e6ca443ed49f7ce9465e775
I think that you may have misunderstood the use and purpose of createLoginURL. This method is intended to provide a URL that allows someone to log in to your application and your application alone. It does not provide a means to log in to other Google services such as Sites.
It is possible to have your application log on to and access Sites or any other secured web application, but Google AppEngine does not provide a canned means of doing so. You will need to write the code to do it yourself.
Generally, what will happen is that you will request a URL and the response will have an HTTP status code of 302 with the URL of the login page located in the Location header field. You would then send a request to that page which should come back with a 200 response and somewhere inside the body of the response would be a username and password field that you would need to provide and POST back. If the credentials were valid, the server might then return an authentication cookie which you would pass on each subsequent request.
If you are versed at all in Python, you can see an example of how this works in some code from my AppEngine MVC framework project. Look at this file:
http://code.google.com/p/gae-mvc-engine/source/browse/trunk/MVCTests.py and check out the ActiontestCase.run_action method. It handles making a request to an AppEngine application that requires authentication. It is not yet terribly-well commented -- and for that I aplogize -- but I hope that it will provide a useful example. If, indeed, I have understood the nature of your problem correctly.