Using HTTP OPTIONS to retrieve information about REST resources - java

This problem relates to the Restlet framework and Java
When a client wants to discover the resources available on a server - they must send an HTTP request with OPTIONS as the request type. This is fine I guess for non human readable clients - i.e. in code rather than a browser.
The problem I see here is - browsers (human readable) using GET, will NOT be able to quickly discover the resources available to them and find out some extra help documentation etc - because they do not use OPTIONS as a request type.
Is there a way to make a browser send an OPTIONS/GET request so the server can fire back formatted XML to the client (as this is what happens in Restlet - i.e. the server response is to send all information back as XML), and display this in the browser?
Or have I got my thinking all wrong - i.e. the point of OPTIONS is that is meant to be used inside a client's code and not meant to be read via a browser.

Use the TunnelService (which by default is already enabled) and simply add the method=OPTIONS query parameter to your URL.
(The Restlet FAQ Q19 is a similar question.)

I think OPTIONS is not designed to be 'user-visible'.
How would you dispatch an OPTIONS request from the browser ? (note that the form element only allows GET and POST).
You could send it using XmlHttpRequest and then get back XML in your Javascript callback and render it appropriately. But I'm not convinced this is something that your user should really know about!

Related

How to put a my own proxy between any client and any server (via a web page)

what I want to do is to build a web application(proxy) that user use to request the webpage he want and
my application forward the request to the main server,
modify HTML code,
send to the client the modified one.
The question now is
How to keep my application between the client and main serevr
(for example when the user click any link inside the modified page-
ajax request - submit forms - and so on)
in another words
How to grantee that any request (after the first URL request) from the client sent to my proxy and any response come first to my proxy
The question is: Why do you need a proxy? Why do you want to build it - why not use already existing one like HAProxy ?
EDIT: sorry, I didn't read your whole post correctly. You can start with:
http://www.jtmelton.com/2007/11/27/a-simple-multi-threaded-java-http-proxy-server/
If the user is willing to, or can be forced1 to configure his clients (e.g. web browser) to use a web proxy, then your problem is already solved. Another way to do this (assuming that the user is cooperative) is to get them to install a trusted browser plugin that dynamically routes selected URLs through your proxy. But you can't do this using an untrusted webapp: the Browser sandbox won't (shouldn't) let you.
Doing it without the user's knowledge and consent requires some kind of interference at the network level. For example, a "smart" switch could recognizes TCP/IP packets on port 80 and deliberately route them to your proxy instead of the IP address that the client's browser specifies. This kind of thing is known as "deep packet inspection". It would be very difficult to implement yourself, and it requires significant compute power in your network switch if you are going to achieve high network rates through the switch.
The second problem is that making meaningful on-the-fly modifications to arbitrary HTML + Javascript responses is a really difficult problem.
The final problem is that this is only going to work with HTTP. HTTPS protects against "man in the middle" attacks ... such as this ... that monitor or interfere with the requests and responses. The best you could hope to do would be to capture the encrypted traffic between the client and the server.
1 - The normal way to force a user to do this is to implement a firewall that blocks all outgoing HTTP connections apart from those made via your proxy.
UPDATE
The problem now what should I change in the html code to enforce client to request any thing from my app --- for example for link href attribute may be www.aaaa.com?url=www.google.com but for ajax and form what I should do?
Like I said, it is a difficult task. You have to deal with the following problems:
Finding and updating absolute URLs in the HTML. (Not hard)
Finding and dealing with the base URL (if any). (Not hard)
Dealing with the URLs that you don't want to change; e.g. links to CSS, javascript (maybe), etc. (Harder ...)
Dealing with HTML that is syntactically invalid ... but not to the extent that the browser can't cope. (Hard)
Dealing with cross-site issues. (Uncertain ...)
Dealing with URLs in requests being made by javascript embedded in / called from the page. This is extremely difficult, given the myriad ways that javascript could assemble the URL.
Dealing with HTTPS. (Impossible to do securely; i.e. without the user not trusting the proxy to see private info such as passwords, credit card numbers, etc that are normally sent securely.)
and so on.

How to append html to a website response before it reaches the browser in Java?

Recently I used a Mac application called Spotflux. I think it's written in Java (because if you hover over its icon it literally says "java"...).
It's just a VPN app. However, to support itself, it can show you ads... while browsing. You can be browsing on chrome, and the page will load with a banner at the bottom.
Since it is a VPN app, it obviously can control what goes in and out of your machine, so I guess that it simply appends some html to any website response before passing it to your browser.
I'm not interested in making a VPN or anything like that. The real question is: how, using Java, can you intercept the html response from a website and append more html to it before it reaches your browser? Suppose that I want to make an app that literally puts a picture at the bottom of every site you visit.
This is, of course, a hypothetical answer - I don't really know how Spotflux works.
However, I'm guessing that as part of its VPN, it installs a proxy server. Proxy servers intercept HTTP requests and responses, for a variety of reasons - most corporate networks use proxy servers for caching, monitoring internet usage, and blocking access to NSFW content.
As a proxy server can see all HTTP traffic between your browser and the internet, it can modify that HTTP; often, a proxy server will inject an HTTP header, for instance; injecting an additional HTML tag for an image would be relatively easy.
Here's a sample implementation of a proxy server in Java.
There are many ways to do this. Probably the easiest would be to proxy HTTP requests through a web proxy like RabbIT (written in java). Then just extend the proxy to mess with the response being sent back to the browser.
In the case of Rabbit, this can either be done with custom code, or with a special Filter. See their FAQ.
WARNING: this is not as simple as you think. Adding an image to the bottom of every screen will be hard to do, depending on what kind of HTML is returned by the server. Depending on what CSS, javascript, etc that the remote site uses, you can't just put the same HTML markup in and expect it to act the same everywhere.

How to validate a request to a JSP page in Struts 1.2

How to validate a request/response to a JSP page in Struts 1.2 ??
Short scenario: the response from the action class is captured on the way to jsp and is tweaked. how to validate whether that response has been touched? (This is all part of VAPT so excuse me if something sounds illogical)
response from the action class is captured on the way to jsp and is tweaked
Are you saying that the Java and JSPs are not running the same container, or even the same JVM, and therefore are going over the wire, and you're worried about interception there?
Or do you mean that some kind of filter class in you server stack is deliberately altering the response, and you want to validate your own transformation of it?
Or do you mean you're concerned about interception between server and the client, and that what you mean by the word JSP is actually the rendered HTML sent to the client (browser)?
In general, the way you detect tampering is to use hashing - this is why when you download an open source project, you will often see a hash download next to it. I don't know of any off-the-shelf solution for this in a normal browser context. Maybe have some javascript in a separate tab / frame, which checks for hash delivered in a separate request?
But ultimately, if you're that worried about man-in-the-middle attacks, make the request over https. Maybe even use mutual certificates (server sends a cert to browser, and a browser sends cert to the server, as a mutual authentication.)

Client HTTP Post to external sites

Is there any web language that allows the client itself to create HTTP posts to external sites.
I know that JavaScript does this with XMLHttpRequest, but it does not allow cross-domain posting, unless the recipient domain wants to allow the sending domain.
I want to post data to an external site (that I don't control) and have the request be authenticated with what the client's browser already has (cookies, etc).
Is this possible? I tried cURL but it seems to make a server HTTP post, not a client HTTP post.
Edit:
A bit more insight of what I am trying to do:
I am trying to POST JSON to the website using the user's session (I said cookies but I believe they are PHP sessions, which I guess I still consider cookies).
The website does NOT check the referral (poor security #1)
I can execute javascript and html on the webpage using my personal homepage (poor security #2)
The JSON code will still work even if the content-type is form (poor security #3)
There is no security checking at all, just PHP session checking.
The form idea is wonderful and it works. The probably again is that its JSON. So having sent postdata as foo={"test":"123", "test2":"456"} the whole foo= part messes it up. Plus forms seem to turn JSON into form encoding, so its sending:
foo=%7B%22
test%22%3A+%22
123%22%2C+%22
test2%22%3A+%22
456%22%7D
when i need it to send;
{"test":"123", "test2":"456"}
So with everything known, is there a better chance of sending JSON or not?
I don't think so: You won't get hold of the user's auth cookies on the third party site from server side (because of the Single Origin Policy) and you can't make Ajax requests to the third party site.
The best you can do is probably create a <form> (maybe in an <iframe>), point it to the third party site, populate it with data, and have the user submit it (or auto-submit it). You will not be able to get hold of the request results programmatically (again because of the Single Origin Policy), but maybe it'll do - you can still show the request results to the user.
I think for obvious reasons this is not allowed. If this was allowed what would stop a malicious person from posting form data from a person's browser to any number of sites in some hidden iframe or popup window.
If this is a design of your application you need to rethink what you are trying to accomplish.
EDIT: As #Pekka was pointing out I know you can submit a form to a remote site using typical form submits. I was referring to using some client side ajax solution. Sorry for the confusion.
You should follow the way OpenID and other single-sign-on system works. How openID works is your website POSTs some token to openID service and in return gets authentication result. Refer How Does it Work? section here
Yes, you can use a special flash library that supports cross-domain calls: YUI connection manager
Added: not sure about the cookie authentication issue though...
The client cannot post to an external site directly; it's a breach of basic cross-domain security models. The exception is accessing javascript with JSONP. What you describe would require access to a user's cookies for another website, which is impossible as the browser only allows cookie access within the same domain/path.
You would need to use a server-side proxy to make cross-domain requests, but you still cannot access external cookies: http://jquery-howto.blogspot.com/2009/04/cross-domain-ajax-querying-with-jquery.html

Is it possible to know if browser has cookies enabled on the very first access

On a very first access to a website, is it possible to know if browser cookies are activated thanks to the GET request?
Is it possible particularly in a J2EE context? On the server side (no ajax/js solution)
Short answer: No.
The HTTP request does simply not carry that kind of information. It's only implicitly testable by trying to send one to the client and see if it uses it or not. There are probably also various javascript options, but you explicitly did not want one of those.
you can send cookies with first page, and then redirect to some index. if anyone tries to get page other then first without cookies then it is not supporting it.

Categories