In my web app some pages, I have two scenarios for browser cache
Scenario 1 :-
I want to server from browser cache if not modified at server side. For example :-
User issues the request for all employees
Response returns 10 employees
User issues the request for all employees again
Expectation is this time it should be served from browser cache
User creates one more employee
User issues the request for all employees again
Expectation is this time it should be served latest from server instead of browser cache
I am planning to use below header
response.setHeader("Cache-Control", "no-cache");
As
no-cache is not instructing the browser or proxies about whether or
not to cache the content. It just tells the browser and proxies to
validate the cache content with the server before using it
Scenario 2 :-
But for some sensitive pages i don't want to cache at all, i am planning it to use below header
response.setHeader("Cache-Control", "no-store");
But some articles safe to use below header to make it work for all browsers. So i am going to use below
response.setHeader("Cache-Control", "no-cache, no-store");
Is my proposed implementation correct ?
For Scenario #1 you indeed need to set Cache-Control to no-cache (or set a max-age for even a better scalability but in this case you won't have necessary the latest value) but you also need to use the HTTP header ETag in order to allow the browser to check if the data content has changed such that the browser will be able to know if the cache entry can be reused or not.
For Scenario #2 you need to set Cache-Control to no-store to prevent the browser to cache the data as it is the standard way but indeed no-cache, no-store will help to work on old browsers if you need to support them too.
Related
Is there a maximum number of cookies which you can set on a single http response? If yes how many?
Because I'm trying to create 2 cookies in one response, for some reason only one is getting created. I'm using the code below.
Cookie cookie = new Cookie("wNote", "1530571761964");
cookie.setMaxAge(2592000);
cookie.setPath("/myWebsite/");
response.addCookie(cookie);
the other cookie is the JSESSIONID which is being created by the server automatically. In the response headers under Set-Cookie I can see only JSESSIONID.
Apache Tomcat 8.0.27
Google Chrome 67.0
If you are using Tomcat then, only one Cookie in HttpServletResponse.addCookie(javax.servlet.http.Cookie) But this method can call multiple times:
The servlet sends cookies to the browser by using the HttpServletResponse.addCookie(javax.servlet.http.Cookie) method, which adds fields to HTTP response headers to send cookies to the browser, one at a time.
This method can be called multiple times to set more than one cookie.
The browser is expected to support 20 cookies for each Web server, 300 cookies total, and may limit cookie size to 4 KB each.
Cookies Doc
Ok going more deeper about this cookies. I check the RFC 2109
Practical user agent implementations have limits on the number and
size of cookies that they can store. In general, user agents' cookie
support should have no fixed limits. They should strive to store as
many frequently-used cookies as possible. Furthermore, general-use
user agents should provide each of the following minimum capabilities
individually, although not necessarily simultaneously:
at least 300 cookies
at least 4096 bytes per cookie (as measured by the size of the
characters that comprise the cookie non-terminal in the syntax
description of the Set-Cookie header)
at least 20 cookies per unique host or domain name
I want to get the clarity on these headers in my application:
response.setHeader("Content-Security-Policy", "frame-ancestors 'self'");
response.setHeader("X-Content-Type-Options", "nosniff");
response.setHeader("X-XSS-Protection", "1; mode=block");
response.setHeader("Strict-Transport-Security", "max-age=7776000; cludeSubdomains");
String contextPath = ((HttpServletRequest) request).getContextPath();
response.setHeader("SET-COOKIE", "JSESSIONID=" +
((HttpServletRequest)request).getSession().getId() +
";Path="+contextPath+";Secure;HttpOnly");
response.setHeader("Cache-control", "no-cache, no-store,max-age=0, must-revalidate");
response.setHeader("Pragma", "no-cache");
response.setHeader("X-Frame-Options", "SAMEORIGIN");
As of now I know:
Content Security Policy is an added layer of security that helps to
detect and mitigate certain types of attacks, including Cross Site
Scripting (XSS) and data injection attacks.
X-Content-Type-Options response HTTP header is a marker used by the server to indicate that the MIME types advertised in the Content-Type headers should not be changed and be followed.
X-XSS-protection is XSS Filter.
Strict-Transport-Security is an opt-in security enhancement that is specified by a web application through the use of a special response header. Once a supported browser receives this header that browser will prevent any communications from being sent over HTTP to the specified domain and will instead send all communications over HTTPS.
Cache-control general-header field is used to specify directives for caching mechanisms in both, requests and responses.
Pragma meant to prevent the client from caching the response. However there is a difference between Cache control and Pragma response headers as they both does same work except Pragma is the HTTP/1.0 implementation and cache-control is the HTTP/1.1 implementation of the same concept..
X-Frame-Options used to indicate whether or not a browser should be allowed to render a page in a frame, iframe or object.
Now I have this code in CrossSiteScriptingFilter which is mapped in web.xml which does XSS filtering. but as a result it changes the .png files encoding and remove the ?characters which corrupt PNG file encoding and thus giving false PNG data.
Please check the screenshot, it has no ? characters and are replaced by empty string and as a result it does not allow .png files to render.
I analysed the code and found that removing response header X-Content-Type-Options is doing the job (.png files are rendering properly).
I am still not sure why this problem occurs and why X-Content-Type-Options was replacing the ? character to "" string which was creating the problem. Can somebody explain.
Thanks in advance :)
It sounds to me like you're pretty close to your answer: XSS filtering of special characters is a bad idea with binary files which may validly use characters that would be out of place in (x)html, js, or similar interpreted files.
Normally, web apps split such resources into their own directory that will have a different process applied to its contents, say, not running an XSS protection filter over it. When you configure the filter, you should exclude paths known to exclusively contain binary data, such as the aforementioned resource directories.
What seems likely is that the headers are causing/prohibiting the filter from guessing at the MIME-type, misinterpreting your binary as html or similar (probably based on the text in the PNG header) or just falling back on the filter by default, and then sanitising it. It could be that your MIME-type headers are wrong and the sniffer is fixing it (hence telling it not to do so prevents it from recovering).
I have the requirement that the end user should not be able to go back to the restricted page after logout/sign out. But currently the end user is able to do that by the browser back button, visiting browser history or even by re-entering the URL in browser's address bar.
Basically, I want that the end user should not be able to access the restricted page in any way after sign out. How can I achieve this the best? Can I disable the back button with JavaScript?
You can and should not disable the browser back button or history. That's bad for user experience. There are JavaScript hacks, but they are not reliable and will also not work when the client has JS disabled.
Your concrete problem is that the requested page is been loaded from the browser cache instead of straight from the server. This is essentially harmless, but indeed confusing to the enduser, because s/he incorrectly thinks that it's really coming from the server.
You just need to instruct the browser to not cache all the restricted JSP pages (and thus not only the logout page/action itself!). This way the browser is forced to request the page from the server instead of from the cache and hence all login checks on the server will be executed. You can do this using a Filter which sets the necessary response headers in the doFilter() method:
#WebFilter
public class NoCacheFilter implements Filter {
#Override
public void doFilter(ServletRequest req, ServletResponse res, FilterChain chain) throws IOException, ServletException {
HttpServletResponse response = (HttpServletResponse) res;
response.setHeader("Cache-Control", "no-cache, no-store, must-revalidate"); // HTTP 1.1.
response.setHeader("Pragma", "no-cache"); // HTTP 1.0.
response.setDateHeader("Expires", 0); // Proxies.
chain.doFilter(req, res);
}
// ...
}
Map this Filter on an url-pattern of interest, for example *.jsp.
#WebFilter("*.jsp")
Or if you want to put this restriction on secured pages only, then you should specify an URL pattern which covers all those secured pages. For example, when they are all in the folder /app, then you need to specify the URL pattern of /app/*.
#WebFilter("/app/*")
Even more, you can do this job in the same Filter as where you're checking the presence of the logged-in user.
Don't forget to clear browser cache before testing! ;)
See also:
Authentication filter and servlet for login
How to control web page caching, across all browsers?
*.jsp in Url Pattern won't work if you forward a page. Try to include your servlet too.. that will make your application secure from this back button problem.
The simplest way to do it without disabling the browser back buton is by adding this code to the page_load event for the page that you don't want the user to go back to after logging out:
if (!IsPostBack)
{
if (Session["userId"] == null)
{
Response.Redirect("Login.aspx");
}
else
{
Response.ClearHeaders();
Response.ClearContent();
Response.Clear();
Session.Abandon();
Session.Remove("\\w+");
Response.AddHeader("Cache-Control", "no-cache, no-store, max-age = 0, must-revalidate");
Response.AddHeader("Pragma", "no-cache");
Response.AddHeader("Expires", "0");
}
}
The correct way to do this is to add the
Vary: Cookie
header on secured pages. When the user logs out, clear their session cookie. Then, when they navigate back after logging out, the browser cache will miss. This also has the benefit of not completely defeating caching.
You can try telling the browser not to cache the homepage (using the appropriate headers - Expires, Cache-Control, Pragma). But it is not guaranteed to work. What you can do, is make an ajax call to the server on page load to check if the user is logged, and if not - redirect.
An alternative to implementing a Filter is to set a 'no-cache' filter on all the secured JSPs, or on all paths. This may be a good idea if the application is small, and if you would like to customize this property for a specific pages only. We can add the following Java snippet on every secured JSP that should not be cached:
<%
response.addHeader("Pragma", "no-cache");
response.setHeader("Cache-Control", "no-cache, no-store, must-revalidate");
response.setDateHeader("Expires", 0);
%>
If not on JSP, this could also be used in Controllers where routing is defined and set the headers for the 'HttpServletResponse' object.
For me the problem was , I didn't want to set headers on all pages , so I just set this header on page when logout is clicked and it clears everything related to the site :)
// Removes all site data
response.setHeader ("Clear-Site-Data", "\"cache\"");
Please read more about it over here :
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Clear-Site-Data
I can't find information anywhere regarding this HTTP Header: PAGE_CACHE_FILTER_STATE.
When I try to access my RSS feed from a browser, this header has the value of NoCacheRequest, but when I access it from my Java application (URL.openConnection()), I've noticed that it gets set to FromCacheRequest and my RSS doesn't appear to update.
So I have two questions:
What is this HTTP header?
How can I make PAGE_CACHE_FILTER_STATE: NoCacheRequest for all requests?
I've never heard about nor seen PAGE_CACHE_FILTER_STATE before either, so I can't help you out with the actual specifications for it. It looks like a custom header telling you whether a cached version of the content was used or not.
To avoid caching, you could try programmatically adding something different to the URL each time. For example, you might add a random number:
http://www.example.com/feed.rss?no_cache=564482
http://www.example.com/feed.rss?no_cache=984637
You should also try sending the Pragma: no-cache and Cache-Control: no-cache HTTP headers when you request the RSS feed.
Part of my app provides a file to be downloaded using the redirect() method. I have found that Chrome (and not Firefox or IE, weirdly) is caching this file so that the same version gets downloaded even if it has changed server-side. I gather that there is a way to tell a browser not to cache a file, e.g. like this in the HTML, or by adding something to the HTTP header. I could probably figure those out in a lower-level web framework, but I don't know how to get at the header in Play!, and the HTML option won't work because it's not an HTML file.
It seems like there's always a clever and simple way to do common tasks in Play!, so is there a clever and simple way to prevent caching in a controller?
Thanks!
Edit:
Matt points me to the http.cacheControl setting, which controls caching for the entire site. While this would work, I have no problem with most of the site being cached, especially the CSS etc. If possible I'd like to control caching for one URL at a time (the one pointing to the downloading file in this case). It's not exactly going to be a high-traffic site, so this is just academic interest talking.
Ideally, I'd like to do something like:
public static void downloadFile(String url) {
response.setCaching(false); // This is the method I'm looking for
redirect(url); // Send the response
}
Play framework response object has a setHeader method. You can add the headers you want like this, for example:
response.setHeader("Cache-Control", "no-cache");
I haven't tested it, but it looks like the http.cacheControl configuration setting might work.
http.cacheControl
HTTP Response headers control for static files: sets the default max-age in seconds, telling the user’s browser how long it should cache the page. This is only read in prod mode, in dev mode the cache is disabled. For example, to send no-cache:
http.cacheControl=0
Default: 3600 – set cache expiry to one hour.
It is actually this:
response().setHeader("Cache-Control", "no-cache");
Tommi's answer is ok, but to make sure it works in every browser, use:
response().setHeader("Cache-Control", "no-cache, max-age=0, must-revalidate, no-store");
response.setHeader("Cache-Control", "no-cache, no-store, must-revalidate, pre-check=0, post-check=0, max-age=0, s-maxage=0"); // HTTP 1.1.
On play currently 2.5.x to 2.8.x
you can set cache life of both assets folder or assets file in configuration.
For folder-
play.assets.cache."/public/stylesheets/"="max-age=100"
play.assets.cache."/public/javascripts/"="max-age=200"
for specific file -
play.assets.cache."/public/stylesheets/bootstrap.min.css"="max-age=3600"
--- documentation
https://www.playframework.com/documentation/2.8.x/AssetsOverview