Is Chrome ignoring Cache-Control: max-age? - iis

Background:
IIS 7
AspNet 3.5 web app
Chrome dev tools lists 98 requests for the home page of the web app (aspx + js + css + images). In following requests, status code is 200 for css/images files. No cache info, browser asks server each time if file has to be updated. OK.
In IIS 7 I set HTTP header for cache control, set to 6 hours for the "ressources" folder. In Chrome, using dev tools, I can see that header is well set in response:
Cache-Control: max-age=21600
But I still get 98 requests... I thought that browser should not request one ressource if its expiration date is not reached, and I was expecting the number of requests to drop...

I got it. Google Chrome ignores the Cache-Control or Expires header if you make a request immediately after another request to the same URI in the same tab (by clicking the refresh button, pressing the F5 key or pressing Command + R). It probably has an algorithm to guess what does the user really want to do.
A way to test the Cache-Control header is to return an HTML document with a link to itself. When clicking the link, Chrome serves the document from the cache. E.g., name the following document self.html:
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Test Page</title>
</head>
<body>
<p>
Link to the same page.
If correctly cached, a request should not be made
when clicking the link.
</p>
</body>
</html>
Another option is to copy the URL and paste it in the same tab or another tab.
UPDATE: On a Chrome post published on January 26, 2017, it is described what was the previous behavior and how it is changing by doing only revalidation of the main resource, but not of the sub-resources:
Users typically reload either because a page is broken or the content seems stale. The existing reload behavior usually solves broken pages, but stale content is inefficiently addressed by a regular reload, especially on mobile. This feature was originally designed in times when broken pages were quite common, so it was reasonable to address both use cases at once. However, this original concern has now become far less relevant as the quality of web pages has increased. To improve the stale content use case, Chrome now has a simplified reload behavior to only validate the main resource and continue with a regular page load. This new behavior maximizes the reuse of cached resources and results in lower latency, power consumption, and data usage.
In a Facebook post also published on January 26, 2017, it is mentioned that they found a piece of code were Chrome invalidates all cached resources after a POST request:
we found that Chrome would revalidate all resources on pages that were loaded from making a POST request. The Chrome team told us the rationale for this was that POST requests tend to be pages that make a change — like making a purchase or sending an email — and that the user would want to have the most up-to-date page.
It seems this is not the case anymore.
Finally, it is described that Firefox is introducing Cache-Control: immutable to completely stop revalidation of resources:
Firefox implemented a proposal from one of our engineers to add a new cache-control header for some resources in order to tell the browser that this resource should never be revalidated. The idea behind this header is that it's an extra promise from the developer to the browser that this resource will never change during its max-age lifetime. Firefox chose to implement this directive in the form of a cache-control: immutable header.

Chrome appears to be ignoring your Cache-Control settings if you're reloading in the same tab. If you copy the URL to a new tab and load it there, Chrome will respect the cache control tags and reuse the contents from the cache.
As an example I had this Ruby Sinatra app:
#!/usr/bin/env ruby
require 'sinatra'
before do
content_type :txt
end
get '/' do
headers "Cache-Control" => "public, must-revalidate, max-age=3600",
"Expires" => Time.at(Time.now.to_i + (60 * 60)).to_s
"This page rendered at #{Time.now}."
end
When I continuously reloaded it in the same Chrome tab it would display the new time.
This page rendered at 2014-10-08 13:36:46 -0400.
This page rendered at 2014-10-08 13:36:48 -0400.
The headers looked like this:
< HTTP/1.1 200 OK
< Content-Type: text/plain;charset=utf-8
< Cache-Control: public, must-revalidate, max-age=3600
< Expires: 2014-10-08 13:36:46 -0400
< Content-Length: 48
< X-Content-Type-Options: nosniff
< Connection: keep-alive
* Server thin is not blacklisted
< Server: thin
However accessing the same URL, http://localhost:4567/ from multiple new tabs would recycle the previous result from the cache.

After doing some tests with Cache-Control:max-age=xxx:
Pressing reload button: header ignored
Entering same url any tab (current or not): honored
Using JS (window.location.reload()): ignored
Using Developer Tools (with Disable cache unselected) or incognito doesn't affect
So, the best option while developing is put the cursor in the omnibox and press enter instead of refresh button.
Note: a right button click on refresh icon will show refresh options (Normal, Hard, Empty Cache). Incredibly, no one of these affect on these headers.

If Chrome Developer Tools are open (F12), Chrome usually disables caching.
It is controllable in the Developer Tools settings - the Gear icon to the right of the dev-tools top bar.

While this question is old, I wanted to add that if you are developing using a self-signed certificate over https and there is an issue with the certificate then google will not cache the response no matter what cache headers you use.
This is noted in this bug report:
https://bugs.chromium.org/p/chromium/issues/detail?id=110649

This is addition to kievic answer
To force browser to NOT send Cache-Control header in request, open chrome console and type:
location = "https://your.page.com"
To force browser to add this header click "reload" button.

Quite an old question, but I noticed just recently (2020), that Chrome sometimes ignores the Cache-Control headers for my image resources when browsing using an Incognito window.
"Sometimes" because in my case the Cache-Control directive was honored for small images (~60-200KB), but not for larger ones (10MB).
Not using Incognito window resulted in Chrome using the disk cached version even for the large images.

Another tip:
Do not forget to verify "Date" header - if server has incorrect date/time (or is located in another time zone) - Chrome will keep requesting resource again and again.

Related

Browser loads response from cache although no-cache header is set

I'm working on a web app and I'm having the following problem:
When I go on some page my server sends a response with cache-control: no-cache header.
Then I do some changes (graphql mutations) on that page.
When I go to an other page and then click browser back then my browser reads the outdated "response" from the disk cache instead of sending a request to the server to get the change data.
browser loads response from cache although no-cache header is set
I wondering if there is something missing in my headers telling the browser to not use the disk cache?
Some info:
The browser does not send a request to my server. (So it is not cached somewhere else.)
It is not the back-forward cache. (There is already some logic handling the bfcache.)
I can reproduce it in all my browsers. (e.g. Firefox, Chrome, ...)
When I disable the disk cache in the Firefox settings then it is working correctly. (Now, the bfcache kicks in.)
I also found the following thread. Is there a better solution?
Chrome is caching even with HTTP no-cache headers

Cache control header not working

I have set Cache control in my response header as Cache-Control:public, max-age=86400. But when I try to refresh page or open a new tab, it always hits my server. The response status I got is 200, server log is appeared for this request also I checked chrome://cache/ this request is not in the list. I already looked some similar SO questions cache-control not working without etag and why cache-control:max-age don't work?. But still with no luck. Tested on chrome 56.
Chrome disables cache when DevTools is open, or at least it does Chrome 59. Open DevTools, go to Network, uncheck "Disable cache" at the top. Now you should be able to refresh the page and see it in chrome://cache.
Cache control tells your browser (and proxy servers like Squid) what resources it cannot cache. But it does not force your browser to cache a resource.
I recommend to check the error_logs to see if you really go to the backend, or stay in the browser.
In my case, browser gives me 200OK in the console logs but I don't reach the back end according to the error_log ...
Cache-Control response header will not work for page refresh. Try making that request twice without refreshing the page, then you will see it being cached (the request won't reach your server internally).
To achieve what you want you might have to cache your request by accessing localStorage, or just cache it through a back-end caching library.

Tracking down X-Frame-Options header

We've partnered with a company whose website will display our content in an IFRAME. I understand what the header is and what it does and why, what I need help with is tracking down where it's coming from!
Windows Server 2003/IIS6
Container page: https://testDomain.com/test.asp
IFRAME Content: https://ourDomain.com/index.asp?lots_of_parameters,_wheeeee
Testing in Firefox 24 with Firebug installed. (IE and Chrome do the same thing.) Also running Fiddler so I can watch network traffic while I'm at it.
For simplicity's sake, I created a page with nothing on it but the IFRAME in question - same physical server, different domain/site - and it failed with
Load denied by X-Frame-Options: https://www.google.com/ does not permit cross-origin framing.
(That's in the Firebug console.) I'm confused because:
Google is not referenced anywhere in the containing app, or in the IFRAMEd app. All javascript libraries are kept locally; there is no analytics in the app. No Google, nowhere.
The containing page has NOTHING on it, except the IFRAME. No html tags, no head tag, no body tag. IFRAME. That's it.
The X-FRAME-OPTIONS header does not exist in IIS on the server: not at the "Websites" node, not in the individual sites.
So where the h-e-double-sticks is that coming from? What am I missing?
Interesting point: if I remove http"S" from the IFRAME url, it works. Given the nature of the data, SSL is required.
You might check global.asax.cs, the app could be adding the header to every response automatically. If you just search the app for "x-frame-options" you might find something also.

Reload vs Refresh

I have this script
<?php
header("Expires: Sat, 11 Jun 2011 00:00:00 GMT");
echo "Hello World";
?>
It just writes "Hello World" and set the cache to expire on next Saturday.
Now, when I load this page in FireFox and click on reload button, it makes a new request to server to load the page instead of just serving it from cache (I think to ensure if last-modified is still valid).
However, if I put my cursor on the address bar and press Enter, FireFox serves the contents from cache.
Why is that so? Why does in first case (reload) it makes a request to server, but in second case (refresh, I guess?) it serves from cache?
I think the terms 'refresh' and 'reload' are basically synonymous. I see this line in RFC 2616 that describes HTTP/1.1 caching that provides a possible slight difference:
An expiration time cannot be used to force a user agent to refresh its display or reload a resource
In other words, perhaps you could say refreshing is for displays, and reloading is for resources. But since browsers' primary use for resources is display, I don't see a difference.
Here's a short writeup on the terms by a developer who has dealt with browser cache control. The terms he prefers are these:
load: hit Enter in the address bar; click on links
reload: F5; Ctrl+R; toolbar's refresh button; Menu -> Reload
hard reload: Ctrl+F5; Ctrl+Shift+R
(The hard reload forces the browser to bypass its cache. For Firefox, you hold down Shift and press the reload button. Wikipedia has a list of how to do this for common browsers. You can test its effect on this page.)
To answer your question about how Firefox decides when to refresh, here is how the link from above explains it:
load: no request happens until the cached resource expires
reload: the request contains the If-Modified-Since and Cache-Control: max-age=0 headers that allow the server to respond with 304 Not Modified if applicable
hard reload: the request contains the Pragma: no-cache and Cache-Control: no-cache headers and will bypass the cache
When people refresh a page, they generally expect to see new results, so caching of the entire page doesn't make much sense.

Why is Chrome reporting a secure / non secure warning when no other browsers aren't?

When I go to our web site through HTTPS mode, Chome is reporting an error saying that the page contains secure and not secure items. However, I used Firebug, Fiddler, and HttpDebuggerPro, all which are telling me that everything is going through HTTPS. Is this a bug in Chrome?
Sorry but I'm unable to give out the actual URL.
A bit late to the party here but I've been having issues recently and once I had found a http resource and changed it was still getting the red padlock symbol. When I closed the tab and opened a new one it changed to a green padlock so I guess Chrome caches this information for the lifetime of the tab
Current versions of Chrome will show the mixed content's URL in the error console. Hit CTRL+Shift+J and you'll see text like:
"The page at https://www.fiddler2.com/test/securepageinsecureimage.htm contains insecure content from http://www.fiddler2.com/Eric/images/me.jpg."
I was having the same issue: Chromium showing the non-secure static files, but when everything was http://.
Just closing the current tab and re-opening the page in another new tab worked, so I think this is a Chromium/Chrome bug.
Cheers,
Diogo
Using Chrome, if you open up the Developer Tools (View > Developer > Developer Tools) and bring up the Console and choose to filter to warnings, you'll see a list of offending URLs.
You'll see something like the following if you do have insecure content
The page at https://mysite/ displayed insecure content from http://insecureurl.
For the best experience in finding the culprit, you'll want to start your investigation in a new tab.
It is possible that a non-secure URL is referenced but not accessed (e.g. the codebase for a Flash <object>).
I ran into this problem when Jquery was being executing a a few seconds after page load which added a class containing a non-secure image background. Chrome must continually to check for any non-secure resources to be loaded.
See the code example below. If you had code like this, the green padlock is shown in Chrome for about 5 seconds until the deferred class is applied to the div.
setTimeout(function() {
$("#some-div").addClass("deferred")
}, 5000);
.deferred
{
background: url(http://not-secure.com/not-secure.jpg"
}
Check the source of the page for any external objects (scripts, stylesheets, images, objects) linked using http://... rather than https://... or a relative path. Change the links to use relative paths, or absolute paths without protocol, i.e. href="/path/to/file".
If all that if fine, it could be something included from Javascript. For example, the Google Analytics code uses document.write to add a new script to the page, but it has code to check for HTTPS in case the calling page is secure:
<script type="text/javascript">
var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www.");
document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E"));
</script>
On the release of Chrome version 53 on Windows, Google has changed the trust indications to initiate the circle-i. Afterward, Google has announced a new warning message will be issued when a website is not using HTTPS.
From 2017 January Start, Popular web browser Chrome will begin
labeling HTTP sites as “Not Secure” [Which transmit passwords / ask
for credit card details]
If all your resources are indeed secure, then it is a bug. http://code.google.com/p/chromium/issues/detail?id=72015 . Luckily it was fixed.

Resources