Browser loads response from cache although no-cache header is set - browser

I'm working on a web app and I'm having the following problem:
When I go on some page my server sends a response with cache-control: no-cache header.
Then I do some changes (graphql mutations) on that page.
When I go to an other page and then click browser back then my browser reads the outdated "response" from the disk cache instead of sending a request to the server to get the change data.
browser loads response from cache although no-cache header is set
I wondering if there is something missing in my headers telling the browser to not use the disk cache?
Some info:
The browser does not send a request to my server. (So it is not cached somewhere else.)
It is not the back-forward cache. (There is already some logic handling the bfcache.)
I can reproduce it in all my browsers. (e.g. Firefox, Chrome, ...)
When I disable the disk cache in the Firefox settings then it is working correctly. (Now, the bfcache kicks in.)
I also found the following thread. Is there a better solution?
Chrome is caching even with HTTP no-cache headers

Related

Handling of requests with Header Accept-Encoding in CloudFront vs request without the header Accept-Encoding

I have a Cloud Front Distribution to cache my images. my origin server is NOT S3, its some server i run.
I use these images in my website(taking the advantage of CF caching). Now to explain the problem, lets assume in my home page i am using an image called banner.png.
I visit my home page lets say from chrome for the first time - for banner.png its a cache miss, so it gets fetched fro origin and cached in CF.
After this i visit my page from FF,opera, Chromium, GET "banner.png" using postman - this all gets me the file from CF cache.
Now i GET "banner.png" using insomnia (Another rest client) - Now CF doesn't send me from cache, it goes back to origin to get the image, and reply me with **"x-cache: RefreshHit from cloudfront"**.
the difference between these 2 sets of clients are first set of clients sends "Accept-Encoding: gzip" header in the request and second client did not.
in my CF behaviour -
"Cache Based on Selected Request Headers" = NONE
Objects Automatically" = NO"Compress
.
Any pointers ?
CloudFront keeps two different copies of cache based on Accept-encoding.
One if Header contains Accept-encoding: gzip
Accept-encoding: any other value or without the header.
You can test it using curl, first without accept-encoding and second request with accept-encoding: gzip and you'll see MISS from CloudFront, this is expected with CloudFront.
The reason being is that CloudFront supports only gzip compression and it keeps this header into consideration to know if it needs to compress the response or not.
However, Your problem seems different, You're seeing Refersh from CloudFront which happens when CloudFront TTLs/Max-age expires and CloudFront is making a Condition GET to the origin to know if the content has been modified or not.
Ideally, it should be a Miss From CloudFront if no accept-header is present.

Cache control header not working

I have set Cache control in my response header as Cache-Control:public, max-age=86400. But when I try to refresh page or open a new tab, it always hits my server. The response status I got is 200, server log is appeared for this request also I checked chrome://cache/ this request is not in the list. I already looked some similar SO questions cache-control not working without etag and why cache-control:max-age don't work?. But still with no luck. Tested on chrome 56.
Chrome disables cache when DevTools is open, or at least it does Chrome 59. Open DevTools, go to Network, uncheck "Disable cache" at the top. Now you should be able to refresh the page and see it in chrome://cache.
Cache control tells your browser (and proxy servers like Squid) what resources it cannot cache. But it does not force your browser to cache a resource.
I recommend to check the error_logs to see if you really go to the backend, or stay in the browser.
In my case, browser gives me 200OK in the console logs but I don't reach the back end according to the error_log ...
Cache-Control response header will not work for page refresh. Try making that request twice without refreshing the page, then you will see it being cached (the request won't reach your server internally).
To achieve what you want you might have to cache your request by accessing localStorage, or just cache it through a back-end caching library.

NodeJS & Express cookie issue with IE

I'm having some issues with the cookie for a sessionID returned by Express to the browser. For consecutive requests the received cookie is not being passed back, hence generating a new session for each request. The login-state can not be maintained.
This issue only seems to be occurring on IE11 and below on an OS lower than Win10 or so (eg: IE11 on Win7). Edge, Chrome, Safari, FF... no issues.
Some more context:
We have two applications, say one.example.com and two.example.com. With requests from both applications the logged-in user should be kept track of.
The MEAN stack accessible on two.example.com returns set-cookie headers for one day, HTTP only, with a domain of '.example.com' on path '/'.
When I load a page with say, 10 resources, all these requests receive a new cookie for the sessionID. Even on consecutive page loads. The cookie is never returned back to the server.
HTTP trace from Chrome:
GET http://localhost:3000/
HTTP/1.1 200 OK
set-cookie: test=123
set-cookie: webSessionId=s%3Av6R-...
GET http://localhost:3000/lib/bootstrap/dist/css/bootstrap.css
Cookie: test=123; webSessionId=s%3Av6R-...
HTTP/1.1 200 OK
Set-Cookie: test=123
seen Chrome returns the session, it is not returned by express for the second request
HTTP trace from IE:
response for first requested resource
request for second resource
response for second requested resource
The test123 is a hardcoded cookie I set on every request (regardless whether it has been returned)... by using res.setHeader('Set-Cookie', 'test=123');. At one point I was looking at the difference between a 'set-cookie' and 'Set-Cookie' (as can be seen in above screenshots), but that does not seem to impact IE.
So I started to play around with the other cookie properties (expiry date, domain, path, secure & http only) as soon as I provide a domain... the test-cookie is not returned by IE.
In our setup '.example.com' really is a requirement. The domain does not contain an underscore (_). In dev & tst it does contain a hyphen "two-dev.example.com". But the IE(11) problem also exists on prd (two.example.com).
Anyone has any idea as to why IE refuses to return cookies with a domain?
This sh*t is driving me bananas
using: express 4.13.1; express-session 1.11.3; cookie-parser 1.3.2
I think I found the cause... the domains we are using are two-letter domains e.g. one.xx.yy & two.xx.yy. So setting the Domain-part of the cookie to .xx.yy is causing IE to ignore the cookie, I recon.
Anyone can confirm this issue is still valid on IE11?
How to set cookies for two-letter domains in IE8?
And/or how to circumvent this? Besides the obvious of using other domains.

How can I check if Access-Control-Allow-Origin is enabled for my domain?

If I have configured Access-Control-Allow-Origin: http://mydomain correctly, should it be listed in the response headers if I view them using the web developer plugin? I don't see it. Should it be there?
I have tried viewing the response headers after submitting my post request, and just calling the page.
Background
I need to transfer a couple of values from mydomain to receivingdomain via XMLHttpRequest POST request and am trying to troubleshoot
XMLHttpRequest Page1.asp cannot load https://receivingdomain. No Access-Control-Allow-Origin header is present on the requested resource
If I turn on the Allow-Control-Allow-Extension plug-in my post requests work correctly. However, when this plug-in is disabled, my post requests are being received ok (the data is inserted into the database) - I'm just not getting any result back from the receiving server.

Is Chrome ignoring Cache-Control: max-age?

Background:
IIS 7
AspNet 3.5 web app
Chrome dev tools lists 98 requests for the home page of the web app (aspx + js + css + images). In following requests, status code is 200 for css/images files. No cache info, browser asks server each time if file has to be updated. OK.
In IIS 7 I set HTTP header for cache control, set to 6 hours for the "ressources" folder. In Chrome, using dev tools, I can see that header is well set in response:
Cache-Control: max-age=21600
But I still get 98 requests... I thought that browser should not request one ressource if its expiration date is not reached, and I was expecting the number of requests to drop...
I got it. Google Chrome ignores the Cache-Control or Expires header if you make a request immediately after another request to the same URI in the same tab (by clicking the refresh button, pressing the F5 key or pressing Command + R). It probably has an algorithm to guess what does the user really want to do.
A way to test the Cache-Control header is to return an HTML document with a link to itself. When clicking the link, Chrome serves the document from the cache. E.g., name the following document self.html:
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Test Page</title>
</head>
<body>
<p>
Link to the same page.
If correctly cached, a request should not be made
when clicking the link.
</p>
</body>
</html>
Another option is to copy the URL and paste it in the same tab or another tab.
UPDATE: On a Chrome post published on January 26, 2017, it is described what was the previous behavior and how it is changing by doing only revalidation of the main resource, but not of the sub-resources:
Users typically reload either because a page is broken or the content seems stale. The existing reload behavior usually solves broken pages, but stale content is inefficiently addressed by a regular reload, especially on mobile. This feature was originally designed in times when broken pages were quite common, so it was reasonable to address both use cases at once. However, this original concern has now become far less relevant as the quality of web pages has increased. To improve the stale content use case, Chrome now has a simplified reload behavior to only validate the main resource and continue with a regular page load. This new behavior maximizes the reuse of cached resources and results in lower latency, power consumption, and data usage.
In a Facebook post also published on January 26, 2017, it is mentioned that they found a piece of code were Chrome invalidates all cached resources after a POST request:
we found that Chrome would revalidate all resources on pages that were loaded from making a POST request. The Chrome team told us the rationale for this was that POST requests tend to be pages that make a change — like making a purchase or sending an email — and that the user would want to have the most up-to-date page.
It seems this is not the case anymore.
Finally, it is described that Firefox is introducing Cache-Control: immutable to completely stop revalidation of resources:
Firefox implemented a proposal from one of our engineers to add a new cache-control header for some resources in order to tell the browser that this resource should never be revalidated. The idea behind this header is that it's an extra promise from the developer to the browser that this resource will never change during its max-age lifetime. Firefox chose to implement this directive in the form of a cache-control: immutable header.
Chrome appears to be ignoring your Cache-Control settings if you're reloading in the same tab. If you copy the URL to a new tab and load it there, Chrome will respect the cache control tags and reuse the contents from the cache.
As an example I had this Ruby Sinatra app:
#!/usr/bin/env ruby
require 'sinatra'
before do
content_type :txt
end
get '/' do
headers "Cache-Control" => "public, must-revalidate, max-age=3600",
"Expires" => Time.at(Time.now.to_i + (60 * 60)).to_s
"This page rendered at #{Time.now}."
end
When I continuously reloaded it in the same Chrome tab it would display the new time.
This page rendered at 2014-10-08 13:36:46 -0400.
This page rendered at 2014-10-08 13:36:48 -0400.
The headers looked like this:
< HTTP/1.1 200 OK
< Content-Type: text/plain;charset=utf-8
< Cache-Control: public, must-revalidate, max-age=3600
< Expires: 2014-10-08 13:36:46 -0400
< Content-Length: 48
< X-Content-Type-Options: nosniff
< Connection: keep-alive
* Server thin is not blacklisted
< Server: thin
However accessing the same URL, http://localhost:4567/ from multiple new tabs would recycle the previous result from the cache.
After doing some tests with Cache-Control:max-age=xxx:
Pressing reload button: header ignored
Entering same url any tab (current or not): honored
Using JS (window.location.reload()): ignored
Using Developer Tools (with Disable cache unselected) or incognito doesn't affect
So, the best option while developing is put the cursor in the omnibox and press enter instead of refresh button.
Note: a right button click on refresh icon will show refresh options (Normal, Hard, Empty Cache). Incredibly, no one of these affect on these headers.
If Chrome Developer Tools are open (F12), Chrome usually disables caching.
It is controllable in the Developer Tools settings - the Gear icon to the right of the dev-tools top bar.
While this question is old, I wanted to add that if you are developing using a self-signed certificate over https and there is an issue with the certificate then google will not cache the response no matter what cache headers you use.
This is noted in this bug report:
https://bugs.chromium.org/p/chromium/issues/detail?id=110649
This is addition to kievic answer
To force browser to NOT send Cache-Control header in request, open chrome console and type:
location = "https://your.page.com"
To force browser to add this header click "reload" button.
Quite an old question, but I noticed just recently (2020), that Chrome sometimes ignores the Cache-Control headers for my image resources when browsing using an Incognito window.
"Sometimes" because in my case the Cache-Control directive was honored for small images (~60-200KB), but not for larger ones (10MB).
Not using Incognito window resulted in Chrome using the disk cached version even for the large images.
Another tip:
Do not forget to verify "Date" header - if server has incorrect date/time (or is located in another time zone) - Chrome will keep requesting resource again and again.

Resources