I'm caching some js resources in my website such as the file require.js, but it seems that it's almost worthless since the browser goes to the server to ask whether the resource changed or not anyway:
On that picture I grabbed from Chrome network analyzer you can see the resource didn't change (304) so it's not transfered, however the browser was waiting for that http response anyway. Since my server is very far from my workstation, the 220ms correspond to my average ping to it.
How can I improve this? Is there a way to tell the browser to always assume the NOT MODIFIED status and not go check the server?
Thank you
Answers to comments
Caching header:
I've got an htaccess some levels above that directory containing the following info:
<IfModule mod_expires.c>
ExpiresActive On
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$">
ExpiresDefault "access plus 1 week"
</FilesMatch>
</IfModule>
My php page itself contains
<meta HTTP-EQUIV="cache-control" CONTENT="NO-CACHE">
which I suddenly hope that it ONLY affects the php file, not its children...
Related
I am using plain domain name url to display homepage and within htaccess I have
<FilesMatch "\.(html|htm)$">
Header set Cache-Control "max-age=3600, private, must-revalidate"
</FilesMatch>
to send Cache-Control header for html files. Unfortunatelly when just the domein is requested the FilesMatch obviously does not match, but I need to send the cache header anyway. ExpiresByType is not a solution as some files of type text/html are needed to keep uncached (forexample php script sending text/html data). Sending header by PHP is also not an option as the homepage html is cached in filesystem and served by apache directly (htaccess rule) without touching php. I am sure there must be some easy solution, but not able to figure out right now..thanks for help
I am trying to improve my score for for pagespeed.
The following file is abit annoying right now:
https://example.com/cdn-cgi/scripts/d07b1474/cloudflare-static/email-decode.min.js
this is cloudflares email-decode, I want to set the expiry for it with either htaccess, the virtual host or mod_pagespeed
This is what I have attempted with no luck (very possible my regex is just wrong)
# speed up cloudflare #
<FilesMatch "email\\-decode\\.min\\.js$">
ExpiresByType application/x-javascript A31536000
ExpiresByType application/javascript A31536000
</FilesMatch>
# end speed up cloudflare #
That is in my .htaccess file.
This is the warning pagespeed gives me:
Setting an expiry date or a maximum age in the HTTP headers for static
resources instructs the browser to load previously downloaded
resources from local disk rather than over the network.
That file is being served by Cloudflare, not your origin, so not a way to change it. You could disable email obfuscation in the ScrapeShield portion of the Cloudflare admin console if you don't have email addresses on your site.
I have enabled pagespeed module and find that for some resources (image, js and css) that are re-written by pagespeed the cache is set to the default 5 minutes. Few other resources (image, js and css) re-written by pagespeed has Cache-Control: max-age=31536000.
I explicitly give set by ExpiresDefault to 1 year for all my static resources in .htaccess.
The response i get has this:
Cache-Control:max-age=300,private
I am expecting:
Cache-Control:max-age=31536000,private
Suggestions and pointers are appreciated.
mod_pagespeed only serves responses with Cache-Control:max-age=300,private if the Hash in the URL doesn't match the content. This can happen normally when A) the contents of the resource changed recently and so there are a mixture of requests for both old and new URLs for some time or B) the rewriting does not finish in time when serving the resource.
This is most likely to happen if the resource request goes to a different server than the HTML request. You can try flushing the cache and see if this clears up.
I've placed a simple cache control in my .htaccess file:
#cache css and javascript files for one week
<FilesMatch ".(js|css)$">
Header set Cache-Control "max-age=604800"
</FilesMatch>
When I test the desktop site at Google's Page Site tester: https://developers.google.com/speed/pagespeed/insights ... it shows the javascript and images are being cached properly. However, when I test my mobile website, the caching isn't working. My htaccess file is contained in the public_html directory alongside all my desktop files (ie. public_html/index.html, public_html/images/, public_html/css/, public_html/.htaccess etc.) My mobile site is contained here: public_html/mobile/.
Would I need to add a second .htaccess file to the mobile directory to make it work?
Thanks.
The best option is to use .htaccess file of html5 boilerplate. It is highly optimised for cache,gzip,cross-domain ajax plus a lot of features.
Also do check whether mod_deflate is on or not.
You don't need any additional .htaccess file just use a single file in the root of your directory.
I am trying to create an ideal .htaccess file that will tick all the boxes for taking advantage of server compression, but only where it makes sense, and caching of resources on public proxies, again where it makes sense. I have been feeling my way through the process and I think I am pretty much there, but I suspect there might be a bit of finessing left to do and I thought I'd invite suggestions. I have my suspicions it's not there yet because of a great tool I have discovered and I have to share that with you to begin with.
www.pingdom.com has a great suite of website analysis tools, many of which are free to use and personally I think the best is http://tools.pingdom.com/fpt/. This shows you the load time of every element of your page, but more importantly, under it's 'Performance Grade' tab it offeres a breakdown of where things could be better. Now I use a number of JQuery resources that are served by Google (and others) and I understand these should exist on many proxy servers. I'm not sure how to say that in my .htaccess file (although I have tried) and sure enough, Pingdom's anaylsis includes the following feedack:
The following publicly cacheable, compressible resources should have a
"Vary: Accept-Encoding" header:
•http://jmar777.googlecode.com/svn/trunk/js/jquery.easing.1.3.js
•http://kwicks.googlecode.com/svn/branches/v1.5.1/Kwicks/jquery.kwicks-1.5.1.pack.js
Well I thought I'd done that, but then again, perhaps it's up to the servers that actually serve those resources to set those headers, and maybe there's nothing I can do about it? Is that so? Anyway here is my .htaccess file at the moment. Please note I have the caching set insanely low because I am still just experimenting / learning with it. I will adjust this up before I go live with it.
suPHP_ConfigPath /home/mydomain/public_html
<Files php.ini>
order allow,deny
deny from all
</Files>
<ifModule mod_deflate.c>
<filesMatch "\.(js|css|php|htm|html)$">
SetOutputFilter DEFLATE
</filesMatch>
</ifModule>
# 1 HOUR
<filesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf|htm|html|)$">
Header set Cache-Control "max-age=3600, public"
</filesMatch>
# PHP - NO CACHING WANTED DUE TO USING SESSION COOKIES
<filesMatch "\.(php)$">
Header set Cache-Control "private"
</filesMatch>
# STORE BOTH COMPRESSED AND UNCOMPRESSED FILES FOR JS & CSS
<IfModule mod_headers.c>
<FilesMatch "\.(js|css|xml|gz)$">
Header append Vary Accept-Encoding
</FilesMatch>
</IfModule>
You can see I am trying to do a 'Vary Accept-Encoding' towards the end of the file but not sure if this is what's needed. How do I tell clients to access JQuery and the like the proxies those files are undoubtedly stored at, and is there anything else I can do to make my .htaccess file deliver faster and my search engine friendly content?
Thank you for your thoughts.
Edit:
It seems my questions here were not clear enough so here goes with some clarification:
1) Is the JQuery library, hosted at Google, something whose proxy availability is somehow under the control of my .htaccess settings, because I make remote reference to it in my PHP, and if so, how should I say, in my .htaccess file, 'please cache that library in a proxy for a year or so'?
2) How too should I specify that Google hosted files should be provided compressed and uncompressed via 'Vary Accept-Encoding'? At a guess I'd say both issues were under Googles control and not mine, so to make that absolutely explicit...
3) Is the compression choices and proxification of files like the JQuery library under my control or under (in this case) Googles?
4) Generally, is anything in my .htaccess file expressed in a sub-optimal (long winded) way and how could I shorten/compact it?
5) Is anything in the .htaccess file sequenced in a way that might cause problems - for example I refer to CSS under three separate rules - does the order matter?
(End of Edit).
Is the JQuery library, hosted at Google, something whose proxy availability is somehow under the control of my .htaccess settings, because I make remote reference to it in my PHP, and if so, how should I say, in my .htaccess file, 'please cache that library in a proxy for a year or so'?
This assertion is not correct. The browser decides to cache or not, to download or not depending on the header exchange for that request only. So if a query response involves requests to multi-site, then your .htaccess file(s) only influence how it caches your files. How it caches Google's is up to Google to decide. So for example, a request to http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js received the response headers:
Age:133810
Cache-Control:public, max-age=31536000
Date:Fri, 17 Feb 2012 21:52:27 GMT
Expires:Sat, 16 Feb 2013 21:52:27 GMT
Last-Modified:Wed, 23 Nov 2011 21:10:59 GMT
Browsers will normally cache for a year but may decide to revalidate on reuse:
If-Modified-Since:Wed, 23 Nov 2011 21:10:59 GMT
And in this case ajax.googleapis.com will reply with a 301 and the following headers:
Age:133976
Date:Fri, 17 Feb 2012 21:52:27 GMT
Expires:Sat, 16 Feb 2013 21:52:27 GMT
This short request/response dialogue will typically require ~50 mSec since this content is CDN delivered.
You might wish to rework your other supplemental Qs in this light since some don't apply