Creating a great .htaccess file that handles shared resources well - .htaccess

I am trying to create an ideal .htaccess file that will tick all the boxes for taking advantage of server compression, but only where it makes sense, and caching of resources on public proxies, again where it makes sense. I have been feeling my way through the process and I think I am pretty much there, but I suspect there might be a bit of finessing left to do and I thought I'd invite suggestions. I have my suspicions it's not there yet because of a great tool I have discovered and I have to share that with you to begin with.
www.pingdom.com has a great suite of website analysis tools, many of which are free to use and personally I think the best is http://tools.pingdom.com/fpt/. This shows you the load time of every element of your page, but more importantly, under it's 'Performance Grade' tab it offeres a breakdown of where things could be better. Now I use a number of JQuery resources that are served by Google (and others) and I understand these should exist on many proxy servers. I'm not sure how to say that in my .htaccess file (although I have tried) and sure enough, Pingdom's anaylsis includes the following feedack:
The following publicly cacheable, compressible resources should have a
"Vary: Accept-Encoding" header:
•http://jmar777.googlecode.com/svn/trunk/js/jquery.easing.1.3.js
•http://kwicks.googlecode.com/svn/branches/v1.5.1/Kwicks/jquery.kwicks-1.5.1.pack.js
Well I thought I'd done that, but then again, perhaps it's up to the servers that actually serve those resources to set those headers, and maybe there's nothing I can do about it? Is that so? Anyway here is my .htaccess file at the moment. Please note I have the caching set insanely low because I am still just experimenting / learning with it. I will adjust this up before I go live with it.
suPHP_ConfigPath /home/mydomain/public_html
<Files php.ini>
order allow,deny
deny from all
</Files>
<ifModule mod_deflate.c>
<filesMatch "\.(js|css|php|htm|html)$">
SetOutputFilter DEFLATE
</filesMatch>
</ifModule>
# 1 HOUR
<filesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf|htm|html|)$">
Header set Cache-Control "max-age=3600, public"
</filesMatch>
# PHP - NO CACHING WANTED DUE TO USING SESSION COOKIES
<filesMatch "\.(php)$">
Header set Cache-Control "private"
</filesMatch>
# STORE BOTH COMPRESSED AND UNCOMPRESSED FILES FOR JS & CSS
<IfModule mod_headers.c>
<FilesMatch "\.(js|css|xml|gz)$">
Header append Vary Accept-Encoding
</FilesMatch>
</IfModule>
You can see I am trying to do a 'Vary Accept-Encoding' towards the end of the file but not sure if this is what's needed. How do I tell clients to access JQuery and the like the proxies those files are undoubtedly stored at, and is there anything else I can do to make my .htaccess file deliver faster and my search engine friendly content?
Thank you for your thoughts.
Edit:
It seems my questions here were not clear enough so here goes with some clarification:
1) Is the JQuery library, hosted at Google, something whose proxy availability is somehow under the control of my .htaccess settings, because I make remote reference to it in my PHP, and if so, how should I say, in my .htaccess file, 'please cache that library in a proxy for a year or so'?
2) How too should I specify that Google hosted files should be provided compressed and uncompressed via 'Vary Accept-Encoding'? At a guess I'd say both issues were under Googles control and not mine, so to make that absolutely explicit...
3) Is the compression choices and proxification of files like the JQuery library under my control or under (in this case) Googles?
4) Generally, is anything in my .htaccess file expressed in a sub-optimal (long winded) way and how could I shorten/compact it?
5) Is anything in the .htaccess file sequenced in a way that might cause problems - for example I refer to CSS under three separate rules - does the order matter?
(End of Edit).

Is the JQuery library, hosted at Google, something whose proxy availability is somehow under the control of my .htaccess settings, because I make remote reference to it in my PHP, and if so, how should I say, in my .htaccess file, 'please cache that library in a proxy for a year or so'?
This assertion is not correct. The browser decides to cache or not, to download or not depending on the header exchange for that request only. So if a query response involves requests to multi-site, then your .htaccess file(s) only influence how it caches your files. How it caches Google's is up to Google to decide. So for example, a request to http://ajax.googleapis.com/ajax/libs/jquery/1.7.1/jquery.min.js received the response headers:
Age:133810
Cache-Control:public, max-age=31536000
Date:Fri, 17 Feb 2012 21:52:27 GMT
Expires:Sat, 16 Feb 2013 21:52:27 GMT
Last-Modified:Wed, 23 Nov 2011 21:10:59 GMT
Browsers will normally cache for a year but may decide to revalidate on reuse:
If-Modified-Since:Wed, 23 Nov 2011 21:10:59 GMT
And in this case ajax.googleapis.com will reply with a 301 and the following headers:
Age:133976
Date:Fri, 17 Feb 2012 21:52:27 GMT
Expires:Sat, 16 Feb 2013 21:52:27 GMT
This short request/response dialogue will typically require ~50 mSec since this content is CDN delivered.
You might wish to rework your other supplemental Qs in this light since some don't apply

Related

Varnish and WordPress, it is possible real caching without external plugin?

Maybe it sounds a novice question in Varnish Cache world, but why in WordPress it seems that is a need to install a external cache plugin, to working fully cached?
Websites are correctly loaded via Varnish, a curl -I command:
HTTP/1.1 200 OK
Server: nginx/1.11.12
Date: Thu, 11 Oct 2018 09:39:07 GMT
Content-Type: text/html; charset=UTF-8
Connection: keep-alive
Vary: Accept-Encoding
Cache-Control: max-age=0, public
Expires: Thu, 11 Oct 2018 09:39:07 GMT
Vary: Accept-Encoding
X-Varnish: 19575855
Age: 0
Via: 1.1 varnish-v4
X-Cache: MISS
Accept-Ranges: bytes
Pragma: public
Cache-Control: public
Vary: Accept-Encoding
With this configuration, by default WordPress installations are not being cached.
After test multiple cache plugins -some not working, or not working without complex configuration- i found the Swift Performance, in their Lite version, simply activating the Cache option, here really takes all advantages and here i can see varnish is working fully with very good results in stress test.
This could be ok for a single site on a single environment, but in shared hosting terms, when every customer can have their own WP (or other CMS) installation could be a problem.
So the key is there are no way to take full caching advantage from Varnish without installing 3rd party caching (and complex) plugins? Why not caching all by default?
Any kind of suggestions and help will be high welcome, thanks in advance.
With this configuration, by default WordPress installations are not being cached
By default, if you don't change anything in neither Wordpress or Varnish configuration, things would work together in a way that Wordpress pages are cached for 120 seconds. So real caching is possible, but it will be a short lived cache and highly ineffective one.
Your specific headers indicate that no caching should happen. They are either sent by Varnish itself (we're all guilty of copy pasting stuff without thinking what it does), or a Wordpress plugin (more often bad ones, than good). Without knowing your specific configuration, it's hard to decipher anything.
Varnish is a transparent HTTP caching proxy. Which means it’s just going to, by default, use HTTP headers, which are sent by backend (Wordpress), like Cache-Control, to make a decision on whether resource can be cached and for how long.
Wordpress, in fact, does not send cache related headers other than in a few specific areas (error pages, login POST submission, etc).
The standard approach outlined here is configuring Varnish with the highest TTL. With that:
Varnish has no idea when you update an article contents, or change theme. Typical solution to this lies in using cache invalidation plugin like Varnish HTTP Purge.
A plugin requirement comes from necessity to purge cache, when content is changed.
Suppose that you update a Wordpress page's text. You had that same page previously visited and it went into Varnish cache for storage. What happens upon the next visit, is that Varnish will serve the same, now stale content to all the next visitors.
The Wordpress plugins for Varnish, like Varnish HTTP Purge, will hook into Wordpress in a way that they will instruct Varnish to clear cache when pages are updated. This is their primary purpose.
That kind of approach (high TTL and cache purging) is de-facto standard with Varnish. As Varnish has no information about when you update content, the inner workings of purging cache is with the application itself. The cache purging feature is either bundled into CMS code itself (Magento 2, for example has it out of the box, without any extra plugins), or a Wordpress plugin.

header `Vary` - need better explanation/tutorial

I've spent time in reading about header Vary and i think, it is not well-documented and explained completely in internet. I will list in paragraphs what I've understood, and please correct me if I am wrong.
1) Does Vary instructs server, what to do? or server doesnt matter that, and it just is a signal understandable for browsers?
2) If i've enabled compression and caching on my site, then does my site always return only compressed version of page?
3) If compression-disabled browser visits my site (and i have compression enabled), then what happens? I've read this, but cant understand well. What happens a)If I"ve set Vary: Accept-encoding; b) If I havent;
4) in same situation, if my site saves cached copied of pages and compression-disabled browser visits my site, What happens a)If I've set Vary: Accept-encoding; b) If I havent;
5) when I have enabled compression on my website and I've set header Vary, does it tell to proxies, that they should keep two versions of my page: Compressed (for browsers,which supporting compression) and Uncompressed(for browsers, which doesnt support compression)? How proxies get both versions of pages, if my site generates only compressed version?
6) What do proxies do if i have compression-enabled on my site, but havent set Vary header. Does proxies keep only 1 version(uncompressed) of my page, and serving that page to clients?
7) I have seen, that Accept-Encoding, Cookie caused caching not to work. After removing the word Cookie, it starts caching (see report here). why?
8) I had other question, however, forgot at this moment..

Cached resource blocking network stack

I'm caching some js resources in my website such as the file require.js, but it seems that it's almost worthless since the browser goes to the server to ask whether the resource changed or not anyway:
On that picture I grabbed from Chrome network analyzer you can see the resource didn't change (304) so it's not transfered, however the browser was waiting for that http response anyway. Since my server is very far from my workstation, the 220ms correspond to my average ping to it.
How can I improve this? Is there a way to tell the browser to always assume the NOT MODIFIED status and not go check the server?
Thank you
Answers to comments
Caching header:
I've got an htaccess some levels above that directory containing the following info:
<IfModule mod_expires.c>
ExpiresActive On
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$">
ExpiresDefault "access plus 1 week"
</FilesMatch>
</IfModule>
My php page itself contains
<meta HTTP-EQUIV="cache-control" CONTENT="NO-CACHE">
which I suddenly hope that it ONLY affects the php file, not its children...

How to turn off Etags from a CDN Js file using htaccess?

I tested my site using yslow and I got Grade B in Configure Entity tags.
I tried below condition in .htaccess and my site's Etags are removed, but not from JS included by CDN like validate.min.js
Header unset Pragma
FileETag None
Header unset ETag
Here is the image,
How to configure etags from Validate plugin from CDN.
It can be possible duplicate of How to off Etag with htaccess? except that here I am getting problem with js included by CDN.
I believe the answer is: you can't. Configurations like the ETags can only be controlled by the host, in this cast the CDN.
I think it's safe to not worry about this for your site. Loading that JS from a CDN is already a win, and this CDN is correctly supporting top performance rules like minification, gzip compression, and future expiration dates.

cache control not working properly

I've placed a simple cache control in my .htaccess file:
#cache css and javascript files for one week
<FilesMatch ".(js|css)$">
Header set Cache-Control "max-age=604800"
</FilesMatch>
When I test the desktop site at Google's Page Site tester: https://developers.google.com/speed/pagespeed/insights ... it shows the javascript and images are being cached properly. However, when I test my mobile website, the caching isn't working. My htaccess file is contained in the public_html directory alongside all my desktop files (ie. public_html/index.html, public_html/images/, public_html/css/, public_html/.htaccess etc.) My mobile site is contained here: public_html/mobile/.
Would I need to add a second .htaccess file to the mobile directory to make it work?
Thanks.
The best option is to use .htaccess file of html5 boilerplate. It is highly optimised for cache,gzip,cross-domain ajax plus a lot of features.
Also do check whether mod_deflate is on or not.
You don't need any additional .htaccess file just use a single file in the root of your directory.

Resources