Htaccess Concatenarion - .htaccess

I use this code to concatenare CSS and Javascript:
<FilesMatch "\.combined\.js$">
Options +Includes
AddOutputFilterByType INCLUDES application/javascript application/json
SetOutputFilter INCLUDES
</FilesMatch>
<FilesMatch "\.combined\.css$">
Options +Includes
AddOutputFilterByType INCLUDES text/css
SetOutputFilter INCLUDES
</FilesMatch>
but it doesnt work

So you're using
<!--#include file="path/to/a/file.js" -->
<!--#include file="path/to/another/file.js" -->
in a master JS or CSS file, right?
This is what is being enabled by the .htaccess code posted in your question.

I also used same code for concatenating my js and css file. It works pefectly in my local machine..but don't work in live. I am using ubutu linux server.

Related

Responding to google page speed suggestion regarding compression

Google page speed tool tells me this: "Compressing resources with gzip or deflate can reduce the number of bytes sent over the network"
and of course lists all my .js and .css files.
Researching here eventually led me to this question:
How to Specify "Vary: Accept-Encoding" header in .htaccess
Which seems to say that for just .js and .css files all I would need to do is this:
<IfModule mod_deflate.c>
#The following line is enough for .js and .css
AddOutputFilter DEFLATE js css
</IfModule>
<IfModule mod_headers.c>
<FilesMatch "\.(js|css)$">
Header append Vary: Accept-Encoding
</FilesMatch>
</IfModule>
Can someone confirm that this is the current "best practice" for this objective and that it is failsafe, assuming the user is on a modern browser (e.g. not < IE7 for example)
Thanks!

Exclude a single file from DEFLATE in .htaccess

I have set my htaccess file to cache and deflate the majority of the usual file types to increase speed, one file particularly though seems to behave oddly when cached and I want to try to exlude this from any deflate and caching commands in htaccess to see if that is the cause.
Because my site is fairly busy it does not make sense to take off all files and slow every user down while I check this over a couple of days so I was wondering?
Is there a line I can put in my htaccess that specifically excludes a particular file (engine.js for example)
regards
You could try something like this:
<Files (whatever you have this set to deflate)>
SetOutputFilter DEFLATE
</Files>
<Files (filename).(extension)>
SetOutputFilter NONE
</Files>
For example, if you were trying to deflate the PHP files, except for one, your code would look like this:
<Files *.php>
SetOutputFilter DEFLATE
</Files>
<Files myfile.php>
SetOutputFilter NONE
</Files>

htaccess - How to force the client's browser to clear the cache?

For my site I have the following htaccess rules:
# BEGIN Gzip
<IfModule mod_deflate.c>
AddOutputFilterByType DEFLATE text/text text/html text/plain text/xml text/css application/x-javascript application/javascript
</IfModule>
# END Gzip
# BEGIN EXPIRES
<IfModule mod_expires.c>
ExpiresActive On
ExpiresDefault "access plus 10 days"
ExpiresByType text/css "access plus 1 month"
ExpiresByType text/plain "access plus 1 month"
ExpiresByType image/gif "access plus 1 month"
ExpiresByType image/png "access plus 1 month"
ExpiresByType image/jpeg "access plus 1 month"
ExpiresByType application/x-javascript "access plus 1 month"
ExpiresByType application/javascript "access plus 1 month"
ExpiresByType application/x-icon "access plus 1 year"
</IfModule>
# END EXPIRES
I've just updated my site and it looked all screwy until I cleared my cache. How can I force the client's browser to clear the cache after an update so that the user can see the changes?
You can force browsers to cache something, but
You can't force browsers to clear their cache.
Thus the only (AMAIK) way is to use a new URL for your resources. Something like versioning.
As other answers have said, changing the URL is a good cache busting technique, however it is alot of work to go through a bigger site, change all the URLs and also move the files.
A similar technique is to just add a version parameter to the URL string which is either a random string / number or a version number, and target the changed files only.
For instance if you change your sites CSS and it looks wonky until you do a force refresh, simply add ?ver=1.1 to the CSS import at the head of the file. This to the browser is a different file, but you only need to change the import, not the actual location or name of the file.
e.g:
<link href="assets/css/style.css" rel="stylesheet" type="text/css" />
becomes
<link href="assets/css/style.css?ver=1.1" rel="stylesheet" type="text/css" />
Works great for javascript files also.
I got your problem...
Although we can clear client browser cache completely but you can add some code to your application so that your recent changes reflect to client browser.
In your <head>:
<meta http-equiv="Cache-Control" content="no-cache" />
<meta http-equiv="Pragma" content="no-cache" />
<meta http-equiv="Expires" content="0" />
You can not force the browsers to clear the cache.
Your .html file seems to be re-loaded sooner as it expires after 10 days.
What you have to do is to update your .html file and move all your files to a new folder such as version-2/ or append a version identifier to each file such as mypicture-2.jpg. Then you reference these new files in your .html file and the browser will load them again because the location changed.
You can tell the browser never cache your site by pasting following code in the header
<meta http-equiv="Cache-Control" content="no-cache, no-store, must-revalidate" />
<meta http-equiv="Pragma" content="no-cache" />
<meta http-equiv="Expires" content="0" />
And to prevent js, css cache, you could use tool to minify and obfuscate the scripts which should generate a random file name every time. That would force the browser to reload them from server too.
Hopefully, that helps.
In my case, I change a lot an specific JS file and I need it to be in its last version in all browsers where is being used.
I do not have a specific version number for this file, so I simply hash the current date and time (hour and minute) and pass it as the version number:
<script src="/js/panel/app.js?v={{ substr(md5(date("Y-m-d_Hi")),10,18) }}"></script>
I need it to be loaded every minute, but you can decide when it should be reloaded.
Adding 'random' numbers to URLs seems inelegant and expensive to me. It also spoils the URL of the pages, which can look like index.html?t=1614333283241 and btw users will have dozens of URLs cached for only one use.
I think this kind of things is what .htaccess files are meant to solve at the server side, between your functional code an the users.
I copy/paste this code from here that allows filtering by file extension to force the browser not to cache them. If you want to return to normal behavior, just delete or comment it.
Create or edit an .htaccess file on every folder you want to prevent caching, then paste this code changing file extensions to your needs, or even to match one individual file.
If the file already exists on your host be cautious modifying what's in it.
(kudos to the link)
# DISABLE CACHING
<IfModule mod_headers.c>
Header set Cache-Control "no-cache, no-store, must-revalidate"
Header set Pragma "no-cache"
Header set Expires 0
</IfModule>
<FilesMatch "\.(css|flv|gif|htm|html|ico|jpe|jpeg|jpg|js|mp3|mp4|png|pdf|swf|txt)$">
<IfModule mod_expires.c>
ExpiresActive Off
</IfModule>
<IfModule mod_headers.c>
FileETag None
Header unset ETag
Header unset Pragma
Header unset Cache-Control
Header unset Last-Modified
Header set Pragma "no-cache"
Header set Cache-Control "max-age=0, no-cache, no-store, must-revalidate"
Header set Expires "jue, 1 Jan 1970 00:00:00 GMT"
</IfModule>
</FilesMatch>
You can set "access plus 1 seconds" and that way it will refresh the next time the user enters the site. Keep the setting for one month.
Now the following wont help you with files that are already cached, but moving forward, you can use the following to easily force a request to get something new, without changing the actual filename.
# Rewrite all requests for JS and CSS files to files of the same name, without
# any numbers in them. This lets the JS and CSS be force out of cache easily
# by putting a number at the end of the filename
# e.g. a request for static/js/site-52.js will get the file static/js/site.js instead.
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteRule ^static/(js|css)/([a-z]+)-([0-9]+)\.(js|css)$ /site/$1/$2.$4 [R=302,NC,L]
</IfModule>
Of course, the higher up in your folder structure you do this type of approach,
the more you kick things out of cache with a simple change.
So for example, if you store the entire css and javascript of your site in one main folder
/assets/js
/assets/css
/assets/...
Then you can could start referencing it as "assets-XXX" in your html, and use a rule like so to kick all assets content out of cache.
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteRule ^assets-([a-z0-9]+)/(.*) /$2 [R=302,NC,L]
</IfModule>
Note that if you do go with this, after you have it working, change the 302 to a 301, and then caching will kick in. When it's a 302 it wont cache at the browser level because it's a temporary redirect. If you do it this way, then you could bump up the expiry default time to 30 days for all assets, since you can easily kick things out of cache by simply changing the folder name in the login page.
<IfModule mod_expires.c>
ExpiresActive on
ExpiresDefault A2592000
</IfModule>
The most straight forward is to add filetime to the request.
eg
myfile.txt?2014-10-30-13:12:33
versioning by date.
Change the name of the .CSS file
Load the page and then change the file again in the original name it works for me.
This worked for me.
look for this:
DirectoryIndex index.php
replace with this:
DirectoryIndex something.php index.php
Upload and refresh page. You will get a page error.
just change it back to:
DirectoryIndex index.php
reupload and refresh page again. I checked this on all of my devices and, it worked.
Use the mod rewrite with R=301 - where you use a incremental version number:
To achieve > css/ver/file.css => css/file.css?v=ver
RewriteRule ^css/([0-9]+)/file.css$ css/file.css?v=$1 [R=301,L,QSA]
so example, css/10/file.css => css/file.css?v=10
Same can be applied to js/ files. Increment ver to force update, 301 forces re-cache
I have tested this across Chrome, Firefox, Opera etc
PS: the ?v=ver is just for readability, this does not cause the refresh

How to Specify "Vary: Accept-Encoding" header in .htaccess

Google PageSpeed says I should "Specify a Vary: Accept-Encoding header" for JS and CSS. How do I do this in .htaccess?
I guess it's meant that you enable gzip compression for your css and js files, because that will enable the client to receive both gzip-encoded content and a plain content.
This is how to do it in apache2:
<IfModule mod_deflate.c>
#The following line is enough for .js and .css
AddOutputFilter DEFLATE js css
#The following line also enables compression by file content type, for the following list of Content-Type:s
AddOutputFilterByType DEFLATE text/html text/plain text/xml application/xml
#The following lines are to avoid bugs with some browsers
BrowserMatch ^Mozilla/4 gzip-only-text/html
BrowserMatch ^Mozilla/4\.0[678] no-gzip
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
</IfModule>
And here's how to add the Vary Accept-Encoding header: [src]
<IfModule mod_headers.c>
<FilesMatch "\.(js|css|xml|gz)$">
Header append Vary: Accept-Encoding
</FilesMatch>
</IfModule>
The Vary: header tells the that the content served for this url will vary according to the value of a certain request header. Here it says that it will serve different content for clients who say they Accept-Encoding: gzip, deflate (a request header), than the content served to clients that do not send this header. The main advantage of this, AFAIK, is to let intermediate caching proxies know they need to have two different versions of the same url because of such change.
I'm afraid Aularon didn't provide enough steps to complete the process. With a little trial and error, I was able to successfully enable Gzipping on my dedicated WHM server.
Below are the steps:
Run EasyApache within WHM, select Deflate within the Exhaustive Options list, and rebuild the server.
Once done, goto Services Configuration >> Apache Configuration >> Include Editor >> Post VirtualHost Include, select All Versions, and then paste the mod_headers.c and mod_headers.c code (listed above in Aularon's post) on top of on another within the input field.
Once saved, I was seeing a 75.36% data savings on average! You can run a before and after test by using this HTTP Compression tool to see your own results: http://www.whatsmyip.org/http_compression/
Hope this works for you all!
Matt
To gzip up your font files as well!
add "x-font/otf x-font/ttf x-font/eot"
as in:
AddOutputFilterByType DEFLATE text/html text/plain text/xml application/xml x-font/otf x-font/ttf x-font/eot
This was driving me crazy, but it seems that aularon's edit was missing the colon after "Vary". So changing "Vary Accept-Encoding" to "Vary: Accept-Encoding" fixed the issue for me.
I would have commented below the post, but it doesn't seem like it will let me.
Anyhow, I hope this saves someone the same trouble I was having.
Many hours spent to clarify what was that. Please, read this post to get the advanced .HTACCESS codes and learn what they do.
You can use:
Header append Vary "Accept-Encoding"
#or
Header set Vary "Accept-Encoding"
if anyone needs this for NGINX configuration file here is the snippet:
location ~* \.(js|css|xml|gz)$ {
add_header Vary "Accept-Encoding";
(... other headers or rules ...)
}
No need to specify or even check if the file is/has compressed,
you can send it to every file, On every request.
It tells downstream proxies how to match future request headers to decide
whether the cached response can be used rather than requesting a fresh
one from the origin server.
<ifModule mod_headers.c>
Header unset Vary
Header set Vary "Accept-Encoding, X-HTTP-Method-Override, X-Forwarded-For, Remote-Address, X-Real-IP, X-Forwarded-Proto, X-Forwarded-Host, X-Forwarded-Port, X-Forwarded-Server"
</ifModule>
the unset is to fix some bugs in older GoDaddy hosting, optionally.

Safari seems to ignore cache settings when user hits back with mouse or keyboard

I have a page where once a button is clicked, it is replaced by an Ajax spinner whilst the user waits for the next page to load.
I am controlling (or attempting to control) caching using .htaccess. If the user hits back (browser button, mouse button, alt+left, backspace) it needs to reload the page from the cache. IEs 6-8 and Chrome were all fine with this. Firefox wasn't working for a while and recently started to work, but the problem seems to be remaining in Safari. This seems a little odd, because I'd have expected Safari and Chrome to behave in the same way.
This is my .htaccess file:
# Add Proper MIME-Type for Favicon
AddType image/x-icon .ico
<IfModule mod_expires.c>
ExpiresActive On
# ExpiresDefault A2630000
ExpiresByType image/x-icon A2630000
ExpiresByType image/gif A2630000
ExpiresByType image/jpeg A2630000
ExpiresByType image/png A2630000
ExpiresByType application/x-javascript M2630000
ExpiresByType application/javascript M2630000
ExpiresByType text/css M2630000
</IfModule>
<IfModule mod_headers.c>
Header set Cache-Control "public"
<FilesMatch "\.php$">
Header set Cache-Control "no-store, no-cache, must-revalidate, max-age=0, post-check=0, pre-check=0"
</FilesMatch>
</IfModule>
SetOutputFilter DEFLATE
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/javascript
I've tried it with and without the Cache-Control public line.
I've also tried adding:
AddType application/x-httpd-php .php
with:
ExpiresByType application/x-httpd-php A0
To no avail.
Am I missing something obvious?
Edit: I don't think it's anything to do with the cache settings.
I've tried adding this to the PHP itself:
#safari test
if (strstr($_SERVER['HTTP_USER_AGENT'],'Safari')){
header("Cache-Control: no-cache, must-revalidate"); // HTTP/1.1
header("Expires: Sat, 26 Jul 1997 05:00:00 GMT"); // Date in the past
}
And even without those lines I can see in "inspect element" > resources that the right headers are being received. The problem seems to be what Safari is doing with them in its bid to be "the fastest browser". It would appear that it explicitly ignores what the site developers specificy - this sounds like IE's original mindset from back in the day when tables where used for layouts.
The HTTP spec explicitly says that recalling from history (back/forward button) does not have to do cache validation. So this seems valid behaviour.
This behavior is caused by back-forward Cache. You can tap into onpageshow event to find out when user navigates with back button. Look for property called persisted. It is set to false on initial pageload. However when page comes from bfcache it is set to true.
You can then force reload page with JavaScript:
window.onpageshow = function(event) {
if (event.persisted) {
window.location.reload(true)
}
};
or if you are using jQuery:
$(window).bind("pageshow", function(event) {
if (event.originalEvent.persisted) {
window.location.reload(true)
}
};
I had a similar problem with firefox but modifying the .htaccess with the code...
<IfModule mod_headers.c>
Header set Cache-Control "public"
<FilesMatch "\.php$">
Header set Cache-Control "no-store, no-cache, must-revalidate, max-age=0, post-check=0, pre-check=0"
</FilesMatch>
</IfModule>
...that, ironically, you have in your code, fixed it for me. Not sure about the Safari problem though.
Try adding this as a meta tag in your HTML:
<meta http-equiv="cache-control" content="no-cache, no store, must-revalidate"/>
Is Safari giving you 304 while Chrome does not? If so, I suspect you are using Apache 2.0 and upgrading to Apache 2.2 solves this issue.

Resources