I have written this in my .htaccess file to leverage browser caching:
# 480 weeks caching
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$">
Header set Cache-Control "max-age=290304000, public"
</FilesMatch>
It is working well for files coming from mydomain.com url but the problem is that it does not affect the images coming from CDN URLs which are actually subdomains static.mydomain.com etc.
How can I leverage browser caching for images served through CDN?
Are you using S3? You can add this header in the S3 console for resources you want to cache:
CacheControl: max-age=999999
Related
I am using plain domain name url to display homepage and within htaccess I have
<FilesMatch "\.(html|htm)$">
Header set Cache-Control "max-age=3600, private, must-revalidate"
</FilesMatch>
to send Cache-Control header for html files. Unfortunatelly when just the domein is requested the FilesMatch obviously does not match, but I need to send the cache header anyway. ExpiresByType is not a solution as some files of type text/html are needed to keep uncached (forexample php script sending text/html data). Sending header by PHP is also not an option as the homepage html is cached in filesystem and served by apache directly (htaccess rule) without touching php. I am sure there must be some easy solution, but not able to figure out right now..thanks for help
I tried google pageSpeed to check the possibilities to enhance the performance of my REACT site(create through CRA). The analytics show that I need to Serve static assets with an efficient cache policy. It shows list of files in which I can control the caching. How could I do that?
I tried to set the cache-control header with max-Age, It only works on the files that are served from the back-end. Since the images, css and JS are served from the build/static folder. I am unable to set a cache-control value.
When doing website speed tests, I often see a comment along the lines of:
Serve static content from a cookieless domain
I want to ensure that data served from the static domain - files with extension JS, SVG, PNG, JPG and CSS - do not set cookies.
Here is a portion of my .htaccess:
<FilesMatch "\.(js|svg|png|jpg|css)$">
<IfModule header_module>
Header unset Cookie
</IfModule>
</FilesMatch>
I set a CNAME entry for static.example.com with the record www.example.com.
How can I ensure that either all files served from the subdomain, or any files with those extensions, don't set cookies?
I used google pagespeed Insights to test the performance of my nodejs website. For some of external files it is saying to leverage browser caching but I don't know how to do this ?
Leverage browser caching
Setting an expiry date or a maximum age in the HTTP headers for static resources instructs the browser to load previously downloaded resources from local disk rather than over the network.
Leverage browser caching for the following cacheable resources:
http://maps.googleapis.com/…kwPPoBErK_--SlHZI28k6jjYLyU&sensor=false (30 minutes)
http://www.google-analytics.com/analytics.js (2 hours)
Anyone please help me on this.
One solution is to reverse proxy the Google resources. Then you can add Cache-Control and other caching headers. If you're using Apache you can accomplish it as follows in your httpd.conf file:
ProxyRemote http://www.google-analytics.com http://yourinternalproxy:yourport
<Location /analytics.js>
ProxyPass http://www.google-analytics.com/analytics.js
ProxyPassReverse http://www.google-analytics.com/analytics.js
Header set Cache-Control "max-age=86400"
</Location>
The drawbacks of this are that:
You'll funnel a lot of additional traffic through your servers.
Obviously updates made by Google will take longer to appear for the user's of your site.
If you don't have access to httpd.conf file as rudolfv's answer there are several options here:
the easiest one is you could copy its content each day to make sure your up to date
we can employ the powers of cron, there is nice sample script using php posted here
use a php script to generate the google analytics script on every request on the fly:
$context = stream_context_create(['http' => ['Content-Type' => 'text/javascript', 'enable_cache' => true, 'enable_optimistic_cache' => true, 'read_cache_expiry_seconds' => 86400,]]);
echo file_get_contents("http://www.google-analytics.com/analytics.js", false, $context);
use the power of .htaccess if your hosting provider allowing mod_headers & mod_proxy
RewriteEngine On
Header set Cache-Control "max-age=86400"
RewriteRule ^js/analytics.js http://www.google-analytics.com/analytics.js [P]
I tested my site using yslow and I got Grade B in Configure Entity tags.
I tried below condition in .htaccess and my site's Etags are removed, but not from JS included by CDN like validate.min.js
Header unset Pragma
FileETag None
Header unset ETag
Here is the image,
How to configure etags from Validate plugin from CDN.
It can be possible duplicate of How to off Etag with htaccess? except that here I am getting problem with js included by CDN.
I believe the answer is: you can't. Configurations like the ETags can only be controlled by the host, in this cast the CDN.
I think it's safe to not worry about this for your site. Loading that JS from a CDN is already a win, and this CDN is correctly supporting top performance rules like minification, gzip compression, and future expiration dates.