I am using plain domain name url to display homepage and within htaccess I have
<FilesMatch "\.(html|htm)$">
Header set Cache-Control "max-age=3600, private, must-revalidate"
</FilesMatch>
to send Cache-Control header for html files. Unfortunatelly when just the domein is requested the FilesMatch obviously does not match, but I need to send the cache header anyway. ExpiresByType is not a solution as some files of type text/html are needed to keep uncached (forexample php script sending text/html data). Sending header by PHP is also not an option as the homepage html is cached in filesystem and served by apache directly (htaccess rule) without touching php. I am sure there must be some easy solution, but not able to figure out right now..thanks for help
Related
I am trying to improve my score for for pagespeed.
The following file is abit annoying right now:
https://example.com/cdn-cgi/scripts/d07b1474/cloudflare-static/email-decode.min.js
this is cloudflares email-decode, I want to set the expiry for it with either htaccess, the virtual host or mod_pagespeed
This is what I have attempted with no luck (very possible my regex is just wrong)
# speed up cloudflare #
<FilesMatch "email\\-decode\\.min\\.js$">
ExpiresByType application/x-javascript A31536000
ExpiresByType application/javascript A31536000
</FilesMatch>
# end speed up cloudflare #
That is in my .htaccess file.
This is the warning pagespeed gives me:
Setting an expiry date or a maximum age in the HTTP headers for static
resources instructs the browser to load previously downloaded
resources from local disk rather than over the network.
That file is being served by Cloudflare, not your origin, so not a way to change it. You could disable email obfuscation in the ScrapeShield portion of the Cloudflare admin console if you don't have email addresses on your site.
So I am using the following rule in the htaccess:
AddType SCHM wsc
<FilesMatch "\.(wsc)$">
ForceType SCHM
Header set Content-Disposition attachment
</FilesMatch>
But when I go to the file's location it doesn't force the download
Since the question is already answered in the comments, this is just to provide an answer in the way how Stackoverflow designated it.
Like in the question it can be solved by using mod_headers of Apache 2. Since Content-Disposition is not part of the standard of HTTP, you may add some other header to achieve your objective.
<FilesMatch "\.(wsc)$">
Header set Content-Type application/octet-stream
Header set Content-Disposition attachment
</FilesMatch>
Another thing you should consider is that your browser may cache the responce of the server. The browser will still send the request, but the request will contain a node that the browser already have the file from a given date. If the files hasn't changed since the given date, the server will not send the new headers to your browser. This means if you change the .htaccess, you may not see any impact until you disable caching in your browser or you change the timestamps of the file.
You can also add
Header set X-Content-Type-Options "nosniff"
for better compatiblity (and maybe security). It prevents the browser from doing MIME-type sniffing, which would ignore the declared content-type. See here for more information.
RFC2616 say for 19.5.1 Content-Disposition
If this header is used in a response with the application/octet- stream content-type, the implied suggestion is that the user agent should not display the response, but directly enter a `save response as...' dialog.
http://www.w3.org/Protocols/rfc2616/rfc2616-sec19.html#sec19.5.1
When doing website speed tests, I often see a comment along the lines of:
Serve static content from a cookieless domain
I want to ensure that data served from the static domain - files with extension JS, SVG, PNG, JPG and CSS - do not set cookies.
Here is a portion of my .htaccess:
<FilesMatch "\.(js|svg|png|jpg|css)$">
<IfModule header_module>
Header unset Cookie
</IfModule>
</FilesMatch>
I set a CNAME entry for static.example.com with the record www.example.com.
How can I ensure that either all files served from the subdomain, or any files with those extensions, don't set cookies?
I'm caching some js resources in my website such as the file require.js, but it seems that it's almost worthless since the browser goes to the server to ask whether the resource changed or not anyway:
On that picture I grabbed from Chrome network analyzer you can see the resource didn't change (304) so it's not transfered, however the browser was waiting for that http response anyway. Since my server is very far from my workstation, the 220ms correspond to my average ping to it.
How can I improve this? Is there a way to tell the browser to always assume the NOT MODIFIED status and not go check the server?
Thank you
Answers to comments
Caching header:
I've got an htaccess some levels above that directory containing the following info:
<IfModule mod_expires.c>
ExpiresActive On
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$">
ExpiresDefault "access plus 1 week"
</FilesMatch>
</IfModule>
My php page itself contains
<meta HTTP-EQUIV="cache-control" CONTENT="NO-CACHE">
which I suddenly hope that it ONLY affects the php file, not its children...
I have written this in my .htaccess file to leverage browser caching:
# 480 weeks caching
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$">
Header set Cache-Control "max-age=290304000, public"
</FilesMatch>
It is working well for files coming from mydomain.com url but the problem is that it does not affect the images coming from CDN URLs which are actually subdomains static.mydomain.com etc.
How can I leverage browser caching for images served through CDN?
Are you using S3? You can add this header in the S3 console for resources you want to cache:
CacheControl: max-age=999999