I'm unsure whether the following is correct considering I can't find a web example that targets a file inside a directory.
<FilesMatch "/out/index.php$">
Header set X-Robots-Tag "noindex, nofollow"
</FilesMatch>
is it ok that way or:
<FilesMatch "\out\index.php$">
Header set X-Robots-Tag "noindex, nofollow"
</FilesMatch>
First one is the right one as Apache like Unix systems uses forward slash for paths.
So the right one is:
<FilesMatch "/out/index\.php$">
Header set X-Robots-Tag "noindex, nofollow"
</FilesMatch>
Related
I have three environments:
env.com
env-uat.com
env-pre.com
All three pages run the same code. I want env-uat.com and env-pre.com to both get this in the htaccess:
Header set X-Robots-Tag "noindex, nofollow"
This will effectively completely unindex these pages, including PDF files etc. But I don't want to affect env.com.
How can I make the Header X-Robots-Tag only be added for env-uat.com and env-pre.com and NOT env.com?
** UPDATE **
From what I could find so far, it would seem you can only do something like this:
SetEnvIf Request_URI "^/privacy-policy" NOINDEXFOLLOW
Header set X-Robots-Tag "noindex, follow" env=REDIRECT_NOINDEXFOLLOW
But this makes it specific to a PAGE. I want it specific to a DOMAIN.
#Starkeen was right up to here :
SetEnvIf host ^(env-uat\.com|host2\.com)$ NOINDEXFOLLOW
So , you could include the domains that you want to be involved in this Env like this :
SetEnvIf host ^(env-uat|env-pre)\.com NOINDEXFOLLOW
Then you should attach the Env with same name like this :
Header set X-Robots-Tag "noindex, follow" env=NOINDEXFOLLOW
Not like this :
Header set X-Robots-Tag "noindex, follow" env=REDIRECT_NOINDEXFOLLOW
The line above will look to Env its name is REDIRECT_NOINDEXFOLLOW not NOINDEXFOLLOW and it is diffrent case from this question X Robots Tag noindex specific page
That was about matching against Request_URI and for special case .
So , the code should look like this :
SetEnvIf host ^(env-uat|env-pre)\.com NOINDEXFOLLOW
Header set X-Robots-Tag "noindex, follow" env=NOINDEXFOLLOW
You can match against Host header using SetEnvIf directive.
To make the Header X-Robots-Tag only available for a specific host ( env-uat.com ) you could use something like the following :
SetEnvIf host ^env-uat\.com$ NOINDEXFOLLOW
Header set X-Robots-Tag "noindex, follow" env=REDIRECT_NOINDEXFOLLOW
To make this available for multiple hosts ,you could use the following :
SetEnvIf host ^(env-uat\.com|host2\.com)$ NOINDEXFOLLOW
Header set X-Robots-Tag "noindex, follow" env=REDIRECT_NOINDEXFOLLOW
I am having problems with my leverage browser caching. It seems that my resources are not fetched from cache and as you can see on the image below some of them are duplicating. I have these meta tag:
<meta http-equiv="Cache-Control" content="private, max-age=216000">
Also I got this on my .htaccess:
<IfModule mod_headers.c>
# Set the cache-control max-age
<FilesMatch ".(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$">
Header set Cache-Control "max-age=172800, public"
</FilesMatch>
# 2 DAYS
<FilesMatch ".(xml|txt)$">
Header set Cache-Control "max-age=172800, public, must-revalidate"
</FilesMatch>
# 4 HOURS
<FilesMatch ".(html|htm)$">
Header set Cache-Control "max-age=14400, must-revalidate"
</FilesMatch>
# Turn off the ETags
Header unset ETag
FileETag None
# Turn off the Last Modified header except for html docs
<FilesMatch ".(ico|pdf|flv|jpg|jpeg|png|gif|js|css)$">
Header unset Last-Modified
</FilesMatch>
Thanks
Ok, as I see from your screenshot, you haven't set any caching headers. Even though you said that you did, I can't see them on screenshot.
Here is an explanation of how caching headers work if you need it, just in case: Cache-Control headers, max-age defined but back button always deliver web cache data
To make caching more efficient, you can load common libraries from public CDNs. For example you can load JQuery from their official CDN: look here
After trying to find the answer, I still cannot figure it out.
I have my domain www.bragdeal.com
I created a subdomain static.bragdeal.com and pointed it to the folder 'static' in the root
I placed all of my images in bragdeal.com/static/ folder
When I ran gtmetrix, it tells me to Use cookie-free domains
On my website, I point to the images like this:
<img src="static/test.jpg">
In my htaccess I have:
# 1 WEEK
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|js|css|swf)$">
Header set Cache-Control "max-age=604800, public"
</FilesMatch>
# 1 WEEK
<FilesMatch "\.(xml|txt)$">
Header set Cache-Control "max-age=604800, public, must-revalidate"
</FilesMatch>
# 1 WEEK
<FilesMatch "\.(html|htm|php|js)$">
Header set Cache-Control "max-age=604800, must-revalidate"
</FilesMatch>
Is my only solution buying a completely new domain to store all my static content on it?
You need to point to your images using the sub-domain you created.
<img src="http://static.bragdeal.com/test.jpg">
Another alternative would be to put them on Amazon S3 -- which also reduces some traffic to your server.
I have few warnings wich I'm tyring to solve in the pagespeed test, such as:
Leverage browser caching
Setting an expiry date or a maximum age in the HTTP headers for static resources instructs the browser to load previously downloaded resources from local disk rather than over the network.
And then it points out to local .js and .css files
But I have this in my htaccess:
<FilesMatch "\.(js|css|ttf)$">
Header set Cache-Control "max-age=604800, public"
</FilesMatch>
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|swf)$">
Header set Cache-Control "max-age=604800, public"
</FilesMatch>
<FilesMatch "\.(html|htm|php)$">
Header set Cache-Control "max-age=60, private, proxy-revalidate"
</FilesMatch>
<FilesMatch "\.(css|js|gif|jpeg|png|ico)$">
ExpiresActive On
ExpiresDefault "access plus 1 year"
</FilesMatch>
Any idea what i'm doing wrong?
Well, this is a shot in the dark but I came across circumstances where Apache would not respect my .htaccess headers and I had to "force" them with the always keyword like this:
<FilesMatch "\.(js|css|ttf)$">
Header always set Cache-Control "max-age=604800, public"
</FilesMatch>
<FilesMatch "\.(ico|pdf|flv|jpg|jpeg|png|gif|swf)$">
Header always set Cache-Control "max-age=604800, public"
</FilesMatch>
<FilesMatch "\.(html|htm|php)$">
Header always set Cache-Control "max-age=60, private, proxy-revalidate"
</FilesMatch>
<FilesMatch "\.(css|js|gif|jpeg|png|ico)$">
ExpiresActive On
ExpiresDefault "access plus 1 year"
</FilesMatch>
When your action is a function of an existing header, you may need to specify a condition of always, depending on which internal table the original header was set in. The table that corresponds to always is used for locally generated error responses as well as successful responses. Note also that repeating this directive with both conditions makes sense in some scenarios because always is not a superset of onsuccess with respect to existing headers:
You're adding a header to a locally generated non-success (non-2xx) response, such as a redirect, in which case only the table corresponding to always is used in the ultimate response.
You're modifying or removing a header generated by a CGI script, in which case the CGI scripts are in the table corresponding to always and not in the default table.
You're modifying or removing a header generated by some piece of the server but that header is not being found by the default onsuccess condition.
From Apache Module mod_headers
Firstly I tried adding multiple ifmodule but it does not work.
<ifModule mod_headers.c>
Header set Access-Control-Allow-Origin: http://domainurl1.com
</ifModule>
<ifModule mod_headers.c>
Header set Access-Control-Allow-Origin: http://domainurl2.com
</ifModule>
When try to add multiple ifmodule only last one(http://domainurl2.com) works others not.
then I try following code it works but i think it is not secure to allow everyone
<ifModule mod_headers.c>
Header set Access-Control-Allow-Origin: “*”
</ifModule>
I have 5 domain that i have to allow.
Are there any solutions for adding multiple domains that i want to allow?
Try this if you want a quick fix
<ifModule mod_headers.c>
Header add Access-Control-Allow-Origin "http://domainurl1.com"
Header add Access-Control-Allow-Origin "http://domainurl2.com"
</ifModule>
However, this is not the recommended solution by W3C, instead you should make the server read the Origin header from the client, then compare it to a list of allowed domains and finally send the value of the Origin header back to the client as the Access-Control-Allow-Origin header. Check http://www.w3.org/TR/cors/#access-control-allow-origin-response-hea for more details.