I'm trying to set HTST Header in .htaccess based on kind of request (https or http). I tried to modify this piece of code, which sets header "max-age=300" to image files by checking E variable:
RewriteRule \.(png|gif|jpg) - [E=image:1]
Header set Cache-Control "max-age=300" env=image
This way:
RewriteCond %{HTTPS} on
RewriteRule ^ - [E=STRICT_ENV:1]
Header always set Strict-Transport-Security "max-age=63072000;" env=STRICT_ENV
But seems that STRICT_ENV variable is not set at all.
Give this header a try:
Header set Strict-Transport-Security "max-age=63072000" env=HTTPS
Related
I want to use https:// and non www. URL always. So I used the following code in my htaccess file. But i am getting an warning from https://hstspreload.org
RewriteCond %{HTTPS} off
RewriteRule .* https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
RewriteCond %{HTTP_HOST} !^www\.
RewriteRule .* https://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
<ifModule mod_headers.c>
Header always set Strict-Transport-Security "max-age=31536000;
includeSubDomains; preload"
</ifModule>
Warning Message is given bellow :
Warning: Unnecessary HSTS header over HTTP
The HTTP page at http://mysiteurl.com sends an HSTS header. This has no effect over HTTP, and should be removed.
Please help me to rid the above warning. I tried with the following code also but it does not work #ref. topic
Header always set Strict-Transport-Security "max-age=31536000;
includeSubDomains; preload" env=HTTPS
Try removing the always attribute. So do this:
Header set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload" env=HTTPS
Instead of this:
Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload" env=HTTPS
The other option is to only set this in the HTTPS VirtualHost's rather than in the main top level config:
Do this:
<VirtualHost *:443>
(All other virtual host config)
Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"
</VirtualHost>
Instead of this:
Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"
<VirtualHost *:443>
(All other virtual host config)
</VirtualHost>
This has the disadvantage (or advantage depending how you look at it!) of having to be added to each VirtualHost to take effect, whereas the first option would automatically be applied to all HTTPS virtual hosts.
And please, please, please be very careful with preload. It is not easily reversible! I would strongly recommend you run for a few months with good (i.e. error-free) config - which it appears you have not been doing - before submitting to preload list.
To give one example where preload can cause you problems: Supposing you run https://www.example.com, and this also responds to http://example.com and redirects you to https://example.com and then https://www.example.com (as the preload submissions require and as your config is set up to do). Then your website is lovely and secure. However, for companies that reuse their domain internally (which is quite common) this can cause problems - especially when you preload. For example, if you run a non-secure site which is not publicly available at http://intranet.example.com, or perhaps a non-secure development version of your site at http://dev.example.com, then you may not realise that this site must now also be served over HTTPS (as it is a subdomain of example.com). This rarely takes effect (as most people don't visit http://example.com or https://example.com so never see this HSTS header on top level domain) so you might never notice this potential problem during all your testing. However as soon as the preload takes effect, your browser will then know about the HSTS at top level domain even without visiting that and you instantly lose access to those HTTP-only sites, and cannot easily reverse this! Lots of companies still have lots of internal sites and tools served over HTTP only and upgrading them all to HTTPS (which should be done anyway btw!) at short notice would not be easy.
To get around this, either use a different domain internally, or you can only set this without includeSubDomain on the top level domain:
<VirtualHost *:443>
ServerName example.com
(All other virtual host config)
#Set HSTS at top level without includeSubDomain
Header always set Strict-Transport-Security "max-age=31536000"
</VirtualHost>
<VirtualHost *:443>
ServerName www.example.com
(All other virtual host config)
#Set HSTS at subdomain level
Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains"
</VirtualHost>
This is not quite as secure (as someone can set up other subdomains over HTTP like http://wwww.example.com (note the four Ws) or http://fake.subdomain.com) but at least it doesn't break those HTTP-only sites. This setup will not be allowed through the preload list as it demands the more secure includeSubDomains even on the top level domain.
If you do want to use includeSubDomains even on the top level domain then I strongly recommend including a resource from the top level domain in your HTML (even if it redirects to the www version as HSTS is still set for 301s/302s). This way you are making sure visitors load the HSTS config at top level even before you preload. For example you could replace your logo to a call to the top level domain instead:
<img source="https://example.com/logo.png">
Run with that, and a small expiry, and no preload tag for a bit. Then increase the expiry. Then, if that all works, add the preload tag back and submit to the preload list.
This might all sound a bit painful, and perhaps you've thought of all this, but preloading can be incredibly dangerous if not thought through, due to the fact it is not easily reversible. In my opinion preloading HSTS is overkill for most sites though I agree it is the most secure option.
I solved the error in my litespeed based server by this method. Also works for apache. First add this code into your htaccess-
# Force HTTPS
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
Then add this code-
<IfModule mod_headers.c>
Header always set Strict-Transport-Security "max-age=63072000; includeSubdomains; preload" env=HTTPS
</IfModule>
Ubuntu 18.04 apache2 Letsencrytp
nano /etc/apache2/conf-enabled/ssl-params.conf
Header always set Strict-Transport-Security "max-age=63072000; includeSubDomains; preload" env=HTTPS
service apache2 restart
remove or comment # all other vhost conf on apache.conf with
#Header always set Strict-Transport-Security "max-age=63072000; includeSubDomains; preload" env=HTTPS
The issue is your are sending the header when the user is connected using HTTP
If you want to force them to use HTTPS, perform a redirect first like this.
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
I want to add Access-Control-Allow-Credentials: true to a response on an Apache 2.2 server. My .htaccess file looks like this.
SetEnvIfNoCase ORIGIN (.*) ORIGIN=$1
Header always add Access-Control-Allow-Origin "%{ORIGIN}e" env=ORIGIN
Header always add Access-Control-Allow-Methods "GET,POST,OPTIONS,DELETE,PUT"
Header always add Access-Control-Allow-Headers "Link, Location, Accept-Post, Content-Type, Slug, Origin"
Header always set Access-Control-Allow-Credentials "true"
RewriteCond %{HTTP_ACCEPT} text/turtle
RewriteRule ^/?$ https://example/me.ttl [R=303]
On line 5 I set the header Access-Control-Allow-Credentials to true. But the resulting response has '' for this header. When I add line 5 also at the end of the file, resulting in the following
SetEnvIfNoCase ORIGIN (.*) ORIGIN=$1
Header always add Access-Control-Allow-Origin "%{ORIGIN}e" env=ORIGIN
Header always add Access-Control-Allow-Methods "GET,POST,OPTIONS,DELETE,PUT"
Header always add Access-Control-Allow-Headers "Link, Location, Accept-Post, Content-Type, Slug, Origin"
Header always set Access-Control-Allow-Credentials "true"
RewriteCond %{HTTP_ACCEPT} text/turtle
RewriteRule ^/?$ https://example/me.ttl [R=303]
Header always set Access-Control-Allow-Credentials "true"
the value of Access-Control-Allow-Credentials is 'true, true'. While I want it to be only 'true'.
Does anybody know what the reason is for this behaviour?
EDIT: I've tried to put '' in line 5 in order to get 'true' at the end, but then I get just ''.
I contacted my hosting and although they said that nothing is wrong on their side, suddenly it works...
Firstly I tried adding multiple ifmodule but it does not work.
<ifModule mod_headers.c>
Header set Access-Control-Allow-Origin: http://domainurl1.com
</ifModule>
<ifModule mod_headers.c>
Header set Access-Control-Allow-Origin: http://domainurl2.com
</ifModule>
When try to add multiple ifmodule only last one(http://domainurl2.com) works others not.
then I try following code it works but i think it is not secure to allow everyone
<ifModule mod_headers.c>
Header set Access-Control-Allow-Origin: “*”
</ifModule>
I have 5 domain that i have to allow.
Are there any solutions for adding multiple domains that i want to allow?
Try this if you want a quick fix
<ifModule mod_headers.c>
Header add Access-Control-Allow-Origin "http://domainurl1.com"
Header add Access-Control-Allow-Origin "http://domainurl2.com"
</ifModule>
However, this is not the recommended solution by W3C, instead you should make the server read the Origin header from the client, then compare it to a list of allowed domains and finally send the value of the Origin header back to the client as the Access-Control-Allow-Origin header. Check http://www.w3.org/TR/cors/#access-control-allow-origin-response-hea for more details.
Can I 'noindex, follow' a specific page using x robots in .htaccess?
I've found some instructions for noindexing types of files, but I can't find instruction to noindex a single page, and what I have tried so far hasn't worked.
This is the page I'm looking to noindex:
http://www.examplesite.com.au/index.php?route=news/headlines
This is what I have tried so far:
<FilesMatch "/index.php?route=news/headlines$">
Header set X-Robots-Tag "noindex, follow"
</FilesMatch>
Thanks for your time.
It seems to be impossible to match the request parameters from within a .htaccess file. Here is a list of what you can match against: http://httpd.apache.org/docs/2.2/sections.html
It will be much easier to do it in your script. If you are running on PHP try:
header('X-Robots-Tag: noindex, follow');
You can easily build conditions on $_GET, REQUEST_URI and so on.
RewriteEngine on
RewriteBase /
#set env variable if url matches
RewriteCond %{QUERY_STRING} ^route=news/headlines$
RewriteRule ^index\.php$ - [env=NOINDEXFOLLOW:true]
#only sent header if env variable set
Header set X-Robots-Tag "noindex, follow" env=NOINDEXFOLLOW
FilesMatch works on (local) files, not urls. So it would try to match only the /index.php part of the url. <location> would be more appropriate, but as far as I can read from the documentation, querystrings are not allowed here. So I ended up with the above solution (I really liked this challenge). Although php would be the more obvious place to put this, but that is up to you.
The solution requires mod_rewrite, and mod_headers of course.
Note that you'll need the mod_headers module enabled to set the headers.
Though like others have said, it seems better to use the php tag. Does that not work?
According to Google the syntax would be a little different:
<Files ~ "\.pdf$">
Header set X-Robots-Tag "noindex, nofollow"
</Files>
https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag
Is it possible to apply HTTP header directives based on the URL's query string using an apache .htaccess?
For example, based on this resource http://code.google.com/web/controlcrawlindex/docs/robots_meta_tag.html under the section titled "Practical implementation of X-Robots-Tag with Apache" it says the following .htaccess file directive can be used:
<Files ~ "\.pdf$">
Header set X-Robots-Tag "noindex, nofollow"
</Files>
I'm looking for something along the lines of:
<QueryString ~ "m=_!">
Header set X-Robots-Tag "noindex, nofollow"
</QueryString>
This way the following URL would NOT get indexed by search engines:
http://domain.com/?m=_!ajax_html_snippet
Any hints/tips/clues would be much appreciated. Thanks.
You can try the following in your .htaccess file
#modify query string condition here to suit your needs
RewriteCond %{QUERY_STRING} (^|&)m=_\! [NC]
#set env var MY_SET-HEADER to 1
RewriteRule .* - [E=MY_SET_HEADER:1]
#if MY_SET_HEADER is present then set header
Header set X-Robots-Tag "noindex, nofollow" env=MY_SET_HEADER