I got .htaccess blocking all user-agents and only allow one's i need
to allow cloudflare to access how can i allow not using (Mozilla)
this is what i got user-agent
Mozilla/5.0 (compatible; CloudFlare-AlwaysOnline/1.0; +http://www.cloudflare.com/always-online)
RewriteEngine on
AuthType Basic
AuthName "private"
AuthUserFile "/home/example/.htpasswds/public_html/exemple/passwd"
require valid-user
#-only allow-#
SetEnvIf User-Agent .0011 0011
Order deny,allow
Deny from all
Allow from env=0011
#-index only open for 0011-#
Options +Indexes
RewriteCond %{REQUEST_FILENAME} -d
RewriteCond %{HTTP_USER_AGENT} !0011 [NC]
RewriteRule . - [F]
You can use:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} !CloudFlare-AlwaysOnline [NC]
RewriteRule ^ - [F]
But do not do that, because Cloudflare uses the name of the user's browser user-agents for all normal queries.
Related
The following rule is allowing only www.google.com domain to access the file list.txt
My question, how can I change the rule to restrict www.google.com and allow other domain?
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{HTTP_HOST} !^(?:www\.)?google\.com$ [NC]
RewriteRule ^list\.txt$ - [NC,F]
</IfModule>
I want the browser response to 404 but not denied response.
Please test and see if this does what you want?
RewriteEngine on
RewriteCond %{HTTP_HOST} ^(?:www\.)?google\.com$ [NC]
RewriteRule ^list\.txt$ - [NC,R=404]
but this might be clearer:
<Files "list.txt">
RewriteEngine on
RewriteCond %{HTTP_HOST} ^(?:www\.)?google\.com$ [NC, R=404]
</Files>
I left out <IfModule mod_rewrite.c> for clarity. If the module is not enabled, you might however prefer the server to not start opposed to just allow access.
I have a Silverstripe platform website that has duplicate URLs for www, non-www, http and https
There seem to be multiple solutions but no definitive answer. Is there someone that knows the correct code for the htaccess file for Silverstripe?
I want to get all pages pointing to https ://www
This is the current code in the htaccess file -
ErrorDocument 401 /base/401.txt
### SILVERSTRIPE START ###
# Deny access to templates (but allow from localhost)
<Files *.ss>
Order deny,allow
Deny from all
Allow from 127.0.0.1
</Files>
# Deny access to IIS configuration
<Files web.config>
Order deny,allow
Deny from all
</Files>
# Deny access to YAML configuration files which might include sensitive information
<Files ~ "\.ya?ml$">
Order allow,deny
Deny from all
</Files>
# Route errors to static pages automatically generated by SilverStripe
ErrorDocument 404 /assets/error-404.html
ErrorDocument 500 /assets/error-500.html
<IfModule mod_env.c>
# Ensure that X-Forwarded-Host is only allowed to determine the request
# hostname for servers ips defined by SS_TRUSTED_PROXY_IPS in your _ss_environment.php
# Note that in a future release this setting will be always on.
SetEnv BlockUntrustedIPs true
</IfModule>
<IfModule mod_rewrite.c>
# Turn off index.php handling requests to the homepage fixes issue in apache >=2.4
<IfModule mod_dir.c>
DirectoryIndex disabled
</IfModule>
RewriteEngine On
# non-www to www redirect
#RewriteCond %{HTTP_HOST} ^bolstered.com.au$ [NC]
#RewriteRule (.*) https://www.bolstered.com.au/$1 [R=301,L]
# http to https redirect
#RewriteCond %{HTTPS} !=on
#RewriteRule ^ (.*)$ https://%{HTTP_HOST}%{REQUEST_URI} [R=301,L]
# Enable HTTP Basic authentication workaround for PHP running in CGI mode
RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
# Deny access to potentially sensitive files and folders
RewriteRule ^vendor(/|$) - [F,L,NC]
RewriteRule silverstripe-cache(/|$) - [F,L,NC]
RewriteRule composer\.(json|lock) - [F,L,NC]
RewriteCond %{REQUEST_URI} ^(.*)$
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_URI} !\.php$
RewriteRule .* index.php?url=%1&%{QUERY_STRING} [L]
</IfModule>
### SILVERSTRIPE END
I typically keep the SilverStripe part between ### SILVERSTRIPE START ### and ### SILVERSTRIPE END ### untouched and put my rules only before or after those of silverstripe.
There is no issue with having RewriteEngine On twice.
I also did not bother to check if mod_rewrite exists, because all my servers will have it enabled and I wouldn't let a client put it on a server without it.
Here is a full example of a .htaccess that I would typically use in silverstripe a project:
RewriteEngine On
RewriteCond %{HTTP_HOST} !^localhost [NC]
RewriteCond %{HTTP_HOST} !^127.0.0.1 [NC]
RewriteCond %{HTTPS} !on [OR]
RewriteCond %{HTTP_HOST} !^www\.examle\.org [NC]
RewriteRule ^ https://www.examle.org%{REQUEST_URI} [R=301,L]
### SILVERSTRIPE START ###
# Deny access to templates (but allow from localhost)
<Files *.ss>
Require ip 127.0.0.1
</Files>
# Deny access to IIS configuration
<Files web.config>
Require all denied
</Files>
# Deny access to YAML configuration files which might include sensitive information
<Files ~ "\.ya?ml$">
Require all denied
</Files>
# Route errors to static pages automatically generated by SilverStripe
ErrorDocument 404 /assets/error-404.html
ErrorDocument 500 /assets/error-500.html
<IfModule mod_rewrite.c>
# Turn off index.php handling requests to the homepage fixes issue in apache >=2.4
<IfModule mod_dir.c>
DirectoryIndex disabled
DirectorySlash On
</IfModule>
SetEnv HTTP_MOD_REWRITE On
RewriteEngine On
# Enable HTTP Basic authentication workaround for PHP running in CGI mode
RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
# Deny access to potentially sensitive files and folders
RewriteRule ^vendor(/|$) - [F,L,NC]
RewriteRule ^\.env - [F,L,NC]
RewriteRule silverstripe-cache(/|$) - [F,L,NC]
RewriteRule composer\.(json|lock) - [F,L,NC]
RewriteRule (error|silverstripe|debug)\.log - [F,L,NC]
RewriteRule ^Security - [F,L,NC]
# Process through SilverStripe if no file with the requested name exists.
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule .* index.php
</IfModule>
### SILVERSTRIPE END ###
Step-by-step explanation:
Turns on the rewrite engine
RewriteEngine On
The 4 RewriteCond are all conditions connected to the RewriteRule below it.
RewriteCond %{HTTP_HOST} !^localhost [NC]
RewriteCond %{HTTP_HOST} !^127.0.0.1 [NC]
RewriteCond %{HTTPS} !on [OR]
RewriteCond %{HTTP_HOST} !^www\.examle\.org [NC]
Multiple conditions will be a logical AND, unless you add a [OR].
[NC] stands for no-case, so not case-sensitive
The first 2 are an exception for localhost/127.0.0.1 to ensure a redirect will not be done when I am developing on my workstation.
3 is checking if https if off
4 is checking if the domain is correct
So speaking in pseudo code, it's like this:
if ($HTTP_HOST != "localhost" && $HTTP_HOST != "127.0.0.1" AND ($HTTPS != "on" OR $HTTP_HOST != "www.examle.org") {
do_redirect();
}
The actual redirect
RewriteRule ^ https://www.examle.org%{REQUEST_URI} [R=301,L]
It's redirecting to the desired domain and attaches the path (/foo/bar) and the query paramters (?foo=bar) to it.
R=301 is the http response code. If you wanted a temporary redirect you could make it 302.
L means Last I think, which will stop the processing at this point and will not continue to try other rules below.
Alternatives:
.htaccess is the best way to do it. But it's worth pointing out that this is not the only option.
You could do it in plain PHP, in the config of any/most webservers, ...
And SilverStripe has builtin methods for doing the check & redirect:
Director::forceSSL();
Director::forceWWW();
but as said, .htaccess is much better (much faster and only a single redirect)
I need to protect a specific route/url with password on my codeigniter site.
Base url looks like this:
https://staging.mysite.com/site-name
I want to protect this url with a password using .htaccess
https://staging.mysite.com/site-name/export
Routes looks like this
$route['export']['get'] = 'ExportController/index';
$route['export']['post'] = 'ExportController/export';
I've tried many different answers for similar problems but I just can't get it to work properly,
either password is asked everywhere, or it isnt asked at all.
Here is my .htaccess
RewriteEngine on
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ index.php/$1 [L,QSA]
</IfModule>
SetEnvIfNoCase Request_URI ^/export SECURED
AuthName "Restricted Area"
AuthType Basic
AuthUserFile "/home/something/path/to/.htpasswd"
AuthGroupFile /
Require valid-user
Satisfy any
Order Allow,Deny
Allow from all
Deny from env=SECURED
I think that the problem might be in this part:
SetEnvIfNoCase Request_URI ^/export SECURED
Because I just cant target the url I want, here is some other things I've tried
SetEnvIfNoCase Request_URI ^/site-name/export SECURED
SetEnvIfNoCase Request_URI "^/site-name/export" SECURED
SetEnvIfNoCase Request_URI "^/export" SECURED
SetEnvIfNoCase Request_URI ^(.*)/export SECURED
SetEnvIfNoCase Request_URI .*/export$ SECURED
SetEnvIfNoCase Request_URI .*/export SECURED
Edit:
I've also tried to protect the entire ExportController with like this, password prompt does not appear anywhere.
<Files ExportController>
AuthName "ExportController"
AuthType Basic
AuthUserFile "/home/something/path/to/.htpasswd"
require valid-user
</Files>
I ended up doing this:
<If "%{THE_REQUEST} =~ m#^GET /site-name/export#">
AuthType Basic
AuthName "Password Required"
AuthUserFile /home/path/to/.htpasswd
Require valid-user
</If>
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ index.php/$1 [L,QSA]
</IfModule>
I have implemented the following code to htaccess but are still seeing referrers from semalt, such as:
74.semalt.com
89.semalt.com
The code:
# Block visits from semalt.com
RewriteEngine on
RewriteCond %{HTTP_REFERER} ^http://([^.]+\.)*semalt\.com [NC]
RewriteRule .* - [F]
Any idea how these referrers are bypassing this rule (which I found online) and how I can fully prevent them?
Your code looks good, syntax checks out ok! I used these mod_rewrite methods:
RewriteCond %{HTTP_REFERER} ^http(s)?://(www\.)?semalt\.com.*$ [NC]
RewriteCond %{HTTP_REFERER} ^http(s)?://(.*\.)?semalt\.*$ [NC,OR]
RewriteCond %{HTTP_REFERER} ^https?://([^.]+\.)*semalt\.com\ [NC,OR]
or with the .htaccess module mod_setenvif
SetEnvIfNoCase Referer semalt.com spambot=yes
SetEnvIfNoCase REMOTE_ADDR "217\.23\.11\.15" spambot=yes
SetEnvIfNoCase REMOTE_ADDR "217\.23\.7\.144" spambot=yes
Order allow,deny
Allow from all
Deny from env=spambot
I even created an Apache, Nginx & Varnish blacklist plus Google Analytics segment to prevent referrer spam traffic, you can find it here:
https://github.com/Stevie-Ray/referrer-spam-blocker/
Here is an updated code for many of spam referral sites using regular expressions
<IfModule mod_rewrite.c>
RewriteEngine On
Options +FollowSymLinks
RewriteCond %{HTTP_REFERER} (?:o-o-6-o-o|bestwebsitesawards|s.click.aliexpress|simple-share-buttons|see-your-website-here|forum.topic55198628.darodar|hulfingtonpost|ilovevitaly|priceg|blackhatworth|semalt.semalt|kambasoft|buttons-for-website|BlackHatWorth|7makemoneyonline)\.com [NC,OR]
RewriteCond %{HTTP_REFERER} (?:lomb|lombia|econom|lumb)\.co [NC,OR]
RewriteCond %{HTTP_REFERER} (?:cenoval|Iskalko)\.ru [NC,OR]
RewriteCond %{HTTP_REFERER} (?:smailik|humanorightswatch)\.org [NC,OR]
RewriteCond %{HTTP_REFERER} (?:ranksonic|savetubevideo)\.info [NC]
RewriteRule ^ – [F]
</IfModule>
Hope someone find this usefull
Ok, this is how I got it to work:
# Block visits from semalt.com
RewriteEngine on
RewriteCond %{HTTP_REFERER} ^http://([^.]+\.)*semalt\.com [NC]
RewriteRule (.*) http://www.semalt.com [R=301,L]
I tried all manner of these sample snippets from all over the web. None of them worked, and Semalt kept changing their domains and paths.
I suggest this which works great for me and has sane syntax. It will apply to any referrer path that contains the semalt.com string. Note you need Apache 2.4 for this to work. It can go in your .htaccess no problem, or in theory in your main Apache config.
<If "%{HTTP_REFERER} =~ /semalt.com/">
Deny from all
</If>
Good luck!
Update: if this causes a 500 error you need to empower your .htaccess, in your main Apache config, in this example, I have my .htaccess in my web server root of /var/www/wordpress, so I have in my .conf:
<Directory /var/www/wordpress>
Options +FollowSymLinks
AllowOverride all
Require all granted
</Directory>
Here's another approach for blocking the ever growing list of botnet hosts:
# Block Common Botnets
SetEnvIfNoCase Referer fbdownloader.com spambot=yes
SetEnvIfNoCase Referer descargar-musicas-gratis.com spambot=yes
SetEnvIfNoCase Referer baixar-musicas-gratis.com spambot=yes
SetEnvIfNoCase Referer savetubevideo.com spambot=yes
SetEnvIfNoCase Referer srecorder.com spambot=yes
SetEnvIfNoCase Referer kambasoft.com spambot=yes
SetEnvIfNoCase Referer semalt.com spambot=yes
SetEnvIfNoCase Referer ilovevitaly.com spambot=yes
SetEnvIfNoCase Referer 7makemoneyonline.com spambot=yes
SetEnvIfNoCase Referer buttons-for-website.com spambot=yes
SetEnvIfNoCase Referer econom.co spambot=yes
SetEnvIfNoCase Referer acunetix-referrer.com spambot=yes
SetEnvIfNoCase Referer yougetsignal.com spambot=yes
SetEnvIfNoCase Referer darodar.com spambot=yes
Order allow,deny
Allow from all
Deny from env=spambot
A stranger bot (GbPlugin) is codifying the urls of the images and causing error 404.
I tried to block the bot without success with this in the bottom of my .htaccess, but it didn't work.
Options +FollowSymlinks
RewriteEngine On
RewriteBase /
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_USER_AGENT} ^$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^GbPlugin [NC]
RewriteRule .* - [F,L]
The log this below.
201.26.16.9 - - [10/Sep/2011:00:06:05 -0300] "GET /wp%2Dcontent/themes/my_theme%2Dpremium/scripts/timthumb.php%3Fsrc%3Dhttp%3A%2F%2Fwww.example.com%2Fwp%2Dcontent%2Fuploads%2F2011%2F08%2Fmy_image_name.jpg%26w%3D100%26h%3D65%26zc%3D1%26q%3D100 HTTP/1.1" 404 1047 "-" "GbPlugin"
Sorry for my language mistakes
Here's what you can put in your .htacces file
Options +FollowSymlinks
RewriteEngine On
RewriteBase /
SetEnvIfNoCase Referer "^$" bad_user
SetEnvIfNoCase User-Agent "^GbPlugin" bad_user
SetEnvIfNoCase User-Agent "^Wget" bad_user
SetEnvIfNoCase User-Agent "^EmailSiphon" bad_user
SetEnvIfNoCase User-Agent "^EmailWolf" bad_user
SetEnvIfNoCase User-Agent "^libwww-perl" bad_user
Deny from env=bad_user
This will return:
HTTP request sent, awaiting response... 403 Forbidden
2011-09-10 11:15:48 ERROR 403: Forbidden.
May I recommend this method:
Put this is .htaccess in root of your site.
ErrorDocument 503 "Your connection was refused"
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^(Mozilla.*537.36|Mozilla.*UCBrowser\/9.3.1.344)$ [NC]
RewriteRule .* - [R=503,L]
Where
^(Mozilla.*537.36|Mozilla.*UCBrowser\/9.3.1.344)$
are the two useragents I wanted to block in this example case.
You can use regex so a useragent like
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:40.0) Gecko/20100101 Firefox/40.0
could be
Mozilla.*Firefox\/40.0
^means match from beginning and $ to the end so you could block just one useragent with:
ErrorDocument 503 "Your connection was refused"
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Firefox\/40.0$ [NC]
RewriteRule .* - [R=503,L]
Or add several using the | character to separate them inside ( and ) like in the first example.
RewriteCond %{HTTP_USER_AGENT} ^(Mozilla.*537.36|Mozilla.*UCBrowser\/9.3.1.344)$ [NC]
You can test it by putting your useragent in the code and then try to access the site. http://whatsmyuseragent.com/
To block empty referers, you can use the following Rule :
RewriteEngine on
RewriteCond %{HTTP_REFERER} ^$
RewriteRule ^ - [F,L]
This will forbid all requests to your site if HTTP_REFERER value is empty ^$ .
To block user agents, you can use
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} opera|firebox|foo|bar [NC]
RewriteRule ^ - [F,L]
This will forbid all requests to your site if HTTP_USER_AGENT matches the Condition pattern.