I am trying to exclude a specific route from http basic authentication.
My .htaccess looks like this:
# Set an environment variable if requesting /dev
SetEnvIfNoCase Request_URI ^/dev/? DONT_NEED_AUTH=true
# Require authentication
AuthUserFile /etc/users
AuthName "This is a protected area"
AuthGroupFile /dev/null
AuthType Basic
# Set the allow/deny order
Order Deny,Allow
# Indicate that any of the following will satisfy the Deny/Allow
Satisfy any
# First off, deny from all
Deny from all
# Allow outright if this environment variable is set
Allow from env=DONT_NEED_AUTH
# or require a valid user
Require valid-user
# Rewrite url (make it pretty)
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^([^?]*)$ index.php?path=$1 [NC,L,QSA]
If I use that exact same .htaccess http authentication is removed for route "/dev", so this works as expected, however the problem is that I want password protection for route "/dev", but I want to remove password protection for route "/dev/guest".
I have tried changing to the following:
SetEnvIfNoCase Request_URI ^/dev/guest/? DONT_NEED_AUTH=true
and with escaping the slash in the middle:
SetEnvIfNoCase Request_URI ^/dev\/guest/? DONT_NEED_AUTH=true
but none of those two options are working, all routes are password protected again.
also, since the route is rewritten the actual url I want to allow is "dev/index.php?path=guest" but I am not sure if I should care about that since part of that is the query string, and a end-user will never use that route directly.
Any help is highly appreciated.
Finally found a working solution.
Used this:
SetEnvIf Request_URI /dev/guest noauth=1
<RequireAny>
Require env noauth
Require env REDIRECT_noauth
Require valid-user
</RequireAny>
I've a live website and due to maintenance I want to redirect all IP but mine and another one. I want also that every PC from the two enabled IP has to login to see the website. How can I have both things working at the same time?
To redirect all IP I'll add this to.htacces:
RewriteEngine On
RewriteBase /
RewriteCond %{REMOTE_HOST} !^1.2.3.4
RewriteRule .* http://www.anothersite.com [R=302,L]
Source: http://kb.siteground.com/how_to_redirect_all_visitors_except_your_ip_to_another_site/
But how can I protect everything also with password, in a way that users IP are redirected to anothersite.com? Also, how can I allow multiple IP? Add them with commas?
You can have a workaround like this
SetEnvIfNoCase REMOTE_ADDR "^(?:x\.x\.x\.x|x\.x\.x\.x)$" GET_AUTH=1
RewriteEngine On
RewriteCond %{ENV:GET_AUTH} !1
RewriteRule ^ - [R=503,L]
AuthType Basic
AuthName "Forbidden"
AuthUserFile /path/to/.htpasswd
Require valid-user
Satisfy any
Order allow,deny
Allow from all
Deny from env=GET_AUTH
This code will redirect any other IP but yours and the other one (with a HTTP 503 error: maintenance specific and google friendly).
Otherwise, you have the authentication process.
I've got a password protected site, and I'm trying to allow a specific URL through so that it works for a Payment callback. The site is built using CakePHP.
The below works great however the Allow from env=allow is just not being taken into account (I've tried with my own IP address too). The setenvif mod is enabled in Apache and the other "Allow from" lines work fine. FYI it's running on Ubuntu on EC2. I've also searched on the site for similar issues and solutions but to no avail.
I've checked the $_SERVER global array in PHP for the "allow" environment variable and it exists so running out of ideas. Any help would be much appreciated!
SetEnvIf Request_URI ^/secure_trading/callback allow=1
SetEnvIf Request_URI ^/secure_trading/callback$ allow=1
SetEnvIf Request_URI "/secure_trading/callback" allow=1
SetEnvIf Request_URI "/app/weboot/secure_trading/callback" allow=1
AuthName "Protected"
AuthGroupFile /dev/null
AuthType Basic
AuthUserFile /var/www/domain.co.uk/.htpasswd
Order deny,allow
Satisfy Any
Deny from all
Allow from 127.0.0.1
Allow from env=allow
require valid-user
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteRule ^$ app/webroot/ [L]
RewriteRule (.*) app/webroot/$1 [L]
</IfModule>
I've got a problems because of 360Spider: this bot makes too many requests per second to my VPS and slows it down (the CPU-usage becomes 10-70%, but usually i have 1-2%). I looked into httpd logs and saw there such lines:
182.118.25.209 - - [06/Sep/2012:19:39:08 +0300] "GET /slovar/znachenie-slova/42957-polovity.html HTTP/1.1" 200 96809 "http://www.hrinchenko.com/slovar/znachenie-slova/42957-polovity.html" "Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN; rv:1.8.0.11) Gecko/20070312 Firefox/1.5.0.11; 360Spider
182.118.25.208 - - [06/Sep/2012:19:39:08 +0300] "GET /slovar/znachenie-slova/52614-rospryskaty.html HTTP/1.1" 200 100239 "http://www.hrinchenko.com/slovar/znachenie-slova/52614-rospryskaty.html" "Mozilla/5.0 (Windows; U; Windows NT 5.1; zh-CN; rv:1.8.0.11) Gecko/20070312 Firefox/1.5.0.11; 360Spider
etc.
How can I block this spider completely via robots.txt? Now my robots.txt looks like this:
User-agent: *
Disallow: /cgi-bin/
Disallow: /tmp/
User-agent: YoudaoBot
Disallow: /
User-agent: sogou spider
Disallow: /
I've added lines:
User-agent: 360Spider
Disallow: /
but that does not seem to work. How to block this angry bot?
If you offer to block it via .htaccess, so mind that it looks now like this:
# Turn on URL rewriting
RewriteEngine On
# Installation directory
RewriteBase /
SetEnvIfNoCase Referer ^360Spider$ block_them
Deny from env=block_them
# Protect hidden files from being viewed
<Files .*>
Order Deny,Allow
Deny From All
</Files>
# Protect application and system files from being viewed
RewriteRule ^(?:application|modules|system)\b.* index.php/$0 [L]
# Allow any files or directories that exist to be displayed directly
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
# Rewrite all other URLs to index.php/URL
RewriteRule .* index.php/$0 [PT]
And, in spite of presence of
SetEnvIfNoCase Referer ^360Spider$ block_them
Deny from env=block_them
this bot still tries to kill my VPS and is logged in access logs.
In your .htaccess file simply add the following :
RewriteCond %{REMOTE_ADDR} ^(182\.118\.2)
RewriteRule ^.*$ http://182.118.25.209/take_a_hike_moron [R=301,L]
This will catch ALL the bots being launched from the 182.118.2xx.xxx range and send them back to themself...
The crappy 360 bot is being fired from servers in China... so as long as you don't mind saying bye bye to crappy Chinese traffic from that IP range, this will guaranteed make those puppies disappear from reaching any files on your web site.
The following two lines in your .htaccess file will also pick it off simply by it being stupid enough to proudly put 360spider in its user agent string. This could be handy for when they use other IP ranges then the 182.118.2xx.xxx
RewriteCond %{HTTP_USER_AGENT} .*(360Spider) [NC]
RewriteRule ^.*$ http://182.118.25.209/take_a_hike_moron [R=301,L]
And yes... I hate them too !
Your robots.txt seems right. Some bots just ignore it (malicious bots crawl from any IP address from any botnet of hundreds to millions of infected devices from all around the globe), in this case you can limit the number of requests per second using mod_security module for apache 2.X
Config example here: http://blog.cherouvim.com/simple-dos-protection-with-mod_security/
[EDIT] On linux, iptables also allows restricting tcp:port connections per (x) second(s) per ip, providing conntrack capabilities are enabled on your kernel. See: https://serverfault.com/questions/378357/iptables-dos-limit-for-all-ports
You can put following rules into your .htaccess file
RewriteEngine On
RewriteBase /
SetEnvIfNoCase Referer 360Spider$ block_them
Deny from env=block_them
Note: Apache module mod_setenvif should be enabled in your server configuration
The person running the crawler might be ignoring robots.txt. You could block them via IP
order deny, allow
deny from 216.86.192.196
in .htaccess
SetEnvIfNoCase User-agent 360Spider blocked
I have lines in my .htaccess file like this to block bad bots:
RewriteEngine On
RewriteCond %{ENV:bad} 1
RewriteCond %{REQUEST_URI} !/forbidden.php
RewriteRule (.*) - [R=402,L]
SetEnvIf Remote_Addr "^38\.99\." bad=1
SetEnvIf Remote_Addr "^210\.195\.45\." bad=1
SetEnvIf Remote_Addr "^207\.189\." bad=1
SetEnvIf Remote_Addr "^69\.84\.207\." bad=1
# ...
SetEnvIf Remote_Addr "^221\.204\." bad=1
SetEnvIf User-agent "360Spider" bad=1
It will send the status code 402 Payment Required to all blacklisted IPs / user-agents.
You can put anything that you want displayed to the bot in forbidden.php.
It's quite effective.
I just had to block 360Spider. Solved with StreamCatcher on IIS (IIS7), which fortunately was already installed so only a small configuration change was needed. Details at http://needs-be.blogspot.com/2013/02/how-to-block-spider360.html
I use the following, and it helps alot! Check the HTTP_USER_AGENT for bad bots
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_URI} !^/robots\.txt$
RewriteCond %{REQUEST_URI} !^/error\.html$
RewriteCond %{HTTP_USER_AGENT} EasouSpider [NC,OR]
RewriteCond %{HTTP_USER_AGENT} YisouSpider [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Sogou\ web\ spider [NC]
RewriteCond %{HTTP_USER_AGENT} 360Spider [NC,OR]
RewriteRule ^.*$ - [F,L]
</IfModule>
<Location />
<IfModule mod_setenvif.c>
SetEnvIfNoCase User-Agent "EasouSpider" bad_bot
SetEnvIfNoCase User-Agent "YisouSpider" bad_bot
SetEnvIfNoCase User-Agent "LinksCrawler" bad_bot
Order Allow,Deny
Allow from All
Deny from env=bad_bot
</IfModule>
</Location>
I'm trying to deny all and allow only for a single IP. But, I would like to have the following htaccess working for that single IP. I'm not finding a way to have both working: the deny all and allow only one, plus the following options:
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
#Removes access to the system folder by users.
#Additionally this will allow you to create a System.php controller,
#previously this would not have been possible.
#'system' can be replaced if you have renamed your system folder.
RewriteCond %{REQUEST_URI} ^system.*
RewriteRule ^(.*)$ /index.php?/$1 [L]
#When your application folder isn't in the system folder
#This snippet prevents user access to the application folder
#Submitted by: Fabdrol
#Rename 'application' to your applications folder name.
RewriteCond %{REQUEST_URI} ^application.*
RewriteRule ^(.*)$ /index.php?/$1 [L]
#Checks to see if the user is attempting to access a valid file,
#such as an image or css document, if this isn't true it sends the
#request to index.php
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ index.php?/$1 [L]
</IfModule>
<IfModule !mod_rewrite.c>
# If we don't have mod_rewrite installed, all 404's
# can be sent to index.php, and everything works as normal.
# Submitted by: ElliotHaughin
ErrorDocument 404 /index.php
</IfModule>
Is there a way to make this work?
order deny,allow
deny from all
allow from <your ip>
I know this question already has an accepted answer, but the Apache documentation says:
The Allow, Deny, and Order directives, provided by mod_access_compat,
are deprecated and will go away in a future version. You should avoid
using them, and avoid outdated tutorials recommending their use.
So, a more future-proof answer would be:
<RequireAll>
Require ip xx.xx.xx.xx yy.yy.yy.yy
</RequireAll>
Hopefully, I've helped prevent this page from becoming one of those "outdated tutorials". :)
This can be improved by using the directive designed for that task.
ErrorDocument 403 /specific_page.html
Order Allow,Deny
Allow from 111.222.333.444
Where 111.222.333.444 is your static IP address.
When using the "Order Allow,Deny" directive the requests must match either Allow or Deny, if neither is met, the request is denied.
http://httpd.apache.org/docs/2.2/mod/mod_authz_host.html#order
Slightly modified version of the above, including a custom page to be displayed to those who get denied access:
ErrorDocument 403 /specific_page.html
order deny,allow
deny from all
allow from 111.222.333.444
...and that way those requests not coming from 111.222.333.444 will see specific_page.html
(posting this as comment looked terrible because new lines get lost)
Improving a bit more the previous answers, a maintenance page can be shown to your users while you perform changes to the site:
ErrorDocument 403 /maintenance.html
Order Allow,Deny
Allow from #.#.#.#
Where:
#.#.#.# is your IP: What Is My IP Address?
For maintenance.html there is a nice example here: Simple Maintenance Page
Add the following command in .htaccess file. And place that file in your htdocs folder.
Order Deny,Allow
Deny from all
Allow from <your ip>
Allow from <another ip>
Just in addition to #David Brown´s answer, if you want to block an IP, you must first allow all then block the IPs as such:
<RequireAll>
Require all granted
Require not ip 10.0.0.0/255.0.0.0
Require not ip 172.16.0.0/12
Require not ip 192.168
</RequireAll>
First line allows all
Second line blocks from 10.0.0.0 to 10.255.255.255
Third line blocks from 172.16.0.0 to 172.31.255.255
Fourth line blocks from 192.168.0.0 to 192.168.255.255
You may use any of the notations mentioned above to suit your CIDR needs.
I wasn't able to use the 403 method because I wanted the maintenance page and page images in a sub folder on my server, so used the following approach to redirect to a 'maintenance page' for everyone but a single IP*
RewriteEngine on
RewriteCond %{REMOTE_ADDR} !**.**.**.*
RewriteRule !^maintenance/ http://www.website.co.uk/maintenance/ [R=302,L]
Source: Creating a holding page to hide your WordPress blog
order deny,allow
deny from all
allow from set your IP
using htaccess to restrict access by ip
You can use the following in htaccess to allow and deny access to your site :
SetEnvIf remote_addr ^1\.2\3\.4\.5$ allowedip=1
Order deny,allow
deny from all
allow from env=allowedip
We first set an env variable allowedip if the client ip address matches the pattern, if the pattern matches then env variable allowedip is assigned the value 1 .
In the next step, we use Allow,deny directives to allow and deny access to the site. Order deny,allow represents the order of deny and allow . deny from all this line tells the server to deny everyone. the last line allow from env=allowedip allows access to a single ip address we set the env variable for.
Replace 1\.2\.3\.4\.5 with your allowed ip address.
Refrences :
https://httpd.apache.org/docs/2.4/mod/mod_setenvif.html
https://httpd.apache.org/docs/2.4/mod/mod_access_compat.html
You can have more than one IP or even some other kind of allow like user, hostname, ... more info here https://www.askapache.com/htaccess/setenvif/
SetEnvIf remote_addr ^123.123.123.1$ allowedip=1
SetEnvIf remote_addr ^123.123.123.2$ allowedip=1
SetEnvIf remote_addr ^123.123.123.3$ allowedip=1
SetEnvIf remote_addr ^123.123.123.4$ allowedip=1
Order deny,allow
deny from all
allow from env=allowedip
ErrorDocument 403 /maintenance.html
Order Allow,Deny
Allow from #:#:#:#:#:#
For me, this seems to work (Using IPv6 rather than IPv4) I don't know if this is different for some websites but for mine this works.
If you want to use mod_rewrite for access control you can use condition like user agent, http referrer, remote addr etc.
Example
RewriteCond %{REMOTE_ADDR} !=*.*.*.* #you ip address
RewriteRule ^$ - [F]
Refrences:
https://httpd.apache.org/docs/2.4/rewrite/access.html