A stranger bot (GbPlugin) is codifying the urls of the images and causing error 404.
I tried to block the bot without success with this in the bottom of my .htaccess, but it didn't work.
Options +FollowSymlinks
RewriteEngine On
RewriteBase /
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_USER_AGENT} ^$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^GbPlugin [NC]
RewriteRule .* - [F,L]
The log this below.
201.26.16.9 - - [10/Sep/2011:00:06:05 -0300] "GET /wp%2Dcontent/themes/my_theme%2Dpremium/scripts/timthumb.php%3Fsrc%3Dhttp%3A%2F%2Fwww.example.com%2Fwp%2Dcontent%2Fuploads%2F2011%2F08%2Fmy_image_name.jpg%26w%3D100%26h%3D65%26zc%3D1%26q%3D100 HTTP/1.1" 404 1047 "-" "GbPlugin"
Sorry for my language mistakes
Here's what you can put in your .htacces file
Options +FollowSymlinks
RewriteEngine On
RewriteBase /
SetEnvIfNoCase Referer "^$" bad_user
SetEnvIfNoCase User-Agent "^GbPlugin" bad_user
SetEnvIfNoCase User-Agent "^Wget" bad_user
SetEnvIfNoCase User-Agent "^EmailSiphon" bad_user
SetEnvIfNoCase User-Agent "^EmailWolf" bad_user
SetEnvIfNoCase User-Agent "^libwww-perl" bad_user
Deny from env=bad_user
This will return:
HTTP request sent, awaiting response... 403 Forbidden
2011-09-10 11:15:48 ERROR 403: Forbidden.
May I recommend this method:
Put this is .htaccess in root of your site.
ErrorDocument 503 "Your connection was refused"
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^(Mozilla.*537.36|Mozilla.*UCBrowser\/9.3.1.344)$ [NC]
RewriteRule .* - [R=503,L]
Where
^(Mozilla.*537.36|Mozilla.*UCBrowser\/9.3.1.344)$
are the two useragents I wanted to block in this example case.
You can use regex so a useragent like
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:40.0) Gecko/20100101 Firefox/40.0
could be
Mozilla.*Firefox\/40.0
^means match from beginning and $ to the end so you could block just one useragent with:
ErrorDocument 503 "Your connection was refused"
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Firefox\/40.0$ [NC]
RewriteRule .* - [R=503,L]
Or add several using the | character to separate them inside ( and ) like in the first example.
RewriteCond %{HTTP_USER_AGENT} ^(Mozilla.*537.36|Mozilla.*UCBrowser\/9.3.1.344)$ [NC]
You can test it by putting your useragent in the code and then try to access the site. http://whatsmyuseragent.com/
To block empty referers, you can use the following Rule :
RewriteEngine on
RewriteCond %{HTTP_REFERER} ^$
RewriteRule ^ - [F,L]
This will forbid all requests to your site if HTTP_REFERER value is empty ^$ .
To block user agents, you can use
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} opera|firebox|foo|bar [NC]
RewriteRule ^ - [F,L]
This will forbid all requests to your site if HTTP_USER_AGENT matches the Condition pattern.
Related
In htaccess, how can i block every visitor except those who come from a specific domain
i tried this but without any success :
# serve everyone from specific-domain or specific-user-agent
RewriteCond %{HTTP_REFERER} ^https?://www.specific-domain.com
RewriteRule ^ - [L]
# everybody else receives a forbidden
RewriteRule ^ - [F]
ErrorDocument 403 /forbidden.html
Update : i had certain success with below code BUT it broked my webpage certainly because of the following parameters that overrride or disturbe appearance. if someone has a clue how to order it the good way ?
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTP_REFERER} ^https://authorizedreferer.com
RewriteRule ^ - [L]
RewriteRule ^ https://unprotected.mydomain.com/ [R,L]
RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>
# serve everyone from specific-domain or specific-user-agent
RewriteCond %{HTTP_REFERER} ^https?://www.specific-domain.com
RewriteRule ^ - [L]
# everybody else receives a forbidden
RewriteRule ^ - [F]
This will indeed allow requests that link from specific-domain.com (ie. this domain is the HTTP Referer) and block everything else. However, it will also block all requests for your static resources, that originate from your site, where your domain is the Referer. So, you need to also allow requests from your domain.
You should also probably allow an empty Referer header. ie. direct requests, when a user types the URL into their browser address bar. Also note that the Referer header can be suppressed in other ways depending on the referrer-policy as set by the originating website. The user themselves can also override the Referer header, so relying on the Referer header is not reliable.
Try the following:
# Serve everyone from specific-domain (and internal requests)
RewriteCond %{HTTP_REFERER} ^https?://www\.your-domain\.com/ [OR]
RewriteCond %{HTTP_REFERER} ^https?://www\.specific-domain\.com/
RewriteRule ^ - [L]
# everybody else receives a forbidden
RewriteRule ^ - [F]
And to allow an empty Referer, include an additional condition:
# Serve everyone from specific-domain (and internal requests and empty referer)
RewriteCond %{HTTP_REFERER} ^$ [OR]
RewriteCond %{HTTP_REFERER} ^https?://www\.your-domain\.com/ [OR]
RewriteCond %{HTTP_REFERER} ^https?://www\.specific-domain\.com/
RewriteRule ^ - [L]
Note that you are currently allowing http or https in the Referer. If this is always https then be specific and remove the ? (optional quantifier). ie. ^https://www\.specific-domain\.com/. And remember to backslash escape the literal dots.
I tried to block bad bots via htaccess with this code:
I know these are 2 ways to do so, but none of them is working, I still see the bots in the access-log: What am I doing wrong?
RewriteCond %{HTTP_USER_AGENT} ^BLEXBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SemrushBot [NC,OR]
SetEnvIfNoCase User-Agent "BLEXBot" rotbot
SetEnvIfNoCase User-Agent "SemrushBot" rotbot
<Limit POST GET HEAD PUT>
Order Allow,Deny
Allow from all
Deny from env=rotbot
</Limit>
The entries in the access log look like that:
domain.org:443 46.229.168.142 - - [22/Jul/2019:08:56:26 +0200] "GET /path/to/page/ HTTP/1.1" 403 3801 "-" "Mozilla/5.0 (compatible; SemrushBot/3~bl; +http://www.semrush.com/bot.html)"
domain.org:443 94.130.219.232 - - [22/Jul/2019:08:56:24 +0200] "GET /path/to/page/ HTTP/1.1" 403 760 "-" "Mozilla/5.0 (compatible; BLEXBot/1.0; +http://webmeup-crawler.com/)"
Fix these rules to:
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^BLEXBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} ^SemrushBot [NC]
RewriteRule ^.* - [F,L]
</IfModule>
Currently I use this code in my .htaccess file to trigger my Site's maintenance page.
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{DOCUMENT_ROOT}/maintenance.html -f
RewriteCond %{DOCUMENT_ROOT}/maintenance.enable -f
RewriteCond %{SCRIPT_FILENAME} !maintenance.html
RewriteRule ^.*$ /maintenance.html [R=503,L]
ErrorDocument 503 /maintenance.html
Header Set Cache-Control "max-age=0, no-store"
</IfModule>
How Would I make it so I could still access a directory I would be working On and not just my IP address in case my friend wishes to access it.
I have done some searching, but not found anything as of yet
Cheers
Tom
I'd say you can simply implement some exception rules prior to your handling of the maintenance mode:
<IfModule mod_rewrite.c>
RewriteEngine On
# immediately end all rewriting for specific IPV4 addresses
RewriteCond %{REMOTE_ADDR} ^123\.123\.123\.123$ [OR]
RewriteCond %{REMOTE_ADDR} ^321\.321\.321\.321$
RewriteRule ^ - [END]
# for everyone else: check for maintenance mode
RewriteCond %{DOCUMENT_ROOT}/maintenance.html -f
RewriteCond %{DOCUMENT_ROOT}/maintenance.enable -f
RewriteCond %{SCRIPT_FILENAME} !maintenance.html
RewriteRule ^.*$ /maintenance.html [R=503,L]
ErrorDocument 503 /maintenance.html
Header Set Cache-Control "max-age=0, no-store"
</IfModule>
Alternatively you could add the negated conditions as additional conditions to the maintenance mode rewriting logic. But I think the above is easier to read and maintain.
I got .htaccess blocking all user-agents and only allow one's i need
to allow cloudflare to access how can i allow not using (Mozilla)
this is what i got user-agent
Mozilla/5.0 (compatible; CloudFlare-AlwaysOnline/1.0; +http://www.cloudflare.com/always-online)
RewriteEngine on
AuthType Basic
AuthName "private"
AuthUserFile "/home/example/.htpasswds/public_html/exemple/passwd"
require valid-user
#-only allow-#
SetEnvIf User-Agent .0011 0011
Order deny,allow
Deny from all
Allow from env=0011
#-index only open for 0011-#
Options +Indexes
RewriteCond %{REQUEST_FILENAME} -d
RewriteCond %{HTTP_USER_AGENT} !0011 [NC]
RewriteRule . - [F]
You can use:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} !CloudFlare-AlwaysOnline [NC]
RewriteRule ^ - [F]
But do not do that, because Cloudflare uses the name of the user's browser user-agents for all normal queries.
How can i ban requests form pingdomtools?
Their hostnames looks like that:
s464.pingdom.com
So how can i ban all hostnames ending with
pingdom.com
?
You could use a rewrite rule.
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{HTTP_REFERER} ^https?://([^.]+\.)*tumblr\.com [NC]
RewriteRule .* - [F]
</IfModule>
http://davidwalsh.name/block-domain
Note: This blocks based on referer, which can be spoofed or left out entirely.
Update: On servers that do reverse dns you can try:
deny from .pingdom.com
https://httpd.apache.org/docs/2.0/mod/mod_access.html#allow
RewriteEngine on
# Options +FollowSymlinks
RewriteCond %{HTTP_REFERER} somesite\.com [NC,OR]
RewriteRule .* - [F]
Well since it's not easy to ban the hostname i just banned the User agent:
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} ^Pingdom.com
RewriteRule ^(.*)$ http://go.away/
Works fine.