I have implemented the following code to htaccess but are still seeing referrers from semalt, such as:
74.semalt.com
89.semalt.com
The code:
# Block visits from semalt.com
RewriteEngine on
RewriteCond %{HTTP_REFERER} ^http://([^.]+\.)*semalt\.com [NC]
RewriteRule .* - [F]
Any idea how these referrers are bypassing this rule (which I found online) and how I can fully prevent them?
Your code looks good, syntax checks out ok! I used these mod_rewrite methods:
RewriteCond %{HTTP_REFERER} ^http(s)?://(www\.)?semalt\.com.*$ [NC]
RewriteCond %{HTTP_REFERER} ^http(s)?://(.*\.)?semalt\.*$ [NC,OR]
RewriteCond %{HTTP_REFERER} ^https?://([^.]+\.)*semalt\.com\ [NC,OR]
or with the .htaccess module mod_setenvif
SetEnvIfNoCase Referer semalt.com spambot=yes
SetEnvIfNoCase REMOTE_ADDR "217\.23\.11\.15" spambot=yes
SetEnvIfNoCase REMOTE_ADDR "217\.23\.7\.144" spambot=yes
Order allow,deny
Allow from all
Deny from env=spambot
I even created an Apache, Nginx & Varnish blacklist plus Google Analytics segment to prevent referrer spam traffic, you can find it here:
https://github.com/Stevie-Ray/referrer-spam-blocker/
Here is an updated code for many of spam referral sites using regular expressions
<IfModule mod_rewrite.c>
RewriteEngine On
Options +FollowSymLinks
RewriteCond %{HTTP_REFERER} (?:o-o-6-o-o|bestwebsitesawards|s.click.aliexpress|simple-share-buttons|see-your-website-here|forum.topic55198628.darodar|hulfingtonpost|ilovevitaly|priceg|blackhatworth|semalt.semalt|kambasoft|buttons-for-website|BlackHatWorth|7makemoneyonline)\.com [NC,OR]
RewriteCond %{HTTP_REFERER} (?:lomb|lombia|econom|lumb)\.co [NC,OR]
RewriteCond %{HTTP_REFERER} (?:cenoval|Iskalko)\.ru [NC,OR]
RewriteCond %{HTTP_REFERER} (?:smailik|humanorightswatch)\.org [NC,OR]
RewriteCond %{HTTP_REFERER} (?:ranksonic|savetubevideo)\.info [NC]
RewriteRule ^ – [F]
</IfModule>
Hope someone find this usefull
Ok, this is how I got it to work:
# Block visits from semalt.com
RewriteEngine on
RewriteCond %{HTTP_REFERER} ^http://([^.]+\.)*semalt\.com [NC]
RewriteRule (.*) http://www.semalt.com [R=301,L]
I tried all manner of these sample snippets from all over the web. None of them worked, and Semalt kept changing their domains and paths.
I suggest this which works great for me and has sane syntax. It will apply to any referrer path that contains the semalt.com string. Note you need Apache 2.4 for this to work. It can go in your .htaccess no problem, or in theory in your main Apache config.
<If "%{HTTP_REFERER} =~ /semalt.com/">
Deny from all
</If>
Good luck!
Update: if this causes a 500 error you need to empower your .htaccess, in your main Apache config, in this example, I have my .htaccess in my web server root of /var/www/wordpress, so I have in my .conf:
<Directory /var/www/wordpress>
Options +FollowSymLinks
AllowOverride all
Require all granted
</Directory>
Here's another approach for blocking the ever growing list of botnet hosts:
# Block Common Botnets
SetEnvIfNoCase Referer fbdownloader.com spambot=yes
SetEnvIfNoCase Referer descargar-musicas-gratis.com spambot=yes
SetEnvIfNoCase Referer baixar-musicas-gratis.com spambot=yes
SetEnvIfNoCase Referer savetubevideo.com spambot=yes
SetEnvIfNoCase Referer srecorder.com spambot=yes
SetEnvIfNoCase Referer kambasoft.com spambot=yes
SetEnvIfNoCase Referer semalt.com spambot=yes
SetEnvIfNoCase Referer ilovevitaly.com spambot=yes
SetEnvIfNoCase Referer 7makemoneyonline.com spambot=yes
SetEnvIfNoCase Referer buttons-for-website.com spambot=yes
SetEnvIfNoCase Referer econom.co spambot=yes
SetEnvIfNoCase Referer acunetix-referrer.com spambot=yes
SetEnvIfNoCase Referer yougetsignal.com spambot=yes
SetEnvIfNoCase Referer darodar.com spambot=yes
Order allow,deny
Allow from all
Deny from env=spambot
Related
I'm writing a script that will alter the roots .htaccess file to block user agents (yandex in this example). I wrote it to use the setenvif command but it wasn't working. Via another post here I was told it was because a redirect prior to the command was causing it to fail due to the .htaccess file being reloaded. I added a rewritecond statement to do the block and it worked. If I place the setenvif statement at the top of the file it also works correctly.
In researching this others have stated that mod_rewrite is handled before mod_alias but I can't find any mention of when mod_setenvif is handled. There also seems to be little information regarding the order of things in the .htaccess file in general. Below is an example of the things in my .htaccess file. Even as a novice I can see it is not ordered well.
But before I deal with that I wanted to see where to properly place the setenvif statement?
And am I correct in assuming that any setenvif, rewritecond and deny statement that causes a block or redirect to another site should be placed at the top?
Options +FollowSymLinks -Indexes
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} "WOW32" [NC,OR]
RewriteCond %{HTTP_USER_AGENT} "WOW64" [NC]
RewriteRule ^(.*)$ - [F]
RewriteCond %{SERVER_PORT} 80
RewriteRule ^(.*)$ https://example.com/$1 [R,L]
redirect 301 /about_us.php https://example.com/about.html
<Files test.html>
order allow,deny
</Files>
<FilesMatch "\.(inc|tpl|h|ihtml|sql|ini|conf|class|bin|spd|theme|module|exe)$">
</FilesMatch>
<IfModule mod_expires.c>
ExpiresActive On
ExpiresByType image/jpg "access plus 1 year"
</IfModule>
<IfModule mod_deflate.c>
AddOutputFilterByType DEFLATE application/javascript
BrowserMatch \bMSIE !no-gzip !gzip-only-text/html
Header append Vary User-Agent
</IfModule>
ErrorDocument 400 /server_error.php?id=400
order allow,deny
deny from 1.2.3.4/32
allow from all
SetEnvIfNoCase User-Agent "^yandex$" my_block
<IfModule !authz_core_module>
Order Allow,Deny
Allow from ALL
Deny from env=my_block
</IfModule>
RewriteCond %{HTTP_USER_AGENT} ^.*(yandex).*$ [NC,OR]
I have a wordpress installation in the root of my server, I wanted to use the "opensourcepos" script in a new directory called "ospos"
If I try to access the link
www.mysite.it/ospos/
I am redirected to
www.mysite.it/public
with 404 error
To access the ospos installation I have to go to
www.mysite.it/ospos/public
only in this way everything works correctly.
Maybe there is some instruction in the ospos folder .htaccess file
which does not redirect correctly.
Code .htaccess wordpress /root
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteBase /
RewriteRule ^index\.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]
</IfModule>
# END WordPress
Code .htaccess root/ospos
# redirect to public page
<IfModule mod_rewrite.c>
RewriteEngine On
RewriteCond %{REQUEST_URI} !^public$
RewriteCond %{REQUEST_URI} !^/.well-known/acme-challenge [NC]
RewriteRule "^(.*)$" "/public/" [R=301,L]
</IfModule>
# disable directory browsing
# For security reasons, Option all cannot be overridden.
Options +ExecCGI +Includes +IncludesNOEXEC +SymLinksIfOwnerMatch -Indexes
# prevent folder listing
IndexIgnore *
# Apache 2.4
<IfModule authz_core_module>
# secure htaccess file
<Files .htaccess>
Require all denied
</Files>
# prevent access to PHP error log
<Files error_log>
Require all denied
</Files>
# prevent access to LICENSE
<Files LICENSE>
Require all denied
</Files>
# prevent access to csv, txt and md files
<FilesMatch "\.(csv|txt|md|yml|json|lock)$">
Require all denied
</FilesMatch>
</IfModule>
# Apache 2.2
<IfModule !authz_core_module>
# secure htaccess file
<Files .htaccess>
Order allow,deny
Deny from all
Satisfy all
</Files>
# prevent access to PHP error log
<Files error_log>
Order allow,deny
Deny from all
Satisfy all
</Files>
# prevent access to LICENSE
<Files LICENSE>
Order allow,deny
Deny from all
Satisfy all
</Files>
# prevent access to csv, txt and md files
<FilesMatch "\.(csv|txt|md|yml|json|lock)$">
Order allow,deny
Deny from all
Satisfy all
</FilesMatch>
</IfModule>
Code .htaccess root/ospos/public
RewriteEngine On
# To redirect a subdomain to a subdir because of https not supporting wildcards
# replace values between <> with your ones
# RewriteCond %{HTTP_HOST} ^<OSPOS subdomain>\.<my web domain>\.com$ [OR]
# RewriteCond %{HTTP_HOST} ^www\.<OSPOS subdomain>\.<my web domain>\.com$
# RewriteRule ^/?$ "https\:\/\/www\.<my web domain>\.com\/<OSPOS path>" [R=301,L]
# To rewrite "domain.com -> www.domain.com" uncomment the following lines.
# RewriteCond %{HTTPS} !=on
# RewriteCond %{HTTP_HOST} !^www\..+$ [NC]
# RewriteCond %{HTTP_HOST} (.+)$ [NC]
# RewriteRule ^(.*)$ http://www.%1/$1 [R=301,L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
# if in web root
# RewriteRule ^(.*)$ index.php?/$1 [L]
# if in subdir comment above line, uncomment below one and replace <OSPOS path> with your path
RewriteRule ^(.*)$ /ospos/public/index.php?/$1 [L]
# disable directory browsing
# For security reasons, Option all cannot be overridden.
#Options All -Indexes
Options +ExecCGI +Includes +IncludesNOEXEC +SymLinksIfOwnerMatch -Indexes
# prevent folder listing
IndexIgnore *
# Apache 2.4
<IfModule authz_core_module>
# secure htaccess file
<Files .htaccess>
Require all denied
</Files>
# prevent access to PHP error log
<Files error_log>
Require all denied
</Files>
</IfModule>
# Apache 2.2
<IfModule !authz_core_module>
# secure htaccess file
<Files .htaccess>
Order allow,deny
Deny from all
Satisfy all
</Files>
# prevent access to PHP error log
<Files error_log>
Order allow,deny
Deny from all
Satisfy all
</Files>
</IfModule>
<IfModule mod_expires.c>
<FilesMatch "\.(jpe?g|png|gif|js|css)$">
ExpiresActive On
ExpiresDefault "access plus 1 week"
</FilesMatch>
</IfModule>
I have read many questions and made many tests, but I still haven't been able to figure out where and what to change to make it redirect correctly.
Anyone have any suggestions on this?
Thank you
PHP version: 7.3.13
MySQL version: 5.6.44-86.0
OS and version: CentOS 7
WebServer is: Apache 2.4
How should I edit:
RewriteCond %{REQUEST_URI} !^public$
RewriteCond %{REQUEST_URI} !^/.well-known/acme-challenge [NC]
RewriteRule "^(.*)$" "/public/" [R=301,L]
I need to protect a specific route/url with password on my codeigniter site.
Base url looks like this:
https://staging.mysite.com/site-name
I want to protect this url with a password using .htaccess
https://staging.mysite.com/site-name/export
Routes looks like this
$route['export']['get'] = 'ExportController/index';
$route['export']['post'] = 'ExportController/export';
I've tried many different answers for similar problems but I just can't get it to work properly,
either password is asked everywhere, or it isnt asked at all.
Here is my .htaccess
RewriteEngine on
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ index.php/$1 [L,QSA]
</IfModule>
SetEnvIfNoCase Request_URI ^/export SECURED
AuthName "Restricted Area"
AuthType Basic
AuthUserFile "/home/something/path/to/.htpasswd"
AuthGroupFile /
Require valid-user
Satisfy any
Order Allow,Deny
Allow from all
Deny from env=SECURED
I think that the problem might be in this part:
SetEnvIfNoCase Request_URI ^/export SECURED
Because I just cant target the url I want, here is some other things I've tried
SetEnvIfNoCase Request_URI ^/site-name/export SECURED
SetEnvIfNoCase Request_URI "^/site-name/export" SECURED
SetEnvIfNoCase Request_URI "^/export" SECURED
SetEnvIfNoCase Request_URI ^(.*)/export SECURED
SetEnvIfNoCase Request_URI .*/export$ SECURED
SetEnvIfNoCase Request_URI .*/export SECURED
Edit:
I've also tried to protect the entire ExportController with like this, password prompt does not appear anywhere.
<Files ExportController>
AuthName "ExportController"
AuthType Basic
AuthUserFile "/home/something/path/to/.htpasswd"
require valid-user
</Files>
I ended up doing this:
<If "%{THE_REQUEST} =~ m#^GET /site-name/export#">
AuthType Basic
AuthName "Password Required"
AuthUserFile /home/path/to/.htpasswd
Require valid-user
</If>
<IfModule mod_rewrite.c>
RewriteEngine on
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)$ index.php/$1 [L,QSA]
</IfModule>
I got .htaccess blocking all user-agents and only allow one's i need
to allow cloudflare to access how can i allow not using (Mozilla)
this is what i got user-agent
Mozilla/5.0 (compatible; CloudFlare-AlwaysOnline/1.0; +http://www.cloudflare.com/always-online)
RewriteEngine on
AuthType Basic
AuthName "private"
AuthUserFile "/home/example/.htpasswds/public_html/exemple/passwd"
require valid-user
#-only allow-#
SetEnvIf User-Agent .0011 0011
Order deny,allow
Deny from all
Allow from env=0011
#-index only open for 0011-#
Options +Indexes
RewriteCond %{REQUEST_FILENAME} -d
RewriteCond %{HTTP_USER_AGENT} !0011 [NC]
RewriteRule . - [F]
You can use:
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} !CloudFlare-AlwaysOnline [NC]
RewriteRule ^ - [F]
But do not do that, because Cloudflare uses the name of the user's browser user-agents for all normal queries.
A stranger bot (GbPlugin) is codifying the urls of the images and causing error 404.
I tried to block the bot without success with this in the bottom of my .htaccess, but it didn't work.
Options +FollowSymlinks
RewriteEngine On
RewriteBase /
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_USER_AGENT} ^$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^GbPlugin [NC]
RewriteRule .* - [F,L]
The log this below.
201.26.16.9 - - [10/Sep/2011:00:06:05 -0300] "GET /wp%2Dcontent/themes/my_theme%2Dpremium/scripts/timthumb.php%3Fsrc%3Dhttp%3A%2F%2Fwww.example.com%2Fwp%2Dcontent%2Fuploads%2F2011%2F08%2Fmy_image_name.jpg%26w%3D100%26h%3D65%26zc%3D1%26q%3D100 HTTP/1.1" 404 1047 "-" "GbPlugin"
Sorry for my language mistakes
Here's what you can put in your .htacces file
Options +FollowSymlinks
RewriteEngine On
RewriteBase /
SetEnvIfNoCase Referer "^$" bad_user
SetEnvIfNoCase User-Agent "^GbPlugin" bad_user
SetEnvIfNoCase User-Agent "^Wget" bad_user
SetEnvIfNoCase User-Agent "^EmailSiphon" bad_user
SetEnvIfNoCase User-Agent "^EmailWolf" bad_user
SetEnvIfNoCase User-Agent "^libwww-perl" bad_user
Deny from env=bad_user
This will return:
HTTP request sent, awaiting response... 403 Forbidden
2011-09-10 11:15:48 ERROR 403: Forbidden.
May I recommend this method:
Put this is .htaccess in root of your site.
ErrorDocument 503 "Your connection was refused"
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^(Mozilla.*537.36|Mozilla.*UCBrowser\/9.3.1.344)$ [NC]
RewriteRule .* - [R=503,L]
Where
^(Mozilla.*537.36|Mozilla.*UCBrowser\/9.3.1.344)$
are the two useragents I wanted to block in this example case.
You can use regex so a useragent like
Mozilla/5.0 (Windows NT 6.1; WOW64; rv:40.0) Gecko/20100101 Firefox/40.0
could be
Mozilla.*Firefox\/40.0
^means match from beginning and $ to the end so you could block just one useragent with:
ErrorDocument 503 "Your connection was refused"
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^Mozilla.*Firefox\/40.0$ [NC]
RewriteRule .* - [R=503,L]
Or add several using the | character to separate them inside ( and ) like in the first example.
RewriteCond %{HTTP_USER_AGENT} ^(Mozilla.*537.36|Mozilla.*UCBrowser\/9.3.1.344)$ [NC]
You can test it by putting your useragent in the code and then try to access the site. http://whatsmyuseragent.com/
To block empty referers, you can use the following Rule :
RewriteEngine on
RewriteCond %{HTTP_REFERER} ^$
RewriteRule ^ - [F,L]
This will forbid all requests to your site if HTTP_REFERER value is empty ^$ .
To block user agents, you can use
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} opera|firebox|foo|bar [NC]
RewriteRule ^ - [F,L]
This will forbid all requests to your site if HTTP_USER_AGENT matches the Condition pattern.