Hi so i have the below working:
SetEnvIf Referer "^http://sub\.site1\.com/yvvl/Portal/" local_referral
SetEnvIf Referer "^http://sub\.site2\.com/yvvl/Portal/" auth_referral
SetEnvIf Referer "^http://sub\.site3\.com/yvvl/Portal/" authC_referral
Order Deny,Allow
Deny from all
Allow from env=local_referral
Allow from env=auth_referral
Allow from env=authC_referral
what i dont know how to do is wildcard it so anything from those 3 domains will be accepted my preg is not good at all
Thanks
Just remove everything after the .com:
SetEnvIf Referer "^http://sub\.site1\.com/" local_referral
SetEnvIf Referer "^http://sub\.site2\.com/" auth_referral
SetEnvIf Referer "^http://sub\.site3\.com/" authC_referral
Since there's no fence-post for the end of the referer (indicated by the $ character) that will match anything that starts with http://sub.site1.com/ etc.
Related
How do I block Useragents and IPs AT THE SAME TIME?
Currently using this
SetEnvIfNoCase User-Agent "Chrome/80" good_ua
SetEnvIfNoCase User-Agent "Chrome/81" good_ua
SetEnvIfNoCase User-Agent "Chrome/82" good_ua
SetEnvIfNoCase User-Agent "Chrome/83" good_ua
order deny,allow
deny from all
allow from env=good_ua
That white lists those UAs. But when I try adding this code
deny from 1.1.1.1
deny from 1.0.0.1
only blocking UA works, I can not make them both work at the time. I need to block IPs and allow certain UAs.
I need to add basic auth for uri backend but need to exlude some of the uri, i have try to add SetEnvIf
SetEnvIf REQUEST_URI "(login|admin)" PROTECTED
SetEnvIf REQUEST_URI "^/api/*" !PROTECTED
The idea is protect uri that containt login or admin but allow admin/api/*
but it's doesn't work, can you guys give me some hint.
Your regex is problematic here:
`^/api/*`
Will match /api at the start and * is useless after /.
You can do this in single SetEnvIf rule by using negative lookahead:
SetEnvIf REQUEST_URI "(login|admin(?!/api/))" PROTECTED
(?!/api/) means skip /admin/api/ from this rule
I have to set different Env for different subdomains. For example, domain/subdomain1 MAGE_RUN_CODE=mobile_en, but domain/subdomain2 MAGE_RUN_CODE=global
This code works:
SetEnvIf Host .*mydomain.net.* MAGE_RUN_CODE=mobile_en
But this code doesn't work
SetEnvIf Host .*mydomain.net/ahava-m1-mobile.* MAGE_RUN_CODE=mobile_en
How should I change second code to make it working?
As explained in the comments above, HOST keyword is used for http host headers ie example.com . Since your url contains a path segment /ahava-m1-mobile you need to match against Request-uri variable.
SetEnvIF request_uri ^/ahava-m1-mobile MAGE_RUN_CODE=mobile_en
I use Cloudflare for my site so usual .htaccess rule won't work. I need to block certain IPs to access my website.
I found this one but won't work:
SetEnvIf X-FORWARDED-FOR 1.1.1.1 deny
order allow,deny
allow from all
deny from env=deny
I also tried this with the same result..
RewriteEngine On
SetEnvIf X-FORWARDED-FOR 109.100.238.188 deniedip
order allow,deny
deny from env=deniedip
"I use Cloudflare for my site so usual .htaccess rule won't work."
These rules should still work fine as long as you have something set to restore visitor IP back to your logs/server (CloudFlare IPs will show without a mod, which could throw things off a little bit).
In my .htaccess file I have:
<Files ~ "\.(tpl|txt)$">
Order deny,allow
Deny from all
</Files>
This denies any text file from being read, but the Google search engine gives me the following error:
robots.txt Status
http://mysite/robots.txt
18 minutes ago 302 (Moved temporarily)
How can I modify .htaccess to permit Google to read robots.txt while prohibiting everyone else from accessing text files?
Use this:
<Files ~ "\.(tpl|txt)$">
Order deny,allow
Deny from all
SetEnvIfNoCase User-Agent "Googlebot" goodbot
Allow from env=goodbot
</Files>