Blocking a City or Region from website - .htaccess

I know it is easy to block an individual IP address or a whole country from viewing my website via htaccess, however I need to block a city in the UK only and have the visitors from the blocked city redirected to another external URL.
Here is some code I already have for my htaccess file, but I have been searching everywhere on how to block just a UK city or region. Where would I find the range of IPs for a specific UK city? or is there a better way of doing this?
# BAN USER BY IP
<Limit GET POST>
order allow,deny
allow from all
deny from (an individual IP address or range)
</Limit>
ErrorDocument 403 http://www.google.com

you have to use IptoLocation Script(http://www.ip2location.com/) and then you can check the city or region after that Blocking a City or Region from website.

There are several ways to do this. First you can use any geoIP API to query the location of the visitor. Just google geoIP API to see what's available. There are online solutions and downloadable databases as well.
You can also "wire in" the banned IP address blocks into your webapp. You can query block information at http://location2ipaddress.com/ for example. If you want to use the .htaccess file to do this, then all you need is the ip range data - put this into the deny list and you're done.
Whatever you do, blocking is a delicate topic.
It is easy to go around the block by using proxy servers.
Blocking whole ranges is risky, you are preventing innocent users from accessing your website. There is no harmless, 100% safe way to do this.

Related

How to dynamically deny access using .htaccess

I am familiar with denying access based on an IP, block of IPs, browser, URL etc... but my problem here is, let me state, how to deny access if it came from some IP address, at least 3 times a second for long period of time. Here, you don't know the IP address unless you look at the access file. By the time you found out the IP address it's already drawn too much of the bandwidth.
You can't use htaccess for doing this. The out-of-box apache directives can't handle browsing sessions. You'll need to install some sort of log parser, maybe something like fail2ban? Or you can probably cook up a set of iptables rules for blocking lots of connections from one IP: https://askubuntu.com/questions/437059/linux-command-to-prevent-dos-attack-by-using-netstat-and-iptables

How do I include only traffic in a certain area of germany?

I want to block traffic coming from IP-addresses far outside of my region in germany, so people from cities more than 300 km away cannot access it(barring they use a proxy).
How do I find the IP-Addresses to include/exclude?
You cannot restrict dynamically via htaccess.
(In your site homepage if your website is built in PHP)
You can use
$_SERVER['REMOTE_ADDR'] or $_SERVER['REMOTE_HOST'] variables.
to find out the IP of the client visiting your website and then if its beyond your particular radius, you redirect them.
To find out the distance between two IP address, please see Get the geographic distance between 2 IP addresses?
To allow only a particular IP, you can use htaccess as follows :
order allow,deny
allow from 255.0.0.0
deny from all
Hope this helps.

Will Google be able to access my website after blocking all US IPs?

I'm going to block all US IPs using .htaccess this way :
<Limit GET HEAD POST>
order deny,allow
deny from 3.0.0.0/8
deny from 4.0.0.0/25
deny from 4.0.0.128/26
deny from 4.0.0.192/28
deny from 4.0.0.208/29
....
allow from all
</Limit>
Will Google be able to access and index my website after blocking all US IPs?
EDIT : Sorry for the ambiguity, but I DO want Google to index my website.
Although Google has its servers spread across the whole world, it would be quite hard to say where the search engine's bots mostly originate from. What I suggest would be to block the IP ranges but add an exclusion clause that matches against the User-Agent for search bots like:
SetEnvIfNoCase User-Agent (googlebot|bingbot|yahoo!\sslurp) is_search_bot
<Directory /docroot>
Order Deny,Allow
Deny from 3.0.0.0/8
Deny from 4.0.0.0/25
Deny from 4.0.0.128/26
Deny from 4.0.0.192/28
Deny from 4.0.0.208/29
Allow from env=is_search_bot
</Directory>
I don't think so, but if you really don't what google to index it then use a robot.txt file so it doesn't index it. The robot.txt would be
User-agent: googlebot Disallow: /directory/
If it's just a matter of blocking US ip and that's it then you're probably good, as google has data centers in many different locations, not just the United States. This means that google will still probably index it.
Although google has many data centers , but all their bots are in US so no google will not be able to scan your website if you block us ips
If you can't access your domain root directory, just use this meta tag to block google bot index specific page(s):
<meta name="googlebot" content="noindex">
If your site was indexed already by google crawler, following the guide Remove your own content from Google search results
Access: https://www.google.com/webmasters/
There all information that you need.
Here, the Google teach how you can block the Googlebot index your site:
https://support.google.com/webmasters/answer/93708
About your question, I think that if you block all US IP Address, the "Google other country" must access and index your site, then he must sync with Google US.

Use htaccess mod_rewrite to hide domain name

I've read lots about what can be done with mod_rewrite but I haven't found one to solve my problem. Maybe it can't be done?
I have a sub-domain on my primary domain that I have a customer direct user to to use one of my programs. The customer doesn't want his customers to see that that are on my domain and he doesn't want to use an iframe.
So, is it possible for the user to only see www.subdomain/program.php instead of www.subdomain.mydomain.com/program.php?
If you want the browser to show www.subdomain/program.php in its location bar, you need to register the www.subdomain domain name. There is no way to remove bits of the domain name using anything in the htaccess file. For example, if you've registered example.com and you have a server at foo.example.com, and you want to be able to go to http://foo/some/path/index.html, you're out of luck because the browser is going to attempt to do a DNS lookup of foo and it will most likely fail unless there happens to be a "foo" server under the DNS search domain. Browsers put a great deal of effort to prevent spoofing of the domain name, since it would be really bad if I was able to spoof my website to show the domain of a bank in a browser's location bar while actually visiting an entirely different website.

htaccess deny all by ip address except those in united states?

I have a local Web site that I would like to tighten access to only those within the United States; or perhaps only within Florida. It's a Word Press site that has gotten hacked due to some weak code. I've seen two sources of IP address lists for .htaccess "allow deny" control by IP Address.
IP by Country/Continents:
http://www.countryipblocks.net/continents/
Wizcrafts List:
http://www.wizcrafts.net/htaccess-blocklists.html
What is the best approach for blocking everything except United States traffic? How would you approach the deny/allow? Would you deny other Countries or try to allow only the U.S.?
Thanks for any comments, Jeff
Add this list to the .htaccess located on the root folder of your server.
It will only allow connections from the US.
ex .htaccess file:
allow from IP
203.31.234.0/24
129.230.176.0/20
etc...
you can use deny from All in order to forbid access to your site!
In countryipblocks you can download all IPs from the area you want and add allow from IP to your .htaccess file! so only those IPs can access to your site!
Edit: Remember you can add IP range instead of one IP!
I downloaded .htacees from that site, and that was ok!

Resources