Normally, we restrict access to our website from certain countries at the edge of our network. We are about to deploy our application to Azure and need to block certain countries from accessing it. This is due to U.S. Embargo policies... What is the best way to block certain countries so we meet U.S. laws and regulations?
Thanks,
Mike
One idea is to build a simple proxy that attempts to filter on region. My teammate Ricardo Villalobos co-wrote an MSDN article explaining how someone might do that with node.js, in Azure, filtering against the MaxMind geolocation database. It's probably not an exact fit, as it deals with media stream filtering, but you should be able to use the basic technique and proxy described, as a starting point.
Related
I have one G Suite account with three domains (mainly many emails).
Now I would like to separate one domain to other fresh G Suite account. Of course with historical emails.
Anybody know how to do this? It's possible without G Suite support team?
There are different ways of doing this and yes, you can do it yourself. Please consider that it can be tricky depending on the amount of data you want to move.
The main steps are:
Create a new account with a temporary domain (the same domain can not be in two consoles at the same time)
Migrate all the contents from your current account to the new one. You have different options to do this. The cheapest one is the Data Migration Service (DMS) that will only allow you to migrate only email using the IMAP protocol (so you need to know the users' passwords). Google support for the DMS is best effort, so if you have time and budget I recommend to use a commercial tool (my tool of choice here is Cloud Migrator) that is also able to migrate calendar items and Google Drive files using the Google APIs (so it is transparent for the users).
In a cut-off date that you agree with your users you remove the domain from the original console, you add it to the new one and perform a mass rename (my tool of choice here is GAM).
There are many variables that can make the process much more complex that are difficult to describe in a single answer and this kind of activities usually require a dedicated project and a (small) team: I really suggest you to get some help.
A word of warning:
I did this on Bluehost (transferring G Suite ownership of a domain from one Bluehost account to another) through Bluehost customer support.
It took 3 separate calls to Bluehost to get everything fully moved over. Make sure that your hosting provider sends you an email confirmation that everything happened successfully, because as a reseller of a Google product, they may not have complete authority over transferring ownership of Google's product.
Best regards.
Also in reference to the domain go to Google Domain and you can see all the options you have in reference to your domain website etc. It is very self-explanatory.
wc.
Searched the web and unable to find a solution. I have an umbraco site using IIS to host on a Windows server. Any ideas on approach to block users accessing site outside the UK? Htaccess approach would be too slow.... thank you in advance!
That's quite hard to do accurately, as you could have someone based in the UK using a European network provider, which means that they might appear to come from say Holland instead of the UK. It's also possible for people to spoof their location fairly easily if they really want to get at your site.
As Lex Li mentions there are plenty of commercial databases and tools for looking up a user's location, but the accuracy of these varies considerably, not to mention the fact that some of them only support IPv4. Any of these options are going to be slow though, as you'll have to check on every request. You also have to make sure you keep the databases up to date.
Another option would be to proxy your site through something like CloudFront or CloudFlare which both support blocking traffic by country.
We have a website which provides services for people based in particular city.
We want to scale and provide it for more cities but we want to remain separated IT within the city realm: one webhost, cloud service , database etc for one location. It does not only enables us to scale individually (some cities are bigger than other several times) but most significantly it improves our code-base and db queries to not use city's predicates - despite the fact it is more expensive in general.
At the same time we do not want to use subdomain. User can switch city through dropdown and request should go to appropriate VM without url being changed so the routing should work seamlessly.
Based on Azure documentation we are still not sure what solution would meet our needs, Traffic Manager, Load balancer or custom redirects.
How you accomplish this is ultimately up to you, but from an Azure-specific perspective, the only multi-region built-in load-balancing service is Traffic Manager. This operates in one of three routing modes:
Primary/failover
Round-robin
Closest (based on latency, not physical distance)
For any other type of routing (such as letting the user choose location, per your question), you'd need to implement this on your own or via 3rd-party service (and how to accomplish that would be a matter of opinion/debate/discussion, which is off-topic for StackOverflow).
Since you're looking to have a separate DB, cloud-role and webhost per city, I do not see how you can get away from doing subdomains.
Do you not want subdomains because of SEO? If so, it'd be easier to find another way to solve SEO problem.
But whatever Traffic Manager or other DNS based routing solution you use, it'll be splitting users by where they come FROM and not where they're going TO.
The destination problem is solved thru separate sub-domains
I am sure many of you have found fake referral traffic in your google analytics reports/views. This makes it difficult for low to medium traffic sites to have accurate data for marketing. I am wondering what others are doing to exclude this traffic from their analytics reports.
If you go to your analytics account and go to acquisition -> all traffic -> referrals you will see sites like floating-share-buttons.com. These are the sites I want to filter out. Which you can do by setting up a custom filter for the view as described at the bottom of this page. I have done this and it works.
I would rather block these bots from hitting the site all together. Just a note: my sites are running as web apps in azure.
I am not sure if setting up url rewrite rules described here will work in azure apps or if this will mess with the existing url rewrite functions of the Content Management System I am using (DotNetNuke DNN platform 7).
I am really just looking to hear what others have done to block bots rather than than setting up filters in the analytics view's settings.
Thanks
PS
for those who are interested, this is the current filter list I am using:
webmonetizer\.net|trafficmonetizer\.org|success-seo\.com|event-tracking\.com|Get-Free-Traffic-Now\.com|buttons-for-website\.com|4webmasters\.org|floating-share-buttons\.com|free-social-buttons\.com|e-buyeasy\.com
With regards to this issue, there are a number of things that you can do. You are going the route that I see most commonly used and that is to block the information using the filters in Google Analytics.
You can go the route of an IIS Filter as well, just like you have linked. DNN's Friendly URL's will not necessarily be impacted by this as they are processed BEFORE DNN gets the request. There is a marginal performance impact by having two things process re-writes, but nothing to be concerned about until incredibly high user volume.
This is also a great collection of options.
First you need to know that there are mainly 2 types of spam affecting GA right now, Ghost and Crawlers.
The first(ghosts) never interacts with your page, so any server-side solutions like the HTTP rules or htaccess file won't have any effect and will only fill your config files with.
The crawlers as the name imply do access your website and can be blocked this way, but there are only a few of them compared with the ghost. To give you an Idea there are around 8 active crawlers while there are more than 100 ghosts and each week increasing.
This is because the ghost method is easier to implement for the spammers.
From your expression, only success-seo is a crawler. The rest should be filtered. Now there is a better way to get rid of all ghosts with just one filter based in your valid hostnames instead of creating of updating one every week.
You can find more information about the ghost spam and the solution here
https://stackoverflow.com/a/28354319/3197362
https://moz.com/ugc/stop-ghost-spam-in-google-analytics-with-one-filter
Hope it helps.
I have a blog site, a WP 3.0 install. I've dropped Google Analytics' tacker code into the footer (a recommended technique I believe). I also have two different types of web statistic software available on the virtual server, through the hosting company. However the web statistics vary greatly. Why such variation?
Statitics --
http://pastebin.com/Nc10iGaA
Thanks a million!
Google removes bot hits from it's traffic. You might be seeing google bots in the other hit counts.
I would guess that the software counts analytics differently. You should look at the documentation to figure out what qualifies a "visit," which may exclude/include crawlers, certain user agents, certain access patterns, etc.