How to get a short URL for my website? - dns

These days I've seen a lot of websites using Short URLs for their whole websites instead of a single page reference.
Examples:
http://bo.lt
http://ge.tt
Just wondering how to get these type of custom URLs for my website. Can anyone let me know how this works?

Two letter high-level domains, such as .lt and .tt, belong to individual countries. .lt is Lithuania and .tt is Trinidad and Tobago. To get domains under these high-level country domains, just google for the domain name registrar for that country and register it just like you do a .com, .net, .org, etc.

Related

Same website on same domain name with different extensions - i.e. .com and .co.uk

What is best practice for doing this? Should I have duplicate content at each domain or should I redirect from one to the the other, i.e. all traffic to the .co.uk domain redirected to the .com domain?
Best practice is to send them all to one web server.
By default the server will not care which domain is pointed at it and will show the home page as domainx.com if you to it from domainx.com.
However there are two possible issues with this that come to mind:
The person who created the website hopefully only used relative links. (The contact us button points to contactus.htm instead of http://domainx.com/contactus.htm ) If not, some links might change the user from domainx.co.uk to domainx.com.
Search Engine Optimisation: Its better SEO wise if all the links to your site point to one domain name rather than appearing as several less popular sites.
You can get everyone on the same site by using a RewriteRule or 301 Redirect to the primary site. Or you can make every hyperlink on the site absolute and point to the primary domain.

Strange Google Results after Websites Server Change

I have two eCommerce websites,
for eg.
1. www.abc.com
2. www.abc.co.uk
Previously, both sites were on US server. We have shifted them to Indian Server and Redesign both websites. At the start we didn't redirect all urls to the new website, after two days, old urls redirection was complete to the new urls for www.abc.com.
Now problem is when I search for my domain name in Google.co.uk , It shows me www.abc.co.uk website in search results but WITH META TITLE and DESCRIPTION of www.abc.com website. How it can be possible ? Does Google consider both websites as the same? How can other website title appear to another website? Does changing server or bad URL redirection create such issues? Also, I have restricted UK website only for UK Region, and did the redirection of the .com website to .co.uk if anyone opens the .com website in UK region vice versa
My .co.uk site was ranking on every related keyword, but now ranking has been disturbed badly.
Please share your answers
Changing server is surely not the issue. Bad redirection is a possible cause, wrong canonical links too. Redesign could explain ranking disturbance. Redirection can explain temporary ranking disturbance (not permanent).
Eventually, report this in Google's Webmaster Tool forum.

Blocking USA traffic without blocking search engine bots?

Dear Friends Need a big advice from you.
I have a website that i don't want any traffic from USA(website contains only local contents). Since most of the visitors comes to my website from search engines,I don't want to block those search engine bots.
I know how to
block ip addresses from .htaccess.
redirecting users from Geo location.
I think if I block USA ips then my website won't be indexed in Google or yahoo.So even I don't want any USA traffic I need my webpages to be indexed in Google and yahoo.
Depend on the $_SERVER['HTTP_USER_AGENT'] I can allow bots to crawl my webpages.
One of my friend told that if I block USA visitors except bots,Google will blacklist my website for disallowing Google indexed pages for USA visitors.
Is this true? If so what should I do for this problem? Any advices are greatly appreciated. thanks
using JS redirect for US users. This will allow most of the search engine bots to visit your website.
using Robots.txt to tell Google where and what to read
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449
there is a way to add Googlebot's IP addresses (or just the name: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=80553) as an exception.
use the geotargeting and block the pages with a JS div or just add a banner that tell your users that they can't use the website from their location
hope this helps,
cheers,
I'm only answering this here because someone was smart enough to spam Google's "Webmaster Central Help Forum" with a link drop. I know this answer is more of a comment, but I blame the question, had it been asked in webmasters SE, this answer would be more on-topic.
1 Why block US visitors? I understand there can be some legal reasons (eg gambling). But you could just disable those features for US visitors and allow them to see the content and a banner that explains that the site is not available for US visitors. Search engines won't have any issue with that (they're incapable of gambling or purchasing stuff anyway) and there's no cloaking either.
2 Victor's answer contains a few issues IMO:
Using JS redirect for US users. This will allow most of the search engine bots to visit your website.
This was probably correct at the time of writing, but these days Google (and >probably some other search engines as well) are capable of running the JavaScript and will therefore also follow the redirect.
Using Robots.txt to tell Google where and what to read.
I'd suggest using the robots meta tag or X-Robots-Tag header instead, or respond with a 451 status code to all US visitors.
There is a way to add Googlebot's IP addresses as an exception.
Cloaking.
Use the geotargeting and block the pages with a JS div or just add a banner that tell your users that they can't use the website from their location.
Totally agree, do this.

geotargeting using tlds

I wanted to ask what is the best strategy for this situation:
We have a site example.de and we are launching a dedicated version of it for the Austrian market. Since both .de and .at sites use german language at the first few months we are going to show the same content in both sites (both domains point to same servers which choose what to show dynamically). Will this penalize our rankings because of duplication (and how can we tell google that "at is a copy for the de site").
In a month or two, .at users will start to see exclusive content for their region (though some parts of the site will stay the same).
Since we are not trying to cheat or smth else, how can we ensure google doesn't falsely penalize us?
Thanks
If you are going to show same content on both domains than I would suggest to redirect one domain to another one otherwise It will be considered duplicate content and eventually you will lose something.
You can use 302 for redirecting which does a temporary redirection, or
You can use 301 for permanent redirection which tells google that content has moved permanently to other domain.
But if you are planning to keep different content on both domains after a few months than I would say go with 302 redirect.
And if you are putting same content on example.de and example.au that will not impose any serious problems because you have different tlds with same second level domain.
But if you put same content on example.com and something.com than there will be some serious duplicate content problem.
But a much better approach would be to do something like this example.com/us , example.com/au , example.com/uk etc for countries or example.com/en , example.com/fr etc for languages.

Is there a way I can find the canonical domain name for a list of websites?

I'm working on a page tracking web app and I'd like to get the canonical domain for a list of sites. As far as I know there is no good way of telling where a site's ownership of subdomains and top level domains starts and ends. I'm not sure the best way to describe that, so here is an example:
If I own a personal URL, mysite.com, I am able to set up subdomains such as www.mysite.com, cdn.mysite.com, and so forth.
If my "group" has a website at a university, such as computerscience.myuni.edu, I might have also have control over www.computerscience.myuni.edu, but not myuni.edu
If I am a huge business and and need to spread web traffic out, I might even have www.acme.com, ww2.acme.com, ww3.acme.com, etc.
So nothing is certain but if I'm given a URL I can probably strip of www., ww2., and cdn., and maybe secure. from the front, but are there any other common "subdomains" that I'm not thinking of that are fairly common and generally not used to serve up a different website?
I'm guess I'm just trying to figure out the best way to get the real "canonical" domain name for a site.
First of all, you should make the distinction between Domain Names and Websites/URLs.
I don't think there was any efficient way to identify easily a website owner but concerning the domain name, it can be deduced through its structure.
Roughly, a Fully Qualified Domain Name is composed by the subdomain(s), the name and the suffix, and in your case, you are looking to find the canonical domain name (name + suffix).
Since the Domain Name System is hierarchical, a FQDN like www.example.com. should be read from the end to beginning: .com.example.www and could be decomposed this way:
Suffix: com
Name: example
Subdomain(s): www
For your identification, you should proceed in the same order:
Suffix: Find the suffix under which the name has been registered (.com, .net, .co.uk, .com.es)
Name: Identify the first name just after the dot
Subdomain(s): strip the rest of the string.
There is no official Database listing all the public suffixes, however at the initiative of the Mozilla Foundation an unofficial one has been created. The project is named Public Suffix, which aim is to record suffixes, under which people could register domain names and have several implementations to parse the database.
I wrote an article on my personal blog introducing the domain name system, if you are interested, where I describe the domain name structure in more details: What's a domain name and what's behind the scene

Resources