googlebot
msnbot
yahoo
bingbot
googlebot-image
This is all the big search engine spiders? or someone know more famous spiders?
and does the names of the spiders I wrote there current?
thx!
Here is list of all sort of user agents (BOts spiders and others) User agents , You can use the exact string name to verify
Related
I'm working with a client to improve their site search results through the Magento search functionality. We have set up redirects for the top searched terms. My question is, how am I able to track conversions/revenue for these terms now that I no longer have search query parameters on the url?
The client wants to be able to see the effect these search changes have on conversion rate/revenue but I can't seem to figure out how to set this up in GA and Magento doesn't seem to have report that provides this data. Any help is appreciated.
Magento 1.7.0.2
You can try two different approach :
Either by getting the search term by url rewrite table using your current url.
Or you can use the session variables by setting the search teams in the session and getting at required place.
Hope this will help you!
I will register 24 character a long domain name. Is long domain names considered spam by Google and is it bad for SEO ?
Long domain names are not considered spam by Google and other search engines OR that it's bad for SEO. It could be bad for SEO if it is completely irrelevant to your site's content though - You wouldn't want a corporate site with a domain name of whycatslovecatnipandpurr.com, so stick to using your primary keyword(s) close to the beginning of the domain name. At the end of the day, search engines are merely interested in your site's content and the quality thereof. There have been rumours that Google dislikes long domains and domains with multiple hyphens, but this is all speculation and there is no documentation authored by Google that confirms it.
So, to sum it up: No, long domain names are not considered as spam or bad for SEO.
No, unless you wish to use the domain for spamming :)
Dear Friends Need a big advice from you.
I have a website that i don't want any traffic from USA(website contains only local contents). Since most of the visitors comes to my website from search engines,I don't want to block those search engine bots.
I know how to
block ip addresses from .htaccess.
redirecting users from Geo location.
I think if I block USA ips then my website won't be indexed in Google or yahoo.So even I don't want any USA traffic I need my webpages to be indexed in Google and yahoo.
Depend on the $_SERVER['HTTP_USER_AGENT'] I can allow bots to crawl my webpages.
One of my friend told that if I block USA visitors except bots,Google will blacklist my website for disallowing Google indexed pages for USA visitors.
Is this true? If so what should I do for this problem? Any advices are greatly appreciated. thanks
using JS redirect for US users. This will allow most of the search engine bots to visit your website.
using Robots.txt to tell Google where and what to read
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449
there is a way to add Googlebot's IP addresses (or just the name: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=80553) as an exception.
use the geotargeting and block the pages with a JS div or just add a banner that tell your users that they can't use the website from their location
hope this helps,
cheers,
I'm only answering this here because someone was smart enough to spam Google's "Webmaster Central Help Forum" with a link drop. I know this answer is more of a comment, but I blame the question, had it been asked in webmasters SE, this answer would be more on-topic.
1 Why block US visitors? I understand there can be some legal reasons (eg gambling). But you could just disable those features for US visitors and allow them to see the content and a banner that explains that the site is not available for US visitors. Search engines won't have any issue with that (they're incapable of gambling or purchasing stuff anyway) and there's no cloaking either.
2 Victor's answer contains a few issues IMO:
Using JS redirect for US users. This will allow most of the search engine bots to visit your website.
This was probably correct at the time of writing, but these days Google (and >probably some other search engines as well) are capable of running the JavaScript and will therefore also follow the redirect.
Using Robots.txt to tell Google where and what to read.
I'd suggest using the robots meta tag or X-Robots-Tag header instead, or respond with a 451 status code to all US visitors.
There is a way to add Googlebot's IP addresses as an exception.
Cloaking.
Use the geotargeting and block the pages with a JS div or just add a banner that tell your users that they can't use the website from their location.
Totally agree, do this.
In my opinion is is not wise.
e.g. check this:
http://edition.cnn.com/robots.txt
http://www.bbc.co.uk/robots.txt
http://www.guardian.co.uk/robots.txt
according from this:
http://www.joomla.org/robots.txt
Joomla.org have not changed the default administration folder :D
E.g. prestashp page has a blank robots.txt file which is not perfect, but at least better in my opinion:
http://www.prestashop.com/robots.txt
Are these people stupid or they think that it is ok to know how they web strtucture look like?
Why are they not using htaccess to deny access for robots etc?
The problem is that .htaccess can't intuitively tell that a visitor is a search engine bot.
Most bots will identify themselves in the user-agent string, but some won't.
Robots.txt is accessed by all the bots looking to index the site, and unscrupulous bots are not going to
Identify themselves as a bot
Pay any attention to robots.txt (or they will deliberately disobey it).
These days I've seen a lot of websites using Short URLs for their whole websites instead of a single page reference.
Examples:
http://bo.lt
http://ge.tt
Just wondering how to get these type of custom URLs for my website. Can anyone let me know how this works?
Two letter high-level domains, such as .lt and .tt, belong to individual countries. .lt is Lithuania and .tt is Trinidad and Tobago. To get domains under these high-level country domains, just google for the domain name registrar for that country and register it just like you do a .com, .net, .org, etc.