my website does not get visited by google bots? - web

I am trying to understand why my website does not get visited my google bots.
http://www.nateiss.com/
I used Site-Analyser to analyse my site - you can see the website report.
http://www.site-analyzer.com/en/audit/http://www.nateiss.com#report-page-6
What Should I do to make majors bots: google, yahoo, bing to visit it?
Thanks

Same problem I was facing before 1 year ago. Because my website http://www.silkyquote.com/ was not listed in google. But now googlebot daily visiting my website.
If your website or blog is new than googlebot takes time to crawl your website.
If your website is not update regularly than googlebot not visit your site regularly.
So, update your website daily. In shot time googlebot will crawling your site.

Related

Multilingual Umbraco Website cannot be scraped?

I have created a multilingual Umbraco website which has 3 domain names pointing to it for each language. The site has gone live and people are starting to share links to it on LinkedIn and other social media. I have metadata in the website which should be picked up when these links are shared. On LinkedIn when the link is shared it has 'coming soon' as the strap-line, which is what was in the holding page months ago suggesting the site isn't being re-scraped.
I used the Facebook link debugging tool and that was returning a run-time error with a 500 response code.
My co-worker insists that there is nothing wrong with the DNS and there aren't any errors in the code of the website so I am wondering if anyone has any ideas why the website cannot be scraped?
It also has another issue where one of the domains sometimes doesn't redirect to it's www. version despite have a redirect on the DNS which may be related.
Is there some specific Umbraco configuration that I may have missed? Or a bug within Umbraco that may cause this?
Aside from this issue the website is working fine, it is just these scrapers seem to be unable to hit the website successfully.
Do you have meta data set for encoding? see https://www.w3.org/International/questions/qa-html-language-declarations probably long shot.

How to fix Google search SEO after transfer old website and hosting to new framework website and hosting? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I had an ecommerce website with magento and Nexcess hosting, but now i have changed the website to new framework Kohana and Arxive hosting. Website has SAME DOMAIN NAME. I have moved my website to new framework and hosting for about a month, but Google still cache old information SEO from my old website even though it still caches few of information of the new website. I tried to search my website through Google, but there is few errors happening when i clicked the links from Google that cause users could not view my website or products.
For example: https://www.website.com -> Google said shop away, but my shop is still available with new framework.
Another example: search product name: zippo lighter.... -> click on it --> links error with new Kohana framework message (BUT if i view product from my website.com, that product is display well; there is no message)
FYI: My new website has SEO functions very well, there are tittle, meta keywork, meta description...
So how do i clear or delete caches from old website (i already deleted website from old hositng)? or what can i do to make my website work well with Google after transfer?
Thank you so much
Maybe Google has not reprocessed your website completely and is confused. You can create a sitemap.xml with all the URLs of your website and a recent lastmod date. Then submit it into Google Webmaster Tools and let crawlers revisit your site. May sure you don't block those links with your robots.txt.
Remember that Google Cache lags behind the index. It is often out of sync and cannot be used as reliable information about indexed content.
If your issue persist, report it to Google's Webmaster Forum.

Google do not index my posts

I'd like to know why google do not index my posts on my blog writes in NodeJS.
Link of a post : http://icecom.fr/articles-icecom/9
Anthony
There are several reasons why Google isn't indexing your website.
There are no links to your website. Google follows links on the internet to other pages. If there are no links to your website it won't find it.
You are denying access to Google through the robots meta-tag or robots.txt.
You haven't waited long enough yet, Google may take some time before it has indexed your website.
Of course you can supply Google with the proper URL's with a [sitemap]{https://support.google.com/webmasters/answer/156184?hl=en}. A good place to create this if you're new to it could be [here]{http://www.xml-sitemaps.com/}
#szenbalu already mentioned you can upload this sitemap.xml to Google Webmaster Tools and this way Google can index your site without the need of links. It is also faster most of the time.
Another way to get your website indexed through Google Webmaster Tools is the 'Fetch as Google' tool. In here you can tell Google to fetch and index your website. This is especially useful if you change content and want it reindexed.
About your specific case:
* You do not block Google with the meta robots tag
* I can not find a robots.txt file
* I can not find any links to your articles from [OpenSiteExplorer]{http://www.opensiteexplorer.org/}
I think that uploading a sitemap to Google Websmaster Tools + Using the Fetch as Google tool will get your site indexed within no time.
If you have any questions left, feel free to ask. :)
Do you have the robots.txt file and webmaster tools account joined to your page?
With webmaster tools you can upload sitemap that google will use to index pages.

Search engine robot.txt

I want to add a robot.txt so my web page can be found...
So I have heard that putting a robot.txt with meta tags in the root of my site can do this.
Is this true?, if so,
What would be the steps to add or generate this robot.txt?
I have found this
Robots.txt is more for telling the crawlers where to and where to not go once they've already reached your site.
A better way to get crawlers onto your site is to build a sitemap for your site, then use Google Webmaster to submit this sitemap to google. You'll also want to include the sitemap on your site's root url and tell google where it is (all of this can be done in Google's Webmaster Tool linked to above).
No, it won't make your webpage suddenly visible. It just instructs web crawlers on how to index your site.
http://www.robotstxt.org/

SEO Website with links only

I have a website that contains links to other sites only.
They link to an image gallery or a video.
Does google accepts this or will it penalize my site because I don't have any real content?
Thanks
Google does look for human generated content. They also look for the links to your site. I would add stuff to your site and also guest blog etc to get traffic to your site.

Resources