Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 12 years ago.
Improve this question
Say if I register a domain and have developed it into a complete website. From where and how Googlebot knows that the new domain is up? Does it always start with the domain registry?
If it starts with the registry, does that mean that anyone can have complete access to the registry's database? Thanks for any insight.
Google will find your website on its own if some existing website has a link to it.
You can jump-start the process: http://www.google.com/addurl/.
You may also be interested in Google's Webmaster Tools.
Google needs to find you. That is, if there is no link to your site from another web site, it'll never find it.
Google finds pages to crawl as links from other pages. If no site links to your site, Google will likely never find it.
You should look at the google website. They have some good information here. They even have a link to add your site to their list to crawl.
Google works purely (sort of) on pagerank, which accounts for its success. If nobody cares about your site, nor do Google.
Usually I submit new domain directly to Google, I found out it takes less time for Google crwaler to find out about it rather than waiting for Googelbot to parse some link to the new domain in some other websites: http://www.google.com/addurl/
Another way is to create a free Webmaster Tools account on Google and add the new domain/website there (maybe also with a sitemap), this will take to Goggle even less time to crawl your domain and get it indexed.
As far as I know the registry's informations of a domain for most domains are public, you can read them by using a simple whois service.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
So I'm making a website for a restaurant in a village in France. The restaurant is called Le Cantou, so I've registered www.lecantou.net. I want to make sure it is easy to find with Google. Now people obviously are not going to type in the name of the restaurant in Google, they will write "restaurant a saint cirq lapopie", because that's the name of the village. So I've also registered http://restaurant-a-saint-cirq-lapopie.com in the hopes that that will make it clear to the visitor that this is the restaurant they want.
Now my question is, I have one website with two domains: is there a way to handle the two domains so I get maximum SEO? I think duplicating the website is a bad idea. But setting a redirect from the long domain name to the original domain name also doesn't work, because then the long domain name will never show up in Google results, isn't that right?
What do you guys recommend?
I recommend you to give up on "the longer domain". Since Google's EMD Update, having domain witch includes the same keywords like popular search queries, won't help you rank better.
You should work more on the content, interaction with your visitors and getting the links from the local websites. That will help you improve the rankings.
Adding multiple domains to a website is a tricky procedure as Google(or other search engine) can easily find out the multiple domains or duplicate links with the help of sophisticated algorithms. Secondary domains pointing to the primary ones can be created through 301Redirects which is a safe technique. All the online visitors can be directed to the secondary domains easily.
As the other guys have said - you need to shift your focus from old-school SEO thinking.
Instead try and create an awesome website on your main domain, awesome content, etc.
Don't go for a superlong keyword heavy domain. It probably won't work.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
For Google Analytics, I had to prove that I owned my domain. I added a TXT record to do this. I also had to prove to Microsoft that I owned my domain by uploading a file (BingSiteAuth.xml) to my site.
Now that I'm up and running with Analytics and Webmaster tools for Google and Bing, can I remove these verification records, or will that break analytics? Does leaving the record and file there pose any kind of security risk?
No, you shouldn't remove any of the verification files or DNS records. Google will periodically recheck your site and if it doesn't succeed you will lose access to WMT, for example. See this WMT support page:
Removing the record from your server can cause your site to become
unverified, and you will need to go through the verification process
again.
I'm not 100% sure but I think Bing will do the same. It makes sense because a domain owner or the roles of administrators might change and you don't want anyone who ever had access to your site's data to keep that access right forever.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I have a forum, user's can post comments, they can also enable/disable and approve comments; however, I can't always trust users will disapprove comments linking to bad domains. Like these: http://www.mywot.com/en/forum/3823-275-bad-domains-to-blacklist
My question is two part:
If a user does hyperlink to a 'bad domain' like those in the link above, will my forum/forum-category/forum-category-thread be penalised by it, and even if so if I add no-follow to the forum thread's links?
Is there a free API service out there, that I can make a request to to get a list of bad domains, so I can then filter them out of users' posts?
I maybe being paranoid, but it's probably because I'm not too SEO savvy.
The actual algorithms aren't public, but this is what I found looking around the 'net.
1) Google's Web master Guidelines says that it may lower ranking for sites that participate in link schemes. As an example of a link scheme, they give "Links to web spammers or bad neighborhoods on the web". NoFollow may or may not have impact on it, but the consensus seems to be that it doesn't.
2) You can use either of Google's two safe browsing APIs to check if sites have been found to be phishing and/or malware sites.
If your website linking to bad domains, that will definitely harm your website but again; it is depending upon outgoing links ratio.
I strongly recommend recruit forum moderator from active members who can manually moderate forum post and will help you to save from spamming.
I am not sure but many forums allow various restriction like:
- Only members having number of post can keep link in forum reply
- Only specified months/days old member can share links
- Only particular number of links are allowed in forum post.
Kindly check for such facilities that can help you to restrict the users.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I am building a question and answering site by myself.
I want to make this site indexed as a Q&A site or Forums by Google, which can be retrieved when using the "Discussions" in Google. In my personal experience, Google Discussion Search is a pretty useful function when I want to get others' real opinions or experience.
However, I have no any idea on that how Google determine one site as Q&A/Forum or one page as Q&A/Forum page. I searched a lot on Google, but there is little related information discussing this issue. Do you have any idea or reference on that?
Thanks!
Best,
Jing
Use richsnippets and make Google recognizing your traffic by using Webmaster tools or Analytics . Use a sitemap.xml to invite for revisit and fast indexing, disable archiving (f.e. Google Cache) with meta-robots noarchive. If you have high traffic and fast content building, search-engines will then recognize by themselves.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 13 years ago.
Improve this question
I just discovered that one of my site was considered by Chrome as a malware.
Here's what is Google Chrome is showing:
The website at ___ contains elements from the site ___, which appears to host malware – software that can hurt your computer or otherwise operate without your consent. Just visiting a site that contains malware can infect your computer.
My site uses Joomla 1.5 as a CMS and had securities issue, one of the template ("beez" I think was the name of the template) that comes with Joomla contained a virus, Now I updated the Joomla and removed the template, and I thought that fixed the problem.
Now Chrome is still considering my site as a malware. Any Ideas how I would fix this?
Thanks!
Google is using the database of this organization for malware reports.
http://www.stopbadware.org/home/reportsearch
Check if you're in there - if you are, go to google webmaster tools and there you should be able to request a revalidation.
http://www.stopbadware.org/home/reviewinfo
You might still have a problem with content that contains malware. When a wordpress site of mine got hacked, there was a stray piece of malware in the static html. You need to check every file that displays :(
Good luck.