One website on two domains [closed] - iis

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I have 2 domains:
www.first.com
www.second.com
Lets assume that In the first one I have an online store, at the second one I have only products of this store (seperate applications that running on the server).
The products link is
www.second.com/firstProduct
www.second.com/secondProduct
www.second.com/thirdProduct
and etc...
I want to redirect users to the first website when someone hit www.second.com, ie not the full product path.
What redirect should I use? 301? In terms of SEO what is the best approach?
Thanks.

Yes, the 301 Moved Permanently, is the code you want to return for this redirect. Search engines typically will queue up 301's for updates to their results, as this indicates that the resource will now be found at the new url, and that the old one is soon to be obsolete.
In your case, since you never want www.second.com/ to be accessed directly, the 301 is exactly what you want.
You might also consider adding a robots.txt file with allow + disallow statements in there, as most of the bots you actually care about for SEO will honor it.

Related

How to tell Google change the results to the new domain of my blogger website? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 7 years ago.
Improve this question
I have just moved my site to the new domain (from math2it.blogspot.com to math2it.com). However, I try to search on Google, there still be the old urls with domain "blogspot.com". How can I tell Google to change the results to "math2it.com"?
I have already changed in Webmaster Tools. I also went to this website to check the status of redirect. The answer is "Type of redirect: 301 Moved Permanently. Redirected to: math2it.com".
looks like you did everything needed to let Google know about the change.
Give it some time. Google will update your records, but not instantly

(How) can I add a second domain to a website to improve SEO? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
So I'm making a website for a restaurant in a village in France. The restaurant is called Le Cantou, so I've registered www.lecantou.net. I want to make sure it is easy to find with Google. Now people obviously are not going to type in the name of the restaurant in Google, they will write "restaurant a saint cirq lapopie", because that's the name of the village. So I've also registered http://restaurant-a-saint-cirq-lapopie.com in the hopes that that will make it clear to the visitor that this is the restaurant they want.
Now my question is, I have one website with two domains: is there a way to handle the two domains so I get maximum SEO? I think duplicating the website is a bad idea. But setting a redirect from the long domain name to the original domain name also doesn't work, because then the long domain name will never show up in Google results, isn't that right?
What do you guys recommend?
I recommend you to give up on "the longer domain". Since Google's EMD Update, having domain witch includes the same keywords like popular search queries, won't help you rank better.
You should work more on the content, interaction with your visitors and getting the links from the local websites. That will help you improve the rankings.
Adding multiple domains to a website is a tricky procedure as Google(or other search engine) can easily find out the multiple domains or duplicate links with the help of sophisticated algorithms. Secondary domains pointing to the primary ones can be created through 301Redirects which is a safe technique. All the online visitors can be directed to the secondary domains easily.
As the other guys have said - you need to shift your focus from old-school SEO thinking.
Instead try and create an awesome website on your main domain, awesome content, etc.
Don't go for a superlong keyword heavy domain. It probably won't work.

Does Google Care About The Structure / Organization of Sitemap Index Files, Sitemap Files and URLs [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I am building a sitemap file generator and have been reading about the various limits. (50,000 URLs per sitemap and 50,000 sitemap files per index file).
I have already been building this with the strategy of organizing my sitemap files similarly to how the links are organized on the actual site. However, I am noticing that in time I will likely need to restructure due to the limits mentioned above.
So, I am now thinking that alternatively I will store every possible link/url in a DB table and then just run a cron job which generates XML files, one per every 50,000 URLs I have. This approach is more easily scalable but also lacks any organization. I am curious if any SEO experts out there know if this matters to google, or if the URLs are all seen in the same light, and not by how they are grouped.
The purpose of a sitemap is simply to help Google fully understand the structure and layout of your website.
That said, as long as you are using a technique which effectively communicated to Google the layout of your site, you should be alright. Since you seem to still be communicating the right message about the structure of your site, this technique appears OK.
See here for more information.

SEO - moving subdomain to folder [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 9 years ago.
Improve this question
I am moving my blog from a sub-domain - blog.example.com to a sub-folder example.com/blog/
The URLs and the content are staying exactly the same.
What would be the best SEO-wise action to take, I was thinking the following:
Add rel="canonical" to sub-domain URLs and let the spiders crawl my pages to become aware of the new links.
Add a 301 redirect from sub-domain to sub-folder.
I understand that there's no point in having canonical if there's a 301 redirect.
Any help would be highly appreciated, thank You in advance!
The URLs are not staying the same. They are changing so you need to tell search engines and users where to find the content. 301 redirects are exactly what you want. They tell search engines where to find the new content and to update their indexes (plus Google will transfer PageRank) plus when users go to the old URL they are automatically redirected to the new URL which canonical URLs do not do.

How to get the domain country? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm having trouble understanding how certain websites use various domains for each website. In a nutshell how does say for instance myspace have uk.myspace.com, fr.myspace.com etc?
Do they put the main files in the above root then have individual sub domains for each country or do they have something weird going on in terms of country detection??
I cant find anything anywhere online?
thanks
There is unlikely to be a single server involved, so talking about "files above the root" is meaningless. You'll be talking about some kind of fairly advanced routing infrastructure hiding dozens of different servers across many different locations. The routing logic is the part that decides which group of machines will be responsible for handling a given request.
The forwarding part is indeed "weird country detection", in the sense that some machine is responsible for performing an IP lookup and redirecting the user to an appropriate (possibly-geographically-closer) host. This might be done for performance reasons, or it might be done for content localisation and SEO reasons (e.g., the default language).

Resources