Client has provided two .xml files namely a_20221229.xml and b_20221229.xml for two sites i.e, site a and site b. And they wanted to replace content of urls https://a.local:9002/sitemap.xml with a_20221229.xml and https://b.local:9002/sitemap.xml with b_20221229.xml.
I have created CatalogUnawareMedia for a_20221229.xml and associated it with site map for the given site.
Can anyone please confirm, is it correct way of doing or have to do some other way?
Replacing sitemap.xml under storefront url https://b.local:9002/sitemap.xml
Related
I have an Umbraco database/back-end that has two sites configured on it (SiteA and SiteB for this purpose)
I wish to create a third site/repository that I can keep nodes common to both sites in and have the two sites reference
I have set up a third site with some nodes/document types in but when I try to add them as targets to the SiteA and SiteB pages, I get an error saying "The site cannot be reached" in the browser
The new site does have a hostname set in Umbraco - does this need to be a site that is available within IIS? Why can Umbraco not just serve up content within the context of SiteA or SiteB?
What version of Umbraco are you using? On very old version of Umbraco, what you are trying to do is possible, but in newer builds, if Site A and Site B have host names set, you won't be able to access the content from the other site using out of the box functionality.
Probably the easiest way to do what you want would be to build a custom URL Provider and Content finder. The custom URL provider can be used to check which domain the site is on (from A and B) and if the node the URL is being generated for is on A or B, include the correct domain in the returned URL. The custom content finder would then look for the content in site C if it's not found in site A or B.
Just be aware that you may have to be careful calling for things like parents etc, as technically content on site C does not sit in the tree for A or B, so calling the parent on the Node from C will pull in its parent from C, and not content from site A or B.
One other consideration, if the content is nodes with content in, Google penalises sites for duplicate content. So if you have identical content on site A and site B, there is a chance that your SEO rankings may be adversely effected.
First off, apologies for not knowing the nomenclature for what I'm looking for, I'm not typically a Windows web admin.
I have a SharePoint website which contains several subsites. We also have several alternate URLs that point to specific pages, and some of those alternate URLs have friendly URLs which also redirect to other specific pages. We're in the process of migrating from a SharePoint 2007 site to this one, and in the process, I'm trying to remove our reliance on our registrar for handling some of this redirection, because it is apparently not a free service.
Currently our registrar does the following redirects:
http://alias1.tld/* redirects to http://subsite1.ca/page1
http://alias1.tld/friendly redirects to http://subsite1.ca/page2
http://alias2.tld/ redirects to http://subsite1.ca/page3
I know I can accomplish the first and second by setting the sites up in IIS, and using the HTTP Redirect function, but I'm not sure how I can do the second one. In Apache this would be easy, but I'm not sure what I'm looking for here.
Is this something that should be handled within SharePoint, and have that take care of redirecting alias1.tld/friendly to the specific page, or is this something I need to setup in IIS? Is this what URL rewrite is for, or is there a different IIS way to do this?
I'm not sure that this is the best way to do it, but I got things working how I wanted them. Here's what I ended up doing:
Create a new subsite on subsite1 to give me the URL subsite1.ca/subsubsite
Create a redirect from alias1.tld to subsite1.ca/subsite
Create 2 pages for the new subsite. One for the default page and one to use to redirect to page2. Both pages are redirects, Default points to Page1, the second points to Page2.
Set the subsite to use Managed Navigation for global and current through Site Settings > Navigation, and created a default term set by selecting the new subsite in the list and then clicking Create Term Set, then clicking OK.
Then created a term store for the one page that needs to be handled differently by going to Site Settings > Term Store Management. Click on the Term Set created in the last step, then select New Term. On the Term-Driven Pages tab, create the friendly URL and then select the target page, which is the redirect page created in step 3, then click Save.
I have a Prestashop store in www.theafricantouch.com.
I also have multiple domains pointing to the same DNS and same folder: .us, .fr, .de, .es, .co.uk, .net...
In the SEO and URLs configuration page of my prestashop backend, I set www.theafricantouch.com as the main store domain.
My goal is when an user from france uses theafricantouch.fr to visit my store, the browser always keep the .fr extension of the domain in the URL field.
Now, doesn't matter from where an user is entering, it always replaces the .fr, .es... extension with the .com
Is there any way to keep the extension?
Thanks,
Jonatan.
Check out the "Multi Store" function, I'm sure you gonna find what you're looking for.
http://doc.prestashop.com/display/PS15/Managing+Multiple+Shops
Please notice all your domain should be with the same hoster or this solution won't work.
Suppose I have a new verison of a website:
http://www.mywebsite.com
and I have would like to keep the older site in a sub-directory and treat it seperately:
http://www.mywebsite.com/old/
My new site has a link to the old one on the main page, but not vice-versa.
1) Should I create 2 sitemaps? One for the new and one for the old?
2) When my site gets crawled, how can I limit the path of the crawler? In other words, since the new site has a link to the old one, the crawler will reach the old site. If I do the following in my robots.txt:
User-agent: *
Disallow: /old/
I'm worried that it won't crawl the old site (using the 2nd sitemap) since it's blocked. Is that correct?
1) You could include all URLs in one file, or you could create separate files. One could understand a sitemap as "per (web) site", e.g. see http://www.sitemaps.org/:
In its simplest form, a Sitemap is an XML file that lists URLs for a site along with additional metadata about each URL
Since you now have two sites, you may create two sitemaps. But again, I don't think that it is strictly defined that way.
2) Well, if you block the URLs in robots.txt, these URLs won't be visited by conforming bots. It doesn't mean that these URLs will never be indexed by search engines, but the pages (= the content) will not.
I want to set my website . It has many user profile which is kind of dynamic.
e.g. http://test.com?profile=2,http://test.com?profile=3.
Whats steps I need to make so that its show all profiles on search engine dynamically.
1) I have an Google webmaster tool
2) Added a sitemap and robot.txt for the site.
After 1 months or so(Indexing is done , as I can on Webmaster tool account)
If I search the profile(say by name) I don't see the user profile in search.
I have added the url parameters as well e.g. here profile.
Am i Missing anything?
Can you get to a profile from the home page by basic links alone?
Search engines like to be able to find your pages on their own.
Do a more specific search first. e.g. add site:test.com to your search so only your site is competing.
Check you have not blocked the pages in the robots.txt file or via the robots meta tag on the page.