I'm implementing Google Custom Search on a site I'm working on.
The website is divided in subsites, with a parent site. It's done like this:
- parentsite.com
- parentsite.com/childsite1
- parentsite.com/childsite2
- parentsite.com/childsite3
- parentsite.com/childsite4
When searching the site, I want the users to be able to filter out the results for each child site, and this I've done with refinement labels (so when you press the Childsite 1 label, you will only get results from parentsite.com/childsite1/*)
My question is: I also want a label that only gives the user results from the parent site, but not from any of the child sites. Can I in some way also make the refinement label exclude certain url patterns?
you can exclude urls in advanced site management by using a list of urls with one url per line. There are further options but the url is treated like a wild card so any url containing the part url you list will be excluded
start here and choose the CSE you own and wish to modify
https://www.google.com/cse/
2.from bottom of ("Basics TAB") under the heading ["sites to add"] you should find an "advanced" link. its fairly intuitive from their bulk or one at a time...
check here for more info on url patterns
https://support.google.com/customsearch/bin/answer.py?hl=en&answer=71826
Related
I am having quite the difficulty. on our company site https://temp-quitlogixbase.quitlogix.org I had set up the Smart Search functionality. setting up indexes for each of the sites with in the application
(i.e.
https://colorado.quitlogix.org https://arkansas.quitlogix.orghttps://idaho.quitlogix.org
)
I even made sure to limit each site to the index meant for it. the problem is that the smart search results, either changes for each site, or if I include multipule indexes it gives me all the results for all the indexes, not just for the site I am working with. can some one help as to what I am doing wrong?
From your description, it sounds like you want each site's search to work independently so that - e.g. - Idaho results are not served on the Colorado search results.
To do this, you'll have (and sounds like you do) a SmartSearch index set up for each site with the allowed content in the index limited to the site in question.
What I would look at is the template your using for the search results. It looks very similar on the three sites you've listed, which makes me think that they are the same template. If that is the case and you're using a web part for the search results, you'll need a macro or some other logic to tell the page which index to look at.The template is effectively global, so each time you set the index on the Smart search results web part, it will override the previous value, even if you're switching between sites.
A way to do the switching can be to set the Indexes field to something like the following macro:
{% if (CurrentSite.CodeName == "QuitLogix_Arkansas") { "ArkansasSiteIndex" } else if(CurrentSite.CodeName == "QuitLogix_Colorado") { "ColoradoSiteIndex" } else {"IdahoSiteIndex"} #%}
If you've done all that or are using separate templates, you will not need that. Other options can include using multiple Smart search results web parts with their set the visibility based upon the current site or by having a different template for each sub-domain.
I have a custom page types (Content Only) for Locations. Then I have a landing page (/company/locations/) with repeater to list all locations and their details. Things work well so far. Now, after adding the smart search, I notice that if I search a location name like "san francisco", the landing page didn't show up in search result, but the content-only page showed with a URL like this /company/locations/san-francisco. The thing is, this URL results in 404 since that page doesn't really exist. What should I do? Should I re-create the page type and change it to a regular page instead of content only before it's too late? Or is there a way to make individual location url (/company/locations/san-francisco) work - considering we can't specify a page template to go with content only page type? Thanks!
There are multiple types of Search indexes in Kentico.
"Pages" scans the data of a document, such as any webparts+properties, editable text, form data, etc. They do NOT scan the rendering on the page though, it doesn't catch any Repeaters (what you're using).
"Page Crawler" will literally load the page, and scan all the content in the page. This will catch Repeaters and dynamic content like that.
Knowing this, you have a couple options.
Use Pages, then Modify the Smart Search Result and add some transformation logic to say something like the below
The Link
Use Page Crawler, tell it specifically to only index the /company/locations.
Use Page Crawler, and also a custom smart search indexer so you can exclude the header/footer or other areas out of the content (it's a bit more advanced)
If you don't want that URL to show then simply exclude those page types from that search index. But if you want them to specifically show, then create a detail or selected transformation for that /company/locations repeater to display when someone navigates to it from the search. This will also be good for google and other search indexes if you plan to have specifics for each location.
I cannot for the life of me work out how to return home pages within SharePoint Online search.
I have a single site collection with a number of sub sites that have a home page set as the default page, however when I create a query results source in SharePoint Online I cannot retrieve any of the homepages. They seem to be excluded?
Any ideas or thoughts to why they would be excluded?
Ideally, I just want to return all homepages for each sub-site within the site collection.
Many thanks.
You need to make sure that all your home pages are having same content type you are using in your filter. You might also use the name of content type in filter instead of id in your filter.
ContentType:"your content type name"
You also need to make sure all these pages are checked in and published to be picked up in search. If you are sure of all that, then try to reindex the whole site collection frim site settings and recheck after a while, normally it takes a couple of hours for crawl to finish and het your results. However, it sometimes take longer depending on search index load on cloud.
I am currently working on a project which requires content to be published onto a view or page depending on a search result criteria. For example: I search through my content for the word dog and this word appears on 4 of 20 pieces of content. I wish to view all of those items on a page that is not the Search Results page, but rather one that displays all the content found, so I can print each piece of content.
I apologize if this post is awkwardly worded. At this moment it is just an idea and I am trying to get a better picture of how to change publishing based on search results to a certain area.
Thank you for your time -- and if anyone wishes to ask follow up questions, I'd be more than willing to help clarify.
You can use a view with exposed filter. Create a view, create a filter criteria there, then in settings - check "Expose this filter to visitors, to allow them to change it". A user will see a form in a view, wich you can also make separate from a view, by setting "exposed form" to "yes" and putting it in a sepparate block.
I've read a bit on the matter of friendly urls and I'm a little unsure as to what is better.
I currently have my website using a structure of http://www.domain.com/page.php?id=2
I am using the record id to determine the content of the page. My record id's are numeric and increment for new pages added. The content of existing pages can change completely over time. But, still use the same record id (this is a cms so the client may do this).
The way I understand it I have two options for friendly urls:
http://www.domain.com/page/2
http://www.domain.com/some-text-describing-the-page
Now because I identify the content by the record id, I would assume the first option would make more sense.
My client seems to want option two.
After some reading I found two conflicting points.
As per Tim Berners-Lee (the architect of the WWW) he states that you want a URI which will have the potential to remain the same 2 months, 2 years, 200 years from now. So you DO NOT want to use a page title or something similar for your pages. If you change your pages content you are either forced to change the content and leave the URI alone, or change the URI and are stuck with dangling links. You can read his article here (http://www.w3.org/Provider/Style/URI)
However, a number of other people on the internet (with no know authority to me) clearly state that you need to have a descriptive yet short URI for the best SEO value. From what I read, mostly for the purpose of backlinks and having keywords in the anchor text since people just use the link itself for the anchor text. So having keywords in the link itself helps search engines know what the link is about without a custom title.
It seems to me the difference has to do with long term VS short term.
Am I grasping this correctly?
If I am to use a slug style URI as defined by the user, do I have to just allow my user to type in whatever they want to a field and check against the current database to see if it exist? If so, am I supposed to anticipate static links by running a query for the know record id and then use the result to generate the url which would just be rewritten back to the format: http://www.domain.com/page.php?id=2?
It seems to me that would be a lot of extra overhead.
I would suggest something in the middle of those two:
http://www.domain.com/page/2/some-text-describing-the-page
or without page:
http://www.domain.com/2/some-text-describing-the-page
You can still get page Id from the Url, and there is a title as well! And what even more important, you're still able to get correct content, even when page title change later.
So think about situation like that: User creates a page, it receives Id=4 and it's title is My great title. From that information Url is generated, and is e.g. http://www.domain.com/page/3/my-great-title. After 2 months user changes the title to This title is better then the last one!. Url changes as well to http://www.domain.com/page/3/this-title-is-better-then-the-last-one. However, there is still 3 within the Url, so you're able to show right content! You can also check, if the rest of Url is actual, and redirect (301 would be the best one) to new one to let search engines know, that Url changed.