How to apply Custom Examine Search Index on Nested Umbraco Contents? - nested

I have a doctype Home which contains nested doctype Product. And i want to be able to search Products.
I have also created an Examine indexset in ExamineIndex.config as follows:
<IndexSet SetName="ProductsIndexSet"
IndexPath="~/App_Data/TEMP/ExamineIndexes/Products/" >
<IndexAttributeFields>
<add Name="id" />
<add Name="nodeName"/>
<add Name="productName"/>
<add Name="nodeTypeAlias" />
</IndexAttributeFields>
<IncludeNodeTypes>
<add Name="homeProduct"/>
<add Name="product"/>
</IncludeNodeTypes>
</IndexSet>
I have created Examine Indexer in ExamineSettings.config as follows:
<add name="ProductIndexer" type="UmbracoExamine.UmbracoMemberIndexer, UmbracoExamine"
supportUnpublished="true"
supportProtected="true"
analyzer="Lucene.Net.Analysis.Standard.StandardAnalyzer, Lucene.Net"
indexSet="ProductsIndexSet"/>
I have created product Searcher in ExamineSettings.config as follows:
<add name="ProductSearcher"
type="UmbracoExamine.UmbracoExamineSearcher, UmbracoExamine"
supportUnpublished="false"
supportProtected="true"
indexSet="ProductsIndexSet"
analyzer="Lucene.Net.Analysis.WhitespaceAnalyzer, Lucene.Net"/>
But when running the Rebuild Index from Developer- Examine Management- ProductIndexer, I am getting 0 Documents in Index.
I am really not sure how to proceed with examine over nested Contents.
Can anyone help me to set up Examine Search Index on Nested Contents?

If your home node alias is "home" then you need to add that to your Included Node Types on the Index configuration. product doesn't need to be included unless it's also a content node in it's own right.
You may also want to take a look at the article here which outlines an approach to indexing nested content etc.:
https://youritteam.com.au/blog/indexing-content-in-complex-umbraco-data-types

Related

Use NLog to populate pre-exisiting log tables in SQLDB

I have been doing a bunch of searching on this and have come up blank. My research has shown me that i can use NLog to record logs to the DB but this always involves creating a new table with specific columns and stored procedures.
I have about 10 different systems, each with its own custom build logging functions and database tables. What i am looking to do is replace our current custom build logging logic with NLog. But i need NLog to then use my existing tables and schema to store the log. I can not modify the log tables in any way as they are tightly strapped to other functions that i can change at this time. So i need NLog to conform to my log table schema.
Is this even possible with NLog?
Any details would be greatly appreciated.
https://github.com/nlog/NLog/wiki/Database-target
Your configuration would look something like:
<target name="db"
xsi:type="Database"
connectionStringName="NLogConn"
commandText="INSERT INTO MyExistingTable (FieldA, FieldB, FieldC) VALUES (#a, #b, #c)">
<parameter name="#a" layout="${machinename}" />
<parameter name="#b" layout="${date}" />
<parameter name="#c" layout="${exception:tostring}" />
</target>
The layout is anything from https://github.com/nlog/nlog/wiki/Layout-Renderers or a combination thereof. You can also define multiple schema by adding different targets with different names.

Sitecore: Best practices for setting up Solr core for multisite

I understand that we can create N number of Cores in Solr. But I'm bit unclear on best practices for setting of Solr core for sitecore multisite implementation.
Scenario:
We have 5 sites in one sitecore instance. Each sites have search capability requirement.
Best practice questions:
Is it ok for each sites to share core and master index?
And, create a site specific cores for web index?
Do we need to setup the configuration file or similar in order to establish relationship between site(s) vs. solr core?
3.1. And I guess, each site needs to have sitecore solr index configuration file?
e.g. Sitecore.ContentSearch.Solr.SiteA.Index.Web (copy of out of box index file which contains custom indexing name?)
Do we need to write any smartness (site context) such that search will use the correct solr Core?
Answer for this question is really opinion-based. There is no one solution which would be ideal in every situation.
There was even a discussion on https://sitecorechat.slack.com about it.
First of all you need to think whether you want to reuse any content between the sites. If yes, then you can use Sitecore out of the box indexes. You will not get that much from separate indexes. Of course you can create separate indexes but some of the content will have to be crawled multiple times.
If you want to create separate indexes:
Leave sitecore_core_index, sitecore_master_index and sitecore_web_index as they are - they will be used by Sitecore to run Content Editor search and other Sitecore background searches.
Create Solr master and web cores for every site, e.g. site1_master_index and site1_web_index.
For every site, duplicate the configuration for sitecore_master_index and sitecore_web_index and set proper locations of the content roots.
E.g. create a copy of Sitecore.ContentSearch.Solr.Index.Master.config file (Site1.ContentSearch.Solr.Index.Master.config), and change its content to:
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/">
<sitecore>
<contentSearch>
<configuration type="Sitecore.ContentSearch.ContentSearchConfiguration, Sitecore.ContentSearch">
<indexes hint="list:AddIndex">
<index id="site1_master_index" type="Sitecore.ContentSearch.SolrProvider.SolrSearchIndex, Sitecore.ContentSearch.SolrProvider">
<param desc="name">$(id)</param>
<param desc="core">$(id)</param>
<param desc="propertyStore" ref="contentSearch/indexConfigurations/databasePropertyStore" param1="$(id)" />
<configuration ref="contentSearch/indexConfigurations/defaultSolrIndexConfiguration" />
<strategies hint="list:AddStrategy">
<strategy ref="contentSearch/indexConfigurations/indexUpdateStrategies/syncMaster" />
</strategies>
<locations hint="list:AddCrawler">
<content type="Sitecore.ContentSearch.SitecoreItemCrawler, Sitecore.ContentSearch">
<Database>master</Database>
<Root>/sitecore/content/site1</Root>
</content>
<media type="Sitecore.ContentSearch.SitecoreItemCrawler, Sitecore.ContentSearch">
<Database>master</Database>
<Root>/sitecore/media library/site1</Root>
</media>
</locations>
</index>
</indexes>
</configuration>
</contentSearch>
</sitecore>
</configuration>
This configuration tells Sitecore to index only /sitecore/content/site1 and /sitecore/media library/site1 locations in this index.
Use convention to get proper index search, e.g.:
ISearchIndex SearchIndex
{
get
{
return ContentSearchManager.GetIndex(Sitecore.Context.Site.Name + "_" + Sitecore.Context.Database.Name + "_index");
}
}

Sitecore search with custom index

I'm attempting to use Sitecore Search with a custom index to filter and search items.
The items are product reviews and are all stored in a single folder with a ProductReview template.
+ Reviews
- Sample Review 1
- Sample Review 2
- Sample Review 3
The users will be able to filter items by category, subcategory and search by product name. So the form will look similar to this:
Category: [ Drop Down ]
Sub Category: [ Drop Down ]
Product name: [ Single line of Text ]
[ Search Button ]
I'm finding the documentation for defining indexes very thin. I'm trying to setup the index with the following properties.
The index web database
It should include only those three fields as they're all I'll need
Only items based on the review template will be indexed
The two category fields don't need to be tokenised
You can filter on the category fields
I'm not sure if I need a custom Analyzer or DatabaseCrawler and I haven't looked into making one at all.
This is what I have so far, however I haven't produced a working index yet:
<index id="reviews" type="Sitecore.Search.Index, Sitecore.Kernel">
<param desc="name">$(id)</param>
<param desc="folder">reviews</param>
<Analyzer ref="search/analyzer" />
<include hint="list:IncludeField">
<!-- Category -->
<fieldId>Category</fieldId>
<!-- Sub Category -->
<fieldId>Sub Category</fieldId>
<!-- Product Name -->
<fieldId>Product Name</fieldId>
</include>
<locations hint="list:AddCrawler">
<web type="Sitecore.Search.Crawlers.DatabaseCrawler, Sitecore.Kernel">
<Database>web</Database>
<!-- {GUID} -->
<Root>{GUID}</Root>
<Tags>web reviews</Tags>
<IndexAllFields>false</IndexAllFields>
<templates hint="list:AddTemplate">
< !-- Product Review -- >
<reviews>Product Review</reviews>
</templates>
</web>
</locations>
</index>
Any pointers would be greatly appreciated.
Edit
The two main things I'm looking for is:
How to index the category field without tokenizing it.
How to filter using that with the Lucine.net API.
Using the SitecoreSearchContrib (aka Advanced Database Crawler) library will make this much easier for you, both in indexing and searching. The library includes example configs that will make it more obvious to you how you should set things up.
Some initial pointers, even if you don't use SitecoreSearchContrib:
You'll want to index master as well, so that this functionality works in Preview mode. The above library will automatically search the correct database, based on the context the code is running in.
Your template inclusion in the index should be a template GUID.
Your field inclusions should be GUIDs as well.

Pulling useful info from a MOSS 2007 Person or Group Field dumped via SSIS package

I’ve got a list defined that, as one of the fields (actually a few of the fields), has a Person or Group picker. I’m extracting the entire list through an SSIS package using the SharePoint List Source data flow source dumping into a SQL table. The PoG picker field dumps its data like so (each line being a single data item):
163;#Jones, Danny S.
179;#Smith, Sandra M.
164;#Thomas, Rob Y.
161;#Ross, Danny L.
2064;#Smith, Michael D.
I would guess that the number preceeding the ;# is some sort of User ID that SharePoint keeps with the user instead of something helpful like an ADS guid. Can I use SSIS to pull SharePoint’s user profiles so I can match the ID shown to an ADS guid or ADS username, and if so, how? I've tried using the Web Service task within SSIS to call the User Profile service (http://www.my.site/_vti_bin/UserProfileService.asmx), but I get an error about the WSDL being a wrong version.
Unfortunately, the ID shown in site fields is local to the list of users against that site.
Each User is uniquely identified by the site and list guids along with the ID field, but the ID is not unique across users lists and so cannot be used for anything other than indexing into that table.
The other issue with this data is that the profile display is updated regulary by the one of the UserProfileSynchronization service timber jobs. I have experienced times when the display name of the user is not updated correctly and will be set to the account name from Active Directory.
To get an idea of what is going on under the hood, have a look at the All_UserData table in a content database.
In Summary
Only the name part of the field is usable in a meaningful way and even that is not completely reliable, but good enough perhaps.
Can you modify the fields that are exported from SharePoint? Can you add a calculated person field based on this field? If so, then you can have that Person field store a different form of person data like their username or e-mail address which are far more useful in interacting with other systems.
Nat's answer is close. It's actually the UserInfo table. The numbers correspond to that table's tp_ID column. Unfortunately, I still can't figure out how to pull this info using SSIS, so I'm resorting to writing a console app that pulls the table's data through the Sharepoint web service and dumping that into a database table and scheduling it with Windows Task Scheduler. Also, because of how Sharepoint works, each root site collection has different IDs for each person, so I'll need to make separate pulls for each root site collection. Here's the method I'm using:
private static XElement GetUserInfo(string siteCollectionListsSvc)
{
SharepointListsSvc.ListsSoapClient ws = new SharepointListsSvc.ListsSoapClient();
ws.Endpoint.Address = new System.ServiceModel.EndpointAddress(siteCollectionListsSvc);
ws.ClientCredentials.Windows.AllowedImpersonationLevel = System.Security.Principal.TokenImpersonationLevel.Impersonation;
ws.ClientCredentials.Windows.AllowNtlm = true;
ws.ClientCredentials.Windows.ClientCredential = (System.Net.NetworkCredential)System.Net.CredentialCache.DefaultCredentials;
XElement userInfo = ws.GetListItems("UserInfo", String.Empty, null, null, "4000", null, null);
return userInfo;
}
The method argument would be something like "http://www.my.site/_vti_bin/lists.asmx". My app config that sets up the binding and endpoint:
<configuration>
<system.serviceModel>
<bindings>
<basicHttpBinding>
<binding name="ListsSoap" closeTimeout="00:01:00" openTimeout="00:01:00"
receiveTimeout="00:10:00" sendTimeout="00:01:00" allowCookies="false"
bypassProxyOnLocal="false" hostNameComparisonMode="StrongWildcard"
maxBufferSize="5000000" maxBufferPoolSize="524288" maxReceivedMessageSize="5000000"
messageEncoding="Text" textEncoding="utf-8" transferMode="Buffered"
useDefaultWebProxy="true">
<readerQuotas maxDepth="32" maxStringContentLength="8192" maxArrayLength="16384"
maxBytesPerRead="4096" maxNameTableCharCount="16384" />
<security mode="TransportCredentialOnly">
<transport clientCredentialType="Ntlm" proxyCredentialType="None"
realm="" />
<message clientCredentialType="UserName" algorithmSuite="Default" />
</security>
</binding>
</basicHttpBinding>
</bindings>
<client>
<endpoint address="http://www.my.site/_vti_bin/lists.asmx"
binding="basicHttpBinding" bindingConfiguration="ListsSoap"
contract="SharepointListsSvc.ListsSoap" name="ListsSoap" />
</client>
</system.serviceModel>
</configuration>
Notice that I increased the //binding/#maxBufferSize and //binding/#maxReceivedMessageSize from the default of 65536 to 5000000. We've got about 3000 records that could be returned, and the default size wasn't nearly big enough. Since these are all internal calls, I'm not worried about network lag. Other changes from the default binding are in the //security element, specifically the #mode and //transport/#clientCredentialType attributes.
When you get the XML back, the number (stored in the PoG field) is in the //z:row/#ows_ID attribute, and his corresponding ADS login is in the //z:row/#ows_Name attribute. You also get the email address back in the //z:row/#ows_EMail attribute.
Hope this helps others get through the same issue!

Default Content Type and Content type order using Folders in a Sharepoint List

I have a Custom List Definition (schema.xml) i have set up Site Columns (through A feature) and numerous content types also through a feature, this all works fine. I have a list definition (schema.xml) and i have put the content types at the top in the order i want them to appear (assuming that the top content type will be the default for the list - which is what i want)
NOTE: i HAVE set
EnableContentTypes="true"
here is an excerpt from my schema.xml
<ContentTypes>
<!-- Folder based content type -->
<ContentTypeRef ID="0x0120006ad66a4924644ac98d371a0e069c5d99" />
<!-- Item Based Content Type -->
<ContentTypeRef ID="0x0100a18ddd58b9384567bc776a3c5889ea77" />
<!-- ..... more content types ... -->
</ContentTypes>
The problem i have is that when a list is provisioned the folder based content type is always second in the list, and as a result is never the default in the list, the only way i can make it default is to remove all the other content type declarations which sort of defeats the object of having multiple content types. Is this ordering because of the way the list is provisioned or am i missing something tucked away deep in the SDK, any help gratefully accepted
Using a FeatureReceiver you can set the lists RootFolder.UniqueContentTypeOrder to an ordered list of ContentTypes which will then determine to Button order.
You can also leave out ContentTypes which you don't want to be available even though they are defined in the list. It's common to combine this with an event receiver which sets the button order (UniqueContentTypeOrder) of folders added, to guide people into a certain structure like only folders at the top level and no nested folders (or only x levels deep) or what you like.
Check the property bag. The list you're working on probably has a property called "vti_contenttypeorder", with the content type id's in the order they will show up when you go reordering them through the UI.
...building on #Renan answer - set your default Content Type declaratively.
Create a module, name it Property bags and add it to the web scope feature containing your list instance. The Elements.xml should contain the following:
<?xml version="1.0" encoding="utf-8"?>
<Elements xmlns="http://schemas.microsoft.com/sharepoint/">
<PropertyBag Url="<!--URL of your list instance here-->" ParentType="Folder" RootWebOnly="FALSE" AlwaysCreateFolder="TRUE" xmlns="http://schemas.microsoft.com/sharepoint/">
<Property Name="vti_contenttypeorder" Value="<!--Your custom Content Type ID here-->" Type="string" />
</PropertyBag>
</Elements>

Resources