Use NLog to populate pre-exisiting log tables in SQLDB - nlog

I have been doing a bunch of searching on this and have come up blank. My research has shown me that i can use NLog to record logs to the DB but this always involves creating a new table with specific columns and stored procedures.
I have about 10 different systems, each with its own custom build logging functions and database tables. What i am looking to do is replace our current custom build logging logic with NLog. But i need NLog to then use my existing tables and schema to store the log. I can not modify the log tables in any way as they are tightly strapped to other functions that i can change at this time. So i need NLog to conform to my log table schema.
Is this even possible with NLog?
Any details would be greatly appreciated.

https://github.com/nlog/NLog/wiki/Database-target
Your configuration would look something like:
<target name="db"
xsi:type="Database"
connectionStringName="NLogConn"
commandText="INSERT INTO MyExistingTable (FieldA, FieldB, FieldC) VALUES (#a, #b, #c)">
<parameter name="#a" layout="${machinename}" />
<parameter name="#b" layout="${date}" />
<parameter name="#c" layout="${exception:tostring}" />
</target>
The layout is anything from https://github.com/nlog/nlog/wiki/Layout-Renderers or a combination thereof. You can also define multiple schema by adding different targets with different names.

Related

How to remove OOTB attributes from Backoffice

I need to remove Out of the box attributes from Backoffice, these attributes are defined in the multipe OOTB extensions.
I am not sure if you want to hide these attributes from backoffice or you want these attributes to be removed completely from the database.
If you just want to hide them, you can add the following custom property for these attributes in the items.xml:
<custom-properties>
<property name="hiddenForUI">
<value>Boolean.TRUE</value>
</property>
</custom-properties>
Please check core-items.xml for some examples.
Alternatively, you can import the following ImpEx:
INSERT_UPDATE AttributeDescriptor;qualifier[unique=true];enclosingType(code)[unique=true];hiddenForUI
;the-attribute-to-be-hidden;the-itemtype-to-which-the-attribute-belongs;TRUE
If you want them to be removed completely from the database, you can do the following things:
Identify the extensions in which they have been defined and if any of these extensions are not required, just remove them from localextensions.xml
Remove these attributes from their respective items.xml
Then, you need to execute ant clean all updatesystem. However, the columns corresponding to these attributes will still persist in the database as system update does not remove/drop any table/column from the database (you can only add new tables/columns and add/update/remove the data using system update). In most of the cases, system initialization is also not a possibility. So, you are left with only one option, which is to delete the columns from the database using the SQL queries.

Sitecore: Best practices for setting up Solr core for multisite

I understand that we can create N number of Cores in Solr. But I'm bit unclear on best practices for setting of Solr core for sitecore multisite implementation.
Scenario:
We have 5 sites in one sitecore instance. Each sites have search capability requirement.
Best practice questions:
Is it ok for each sites to share core and master index?
And, create a site specific cores for web index?
Do we need to setup the configuration file or similar in order to establish relationship between site(s) vs. solr core?
3.1. And I guess, each site needs to have sitecore solr index configuration file?
e.g. Sitecore.ContentSearch.Solr.SiteA.Index.Web (copy of out of box index file which contains custom indexing name?)
Do we need to write any smartness (site context) such that search will use the correct solr Core?
Answer for this question is really opinion-based. There is no one solution which would be ideal in every situation.
There was even a discussion on https://sitecorechat.slack.com about it.
First of all you need to think whether you want to reuse any content between the sites. If yes, then you can use Sitecore out of the box indexes. You will not get that much from separate indexes. Of course you can create separate indexes but some of the content will have to be crawled multiple times.
If you want to create separate indexes:
Leave sitecore_core_index, sitecore_master_index and sitecore_web_index as they are - they will be used by Sitecore to run Content Editor search and other Sitecore background searches.
Create Solr master and web cores for every site, e.g. site1_master_index and site1_web_index.
For every site, duplicate the configuration for sitecore_master_index and sitecore_web_index and set proper locations of the content roots.
E.g. create a copy of Sitecore.ContentSearch.Solr.Index.Master.config file (Site1.ContentSearch.Solr.Index.Master.config), and change its content to:
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/">
<sitecore>
<contentSearch>
<configuration type="Sitecore.ContentSearch.ContentSearchConfiguration, Sitecore.ContentSearch">
<indexes hint="list:AddIndex">
<index id="site1_master_index" type="Sitecore.ContentSearch.SolrProvider.SolrSearchIndex, Sitecore.ContentSearch.SolrProvider">
<param desc="name">$(id)</param>
<param desc="core">$(id)</param>
<param desc="propertyStore" ref="contentSearch/indexConfigurations/databasePropertyStore" param1="$(id)" />
<configuration ref="contentSearch/indexConfigurations/defaultSolrIndexConfiguration" />
<strategies hint="list:AddStrategy">
<strategy ref="contentSearch/indexConfigurations/indexUpdateStrategies/syncMaster" />
</strategies>
<locations hint="list:AddCrawler">
<content type="Sitecore.ContentSearch.SitecoreItemCrawler, Sitecore.ContentSearch">
<Database>master</Database>
<Root>/sitecore/content/site1</Root>
</content>
<media type="Sitecore.ContentSearch.SitecoreItemCrawler, Sitecore.ContentSearch">
<Database>master</Database>
<Root>/sitecore/media library/site1</Root>
</media>
</locations>
</index>
</indexes>
</configuration>
</contentSearch>
</sitecore>
</configuration>
This configuration tells Sitecore to index only /sitecore/content/site1 and /sitecore/media library/site1 locations in this index.
Use convention to get proper index search, e.g.:
ISearchIndex SearchIndex
{
get
{
return ContentSearchManager.GetIndex(Sitecore.Context.Site.Name + "_" + Sitecore.Context.Database.Name + "_index");
}
}

How to count the query result on multiple solr cores at once

Using multicore functionality of solr, we implemented multiple cores into one solr instance and each core having some specific information, like one core is stored Hotel's details along with city while another stored details of Events happening over the city and 3rd core stored details about restaurants in the city. There are other parameters so that we created separate cores.
My application is given search by keyword and I want to show list as:
hotel(numFound)
Events(numFound)
Restaurants(numFound)
then user can drill down into his interested thing.
How can we achieve this by querying to all cores and get the result as number of records found(numFound) for each core.
Solr.xml:
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<solr sharedLib="lib" persistent="true">
<cores adminPath="/admin/cores" hostPort="${jetty.port:8983}" hostContext="${hostContext:solr}">
<core default="true" instanceDir="hotelDetails" name="hotelDetails"/>
<core default="false" instanceDir="Events" name="Events"/>
<core default="false" instanceDir="Restaurants" name="Restaurants"/>
</cores>
</solr>
You need to use Solr's Faceted search mechanism. This link will guide you to get the faceted search.
There might be a lot of ways to achieve what you need. Here's my solution
First create a custom field called "type". Depending upon each core update it's relevant value (e.g For hotelDetails type:hotel) in each document.
Do a distributed search using faceting. Add the following command in your search query and you'll get the result as expected.
For distributed search add
/select?collection=hotelDetails,Events,Restaurants
For faceting
facet=true&facet.field=type
Check with the following queries. I hope both the query should work.
http://'localhost':8983/solr/hotelDetails/select?shards=localhost:8983/solr/hotelDetails,localhost:8983/solr/Events,localhost:8983/solr/Restaurants&q=some_query
and
http://'localhost':8983/solr/hotelDetails/select?collection=hotelDetails,Events,Restaurants&q=some_query

Setting created and modified dates for Sharepoint 2010 list items

I'm working on a Powershell script to migrate data from an existing SQL Server database into Sharepoint 2010 lists. Some of the records in the existing database have created and modified dates which the client would like to carry over into Sharepoint.
My migration script uses the UpdateListItems method on Sharepoint Lists web service to upload batches of CAML to create the new items. I have updated my CAML to set the "Created" column however the value seems to get ignored and is just set to the current date.
Is it possibly to manually set these dates either through the web services or through the ProcessBatchData method on the SPWeb object? I've seen examples online which imply that it can be done for individual items by disabling system update and tweaking versioning settings. However working with individual items is not an option since we have about 800,000 list items to import.
Solution was to change from using web services to using SPWeb.ProcessBatchData to import my data (thanks Andreas).
The XML passed into this method looks like:
<ows:Batch OnError="Continue">
<Method ID="1">
<SetList />
<SetVar Name="ID">New</SetVar>
<SetVar Name="Cmd">Save</SetVar>
<SetVar Name="urn:schemas-microsoft-com:office:office#ColumnName1">Value</SetVar>
<SetVar Name="urn:schemas-microsoft-com:office:office#Modified">2009-09-03T15:05:00Z</SetVar>
<SetVar Name="urn:schemas-microsoft-com:office:office#Created">2004-01-15T13:48:00Z</SetVar>
</Method>
<Method ID="2">
...
</Method>
...
</ows:Batch>
The "SetList" element should contain the Guid for the list to add the data too. In my example XML above this is empty because the XML is pre-generated before importing the data into SharePoint and we can't guarantee that the Sharepoint list Guid on the target server would be the same so we fill this in just before importing.
I also had to ensure that the dates being passed in where in the correct format by passing them into the SPUtility.CreateISO8601DateTimeFromSystemDateTime method.

Pulling useful info from a MOSS 2007 Person or Group Field dumped via SSIS package

I’ve got a list defined that, as one of the fields (actually a few of the fields), has a Person or Group picker. I’m extracting the entire list through an SSIS package using the SharePoint List Source data flow source dumping into a SQL table. The PoG picker field dumps its data like so (each line being a single data item):
163;#Jones, Danny S.
179;#Smith, Sandra M.
164;#Thomas, Rob Y.
161;#Ross, Danny L.
2064;#Smith, Michael D.
I would guess that the number preceeding the ;# is some sort of User ID that SharePoint keeps with the user instead of something helpful like an ADS guid. Can I use SSIS to pull SharePoint’s user profiles so I can match the ID shown to an ADS guid or ADS username, and if so, how? I've tried using the Web Service task within SSIS to call the User Profile service (http://www.my.site/_vti_bin/UserProfileService.asmx), but I get an error about the WSDL being a wrong version.
Unfortunately, the ID shown in site fields is local to the list of users against that site.
Each User is uniquely identified by the site and list guids along with the ID field, but the ID is not unique across users lists and so cannot be used for anything other than indexing into that table.
The other issue with this data is that the profile display is updated regulary by the one of the UserProfileSynchronization service timber jobs. I have experienced times when the display name of the user is not updated correctly and will be set to the account name from Active Directory.
To get an idea of what is going on under the hood, have a look at the All_UserData table in a content database.
In Summary
Only the name part of the field is usable in a meaningful way and even that is not completely reliable, but good enough perhaps.
Can you modify the fields that are exported from SharePoint? Can you add a calculated person field based on this field? If so, then you can have that Person field store a different form of person data like their username or e-mail address which are far more useful in interacting with other systems.
Nat's answer is close. It's actually the UserInfo table. The numbers correspond to that table's tp_ID column. Unfortunately, I still can't figure out how to pull this info using SSIS, so I'm resorting to writing a console app that pulls the table's data through the Sharepoint web service and dumping that into a database table and scheduling it with Windows Task Scheduler. Also, because of how Sharepoint works, each root site collection has different IDs for each person, so I'll need to make separate pulls for each root site collection. Here's the method I'm using:
private static XElement GetUserInfo(string siteCollectionListsSvc)
{
SharepointListsSvc.ListsSoapClient ws = new SharepointListsSvc.ListsSoapClient();
ws.Endpoint.Address = new System.ServiceModel.EndpointAddress(siteCollectionListsSvc);
ws.ClientCredentials.Windows.AllowedImpersonationLevel = System.Security.Principal.TokenImpersonationLevel.Impersonation;
ws.ClientCredentials.Windows.AllowNtlm = true;
ws.ClientCredentials.Windows.ClientCredential = (System.Net.NetworkCredential)System.Net.CredentialCache.DefaultCredentials;
XElement userInfo = ws.GetListItems("UserInfo", String.Empty, null, null, "4000", null, null);
return userInfo;
}
The method argument would be something like "http://www.my.site/_vti_bin/lists.asmx". My app config that sets up the binding and endpoint:
<configuration>
<system.serviceModel>
<bindings>
<basicHttpBinding>
<binding name="ListsSoap" closeTimeout="00:01:00" openTimeout="00:01:00"
receiveTimeout="00:10:00" sendTimeout="00:01:00" allowCookies="false"
bypassProxyOnLocal="false" hostNameComparisonMode="StrongWildcard"
maxBufferSize="5000000" maxBufferPoolSize="524288" maxReceivedMessageSize="5000000"
messageEncoding="Text" textEncoding="utf-8" transferMode="Buffered"
useDefaultWebProxy="true">
<readerQuotas maxDepth="32" maxStringContentLength="8192" maxArrayLength="16384"
maxBytesPerRead="4096" maxNameTableCharCount="16384" />
<security mode="TransportCredentialOnly">
<transport clientCredentialType="Ntlm" proxyCredentialType="None"
realm="" />
<message clientCredentialType="UserName" algorithmSuite="Default" />
</security>
</binding>
</basicHttpBinding>
</bindings>
<client>
<endpoint address="http://www.my.site/_vti_bin/lists.asmx"
binding="basicHttpBinding" bindingConfiguration="ListsSoap"
contract="SharepointListsSvc.ListsSoap" name="ListsSoap" />
</client>
</system.serviceModel>
</configuration>
Notice that I increased the //binding/#maxBufferSize and //binding/#maxReceivedMessageSize from the default of 65536 to 5000000. We've got about 3000 records that could be returned, and the default size wasn't nearly big enough. Since these are all internal calls, I'm not worried about network lag. Other changes from the default binding are in the //security element, specifically the #mode and //transport/#clientCredentialType attributes.
When you get the XML back, the number (stored in the PoG field) is in the //z:row/#ows_ID attribute, and his corresponding ADS login is in the //z:row/#ows_Name attribute. You also get the email address back in the //z:row/#ows_EMail attribute.
Hope this helps others get through the same issue!

Resources