In some instances, I prefer working with custom objects instead of strongly typed datasets and data rows. However, it seems like Microsoft Reporting (included with VS2005) requires strongly typed datasets.
Is there a way to use my custom objects to design and populate reports?
I found the answer. Yes, it's possible. You just have to add a custom object as a datasource in visual studio.
http://www.gotreportviewer.com/objectdatasources/index.html
I could never choose one of my own POCOs in Report Data setup from my project to be a model for the report - the alleged 'global' option mentioned in the walkthrough was not there. So I ended up having to edit the XML to define the type and an imitation data source (which does not actually exist in my project).
I assign the data of type Aies.Core.Model.Invoice.MemberInvoice to the report in code
reportViewer.LocalReport.DataSources.Add(new ReportDataSource("MemberInvoice", new[] { invoice1 }));
And the custom definition is:
<DataSources>
<DataSource Name="MemberInvoice">
<ConnectionProperties>
<DataProvider>System.Data.DataSet</DataProvider>
<ConnectString>/* Local Connection */</ConnectString>
</ConnectionProperties>
<rd:DataSourceID>3fe04def-105a-4e9b-99db-630c1f8bb2c9</rd:DataSourceID>
</DataSource>
</DataSources>
<DataSets>
<DataSet Name="MemberInvoice">
<Fields>
<Field Name="MemberId">
<DataField>MemberId</DataField>
<rd:TypeName>System.Int32</rd:TypeName>
</Field>
<Field Name="DateOfIssue">
<DataField>DateOfIssue</DataField>
<rd:TypeName>System.DateTime</rd:TypeName>
</Field>
<Field Name="DateDue">
<DataField>DateDue</DataField>
<rd:TypeName>System.DateTime</rd:TypeName>
</Field>
<Field Name="Amount">
<DataField>Amount</DataField>
<rd:TypeName>System.Decimal</rd:TypeName>
</Field>
</Fields>
<Query>
<DataSourceName>MemberInvoice</DataSourceName>
<CommandText>/* Local Query */</CommandText>
</Query>
<rd:DataSetInfo>
<rd:DataSetName>Aies.Core.Model.Invoice</rd:DataSetName>
<rd:TableName>MemberInvoiceData</rd:TableName>
<rd:ObjectDataSourceSelectMethod>GetInvoices</rd:ObjectDataSourceSelectMethod>
<rd:ObjectDataSourceSelectMethodSignature>System.Collections.Generic.IEnumerable`1[Aies.Core.Model.Invoice.MemberInvoice] GetInvoices()</rd:ObjectDataSourceSelectMethodSignature>
<rd:ObjectDataSourceType>Aies.Core.Model.Invoice.MemberInvoiceData, Aies.Core, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null</rd:ObjectDataSourceType>
</rd:DataSetInfo>
</DataSet>
</DataSets>
I believe you can set up SSRS to read data values from a more or less arbitrary object. This Link describes the IDataReaderFieldProperties object in the API which (IIRC) allows you to specify the getter method to invoke to get a value.
Related
I am Integrating from CRM to Azure SQL DB but I want to set the Net Change as Last ModifiedOn. This will help to update/insert only those records modified since the previous run. I am trying to fetch the Latest modified on date-time from SQL but I am unable to pass it Fetch XML query for CRM Source. The Query I am using is as follows:
<fetch>
<entity name="msdyn_project">
<all-attributes />
<filter type="and">
<condition attribute="modifiedon" operator="on-or-after"
value="#{activity(\'LookupOldWaterMarkactivity\').output.firstRow.Prop_0}"/>
</filter>
</entity>
</fetch>
Please suggest if there is some other workaround for this.
Per my experience,you can't set #{activity(...)} in the Fetch XML query SQL directly.If you want to use #{activity(...)},it belongs to dynamic content:
So,you could try to use #cancat to complete the query SQL:
#concat(
'<fetch><entity name="">.....<condition attribute="modifiedon" operator="on-or-after" value=',
#{activity("LookupOldWaterMarkactivity").output.firstRow.Prop_0,
'/>....</fetch>')
I have been doing a bunch of searching on this and have come up blank. My research has shown me that i can use NLog to record logs to the DB but this always involves creating a new table with specific columns and stored procedures.
I have about 10 different systems, each with its own custom build logging functions and database tables. What i am looking to do is replace our current custom build logging logic with NLog. But i need NLog to then use my existing tables and schema to store the log. I can not modify the log tables in any way as they are tightly strapped to other functions that i can change at this time. So i need NLog to conform to my log table schema.
Is this even possible with NLog?
Any details would be greatly appreciated.
https://github.com/nlog/NLog/wiki/Database-target
Your configuration would look something like:
<target name="db"
xsi:type="Database"
connectionStringName="NLogConn"
commandText="INSERT INTO MyExistingTable (FieldA, FieldB, FieldC) VALUES (#a, #b, #c)">
<parameter name="#a" layout="${machinename}" />
<parameter name="#b" layout="${date}" />
<parameter name="#c" layout="${exception:tostring}" />
</target>
The layout is anything from https://github.com/nlog/nlog/wiki/Layout-Renderers or a combination thereof. You can also define multiple schema by adding different targets with different names.
I try to model my db using this example from solr wiki.
I have a table called item and a table called features with id,featureName,description
here is the updated xml (added featureName)
<dataConfig>
<dataSource driver="org.hsqldb.jdbcDriver" url="jdbc:hsqldb:/temp/example/ex" user="sa" />
<document>
<entity name="item" query="select * from item">
<entity name="feature" query="select description, featureName as features from feature where item_id='${item.ID}'"/>
</entity>
</document>
Now I get two lists in the xml element
<doc>
<arr name="featureName">
<str>number of miles in every direction the universal cataclysm was gathering</str>
<str>All around the Restaurant people and things relaxed and chatted. The</str>
<str>- Do we have... - he put up a hand to hold back the cheers, - Do we</str>
</arr>
<arr name="description">
<str>to a stupefying climax. Glancing at his watch, Max returned to the stage</str>
<str>air was filled with talk of this and that, and with the mingled scents of</str>
<str>have a party here from the Zansellquasure Flamarion Bridge Club from</str>
</arr>
</doc>
But I would like to see the list together (using xml attributes) so that I dont have to join the values.
Is it possible?
I wanted to suggest the ScriptTransformer, it gives you the flexibility to alter the data as needed, but it will not work in your case since it's working at the row level.
You can always define an aggregation function for string concatenation in SQL(example), but you will potentially have performance issues.
If you would use a http/xml data source the solution would have been to use the flatten atribute.
Nevertheless the search functionality will work as expected even if you ended up with multi-valued fields. The down side would be on the client where you will concatenate them before the presentation layer, which is not really a problem if you use some sort of pagination.
I am indexing a collection of xml document with the next structure:
<mydoc>
<id>1234</id>
<name>Some Name</name>
<experiences>
<experience years="10" type="Java"/>
<experience years="4" type="Hadoop"/>
<experience years="1" type="Hbase"/>
</experiences>
</mydoc>
Is there any way to create solr index so that it would support the next query:
find all docs with experience type "Hadoop" and years>=3
So far my best idea is to put delimited years||type into multiValued string field, search for all docs with type "Hadoop" and after that iterate through the results to select years>=3. Obviously this is very inefficient for a large set of docs.
I think there is no obvious solution for indexing data coming from the many-to-many relationship. In this case I would go with dynamic fields: http://wiki.apache.org/solr/SchemaXml#Dynamic_fields
Field definition in schema.xml:
<dynamicField name="experience_*" type="integer" indexed="true" stored="true"/>
So, using your example you would end up with something like this:
<mydoc>
<id>1234</id>
<name>Some Name</name>
<experience_Java>10</experience_Java>
<experience_Hadoop>4</experience_Hadoop>
<experience_Hbase>1</experience_Hbase>
</mydoc>
Then you can use the following query: fq=experience_Java:[3 to *]
I’ve got a list defined that, as one of the fields (actually a few of the fields), has a Person or Group picker. I’m extracting the entire list through an SSIS package using the SharePoint List Source data flow source dumping into a SQL table. The PoG picker field dumps its data like so (each line being a single data item):
163;#Jones, Danny S.
179;#Smith, Sandra M.
164;#Thomas, Rob Y.
161;#Ross, Danny L.
2064;#Smith, Michael D.
I would guess that the number preceeding the ;# is some sort of User ID that SharePoint keeps with the user instead of something helpful like an ADS guid. Can I use SSIS to pull SharePoint’s user profiles so I can match the ID shown to an ADS guid or ADS username, and if so, how? I've tried using the Web Service task within SSIS to call the User Profile service (http://www.my.site/_vti_bin/UserProfileService.asmx), but I get an error about the WSDL being a wrong version.
Unfortunately, the ID shown in site fields is local to the list of users against that site.
Each User is uniquely identified by the site and list guids along with the ID field, but the ID is not unique across users lists and so cannot be used for anything other than indexing into that table.
The other issue with this data is that the profile display is updated regulary by the one of the UserProfileSynchronization service timber jobs. I have experienced times when the display name of the user is not updated correctly and will be set to the account name from Active Directory.
To get an idea of what is going on under the hood, have a look at the All_UserData table in a content database.
In Summary
Only the name part of the field is usable in a meaningful way and even that is not completely reliable, but good enough perhaps.
Can you modify the fields that are exported from SharePoint? Can you add a calculated person field based on this field? If so, then you can have that Person field store a different form of person data like their username or e-mail address which are far more useful in interacting with other systems.
Nat's answer is close. It's actually the UserInfo table. The numbers correspond to that table's tp_ID column. Unfortunately, I still can't figure out how to pull this info using SSIS, so I'm resorting to writing a console app that pulls the table's data through the Sharepoint web service and dumping that into a database table and scheduling it with Windows Task Scheduler. Also, because of how Sharepoint works, each root site collection has different IDs for each person, so I'll need to make separate pulls for each root site collection. Here's the method I'm using:
private static XElement GetUserInfo(string siteCollectionListsSvc)
{
SharepointListsSvc.ListsSoapClient ws = new SharepointListsSvc.ListsSoapClient();
ws.Endpoint.Address = new System.ServiceModel.EndpointAddress(siteCollectionListsSvc);
ws.ClientCredentials.Windows.AllowedImpersonationLevel = System.Security.Principal.TokenImpersonationLevel.Impersonation;
ws.ClientCredentials.Windows.AllowNtlm = true;
ws.ClientCredentials.Windows.ClientCredential = (System.Net.NetworkCredential)System.Net.CredentialCache.DefaultCredentials;
XElement userInfo = ws.GetListItems("UserInfo", String.Empty, null, null, "4000", null, null);
return userInfo;
}
The method argument would be something like "http://www.my.site/_vti_bin/lists.asmx". My app config that sets up the binding and endpoint:
<configuration>
<system.serviceModel>
<bindings>
<basicHttpBinding>
<binding name="ListsSoap" closeTimeout="00:01:00" openTimeout="00:01:00"
receiveTimeout="00:10:00" sendTimeout="00:01:00" allowCookies="false"
bypassProxyOnLocal="false" hostNameComparisonMode="StrongWildcard"
maxBufferSize="5000000" maxBufferPoolSize="524288" maxReceivedMessageSize="5000000"
messageEncoding="Text" textEncoding="utf-8" transferMode="Buffered"
useDefaultWebProxy="true">
<readerQuotas maxDepth="32" maxStringContentLength="8192" maxArrayLength="16384"
maxBytesPerRead="4096" maxNameTableCharCount="16384" />
<security mode="TransportCredentialOnly">
<transport clientCredentialType="Ntlm" proxyCredentialType="None"
realm="" />
<message clientCredentialType="UserName" algorithmSuite="Default" />
</security>
</binding>
</basicHttpBinding>
</bindings>
<client>
<endpoint address="http://www.my.site/_vti_bin/lists.asmx"
binding="basicHttpBinding" bindingConfiguration="ListsSoap"
contract="SharepointListsSvc.ListsSoap" name="ListsSoap" />
</client>
</system.serviceModel>
</configuration>
Notice that I increased the //binding/#maxBufferSize and //binding/#maxReceivedMessageSize from the default of 65536 to 5000000. We've got about 3000 records that could be returned, and the default size wasn't nearly big enough. Since these are all internal calls, I'm not worried about network lag. Other changes from the default binding are in the //security element, specifically the #mode and //transport/#clientCredentialType attributes.
When you get the XML back, the number (stored in the PoG field) is in the //z:row/#ows_ID attribute, and his corresponding ADS login is in the //z:row/#ows_Name attribute. You also get the email address back in the //z:row/#ows_EMail attribute.
Hope this helps others get through the same issue!