Combining data from Project Server and SharePoint into a single report - sharepoint

I need to combine data from the Project Server reporting database with data from custom lists in SharePoint workspaces. The results need to be displayed within a single report. How should this be done? Options I've thought of:
Extend the reporting database with the custom list data (if this is possible). Use Reporting Services to display the output.
Query the reporting database and the SharePoint workspaces and combine results in memory. Write custom code to display the output.
Any other ideas? I have the skills to develop this but am very open to purchasing a product if it solves the problem.

I've had this sort of problem as well. My apporach:
Create a Custom reporting Db.
Run regular jobs from the SQL Server to query sharepoint (via WS) and store the results in the db.
i use the ListItemsChangesSinceToken is Lists.asmx to improve effeciency. Also I utilise the sitedataquery tool set. I wrote a really simple interface into it for the ability to call a sitedataquery remotely, returning a dataTable.
Use Reporting Services / any tool to extract and report on the data.
The reason I opted for a staging Db was for
Performance - the WS calls are pretty slow.
Service continuity - if SP is down for any reason or slow then queries will fail.
Hope this helps.

I also found the tool SharePoint Data Miner which appears to do the same as DJ's answer.

Related

Bringing a MS Graph Search Custom Connector into working mode

Recently Microsoft published the Microsoft Search API (beta) which provides the possibility to index external systems by creating an MS Graph search custom connector.
I created such a connector that was successful so far. I also pushed a few items to the index and in the MS admin center, I created a result type and a vertical. Now I'm able to find the regarded external items in the SharePoint Online modern search center in a dedicated tab belonging to the search vertical created before. So far so good.
But now I wonder:
How can I achieve that the external data is continuously pushed to the MS Search Index? (How can this be implemented? Is there any tutorial or a sample project? What is the underlying architecture?)
Is there a concept of Full / Incremental / Continuous Crawls for a Search Custom Connector at all? If so, how can I "hook" into a crawl in order to update changed data to the index?
Or do I have to implement it all on my own? And if so, what would be a suitable approach?
Thank you for trying out the connector APIs. I am glad to hear that you are able to get items into the index and see the results.
Regarding your questions, the logic for determining when to push items, and your crawl strategy is something that you need to implement on your own. There is no one best strategy per se, and it will depend on your data source and the type of access you have to that data. For example, do you get notifications every time the data changes? If not, how do you determine what data has changed? If none of that is possible, you might need to do a periodic full recrawl, but you will need to consider the size of your data set for ingestion.
We will look into ways to reduce the amount of code you have to write in the future, but right now, this is something you have to implement on your own.
-James
I recently implemented incremental crawling for Graph connectors using Azure functions. I created a timer triggered function that fetches the items updated in the data source since the time of the last function run and then updates the search index with the updated items.
I also wrote a blog post around this approach considering a SharePoint list as the data source. The entire source code can be found at https://github.com/aakashbhardwaj619/function-search-connector-crawler. Hope it would be useful.

Application insight -> export -> Power BI Data Warehouse Architecture

Our team have just recently started using Application Insights to add telemetry data to our windows desktop application. This data is sent almost exclusively in the form of events (rather than page views etc). Application Insights is useful only up to a point; to answer anything other than basic questions we are exporting to Azure storage and then using Power BI.
My question is one of data structure. We are new to analytics in general and have just been reading about star/snowflake structures for data warehousing. This looks like it might help in providing the answers we need.
My question is quite simple: Is this the right approach? Have we over complicated things? My current feeling is that a better approach will be to pull the latest data and transform it into a SQL database of facts and dimensions for Power BI to query. Does this make sense? Is this what other people are doing? We have realised that this is more work than we initially thought.
Definitely pursue Michael Milirud's answer, if your source product has suitable analytics you might not need a data warehouse.
Traditionally, a data warehouse has three advantages - integrating information from different data sources, both internal and external; data is cleansed and standardised across sources, and the history of change over time ensures that data is available in its historic context.
What you are describing is becoming a very common case in data warehousing, where star schemas are created for access by tools like PowerBI, Qlik or Tableau. In smaller scenarios the entire warehouse might be held in the PowerBI data engine, but larger data might need pass through queries.
In your scenario, you might be interested in some tools that appear to handle at least some of the migration of Application Insights data:
https://sesitai.codeplex.com/
https://github.com/Azure/azure-content/blob/master/articles/application-insights/app-insights-code-sample-export-telemetry-sql-database.md
Our product Ajilius automates the development of star schema data warehouses, speeding the development time to days or weeks. There are a number of other products doing a similar job, we maintain a complete list of industry competitors to help you choose.
I would continue with Power BI - it actually has a very sophisticated and powerful data integration and modeling engine built in. Historically I've worked with SQL Server Integration Services and Analysis Services for these tasks - Power BI Desktop is superior in many aspects. The design approaches remain consistent - star schemas etc, but you build them in-memory within PBI. It's way more flexible and agile.
Also are you aware that AI can be connected directly to PBI Web? This connects to your AI data in minutes and gives you PBI content ready to use (dashboards, reports, datasets). You can customize these and build new reports from the datasets.
https://powerbi.microsoft.com/en-us/documentation/powerbi-content-pack-application-insights/
What we ended up doing was not sending events from our WinForms app directly to AI but to the Azure EventHub
We then created a job that reads from the eventhub and send the data to
AI using the SDK
Blob storage for later processing
Azure table storage to create powerbi reports
You can of course add more destinations.
So basically all events are send to one destination and from there stored in many destinations, each for their own purposes. We definitely did not want to be restricted to 7 days of raw data and since storage is cheap and blob storage can be used in many analytics solutions of Azure and Microsoft.
The eventhub can be linked to stream analytics as well.
More information about eventhubs can be found at https://azure.microsoft.com/en-us/documentation/articles/event-hubs-csharp-ephcs-getstarted/
You can start using the recently released Application Insights Analytics' feature. In Application Insights we now let you write any query you would like so that you can get more insights out of your data. Analytics runs your queries in seconds, lets you filter / join / group by any possible property and you can also run these queries from Power BI.
More information can be found at https://azure.microsoft.com/en-us/documentation/articles/app-insights-analytics/

Getting data from Sharepoint to another server

I am currently working on a mobile concept.
We are running a Sharepoint 2010 Intranet solution, which is ONLY accessible within the company.
We want to make a mobile solution (for people outside), with data from the Sharepoint server.
I would like to have the data moved i.e. every 10-15 minutes through a cron job, and then move the data to an external database, which the mobile solution can access.
What is the easiest way to move the data? Using the web services, or are there any other ways to do this?
Thank you on beforehand,
Jens
A possible solution is to code a timer job, which is a cron job scheduled by sharepoint that you can set to run every night with some Sharepoint Object model code that extracts all the data and sends it to the other server, you can do this using ado.net or any equivalent technology like ORMS etc, so this method pushes the data to the server.
If you have limitations on the connectivity like firewalls that only allow http traffic then definitely you will need to use either web services or the client object model, this method pulls the data from the server.
Client Object Model is preferred over web services as it among other features it batches the requests to make it more efficient, the api is better to manipulate data, etc.
Another option is to use SSIS to do the job as described in this artice:
http://msdn.microsoft.com/en-us/library/hh368261.aspx

Are SharePoint Lists evil in an SOA based enterprise?

My company is moving from Client/Server applications (thick client apps that makes database calls directly) to a Service Oriented Architecture (SOA) (thin or thick clients that call a web service that then does business logic and calls the database).
Part of this includes using SharePoint as our client (not our only client type, but the major one). I have been watching the Pluralsight training on SharePoint and I am starting to see a lot about SharePoint Lists.
SharePoint Lists seem to be to a core part of SharePoint. However they also seem to be a huge step backward architecturally speaking. These are my concerns:
Using these lists, I will have my SharePoint webparts hitting the data directly again (much like where we were with 2 tier client/server apps).
This confuses the data layer big time. Do I store my list of clients in the SQL Server database? Or a SharePoint list? Or both? (say it ain't so!) If both, how do I keep them in Sync?
If I store the data in SharePoint Lists do I then have to have my Web Services using the SharePoint Client Object Model to get at the lists (for non-SharePoint clients)?
Basically SharePoint Lists seem like a very very bad idea. But what I hear is that it is one of the big benefits of SharePoint. (Though I know that there are things like resource management and permissions that are also useful in SharePoint.)
SharePoint Lists seem like an attempt at low grade data storage. (With out all the benefits of a full data management solution like SQL Server.)
So here are my questions: What are the right/best practice reasons why would I use SharePoint Lists over web services that access a SQL Server? And can SharePoint even work normally using web services to get and update data? (Basically, if I don't use lists, do I lose a lot of functionality?)
SharePoint lists are not a one size fits all solution to data storage. There are a great deal of scenarios where you'll want to use data available from an external system, like an existing CRM database, inside of SharePoint.
SharePoint 2007 used a concept called Business Data Catalog to address some of these scenarios, allowing a read-only view of external system data in SharePoint lists.
SharePoint 2010 greatly expands on the SharePoint 2007 capabilities with Business Connectivity Services, allowing for full read/write from SharePoint lists, with API access allowing custom connectors to be implemented in code for whatever backend system you may be trying to access (a SQL Server provider is provided out of the box). Here's a pretty thorough primer on the BCS, and there's a lot more information to be found on MSDN.
Be wary of trying to use SharePoint lists as traditional tables in a RDBMS, these aren't their purpose and it will only lead to intense headaches down the road.
While I agree with the answer by OedipusPrime, I feel your question "Why would I use lists" warrants a more detailed answer.
The short version is, you probably shouldn't. What SharePoint gives you are lists which are a bit 'database-like', but simple enough that your ordinary user can cope with them. They're quite flexible for users. It also gives you a user interface to interact with the lists and data.
You're not using the UI, and you're probably quite happy with SQL, so SQL should probably be your choice. There's nothing that you can do in SharePoint that you can't do yourself in SQL (often faster) - but SQL isn't as user friendly for non-techies to set things up. SharePoint isn't a "full data management solution" like SQL - it's more like ASP.NET on steroids, and it has different advantages. (That's why its back end is ... SQL)
So, where would you store your data? SQL or Lists. One or the other - don't do both, that never works out well. If your data is in SQL, you can expose it in SharePoint with the BCS as mentioned already.
If your data is in a SharePoint list, yes, you can use the Client Object model. Or you can use Web Services directly. Or the REST API. All those are valid options.
Or, you could expose data from your database via your own Web Services, and then consume those via SharePoint's BCS, allowing you to present your data in SharePoint (with full CRUD, if you want) without your application becoming dependent upon it.
You are partially right. Regarding your options, here is the route you should go:
You should store the data only in lists. Not in SQL server. The data in lists is ultimately stored in SharePoint content database in SQL server and there is no point syncing it.
To have your clients access the data, your web services can call out of the box web services exposed in SharePoint which can operate on lists data.
See this article on overview of web services exposed by SharePoint:
http://msdn.microsoft.com/en-us/library/ee538665.aspx
http://www.sharepointmonitor.com/2007/01/sharepoint-web-service/
This is one of the big questions that faces someone new to SharePoint - should I store X in a SharePoint list, or in a SQL Server table?
SharePoint lists have similarities to database tables in that they have rows (items) and columns and the equivalent of triggers (event receivers). Data management pages are part of SharePoint so there is no need to build pages for updating and adding items to the table, and additionally the lists can be exposed as RSS feeds or through web services. They can also be versioned and participate in a workflow. Another advantage is that the contents of the lists are automatically included in content backups, so there is no need to manage a separate backup and restore process – everything is in the content database. There may not necessarily be a performance impact because there are several caching mechanisms which come into play.
A SharePoint list should certainly be considered as a storage mechanism, even for large datasets with appropriate treatment. In one sense the SharePoint list is acting as an effective data access layer, bearing in mind that ultimately the data is being stored in a SQL Server database anyway. What you do not get is the rigour of relational modelling, normalisation, referential integrity, optimisation of the execution plan, and all the other tools of the DBA’s craft. If the efficient management of the data is dependent on those skills then storing the data directly in its own database is probably a better choice. You can still get at the data through BCS, as well as through custom code.
A word of warning: on no account be tempted to interact with the SharePoint content databases directly.

IIS 7 Logs Vs Custom

I want to log some information about my visitors. Is it better to use the IIS generated log or to create my own in an SQL 2008 db.
I know I should probably provide more information about my specific scenario, but I'd like just generally, pros and cons of either proposal.
You can add additional information to the IIS logs from ASP.NET using HttpResponse.AppendToLog, additionally you could use the Advanced Logging Module to create your own logs with custom filters and custom data including data from Performance Counters, and more.
It all depends on what information you want to analyse.
If you're doing aggregations and rollups then you'd want to pull this data into a database for analysis. Pulling your data into a database will give you access to indexes and better querying tools.
If you're doing infrequent one-off simple queries then LogParser might be sufficient for your needs. However you'll be constantly scanning unindexed flat files looking for data which is I/O intensive.
But as you say, without knowing more about your specific scenario it's hard to say what would be best.

Resources