I'd like to create a shadow copy of my main organization to play around in without risking damaging the data. The schema is easy to move over using solutions but what about the data?
Moving the data using export/import functions would kill a lot of time. I'd like a utility that does that for me, so I can move out data daily or at the very least weekly.
Suggestion on method? Do I need to go to third party products? Or do I have to code one myself?
The Instance Adapter for CRM allows for you to keep two organizations in sync.
You will also need the Connector for Microsoft Dynamics, it is available on partner or customer source. The connector is a tool that allows you to integrate and sync the different Dynamics products like CRM, GP, or NAV. With the instance adapter for CRM you should be able to setup an integration between 2 CRM orgs.
http://www.microsoft.com/en-us/download/details.aspx?id=35385
If you are using CRM 2013 you should read this blog post as support has been dropped after a certain version of the connector.
http://blogs.msdn.com/b/dynamicsconnector/archive/2013/10/17/microsoft-dynamics-crm-2013-is-supported-by-connector-for-microsoft-dynamics.aspx
Related
Our team have just recently started using Application Insights to add telemetry data to our windows desktop application. This data is sent almost exclusively in the form of events (rather than page views etc). Application Insights is useful only up to a point; to answer anything other than basic questions we are exporting to Azure storage and then using Power BI.
My question is one of data structure. We are new to analytics in general and have just been reading about star/snowflake structures for data warehousing. This looks like it might help in providing the answers we need.
My question is quite simple: Is this the right approach? Have we over complicated things? My current feeling is that a better approach will be to pull the latest data and transform it into a SQL database of facts and dimensions for Power BI to query. Does this make sense? Is this what other people are doing? We have realised that this is more work than we initially thought.
Definitely pursue Michael Milirud's answer, if your source product has suitable analytics you might not need a data warehouse.
Traditionally, a data warehouse has three advantages - integrating information from different data sources, both internal and external; data is cleansed and standardised across sources, and the history of change over time ensures that data is available in its historic context.
What you are describing is becoming a very common case in data warehousing, where star schemas are created for access by tools like PowerBI, Qlik or Tableau. In smaller scenarios the entire warehouse might be held in the PowerBI data engine, but larger data might need pass through queries.
In your scenario, you might be interested in some tools that appear to handle at least some of the migration of Application Insights data:
https://sesitai.codeplex.com/
https://github.com/Azure/azure-content/blob/master/articles/application-insights/app-insights-code-sample-export-telemetry-sql-database.md
Our product Ajilius automates the development of star schema data warehouses, speeding the development time to days or weeks. There are a number of other products doing a similar job, we maintain a complete list of industry competitors to help you choose.
I would continue with Power BI - it actually has a very sophisticated and powerful data integration and modeling engine built in. Historically I've worked with SQL Server Integration Services and Analysis Services for these tasks - Power BI Desktop is superior in many aspects. The design approaches remain consistent - star schemas etc, but you build them in-memory within PBI. It's way more flexible and agile.
Also are you aware that AI can be connected directly to PBI Web? This connects to your AI data in minutes and gives you PBI content ready to use (dashboards, reports, datasets). You can customize these and build new reports from the datasets.
https://powerbi.microsoft.com/en-us/documentation/powerbi-content-pack-application-insights/
What we ended up doing was not sending events from our WinForms app directly to AI but to the Azure EventHub
We then created a job that reads from the eventhub and send the data to
AI using the SDK
Blob storage for later processing
Azure table storage to create powerbi reports
You can of course add more destinations.
So basically all events are send to one destination and from there stored in many destinations, each for their own purposes. We definitely did not want to be restricted to 7 days of raw data and since storage is cheap and blob storage can be used in many analytics solutions of Azure and Microsoft.
The eventhub can be linked to stream analytics as well.
More information about eventhubs can be found at https://azure.microsoft.com/en-us/documentation/articles/event-hubs-csharp-ephcs-getstarted/
You can start using the recently released Application Insights Analytics' feature. In Application Insights we now let you write any query you would like so that you can get more insights out of your data. Analytics runs your queries in seconds, lets you filter / join / group by any possible property and you can also run these queries from Power BI.
More information can be found at https://azure.microsoft.com/en-us/documentation/articles/app-insights-analytics/
"A plug-in is custom business logic (code) that you can integrate with Microsoft Dynamics CRM 2011 to modify or augment the standard behavior of the platform."
my question is :
What the ideal scenario/conditions to use the plugins in the CRM dynamics, before the using the plugins, what types of conditions considers for use of the plugin.
You should use plugins under any of the following conditions:
When you need to enforce business logic in your database that you cannot accomplish reasonably with built in tools like workflows
When your business logic must be executed synchronously
When you need to integrate with external services (address verification or payment processing for example)
When you have a multi-tiered solution where you want to inherit business logic
We write the plugin, when MS CRM tools do not provide the such works then we write the plugin and perform that works.
Like
Any integration in MS CRM tools or want generate auto-number,Execute complex Business Logic etc.
Plugins are Call Out a custom logic(.dll) to implement business logic. It can be registered on CRUD events over three zone Pre-Validation,Pre-Operation & Post-Operation which lets you capture the data that is going to/coming from the database over the transaction between UI & Back-end.
So by running a custom code in these event the developers will be able to perform business operation between it. Although there are many examples to quote but mostly it is used when the in-build system workflow/process is not capable to implement the Business requirements.
Plug-ins have many uses. This includes the following:
Performing complex platform level data validation
Performing auto-number generation
Providing integration with other applications
Executing complex business logic
I've been researching how to extract data from an MS Dynamics CRM 2011/3 Online instance so that I can replicate entire CRM entities in a target database.
I've looked at the Retrieve and RetrieveAll operations of the Organisation web service. These are able to extract data from a single CRM entity (entity type).
There's also the FetchXML interface, that can retrieve data using a complex query, from multiple entities.
It's possible that will be be no quiet time, when there are no data changes being made by users, or via web services, that I could use to extract data from the system in order to get a consistent snapshot of the data.
If I was able to access the SQL Server database directly I would be able to set an isolation level for a transaction and extract all data within that transaction, and get a consistent view of data.
I think FetchXML would give me a consistent snapshot, but only of the data queried by each call to it.
I could use FetchXML to query all the entities I'd like to replicate, in a single call, and then renormalise the data, with some ETL code, on my target database. That query wouldn't be nice though (complex and possibly non performant, and impacting the system performance).
So, basically my problem is this: if I extract from each entity in turn, and the database is changing whilst I'm extracting, I'm highly like to get an inconsistent data set in my target database.
How can I get a consistent snapshot of data to access?
You can contact support through the Support Portal and request a database backup. Then you can just restore that database to your On-Premise installation through Deployment Manager.
EDIT
After your comments below, I suggest a "push" model instead of a "pull" model. You'll need to create plugins for Create/Update/Delete on all entities in which you are interested in CRM Online. These plugins will push those updates to your database (probably through your own web-service). Since these plugins happen inside the transaction, if your web service throws an error you can cancel the source action in CRM, thus guaranteeing transactional consistency.
Once you get these plugins up and running, you can do a one-time export and your plugins will keep it up to date from there.
My company is moving from Client/Server applications (thick client apps that makes database calls directly) to a Service Oriented Architecture (SOA) (thin or thick clients that call a web service that then does business logic and calls the database).
Part of this includes using SharePoint as our client (not our only client type, but the major one). I have been watching the Pluralsight training on SharePoint and I am starting to see a lot about SharePoint Lists.
SharePoint Lists seem to be to a core part of SharePoint. However they also seem to be a huge step backward architecturally speaking. These are my concerns:
Using these lists, I will have my SharePoint webparts hitting the data directly again (much like where we were with 2 tier client/server apps).
This confuses the data layer big time. Do I store my list of clients in the SQL Server database? Or a SharePoint list? Or both? (say it ain't so!) If both, how do I keep them in Sync?
If I store the data in SharePoint Lists do I then have to have my Web Services using the SharePoint Client Object Model to get at the lists (for non-SharePoint clients)?
Basically SharePoint Lists seem like a very very bad idea. But what I hear is that it is one of the big benefits of SharePoint. (Though I know that there are things like resource management and permissions that are also useful in SharePoint.)
SharePoint Lists seem like an attempt at low grade data storage. (With out all the benefits of a full data management solution like SQL Server.)
So here are my questions: What are the right/best practice reasons why would I use SharePoint Lists over web services that access a SQL Server? And can SharePoint even work normally using web services to get and update data? (Basically, if I don't use lists, do I lose a lot of functionality?)
SharePoint lists are not a one size fits all solution to data storage. There are a great deal of scenarios where you'll want to use data available from an external system, like an existing CRM database, inside of SharePoint.
SharePoint 2007 used a concept called Business Data Catalog to address some of these scenarios, allowing a read-only view of external system data in SharePoint lists.
SharePoint 2010 greatly expands on the SharePoint 2007 capabilities with Business Connectivity Services, allowing for full read/write from SharePoint lists, with API access allowing custom connectors to be implemented in code for whatever backend system you may be trying to access (a SQL Server provider is provided out of the box). Here's a pretty thorough primer on the BCS, and there's a lot more information to be found on MSDN.
Be wary of trying to use SharePoint lists as traditional tables in a RDBMS, these aren't their purpose and it will only lead to intense headaches down the road.
While I agree with the answer by OedipusPrime, I feel your question "Why would I use lists" warrants a more detailed answer.
The short version is, you probably shouldn't. What SharePoint gives you are lists which are a bit 'database-like', but simple enough that your ordinary user can cope with them. They're quite flexible for users. It also gives you a user interface to interact with the lists and data.
You're not using the UI, and you're probably quite happy with SQL, so SQL should probably be your choice. There's nothing that you can do in SharePoint that you can't do yourself in SQL (often faster) - but SQL isn't as user friendly for non-techies to set things up. SharePoint isn't a "full data management solution" like SQL - it's more like ASP.NET on steroids, and it has different advantages. (That's why its back end is ... SQL)
So, where would you store your data? SQL or Lists. One or the other - don't do both, that never works out well. If your data is in SQL, you can expose it in SharePoint with the BCS as mentioned already.
If your data is in a SharePoint list, yes, you can use the Client Object model. Or you can use Web Services directly. Or the REST API. All those are valid options.
Or, you could expose data from your database via your own Web Services, and then consume those via SharePoint's BCS, allowing you to present your data in SharePoint (with full CRUD, if you want) without your application becoming dependent upon it.
You are partially right. Regarding your options, here is the route you should go:
You should store the data only in lists. Not in SQL server. The data in lists is ultimately stored in SharePoint content database in SQL server and there is no point syncing it.
To have your clients access the data, your web services can call out of the box web services exposed in SharePoint which can operate on lists data.
See this article on overview of web services exposed by SharePoint:
http://msdn.microsoft.com/en-us/library/ee538665.aspx
http://www.sharepointmonitor.com/2007/01/sharepoint-web-service/
This is one of the big questions that faces someone new to SharePoint - should I store X in a SharePoint list, or in a SQL Server table?
SharePoint lists have similarities to database tables in that they have rows (items) and columns and the equivalent of triggers (event receivers). Data management pages are part of SharePoint so there is no need to build pages for updating and adding items to the table, and additionally the lists can be exposed as RSS feeds or through web services. They can also be versioned and participate in a workflow. Another advantage is that the contents of the lists are automatically included in content backups, so there is no need to manage a separate backup and restore process – everything is in the content database. There may not necessarily be a performance impact because there are several caching mechanisms which come into play.
A SharePoint list should certainly be considered as a storage mechanism, even for large datasets with appropriate treatment. In one sense the SharePoint list is acting as an effective data access layer, bearing in mind that ultimately the data is being stored in a SQL Server database anyway. What you do not get is the rigour of relational modelling, normalisation, referential integrity, optimisation of the execution plan, and all the other tools of the DBA’s craft. If the efficient management of the data is dependent on those skills then storing the data directly in its own database is probably a better choice. You can still get at the data through BCS, as well as through custom code.
A word of warning: on no account be tempted to interact with the SharePoint content databases directly.
We need to build a couple applications that require fairly advanced workflow functionality. The plan is to store the data in SQL Server, use Windows Workflow Foundation as the workflow engine, and build the frontend using an RIA technology such as Flex or Silverlight.
We already have Sharepoint 2007 set up, and some of us (including me) have a little bit of experience creating custom Sharepoint workflows that work with data in Sharepoint lists.
My question is, would it make sense to use Sharepoint for the workflow, while the actual data is stored outside of Sharepoint in a separate database? We need the task, authentication, and email functionality of Sharepoint, but our data model is a bit complex so we'd rather not store the data in Sharepoint. We'd rather not start from scratch with Workflow Foundation, because Sharepoint already gives us 90% of the functionality we need.
Any thoughts / advice?
I think that this is a great example for use of SharePoint as a platform. I dont see any conceptual problems using it in the way that you describe. I see SharePoint as a development platform. One thing you might want to keep in mind, is if you want to make the workflow continiue on events happening in the seperate database, you might have to update for instance the workflow tasks item from an external program.
Your use case is a perfect fit and one that SharePoint adds great value to. I would highly recommend using SharePoint to host your workflows.
I have developed many SharePoint hosted WF workflows and the only real problem that I ever experienced was making calls to long running web services (asynchronous operations) as SharePoints WF host has some limitations on the type of external providers it can listen for events from.
The solution that I developed (which was a bit of a hack at first but ended up being of some value to my customers) was to create a service proxy (WCF) that sat outside of SharePoint and would route calls to remote services and wait for their response. In parallel to making that asynchronous call a parallel activity would create a SharePoint task associated with the asynchronous operation. Then the WF would stop on a OnTaskCompleted activity which causes the WF resources to be released and the state to be persisted to SQL. As the long running operation would event back status updates or completion notification the external service would update the related SharePoint task. Once the task is marked completed the WF is dehydrated and continues executing. The neat thing about this approach was that I could then create a dashboard that showed the status of all the long running processes going on outside of SharePoint. Lastly I packaged all of this stuff up into a composite activity so that it didn't clutter up my pretty workflow diagrams.
SharePoint is ideally suited for this scenarion. I would suggest using a Business Data Catalog (BDC) to access external data sources. It provides a tremendouse benefit primarily by making your datasource searchable as well as providing OOB web parts to display the data with master child relation ships, filtering and a rich API.
I would caution against making workflows too complex and instead break up the process into stages using smaller workflows, InfoPath and user actions to facilitate the entire process. this is where SharePoint really shines as you can interject visibility of the process stages to others in the organization using dashboards (if it makes sense for your scenario) as well as collaboration, approvals ... the list goes on.
I agree that SP can provide a nice WF engine, but let me ask this... are you storing anything IN SharePoint? (tasks, data sources, etc)
I ask because it may be as easy (and more appropriate) to run your own WF engine. If you are running all native WF functionality, and just need an engine, you can write a quick console app that can start workflows.
If you are using SP for anything beyond WF, then I absolutely agree to use SP.