Why do we use the plugins in the crm 2011? - dynamics-crm-2011

"A plug-in is custom business logic (code) that you can integrate with Microsoft Dynamics CRM 2011 to modify or augment the standard behavior of the platform."
my question is :
What the ideal scenario/conditions to use the plugins in the CRM dynamics, before the using the plugins, what types of conditions considers for use of the plugin.

You should use plugins under any of the following conditions:
When you need to enforce business logic in your database that you cannot accomplish reasonably with built in tools like workflows
When your business logic must be executed synchronously
When you need to integrate with external services (address verification or payment processing for example)
When you have a multi-tiered solution where you want to inherit business logic

We write the plugin, when MS CRM tools do not provide the such works then we write the plugin and perform that works.
Like
Any integration in MS CRM tools or want generate auto-number,Execute complex Business Logic etc.

Plugins are Call Out a custom logic(.dll) to implement business logic. It can be registered on CRUD events over three zone Pre-Validation,Pre-Operation & Post-Operation which lets you capture the data that is going to/coming from the database over the transaction between UI & Back-end.
So by running a custom code in these event the developers will be able to perform business operation between it. Although there are many examples to quote but mostly it is used when the in-build system workflow/process is not capable to implement the Business requirements.

Plug-ins have many uses. This includes the following:
Performing complex platform level data validation
Performing auto-number generation
Providing integration with other applications
Executing complex business logic

Related

CRM 2011 Plugin development best practice

I am inheriting a set of plugins that appear to be developed by different people. Some of them follow the pattern of one master plugin with many different steps. In this plugin none of the steps are cohesive or related in functionality, the author simply put them all in the same plugin with code internal to the plugin (if/else madness) that handles the various different entities, crm messages (update, create, delete, etc..) and stages (preValidation/post operation etc.).
The other developer seems to make a plugin for every entity type and/or related feature grouping. This results in multiple smaller plugins with fewer steps.
My question is this, assuming I have architected a way out of the if/else hell that the previous developer created in the 'one-plugin-to-rule-them-all' design, which approach is preferable from a CRM performance and long term maintenance (as in fewer side effects and difficulties with deployment, etc.) perspective?
I usually follow a model driven approach and design one plugin class per entity. On this class steps can be registered for the pre-validation, pre- and post-operation and asynchronous stages on the Create, Update, Delete and other messages, but always for only one entity at a time.
Doing so I can keep a clear oversight of the plugin logic that is triggered on an entity's events and also I do not need to bother about the order in which plugin steps are triggered.
Following this approach, of course, means I need a generic pattern for handling all supported events. For this purpose I designed a plugin base class responsible for the event routing. My deriving plugin classes only need to implement (override) the event handler methods (PreUpdate, PostCreate etc.).
Im my opinion plugin classes should only be used to glue system events to the business logic. Therefore the code performing the desired actions should be placed in separate classes. Plugin classes only route the events, prepare the data and call the business logic.
Some developers tend to design one plugin class per step or even per implemented requirement. Doing so keeps your plugin classes terse (which is positive), but when logic gets complicated you can easily loose track of what is going on for a single entity. (Recently I worked with a CRM implementation that had an entity having 21 plugin classes registered for it. Understanding what was going on and adding new behaviour to this entity proved to be very tricky and time consuming.)

Nested Azure Logic Apps

Is this even possible? I have a logic application that is an unit of functionality (mixture of card connectors and API apps I created) that I would like to share among other logic apps. From what I can see, this doesn't appear to be allowed. If this is not possible, I am going to have to recreate the same cards in the logic apps I need this unit of functionality in.
You can definitely factor your Logic Apps in smaller flows that can be orchestrated and with the support of conditions and iterations you can do lots of powerful stuff.
Unfortunately today it is not as straight forward to leverage the "Action of Type Workflow" that we have in the system since the User Interface does not expose the capabilities you would want such as adding actions of type Workflow, editing the access keys, and the outputs.
I wrote a quick blog here that shows you how you can do that today using the ARMClient to edit the access keys and make it work:
http://blogs.msdn.com/b/carlosag/archive/2015/05/31/using-nested-azure-logic-apps-or-invoking-flows-from-another-logic-app.aspx
In my notes from the Logic Apps talk at Ignite I wrote that it is possible to nest logic apps. I don't have an example handy, but it might be worth skimming through the slides/video from that talk.

Synchronizing data between two organizations

I'd like to create a shadow copy of my main organization to play around in without risking damaging the data. The schema is easy to move over using solutions but what about the data?
Moving the data using export/import functions would kill a lot of time. I'd like a utility that does that for me, so I can move out data daily or at the very least weekly.
Suggestion on method? Do I need to go to third party products? Or do I have to code one myself?
The Instance Adapter for CRM allows for you to keep two organizations in sync.
You will also need the Connector for Microsoft Dynamics, it is available on partner or customer source. The connector is a tool that allows you to integrate and sync the different Dynamics products like CRM, GP, or NAV. With the instance adapter for CRM you should be able to setup an integration between 2 CRM orgs.
http://www.microsoft.com/en-us/download/details.aspx?id=35385
If you are using CRM 2013 you should read this blog post as support has been dropped after a certain version of the connector.
http://blogs.msdn.com/b/dynamicsconnector/archive/2013/10/17/microsoft-dynamics-crm-2013-is-supported-by-connector-for-microsoft-dynamics.aspx

Should we use the SharePoint WF host for workflows that include external (to SharePoint) data sources?

We need to build a couple applications that require fairly advanced workflow functionality. The plan is to store the data in SQL Server, use Windows Workflow Foundation as the workflow engine, and build the frontend using an RIA technology such as Flex or Silverlight.
We already have Sharepoint 2007 set up, and some of us (including me) have a little bit of experience creating custom Sharepoint workflows that work with data in Sharepoint lists.
My question is, would it make sense to use Sharepoint for the workflow, while the actual data is stored outside of Sharepoint in a separate database? We need the task, authentication, and email functionality of Sharepoint, but our data model is a bit complex so we'd rather not store the data in Sharepoint. We'd rather not start from scratch with Workflow Foundation, because Sharepoint already gives us 90% of the functionality we need.
Any thoughts / advice?
I think that this is a great example for use of SharePoint as a platform. I dont see any conceptual problems using it in the way that you describe. I see SharePoint as a development platform. One thing you might want to keep in mind, is if you want to make the workflow continiue on events happening in the seperate database, you might have to update for instance the workflow tasks item from an external program.
Your use case is a perfect fit and one that SharePoint adds great value to. I would highly recommend using SharePoint to host your workflows.
I have developed many SharePoint hosted WF workflows and the only real problem that I ever experienced was making calls to long running web services (asynchronous operations) as SharePoints WF host has some limitations on the type of external providers it can listen for events from.
The solution that I developed (which was a bit of a hack at first but ended up being of some value to my customers) was to create a service proxy (WCF) that sat outside of SharePoint and would route calls to remote services and wait for their response. In parallel to making that asynchronous call a parallel activity would create a SharePoint task associated with the asynchronous operation. Then the WF would stop on a OnTaskCompleted activity which causes the WF resources to be released and the state to be persisted to SQL. As the long running operation would event back status updates or completion notification the external service would update the related SharePoint task. Once the task is marked completed the WF is dehydrated and continues executing. The neat thing about this approach was that I could then create a dashboard that showed the status of all the long running processes going on outside of SharePoint. Lastly I packaged all of this stuff up into a composite activity so that it didn't clutter up my pretty workflow diagrams.
SharePoint is ideally suited for this scenarion. I would suggest using a Business Data Catalog (BDC) to access external data sources. It provides a tremendouse benefit primarily by making your datasource searchable as well as providing OOB web parts to display the data with master child relation ships, filtering and a rich API.
I would caution against making workflows too complex and instead break up the process into stages using smaller workflows, InfoPath and user actions to facilitate the entire process. this is where SharePoint really shines as you can interject visibility of the process stages to others in the organization using dashboards (if it makes sense for your scenario) as well as collaboration, approvals ... the list goes on.
I agree that SP can provide a nice WF engine, but let me ask this... are you storing anything IN SharePoint? (tasks, data sources, etc)
I ask because it may be as easy (and more appropriate) to run your own WF engine. If you are running all native WF functionality, and just need an engine, you can write a quick console app that can start workflows.
If you are using SP for anything beyond WF, then I absolutely agree to use SP.

Unit/Automated Testing in a workflow system

Do you do automated testing on a complex workflow system like K2?
We are building a system with extensive integration between Sharepoint 2007 and K2. I can't even imagine where to start with automated testing as the workflow involves multiple users interacting with Sharepoint, K2 workflows and custom web pages.
Has anyone done automated testing on a workflow server like K2? Is it more effort than it's worth?
I'm having a similar problem testing workflow-heavy MOSS-based application. Workflows in our case are based on WWF.
My idea is to mock pretty much everything that you can't control from unit tests - documents storage, authentication, user rights and actions, sharepoint-specific parts of workflows for sharepoint (these mocks should be thoroughly tested to mirror behavior of real components).
You use inversion of control to make code choose which component to use at runtime - real or mock.
Then you can write system-wide tests to test workflows behavior - setting up your own environment, checking how workflow engine reacts. These tests are too big to call them unit-tests, still it is automated testing.
This approach seems to work on trivial cases, but I still have to prove it is worthy to use in real-world workflows.
Here's the solution I use. This is a simple wrapper around the runtime that allows executing single activity, simplifies passing the parameters, blocks the invoking thread until the workflow or activity is done, and translates / rethrows exceptions if any. Since my workflow only sends or waits for messages through a custom workflow service, I can mock out the service to expect certain messages from workflow and post certain messages to it and here I'm having real unit-tests for my WF! The credit for technology goes to Michael Kennedy.
If you are going to do unit testing, Typemock Isolator is the only tool that can currently mock SharePoint objects.
And by the way, Richard Fennell is working on a workflow mocking solution here.
We've just today written an application that monitors our K2 worklist, picks up certain tasks from it, fills in some data and submits the tasks for completion. This is allowing us to perform automated testing, find regressions, and run through as many different paths of the workflow in a fraction of the time that it would take people to do it. I'd imagine a similar program could be written to pretend to be sharepoint.
As for the unit testing of the workflow items themselves, we have a dll referenced from k2 which contains all of our line rule and processing logic. We don't have any code in the k2 workflows themselves, it is all referenced from these dlls. This allows us to easily write unit tests on them to test all of the individual line rules.
I've done automated integration testing on K2 workflows using the K2ROM API (probably SourceCode.Workflow.Client if you're using K2 blackpearl).
Basically you start a process on a test server with a known folio (I generate a GUID), then use the management API to delete it afterwards. I wrote helper methods like AssertAtClientActivity (basically calls ProvideWorkItem with criteria).
Use the IsSynchronous parameter to StartProcessInstance, WorklistItem.Finish, etc. so that relevant method calls will not return until the process instance has reached a stable state.
Expect tests to be slow and to occasionally fail. These are not unit tests.
If you want to write unit tests against other systems, you'll probably want to wrap the K2 API.
Consider looking at Windows Workflow 4 and the new workflow features in SharePoint 2010. You may not need K2.

Resources