Dynamics CRM: Scheduling workflows - cron

In my CRM I've applications which should be checked and processed by a workflow once a minute.
I was wondering if there anyway to automate this stuff using some sort of cron task or scheduling. I'm relatively new to CRM.
What should I do to make the stuff above, using standard CRM tools, or third party plugins?
Sultan.

CRM doesn't have a good way of handling this. Here are the options generally available inside CRM:
Create a workflow that runs, checks what you need it to do, waits for a period of time and calls itself recursively. If the interval you needed to check at was longer than a minute, this might work, however, CRM has loop detection built into the workflows, and running them once a minute will definitely trigger that.
Create an entity that represents one of your processes. Create a workflow that kicks off after create of this entity, waits one minute, and then creates a new record of your entity. This way, the workflow isn't calling itself recursively and it shouldn't trigger CRM's loop detection. However, you're creating a lot of dummy records and workflow instances that you'll need to clean up in this scenario.
I think both of these are kind of hacky. I would say that if you need to check something once every minute, I'd put it outside CRM in a Windows Service or a Scheduled Task. CRM just doesn't have this capability built in.

Related

Document-centric event scheduling on Azure

I'm aware of the many different ways of scheduling system-centric events in Azure. E.g. Azure Scheduler, Logic Apps, etc. These can be used for things like backups, sending batch emails, or other maintenance functions.
However, I'm less clear on what technology is available for events relating to a large volume of documents or records.
For example, imagine I have 100,000 documents in Cosmos and some of the datetime properties on those documents relate to events: e.g. expiry, reminders, escalations, timeouts, etc. Each record has a different set of dates and times.
What approaches are there to fire off code whenever one of those datetimes is reached?
Stuff I've thought of so far:
Have a scheduled task that runs once per minute and looks for anything relating to that particular minute in Cosmos then does "stuff".
Schedule tasks on Service Bus queues with a future date as-and-when the Cosmos records are created and then have something to receive those messages and do "stuff".
But are there better ways of doing this? Is there a ready-made Azure service that would take away much of the background infrastructure work and just let me schedule a single one-off event at a particular point in time and hit a webhook or something like that?
Am I mis-categorising Azure Scheduler as something that you'd use for a handful of regularly scheduled tasks rather than the mixed bag of dates and times you'd find in 100,000 Cosmos records?
FWIW, in my use-case there isn't really a precision issue - stuff scheduled for 10:05:00 happening at 10:05:32 would be perfectly acceptable, for example.
Appreciate your thoughts.
First of all, Azure Schedular will be replaced by Azure Logic Apps:
Azure Logic Apps is replacing Azure Scheduler, which is being retired. To schedule jobs, follow this article for moving to Azure Logic Apps instead.
(source)
That said, Azure Logic Apps is one of your options since you can define a logic apps that starts a one time job by using a delay activity. See the docs for details.
It scales very well and you can pay for what you use (or use a fixed pricing model).
Another option is using a durable azure function with a timer in it. Once elapsed, you could do your thing. You can use a consumption plan as well, so you pay only for what you use or you can use a fixed pricing model. It also scales very well so hundreds of those instances won't be a problem.
In both cases you have to trigger the function or logic app when the Cosmos records are created. Put the due time as context in the trigger and there you go.
Now, given your statement
I'm aware of the many different ways of scheduling system-centric events in Azure. E.g. Azure Scheduler, Logic Apps, etc. These can be used for things like backups, sending batch emails, or other maintenance functions.
That is up to you. You can do anything you want. You don't specify in your question what work needs to be done when the due time is reached but I doubt it is something you can't do with those services.

Azure hosting Workflow Activity with Persistence for Windows8 Push Notification

I'm completely new to the Windows Azure and Windows Workflow scope of things.
But basically, what I'm trying to implement is the Cloud web-app that's going to be responsible for pushing down tile updates/badge/toast notifications to my Winodws 8 application.
The code to run to send down the tile notification etc is fine, but needs to be executed every hour or so.
I decided the most straight forward approach was to make an MVC application that would have a WebAPI, this WebAPI will be responsible for receiving the ChannelURI from the ModernApplication that sends it to it, and will be stored on SQL Azure.
There will then be a class that has a static method which does the logic for gathering the new data and generating a new Tile/Badge/Toast.
I've created a simple Activity workflow, that has a Sequence with a DoWhile(true) activity. Inside the body of this DoWhile, contains a Sequence which has InvokeMethod and Delay, the InvokeMethod will call my class that contains the static method. The delay is set to one hour.
So that seems to be all okay. I then start this Activity via the Application_Start in Global.asax with the following line:
this.ActivityInvoker = new WorkflowInvoker(new NotificationActivity());
this.ActivityInvoker.InvokeAsync();
So I just tested it with that and it seems to be running my custom static method at the set interval.
That's all good, but now I have three questions in relation to this way of handling it:
Is this the correct/best approach to do this? If not, what are some other ways I should look into.
If a new instance is spun up on Azure, how do I ensure that the running Workflow for both instances won't step on each other's foot? i.e. how do I make sure that the InvokeMethod won't run two times, I only want it to run once an hour regardless of how many instances there are.
How do I ensure that if the instances crash/go-down that the state of it is maintained?
Any help, guidance, etc is much appreciated.
A couple of good questions that I would love to answer, however trying to do some on a forum like this would be difficult. But let's give it a crack. To start with at least.
1) There is nothing wrong with your approach for implementing a scheduled task. I can think of a few other ways of doing it. Like running a simple Worker Role with a Do{Thread.Sleep(); ...} simple, but effective. There are more complex / elegant ways too including using external libraries and frameworks for scheduling tasks in Azure.
2) You would need to implement some sort of Singleton type pattern in your workflow / job processing engine. You could for instance acquire a lease on a 1Kb blob record when your job starts, and not allow another instance to start etc.
For more detailed answers I suggest we take this offline and have a Skype call and discuss in detail your requirements. You know how to get hold of me via email :) look forward to it.

Scheduling tasks in Microsoft CRM 2011

I need to schedule the import/ export of contacts in Microsoft CRM 2011 (online and on premises).
I plan to create a custom entity to store the scheduled tasks, and a form to set them up (similar to Windows Task Scheduler).
I am not sure how can I actually execute the scheduled tasks. Does CRM 2011 have a service or API I could use to schedule tasks? The solution must work in CRM 2011 online and on premises. Thank you so much.
Directly from a former Microsoft product team member (Gonzalo Ruiz),
there is no Out-of-the-Box scheduling engine in CRM.[1]
So the answer is no. I recently asked a similar question, and for several reasons, our team decided the best way to go was solution 1: an external task manager (Windows has a few native solutions for this), which would work for both on-premise and online versions. Drawback: you should probably have a reliable server-type machine that you could host the task manager on.
As linked, you can use solution 2, recurring workflows, to achieve a similar result, but there are some drawbacks to this route as well, some of which are mentioned in Gonzalo's blog.
As Peter mentioned use of recurring workflows can help here. Setting up a workflow as a child workflow which calls itself after a suitable timeout can create the required conditions.
You can potentially have a configuration entity within CRM which stores the "time to next run" and the workflow can be triggered to run on update of this attribute (this can be useful if the scheduling period is likely to be non-linear). If the timescales are linear then you can just implement the required timescales in the workflow or have the workflow can update the aforementioned attribute before completion so that the child invocation waits for the appropriate time period.

Should we use the SharePoint WF host for workflows that include external (to SharePoint) data sources?

We need to build a couple applications that require fairly advanced workflow functionality. The plan is to store the data in SQL Server, use Windows Workflow Foundation as the workflow engine, and build the frontend using an RIA technology such as Flex or Silverlight.
We already have Sharepoint 2007 set up, and some of us (including me) have a little bit of experience creating custom Sharepoint workflows that work with data in Sharepoint lists.
My question is, would it make sense to use Sharepoint for the workflow, while the actual data is stored outside of Sharepoint in a separate database? We need the task, authentication, and email functionality of Sharepoint, but our data model is a bit complex so we'd rather not store the data in Sharepoint. We'd rather not start from scratch with Workflow Foundation, because Sharepoint already gives us 90% of the functionality we need.
Any thoughts / advice?
I think that this is a great example for use of SharePoint as a platform. I dont see any conceptual problems using it in the way that you describe. I see SharePoint as a development platform. One thing you might want to keep in mind, is if you want to make the workflow continiue on events happening in the seperate database, you might have to update for instance the workflow tasks item from an external program.
Your use case is a perfect fit and one that SharePoint adds great value to. I would highly recommend using SharePoint to host your workflows.
I have developed many SharePoint hosted WF workflows and the only real problem that I ever experienced was making calls to long running web services (asynchronous operations) as SharePoints WF host has some limitations on the type of external providers it can listen for events from.
The solution that I developed (which was a bit of a hack at first but ended up being of some value to my customers) was to create a service proxy (WCF) that sat outside of SharePoint and would route calls to remote services and wait for their response. In parallel to making that asynchronous call a parallel activity would create a SharePoint task associated with the asynchronous operation. Then the WF would stop on a OnTaskCompleted activity which causes the WF resources to be released and the state to be persisted to SQL. As the long running operation would event back status updates or completion notification the external service would update the related SharePoint task. Once the task is marked completed the WF is dehydrated and continues executing. The neat thing about this approach was that I could then create a dashboard that showed the status of all the long running processes going on outside of SharePoint. Lastly I packaged all of this stuff up into a composite activity so that it didn't clutter up my pretty workflow diagrams.
SharePoint is ideally suited for this scenarion. I would suggest using a Business Data Catalog (BDC) to access external data sources. It provides a tremendouse benefit primarily by making your datasource searchable as well as providing OOB web parts to display the data with master child relation ships, filtering and a rich API.
I would caution against making workflows too complex and instead break up the process into stages using smaller workflows, InfoPath and user actions to facilitate the entire process. this is where SharePoint really shines as you can interject visibility of the process stages to others in the organization using dashboards (if it makes sense for your scenario) as well as collaboration, approvals ... the list goes on.
I agree that SP can provide a nice WF engine, but let me ask this... are you storing anything IN SharePoint? (tasks, data sources, etc)
I ask because it may be as easy (and more appropriate) to run your own WF engine. If you are running all native WF functionality, and just need an engine, you can write a quick console app that can start workflows.
If you are using SP for anything beyond WF, then I absolutely agree to use SP.

Unit/Automated Testing in a workflow system

Do you do automated testing on a complex workflow system like K2?
We are building a system with extensive integration between Sharepoint 2007 and K2. I can't even imagine where to start with automated testing as the workflow involves multiple users interacting with Sharepoint, K2 workflows and custom web pages.
Has anyone done automated testing on a workflow server like K2? Is it more effort than it's worth?
I'm having a similar problem testing workflow-heavy MOSS-based application. Workflows in our case are based on WWF.
My idea is to mock pretty much everything that you can't control from unit tests - documents storage, authentication, user rights and actions, sharepoint-specific parts of workflows for sharepoint (these mocks should be thoroughly tested to mirror behavior of real components).
You use inversion of control to make code choose which component to use at runtime - real or mock.
Then you can write system-wide tests to test workflows behavior - setting up your own environment, checking how workflow engine reacts. These tests are too big to call them unit-tests, still it is automated testing.
This approach seems to work on trivial cases, but I still have to prove it is worthy to use in real-world workflows.
Here's the solution I use. This is a simple wrapper around the runtime that allows executing single activity, simplifies passing the parameters, blocks the invoking thread until the workflow or activity is done, and translates / rethrows exceptions if any. Since my workflow only sends or waits for messages through a custom workflow service, I can mock out the service to expect certain messages from workflow and post certain messages to it and here I'm having real unit-tests for my WF! The credit for technology goes to Michael Kennedy.
If you are going to do unit testing, Typemock Isolator is the only tool that can currently mock SharePoint objects.
And by the way, Richard Fennell is working on a workflow mocking solution here.
We've just today written an application that monitors our K2 worklist, picks up certain tasks from it, fills in some data and submits the tasks for completion. This is allowing us to perform automated testing, find regressions, and run through as many different paths of the workflow in a fraction of the time that it would take people to do it. I'd imagine a similar program could be written to pretend to be sharepoint.
As for the unit testing of the workflow items themselves, we have a dll referenced from k2 which contains all of our line rule and processing logic. We don't have any code in the k2 workflows themselves, it is all referenced from these dlls. This allows us to easily write unit tests on them to test all of the individual line rules.
I've done automated integration testing on K2 workflows using the K2ROM API (probably SourceCode.Workflow.Client if you're using K2 blackpearl).
Basically you start a process on a test server with a known folio (I generate a GUID), then use the management API to delete it afterwards. I wrote helper methods like AssertAtClientActivity (basically calls ProvideWorkItem with criteria).
Use the IsSynchronous parameter to StartProcessInstance, WorklistItem.Finish, etc. so that relevant method calls will not return until the process instance has reached a stable state.
Expect tests to be slow and to occasionally fail. These are not unit tests.
If you want to write unit tests against other systems, you'll probably want to wrap the K2 API.
Consider looking at Windows Workflow 4 and the new workflow features in SharePoint 2010. You may not need K2.

Resources