This might seem like a silly thing to say, the final branch in a parallel activity so I'll clarify. It's a parallel activity with three branches each containing a simple create task, on task changed and complete task. The branch containing the task that is last to complete seems to break. So every task works in it's own right, but the last one encounters a problem.
Say the user clicks the final tasks link to open the attached infopath form and submits that. Execution gets to the event handler for that onTaskChanged where a taskCompleted variable gets set to true which will exit the while loop. I've successfully hit a breakpoint on this line so I know that happens. However the final activity in that branch, the completeTask doesn't get hit.
When submit is clicked in the final form, the operation in progess screen says of for quite a while before returning to the workflow status page. The task that was opened and submitted says "Not Started".
I can disable any of the branches to leave only two, but the same problem happens with the last to be completed. Earlier on in the workflow I do essencially the same thing. I have another 3 branch parallel activity with each brach containing a task. This one works correctly which leads me to believe that it might be a problem with having two parallel activites in the same sequential workflow.
I've considered the possibility that it might be a correlation token problem. The token that every task branch uses is unique to that branch and it's owner activity name is est to that of the branch. It stands to reason that if the task complete variable is indeed getting set to true but the while loop isn't being exited, then there's a wire crossing with the variable somewhere. However I'd still have thought that the task status back on the workflow status page would at least say that the task is in progress.
This is a frustrating show stopper of a bug for me. Any thoughts or suggestions would be much appricated so I can investigate them.
my workflow scenario is to reassign task to it's originator after due date of the task expires, by firing a delay activity.
in my workflow I have a parallel replicator which is used to assign(create) different tasks to different users at the same time.Inside replicator I used a listen activity so in the left branch there is a OnTaskChanged activity+...+ completetask1, In the right branch of listenActivity there is a Delay Activity followed by a CompleteTask2 activity and a code activity to reassign task to task originator.I'm sure about the correlation tokens on both completetasks activities.every thing works fine on the left branch but error occurs in the right branch which contains Delay activity-->Completetask activity.
let consider that we have two tasks assigned to 2 users and they have one hour to complete their tasks, but they didn't.so Delay activity fires for both tasks.then in workflow first task will be completed but for the second task it makes error.
I think the problem is with taskid property of the completetask.it doesn't updated with the second task id, so it tries to complete a task which has been completed.
Related
I'm trying to find a release task that I can place inside a Release Pipeline Stage's workflow to intentionally stop the Release.
I have a large provisioning pipeline with 50 tasks per Stage (and 6 Stages). Barring a few exceptions the Stages are identical in Tasks, only differing with variables.
After the 10th task if a variable is true, then I want to stop the Release. This is not a failed Release (so I don't want to mark the Release as failed), it just means after the 10th task there is legitimately nothing more to do.
I see lots of information saying create a condition on an existing Task so that the Task only runs when the condition evaluates to true.
This Microsoft documentation suggests to me that on Tasks 11 to 50 I would need a custom condition that says "only run if variable = true". I might have misunderstood the behaviour and there might be another way to achieve the same result.
Why do I want a Task and not a Condition?
Conditions seem to cater for pre-conditions, not post-conditions scenarios. If it has to be a condition I'd rather say, "stop the release successfully after the 10th task has completed successfully and the variable = XYZ" using a post-condition such as:
eq(variables['RunTasks11To50'], 'True')
This is a pain to do this 40 times for a pre-condition (11th task onwards) and it is also error prone as the condition is not obviously set without drilling into the task (unlike a disabled task which is greyed out).
If there was a "Stop Release" task that allows the Release to legitimately stop then I wouldn't need to add conditions on Tasks 11 to 50.
Alternatively maybe if there was a "Gate" task that allowed the release to pause and require a confirmation to continue that might work too.
My concern is that I'm going to need to write a condition eq(variables['RunTasks11To50'], 'True') on 40 tasks multiplied by 6 stages (6 environments).
What have I considered?
Writing a Powershell task to call the DevOps Rest API to cancel my own Release
Somehow disabling the Tasks 11 to 50 at runtime (again probably requiring a Powershell Task DevOps Rest API call)
Wondering if I'm looking for a complicated answer when there's something obvious and simple I've missed.
Thanks for any advice.
Unfortunately, there is no "Gate" task to stop the release.
Alternatively maybe if there was a "Gate" task that allowed the release to pause and require a confirmation to continue that might work too.
We could set Post-deployment conditions or Pre-deployment conditions to add Approvers.
We recommend that you using condition on Tasks 11 to 50 in the release pipeline and use condition on Stage2 to Stage6 to skip the stage task.
Also, we could add task power shell tp call the REST API to cancel the release.
I have a Windows Delphi application that receives events, on each of these events i'd like to run a task in a parallel way (so i can be ready for the following event). There is many way to do this through omnithread library's abstractions.
The issue is that part of my code needs to be executed immediately after the reception of the event (basically to "decode" the events params), and another part needs to be executed a few seconds after only under the condition of nothing new happend for the same context.
This behaviour should respond to "only store this new value if it last longer than 3000ms, otherwise just cancel it".
So what I need would be to "cancel" a running task (the one waiting 3000ms) if a new event arrives with the same context.
I cannot use a pipeline abstraction because when the first stage ends, it automatically fills the second stage queue without asking me if i want to cancel it or not.
Is that possible?
Thank you.
Sounds like you need a Dictionary<Context, Event> where the events also carry a "created" timestamp property, and a background tread which continuously checks if there are event entries in this dictionary with elapsed time > 3000ms.
Incoming events update the timestamp and event params, until the thread detects an entry which matches the condition and then extracts the entry from the dictionary.
I am new to SharePoint. Sorry if answer to my questions is obvious.
First question: I have state machine workflow which creates about 30 tasks (some of them creates after previous completed using OnTaskChnage activity). I have to log tasks changes and complete workflow when all tasks completed. I see 2 ways to do it:
1) I can create eventDrivenActivity for every of 30 tasks.
OnTaskChanged
--Code (log changes)
--If (allTaskCompleted()) //not code, but activity, what use
----then SetState(Completed); //condition allTaskComplete() from code
But I think it is not good way because of I can't reuse code and do 30 same steps.
2) I can do loging in code and then if need, complete workflow from code, but I don't know how can I do it. I can cancel workflow from the code
SPWorkflowManager.CancelWorkflow(itemWorkflow);
but I can't find any information, how to complete it (or setState to "Completed"). May be I am doing something wrong and workflow have to complete itself then all tasks completed, but it does not happen (It stays in "In progress").
Second question: Is there any possibility to run some code after every change in workflow tasks (as far as I understood, OnWorkflowChanged and OnWorkflowModified not suitable for my needs), or to add programmatically handler to my 30 tasks (not to Tasks list at all, but only for my tasks)?
Thank you in advance.
Best regards
Mikhail.
PS: sorry for my writing. English is not my native language.
I have a state machine workflow that I created in VS2010 for SP2010.
I would like to have the WF utilize 2 different task lists. One task list for tracking approvals and one task list for tracking assigned deliverables. However, it seems like the workflowProperties.TaskList is locked down and I am unable to modify the task list "location" as the workflow progress from state to state.
Is it possible to have a custom workflow utilize multiple task lists? How do I do it?
My main reason for doing it this way is that I want simplified status fields for my tasks. On the approval task list the only status values will be Accepted, Rejected, Escalated, Pending. While on my deliverable task list I would like the default choices of Not Started, In Progress, Complete, Waiting, Deferred. If there is a better way than using two different customized task lists, I am open to suggestions.
Same problem here. I'm creating a static workflow machine.
It is possible to create a task by just adding an item to the list. I mean not using: Microsoft.SharePoint.WorkflowActions.CreateTask();, but just add an SPListItem to an SPLIst.
The advantage is the you can add a task to every list on every site level. But is it possible to combine these task to an OnTaskChanged()? If not, is it a good idea to use an EventReceiver which triggers my forkflow steps?
I am trying to create a following scenario:
a task gets assigned to the user to complete
a task get created for the manager to reassign the user task if necessary (don't ask, they wanted it this way)
an email reminder neeeds to be sent when the task is nearing a due date
So, I thought of using EventHandlingScope for this:
I am listening for a task change on the main branch of eventhandlingscope activity,
listening to reassign task change in event driven branch - and if reassign task gets activated, reassign the first task to the user specified
in another event driven branch use a delay activity and check periodically if user assigned task is nearing a due date and send an email reminder
So, I though eventhandlingscope would be good for this, and it mostly is, except for the problem with the DelayActivity.
If I put a delay activity in one of the Event Handlers branches it fires once, but not more.
Whereas if I put an onTaskChange activity there it fires everytime somebody changes that task.
So, is this the expected behaviour? Why doesn't DelayActivity loop?
How could I do this differently? My thought is with a CAG, but this looks a bit more complex...
Update: the problem with CAG is that the whole thing blocks until delay activity fires, even if the onChange event fired. This makes sense, but makes it a bit trickier to use.
Update2: I've reworded the text to make it hopefully clearer
The Solution
The fundemental activity arrangement that solves this problem is a WhileActivity containing a ListenActivity.
The listen activity is given 3 EventDrivenActivity branches. On the first is your "User Task Completed" branch, the second is the "Manager Changed the assigned user" branch and the third contains a DelayActivity followed by your emailing logic.
In a listen activity any of the branches can complete the Listen activity and when they do the other activities in the Listen activity will be canceled.
You will need to ensure the the "User Task Completed" sequence sets some value that can be tested by the while loop such that the while loop exits when a user completes a task.
When a branch other than the "User Task Completed" branch is responsible for completing the the ListenActivity workflow will loop back to the ListenActivity and re-execute all 3 event driven activities including the one containing the DelayActivity.
Note that this is slightly different from the EventHandlingScope approach because the "listen for user task completed" will get canceled and re-executed whereas with the EventHandlingScope that wouldn't happen. IMO this a better arrangement since it means that the user that was currently selected to do the task at the start of the Listen activity is guaranteed to be unchanged at the end (because if it is changed the whole activity is discarded and a new one started).
Why the Delay only fired once in the EventHandlingScope
Effectively what you had set up is a scope that is listening for two events. One was your managers change assigned user event, the other was a "timer fired event".
Now the way its described in the documentation it sounds like some loop is involved as if once one of these activities completes they are restarted. However its not quite like that, it actually just continues to listen for the original event and will re-run the contents if another such event is fired.
In the case of the DelayActivity there is some internal "timer fired event" that is being listened to. When the Delay is first entered the timeout is setup so that the timer will fire at the appropriate time, it then listens for that event. Once it has fired the scope returns to listening to a "timer fired event", however, there is no re-running of the initial code that setup the timeout hence no other "timer fired event" is forth coming.
I know you don't want to hear this but you would be better off creating a workflow in place of the handler as workflows are designed to handle the time dimension much better as they are "long running". Event handlers are more scoped for a moment-in-time event triggers them and then they complete an action. Not only that, but judging from what you write, if the requirements are that simple you can create a SharePoint Designer Workflow so you wouldn't even have to crach open Visual Studio.
Also, not sure if you know this but SharePoint tasks do send out emails, these tasks will send out daily reminders when the task is late so you might be able to address your delay activity using out-of-the-box functionality.
Finally, if you are running in debug mode and you have hard-coded your taskid, you can only run one task per debug session otherwise your Event Handler will stop when another item with the same ID is added to SharePoint. This might explain why your delay activity is blocked.