Azure Durable Function Repeating Function calls - azure

The below question stems from my work trying to use the fan out/fan in pattern to gather license data for 130k users to input to our database for reporting and license management. I have this piece functioning correctly in a different project but it is exceeding the timeout of a standard Azure function. I ported it to the durable function and ran into this issue, then backtracked to the pre-generated code in an attempt to find the issue but it exists no matter what I do.
The below is in a fresh project using the pre-generated code for the Durable Functions. No changes to the code made at all.
When I run the code, it calls the same function multiple times with the same timestamp and different ID's. It then continuously calls Function1_Hello.
I have done some research to try and determine what is causing this but have not been able to find an answer or resolution.
Similar question here and some deep research done here but I have not seen an actual resolution to this issue. Any help is greatly appreciated.
Code
using System.Collections.Generic;
using System.Net.Http;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.DurableTask;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
namespace DurableFunctionTest
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task<List<string>> RunOrchestrator(
[OrchestrationTrigger] IDurableOrchestrationContext context)
{
var outputs = new List<string>();
// Replace "hello" with the name of your Durable Activity Function.
outputs.Add(await context.CallActivityAsync<string>("Function1_Hello", "Tokyo"));
outputs.Add(await context.CallActivityAsync<string>("Function1_Hello", "Seattle"));
outputs.Add(await context.CallActivityAsync<string>("Function1_Hello", "London"));
// returns ["Hello Tokyo!", "Hello Seattle!", "Hello London!"]
return outputs;
}
[FunctionName("Function1_Hello")]
public static string SayHello([ActivityTrigger] string name, ILogger log)
{
log.LogInformation($"Saying hello to {name}.");
return $"Hello {name}!";
}
[FunctionName("Function1_HttpStart")]
public static async Task<HttpResponseMessage> HttpStart(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post")] HttpRequestMessage req,
[DurableClient] IDurableOrchestrationClient starter,
ILogger log)
{
// Function input comes from the request content.
string instanceId = await starter.StartNewAsync("Function1", null);
log.LogInformation($"Started orchestration with ID = '{instanceId}'.");
return starter.CreateCheckStatusResponse(req, instanceId);
}
}
}
Console results from running the code here.
For detailed output, run func with --verbose flag.
[2022-09-28T16:54:28.492Z] Executing 'Function1_Hello' (Reason='(null)', Id=76fc4193-219e-4347-ab08-880785a2d73e)
[2022-09-28T16:54:28.492Z] Executing 'Function1_Hello' (Reason='(null)', Id=022d09eb-0960-427a-8777-c99062c50e34)[2022-09-28T16:54:28.492Z] Executing 'Function1_Hello' (Reason='(null)', Id=69633782-4904-4ee8-b88a-228e96a4b8de)
[2022-09-28T16:54:28.492Z] Executing 'Function1_Hello' (Reason='(null)', Id=374d38c5-b2b7-4729-9dc0-7ea3e508a564)
[2022-09-28T16:54:28.492Z] Executing 'Function1_Hello' (Reason='(null)', Id=48cd6a04-ee9d-4f30-aff8-59af54233bb6)
[2022-09-28T16:54:29.435Z] Saying hello to Tokyo.
[2022-09-28T16:54:29.456Z] Saying hello to Tokyo.
[2022-09-28T16:54:29.620Z] Saying hello to Tokyo.
[2022-09-28T16:54:29.761Z] Saying hello to Tokyo.
[2022-09-28T16:54:29.761Z] Saying hello to Tokyo.
[2022-09-28T16:54:29.781Z] Host lock lease acquired by instance ID '00000000000000000000000042B5BCB0'.
[2022-09-28T16:54:29.862Z] Executed 'Function1_Hello' (Succeeded, Id=69633782-4904-4ee8-b88a-228e96a4b8de, Duration=2281ms)
[2022-09-28T16:54:29.862Z] Executed 'Function1_Hello' (Succeeded, Id=76fc4193-219e-4347-ab08-880785a2d73e, Duration=2438ms)
[2022-09-28T16:54:29.862Z] Executed 'Function1_Hello' (Succeeded, Id=374d38c5-b2b7-4729-9dc0-7ea3e508a564, Duration=2417ms)
[2022-09-28T16:54:29.862Z] Executed 'Function1_Hello' (Succeeded, Id=48cd6a04-ee9d-4f30-aff8-59af54233bb6, Duration=2437ms)
[2022-09-28T16:54:29.862Z] Executed 'Function1_Hello' (Succeeded, Id=022d09eb-0960-427a-8777-c99062c50e34, Duration=2297ms)
[2022-09-28T16:54:59.644Z] Executing 'Function1' (Reason='(null)', Id=0eaa8de6-d050-4d50-91f9-96c2f019cfeb)
[2022-09-28T16:54:59.644Z] Executing 'Function1' (Reason='(null)', Id=56461ac1-0b65-4797-8d93-38e3b6184db9)[2022-09-28T16:54:59.644Z] Executing 'Function1' (Reason='(null)', Id=64540679-8f5a-4507-a656-8a011874fe9a)
[2022-09-28T16:54:59.644Z] Executing 'Function1' (Reason='(null)', Id=641d8c52-cce8-4076-a384-b05adc05ab89)
[2022-09-28T16:55:00.672Z] Executing 'Function1' (Reason='(null)', Id=7ab59722-55b1-4fbd-a3e6-c957e3da99f9)
[2022-09-28T16:55:00.743Z] Executed 'Function1' (Succeeded, Id=0eaa8de6-d050-4d50-91f9-96c2f019cfeb, Duration=1572ms)
[2022-09-28T16:55:00.743Z] Executed 'Function1' (Succeeded, Id=64540679-8f5a-4507-a656-8a011874fe9a, Duration=1572ms)
[2022-09-28T16:55:00.795Z] Executed 'Function1' (Succeeded, Id=7ab59722-55b1-4fbd-a3e6-c957e3da99f9, Duration=125ms)
[2022-09-28T16:55:00.795Z] Executed 'Function1' (Succeeded, Id=641d8c52-cce8-4076-a384-b05adc05ab89, Duration=1625ms)
[2022-09-28T16:55:00.795Z] Executed 'Function1' (Succeeded, Id=56461ac1-0b65-4797-8d93-38e3b6184db9, Duration=1624ms)
[2022-09-28T16:55:01.291Z] Executing 'Function1_Hello' (Reason='(null)', Id=069b5928-0947-4c33-842e-befd8d38c9d4)
[2022-09-28T16:55:01.340Z] Saying hello to Seattle.
[2022-09-28T16:55:01.367Z] Executing 'Function1_Hello' (Reason='(null)', Id=346e37d5-fb8d-4ab9-9340-272eee8219a0)
[2022-09-28T16:55:01.413Z] Executed 'Function1_Hello' (Succeeded, Id=069b5928-0947-4c33-842e-befd8d38c9d4, Duration=75ms)[2022-09-28T16:55:01.417Z] Saying hello to Seattle.
[2022-09-28T16:55:01.442Z] Executed 'Function1_Hello' (Succeeded, Id=346e37d5-fb8d-4ab9-9340-272eee8219a0, Duration=76ms)
[2022-09-28T16:55:01.442Z] Executing 'Function1_Hello' (Reason='(null)', Id=68f9d521-3178-40a3-9b7b-e8dea8c29f19)
[2022-09-28T16:55:01.617Z] Saying hello to Tokyo.
[2022-09-28T16:55:01.619Z] Executing 'Function1_Hello' (Reason='(null)', Id=8dac2a24-0796-40b1-9fc5-97fc14fc7d7c)
[2022-09-28T16:55:01.729Z] Executed 'Function1_Hello' (Succeeded, Id=68f9d521-3178-40a3-9b7b-e8dea8c29f19, Duration=268ms)
[2022-09-28T16:55:01.783Z] Saying hello to Tokyo.
[2022-09-28T16:55:01.787Z] Executing 'Function1_Hello' (Reason='(null)', Id=94bc6a3c-9aba-4857-bc17-01092dacc8ce)
[2022-09-28T16:55:01.944Z] Executed 'Function1_Hello' (Succeeded, Id=8dac2a24-0796-40b1-9fc5-97fc14fc7d7c, Duration=326ms)
[2022-09-28T16:55:02.026Z] Saying hello to Tokyo.
[2022-09-28T16:55:02.053Z] Executed 'Function1_Hello' (Succeeded, Id=94bc6a3c-9aba-4857-bc17-01092dacc8ce, Duration=266ms)
[2022-09-28T16:55:06.515Z] Executing 'Function1' (Reason='(null)', Id=8de1ae6e-b19d-49b0-9559-3817218f81de)
[2022-09-28T16:55:06.515Z] Executing 'Function1' (Reason='(null)', Id=1707f89b-dc2d-466d-ace2-68dadfebf991)
[2022-09-28T16:55:06.535Z] Executed 'Function1' (Succeeded, Id=8de1ae6e-b19d-49b0-9559-3817218f81de, Duration=20ms)
[2022-09-28T16:55:06.547Z] Executed 'Function1' (Succeeded, Id=1707f89b-dc2d-466d-ace2-68dadfebf991, Duration=32ms)
[2022-09-28T16:55:06.909Z] Executing 'Function1_Hello'
(Reason='(null)', Id=239bdb01-e2cb-42de-84a3-22ec9ce582c8)

Based on the discussion in the comments, it seems that activity start messages had been sent in previous runs to the work item queue, and when the Function was restarted, those activities executed.
A really easy way to ensure previous data is not used is to change the task hub name.
You can specify one in the host.json:
{
"extensions": {
"durableTask": {
"hubName": "MyHubName"
}
}
}
Note however that if you commit this change to Azure, it'll also clear the state there.
Alternatively, you can delete the queues and tables created for your local testing and Durable Functions will recreate them on startup.

Related

Azure durable function running multiple times on startup when running locally

I have an http triggered azure durable function with an orchestration trigger called "ExecuteWork" and two activities namely "HandleClaimsForms" and "HandleApplicationForms". I will add the definitions for them below. The function is used to process PDFs in a blob storage container. When running locally it will execute "HandleClaimsForms" four or five times on startup without it being called.
Here are the logs that it is producing:
Functions:
Function1: [GET,POST] http://localhost:7071/api/Function1
ExecuteWork: orchestrationTrigger
HandleApplicationForms: activityTrigger
HandleClaimsForms: activityTrigger
[2022-06-07T12:39:44.587Z] Executing 'HandleClaimsForms' (Reason='(null)', Id=c45878fe-35c8-4a57-948e-0b43da969427)
[2022-06-07T12:39:44.587Z] Executing 'HandleClaimsForms' (Reason='(null)', Id=0fb9644d-6748-4791-96cf-a92f6c161a97)
[2022-06-07T12:39:44.587Z] Executing 'HandleClaimsForms' (Reason='(null)', Id=9a39a169-a91d-4524-b5e5-63e6226f70ec)
[2022-06-07T12:39:44.587Z] Executing 'HandleClaimsForms' (Reason='(null)', Id=b3697f6b-7c96-4497-826f-3894359ff361)
[2022-06-07T12:39:44.587Z] Executing 'HandleClaimsForms' (Reason='(null)', Id=3ca3bbce-1657-453b-a5b3-e9dbdb940302)
Here are the Function definitions:
Function entrypoint
[FunctionName("Function1")]
public async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)] HttpRequest req,
[DurableClient] IDurableOrchestrationClient starter,
ILogger log)
{
string instanceID = await starter.StartNewAsync("ExecuteWork");
return starter.CreateCheckStatusResponse(req, instanceID);
}
Orchestration trigger
[FunctionName("ExecuteWork")]
public async Task<bool> ProcessForms(
[OrchestrationTrigger] IDurableOrchestrationContext context,
ILogger log)
{
bool success = true;
try
{
await context.CallActivityAsync("HandleClaimsForms", success);
await context.CallActivityAsync("HandleApplicationForms", success);
return success;
}
catch (Exception err)
{
log.LogInformation($"The following error was thrown: {err}");
success = false;
return success;
}
}
HandleClaimsForm Activity
public async Task<bool> ProcessClaimsForms(
[ActivityTrigger]bool success)
{
await _docHandler.Handle();
return success;
}
HandleApplicationForm activity
[FunctionName("HandleApplicationForms")]
public async Task<bool> ProcessApplicationForms(
[ActivityTrigger]bool success)
{
await _appHandler.HandleJsonApplicationFormAsync();
return success;
}
One of the workaround you can follow to resolve the above issue,
Based on the MICROSOFT DOCUMENT about the reliability is:
Durable Functions uses event sourcing transparently. Behind the scenes, the await (C#) or yield (JavaScript/Python) operator in an
orchestrator function yields control of the orchestrator thread back
to the Durable Task Framework dispatcher.
The orchestrator wakes up and re-executes the entire function from scratch to rebuild the local state whenever an orchestration function
is given more work to do (for instance, when a response message is
received or a durable timer expires). The Durable Task Framework
analyses the orchestration's execution history if the code tries to
invoke a function or do any other async task during the replay. In the
event that it discovers that the activity function has already run and
produced a result, it replays that result while the orchestrator code
keeps running. Playback continues until the function code terminates
or until additional async work has been scheduled.
Another, approach is to use the dependency injection .
For more information please refer the below links:-
SIMILAR SO THREAD| HTTP Trigger on demand azure function calling itself multiple times & Azure Durable Function Activity seems to run multiple times and not complete .

Azure Durable Functions, unexpected value of 'IsReplaying' flag

I'm trying to understand Azure Durable Functions behavior. Specifically about how the Orchestrator function gets replayed. I thought I was getting the hang of it until I found one value of the Context.IsReplaying flag that didn't make sense to me.
My code is very "hello world" -ish. It has an Orchestrator function that calls two Activity functions one after the other.
[FunctionName("OrchestratorFn")]
public static async Task<object> Orchestrator(
[OrchestrationTrigger] IDurableOrchestrationContext context,
ILogger log
) {
log.LogInformation($"--------> Orchestrator started at {T()}, isReplay={context.IsReplaying}");
string name = context.GetInput<string>();
string name1 = await context.CallActivityAsync<string>("A_ActivityA", name);
string name2 = await context.CallActivityAsync<string>("A_ActivityB", name1);
log.LogInformation($"--------> Orchestrator ended at {T()}, isReplay={context.IsReplaying}");
return new {
OutputFromA = name1,
OutputFromB = name2
};
}
[FunctionName("A_ActivityA")]
public static async Task<object> ActivityA(
[ActivityTrigger] string input,
ILogger log
) {
log.LogInformation($"--------> Activity A started at {T()}");
await Task.Delay(3000);
log.LogInformation($"--------> Activity A ended at {T()}");
return input + "-1";
}
[FunctionName("A_ActivityB")]
public static async Task<object> ActivityB(
[ActivityTrigger] string input,
ILogger log
) {
log.LogInformation($"--------> Activity B started at {T()}");
await Task.Delay(3000);
log.LogInformation($"--------> Activity B ended at {T()}");
return input + "-2";
}
In the console output (I've cut out everything except the output where I log time), this is what I see:
[1/26/2020 12:56:40 PM] ------> DurableClient Function Running at 56.40.8424.
[1/26/2020 12:56:49 PM] ------> DurableClient Function END at 56.49.5029.
[1/26/2020 12:57:03 PM] ------> Orchestrator started at 57.03.7915, isReplay=False
[1/26/2020 12:57:04 PM] ------> Activity A started at 57.04.1905
[1/26/2020 12:57:07 PM] ------> Activity A ended at 57.07.2016
[1/26/2020 12:57:24 PM] ------> Orchestrator started at 57.24.8029, isReplay=True
[1/26/2020 12:57:40 PM] ------> Activity B started at 57.40.4136
[1/26/2020 12:57:43 PM] ------> Activity B ended at 57.43.4258
[1/26/2020 12:57:53 PM] ------> Orchestrator started at 57.53.1490, isReplay=True
[1/26/2020 12:57:59 PM] ------> Orchestrator ended at 57.59.0736, isReplay=False
It's the 'isReplay=False' on the very last line that I can't explain. Why is this ? Shouldn't isReplay be 'True' ?
I'm using Microsoft.Azure.WebJobs.Extensions.Durable v2.1.1
No, it should not be isReplay=true because this line is really only executed once. Whenever the Orchestrator await some call, it stops its own execution right there and waits for that call to finish. When it does, it runs through all the code up until its last point again - without making outbound calls again.
Since there is no further await behind your last logging statement, this line is only reached once.

Azure Functions host.json: maxPollingInterval

If I read the documentation for maxPollingInterval:
The maximum interval between queue polls
From here: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-queue#hostjson-settings
I have set it to 00:01:00 in my host.json. How I understand it is that I have told the functions runtime that it may not do a poll every 2. minute. It has to do it atleast every minute, but if the queue from which it is reading is "hot", that is, there is a consistent flow of messages, it will do it more often, than the minute i have specified. If the flow of messages drops, then the runtime will begin to check less often, until it hits the one minute.
Is this correctly understood?
I have tried to find the code for this pooling mechanism in the webjob sdk: https://github.com/Azure/azure-webjobs-sdk and Azure Functions runtime: https://github.com/Azure/Azure-Functions, but I don't know what to look after (and I haven't spend that much time on it yet).
The queuetrigger implements a random exponential back-off algorithm. As a result of this if there are no messages on the queue, the SDK will back off and start polling less frequently.
MaxPollingInterval allows you to configure this behavior. It for when a queue remains empty, the longest period of time to wait before checking for a message to.
As an example, if you set it as 00:01:00 and then send a message to the queue. Function will be triggered and then wait for 2 seconds to check, if there was no message, wait for 4 seconds , no message, wait for 8 seconds and so on. If you send a message to queue and the function be triggered it will turn to wait for 2 seconds , 4 seconds, 8 seconds, ..., 1 minute, 1 minute, 1 minute ...until queuetrigger recive message.
This is the Offical doc, and below is a test i do. You can have a look of time interval of my test and have a try by yourself.
2019-10-15T07:53:32.884 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=aead524a-cdba-4e43-a956-1a8558dea037)
2019-10-15T07:53:32.884 [Information] Trigger Details: MessageId: e8e9fa50-cda3-4653-bddb-c446807ec986, DequeueCount: 1, InsertionTime: 10/15/2019 7:53:32 AM +00:00
2019-10-15T07:53:32.886 [Information] C# Queue trigger function processed: 1
2019-10-15T07:53:32.886 [Information] Executed 'Function1' (Succeeded, Id=aead524a-cdba-4e43-a956-1a8558dea037)
2019-10-15T07:56:27.105 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=eb01dce6-23fd-4128-bccd-8c26c104361f)
2019-10-15T07:56:27.105 [Information] Trigger Details: MessageId: 9aa0a4ff-d823-4703-a8b4-dca0ff7b61cb, DequeueCount: 1, InsertionTime: 10/15/2019 7:55:54 AM +00:00
2019-10-15T07:56:27.106 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=d5b92a82-5159-4c28-91b0-c84668916f13)
2019-10-15T07:56:27.106 [Information] Trigger Details: MessageId: b9445cc1-8d0e-4f37-9de2-aafd74df4b82, DequeueCount: 1, InsertionTime: 10/15/2019 7:55:55 AM +00:00
2019-10-15T07:56:27.106 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=86d6abb4-9a41-4bac-9259-17d85afd031f)
2019-10-15T07:56:27.106 [Information] Trigger Details: MessageId: f019251e-6d79-448c-baba-13bfd8401494, DequeueCount: 1, InsertionTime: 10/15/2019 7:55:57 AM +00:00
2019-10-15T07:56:27.106 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=01d94b5c-230d-4ae0-adb6-70a949ec97b5)
2019-10-15T07:56:27.106 [Information] Trigger Details: MessageId: a00e1340-e0f7-4aee-a4a7-addb9a210831, DequeueCount: 1, InsertionTime: 10/15/2019 7:55:58 AM +00:00
2019-10-15T07:56:27.107 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=fd928cf3-c0a7-4fa1-88a1-e31eecf069bd)
2019-10-15T07:56:27.107 [Information] Trigger Details: MessageId: 6a8cc1b9-c45c-42e2-b7bf-34f7e96f5471, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:01 AM +00:00
2019-10-15T07:56:27.107 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=8e6de5fc-6696-4030-a2cc-748b5da44ec7)
2019-10-15T07:56:27.107 [Information] Trigger Details: MessageId: 0b100915-7de1-44b9-a706-2fd866169563, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:02 AM +00:00
2019-10-15T07:56:27.107 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=9f1a6e13-93e7-443c-a021-c0b9d5f4dcfa)
2019-10-15T07:56:27.108 [Information] Trigger Details: MessageId: 713159e1-99cd-4727-9793-7540297eb6d0, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:04 AM +00:00
2019-10-15T07:56:27.108 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=564ca50f-58a7-4475-865d-03706298c5fd)
2019-10-15T07:56:27.108 [Information] Trigger Details: MessageId: c9bc6646-15c4-42b8-ac48-44deeb056844, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:06 AM +00:00
2019-10-15T07:56:27.113 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=2d5083ee-e839-4bd1-a59d-3f13a7f04ae2)
2019-10-15T07:56:27.114 [Information] Trigger Details: MessageId: 4a92ad18-3e53-443d-a721-1e84ae77c112, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:08 AM +00:00
2019-10-15T07:56:27.114 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=b512c3bf-2d80-4e6c-ac6d-fef77f2c93ef)
2019-10-15T07:56:27.114 [Information] Trigger Details: MessageId: 5eace6a2-98e4-4aca-8747-ff2a2d9c5198, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:10 AM +00:00
2019-10-15T07:56:27.114 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=0febda35-e777-4449-b8dd-a0895daa403e)
2019-10-15T07:56:27.114 [Information] Trigger Details: MessageId: 5e7b9f35-afa7-4edd-bb8a-cb98c12c25f5, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:12 AM +00:00
2019-10-15T07:56:27.115 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=78e66fcb-45a5-4612-8dde-ff2c31be1585)
2019-10-15T07:56:27.115 [Information] Trigger Details: MessageId: 623a077d-4bfb-4321-b6c0-2ef475ad11f4, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:15 AM +00:00
2019-10-15T07:56:27.115 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=01afae82-451e-42a7-9ed8-346737efdd01)
2019-10-15T07:56:27.115 [Information] Trigger Details: MessageId: 243a1af2-238d-48b3-930f-0ba629aeb293, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:17 AM +00:00
2019-10-15T07:56:27.115 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=b6aa18b7-d52c-4b33-83ec-d380a5431de8)
2019-10-15T07:56:27.115 [Information] Trigger Details: MessageId: 71c5d964-215a-4a62-a12c-b0d2c2a15756, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:20 AM +00:00
2019-10-15T07:56:27.116 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=3d1c2ed3-d79e-4492-8ce2-3f6980bef396)
2019-10-15T07:56:27.116 [Information] Trigger Details: MessageId: 23121a01-ef82-4339-8df3-0d4fed9f0cab, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:23 AM +00:00
2019-10-15T07:56:27.116 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=72e3bd65-4c14-4b21-ad1a-9055ec0c7491)
2019-10-15T07:56:27.116 [Information] Trigger Details: MessageId: df15e792-fecb-45c9-91d3-73d493faea4f, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:24 AM +00:00
2019-10-15T07:56:27.123 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.123 [Information] Executed 'Function1' (Succeeded, Id=eb01dce6-23fd-4128-bccd-8c26c104361f)
2019-10-15T07:56:27.123 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.124 [Information] Executed 'Function1' (Succeeded, Id=d5b92a82-5159-4c28-91b0-c84668916f13)
2019-10-15T07:56:27.124 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.124 [Information] Executed 'Function1' (Succeeded, Id=86d6abb4-9a41-4bac-9259-17d85afd031f)
2019-10-15T07:56:27.126 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.126 [Information] Executed 'Function1' (Succeeded, Id=01d94b5c-230d-4ae0-adb6-70a949ec97b5)
2019-10-15T07:56:27.132 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.132 [Information] Executed 'Function1' (Succeeded, Id=fd928cf3-c0a7-4fa1-88a1-e31eecf069bd)
2019-10-15T07:56:27.133 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.133 [Information] Executed 'Function1' (Succeeded, Id=8e6de5fc-6696-4030-a2cc-748b5da44ec7)
2019-10-15T07:56:27.134 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.134 [Information] Executed 'Function1' (Succeeded, Id=9f1a6e13-93e7-443c-a021-c0b9d5f4dcfa)
2019-10-15T07:56:27.135 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.135 [Information] Executed 'Function1' (Succeeded, Id=564ca50f-58a7-4475-865d-03706298c5fd)
2019-10-15T07:56:27.142 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.142 [Information] Executed 'Function1' (Succeeded, Id=2d5083ee-e839-4bd1-a59d-3f13a7f04ae2)
2019-10-15T07:56:27.143 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.143 [Information] Executed 'Function1' (Succeeded, Id=b512c3bf-2d80-4e6c-ac6d-fef77f2c93ef)
2019-10-15T07:56:27.144 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.144 [Information] Executed 'Function1' (Succeeded, Id=0febda35-e777-4449-b8dd-a0895daa403e)
2019-10-15T07:56:27.150 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.150 [Information] Executed 'Function1' (Succeeded, Id=78e66fcb-45a5-4612-8dde-ff2c31be1585)
2019-10-15T07:56:27.152 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.152 [Information] Executed 'Function1' (Succeeded, Id=01afae82-451e-42a7-9ed8-346737efdd01)
2019-10-15T07:56:27.153 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.153 [Information] Executed 'Function1' (Succeeded, Id=b6aa18b7-d52c-4b33-83ec-d380a5431de8)
2019-10-15T07:56:27.159 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.160 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.160 [Information] Executed 'Function1' (Succeeded, Id=72e3bd65-4c14-4b21-ad1a-9055ec0c7491)
2019-10-15T07:56:27.161 [Information] Executed 'Function1' (Succeeded, Id=3d1c2ed3-d79e-4492-8ce2-3f6980bef396)
2019-10-15T07:57:25.420 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=a0ba0110-b590-4770-856b-dcd0b9b2fe7e)
2019-10-15T07:57:25.420 [Information] Trigger Details: MessageId: 5203a3c7-30c6-41f3-95af-dafac74512c8, DequeueCount: 1, InsertionTime: 10/15/2019 7:57:21 AM +00:00
2019-10-15T07:57:25.420 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=cb9f04e9-931d-4e09-89b5-0f346043c637)
2019-10-15T07:57:25.421 [Information] Trigger Details: MessageId: 2210c791-0f56-4eae-809d-2060591a5a9b, DequeueCount: 1, InsertionTime: 10/15/2019 7:57:23 AM +00:00
2019-10-15T07:57:25.421 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=0be83ef6-6872-4ed1-bea9-a89ade1aa031)
2019-10-15T07:57:25.421 [Information] Trigger Details: MessageId: 8ec52c1d-584a-4c95-ae74-93beca0ea8dc, DequeueCount: 1, InsertionTime: 10/15/2019 7:57:25 AM +00:00
2019-10-15T07:57:25.422 [Information] C# Queue trigger function processed: 1
2019-10-15T07:57:25.422 [Information] Executed 'Function1' (Succeeded, Id=a0ba0110-b590-4770-856b-dcd0b9b2fe7e)
2019-10-15T07:57:25.423 [Information] C# Queue trigger function processed: 1
2019-10-15T07:57:25.423 [Information] Executed 'Function1' (Succeeded, Id=cb9f04e9-931d-4e09-89b5-0f346043c637)
2019-10-15T07:57:25.429 [Information] C# Queue trigger function processed: 1
2019-10-15T07:57:25.429 [Information] Executed 'Function1' (Succeeded, Id=0be83ef6-6872-4ed1-bea9-a89ade1aa031)
2019-10-15T07:57:27.424 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=14583d91-ec35-42aa-83c8-0727388b663d)
2019-10-15T07:57:27.424 [Information] Trigger Details: MessageId: 0c55402f-b47b-476e-ae5c-f64d9756b65f, DequeueCount: 1, InsertionTime: 10/15/2019 7:57:26 AM +00:00
2019-10-15T07:57:27.425 [Information] C# Queue trigger function processed: 1
2019-10-15T07:57:27.425 [Information] Executed 'Function1' (Succeeded, Id=14583d91-ec35-42aa-83c8-0727388b663d)
Do you understand now?
#BowmanZhu, just to complete/update your answer, it seems what in Microsoft.Azure.WebJobs version 3.0.16, the polling is triggered differently.
Checking mechanism is starting from the QueuePollingIntervals.Minimum(not 2 seconds) which is 100ms and exponentially increment it until it reaches the maximum.
Note: for Development environment the MaxPollingInterval default value is not 1 minute, but 2 seconds.
MaxPollingInterval set to 15 seconds (15 000ms):
dbug: Microsoft.Extensions.Hosting.Internal.Host[2]
Hosting started
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 100 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 199 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 336 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 548 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 916 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 1851 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 3199 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 6222 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 13030 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 15000 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 15000 ms before polling queue 'stackoverflow'.
...

Azure function visibilityTimeout

When i read the documentation about visibilityTimeout: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-queue#host-json it says "The time interval between retries when processing of a message fails.". How I understands this is that if the timeout is set to 30 seconds and my function runs for 1 minute but doesn't fail in that 1 minute period, the message doesn't get visible to others in the queue. But when I read up on it by others sources (stackoverflow fx) it tells me the opposite, that when the execution time of the function exceeds the timeout, the message becomes visible EVEN though the function is still processing the message.
What is the truth? Is the timeout only relevant when the function isn't running more (and maybe have failed) or can it happen that the message gets visible again even though the function is still running?
What doesn't makes sense either, if we assume that the message gets visible when timeout is reached, is that the default timeout is 00:00:00 which implies that the message is visible at the same moment it is dequeued. This contradicts what 3. party sources is saying.
I am a bit confused by this.
It appears there are actually two different visibility timeout values used here. Both are set by the Azure WebJobs SDK but only one is configurable.
When the function fails
The queues.visibilityTimeout configuration option would be more aptly named retryDelay.
When the function throws an exception or fails for some other kind of error, the message gets returned to the queue to be retried. The message is returned with the configured visibilityTimeout (see here), which delays when the function will next attempt to run.
This allows your application to cope with transient errors. For example, if an email API or other external service is temporarily down. By delaying the retry, there is a chance that the service may be back online for the next function attempt.
Retry is limited to maxDequeueCount attempts (5 default) before a message is moved to the Poison Queue.
While the function is running
When the QueueTrigger binding runs the function, it dequeues the message with a visibility timeout of 10mins (hard-coded here). It then sets a timer to extend the visibility window when it reaches half-time as long as the function is running (see the timer and visibility update in the source).
Ordinarily you don't need to worry about this as long as your functions use CancellationTokens correctly. This 10min timeout only matters if the Azure Function/WebJob host doesn't get to shut down gracefully. For example:
someone "pulls the plug" on the web host
if the function doesn't respond to the CancellationToken in time during scale-in or other Azure shutdown events
So, as long as the function is still running, the message will remain hidden from the queue.
Verification
I did a similar experiment to check:
[FunctionName("SlowJob")]
public async Task Run(
[QueueTrigger("slow-job-queue")] CloudQueueMessage message,
ILogger log)
{
for (var i = 0; i < 20; i++)
{
log.LogInformation($"Next visible {i}: {message.NextVisibleTime}");
await Task.Delay(60000);
}
}
Output:
Next visible 0: 5/11/2020 7:49:24 +00:00
Next visible 1: 5/11/2020 7:49:24 +00:00
Next visible 2: 5/11/2020 7:49:24 +00:00
Next visible 3: 5/11/2020 7:49:24 +00:00
Next visible 4: 5/11/2020 7:49:24 +00:00
Next visible 5: 5/11/2020 7:54:24 +00:00
Next visible 6: 5/11/2020 7:54:24 +00:00
Next visible 7: 5/11/2020 7:54:24 +00:00
Next visible 8: 5/11/2020 7:54:24 +00:00
Next visible 9: 5/11/2020 7:54:24 +00:00
Next visible 10: 5/11/2020 7:59:24 +00:00
Next visible 11: 5/11/2020 7:59:24 +00:00
...
I have tested this with
using System;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
using Microsoft.WindowsAzure.Storage.Queue;
namespace WorkerFunctions
{
public static class WorkerFunctions
{
[FunctionName("WorkerFunction1")]
public static async Task Function1(
[QueueTrigger("outputQueue")]
CloudQueueMessage item,
[Queue("outputQueue")]
CloudQueue outputQueue,
DateTimeOffset nextVisibleTime,
DateTimeOffset expirationTime,
DateTimeOffset insertionTime,
ILogger log)
{
log.LogInformation("########## Function 1 ###############");
log.LogInformation($"NextVisibleTime: {nextVisibleTime}");
log.LogInformation($"NextVisibleTime: {(nextVisibleTime-insertionTime).TotalSeconds}");
log.LogInformation($"C# Queue trigger function processed: {item.AsString}");
Thread.Sleep(TimeSpan.FromMinutes(20));
}
[FunctionName("WorkerFunction2")]
public static async Task Function2(
[QueueTrigger("outputQueue")]
CloudQueueMessage item,
[Queue("outputQueue")]
CloudQueue outputQueue,
DateTimeOffset nextVisibleTime,
DateTimeOffset expirationTime,
DateTimeOffset insertionTime,
ILogger log)
{
log.LogInformation("########## Function 2 ###############");
log.LogInformation($"NextVisibleTime: {nextVisibleTime}");
log.LogInformation($"NextVisibleTime: {(nextVisibleTime - insertionTime).TotalSeconds}");
log.LogInformation($"C# Queue trigger function processed: {item.AsString}");
Thread.Sleep(TimeSpan.FromMinutes(20));
}
}
}
With this host file
{
"version": "2.0",
"extensions": {
"queues": {
"maxPollingInterval": "00:00:02",
"visibilityTimeout": "00:00:10",
"batchSize": 16,
"maxDequeueCount": 5,
"newBatchThreshold": 8
}
}
}
And when i put a simple message on the queue and let it run, I see the following:
the function that grabs it, doesn't release it before the sleep is over
i can't see it in logs that the lease is renewed, but it seems like it happens under the hood
What this tells me:
if the function doesn't fail, OR the host doesn't fail, well then the lease is autorenewed according to: https://stackoverflow.com/a/31883806/21199
when the visibility timeout is reached, and the function is running, the message doesn't get "readded" to the queue
that the documentation about the visibilityTimeout is true: "The time interval between retries when processing of a message fails." (from https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-queue#hostjson-settings)
I haven't saved any links to 3. party that contradicted this (sorry I haven't saved these), but they exists. I wish someone will answer this, so I can get clarification.

setImmediate() function not being called after process.nextTick() function

For this snippet:
const foo = [1, 2];
const bar = ['a', 'b'];
foo.forEach( num => {
console.log(`setting setImmmediate ${num}`);
setImmediate(() => {
console.log(`running setImmediate ${num}`);
bar.forEach(char => {
console.log(`setting nextTick ${char}`);
process.nextTick(() => {
console.log(`running nextTick ${char}`);
})
})
});
} )
The output is
$ node scratch.js
setting setImmmediate 1
setting setImmmediate 2
running setImmediate 1
setting nextTick a
setting nextTick b
running setImmediate 2
setting nextTick a
setting nextTick b
running nextTick a
running nextTick b
running nextTick a
running nextTick b
From the docs
the nextTickQueue will be processed after the current operation completes, regardless of the current phase of the event loop.
As I understand, process.nextTick() will add to the current event's nextTickQueue, and executed immediately after the current event, no matter what phase the event loop is in.
Shouldn't the output therefor be the following?
setting setImmmediate 1
setting setImmmediate 2
running setImmediate 1
setting nextTick a
setting nextTick b
running nextTick a
running nextTick b
running setImmediate 2
setting nextTick a
setting nextTick b
running nextTick a
running nextTick b
the nextTickQueue will be processed after the current operation completes, regardless of the current phase of the event loop.
I misunderstood the event loop documentation in thinking that "current operation" means the currently processing event, where in actually it means the currently processing phase.
From Danial Khan's What you should know to really understand the Node.js Event Loop/:

Resources