Azure Functions host.json: maxPollingInterval - azure

If I read the documentation for maxPollingInterval:
The maximum interval between queue polls
From here: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-queue#hostjson-settings
I have set it to 00:01:00 in my host.json. How I understand it is that I have told the functions runtime that it may not do a poll every 2. minute. It has to do it atleast every minute, but if the queue from which it is reading is "hot", that is, there is a consistent flow of messages, it will do it more often, than the minute i have specified. If the flow of messages drops, then the runtime will begin to check less often, until it hits the one minute.
Is this correctly understood?
I have tried to find the code for this pooling mechanism in the webjob sdk: https://github.com/Azure/azure-webjobs-sdk and Azure Functions runtime: https://github.com/Azure/Azure-Functions, but I don't know what to look after (and I haven't spend that much time on it yet).

The queuetrigger implements a random exponential back-off algorithm. As a result of this if there are no messages on the queue, the SDK will back off and start polling less frequently.
MaxPollingInterval allows you to configure this behavior. It for when a queue remains empty, the longest period of time to wait before checking for a message to.
As an example, if you set it as 00:01:00 and then send a message to the queue. Function will be triggered and then wait for 2 seconds to check, if there was no message, wait for 4 seconds , no message, wait for 8 seconds and so on. If you send a message to queue and the function be triggered it will turn to wait for 2 seconds , 4 seconds, 8 seconds, ..., 1 minute, 1 minute, 1 minute ...until queuetrigger recive message.
This is the Offical doc, and below is a test i do. You can have a look of time interval of my test and have a try by yourself.
2019-10-15T07:53:32.884 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=aead524a-cdba-4e43-a956-1a8558dea037)
2019-10-15T07:53:32.884 [Information] Trigger Details: MessageId: e8e9fa50-cda3-4653-bddb-c446807ec986, DequeueCount: 1, InsertionTime: 10/15/2019 7:53:32 AM +00:00
2019-10-15T07:53:32.886 [Information] C# Queue trigger function processed: 1
2019-10-15T07:53:32.886 [Information] Executed 'Function1' (Succeeded, Id=aead524a-cdba-4e43-a956-1a8558dea037)
2019-10-15T07:56:27.105 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=eb01dce6-23fd-4128-bccd-8c26c104361f)
2019-10-15T07:56:27.105 [Information] Trigger Details: MessageId: 9aa0a4ff-d823-4703-a8b4-dca0ff7b61cb, DequeueCount: 1, InsertionTime: 10/15/2019 7:55:54 AM +00:00
2019-10-15T07:56:27.106 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=d5b92a82-5159-4c28-91b0-c84668916f13)
2019-10-15T07:56:27.106 [Information] Trigger Details: MessageId: b9445cc1-8d0e-4f37-9de2-aafd74df4b82, DequeueCount: 1, InsertionTime: 10/15/2019 7:55:55 AM +00:00
2019-10-15T07:56:27.106 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=86d6abb4-9a41-4bac-9259-17d85afd031f)
2019-10-15T07:56:27.106 [Information] Trigger Details: MessageId: f019251e-6d79-448c-baba-13bfd8401494, DequeueCount: 1, InsertionTime: 10/15/2019 7:55:57 AM +00:00
2019-10-15T07:56:27.106 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=01d94b5c-230d-4ae0-adb6-70a949ec97b5)
2019-10-15T07:56:27.106 [Information] Trigger Details: MessageId: a00e1340-e0f7-4aee-a4a7-addb9a210831, DequeueCount: 1, InsertionTime: 10/15/2019 7:55:58 AM +00:00
2019-10-15T07:56:27.107 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=fd928cf3-c0a7-4fa1-88a1-e31eecf069bd)
2019-10-15T07:56:27.107 [Information] Trigger Details: MessageId: 6a8cc1b9-c45c-42e2-b7bf-34f7e96f5471, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:01 AM +00:00
2019-10-15T07:56:27.107 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=8e6de5fc-6696-4030-a2cc-748b5da44ec7)
2019-10-15T07:56:27.107 [Information] Trigger Details: MessageId: 0b100915-7de1-44b9-a706-2fd866169563, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:02 AM +00:00
2019-10-15T07:56:27.107 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=9f1a6e13-93e7-443c-a021-c0b9d5f4dcfa)
2019-10-15T07:56:27.108 [Information] Trigger Details: MessageId: 713159e1-99cd-4727-9793-7540297eb6d0, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:04 AM +00:00
2019-10-15T07:56:27.108 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=564ca50f-58a7-4475-865d-03706298c5fd)
2019-10-15T07:56:27.108 [Information] Trigger Details: MessageId: c9bc6646-15c4-42b8-ac48-44deeb056844, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:06 AM +00:00
2019-10-15T07:56:27.113 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=2d5083ee-e839-4bd1-a59d-3f13a7f04ae2)
2019-10-15T07:56:27.114 [Information] Trigger Details: MessageId: 4a92ad18-3e53-443d-a721-1e84ae77c112, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:08 AM +00:00
2019-10-15T07:56:27.114 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=b512c3bf-2d80-4e6c-ac6d-fef77f2c93ef)
2019-10-15T07:56:27.114 [Information] Trigger Details: MessageId: 5eace6a2-98e4-4aca-8747-ff2a2d9c5198, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:10 AM +00:00
2019-10-15T07:56:27.114 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=0febda35-e777-4449-b8dd-a0895daa403e)
2019-10-15T07:56:27.114 [Information] Trigger Details: MessageId: 5e7b9f35-afa7-4edd-bb8a-cb98c12c25f5, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:12 AM +00:00
2019-10-15T07:56:27.115 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=78e66fcb-45a5-4612-8dde-ff2c31be1585)
2019-10-15T07:56:27.115 [Information] Trigger Details: MessageId: 623a077d-4bfb-4321-b6c0-2ef475ad11f4, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:15 AM +00:00
2019-10-15T07:56:27.115 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=01afae82-451e-42a7-9ed8-346737efdd01)
2019-10-15T07:56:27.115 [Information] Trigger Details: MessageId: 243a1af2-238d-48b3-930f-0ba629aeb293, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:17 AM +00:00
2019-10-15T07:56:27.115 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=b6aa18b7-d52c-4b33-83ec-d380a5431de8)
2019-10-15T07:56:27.115 [Information] Trigger Details: MessageId: 71c5d964-215a-4a62-a12c-b0d2c2a15756, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:20 AM +00:00
2019-10-15T07:56:27.116 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=3d1c2ed3-d79e-4492-8ce2-3f6980bef396)
2019-10-15T07:56:27.116 [Information] Trigger Details: MessageId: 23121a01-ef82-4339-8df3-0d4fed9f0cab, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:23 AM +00:00
2019-10-15T07:56:27.116 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=72e3bd65-4c14-4b21-ad1a-9055ec0c7491)
2019-10-15T07:56:27.116 [Information] Trigger Details: MessageId: df15e792-fecb-45c9-91d3-73d493faea4f, DequeueCount: 1, InsertionTime: 10/15/2019 7:56:24 AM +00:00
2019-10-15T07:56:27.123 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.123 [Information] Executed 'Function1' (Succeeded, Id=eb01dce6-23fd-4128-bccd-8c26c104361f)
2019-10-15T07:56:27.123 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.124 [Information] Executed 'Function1' (Succeeded, Id=d5b92a82-5159-4c28-91b0-c84668916f13)
2019-10-15T07:56:27.124 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.124 [Information] Executed 'Function1' (Succeeded, Id=86d6abb4-9a41-4bac-9259-17d85afd031f)
2019-10-15T07:56:27.126 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.126 [Information] Executed 'Function1' (Succeeded, Id=01d94b5c-230d-4ae0-adb6-70a949ec97b5)
2019-10-15T07:56:27.132 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.132 [Information] Executed 'Function1' (Succeeded, Id=fd928cf3-c0a7-4fa1-88a1-e31eecf069bd)
2019-10-15T07:56:27.133 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.133 [Information] Executed 'Function1' (Succeeded, Id=8e6de5fc-6696-4030-a2cc-748b5da44ec7)
2019-10-15T07:56:27.134 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.134 [Information] Executed 'Function1' (Succeeded, Id=9f1a6e13-93e7-443c-a021-c0b9d5f4dcfa)
2019-10-15T07:56:27.135 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.135 [Information] Executed 'Function1' (Succeeded, Id=564ca50f-58a7-4475-865d-03706298c5fd)
2019-10-15T07:56:27.142 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.142 [Information] Executed 'Function1' (Succeeded, Id=2d5083ee-e839-4bd1-a59d-3f13a7f04ae2)
2019-10-15T07:56:27.143 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.143 [Information] Executed 'Function1' (Succeeded, Id=b512c3bf-2d80-4e6c-ac6d-fef77f2c93ef)
2019-10-15T07:56:27.144 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.144 [Information] Executed 'Function1' (Succeeded, Id=0febda35-e777-4449-b8dd-a0895daa403e)
2019-10-15T07:56:27.150 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.150 [Information] Executed 'Function1' (Succeeded, Id=78e66fcb-45a5-4612-8dde-ff2c31be1585)
2019-10-15T07:56:27.152 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.152 [Information] Executed 'Function1' (Succeeded, Id=01afae82-451e-42a7-9ed8-346737efdd01)
2019-10-15T07:56:27.153 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.153 [Information] Executed 'Function1' (Succeeded, Id=b6aa18b7-d52c-4b33-83ec-d380a5431de8)
2019-10-15T07:56:27.159 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.160 [Information] C# Queue trigger function processed: 1
2019-10-15T07:56:27.160 [Information] Executed 'Function1' (Succeeded, Id=72e3bd65-4c14-4b21-ad1a-9055ec0c7491)
2019-10-15T07:56:27.161 [Information] Executed 'Function1' (Succeeded, Id=3d1c2ed3-d79e-4492-8ce2-3f6980bef396)
2019-10-15T07:57:25.420 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=a0ba0110-b590-4770-856b-dcd0b9b2fe7e)
2019-10-15T07:57:25.420 [Information] Trigger Details: MessageId: 5203a3c7-30c6-41f3-95af-dafac74512c8, DequeueCount: 1, InsertionTime: 10/15/2019 7:57:21 AM +00:00
2019-10-15T07:57:25.420 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=cb9f04e9-931d-4e09-89b5-0f346043c637)
2019-10-15T07:57:25.421 [Information] Trigger Details: MessageId: 2210c791-0f56-4eae-809d-2060591a5a9b, DequeueCount: 1, InsertionTime: 10/15/2019 7:57:23 AM +00:00
2019-10-15T07:57:25.421 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=0be83ef6-6872-4ed1-bea9-a89ade1aa031)
2019-10-15T07:57:25.421 [Information] Trigger Details: MessageId: 8ec52c1d-584a-4c95-ae74-93beca0ea8dc, DequeueCount: 1, InsertionTime: 10/15/2019 7:57:25 AM +00:00
2019-10-15T07:57:25.422 [Information] C# Queue trigger function processed: 1
2019-10-15T07:57:25.422 [Information] Executed 'Function1' (Succeeded, Id=a0ba0110-b590-4770-856b-dcd0b9b2fe7e)
2019-10-15T07:57:25.423 [Information] C# Queue trigger function processed: 1
2019-10-15T07:57:25.423 [Information] Executed 'Function1' (Succeeded, Id=cb9f04e9-931d-4e09-89b5-0f346043c637)
2019-10-15T07:57:25.429 [Information] C# Queue trigger function processed: 1
2019-10-15T07:57:25.429 [Information] Executed 'Function1' (Succeeded, Id=0be83ef6-6872-4ed1-bea9-a89ade1aa031)
2019-10-15T07:57:27.424 [Information] Executing 'Function1' (Reason='New queue message detected on 'myqueue-items'.', Id=14583d91-ec35-42aa-83c8-0727388b663d)
2019-10-15T07:57:27.424 [Information] Trigger Details: MessageId: 0c55402f-b47b-476e-ae5c-f64d9756b65f, DequeueCount: 1, InsertionTime: 10/15/2019 7:57:26 AM +00:00
2019-10-15T07:57:27.425 [Information] C# Queue trigger function processed: 1
2019-10-15T07:57:27.425 [Information] Executed 'Function1' (Succeeded, Id=14583d91-ec35-42aa-83c8-0727388b663d)
Do you understand now?

#BowmanZhu, just to complete/update your answer, it seems what in Microsoft.Azure.WebJobs version 3.0.16, the polling is triggered differently.
Checking mechanism is starting from the QueuePollingIntervals.Minimum(not 2 seconds) which is 100ms and exponentially increment it until it reaches the maximum.
Note: for Development environment the MaxPollingInterval default value is not 1 minute, but 2 seconds.
MaxPollingInterval set to 15 seconds (15 000ms):
dbug: Microsoft.Extensions.Hosting.Internal.Host[2]
Hosting started
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 100 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 199 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 336 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 548 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 916 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 1851 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 3199 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 6222 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 13030 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 15000 ms before polling queue 'stackoverflow'.
dbug: Microsoft.Azure.WebJobs.Host.Queues.Listeners.QueueListener[2]
Function 'ProcessQueueMessage' will wait 15000 ms before polling queue 'stackoverflow'.
...

Related

terratest to test SQS

I am using terratests to test my terraform code.
so, my terraform code setup: SQS which is connected to Lambda and Lambda is the consumer of the SQS Message
so, I would like to test the whole flow, So thought of sending a dummy message to SQS and then read the message, using terratest however I can't read the message as the lambda already consumed it !!! I see cw logs for the lambda and it has consumed the message
can anyone suggest how to test this complete set of flow with terratest?
The testcode for SQS looks like this:
ack_queue_url := terraform.Output(t, terraformOptions, "acknowledgment_queue_url")
time_out_sec := 120
test_message := fmt.Sprintf("terratest-test-message-%s", uniqueId)
aws.SendMessageToQueue(t, awsRegion, ack_queue_url, test_message)
response := aws.WaitForQueueMessage(t, awsRegion, ack_queue_url, time_out_sec)
assert.NoError(t, response.Error)
fmt.Println("###Message Body####:", response.MessageBody)
aws.DeleteMessageFromQueue(t, awsRegion, ack_queue_url, response.ReceiptHandle)
delete_response := aws.WaitForQueueMessage(t, awsRegion, ack_queue_url, time_out_sec)
assert.Error(t, delete_response.Error, aws.ReceiveMessageTimeout{QueueUrl: ack_queue_url, TimeoutSec: time_out_sec})
the output looks like this:
logger.go:66: "https://sqs.us-east-1.amazonaws.com/1234567/tst-queue"
sqs.go:150: Sending message terratest-test-message-DkKAvt to queue https://sqs.us-east-1.amazonaws.com/1234567/tst-queue
sqs.go:170: Message id b9b0a000-1d71-4821-8659-21aebe33cdc0 sent to queue https://sqs.us-east-1.amazonaws.com/1234567/tst-queue
sqs.go:234: Waiting for message on https://sqs.us-east-1.amazonaws.com/1234567/tst-queue (0s)
sqs.go:234: Waiting for message on https://sqs.us-east-1.amazonaws.com/1234567/tst-queue(20s)
sqs.go:234: Waiting for message on https://sqs.us-east-1.amazonaws.com/1234567/tst-queue (40s)
sqs.go:234: Waiting for message on https://sqs.us-east-1.amazonaws.com/1234567/tst-queue (60s)
sqs.go:234: Waiting for message on https://sqs.us-east-1.amazonaws.com/1234567/tst-queue (80s)
sqs.go:234: Waiting for message on https://sqs.us-east-1.amazonaws.com/1234567/tst-queue (100s)
Error Trace: /Users/xxxx/Projects/dummy_test/terratest/complete_test.go:83
Error: Received unexpected error:
Failed to receive messages on https://sqs.us-east-1.amazonaws.com/1234567/tst-queue within 120 seconds
Test: complete_test
###Message Body####:
sqs.go:125: Deleting message from queue https://sqs.us-east-1.amazonaws.com/1234567/tst-queue()
sqs.go:119: MissingParameter: The request must contain the parameter ReceiptHandle.
status code: 400, request id: 4421cdfc-4326-5755-9623-91ca3414ed6f

Azure Durable Function Repeating Function calls

The below question stems from my work trying to use the fan out/fan in pattern to gather license data for 130k users to input to our database for reporting and license management. I have this piece functioning correctly in a different project but it is exceeding the timeout of a standard Azure function. I ported it to the durable function and ran into this issue, then backtracked to the pre-generated code in an attempt to find the issue but it exists no matter what I do.
The below is in a fresh project using the pre-generated code for the Durable Functions. No changes to the code made at all.
When I run the code, it calls the same function multiple times with the same timestamp and different ID's. It then continuously calls Function1_Hello.
I have done some research to try and determine what is causing this but have not been able to find an answer or resolution.
Similar question here and some deep research done here but I have not seen an actual resolution to this issue. Any help is greatly appreciated.
Code
using System.Collections.Generic;
using System.Net.Http;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.DurableTask;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
namespace DurableFunctionTest
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task<List<string>> RunOrchestrator(
[OrchestrationTrigger] IDurableOrchestrationContext context)
{
var outputs = new List<string>();
// Replace "hello" with the name of your Durable Activity Function.
outputs.Add(await context.CallActivityAsync<string>("Function1_Hello", "Tokyo"));
outputs.Add(await context.CallActivityAsync<string>("Function1_Hello", "Seattle"));
outputs.Add(await context.CallActivityAsync<string>("Function1_Hello", "London"));
// returns ["Hello Tokyo!", "Hello Seattle!", "Hello London!"]
return outputs;
}
[FunctionName("Function1_Hello")]
public static string SayHello([ActivityTrigger] string name, ILogger log)
{
log.LogInformation($"Saying hello to {name}.");
return $"Hello {name}!";
}
[FunctionName("Function1_HttpStart")]
public static async Task<HttpResponseMessage> HttpStart(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post")] HttpRequestMessage req,
[DurableClient] IDurableOrchestrationClient starter,
ILogger log)
{
// Function input comes from the request content.
string instanceId = await starter.StartNewAsync("Function1", null);
log.LogInformation($"Started orchestration with ID = '{instanceId}'.");
return starter.CreateCheckStatusResponse(req, instanceId);
}
}
}
Console results from running the code here.
For detailed output, run func with --verbose flag.
[2022-09-28T16:54:28.492Z] Executing 'Function1_Hello' (Reason='(null)', Id=76fc4193-219e-4347-ab08-880785a2d73e)
[2022-09-28T16:54:28.492Z] Executing 'Function1_Hello' (Reason='(null)', Id=022d09eb-0960-427a-8777-c99062c50e34)[2022-09-28T16:54:28.492Z] Executing 'Function1_Hello' (Reason='(null)', Id=69633782-4904-4ee8-b88a-228e96a4b8de)
[2022-09-28T16:54:28.492Z] Executing 'Function1_Hello' (Reason='(null)', Id=374d38c5-b2b7-4729-9dc0-7ea3e508a564)
[2022-09-28T16:54:28.492Z] Executing 'Function1_Hello' (Reason='(null)', Id=48cd6a04-ee9d-4f30-aff8-59af54233bb6)
[2022-09-28T16:54:29.435Z] Saying hello to Tokyo.
[2022-09-28T16:54:29.456Z] Saying hello to Tokyo.
[2022-09-28T16:54:29.620Z] Saying hello to Tokyo.
[2022-09-28T16:54:29.761Z] Saying hello to Tokyo.
[2022-09-28T16:54:29.761Z] Saying hello to Tokyo.
[2022-09-28T16:54:29.781Z] Host lock lease acquired by instance ID '00000000000000000000000042B5BCB0'.
[2022-09-28T16:54:29.862Z] Executed 'Function1_Hello' (Succeeded, Id=69633782-4904-4ee8-b88a-228e96a4b8de, Duration=2281ms)
[2022-09-28T16:54:29.862Z] Executed 'Function1_Hello' (Succeeded, Id=76fc4193-219e-4347-ab08-880785a2d73e, Duration=2438ms)
[2022-09-28T16:54:29.862Z] Executed 'Function1_Hello' (Succeeded, Id=374d38c5-b2b7-4729-9dc0-7ea3e508a564, Duration=2417ms)
[2022-09-28T16:54:29.862Z] Executed 'Function1_Hello' (Succeeded, Id=48cd6a04-ee9d-4f30-aff8-59af54233bb6, Duration=2437ms)
[2022-09-28T16:54:29.862Z] Executed 'Function1_Hello' (Succeeded, Id=022d09eb-0960-427a-8777-c99062c50e34, Duration=2297ms)
[2022-09-28T16:54:59.644Z] Executing 'Function1' (Reason='(null)', Id=0eaa8de6-d050-4d50-91f9-96c2f019cfeb)
[2022-09-28T16:54:59.644Z] Executing 'Function1' (Reason='(null)', Id=56461ac1-0b65-4797-8d93-38e3b6184db9)[2022-09-28T16:54:59.644Z] Executing 'Function1' (Reason='(null)', Id=64540679-8f5a-4507-a656-8a011874fe9a)
[2022-09-28T16:54:59.644Z] Executing 'Function1' (Reason='(null)', Id=641d8c52-cce8-4076-a384-b05adc05ab89)
[2022-09-28T16:55:00.672Z] Executing 'Function1' (Reason='(null)', Id=7ab59722-55b1-4fbd-a3e6-c957e3da99f9)
[2022-09-28T16:55:00.743Z] Executed 'Function1' (Succeeded, Id=0eaa8de6-d050-4d50-91f9-96c2f019cfeb, Duration=1572ms)
[2022-09-28T16:55:00.743Z] Executed 'Function1' (Succeeded, Id=64540679-8f5a-4507-a656-8a011874fe9a, Duration=1572ms)
[2022-09-28T16:55:00.795Z] Executed 'Function1' (Succeeded, Id=7ab59722-55b1-4fbd-a3e6-c957e3da99f9, Duration=125ms)
[2022-09-28T16:55:00.795Z] Executed 'Function1' (Succeeded, Id=641d8c52-cce8-4076-a384-b05adc05ab89, Duration=1625ms)
[2022-09-28T16:55:00.795Z] Executed 'Function1' (Succeeded, Id=56461ac1-0b65-4797-8d93-38e3b6184db9, Duration=1624ms)
[2022-09-28T16:55:01.291Z] Executing 'Function1_Hello' (Reason='(null)', Id=069b5928-0947-4c33-842e-befd8d38c9d4)
[2022-09-28T16:55:01.340Z] Saying hello to Seattle.
[2022-09-28T16:55:01.367Z] Executing 'Function1_Hello' (Reason='(null)', Id=346e37d5-fb8d-4ab9-9340-272eee8219a0)
[2022-09-28T16:55:01.413Z] Executed 'Function1_Hello' (Succeeded, Id=069b5928-0947-4c33-842e-befd8d38c9d4, Duration=75ms)[2022-09-28T16:55:01.417Z] Saying hello to Seattle.
[2022-09-28T16:55:01.442Z] Executed 'Function1_Hello' (Succeeded, Id=346e37d5-fb8d-4ab9-9340-272eee8219a0, Duration=76ms)
[2022-09-28T16:55:01.442Z] Executing 'Function1_Hello' (Reason='(null)', Id=68f9d521-3178-40a3-9b7b-e8dea8c29f19)
[2022-09-28T16:55:01.617Z] Saying hello to Tokyo.
[2022-09-28T16:55:01.619Z] Executing 'Function1_Hello' (Reason='(null)', Id=8dac2a24-0796-40b1-9fc5-97fc14fc7d7c)
[2022-09-28T16:55:01.729Z] Executed 'Function1_Hello' (Succeeded, Id=68f9d521-3178-40a3-9b7b-e8dea8c29f19, Duration=268ms)
[2022-09-28T16:55:01.783Z] Saying hello to Tokyo.
[2022-09-28T16:55:01.787Z] Executing 'Function1_Hello' (Reason='(null)', Id=94bc6a3c-9aba-4857-bc17-01092dacc8ce)
[2022-09-28T16:55:01.944Z] Executed 'Function1_Hello' (Succeeded, Id=8dac2a24-0796-40b1-9fc5-97fc14fc7d7c, Duration=326ms)
[2022-09-28T16:55:02.026Z] Saying hello to Tokyo.
[2022-09-28T16:55:02.053Z] Executed 'Function1_Hello' (Succeeded, Id=94bc6a3c-9aba-4857-bc17-01092dacc8ce, Duration=266ms)
[2022-09-28T16:55:06.515Z] Executing 'Function1' (Reason='(null)', Id=8de1ae6e-b19d-49b0-9559-3817218f81de)
[2022-09-28T16:55:06.515Z] Executing 'Function1' (Reason='(null)', Id=1707f89b-dc2d-466d-ace2-68dadfebf991)
[2022-09-28T16:55:06.535Z] Executed 'Function1' (Succeeded, Id=8de1ae6e-b19d-49b0-9559-3817218f81de, Duration=20ms)
[2022-09-28T16:55:06.547Z] Executed 'Function1' (Succeeded, Id=1707f89b-dc2d-466d-ace2-68dadfebf991, Duration=32ms)
[2022-09-28T16:55:06.909Z] Executing 'Function1_Hello'
(Reason='(null)', Id=239bdb01-e2cb-42de-84a3-22ec9ce582c8)
Based on the discussion in the comments, it seems that activity start messages had been sent in previous runs to the work item queue, and when the Function was restarted, those activities executed.
A really easy way to ensure previous data is not used is to change the task hub name.
You can specify one in the host.json:
{
"extensions": {
"durableTask": {
"hubName": "MyHubName"
}
}
}
Note however that if you commit this change to Azure, it'll also clear the state there.
Alternatively, you can delete the queues and tables created for your local testing and Durable Functions will recreate them on startup.

Azure ServiceBus AbandonMessageAsync releasing message at inconsistent times

I have the need to inspect a dead letter queue, and if some condition exists (like older than 30 days) I want to archive it to some data store (not just remove it). So I was gonna grab the messages, if it meets this condition, save it to some store and complete/delete the message, if not, abandoning it. I have a console app where I’m grabbing the messages from the dlq, it seems to work, but if I run it over and over again, I’m seeing inconsistent results in the number of messages that get returned. It will have all of them for a few iterations (in my example it would be 7), but then it will start only getting 6, 0 or 1, and eventually go back to the full amount that’s in the dql (like 30 seconds later which I think is the default lock period for peek lock). I would assume that every time I run this, I should get all messages, cause I’m abandoned the messages the run before.
I'm using Azure.Messaging.ServiceBus 7.8.1 and seems like you just pass the message object to the abandon method. If anyone has any suggestion that would be great!
Code in github: https://github.com/ndn2323/bustest
using Azure.Messaging.ServiceBus;
using System.Text;
namespace BusReceiver
{
public class TaskRunner
{
public TaskRunner() { }
public async Task Run() {
const string DLQPATH = "/$deadletterqueue";
var maxMsgCount = 50;
var connectionString = "[ConnectionString]";
var topicName = "testtopic1";
var subscriberName = "testsub1";
var subscriberDlqName = subscriberName + DLQPATH;
var client = new ServiceBusClient(connectionString);
var options = new ServiceBusReceiverOptions();
options.ReceiveMode = ServiceBusReceiveMode.PeekLock;
var receiver = client.CreateReceiver(topicName, subscriberName, options);
var receiverDlq = client.CreateReceiver(topicName, subscriberDlqName, options);
Log("Starting receive from regular queue");
var msgList = await receiver.ReceiveMessagesAsync(maxMsgCount, TimeSpan.FromMilliseconds(500));
Log(msgList.Count.ToString() + " messages found");
foreach (var msg in msgList)
{
await receiver.DeadLetterMessageAsync(msg);
}
Log("Starting receive from dead letter queue");
var msgListDlq = await receiverDlq.ReceiveMessagesAsync(maxMsgCount, TimeSpan.FromMilliseconds(500));
Log(msgListDlq.Count.ToString() + " messages found in dlq");
foreach (var msg in msgListDlq) {
Log("MessageId: " + msg.MessageId + " Body: " + Encoding.ASCII.GetString(msg.Body));
// if some condition, archieve message to some data store, else abandon it to be picked up again
// for this test I'm abandoning all messages
await receiverDlq.AbandonMessageAsync(msg);
}
await receiver.CloseAsync();
await receiverDlq.CloseAsync();
}
private void Log(string msg) {
Console.WriteLine(DateTime.Now.ToString() + ": " + msg);
}
}
}
Example of output:
C:\GitHub\ndn2323\bustest\BusReceiver\bin\Debug\net6.0>BusReceiver.exe
5/29/2022 11:45:36 PM: Starting receive from regular queue
5/29/2022 11:45:37 PM: 0 messages found
5/29/2022 11:45:37 PM: Starting receive from dead letter queue
5/29/2022 11:45:37 PM: 7 messages found in dlq
5/29/2022 11:45:37 PM: MessageId: 9e9f390655af44a8b93866920a6de77c Body: TestMessage
5/29/2022 11:45:37 PM: MessageId: 3aacffe40ab5473fb34412684bcd1907 Body: TestMessage
5/29/2022 11:45:37 PM: MessageId: a47f83d4a12845088ade427e084d8e39 Body: TestMessage
5/29/2022 11:45:37 PM: MessageId: 47ff6dd4f4134661a3616a9210670be5 Body: TestMessage
5/29/2022 11:45:37 PM: MessageId: d10b3602f57047f1bf613675e35793e0 Body: TestMessage
5/29/2022 11:45:37 PM: MessageId: 08a45405375e46ffb99db9812c3e3d78 Body: TestMessage
5/29/2022 11:45:37 PM: MessageId: d21cff4ae5b6453f9077b3805ace4e09 Body: TestMessage
C:\GitHub\ndn2323\bustest\BusReceiver\bin\Debug\net6.0>BusReceiver.exe
5/29/2022 11:45:42 PM: Starting receive from regular queue
5/29/2022 11:45:43 PM: 0 messages found
5/29/2022 11:45:43 PM: Starting receive from dead letter queue
5/29/2022 11:45:43 PM: 7 messages found in dlq
5/29/2022 11:45:43 PM: MessageId: 9e9f390655af44a8b93866920a6de77c Body: TestMessage
5/29/2022 11:45:43 PM: MessageId: 3aacffe40ab5473fb34412684bcd1907 Body: TestMessage
5/29/2022 11:45:43 PM: MessageId: a47f83d4a12845088ade427e084d8e39 Body: TestMessage
5/29/2022 11:45:43 PM: MessageId: 47ff6dd4f4134661a3616a9210670be5 Body: TestMessage
5/29/2022 11:45:43 PM: MessageId: d10b3602f57047f1bf613675e35793e0 Body: TestMessage
5/29/2022 11:45:43 PM: MessageId: 08a45405375e46ffb99db9812c3e3d78 Body: TestMessage
5/29/2022 11:45:43 PM: MessageId: d21cff4ae5b6453f9077b3805ace4e09 Body: TestMessage
C:\GitHub\ndn2323\bustest\BusReceiver\bin\Debug\net6.0>BusReceiver.exe
5/29/2022 11:45:48 PM: Starting receive from regular queue
5/29/2022 11:45:49 PM: 0 messages found
5/29/2022 11:45:49 PM: Starting receive from dead letter queue
5/29/2022 11:45:49 PM: 1 messages found in dlq
5/29/2022 11:45:49 PM: MessageId: d21cff4ae5b6453f9077b3805ace4e09 Body: TestMessage
C:\GitHub\ndn2323\bustest\BusReceiver\bin\Debug\net6.0>BusReceiver.exe
5/29/2022 11:46:03 PM: Starting receive from regular queue
5/29/2022 11:46:04 PM: 0 messages found
5/29/2022 11:46:04 PM: Starting receive from dead letter queue
5/29/2022 11:46:04 PM: 1 messages found in dlq
5/29/2022 11:46:04 PM: MessageId: d21cff4ae5b6453f9077b3805ace4e09 Body: TestMessage
Due to variations in the network, service, and your application, it is normal to see batches of inconsistent size returned when calling ReceiveMessagesAsync.
When receiving, there is no minimum batch size. The receiver will add enough credits to the link to allow maxMessageCount to flow from the service but will not wait in an attempt to build a batch of that size. Once any messages have been transferred from the service, they will be returned as the batch. Because you specified a maxWaitTime, if no messages were available on the service within that time, an empty batch will be returned.
Setting a PrefetchCount in your ServiceBusReceiverOptions can help to smooth out the batch sizes. That said, it is important to be aware that locks are held for messages in the prefetch queue and are not automatically renewed, so finding a prefetch count too high will result in seeing expired locks.
In your example, the best approach may be to just perform your receive loop repeatedly until you see 1 (or more?) empty batches consecutively. That would be a strong indicator that the queue was empty.

Azure function visibilityTimeout

When i read the documentation about visibilityTimeout: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-queue#host-json it says "The time interval between retries when processing of a message fails.". How I understands this is that if the timeout is set to 30 seconds and my function runs for 1 minute but doesn't fail in that 1 minute period, the message doesn't get visible to others in the queue. But when I read up on it by others sources (stackoverflow fx) it tells me the opposite, that when the execution time of the function exceeds the timeout, the message becomes visible EVEN though the function is still processing the message.
What is the truth? Is the timeout only relevant when the function isn't running more (and maybe have failed) or can it happen that the message gets visible again even though the function is still running?
What doesn't makes sense either, if we assume that the message gets visible when timeout is reached, is that the default timeout is 00:00:00 which implies that the message is visible at the same moment it is dequeued. This contradicts what 3. party sources is saying.
I am a bit confused by this.
It appears there are actually two different visibility timeout values used here. Both are set by the Azure WebJobs SDK but only one is configurable.
When the function fails
The queues.visibilityTimeout configuration option would be more aptly named retryDelay.
When the function throws an exception or fails for some other kind of error, the message gets returned to the queue to be retried. The message is returned with the configured visibilityTimeout (see here), which delays when the function will next attempt to run.
This allows your application to cope with transient errors. For example, if an email API or other external service is temporarily down. By delaying the retry, there is a chance that the service may be back online for the next function attempt.
Retry is limited to maxDequeueCount attempts (5 default) before a message is moved to the Poison Queue.
While the function is running
When the QueueTrigger binding runs the function, it dequeues the message with a visibility timeout of 10mins (hard-coded here). It then sets a timer to extend the visibility window when it reaches half-time as long as the function is running (see the timer and visibility update in the source).
Ordinarily you don't need to worry about this as long as your functions use CancellationTokens correctly. This 10min timeout only matters if the Azure Function/WebJob host doesn't get to shut down gracefully. For example:
someone "pulls the plug" on the web host
if the function doesn't respond to the CancellationToken in time during scale-in or other Azure shutdown events
So, as long as the function is still running, the message will remain hidden from the queue.
Verification
I did a similar experiment to check:
[FunctionName("SlowJob")]
public async Task Run(
[QueueTrigger("slow-job-queue")] CloudQueueMessage message,
ILogger log)
{
for (var i = 0; i < 20; i++)
{
log.LogInformation($"Next visible {i}: {message.NextVisibleTime}");
await Task.Delay(60000);
}
}
Output:
Next visible 0: 5/11/2020 7:49:24 +00:00
Next visible 1: 5/11/2020 7:49:24 +00:00
Next visible 2: 5/11/2020 7:49:24 +00:00
Next visible 3: 5/11/2020 7:49:24 +00:00
Next visible 4: 5/11/2020 7:49:24 +00:00
Next visible 5: 5/11/2020 7:54:24 +00:00
Next visible 6: 5/11/2020 7:54:24 +00:00
Next visible 7: 5/11/2020 7:54:24 +00:00
Next visible 8: 5/11/2020 7:54:24 +00:00
Next visible 9: 5/11/2020 7:54:24 +00:00
Next visible 10: 5/11/2020 7:59:24 +00:00
Next visible 11: 5/11/2020 7:59:24 +00:00
...
I have tested this with
using System;
using System.Threading;
using System.Threading.Tasks;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
using Microsoft.WindowsAzure.Storage.Queue;
namespace WorkerFunctions
{
public static class WorkerFunctions
{
[FunctionName("WorkerFunction1")]
public static async Task Function1(
[QueueTrigger("outputQueue")]
CloudQueueMessage item,
[Queue("outputQueue")]
CloudQueue outputQueue,
DateTimeOffset nextVisibleTime,
DateTimeOffset expirationTime,
DateTimeOffset insertionTime,
ILogger log)
{
log.LogInformation("########## Function 1 ###############");
log.LogInformation($"NextVisibleTime: {nextVisibleTime}");
log.LogInformation($"NextVisibleTime: {(nextVisibleTime-insertionTime).TotalSeconds}");
log.LogInformation($"C# Queue trigger function processed: {item.AsString}");
Thread.Sleep(TimeSpan.FromMinutes(20));
}
[FunctionName("WorkerFunction2")]
public static async Task Function2(
[QueueTrigger("outputQueue")]
CloudQueueMessage item,
[Queue("outputQueue")]
CloudQueue outputQueue,
DateTimeOffset nextVisibleTime,
DateTimeOffset expirationTime,
DateTimeOffset insertionTime,
ILogger log)
{
log.LogInformation("########## Function 2 ###############");
log.LogInformation($"NextVisibleTime: {nextVisibleTime}");
log.LogInformation($"NextVisibleTime: {(nextVisibleTime - insertionTime).TotalSeconds}");
log.LogInformation($"C# Queue trigger function processed: {item.AsString}");
Thread.Sleep(TimeSpan.FromMinutes(20));
}
}
}
With this host file
{
"version": "2.0",
"extensions": {
"queues": {
"maxPollingInterval": "00:00:02",
"visibilityTimeout": "00:00:10",
"batchSize": 16,
"maxDequeueCount": 5,
"newBatchThreshold": 8
}
}
}
And when i put a simple message on the queue and let it run, I see the following:
the function that grabs it, doesn't release it before the sleep is over
i can't see it in logs that the lease is renewed, but it seems like it happens under the hood
What this tells me:
if the function doesn't fail, OR the host doesn't fail, well then the lease is autorenewed according to: https://stackoverflow.com/a/31883806/21199
when the visibility timeout is reached, and the function is running, the message doesn't get "readded" to the queue
that the documentation about the visibilityTimeout is true: "The time interval between retries when processing of a message fails." (from https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-queue#hostjson-settings)
I haven't saved any links to 3. party that contradicted this (sorry I haven't saved these), but they exists. I wish someone will answer this, so I can get clarification.

setImmediate() function not being called after process.nextTick() function

For this snippet:
const foo = [1, 2];
const bar = ['a', 'b'];
foo.forEach( num => {
console.log(`setting setImmmediate ${num}`);
setImmediate(() => {
console.log(`running setImmediate ${num}`);
bar.forEach(char => {
console.log(`setting nextTick ${char}`);
process.nextTick(() => {
console.log(`running nextTick ${char}`);
})
})
});
} )
The output is
$ node scratch.js
setting setImmmediate 1
setting setImmmediate 2
running setImmediate 1
setting nextTick a
setting nextTick b
running setImmediate 2
setting nextTick a
setting nextTick b
running nextTick a
running nextTick b
running nextTick a
running nextTick b
From the docs
the nextTickQueue will be processed after the current operation completes, regardless of the current phase of the event loop.
As I understand, process.nextTick() will add to the current event's nextTickQueue, and executed immediately after the current event, no matter what phase the event loop is in.
Shouldn't the output therefor be the following?
setting setImmmediate 1
setting setImmmediate 2
running setImmediate 1
setting nextTick a
setting nextTick b
running nextTick a
running nextTick b
running setImmediate 2
setting nextTick a
setting nextTick b
running nextTick a
running nextTick b
the nextTickQueue will be processed after the current operation completes, regardless of the current phase of the event loop.
I misunderstood the event loop documentation in thinking that "current operation" means the currently processing event, where in actually it means the currently processing phase.
From Danial Khan's What you should know to really understand the Node.js Event Loop/:

Resources