Update TimeTrigger expression dynamically Azure Functions - azure

The user inputs the CRON expression from an interface. The function app should update the appsettings to reflect the user input.
My current approach
The TimerTrigger function with schedule appsetting
[FunctionName("Cleanup")]
public static async Task Run([TimerTrigger("%schedule%")]TimerInfo myTimer, ILogger log)
{
// Get the connection string from app settings and use it to create a connection.
var str = Environment.GetEnvironmentVariable("db_connection");
log.LogInformation($"db_connection : {str}");
}
Setting the schedule appsetting via environment variable
[FunctionName("SetConfig")]
public static async Task<HttpResponseMessage> SetConfig([HttpTrigger(AuthorizationLevel.Function, "post", Route = null)]HttpRequestMessage req)
{
HistoryLogDeleteConfigDto data = await req.Content.ReadAsAsync<HistoryLogDeleteConfigDto>();
Environment.SetEnvironmentVariable("schedule", data.Schedule);
return req.CreateResponse(HttpStatusCode.OK);
}
local.settings.json file
"Values": {
"db_connection": "Server=DESKTOP-DFJ3PBT;Database=CovalentLogger;Trusted_Connection=True;MultipleActiveResultSets=true",
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"schedule": "*/20 * * * * *"
}
POSTMAN request body to update the schedule appsetting
{
"Schedule": "*/30 * * * * *"
}
But no luck. After sending the request from postman to update the setting, if I access the azure portal function app setting I still can see the old value.
But if I query the environmnet variable like below
Environment.GetEnvironmentVariable("schedule", EnvironmentVariableTarget.Process)
I can see the new expression. But in Azure portal function appsetting it stills the old value. So the job still executes based on the old schedule.
Where have I gone wrong?
Thank you

I don't think updating the Environment variable like that will work as the the schedule is only being read at the initialization of the Function - or when the app setting is updated on the app service. However, it should still be fairly doable. This does basically exactly what you are looking for: https://stackoverflow.com/a/50116234/1537195
Just package this in your HTTP-triggered Function (and I'd probably use Managed Identity) and you are good to go.

Related

Azure Durable Function - load balance Activity Functions over multiple Function apps?

I've tried to build a solution in which I have multiple Function Apps deployed (in multiple Azure Regions) which all share the same Task Hub (set via host.json) storage account. Now my idea was that since each of them should listen to new work for the Activity Functions, the load should get somewhat distributed. But thats not happening - at least with what I have tried so far. It looks like the Function App for the Activity Functions is already determined by which Orchestrator is picked. (I had to deploy the Orchestrator in together with the Activity Functions or else it wouldn't even kick off).
So my question is: Would such a scenario be possible to achieve using Durable Functions?
host.json
{
"version": "2.0",
"extensions": {
"durableTask": {
"hubName": "MyTaskHub",
"storageProvider": {
"connectionStringName": "DurableStorage",
"useAppLease": false
}
}
}
}
Thank you Anand Sowmithiran. Posting your suggestion as an answer so that it will be helpful for other community members who face similar kind of issues.
Now my idea was that since each of them should listen to new work for the Activity Functions, the load should get somewhat distributed. But thats not happening - at least with what I have tried so far. It looks like the Function App for the Activity Functions is already determined by which Orchestrator is picked.
As per the Microsoft Document the Task hub should be different for each durable function which means it should have their own durable function. below is the sample Task Hub code
{
"version": "2.0",
"extensions": {
"durableTask": {
"hubName": "%MyTaskHub%"
}
}
}
In addition to host.json, task hub names can also be configured in orchestration client binding metadata.
[FunctionName("HttpStart")]
public static async Task<HttpResponseMessage> Run(
[HttpTrigger(AuthorizationLevel.Function, methods: "post", Route = "orchestrators/{functionName}")] HttpRequestMessage req,
[DurableClient(TaskHub = "%MyTaskHub%")] IDurableOrchestrationClient starter,
string functionName,
ILogger log)
{
// Function input comes from the request content.
object eventData = await req.Content.ReadAsAsync<object>();
string instanceId = await starter.StartNewAsync(functionName, eventData);
log.LogInformation($"Started orchestration with ID = '{instanceId}'.");
return starter.CreateCheckStatusResponse(req, instanceId);
}

CosmosDBTrigger not getting invoked when CosmosDB binding is also present

I'm trying to make a CosmosDBTriggered function in my precompiled C# CI/CD deployed project.
Here's the function implementation, which gets deployed with no complaints. I've tried static and instance methods.
There are no errors but also no invocations as reported by the monitoring/Insights tools even though the watched Collection has items and changes while it's deployed.
The function says it's enabled and has a Cosmosdb trigger:
I've tried adding these dependencies individually, but no changes:
<PackageReference Include="Microsoft.Azure.WebJobs.Extensions.CosmosDB" Version="3.0.10" />
<PackageReference Include="Microsoft.Azure.WebJobs.Script.ExtensionsMetadataGenerator" Version="1.2.1" />
This function DOES NOT appear in the Triggers of any CosmosDB Collection as I might expect, but I think that's possibly for a different kind of Trigger.
What configuration step am I missing??
UPDATE
When I comment out this [CosmosDB] DocumentClient binding (and anything that relies on it), the function is invoked. So I guess it's a problem with those bindings being used together?
Are you sure you set the CosmosDbConnection in azure function app on azure?
For example, this is my function app:
Function1.cs
using System;
using System.Collections.Generic;
using Microsoft.Azure.Documents;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
namespace FunctionApp109
{
public static class Function1
{
[FunctionName("Function1")]
public static void Run([CosmosDBTrigger(
databaseName: "testbowman",
collectionName: "testbowman",
ConnectionStringSetting = "str",
CreateLeaseCollectionIfNotExists = true,
LeaseCollectionName = "lease")]IReadOnlyList<Document> input, ILogger log)
{
if (input != null && input.Count > 0)
{
log.LogInformation("Documents modified " + input.Count);
log.LogInformation("First document Id " + input[0].Id);
}
}
}
}
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=0730bowmanwindow;AccountKey=xxxxxx;EndpointSuffix=core.windows.net",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"str": "AccountEndpoint=https://testbowman.documents.azure.com:443/;AccountKey=xxxxxx;"
}
}
But when deploy function app to azure, the local.settings.json will not be used, you need to set the connection string here:
The function app on azure will not tell you this thing, it just doesn't work.
Based on your updated post, the issue is that the Functions runtime is not initializing your Function because of some configuration issue in your bindings.
Normally, the actual error should be in your Application Insights logs.
The Cosmos DB output binding you are using is missing the collection and database properties. Checking the official samples: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-cosmosdb-v2-input?tabs=csharp#http-trigger-get-multiple-docs-using-documentclient-c and assuming CosmosDBConnection is a variable that points to the setting name of a setting that contains the connection string:
[CosmosDB(
databaseName: "<your-db-name>",
collectionName: "<your-collection-name>",
ConnectionStringSetting = CosmosDBConnection)] DocumentClient dbClient,

How do I pass in the storage account connection string for a CosmosDBTrigger?

I'm trying to figure out the proper way to pass in a storage account connection string to a CosmosDBTrigger. I have a function that runs when there is a change on a CosmosDB container. This function copies image blobs from one container to another. If you look at the code below, I have commented out the line where I am trying to fine the storage account that I want to connect to. This function runs when that is commented out. It does not run when I have that un-commented. Why?
public static class Function1
{
[FunctionName("ImageCopier")]
public static async Task Run([CosmosDBTrigger(
databaseName: "MyDatabase",
collectionName: "Orders",
ConnectionStringSetting = "databaseConnection",
CreateLeaseCollectionIfNotExists = true,
LeaseDatabaseName = "TriggerLeases",
LeaseCollectionName = "TriggerLeases",
LeaseCollectionPrefix = "ImageCopier")]IReadOnlyList<Document> input,
//[StorageAccount("MyStorageAccount")]string storageConnectionString,
ILogger log)
{
I have MyStorageAccount defined in my local.settings.json file and I also have it in my Azure Function Configuration settings. I copied the connection string directly from the storage account keys panel.
When you set up a CosmosDB trigger, the information that is supplied in that trigger is specific to the trigger. If you need a setting or configuration not related to the trigger in your code, you can use the Environment.GetEnvironmentVariable method.
In your local environment, you can set these variables by editing the local.settings.json file, specifically the Values array. For example:
{
"IsEncrypted": false,
"Values": {
"JobUri": "https://yourapiendpointurl.com",
"BlobStorageConnectionString" : "the connection string",
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
}
}
In your method, you may grab that value like so:
public static class Function1
{
[FunctionName("ImageCopier")]
public static async Task Run([CosmosDBTrigger(
databaseName: "MyDatabase",
...
ILogger log)
{
var connectionString =
Environment.GetEnvironmentVariable("BlobStorageConnectionString");
}
}
The local.settings.json file will not be used when it's running in Azure.
I am not sure that when you publish the function if your local.settings.json file will migrate the settings to your Azure Function app's configuration, so I would check to make sure that your settings are in there after publishing.
Side note: Be carful when committing code to repos .. you don't want "secrets" in your repositories in case someone gets in to your repo and discovers it.
While you can access raw configuration values using GetEnvironmentVariable, a more robust/idiomatic approach with .NET in particular is to leverage the built-in dependency injection of configuration.
Using this, you can accept an IConfiguration or strongly-typed IOptions through the function's constructor and use the values in your code. For example:
public class Function1
{
private readonly IConfiguration configuration;
public Function1(IConfiguration configuration)
{
this.configuration = configuration;
}
[FunctionName("ImageCopier")]
public async Task Run([CosmosDBTrigger(/* trigger params */)] IReadOnlyList<Document> input)
{
var connectionString = configuration["MyStorageAccount"];
// Use connection string
}
}
You can take this further to inject services like an "ImageBlobService" into your function that have already been configured in a common Startup Configure method just like ASP.NET Core. That way the individual functions don't need to know anything about configuration and just ask for the relevant service to use.

Azure Timer Function - Microsoft.WindowsAzure.Storage: Settings must be of the form "name=value"

I'm able to create http trigger functions and execute them no problem, but when I create a timer function, I get the following error:
The listener for function 'Functions.CheckForWinnersOnTimer' was unable to start. Microsoft.WindowsAzure.Storage: Settings must be of the form "name=value".
This is with no alteration of the template code when I create the function.
module.exports = async function (context, myTimer) {
var timeStamp = new Date().toISOString();
if (myTimer.IsPastDue)
{
context.log('JavaScript is running late!');
}
context.log('JavaScript timer trigger function ran!', timeStamp);
};
Why is that? I appreciate any help!
This error is what you faced know.
Given that there is nothing wrong with your code, I think the problem should be caused by local.settings.json. Note the format of the value of AzureWebJobsStorage.
If you are using a local virtual storage simulator, then you should fill in UseDevelopmentStorage=true. If you choose Storage Account on Azure, then you should go here to copy the values and fill in AzureWebJobsStorage:
An example of local.settings.json using a local storage emulator:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "node"
}
}

Azure Queue Trigger Function - Local Integration

I am creating a simple queue triggered azure function using Visual Studio. I am connecting it with my storage account, but for some reason its not working. Any help is appreciated.
This is my code: (auto-generated by VS)
[FunctionName("QueueTest")]
public static void Run([QueueTrigger("my-queue", Connection = "")]string myQueueItem, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
}
This is my local.settings.json
{
"IsEncrypted": false,
"Values":{
"AzureWebJobsStorage":"DefaultEndpointsProtocol=https;AccountName=accountname;AccountKey=accountkey"
}
}
Queue trigger by default use a AzureWebJobsStorage account. All you need to do is just remove Connection parameter from the attribute:
[FunctionName("QueueTest")]
public static void Run([QueueTrigger("my-queue")]string myQueueItem, TraceWriter log)
{
log.Info($"C# Queue trigger function processed: {myQueueItem}");
}
Ideally if you are a Windows user, use Azure Storage Emulator to connect to local queues. Afterwards change the connection string in your local.settings.json file:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"AzureWebJobsDashboard": "UseDevelopmentStorage=true",
}
}
If you are not a Windows user, you must connect to queues hosted on the Azure platform. To do this, find the storage account linked to your functions and copy the connection string from his settings (Storage Account -> Access Keys -> Connection string)
So I figured out the issue. All the configurations were fine. The issue was, Azure Function Version of my function app was 1 but for some reason, probably because of latest SDK/WebJobs, version 1 was not working correctly. So I had to create another function app with AzureFunctionVersion 2 and all worked fine.
You need to add the connection string of your queue storage account to the local.settings.json and then supply the name of the connection string as the Connection parameter of the QueueTrigger, e.g. in local.settings.json
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=accountname;AccountKey=accountkey",
"MyStorage": "DefaultEndpointsProtocol=https;AccountName=accountname2;AccountKey=accountkey2;EndpointSuffix=core.windows.net"
}
and in your code
[QueueTrigger("my-queue", Connection = "MyStorage")]string myQueueItem

Resources