'No storage connection string found.' when trying to list durable function instances from azure functions core tools - azure

I am currently trying to list all instances of an activity function and the orchestrator function using azure function core tools. The application synchronizes data from different sources into a centralized location.
The setup is as follows:
TimerTrigger -> Durable Orchestrator -> Multiple Activity Functions
In my concrete example, it is like this:
Start Synchronization -> Orchestrate Synchronizations -> Synchronize Source
So we start the synchronization process which starts the orchestrator. The orchestrator then starts multiple different synchronizations, one for each source. The problem though is that I cannot seem to get the azure function core tools to list me all instances of the functions I am interested in.
Unfortunately, I would really prefer not to have to use the REST api to query for this information. The setup really complicates things with IP restrictions and managed identity authentication. I think I can correct the setup to get things to work from my network + user, if really needed, but I think that will take way longer than required.
I have tried running the following command:
func durable get-instances
in a directory with a file called host.json with the following contents:
{
"version": "2.0",
"AzureWebJobsStorage":"DefaultEndpointsProtocol=https;AccountName=Name;AccountKey=Key;EndpointSuffix=core.windows.net",
}
I have also tried where the contents of the file are as follows:
{
"version": "2.0",
"extensions": {
"durableTask": {
"storageProvider": {
"connectionStringName": "DefaultEndpointsProtocol=https;AccountName=Name;AccountKey=Key;EndpointSuffix=core.windows.net"
}
}
}
}
I have tried calling the func durable get-instances with and without the --connection-string-setting parameter, with the values 'AzureWebJobsStorage' and 'extensions:durableTask:storageProvider:connectionStringName', but nothing seems to work. I keep getting the error No storage connection string found.
I know that the connection string is correct. I have pulled it directly from the storage account 'Access keys' blade.
Is there anything I am missing? What am I doing wrong?

Thanks to #juunas, I got it to work. I edited the host.json file to have the following content:
{
"version": "2.0"
}
and created another file called local.settings.json with the following contents:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=Name;AccountKey=Key;EndpointSuffix=core.windows.net",
}
}
Running func durable get-instances now works and returns a continuation token, but an empty list. I was not expecting that, but I can now start exploring and understanding here what is going on.

Related

How to configure Serilog settings in Azure Function?

I’m in the process of creating my first production Azure Function and I’m trying to write log information to a local file and then eventually to Blob Storage. The local file is more for development troubleshooting and then ultimately I would like to have production information stored in Blob Storage. I’m not only new with Azure Functions but I’m also new with Serilog. I’ve used NLog in all my other applications but couldn’t get it to work with Azure Functions.
Currently I’m trying to get the local log working. I actually seem to have it working but I’m not understanding how I can tweak a couple things.
The first thing I’m trying to change is the amount of information that is getting logged. It seems to be logging a whole bunch of system type of information like Request info to the blob storage. There is so much stuff getting logged that what entry I'm adding in code gets lost. It looks like all the system entries are marked as Information which is why it’s probably showing up in my log. However, I would like to see if I could get it to only log data from when I specifically call the logger.Information(“some text”) in my code. Is there a way to suppress all of the Microsoft system information?
Second thing is how I can I make the Serilog configuration come from my local.settings.json file. Below is a sample of my file and I’m not sure if I would add the configuration information in the Values: property or if I would put it outside of that property into its own property? I’m assuming it would be in it’s own property but so far all my custom settings have been coming from the Values: property?
Do I need to add the Serilog.Settings.Configuration NuGet package? If so, then I’m not understanding how I configure my Startup.cs file to get the information from the local settings file instead of configuring the settings directly in code. Eventually, I would like to add it to Dependency Injection so I can use the logger in other classes as well.
Startup.cs
public override void Configure(IFunctionsHostBuilder builder)
{
builder.Services.AddTransient<IDataManager, DataManager>();
ConfigureServices(builder.Services).BuildServiceProvider(true);
}
private IServiceCollection ConfigureServices(IServiceCollection services)
{
services
.AddLogging(loggingBuilder =>
loggingBuilder.AddSerilog(
new LoggerConfiguration()
.MinimumLevel.Information()
.WriteTo.Console()
.WriteTo.File(#"D:\logs\AzureFunction\log_.txt", rollingInterval: RollingInterval.Day)
.CreateLogger())
);
return services;
}
Local.settings.json
{
"IsEncrypted": false,
"Values": {
"ProcessLookBackDays": "90",
"SqlConnection": "connection info",
"StorageConnection": "connection info"
"AzureWebJobsStorage": "connection info"
"InputContainer": "test-files",
"InputFolder": "input",
"FUNCTIONS_WORKER_RUNTIME": "dotnet"
},
"Serilog": {
"Using": [ "Serilog.Sinks.File" ],
"MinimumLevel": {
"Default": "Information"
},
"WriteTo": [
{
"Name": "File",
"Args": {
"path": "D:\\logs\\AzureFunction\\log_.log",
"rollingInterval": "Day",
"outputTemplate": "[{Timestamp:yyyy-MM-dd HH:mm:ss.fff zzz} {CorrelationId} {Level:u3}] {Username} {Message:lj}{NewLine}{Exception}"
}
}
]
}
}
Setting up configuration in local-setting. Json will not reflect in the azure function app. local-setting as the name suggest is for local use only while with azure you need to use the app setting and read them.
Just add a new setting in the app setting and you can then use the below code to read the settings.
var appsettings = Environment.GetEnvironmentVariable("Name of the setting");
When you want to use the external configuration with serilog then you can use the Serilog.Settings.Configuration.
Now you can configure the minimum level of a log event so all the log event with less importance than the specified minimum level will not be logged.
Log.Logger = new LoggerConfiguration()
.MinimumLevel.Debug()
.WriteTo.Console(restrictedToMinimumLevel: LogEventLevel.Information)
.CreateLogger();
Here we specify two things the .MinimumLevel.Debug() and restrictedToMinimumLevel this attribute dictates the minimum level for the that particular sink. A sink is just a place where you can log, they are configured using the Writeto tag. for e.g., in the above code the sink is a console. There are other sink too.
The minimum levels are verbose Debug, information, warning, error, fatal.
Reference:-
Read configuration from app setting by Ashish Patel
Use serilog to filter the logs
Serilog setting configuration

Azure Blob Trigger function app : running same instance for multiple blobs upload

I have created a Blob triggered function app in Python. My requirement is to run a separate instance for each blob upload (for parallel processing), but it's not happening. Even I have modified the host.json as below as per the below link:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob
{
"version": "2.0",
"extensions": {
"blobs": {
"maxDegreeOfParallelism": "4"
}
}
}
Still, the same instance is running and processing files one by one. Am I missing something here?
I'm afraid we can't implement this requirement. As far as I know, we can just set the function app to scale out to maximum n(in your case is 4) instances, but we can't scale out instances manaually.
When you modify the configuration to allow the function app to scale out for multiple instances, it can just scale out automatically when lots of requests coming. If there are only 4 request, only one instance will be started in most cases.
Here is another post I did research in the past which is similar problem with this case for your reference.

Output binding and generation of function.json

I'm trying to create an Azure function that will output to a table. I'm using the Azure Function App, and so, as I currently understand it, the function.json is generated for me by the SDK. My function definition is as follows:
public static HttpResponseMessage Run(
[HttpTrigger(AuthorizationLevel.Function, "post", Route = null)]HttpRequestMessage req,
TraceWriter log,
[StorageAccount("table_storage")] ICollector<TableItem> outputTable)
I've defined TableItem as a class that inherits from TableEntity. When I deploy this and look at the generated function.json, it doesn't mention the output parameter binding:
{
"generatedBy": "Microsoft.NET.Sdk.Functions.Generator-1.0.7",
"configurationSource": "attributes",
"bindings": [
{
"type": "httpTrigger",
"methods": [
"post"
],
"authLevel": "function",
"name": "req"
}
],
"disabled": false,
"scriptFile": "../bin/FunctionApp5.dll",
"entryPoint": "FunctionApp5.DeliveryComplete.Run"
}
If I run this from Visual Studio, I get the following error:
Cannot bind parameter 'outputTable' to type ICollector`1
I have a few questions about this behaviour: the first and main one is, why is function.json not showing the output binding? Secondly, I understand why this can't be edited when you deploy from VS, but is there a way to manage the bindings without guesswork (I came across using ICollector in this post), but I can't find anywhere else that says it should or shouldn't be there.
Finally, how does (or does) running this from the desktop interact with the published function: does it connect to the published version of the function, or does it generate the function.json locally?
That's a common source of confusion, but input and output bindings are not visible in generated function.json, only trigger does. They will still work normally.
If you are trying to write to Table Storage, you should use Table attribute instead of StorageAccount. ICollector is mentioned in Azure Table storage bindings for Azure Functions.
When running locally, the files stay locally and run in local runtime, without deployment to Azure. They might still interact with real Azure services via bindings.

Implementing Application Insight to Azure Function App

Anyone know of any god guide for this?
First i created an Application Insight Resource and put:
APPINSIGHTS_INSTRUMENTATIONKEY = "INSTRUMENTATION KEY"
in the Function Apps Application Settings.
I have tried implementering the nuget package for the funtion app like this.
Createing a project.json file and pasting this:
{
"frameworks": {
"net46":{
"dependencies": {
"Microsoft.ApplicationInsights": "2.1.0"
}
}
}
}
It installed the nuget package (i could see it in the log, everything went well).
After that i put these snippets in my code to use the telemetry.TrackException(exception) functionality:
First...
using Microsoft.ApplicationInsights;
using Microsoft.ApplicationInsights.Extensibility;
Then:
var telemetry = new TelemetryClient(new TelemetryConfiguration("INSTRUMENTATION KEY"));
and in my catch:
telemetry.TrackException(e);
and when i try to save my Function app i get this error:
error CS1729: 'TelemetryConfiguration' does not contain a constructor that takes 1 arguments
You don't need to use reference the Application Insights library to use it with Functions. If you've already set the APPINSIGHTS_INSTRUMENTATIONKEY application setting, you can simply add the ILogger interface as a parameter to your function and it will automatically send the log data to your Application Insights instance.
Full documentation can be found here: https://learn.microsoft.com/en-us/azure/azure-functions/functions-monitoring
Addition to #ChrisGillum answer for .Net Core 3.1 Azure Function:
If you create a new Azure Function with Http trigger from Visual Studio the following line will exist in the example:
log.LogInformation("C# HTTP trigger function processed a request.");
Add "APPINSIGHTS_INSTRUMENTATIONKEY" to local.settings.json Values.
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"APPINSIGHTS_INSTRUMENTATIONKEY": "<YOUR_GUID>"
},
}
Note that the key must be in an app setting named APPINSIGHTS_INSTRUMENTATIONKEY and nothing else.
Logging is then added automatically:
Complete guide for hosted values:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-monitoring?tabs=cmd#enable-application-insights-integration

How to bind to ICloudBlob or some other (not string) type

I've been trying to create a Azure function being triggered when I add a image to a container on my blob storage account.
The only thing that seems to work, is when I have a string parameter, but the files are images, so I have no use for a string containing the image data.
So I've been trying each and every example I can find online (not that many), and now I've tried the samples from the azure webjobs sdk - this isn't wokring either. So either I'm stupid, which I feel right now, I'm missing something obvious?
There are some of the errors I get:
Microsoft.Azure.WebJobs.Host: Error indexing method 'Functions.thumbnailgenerator'. Microsoft.Azure.WebJobs.Host: Can't bind BlobTrigger to type 'Microsoft.WindowsAzure.Storage.Blob.ICloudBlob'.
Microsoft.Azure.WebJobs.Host: Error indexing method 'Functions.thumbnailgenerator'. Microsoft.Azure.WebJobs.Host: Can't bind BlobTrigger to type 'Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob'.
Right now the function I'm trying out, is the one given in the sample above, and like so many others I've tried, it's not working with anything but strings.
So how should I create the function (with C#) and the function.json file, to make it work with a blob in and preferable a string in with the name of the blob. Either that or blob in and one out, where the name of the out blob is in a different container and the name is prefixed with a hardcoded string.
This is what I got now, and it's not running:
function.json
{
"bindings": [
{
"type": "blobTrigger",
"name": "blob",
"direction": "in",
"path": "kitimages/{name}.{ext}"
},
{
"type": "blob",
"name": "output",
"direction": "inout",
"path": "thumbnails/{name}_300_200.{ext}"
} ],
"disabled": false
}
run.csx
#r "Microsoft.WindowsAzure.Storage"
using System;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.WindowsAzure.Storage.Blob;
public static void Run(CloudBlockBlob blob, CloudBlockBlob output, TraceWriter log)
{
log.Info($"C# Blob trigger function processed a blob. Blob={blob.Name}");
}
EDIT: Take a look here for the final solution to my question: Getting work done in the cloud
We need to improve the template here, this is a common pitfall you've run into (sorry about that!). We're fixing, see the GitHub issue: Make it easier for users to get started with binary blob triggers.
There's a built-in template that binds to streams. Go to New Function and select C# for language and Samples for Scenario.
For a more advanced sample that uses CloudBlockBlob bindings (which requires the InOut binding direction that is not yet documented), see the Functions sample in ContosoMoments: DeleteImages Function.
Note that you can browse all the templates in the GitHub repo: https://github.com/Azure/azure-webjobs-sdk-templates.
For anyone else stumbling upon this while seemingly having a correct setup as per above:
I got this message because I had a reference to WindowsAzure.Storage in my project.json file. Perhaps because it was referring to an older version (8.1.1) of the library.. I don't know. Removing it made my function work. Since it's a supported DLL you should just import it using #r..
I found my solution here (the last reply by Baudine).
I had a project that referenced WindowsAzure.Storage nuget directly and a function project that that project but also referenced WindowsAzure.Storage indirectly (through Microsoft.Azure.WebJobs.Extensions.Storage nuget). After reading Baudine's answer, I saw that the versions off (v9.3.3 vs v9.3.1).
So my fix was as Baudine suggested: I removed the WindowsAzure.Storage nuget from the project and added Microsoft.Azure.WebJobs.Extensions.Storage. My trigger looks like this:
public async Task Run([BlobTrigger("/files/{fileName}", Connection = "StorageConnectionString")]ICloudBlob blob, string fileName)

Resources