How to trigger blob function with user assigned identity - azure

I have created a blob trigger azure function which uses connection string in the code at the moment.
local.settings.json
public static class BlobTrigger_Fun
{
[FunctionName("BlobTrigger_Fun")]
public static void Run([BlobTrigger("democontainerazure/{name}", Connection = "AzureWebJobsStorage")]Stream myBlob, string name, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
}
}
I want to use managed identity to avoid use of connection string in the code.

No, you can't.
The MSI(managed identity) is not for such usage, it is just used for authenticating to azure services that support Azure AD authentication, the AzureWebJobsStorage is used for azure function runtime, in the function app, the property must be specified as an app setting in the site configuration.

Related

Azure function with .net core 3.1 not triggering from Queue storage

I am trying to trigger an Azure function when a new queue message is added. Both the storage account and the azure function are in the same region.
For my Azure Function, I clicked on Add, Azure Queue Storage Trigger, I gave my function a name, and the Queue name is the same name as my queue. I tried adding a new queue message, nothing is triggered.
I then tried modifying the code as the following:
using System;
[FunctionName("QueueTrigger")]
[StorageAccount("storagetestaccount1")]
public static void Run(
[QueueTrigger("queue1")] string myQueueItem,
ILogger log)
{
log.LogInformation($"C# Queue trigger function processed: {myQueueItem}");
}
But still no success. Any idea what might be causing this?
This is my first azure function so not sure what's correct and what's not.
I think the correct code is this:
public static class Function1
{
[FunctionName("Function1")]
public static void Run([QueueTrigger("queueName", Connection = "connectString")]string myQueueItem, ILogger log)
{
log.LogInformation($"C# Queue trigger function processed: {myQueueItem}");
}
}
Note
If you develop locally, you should config your azure storage connect string in local.settings.json
If you develop in azure portal, you need to config connect string in Application settings:

Azure Storage Blob Trigger is not awakening a sleeping function

This question is similar to Azure Blob Storage trigger Function not firing
However, their problem was that their Azure Function wasn't awaking immediately, giving the impression it wasn't processing triggers from Azure Blob Storage when in fact it was after 10 minutes which is exactly as the MS docs claim.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob-trigger?tabs=csharp
My problem is different. My blob has been sitting in the container now for 9 hours and it still hasn't been processed.
All it does is post a message onto ServiceBus.
[FunctionName("IncomingFileDetected")]
[return: ServiceBus("incoming-file-received", EntityType = Microsoft.Azure.WebJobs.ServiceBus.EntityType.Topic)]
public static IncomingFile Run(
[BlobTrigger("incoming-files/{filename}", Connection = "ConnectionStrings:MutableStorage")]
Stream contents,
string filename,
ILogger log)
{
log.LogInformation($"Detected new blob file: {filename}");
return new IncomingFile(filename);
}
No messages have appeared in the service bus.
Now, after 9 hours, I have restarted the function app and the blob was processed within about 10 minutes.
Update:
Thanks for Peter Morris's sharing, the problem comes from is service plan is d1. So first make sure you are based on the three kinds of plans: consumption plan, premium plan and app service plan. When we use azure function, even only test, we should use a consumption plan. The smallest in production is S1, which is normally used for testing.
Original Answer:
The below code works fine on my side. Even the consumption plan is no problem.
using System;
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;
namespace FunctionApp35
{
public static class Function1
{
[FunctionName("Function1")]
[return: ServiceBus("test", Connection = "ServiceBusConnection")]
public static string Run([BlobTrigger("samples-workitems/{name}", Connection = "str")]Stream myBlob, string name, ILogger log)
{
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
string a = "111111111111111";
return a;
}
}
}
This is my local settings:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=0730bowmanwindow;AccountKey=lti/ThmF+mw9BebOacp9gVazIh76Q39ecikHSCkaTcGK5hmInspX+EkjzpNmvCPWsnvapWziHQHL+kKt2V+lZw==;EndpointSuffix=core.windows.net",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"str": "DefaultEndpointsProtocol=xxxxxx",
"ServiceBusConnection": "Endpoint=sb://bowmantestxxxxxx"
}
}
The str is from this place:
The ServiceBusConnection is from this place:
And please notice that the blob will will not be removed from container after the azure function is triggered. Also, dont forget to create at least one subscription in your service bus topic.
All all the above also works fine after the function be deployed to azure.(The difference from local is you need to add settings in configuration settings instead of local.settings.json)

Free Text search in Azure Function App - Application Insights

For example I have this string "CTASK0220892", I want to search this in Application Insights for Azure Function App.
What would be the query to search this string?
It depends on which method you're using to send the string "CTASK0220892" to application insights.
As an example, if you're using ILogger.LogInformation method, like below:
[FunctionName("Function1")]
public static void Run([BlobTrigger("samples-workitems/{name}", Connection = "AzureWebJobsStorage")]Stream myBlob, string name, ILogger log)
{
//use ILogger.LogInformation method to send the string to application insights.
log.LogInformation("CTASK0220892");
log.LogInformation($"C# Blob trigger function Processed blob\n Name:{name} \n Size: {myBlob.Length} Bytes");
}
Then in application insights, you can find this string in the trace table. Nav to azure portal -> your application insights which is connected to the azure function -> Logs, then use the query below:
traces
| where message contains "CTASK0220892"
Note that: there're many operators besides the contains operator, like ==, !=,startswith etc. Please use the proper operator as per your need.
Here is the test result:

Where to put my Azure storage account queue connectionstring (used by my Azure function)

I have an Azure storage account with a queue in it.
Now in my function (created in Visual Studio) I have this:
public IActionResult Run(
[HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]
HttpRequest req,
[Queue("temperatuurmeting"), StorageAccount("AzureWebJobsStorage")]
ICollector<string> messages,
ILogger log)
{
`
I host the code in BitBucket, and when I commit something it gets deployed to Azure, and it works.
But.... the AzureWebJobsStorage connection string is (automatically created when I created my function in Visual Studio) in the local.settings.json, which I just read should not be in the code repository.
But: when I do not include the local.settings.json in de code repo, and then deploy it to Azure, where should I put the AzureWebJobsStorage connectionstring so that my running function can find it?
Or is AzureWebJobsStorage a default name and already somewhere in Azure created by something other than me and can I just remove the local.settings.json?
Locally:
"Connection": "Endpoint=sb://****.servicebus.windows.net/;SharedAccessKeyName=****;SharedAccessKey=*********"
in local.settings.json
public static void Run([ServiceBusTrigger("attended", Connection = "Connection")]string myQueueItem, ILogger log)
in the function
When you publish go to the portal
go to your function app settings
variables
and then add the variable you used

Generic Azure Functions - Dynamic Connections

I want to create and host a azure function that would take as input "azure-storage-account-name" and "path" and run some common logic and then return a list of processed blobs in that storage account at that path. I have 20 storage accounts and I was thinking to write single azure function in same subscription to have listing capability across all of them
I went through Azure function documentation couldn't figure out if this is even possible in current offering. Any pointers would be helpful
You can use Imperative Bindings feature of Azure Functions. This is a sample code:
public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, Binder binder, TraceWriter log)
{
var attributes = new Attribute[]
{
new StorageAccountAttribute("your account"),
new BlobAttribute("your folder name")
};
var directory = await binder.BindAsync<CloudBlobDirectory>(attributes);
log.Info(directory.ListBlobs().Count().ToString());
return new HttpResponseMessage(HttpStatusCode.OK);
}
If you have the correct credentials, you can use the Azure Storage REST API to get a list of containers

Resources