Azure Function App Service Bus subscription triggered not consuming messages - azure

I implemented an Azure Function App with a service bus subscription trigger. It works all well in my laptop debugging it from Visual Studio, getting triggered every time a message is pushed to the service bus topic. However, after deploying it to Azure, it is not triggered when a message is published to the service bus topic.

After some debugging and research, I found that it was working well locally as it was using a emulated storage account; however, in the cloud, it needs to have a storage account. In my case, the problem was that the configuration settings were missing the details for the storage account. It needs to be either a connection string setting (if you are using SAS tokens), or as in my case, the two following entries, as I use managed identities instead (why it needed both representations is still unclear to me):
{
"name": "AzureWebJobsStorage:accountName",
"value": "yourstorageaccountname",
"slotSetting": false
},
{
"name": "AzureWebJobsStorage__accountName",
"value": "yourstorageaccountname",
"slotSetting": false
}

Related

Azure Bot transcripts not being saved

I have a bot developed in Bot Framework Composer and have implemented Blob transcript storage. Transcript storage works when I run the bot locally. But once I publish the bot to azure, no transcripts are saved.
I presume there is some error in the azure bot accessing the blob storage but I don't see any errors generated in azure. The blob storage does not show any access attempts indicating to me that the request never gets to blob storage.
I updated CORS on the blob storage to allow all origins and methods but this did not have any effect.
Any suggestions what to look for or what to try next?
The issue was that there are two steps to adding transcripts to an existing bot.
In Composer, settings:
Add the blob storage settings in the runtimeSettings > components > features section:
"blobTranscript": {
"connectionString": "DefaultEndpointsProtocol=https;AccountName=bottranscripts;AccountKey=<your key here>;EndpointSuffix=core.windows.net",
"containerName": "transcripts"
}
At this point, running the bot locally should store transcripts in blob storage in Azure.
Again, in Composer, check the publish settings for publishing to Azure. There should be a setting
"blobStorage": {
"connectionString": "<ConnectionString>",
"container": "transcripts",
"name": "<myBotName>"
}
Make sure that the connection string matches what you entered in the runtimeSettings section. The bot in Azure will use the publish settings, not the runtimeSettings for transcripts.

How can I log subscriptions auto-delete events on the Azure service bus diagnostics logs?

We have an ASP.NET core 2.2 application using Azure Service bus topics and subscriptions.
We have configured the auto delete on idle feature on the subscriptions. After an idle time of 10 minutes the subscriptions are automatically deleted by the Azure infrastructure.
We have enabled the diagnostics logs for our service bus namespace on the Azure portal as explained here. We have verified that we are actually collecting logs and we are able to see them from the Azure portal. So far, so good.
Our problem is that we are not able to find among the collected logs the auto delete of the subscriptions after the configured idle time. Based on this documentation we would expected to see a log entry having an EventName_s with the value AutoDelete Delete Subscription, but we don't. We are sure that during our test at least one subscription has been deleted by the configured auto delete on idle rule.
Are the subscription auto delete events actually logged, as stated in the docs ?
Is there any configuration required to actually see the subscription auto delete events from the service bus diagnostics logs ?
#Enrico Massone The information in the document is correct that when you enabled diagnostic logs then you should see autodelete events logs. I have performed a test at my end and was able to see the below event for the configured diagnostic setting streaming the logs to log analytics workspace and storage account.
{ "Environment": "PROD", "Region": "South India", "ScaleUnit": "XXXX", "ActivityId": "249d60c9-780d-4fce-a3dc-69688b576d65", "EventName": "AutoDelete Delete Subscription", "resourceId": "/SUBSCRIPTIONS/XXXXX/RESOURCEGROUPS/XXXX/PROVIDERS/MICROSOFT.SERVICEBUS/NAMESPACES/XXX", "SubscriptionId": "XXXX", "EventTimeString": "11/25/2020 8:01:01 AM +00:00", "EventProperties": "{"SubscriptionId":"XXXXX","Namespace":"XXXX","Via":"AutoDelete","TrackingId":"249d60c9-780d-4fce-a3dc-69688b576d65_B4"}", "Status": "Succeeded", "Caller": "AutoDelete", "category": "OperationalLogs"}
Autodelete has a minimum value of 5 minutes and will trigger only on fulfilling the below conditions
No Receives
No updates to the subscription
No new rules added to the subscription No Browse/Peek
No Browse/Peek
If you want I can help you in verifying the logs from the backend whether the auto on idle was triggered or not.
You can also perform another test to see if you observe the same behavior.

Securing an Azure Function

I'm trying to apply the least privilege principle to an Azure Function. What I want is to make a FunctionApp have only read access to a, for example, storage queue. What I've tried so far is:
Enable managed identity in the FunctionApp
Create a role that only allows read access to the queues (role definition below)
Go to the storage queue IAM permissions, and add a new role assignment, using the new role and the Function App.
But it didn't work. If I try to write to that queue from my function (using an output binding) the item is written, when I expected a failure. I've tried using the builtin role "Storage Queue Data Reader (Preview)" with the same result.
What's the right way to add/remove permissions of a Function App?
Role definition:
{
"Name": "Reader WorkingSA TestQueue Queue",
"IsCustom": true,
"Description": "Read TestQueue queue on WorkingSA storage accoung.",
"actions": ["Microsoft.Storage/storageAccounts/queueServices/queues/read"],
"dataActions": [
"Microsoft.Storage/storageAccounts/queueServices/queues/messages/read"
],
"notActions": [],
"notDataActions": [],
"AssignableScopes": [
"/subscriptions/XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX/resourceGroups/TestAuth-dev-rg"
]
}
#anirudhgarg has pointed the right way.
The managed identity and RBAC you set makes difference only when you use managed identity access token to reach Storage service in Function app. It means those settings have no effect on function binding as it internally connects to Storage using connection string. If you haven't set connection property for the output binding, it leverages the AzureWebJobsStorage app settings by default.
To be more specific, connection string has nothing to do with Azure Active Directory Authentication process so it can't be influenced by AAD configuration. Hence if a function takes advantage of Storage Account connection string(e.g. uses Storage related binding), we can't limit its access with other settings. Likewise, no connection string usage means no access.
Update for using SAS token
If the queue mentioned is used in a Queue Trigger/input binding, we can restrict function with read and process(get message then delete)access, here comes SAS token.
Prerequisite:
Queue locates at Storage account other than the one specified by AzureWebJobsStorage app setting. AzureWebJobsStorage requires connection string offering full access with Account key.
Function app is 2.0. Check it on Function app settings> Runtime version: 2.xx (~2). In 1.x it requires more permissions like AzureWebJobsStorage.
Then get SAS token on portal as below and put it in app settings.

Azure function ApiHubFileTrigger doesn't execute with consumption plan when not monitoring in portal

I'm using the apiHubFileTrigger with the OneDrive for business connector(onedriveforbusiness_ONEDRIVEFORBUSINESS).
{
"bindings": [
{
"type": "apiHubFileTrigger",
"name": "myFile",
"direction": "in",
"path": "/InputFile/{file}",
"connection": "onedriveforbusiness_ONEDRIVEFORBUSINESS"
}
],
"disabled": false
}
It's working well when I am monitoring the script inside the azure portal. But soon as I close the editor and wait for some time, the function is not be triggered on new files copied to onedrive for business. There is no error or anything in invocation logs. (No invocation at all)
The function is written in C#. The function will use the input file and perform some operations based on file. Since its working when I am inside the portal and monitoring it, the issue is not related to code.
I'm running the consumption plan so the problem has nothing to do with "always on"
To summarize the discussion in the comments, there is an outstanding bug with Consumption Plan functions where triggers sometimes get out of sync and don't fire unless the portal is open if they are published in certain ways (details here: https://github.com/Azure/Azure-Functions/issues/210)
In those cases, hitting the "refresh" button next to the function app in the portal will sync the trigger.
In this case, there was an issue with storage account connection string parsing. The workaround is to switch to an App Service Plan function (making sure that "Always On" is "On").

How to write azure function that can get triggered when an item appears in azure service bus queue using C#?

The new azure function preview contains a few templates for C#. But there is no service bus queue template for C#. There is a trigger template for node with service bus. But on close inspection, it only supported Notification Hubs and not service bus queue. Is it even possible to write an azure function that can only be triggered when an item appears in azure service bus queue? If it is not possible now, will there be such a template in the near future?
Thanks.
Raghu/..
Update: The below steps and information still hold, however we now have a "ServiceBusQueueTrigger - C#" template live in the portal, so the workaround steps are no longer necessary :)
ServiceBus IS supported already for C#, we just need to add a template for it (we'll add very soon). In general, templates are just starting points - you can always modify templates by adding additional bindings, or start from the empty template and build up your own Function.
Until we get the template uploaded, you can get this working yourself by starting from the C# Empty template. For example, you can enter binding info like the following in the Advanced Editor on the Integrate tab:
{
"bindings": [
{
"type": "serviceBusTrigger",
"name": "message",
"direction": "in",
"queueName": "samples-input",
"connection": "myServiceBus"
}
]
}
Make sure your Function App has an AppSetting matching the name of the connection property, containing your ServiceBus connection string. It looks like we currently have some issues with the connection string picker for ServiceBus (which will also be fixed very soon), but you can use "Function app settings"/"Go to App Service Settings"/"Application Settings" to add this app setting. Then you can use the corresponding Function code:
using System;
using Microsoft.Azure.WebJobs.Host;
public static void Run(string message, TraceWriter log)
{
log.Verbose($"C# ServiceBus Queue function processed message: {message}");
}
This function will then be invoked whenever new messages are added to ServiceBus queue samples-input.
Per https://azure.microsoft.com/en-us/documentation/articles/functions-reference/, there is no binding with the SB.
The best way to do that instead of doing something that is (at least by some chance) being in work in product group is to submit your idea on the UserVoice - https://feedback.azure.com/forums/355860-azure-functions .

Resources