Azure Bot transcripts not being saved - azure

I have a bot developed in Bot Framework Composer and have implemented Blob transcript storage. Transcript storage works when I run the bot locally. But once I publish the bot to azure, no transcripts are saved.
I presume there is some error in the azure bot accessing the blob storage but I don't see any errors generated in azure. The blob storage does not show any access attempts indicating to me that the request never gets to blob storage.
I updated CORS on the blob storage to allow all origins and methods but this did not have any effect.
Any suggestions what to look for or what to try next?

The issue was that there are two steps to adding transcripts to an existing bot.
In Composer, settings:
Add the blob storage settings in the runtimeSettings > components > features section:
"blobTranscript": {
"connectionString": "DefaultEndpointsProtocol=https;AccountName=bottranscripts;AccountKey=<your key here>;EndpointSuffix=core.windows.net",
"containerName": "transcripts"
}
At this point, running the bot locally should store transcripts in blob storage in Azure.
Again, in Composer, check the publish settings for publishing to Azure. There should be a setting
"blobStorage": {
"connectionString": "<ConnectionString>",
"container": "transcripts",
"name": "<myBotName>"
}
Make sure that the connection string matches what you entered in the runtimeSettings section. The bot in Azure will use the publish settings, not the runtimeSettings for transcripts.

Related

Azure Function App Service Bus subscription triggered not consuming messages

I implemented an Azure Function App with a service bus subscription trigger. It works all well in my laptop debugging it from Visual Studio, getting triggered every time a message is pushed to the service bus topic. However, after deploying it to Azure, it is not triggered when a message is published to the service bus topic.
After some debugging and research, I found that it was working well locally as it was using a emulated storage account; however, in the cloud, it needs to have a storage account. In my case, the problem was that the configuration settings were missing the details for the storage account. It needs to be either a connection string setting (if you are using SAS tokens), or as in my case, the two following entries, as I use managed identities instead (why it needed both representations is still unclear to me):
{
"name": "AzureWebJobsStorage:accountName",
"value": "yourstorageaccountname",
"slotSetting": false
},
{
"name": "AzureWebJobsStorage__accountName",
"value": "yourstorageaccountname",
"slotSetting": false
}

How do I access Azure Storage container from Azure Cognitive services

I am having an issue with transcribing (Speech-To-Text) an audio file hosted on Azure Storage container from the Cognitive Services API.
The services are of the same resource (and I created a VNet and they are part of the same subnet).
After I take the response from there the contentUrl:
The error I get is:
{
"successfulTranscriptionsCount": 0,
"failedTranscriptionsCount": 1,
"details": [
{
"source":"https://{service-name}.blob.core.windows.net/meetingnotes/Meeting82035.wav",
"status": "Failed",
"errorMessage": "Error when downloading the recording URI. StatusCode: Conflict.",
"errorKind": "DownloadRecordingsUrisUnknownError"
}
]
}
I tested in my environment and was getting the same error as you.
To resolve the issue, you need to append the SAS Token with bloUrl in contentUrls field.
For Generating the SAS token allowed all the permission as I have done in below picture.
Generated Transcript report
Final OutPut Once Clicked on ContentUrl
I contacted Azure support and they provided the correct solution, which is to add the Role “Storage Blob Data Contributor” to the speech services resources.
Go to IAM of your storage account
Go to Role Assignments
click "Add", then add your speech service in Managed Identities.
That should fix it.

Azure Logic App with Azure Blob Storage Action: Getting 429 statusCode error

I am using Azure Logic App with Azure BLOB Storage trigger.
When a blob is updated or modified in Azure Storage, I pull the content of blob created or modified from Storage, do some transformations on data and push it back to Azure Storage as new blob content using Create Content - Azure Blob Storage action of LogicApp.
With large number of blobs inserted (for example 10000 files) or updated into blob storage, Logic App gets triggered multiple runs as expected for these inserted blobs, but the further Azure Blob Actions fail with following error:
{
"statusCode": 429,
"message": "Rate limit is exceeded. Try again in 16 seconds."
}
Did someone face similar issue in Logic App? If yes, can you suggest what could be the possible reason and probable fix.
Thanks
Seems like you are hitting the rate limits on the Azure Blob Managed API.
Please refer to Jörgen Bergström's blog about this: http://techstuff.bergstrom.nu/429-rate-limit-exceeded-in-logic-apps/
Essentially he says you can setup multiple API connections that do the same thing and then randomize the connection in the logic app code view to randomly use one of those connection which will eliminate the rate exceeding issue.
An example of this (I was using SQL connectors) is see below API connections I setup for my logic app. You can do the same with a blob storage connection and use a similar naming convention e.g. blob_1, blob_2, blob_3, ... and so on. You can create as many as you would like, I created 10 for mine:
You would then in your logic app code view replace all your current blob connections e.g.
#parameters('$connections')['blob']['connectionId']
Where "blob" is your current blob api connection with the following:
#parameters('$connections')[concat('blob_',rand(1,10))]['connectionId']
And then make sure to add all your "blob_" connections at the end of your code:
"blob_1": {
"connectionId": "/subscriptions/.../resourceGroups/.../providers/Microsoft.Web/connections/blob-1",
"connectionName": "blob-1",
"id": "/subscriptions/.../providers/Microsoft.Web/locations/.../managedApis/blob"
},
"blob_2": {
"connectionId": "/subscriptions/.../resourceGroups/.../providers/Microsoft.Web/connections/blob-2",
"connectionName": "blob-2",
"id": "/subscriptions/.../providers/Microsoft.Web/locations/.../managedApis/blob"
},
...
The logic app would then randomize which connection to use during the run eliminating the 429 rate limit error.
Please check this doc: https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-manager-request-limits
For each Azure subscription and tenant, Resource Manager allows up to
12,000 read requests per hour and 1,200 write requests per hour.
you can check the usage by:
response.Headers.GetValues("x-ms-ratelimit-remaining-subscription-reads").GetValue(0)
or
response.Headers.GetValues("x-ms-ratelimit-remaining-subscription-writes").GetValue(0)

How to use an Azure blob-only storage account with Azure Functions - Trying to create blob snapshot

I'm trying to set up a function to take a snapshot of a blob container every time a change is pushed to it. There is some pretty simple functionality in Azure Functions to do this, but it only works for general purpose storage accounts. I'm trying to do this with a blob only storage account. I'm very new to Azure so I may be approaching this all wrong, but I haven't been able to find much helpful information. Is there any way to do this?
As #joy-wang mentioned, the Azure Functions Runtime requires a general purpose storage account.
A general purpose storage account is required to configure the AzureWebJobsStorage and the AzureWebJobsDashboard settings (local.settings.json or Appsettings Blade in the Azure portal):
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "my general purpose storage account connection string",
"AzureWebJobsDashboard": "my general purpose storage account connection string",
"MyOtherStorageAccountConnectionstring": "my blob only storage connection string"
}
}
If you want to create a BlobTrigger Function, you can specify another connection string and create a snapshot everytime a blob is created/updated:
[FunctionName("Function1")]
public static async Task Run([BlobTrigger("test-container/{name}",
Connection = "MyOtherStorageAccountConnectionstring")]CloudBlockBlob myBlob,
string name, TraceWriter log)
{
log.Info($"C# Blob trigger function Processed blob\n Name:{name}");
await myBlob.CreateSnapshotAsync();
}
In the Visual Studio:
I have tried to create snapshot for a blob-only storage
named joyblobstorage , but it failed. I supposed you should get the same error in the screenshot.
As the error information says Microsoft.Azure.WebJobs.Host: Storage account 'joyblobstorage' is of unsupported type 'Blob-Only/ZRS'. Supported types are 'General Purpose'.
In the portal:
I try to create a Function App and use the existing Storage, but it could not find my blob-only storage account. Azure Function setup in portal should not allow we to select a blob-only storage account. Please refer to the screenshot.
Conclusion:
It is not possible to create snapshot for a blob-only storage. In the official documentation, you could see the Storage account requirements.
When creating a function app in App Service, you must create or link to a general-purpose Azure Storage account that supports Blob, Queue, and Table storage.
Also, in the App settings reference, you could see
AzureWebJobsStorage
The Azure Functions runtime uses this storage account connection string for all functions except for HTTP triggered functions. The storage account must be a general-purpose one that supports blobs, queues, and tables.
AzureWebJobsDashboard
Optional storage account connection string for storing logs and displaying them in the Monitor tab in the portal. The storage account must be a general-purpose one that supports blobs, queues, and tables.
Here is the Feedback, Azure App Service Team has explained the requirements on storage account, you could refer to it.

Issue while copying existing blob to Azure Media Services

We are trying to copy the existing blob to AMS but it is not getting copied. Blob resides in storage account 1 and AMS is associated with storage account 2. All the accounts including AMS are in the same location.
await destinationBlob.StartCopyAsync(new Uri(sourceBlob.Uri.AbsoluteUri + signature));
When visualizing the AMS Storage Account using Blob Storage explorer, asset folders are getting created but with no blobs in it. Also, within the Media explorer, we can see the assets listed in the AMS but when clicked, not found exception is thrown. Basically they are not getting fully copied into the AMS.
However, when we use same code and attach a new AMS to the blob storage account (storage account1) where the actual blob resides, copy is working fine.
I have not reproduce your issue. But there is a code sample to copy existing blob to Azure media services via .NET SDK. Please try to copy the Blob using StartCopyFromBlob or StartCopyFromBlobAsync(the Azure storage client library 4.3.0). Below is the code snippet in the code sample :
destinationBlob.StartCopyFromBlob(new Uri(sourceBlob.Uri.AbsoluteUri + signature));
while (true)
{
// The StartCopyFromBlob is an async operation,
// so we want to check if the copy operation is completed before proceeding.
// To do that, we call FetchAttributes on the blob and check the CopyStatus.
destinationBlob.FetchAttributes();
if (destinationBlob.CopyState.Status != CopyStatus.Pending)
{
break;
}
//It's still not completed. So wait for some time.
System.Threading.Thread.Sleep(1000);
}

Resources