How do I access Azure Storage container from Azure Cognitive services - azure

I am having an issue with transcribing (Speech-To-Text) an audio file hosted on Azure Storage container from the Cognitive Services API.
The services are of the same resource (and I created a VNet and they are part of the same subnet).
After I take the response from there the contentUrl:
The error I get is:
{
"successfulTranscriptionsCount": 0,
"failedTranscriptionsCount": 1,
"details": [
{
"source":"https://{service-name}.blob.core.windows.net/meetingnotes/Meeting82035.wav",
"status": "Failed",
"errorMessage": "Error when downloading the recording URI. StatusCode: Conflict.",
"errorKind": "DownloadRecordingsUrisUnknownError"
}
]
}

I tested in my environment and was getting the same error as you.
To resolve the issue, you need to append the SAS Token with bloUrl in contentUrls field.
For Generating the SAS token allowed all the permission as I have done in below picture.
Generated Transcript report
Final OutPut Once Clicked on ContentUrl

I contacted Azure support and they provided the correct solution, which is to add the Role “Storage Blob Data Contributor” to the speech services resources.
Go to IAM of your storage account
Go to Role Assignments
click "Add", then add your speech service in Managed Identities.
That should fix it.

Related

Azure Bot transcripts not being saved

I have a bot developed in Bot Framework Composer and have implemented Blob transcript storage. Transcript storage works when I run the bot locally. But once I publish the bot to azure, no transcripts are saved.
I presume there is some error in the azure bot accessing the blob storage but I don't see any errors generated in azure. The blob storage does not show any access attempts indicating to me that the request never gets to blob storage.
I updated CORS on the blob storage to allow all origins and methods but this did not have any effect.
Any suggestions what to look for or what to try next?
The issue was that there are two steps to adding transcripts to an existing bot.
In Composer, settings:
Add the blob storage settings in the runtimeSettings > components > features section:
"blobTranscript": {
"connectionString": "DefaultEndpointsProtocol=https;AccountName=bottranscripts;AccountKey=<your key here>;EndpointSuffix=core.windows.net",
"containerName": "transcripts"
}
At this point, running the bot locally should store transcripts in blob storage in Azure.
Again, in Composer, check the publish settings for publishing to Azure. There should be a setting
"blobStorage": {
"connectionString": "<ConnectionString>",
"container": "transcripts",
"name": "<myBotName>"
}
Make sure that the connection string matches what you entered in the runtimeSettings section. The bot in Azure will use the publish settings, not the runtimeSettings for transcripts.

Azure Data Factory: Access token from MSI failed for Data Factory

Details of the Error: Get access token from MSI failed for Datafactory XXXX, region XXXX. Please verify resource url is valid and retry. Details: Accquire MI token from MI store V1 failed.
Error Code: 2403
Failure type: User Configuration issue
used web activity in Azure Data Factory to access Azure function app using MSI
I also had these kind of issues and it took me some time to figure out the right resource ID for the token I needed.
First of all the "Web-Activity" in ADF or Azure Synapse can be used for performing Azure REST-API calls quite good.
But we have to understand that "access token" is not always the same "access token". Azure AD provides different access token depending on the resource provider you want to access.
Here is a list of Resource IDs you can use:
https://learn.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/services-support-managed-identities#azure-services-that-support-azure-ad-authentication
Unfortunately it doesn't seem up to date, as I'm using in my case for https://dev.azuresynapse.net (which is not listed on the docs yet).
As an alternative, there is Azure Function activity in Azure Data Factory. You can try that.https://learn.microsoft.com/en-us/azure/data-factory/control-flow-azure-function-activity

Why is Azure Storage API permission not listed in azure portal?

Trying to build an app that accesses the azure storage API using AAD credentials, I noted that some API permissions are not listed within the azure portal and the only way to set them is through editing the app registration manifest directly. See my answer here for more details.
My question is simple:
Did I miss the permission for user_impersonation on Azure Storage in the portal or is it truly not listed?
If so, where can we find a complete list of all available permissions with its GUIDs so configuring the manifest manually is easier?
For instance I found the correct GUID for user_impersonation on Azure Storage somewhere on a internet forum and added it to my manifest as shown below.
"requiredResourceAccess": [
{
"resourceAppId": "e406a681-f3d4-42a8-90b6-c2b029497af1",
"resourceAccess": [
{
"id": "03e0da56-190b-40ad-a80c-ea378c433f7f",
"type": "Scope"
}
]
}
]
The result hover can be seen in the portal after modification of the manifest:
As pointed out in the comments, the entry should be visible unter "APIs my organization uses".
However, the search there is not returning anything, which seems to be a temporary bug.
You have to manually scroll through the list and hit "load more". From there you'll find Azure Storage as an API permission.

Azure Logic App with Azure Blob Storage Action: Getting 429 statusCode error

I am using Azure Logic App with Azure BLOB Storage trigger.
When a blob is updated or modified in Azure Storage, I pull the content of blob created or modified from Storage, do some transformations on data and push it back to Azure Storage as new blob content using Create Content - Azure Blob Storage action of LogicApp.
With large number of blobs inserted (for example 10000 files) or updated into blob storage, Logic App gets triggered multiple runs as expected for these inserted blobs, but the further Azure Blob Actions fail with following error:
{
"statusCode": 429,
"message": "Rate limit is exceeded. Try again in 16 seconds."
}
Did someone face similar issue in Logic App? If yes, can you suggest what could be the possible reason and probable fix.
Thanks
Seems like you are hitting the rate limits on the Azure Blob Managed API.
Please refer to Jörgen Bergström's blog about this: http://techstuff.bergstrom.nu/429-rate-limit-exceeded-in-logic-apps/
Essentially he says you can setup multiple API connections that do the same thing and then randomize the connection in the logic app code view to randomly use one of those connection which will eliminate the rate exceeding issue.
An example of this (I was using SQL connectors) is see below API connections I setup for my logic app. You can do the same with a blob storage connection and use a similar naming convention e.g. blob_1, blob_2, blob_3, ... and so on. You can create as many as you would like, I created 10 for mine:
You would then in your logic app code view replace all your current blob connections e.g.
#parameters('$connections')['blob']['connectionId']
Where "blob" is your current blob api connection with the following:
#parameters('$connections')[concat('blob_',rand(1,10))]['connectionId']
And then make sure to add all your "blob_" connections at the end of your code:
"blob_1": {
"connectionId": "/subscriptions/.../resourceGroups/.../providers/Microsoft.Web/connections/blob-1",
"connectionName": "blob-1",
"id": "/subscriptions/.../providers/Microsoft.Web/locations/.../managedApis/blob"
},
"blob_2": {
"connectionId": "/subscriptions/.../resourceGroups/.../providers/Microsoft.Web/connections/blob-2",
"connectionName": "blob-2",
"id": "/subscriptions/.../providers/Microsoft.Web/locations/.../managedApis/blob"
},
...
The logic app would then randomize which connection to use during the run eliminating the 429 rate limit error.
Please check this doc: https://learn.microsoft.com/en-us/azure/azure-resource-manager/resource-manager-request-limits
For each Azure subscription and tenant, Resource Manager allows up to
12,000 read requests per hour and 1,200 write requests per hour.
you can check the usage by:
response.Headers.GetValues("x-ms-ratelimit-remaining-subscription-reads").GetValue(0)
or
response.Headers.GetValues("x-ms-ratelimit-remaining-subscription-writes").GetValue(0)

Azure rest apis to ListKeys of classic storage account

I wanted to retrieve the access keys of classic storage account.
I found this online
POST https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Storage/storageAccounts/{accountName}/listKeys?api-version=2016-12-01
But this is not applicable for classic storage account. When I replace the Microsoft.Storage to Microsoft.ClassicStorage, it throws the following error
{
"error": {
"code": "InvalidRequestUri",
"message": "The request uri is invalid. The requested path '/subscriptions/{subscriptionID}/resourceGroups/{myresourcegroup}/providers/Microsoft.ClassicStorage/storageAccounts/{myStorageAccount}/listKeys' is not found."
}
}
NOTE: I am using Application permissions not delegated.
For classic storage accounts, the documented way to list keys is using Service Management API (unfortunately I am not able to find the documentation).
You can get the keys for a classic storage accounts using ARM API as well however it is not supported and Microsoft may remove that API completely anytime. To do so, simply use the following URL:
https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.ClassicStorage/storageAccounts/{accountName}/listKeys?api-version=2015-06-01
It is also recommended that you convert your classic storage accounts to ARM storage accounts if possible.

Resources