How to Access Emulated Azure storage on local PC - azure

I am beginning development with Azure functions. Ive been able to connect to my actual azure Storage account Queue for testing how to program with Azure functions. Now my next step is to use the Microsoft Azure Storage Explorer to use the Local storage account so I do not have to be connected to azure. I saw how to do it in this article: https://learn.microsoft.com/en-us/azure/storage/storage-configure-connection-string#create-a-connection-string-to-the-storage-emulator
in the appsettings.json I changed my values to this exactly:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==",
"AzureWebJobsDashboard": "",
"StorageConnectionString": "UseDevelopmentStorage=true"
}
}
When i start up the Azure Fuctions CLI using Visual Studio i get this error message:
ScriptHost initialization failed Microsoft.WindowsAzure.Storage: The
remote server returned an error: (403) Forbidden.
Has anyone encountered this?

Please change the following line of code:
"AzureWebJobsStorage": "DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw=="
to either:
"AzureWebJobsStorage": "UseDevelopmentStorage=true"
or:
"AzureWebJobsStorage": "DefaultEndpointsProtocol=http;AccountName=devstoreaccount1;
AccountKey=Eby8vdM02xNOcqFlqUwJPLlmEtlCDXJ1OUzFT50uSRZ6IFsuFq2UVErCz4I6tq/K1SZFPTOtr/KBHBeksoGMGw==;
BlobEndpoint=http://127.0.0.1:10000/devstoreaccount1;
TableEndpoint=http://127.0.0.1:10002/devstoreaccount1;
QueueEndpoint=http://127.0.0.1:10001/devstoreaccount1;"
That should take care of 403 error.
Basically storage emulator has different endpoints than the cloud storage account. For example, the default blob endpoint for a cloud storage account is http://[youraccount].blob.core.windows.net while the blob endpoint for storage emulator is http://127.0.0.1:10000. When you just specify the storage account name and key for the storage emulator in your connection string, storage client library treats it like a cloud storage account and tries to connect to http://devstoreaccount1.blob.core.windows.net using the account key you provided. Since the key for devstoreaccount1 in the cloud is not the one you provided, you are getting 403 error.

Related

Azure Bot transcripts not being saved

I have a bot developed in Bot Framework Composer and have implemented Blob transcript storage. Transcript storage works when I run the bot locally. But once I publish the bot to azure, no transcripts are saved.
I presume there is some error in the azure bot accessing the blob storage but I don't see any errors generated in azure. The blob storage does not show any access attempts indicating to me that the request never gets to blob storage.
I updated CORS on the blob storage to allow all origins and methods but this did not have any effect.
Any suggestions what to look for or what to try next?
The issue was that there are two steps to adding transcripts to an existing bot.
In Composer, settings:
Add the blob storage settings in the runtimeSettings > components > features section:
"blobTranscript": {
"connectionString": "DefaultEndpointsProtocol=https;AccountName=bottranscripts;AccountKey=<your key here>;EndpointSuffix=core.windows.net",
"containerName": "transcripts"
}
At this point, running the bot locally should store transcripts in blob storage in Azure.
Again, in Composer, check the publish settings for publishing to Azure. There should be a setting
"blobStorage": {
"connectionString": "<ConnectionString>",
"container": "transcripts",
"name": "<myBotName>"
}
Make sure that the connection string matches what you entered in the runtimeSettings section. The bot in Azure will use the publish settings, not the runtimeSettings for transcripts.

Azure ADF using Azure Batch throws Shared Access Signature generation error

I am working on a simple Azure Data Factory pipeline where I have simply added a Batch Service and in that specified the Batch Service account (which I have created thru linked service and tested the connection is working). In the command I am just running a simple "ls" command and when I do a debug run I get this error: "Cannot create Shared Access Signature unless Account Key credentials are used." I have following linked services "Azure Batch", "Azure Blob Storage" and Key Vault (where we store the access key). All linked services connections are working properly.
Any help on how to fix this error: "Cannot create Shared Access Signature unless Account Key credentials are used."
Azure Batch Linked service:
Azure Storage Linked service:
Azure Data factory pipeline:
The issue happens because you use "Managed Identity" to connect ADF to the Storage. It will say "successful" when doing a connection test on the linked services but when this storage is used for a Batch, it needs to have "Account Key" authentication type (see here).

Error calling the azure function endpoint from azure data factory

I have linked azure function in data factory pipeline which writes the text file to blob storage
The azure function works fine when executed independently and writes the file to blob storage
But i am facing below mentioned error when i run the azure function from data factory
{
"errorCode": "3600",
"message": "Error calling the endpoint.",
"failureType": "UserError",
"target": "Azure Function1"
}
I have configured the azure fucntion to access the blob with blobendpoint and shared access signature as mentioned below
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=XYZ;AccountKey=XYZ;BlobEndpoint=ABC;SharedAccessSignature=AAA"
Please let me know if i need to make some additional properties changes in blob storage to access azure function successfully from data factory
What is the trigger in your azure function? http trigger?
Also how is your azure function protected?
if protected using AAD you need Bearer token.
if you are using keys you need x-function key.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-http-webhook#authorization-keys
Here is a video from channel9 that might help:
Run Azure Functions from Azure Data Factory pipelines
https://channel9.msdn.com/Shows/Azure-Friday/Run-Azure-Functions-from-Azure-Data-Factory-pipelines
The Azure Function Activity in the ADF pipeline expects the Azure Function to return a JSON object instead of an HttpResponseMessage.
Here is how we solved it:
https://microsoft-bitools.blogspot.com/2019/01/introducing-azure-function-activity-to.html

How to use an Azure blob-only storage account with Azure Functions - Trying to create blob snapshot

I'm trying to set up a function to take a snapshot of a blob container every time a change is pushed to it. There is some pretty simple functionality in Azure Functions to do this, but it only works for general purpose storage accounts. I'm trying to do this with a blob only storage account. I'm very new to Azure so I may be approaching this all wrong, but I haven't been able to find much helpful information. Is there any way to do this?
As #joy-wang mentioned, the Azure Functions Runtime requires a general purpose storage account.
A general purpose storage account is required to configure the AzureWebJobsStorage and the AzureWebJobsDashboard settings (local.settings.json or Appsettings Blade in the Azure portal):
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "my general purpose storage account connection string",
"AzureWebJobsDashboard": "my general purpose storage account connection string",
"MyOtherStorageAccountConnectionstring": "my blob only storage connection string"
}
}
If you want to create a BlobTrigger Function, you can specify another connection string and create a snapshot everytime a blob is created/updated:
[FunctionName("Function1")]
public static async Task Run([BlobTrigger("test-container/{name}",
Connection = "MyOtherStorageAccountConnectionstring")]CloudBlockBlob myBlob,
string name, TraceWriter log)
{
log.Info($"C# Blob trigger function Processed blob\n Name:{name}");
await myBlob.CreateSnapshotAsync();
}
In the Visual Studio:
I have tried to create snapshot for a blob-only storage
named joyblobstorage , but it failed. I supposed you should get the same error in the screenshot.
As the error information says Microsoft.Azure.WebJobs.Host: Storage account 'joyblobstorage' is of unsupported type 'Blob-Only/ZRS'. Supported types are 'General Purpose'.
In the portal:
I try to create a Function App and use the existing Storage, but it could not find my blob-only storage account. Azure Function setup in portal should not allow we to select a blob-only storage account. Please refer to the screenshot.
Conclusion:
It is not possible to create snapshot for a blob-only storage. In the official documentation, you could see the Storage account requirements.
When creating a function app in App Service, you must create or link to a general-purpose Azure Storage account that supports Blob, Queue, and Table storage.
Also, in the App settings reference, you could see
AzureWebJobsStorage
The Azure Functions runtime uses this storage account connection string for all functions except for HTTP triggered functions. The storage account must be a general-purpose one that supports blobs, queues, and tables.
AzureWebJobsDashboard
Optional storage account connection string for storing logs and displaying them in the Monitor tab in the portal. The storage account must be a general-purpose one that supports blobs, queues, and tables.
Here is the Feedback, Azure App Service Team has explained the requirements on storage account, you could refer to it.

Azure Functions: Configuration file for referenced assembly

We have a referenced project in azure function app project. The referenced assembly is a data service project which is referred to by web api project too.
When referenced in web-api project the data service project automatically refers to web.config file for connection strings and app settings. While in azure functions app the data service project is not able to locate the connection strings stored in local.settings.json file.
How to address this issue locally?
How to address the issue in production?
NOTE: Would like to have DRY approach here.
As Jan V said, you could add a data connection string in json file. Besides, you could set a break point to see whether you get the 'str' value (Debug).
var str = ConfigurationManager.ConnectionStrings["ConnectionStringName"].ConnectionString;
Code in local.settings.jons file:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage":
"your storage account connection string",
"AzureWebJobsDashboard":
"your storage account connection string"
},
"ConnectionStrings": {
"ConnectionStringName": "Data Source=tcp:database server name,1433;Initial Catalog=database name;Integrated Security=False;User Id=user name;Password= your Password;Encrypt=True;TrustServerCertificate=False;MultipleActiveResultSets=True" // Refer to Azure portal>SQL database> connection string
}
}
For more details about how to use Azure Functions to connect to an Azure SQL Database ,you could read this article.

Resources