I am attempting to deploy a small piece of python code to the Azure cloud. I want a function which makes a request to an API, manipulates the response, and puts the output into a CSV file all in Azure. It stores and retrieves the API keys from the Azure Keyvault.
I am unable to run my code as the Function is stuck in read-only mode, so it cannot create a new CSV file as is needed.
Here's what I've done so far:
Created a new resource group for the resources.
Created a new function app running on Linux. It is set to python runtime.
Created dedicated App Service Plan and Application Insights resources for the function app.
Created a KeyVault and stored the API keys as a KV Secret.
Created a system assigned managed identity for the Azure Function, and created a new set of Keyvault policies to allow the Function access to 'get' KV secrets.
Linked the Keyvault and Function by adding the Keyvault identity(?) to the Function's application settings.
In the Function's application settings, set FUNCTION_APP_EDIT_MODE to 'readwrite'. Also set WEBSITE_RUN_FROM_PACKAGE to '0' (also tried this as '1' but to no avail).
For context, I am deploying the function itself using the Azure Functions extension in VS Code.
Any suggestions about how to allow the writing of a new CSV file would be appreciated. Alternatively, any suggestions for alternatives method to run this code in Azure would also be welcomed.
Edit: Could the problem here actually be that the code is not outputting to the attached storage account? That the function is read-only might not be the true problem here...
My function.json file:
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "mytimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 0 4 * * *"
},
{
"name": "outputFile",
"direction": "out",
"type": "file",
"path": "output/csv",
"connection": "AzureWebJobsStorage"
}
]
}
A snippet of my __init__.py file which attempts to write to a CSV file and place in storage:
with open("non_exempt_devices.csv", "w", newline="") as f:
writer = csv.writer(f)
writer.writerow(["Host Name", "Metadata"])
writer.writerows(output[0])
outputFile.set(f)
Related
I'm new to azure functions so bear with me.
I'm working on a microservice that will use a blob storage trigger and input/output bindings to process data and write to a database, but I am having trouble with the basic blob storage trigger function. For reference, I am developing in Visual Studio Code using Python with the V2 model of programming as listed in the azure documentation I have installed the Azure functions extension and the azurite extension, but in my local.settings.json I have added the connection string to the Value AzureWebJobStorage and put the Feature Flag and Storage Type.
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "python",
"AzureWebJobsStorage": "DefaultEndpointsProtocol=REDACTED;AccountName=REDACTED;AccountKey=REDACTED;EndpointSuffix=core.windows.net",
"AzureWebJobsFeatureFlags": "EnableWorkerIndexing",
"AzureWebJobsSecretStorageType": "files",
"FUNCTIONS_EXTENSION_VERSION": "~4",
"APPINSIGHTS_INSTRUMENTATIONKEY": "REDACTED",
"APPLICATIONINSIGHTS_CONNECTION_STRING": "REDACTED",
"user":"REDACTED",
"host":"REDACTED",
"password":"REDACTED",
"dbname":"REDACTED",
"driver":"REDACTED"
}
}
I had our architect create the necessary resources for me (gen2 Storage Account, function app), and our company has protocols in place for security, meaning the network access is disabled for the storage account and a private endpoint is configured, but not for the function App because the deployment process wouldn't work.
In my function_app.py I have this for the blob trigger.
#app.function_name(name="BlobTrigger1")
#app.blob_trigger(arg_name="myblob", path="samples-workitems/{name}",
connection="")
def test_function(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
Host.json
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
}
},
"extensions": {
"blobs": {
"maxDegreeOfParallelism": 4
}
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[3.15.0, 4.0.0)"
},
"concurrency": {
"dynamicConcurrencyEnabled": true,
"snapshotPersistenceEnabled": true
}
}
When I run the app locally using
Azurite: Start
func host start
it spits out.
It is also worth noting I have a util folder with simple scripts. They DO NOT FOLLOW THE BLUEPRINT SCHEMA. I don't know if this is the problem or if I can keep doing this, but then functions in them aren't meant to be azure functions, more like helper functions.
Storage account networking
Container view
Azure Function in the cloud. I haven't deployed to this function because it didn't work.
It is very frustrating because I don't know if the problem lies with the way the resource was configured or if it's my mistake with the code I wrote, the way I set this up, or if it's a simple issue with the settings in one of the Json files.
I want to create an azure storage queue triggered azure function. I went through the following tutorial https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-queue-trigger
I basically want to trigger the function whenever a message is pushed into the queue and push the result back to another queue once a function is finished.
function.json
{
"bindings": [
{
"name": "input",
"type": "queueTrigger",
"direction": "in",
"queueName": "queue-trigger",
"connection": ""
},
{
"type": "queue",
"direction": "out",
"name": "output",
"queueName": "queue-receiver",
"connection": ""
}
]
}
When i deployed the function, then I am getting the following error in logs present in monitor.
2022-09-10T12:16:53.412 [Error] The 'QueueTrigger' function is in error:
Microsoft.Azure.WebJobs.Host: Error indexing method 'Functions.QueueTrigger'.
Microsoft.Azure.WebJobs.Extensions.Storage.Queues: Storage account connection string 'AzureWebJobs<storage account name>_STORAGE' does not exist. Make sure that it is a defined App Setting.
You can see I have defined, three application settings.
AzureWebJobs<storage account>_STORAGE
AzureWebJobsStorage
<storage account>_STORAGE
According to documentation, if connection is empty in function.json then AzureWebJobsStorage will be used.
Even i tried to set connection:"<storage account>_STORAGE", that also raised the same error.
I have reproduced in my environment, and the below workaround worked for me:
I have created a storage account and copied the connection as below and i created a queue too over there as below
Then i pasted it in local.settings file in vs studio as below:
And now i published it too azure.
Click on App setting in Configuration , if Connection string name is present then open that and paste the connection string.
If not Then in published function app i created a connection string in application settings and copied the same connection string that i have taken from Storage account
Then click on ok then save it.
Then send message in queue as below:
And this workaround worked for me.
I have creating some Azure Functions in a C# project that are working fine locally. An example of the definition of a function is the following:
[FunctionName("createBankTransactionFromServiceBus")]
public async Task Run(
[ServiceBusTrigger("vspan.sbus.xerobanktransaction.requests", "requests",
Connection = "AccountingServiceBusConnection")] string myQueueItem)
{
}
Nothing different than usual. The problem is when I deploy this function on Azure. On Azure, the Azure Functions can't find the connection string. So, I added a new one in the local.settings.json but now I have two AccountingServiceBusConnection with the same value, one for my local machine and one for Azure.
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"AccountingServiceBusConnection": "connectionString"
},
"AccountingServiceBusConnection": "connectionString"
}
I tried to replace the connection in the signature of the function like:
[FunctionName("createBankTransactionFromServiceBus")]
public async Task Run(
[ServiceBusTrigger("vspan.sbus.xerobanktransaction.requests", "requests",
Connection = "%Values:AccountingServiceBusConnection%")] string myQueueItem)
{
}
but locally I have a warning (with or without %).
Warning: Cannot find value named
'Values:AccountingServiceBusConnection' in local.settings.json that
matches 'connection' property set on 'serviceBusTrigger' in
'C:\Projects\fun\bin\Debug\netcoreapp3.1\createBankTransactionFromServiceBus\function.json'.
You can run 'func azure functionapp fetch-app-settings
' or specify a connection string in
local.settings.json.
Also, I tried to move AccountingServiceBusConnection under ConnectionStrings with the same result.
Update
Screenshot of Kudu and local.settings.json
Screenshot of Azure Functions configuration
How can you configure a pipeline in DevOps? How do you store the configuration from DevOps in the configuration in your Azure Functions?
There's no local.settings.json on Azure, you must add the settings to your Azure App Services settings:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-how-to-use-azure-function-app-settings
EDIT:
for Key Vault Integration you must assign a managed identity to your function:
Then use Key Vault Ingegration:
#Microsoft.KeyVault(SecretUri={theSecretUri})
More info:
https://medium.com/statuscode/getting-key-vault-secrets-in-azure-functions-37620fd20a0b
By default, the local.settings.json file is NOT deployed with your code to an Azure Function. See the documentation here:
By default, these settings are not migrated automatically when the
project is published to Azure. Use the --publish-local-settings switch
when you publish to make sure these settings are added to the function
app in Azure. Note that values in ConnectionStrings are never
published.
You have a few options:
Explicitly publish your local.settings.json file with the aforementioned command line arg.
Add this setting (and any other settings needed) to your Azure Function's Configuration. Those values defined in your app settings in the Azure Portal take precedence over everything else.
I don't recommend option #1, because it requires you to place production values in your source code, which is in general a bad idea.
Updated - how to configure with Azure DevOps
For Azure DevOps, we've taken a two pronged approach.
We place the bare minimum key/value pairs in the Azure Function configuration. These are added into our yaml deployment pipeline. Some variable values (like connection strings) are read from other resources at deploy time so that sensitive info isn't included in our yaml script that is checked into revision control. Here's some example yaml for deploying an Azure Function:
{
"apiVersion": "2016-03-01",
"type": "Microsoft.Web/sites",
"name": "FooBarFunction",
"location": "[resourceGroup().location]",
"kind": "functionapp",
"dependsOn": [
"[resourceId('Microsoft.Web/serverfarms', "YourHostingPlanName")]"
],
"properties": {
"serverFarmId": "[resourceId('Microsoft.Web/serverfarms', "YourHostingPlanName)]",
"siteConfig": {
"appSettings": [
{
"name": "WEBSITE_CONTENTSHARE",
"value": "FooBarFunctionContentShare"
},
{
"name": "FUNCTIONS_WORKER_RUNTIME",
"value": "dotnet"
}
]
}
}
}
We use Azure App Configuration service to hold all of our other app settings. This gives us the advantage of defining different config profiles, and also having hot reload of our app settings without having to recycle the Azure Function. It also plays nicely with Keyvault for sensitive settings.
I am trying to create an azure function cosmosdbtrigger .My cosmosdb is in a different resource id as compared to my azure function. However My function is not getting triggered.
Is there any restriction that the azure function and cosmosdb should be in the same resource id. If not is there any additional setting to be done for a different resource id.
My azure function is on python running on a linux app service. From the azure documentation i came to know, i cannot mix app services from windows and linux as the current limitation.
Azure Documentation on Current Limitation
I need to use an azure function Python to check azure cosomos db change feed.
Here is my function.json used for connecting to a cosmosdb collection trigger..
{
"scriptFile": "__init__.py",
"bindings": [
{
"type": "cosmosDBTrigger",
"name": "documents",
"direction": "in",
"leaseCollectionName": "leases1",
"connectionStringSetting": "devcosmosdb_DOCUMENTDB",
"databaseName": "devcosmosdb",
"collectionName": "testCollection",
"createLeaseCollectionIfNotExists": "true"
}
]
}
There is no such limitation.
Please check the databaseName, collectionName and connectionStringSetting again.
If you have already deployed the function to Azure portal. You need to add the connectionStringSetting to Application Settings. In your scenario, you should add the connectionString like this
You can find the connectionString under Keys part of your cosmosdb account.
Also, please check the FireWall settings.
I would like to add on some more useful information related to this topic. You might be the old me looking at the Monitor logs waiting for something to appear when you use the default template of the __init__.py to get update on the inserted document to your CosmosDB.
May I refer you to this link - https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-configure-cosmos-db-trigger#enabling-logging
You need to edit the host.json file to enable logging for CosmosDB before anything useful will appear in your logs.
{
"version": "2.0",
"logging": {
"fileLoggingMode": "always",
"logLevel": {
"Host.Triggers.CosmosDB": "Trace"
}
}
}
I had for some reason changed the name in the bindings section of the function.json. If this doesn't match with the parameter name in your function definition, it will not fire.
For example, name in here:
{
"scriptFile": "__init__.py",
"bindings": [
{
"type": "cosmosDBTrigger",
"name": "documents", <<<<<<<<<<<<<
"direction": "in",
"leaseCollectionName": "leases",
"connectionStringSetting": "CosmosDBConnectionString",
"databaseName": "dbname",
"collectionName": "collname",
"createLeaseCollectionIfNotExists": true
}
]
}
Must match the variable name in the main method of __init__.py when using Python as runtime (this should be similar for other languages):
import azure.functions as func
"""
Must match with this
|
|
V
"""
def main(documents: func.DocumentList):
# do something smart with the list
I am trying to get the Azure function to trigger when a blob file is uploaded. The function was deployed from an azure DevOps release.
Azure deploy steps (shows most relevant information):
Storage account:
- has a folder [blob folder] where I upload files to.
Azure function code:
public async static Task Run([BlobTrigger("[blob folder]/{name}", Connection = "AzureWebJobsStorage")]Stream myBlob, string name, ILogger log)
{
// any breakpoint here is never hit.
}
function.json:
{
"generatedBy": "Microsoft.NET.Sdk.Functions-1.0.24",
"configurationSource": "attributes",
"bindings": [
{
"type": "blobTrigger",
"connection": "AzureWebJobsStorage",
"path": "[blob folder]/{name}",
"name": "myBlob"
}
],
"disabled": false,
"scriptFile": "../bin/[dllname].dll",
"entryPoint": "[namespace].[function].[command]"
}
The storage and function are part of the same resource group. The Function app settings contains the values AzureWebJobsDashboard and AzureWebJobsStorage. I read these should be available in the function settings in this post. I also turned off the Function setting "Always On" for the function, and made sure the function is running.
Running and debugging the function locally (with the Azure storage emulator and storage explorer) works fine. The function is triggered after I upload a file to the blob folder.
In Azure, it seems like nothing happens. I'm not super familiar with the Azure environment, so any help is appreciated.
The issue was caused by the FUNCTIONS_EXTENSION_VERSION setting in the application settings. I needed to update this setting to the correct version.
I managed to get the debugging running using Jason Robert's blog post and debug the trigger event of my function.