I am trying to get the Azure function to trigger when a blob file is uploaded. The function was deployed from an azure DevOps release.
Azure deploy steps (shows most relevant information):
Storage account:
- has a folder [blob folder] where I upload files to.
Azure function code:
public async static Task Run([BlobTrigger("[blob folder]/{name}", Connection = "AzureWebJobsStorage")]Stream myBlob, string name, ILogger log)
{
// any breakpoint here is never hit.
}
function.json:
{
"generatedBy": "Microsoft.NET.Sdk.Functions-1.0.24",
"configurationSource": "attributes",
"bindings": [
{
"type": "blobTrigger",
"connection": "AzureWebJobsStorage",
"path": "[blob folder]/{name}",
"name": "myBlob"
}
],
"disabled": false,
"scriptFile": "../bin/[dllname].dll",
"entryPoint": "[namespace].[function].[command]"
}
The storage and function are part of the same resource group. The Function app settings contains the values AzureWebJobsDashboard and AzureWebJobsStorage. I read these should be available in the function settings in this post. I also turned off the Function setting "Always On" for the function, and made sure the function is running.
Running and debugging the function locally (with the Azure storage emulator and storage explorer) works fine. The function is triggered after I upload a file to the blob folder.
In Azure, it seems like nothing happens. I'm not super familiar with the Azure environment, so any help is appreciated.
The issue was caused by the FUNCTIONS_EXTENSION_VERSION setting in the application settings. I needed to update this setting to the correct version.
I managed to get the debugging running using Jason Robert's blog post and debug the trigger event of my function.
Related
I am attempting to deploy a small piece of python code to the Azure cloud. I want a function which makes a request to an API, manipulates the response, and puts the output into a CSV file all in Azure. It stores and retrieves the API keys from the Azure Keyvault.
I am unable to run my code as the Function is stuck in read-only mode, so it cannot create a new CSV file as is needed.
Here's what I've done so far:
Created a new resource group for the resources.
Created a new function app running on Linux. It is set to python runtime.
Created dedicated App Service Plan and Application Insights resources for the function app.
Created a KeyVault and stored the API keys as a KV Secret.
Created a system assigned managed identity for the Azure Function, and created a new set of Keyvault policies to allow the Function access to 'get' KV secrets.
Linked the Keyvault and Function by adding the Keyvault identity(?) to the Function's application settings.
In the Function's application settings, set FUNCTION_APP_EDIT_MODE to 'readwrite'. Also set WEBSITE_RUN_FROM_PACKAGE to '0' (also tried this as '1' but to no avail).
For context, I am deploying the function itself using the Azure Functions extension in VS Code.
Any suggestions about how to allow the writing of a new CSV file would be appreciated. Alternatively, any suggestions for alternatives method to run this code in Azure would also be welcomed.
Edit: Could the problem here actually be that the code is not outputting to the attached storage account? That the function is read-only might not be the true problem here...
My function.json file:
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "mytimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 0 4 * * *"
},
{
"name": "outputFile",
"direction": "out",
"type": "file",
"path": "output/csv",
"connection": "AzureWebJobsStorage"
}
]
}
A snippet of my __init__.py file which attempts to write to a CSV file and place in storage:
with open("non_exempt_devices.csv", "w", newline="") as f:
writer = csv.writer(f)
writer.writerow(["Host Name", "Metadata"])
writer.writerows(output[0])
outputFile.set(f)
I'm new to azure functions so bear with me.
I'm working on a microservice that will use a blob storage trigger and input/output bindings to process data and write to a database, but I am having trouble with the basic blob storage trigger function. For reference, I am developing in Visual Studio Code using Python with the V2 model of programming as listed in the azure documentation I have installed the Azure functions extension and the azurite extension, but in my local.settings.json I have added the connection string to the Value AzureWebJobStorage and put the Feature Flag and Storage Type.
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "python",
"AzureWebJobsStorage": "DefaultEndpointsProtocol=REDACTED;AccountName=REDACTED;AccountKey=REDACTED;EndpointSuffix=core.windows.net",
"AzureWebJobsFeatureFlags": "EnableWorkerIndexing",
"AzureWebJobsSecretStorageType": "files",
"FUNCTIONS_EXTENSION_VERSION": "~4",
"APPINSIGHTS_INSTRUMENTATIONKEY": "REDACTED",
"APPLICATIONINSIGHTS_CONNECTION_STRING": "REDACTED",
"user":"REDACTED",
"host":"REDACTED",
"password":"REDACTED",
"dbname":"REDACTED",
"driver":"REDACTED"
}
}
I had our architect create the necessary resources for me (gen2 Storage Account, function app), and our company has protocols in place for security, meaning the network access is disabled for the storage account and a private endpoint is configured, but not for the function App because the deployment process wouldn't work.
In my function_app.py I have this for the blob trigger.
#app.function_name(name="BlobTrigger1")
#app.blob_trigger(arg_name="myblob", path="samples-workitems/{name}",
connection="")
def test_function(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
Host.json
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
}
},
"extensions": {
"blobs": {
"maxDegreeOfParallelism": 4
}
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[3.15.0, 4.0.0)"
},
"concurrency": {
"dynamicConcurrencyEnabled": true,
"snapshotPersistenceEnabled": true
}
}
When I run the app locally using
Azurite: Start
func host start
it spits out.
It is also worth noting I have a util folder with simple scripts. They DO NOT FOLLOW THE BLUEPRINT SCHEMA. I don't know if this is the problem or if I can keep doing this, but then functions in them aren't meant to be azure functions, more like helper functions.
Storage account networking
Container view
Azure Function in the cloud. I haven't deployed to this function because it didn't work.
It is very frustrating because I don't know if the problem lies with the way the resource was configured or if it's my mistake with the code I wrote, the way I set this up, or if it's a simple issue with the settings in one of the Json files.
I want to create an azure storage queue triggered azure function. I went through the following tutorial https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-queue-trigger
I basically want to trigger the function whenever a message is pushed into the queue and push the result back to another queue once a function is finished.
function.json
{
"bindings": [
{
"name": "input",
"type": "queueTrigger",
"direction": "in",
"queueName": "queue-trigger",
"connection": ""
},
{
"type": "queue",
"direction": "out",
"name": "output",
"queueName": "queue-receiver",
"connection": ""
}
]
}
When i deployed the function, then I am getting the following error in logs present in monitor.
2022-09-10T12:16:53.412 [Error] The 'QueueTrigger' function is in error:
Microsoft.Azure.WebJobs.Host: Error indexing method 'Functions.QueueTrigger'.
Microsoft.Azure.WebJobs.Extensions.Storage.Queues: Storage account connection string 'AzureWebJobs<storage account name>_STORAGE' does not exist. Make sure that it is a defined App Setting.
You can see I have defined, three application settings.
AzureWebJobs<storage account>_STORAGE
AzureWebJobsStorage
<storage account>_STORAGE
According to documentation, if connection is empty in function.json then AzureWebJobsStorage will be used.
Even i tried to set connection:"<storage account>_STORAGE", that also raised the same error.
I have reproduced in my environment, and the below workaround worked for me:
I have created a storage account and copied the connection as below and i created a queue too over there as below
Then i pasted it in local.settings file in vs studio as below:
And now i published it too azure.
Click on App setting in Configuration , if Connection string name is present then open that and paste the connection string.
If not Then in published function app i created a connection string in application settings and copied the same connection string that i have taken from Storage account
Then click on ok then save it.
Then send message in queue as below:
And this workaround worked for me.
I have creating some Azure Functions in a C# project that are working fine locally. An example of the definition of a function is the following:
[FunctionName("createBankTransactionFromServiceBus")]
public async Task Run(
[ServiceBusTrigger("vspan.sbus.xerobanktransaction.requests", "requests",
Connection = "AccountingServiceBusConnection")] string myQueueItem)
{
}
Nothing different than usual. The problem is when I deploy this function on Azure. On Azure, the Azure Functions can't find the connection string. So, I added a new one in the local.settings.json but now I have two AccountingServiceBusConnection with the same value, one for my local machine and one for Azure.
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"AccountingServiceBusConnection": "connectionString"
},
"AccountingServiceBusConnection": "connectionString"
}
I tried to replace the connection in the signature of the function like:
[FunctionName("createBankTransactionFromServiceBus")]
public async Task Run(
[ServiceBusTrigger("vspan.sbus.xerobanktransaction.requests", "requests",
Connection = "%Values:AccountingServiceBusConnection%")] string myQueueItem)
{
}
but locally I have a warning (with or without %).
Warning: Cannot find value named
'Values:AccountingServiceBusConnection' in local.settings.json that
matches 'connection' property set on 'serviceBusTrigger' in
'C:\Projects\fun\bin\Debug\netcoreapp3.1\createBankTransactionFromServiceBus\function.json'.
You can run 'func azure functionapp fetch-app-settings
' or specify a connection string in
local.settings.json.
Also, I tried to move AccountingServiceBusConnection under ConnectionStrings with the same result.
Update
Screenshot of Kudu and local.settings.json
Screenshot of Azure Functions configuration
How can you configure a pipeline in DevOps? How do you store the configuration from DevOps in the configuration in your Azure Functions?
There's no local.settings.json on Azure, you must add the settings to your Azure App Services settings:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-how-to-use-azure-function-app-settings
EDIT:
for Key Vault Integration you must assign a managed identity to your function:
Then use Key Vault Ingegration:
#Microsoft.KeyVault(SecretUri={theSecretUri})
More info:
https://medium.com/statuscode/getting-key-vault-secrets-in-azure-functions-37620fd20a0b
By default, the local.settings.json file is NOT deployed with your code to an Azure Function. See the documentation here:
By default, these settings are not migrated automatically when the
project is published to Azure. Use the --publish-local-settings switch
when you publish to make sure these settings are added to the function
app in Azure. Note that values in ConnectionStrings are never
published.
You have a few options:
Explicitly publish your local.settings.json file with the aforementioned command line arg.
Add this setting (and any other settings needed) to your Azure Function's Configuration. Those values defined in your app settings in the Azure Portal take precedence over everything else.
I don't recommend option #1, because it requires you to place production values in your source code, which is in general a bad idea.
Updated - how to configure with Azure DevOps
For Azure DevOps, we've taken a two pronged approach.
We place the bare minimum key/value pairs in the Azure Function configuration. These are added into our yaml deployment pipeline. Some variable values (like connection strings) are read from other resources at deploy time so that sensitive info isn't included in our yaml script that is checked into revision control. Here's some example yaml for deploying an Azure Function:
{
"apiVersion": "2016-03-01",
"type": "Microsoft.Web/sites",
"name": "FooBarFunction",
"location": "[resourceGroup().location]",
"kind": "functionapp",
"dependsOn": [
"[resourceId('Microsoft.Web/serverfarms', "YourHostingPlanName")]"
],
"properties": {
"serverFarmId": "[resourceId('Microsoft.Web/serverfarms', "YourHostingPlanName)]",
"siteConfig": {
"appSettings": [
{
"name": "WEBSITE_CONTENTSHARE",
"value": "FooBarFunctionContentShare"
},
{
"name": "FUNCTIONS_WORKER_RUNTIME",
"value": "dotnet"
}
]
}
}
}
We use Azure App Configuration service to hold all of our other app settings. This gives us the advantage of defining different config profiles, and also having hot reload of our app settings without having to recycle the Azure Function. It also plays nicely with Keyvault for sensitive settings.
Is there a way to trigger Azure Function without defined concrete container?
I'm expecting, that function below will be triggered for any file in any container. Name of file and container should be in variables.
*****function.json:
{
"bindings": [
{
"type": "blobTrigger",
"name": "myBlob",
"path": "{container}/{name}",
"connection": "AzureWebJobsStorage",
"direction": "in"
}
],
"disabled": false
}
*****run.csx:
public static void Run(Stream myBlob, string name, string container, TraceWriter log, out string outputSbMsg)
{
log.Info("C# Blob trigger function Processed blob");
log.Info(name);
log.Info(container);
}
However, nothing is triggered. Any idea what is wrong?
Is there a way to trigger Azure Function without defined concrete container?
I assume that there is no way to trigger Azure Function without defined concrete container currently. From the Azure Function document, we could use the Azure storage blob trigger to moniter a storage container.
The Azure Storage blob trigger lets you monitor a storage container for new and updated blobs and run your function code when changes are detected
Base on my experience, we need to create multiple Azure functions to monitor the blobs as a work-around.
Update:
As mathewec mentioned it is an opened issue, more details please refer to it.