Timer-Trigger not showing in Azure - azure

I am new to Azure so sorry if this does not make too much sense.
I created a function app in Azure that works when testing. I am trying to use Timer-Trigger to make this app run each morning. The issue I have is that Timer-Trigger Template is not available when I click add. I am going to
Home > Function App > "Choose My App Here" > Events
But Timer Trigger template does not show. Is it not possible to add timer-trigger to an existing function app?
Also tried
Home > Function App > "Choose My App Here" > Functions
but no luck here either.

But Timer Trigger template does not show. Is it not possible to add
timer-trigger to an existing function app?
First, all Azure function based on script language and Windows OS should be able to create directly on the Azure portal. (Linux OS can not).
Second, I don't recommend you to create trigger directly on portal. If it is just a test, it is no problem, if you want to develop Azure function, I recommend you to use VS Code to develop Azure function, after finish the develop you can publish it to the Azure function.
Develop Azure function with VS Code needs four things: Azure function core tools (I recommend you to use V3), Azure function extension of VS Code, language environment and related language debug extension.
You can follow below doc:
https://learn.microsoft.com/en-us/azure/azure-functions/create-first-function-vs-code-node
Use this button to deploy:
Or use function core tools command to deploy:
func azure functionapp publish <FunctionAppName>
Or use web app zip deploy, ftp, git, anyway is ok, just upload local function app structure to azure function app physical path.
And the main structure of timer-trigger:
index.js
module.exports = async function (context, myTimer) {
var timeStamp = new Date().toISOString();
if (myTimer.isPastDue)
{
context.log('JavaScript is running late!');
}
context.log('JavaScript timer trigger function ran!', timeStamp);
};
function.json
{
"bindings": [
{
"name": "myTimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "* * * * * *"
}
]
}
Below link will tell you the format of schedule:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-timer?tabs=csharp#ncrontab-expressions
local.settings.json:
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "DefaultEndpointsProtocol=https;AccountName=0730bowmanwindow;AccountKey=xxxxxx;EndpointSuffix=core.windows.net",
"FUNCTIONS_WORKER_RUNTIME": "node"
}
}
host.json
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
}
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[1.*, 2.0.0)"
}
}

Related

Azure Functions Blob Trigger not working with Python V2 model in local

I'm new to azure functions so bear with me.
I'm working on a microservice that will use a blob storage trigger and input/output bindings to process data and write to a database, but I am having trouble with the basic blob storage trigger function. For reference, I am developing in Visual Studio Code using Python with the V2 model of programming as listed in the azure documentation I have installed the Azure functions extension and the azurite extension, but in my local.settings.json I have added the connection string to the Value AzureWebJobStorage and put the Feature Flag and Storage Type.
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "python",
"AzureWebJobsStorage": "DefaultEndpointsProtocol=REDACTED;AccountName=REDACTED;AccountKey=REDACTED;EndpointSuffix=core.windows.net",
"AzureWebJobsFeatureFlags": "EnableWorkerIndexing",
"AzureWebJobsSecretStorageType": "files",
"FUNCTIONS_EXTENSION_VERSION": "~4",
"APPINSIGHTS_INSTRUMENTATIONKEY": "REDACTED",
"APPLICATIONINSIGHTS_CONNECTION_STRING": "REDACTED",
"user":"REDACTED",
"host":"REDACTED",
"password":"REDACTED",
"dbname":"REDACTED",
"driver":"REDACTED"
}
}
I had our architect create the necessary resources for me (gen2 Storage Account, function app), and our company has protocols in place for security, meaning the network access is disabled for the storage account and a private endpoint is configured, but not for the function App because the deployment process wouldn't work.
In my function_app.py I have this for the blob trigger.
#app.function_name(name="BlobTrigger1")
#app.blob_trigger(arg_name="myblob", path="samples-workitems/{name}",
connection="")
def test_function(myblob: func.InputStream):
logging.info(f"Python blob trigger function processed blob \n"
f"Name: {myblob.name}\n"
f"Blob Size: {myblob.length} bytes")
Host.json
{
"version": "2.0",
"logging": {
"applicationInsights": {
"samplingSettings": {
"isEnabled": true,
"excludedTypes": "Request"
}
}
},
"extensions": {
"blobs": {
"maxDegreeOfParallelism": 4
}
},
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle",
"version": "[3.15.0, 4.0.0)"
},
"concurrency": {
"dynamicConcurrencyEnabled": true,
"snapshotPersistenceEnabled": true
}
}
When I run the app locally using
Azurite: Start
func host start
it spits out.
It is also worth noting I have a util folder with simple scripts. They DO NOT FOLLOW THE BLUEPRINT SCHEMA. I don't know if this is the problem or if I can keep doing this, but then functions in them aren't meant to be azure functions, more like helper functions.
Storage account networking
Container view
Azure Function in the cloud. I haven't deployed to this function because it didn't work.
It is very frustrating because I don't know if the problem lies with the way the resource was configured or if it's my mistake with the code I wrote, the way I set this up, or if it's a simple issue with the settings in one of the Json files.

Is it possible to Log data for a particular function in a Function app containing multiple functions?

I want to save cost on Log analytics.
I have a function app with 4 functions in it. I want to log only for a particular function and not for all.
I have created 3 azure functions in azure function app in visual studio:
Then i used this in my host.json file:
{
"version": "2.0",
"logging": {
"logLevel": {
"Function.TimerTrigger1.User": "Information",
"Function": "Error"
}
}
}
TimerTrigge1 is Function name
After using this i am able to only see the logs of TimerTrigger1 Function:
Then you can publish your function app to Azure then from now on you can only see the required function app logs.

C# Azure Functions project with ServiceBusTrigger connection issue

I have creating some Azure Functions in a C# project that are working fine locally. An example of the definition of a function is the following:
[FunctionName("createBankTransactionFromServiceBus")]
public async Task Run(
[ServiceBusTrigger("vspan.sbus.xerobanktransaction.requests", "requests",
Connection = "AccountingServiceBusConnection")] string myQueueItem)
{
}
Nothing different than usual. The problem is when I deploy this function on Azure. On Azure, the Azure Functions can't find the connection string. So, I added a new one in the local.settings.json but now I have two AccountingServiceBusConnection with the same value, one for my local machine and one for Azure.
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"AccountingServiceBusConnection": "connectionString"
},
"AccountingServiceBusConnection": "connectionString"
}
I tried to replace the connection in the signature of the function like:
[FunctionName("createBankTransactionFromServiceBus")]
public async Task Run(
[ServiceBusTrigger("vspan.sbus.xerobanktransaction.requests", "requests",
Connection = "%Values:AccountingServiceBusConnection%")] string myQueueItem)
{
}
but locally I have a warning (with or without %).
Warning: Cannot find value named
'Values:AccountingServiceBusConnection' in local.settings.json that
matches 'connection' property set on 'serviceBusTrigger' in
'C:\Projects\fun\bin\Debug\netcoreapp3.1\createBankTransactionFromServiceBus\function.json'.
You can run 'func azure functionapp fetch-app-settings
' or specify a connection string in
local.settings.json.
Also, I tried to move AccountingServiceBusConnection under ConnectionStrings with the same result.
Update
Screenshot of Kudu and local.settings.json
Screenshot of Azure Functions configuration
How can you configure a pipeline in DevOps? How do you store the configuration from DevOps in the configuration in your Azure Functions?
There's no local.settings.json on Azure, you must add the settings to your Azure App Services settings:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-how-to-use-azure-function-app-settings
EDIT:
for Key Vault Integration you must assign a managed identity to your function:
Then use Key Vault Ingegration:
#Microsoft.KeyVault(SecretUri={theSecretUri})
More info:
https://medium.com/statuscode/getting-key-vault-secrets-in-azure-functions-37620fd20a0b
By default, the local.settings.json file is NOT deployed with your code to an Azure Function. See the documentation here:
By default, these settings are not migrated automatically when the
project is published to Azure. Use the --publish-local-settings switch
when you publish to make sure these settings are added to the function
app in Azure. Note that values in ConnectionStrings are never
published.
You have a few options:
Explicitly publish your local.settings.json file with the aforementioned command line arg.
Add this setting (and any other settings needed) to your Azure Function's Configuration. Those values defined in your app settings in the Azure Portal take precedence over everything else.
I don't recommend option #1, because it requires you to place production values in your source code, which is in general a bad idea.
Updated - how to configure with Azure DevOps
For Azure DevOps, we've taken a two pronged approach.
We place the bare minimum key/value pairs in the Azure Function configuration. These are added into our yaml deployment pipeline. Some variable values (like connection strings) are read from other resources at deploy time so that sensitive info isn't included in our yaml script that is checked into revision control. Here's some example yaml for deploying an Azure Function:
{
"apiVersion": "2016-03-01",
"type": "Microsoft.Web/sites",
"name": "FooBarFunction",
"location": "[resourceGroup().location]",
"kind": "functionapp",
"dependsOn": [
"[resourceId('Microsoft.Web/serverfarms', "YourHostingPlanName")]"
],
"properties": {
"serverFarmId": "[resourceId('Microsoft.Web/serverfarms', "YourHostingPlanName)]",
"siteConfig": {
"appSettings": [
{
"name": "WEBSITE_CONTENTSHARE",
"value": "FooBarFunctionContentShare"
},
{
"name": "FUNCTIONS_WORKER_RUNTIME",
"value": "dotnet"
}
]
}
}
}
We use Azure App Configuration service to hold all of our other app settings. This gives us the advantage of defining different config profiles, and also having hot reload of our app settings without having to recycle the Azure Function. It also plays nicely with Keyvault for sensitive settings.

Azure Function Blob trigger not responding to manual import

I am trying to get the Azure function to trigger when a blob file is uploaded. The function was deployed from an azure DevOps release.
Azure deploy steps (shows most relevant information):
Storage account:
- has a folder [blob folder] where I upload files to.
Azure function code:
public async static Task Run([BlobTrigger("[blob folder]/{name}", Connection = "AzureWebJobsStorage")]Stream myBlob, string name, ILogger log)
{
// any breakpoint here is never hit.
}
function.json:
{
"generatedBy": "Microsoft.NET.Sdk.Functions-1.0.24",
"configurationSource": "attributes",
"bindings": [
{
"type": "blobTrigger",
"connection": "AzureWebJobsStorage",
"path": "[blob folder]/{name}",
"name": "myBlob"
}
],
"disabled": false,
"scriptFile": "../bin/[dllname].dll",
"entryPoint": "[namespace].[function].[command]"
}
The storage and function are part of the same resource group. The Function app settings contains the values AzureWebJobsDashboard and AzureWebJobsStorage. I read these should be available in the function settings in this post. I also turned off the Function setting "Always On" for the function, and made sure the function is running.
Running and debugging the function locally (with the Azure storage emulator and storage explorer) works fine. The function is triggered after I upload a file to the blob folder.
In Azure, it seems like nothing happens. I'm not super familiar with the Azure environment, so any help is appreciated.
The issue was caused by the FUNCTIONS_EXTENSION_VERSION setting in the application settings. I needed to update this setting to the correct version.
I managed to get the debugging running using Jason Robert's blog post and debug the trigger event of my function.

I can't see proxies for Azure Function App

I have created an azure function app in VS 2017.
i've included a proxies.json file and published the app.
However I can't see the proxies in the Azure portal.
All it says is:
Proxies (preview) (Read Only)
I have gone into function settings, and there are no settings to enable proxies. I think that was the old method, (not for developing the function in visual studio)
proxies.json looks like this:
{
"$schema": "http://json.schemastore.org/proxies",
"proxies": {
"user": {
"matchCondition": {
"methods": [ "GET" ],
"route": "/user/{user}"
},
"backendUri": "https://<mycontainer>.blob.core.windows.net/html/test.html/{user}"
}
}
}
Ok I figured it out....
the proxies.json file should have copy to output directory as "copy always" in the properties panel in VS.

Resources