Azure TimerTrigger Multiple instances with different configuration - python-3.x

Using VS Code + "Azure Function" Extension, I generated the default python 3.7 timedTrigger function with the following settings:
// functions.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"name": "mytimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 0 */6 * * *"
}
]
}
I have also set up two environment variables "USER" and "PASSWORD" which are set up in the Configuration of the app service.
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": ****************,
"FUNCTIONS_WORKER_RUNTIME": "python",
"USER": "********",
"PASSWORD": "*********"
}
}
Goal:
I want to run two instances of the same function, but using two different Configs, i.e. Users+Passwords.
Problem:
I believe that the Configuration/App Settings might not be sufficient for this. I can't find a way to run the function twice with multiple different parameters.
Question: What options do I have to reach my goal? One idea I had was to put the User/PW into the functions.json, but I could not figure out how to access that information from within the app function.

You have two options:
Read a custom json (not necessarily reading the value of function.json), you can add a custom json in the function app, and then read the value you want according to the hierarchy of the json file, Then use the value you read in the trigger.
Use deployment slot. (This is the official method, I think it is completely suitable for your current needs)
In this newly created slot you can use completely different environment variables in Configuration Settings.
This is the doc:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-deployment-slots

I'd probably do it by having a single setting that holds a JSON array, viz
"Credentials": "[{'username':'***','password':'***'},{'username':'******','password':'******'}]"
Then, assuming you want to process them all at the same time, make a single function that parses the array and iterates over each username and password.
If you need to run them on different schedules, create a shared Python function DoTheThing(credentialIndex) that actually does the work and then multiple Azure Functions that simply call DoTheThing(0), DoTheThing(1), ...
(Security note: not immediately relevant to the problem at hand, but secrets are best kept in a secret store such as Key Vault rather than directly in the settings)

EDIT/SOLUTION:
I ended up having a the following keys in my environment variables:
"USERS": "[\"UserA\", \"UserB\"]"
"UserA_USER": "Username1"
"UserA_PW": "Password1"
"UserB_USER": "Username2"
"UserB_PW": "Password2"
Then I iterated over the USERS array and retrieved the keys for each user like so:
import os
import json
users = json.loads(os.environ["USERS"])
for u in users:
user = os.environ[u + "_USER"]
pw = os.environ[u + "_PW"]
doStuff(user, pw)

Related

Returning identity value using Azure Functions SQL extension

I am working on a simple REST API using the SQL extension for Azure Functions as described here.
Now i have a table in my Azure SQL database with a primary-key identity column. I am using an Output binding in a HTTP-triggered Azure function for inserting a row in the table:
...
{
"name": "aRow",
"type": "sql",
"direction": "out",
"commandText": "[dbo].[MYTABLE]",
"commandType": "Text",
"connectionStringSetting": "..."
},
...
This is working perfectly and the identity column is being incremented as expected. Now, naturally, i want to return the identity ID of the inserted row to the client. I don't think this feature is built in at the moment, therefor i was hoping that someone out there has solved this in a neat way or at least can point me in a direction.
I have tried adding an Input binding to the above function that querys the row where ID = ##IDENTITY, but it seems like the bindings aren't executed in the same scope or in the correct order. Either way, there is no match for this query. That would've been a neat solution though.

Azure Data Factory not interpreting well an array global parameter

We have an Azure Data Factory using Global Parameters, it's working fine on our Dev environment, but we when try do deploy it to QA environment using an Azure DevOps pipeline, it seems it's not understanding the only Global Parameter with type = array; even though all of the other parameters are good.
This is the guide we're using to build the CI/CD pipelines.
We have something similar to this in the Global Parameters JSON file:
{
"FilesToProcess": {
"type": "array",
"value": [
"VALUE01",
"VALUE02",
"VALUE03",
"VALUE04",
"VALUE05",
"VALUE06",
"VALUE07",
"VALUE08",
"VALUE09",
"VALUE10",
"VALUE11",
"VALUE12",
"VALUE13",
"VALUE14",
"VALUE15",
"VALUE16",
"VALUE17",
"VALUE18",
"VALUE19",
"VALUE20",
"VALUE21",
"VALUE22",
"VALUE23",
"VALUE24",
"VALUE25",
"VALUE26",
"VALUE27"
]
},
"EmailLogicAppUrl": {
"type": "string",
"value": "URL
}
}
All of the paremeters are deployed fine, except for the array one, and we're getting this:
We have debugged the PS script to update the Global Parameters, and it seems it's understanding well the array, so it has to be something else.
Any help will be highly appreciated.
Thanks!

Azure ARM Template - conditional input parameter

In Short:
Is it possible to have condition field (or something else to achieve this functionality), inside a parameter, so that this parameter input will be asked to the user, only based on another parameter's value.
i.e., Only if user selects Create New on a parameter, the input for the name of that parameter should be asked.
In Detail:
I have a parameter virtualNetworkCreateNewOrUseExisting which will accept two values - Create New and Use Existing.
"parameters": {
"virtualNetworkCreateNewOrUseExisting": {
"type": "string",
"defaultValue": "Create New",
"allowedValues": [
"Create New",
"Use Existing"
]
},
// Other parameters
}
I am trying to create a Virtual Network based on an input from user.
If user selects Create New, it will be created, and if they select Use Existing, it will be skipped. I see that this is achievable by using condition field inside the resource.
"resources": [
{
"condition": "[equals(parameters('virtualNetworkCreateNewOrUseExisting'), 'Create New')]",
// Other fields
}
// Other resources
]
Now, my question here is that,
Similar to this, is it possible to have condition field, inside another parameter, so that this parameter input will be asked to the user, only based on the previous parameter value.
i.e., Only if user selects Create New, the input for Virtual Network Name should be asked.
Something like this, by using a condition field:
"parameters": {
"virtualNetworksName": {
"condition": "[equals(parameters('virtualNetworkCreateNewOrUseExisting'), 'Create New')]",
"defaultValue": "vn-1",
"type": "string"
},
// Other parameters
}
But, I see that condition field, is not supported inside parameters (at least as of now).
Or, is it at least possible to have, if statements, like this (So that, the value of the field will be displayed empty by default if user selects Use Existing.):
"parameters": {
"virtualNetworksName": {
"defaultValue": "[if(equals(parameters('virtualNetworkCreateNewOrUseExisting'),'Create New'), 'vn-1', '')]",
"type": "string"
},
// Other parameters
}
Or, any other way to achieve this goal?
You can't do this natively in the template file, but in the portal user experiences for deploying the template you can... See:
https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/template-specs-create-portal-forms
and
https://learn.microsoft.com/en-us/azure/azure-resource-manager/managed-applications/create-uidefinition-overview
You don't have to use a templateSpec or managedApp (see the "Deploy to Azure button here: https://github.com/Azure/azure-quickstart-templates/tree/master/demos/100-marketplace-sample )
but templateSpec or managedApp will give you a better experience if that's an option.
As per the current Azure documentation, we cannot add conditions to parameters in parameters block.
As a side note you can use PowerShell parameter inline functions while deploying the template.
We see that there is a feature request already in place to add Support functions within the definition of parameters... · Community (azure.com) We would suggest you to make a comment & Upvote on the exiting feedback request .

Azure core tools - how to specify a different key name when fetching connection string

Is there a way to tell the tool what field name in my local.settings.json file I want to update when fetching a connection string? so specifically, when I run this command:
func azure storage fetch-connection-string $STORAGE_ACCOUNT_NAME
by default, it grabs the name of my storage account (mystorage123) and appends "_STORAGE" to it. So you end up with something like this:
{
"IsEncrypted": false,
"Values": {
"FUNCTIONS_WORKER_RUNTIME": "dotnet",
"mystorage123_STORAGE": "DefaultEndpointsProtocol=https;AccountName=mystorage123;AccountKey=<key>"
},
"ConnectionStrings": {}
}
I would like it just to create a json field / key that just matches the actual account name - in this case "mystorage123".
Is there a way to do this?
I know I can write powershell code to update my json file manually. But it'd be better if I can just tell the command what to call the field.
Thanks.
I would like it just to create a json field / key that just matches
the actual account name - in this case "mystorage123". Is there a way
to do this? I know I can write powershell code to update my json file
manually. But it'd be better if I can just tell the command what to
call the field.
I don’t think there is such a way. This connection string is originally been spliced.

Is it possible to create a CompositeIndex of CosmosDB with ARM template

I find instructions for using ARM templates to create or make changes to CosmosDB, but none of them contain instructions on how to add a CompositeIndex to the template. I have also heard it is not supported in the template and has to be done with PowerShell or Azure CLI script, but have not been able to find a supporting content on the net. Can someone please shed light on this?
I've not tested this but according to the Microsoft.DocumentDB resource provider docs / template reference there is a Microsoft.DocumentDB/databaseAccounts/apis/databases/containers resource which may give you what you need.
Every container has an IndexingPolicy in the template schema, which has an array of IncludedPath objects which themselves have an array of Index objects as follows:
"includedPaths": [
{
"path": "string",
"indexes": [
{
"dataType": "string",
"precision": "integer",
"kind": "string"
}
]
}
]
It's treated as a separate resource from the database / account altogether. You may want to add this resource to your template with an appropriate dependsOn value to ensure it's deployed after your database.
You can add multiple paths therefore making a composite index.
Full schema is here:
https://learn.microsoft.com/en-us/azure/templates/microsoft.documentdb/2015-04-08/databaseaccounts/apis/databases/containers
If this doesn't do it, you may want to look at this too as the schema docs may be out of date and compositeIndexes may be supported:
https://learn.microsoft.com/en-us/azure/cosmos-db/how-to-manage-indexing-policy#composite-indexing-policy-examples

Resources