In a given Azure Function, I can have 1 or more output bindings. For example, I might have a blob storage output (writing a file blob to storage) and a queue output (pushing a message into a queue).
For example, if I have this very simple Azure function (written in Node.js)...
module.exports = function (context, req) {
context.log('START: Multi-output function.');
context.bindings.outputBlob = "blob-contents";
context.bindings.outputQueueItem = "{'message': 'hello!'}";
context.done();
};
... with the output bindings set up in function.json as follows...
{
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req"
},
{
"type": "http",
"direction": "out",
"name": "res"
},
{
"type": "blob",
"name": "outputBlob",
"path": "outcontainer/{rand-guid}",
"connection": "AzureWebJobsDashboard",
"direction": "out"
},
{
"type": "queue",
"name": "outputQueueItem",
"queueName": "outqueue",
"connection": "AzureWebJobsDashboard",
"direction": "out"
}
],
"disabled": false
}
... when do the two output bindings actually fire, and in which order?
For the when part of the question:
Do they fire at the point where the function sets the output binding? (e.g. the line of code that sets context.bindings.outputBlob)
Do they fire at/after context.done()?
For the order part of the question:
Do they fire in the order they're seen in the code?
Do they fire in the order they're seen in function.json ?
Output bindings fire after the function execution is completed - after context.done().
The order that you set them in the code has no influence on binding executions.
If you can, treat the actual execution order as the implementation detail and do not rely on it. Having said that, if I'm not mistaken, the actual order will be:
Execute all non-queue bindings in order of function.json
Then, execute all queue bindings in order of function.json
UPDATE: based on this issue and this issue I conclude that order is not guaranteed at the moment.
Related
Building on an earlier question. The following code is an httptrigger that listed to a gis layer edits and updates. It logs into the queue the url payload. I do not want the payload loaded but a specific repetitive message so that it is overwritten everytime for I do not want to dequeue every now and then. How can I go about this?
import logging
import azure.functions as func
def main(req: func.HttpRequest,msg: func.Out[str]) -> func.HttpResponse:
logging.info('Python HTTP trigger function processed a request.')
input_msg = req.params.get('message')
logging.info(input_msg)
msg.set(req.get_body())
return func.HttpResponse(
"This is a test.",
status_code=200
)
**function.json**
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "$return"
},
{
"type": "queue",
"direction": "out",
"name": "msg",
"queueName": "outqueue1",
"connection": "AzureStorageQueuesConnectionString"
}
]
}
I do not want the payload loaded but a specific repetitive message so
that it is overwritten everytime for I do not want to dequeue every
now and then.
No, when you put in the same message, It will not overwritten. It just queued in the queue storage.
If you want to process the message in queue, just use queueclient or use queuetrigger of azure function.(queuetrigger of function is based on queueclient, they are basically same.)
This is the API reference of queue:
https://learn.microsoft.com/en-us/python/api/azure-storage-queue/azure.storage.queue?view=azure-python
You can use it to process message in the queue with python code.
And this is the queuetrigger of azure function:(This is already integrated and can be used directly)
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-queue-trigger?tabs=python
I have a container folder where there are many sub-folders(around 3000), a file can land in any of the sub-folders. I need to react to a blob that's added into a sub-folder. I still can't figure out how to create a blob trigger if files are added to sub-folders.
Example:
Excerpt from function.json:
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "rootContainer/{name}"
}
OK, a function is triggered if I receive the blob in rootContainer folder
Except from function.json:
{
"name": "*/myblob",
"type": "blobTrigger",
"direction": "in",
"path": "rootContainer/{name}"
}
or
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "rootContainer/*/{name}"
}
NOT OK, a function isn't triggered
There are not many questions regarding this problem and they still don't provide a normal answer. Can't find any info in documentation either.
Thanks!
I notice you use */myblob as the name, but this is no use.
For example, if you want the function be triggered when something send to a folder such as test under rootContainer, you need to use this function.json:
{
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "rootContainer/test/{name}",
"connection": "AzureWebJobsStorage"
}
]
}
The path needs to be defined at compile time.
I started tinkering with Azure SignalR and ran into a problem with the negiotate trigger.
I followed this official Microsoft guide:
Heres my Code:
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureSignalRConnectionString": "Endpoint=https://my.service.signalr.net;AccessKey=myKey=;Version=1.0;",
"FUNCTIONS_WORKER_RUNTIME": "node"
},
"Host": {
"LocalHttpPort": 7071,
"CORS": "*",
"CORSCredentials": true
}
}
function.json
{
"disabled": false,
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"methods": [
"get"
],
"name": "req",
"route": "negotiate"
},
{
"type": "http",
"direction": "out",
"name": "res"
},
{
"type": "SignalRConnectionInfo",
"name": "connectionInfo",
"hubName": "jitsi",
"ConnectionStringSetting": "Endpoint=https://my.service.signalr.net;AccessKey=myKey;Version=1.0;",
"direction": "in"
}
]
}
index.js
module.exports = async function (context, req, connectionInfo) {
context.res.body = connectionInfo;
};
It works fine locally (unfortunately thats where the guide ends). But if I visit the URL of the negotiate http-trigger I get "Internal Server Error 500". Logs contain following output.
2020-04-23T08:47:32 Welcome, you are now connected to log-streaming service. The default timeout is 2 hours. Change the timeout with the App Setting SCM_LOGSTREAM_TIMEOUT (in seconds).
2020-04-23T08:47:52.070 [Information] Executing 'Functions.jitsiNegotiate' (Reason='This function was programmatically called via the host APIs.', Id=2b791d95-3775-47bb-ade1-ac9005929f61)
2020-04-23T08:47:52.238 [Error] Executed 'Functions.jitsiNegotiate' (Failed, Id=2b791d95-3775-47bb-ade1-ac9005929f61)
Unable to resolve the value for property 'SignalRConnectionInfoAttribute.ConnectionStringSetting'. Make sure the setting exists and has a valid value.
As you can see in my code I did provide the ConnectionStringSetting.
Some People suggested it's due to lower/upper case 'C' in ConnectionStringSetting.
Others said to to edit local.settings.json.
None of that had any effect for me and I can't find any useful information on the issue.
EDIT 1:
I set "hubName":"jitsi". With jitsi being the name of my SignalR Service.
As in 'jitsi.service.signalr.net'. I'm not sure if that's correct or not.
Perhaps thats part of the issue?
EDIT 2:
I tried with no value set for ConnectionStringSetting (so that it goes to default).
Gave me same error. I also completely deleted any content of local.settings.json and then re-deployed to see what would happen.
Same behaviour as before.
My guess is The service only uses the file for local usage (hence the name).
So with the local.settings.json being empty theres no place else where I defined the value for AzureSignalRConnectionString.
I did some digging and apparently (according to this thread) you should define it under
'Configuration'->'Application Settings'
So I created a new setting with
name: Azure__SignalR__ConnectionString
value: myMaskedConnectionString
Which resulted in the following error:
The SignalR Service connection string must be set either via an 'AzureSignalRConnectionString' app setting, via an 'AzureSignalRConnectionString' environment variable, or directly in code via SignalROptions.ConnectionString or SignalRConnectionInfoAttribute.ConnectionStringSetting.
I found a resolution to this issue:
I got confused at first and thought the local.settings.json would serve as configuration for the live/non-local version of the function. That's not the case. It's only for local execution (could've guessed by the name of the file)
So the question remains: Where/How can I edit the required settings in the Azure Portal?
Answer:Home -> All Services -> Function-App -> MyFunctionApp -> Platform Features -> Configuration -> Application Settings -> Create New Application Setting
name: AzureSignalRConnectionString
value MyMaskedConnectionString
Then in function.json like this:
{
"disabled": false,
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"methods": [
"get"
],
"name": "req",
"route": "negotiate"
},
{
"type": "http",
"direction": "out",
"name": "res"
},
{
"type": "SignalRConnectionInfo",
"name": "connectionInfo",
"hubName": "jitsi",
"direction": "in",
"connectionStringSetting": "AzureSignalRConnectionString"
}
]
}
With those settings it's working for me now.
The goal is to receive in Azure HttpTrigger multipart form (with a text file) in the stream and pipe it to the Azure Blob Storage. While processing, check if the file exceeds SIZE_LIMIT (20 Megabytes), then abort uploading.
Tried to set up function.js like this
{
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": ["post"],
"dataType": "stream",
"route": "myroute"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
],
"disabled": false
}
But in debug I see in the variable req.body Buffer type.
What am I doing wrong? Is it even possible to receive stream in Azure Functions?
For now, it's by design that Nodejs(non-C#) functions read the incoming content as Buffer.
Here's the thread tracking the stream support but it seems not in progress. We may have to operate on the buffer(convert to stream and so on) based on our requirement.
I have the following function definition.
Message type:
type MailboxItem = {
CustomerID: int
AssetID: int
}
Code:
let Run(item: MailboxItem, userNames: string, log: TraceWriter) =
log.Verbose("F# function executing for " + item.AssetID.ToString())
And function.json:
{
"bindings": [
{
"type": "eventHubTrigger",
"name": "item",
"direction": "in",
"path": "eventhubpath",
"connection": <connection>,
"consumerGroup": "$Default"
},
{
"type": "blob",
"name": "userNames",
"path": "blobpath/{CustomerID}-{AssetID}",
"connection": <connection>,
"direction": "in"
}
],
"disabled": false
}
As you can see, I'm using properties of the incoming message to bind an input blob from Blob Storage.
Now, I need to extend my function to access some metadata of the incoming message via EventData class (e.g. sequence number). Is it possible to add EventData parameter but also keep the binding to properties of the message body?
No not currently, unfortunately, though this is a common ask and something we're tracking in our repo here and will hopefully get to soon. Until we do, it is an either/or - you can bind to EventData or your custom POCO.