I have a container folder where there are many sub-folders(around 3000), a file can land in any of the sub-folders. I need to react to a blob that's added into a sub-folder. I still can't figure out how to create a blob trigger if files are added to sub-folders.
Example:
Excerpt from function.json:
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "rootContainer/{name}"
}
OK, a function is triggered if I receive the blob in rootContainer folder
Except from function.json:
{
"name": "*/myblob",
"type": "blobTrigger",
"direction": "in",
"path": "rootContainer/{name}"
}
or
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "rootContainer/*/{name}"
}
NOT OK, a function isn't triggered
There are not many questions regarding this problem and they still don't provide a normal answer. Can't find any info in documentation either.
Thanks!
I notice you use */myblob as the name, but this is no use.
For example, if you want the function be triggered when something send to a folder such as test under rootContainer, you need to use this function.json:
{
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "rootContainer/test/{name}",
"connection": "AzureWebJobsStorage"
}
]
}
The path needs to be defined at compile time.
Related
I have a csv file with 'n' number of records stored in blob storage. I want to read the new records being added to the csv file, process it and stored it back to another container in the blob storage. I want to achieve this flow using Python Azure Functions. I am unable to write the code for inbound and outbound in the Python Azure Functions.
Please help. Thanks
Below is the code that worked for me. I'm using HTTP Trigger to achieve your requirement.
function.json
{
"scriptFile": "__init__.py",
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "blob",
"direction":"in",
"name": "inputblob",
"path": "input/input.csv",
"connection": "storageacc_STORAGE"
},
{
"type": "blob",
"direction":"out",
"name": "outputblob",
"path": "output/output.csv",
"connection": "storageacc_STORAGE"
}
]
}
init.py
import csv
import logging
import azure.functions as func
def main(req: func.HttpRequest, inputblob: func.InputStream,outputblob: func.Out[bytes]) -> func.InputStream:
# Reading from the input binding
input_file = inputblob.read()
# Processing the csv file
# Writing to output binding
outputblob.set(input_file)
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "",
"FUNCTIONS_WORKER_RUNTIME": "python",
"storageacc_STORAGE": "<Your_Connection_String>"
}
}
RESULTS:
I started tinkering with Azure SignalR and ran into a problem with the negiotate trigger.
I followed this official Microsoft guide:
Heres my Code:
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureSignalRConnectionString": "Endpoint=https://my.service.signalr.net;AccessKey=myKey=;Version=1.0;",
"FUNCTIONS_WORKER_RUNTIME": "node"
},
"Host": {
"LocalHttpPort": 7071,
"CORS": "*",
"CORSCredentials": true
}
}
function.json
{
"disabled": false,
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"methods": [
"get"
],
"name": "req",
"route": "negotiate"
},
{
"type": "http",
"direction": "out",
"name": "res"
},
{
"type": "SignalRConnectionInfo",
"name": "connectionInfo",
"hubName": "jitsi",
"ConnectionStringSetting": "Endpoint=https://my.service.signalr.net;AccessKey=myKey;Version=1.0;",
"direction": "in"
}
]
}
index.js
module.exports = async function (context, req, connectionInfo) {
context.res.body = connectionInfo;
};
It works fine locally (unfortunately thats where the guide ends). But if I visit the URL of the negotiate http-trigger I get "Internal Server Error 500". Logs contain following output.
2020-04-23T08:47:32 Welcome, you are now connected to log-streaming service. The default timeout is 2 hours. Change the timeout with the App Setting SCM_LOGSTREAM_TIMEOUT (in seconds).
2020-04-23T08:47:52.070 [Information] Executing 'Functions.jitsiNegotiate' (Reason='This function was programmatically called via the host APIs.', Id=2b791d95-3775-47bb-ade1-ac9005929f61)
2020-04-23T08:47:52.238 [Error] Executed 'Functions.jitsiNegotiate' (Failed, Id=2b791d95-3775-47bb-ade1-ac9005929f61)
Unable to resolve the value for property 'SignalRConnectionInfoAttribute.ConnectionStringSetting'. Make sure the setting exists and has a valid value.
As you can see in my code I did provide the ConnectionStringSetting.
Some People suggested it's due to lower/upper case 'C' in ConnectionStringSetting.
Others said to to edit local.settings.json.
None of that had any effect for me and I can't find any useful information on the issue.
EDIT 1:
I set "hubName":"jitsi". With jitsi being the name of my SignalR Service.
As in 'jitsi.service.signalr.net'. I'm not sure if that's correct or not.
Perhaps thats part of the issue?
EDIT 2:
I tried with no value set for ConnectionStringSetting (so that it goes to default).
Gave me same error. I also completely deleted any content of local.settings.json and then re-deployed to see what would happen.
Same behaviour as before.
My guess is The service only uses the file for local usage (hence the name).
So with the local.settings.json being empty theres no place else where I defined the value for AzureSignalRConnectionString.
I did some digging and apparently (according to this thread) you should define it under
'Configuration'->'Application Settings'
So I created a new setting with
name: Azure__SignalR__ConnectionString
value: myMaskedConnectionString
Which resulted in the following error:
The SignalR Service connection string must be set either via an 'AzureSignalRConnectionString' app setting, via an 'AzureSignalRConnectionString' environment variable, or directly in code via SignalROptions.ConnectionString or SignalRConnectionInfoAttribute.ConnectionStringSetting.
I found a resolution to this issue:
I got confused at first and thought the local.settings.json would serve as configuration for the live/non-local version of the function. That's not the case. It's only for local execution (could've guessed by the name of the file)
So the question remains: Where/How can I edit the required settings in the Azure Portal?
Answer:Home -> All Services -> Function-App -> MyFunctionApp -> Platform Features -> Configuration -> Application Settings -> Create New Application Setting
name: AzureSignalRConnectionString
value MyMaskedConnectionString
Then in function.json like this:
{
"disabled": false,
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"methods": [
"get"
],
"name": "req",
"route": "negotiate"
},
{
"type": "http",
"direction": "out",
"name": "res"
},
{
"type": "SignalRConnectionInfo",
"name": "connectionInfo",
"hubName": "jitsi",
"direction": "in",
"connectionStringSetting": "AzureSignalRConnectionString"
}
]
}
With those settings it's working for me now.
The goal is to receive in Azure HttpTrigger multipart form (with a text file) in the stream and pipe it to the Azure Blob Storage. While processing, check if the file exceeds SIZE_LIMIT (20 Megabytes), then abort uploading.
Tried to set up function.js like this
{
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": ["post"],
"dataType": "stream",
"route": "myroute"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
],
"disabled": false
}
But in debug I see in the variable req.body Buffer type.
What am I doing wrong? Is it even possible to receive stream in Azure Functions?
For now, it's by design that Nodejs(non-C#) functions read the incoming content as Buffer.
Here's the thread tracking the stream support but it seems not in progress. We may have to operate on the buffer(convert to stream and so on) based on our requirement.
In a given Azure Function, I can have 1 or more output bindings. For example, I might have a blob storage output (writing a file blob to storage) and a queue output (pushing a message into a queue).
For example, if I have this very simple Azure function (written in Node.js)...
module.exports = function (context, req) {
context.log('START: Multi-output function.');
context.bindings.outputBlob = "blob-contents";
context.bindings.outputQueueItem = "{'message': 'hello!'}";
context.done();
};
... with the output bindings set up in function.json as follows...
{
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req"
},
{
"type": "http",
"direction": "out",
"name": "res"
},
{
"type": "blob",
"name": "outputBlob",
"path": "outcontainer/{rand-guid}",
"connection": "AzureWebJobsDashboard",
"direction": "out"
},
{
"type": "queue",
"name": "outputQueueItem",
"queueName": "outqueue",
"connection": "AzureWebJobsDashboard",
"direction": "out"
}
],
"disabled": false
}
... when do the two output bindings actually fire, and in which order?
For the when part of the question:
Do they fire at the point where the function sets the output binding? (e.g. the line of code that sets context.bindings.outputBlob)
Do they fire at/after context.done()?
For the order part of the question:
Do they fire in the order they're seen in the code?
Do they fire in the order they're seen in function.json ?
Output bindings fire after the function execution is completed - after context.done().
The order that you set them in the code has no influence on binding executions.
If you can, treat the actual execution order as the implementation detail and do not rely on it. Having said that, if I'm not mistaken, the actual order will be:
Execute all non-queue bindings in order of function.json
Then, execute all queue bindings in order of function.json
UPDATE: based on this issue and this issue I conclude that order is not guaranteed at the moment.
I'm trying to create an Azure Function that executes PowerShell with a Storage Queue trigger. For testing purposes, I want this function to manipulate a file in my OneDrive for Business account. To copy the file at aapdftoimage/ThreePages.pdf to aapdftoimage/output_ThreePages.pdf.
When OneDrive for Business is integrated as an Input, I get errors any time the function is triggered by a new message in the queue. If I disconnect OneDrive as input I don't get any errors and $triggerInput contains the message.
The errors are:
2017-05-25T22:24:38.484 Function started (Id=a0c37fdf-ed3c-473c-9c79-236d63531e7e)
2017-05-25T22:24:38.499 Function completed (Failure, Id=a0c37fdf-ed3c-473c-9c79-236d63531e7e, Duration=1ms)
2017-05-25T22:24:38.562 Exception while executing function: Functions.QueueTriggerPowerShell1. Microsoft.Azure.WebJobs.Host: No value for named parameter 'file'.
Here's my PowerShell:
$inData = Get-Content $triggerInput
$inFile = Get-Content $inputFile
Write-Output "PowerShell script processed queue message '$inData'"
Write-Output "inFile: $inFile"
Here's function.json:
{
"bindings": [
{
"name": "triggerInput",
"type": "queueTrigger",
"direction": "in",
"queueName": "samples-powershell-pdftoimage",
"connection": "<storageaccount>_STORAGE"
},
{
"type": "apiHubFile",
"name": "inputFile",
"path": "aapdftoimage/{file}",
"connection": "onedriveforbusiness1_ONEDRIVEFORBUSINESS",
"direction": "in"
}
],
"disabled": false
}
As I'm writing this, I think part of my confusion is over the Input and Output (not connected in my test) integrations of OneDrive for Business.
I know what $triggerInput is. It's the content of the message. But what is $inputFile? And where does {file} come from?
I thought maybe I would do the following but it too doesn't work (same errors):
$file = Get-Content $triggerInput
I thought this might define $inputFile as "aapdftoimage/$file" but it does nothing of the sort.
Needless to say, I'm at a standstill. Can anyone give me some guidance and straighten me out?
#Henry Hamid Safi is correct. Using C#, you can leverage the Binder object to dynamically name the file.
In your use-case, the only way to dynamically provide the name of the file is to pass it as a JSON object in your trigger payload. Here is a sample setup that worked for me.
function.json:
{
"bindings": [
{
"name": "triggerInput",
"type": "queueTrigger",
"direction": "in",
"queueName": "samples-powershell",
"connection": "AzureWebJobsStorage"
},
{
"type": "apiHubFile",
"name": "inputFile",
"path": "aapdftoimage/{file}",
"connection": "onedriveforbusiness_ONEDRIVEFORBUSINESS",
"direction": "in"
},
{
"type": "apiHubFile",
"name": "outputFile",
"path": "aapdftoimage/output_{file}",
"connection": "onedriveforbusiness_ONEDRIVEFORBUSINESS",
"direction": "out"
}
],
"disabled": false
}
run.ps1:
$in = Get-Content $triggerInput
Write-Output "PowerShell script processed queue message '$in'"
Copy-Item $inputFile $outputFile
Request body (if using Test pane in Portal) or Queue trigger payload:
{
"file":"ThreePages.pdf"
}
Log entries:
2017-05-26T22:27:53.984 Function started (Id=032c4469-8378-44ce-af9e-5a941afb0d82)
2017-05-26T22:27:54.875 PowerShell script processed queue message '{ "file":"ThreePages.pdf" }'
2017-05-26T22:27:54.891 Function completed (Success, Id=032c4469-8378-44ce-af9e-5a941afb0d82, Duration=899ms)
OneDrive folder screen shot:
Working example
Function.json:
{
"bindings": [
{
"name": "triggerInput",
"type": "queueTrigger",
"direction": "in",
"queueName": "test",
"connection": "AzureWebJobsDashboard"
},
{
"type": "apiHubFile",
"name": "inputFile",
"path": "aapdftoimage/ThreePages.pdf",
"connection": "onedrive_ONEDRIVE",
"direction": "in"
},
{
"type": "apiHubFile",
"name": "outputFile",
"path": "aapdftoimage/output_ThreePages.pdf",
"connection": "onedrive_ONEDRIVE",
"direction": "out"
}
],
"disabled": false
}
run.ps1:
$in = Get-Content $triggerInput
Write-Output "PowerShell script processed queue message '$in'"
Copy-Item $inputFile $outputFile