When I deploy an Azure Function from Visual Studio, the function.json file is always incorrect. An example of the function.json file is the following for a queue triggered function:
{
"generatedBy": "Microsoft.NET.Sdk.Functions-1.0.12",
"configurationSource": "attributes",
"bindings": [
{
"type": "queueTrigger",
"connection": "AzureWebJobsStorage",
"queueName": "queue",
"name": "myQueueItem"
}
],
"disabled": false,
"scriptFile": "../bin/x.dll",
"entryPoint": "x"
}
The correct function.json in order for the function to work in azure is:
{
"bindings": [
{
"type": "queueTrigger",
"connection": "AzureWebJobsStorage",
"direction" : "in",
"queueName": "queue",
"name": "myQueueItem"
}
],
"disabled": false,
"scriptFile": "../bin/x.dll",
"entryPoint": "x"
}
Is there any solution to automated deployments/ Visual Studio deployments that would do this automatically? Currently I am editing all the function.json files every deployment. Any solutions or workarounds would be appreciated.
Agree with #Thomas, have tested v1 queue trigger template with Microsoft.NET.Sdk.Functions-1.0.12 and latest Microsoft.NET.Sdk.Functions-1.0.22, function.json generated by VS does work.
Actually two function.json both work on Azure, those two line below are used to tell function.json is generated by VS and not recommended to be modified after deployment.
"generatedBy": "Microsoft.NET.Sdk.Functions-1.0.22",
"configurationSource": "attributes",
The first one would not work
Function execution result may not be shown instantly, you could go to https://functionappname.scm.azurewebsites.net/DebugConsole and navigate to D:\home\LogFiles\Application\Functions\function\{FunctionName} to check log files.
Also you can visit D:\home\LogFiles\Application\Functions\Host to detect detailed host logs.
If you are still troubled, you could elaborate would not work with details and show us your code.
Related
I have a container folder where there are many sub-folders(around 3000), a file can land in any of the sub-folders. I need to react to a blob that's added into a sub-folder. I still can't figure out how to create a blob trigger if files are added to sub-folders.
Example:
Excerpt from function.json:
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "rootContainer/{name}"
}
OK, a function is triggered if I receive the blob in rootContainer folder
Except from function.json:
{
"name": "*/myblob",
"type": "blobTrigger",
"direction": "in",
"path": "rootContainer/{name}"
}
or
{
"name": "myblob",
"type": "blobTrigger",
"direction": "in",
"path": "rootContainer/*/{name}"
}
NOT OK, a function isn't triggered
There are not many questions regarding this problem and they still don't provide a normal answer. Can't find any info in documentation either.
Thanks!
I notice you use */myblob as the name, but this is no use.
For example, if you want the function be triggered when something send to a folder such as test under rootContainer, you need to use this function.json:
{
"bindings": [
{
"name": "myBlob",
"type": "blobTrigger",
"direction": "in",
"path": "rootContainer/test/{name}",
"connection": "AzureWebJobsStorage"
}
]
}
The path needs to be defined at compile time.
I started tinkering with Azure SignalR and ran into a problem with the negiotate trigger.
I followed this official Microsoft guide:
Heres my Code:
local.settings.json
{
"IsEncrypted": false,
"Values": {
"AzureSignalRConnectionString": "Endpoint=https://my.service.signalr.net;AccessKey=myKey=;Version=1.0;",
"FUNCTIONS_WORKER_RUNTIME": "node"
},
"Host": {
"LocalHttpPort": 7071,
"CORS": "*",
"CORSCredentials": true
}
}
function.json
{
"disabled": false,
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"methods": [
"get"
],
"name": "req",
"route": "negotiate"
},
{
"type": "http",
"direction": "out",
"name": "res"
},
{
"type": "SignalRConnectionInfo",
"name": "connectionInfo",
"hubName": "jitsi",
"ConnectionStringSetting": "Endpoint=https://my.service.signalr.net;AccessKey=myKey;Version=1.0;",
"direction": "in"
}
]
}
index.js
module.exports = async function (context, req, connectionInfo) {
context.res.body = connectionInfo;
};
It works fine locally (unfortunately thats where the guide ends). But if I visit the URL of the negotiate http-trigger I get "Internal Server Error 500". Logs contain following output.
2020-04-23T08:47:32 Welcome, you are now connected to log-streaming service. The default timeout is 2 hours. Change the timeout with the App Setting SCM_LOGSTREAM_TIMEOUT (in seconds).
2020-04-23T08:47:52.070 [Information] Executing 'Functions.jitsiNegotiate' (Reason='This function was programmatically called via the host APIs.', Id=2b791d95-3775-47bb-ade1-ac9005929f61)
2020-04-23T08:47:52.238 [Error] Executed 'Functions.jitsiNegotiate' (Failed, Id=2b791d95-3775-47bb-ade1-ac9005929f61)
Unable to resolve the value for property 'SignalRConnectionInfoAttribute.ConnectionStringSetting'. Make sure the setting exists and has a valid value.
As you can see in my code I did provide the ConnectionStringSetting.
Some People suggested it's due to lower/upper case 'C' in ConnectionStringSetting.
Others said to to edit local.settings.json.
None of that had any effect for me and I can't find any useful information on the issue.
EDIT 1:
I set "hubName":"jitsi". With jitsi being the name of my SignalR Service.
As in 'jitsi.service.signalr.net'. I'm not sure if that's correct or not.
Perhaps thats part of the issue?
EDIT 2:
I tried with no value set for ConnectionStringSetting (so that it goes to default).
Gave me same error. I also completely deleted any content of local.settings.json and then re-deployed to see what would happen.
Same behaviour as before.
My guess is The service only uses the file for local usage (hence the name).
So with the local.settings.json being empty theres no place else where I defined the value for AzureSignalRConnectionString.
I did some digging and apparently (according to this thread) you should define it under
'Configuration'->'Application Settings'
So I created a new setting with
name: Azure__SignalR__ConnectionString
value: myMaskedConnectionString
Which resulted in the following error:
The SignalR Service connection string must be set either via an 'AzureSignalRConnectionString' app setting, via an 'AzureSignalRConnectionString' environment variable, or directly in code via SignalROptions.ConnectionString or SignalRConnectionInfoAttribute.ConnectionStringSetting.
I found a resolution to this issue:
I got confused at first and thought the local.settings.json would serve as configuration for the live/non-local version of the function. That's not the case. It's only for local execution (could've guessed by the name of the file)
So the question remains: Where/How can I edit the required settings in the Azure Portal?
Answer:Home -> All Services -> Function-App -> MyFunctionApp -> Platform Features -> Configuration -> Application Settings -> Create New Application Setting
name: AzureSignalRConnectionString
value MyMaskedConnectionString
Then in function.json like this:
{
"disabled": false,
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"methods": [
"get"
],
"name": "req",
"route": "negotiate"
},
{
"type": "http",
"direction": "out",
"name": "res"
},
{
"type": "SignalRConnectionInfo",
"name": "connectionInfo",
"hubName": "jitsi",
"direction": "in",
"connectionStringSetting": "AzureSignalRConnectionString"
}
]
}
With those settings it's working for me now.
I have a simple azure function cosmos trigger set up, like so:
{
"bindings": [
{
"type": "cosmosDBTrigger",
"name": "documents",
"direction": "in",
"leaseCollectionName": "leases",
"connectionStringSetting": "DbConnectionString",
"databaseName": "mydb",
"collectionName": "mycollection",
"createLeaseCollectionIfNotExists": "true"
}
],
"scriptFile": "../dist/TestCosmosTrigger/index.js"
}
When I run the function I get an error:
The 'TestNotifier' function is in error: The binding type(s)
'cosmosDBTrigger' are not registered. Please ensure the type is
correct and the binding extension is installed.
Not sure what I'm doing wrong, perhaps it's a bug in the nodejs azure functions?
edit: after updating the azure function tools on my computer now a whole bunch of other bindings have started failing with the same error such as signalr:
{
"type": "signalRConnectionInfo",
"name": "connectionInfo",
"hubName": "chat",
"userId": "{headers.authorization}",
"direction": "in"
}
signalRConnectionInfo binding extension is not installed.
After updating azure function core tools with:
npm i -g azure-functions-core-tools#core --unsafe-perm true
and then running:
func extensions install -p Microsoft.Azure.WebJobs.Extensions.SignalRService -v 1.0.0
It looks like the above issue was resolved, a new problem opened up but that's a seperate issue.
The goal is to receive in Azure HttpTrigger multipart form (with a text file) in the stream and pipe it to the Azure Blob Storage. While processing, check if the file exceeds SIZE_LIMIT (20 Megabytes), then abort uploading.
Tried to set up function.js like this
{
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": ["post"],
"dataType": "stream",
"route": "myroute"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
],
"disabled": false
}
But in debug I see in the variable req.body Buffer type.
What am I doing wrong? Is it even possible to receive stream in Azure Functions?
For now, it's by design that Nodejs(non-C#) functions read the incoming content as Buffer.
Here's the thread tracking the stream support but it seems not in progress. We may have to operate on the buffer(convert to stream and so on) based on our requirement.
In a given Azure Function, I can have 1 or more output bindings. For example, I might have a blob storage output (writing a file blob to storage) and a queue output (pushing a message into a queue).
For example, if I have this very simple Azure function (written in Node.js)...
module.exports = function (context, req) {
context.log('START: Multi-output function.');
context.bindings.outputBlob = "blob-contents";
context.bindings.outputQueueItem = "{'message': 'hello!'}";
context.done();
};
... with the output bindings set up in function.json as follows...
{
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req"
},
{
"type": "http",
"direction": "out",
"name": "res"
},
{
"type": "blob",
"name": "outputBlob",
"path": "outcontainer/{rand-guid}",
"connection": "AzureWebJobsDashboard",
"direction": "out"
},
{
"type": "queue",
"name": "outputQueueItem",
"queueName": "outqueue",
"connection": "AzureWebJobsDashboard",
"direction": "out"
}
],
"disabled": false
}
... when do the two output bindings actually fire, and in which order?
For the when part of the question:
Do they fire at the point where the function sets the output binding? (e.g. the line of code that sets context.bindings.outputBlob)
Do they fire at/after context.done()?
For the order part of the question:
Do they fire in the order they're seen in the code?
Do they fire in the order they're seen in function.json ?
Output bindings fire after the function execution is completed - after context.done().
The order that you set them in the code has no influence on binding executions.
If you can, treat the actual execution order as the implementation detail and do not rely on it. Having said that, if I'm not mistaken, the actual order will be:
Execute all non-queue bindings in order of function.json
Then, execute all queue bindings in order of function.json
UPDATE: based on this issue and this issue I conclude that order is not guaranteed at the moment.