Could someone please tell me whether it is possible to assign same value for both Partitionkey and Rowkey in Azure Functionapp?
Many Thanks in advance
Based on your description, I just created my Http Trigger for Node.js to check this issue.
function.json:
{
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req"
},
{
"type": "http",
"direction": "out",
"name": "res"
},
{
"type": "table",
"name": "outputTable",
"tableName": "emails",
"connection": "AzureWebJobsDashboard",
"direction": "out"
}
],
"disabled": false
}
index.js:
var date=Date.now();
var key=date+'-'+Math.ceil(Math.random()*1000);
context.bindings.outputTable = {
"partitionKey": key,
"rowKey":key,
"GPIOPin":2,
'status':true
};
Leverage Azure Storage Explorer to check my table as follows:
For more details about the output sample for Table storage binding, you could refer to here.
It is possible. The design consequences will be that you will have partitions with a size of one entity. Remember that batch operations and transaction support are limited to entities in the same partition.
Related
I am currently making a Azure Function that read a blob schema whenever a new file is uploaded. I use the Blob Trigger Function for it to trigger whenever a new file is uploaded, but from there I'm stuck. Can anyone help ?
Below is the sample code for Blod trigger to read data. Review your code with this which might help you in resolving the issue.
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnectionAppSetting",
"name": "queuemsg",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "inputblob",
"type": "blob",
"dataType": "binary",
"path": "samples-workitems/{queueTrigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"name": "$return",
"type": "blob",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
],
"disabled": false,
"scriptFile": "__init__.py"
}
Here is the sample python code.
import logging
import azure.functions as func
# The input binding field inputblob can either be 'bytes' or 'str' depends
# on dataType in function.json, 'binary' or 'string'.
def main(queuemsg: func.QueueMessage, inputblob: bytes) -> bytes:
logging.info(f'Python Queue trigger function processed {len(inputblob)} bytes')
return inputblob
For complete information check this Blob Trigger documentation and Blob Trigger input binding.
I have a function with an httpTrigger, and I'd like to use the domain and the path (the route in the trigger binding) to look up a value in a table store. The function app will have multiple domains pointing to it, so I want to use the request domain as part of the filter.
Filtering based on the route/path works fine, but I can't find any information about using other parts of the request.
How can I provide the request host or URL to the table binding as part of a filter?
Here's my function.json:
{
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"route": "{path}",
"direction": "in",
"name": "req",
"methods": [
"get"
]
},
{
"name": "redirectRule",
"type": "table",
"take": "1",
"filter": "Host eq '{????}' and Path eq '{path}'",
"tableName": "RedirectRules",
"connection": "StorageConnectionAppSetting",
"direction": "in"
},
{
"type": "http",
"direction": "out",
"name": "res"
}
]
}
Where I have {????}, I'd like to use something like req.headers.host (which doesn't work). Is this possible?
When i run command
func new --name getTodo
i get the following code in my function.json file
{
"bindings": [
{
"authLevel": "function",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
],
},
{
"type": "http",
"direction": "out",
"name": "res"
}
]
}
so i have two objects in array - what is the difference between the first and the second one ? What are the meanings of "direction": "in", vs "direction": "out", in the second object ?
One means trigger, and another is output binding. Both of them is http.
The trigger give input data by http request, and output binding output data in http response.
I have azure functions written in nodejs. I can't find a way how to get data for example from my created azure cosmos db. I know that there is azure cosmos SDK, but i don't want to use that way.I want to learn to do it through the azure functions because it is possible with them also.
i try do to this:
function.json
{
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get"
]
},
{
"type": "http",
"direction": "out",
"name": "res"
},
{
"type": "cosmosDB",
"name": "inputDocument",
"databaseName": "dbtodos",
"collectionName": "items",
"connectionStringSetting": "todos_DOCUMENTDB",
"partitionKey": "/all",
"direction": "in"
}
],
"disabled": false
}
index
module.exports = async function (context, req) {
context.res = {
// status: 200, /* Defaults to 200 */
body: context.bindings.inputDocument
};
};
after my deploy when i visit the automatically generated url - i can't even open the link.There is not requests coming back.
If i do some basic example where i don't try to pull data from the db then my url is working after deploy.
How can i get the data ?
My data in the local.settings.json was wrong. I had azure storage for other table not for the one that i wanted to query... The code works perfectly fine
I want to create the azure Function which is bind to cosmos DB.
Whenever some insertion happens in "A" collection I want to update "B" collection.
"B" collection has stored procedure which I want to call after the insertion in collection "A".
I am new to Azure and cosmos-DB.
Suggest me what need to be done to accomplish the requirement.
So far I have created Azure Function
Also Updated the function.json with below code.
{
"bindings": [
{
"type": "cosmosDBTrigger",
"name": "input",
"direction": "in",
"leaseCollectionName": "leases",
"connectionStringSetting": "cdb-swm-dev-001_DOCUMENTDB",
"databaseName": "admin",
"collectionName": "aomsorders",
"createLeaseCollectionIfNotExists": true
},
{
"type": "documentDB",
"name": "inputDocument",
"databaseName": "admin",
"collectionName": "aomsorders",
"connection": "cdb-swm-dev-001_DOCUMENTDB",
"direction": "in"
},
{
"type": "documentDB",
"name": "outputDocument",
"databaseName": "admin",
"collectionName": "test",
"createIfNotExists": true,
"connection": "cdb-swm-dev-001_DOCUMENTDB",
"direction": "out"
}
],
"disabled": false
}
Also Updated the Integrate Part as below
Any Suggestion will be appreciable.
I find your question not very specific, but broadly you have at least two options:
Insert to both collections from the same Azure Function
Insert to the collection 1 from the first Azure Function, then have a second Azure Function with Cosmos DB trigger listening to the changes of collection 1 and updating collection 2
I'm sure there are other options too.
Here is an example of an Azure Function that gets triggered by a CosmosDBTrigger and then it uses a DocumentDB Output Binding to write to a second collection:
function.json
{
"bindings": [
{
"type": "cosmosDBTrigger",
"name": "input",
"direction": "in",
"databaseName": "your-database",
"collectionName": "your-collection1",
"connectionStringSetting": "name-of-connectionstring-setting-for-collection1",
"leaseCollectionName": "your-lease-collection"
},
{
"type": "documentDB",
"direction": "out",
"name": "docsToSave",
"databaseName": "your-database2",
"collectionName": "your-collection2",
"connection": "name-of-connectionstring-setting-for-collection2",
"createIfNotExists": false
}
]
}
run.csx (C#)
#r "Microsoft.Azure.Documents.Client"
using Microsoft.Azure.Documents;
using System.Collections.Generic;
using System;
public static async Task Run(IReadOnlyList<Document> input, IAsyncCollector<Document> docsToSave)
{
foreach(var doc in input){
// Do something here, process the document or do your compute
// Here I am saving the same document to the second collection but you could send a new document created within the processing logic or send the same document modified by some logic
await docsToSave.AddAsync(doc);
}
}
index.js (NodeJS)
module.exports = function(context, input) {
if(!!input && input.length > 0){
context.bindings.docsToSave = [];
for(var i = 0, len=input.length; i<len;i++){
var doc = input[i];
// Do something here with the doc or create a new one
context.bindings.docsToSave.push(doc);
}
}
context.done();
}