Cosmosdb Trigger for Azure Functions Application - azure

We are developing applications using azure functions (python). I have 2 questions regarding Azure Functions Application
Can we monitor 2 collections using single Cosmosdb trigger ?
--- I have looked through the documentation and it seems it doesn't support. Did i miss anything ?
If there are 2 functions monitoring the same collection, will only one of the functions be triggered.
-- I observed this behaviour today. I was running 2 instances of functions app and the data from cosmosdb trigger was sent to only one of the application. I am trying to find out the reason for it.

EDIT:
1-I had never used multiple input triggers, but according to the official wiki it's possible, just add function.json file:
{
"bindings": [
{
"type": "queueTrigger",
"direction": "in",
"queueName": "image-resize"
},
{
"type": "blob",
"name": "original",
"direction": "in",
"path": "images-original/{name}"
},
{
"type": "blob",
"name": "resized",
"direction": "out",
"path": "images-resized/{name}"
}
]
}
PS: I know you're using cosmosDB, the sample above is just to illustrate
2- I assume it's due the way it's implemented (e.g topic vs queue). So the first function lock the event / message, then the second one is not aware of the event. At this moment, Durable Functions for python is still under development and should be released next month (03/2020). It will allow you to chain the execution of the functions just like it's available for c# / node:
https://learn.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=csharp#chaining
What you can do is output to a queue, which will trigger your second function after the first function completes. (pretty much what Durable Functions offers)

Related

Nested Http Route in Azure Function returns 404 not found

I have multiple functions handling GET, PUT & POST API's with different paths in my Azure Function and they all work fine. Each function has one index.js file handling one HttpMethod only & I have a routePrefix of "api" in the hosts.json. The function.json for one of the functions can be found below
{
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"route": "bike/{id}/like",
"methods": [
"put"
]
},
{
"type": "http",
"direction": "out",
"name": "res"
}
]
}
The problem I am facing is the two PUT requests below don't work & I get a 404 not found. However this seems to work when I run the function locally in vscode & check using Postman.
http://<MyAzureFuncName>.azurewebsites.net/api/bike/63d51c0bb593aec8734638e5/like
http://<MyAzureFuncName>.azurewebsites.net/api/bike/63d51c0bb593aec8734638e5/dislike
UPDATE 1
I have just found out that calling the above API's using the test/run feature on the Azure function in the portal works but not when I try using Postman. Any ideas why this is the case
I also have GET & an update method for PUT following a similar signature (shown below) which both work locally but on Azure the PUT is returning the GET result when checking with Postman. Could this PUT overload be the cause for the mix up.
http://<MyAzureFuncName>.azurewebsites.net/api/bike/63d51c0bb593aec8734638e5
UPDATE 2
I have found that all the above PUT requests work if I use https instead of http. The other POST, DELETE & GET requests work with both. This is all the more baffling, can anyone shed some light on this.
I would appreciate any help & guidance in spotting the obvious.
UPDATE 3
I do not have the TLS/SSL settings (classic) option in the Settings for my function app or in any of my app services for that matter. I am assuming this was something on the old portal.
Thanks
Since you are able to reach the endpoint with https and not with http, I assume that the error is the HTTP Only setting.
Navigate to your Function App, and go to TLS/SSL settings (classic) in the Settings section and switch HTTPS Only to Off.
When it comes to this section:
I also have GET & an update method for PUT following a similar
signature (shown below) which both work locally but on Azure the PUT
is returning the GET result when checking with Postman. Could this PUT
overload be the cause for the mix up.
I would say it's normal to return the resource when making a PUT request. For example, if you update the resource, you would then return the data with the updated information.
Hope it helps.

How to monitor execution time per endpoint when using Express with Google Cloud Functions?

I have a Cloud Function, actually, a Firebase Function, running on Node.js runtime that serves an API based on Express Framework. Looking at the function logs I have outputs like this (real data omitted):
{
"insertId": "000000-00000000-0000-0000-0000-000000000000",
"labels": {
"execution_id": "000000000000"
},
"logName": "projects/project-00000/logs/cloudfunctions.googleapis.com%2Fcloud-functions",
"receiveTimestamp": "2000-01-01T00:00:00.000000000Z",
"resource": {
"labels": {
"function_name": "api",
"project_id": "project-00000",
"region": "us-central1"
},
"type": "cloud_function"
},
"severity": "DEBUG",
"textPayload": "Function execution took 5000 ms, finished with status code: 200",
"timestamp": "2000-01-01T00:00:00.000000000Z",
"trace": "projects/project-00000/traces/00000000000000000000000000000000"
}
The relevant data I wanted to extract is the execution time and response code, in the textPayload attribute. However, I wanted to create a metric that breaks the data for each API endpoint to identify which endpoints are slow. This is an HTTP function but I don't have any request detail in the logs.
I probably can achieve what I want coding the logs into the function. However, I was wondering if I can extract the info directly from Google Cloud without touching the code function.
There is a way to create a log-based metric to show execution times split in endpoints?
References:
https://firebase.google.com/docs/functions/monitored-metrics
https://cloud.google.com/functions/docs/monitoring/metrics
I don't believe this will be possible without writing code. If you want to collect information about some running code, typically folks turn to Stackdriver, and use its APIs to collection specific information for analysis.

Which action(s) can I use to create a folder in SharePoint Online via Azure Logic App?

As question title states, I am looking for a proper action in Logic Apps to create a folder. This action will be executed several times -- once per directory as per business rule. There will be no files created in these folders because the intent of the Logic App is to prepare a template folder structure for the users' needs.
In the official documentation I see that there are create file, create item, and list folder actions. They suggest that there might be an action to create a folder too (which I can't find).
If such action does not exist, I may need to use some SharePoint Online API, but that will be a last resort solution.
I was able to create a directory by means of SharePoint - CreateFile action. Creating a directory via a side effect of the file creation action is definitely a dirty hack (btw, inspired by a comment on MS suggestion site). This bug/feature is not documented, so relying on it in production environment is probably not a good idea.
More that that, if my problem requires creating a directory in SharePoint without any files in it whatsoever, an extra step in App Logic needs to be used. Make sure to delete the file using the Id provided by Create File action.
Here's what your JSON might look like, if you were trying to create a directory called folderCreatedAsSideEffect under preexisting TestTarget document library.
"actions": {
"Create_file": {
"inputs": {
"body": "#triggerBody()?['Name']",
"host": { "connection": { "name": "#parameters('$connections')['sharepointonline']['connectionId']" } },
"method": "post",
"path": "/datasets/#{encodeURIComponent(encodeURIComponent('https://MY.sharepoint.com/LogicApps/'))}/files",
"queries": {
"folderPath": "/TestTarget/folderCreatedAsSideEffect",
"name": "placeholder"
}
},
"runAfter": {},
"type": "ApiConnection"
},
"Delete_file": {
"inputs": {
"host": { "connection": { "name": "#parameters('$connections')['sharepointonline']['connectionId']" } },
"method": "delete",
"path": "/datasets/#{encodeURIComponent(encodeURIComponent('https://MY.sharepoint/LogicApps/'))}/files/#{encodeURIComponent(body('Create_file')?['Id'])}"
},
"runAfter": {
"Create_file": [
"Succeeded"
]
},
"type": "ApiConnection"
}
},
Correct, so far the SharePoint Connector does not support Folder management tasks.
So, your best option currently is to use the SharePoint API or client libraries in an API or Function App.

Is logic app will retry to insert the failed record again or not?

I have a Logic App which will trigger whenever a record is created in Salesforce CRM, after that I have SQL server insert action where it will inserts the salesforce CRM record into azure SQL database.
Here my question is, if my Azure SQL database is down or failed to connect. What will happen to the record which is failed? Is logic app will retry to insert the failed record again or not?
By default not.
But you have the Do-Until Loops, where you define a condition for repeating an action. In your condition you can simply evaluate the result of the SQL Insert.
I use, for example the following expression to make a reliable call to a REST API:
"GetBerlinDataReliable": {
"actions": {
"GetBerlinData": {
"inputs": {
"method": "GET",
"uri": "http://my.rest.api/path?query"
},
"runAfter": {},
"type": "Http"
}
},
"expression": "#and(equals(outputs('GetBerlinData').statusCode, 200),greaterOrEquals(body('GetBerlinData').query?.count, 1))",
"limit": {
"count": 100,
"timeout": "PT30M"
},
"runAfter": {},
"type": "Until"
},
It depends on whether the HTTP code from such API is retry-able or not. If it is, we will by default retry 4 times with 30 seconds in between (you can change that in Settings of a given action as well). If it is not, then no retry will happen.
There are multiple ways to handle errors, depending on what and how you expect the error to occur: do-until like mentioned above is one way, or you an consider a try(insert)-catch(save to blob) and have another Logic Apps to check blob and retry insert.

Get route url for a http triggered functions in an ARM template

I'm trying to figure out how to get the route for an HTTP triggered Azure Function within an ARM template.
Thanks to a blog post I managed to find out the listsecret command, but when trying to execute this action via powershell, the output doesn't give me the trigger_url I was expecting. The URL does not comply with the configured route of the function, and shows the default trigger if no route would have been configured.
Any way I can get a hold of the configured route instead since I can't seem to use the trigger_url.
My configured route has got parameters in the path as well, e.g.:
{
"name": "req",
"type": "httpTrigger",
"direction": "in",
"authLevel": "function",
"methods": [
"POST"
],
"route": "method/{userId}/{deviceId}"
}
The output of listsecrets is:
trigger_url: https://functionapp.azurewebsites.net/api/method?code=hostkey
Is there any other way to extract the host key and route?
Try playing with the API version, but I would suspect that this is not possible as of now.
Currently, the only way to get the route is by reading the function.json file and parsing that information out, which you can do by using Kudu's VFS API.
For the keys, I would actually recommend using the key management APIs instead of listSecrets. As the latter is meant to address a small set of scenarios (primarily to enable some internal integrations) where the key management API more robust API and will continue to work with different secret storage providers (e.g. Azure Storage, which is what is used when slots are enabled and will eventually become the default).

Resources