Hi I'm trying to find information on how an azure function running on a consumption plan would scale with a custom trigger. This article - https://learn.microsoft.com/en-us/azure/azure-functions/functions-scale#how-the-consumption-plan-works - seems to imply theres a custom scaling implementation per trigger and does not have any explanation of how that works with custom triggers (if at all).
Custom triggers are not supported for Azure Functions. I think the main reason for that is indeed lack of Scaling Controller hooks.
Based on what is done in Durable Functions, you might be able to define your own triggers which are based on other existing triggers (like Orchestration Trigger is based on Storage Queues) to add your specific semantics, but reuse the scaling logic.
Related
I have a .NET isolated function with a queue trigger.
When triggering that queue from the storage explorer or from another function using a QueueServiceClient, a new operationId is made up. Thus I cannot correlate it, it seems.
Is it possible to do distributed tracing using W3C standard for Azure Function Queue trigger? I can not find any source on this.
If so, how?
Currently not supported.
Azure Functions team will evaluate this scenario (at some unmentioned point in time) whether or not it can be/will be supported. This has to do with their dependency on the team creating the Azure.Storage.Queues SDK.
https://github.com/Azure/azure-functions-dotnet-worker/issues/1126
I have a requirement where I need to transfer file (20-150 MB) between two systems. For this requirement , is it better to use Durable function instead of Azure data factory (ADF). As per my understanding , ADF execution will costlier as compared to durable functions. Note : durable function trigger is eventGrid trigger. Any suggestion will be helpful. File transfer will be simple pass through, no transformation is involved.
Also, for my requirement even simple azure function will work right instead of durable function? There is no need of function orchestration as file is not processed in batch. Since, file will be processed based on event trigger.
As of my experience, I would like to recommend using Azure functions over ADF is a good idea because of the following reasons:
Azure Data Factory is too expensive. ADF costs way more than azure functions.
Custom Logic: ADF is not built to perform cleansing logics or any custom code. Its primary goal is for data integration from external systems using its vast connector pool
Latency: ADF has much higher latency due to the large overhead of its job framework
Durable function is just related to the maximum execution time of a single call. For "out of the box" functions, that timeout is 10min, for durable functions this limitation gets removed. In this case, where you simply need to copy the data, there might be timeout issue and therefore you can consider the Durable function. Otherwise, simple function should also work fine. Moreover, Durable functions and normal functions share the same billing pattern.
For more details: https://learn.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=csharp
we have created azure durable function with timer trigger. We are not using Azure front door. Currently, function is deployed at east US2 & Central US region with Active -Active configuration.
Here problem is, both functions are executing and processing same data twice which is incorrect. I want to setup this configuration as Active-Passive but how it should be architect?
There is no out-of-the-box solution for this. You have to synchronize the work yourself by setting a lock for example, preventing the other timer from doing work when the work is already done. Or you disable the timer trigger function in the failover region, which means that in case of a failover you need to enable them yourself. Of course you could script that.
See also this blogpost describing the solution contained in this answer.
I've recently started experimenting with Azure functions and I'd like to get some info on whether the following makes sense and is possible..
How feasible is it to build a normal .NET Core Web API following DDD patterns but replacing the API endpoints with durable azure functions? Is this a possible use case for Azure Functions, to make the web API "serverless"?
And how would the whole thing be structured? Does each Azure function need its own project or can they all be placed in one project?
Thanks
As I wrote in comments why not?
You can define bounded context and deploy to one azure function as microservice for instance service which will be responsible for orders, other azure function of delivery and so on.
Use durable function when you need to orchestrate actions, for instance you have buy flow when in first action you lock products, take payments and unlock so you have kind of dependency on each other.
You can use azure functions with service bus or azure queue storage for event processing.
One thing keep in mind that when you design function you have time limitations is up to 5 minutes on provisioning plan. So when you design newsletter for instance keep in mind that you would need to send email in batches.
I am looking for a simple Scheduler to execute a task inside my Java web application deployed in Azure cloud. I am evaluating Azure functions with TimerTrigger for my requirement. Here, I am planning to define a Azure function with a callback API URL to invoke my application for executing the task inside my application.
I have some queries in this approach. Can anyone help me If you are familiar with Azure functions please?
1) Is it possible to initiate/reschedule/cancel a Azure TimerTrigger function from a Java application through API at runtime?
2) If yes, Is it possible to pass a call back URL to the timer Trigger?
3) Is there any known drawback in using Azure functions?
Thanks!
TimerTriggers don't have an api to control this (you could try to hack one in by uploading a new function.json with the schedule you want and whether or not the timer is disabled, but I don't recommend that at all).
Instead, I'd suggest using a QueueTrigger. This would allow you to pass the function any data you need in the queue item (the callback url) and you could add items to the queue with a visibility timeout in order to create your schedule. If you need to cancel pending executions, just remove the item(s) from the queue. The function is also more durable - if a queue item fails, it will automatically retry (unlike timers).
3) is way too broad of a question to have an answer.