How to test Azure Function Scalability? - azure

As we're new to the Azure Function App where we heard one of its great functionality was scalability, but how did azure function scale works out? Was it automatically scaling in behind or any mechanism we can set up? For example, max of scale size limitation.
When we debug the azure function locally (we've tried ServiceBusTrigger, EventHubTrigger, QueueTrigger and CosmosDBTrigger), it seems like every time the same function instance was called multiple times over and over while we continue sending messages, which doesn't work as scaling/working parallel as we expected, is there any good way of debugging the scalability locally?

The scaling of Azure Functions is determined by the Scale Controller:
The Scale Controller only runs in the cloud so it is not possible to test the scaling locally. Also the inner workings of this controller are not disclosed.
The best way to test the scaling is to actually do a proof of concept in the cloud and make sure you configure Application Insights. Once you have load tested your function app you can do a Log Analytics query such as the following one to see if multiple instances of your function app have been provisioned:
requests |
project timestamp, id, operation_Id, operation_Name, duration, cloud_RoleName, cloud_RoleInstance |
where cloud_RoleName =~ 'FUNCTION_APP_NAME' |
order by timestamp desc |
take 100
The cloud_RoleInstance property has the ID of the resource that has been provisioned. When that column contains muliple values you know that scaling has occurred.
To be honest, testing if Azure Functions autoscales should not be a primary concern to you since it's the responsibility of Azure. You probaly need the autoscaling to handle both small and large workloads and you might have time constraints in which the processing should be finished. If that is your real concern then you might be better off measuring the end-to-end performance/timings.

The scalability of azure function depends on the hosting plan, and there're 3 types of hosting plan: Consumption plan, Premium plan(it's in preview, so we can ignore it now), Dedicated plan(app service plan).
For Consumption plan, it scales automatically based on the number of incoming events.
For app service plan, you can manually scale out by adding more VM instances, or you can also enable autoscale. More details you can refer to this article.
And when you run it locally without hosting plan, you cannot see this behavior.
Hope this helps.

Related

Azure Functions scalability issue

I am using Azure Functions on the App Service Plan. My understanding is for every new execution the Azure Function will create a new App Service, execute the function and then shut down the App Service. There would be nothing shared between the multiple App Services that are spawned due to multiple requests.
However when I do test my Function(which is a video processing one), for one request the time it takes is around 2-3 mins however for multiple simultaneous requests the time increases to 10-15 mins. My questions are whether my understanding above is correct? If not then what resource is shared amongst these App Services? How should I decide my scaling options(manual vs auto)?
"My understanding is for every new execution the Azure Function will create a new App Service" Nope it will not run new instance each time. Generally if there is no load on AF it will stop all instances.
Then if first request/event comes in it will start first instance. This is why we have ColdStart in Serverless. After that scale controller will measure your instance performance memory and CPU consumption and decide if it needs to scale but it wont be instant. So if lets say you sent N amount of requests to do smth with video they could go to same first instance and increase load. Then AF will scale, because of CPU spike but it wont help with old requests since they are handled at first instance. Keep in mind For non-HTTP triggers, new instances are allocated, at most, once every 30 seconds which means that your AF should have CPU spike for at least 30 second to add new instance https://learn.microsoft.com/en-us/azure/azure-functions/event-driven-scaling
I am not sure if Azure Functions are good option for video processing. Azure function should be used for quick stuff usually I would say not more than 30 sec. But there are some limitation of execution time depends how you run it https://learn.microsoft.com/en-us/azure/azure-functions/functions-premium-plan?tabs=portal
Not sure what type of video processing you doing but i would have a look into Azure Media Services
The other options as you mentioned is Batch jobs with low priority https://azure.microsoft.com/en-au/blog/announcing-public-preview-of-azure-batch-low-priority-vms/ it actually a good use case you have: Media processing and transcoding, rendering and so on
A small addition to Vova's answer: if you're running your Function in an App Service (also known as a Dedicated Plan), it will by default only scale instances within the possibilities of the App Service Plan you defined. This means that all of the instances of your Function App run on the same virtual machine. That is most probably the reason you're seeing increasing request times with more requests.
If you want your Functions to scale beyond the capabilities of that plan, you will need to manually scale or enable autoscaling for the App Service plan.
An App Service plan defines a set of compute resources for an app to run. These compute resources are analogous to the server farm in conventional hosting.
and
Using an App Service plan, you can manually scale out by adding more VM instances. You can also enable autoscale, though autoscale will be slower than the elastic scale of the Premium plan. [...] You can also scale up by choosing a different App Service plan.
If you run your Function App on Consumption Plan (the true serverless hosting plan option since it enables scaling to zero),
The Consumption plan scales automatically, even during periods of high load.
In case you need longer execution times than those available in Consumption Plan, but the App Service Plan doesn't seem to be the best hosting environment for your Functions there's also the Premium Plan.
The Azure Functions Elastic Premium plan is a dynamic scale hosting option for function apps.
Premium plan hosting provides the following benefits to your functions:
Avoid cold starts with perpetually warm instances
Virtual network connectivity.
Unlimited execution duration, with 60 minutes guaranteed.
Premium instance sizes: one core, two core, and four core instances.
More predictable pricing, compared with the Consumption plan.
High-density app allocation for plans with multiple function apps.
More info on all the different Azure Functions hosting options.

Azure App Service Plan: Function vs App Service?

When hosting an Azure Function in an App Service Plan, are there any significant differences compared with using an App Service (EDIT: for a restful API) associated with the same App Service Plan? I assume the only difference is that the Function offers additional out of the box triggers. Any differences I'm missing that would lead to preferring one over the other?
I'd also like to confirm that hosting Azure Functions in an App Service Plan can actually limit scalability if scaling is not configured on the App Service Plan. As I understand it, Functions automatically scale as needed when using Consumption or Premium hosting without additional configuration.
When hosting an Azure Function in an App Service Plan, are there any significant differences compared with using an App Service associated with the same App Service Plan? I assume the only difference is that the Function offers additional out of the box triggers. Any differences I'm missing that would lead to preferring one over the other?
Well, an Azure Function is a different beast than an App Service. An Azure function is triggered by an external event or a timer. It then executes the code of the function. When hosted on a consumption plan this execution is allowed to run for 5 or 10 minutes max. When you need a longer execution time you need to run it on an App Service Plan.
An App Service can host any app you've created. Like a website (that runs continuously and doesn't need to be triggered before it starts doing something) or an api for example.
I'd also like to confirm that hosting Azure Functions in an App Service Plan can actually limit scalability if scaling is not configured on the App Service Plan. As I understand it, Functions automatically scale as needed when using Consumption or Premium hosting without additional configuration.
Correct, when hosting Azure Functions in an App Service Plan you are responsible for making sure the app service is scaled to allow the function to perform well under load. Thats why the consumption plan is designed to handle this so the developer can focus on the functionality and does not need to worry about the infrastructure.
So, for integration scenario's azure functions are a very natural fit. For web sites an App Service might be the best solution.
To address your comment:
I should have mentioned that this question was in the context of hosting a restful API and not a UI application. In this scenario, I'm not seeing much difference between a Function and App Service, but please correct me if I'm missing something
A couple of things: For one, there is a certain sweet spot. If traffic is heavy enough a consumption plan based azure function might be more costly than having a dedicated app service plan. That depends of course on a lot of factors (CPU usage, request duration etc.). Also, you won't be able to use things like Asp.Net Core Middleware out of the box.
Finally, I'd argue that if your api is becoming large enough managing a single asp.net core solution may be easier than having to manage a lot of azure functions with small functions or one azure function project with lots and lots of functions, but hey, that's just my opinion (haven't actually dealt with it to be honest)
Some resources to consider:
https://www.taztopia.com/single-post/2019/07/28/azure-function-vs-web-app-aka-serverless-vs-paas
https://dasith.me/2018/01/20/using-azure-functions-httptrigger-as-web-api/
The main difference is in how you pay for it:
Azure Functions Consumption plan you pay per execution
Azure Functions in an App Service (dedicated plan) you pay for the allocated hardware per minute.
You also have control of which VNET your functions run in when you use an app service plan. You may have security requirements that make this important.
You are correct that if you run it in an app service that is not configured to scale, then throughput will be limited by the allocated hardware.
For details see: https://learn.microsoft.com/en-us/azure/azure-functions/functions-scale
If you are having limited and predictable workload then deploy az function in AppService plan with supports VNET integration for private compute otherwise go for Premium plan which will provide autoscaling capability of your compute environment.

Pricing for deploying multiple Azure Functions to the same App Service Plan (Elastic Premium Plan - EP1)

I'm planning to promote a set of Azure Functions (~15) to production. Today we're using for development purposes the Consumption plan but the cold start nature of this plan might impact the overall experience.
I've been searching for a best cost-benefit plan for deploying the application and found that the Elastic Plan EP1 would fit our needs (e.g. no cold start, rapid scale out, "... share an App Service Plan accross multiple funcion apps ...", etc).
The problem is that I didn't find precisely how this Plan would be charged...
In the scenario exposed, would I be charged 388.67 BRL for each of the approx. 15 functions deployed to the Plan? Or would the charge be for the Plan itself and the Functions would share the resources of the Plan?
And also, all the Function Apps on the Plan would be pre-warmed?
EDIT:
Even not finding the answers clearly on the official documentation, I created the EP1 Plan and deployed the Functions.
I found that for a given App Service Plan (e.g. Elastic Plan EP1), I can deploy many Function Apps to it, but they share resources of that Plan, increasing the "app density".
I still don't get the answer for the cold start question: for that Plan, if I deploy the 15 Function Apps to it would them be pre-warmed? I found that I can set "pre-warmed=1" in every Function App, but still experiencing cold starts.
No. If you dont set the pre-warmed instance, only one instance will be pre-warmed by default. You need to go the Platform Features tab to change the number of pre-warmed instances.
If you only care about the cost of the premium plan, you can use this calculator to get the cost:
https://azure.microsoft.com/en-us/pricing/calculator/?service=functions
And these is the documents about the premium plan:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-premium-plan#features
https://learn.microsoft.com/en-us/azure/azure-functions/functions-scale#premium-plan
Let me know whether this can answered your question.:)

Saving on Azure billing cost with App Services?

I have a .NET Core application currently running as an Azure App Service, and I need it to do a lot of 'work' only about a few times a day. In order to save on the hourly billing, this is the solution I developed:
Using a runbook (Azure Automation): scale the App Service Plan to the 'Free' tier at 7:00 PM
Using a runbook (Azure Automation): scale the App Service Plan back up to the premium tier at 8:00 AM
Hard-code my .NET Core application to ensure it only does the heavy 'work' between 8:00 AM and 7:00 PM
This is fine as it saves me a significant portion of cost, as I'm only paying for the hours in which the App Service Plan is scaled up to the premium tier. However it is definitely not ideal.
My question is - what design pattern should I implement in order to accomplish what I'm trying to do? I need a lot of compute resources but only for a few hours out of the day. I know AWS has 'spot' instances that you can configure - is there a similar mechanism in Azure?
Ideally I could implement a solution that involves me only paying for those heavy compute resources when I actually need it (e.g.: a few times a day, while the sun is up)
Thank you for any insight and help!
EDIT in regards to the type of computation, my summary is essentially a few ML.NET trainers running in parallel with some moderate Elasticsearch document writing
It is pretty tough to answer this with the whole description of your workload being a "lot" of "heavy compute".
If you can put your "compute" into Azure Functions, going serverless with a consumption plan will probably be the nicest solution. However, individual function executions have a given timeout, so you need to see if your app fits the bill.
As an alternative, you can put your application into an Azure Container Instance, and spin that up on demand.
If you have REALLY high workload, you can use Azure Batch. If your current workload can be done on an AppService plan, this may be "overkill".
The equivalent to AWS spot instances is called Azure Spot Virtual Machines. You can also use them with Azure Batch.
Yes, you can switch to Serverless. Host front end on Storage Accounts and back end move to Azure Functions (Consumption Plan).
PS: If it's a long running processing, it may not be the best solution unless you use Durable Functions.

Azure Functions not Running Fast Enough

I have an azure function that reads jobs from a storage queue. It then executes these jobs and grabs more. I have been getting more jobs for it to run lately and noticed that the queue is building up.
What can I do from an Azure Perspective to get better performance out of this? Each job runs in its own little world so adding a new instance or adding threads or attaching to a "better" machine would all work fine.
Things come to mind with the information provided:
For more pure power: Host your Azure Function in a dedicated App Service plan instead of using the consumption plan. You can scale up (better hardware) or out (more hardware). Be aware that this could also be worse in theory. I would give it a try. Or try the "premium consumption plan" mentioned by Ken.
More parallelism: If your queue builds up even though you are not using most of your resources. Try playing with the configuration parameters batchSize and newBatchThreshold.
Changed execution logic: Depending where most of your time is spent during function execution, durable functions might help. Based on your comments you might also try to cache the external data using static or Azure Redis Cache.
Look at the most common performance considerations
Premium plan (Preview)
Azure Functions Premium plan provides customers the same features and scaling mechanism used on the Consumption plan (based on number of events) with enhanced performance and VNET access. Azure Functions Premium Functions plan is billed on a per second basis based on the number of vCPU-s and GB-s your premium functions consume.
In order to use the Azure Functions Premium Plan private preview your subscription needs to be added to an allowlist. Please apply for access via http://aka.ms/functionspremium.
More Info:
https://github.com/Azure/Azure-Functions/blob/master/functions-premium-plan/overview.md

Resources