How to monitor IOPS for an Azure Storage Account - azure

Having used Azure for some time now, I'm well aware of the default 20,000 IOPS limit of an Azure Storage Account. What I've yet to find however is up to date documentation on how to monitor an account's IOPS in order to determine whether or not it's being throttled. This is important when debugging performance issues for applications, VMs, and ASR replication - to name but three possible uses.
If anyone knows the correct way to keep track of an account's total IOPS and/or whether it's being throttled at any point in time, I'd appreciate it - if there's a simple solution for monitoring this over time, all the better, otherwise if all that exists is an API/PowerShell cmdlet, I guess I'll have to write something to save the data periodically over time.

You can monitor your storage account for throttling using Azure Monitor | Metrics. There are 3 metrics relevant to your question, which are
AnonymousThrottlingError
SASThrottlingError
ThrottlingError
These metrics exist for each of the 4 storage account abstractions (blob, file, table, queue). If you're unsure how your storage account is being used then monitor these metrics for all 4 services. Things like ASR, Backup and VM's are going to be using the blob service.
To configure this, go to the Azure Monitor | Metrics blade in the portal and select the storage account(s) you want to monitor. Then check off the metrics you're interested in. The image blow shows the chart with these 3 metrics configured for the blob service.
You can also configure an alert based on these metrics to alert you when any of these throttling events occur.
As for measuring the IOPS for the storage account, you could monitor the Transactions metric for the storage account. This is not really measuring the IOPS, but it does give you some visibility into the number of transactions (which sort of relates to IOPS) across the storage account. You can configure this from the storage account blade and clicking Metrics in the Monitoring section as shown below.

Related

What all logs/Metrics should be enabled as part of the Diagnostic settings enablement for Azure Storage Accounts

As part of a client requirement, I've been asked to set up central log repository for different Azure workloads including Storage accounts & databases. I see a default diagnostic setting in place but all of those are disabled. To enable these, we need to enable certain logs/metrics which will further be ingested into the workspace. Now I want to make a cost-effective & most accurate selection of the logs/metrics for storage accounts. Can someone with more profound knowledge into this domain enlighten me about it?
Similarly for Postgre SQL & Cosmos DB databases too, I have to make such decision. Please help me with this.
Please check the below points and references in detail.
Selection:
You can select the logs for the operations that you want to Get all the details you wish for.selection depends on the requirement.
A good practice is to go through your agents and monitoring settings
and see exactly what you are logging. Capture logs which are
important for your monitoring purpose.
Choose the cheapest region to create and store your log analytics
workspace.
If you have very high volume of the log ingestion then it would be
prudent to opt for azure commitment tier.
In case you need to export the log analytics data, rather than
exporting all the data, you can filter it and send only relevant log
data
Above things can significantly reduce your azure billing cost and help you to save money in using azure monitor effectively. Understand Azure Monitor and Log Analytics Pricing and Cost Optimization (azurelib.com)
Storing:
Log data can accumulate in your account over time which can increase the cost of storage.
If you need log data for only a small period of time, you can reduce
your costs by modifying the log data retention period to less days.
Use lifecycle policy to move data between access tiers.
Data ingested into Log Analytics workspace can be retained at no
additional charge(free) up to the first 31 days.
See
Design considerations and change the data retention if
not needed more than that. See Monitoring Azure Blob Storage
| Microsoft Docs.
Storage Insights is a dashboard on top of Azure Storage metrics and
logs. You can use Storage Insights to examine the transaction volume
and used capacity of all your accounts. That information can help
you decide which accounts you might want to retire.
Analyze:
Analyze the used capacity and monitor the use of the container.
you can consider reducing the total cost by exporting logs to
storage account, and then using a serverless query solution on top
of log data.See blob storage monitoring/optimize cost for infrequent
queries
Organize data into access tiers.Log Analytics has Commitment Tiers,
which can save you as much as 30 percent compared to the
Pay-As-You-Go price.
You should periodically review this information to determine if you
can reduce your charges by moving to another tier
References:
Plan and manage costs for Azure Blob Storage | Microsoft Docs
Azure Monitor Logs pricing details - Azure Monitor | Microsoft Docs
Azure Monitor Log Analytics too Expensive? Part 2 - Save Some Money
| Thomas Stringer (trstringer.com)

Is it possible to see when my Azure Resources are idling?

I want to see when my resources are idling (e.g. certain resources might only be used during business hours and not used for any other background process). I'd like to do that preferably through an API call.
It would all depends on the type of resource and what you are wanting to do. You could use the Azure Monitor API or Azure Data Explorer API with Kusto to query out specific metrics for your different services. Depending on the type of data, this would require you to have more analytics enabled.
Here are some examples based on types of services.
Azure App Service - You could query for CPU, Memory, HTTP Requests, etc. This would give you an idea of activity. These same metrics tie into the auto-scaling.
Azure VMs - CPU, Memory, Disk IO, etc. You could determine your baseline then you would know when it is idle or not.
Azure Storage - Transactions, Ingress, Egress, Requests, etc. You could use that to determine if there is activity in your storage account.
As you can see it all depends on what you want to define as idling. If the goal is to reduce costs, then that will be difficult with many of these services. You could scale up and down your App Services with some scripts or scale in/out based on metrics. Same can be done with your Azure VMs, or using stopping and starting. Storage will not be able to be adjusted, but you are only charged for storage and egress so that is dictated by activity.
Hope this helps.
no, this is not possible. how do you define "idling"? how would azure know if your service does anything or not? besides, most of the PaaS resources cannot be stopped, so whats the use of that.
You can use Azure Advisor to get cost optimization advice, or Azure Monitor directly to gather performance data and then analyze it, but its not going to be trivial.

Azure functions - Unexplained storage account cost related to files

We are making use of Azure Functions (v2) extensively to fulfill a number of business requirements.
We have recently introduced a durable function to handle a more complex business process which includes both fanning out, as well as a chain of functions.
Our problem is related to how much the storage account is being used. I made a fresh deployment on an account we use for dev testing on Friday, and left the function idling over the weekend to monitor what happens. I also set a budget to alert me if the cost start shooting up.
Less than 48 hours later, I received an alert that I was at 80% of my budget, and saw how the storage account was single handedly responsible for the entire bill. The most baffling part is, that it's mostly egress and ingress on file storage, which I'm entirely not using in the application! So it must be something internal by the azure function implementations. I've dug around and found this. In this case the issue seems to have been solved by switching to an App Service plan, but this is not an option in our case and must stick to consumption. I also double checked and made sure that I don't have the AzureWebJobsDashboard setting.
Any ideas what we can try next?
The below are some interesting charts from the storage account. Note how file egress and ingress makes up most of the activity on the entire account.
A ticket for this issue has also been opened on GitHub
The link you provided actually points to AzureWebJobsDashboard as the culprit. AzureWebJobsDashboard is an optional storage account connection string for storing logs and displaying them in the Monitor tab in the portal. The storage account must be a general-purpose one that supports blobs, queues, and tables.
For performance and experience, it is recommended to use
APPINSIGHTS_INSTRUMENTATIONKEY and App Insights for monitoring instead
of AzureWebJobsDashboard
When creating a function app in App Service, you must create or link to a general-purpose Azure Storage account that supports Blob, Queue, and Table storage. Internally, Functions uses Storage for operations such as managing triggers and logging function executions. Some storage accounts do not support queues and tables, such as blob-only storage accounts, Azure Premium Storage, and general-purpose storage accounts with ZRS replication. These accounts are filtered out of from the Storage Account blade when creating a function app.
When using the Consumption hosting plan, your function code and
binding configuration files are stored in Azure File storage in the
main storage account. When you delete the main storage account, this
content is deleted and cannot be recovered.
If you use the legacy "General Purpose V1" storage accounts, you may see your costs drop by up to 95%. I had a similar use case where my storage account costs exploded after the accounts were upgraded to "V2". In my case, we just went back to V1 instead of changing our application.
Altough V1 is now legacy, I don't see Azure dropping it any time soon. You can still create it using the Azure Portal. Could be a medium-term solution.
Some alternatives to save costs:
Try the "premium" performance tier (V2 only). It is cheaper for such workloads.
Try LRS or ZRS as the redundancy setting. Depends on the criticality of this orchestration data.
PS: Our use case were some EventHub processors which used the storage accounts for coordination and checkpointing.
PS2: Regardless of the storage account configuration, there must be a way reduce the traffic towards the storage account. It is just another thing to try to reduce costs.

Is it possible to generate custom email report for azure metrics?

My Aim is this:
My company has 5 webapps and some other resources running on Microsoft Azure. An e-mail is to be sent daily that would contain
CPU and memory utilization of the 5 webapps
DTU percentage of SQL DB
Observed capacity of Autoscale
Currently this is done manually by taking screenshots of the metrics. Could this be automated via API or something else? I looked into application insights API, but couldn't find the info for SQL databases and auto-scale metrics.
If someone could just lean me on the right path, that would be great. Thanks.
You can use "Azure Monitor REST API".
All metrics: Contains DTU/CPU percentage, DTU/CPU limit, physical data
read percentage, log write percentage, Successful/Failed/Blocked by
firewall connections, sessions percentage, workers percentage,
storage, storage percentage, and XTP storage percentage.
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-metrics-diag-logging
and also the Azure Monitor REST API reference
https://learn.microsoft.com/en-us/rest/api/monitor/
Then simply you can use a logic app to send the email every day.

Azure Cloud Role included storage = extra costs?

I'm currently working out the cost-analysis for my upcoming Azure project. I am tempted to use a Azure Cloud Role, because it has a certain amount of storage included in the offer. However, I have the feeling that it is too good to be true.
Therefore, I was wondering. Do you have to pay transaction-costs/ storage costs on this "included" storage? I can't find any information about this on the Azure website, and I want to be as accurate as possible (even if the cost of transactions is almost nothing).
EDIT:
To clarify, I specifically want to know about the transaction costs on the storage. Do you have to pay a small cost per transaction on the storage (like with Blob/Table storage), or is this included in the offer as well?
EDIT 2:
I am talking about the storage included with the Cloud Services (web/worker) and not a separate Table/blob storage.
Can you clarify which offer you're referring to?
With Cloud Services (web/worker roles), each VM instance has some local storage associated with it, which is free of charge and, because it's a local disk, there are no transactions or related fees associated with this storage. As Rik pointed out in his answer, that data is not durable: it's on a single disk and will be gone forever if, say, the disk crashes.
If you're storing data in Blobs, Tables, or Queues (Windows Azure Storage), then you pay per GB ($0.095 cents per GB per month for geo-redundant storage, or $0.07 per GB per month for locally-redundant storage), and a penny per 100,000 transactions. And as long as your storage account is in the same data center as your Cloud Service, there's no data egress fees.
Now we come back to the question of which offer you're referring to. The free 90-day trial, for instance, comes with 70GB of Windows Azure Storage, and 50M transactions monthly included. MSDN subscriptions come with included storage and transactions as well. If you're just working with a pay-as-you-go subscription, you'll pay for storage plus transactions.
The storage is included, but not guaranteed to be persistent. Your role could be shut down and started on a different physical location, which has no impact on the availability of your role, but you'll lose your whatever you have in storage, I.E. the included storage is very much temporary.
As for transaction costs, you only pay for outgoing data, not incoming data or data within Azure (one role to another).
You pay per GB, and $0,01 per 100.000 transactions

Resources