Looking for a method to export data outside of the Azure Cost Centre into a storage account. This can't be done because of network restrictions on a storage account (i.e. firewall rules with selected networks prevents the Azure Cost Centre from exporting data to a storage account).
what is the workaround? Can a datafactory do this instead with APIs? Can an Azure function do this? what are some of the options available?
Can a datafactory do this instead with APIs?
No, Both Logic App and Azure Data Factory has no connector to Azure Cost Centre.
Can an Azure function do this?
Yes, you can use function to do this. You can do a Http request to get data from Azure Cost Centre and then upload a file with that data to Storage Blob.
Related
I have some e-mail attachments being saved to Azure Blob.
I am now trying to write a Azure Functions App that would connect to that blob storage, run some scripts and re-save the file.
However, when selecting a storage account for the function, I couldn't select my blob storage account.
I went on the website and it said this:
When creating a function app, you must create or link to a general-purpose Azure Storage account that supports Blob, Queue, and Table storage. Some storage accounts don't support queues and tables. These accounts include blob-only storage accounts and Azure Premium Storage.
I'm wondering, is there any workaround this? and if not, perhaps any other suggestions? I'm becoming a little lost in all the options, and which one to actually choose.
Thanks!
EDIT: Might I add I writing the function Python
I think you are overlooking the fact that you can have multiple storage accounts. In order for an Azure Function to work you need a storage account. That storage account is used to store runtime information of the Azure Function for internal purposes like state management. This storage account is subject to restrictions as you already found out. There is no workaround for that.
However, if the function you are writing needs to access another storage account it is free to do so. You just have to provide details to connect to that specific storage account. In that case you also have a clear seperation between the storage account that is used by the azure function for its internal operations and the storage account your application needs to connect and which you have total control about withouth having to worry that you break things by deleting internal used blobs/tables/queues.
You can have a blob triggered function that gets triggered when changes occur on your specific blob storage. That doesn't need to be the storage account that the azure function internally uses, which is created/selected when creating the azure function.
Here is a sample that shows how to add a blob triggered azure function in Python. MyStorageAccountAppSetting refers to an app setting that holds the connection string to the storage account that you use for storage.
The snippet from the website you are quoting is for storing the function app code itself and any related modules. It does not pertain to what your function can access when the code of your function executes.
When your function executes it will need to use the Azure Blob Storage SDK/modules to connect to your blob storage account and read the email attachments. Here's a quickstart guide for using Azure Storage with Python: Quickstart with Azure Storage Blobs SDK for Python
General-purpose v2 storage accounts support the latest Azure Storage features and incorporate all of the functionality of general-purpose v1 and Blob storage accounts here
There are more integration options with GPv2 accounts including Azure Function Triggers. See: Azure Blob storage bindings for Azure Functions
Further refer: Types of storage accounts
If Blob, based on your need, you can choose an access tier based on the frequency of access for the data (e-mail attachments)Access tiers for Azure Blob Storage - hot, cool, and archive. If General purpose storage account, its standard performance tier.
Does anyone know if there is a cost associated to the Azure API egress calls. We are calling the Azure API from external sources outside azure to extract data from the Azure resources. Wanted to know if there would be a cost of pulling the data from Azure via API.
We are not using Azure API management, this is a custom solution which is pulling data from Azure.
Thanks very much for the help.
Cheers
I have a requirement where I need to transfer files from one blob to the other through vnets deployed in different geographies and connected to each other. As I am very new to Azure platform, I tried researching over the web but could not find any proper solution. I got a suggestion that I can achieve this through programming an app service. Please let me know how I can achieve this.
Depends on your scenario, here are options:
To perform a backup of the storage account across different gegions, you can just specify the replication parameter (while creating a new storage account) to one these values:
Geo-redundant storage
Read-access geo-redundant storage
Another article on HA applications:
Designing Highly Available Applications using RA-GRS
If you want to manually copy files from a storage account to another, you can use Azure Storage events, it will push an event to Event Grid every time a blob is created.
Reacting to Blob storage events
You can then use a Logic App or a Function App to copy blobs to another storage account.
I was monitoring microsoft.compute using a REST API Client and I was hoping that I could do the same for microsoft.storage. But unfortunately I get an error response while trying to do so
{
"code": "ResourceNotSupported",
"message": "Resource provider not supported: microsoft.storage"
}
The Rest API call I make is something similar to this
https://management.azure.com/subscriptions/xxxxxxxx/resourceGroups/xxxxx/providers/Microsoft.Storage/storageAccounts/xxxxx/providers/microsoft.insights/metricdefinitions?api-version=2016-03-01
Is there any way to get storage metrics from a REST API client?
Storage metrics are stored in a table called $MetricsTransactionsBlob. You will need to use the data plane APIs described in this link.
Simply you will need to access and query the table at https://<accountname>.table.core.windows.net/Tables("$MetricsTransactionsBlob")
There is no Azure Insights for Azure storage. You can use Storage analytics to get at the monitoring data that's stored in associated Table storage of the monitored account, but this will fail for Blob and Premium storage accounts. This is a big oversight from the Azure API perspective. For Standard Azure storage, you can get at the metric tables via the link that #Sercan provided.
If you're trying to monitor utilization of your VM disks on Premium or Blob accounts, you can use Physical Disk performance counters on the actual VMs to measure the throughput, IO requests, etc. This is what we advise our CloudMonix users do when they have this need.
I'm trying to enable the monitoring tiles in my Azure Storage account, but for the life of me can't get it to work. It keeps coming up with the error"monitoring may not be enabled. Please turn on diagnostics".
I've tried ticking all the checkboxes under Settings->Diagnostics->Monitoring but nothing seems to work.
Is there a trick I'm missing?
Are you trying to monitor a Premium Azure Storage account? I dont believe those can be monitored at this time, as monitored/analytical data for them is stored in Table Storage and Premium Storage accounts do not support Table Storage or Queue storage.