Azure function storage file isn't deleted with function - azure

When I create new Azure function, the specified storage account also create logs and files like host locks. For consumption, plan storage uses File Share to store whole function app by default.
When I want to delete my azure function, nothing is deleted in the storage account.
Storage account after deleted:
Is that correct for consumption plan?
Should I delete it manually?

On either a Consumption plan or an App Service plan, a function app requires a general Azure Storage account, which supports Azure Blob, Queue, Files, and Table storage. This is because Functions relies on Azure Storage for operations such as managing triggers and logging function executions, but some storage accounts do not support queues and tables.
They are part of a resource group, if you don't delete the whole resource group you have to delete each item seperately.
reference:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-scale

Related

Blob storage compatibility with Azure Functions

I have some e-mail attachments being saved to Azure Blob.
I am now trying to write a Azure Functions App that would connect to that blob storage, run some scripts and re-save the file.
However, when selecting a storage account for the function, I couldn't select my blob storage account.
I went on the website and it said this:
When creating a function app, you must create or link to a general-purpose Azure Storage account that supports Blob, Queue, and Table storage. Some storage accounts don't support queues and tables. These accounts include blob-only storage accounts and Azure Premium Storage.
I'm wondering, is there any workaround this? and if not, perhaps any other suggestions? I'm becoming a little lost in all the options, and which one to actually choose.
Thanks!
EDIT: Might I add I writing the function Python
I think you are overlooking the fact that you can have multiple storage accounts. In order for an Azure Function to work you need a storage account. That storage account is used to store runtime information of the Azure Function for internal purposes like state management. This storage account is subject to restrictions as you already found out. There is no workaround for that.
However, if the function you are writing needs to access another storage account it is free to do so. You just have to provide details to connect to that specific storage account. In that case you also have a clear seperation between the storage account that is used by the azure function for its internal operations and the storage account your application needs to connect and which you have total control about withouth having to worry that you break things by deleting internal used blobs/tables/queues.
You can have a blob triggered function that gets triggered when changes occur on your specific blob storage. That doesn't need to be the storage account that the azure function internally uses, which is created/selected when creating the azure function.
Here is a sample that shows how to add a blob triggered azure function in Python. MyStorageAccountAppSetting refers to an app setting that holds the connection string to the storage account that you use for storage.
The snippet from the website you are quoting is for storing the function app code itself and any related modules. It does not pertain to what your function can access when the code of your function executes.
When your function executes it will need to use the Azure Blob Storage SDK/modules to connect to your blob storage account and read the email attachments. Here's a quickstart guide for using Azure Storage with Python: Quickstart with Azure Storage Blobs SDK for Python
General-purpose v2 storage accounts support the latest Azure Storage features and incorporate all of the functionality of general-purpose v1 and Blob storage accounts here
There are more integration options with GPv2 accounts including Azure Function Triggers. See: Azure Blob storage bindings for Azure Functions
Further refer: Types of storage accounts
If Blob, based on your need, you can choose an access tier based on the frequency of access for the data (e-mail attachments)Access tiers for Azure Blob Storage - hot, cool, and archive. If General purpose storage account, its standard performance tier.

Azure Event Hub Storage Container Configuration for Azure Function with Input Binding

When handling an Event Hub event with an input bound Azure Function, is it possible to change the Storage account configured for the Event Hub partition checkpointing?
Is it possible to do this with a Premium Storage account and in isolation (ie a different Storage account than the account selected for the Azure Function during set up)?
It seems that this is possible with the EventProcessorHost but the Function doesn't seem to expose the EventProcessorHost configuration.
I did a similar hack for some other propose.
First, stop your Azure function
Then, Create New Premium Storage account and Copy an existing blobs and container (All eventhub checkpointing files with the same folder structure).
Then change Azure function storage account connection string to new premium storage account.
Then start your Azure function.

Azure Functions storage account: Used capacity is constantly rising

I use a storage account to host 3 simple Azure Functions, which perform read, write and delete operations on a database. Surprisingly the Used Capacity value under Metrics is constantly increasing (please see screenshot below). Why? Those functions don't write anything to this storage. Since one pays for capacity used - am I going to pay more and more, if it continues to increase like that? I am a "Pay-as-you-go" customer by the way.
Edit#1: If I check the folder size of File Shares it says 2MB (see below). No clue where the values in Metrics are coming from...
Edit#2: Below are the Application Settings. Just the default values + Link to the MongoDB Atlas cluster. Could it be related to AzureWebJobsStorage?
Every Azure Function requires a storage account.
Storage account requirements
When creating a function app in App Service, you must create or link
to a general-purpose Azure Storage account that supports Blob, Queue,
and Table storage. Internally, Functions uses Storage for operations
such as managing triggers and logging function executions.
Note
When using the Consumption hosting plan, your function code and
binding configuration files are stored in Azure File storage in the
main storage account. When you delete the main storage account, this
content is deleted and cannot be recovered.

Transferring files from one blob to another through vnet in azure

I have a requirement where I need to transfer files from one blob to the other through vnets deployed in different geographies and connected to each other. As I am very new to Azure platform, I tried researching over the web but could not find any proper solution. I got a suggestion that I can achieve this through programming an app service. Please let me know how I can achieve this.
Depends on your scenario, here are options:
To perform a backup of the storage account across different gegions, you can just specify the replication parameter (while creating a new storage account) to one these values:
Geo-redundant storage
Read-access geo-redundant storage
Another article on HA applications:
Designing Highly Available Applications using RA-GRS
If you want to manually copy files from a storage account to another, you can use Azure Storage events, it will push an event to Event Grid every time a blob is created.
Reacting to Blob storage events
You can then use a Logic App or a Function App to copy blobs to another storage account.

Azure Storage account consistently only adding Blob storage, missing Table/Queue/Files

Whenever I create a new Storage (classic) account through the Azure portal I consistently have issues whereby the Table/Queue/File storage is not created at all, leaving the account with only Blob storage, like this:
Instead of like this (separate account):
I have tried this multiple times and all have had the same result. I don't see how I can be getting this wrong as there is only 4 options on the form to create the account, and none of them govern the content of the account.
When I then attempt to create a new Table or Queue in this new account I get a 502 Bad Gateway error.
Am I missing something here? Can anyone tell me how I can add the required storage types to the account.
Not sure what's up with the portal, but a storage account always comprises blob, table, queue, and file storage (unless you create a Premium storage account - that's strictly blobs).
You should be able to confirm this by creating an app to, say, create, write, and read from a queue or table.
EDIT I see you edited your question, showing that you did try to create a table/queue. If this is a non-premium account, I suggest reaching out to support, as this makes no sense.
EDIT 4/2017 Aside from Premium storage accounts (which only have page blobs), there is another type of general (non-premium) storage account, specific to blobs only, where you won't be able to create Tables and Queues, but it's not available via the "Classic" deployment model; it's available only via "Resource Manager" deployment model:
In my case the issue was due to selecting Zone Redundant Storage (ZRS).
Since ZRS accounts only support Block Blobs, you will not see the
table, queue or file endpoints listed on the portal for the new
account.
https://blogs.msdn.microsoft.com/windowsazurestorage/2014/08/01/introducing-zone-redundant-storage/
Recreating the storage account using Globaly Redundant Storage (GRS) worked.

Resources