Azure Event Hub Storage Container Configuration for Azure Function with Input Binding - azure

When handling an Event Hub event with an input bound Azure Function, is it possible to change the Storage account configured for the Event Hub partition checkpointing?
Is it possible to do this with a Premium Storage account and in isolation (ie a different Storage account than the account selected for the Azure Function during set up)?
It seems that this is possible with the EventProcessorHost but the Function doesn't seem to expose the EventProcessorHost configuration.

I did a similar hack for some other propose.
First, stop your Azure function
Then, Create New Premium Storage account and Copy an existing blobs and container (All eventhub checkpointing files with the same folder structure).
Then change Azure function storage account connection string to new premium storage account.
Then start your Azure function.

Related

Blob storage compatibility with Azure Functions

I have some e-mail attachments being saved to Azure Blob.
I am now trying to write a Azure Functions App that would connect to that blob storage, run some scripts and re-save the file.
However, when selecting a storage account for the function, I couldn't select my blob storage account.
I went on the website and it said this:
When creating a function app, you must create or link to a general-purpose Azure Storage account that supports Blob, Queue, and Table storage. Some storage accounts don't support queues and tables. These accounts include blob-only storage accounts and Azure Premium Storage.
I'm wondering, is there any workaround this? and if not, perhaps any other suggestions? I'm becoming a little lost in all the options, and which one to actually choose.
Thanks!
EDIT: Might I add I writing the function Python
I think you are overlooking the fact that you can have multiple storage accounts. In order for an Azure Function to work you need a storage account. That storage account is used to store runtime information of the Azure Function for internal purposes like state management. This storage account is subject to restrictions as you already found out. There is no workaround for that.
However, if the function you are writing needs to access another storage account it is free to do so. You just have to provide details to connect to that specific storage account. In that case you also have a clear seperation between the storage account that is used by the azure function for its internal operations and the storage account your application needs to connect and which you have total control about withouth having to worry that you break things by deleting internal used blobs/tables/queues.
You can have a blob triggered function that gets triggered when changes occur on your specific blob storage. That doesn't need to be the storage account that the azure function internally uses, which is created/selected when creating the azure function.
Here is a sample that shows how to add a blob triggered azure function in Python. MyStorageAccountAppSetting refers to an app setting that holds the connection string to the storage account that you use for storage.
The snippet from the website you are quoting is for storing the function app code itself and any related modules. It does not pertain to what your function can access when the code of your function executes.
When your function executes it will need to use the Azure Blob Storage SDK/modules to connect to your blob storage account and read the email attachments. Here's a quickstart guide for using Azure Storage with Python: Quickstart with Azure Storage Blobs SDK for Python
General-purpose v2 storage accounts support the latest Azure Storage features and incorporate all of the functionality of general-purpose v1 and Blob storage accounts here
There are more integration options with GPv2 accounts including Azure Function Triggers. See: Azure Blob storage bindings for Azure Functions
Further refer: Types of storage accounts
If Blob, based on your need, you can choose an access tier based on the frequency of access for the data (e-mail attachments)Access tiers for Azure Blob Storage - hot, cool, and archive. If General purpose storage account, its standard performance tier.

Azure function storage file isn't deleted with function

When I create new Azure function, the specified storage account also create logs and files like host locks. For consumption, plan storage uses File Share to store whole function app by default.
When I want to delete my azure function, nothing is deleted in the storage account.
Storage account after deleted:
Is that correct for consumption plan?
Should I delete it manually?
On either a Consumption plan or an App Service plan, a function app requires a general Azure Storage account, which supports Azure Blob, Queue, Files, and Table storage. This is because Functions relies on Azure Storage for operations such as managing triggers and logging function executions, but some storage accounts do not support queues and tables.
They are part of a resource group, if you don't delete the whole resource group you have to delete each item seperately.
reference:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-scale

Copy data from Azure Service Bus Topic to a Storage Account inside Azure

I need to move data received in my Service Bus Topic to a Storage Account inside Azure.
I believe Azure Function is one of the good ways to achieve this.
Please share suitable example if you have one.
Regards,
Surya
Fire up an Azure Functions project, create a function with a Service Bus trigger binding (if you're in the portal or Visual Studio then it'll offer you a template) and add a storage output binding for blob, queue or table as appropriate. Then in the function code just copy the relevant data from your trigger parameter to your output parameter.

Transferring files from one blob to another through vnet in azure

I have a requirement where I need to transfer files from one blob to the other through vnets deployed in different geographies and connected to each other. As I am very new to Azure platform, I tried researching over the web but could not find any proper solution. I got a suggestion that I can achieve this through programming an app service. Please let me know how I can achieve this.
Depends on your scenario, here are options:
To perform a backup of the storage account across different gegions, you can just specify the replication parameter (while creating a new storage account) to one these values:
Geo-redundant storage
Read-access geo-redundant storage
Another article on HA applications:
Designing Highly Available Applications using RA-GRS
If you want to manually copy files from a storage account to another, you can use Azure Storage events, it will push an event to Event Grid every time a blob is created.
Reacting to Blob storage events
You can then use a Logic App or a Function App to copy blobs to another storage account.

Message from Azure Blob Storage to Azure Service Bus

I'm trying to figure out if Azure Blob Storage has similar functionality to Amazon S3. An S3 bucket can be configured in a way, that when new object is created, bucket sends message to SQS. I'm wondering if Azure Blob Storage is able to do the same with Azure Service Bus (which is kind of similar to SQS, correct?).
The only resource I've found so far, which mentions something similar is https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-overview, but there is no Azure Service Bus on the right side. I know I can use Functions as a proxy, but I'm interested in direct connection.
Any ideas?
Service bus(I think you compare service bus with SQS and SNS in AWS) don't have the ability to subscripe to Blob storage events. Event Grid(the link that you reffered to has Service bus support on the roadmap but no date is confirmed.
I think your best choice is Azure Functions(or Logic app if you don't want to write code) that has a blob Storage trigger to catch events and do action X.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-storage-blob-triggered-function.
Or wait a litte for event grid but you still get that "proxy" part.
One option is to use logic apps/ event grid and you can add trigger directly from azure blob storage (https://azure.microsoft.com/it-it/blog/azure-service-bus-now-integrates-with-azure-event-grid/) . Another option would be to add blob trigger with azure functions and write the code to do whatever action which you are looking for .

Resources