Copy data from Azure Service Bus Topic to a Storage Account inside Azure - azure

I need to move data received in my Service Bus Topic to a Storage Account inside Azure.
I believe Azure Function is one of the good ways to achieve this.
Please share suitable example if you have one.
Regards,
Surya

Fire up an Azure Functions project, create a function with a Service Bus trigger binding (if you're in the portal or Visual Studio then it'll offer you a template) and add a storage output binding for blob, queue or table as appropriate. Then in the function code just copy the relevant data from your trigger parameter to your output parameter.

Related

Can azure event hub ingest json events from azure blog storage without writing any code?

Is it possible to use some ready made construct in azure cloud environment to ingest the events (in json format) that are currently stored in azure blob storage and have it submit those events directly to azure event hub without writing any (however small) custom code? In other words, I would like to use configuration driven approach only.
Sure. You can try to use Azure Logic Apps to realize your needs without any code or just with some function expressions, please refer to the offical documents of Azure Logic Apps to know more details.
The logic flow is as the figure below.
You can refer to my sample below to make it works.
Here is my sample to receive an event from my EventHub and transfer to Azure Blob Storage to create a new blob for storing the event data.
Create an Azure Logic App instance on Azure portal, it should be easy for you.
Move to the tab Logic app designer to configure the logic flow.
Click Save and Run buttons. Then, use ServiceBusExplorer (downloaded from https://github.com/paolosalvatori/ServiceBusExplorer/releases) to send event message and check whether new blob created using AzureStorageExplorer. It works fine after a few minutes.

Read records from large xlsx/csv files and post it to azure service bus topic

We receive large feed files(excel/csv) of ~5gb size every night, need to read and have to post one record after another from the file to azure service bus topic. Is it possible using Azure Data Factory, if yes, how? or any other better azure based solution available? please suggest.
Data Factory doesn't support Azure Service Bus topic. So it's impossible to using Azure Data Factory
For more details, please see : Supported data stores and formats.
I think Azure Function may can help you achieve it.
You can reference :
Azure Blob storage bindings for Azure Functions.
This article explains how to work with Azure Blob storage bindings in Azure Functions. Azure Functions supports trigger, input, and output bindings for blobs. The article includes a section for each binding: Blob trigger, Blob input binding,Blob output binding.
Azure Service Bus bindings for Azure Functions. This article explains how to work with Azure Service Bus bindings in Azure Functions. Azure Functions supports trigger and output bindings for Service Bus queues and topics
Here is a blog about Copy data from Azure Service Bus Topic to a Storage Account inside Azure.
We don't know how much time it will take. Azure Functions has two kinds of pricing plans:Consumption plan and App Service plan.
Azure Functions in a Consumption plan are limited to 10 minutes for a single execution. In the Premium plan, the run duration defaults to 30 minutes to prevent runaway executions. However, you can modify the host.json configuration to make this unbounded for Premium plan apps.
For more details, please reference: Azure Functions Premium plan.
Maybe Azure Function is not fit for this long process, but can help you execute this long process.
Hope this helps.

How to make code on an Azure VM trigger from storage blob change (like Functions do)

I've got some image processing code that I need to run in Azure. It's perfect for an Azure Function, but unfortunately requires a component with a complex installation procedure and therefore will need to run in a VM.
However, I'd like to make it behave much like an Azure Function, and trigger whenever new items arrive in blob storage.
My question is: Does Azure provide me with any handy way of doing this, or do I have to write code that polls the blob storage looking for new items?
Have a look at Azure WebJobs SDK. It shares API model with Functions, but you can host it in any .NET application. Blob Trigger.

Message from Azure Blob Storage to Azure Service Bus

I'm trying to figure out if Azure Blob Storage has similar functionality to Amazon S3. An S3 bucket can be configured in a way, that when new object is created, bucket sends message to SQS. I'm wondering if Azure Blob Storage is able to do the same with Azure Service Bus (which is kind of similar to SQS, correct?).
The only resource I've found so far, which mentions something similar is https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-overview, but there is no Azure Service Bus on the right side. I know I can use Functions as a proxy, but I'm interested in direct connection.
Any ideas?
Service bus(I think you compare service bus with SQS and SNS in AWS) don't have the ability to subscripe to Blob storage events. Event Grid(the link that you reffered to has Service bus support on the roadmap but no date is confirmed.
I think your best choice is Azure Functions(or Logic app if you don't want to write code) that has a blob Storage trigger to catch events and do action X.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-storage-blob-triggered-function.
Or wait a litte for event grid but you still get that "proxy" part.
One option is to use logic apps/ event grid and you can add trigger directly from azure blob storage (https://azure.microsoft.com/it-it/blog/azure-service-bus-now-integrates-with-azure-event-grid/) . Another option would be to add blob trigger with azure functions and write the code to do whatever action which you are looking for .

Azure Functions - Table Storage Trigger with Azure Functions

I need a way to trigger the Azure functions when an entity is added to the Azure Table storage. Is there a way to do this ? When I tried to add a new Azure function, I did not see any Azure Table storage trigger. I see there is Queue and Blob triggers available.
If there is no support for the Azure table storage trigger, then should I need to have a Http trigger and have the Azure Table storage as input binding ?
Thanks
There is no trigger binding for Table Storage.
Here's a detailed view on what is supported by the different bindings available today:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings#overview
If there is no support for the Azure table storage trigger, then should I need to have a Http trigger and have the Azure Table storage as input binding ?
Yes, this approach would work and would allow you to pass the table data as an input while relying on a separate trigger. Depending on the type of clients you are working with, and your requirements, using a queue trigger is also another good option.
#venki What the Fabio Cavalcante said to you is really true. Azure Function doesn't have a trigger option for Storage Table. But, whether your business needs store the data into the Storage Table and you as a Developer decide to use Azure Function into your architecture, you're able to configure you Function to use data that will come from Storage Table as a Input to your Function! This works really well.
But, There is another way to configure your Function to have "automagically" trigger, using Storage Queue (for small business) or Service Bus (for a business that needs a mechanism more robust)

Resources