How to events from azure event hub to azure blob storage using logic app? - azure

I want to move events from azure event hub to Azure blob storage using the logic app, can anyone suggest to me how to use with logic app connector, triggers, and action with the designer example.
Note:
Events are json events and need to be stored in blob storage.

You can start with a blank logic app, and use the search assistant to find what you're looking for.
Typing event hub gives:
where you can provide the connection by providing the name.
Save the content in a variable.
You can use SaveInitialJsonToBlobStorage to now store this json in a blob storage:

Related

Automatically pickup uploaded text file from Blob Storage and import data to Azure SQL

I have a Blob Storage, and an Azure SQL DB.
When I upload a text file to my Blob Storage, says users.txt which contains list of users I need to import to User table in my SQL DB.
Is there a way that whenever a file arrive to Blob Storage, it will trigger an event. That event will trigger another event to import data to SQL DB(I don't know, but may be an Azure function, Logic App...). Therefore I don't need to write any code. Is that possible? If so, could you please let me know step by step how to do it?
Any help would be highly appreciated!.
Thanks!.
Teka a look at Azure Blob storage trigger for Azure Functions, which describes how you can use a "blob added" event to trigger an Azure Function. You can do something like below.
[FunctionName("SaveTextBlobToDb")]
public static void Run(
[BlobTrigger("container-with-text-files/{name}", Connection = "StorageConnectionAppSetting")] Stream streamWithTextFile)
{
// your logic for handling new blob (streamWithTextFile)
}
In the implementation, you can save the blob content to your SQL database. If you want to make sure that the blob is not lost due to any transient errors (like issues with db connectivity), you can first put the info about new blob to an Azure storage queue, and then have a separate Azure Function to take each blob-info from the queue and transfer the content to the database.
One solution that comes to mind, other than the options you already know, is Azure Data Factory. It is a kind of ETL tool for the cloud. It allows you to set up pipelines for data processing with defined inputs and outputs. In your scenario the input would be a blob and the output would be a Sql Server database record.
You can trigger the pipeline to be executed in the event a new blob is added. The docs even have an example showing just that, you can find it here.
In your case you can probably use the Copy Activity to copy the data from the blob to sql server. A tutorial titled "Copy data from Azure Blob storage to a SQL Database by using the Copy Data tool" is found here
An Azure Function will do the job as well but will involve coding. A Logic App is also a good option.
You answered your question...azure function or logic app. You can declaratively bind to your blob within an azure function, you can use the blob trigger on a logic app as well. Someone suggested data factory (this would necessarily be the most expensive option).

Saving a JSON data from an Azure Function

I have integrated an Azure Service Bus and an Azure Function to receive a message, and then update a SQL DB.
I want to save a JSON created from a query from the same Database to a Azure Blob.
My questions are:
I can save the JSON by calling the Azure Blob REST API. Is it a Cloud Native pattern to call one service from another service?
Is sending the JSON to the Azure Service Bus and another Azure Function saving the data to the Blob an optimal approach?
Is a resource other than Azure Blob to save the JSON data from an Azure Function which will make the integration easy.
There are many ways of saving a file in Azure Blob, if you want to save over HTTP, use Azure Blob REST API, you can also use Microsoft Azure Storage SDK that you can integrate into your application, there are storage client for many languages (.NET, Python, javascript, GO, etc.) or if you are using Azure function, you can use Output Binding.
it depends... Blob Storage is not the only location where you can save JSON, you can also save JSON straight into a SQL database for instance.
The easiest way to save from an Azure function is to use Azure Blob storage output binding for Azure Functions.

Can azure event hub ingest json events from azure blog storage without writing any code?

Is it possible to use some ready made construct in azure cloud environment to ingest the events (in json format) that are currently stored in azure blob storage and have it submit those events directly to azure event hub without writing any (however small) custom code? In other words, I would like to use configuration driven approach only.
Sure. You can try to use Azure Logic Apps to realize your needs without any code or just with some function expressions, please refer to the offical documents of Azure Logic Apps to know more details.
The logic flow is as the figure below.
You can refer to my sample below to make it works.
Here is my sample to receive an event from my EventHub and transfer to Azure Blob Storage to create a new blob for storing the event data.
Create an Azure Logic App instance on Azure portal, it should be easy for you.
Move to the tab Logic app designer to configure the logic flow.
Click Save and Run buttons. Then, use ServiceBusExplorer (downloaded from https://github.com/paolosalvatori/ServiceBusExplorer/releases) to send event message and check whether new blob created using AzureStorageExplorer. It works fine after a few minutes.

Copy data from Azure Service Bus Topic to a Storage Account inside Azure

I need to move data received in my Service Bus Topic to a Storage Account inside Azure.
I believe Azure Function is one of the good ways to achieve this.
Please share suitable example if you have one.
Regards,
Surya
Fire up an Azure Functions project, create a function with a Service Bus trigger binding (if you're in the portal or Visual Studio then it'll offer you a template) and add a storage output binding for blob, queue or table as appropriate. Then in the function code just copy the relevant data from your trigger parameter to your output parameter.

Message from Azure Blob Storage to Azure Service Bus

I'm trying to figure out if Azure Blob Storage has similar functionality to Amazon S3. An S3 bucket can be configured in a way, that when new object is created, bucket sends message to SQS. I'm wondering if Azure Blob Storage is able to do the same with Azure Service Bus (which is kind of similar to SQS, correct?).
The only resource I've found so far, which mentions something similar is https://learn.microsoft.com/en-us/azure/storage/blobs/storage-blob-event-overview, but there is no Azure Service Bus on the right side. I know I can use Functions as a proxy, but I'm interested in direct connection.
Any ideas?
Service bus(I think you compare service bus with SQS and SNS in AWS) don't have the ability to subscripe to Blob storage events. Event Grid(the link that you reffered to has Service bus support on the roadmap but no date is confirmed.
I think your best choice is Azure Functions(or Logic app if you don't want to write code) that has a blob Storage trigger to catch events and do action X.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-storage-blob-triggered-function.
Or wait a litte for event grid but you still get that "proxy" part.
One option is to use logic apps/ event grid and you can add trigger directly from azure blob storage (https://azure.microsoft.com/it-it/blog/azure-service-bus-now-integrates-with-azure-event-grid/) . Another option would be to add blob trigger with azure functions and write the code to do whatever action which you are looking for .

Resources