Automatically pickup uploaded text file from Blob Storage and import data to Azure SQL - azure

I have a Blob Storage, and an Azure SQL DB.
When I upload a text file to my Blob Storage, says users.txt which contains list of users I need to import to User table in my SQL DB.
Is there a way that whenever a file arrive to Blob Storage, it will trigger an event. That event will trigger another event to import data to SQL DB(I don't know, but may be an Azure function, Logic App...). Therefore I don't need to write any code. Is that possible? If so, could you please let me know step by step how to do it?
Any help would be highly appreciated!.
Thanks!.

Teka a look at Azure Blob storage trigger for Azure Functions, which describes how you can use a "blob added" event to trigger an Azure Function. You can do something like below.
[FunctionName("SaveTextBlobToDb")]
public static void Run(
[BlobTrigger("container-with-text-files/{name}", Connection = "StorageConnectionAppSetting")] Stream streamWithTextFile)
{
// your logic for handling new blob (streamWithTextFile)
}
In the implementation, you can save the blob content to your SQL database. If you want to make sure that the blob is not lost due to any transient errors (like issues with db connectivity), you can first put the info about new blob to an Azure storage queue, and then have a separate Azure Function to take each blob-info from the queue and transfer the content to the database.

One solution that comes to mind, other than the options you already know, is Azure Data Factory. It is a kind of ETL tool for the cloud. It allows you to set up pipelines for data processing with defined inputs and outputs. In your scenario the input would be a blob and the output would be a Sql Server database record.
You can trigger the pipeline to be executed in the event a new blob is added. The docs even have an example showing just that, you can find it here.
In your case you can probably use the Copy Activity to copy the data from the blob to sql server. A tutorial titled "Copy data from Azure Blob storage to a SQL Database by using the Copy Data tool" is found here
An Azure Function will do the job as well but will involve coding. A Logic App is also a good option.

You answered your question...azure function or logic app. You can declaratively bind to your blob within an azure function, you can use the blob trigger on a logic app as well. Someone suggested data factory (this would necessarily be the most expensive option).

Related

How to get and parse .xlsx file in Azure Function called from Azure Logic App?

I am working on a logic app that gets a file azure file file shares, adds them in azure blob storage and then calls a Azure Function that receives that blob (.xlsx file).
After the blob is created the Azure Function will parse the data in the blob and will insert the data in MS dynamics CRM entity.
My question is how can I access the blob in Azure Function and parse it so I can the data that will be store in the entity?
I have successfully created the logic app that performs the mentioned steps:
My question is how can I access the blob in Azure Function and parse
it so I can the data that will be store in the entity?
From your screenshot, seems your logic app to trigger function is triggered by Http. So you can use azure function blob binding to access the blob in storage:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob-input?tabs=csharp
(The information of blob name is incoming by json format, so binding can directly get the name.)
Regarding parsing, I think there are a lot of codes that can be used for reference. It is no need to be given here. If you need it, you can tell me(And please tell the language you are using.).

Can azure event hub ingest json events from azure blog storage without writing any code?

Is it possible to use some ready made construct in azure cloud environment to ingest the events (in json format) that are currently stored in azure blob storage and have it submit those events directly to azure event hub without writing any (however small) custom code? In other words, I would like to use configuration driven approach only.
Sure. You can try to use Azure Logic Apps to realize your needs without any code or just with some function expressions, please refer to the offical documents of Azure Logic Apps to know more details.
The logic flow is as the figure below.
You can refer to my sample below to make it works.
Here is my sample to receive an event from my EventHub and transfer to Azure Blob Storage to create a new blob for storing the event data.
Create an Azure Logic App instance on Azure portal, it should be easy for you.
Move to the tab Logic app designer to configure the logic flow.
Click Save and Run buttons. Then, use ServiceBusExplorer (downloaded from https://github.com/paolosalvatori/ServiceBusExplorer/releases) to send event message and check whether new blob created using AzureStorageExplorer. It works fine after a few minutes.

Logic Apps - Get Blob Content Using Path

I have a event driven logic app (blob event) which reads a block blob using the path and uploads the content to Azure Data Lake. I noticed the logic app is failing with 413 (RequestEntityTooLarge) reading a large file (~6 GB). I understand that logic apps has the limitation of 1024 MB - https://learn.microsoft.com/en-us/connectors/azureblob/ but is there any work around to handle this type of situation? The alternative solution I am working on is moving this step to Azure Function and get the content from the blob. Thanks for your suggestions!
If you want to use an Azure function, I would suggest you to have a look at this at this article:
Copy data from Azure Storage Blobs to Data Lake Store
There is a standalone version of the AdlCopy tool that you can deploy to your Azure function.
So your logic app will call this function that will run a command to copy the file from blob storage to your data lake factory. I would suggest you to use a powershell function.
Another option would be to use Azure Data Factory to copy file to Data Lake:
Copy data to or from Azure Data Lake Store by using Azure Data Factory
You can create a job that copy file from blob storage:
Copy data to or from Azure Blob storage by using Azure Data Factory
There is a connector to trigger a data factory run from logic app so you may not need azure function but it seems that there is still some limitations:
Trigger Azure Data Factory Pipeline from Logic App w/ Parameter
You should consider using Azure Files connector:https://learn.microsoft.com/en-us/connectors/azurefile/
It is currently in preview, the advantage it has over Blob is that it doesn't have a size limit. The above link includes more information about it.
For the benefit of others who might be looking for a solution of this sort.
I ended up creating an Azure Function in C# as the my design dynamically parses the Blob Name and creates the ADL structure based on the blob name. I have used chunked memory streaming for reading the blob and writing it to ADL with multi threading for adderssing the Azure Functions time out of 10 minutes.

Sql to Azure Blob to LogicApp

I am new Azure functions, One of my task is to read data from Sql database and upload that data as a csv file in azure Blob storage using Azure functions and then using logicapps to retreive it. I am stuck with Sql to file to Azure Blob
I would start with the Azure Functions documentation. I did a quick internet search and found this article on how to access to SQL database from an Azure Function: https://learn.microsoft.com/en-us/azure/azure-functions/functions-scenario-database-table-cleanup
Here is another article which shows how to upload content to blob storage: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob#output
Apply your learnings from both and you should be able to accomplish this task.
What about if instead you create a trigger to start the logic apps when something happen in your DB. Interesting article here : https://flow.microsoft.com/en-us/blog/introducing-triggers-in-the-sql-connector/
you can then pass the information to a function to process the data and push the new csv file to the storage : https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet?tabs=windows
Optionally you might need to transform what the trigger from sql returns you, there you can use the logic apps transform the input : https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-enterprise-integration-transform

Azure Functions - Table Storage Trigger with Azure Functions

I need a way to trigger the Azure functions when an entity is added to the Azure Table storage. Is there a way to do this ? When I tried to add a new Azure function, I did not see any Azure Table storage trigger. I see there is Queue and Blob triggers available.
If there is no support for the Azure table storage trigger, then should I need to have a Http trigger and have the Azure Table storage as input binding ?
Thanks
There is no trigger binding for Table Storage.
Here's a detailed view on what is supported by the different bindings available today:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings#overview
If there is no support for the Azure table storage trigger, then should I need to have a Http trigger and have the Azure Table storage as input binding ?
Yes, this approach would work and would allow you to pass the table data as an input while relying on a separate trigger. Depending on the type of clients you are working with, and your requirements, using a queue trigger is also another good option.
#venki What the Fabio Cavalcante said to you is really true. Azure Function doesn't have a trigger option for Storage Table. But, whether your business needs store the data into the Storage Table and you as a Developer decide to use Azure Function into your architecture, you're able to configure you Function to use data that will come from Storage Table as a Input to your Function! This works really well.
But, There is another way to configure your Function to have "automagically" trigger, using Storage Queue (for small business) or Service Bus (for a business that needs a mechanism more robust)

Resources