Sql to Azure Blob to LogicApp - azure

I am new Azure functions, One of my task is to read data from Sql database and upload that data as a csv file in azure Blob storage using Azure functions and then using logicapps to retreive it. I am stuck with Sql to file to Azure Blob

I would start with the Azure Functions documentation. I did a quick internet search and found this article on how to access to SQL database from an Azure Function: https://learn.microsoft.com/en-us/azure/azure-functions/functions-scenario-database-table-cleanup
Here is another article which shows how to upload content to blob storage: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob#output
Apply your learnings from both and you should be able to accomplish this task.

What about if instead you create a trigger to start the logic apps when something happen in your DB. Interesting article here : https://flow.microsoft.com/en-us/blog/introducing-triggers-in-the-sql-connector/
you can then pass the information to a function to process the data and push the new csv file to the storage : https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet?tabs=windows
Optionally you might need to transform what the trigger from sql returns you, there you can use the logic apps transform the input : https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-enterprise-integration-transform

Related

Automatically pickup uploaded text file from Blob Storage and import data to Azure SQL

I have a Blob Storage, and an Azure SQL DB.
When I upload a text file to my Blob Storage, says users.txt which contains list of users I need to import to User table in my SQL DB.
Is there a way that whenever a file arrive to Blob Storage, it will trigger an event. That event will trigger another event to import data to SQL DB(I don't know, but may be an Azure function, Logic App...). Therefore I don't need to write any code. Is that possible? If so, could you please let me know step by step how to do it?
Any help would be highly appreciated!.
Thanks!.
Teka a look at Azure Blob storage trigger for Azure Functions, which describes how you can use a "blob added" event to trigger an Azure Function. You can do something like below.
[FunctionName("SaveTextBlobToDb")]
public static void Run(
[BlobTrigger("container-with-text-files/{name}", Connection = "StorageConnectionAppSetting")] Stream streamWithTextFile)
{
// your logic for handling new blob (streamWithTextFile)
}
In the implementation, you can save the blob content to your SQL database. If you want to make sure that the blob is not lost due to any transient errors (like issues with db connectivity), you can first put the info about new blob to an Azure storage queue, and then have a separate Azure Function to take each blob-info from the queue and transfer the content to the database.
One solution that comes to mind, other than the options you already know, is Azure Data Factory. It is a kind of ETL tool for the cloud. It allows you to set up pipelines for data processing with defined inputs and outputs. In your scenario the input would be a blob and the output would be a Sql Server database record.
You can trigger the pipeline to be executed in the event a new blob is added. The docs even have an example showing just that, you can find it here.
In your case you can probably use the Copy Activity to copy the data from the blob to sql server. A tutorial titled "Copy data from Azure Blob storage to a SQL Database by using the Copy Data tool" is found here
An Azure Function will do the job as well but will involve coding. A Logic App is also a good option.
You answered your question...azure function or logic app. You can declaratively bind to your blob within an azure function, you can use the blob trigger on a logic app as well. Someone suggested data factory (this would necessarily be the most expensive option).

Saving a JSON data from an Azure Function

I have integrated an Azure Service Bus and an Azure Function to receive a message, and then update a SQL DB.
I want to save a JSON created from a query from the same Database to a Azure Blob.
My questions are:
I can save the JSON by calling the Azure Blob REST API. Is it a Cloud Native pattern to call one service from another service?
Is sending the JSON to the Azure Service Bus and another Azure Function saving the data to the Blob an optimal approach?
Is a resource other than Azure Blob to save the JSON data from an Azure Function which will make the integration easy.
There are many ways of saving a file in Azure Blob, if you want to save over HTTP, use Azure Blob REST API, you can also use Microsoft Azure Storage SDK that you can integrate into your application, there are storage client for many languages (.NET, Python, javascript, GO, etc.) or if you are using Azure function, you can use Output Binding.
it depends... Blob Storage is not the only location where you can save JSON, you can also save JSON straight into a SQL database for instance.
The easiest way to save from an Azure function is to use Azure Blob storage output binding for Azure Functions.

I want to import Text file(CSV) data to Azure SQL by using logic-App

I tried using HTTP request I am able to send the data from HTTP request to azure SQL but I am manually Send the data through postman but that is not my requirement
Requirement:I need to use a scheduler and a particular time the data from the text file need to be read and to be stored into Azure-SqlDB
if you have any resource or examples please let me know
About how import Text file(CSV) data to Azure SQL by using logic-App, you could reference this tutorial: Quick, easy and cheap way to automate data loading from CSV file into Azure SQL:
Check out how to leverage Azure Blob Storage and Logic Apps for
simple scenario of data loading from CSV into Azure SQL in less than
30 minutes and with almost no coding.
About any developer out there at some point or another had to
automate ETL process for data loading. This article will present a
fast and convinient way to create data loading workflow for CSVs
using Azure SQL and blob storage.
It also introduced other ways you could reference:
How import Text file(CSV) data to Azure SQL by using Data Factory.
How import Text file(CSV) data from Blob stroage to Azure SQL by
using T-SQL.
You also could reference:
Upload Flat File on Azure SQL Database using Azure Logic App
Using Azure Logic Apps to Import CSV to SQL Server
I'm agree with #Mandar Dharmadhikari, Logic app is not the best way to do it.
If your csv file with large data, I also suggest you to use Data Factory,when the copy active pipeline created, you could trigger the pipeline executing in schedule.
Hope this helps.
I would suggest you yo use Azure Data Factory as it is more suited to the task that you want to perform. Following post gives the idea on how to move csv data fo SQL.
http://normalian.hatenablog.com/entry/2017/09/04/233320

Logic Apps - Get Blob Content Using Path

I have a event driven logic app (blob event) which reads a block blob using the path and uploads the content to Azure Data Lake. I noticed the logic app is failing with 413 (RequestEntityTooLarge) reading a large file (~6 GB). I understand that logic apps has the limitation of 1024 MB - https://learn.microsoft.com/en-us/connectors/azureblob/ but is there any work around to handle this type of situation? The alternative solution I am working on is moving this step to Azure Function and get the content from the blob. Thanks for your suggestions!
If you want to use an Azure function, I would suggest you to have a look at this at this article:
Copy data from Azure Storage Blobs to Data Lake Store
There is a standalone version of the AdlCopy tool that you can deploy to your Azure function.
So your logic app will call this function that will run a command to copy the file from blob storage to your data lake factory. I would suggest you to use a powershell function.
Another option would be to use Azure Data Factory to copy file to Data Lake:
Copy data to or from Azure Data Lake Store by using Azure Data Factory
You can create a job that copy file from blob storage:
Copy data to or from Azure Blob storage by using Azure Data Factory
There is a connector to trigger a data factory run from logic app so you may not need azure function but it seems that there is still some limitations:
Trigger Azure Data Factory Pipeline from Logic App w/ Parameter
You should consider using Azure Files connector:https://learn.microsoft.com/en-us/connectors/azurefile/
It is currently in preview, the advantage it has over Blob is that it doesn't have a size limit. The above link includes more information about it.
For the benefit of others who might be looking for a solution of this sort.
I ended up creating an Azure Function in C# as the my design dynamically parses the Blob Name and creates the ADL structure based on the blob name. I have used chunked memory streaming for reading the blob and writing it to ADL with multi threading for adderssing the Azure Functions time out of 10 minutes.

Data export from SQL Azure Database

We have a requirement to provide data in the form of a text file from our database to different vendors. The file should be generated on a daily basis. Is there any resource or application in azure that we can leverage in order to accomplish this?
Regards,
Lolek
You can use Azure Functions to read from a SQL Azure database as explained on the following resource:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-scenario-database-table-cleanup
The following resource shows how you can write from an Azure Function to a BLOB storage account.
https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/azure-functions/functions-reference-csharp.md#binding-at-runtime-via-imperative-bindings
The following article shows you how to schedule the Azure Function or automate its execution.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-scheduled-function
Hope this helps.
Use Azure Data factory to export the desired data from Sql Azure, schedule the job to put the file in Blob storage.

Resources