Trying to send files from Azure file storage and did not find the stored procedure connector in logicapp workflow
Azure stored procedure connector in logicapps
Azure stored procedure connector in logicapps
Refer this link for Stored procedure
Trying to send files to Azure file storage and did not find the stored procedure connector in logicapp workflow
If you want to send files to azure file storage, then you need to search for Azure file storage connector not for Stored procedure.
As Shown in below image if you search for Stored procedure you can see below connectors.
May be if you want to send it from file storage to SQL server you can choose Sql Server connector there you can see Execute Stored procedure action as shown below.
If in case, you want to send files to Azure File Storage then search for that connector and you can choose actions regarding your requirement as shown in below image
Related
I have 5 types of EDI files namely - *.DAT, *.XML, *.TXT, *.CSV and Image files which contains data in them whose data are not in standard format.
I need to parse them and extract required data from them and persist them in SQL Database.
Currently, I'm spending time writing parser class libraries for each type of EDI file and not scalable .
I need to know if there are any azure services which can do the parsing work for me and is scalable.
Can I expect a solution on this regards?
I need to parse them and extract required data from them and persist them in SQL Database.
Yes, You can use Azure Functions to process a Files like CSV an import data into Azure SQL Or Azure Data Factory is also helpful to read or copy many file formats and store them in SQL Server Database in specified formats, there is an practical example provided by Microsoft, Please refer here.
To do with Azure Functions, the following steps are:
Create Azure Functions (Stack: .Net 3.1) of type Blob Trigger and define the local storage account connection string in local.settings.json like below:
In the Function.cs, there will be some boilerplate code which gives the logic of showing the uploaded blob name and its size.
In the Run function, you can define your parsing logic of the uploaded blob files.
Create the Azure SQL Database, configure the server with location, pricing tier and the required settings. After that, Select Set Server Firewall on the database overview page. Click Add Client IP to add your IP Address and Save. Test the database whether you're able to connect.
Deploy the project to Azure Functions App from Visual Studio.
Open your Azure SQL Database in the Azure portal and navigate to Connection Strings. Copy the connection string for ADO.NET.
Paste that Connection String in Azure Function App Settings in the portal.
Test the function app from portal and the remaining steps of uploading files from storage to SQL Database were available in this GitHub documentation
Also for parsing the files like CSV, etc. to JSON Format through Azure Functions, please refer here.
Consider using Azure Data Factory. It supports a range of file types.
I want to take an Archival(Backup) of Azure SQL Table to the Azure Blob Storage, I have done the backup in Azure Blob storage using via the Pipeline in CSV file format. And From the Azure Blob Storage, I have restored the data into the Azure SQL Table successfully using the Bulk Insert process.
But now I want to retrieve the data from this CSV file using some kind of filter criteria. Is there any way that I can apply a filter query on Azure Blob storage to retrieve the data?
Is there any other way to take a backup differently and then retrieve the data from Azure Storage?
My end goal is to take a backup of the Azure SQL table in Azure Storage and retrieve the data directly from Azure Storage with a filter.
Note
I know that I can take a backup using the SSMS, but that is not a requirement, I want this process through some kind of Pipeline or using the SQL command.
AFAIK, there is no such filtering option available when restoring the database. But, as you are asking for another way to backup and restoring, SQL Server Management Studio (SSMS) is one the most conveniently used platform for almost all SQL Server related activities.
You can use SSMS to access Azure SQL database using server name and Login Password.
Find this official tutorial from Microsoft about how to take backup of your Azure SQL Database and store it in Storage account and then restore it.
I have 20 files of type Excel/pdf located in different https server. i need to validate these file and load into azure storage Using Data Factory.I need to do apply some business logic on this data and load into azure SQL Database.I need to if we have to create a pipe line and store this data in azure blob storage and then load into Azure sql Database
I have tried creating copy data in data factory
My idea as below:
No.1
Step 1: Use Copy Activity to transfer data from http connector source into blob storage connector sink.
Step 2: Meanwhile, configure a blob storage trigger to execute your logic code so that the blob data will be processed as soon as it's collected into blob storage.
Step 3: Use Copy Activity to transfer data from blob storage connector source into SQL database connector sink.
No.2:
Step 1:Use Copy Activity to transfer data from http connector source into SQL database connector sink.
Step 2: Meanwhile, you could configure stored procedure to add your logic steps. The data will be executed before inserted into table.
I think both methods are feasible. The No.1, the business logic is freer and more flexible. The No.2, it is more convenient, but it is limited by the syntax of stored procedures. You could pick the solution as you want.
The excel and pdf are supported yet. Based on the link,only below formats are supported by ADF diectly:
i tested for csv file and get the below random characters:
You could refer to this case to read excel files in ADF:How to read files with .xlsx and .xls extension in Azure data factory?
I am new Azure functions, One of my task is to read data from Sql database and upload that data as a csv file in azure Blob storage using Azure functions and then using logicapps to retreive it. I am stuck with Sql to file to Azure Blob
I would start with the Azure Functions documentation. I did a quick internet search and found this article on how to access to SQL database from an Azure Function: https://learn.microsoft.com/en-us/azure/azure-functions/functions-scenario-database-table-cleanup
Here is another article which shows how to upload content to blob storage: https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob#output
Apply your learnings from both and you should be able to accomplish this task.
What about if instead you create a trigger to start the logic apps when something happen in your DB. Interesting article here : https://flow.microsoft.com/en-us/blog/introducing-triggers-in-the-sql-connector/
you can then pass the information to a function to process the data and push the new csv file to the storage : https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-dotnet?tabs=windows
Optionally you might need to transform what the trigger from sql returns you, there you can use the logic apps transform the input : https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-enterprise-integration-transform
We have a requirement to provide data in the form of a text file from our database to different vendors. The file should be generated on a daily basis. Is there any resource or application in azure that we can leverage in order to accomplish this?
Regards,
Lolek
You can use Azure Functions to read from a SQL Azure database as explained on the following resource:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-scenario-database-table-cleanup
The following resource shows how you can write from an Azure Function to a BLOB storage account.
https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/azure-functions/functions-reference-csharp.md#binding-at-runtime-via-imperative-bindings
The following article shows you how to schedule the Azure Function or automate its execution.
https://learn.microsoft.com/en-us/azure/azure-functions/functions-create-scheduled-function
Hope this helps.
Use Azure Data factory to export the desired data from Sql Azure, schedule the job to put the file in Blob storage.