Azure Function App to pull files from other project/repo - azure

Ive got a straightforward scenario but i can't figure out how to do it.
I have a SQL Database Project which has a bunch of simple SQL files and some of these drop/create summary tables.
I am trying to use a Azure Function App on a Timer to pick up certain SQL files from this project and run them ( every night ). I am however confused how to get these SQL files into my function to run?
I've tried referencing the SQL project and picking up the files from bin/debug but once function app is deployed i can see how it will be able to build the SQL Project?
Any ideas appreciated

If you choose Azure Functions for achieving this, then you can utilize the REST API calls as the logic to do operations on the database either fetching or updating the data.
This approach is good for short time period.
Refer to the C# Corner Article for the information and code on using the Rest APIs of database calls in Azure Functions.
I believe the long-term reliable solution for this requirement; Azure DevOps project is recommended.
Create Azure DevOps project for SQL Project.
Schedule it with an internal time which deploys the respective results at on-time whenever check-in happens on that project also.
Refer to this MS Doc for more information on using the above solution.

Related

What is the best way to automate creating SQL .bak or .bacpac backup files and saving them to an Azure cloud storage container?

Currently I am tasked with researching a solution to easily copying data from one environment to another (QA to DEV for example) as well as having the flexibility of going to different times to compare our data. It is an easy task to do locally with SSMS and I am looking for the best ways to do it using Azure and it's tools.
These are the options that I found so far:
Backup Service and Backup Vault (The MS solution that I am not asking for. They don't generate .bak files)
Azure Function to execute generate and transfer SQL (flexible but the code needs to be maintained + manage authentication)
Powershell process with Azure Automate (Flexible too but needs to be maintained)
Datafactory/SSIS (Still learning and researching)
Anyone got any tools/methods that are worth looking into before I dive deeper with a solution?
For Azure SQL database, SQL Data Sync is one of feature for the data sync between Azure SQL and SQL server(on-premise). Some limits are that Azure SQL database must be hub and each must have a primary key. That may not suit you.
Per my experience, Data Factory is the best one for you. You can copy the data between different environment, in Sink settings, we can using upsert(insert or update) operation to sync the data.
If you only want to schedule the backup automatically for the SQL, the third-part tool also could feed your request: SQL Backup and FTP.
Since you have searched a lot and found almost all the options in Azure, all the ways can achieve that. You need to know your real request, data sync or auto backup create the .bacpac file to storage. That's not a good question to help you find the best way. The way you like, the way is the best.
I went with writing an Azure Automate powershell script. including cmdlts like New-AzureRmSqlDatabaseExport and passing in the parameters was ticky but it finally did the job.

Which Azure products are needed for a staging database?

I have several external data APIs that I access using some Python scripts. My scripts run from an on-premises server, transform the data, and store it in a SQL Server database on the same server. I suppose it's a rudimentary ETL system run with Python and T-SQL.
The system is about to grow quite a bit with new APIs and will require more complex data pipelines (for example, some of the API data will be spun off to more than one table). I think this would be a good time to move the system onto Azure (we are heavily integrated with Microsoft so it will have to be Azure!).
I have spent a few days researching the Azure products that would let me run Python scripts to access data from web APIs and store the processed data in a cloud database. I'm looking for advice on what sort of Azure products other people have used for similar jobs. At the moment it seems I will need:
Azure SQL Database to hold the processed data that can be accessed by various colleagues.
Azure Data Factory to manage, log, and schedule the pipeline jobs and to run my custom Python scripts (is this even possible?).
Azure Batch to run the aforementioned Python scripts but I'm not sure about this.
I want to put together a proposal basically and start thinking about costs but it would be good to hear from someone who has done something similar - am I on the right track or completely off? Should I just stay on-premises? Thank you in advance.
Azure SQL Database, Azure SQL Data Warehouse are good for relational data. And if you want to use NoSQL, you could go with Azure Cosmos DB. If you want to use Files to store data, you could use Azure Data Lake.
For python scripts, you could use custom activity or Data bricks for Azure Data Factory.
Azure SQL Warehouse should be used if the amount of data you want to load is in petabytes. Also, Azure Data warehouse is not meant for complex transformations. I would recommend it for plain data load with PolyBase.

How to manage CosmosDB Stored procedures, Function and Triggers as like SQL DB project

I am developing a Saas based application which has hybrid DB architecture (Azure SQL Server and Azure Cosmos DB).
To manage SQL Server Tables, Stored procedures, triggers and functions we will create a SQLDB project (.sqlproj). Also we can generate .dacpac and deploy in the sql server.
As like SQL, we will have collections, stored procedures, triggers and functions in Azure CosmosDB.
How to manage CosmosDB collection, procedure, trigger? Is there any project templete available to manage? Suggest a solution to proceed.
Based on my experience with CosmosDb, I believe there is nothing sort of project templates available for CosmosDb. Because it is not as easy as SQL Db project.
I suggest you will have to store them as json files in local solution version control and version them accordingly.
You could write necessary programming logic to execute these scripts/cosmos DB logic using SQL API for .NET or another platform. This way you are controlling the collections, udf, triggers etc from your code, and you can version your code accordingly.
More references here: https://learn.microsoft.com/en-us/azure/cosmos-db/programming
Azure CosmosDBs can be managed through ARM templates. You can use these to version your databases/collections/etc. See Microsoft.DocumentDB resource types documentation.

Azure Unzip automation

I am looking to do the following in Azure however I should point out that on my local machine I have no visual studio, no admin rights, no IT support and no tools (except SSMS) but I have a VERY strong drive to complete this work if its possible.
I have created an Azure blob which receives a file each day (zipped) from a 3rd party. I am looking to do the following:
1)Unzip the data in an automated fashion
2)Get the data into an Azure SQL database (already created) in an automated fashion
What I want to know is if this is possible to do using Azure alone or am I going to need admin rights / Visual Studio? If it is possible any directions that you could point me in would be greatly received!
Thanks
Dave
Based on your description, one approach would be to create a Blob Triggered Azure Function through the Azure Portal (Visual Studio is not required), unzip/process the file and save desired data into Azure SQL. Moreover, considering the there is only one new file per day, prefer the Consumption Plan to optimize cost.
Find more details about Azure Function Blob Binding at https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob.
Based on your description, one approach would be to create a Blob Triggered Azure Function through the Azure Portal (Visual Studio is not required), unzip/process the file and save desired data into Azure SQL. Moreover, considering the there is only one new file per day, prefer the Consumption Plan to optimize cost.
Find more details about Azure Function Blob Binding at https://learn.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-blob.
Spin up your data factory on Azure, unzip function is available on adf

How to execute multiple rows in web service Azure Machine Learning Studio

I create a model in Azure ML studio.
I deployed the web service.
Now, I know how to check one record at a time, but how can I load a csv file and made the algorithm go through all records ?
If I click on Batch Execution - it will ask me to create an account for Azure storage.
Is any way to execute multiple records from csv file without creating any other accounts?
Yes, there is a way and it is simple. What you need is an excel add-in. You need not create any other account.
You can either read Excel Add-in for Azure Machine Learning web services doc or you can watch Azure ML Excel Add-in video.
If you search for videos on excel add in for azure ml, you get other useful videos too.
I hope this is the solution you are looking for.

Resources