Make REST call to API and save the result to Azure SQL every hour - azure

I'm using this very useful SQLCLR script to make a REST call to an API and save the data on SQL Server on the fly.
I have created a stored procedure that withdraws new data every hour so my data are always updated.
I would like to have all this on Azure so I can then create a Power BI data visualization.
THE PROBLEM:
As soon as I try to transfer the database on Azure I receive this error:
TITLE: Microsoft SQL Server Management Studio
------------------------------
Could not import package.
Warning SQL0: A project which specifies SQL Server 2019 or Azure SQL Database Managed Instance as the target platform may experience compatibility issues with Microsoft Azure SQL Database v12.
Error SQL72014: .Net SqlClient Data Provider: Msg 40517, Level 16, State 1, Line 4 Keyword or statement option 'unsafe' is not supported in this version of SQL Server.
Error SQL72045: Script execution error. The executed script:
CREATE ASSEMBLY [ClrHttpRequest]
AUTHORIZATION [dbo]
FROM 0x4D5A90000300000004000000FFFF0000B800000000000000400000000000000000000000000000000000000000000000000000000000000000000000800000000E1FBA0E00B409CD21B8014CCD21546869732070726F6772616D2063616E6E6F742062652072756E20696E20444F53206D6F64652E0D0D0A2400000000000000504500004C0103006D85475F0000000000000000E00022200B0130000026000000060000000000007E45000000200000006000000000001000200000000200000400000000000000060000000000000000A00000000200004C1E01000300608500001000001000000000100000100000000000001000000000000000000000002C4500004F00000000600000FC03000000000000000000000000000000000000008000000C000000F44300001C0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000200000080000000000000000000000082000004800000000000000000000002E7465787400000084250000002000000026000000020000000000000000000000000000200000602E72737263000000FC030000006000000004000000280000000000000000000000000000400000402E72656C6F6300000C000000008000000002000000
(Microsoft.SqlServer.Dac)
------------------------------
BUTTONS:
OK
------------------------------
This happens because Azure SQL has some feature stripped off like SQLCLR or SQL Server Agent (for some obvious security reason).
Is there any alternative to SQLCLR on Azure?
Is there any alternative to SQL Server Agent on Azure?
Basically: how to automate a REST call to an API every hour and save the result to SQL Server on Azure?

I do not think there is a straight forward replacement for SQL CLR. However, there are some Azure offerings that might be interesting.
I suppose an alternative is using a scheduled azure function that calls the API and store the result in the Azure SQL Database.
Do mind that if the process takes longer than 10 minutes you cannot use a consumption plan for the Azure Function, which is the most cost effective probably.
Depending on the scenario, Azure Data Factory can also provide a solution. You can create a pipeline that calls the API and copies the data to Sql Server as outlined here, based on a schedule trigger.

Even though Azure Functions is great, you could even solve this without much code using Azure Logic Apps, a scheduled trigger, the http request and the mssql connector.
https://azure.microsoft.com/de-de/services/logic-apps/

Related

Cross Database Insert in Azure?

Is it possible for me to insert some data from one database to another in Azure sql?
Let's say I have a trigger in db1 that updates some values in db2.
I read about elastic queries but it seems like they are read-only so they don't solve my problem.
You can't use cross-database in Azure Sql Server because databases can't see eachother physically , you could use elastic pools but they are Read Only.
A solution is to use SQL Managed Instance to upload your instance . This supports cross-database queries but it was expensive.
There was some previous discussion here about doing similar:
C# Azure Function trigger when SQL Database has a new row added without polling
There is also the Azure SQL Bindings for Azure Functions but they are input bindings and not triggers and they're still in preview and limited to C#, JavaScript and Python.
Azure SQL bindings for Azure Functions overview (preview)
There was a new announcement last week after MS Build however for Azure SQL Database External REST Endpoints Integration (hopefully they don't refer to it as ASDEREI) but this is currently in preview under Early Adoption Program (EAP).
Announcing the “Azure SQL Database External REST Endpoints Integration” Early Adoption Program

Solution architecture for data transfer from SQL Server database to external API and back

I am looking for a proper solution architecture for a data transfer scenario from SQL Server to an external API and then from the API back to SQL Server. We are thinking of using Azure technologies.
We have a database hosted on an Azure VM. When the value of the author of the book table changes, we would like to get all the data for that book from related table and transfer it an external API. the quantity of the rows to be transferred (the select-join) is huge so it takes a long time to execute the select-join query, After this data is read it is transformed and then it is sent to an external API (over which we have no control) The transfer of the data to the API could take up to an hour. After the data is written into this API, we read some reports from this API and write these reports back into the original database.
We must repeat this process more than 50 per day.
We are thinking of using Logic app to detect the trigger from SQL Server (as it is hosted in Azure VMs) publish this even to an Azure Data grid and then use Azure Durable functions to handle the Read SQL data-Transform it- and Send to the external API.
Does this make sense? Does anybody have any better ideas?
Thanks in advance
At this moment, Logic App SQL connector can't detect when a particular row changes, it will perform a select (which you'll provide), and then it will check for changes every X interval (you'll specify).
In other words, SQL Database doesn't offer a change feed like CosmosDB where you can subscribe to events and trigger an Azure Function.
Things you can do:
1-Add a Trigger on SQL after insert / update which will insert the new/changed row into a separated table, and then you can use Logic App / Azure Functions to query this table and retrieve data.
2-Migrate to Cosmos DB and use the change feed + Azure Functions
3-Change your code to after insert into SQL Database, also add a message with the Identifier for the row you're about to insert / update, then add it to a Queue, which will be consumed by Azure Function.

Alternative to trigger Azure Functions or AppService based on a table row insert into Azure SQL MI

Is it possible to trigger Azure Functions or AppService webapp whenever an insert operation is performed against a table on Azure SQL MI?
if not, is there a way to trigger applications outside Azure SQL rather than using LogicApp? I want to avoid LogicApp because it requries using one more application, and it is still using polling.
Link below said it is not for Azure functions
https://feedback.azure.com/forums/355860-azure-functions/suggestions/16711846-sql-azure-trigger-support
Link below suggests using LogicApp.
Trigger Azure Function by inserting (adding) new row into table, SQL Server Database
Today, in Azure SQL, there is no such possibility. The closest option is to create a Timer Trigger Azure Function that checks if there has been any changes in the table you want to monitor (using Change Tracking, for example).
If you are using Azure SQL MI instead, you could create a SQLCLR procedure that calls an Azure Function via an HTTP request or, another option, via Azure Event Hubs or Azure Event Grid
There have been several feature requests for triggering Azure functions based on changes to at Azure SQL database. For example:
https://github.com/Azure/azure-functions-python-worker/issues/365
It seems that they are not prioritizing it, since it is possible to implement this functionality using logic apps.

Why is there no execution plan in Azure SQL Data Warehouse?

I am working on storing data in Azure SQL Data Warehouse.
I am trying to look at my indexes usage and see execution plans but none of them are shown in SSMS.
Question
Why is there no execution plan in Azure SQL Data Warehouse?
Update
I am using SSMS version 13.0.16106.4 (SQL Server 2016).
There are at least three methods of viewing execution plans for Azure SQL Data Warehouse:
Use the EXPLAIN command before any SQL command to view the text execution plan for that command, eg
EXPLAIN
SELECT * FROM yourTable;
For an example of interpreting these plans see here.
Version of SQL Server Management Studio (SSMS) from 17.5 onwards have support for visual execution plans
Download version 17.x or later. More details here.
Through the Azure portal, which is really a wrapper for DBCC PDW_SHOWEXECUTIONPLAN.
Armed with these three methods, you are now no doubt well equipped to view execution plans for Azure SQL Data Warehouse.
As far as I can tell, Actual execution plans within SSMS do not work in Azure DW, (which changed to Azure Synapse, which changed to SQL Dedicated Pools.) Please prove me wrong.

I need to push data from various select statments to Azure SQL Database, best way to do so?

I have some T-sql scripts which generate some data and we manually update them into the excel spreedsheet, we need a way to push this into azure sql database, from a job so that we can access them from there and remove the manual process of uploading the information to the azure sql database every time. What is the best way to do this?
I assume you are trying to move data from an on prem server to Azure. The simplest method may be Azure Data Sync.
You could load your data from your queries into an on prem table which syncs to Azure.
On all your SQL Server instances, you can create a Linked Server to one Azure SQL Database. Once the linked server is created you can directly insert on Azure SQL Database from your on-premises SQL Server instances.
Here is how you create the Linked Server.
Below image shows how you insert data on Azure SQL Database using the linked server.
For detailed steps, you can visit this tutorial.
I think you can think about Azure Data Factory.
Azure Data Factory Copy Active can help you use T-sql scripts to move data to another Azure SQL database.
For more details, please the Azure tutorial:Copy multiple tables in bulk by using Azure Data Factory.
When the pipeline created, you can trigger and monitor the pipeline runs.
Trigger the pipeline on a schedule:
You can create a scheduler trigger to schedule the pipeline to run periodically (hourly, daily, and so on). In this procedure, you create a trigger to run every minute until the end date and time that you specify.
Please see: Trigger the pipeline on a schedule.
This can help you push the data to Azure SQL Database automatically.
Hope this helps.
you can try the SSIS package? which automates the process of data upload data into azure sql database.... i have not used ssis for Azure but to sink data from csv/xls/xlsx into ms sql server database,,I refered this article which can be helpful in anyway

Resources