We are moving from Microsoft SQL Server 2012 (SP1) - 11.0.3128.0 (X64)
to
Microsoft SQL Azure (RTM) - 12.0.2000.8
Previously we send mail from database mail if a particular table have not updated in a particular time interval.
But i tried to do the same on Azure mail but seems like this functionality is not available on this.
You can write a Stored Procedure to query the table and return results, read the results in an Azure WebJob, Logic App or Azure Function, schedule the execution and send the results via email. Here you can find an example using a Logic App and here you can find how to schedule it.
You can use your own Exchange/SMTP server or you can choose to use SendGrid to send out the email as shown here.
Related
After migrating data to Azure Sql database need to send email alert which I did not find in azure data factory
In azure logicapps unable to see Azure sql database connector like below image
After migrating data to Azure Sql database need to send email alert which I did not find in azure data factory
AFAIK sending email alert is not possible through azure data factory, but you can achieve this using Azure logic Apps by following this MS Document
In logic App Create When a http request is received trigger by taking your adf Json schema as shown below.
Then In Outlook 365 connector you can choose send email action and you can retrieve your adf details like data factory name, pipeline name and message.
Then the email gets triggered and the recipient get email alert .
I have been working on a requirement to send the email notification that lists the files that is transferred to a FileShare using Azure Data Factory.
I have also been instructed not to use Logic Apps and Sendgrid and cannot use log analytics as the team wants no additional charge applied to that subscription and for other reasons.
I have been trying using AKS service or via Databricks.
Can anyone guide me the process to achieve the same, i only have the details of SMTP server and port(No credentials required). Please share any pseudocode if any
Thanks in Advance.
Unlike SSIS, Azure Data Factory and Azure Synapse Engineering Pipelines do not have an out of the box activity to send a email.
Don't fret, you can create an logic app to send the email. This application can be called via an web activity in the above services. Here are the details.
https://learn.microsoft.com/en-us/azure/data-factory/how-to-send-email
I am looking for a proper solution architecture for a data transfer scenario from SQL Server to an external API and then from the API back to SQL Server. We are thinking of using Azure technologies.
We have a database hosted on an Azure VM. When the value of the author of the book table changes, we would like to get all the data for that book from related table and transfer it an external API. the quantity of the rows to be transferred (the select-join) is huge so it takes a long time to execute the select-join query, After this data is read it is transformed and then it is sent to an external API (over which we have no control) The transfer of the data to the API could take up to an hour. After the data is written into this API, we read some reports from this API and write these reports back into the original database.
We must repeat this process more than 50 per day.
We are thinking of using Logic app to detect the trigger from SQL Server (as it is hosted in Azure VMs) publish this even to an Azure Data grid and then use Azure Durable functions to handle the Read SQL data-Transform it- and Send to the external API.
Does this make sense? Does anybody have any better ideas?
Thanks in advance
At this moment, Logic App SQL connector can't detect when a particular row changes, it will perform a select (which you'll provide), and then it will check for changes every X interval (you'll specify).
In other words, SQL Database doesn't offer a change feed like CosmosDB where you can subscribe to events and trigger an Azure Function.
Things you can do:
1-Add a Trigger on SQL after insert / update which will insert the new/changed row into a separated table, and then you can use Logic App / Azure Functions to query this table and retrieve data.
2-Migrate to Cosmos DB and use the change feed + Azure Functions
3-Change your code to after insert into SQL Database, also add a message with the Identifier for the row you're about to insert / update, then add it to a Queue, which will be consumed by Azure Function.
I'm using this very useful SQLCLR script to make a REST call to an API and save the data on SQL Server on the fly.
I have created a stored procedure that withdraws new data every hour so my data are always updated.
I would like to have all this on Azure so I can then create a Power BI data visualization.
THE PROBLEM:
As soon as I try to transfer the database on Azure I receive this error:
TITLE: Microsoft SQL Server Management Studio
------------------------------
Could not import package.
Warning SQL0: A project which specifies SQL Server 2019 or Azure SQL Database Managed Instance as the target platform may experience compatibility issues with Microsoft Azure SQL Database v12.
Error SQL72014: .Net SqlClient Data Provider: Msg 40517, Level 16, State 1, Line 4 Keyword or statement option 'unsafe' is not supported in this version of SQL Server.
Error SQL72045: Script execution error. The executed script:
CREATE ASSEMBLY [ClrHttpRequest]
AUTHORIZATION [dbo]
FROM 0x4D5A90000300000004000000FFFF0000B800000000000000400000000000000000000000000000000000000000000000000000000000000000000000800000000E1FBA0E00B409CD21B8014CCD21546869732070726F6772616D2063616E6E6F742062652072756E20696E20444F53206D6F64652E0D0D0A2400000000000000504500004C0103006D85475F0000000000000000E00022200B0130000026000000060000000000007E45000000200000006000000000001000200000000200000400000000000000060000000000000000A00000000200004C1E01000300608500001000001000000000100000100000000000001000000000000000000000002C4500004F00000000600000FC03000000000000000000000000000000000000008000000C000000F44300001C0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000200000080000000000000000000000082000004800000000000000000000002E7465787400000084250000002000000026000000020000000000000000000000000000200000602E72737263000000FC030000006000000004000000280000000000000000000000000000400000402E72656C6F6300000C000000008000000002000000
(Microsoft.SqlServer.Dac)
------------------------------
BUTTONS:
OK
------------------------------
This happens because Azure SQL has some feature stripped off like SQLCLR or SQL Server Agent (for some obvious security reason).
Is there any alternative to SQLCLR on Azure?
Is there any alternative to SQL Server Agent on Azure?
Basically: how to automate a REST call to an API every hour and save the result to SQL Server on Azure?
I do not think there is a straight forward replacement for SQL CLR. However, there are some Azure offerings that might be interesting.
I suppose an alternative is using a scheduled azure function that calls the API and store the result in the Azure SQL Database.
Do mind that if the process takes longer than 10 minutes you cannot use a consumption plan for the Azure Function, which is the most cost effective probably.
Depending on the scenario, Azure Data Factory can also provide a solution. You can create a pipeline that calls the API and copies the data to Sql Server as outlined here, based on a schedule trigger.
Even though Azure Functions is great, you could even solve this without much code using Azure Logic Apps, a scheduled trigger, the http request and the mssql connector.
https://azure.microsoft.com/de-de/services/logic-apps/
Usecase
We have an on-premise Hadoop setup and we are using power BI as a BI visualization tool. What we do currently to get data on Powerbi is as follows.
Copy data from on-premise to Azure Blob(Our on-premise schedule does this once the data is ready in Hive)
Data from Azure Blob is then copied to Azure-DataWarehouse/Azure-SQL
Cube refreshes on Azure AAS, AAS pulls data from Azure DataWarehouse/SQL
To do the step2 and step3 we are currently running a web server on Azure and the endpoints are configured to take few parameters like the table name, azure file location, cube information and so on.
Sample http request:
http://azure-web-server-scheduler/copydata?from=blob&to=datawarehouse&fromloc=myblob/data/today.csv&totable=mydb.mytable
Here the web servers extract the values from variables(from, fromloc, to, totable) and them does the copy activity. We did this as we had a lot of tables and all could reuse the same function.
Now we have use cases piling up(retries, control flows, email alerts, monitoring) and we are looking for a cloud alternative to do the scheduling job for us, we would still like to hit an HTTP endpoint like the above one.
One of the alternatives we have checked till now is the Azure Data Factory, where are create pipelines to achieve the steps above and trigger the ADF using http endpoints.
Problems
How can we take parameters from the http post call and make it available as custom variables[1], this is required within the pipeline so that we can still write a function for each step{2, 3} and the function can take these parameters, we don't want to create an ADF for each table.
How can we detect for failure in ADF steps and send email alerts during failures?
What are the other options apart from ADF to do this in Azure?
[1] https://learn.microsoft.com/en-us/azure/data-factory/control-flow-system-variables
You could trigger the copy job from blob to SQL DW via a Get Metadata Acitivity. It can be used in the following scenarios:
- Validate the metadata information of any data
- Trigger a pipeline when data is ready/ available
For eMail notification you can use a Web Activity calling a LogicApp. See the following tuturial how to send an email notification.