Sending email alert - azure

After migrating data to Azure Sql database need to send email alert which I did not find in azure data factory
In azure logicapps unable to see Azure sql database connector like below image

After migrating data to Azure Sql database need to send email alert which I did not find in azure data factory
AFAIK sending email alert is not possible through azure data factory, but you can achieve this using Azure logic Apps by following this MS Document
In logic App Create When a http request is received trigger by taking your adf Json schema as shown below.
Then In Outlook 365 connector you can choose send email action and you can retrieve your adf details like data factory name, pipeline name and message.
Then the email gets triggered and the recipient get email alert .

Related

How to Send a Email notification from Azure Data Factory

I have been working on a requirement to send the email notification that lists the files that is transferred to a FileShare using Azure Data Factory.
I have also been instructed not to use Logic Apps and Sendgrid and cannot use log analytics as the team wants no additional charge applied to that subscription and for other reasons.
I have been trying using AKS service or via Databricks.
Can anyone guide me the process to achieve the same, i only have the details of SMTP server and port(No credentials required). Please share any pseudocode if any
Thanks in Advance.
Unlike SSIS, Azure Data Factory and Azure Synapse Engineering Pipelines do not have an out of the box activity to send a email.
Don't fret, you can create an logic app to send the email. This application can be called via an web activity in the above services. Here are the details.
https://learn.microsoft.com/en-us/azure/data-factory/how-to-send-email

How to configure mail on Azure SQL Database

We are moving from Microsoft SQL Server 2012 (SP1) - 11.0.3128.0 (X64)
to
Microsoft SQL Azure (RTM) - 12.0.2000.8
Previously we send mail from database mail if a particular table have not updated in a particular time interval.
But i tried to do the same on Azure mail but seems like this functionality is not available on this.
You can write a Stored Procedure to query the table and return results, read the results in an Azure WebJob, Logic App or Azure Function, schedule the execution and send the results via email. Here you can find an example using a Logic App and here you can find how to schedule it.
You can use your own Exchange/SMTP server or you can choose to use SendGrid to send out the email as shown here.

Azure Batch: Send email if job fails

As stated in the question, similar to Azure data factory, I want to have a feature where if my job fails I will get an email of it.
How to configure that in Azure batch?

Taking parameters from manual triggers in ADF

Usecase
We have an on-premise Hadoop setup and we are using power BI as a BI visualization tool. What we do currently to get data on Powerbi is as follows.
Copy data from on-premise to Azure Blob(Our on-premise schedule does this once the data is ready in Hive)
Data from Azure Blob is then copied to Azure-DataWarehouse/Azure-SQL
Cube refreshes on Azure AAS, AAS pulls data from Azure DataWarehouse/SQL
To do the step2 and step3 we are currently running a web server on Azure and the endpoints are configured to take few parameters like the table name, azure file location, cube information and so on.
Sample http request:
http://azure-web-server-scheduler/copydata?from=blob&to=datawarehouse&fromloc=myblob/data/today.csv&totable=mydb.mytable
Here the web servers extract the values from variables(from, fromloc, to, totable) and them does the copy activity. We did this as we had a lot of tables and all could reuse the same function.
Now we have use cases piling up(retries, control flows, email alerts, monitoring) and we are looking for a cloud alternative to do the scheduling job for us, we would still like to hit an HTTP endpoint like the above one.
One of the alternatives we have checked till now is the Azure Data Factory, where are create pipelines to achieve the steps above and trigger the ADF using http endpoints.
Problems
How can we take parameters from the http post call and make it available as custom variables[1], this is required within the pipeline so that we can still write a function for each step{2, 3} and the function can take these parameters, we don't want to create an ADF for each table.
How can we detect for failure in ADF steps and send email alerts during failures?
What are the other options apart from ADF to do this in Azure?
[1] https://learn.microsoft.com/en-us/azure/data-factory/control-flow-system-variables
You could trigger the copy job from blob to SQL DW via a Get Metadata Acitivity. It can be used in the following scenarios:
- Validate the metadata information of any data
- Trigger a pipeline when data is ready/ available
For eMail notification you can use a Web Activity calling a LogicApp. See the following tuturial how to send an email notification.

i need to send a email from azure data factory .net activity C# code

i am running .net activity in azure data lake store using C#, i need to send a email based on my job fail or pass or writing some logger information.
i know on-perm we have option DBmail, SMTP but in data factory
Regards,
Manish

Resources