Azure Batch: Send email if job fails - azure

As stated in the question, similar to Azure data factory, I want to have a feature where if my job fails I will get an email of it.
How to configure that in Azure batch?

Related

Sending email alert

After migrating data to Azure Sql database need to send email alert which I did not find in azure data factory
In azure logicapps unable to see Azure sql database connector like below image
After migrating data to Azure Sql database need to send email alert which I did not find in azure data factory
AFAIK sending email alert is not possible through azure data factory, but you can achieve this using Azure logic Apps by following this MS Document
In logic App Create When a http request is received trigger by taking your adf Json schema as shown below.
Then In Outlook 365 connector you can choose send email action and you can retrieve your adf details like data factory name, pipeline name and message.
Then the email gets triggered and the recipient get email alert .

How to Send a Email notification from Azure Data Factory

I have been working on a requirement to send the email notification that lists the files that is transferred to a FileShare using Azure Data Factory.
I have also been instructed not to use Logic Apps and Sendgrid and cannot use log analytics as the team wants no additional charge applied to that subscription and for other reasons.
I have been trying using AKS service or via Databricks.
Can anyone guide me the process to achieve the same, i only have the details of SMTP server and port(No credentials required). Please share any pseudocode if any
Thanks in Advance.
Unlike SSIS, Azure Data Factory and Azure Synapse Engineering Pipelines do not have an out of the box activity to send a email.
Don't fret, you can create an logic app to send the email. This application can be called via an web activity in the above services. Here are the details.
https://learn.microsoft.com/en-us/azure/data-factory/how-to-send-email

Http requests in azure data factory to retrieve xml data and store it in azure storage blob

Previously we made logic app in azure where we used http request to retrieve xml file from our clients system.
It goes like that:
HTTP request --> response body is xml data --> we save that xml data in azure blob storage as xml file.
My question is how and if its possible to do the same thing in azure data factory?
Reason for us to move this process over to data factory is that we also need to execute sql server stored procedures there and in logic app there is that 2 minute timeout and some of our procedures run longer than 2 min.
If you're looking for a way to manually trigger an Azure Data Factory pipeline,
You can manually run your pipeline by using one of the following methods:
.NET SDK
Azure PowerShell module
REST API
Python SDK
The following sample command shows you how to run your pipeline by using the REST API manually:
POST
https://management.azure.com/subscriptions/mySubId/resourceGroups/myResourceGroup/providers/Microsoft.DataFactory/factories/myDataFactory/pipelines/copyPipeline/createRun?api-version=2017-03-01-preview
More information: Manual execution (on-demand) with JSON
There are more questions to be answered, however, like "can we increase the timeout for the Logic App" (yes, see HTTP request limits - Timeout duration), "does the Logic App need to wait for the Stored Procedures to complete" and "is Data Factory the best tool for the job". The best answer to your question depends on the answer to all of these questions.
Based on the information you provided, running the logic in a different way like a Logic App on an Integrated Service Environment or an Azure Function feels like the best option.

Azure Data Factory and Calling an Azure Batch Job

I am new to Azure Data Factory pipelines.
I want guidance on how to call an Azure Batch Job via a Azure Data Factory pipeline and monitor the batch job for failure/completion - is this possible ?
Regards
I found the following articles which I am working through...
https://learn.microsoft.com/en-us/azure/data-factory/v1/data-factory-data-processing-using-batch

Taking parameters from manual triggers in ADF

Usecase
We have an on-premise Hadoop setup and we are using power BI as a BI visualization tool. What we do currently to get data on Powerbi is as follows.
Copy data from on-premise to Azure Blob(Our on-premise schedule does this once the data is ready in Hive)
Data from Azure Blob is then copied to Azure-DataWarehouse/Azure-SQL
Cube refreshes on Azure AAS, AAS pulls data from Azure DataWarehouse/SQL
To do the step2 and step3 we are currently running a web server on Azure and the endpoints are configured to take few parameters like the table name, azure file location, cube information and so on.
Sample http request:
http://azure-web-server-scheduler/copydata?from=blob&to=datawarehouse&fromloc=myblob/data/today.csv&totable=mydb.mytable
Here the web servers extract the values from variables(from, fromloc, to, totable) and them does the copy activity. We did this as we had a lot of tables and all could reuse the same function.
Now we have use cases piling up(retries, control flows, email alerts, monitoring) and we are looking for a cloud alternative to do the scheduling job for us, we would still like to hit an HTTP endpoint like the above one.
One of the alternatives we have checked till now is the Azure Data Factory, where are create pipelines to achieve the steps above and trigger the ADF using http endpoints.
Problems
How can we take parameters from the http post call and make it available as custom variables[1], this is required within the pipeline so that we can still write a function for each step{2, 3} and the function can take these parameters, we don't want to create an ADF for each table.
How can we detect for failure in ADF steps and send email alerts during failures?
What are the other options apart from ADF to do this in Azure?
[1] https://learn.microsoft.com/en-us/azure/data-factory/control-flow-system-variables
You could trigger the copy job from blob to SQL DW via a Get Metadata Acitivity. It can be used in the following scenarios:
- Validate the metadata information of any data
- Trigger a pipeline when data is ready/ available
For eMail notification you can use a Web Activity calling a LogicApp. See the following tuturial how to send an email notification.

Resources