Azure Data Factory and Calling an Azure Batch Job - azure

I am new to Azure Data Factory pipelines.
I want guidance on how to call an Azure Batch Job via a Azure Data Factory pipeline and monitor the batch job for failure/completion - is this possible ?
Regards

I found the following articles which I am working through...
https://learn.microsoft.com/en-us/azure/data-factory/v1/data-factory-data-processing-using-batch

Related

How to sequence pipeline execution trigger in Azure Data Factory

I am working on a migration project where I have a few SQL Server Integration Service projects that will be moved to Azure Data Factory. While I go through this we have a few jobs scheduled via SQL Server Agent which has multiple steps. If were to replicate the same using Azure Data Factory triggers is there a way to group multiple pipelines together and sequence the execution accordingly like we have multiple job steps in SQL Server Agents.
For instance:
Load all of the lookup tables
Load all of the staging tables
Load all of the dimension tables
Load Fact table
Please guide in the right direction.
You can use the Execute Pipeline Activity to build a master pipeline that runs your other pipelines. eg

Is there any way to run python script from Azure File Shares using Batch Service activity in Azure Data Factory?

I know how to run python script using Batch Service in Azure Data Factory if script is located in Blob Storage Container.
But I need to put script in Azure File Shares and execute it using Batch Service custom activity
Unfortunately is not possible to run python script from Azure File share using Batch Service Activity in ADF as Azure Batch connector is linked to a blob storage and hence won't be able to establish a connection to File storage from Azure Batch connector.
I would recommend you to log a feature request here: https://feedback.azure.com/d365community/forum/1219ec2d-6c26-ec11-b6e6-000d3a4f032c
All the feedback shared in this forum are actively monitored and reviewed by ADF product team.

Use Azure Functions as custom activity in ADFv2

Is it possible to somehow package and execute already written azure function as a custom activity in azure data factory?
My workflow is next:
I want to use azure function (which is doing some data processing) in ADF pipeline as a custom activity. This custom activity is just one of the activities in pipeline but its key to be executed.
Is it possible to somehow package and execute already written azure
function as a custom activity in azure data factory?
As I know, there is no way to do that so far. In my opinion, you do not need to package the Azure Function. I suggest you using Web Activity to invoke the endpoint of your Azure Function which could merge into previous pipeline nicely.

Fetch on-demand data from Azure Data Factory Pipeline

I have searched for on-demand data fetch but found details about scheduling ADF pipeline.
I want to know about how to achieve on-demand data load from ADF pipeline?
Documentation for One-time pipelines is here: https://learn.microsoft.com/en-us/azure/data-factory/data-factory-scheduling-and-execution#onetime-pipeline
You can use this for example with PowerShell (https://learn.microsoft.com/en-us/azure/data-factory/data-factory-copy-activity-tutorial-using-powershell) to script one-time execution.

Scheduling U - SQL Job

I am trying to schedule a U SQL job. Please let me know whether I can schedule the U SQL job.If so,how can I schedule.
Thanks,
Vinoth
To my mind, the best way to orchestrate your U-SQL job along with concomitant data management such as getting source data, pushing output data and etc is Azure Data Factory V2. ADF has reach API. Basically, you can run your jobs using either PowerShell or C# or a trigger.
See my very simple example of the job and how to add a trigger below. In this example, I process the documents with my U-SQL job and then push output file (CSV or Avro file) into Azure SQL Server
You could use Azure Automation (with the help of the Azure Data Lake Analytics Cmdlets) or Azure Data Factory to schedule a U-SQL script in the cloud.
You can get some guidance regarding creating a ADF Pipeline here:
https://azure.microsoft.com/en-us/documentation/articles/data-factory-build-your-first-pipeline-using-editor/

Resources