How to run Azure CLI command from azure ADF pipeline? - azure

I have set of Azure CLI commands ready which append data into an existing azure data lake file.
We need to run all these commands from an ADF (Azure Data Factory) pipeline. Does anyone have any idea on how we can run Azure CLI commands from ADF pipeline?

You can create an Azure function and call it from ADF with the Azure Function Activity: https://learn.microsoft.com/en-us/azure/data-factory/control-flow-azure-function-activity
Here is a tutorial to run azure-cli commands in Azure Functions: https://learn.microsoft.com/en-us/azure/azure-functions/scripts/functions-cli-create-serverless
Hope this helped!!

Related

How to run python scripts in Synapse in production environments

How can we run python scripts in Synapse in production environments? I want to know the best practices.
You can use custom activities in an Azure Data Factory or Azure Synapse Analytics pipeline to run Python scripts.
For more details, refer to below links:
Use custom activities in an Azure Data Factory or Azure Synapse Analytics pipeline
Tutorial: Run Python scripts through Azure Data Factory using Azure Batch

Create Azure Synapse Pipeline using Terraform

I'm new to Azure and was wondering if we can create Synapse pipeline using terraform. I tried searching on hasicorp website but could not find anything that specifically provided instructions on how to be able to create Azure Synapse pipeline using terraform.
As of today, it is not possible to create Azure Synapse Pipeline using Terraform.
You can create Azure synapse pipeline using Azure CLI.
For more details, refer to az synapse pipeline

Backup Azure SQL Database using Azure Devops

We are using Azure devops to deploy changes to Azure SQL database using dacpac. I want to add a step in the build or release pipeline to take backup of the database but didn't find any task in devops marketplace. Can anyone suggest some way to take db backup in devops pipeline. It will be very helpful. Thanks.
Azure CLI with az sql db commands can be used to manage your Azure Database.
Azure Devops Service provides Azure CLI task to call Azure CLI in Azure Devops pipelins.
And here're two documents that you can refer to:
Use CLI to backup an Azure SQL single database to an Azure storage container
Use CLI to restore a single database in Azure SQL Database to an earlier point in time

Is it possible to failover Azure functions using a VSTS/Azure DevOps Release deployment task?

I have a timer triggered function that I'm looking to deploy in two different regions in an active-passive pattern. In disaster recovery scenario, I want to disable the active instance, then activate the passive instance in such a way that I can also keep record of this activity.
I know this can be done via powershell/Azure CLI, but I think doing this via Azure release pipelines should be better for auditing purposes. Does anyone know if this is doable?
I donot know much about Azure funtions. But there are azure powershell task and azure CLI task that you can use to run the scripts in azure release pipeline, Since it can be done via powershell/Azure CLI.
In order to run the powershell/Azure CLI scripts in azure release pipeline. You need first connect your azure devops to your Microsoft Azure subscription via service connection. Check here to create a service connection in azure devops. This service connection is needed for azureSubscription parameter of azure powershell task/azure cli task
Then you can create a release pipeline and add azure powershell task or azure cli task to run the scripts.

Azure DevOps pipelines for Azure Databricks

No idea what all from Azure databricks can be based on Azure DevOps pipeline. We are planning to use github as repository.
Like can Azure databricks be coded in file and then that file i can manage in git repo?
Can we use Azure DevOps CD pipeline for deployment in Azure Databricks?
Can we use Azure DevOps CD pipeline for deployment in Azure Databricks?
The short answer is yes.
We could configure Azure Databricks workspace to use Azure DevOps and there is a task Databricks Script Deployment Task by Data Thirst, which will give you the option of deploying scripts, secrets and notebooks to Databricks.
For the details info, you can refer to following document:
CI/CD with Databricks and Azure DevOps
Hope this helps.

Resources