How to run python scripts in Synapse in production environments - azure

How can we run python scripts in Synapse in production environments? I want to know the best practices.

You can use custom activities in an Azure Data Factory or Azure Synapse Analytics pipeline to run Python scripts.
For more details, refer to below links:
Use custom activities in an Azure Data Factory or Azure Synapse Analytics pipeline
Tutorial: Run Python scripts through Azure Data Factory using Azure Batch

Related

Create Azure Synapse Pipeline using Terraform

I'm new to Azure and was wondering if we can create Synapse pipeline using terraform. I tried searching on hasicorp website but could not find anything that specifically provided instructions on how to be able to create Azure Synapse pipeline using terraform.
As of today, it is not possible to create Azure Synapse Pipeline using Terraform.
You can create Azure synapse pipeline using Azure CLI.
For more details, refer to az synapse pipeline

Backup Azure SQL Database using Azure Devops

We are using Azure devops to deploy changes to Azure SQL database using dacpac. I want to add a step in the build or release pipeline to take backup of the database but didn't find any task in devops marketplace. Can anyone suggest some way to take db backup in devops pipeline. It will be very helpful. Thanks.
Azure CLI with az sql db commands can be used to manage your Azure Database.
Azure Devops Service provides Azure CLI task to call Azure CLI in Azure Devops pipelins.
And here're two documents that you can refer to:
Use CLI to backup an Azure SQL single database to an Azure storage container
Use CLI to restore a single database in Azure SQL Database to an earlier point in time

Can we use Azure DevOps Test Plans for Datalake testing

I'm working on a Data Lake project and I'm using Azure Databricks (writing pyspark code) for ETL purpose and Azure DevOps for CICD and source control purpose. I have noticed Test Plans n Devops: my query is can I use test plans for Data Lake testing? I went through the internet browsing related test plans but I did not find anything about plans related to Data Lake, Database or Data warehousing.
You can use Azure Devops Test Plans for Datalake testing. If your Data Lake test cases are written in c#, you can use Azure devops Test plans for Datalake testing just like any other c# test projects.
1,Here is the example for set up Datalake test case, Test your Azure Data Lake Analytics code.
2,Then you can create test workitems in your azure devops boards.
3,After your test items are created in azure devops, you can associate them to your test cases in visual studio test project. Please check the detailed steps
Associate automated tests with test cases.
4, In the Test plans of your azure devops project, Create test plans for your test work items
5, Then you can run your automated tests from the test hub in azure devops

How to run Azure CLI command from azure ADF pipeline?

I have set of Azure CLI commands ready which append data into an existing azure data lake file.
We need to run all these commands from an ADF (Azure Data Factory) pipeline. Does anyone have any idea on how we can run Azure CLI commands from ADF pipeline?
You can create an Azure function and call it from ADF with the Azure Function Activity: https://learn.microsoft.com/en-us/azure/data-factory/control-flow-azure-function-activity
Here is a tutorial to run azure-cli commands in Azure Functions: https://learn.microsoft.com/en-us/azure/azure-functions/scripts/functions-cli-create-serverless
Hope this helped!!

Azure DevOps pipelines for Azure Databricks

No idea what all from Azure databricks can be based on Azure DevOps pipeline. We are planning to use github as repository.
Like can Azure databricks be coded in file and then that file i can manage in git repo?
Can we use Azure DevOps CD pipeline for deployment in Azure Databricks?
Can we use Azure DevOps CD pipeline for deployment in Azure Databricks?
The short answer is yes.
We could configure Azure Databricks workspace to use Azure DevOps and there is a task Databricks Script Deployment Task by Data Thirst, which will give you the option of deploying scripts, secrets and notebooks to Databricks.
For the details info, you can refer to following document:
CI/CD with Databricks and Azure DevOps
Hope this helps.

Resources