Missing argument 'Target_path' while deploying databricks notebooks using azure devops cicd - databricks

I am trying to deploy databricks notebooks using cicd. But I am facing an error in the release pipeline.
Error: Missing argument 'target_path'
The script i am using to deploy notebook to the target workspace
databricks workspace import_dir --overwrite _databricks/databricks "//$(notebook_folder)"
Here the Shared folder is my target path and for Shared folder i used the variable (notebook_folder) and i am using self hosted agent for the pipeline. There are some restrictions that i am not allowed to use microsoft hosted agents.
Any lead will be helpful.
I tried to deploy it in the Users folder but it doesn't solve the issue.

Related

How can I do CICD of Databricks Notebook in Azure Devops?

I want to do CICD of my Databricks Notebook. Steps I followed.
I have integrated my Databricks with Azure Repos.
Created a Build Artifact using YAML script which will hold my Notebook.
Deployed Build Artifact into Databricks workspace in YAML.
Now I want to
Execute and Schedule the Databricks notebook from the Azure DevOps pipeline itself.
How can setup multiple Environments like Stage, Dev, and Prod using YAML.
My Notebook itself call other notebooks. can I do this?
How can I solve this?
It's doable, and with Databricks Repos you really don't need to create build artifact & deploy it - it's better to use Repos API or databricks repos to update another checkout that will be used for tests.
For testing of notebooks I always recommend to use Nutter library from Microsoft that simplifies testing of notebooks by allowing to trigger their execution from the command-line.
You can include other notebooks using %run directive - it's important to use relative paths instead of absolute paths. You can organize dev/staging/prod either as folders inside the Repos, or as a fully separated environments - it's up to you.
I have a demo of notebooks testing & Repos integration with CI/CD - it contains all necessary instructions how to setup dev/staging/prod + Azure DevOps pipeline that will test notebook & trigger release pipeline.
The only one thing that I want to mention explicitly - for Azure DevOps you will need to use Azure DevOps personal access token because identity passthrough doesn't work with APIs yet.

Automated way to add & deploy SPFX sppkg solution package file to SharePoint 2019 (on-premises) app catalog

I'm looking for an automated way for adding & deploying SPFX solution package (*.sppkg) into the SharePoint 2019 (NOT online) app catalog. This is cause and doing the deploying using azure devops (CI/CD) release pipeline.
I found those two below tasks for adding this package:
The first one worked fine in just adding the file to the app catalog but not deploying it, so I had to do it manually.
The second task has the option to write PnP script against SharePoint, but the problem with me is that most of the scripts i found are for SharePoint Online not on premise.
Appreciate your support if you had such situation and got it solved using PnP or something else.
Microsoft's official documentation describes an approach to continuous deployment using Azure DevOps. You can click Implement Continuous Integration and Continuous deployment using Azure DevOps for detailed information. There is no need to write PNP scripts using this method.
Setting up Azure DevOps for Continuous Deployments with a SharePoint
Framework solution requires the following steps:
Creating the Release Definition
Linking the Build Artifact
Creating the Environment
Installing NodeJS
Installing the CLI for Microsoft 365
Connecting to the App Catalog
Adding the Solution Package to the App Catalog
Deploying the Application
Setting the Variables for the Environment
If you just want to use Azure DevOps CD and don't want to use CI, you can skip the second step and upload your build artifacts directly to the repository, and then use them in the release pipeline.
There are some basic scenarion for using pipelines:
You use cloud pipeline (azure, github, etc) with their abilities and their purpuse.
You create own environment with self-hosted pipelines
You add own runner (pipeline agent) to your cloud (azure) environment
So, azure allows you to add own pipeline agent to environment. For example, self-hosting windows agent.
So, I think solution in your case will be:
Install self-hosted agent
Configure agent environment - installing SharePointPnPPowerShell2019
Add agent to your azure environment
Add step to deploy your solution into pipline with using self-hosted agent
This scenario allows you to deploy sppkg solutions without publish your app catalog to internet, because your self-hosted agent will be in the save network as your sharepoint farm.
Azure pipeline deploy steps allow to run powershell on target machines
Prerequisites This task uses Windows Remote Management (WinRM) to
access on-premises physical computers or virtual computers that are
domain-joined or workgroup-joined.

Azure DevOps CD Pipeline to Deploy Library to Databricks DBFS 403 Forbidden Error

I'm following the tutorial Continuous integration and delivery on Azure Databricks using Azure DevOps to automate the process to deploy and install library on an Azure Databricks cluster. However, I'm stucked in the step "Deploy the library to DBFS" using task Databricks files to DBFS in Databricks Script Deployment Task extension by Data Thirst.
It continuously gives me this error:
##[error]The remote server returned an error: (403) Forbidden.
The configuration of this task is shown below:
I've checked with my token that it works fine when I try to upload the libraries manually through Databricks CLI. Thus, the problem shouldn't be due to the permission of the token.
Can anyone suggest any solution to this? Or is there any alternative way to deploy libraries to clusters on Azure Databricks via the release CD pipelines on Azure DevOps?
Did you check your Azure Region in Databricks? If you don't use the same Azure Region in Azure Devops, you will get 403 error.
After trying multiple times, it turns out if you don't use the extension and use Databricks CLI in the pipeline to directly upload files, the uploading will work smoothly. Hope this helps if someone got the same problem.
I also faced similar problem while using the Databricks Script Deployment Task created by Data Thirst. Then switched to DevOps for Azure Databricks created by Microsoft DevLabs. Below are the steps I used to work with Databricks CLI to achieve what I wanted to do as part of Azure Release Pipeline:
First, added Use Python version task. Referred to Python 3.7
Then, added Configure Databricks CLI. Provided workspace URL, e.g. adb-1234567890123456.12.azuredatabricks.net, and provided the personal access token by referring to a secret variable
Added a Command Line Script task, and added Databricks CLI scripts as inline code. Moreover, added --profile AZDO along with the scripts as this profile is configured in the previous step. E.g., dbfs cp $(System.DefaultWorkingDirectory)/abcd dbfs:/mytempfiles --recursive --overwrite --profile AZDO

How to deploy python flask application on linux web app service through azure portal?

I am trying to deploy my flask application on Linux web apps.
I want to set a azure pipeline for my code which is pushed on an azure repository.
I have made all the configuration changes in my python code and created a web app with runtime stack of Python 3.7
As soon as I go to deployment center to deploy my code, after selecting the azure repository as the source of my code, I am redirected to an option of azure pipelines where we have to configure our build settings.
But the build does not gives any option for Python. It just gives me four build options such as Node, Ruby, Asp.Net and php.
I cannot use :
- Docker
- Git
With such limitations I have found no suitable tutorial to do the same.
Can someone tell me some way to set the pipeline for my python project ?
Azure DevOps CI/CD works with any language, platform and cloud. Just for Python application, you may need add additional steps to achieve the deployment from Azure Devops CI/CD.
CI
Since the python is an interpreted language, do not need the compilation. If none of other step, like test, just need use 2 tasks in CI pipeline: Archive Files task and Publish Build Artifacts task.
The Archive Files task used to pack the python application source folder into a zip package to use it in CD. And Publish Build Artifacts task will publish this zip package to Release pipeline.
BUT,
If your project contains and needs test, please add another Command line task to run the test by using pytest.
But, in azure devops, you need configure the python environment with some tasks if you want to use python component like pytest.
Here please refer to this blog.
Note: Since the stack you are using is Python 3.7, please specify the python version as 3.x in the Using Python task.
CD
Since you have create the app service in Azure portal, just skip step 4(Add Azure CLI task) in Exercise 3: Configure Release pipeline which shown in this blog, because Step 4 just used to create a new Azure Resources.
1. To deploy the python application, you need add the Azure App Service manage task first to install the corresponding python version site extension in release pipeline:
It would install set of corresponding tools to support to manage your app service.
2. Next you could use Azure App Service deploy to deploy the zip package which created in the Build pipeline, to the app service you configured in Azure portal.
After specified the subscription in this task, the app service will automatically display in the drop list of App Service name:
Then specified the path which you configured in the publish task of the build pipeline. Replace $(Build.ArtifactStagingDirectory) as $(System.DefaultWorkingDirectory), and replace $(Build.BuildId) as * to search the zip package by using the fuzzy search.

Azure DevOps: Application Error If you are the application administrator, you can access the diagnostic resources

I am trying to publish Hello-world to azure app service.
I am using below code from my git repo as below
https://github.com/biswajeetbehera/java-hello-world-with-maven/tree/master/src/main/java/hello
I am using Azure-DevOps CI-CD pipeline to build my code from above URL & using maven as my build tool inside it.
after successfully building my package I am trying to deploy it on Azure app service {its a Linux server. I have installed java inside it.)
But I am getting below error:
:( Application Error
If you are the application administrator, you can access the diagnostic resources.
Following things which you can take a look at,
1) Did you try to build at your local machine , did it compiled successfully.
Using the same command which has been triggered with azure devops.
2) Did you have all the dependency installed in the build agent?
3) Here you can find the respected link for maven task for an application
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/build/maven?view=azure-devops
Please check that and see if it helps.

Resources