Azure databricks notebook missing Repo - databricks

I am a beginner in azure databricks notebook. I read the docs that in the azure databricks notebook, there should be a Repo in the sidebar. But in one of my notebooks, I didn't find it. Do you know why? Is it because of some setting on purpose?

This happens when Repos aren't enabled in your Databricks workspace:
Ask your administrator to enable it:

Related

How can I do CICD of Databricks Notebook in Azure Devops?

I want to do CICD of my Databricks Notebook. Steps I followed.
I have integrated my Databricks with Azure Repos.
Created a Build Artifact using YAML script which will hold my Notebook.
Deployed Build Artifact into Databricks workspace in YAML.
Now I want to
Execute and Schedule the Databricks notebook from the Azure DevOps pipeline itself.
How can setup multiple Environments like Stage, Dev, and Prod using YAML.
My Notebook itself call other notebooks. can I do this?
How can I solve this?
It's doable, and with Databricks Repos you really don't need to create build artifact & deploy it - it's better to use Repos API or databricks repos to update another checkout that will be used for tests.
For testing of notebooks I always recommend to use Nutter library from Microsoft that simplifies testing of notebooks by allowing to trigger their execution from the command-line.
You can include other notebooks using %run directive - it's important to use relative paths instead of absolute paths. You can organize dev/staging/prod either as folders inside the Repos, or as a fully separated environments - it's up to you.
I have a demo of notebooks testing & Repos integration with CI/CD - it contains all necessary instructions how to setup dev/staging/prod + Azure DevOps pipeline that will test notebook & trigger release pipeline.
The only one thing that I want to mention explicitly - for Azure DevOps you will need to use Azure DevOps personal access token because identity passthrough doesn't work with APIs yet.

Azure DevOps CD Pipeline to Deploy Library to Databricks DBFS 403 Forbidden Error

I'm following the tutorial Continuous integration and delivery on Azure Databricks using Azure DevOps to automate the process to deploy and install library on an Azure Databricks cluster. However, I'm stucked in the step "Deploy the library to DBFS" using task Databricks files to DBFS in Databricks Script Deployment Task extension by Data Thirst.
It continuously gives me this error:
##[error]The remote server returned an error: (403) Forbidden.
The configuration of this task is shown below:
I've checked with my token that it works fine when I try to upload the libraries manually through Databricks CLI. Thus, the problem shouldn't be due to the permission of the token.
Can anyone suggest any solution to this? Or is there any alternative way to deploy libraries to clusters on Azure Databricks via the release CD pipelines on Azure DevOps?
Did you check your Azure Region in Databricks? If you don't use the same Azure Region in Azure Devops, you will get 403 error.
After trying multiple times, it turns out if you don't use the extension and use Databricks CLI in the pipeline to directly upload files, the uploading will work smoothly. Hope this helps if someone got the same problem.
I also faced similar problem while using the Databricks Script Deployment Task created by Data Thirst. Then switched to DevOps for Azure Databricks created by Microsoft DevLabs. Below are the steps I used to work with Databricks CLI to achieve what I wanted to do as part of Azure Release Pipeline:
First, added Use Python version task. Referred to Python 3.7
Then, added Configure Databricks CLI. Provided workspace URL, e.g. adb-1234567890123456.12.azuredatabricks.net, and provided the personal access token by referring to a secret variable
Added a Command Line Script task, and added Databricks CLI scripts as inline code. Moreover, added --profile AZDO along with the scripts as this profile is configured in the previous step. E.g., dbfs cp $(System.DefaultWorkingDirectory)/abcd dbfs:/mytempfiles --recursive --overwrite --profile AZDO

Run Azure databricks notebook(python) via Terraform

Please help me with terraform script to run Azure databricks notebook(python)in other environment.Thank you
You should synchronise Databricks Notebooks via databricks_notebook and scheduling every quartz_cron_expression through databricks_job notebook_task. See example configuration here.
These are the supported developer tools help you develop Azure Databricks applications using the Databricks REST API, Databricks Utilities, Databricks CLI, and tools outside the Azure Databricks environment.
Reference: Azure Databricks - Developer Tools.
Hope this helps.

Share notebooks across Azure Machine learning service notebook VM

Is there any way to share a Azure notebook across multiple users who use different notebook VMs? It seems the VMs itself is not shareable across users.
Azure Machine Learning Notebook VM is a part of Azure Machine Learning service, whereas jupyter notebooks on Azure Machine Learning Studio are the part of the Notebook service that runs on Ubuntu 14.04.02 under Docker. With Jupyter in Azure ML Stuido you have the full Anaconda 64-bit distribution available to you.
Thus, if you are willing to share the Azure ML Studio notebook you will need to add a user to your workspace with owner rights.
Notebook VMs has own Jupyter environment and we don't need to use notebooks.azure.com. The former can be used in enterprise scenarios within the team to share the resources, and the latter is open, similar to google colab. When each user login to his notebook VM, there is a top level folder with his/her alias and under that all notebooks are stored. this is stored in an Azure storage and each user's notebook VM will mount same storage. Hence If I want to view other person \'s notebook, I need to navigate to his alias in the Jupyter nb in my nbvm
If you have a look at this example there is a clone button. So when, say, Microsoft DataScientist shares his code all the others may clone his notebook to their own workspace.
After they clone it the url is no longer
https://notebooks.azure.com/ms-ai/projects/Text-Lab/html/Text%20Lab%20-%20workflow%20and%20embedding.ipynb
but
https://notebooks.azure.com/another-user-workspace/projects/Text-Lab/html/Text%20Lab%20-%20workflow%20and%20embedding.ipynb
Does this solve your issue?

How to version control Jupyter notebook files hosted in Azure to GITHUB?

Folks,
I`m using Azure Notebook.
Created a new library and linked to my GITHUB account.
Can see the files hosted on my GITHUB in Azure Notebook Library.
However, If I amend .ipynb file in an Azure notebook.
Not sure what is required for the respective GITHUB REPO to be updated.
Any pointers will be of great help.

Resources