Databricks command not found in azure devops pipeline - azure

I am trying to copy a file to Azure Databricks DBFS through Azure Devops pipeline. The following is a snippet from the yml file I am using:
stages:
- stage: MYBuild
displayName: "My Build"
jobs:
- job: BuildwhlAndRunPytest
pool:
vmImage: 'ubuntu-16.04'
steps:
- task: UsePythonVersion#0
displayName: 'Use Python 3.7'
inputs:
versionSpec: '3.7'
addToPath: true
architecture: 'x64'
- script: |
pip install pytest requests setuptools wheel pytest-cov
pip install -U databricks-connect==7.3.*
displayName: 'Load Python Dependencies'
- checkout: self
persistCredentials: true
clean: true
- script: |
echo "y
$(databricks-host)
$(databricks-token)
$(databricks-cluster)
$(databricks-org-id)
8787" | databricks-connect configure
databricks-connect test
env:
databricks-token: $(databricks-token)
displayName: 'Configure DBConnect'
- script: |
databricks fs cp test-proj/pyspark-lib/configs/config.ini dbfs:/configs/test-proj/config.ini
I get the following error at the stage where I am invoking the databricks fs cp command:
/home/vsts/work/_temp/2278f7d5-1d96-4c4e-a501-77c07419773b.sh: line 7: databricks: command not found
However, when I run databricks-connect test, it is able to execute the command successfully. Kindly help if I am missing some steps somewhere.

The databricks command is located in the databricks-cli package, not in the databricks-connect, so you need to change your pip install command.
Also, for databricks command you can just set the environment variables DATABRICKS_HOST and DATABRICKS_TOKEN and it will work, like this:
- script: |
pip install pytest requests setuptools wheel
pip install -U databricks-cli
displayName: 'Load Python Dependencies'
- script: |
databricks fs cp ... dbfs:/...
env:
DATABRICKS_HOST: $(DATABRICKS_HOST)
DATABRICKS_TOKEN: $(DATABRICKS_TOKEN)
displayName: 'Copy artifacts'
P.S. Here is an example on how to do CI/CD on Databricks + notebooks. You could be also interested in cicd-templates project.

Related

How to set variables, defined in Azure variable group, as env variables in Azure pipeline?

We run integration tests, written in Python, in Azure pipeline.
The run is triggered in this way:
- script: |
pdm run pytest \
--variables "$VARIABLE_FILE" \
--napoleon-docstrings \
--doctest-modules \
--color=yes \
--junitxml=junit/test-results.xml \
integration
env:
<different_environment_variables>
Some integration tests will make connection to a database, and the properties, needed for connection to the database, are stored in a variable group in Azure. I can get them via :
- powershell: |
az pipelines variable-group variable list --group-id <some_group_id> --output table
but i do not know how to set them later as environment variables and use them in the python code?
You can reference the variables from your variable group directly in your python YAML pipeline like below :-
# Python package
# Create and test a Python package on multiple Python versions.
# Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more:
# https://docs.microsoft.com/azure/devops/pipelines/languages/python
trigger:
- master
pool:
vmImage: ubuntu-latest
strategy:
matrix:
Python27:
python.version: '2.7'
Python35:
python.version: '3.5'
Python36:
python.version: '3.6'
Python37:
python.version: '3.7'
variables:
- group: SharedVariables
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
echo $(databaseservername)
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt
displayName: 'Install dependencies'
- script: |
pip install pytest pytest-azurepipelines
pytest
displayName: 'pytest'
variables:
- group: SharedVariables
- script: |
echo $(databaseservername)
Pipeline ran successfully like below :-
Authorize and allow the permission to run the pipeline like below :-
Permit access :-
Pipeline ran successfully with the variable:-
You can also pass the variables in the pipeline with arguments and inline script with python task like below :-
variables:
- group: variableGroup
steps:
- task: PythonScript#0
displayName: 'Run a Python script'
inputs:
scriptPath: 'Test.py'
arguments: --variableInScript $(variableInVariableGroup)
Inline Python task:-
variables:
- group: SharedVariables
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
echo $(databaseservername)
- task: PythonScript#0
inputs:
scriptSource: 'inline'
script: 'print(''databaseservername: $(databaseservername))'')'
Output :-
databaseservername variable was encrypted like below after running as an argument in the python inline script.
You can link the variable group in the release pipeline like below and call it in multiple pipelines across different stages too :-
Similarly, you can add Azure CLI task for your shell script to callaz pipelines variable-group variable list in your yaml pipeline and reference it like below :-
- task: AzureCLI#2
inputs:
azureSubscription: '<Subscription-name>(sub-id)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
az pipelines variable-group variable list
--org ''https://dev.azure.com/org/''
--project project --group-id id
--only-show-errors --output json
Reference:-
Access variables from Variable Groups inside Python script task in Azure DevOps Yaml pipeline By Bo Soborg Perterson
Pass Variable Group as Dictionary To Python Script By Joy wang

How to publish python unittest results to azure pipeline

I am trying to build a pipeline azure DevOps. I build a basic Flask website and wrote a Unittest script for it. It basically all works perfect. When i commit to azure repos the pipeline will do his thing and the test will run. The thing i want is to see the test results, i see all these tutorials for Pytest but not for Unittest.
trigger:
- Development
jobs:
- job: 'Test'
pool:
vmImage: 'ubuntu-latest' # other options: 'macOS-latest', 'windows-latest'
strategy:
matrix:
Python37:
python.version: '3.7'
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
- script: |
python -m pip install --upgrade pip
python -m pip install -e .
displayName: 'Install dependencies'
- script: |
python -m unittest discover -p "*.py" > results.txt
displayName: unittesting
This is my Yaml file that runs the pipeline.
This is how my results look when running the pipeline.
Pipeline results
Is there a way to publish these results with Unittest and have them in azure pipeline.
You'll first need to make your test script generate results in a format that DevOps can understand, i.e. JUnit XML.
There's an example in MS docs (which includes coverage as well):
- script: |
pip install pytest pytest-azurepipelines
pip install pytest-cov
pytest --doctest-modules --junitxml=junit/test-results.xml --cov=. --cov-report=xml
displayName: 'pytest'
Alternatively, using unittest-xml-reporting should give you results in JUnit XML format as well.
Once you have that, then you can use Publish Test Results task to upload results and make them visible in DevOps UI, i.e.:
- task: PublishTestResults#2
inputs:
testResultsFormat: 'JUnit'
testResultsFiles: '**/test-*.xml'
If your unit tests are written using unittest module, dont worry
pytest can run unittest as well !!!
Add below yaml code snippet to your azure pipeline.
Run your unittest with pytest
Generate Junit XML output
Publish the output back to azure pipeline
- script: |
cd $(Build.Repository.LocalPath)
python -m pytest $(Build.Repository.LocalPath)/<unit_tests_path>/*.py --junitxml=test-unit.xml
displayName: 'Run Unit Tests'
- task: PublishTestResults#2
inputs:
testResultsFormat: 'JUnit'
testResultsFiles: 'test-unit.xml'

Use python environment as an Artifact in Azure pipelines

so I´m trying to create a python environment in az pipelines and publish as an artifact in order to use it across jobs. I can publish the artifact, also it is downloaded in next job, problem I have is that seems that "environment/bin/activate" doesn´t work, what am I doing wrong? is there another way to achieve that?
build and publish artifact
jobs:
- job: BuildPythonArtifact
workspace:
clean: all
pool:
vmImage: $(UBUNTU_DEFAULT_VERSION)
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: "3.9"
displayName: "Use Python 3.9"
- script: |
python -m venv env
source env/bin/activate
python -m pip install --upgrade pip
pip install -r requirements.txt
workingDirectory: $(Build.SourcesDirectory)
displayName: "Install requirements"
- task: PublishPipelineArtifact#1
inputs:
targetPath: $(Build.SourcesDirectory)/env
artifactName: pythonenv
download artifact
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: "3.9"
displayName: "Use Python 3.9"
- task: DownloadPipelineArtifact#2
inputs:
artifact: pythonenv
- script: |
source $(Pipeline.Workspace)/bin/activate
files are there, bit seems that it ignores the env activate
What I expect is to can activate or reuse python env to avoid installing requirements in every job

Deployment of Python web app fails when all packages are not included in the file requirements.txt (even though they are installed in the yaml tasks))

I deploy Azure functions using yaml scripts.
For some reasons, I do not put all my package requirements in the file requirements.txt, as this is used by some other processes.
Yet, when deploying my web app through a YAML pipeline, I want to install additional python packages. However, the resulting app that is deployed crashes with errors saying that it does not have those packages installed.
my pipeline:
# Python to Linux Web App on Azure
# Build your Python project and deploy it to Azure as a Linux Web App.
# Change python version to one thats appropriate for your application.
# https://learn.microsoft.com/azure/devops/pipelines/languages/python
trigger:
- master
variables:
# Azure Resource Manager connection created during pipeline creation
azureServiceConnectionId: .........................
# Web app name
webAppName: .........................
# Agent VM image name
vmImageName: 'ubuntu-latest'
# Environment name
environmentName: .........................
# Project root folder. Point to the folder containing manage.py file.
projectRoot: $(System.DefaultWorkingDirectory)
# Python version: 3.8
pythonVersion: '3.8'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: BuildJob
pool:
vmImage: $(vmImageName)
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(pythonVersion)'
displayName: 'Use Python $(pythonVersion)'
- script: |
python -m venv antenv
source antenv/bin/activate
python -m pip install --upgrade pip
pip install setup
pip install -r requirements.txt
pip install pyodbc
workingDirectory: $(projectRoot)
displayName: "Install requirements"
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(projectRoot)'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- upload: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
displayName: 'Upload package'
artifact: drop
- stage: Deploy
displayName: 'Deploy Web App'
dependsOn: Build
condition: succeeded()
jobs:
- deployment: DeploymentJob
pool:
vmImage: $(vmImageName)
environment: $(environmentName)
strategy:
runOnce:
deploy:
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(pythonVersion)'
displayName: 'Use Python version'
- task: AzureWebApp#1
displayName: 'Deploy Azure Web App : .........................
inputs:
azureSubscription: $(azureServiceConnectionId)
appName: $(webAppName)
package: $(Pipeline.Workspace)/drop/$(Build.BuildId).zip
startUpCommand: 'startup-nocontainer.sh'
Here, pyodbc is installed separately in the pipeline, and not present in the file requirements.txt.
Checking the logs of the pipeline, I can see that the installation was performed successfully.
Yet when the app starts, it crashes at first import of pyodbc, as if the app that is eventually deployed only relyed on requirements.txt.
Any idea?
I think pyodbc is not being installed in the venv you create. The venv contains only the packages you specify in requirements.txt
Check the solutions proposed in this old post

Unable to run Python Selenium Tests on Self Hosted Azure VM

I am trying to execute the Selenium test in my self hosted VM from Azure pipelines. I am getting below error message.
##[error]Version spec 3.8 for architecture x64 did not match any version in Agent.ToolsDirectory.
Below is my Yaml
trigger:
- main
pool: default strategy: matrix:
Python38:
python.version: '3.8'
addToPath: true
steps:
- task: UsePythonVersion#0 inputs:
versionSpec: '$(python.version)'
addToPath: true
architecture: 'x64' displayName: 'Use Python $(python.version)'
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt displayName: 'Install dependencies'
- script: |
pip install pytest-azurepipelines
pytest tests\test_smoke\test_suite_SmokeTest.py -s -v --browser Chrome --html=report.html --reruns 2 displayName: 'pytest'
- task: DownloadPipelineArtifact#2 inputs:
buildType: 'specific'
project: '8801f21d-f4e9-4b65-b01e-68baf825747c'
definition: '5'
buildVersionToDownload: 'latest'
targetPath: '$(Pipeline.Workspace)'
I have configured the agent in my VM. I have also installed the python 3.8.6 in VM and configured the python path for it.
Unable to run Python Selenium Tests on Self Hosted Azure VM
To use this task on the private agent, we need to add the desired Python version to the tool cache on the self-hosted agent in order for the task to use it.
Normally the tool cache is located under the _work/_tool directory of the agent or you could use the command line task to echo the variable $(Agent.ToolsDirectory) to get the path.
Then, under that directory, create the following directory structure based off of your Python version:
$AGENT_TOOLSDIRECTORY/
Python/
{version number}/
{platform}/
{tool files}
{platform}.complete
The version number should follow the format of 1.2.3. The platform
should either be x86 or x64. The tool files should be the unzipped
Python version files. The {platform}.complete should be a 0 byte file
that looks like x86.complete or x64.complete and just signifies the
tool has been installed in the cache properly.
My test directory structure:
$AGENT_TOOLSDIRECTORY/
Python/
3.9.0/
x64/
python-3.9.0-amd64.exe
x64.complete
Now, it works fine on my Self Hosted agent:
You could refer the FAQ in this document for some more details.

Resources