Use python environment as an Artifact in Azure pipelines - python-3.x

so I´m trying to create a python environment in az pipelines and publish as an artifact in order to use it across jobs. I can publish the artifact, also it is downloaded in next job, problem I have is that seems that "environment/bin/activate" doesn´t work, what am I doing wrong? is there another way to achieve that?
build and publish artifact
jobs:
- job: BuildPythonArtifact
workspace:
clean: all
pool:
vmImage: $(UBUNTU_DEFAULT_VERSION)
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: "3.9"
displayName: "Use Python 3.9"
- script: |
python -m venv env
source env/bin/activate
python -m pip install --upgrade pip
pip install -r requirements.txt
workingDirectory: $(Build.SourcesDirectory)
displayName: "Install requirements"
- task: PublishPipelineArtifact#1
inputs:
targetPath: $(Build.SourcesDirectory)/env
artifactName: pythonenv
download artifact
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: "3.9"
displayName: "Use Python 3.9"
- task: DownloadPipelineArtifact#2
inputs:
artifact: pythonenv
- script: |
source $(Pipeline.Workspace)/bin/activate
files are there, bit seems that it ignores the env activate
What I expect is to can activate or reuse python env to avoid installing requirements in every job

Related

azure function app - linux consumption plan - unable to import modules

The deployment with vscode run 100% fine,
in the log I see it uses oryx.
header:
import datetime
import logging
import adal
import requests
import json
I want to upload the code using Azure Pipelines though, for the sake of automation.
Here is my code
steps:
- bash: |
if [ -f extensions.csproj ]
then
dotnet build extensions.csproj --output ./bin
fi
displayName: 'Build extensions'
- task: UsePythonVersion#0
displayName: 'Use Python 3.9'
inputs:
versionSpec: '3.9'
- bash: |
python3.9 -m venv worker_venv
source worker_venv/bin/activate
pip3.9 install setuptools
pip3.9 install -r requirements.txt
displayName: 'Install application dependencies'
- task: ArchiveFiles#2
displayName: "Archive files"
inputs:
rootFolderOrFile: "$(System.DefaultWorkingDirectory)/functions"
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- publish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
artifact: drop
- task: AzureFunctionApp#1
displayName: 'Deploy functions to Function App'
inputs:
azureSubscription: Service-Conn
appType: functionAppLinux
appName: 'pythontest'
package: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
runtimeStack: 'Python|3.9'
deploymentMethod: 'zipDeploy'
resourceGroupName: $(resourcegroup_name_app)
But I end up with module not found error (in the monitor of function in azure portal).
Result: Failure Exception: ModuleNotFoundError: No module named 'adal'.
the uploaded zip have site packages
there is no error in pipeline
What am I missing? Any ideas guys?
pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt is the command you want to run if you need to ship your libraries in the deployment zip file instead of running pip on function app service.
Your pipeline code installed libraries into .venv and function app runtime will not use that folder.
refs:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-how-to-azure-devops?tabs=dotnet-core%2Cyaml%2Cpython#build-your-app
https://learn.microsoft.com/en-us/azure/azure-functions/deployment-zip-push#deployment-customization
I believe you have to run a task to install pip in pipeline console. See link below for better clarity: https://learn.microsoft.com/en-us/azure/devops/pipelines/ecosystems/python?view=azure-devops

uninstall packages in azure function

My azure function app(python) is throwing an exception: module typing has no attribute '_classVar'. A fix for this would be to uninstall the dataclasses package. How do I uninstall this package on a python azure function using pip?
If I run pip uninstall dataclasses, will this reflect on deployment?
If you are using python version 3.7 or greater you need to uninstall the dataclass library using the same pip uninstall dataclasses.
As The dataclasses package is a backport of the Python 3.7 dataclass functionality.
Or, if still you want to exist dataclasses you can downgrade your python version to 3.6.
For more information please refer the below links:
Blog|AttributeError: module ‘typing’ has no attribute ‘_ClassVar’ with Tune
Similar GitHub Issue
I was also having a lot of trouble trying to deploy azure functions from an Azure Devops pipeline with a Python 3.7 environment, so I decided to place this here as it might help someone else with the same problem.
You need to prepare the following yaml file with your respective variables.
trigger:
- {{ branch }}
variables:
# Azure Resource Manager connection created during pipeline creation
azureSubscription: '{{ azureRmConnection.Id }}'
# Function app name
functionAppName: '{{ functionAppName }}'
# Agent VM image name
vmImageName: 'ubuntu-latest'
# Working Directory
workingDirectory: '{{ workingDirectory }}'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- bash: |
if [ -f extensions.csproj ]
then
dotnet build extensions.csproj --runtime ubuntu.16.04-x64 --output ./bin
fi
workingDirectory: $(workingDirectory)
displayName: 'Build extensions'
- task: UsePythonVersion#0
displayName: 'Use Python 3.6'
inputs:
versionSpec: 3.6 # Functions V2 supports Python 3.6 as of today
- bash: |
pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt
rm -rf ./.python_packages/lib/site-packages/dataclasses-0.6*
rm ./.python_packages/lib/site-packages/dataclasses.py
workingDirectory: $(workingDirectory)
displayName: 'Install application dependencies'
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(workingDirectory)'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- publish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
artifact: drop
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: 'development'
pool:
vmImage: $(vmImageName)
strategy:
runOnce:
deploy:
steps:
- task: AzureFunctionApp#1
displayName: 'Azure functions app deploy'
inputs:
azureSubscription: '$(azureSubscription)'
appType: functionAppLinux
appName: $(functionAppName)
package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip'
These are the key lines after installing the requirements.txt. These will remove the package from the site-packages folder.
rm -rf ./.python_packages/lib/site-packages/dataclasses-0.6*
rm ./.python_packages/lib/site-packages/dataclasses.py
pip uninstall dataclasses will not work because you are not in the right folder.
Hope this helps!

How to install packet on Linux Azure App Service

I want to run my .net 5 app on a Linux app service that has specific libraries (for example ibnss3-dev).
I have pipeline for build:
trigger:
- main
pool:
vmImage: ubuntu-20.04
variables:
buildConfiguration: 'Release'
wwwrootDir: 'Web/wwwroot'
dotnetSdkVersion: '5.0.x'
- script: |
sudo apt-get update
sudo apt-get install -y libnss3-dev
displayName: 'Dep install'
steps:
- task: UseDotNet#2
displayName: 'Use .NET Core SDK $(dotnetSdkVersion)'
inputs:
version: '$(dotnetSdkVersion)'
- script: 'echo "$(Build.DefinitionName), $(Build.BuildId), $(Build.BuildNumber)" > buildinfo.txt'
displayName: 'Write build info'
workingDirectory: $(wwwrootDir)
- task: DotNetCoreCLI#2
displayName: 'Restore project dependencies'
inputs:
command: 'restore'
projects: '**/Web.csproj'
- task: DotNetCoreCLI#2
displayName: 'Build the project - $(buildConfiguration)'
inputs:
command: 'build'
arguments: '--no-restore --configuration $(buildConfiguration)'
projects: '**/Web.csproj'
- task: DotNetCoreCLI#2
displayName: 'Publish the project - $(buildConfiguration)'
inputs:
command: 'publish'
projects: '**/Web.csproj'
publishWebProjects: false
arguments: '--no-build --configuration Release --output $(Build.ArtifactStagingDirectory)/Release'
zipAfterPublish: true
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
condition: succeeded()
And a release with Azure App Service deploy (Task Version 4)
How should I do it? I tried the following solutions, but non of them works:
Release Post Deployment Action with sudo (kuduPostDeploymentScript.sh: sudo: not found)
Release Post Deployment Action without sudo ([error]E: List directory /var/lib/apt/lists/partial is missing. - Acquire (13: Permission denied))
Add script step in the pipeline
I can run install command manually via ssh but I'm looking for an automated method.
apt-get update
apt-get install -y libnss3-dev
Please check this question. As you already discovered you should replace your command "dotnet web.dll" with sh script where you first install dependencies and then run your web.dll with dotnet CLI.

How to avoid creating new folder in Azure Agent for Pipeline in Environment

I have created an Azure agent environment to a virtual machine where I download builds to the folder. I have the following yaml file
# Python package
# Create and test a Python package on multiple Python versions. OK
# Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/python
trigger:
paths:
include:
- Flytteordre
pool:
vmImage: 'ubuntu-latest'
name: Azure Pipelines
variables:
python.version: '3.6'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: build
displayName: build
steps:
- task: UsePythonVersion#0
displayName: 'Use Python $(python.version) copy'
inputs:
versionSpec: '$(python.version)'
# We actually don't need to install these dependencies here. It needs to happen in the deploy yaml file.
- task: CmdLine#2
inputs:
script: |
python -m pip install --upgrade pip
python -m pip install selenium
python -m pip install pdfplumber
python -m pip install pandas
displayName: 'Install dependencies'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: dist'
inputs:
PathtoPublish: Flytteordre
ArtifactName: dist
- stage: Deployment
displayName: Deployment stage
jobs:
- deployment: deploymentJob
displayName: Deploy
environment:
name: Production
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- download: none
- downloadBuild: none
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'dist'
downloadPath: 'C:/azagent/A1/_work/myfolder/'
My problem is that each time I run the pipeline, it creates a folder inside the _work environment called 1, 2 etc.
I would like to avoid this so that I am in full control of which folders are created. As you can see I have indicated that my artifact should by downloaded to the folder path C:/azagent/A1/_work/myfolder/, however this will create two folders. One is the folder that I have indicated but the other is a folder with the title of a number. I now that this is the default but I would like to know if there is a way to turn off this default setting or at the very least to be able to change the predefined path variable PIPELINE.WORKSPACE or Agent.BuildDirectory?
How to avoid creating new folder in Azure Agent for Pipeline in Environment
According to the document Agent variables:
So, each build definition goes into its own directory within the agent's working directory.
As we know, a pipeline has input and output. Whenever we create a new pipeline, we will increment a number to name the new work folder created. The advantage of this is that multiple running builds sharing the same copy of the repository are guaranteed to step on each other sooner or later.
I would prefer the option of being able to name the directory myself
in case I need to find a particular project and open it on the local
agent.
If you want open it on the local agent, you could just use the Agent variables to show the path:
Agent.BuildDirectory

Deployment of Python web app fails when all packages are not included in the file requirements.txt (even though they are installed in the yaml tasks))

I deploy Azure functions using yaml scripts.
For some reasons, I do not put all my package requirements in the file requirements.txt, as this is used by some other processes.
Yet, when deploying my web app through a YAML pipeline, I want to install additional python packages. However, the resulting app that is deployed crashes with errors saying that it does not have those packages installed.
my pipeline:
# Python to Linux Web App on Azure
# Build your Python project and deploy it to Azure as a Linux Web App.
# Change python version to one thats appropriate for your application.
# https://learn.microsoft.com/azure/devops/pipelines/languages/python
trigger:
- master
variables:
# Azure Resource Manager connection created during pipeline creation
azureServiceConnectionId: .........................
# Web app name
webAppName: .........................
# Agent VM image name
vmImageName: 'ubuntu-latest'
# Environment name
environmentName: .........................
# Project root folder. Point to the folder containing manage.py file.
projectRoot: $(System.DefaultWorkingDirectory)
# Python version: 3.8
pythonVersion: '3.8'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: BuildJob
pool:
vmImage: $(vmImageName)
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(pythonVersion)'
displayName: 'Use Python $(pythonVersion)'
- script: |
python -m venv antenv
source antenv/bin/activate
python -m pip install --upgrade pip
pip install setup
pip install -r requirements.txt
pip install pyodbc
workingDirectory: $(projectRoot)
displayName: "Install requirements"
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(projectRoot)'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- upload: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
displayName: 'Upload package'
artifact: drop
- stage: Deploy
displayName: 'Deploy Web App'
dependsOn: Build
condition: succeeded()
jobs:
- deployment: DeploymentJob
pool:
vmImage: $(vmImageName)
environment: $(environmentName)
strategy:
runOnce:
deploy:
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(pythonVersion)'
displayName: 'Use Python version'
- task: AzureWebApp#1
displayName: 'Deploy Azure Web App : .........................
inputs:
azureSubscription: $(azureServiceConnectionId)
appName: $(webAppName)
package: $(Pipeline.Workspace)/drop/$(Build.BuildId).zip
startUpCommand: 'startup-nocontainer.sh'
Here, pyodbc is installed separately in the pipeline, and not present in the file requirements.txt.
Checking the logs of the pipeline, I can see that the installation was performed successfully.
Yet when the app starts, it crashes at first import of pyodbc, as if the app that is eventually deployed only relyed on requirements.txt.
Any idea?
I think pyodbc is not being installed in the venv you create. The venv contains only the packages you specify in requirements.txt
Check the solutions proposed in this old post

Resources