Unable to download packages to Azure Artifacts from Azure Pipelines - azure

Requesting your help with Azure Artifact connection from Azure Pipelines.
My Azure Pipeline is building a image from docker file and a 'requirements' file has list of packages to be pip installed. In the pipeline I authenticate to my Azure Artifacts feed using PipAuthenticate#1 task and the authentication is successfull and the URL is passed as an argument to the docker file.
However, I can see that the packages are getting installed from external links, but are not downloaded to my artifact feed.
The artifact feed 'testartifact' is currently empty and so it is correctly going to the external link to download the package. But I was expecting the package to be then saved in 'testartifact' feed so that next docker build, it takes the package directly from testartifact feed. Is my assumption correct?
If so, could you help on if I am missing something in the code due to which the package is not getting saved to my artifact.
Here is the Azure Pipeline yaml file and the docker file. Also attached the log of package download.
Thanks for your time!
pool:
vmImage: 'ubuntu-latest'
# Set variables
variables:
imageversion: 1.0
artifactFeed: testartifact
stages:
- stage: DevDeploy
jobs:
- job: DevBuildandPushImage
steps:
- bash: echo DevDeploy
- task: PipAuthenticate#1
displayName: 'Pip Authenticate'
inputs:
artifactFeeds: $(artifactFeed)
onlyAddExtraIndex: true
- bash: echo "##vso[task.setvariable variable=artifactoryUrl;]$PIP_EXTRA_INDEX_URL"
- bash: echo $PIP_EXTRA_INDEX_URL
- task: Docker#2
inputs:
containerRegistry: 'testcontaineregistry'
repository: 'testrepository'
command: 'build'
Dockerfile: '**/dockerfile'
arguments: '--build-arg PIP_EXTRA_URL=$(PIP_EXTRA_INDEX_URL)'
Part of the dockerfile
ARG PIP_EXTRA_URL
ENV PIP_EXTRA_INDEX_URL=$PIP_EXTRA_URL
RUN echo 'PIP_EXTRA_INDEX_URL'$PIP_EXTRA_INDEX_URL
# Install Python Packages & Requirements
COPY requirements requirements
RUN pip3 install -r requirements --extra-index-url $PIP_EXTRA_URL
part of the log
2020-07-16T17:39:05.0301632Z Step 8/28 : RUN echo 'PIP_EXTRA_INDEX_URL'$PIP_EXTRA_INDEX_URL
2020-07-16T17:39:05.4787725Z PIP_EXTRA_INDEX_URLhttps://build:***#XXXXXXX.pkgs.visualstudio.com/_packaging/testartifact/pypi/simple
2020-07-16T17:39:06.1264997Z Step 9/28 : COPY requirements requirements
2020-07-16T17:39:07.0309036Z Step 10/28 : RUN pip3 install -r requirements --extra-index-url $PIP_EXTRA_URL
2020-07-16T17:39:08.3873873Z Collecting pypyodbc (from -r requirements (line 1))
2020-07-16T17:39:08.7139882Z Downloading https://files.pythonhosted.org/packages/ea/48/bb5412846df5b8f97d42ac24ac36a6b77a802c2778e217adc0d3ec1ee7bf/pypyodbc-1.3.5.2.zip
2020-07-16T17:39:08.9900873Z Collecting pyodbc (from -r requirements (line 2))
2020-07-16T17:39:09.2421266Z Downloading https://files.pythonhosted.org/packages/81/0d/bb08bb16c97765244791c73e49de9fd4c24bb3ef00313aed82e5640dee5d/pyodbc-4.0.30.tar.gz (266kB)
2020-07-16T17:39:09.4960835Z Collecting xlrd (from -r requirements (line 3))
2020-07-16T17:39:09.6500787Z Downloading https://files.pythonhosted.org/packages/b0/16/63576a1a001752e34bf8ea62e367997530dc553b689356b9879339cf45a4/xlrd-1.2.0-py2.py3-none-any.whl (103kB)
2020-07-16T17:39:09.6782714Z Collecting pandas (from -r requirements (line 4))
2020-07-16T17:39:10.2506552Z Downloading https://files.pythonhosted.org/packages/c0/95/cb9820560a2713384ef49060b0087dfa2591c6db6f240215c2bce1f4211c/pandas-1.0.5-cp36-cp36m-manylinux1_x86_64.whl (10.1MB)
2020-07-16T17:39:11.4371150Z Collecting datetime (from -r requirements (line 5))
2020-07-16T17:39:11.6083120Z Downloading https://files.pythonhosted.org/packages/73/22/a5297f3a1f92468cc737f8ce7ba6e5f245fcfafeae810ba37bd1039ea01c/DateTime-4.3-py2.py3-none-any.whl (60kB)
2020-07-16T17:39:11.6289946Z Collecting azure-storage-blob (from -r requirements (line 6))

From the task log, the Python packages are restored from external links. You need to make sure that the packages are installed from Feed upstream source . Then the package will exist in the feed after installation.
Here are the steps:
Step1: Add the Python Upstream source to feed.
Step2: Use PipAuthenticate task to get the $PIP_EXTRA_INDEX_URL
Step3: Use the $PIP_EXTRA_INDEX_URL to install package from feed.
pip install -r requirements.txt --index-url $PIP_EXTRA_INDEX_URL
Note: The step 2 and 3 are already existing in your yaml file. But the pip install script seems to have issue. You need to directly add the --index-url parameter.
Then the packages are installed from feed upstream source.
In this case, these packages will also exist in the feed.

Related

Installing packages from Azure DevOps Artifact Feed in a Azure Pipeline

I have two repositories.
The first is built by AzureDevOps Pipelines into a whl file and published on a Azure DevOps Artifact feed. (works)
The second should also be built by AzureDevOps Pipelines and published on Azure DevOps artifacts -> but it is dependend on the first one and needs to install it from the AzureDevops Artifact feed during the build-process. (this does not work).
I can install it locally, but the pipeline of the second package fails. When the pipeline fails, I get the following error:
401 Client Error: Unauthorized for url:
https://pkgs.dev.azure.com/<company>/<some-numbers>/_packaging/<some-numbers>/pypi/download/<mypackage>/0.0.1.9/<mypackage>-0.0.1.9-py3-none-any.whl#sha256=<some-numbers>
---------------------------------- SETUP ----------------------------------
I added the feed as a secondary source to the pyproject.toml of my second repository, this allows me to successfully install the first package with poetry add <firstpackage> and poetry install for my local IDE:
[[tool.poetry.source]]
name = "azure"
url = "https://pkgs.dev.azure.com/<company>/<some-numbers>/_packaging/<feed-name>/pypi/simple/"
secondary = true
YAML script to install packages via poetry - works for the first repository, but not for the second repository which needs to install the first package from the Azure DevOps artifcats feed (the first installs everything from pypi.org):
- script: |
python -m pip install -U pip
pip install poetry==1.1.3 # Install poetry via pip to pin the version
poetry install
displayName: Install software
YAML script to publish a package to an Azure DevOps artifact feed (with a personal access token as authentification) - works:
- script: |
poetry config repositories.azure https://pkgs.dev.azure.com/<company>/<somenumbers>/_packaging/<feed-name>/pypi/upload/
poetry config http-basic.azure usernamedoesnotmatter $(pat)
poetry publish --repository azure
exit 0
displayName: Publish package
I am not checking my Personal Access Token (PAT) into my repository.
pyproject.toml (partial):
[[tool.poetry.source]]
name = "azure"
url = "https://pkgs.dev.azure.com/<company>/<some-numbers>/_packaging/<feed-name>/pypi/simple/"
secondary = true
I added the PipAuthenticate#1 task that sets the PIP_EXTRA_INDEX_URL environment variable that contains a PAT. In the script, I extract the PAT and use it to configure poetry.
azure-pipelines.yaml (partial):
- task: PipAuthenticate#1
displayName: 'Pip Authenticate'
inputs:
artifactFeeds: '<some-numbers>/<feed-name>'
onlyAddExtraIndex: True
- script: |
python -m pip install --upgrade pip
pip install poetry
export PAT=$(echo "$PIP_EXTRA_INDEX_URL" | sed 's/.*build:\(.*\)#pkgs.*/\1/')
poetry config http-basic.azure build "$PAT"
poetry install
displayName: "Install dependencies"
Turns out, I just needed to configure poetry in the pipeline before the install for the second repository - same as I did locally, some long time ago (and forgot about it).
- script: |
python -m pip install -U pip
pip install poetry==1.1.3 # Install poetry via pip to pin the version
# configuring the feed as a secondary source for poetry
poetry config repositories.azure https://pkgs.dev.azure.com/<company>/<some-numbers>/_packaging/<feed-name>/pypi/simple/
poetry config http-basic.azure userNameDoesntMatter $(pat)
poetry install
displayName: Install software

Dependency caching in Python CI pipeline in Azure DevOps?

I followed the pip section on the Azure documentation on pipeline caching to speed up my Azure DevOps CI pipeline (in particular the dependency installation step). However, the packages are still installed every time I execute the pipeline (which I ideally also want to cache). How can I achieve this?
The Azure DevOps documentation is a bit lackluster here. Following the pip section just leads to caching of the wheels, not the installation itself (which you can also cache to further improve the pipeline execution time). To enable this, you need to work with a virtual environment (such as venv or a conda environment) and cache the entire environment.
Below you can find a code example with conda on how to cache an entire installed environment:
variables:
CONDA_ENV_NAME: "unit_test"
# set $(CONDA) environment variable to your conda path (pre-populated on 'ubuntu-latest' VMs)
CONDA_ENV_DIR: $(CONDA)/envs/$(CONDA_ENV_NAME)
steps:
- script: echo "##vso[task.prependpath]$CONDA/bin"
displayName: Add conda to PATH
- task: Cache#2
displayName: Use cached Anaconda environment
inputs:
key: 'conda | "$(Agent.OS)" | requirements.txt'
path: $(CONDA_ENV_DIR)
cacheHitVar: CONDA_CACHE_RESTORED
- bash: conda create --yes --quiet --name $(CONDA_ENV_NAME)
displayName: Create Anaconda environment
condition: eq(variables.CONDA_CACHE_RESTORED, 'false')
- bash: |
source activate $(CONDA_ENV_NAME)
pip install -r requirements.txt
displayName: Install dependencies
condition: eq(variables.CONDA_CACHE_RESTORED, 'false')
# Optional step here: Install your package (do not cache this step)
- bash: |
source activate $(CONDA_ENV_NAME)
pip install --no-deps .
pytest .
displayName: Install package and execute unit tests

How to fix: GitLab pipeline - failed to start

Introduction: I am new to creating GitLab pipelines.
Details:
The type of executor I am using for Runner is; Shell.
(I am not sure if this can be used or new runner needs to be registered with a different executor.)
Gitlab-runner 13.11.0
On trying to execute the below code which I have written in the .gitlab-ci.yml file, it throws the error.
image: "ruby:2.6"
test:
script:
- sudo apt-get update -qy
- sudo apt-get -y install unzip zip
- gem install cucumber
- gem install rspec-expectations
##TODO grep on all folders searching for .feature files
- find . -name "*.feature"
The error I am receiving is as follows.
Outout from Gitlab pipeline execution:
Can I request you to please help me fix this and run this successfully?
Thanks.

How to install a package in GitLab runner container?

I am trying to implement some cicd using GitLab runner,
I am very new to containers and trying to install the zip package in the container,
I was able to install awscli using pip but, I am not able to install the zip package, which is required for my shell script.
Following is the .gitlab-ci.yml file -
stages:
- build
build:
image: python:latest
stage: build
script:
- pip install awscli
- yum install zip
- bash cicdScript.sh
I'm using the python container, as my script requires awscli,
But also needs the zip package,
I tried the following -
1)
script:
- pip install awscli
- yum install zip
- bash cicdScript.sh
gives -
/bin/bash: line 82: yum: command not found
2)
script:
- pip install awscli
- apt-get install zip unzip
- bash cicdScript.sh
gives -
Reading package lists...
Building dependency tree...
Reading state information...
Package zip is not available, but is referred to by another package.
This may mean that the package is missing, has been obsoleted, or
is only available from another source
E: Package 'zip' has no installation candidate
Try update and -y
apt-get update
apt-get install -y zip unzip

GitLab CI runner doesn't build

I have just installed gitlab-ci-multi-runner by following the documentation https://gitlab.com/gitlab-org/gitlab-ci-multi-runner/blob/master/docs/install/linux-repository.md
I use the public server ci.gitlab.com and the registration of the runner seems OK (the runner appears with a green light).
With debug activated I can see that the runner fetch regularly the CI server.
But when a new commit is pushed no build is done.
Everything is green: https://ci.gitlab.com/projects/4656 but no test is done...
My .gitlab-ci.yml is pretty simple:
before_script:
- apt install python3-pip
- pip3 install -q -r requirements.txt
master:
script: "make test"
only:
- master
script:
- python setup.py test
By the way I can find any error message and I don't know where to search.
I am pretty knew to CI and there is perhaps an obvious point I am missing.
Give this a try. this is assuming your pyunit tests are in a file called runtests.py in the working directory.
before_script:
- apt install python3-pip
- pip3 install -q -r requirements.txt
master:
script: "python runtests.py"
only:
- master

Resources