Im trying to run an Helm deployment via Azure Devops. The problem is that the variable i set in my Bash step is not being read in the actual Upgrade step. When i run this command stand alone from my CLI it works fine.
So it is actually about this line:
arguments: "--reuse-values --version $(helmChartVersion)"
The full thing below:
- task: Bash#3
name: repoAdd
displayName: Add repo and deploy
inputs:
targetType: 'inline'
script: |
# Add the repo
helm repo add \
stapp \
https://$(containerRegistry)/helm/v1/repo \
--username $(registryUsername) \
--password '$(registryPassword)'
# Extra version file
export helmChartVersion=$(jq .helmChartVersion $(pipeline.workspace)/ci-pipeline/build-artifact/variables.json -r)
cat $(pipeline.workspace)/ci-pipeline/build-artifact/variables.json
# Lets update the repo
helm repo update
- task: HelmDeploy#0
inputs:
connectionType: 'Azure Resource Manager'
azureSubscription: 'Microsoft Azure(1fafaf-8012-4035-b8f3-fafaffa)'
azureResourceGroup: 'production-rg'
kubernetesCluster: 'production'
namespace: 'stapp-test'
command: 'upgrade'
chartType: 'Name'
chartName: 'stapp/stapp'
releaseName: 'stapp'
install: false
arguments: "--reuse-values --version $(helmChartVersion)"
Best,
Pim
In Azure DevOps you must explicitly set the variable with a legacy label from Visual Studio Online.
# Extra version file
helmChartVersion=$(jq .helmChartVersion $(pipeline.workspace)/ci-pipeline/build-artifact/variables.json -r)
echo "##vso[task.setvariable variable=helmChartVersion]$helmChartVersion"
Alessandro Segala has written a great article about this (https://medium.com/microsoftazure/how-to-pass-variables-in-azure-pipelines-yaml-tasks-5c81c5d31763)
Related
We run integration tests, written in Python, in Azure pipeline.
The run is triggered in this way:
- script: |
pdm run pytest \
--variables "$VARIABLE_FILE" \
--napoleon-docstrings \
--doctest-modules \
--color=yes \
--junitxml=junit/test-results.xml \
integration
env:
<different_environment_variables>
Some integration tests will make connection to a database, and the properties, needed for connection to the database, are stored in a variable group in Azure. I can get them via :
- powershell: |
az pipelines variable-group variable list --group-id <some_group_id> --output table
but i do not know how to set them later as environment variables and use them in the python code?
You can reference the variables from your variable group directly in your python YAML pipeline like below :-
# Python package
# Create and test a Python package on multiple Python versions.
# Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more:
# https://docs.microsoft.com/azure/devops/pipelines/languages/python
trigger:
- master
pool:
vmImage: ubuntu-latest
strategy:
matrix:
Python27:
python.version: '2.7'
Python35:
python.version: '3.5'
Python36:
python.version: '3.6'
Python37:
python.version: '3.7'
variables:
- group: SharedVariables
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
echo $(databaseservername)
- script: |
python -m pip install --upgrade pip
pip install -r requirements.txt
displayName: 'Install dependencies'
- script: |
pip install pytest pytest-azurepipelines
pytest
displayName: 'pytest'
variables:
- group: SharedVariables
- script: |
echo $(databaseservername)
Pipeline ran successfully like below :-
Authorize and allow the permission to run the pipeline like below :-
Permit access :-
Pipeline ran successfully with the variable:-
You can also pass the variables in the pipeline with arguments and inline script with python task like below :-
variables:
- group: variableGroup
steps:
- task: PythonScript#0
displayName: 'Run a Python script'
inputs:
scriptPath: 'Test.py'
arguments: --variableInScript $(variableInVariableGroup)
Inline Python task:-
variables:
- group: SharedVariables
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(python.version)'
displayName: 'Use Python $(python.version)'
- script: |
echo $(databaseservername)
- task: PythonScript#0
inputs:
scriptSource: 'inline'
script: 'print(''databaseservername: $(databaseservername))'')'
Output :-
databaseservername variable was encrypted like below after running as an argument in the python inline script.
You can link the variable group in the release pipeline like below and call it in multiple pipelines across different stages too :-
Similarly, you can add Azure CLI task for your shell script to callaz pipelines variable-group variable list in your yaml pipeline and reference it like below :-
- task: AzureCLI#2
inputs:
azureSubscription: '<Subscription-name>(sub-id)'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
az pipelines variable-group variable list
--org ''https://dev.azure.com/org/''
--project project --group-id id
--only-show-errors --output json
Reference:-
Access variables from Variable Groups inside Python script task in Azure DevOps Yaml pipeline By Bo Soborg Perterson
Pass Variable Group as Dictionary To Python Script By Joy wang
I'm working with azure pipeline to checkout source code from azure repo and execute setup of inbuilt script to which is provided by webmethod SAG, Using build.yaml i can able to build my application but not able to publish the artifacts.
cat build.yaml
trigger:
- devops-build
pool:
name: CICD
steps:
# Create Target Directory to keep git repo for later use
- bash: |
mkdir -p /home/user/cicd_source/dev/packages/packages
displayName: 'create directory'
- bash: |
echo "webname=${{parameters.projectName}}" > $(Build.ArtifactStagingDirectory)/devpackagename.properties
echo "BuildNumber=$(Build.BuildNumber)" > $(Build.ArtifactStagingDirectory)/devBuildNumber.txt
Above script will create devpackagename.properties and devBuildNumber.txt following path inside my self hosted agent directory work location.
pwd
/home/user/agent/CICD/_work/1/a
ls -lrt
devpackagename.properties
devBuildNumber.txt
cat devpackagename.properties
webname=package
cat devBuildNumber.txt
BuildNumber=20221004.83
After ran the successful pipeline i don't see any artefacts published inside my pipeline
after your build steps add below task
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Build.ArtifactStagingDirectory)'
artifact: 'drop'
publishLocation: 'pipeline'
you would see artifact get published on the pipeline
This is the reference doc I have followed to set up the Azure pipeline
https://medium.com/adessoturkey/owasp-zap-security-tests-in-azure-devops-fe891f5402a4
below i am sharing screenshort of the pipeline failed:
Could you please help here to resolve the issue I have exactly followed the medium article to implement the task....
Those who aware on this could you please share your taughts.
This is the pipeline script i am using.
trigger: none
stages:
stage: 'buildstage'
jobs:
job: buildjob
pool:
vmImage: ubuntu-latest
steps:
- checkout: self
- checkout: owasap-zap
bash: "docker run -d -p 80:80 nginx:1.14.2"
displayName: "App Container"
bash: |
chmod -R 777 ./
docker run --rm -v $(pwd):/zap/wrk/:rw -t owasp/zap2docker-stable zap-full-scan.py -t http://$(ip -f inet -o addr show docker0 | awk '{print $4}' | cut -d '/' -f 1):80 -x xml_report.xml
true
displayName: "Owasp Container Scan"
- displayName: "PowerShell Script"
powershell: |
$XslPath = "owasp-zap/xml_to_nunit.xslt"
$XmlInputPath = "xml_report.xml"
$XmlOutputPath = "converted_report.xml"
$XslTransform = New-Object System.Xml.Xsl.XslCompiledTransform
$XslTransform.Load($XslPath)
$XslTransform.Transform($XmlInputPath, $XmlOutputPath)
displayName: "PowerShell Script"
task: PublishTestResults#2
displayName: "Publish Test Results"
inputs:
testResultsFiles: converted_report.xml
testResultsFormat: NUnit
# task: PublishTestResults#2
stage: buildstage
According to the YAML file, you want to checkout multiple repositories in your pipeline, but it seems you haven't define a repository resource like mentioned in the document you shared.
resources:
repositories:
- repository: <repo_name>
type: git
name: <project_name>/<repo_name>
ref: refs/heads/master
And according to the screenshot you shared, you only checkout out one repo. Which cause the location of file xml_to_nunit.xslt is different from owasp-zap/xml_to_nunit.xslt. If you only checkout one repo, the location of xml_to_nunit.xslt should be current directory, thus, just define $XslPath in the PowerShell script as "xml_to_nunit.xslt".
Edit
If the repository that contain "xml_to_nunit.xslt" file is in the same organization as the repository run for your pipeline, you need to checkout the repository by using Inline syntax checkout like below or define repository resource.
- checkout: git://MyProject/MyRepo # Azure Repos Git repository in the same organization
You could also add one more command ls before the PowerShell script to list the files in current directory. Aim to figure out where is "xml_to_nunit.xslt".
I am having issues passing parameters defined in an Azure Pipeline YAML to AZ Cli located in a bash script. I found the following solution on Stackoverflow but it doesn't seem to work:
Pass parameter to Azure CLI Task in DevOps
Here is my YAML file:
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
steps:
- task: AzureCLI#2
displayName: Azure CLI
inputs:
azureSubscription: templateresourceconnection
scriptType: bash
scriptPath: ./deployment.sh
arguments:
-resourceGrp 'TEST-RG'
-location 'westeurope'
I would expect to be able to access the arguments in my deployment.sh, which fails:
#!/bin/bash
# Create Resource Group
az group create -n $(resourceGrp) -l $(location)
If I don't pass any arguments and hardcode the values in deployment.sh it all works fine.
Any ideas what could cause this issue? I also tried with UPPERCASE and just without brackets.
I get the following error message
Do you have any idea what else I could try to make it work. Seems like the documentation doesn't contain any example for az cli. Just how to define parameters but not how to pass them afterwards.
Thank you.
Do you have any idea what else I could try to make it work
You could try to use the Environment variable. Environment variables can be accessed by bash script files.
First of all, you need to define pipeline variable instead of task argument.
variables:
- name: one
value: initialValue
Here is my example: Used in Linux system.
az group create -n $RESOURCEGRP -l $LOCATION
Note: All characters in environment variables need to be capitalized.
e.g.: resourceGrp -> $RESOURCEGRP
Yaml sample:
variables:
- name: resourceGrp
value: TEST-RG
- name: location
value: westeurope
pool:
vmImage: 'ubuntu-latest'
steps:
- task: AzureCLI#2
displayName: 'Azure CLI deploy.sh'
inputs:
azureSubscription: kevin0209
scriptType: bash
scriptPath: ./deployment.sh
I am new to AzureDevOps, i am trying to use Azure DevOps for one of my C++ project. My requirements are below.
1: Build C++ and publish to Artifactory(Azure),This i am able to do and binary able to publish in Artifactory.
2: I want to use that Artifactory(Binary file) while building Docker image with help of binary file. But i am unable to achieve in Azure DevOps. Locally i am able to build docker image with binary and running file.
Just As summary i am writing.
I need to create simple release pipeline using the build artifacts from the previous task
● The Release Pipeline should build a Docker image with the following requirements
○ must contain Build artifacts from the build pipeline.
Please find below code:
trigger:
- master
jobs:
- job: gcctest
pool:
vmImage: 'ubuntu-16.04'
steps:
- script: sudo apt-get update && sudo apt-get install libboost-all-dev
- script: g++ -std=c++11 -I/usr/include/boost/asio -I/usr/include/boost -o binary.out main.cpp
connection.cpp connection_manager.cpp mime_types.cpp reply.cpp request_handler.cpp request_parser.cpp server.cpp -lboost_system -lboost_thread -lpthread
- powershell:
$commitId= "$env:BUILD_SOURCEVERSION"
$definitionName= "$env:BUILD_DEFINITIONNAME"
$shortId= $commitId.Substring(1, 8)
$buildName="$definitionName.$shortId"
Write-Host $buildName
Write-Output "$buildName">readme.txt
# echo "##vso[task.setvariable variable=text]$buildName"
- task: CopyFiles#2
inputs:
sourceFolder: '$(Build.SourcesDirectory)'
contents: '?(*.out|*.txt)'
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: $(Build.ArtifactStagingDirectory)
artifactName: result
- task: Docker#1
displayName: 'Build using multi-stage'
inputs:
containerregistrytype: 'Container Registry'
dockerRegistryEndpoint: 'My Container Registry'
arguments: '--build-arg ARTIFACTS_ENDPOINT=$(ArtifactFeed)'
Docker File:
FROM ubuntu:18.04
RUN apt-get update
RUN apt-get install -y libboost-all-dev
COPY . /app
EXPOSE 80
CMD /app/binary.out 0.0.0.0 80 .```
~
Expected result:
Docker image should build by artifact and able to publish on DockerHub repo.