I have come across a scenario where I want to build source code dependant upon the source directory.
I have 2 languages in the same git repository (dotnet & Python).
I wanted to build the source code using single Azure Pipelines
If the commit is done for both (dotnet & Python) all the task should get executed
& if the commit is done for a specific directory only the respective language should get executed.
Please let me know how can I achieve this using -condition or if there are any other alternatives
Below is my azure-pipelines.yml
#Trigger and pool name configuration
variables:
name: files
steps:
- script: $(files) = git diff-tree --no-commit-id --name-only -r $(Build.SourceVersion)"
displayName: 'display Last Committed Files'
## Here I am getting changed files
##Dotnet/Server.cs
##Py/Hello.py
- task: PythonScript#0 ## It should only get called when there are changes in /Py
inputs:
scriptSource: 'inline'
script: 'print(''Hello, FromPython!'')'
condition: eq('${{ variables.files }}', '/Py')
- task: DotNetCoreCLI#2 ## It should only get called when there are changes in /Dotnet
inputs:
command: 'build'
projects: '**/*.csproj'
condition: eq('${{ variables.files }}', '/Dotnet')
Any help will be appreciated
I don't think it's possible to do directly what you want. All these task conditions are evaluated at the beginning of the pipeline execution. So if you set pipeline variables in any particular task, even the first one, it's already too late.
If you really want to do this, you probably have to go scripting all the way. So you set the variables in the first script using syntax from here:
(if there are C# files) echo '##vso[task.setvariable variable=DotNet]true'
(if there are Py files) echo '##vso[task.setvariable variable=Python]true'
And then in other scripts you evaluate them like:
if $(DotNet) = 'true' then dotnet build
Something among these lines. That'll probably be quite subtle so maybe it would make sense to reconsider the flow on some higher level but without extra context it's hard to say.
Related
I am currently migrating some build components to Azure Pipelines and am attempting to set some environment variables for all Golang related processes. I wish to execute the following command within the pipeline:
CGO_ENABLED=0 GOOS=linux GOARCH=amd64 go build [...]
When utilizing the provided Golang integrations, it is easy to add arguments for Go related processes, but setting an environment variable for all (or for every individual) Go process does not seem possible. Neither GoTool or the default Go task seem to support it, and performing a script task with a shell execution in it do not seem to be supported either.
I have also tried adding an environment variable to the entire pipelines process that defines the desired flags, but these appear to be ignored by the Go task provided by Azure Pipelines itself.
Would there be a way to do add these flags to each (or a single) go process, such as how I do it in the following code block (in which the flags input line was made-up by me)?
- task: Go#0
inputs:
flags: 'CGO_ENABLED=0 GOOS=linux GOARCH=amd64'
command: 'build'
arguments: '[...]'
workingDirectory: '$(System.DefaultWorkingDirectory)'
displayName: 'Build the application'
Based on the information I was able to find and many hours of debugging, I ended up using a workaround in which I ran the golang commands in a CmdLine#2 task, instead. Due to how GoTool#0 sets up the pipeline and environment, this is possible.
Thus, the code snippet below worked for my purposes.
steps:
- task: GoTool#0
inputs:
version: '1.19.0'
- task: CmdLine#2
inputs:
script: 'CGO_ENABLED=0 GOOS=linux GOARCH=amd64 go build'
workingDirectory: '$(System.DefaultWorkingDirectory)'
We are migrating from Azure DevOps to GitHub and we have Build Validations set up where if you make a change in a specific folder, the respective CI pipeline will run when a PR is created.
I am trying to make use of the PR triggers in my YAML file, however when I open a PR it doesn't seem to work.
My pipeline is:
trigger: none
pr:
branches:
include:
- develop
- release/*
- ProductionSupport/*
paths:
include:
- cicd/pipelines/common/pre-commit-ci.yaml
- src
- cicd
pool:
vmImage: ubuntu-latest
variables:
PRE_COMMIT_HOME: $(Pipeline.Workspace)/pre-commit-cache
steps:
- bash: echo "##vso[task.setvariable variable=PY]`python -V`"
displayName: Get python version
- task: Cache#2
inputs:
key: pre-commit | .pre-commit-config.yaml | "$(PY)"
path: $(PRE_COMMIT_HOME)
- bash: |
pip install --quiet pre-commit
pre-commit run
displayName: 'Run pre-commit'
As a test to make sure my branches/paths were correct I updated the triggers section to:
trigger:
branches:
include:
- develop
- release/*
- ProductionSupport/*
paths:
include:
- cicd/pipelines/common/pre-commit-ci.yaml
- src
- cicd
Then when I made a change in one of the files in these folders, the pipeline was successfully triggered. Am I specifying my PR validation incorrectly?
Your yml definition seems correct.
Since you mentioned the CI trigger work fine and you mentioned We are migrating from Azure DevOps to GitHub.
This brings me a idea that a situation that exactly reproduces what you're experiencing and you might not expect:
PR Trigger Override
For example, if your pipeline is the same one as before(Just change the pipeline source), and you didn't delete the previous build validation(Or previous pipeline name is same as the current one), then the pr part in your github yml file will be override, only the build validation on DevOps side will work.
I suggest you investigate whether you have some build validation settings to the pipeline(If your project structure is complex, this maybe difficult to find) or you can simply create a totally new pipeline with the new YAML file.
I am working on a multi stage pipeline that build and deploy some c# code from staging to production
Everything works just fine but I wanted to try and customise a bit more the pipeline so I can see the actual version that is being built and deployed as part of the name of the stage.
At the current stage, this my multi stage pipeline
trigger:
batch: true
tags:
include:
- '*'
branches:
exclude:
- main
- staging
pool:
vmImage: 'ubuntu-latest'
variables:
buildNumber: "$[variables['Build.BuildNumber']]"
DOCKER_BUILDKIT: 1
dockerRegistryServiceConnectionStaging: '<My-Connection-String>'
imageRepositoryStaging: '<My-Repo-Name>'
containerRegistryStaging: '<My-Container-Name>'
dockerRegistryServiceConnectionProd: '<My-Connection-String>'
imageRepositoryProd: 'My-Repo-Name>'
containerRegistryProd: '<My-Container-Name>'
dockerfilePath: 'pathTo/Dockerfile'
solution: 'path/To/Solution.csproj'
tag: '$(Build.BuildNumber)'
stages:
- stage: 'Build_Staging'
displayName: 'Build_Staging'
jobs:
- job: buildStaging
displayName: 'DotNet Core publish and dockerize'
steps:
- powershell: |
# Write your PowerShell commands here.
Write-Host "Update Build.BuildNumber"
cd $(System.DefaultWorkingDirectory)
$Latesttag = $(git describe --tags $(git rev-list --tags --max-count=1))
Write-Host "The latest git tag is $Latesttag "
Write-Host
"##vso[build.updatebuildNumber]$Latesttag"
- task: DotNetCoreCLI#2
displayName: 'DotNet - Restore'
inputs:
command: 'restore'
projects: $(solution)
noCache: true
versioningScheme: 'off'
vstsFeed: '<Feed>'
- task: DotNetCoreCLI#2
name: 'DotnetPublish'
displayName: 'dotnet - Publish'
inputs:
command: 'publish'
projects: $(solution)
arguments: '-o publish/solution -c release'
modifyOutputPath: false
zipAfterPublish: false
publishWebProjects: false
- task: Docker#2
name: 'dockerBuildAndPush'
displayName: 'docker - Build & Push $(tag)'
inputs:
repository: $(imageRepositoryStaging)
Dockerfile: $(dockerfilePath)
containerRegistry: ${{ variables.dockerRegistryServiceConnectionStaging }}
buildContext: ${{ variables.buildContext }}
tags: |
$(Build.BuildNumber)
latest
- stage: 'Deploy_Staging'
jobs:
- deployment: 'Deploy'
environment: 'Staging'
variables:
EnvironmentName: 'Staging'
strategy:
runOnce:
deploy:
steps:
- task: AzureRmWebAppDeployment#4
displayName: 'Deploy Azure App Service To Staging'
inputs:
azureSubscription: '<Azure-Subscription>'
appType: 'webAppContainer'
DockerNamespace: '<container-namespace>'
DockerRepository: '<Repository>'
DockerImageTag: '$(Build.BuildNumber)'
WebAppName: '<WebAppName>'
The Powershell command is to override the Build.BuildNumber with the tag I am pushing to GitHub.
When I run this pipeline, in azure DevOps, I see the stage name Build_Staging_$(Build.BuildNumber) as a string.
What I would really like to see is, if I push the tag 'v1.0.0` for example, is to see the stage name like:
Build_Staging_v1.0.0
I tried to use the displayName and the output is not the one I was looking for and if I try with name instead of displayName I get the error unexpected value name
Can please please anyone help understand what am I doing wrong and how I can achieve this?
Please if my question is not 100% clear and missing any important detail, just let me know
UPDATE:
I did update the post with my entire pipeline.
This pipeline, before it used to be a single job process, and everything was working fine. But to get my hand dirty, I wanted to add stages to split and workflow based on resources and environment.
The process is still working and this is what I am expecting.
In my GitHub, when I create a tag on the main branch, this will trigger my build stage. Which thanks to the Powershell script to update the BuildNumber with the tag, I am able to build the docker image in my container registry in the following format:
docker-image-name:v1.0.1
That version can be seen at this level also:
This updated buildNumber (now is Tag) is use in Azure pipelines App Slack to check the version that has been pushed.
So far everything is good.
But I am facing the problem with the deployment job, at that level I am not able to set any Powershell script to update that same BuildNumber with the tag. I checked the documentation and nothing is mentioned about how I can add another job or step. I tried implementing this but I get errors that the value is unexpected.
Let me just share another screenshot to fully explain the issue.
Assuming I am deploying the docker image v1.0.1, everything works perfectly, but the build number in deployment stage, is not being updated, in fact in slack channel, I see the normal build number, as follow:
Instead of having the buildNumber, I would like to have my tag.
Please any help here?
Unfortunately, you won't be able to set a stage name to a dynamic variable that is set within one of its child's steps. You'll have to set it to a pipeline-level variable or predefined variable
Variable evaluation goes top-down from stages to tasks:
stages
jobs
tasks
To help explain exactly why, let's talk about how variable evaluation works in general with regard to this structure:
VARIABLE EVALUATION: Using stages as an example, you can set a stage name using any dynamic value that's present when the stage is evaluated. This means the variable is only accessed when the stage is initially "rendered". Azure DevOps requires that the variable be present before evaluation and will not retroactively update the UI if that variable is changed within a child step.
Let's talk about each and their respective limitations on what variables you can use in their names:
STAGES: pipeline-level variables, parameters (in the case of templates), or predefined variables
JOBS: stage-level variables, pipeline-level variables, parameters (in the case of templates), or predefined variables
TASKS: job-level variables, stage-level variables, pipeline-level variables, parameters (in the case of templates), or predefined variables
I did something similar by setting my build number to a repo tag. Here is the PowerShell function that sets the Build.Buildnumber variable to the tag value. You can just call it straight out or base it off a parameter if you have other version number logic.
function getTagVersion() {
$tag = iex "git describe --long --tags --always"
$a = [regex]"\d+\.\d+\.\d+\.\d+"
$b = $a.Match($tag)
$b = $b.Captures[0].value
$b = $b -replace '-', '.'
$b = $b -replace 'v', ''
Write-Host "Version found: $b"
$newBuildNumber = '$(Build.BuildNumber)' -replace $a,$b
Write-Host "##vso[build.updatebuildnumber]$newBuildNumber"
return $b
}
I can't claim credit for this code as I found it on someone's blog. But it works and I use for my release builds. You just have to call the function and it will reset the build.buildnumber to the latest tag in your repo. Its important to note, that the tag should be in normal version number format.
Example:
Tag Name: 10.1.100.0
I need to pass a file path to a trigger job where the file path is found within a specified json file in a separate job. Something along the lines of this...
stages:
- run_downstream_pipeline
variables:
- FILE_NAME: default_file.json
.get_path:
stage: run_downstream_pipeline
needs: []
only:
- schedules
- triggers
- web
script:
- apt-get install jq
- FILE_PATH=$(jq '.file_path' $FILE_NAME)
run_pipeline:
extends: .get_path
variables:
PATH: $FILE_PATH
trigger:
project: my/project
branch: staging
strategy: depend
I can't seem to find any workaround to do this, as using extends won't work since Gitlab wont allow for a script section in a trigger job.
I thought about trying to use the Gitlab API trigger method, but I want the status of the downstream pipeline to actually show up in the pipeline UI and I want the upstream pipeline to depend on the status of the downstream pipeline, which from my understanding is not possible when triggering via the API.
Any advice would be appreciated. Thanks!
You can use artifacts:reports:dotenv for setting variables dynamically for subsequent jobs.
stages:
- one
- two
my_job:
stage: "one"
script:
- FILE_PATH=$(jq '.file_path' $FILE_NAME) # In script, get the environment URL.
- echo "FILE_PATH=${FILE_PATH}" >> variables.env # Add the value to a dotenv file.
artifacts:
reports:
dotenv: "variables.env"
example:
stage: two
script: "echo $FILE_PATH"
another_job:
stage: two
trigger:
project: my/project
branch: staging
strategy: depend
Variables in the dotenv file will automatically be present for jobs in subsequent stages (or that declare needs: for the job).
You can also pull artifacts into child pipelines, in general.
But be warned you probably don't want to override the PATH variable, since that's a special variable used to help you find builtin binaries.
I am trying to improve the CI/CD process of an old funky project whose code is not open to refactoring at the moment. I just cant get this to work following the Azure documentation or even if it is possible.
I have been able to improve the current state with an azure pipeline file that runs unit tests before merging into releases/dev branch. But i want to further.
Tasks every PR into releases/dev will:
script: npm run test:unit
script: npm run build:dev
copy/publish the contents of the .div/ folder to a azure blob store config for static site
Any PR or merge into releases/staging will:
script: npm run build:staging
copy/publish the contents of the .div/ folder to a azure blob store config for static site
Any PR or merge into master will:
script: npm run test:unit
script: npm run build:production
copy/publish the contents of the .div/ folder to a azure blob store config for static site
I have 3 questions
Is this possible within a single yaml file?
How do i run different task for different branches, I've defined jobs/stages but cant get them to be conditional?
Is there some magic anyone can direct me to that lets me copy the content of a directory to a blob store? Or must it be zipped->copied->un zipped?
Thanks in advance from a new sleep deprived dad
Is this possible within a single yaml file? How do i run different
task for different branches, I've defined jobs/stages but cant get
them to be conditional?
Of course. You could add these stages in a single yaml file. Then you need to define the condiftion field for each stage or each job.
Here is an example for stages:
trigger:
- '*'
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: Test1
condition: OR(eq(variables['Build.SourceBranch'], 'refs/heads/master') ,eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/master'))
jobs:
- job: BuildJob
steps:
- script: echo Build Stage1!
- stage: Test2
condition: OR(eq(variables['Build.SourceBranch'], 'refs/heads/dev') ,eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/dev'))
jobs:
- job: BuildJob
steps:
- script: echo Build Stage2!
- stage: Test3
condition: OR(eq(variables['Build.SourceBranch'], 'refs/heads/staging') ,eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/staging'))
jobs:
- job: BuildJob
steps:
- script: echo Build Stage3!
You could set required branches as trigger. Then you could use the Build.SourceBranch and System.PullRequest.TargetBranch to determine whether to run the current stage.
Build.SourceBranch -> for merge branch.
System.PullRequest.TargetBranch -> for Pull Request.
Here are the docs about conditions and variables.
Is there some magic anyone can direct me to that lets me copy the content of a directory to a blob store? Or must it be zipped->copied->un zipped?
Since you need to publish file to Azure Blob, you could directly try to use the Azure File Copy task.
Here is an example:
- task: AzureFileCopy#4
displayName: 'AzureBlob File Copy'
inputs:
SourcePath: xxx
azureSubscription: xxx
Destination: AzureBlob
storage: xxx
ContainerName: '$web'
Hope this helps.