Can we reference a YAML template from a Nuget package? - azure

I need to update a couple of pipelines to use the same logic and thought about using templates to split the common YAML steps.
All the projects share the same dependency with an in-house Nuget package that is used for deployment. My initial idea is to use that package to hold the template YAML and reference it on each project but when I read the documentation I am not entirely sure this is supported.
To give you a better idea I'm going to exemplify with the same sample code on the documentation.
This would be our YAML in the Nuget package, to be inserted as a template reference on each project.
# File: templates/include-npm-steps.yml
steps:
- script: npm install
- script: yarn install
- script: npm run compile
This would be the YAML on each project that needed to consume the above YAML from the Nuget package.
# File: azure-pipelines.yml
jobs:
- job: Linux
pool:
vmImage: 'ubuntu-latest'
steps:
- template: templates/include-npm-steps.yml # Template reference
- job: Windows
pool:
vmImage: 'windows-latest'
steps:
- template: templates/include-npm-steps.yml # Template reference
Probably at run time the pipeline checks for the referenced templates and won't run since they need to be restored first. Any ideas?

As suspected you cannot consume a YAML in a Nuget package but have to reference the YAML from a repo instead.

Related

How do I set up PR validations in Azure DevOps/GitHub?

We are migrating from Azure DevOps to GitHub and we have Build Validations set up where if you make a change in a specific folder, the respective CI pipeline will run when a PR is created.
I am trying to make use of the PR triggers in my YAML file, however when I open a PR it doesn't seem to work.
My pipeline is:
trigger: none
pr:
branches:
include:
- develop
- release/*
- ProductionSupport/*
paths:
include:
- cicd/pipelines/common/pre-commit-ci.yaml
- src
- cicd
pool:
vmImage: ubuntu-latest
variables:
PRE_COMMIT_HOME: $(Pipeline.Workspace)/pre-commit-cache
steps:
- bash: echo "##vso[task.setvariable variable=PY]`python -V`"
displayName: Get python version
- task: Cache#2
inputs:
key: pre-commit | .pre-commit-config.yaml | "$(PY)"
path: $(PRE_COMMIT_HOME)
- bash: |
pip install --quiet pre-commit
pre-commit run
displayName: 'Run pre-commit'
As a test to make sure my branches/paths were correct I updated the triggers section to:
trigger:
branches:
include:
- develop
- release/*
- ProductionSupport/*
paths:
include:
- cicd/pipelines/common/pre-commit-ci.yaml
- src
- cicd
Then when I made a change in one of the files in these folders, the pipeline was successfully triggered. Am I specifying my PR validation incorrectly?
Your yml definition seems correct.
Since you mentioned the CI trigger work fine and you mentioned We are migrating from Azure DevOps to GitHub.
This brings me a idea that a situation that exactly reproduces what you're experiencing and you might not expect:
PR Trigger Override
For example, if your pipeline is the same one as before(Just change the pipeline source), and you didn't delete the previous build validation(Or previous pipeline name is same as the current one), then the pr part in your github yml file will be override, only the build validation on DevOps side will work.
I suggest you investigate whether you have some build validation settings to the pipeline(If your project structure is complex, this maybe difficult to find) or you can simply create a totally new pipeline with the new YAML file.

AzureDevops Task name NuGetCommand is ambiguous

I am getting the following error when trying to run my pipeline
Job Job: Step task reference is invalid. The task name NuGetCommand is ambiguous.
Specify one of the following identifiers to resolve the ambiguity:
.NuGetCommand, .NuGetCommand
Below is my .yml file for my .Net Standard library
trigger:
- main
pool:
vmImage: 'windows-latest'
variables:
solution: '**/MyLibrary.sln'
buildPlatform: 'Any CPU'
buildConfiguration: 'Release'
steps:
- task: NuGetToolInstaller#1
- task: NuGetCommand#2
inputs:
command: 'restore'
restoreSolution: '**/MyLibrary.sln'
feedsToUse: 'config'
nugetConfigPath: './nuget.config'
- task: VSBuild#1
inputs:
solution: '**\MyLibrary.sln'
vsVersion: '15.0'
restoreNugetPackages: true
It seems to be complaining about NugetCommand#2 which doesn't make sense when it comes built in?
Edit
Have also tried using - task: 333b11bd-d341-40d9-afcf-b32d5ce6f23b#2 instead of - task: NuGetCommand#2
One may have been pushed straight into the account using tfx build tasks upload, run tfx build tasks list to find out which ones are installed and if needed delete the non-official one using tfx build tasks delete.
C:\Users\jesse>npm install tfx-cli -g
C:\Users\jesse>tfx build tasks list
TFS Cross Platform Command Line Interface v0.8.3
Copyright Microsoft Corporation
> Service URL: https://dev.azure.com/jessehouwing-dev
> Personal access token:
The one you want to keep is:
id : 333b11bd-d341-40d9-afcf-b32d5ce6f23b
name : NuGetCommand
friendly name : NuGet
visibility :
description : Restore, pack, or push NuGet packages, or run a NuGet command. Supports NuGet.org and authenticated feeds like Azure Artifacts and MyGet. Uses NuGet.exe and works with .NET Framework apps. For .NET Core and .NET Standard apps, use the .NET Core task.
version : 2.179.0
If there is one with a different guid, delete is with:
C:\Users\jesse>tfx build tasks delete --task-id the-task-id-guid-to-delete
It may also have been pushed as part of a privately shared custom extension. The marketplace will block tasks with the rame GUID, but it will allow installing a task with the same name just fine through an extension. Check your installed extensions, especially privately shared ones.
AzureDevops Task name NuGetCommand is ambiguous
According to the error message, it seems that error can happen when two tasks\extensions exist in your Azure DevOps organization with the same name.
You could use this REST API:
https://dev.azure.com/{organisationName}/_apis/distributedtask/tasks?visibility%5B%5D=Build
to check here for possible duplicates.
To resolve this issue, you could use command line to invoke nuget.exe to restore the solution:
nuget.exe restore a.sln -source "xx" -PackagesDirectory

Dependencies getting copied to private feed in Azure DevOps

I have set up a build pipeline for a model library that's shared between several of my projects. I'm accessing it through a private feed in Azure DevOps, and it works just fine. I can retrieve the library in Visual Studio and my projects all get the most up-to-date version. However, in the feed are all the dependency libraries used within the model library (e.g. Microsoft.Azure.Storage.Blob, System.Threading, Microsoft.AspNetCore, etc.). I haven't been able to find any guidance on why this is happening, if it's the expected behavior, or if I'm screwing something up. My YAML file for the build pipeline is below:
Also, does anyone know a better to handle package versioning? This seems really hacky, but it was the only way I could get auto-incrementing versions to work.
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
name: $(projectName)-$(majorMinorVersion).$(semanticVersion)
pool:
vmImage: 'windows-latest'
# pipeline variables
variables:
majorMinorVersion: 1.1
# semanticVersion counter is automatically incremented by one in each execution of pipeline
# second parameter is seed value to reset to every time the referenced majorMinorVersion is changed
semanticVersion: $[counter(variables['majorMinorVersion'], 0)]
projectName: 'MyProject.Models'
buildConfiguration: 'Release'
projectPath: 'Shared/MyProject.Models.csproj'
fullVersion: '$(majorMinorVersion).$(semanticVersion)'
steps:
# show version number on start
- task: Bash#3
inputs:
targetType: 'inline'
script: |
echo Building $(projectName)-$(fullVersion)
- task: DotNetCoreCLI#2
inputs:
command: 'pack'
packagesToPack: $(projectPath)
versioningScheme: 'byEnvVar'
versionEnvVar: 'fullVersion'
- task: DotNetCoreCLI#2
inputs:
command: 'push'
packagesToPush: '$(Build.ArtifactStagingDirectory)/MyProject.Models*.nupkg'
nuGetFeedType: 'internal'
publishVstsFeed: '<feed GUID>'
Dependencies getting copied to private feed in Azure DevOps
That because your private Nuget Feed set nuget.org as an Upstream source by default if you set Package from public sources enable when you create the this feed:
Then go to Setting->Upstream source, you will find there are three public sources listed:
When we download any packages from the Upstream sources, it will been cached in the Artifacts, you will see it next time. They are cached packages from upstream sources, so we do not need to download them again from upstream sources next time we use them, and the included upstream sources are all approved by MS, so you do not need to worry about them.
Besides, if you still worry about them, you can disable the Upstream sources, but in this case, you need publish all the dependencies to your private feed, otherwise, your model library package will throw the error could not found the dependencies.
Hope this helps.

Is it possible to refrence files inside Azure DevOps pipeline templates when these templates reside in a standalone repo?

I'm setting up several pipelines in Azure DevOps. To make my teams life easier, I'm using job templates.
These job templates are in a a proper repository, just for them.
For every pipeline I define the repository to get the templates from.
Some tasks in these templates run powershell code, and I want this code to be in a script file, to be reusable and stored in the same repo as the template.
When the pipelines runs, the template is embeded, it tries to locate the powershell script inside project repo actually being built/deployed.
How can i achieve this?
The workaround is to have inline code which I really don't want to have.
Any constructive answer will be very appreciated.
Thanks
After some digging I couldn't find any way to specify a script file as source to powershell task in a template.
Inside pipeline definition:
resources:
repositories:
- repository: templates
type: git
name: deploy-templates
variables:
artifactName: 'Trade Data ETL - $(Build.SourceBranchName)'
stages:
- stage: Build
displayName: Build
variables:
- group: DEV-Credential-Group
- group: COMMON-Settings-Group
jobs:
- template: ssis/pipelines/stage-build.yml#templates # Template reference
parameters:
artifactName: '$(artifactName)'
Inside template file:
- task: PowerShell#2
inputs:
filePath: ssis/pipelines/scripts/build-ssis-project.ps1
arguments: '-ProjectToBuild "tradedata-ldz-ssis/tradedata-ldz-ssis.dtproj'
pwsh: true
Update 2021
According to learn.microsoft.com, you can now also check out multiple repositories without custom scripting.
If you check out more than one repository, a separate folder containing the repository is created below $(Build.SourcesDirectory).
You can define multiple repositories like this:
resources:
repositories:
- repository: devops
type: git
name: DevOps
ref: main
- repository: infrastructure
type: git
name: Infrastructure
ref: main
And in the steps simply check them out as follows:
steps:
- checkout: self
- checkout: devops
- checkout: infrastructure
# List all available repositories
- script: ls
Original Answer
Currently the resources command only supports yml files in other repositories. However, you could simply checkout the repository in a task and then run the desired powershell script.
steps:
- task: PowerShell#2
inputs:
targetType: inline
script: |
git clone -b <your-desired-branch> https://azuredevops:$($env:token)#dev.azure.com/<your-organization>/<your-project>/_git/<your-repository> <target-folder-name>
./<target-folder-name>/foo.ps1
env:
token: $(System.AccessToken)
This script would checkout an arbitrary branch and execute a script foo.ps1 in the root of the target repository.
Call - checkout: templates inside the template file. This might only work when you insert a template but it successfully sees the repository resource and pulls it down.
You can copy the script files from source directory. Currently, you have not mentioned the root folder -
ssis/pipelines/scripts/build-ssis-project.ps1
Assuming, you are building on a repo where the powershell script resides -
Try -
- task: PowerShell#1
inputs:
scriptName: '$(ScriptsDir)/ssis/pipelines/scripts/build-ssis-project.ps1'
Pass the value of ScriptsDir where it could be the build source directory or build working directory

Do we have standard pipeline option in Azure DevOps to share the library in other pipelines like in jenkins?

I would like to have one standard pipeline defined and want to use that as a shared library in all my jobs which have the common steps.
Yes, it's called Template:
Use templates to define your logic once and then reuse it several times. Templates combine the content of multiple YAML files into a single pipeline. You can pass parameters into a template from your parent pipeline.
For example, Job reuse:
First yaml:
# File: templates/jobs.yml
jobs:
- job: Build
steps:
- script: npm install
- job: Test
steps:
- script: npm test
Second yaml:
# File: azure-pipelines.yml
jobs:
- template: templates/jobs.yml # Template reference

Resources