Is it possible to copy build artifact from pipeline into repo - azure

I'm trying to copy my published zip build into my azure repo for easy access. I have the following YML code. See inline comment with my question:
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(build.artifactstagingdirectory)\MyFolder'
includeRootFolder: true
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)\zipfolder/$(Build.BuildId).zip'
replaceExistingArchive: true
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)\zipfolder'
TargetPath: '$(Build.SourcesDirectory)\MyFolder' // Here I would expect the code to copy the zip into my repo.

You could download the artifact and use the git command in command line task to push it to the repo, refer to the sample as below, it works for me.
# 'Allow scripts to access the OAuth token' was selected in pipeline. Add the following YAML to any steps requiring access:
# env:
# MY_ACCESS_TOKEN: $(System.AccessToken)
# Variable Group 'vargroup1' was defined in the Variables tab
resources:
repositories:
- repository: self
type: git
ref: refs/heads/testb2
jobs:
- job: Job_1
displayName: Agent job 1
pool:
vmImage: ubuntu-20.04
steps:
- checkout: self
persistCredentials: True
- task: ArchiveFiles#2
displayName: Archive README.md
inputs:
rootFolderOrFile: README.md
archiveFile: $(Build.ArtifactStagingDirectory)\zipfolder/$(Build.BuildId).zip
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
inputs:
PathtoPublish: $(Build.ArtifactStagingDirectory)\zipfolder
- task: DownloadBuildArtifacts#1
displayName: Download Build Artifacts
inputs:
artifactName: drop
- task: CmdLine#2
displayName: Command Line Script
inputs:
script: >-
cd $(System.ArtifactsDirectory)\zipfolder
git config --global user.email "xxxxx"
git config --global user.name "xxxxx"
git init
git add .
git commit -m "123"
git remote add origin https://$(System.AccessToken)#dev.azure.com/orgname/testpro1/_git/testpro4
git push https://$(System.AccessToken)#dev.azure.com/orgname/testpro1/_git/testpro4
...

The alternative is to push to repo and then sync from the remote git/ - basically a automated way of pulling the file over.

Related

Azure Pipeline- Copy files from one Repo to another Repo using YAML

There is a folder in one of the repositories (Source Repo) that I like to copy to another repository (Destination Repo) using Azure Pipeline (as they needed to be in sync)
so far I can Copy a folder in the same repository using:
- task: CopyFiles#2
inputs:
SourceFolder: '$(Build.Repository.LocalPath)\MyFolder\'
Contents: |
**
!**\obj\**
!**\bin\**
TargetFolder: '$(Build.Repository.LocalPath)\DestFolder'
flattenFolders: false
CleanTargetFolder: true
OverWrite: true
preserveTimestamp: true
this is how I connect to another repository:
resources:
repositories:
- repository: SourceRepo
type: git
name: MyCollection/SourceRepo
but I don't know how to get files from the source repo and place them in the Destination Repo
after a lot of searching, this is the answer:
resources:
repositories:
- repository: SourceRepo
type: git
name: MyCollection/SourceRepo
steps:
- checkout: SourceRepo
clean: true
- checkout: self
persistCredentials: true
clean: true
- task: DotNetCoreCLI#2
displayName: "restore DestRepo"
inputs:
command: 'restore'
projects: '$(Build.Repository.LocalPath)/DestRepo/**/*.csproj'
feedsToUse: 'select'
- task: DotNetCoreCLI#2
displayName: "build DestRepo"
inputs:
command: 'build'
projects: '$(Build.Repository.LocalPath)/DestRepo/DestRepo/**/*.csproj'
configuration: Release
# configurations for using git command
- task: CmdLine#2
inputs:
script: |
cd $(Agent.HomeDirectory)\externals\git\cmd
git config --global user.email ""
git config --global user.name "$(Build.RequestedFor)"
- task: CmdLine#2
displayName: checkout
inputs:
script: |
git -C RootRep checkout $(Build.SourceBranchName)
- task: CmdLine#2
displayName: pull
inputs:
script: |
git -C DestRepo pull
- task: CopyFiles#2
inputs:
SourceFolder: '$(Build.Repository.LocalPath)\SourceRepo\SourceFolder'
Contents: |
**
!**\obj\**
!**\bin\**
TargetFolder: '$(Build.Repository.LocalPath)\DestRepo\DestFolder'
flattenFolders: false
CleanTargetFolder: true
OverWrite: true
# preserveTimestamp: true
- task: CmdLine#2
displayName: add
inputs:
script: |
git -C DestRepo add --all
- task: CmdLine#2
displayName: commit
continueOnError: true
inputs:
script: |
git -C DestRepo commit -m "Azure Pipeline Repository Integration"
- task: CmdLine#2
displayName: push
inputs:
script: |
git -C DestRepo push -u origin $(Build.SourceBranchName)
I was trying to find some solution related to this problem, but instead of using a copy file task, I found a better way and we can use any number of repositories are resources in the build pipeline and we don't need to check out all these.
This is how my build pipeline looks like.
As you can see I have used two variables
$(System.AccessToken), this variable is available in Azure DevOps aka PAT(Personal Access Token)
$(Build.Repository.Uri) URL of the repository (this could be the URL of any repo in resources).

Excluding Certain Files in Azure CI Pipeline (YAML)

I have a CI pipeline (YAML) that builds a repo that will deploy into an existing Azure Function. The CI pipeline is doing it is job. However, after it is done, and I go to Function App -> App files -> I can see the azure-pipeline.yml is included in there (or i think it was included in the build process). I have tried using paths and exclude but they dont work. My question is, how do I exclude only that azure-pipeline.yml so that after the pipeline is done building, the azure-pipeline.yml is not in App files in Function App. Below is my YAML
trigger:
branches:
include:
- master
paths:
exclude:
- README.md
- azure-pipelines.yml
variables:
# Azure Resource Manager connection created during pipeline creation
azureSubscription: 'DevOps-Test'
# Function app name
functionAppName: 'test'
# Agent VM image name
vmImageName: 'vs2017-win2016'
# Working Directory
workingDirectory: '$(System.DefaultWorkingDirectory)/'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- powershell: |
if (Test-Path "extensions.csproj") {
dotnet build extensions.csproj --output ./$(workingDirectory)/bin
}
displayName: 'Build extensions'
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: $(workingDirectory)
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- publish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
artifact: drop
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'drop'
downloadPath: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
- task: AzureFunctionApp#1
displayName: 'Azure functions app deploy'
inputs:
azureSubscription: '$(azureSubscription)'
appType: functionApp
appName: $(functionAppName)
package: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
The following syntax means file README.md or azure-pipelines.yml won't trigger the build. It doesn't mean file README.md or azure-pipelines.yml are excluded in the working directory.
trigger:
branches:
include:
- master
paths:
exclude:
- README.md
- azure-pipelines.yml
I've noticed you tried archiving folder $(workingDirectory), and workingDirectory defined in variable which was actually $(System.DefaultWorkingDirectory)/. System.DefaultWorkingDirectory is the local path on the agent where your source code files are downloaded.
Obviously, file README.md and azure-pipelines.yml are in the source code, so they are archived too. You could add a CopyFiles task before ArchiveFiles task to copy files you need from a source folder to a target folder using match patterns, then archive the target folder. For example:
- task: CopyFiles#2
displayName: 'Copy Files to: $(Build.ArtifactStagingDirectory) '
inputs:
SourceFolder: '$(workingDirectory)'
Contents: |
**/*
!*.md
!*.yml
TargetFolder: '$(Build.ArtifactStagingDirectory) '
- task: ArchiveFiles#2
displayName: 'Archive files '
inputs:
rootFolderOrFile: '$(Build.ArtifactStagingDirectory) '
Take a look here
- powershell: |
if (Test-Path "extensions.csproj") {
dotnet build extensions.csproj --output ./$(workingDirectory)/bin
}
displayName: 'Build extensions'
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: $(workingDirectory)
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
You produce your ouptut here ./$(workingDirectory)/bin but you zipped rootFolderOrFile: $(workingDirectory). Please change it to rootFolderOrFile: $(workingDirectory)/bin.
EDIT
Please add this before calling archive
- script: |
rm README.md
rm azure-pipelines.yml
workingDirectory: $(workingDirectory)
So you will remove them from folder which you later archive.

WARN unable to find package.json for plottable

When running a build for storybook via an Azure devops pipeline I get the above mentioned error.
I've tried to completely remove this package(plottable) from my project but I keep getting this error and it causes my webpack build to get stuck.
This error doesn't occur locally.
My pipeline:
trigger:
batch: true
branches:
include:
- master
stages:
- stage: develop_build_deploy_stage
pool:
name: Default
demands:
- msbuild
- visualstudio
jobs:
- job: develop_build_deploy_job
steps:
- checkout: self
clean: true
persistCredentials: true
- task: NodeTool#0
displayName: Install Node
inputs:
versionSpec: '12.x'
- task: PowerShell#2
displayName: 'Install Dependencies'
inputs:
targetType: 'inline'
script: |
npm install
- task: PowerShell#2
displayName: 'Increment version'
inputs:
targetType: 'inline'
script: |
git checkout master
git pull origin master
git config --global user.email "d#gmail.com"
git config --global user.name "Build Agent"
npm version patch -m "Increment Version [skip ci]" --force
git push
- task: PowerShell#2
displayName: 'Build Project'
inputs:
targetType: 'inline'
script: |
npm run build-storybook
npm run build
- task: CopyFiles#2
displayName: 'Copy storybook-static Files'
inputs:
sourceFolder: '$(Build.SourcesDirectory)/storybook-static'
contents: '**'
targetFolder: '$(Build.ArtifactStagingDirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish storybook-static Files to ArtifactStagingDirectory'
inputs:
pathToPublish: '$(Build.ArtifactStagingDirectory)'
artifactName: Storybook
- task: S3Upload#1
displayName: 'Upload storybook-static to S3'
inputs:
awsCredentials: 'my-s3'
regionName: 'us-east-1'
bucketName: 'my-s3-bucket'
sourceFolder: $(Build.ArtifactStagingDirectory)
- task: Npm#1
displayName: 'Publish to Feed'
inputs:
command: 'publish'
publishRegistry: 'useFeed'
publishFeed: '#####'
How would I go about resolving this problem?
I still don't know what caused this to be the case but for some reason the changes I had made on my branch weren't being picked up and an import that I had removed wasn't removed on the branch that was being built on.
So in the end it was trying to import a package which I had removed as a dependency.

azure devops build and deploy to app service

I have create an empty project on dev.azure.com
I have cloned the repository on my local computer.
I have run this command in the main folder:
$ dotnet new mvc
I have create this azure-pipelines.yml file:
trigger:
- master
pool:
vmImage: 'windows-latest'
steps:
- task: DotNetCoreCLI#2
inputs:
command: 'restore'
feedsToUse: 'select'
- task: DotNetCoreCLI#2
inputs:
command: 'build'
- task: DotNetCoreCLI#2
inputs:
command: 'publish'
publishWebProjects: true
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'artifact2'
I have add, commit and pushed files on dev.azure.com (on master branch)
I have this warning message:
##[warning]Directory 'd:\a\1\a' is empty. Nothing will be added to build artifact 'artifact2'.
I have create a release pipeline but i get an error:
##[error]Error: No package found with specified pattern: D:\a\r1\a\**\*.zip<br/>Check if the package mentioned in the task is published as an artifact in the build or a previous stage and downloaded in the current job.
I do not understand what is wrong in my azure-pipelines.yml for artifact production...
Thanks
It seems that there is an issue where the published dlls are placed. Try the yaml below, that I have explicitly set the output directory for the published dlls and zipped the files after publish(that would probably be your next issue). I have also explicitly set in which folder to look for the published folder in order to publish the artifact.
trigger:
- master
pool:
vmImage: 'windows-latest'
steps:
- task: DotNetCoreCLI#2
inputs:
command: 'restore'
feedsToUse: 'select'
- task: DotNetCoreCLI#2
inputs:
command: 'build'
- task: DotNetCoreCLI#2
inputs:
command: 'publish'
publishWebProjects: true
modifyOutputPath: true
arguments: '--configuration $(BuildConfiguration) --output "$(build.artifactstagingdirectory)"'
zipAfterPublish: true
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'artifact2'

Does Azure DevOps Pipelines cache some data accross runs

I'm new to Azure DevOps and pipelines, and I ran into an issue running the same pipeline multiple times in a short period.
In brief, I created a pipeline to simply build a .Net project with MSBuild and generate an artifact. The pipeline trigger on change in master branch.
The first time, it worked, I can download the artifact and execute the program without any issue. Now if I do a change in the master branch 5 minutes later adding an option to my program, the pipeline runs successfully, however when running program stored in the generated artifact, my new option is not there.
I'm probably doing something stupid there, but I don't understand why I have this behaviour.
Is there any kind of caching and how can I have fresh build everytime ?
== EDIT ==
Here is my YAML definition as requested
Basically, steps are:
Checkout solution with all submodule
Nuget restore packages for all required projects
MSBuild task
Archive the output
Publish artifact.
trigger:
- master
pool:
demands: azureps
vmImage: 'windows-latest'
steps:
- checkout: "git://GSS-CMDB-Tools/GSSAM_Code"
submodules: true
persistCredentials: true
- task: NuGetCommand#2
inputs:
command: 'custom'
arguments: 'restore ADDMSync/packages.config -SolutionDirectory .'
- task: NuGetCommand#2
inputs:
command: 'custom'
arguments: 'restore GSSAM/packages.config -SolutionDirectory .'
- task: NuGetCommand#2
inputs:
command: 'custom'
arguments: 'restore GSSAM.ADDMRest/packages.config -SolutionDirectory .'
- task: NuGetCommand#2
inputs:
command: 'custom'
arguments: 'restore GSSAM.SNOWRest/packages.config -SolutionDirectory .'
- task: MSBuild#1
inputs:
solution: 'ADDMSync/ADDMSync.csproj'
msbuildArchitecture: 'x64'
configuration: 'Release'
msbuildArguments: '/p:PostBuildEvent='
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
mv ADDMSync/bin/Release ADDMSync/Bin/ADDMSync
rm ADDMSync/bin/ADDMSync/*.pdb
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: 'ADDMSync/bin/ADDMSync'
includeRootFolder: true
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/ADDMSync.zip'
replaceExistingArchive: true
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)/ADDMSync.zip'
ArtifactName: 'ADDMSync'
publishLocation: 'Container'
Thanks a lot
RĂ©mi
OK I think I understand what happens.
What I did was to commit and push all submodules required by the build. However I did not commit the modification of the solution itself. By doing so it makes it working.
I don't understand why for now, I guess it's link to the way the checkout task works.

Resources