Why does Azure task ExtractFiles sometimes fail? - azure

I have a yaml release pipeline that uses the 'ExtractFiles#1' task setup like this:
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '$(PIPELINE.WORKSPACE)/MyBuild/MyPackage/$(resources.pipeline.MyBuild.runID)-MyArtifact.zip'
destinationFolder: $(MyDestinationPath)
cleanDestinationFolder: true
overwriteExistingFiles: true
A windows service uses that folder while processing some stuff, so I have a powerhsell task prior to the one above that stops the service. The ExtractFiles task is inside a deployment step that's dependent on the powerhsell task (also a deployment step) completing successfully. So, if I've understood correctly, the service should have stopped (and released all holds on the folder) before the 'cleanDestinationFolder' of the ExtractFiles tasks kicks in.
However, I sometimes see the following error:
Cleaning destination folder before extraction: my folder path Error:
Failed rmRF: EPERM: operation not permitted, unlink 'my folder path'
I can only assume that the cleanDestinationFolder is running up against something that still has a hold on that folder and therefore fails. Could it be the 'overwriteExistingFiles' flag conflicting with this (probably not necessary really)? Is there a way around this? I'm making a few assumptions (possibly incorrectly) as part of this pipeline and the build pipeline that creates the artifact, which are:
The build pipeline uses ArchiveFiles#2 task 3 times to zip 3 separate folders and then uses PublishBuildArtifacts#1. This assumes that the PublishBuildArtifacts#1 doesn't automatically zip the artifact.
The ExtractFiles#1 task has to be used (3 times) to unzip the folders in the artifact. I'm assuming here that the Download step won't automatically unzip them for me.
It feels like this should be a fairly common error, and when googling I came across a very similar issue people seemed to be having with the CopyFiles task, but there didn't seem to be a workaround. I could potentially use the CopyFiles task if the zipping and unzipping that I'm doing is unnecessary.
Update
It looks like I might have a workaround albeit not a particularly nice one. There appears to be the ability to retry a task on failure as stated here. This may well work, as often just re-running the build works. Though, obviously it would be nicer if this wasn't needed.

Related

How can I conditionally run YAML tasks based on path filters?

I have a number of solutions, each of which have a mixture of applications and libraries. Generally speaking, the applications get built and deployed, and the libraries get published as NuGet packages to our internal packages feed. I'll call these "apps" and "nugets."
In my Classic Pipelines, I would have one build for the apps, and one for the nugets. With path filters, I would indicate folders that contain the nuget material, and only trigger the nuget build if those folders had changes. Likewise, the app build would have path filters to detect if any app code had changed. As a result, depending on what was changed in a branch, the app build might run, the nuget build might run, or both might run.
Now I'm trying to convert these to YAML. It seems we can only have one pipeline set up for CI, so I've combined the stages/jobs/steps for nugets and apps into this single pipeline. However, I can't seem to figure out a good way to only trigger the nuget tasks if the nuget path filters are satisfied and only the app tasks if the app path filters are satisfied.
I am hoping someone knows a way to do something similar to one of the following (or anything else that would solve the issue):
Have two different CI pipelines with their own sets of triggers and branch/path filters such that one or both might run on a given branch change
Set some variables based on which paths have changes so that I could later trigger the appropriate tasks using conditions
Make a pipeline always trigger, but only do tasks if a path filter is satisfied (so that the nuget build could always run, but not necessarily do anything, and then the app build could be triggered by the nuget build completing, and itself only do stuff if path filters are satisfied.
It seems we can only have one pipeline set up for CI
My issue was that this was an erroneous conclusion. It appeared to me that, out of the box, a Pipeline is created for a repo with a YAML file in it, and that you can change which file the Pipeline uses, but you can't add a list of files for it to use. I did not realize I could create an additional Pipeline in the UI, and then associate it to a different YAML file in the repo.
Basically, my inexperience with this topic is showing. To future people who might find this, note that you can create as many Pipelines in the UI as you want, and associate each one to a different YAML file.

What Are the Permissions Issues With Regard To Stages of A Multi-Stage Azure DevOps Pipeline

I needed to do some processing (using Python) of configuration files in the source code that were apparently unavailable to the Release stage of my YAML pipeline. So I did the processing in the Build stage. But the build stage doesn't appear to have access to the linked server (in a Deployment Group). Because when I do this step:
- task: ExtractFiles#1
inputs:
#archiveFilePatterns: '**\*.zip'
#not sure why i now have to go up one directory to get to drop\
archiveFilePatterns: '$(Build.artifactStagingDirectory)\**\*.zip'
destinationFolder: 'C:\DevOps\$(Build.BuildNumber)'
overwriteExistingFiles: true
It executes without an error, but no files end up in C:\DevOps$(Build.BuildNumber). How might I have access to the linked server from the Build stage, or is this impossible?
Short of that, any method of being able to copy files produced by my Python script from the Build stage to the Release stage would be helpful.
I'm not clear on what directories are reachable from the various stages. Is $(Build.artifactStagingDirectory) not reachable from the Release stage? Is a directory on the linked server (in this case C:\DevOps$(Build.BuildNumber)) not accessible from the Build stage?

Azure release pipeline cannot find artifact in path

I have a build and release path to build some jar files and deploy them to my azure cloud subscription/resource group. I have been wrking on this for a while, it keeps failing and is desperating.
This is my build pipeline, for specifically the publish artifact task (I understand I must use PublishPipelineArtifact and not PublishBuildArtifact as it is the current preferred option, although build should work too as long as I mark my jars)
- task: PublishPipelineArtifact#1
inputs:
targetPath: $(Pipeline.Workspace)/
artifactName: 'drop'
artifactType: 'pipeline'
I see in my PublishArtifact logs that my files are being read from: /home/vsts/work/1/s
Unfortunatelly no info to where is being copied appears
If I click on the artifacts created I can see they are moved under
https://dev.azure.com/MyOrg/MyProject/_build/results?buildId=XXXXX&view=artifacts&pathAsName=false&type=publishedArtifacts
More concretely under drop/s in what I understand is the stagering directory
So far so good. I have my project built on drop and I can see my jars.
Then when going to the release pipeline to deploy to the cloud
The tasks are run and this is my copy task to copy from build to release
But at the end it fails because after downloading apparently without problems the artifact it says it cannot find it, when even in the task manager from the release pipeline UI I can select manually the jar file as seen in the picture 2.
Seems to be a problem copying to the release branch, or that being properly copied it cannot read it
Seems to be a missmatch between path system variables, or even the slashes as I'm copying a jar from a linux build pipeline to a what is seems to be a Windows NFTS path. This is awful. I have been stuck for a while and I have tried all it came to mind. Can someone give me an idea on why the artifact seems to be downloading well but then it sad it cannot find the jar file? Feel free to correct any other misconception you think that exists in my post. Thanks a lot!

Self hosted azure agent - how to configure pipelines to share the same build folder

We have a self-hosted build agent on an on-prem server.
We typically have a large codebase, and in the past followed this mechanism with TFS2013 build agents:
Daily check-ins were built to c:\work\tfs\ (taking about 5 minutes)
Each night a batch file would run that did the same build to those folders, using the same sources (they were already 'latest' from the CI build), and build the installers. Copy files to a network location, and send an email to the team detailing the build success/failures. (Taking about 40 minutes)
The key thing there is that for the nightly build there would be no need to get the latest sources, and the disk space required wouldn't grow much. Just by the installer sizes.
To replicate this with Azure Devops, I created two pipelines.
One pipeline that did the CI using MSBuild tasks in the classic editor- works great
Another pipeline in the classic editor that runs our existing powershell script, scheduled at 9pm - works great
However, even though my agent doesn't support parallel builds what's happening is that:
The CI pipeline's folder is c:\work\1\
The Nightly build folder is c:\work\2\
This doubles the amount of disk space we need (10gb to 20gb)
They are the same code files, just built differently.
I have struggled to find a way to say to the agent "please use the same sources folder for all pipelines"
What setting is this, as we have to pay our service provider for extra GB storage otherwise.
Or do I need to change my classic pipelines into Yaml and somehow conditionally branch the build so it knows it's being scheduled and do something different?
Or maybe, stop using a Pipeline for the scheduled build, and use task scheduler in Windows as before?
(I did try looking for the same question - I'm sure I can't be the only one).
There is "workingDirectory" directive available for running scripts in pipeline. This link has details of this - https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/command-line?view=azure-devops&tabs=yaml
The number '1' '2'...'6' of work folder c:\work\1\, c:\work\2\... c:\work\6\ in your build agent which stands for a particular pipeline.
Agent.BuildDirectory
The local path on the agent where all folders for a given build
pipeline are created. This variable has the same value as
Pipeline.Workspace. For example: /home/vsts/work/1
If you have two pipelines, there will also be two corresponding work folders. It's an except behavior. We could not configure pipelines to share the same build folde. This is by designed.
If you need to use less disk space to save cost, afraid stop using a Pipeline for the scheduled build, and use task scheduler in Windows as before is a better way.

Upload Azure Pipelines File to Onedrive or similar

In my pipeline on my Azure Devops project, I'm building my project and then zipping up the output into a file. However, at the moment, I have no way to access this archive.
I would like to upload it to a cloud storage service such as OneDrive, Google Drive, Dropbox or similar. But I haven't been able to find a simple way of doing this yet. Can anyone help me?
Several approaches can be used. If only "access" to the zip is needed after a build has completed the Publish Build Artifact task is actually what you want.
This will publish specified files as artifacts within the completed pipeline job.
From within the detail view of the Build-Job you can then download and use whatever the pipeline produced.
It can be configured as a build step, like this:
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: '$(Build.ArtifactStagingDirectory)'
artifactName: 'myCoolZip-$(Date:yyyyMMdd)'
Where the pathToPublish is the path to the artifact you want to publish.
Usually the compiled resources are copied to the ArtifactStagingDirectory and then the contents of that directory is published. This makes it easier if there are many steps that all contribute to the final artifact.
There are several community supported tasks that allow files to be uploaded to cloud-services:
rclone (supports a lot of providers, including OneDrive)
DropBox
AWS3
SharePoint
Cloudvault
However since Azure tends to update regularly these might not be so future-proof depending on the maintainer.
If you still want to upload your zip to the cloud where it is more accessible the easiest way would be to use the FtpUpload task.
- task: FtpUpload#2
inputs:
credentialsOption: inputs
serverUrl: ftp://someFtpServer.com
username: yourUsername
password: yourPassword
rootDirectory: '$(Build.ArtifactStagingDirectory)'
remoteDirectory: '/upload/$(Build.BuildId)/'
# Depending on your certificate you might need to set this to true
# trustSSL: true
However don't place passwords and usernames directly into your YAML, use secret variables instead.
It is also possible to upload artifacts to Azure Storage by using the AzureFileCopy task
If you want more control add a script to your repository that does the uploading (example script) and execute that as a Command Line task.
Use the Publish Artifact task to publish your build output. This will attach it to the build results. You will be able to download or consume the file in a downstream process such as a release definition.
Documentation is avalable on learn.microsoft.com.

Resources