I have a build and release path to build some jar files and deploy them to my azure cloud subscription/resource group. I have been wrking on this for a while, it keeps failing and is desperating.
This is my build pipeline, for specifically the publish artifact task (I understand I must use PublishPipelineArtifact and not PublishBuildArtifact as it is the current preferred option, although build should work too as long as I mark my jars)
- task: PublishPipelineArtifact#1
inputs:
targetPath: $(Pipeline.Workspace)/
artifactName: 'drop'
artifactType: 'pipeline'
I see in my PublishArtifact logs that my files are being read from: /home/vsts/work/1/s
Unfortunatelly no info to where is being copied appears
If I click on the artifacts created I can see they are moved under
https://dev.azure.com/MyOrg/MyProject/_build/results?buildId=XXXXX&view=artifacts&pathAsName=false&type=publishedArtifacts
More concretely under drop/s in what I understand is the stagering directory
So far so good. I have my project built on drop and I can see my jars.
Then when going to the release pipeline to deploy to the cloud
The tasks are run and this is my copy task to copy from build to release
But at the end it fails because after downloading apparently without problems the artifact it says it cannot find it, when even in the task manager from the release pipeline UI I can select manually the jar file as seen in the picture 2.
Seems to be a problem copying to the release branch, or that being properly copied it cannot read it
Seems to be a missmatch between path system variables, or even the slashes as I'm copying a jar from a linux build pipeline to a what is seems to be a Windows NFTS path. This is awful. I have been stuck for a while and I have tried all it came to mind. Can someone give me an idea on why the artifact seems to be downloading well but then it sad it cannot find the jar file? Feel free to correct any other misconception you think that exists in my post. Thanks a lot!
Related
I have a yaml release pipeline that uses the 'ExtractFiles#1' task setup like this:
- task: ExtractFiles#1
inputs:
archiveFilePatterns: '$(PIPELINE.WORKSPACE)/MyBuild/MyPackage/$(resources.pipeline.MyBuild.runID)-MyArtifact.zip'
destinationFolder: $(MyDestinationPath)
cleanDestinationFolder: true
overwriteExistingFiles: true
A windows service uses that folder while processing some stuff, so I have a powerhsell task prior to the one above that stops the service. The ExtractFiles task is inside a deployment step that's dependent on the powerhsell task (also a deployment step) completing successfully. So, if I've understood correctly, the service should have stopped (and released all holds on the folder) before the 'cleanDestinationFolder' of the ExtractFiles tasks kicks in.
However, I sometimes see the following error:
Cleaning destination folder before extraction: my folder path Error:
Failed rmRF: EPERM: operation not permitted, unlink 'my folder path'
I can only assume that the cleanDestinationFolder is running up against something that still has a hold on that folder and therefore fails. Could it be the 'overwriteExistingFiles' flag conflicting with this (probably not necessary really)? Is there a way around this? I'm making a few assumptions (possibly incorrectly) as part of this pipeline and the build pipeline that creates the artifact, which are:
The build pipeline uses ArchiveFiles#2 task 3 times to zip 3 separate folders and then uses PublishBuildArtifacts#1. This assumes that the PublishBuildArtifacts#1 doesn't automatically zip the artifact.
The ExtractFiles#1 task has to be used (3 times) to unzip the folders in the artifact. I'm assuming here that the Download step won't automatically unzip them for me.
It feels like this should be a fairly common error, and when googling I came across a very similar issue people seemed to be having with the CopyFiles task, but there didn't seem to be a workaround. I could potentially use the CopyFiles task if the zipping and unzipping that I'm doing is unnecessary.
Update
It looks like I might have a workaround albeit not a particularly nice one. There appears to be the ability to retry a task on failure as stated here. This may well work, as often just re-running the build works. Though, obviously it would be nicer if this wasn't needed.
I needed to do some processing (using Python) of configuration files in the source code that were apparently unavailable to the Release stage of my YAML pipeline. So I did the processing in the Build stage. But the build stage doesn't appear to have access to the linked server (in a Deployment Group). Because when I do this step:
- task: ExtractFiles#1
inputs:
#archiveFilePatterns: '**\*.zip'
#not sure why i now have to go up one directory to get to drop\
archiveFilePatterns: '$(Build.artifactStagingDirectory)\**\*.zip'
destinationFolder: 'C:\DevOps\$(Build.BuildNumber)'
overwriteExistingFiles: true
It executes without an error, but no files end up in C:\DevOps$(Build.BuildNumber). How might I have access to the linked server from the Build stage, or is this impossible?
Short of that, any method of being able to copy files produced by my Python script from the Build stage to the Release stage would be helpful.
I'm not clear on what directories are reachable from the various stages. Is $(Build.artifactStagingDirectory) not reachable from the Release stage? Is a directory on the linked server (in this case C:\DevOps$(Build.BuildNumber)) not accessible from the Build stage?
I imported the SmartHotel360 project from azuredevopsdemogenerator.azurewebsites.net and would like to build and release the project however I have certain warnings. I successfully ran the build pipeline but when I want to release it I see this warning: The version provided for the build artifact source is invalid. I do not know from where to take this version. There is no option in the dropdown.
Using the Azure DevOps Services Demo Generator, we can create same SmartHotel360 project as below.
And it will automatically create a build pipeline and release pipeline in SmartHotel360 project. By reviewing the generated release pipeline: SmartHotel360_Website-Deploy, we can see below default primary artifact: Build - _SmartHotel_Petchecker-Web.
Clicking this artifact we will be redirected to a build pipeline which doesn’t exist, so we need to manually delete this artifact and then re-add our own artifact, as below.
Therefore, after successfully running the build pipeline and publishing the artifacts, we can create a new release, as below.
For the source alias you just need to set the Build Name, also on the version it will be automatically picked if you already have a successful build
So I'm currently trying to proceed through the release pipeline portion of an Azure DevOps Fundamentals for beginners course on Udemy, and the goal is to deploy the code for a small webapp game called "Flatris" for the purpose of showing how Azure works.
I've been following along with all the steps as per the course, but when I run the release pipeline build, it consistently fails with the message:
"Error: No package found with specified pattern: D:\a\r1\a***.zipCheck if the package mentioned in the task is published as an artifact in the build or a previous stage and downloaded in the current job."
So far, I've double checked the function of the Web App resource provided through Azure, it's showing traffic when I try and run the pipelines, so it's not the problem(i think), I've double checked the repositories, and they seem to be functioning well(i think), I'm unsure about the artifacts and whether that may be the source of the problem, but theres not enough specificity in the error logs for me to accurately isolate where the problem is.
I don't know if anyone else has managed to get pat this problem, or if it's unique to me with something I'm doing, but any help would be greatly appreciated.
Error: No package found with specified pattern: D:\a\r1\a***.zip
Check if the package mentioned in the task is published as an artifact in the build or a previous stage and downloaded in the current job."
When we use release pipeline for our application, we have to specify the artifacts that make up the application. And the artifact is a deployable component of your application. It is typically produced through a Continuous Integration or a build pipeline. Azure Pipelines releases can deploy artifacts that are produced by a wide range of artifact sources such as Azure Pipelines build, Jenkins, or Team City.
According to the error message, it seems you are using the build pipeline as artifact resource in the release pipeline.
To resolve this issue, we need make sure we have use the Publish build artifacts task to publish the artifacts to the Azure Pipelines in the build pipeline.
Then select the above build pipeline when we select the artifact source type:
In this case, Azure release pipeline will download the artifact automatically when we execute the release pipeline, the artifact will be save in the default folder D:\a\r1\a. That will resolve your issue.
In my pipeline on my Azure Devops project, I'm building my project and then zipping up the output into a file. However, at the moment, I have no way to access this archive.
I would like to upload it to a cloud storage service such as OneDrive, Google Drive, Dropbox or similar. But I haven't been able to find a simple way of doing this yet. Can anyone help me?
Several approaches can be used. If only "access" to the zip is needed after a build has completed the Publish Build Artifact task is actually what you want.
This will publish specified files as artifacts within the completed pipeline job.
From within the detail view of the Build-Job you can then download and use whatever the pipeline produced.
It can be configured as a build step, like this:
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: '$(Build.ArtifactStagingDirectory)'
artifactName: 'myCoolZip-$(Date:yyyyMMdd)'
Where the pathToPublish is the path to the artifact you want to publish.
Usually the compiled resources are copied to the ArtifactStagingDirectory and then the contents of that directory is published. This makes it easier if there are many steps that all contribute to the final artifact.
There are several community supported tasks that allow files to be uploaded to cloud-services:
rclone (supports a lot of providers, including OneDrive)
DropBox
AWS3
SharePoint
Cloudvault
However since Azure tends to update regularly these might not be so future-proof depending on the maintainer.
If you still want to upload your zip to the cloud where it is more accessible the easiest way would be to use the FtpUpload task.
- task: FtpUpload#2
inputs:
credentialsOption: inputs
serverUrl: ftp://someFtpServer.com
username: yourUsername
password: yourPassword
rootDirectory: '$(Build.ArtifactStagingDirectory)'
remoteDirectory: '/upload/$(Build.BuildId)/'
# Depending on your certificate you might need to set this to true
# trustSSL: true
However don't place passwords and usernames directly into your YAML, use secret variables instead.
It is also possible to upload artifacts to Azure Storage by using the AzureFileCopy task
If you want more control add a script to your repository that does the uploading (example script) and execute that as a Command Line task.
Use the Publish Artifact task to publish your build output. This will attach it to the build results. You will be able to download or consume the file in a downstream process such as a release definition.
Documentation is avalable on learn.microsoft.com.