In my pipeline on my Azure Devops project, I'm building my project and then zipping up the output into a file. However, at the moment, I have no way to access this archive.
I would like to upload it to a cloud storage service such as OneDrive, Google Drive, Dropbox or similar. But I haven't been able to find a simple way of doing this yet. Can anyone help me?
Several approaches can be used. If only "access" to the zip is needed after a build has completed the Publish Build Artifact task is actually what you want.
This will publish specified files as artifacts within the completed pipeline job.
From within the detail view of the Build-Job you can then download and use whatever the pipeline produced.
It can be configured as a build step, like this:
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: '$(Build.ArtifactStagingDirectory)'
artifactName: 'myCoolZip-$(Date:yyyyMMdd)'
Where the pathToPublish is the path to the artifact you want to publish.
Usually the compiled resources are copied to the ArtifactStagingDirectory and then the contents of that directory is published. This makes it easier if there are many steps that all contribute to the final artifact.
There are several community supported tasks that allow files to be uploaded to cloud-services:
rclone (supports a lot of providers, including OneDrive)
DropBox
AWS3
SharePoint
Cloudvault
However since Azure tends to update regularly these might not be so future-proof depending on the maintainer.
If you still want to upload your zip to the cloud where it is more accessible the easiest way would be to use the FtpUpload task.
- task: FtpUpload#2
inputs:
credentialsOption: inputs
serverUrl: ftp://someFtpServer.com
username: yourUsername
password: yourPassword
rootDirectory: '$(Build.ArtifactStagingDirectory)'
remoteDirectory: '/upload/$(Build.BuildId)/'
# Depending on your certificate you might need to set this to true
# trustSSL: true
However don't place passwords and usernames directly into your YAML, use secret variables instead.
It is also possible to upload artifacts to Azure Storage by using the AzureFileCopy task
If you want more control add a script to your repository that does the uploading (example script) and execute that as a Command Line task.
Use the Publish Artifact task to publish your build output. This will attach it to the build results. You will be able to download or consume the file in a downstream process such as a release definition.
Documentation is avalable on learn.microsoft.com.
Related
I apologize for my english, im using a translator.
I need make a deployment on Azure Devops using Continous Deployment but i need exclude a file from the repository (i cant change the repository) then create a deployment in IIS.
I have a WebService file in the repository but i cant ignore or delete it from git. I need to use Azure Devops to ignore it then make a continuous deployment.
Another approch is also to use .artifactsignore. By including this file you can describe which files you want to ignore before building the artifacts package. The tricky part here is correct placing of the file.
Where you save the .artifactignore file depends which path you have specified for the publish pipeline artifact task in your pipeline definition.
Here is good example which helped me to use it:
https://www.jfe.cloud/control-pipeline-artifacts-with-artifactignore-file/
You can use the task Delete Files, as covered below:
# Delete files
# Delete folders, or files matching a pattern
- task: DeleteFiles#1
inputs:
#SourceFolder: # Optional
#Contents: 'myFileShare'
#RemoveSourceFolder: # Optional
The input source folder can be folder $(rootFolder) or $(Build.ArtifactStagingDirectory).
I've had a similar problem and solved it this way. Especially .git or .vsfolder.
I needed to do some processing (using Python) of configuration files in the source code that were apparently unavailable to the Release stage of my YAML pipeline. So I did the processing in the Build stage. But the build stage doesn't appear to have access to the linked server (in a Deployment Group). Because when I do this step:
- task: ExtractFiles#1
inputs:
#archiveFilePatterns: '**\*.zip'
#not sure why i now have to go up one directory to get to drop\
archiveFilePatterns: '$(Build.artifactStagingDirectory)\**\*.zip'
destinationFolder: 'C:\DevOps\$(Build.BuildNumber)'
overwriteExistingFiles: true
It executes without an error, but no files end up in C:\DevOps$(Build.BuildNumber). How might I have access to the linked server from the Build stage, or is this impossible?
Short of that, any method of being able to copy files produced by my Python script from the Build stage to the Release stage would be helpful.
I'm not clear on what directories are reachable from the various stages. Is $(Build.artifactStagingDirectory) not reachable from the Release stage? Is a directory on the linked server (in this case C:\DevOps$(Build.BuildNumber)) not accessible from the Build stage?
I have a build and release path to build some jar files and deploy them to my azure cloud subscription/resource group. I have been wrking on this for a while, it keeps failing and is desperating.
This is my build pipeline, for specifically the publish artifact task (I understand I must use PublishPipelineArtifact and not PublishBuildArtifact as it is the current preferred option, although build should work too as long as I mark my jars)
- task: PublishPipelineArtifact#1
inputs:
targetPath: $(Pipeline.Workspace)/
artifactName: 'drop'
artifactType: 'pipeline'
I see in my PublishArtifact logs that my files are being read from: /home/vsts/work/1/s
Unfortunatelly no info to where is being copied appears
If I click on the artifacts created I can see they are moved under
https://dev.azure.com/MyOrg/MyProject/_build/results?buildId=XXXXX&view=artifacts&pathAsName=false&type=publishedArtifacts
More concretely under drop/s in what I understand is the stagering directory
So far so good. I have my project built on drop and I can see my jars.
Then when going to the release pipeline to deploy to the cloud
The tasks are run and this is my copy task to copy from build to release
But at the end it fails because after downloading apparently without problems the artifact it says it cannot find it, when even in the task manager from the release pipeline UI I can select manually the jar file as seen in the picture 2.
Seems to be a problem copying to the release branch, or that being properly copied it cannot read it
Seems to be a missmatch between path system variables, or even the slashes as I'm copying a jar from a linux build pipeline to a what is seems to be a Windows NFTS path. This is awful. I have been stuck for a while and I have tried all it came to mind. Can someone give me an idea on why the artifact seems to be downloading well but then it sad it cannot find the jar file? Feel free to correct any other misconception you think that exists in my post. Thanks a lot!
I have a template repository for build pipelines say 'azure-templates-repo', I have python task template as mentioned below:
steps:
- task: PythonScript#0
inputs:
scriptSource: 'filepath'
scriptPath: 'python_test.yml' #this is located on repo: 'azure-templates-repo'
My question is on scriptPath, when I use this template in a build pipeline azure-pipeline.yml in a repository my-great-app, azure attempts to find the file in my-great-app instead where it really located azure-templates-repo.
So, Is there a way to mention repository for a filePath parameter of azure task 'pythonScript'?
When you use a template in a pipeline, you are only consuming the .yml template file, not the entire repository that contains the template. So by default no other additional files (besides the template itself) that may exist in the template repository will be available when the primary pipeline is composed.
If you need access to scripts or other files that exist in your template repository you will need to use the checkout task and actually checkout the template repository.
- checkout: git://MyProject/MyTemplateRepo
One thing to be aware of if you go down the path of checking out multiple repositories is that it will cause the structure of your $(Build.SourcesDirectory) to change. In practice this can cause pain as you have to update any tasks that expected your primary repository location to be at the root of $(Build.SourcesDirectory).
That mutation of the $(Build.SourcesDirectory) might not be a big deal for you for new pipelines. It can turn into a pain if you have lots of pipelines that you want to consume a new template in, that require supporting scripts.
One option is to package the supporting template scripts, and publish them to an internal package feed. Then within your template block pull down the required script as a package. I have used this strategy before with templates that needed supporting powerShell scripts. We pipeline those scripts and publish them as a universal packages, then consume them at the template level.
I have a release pipeline which i use to deploy my resources to other environments. All works fine but the problem is that every time i deploy, all the resources even if no modification is made, are deployed. Is there a way through which i can do selective deployment; i.e. I deploy only those resources which have been modified. Any help would do. Thanks.
That`s a broad question. There is no out-of-box feature to select units to deploy. But you can use variables in the release pipeline:
Define a variable for each resource/unit and set some default value and "Settable at release time" property.
For each resource, define a separate task to deploy and define the custom condition, like: and(succeeded(), eq(variables['Custom.DeployUnit1'], 'YES'))
You can update these variables at the release creation time:
Is there any way to do selective deployment in azure devops?
There is no such out of box way to selective deployment in azure devops.
That because Azure devops release does not support release only changed files since only release changed files not always meaningful and could not archive what the project intend to release (such as the config file only changed in a commit).
But you could create a PowerShell script to compare timestamp for all files:
Create XML file that stores the last upload/publish information of
each files (e.g. file name, date time, changeset/commit version).
Create a PowerShell script file that included the logical to compare
files (get files metadata and compare with that XML file) and copy
updated files to specific folder
Publish the files in that folder
Check the similar thread for some more details.
Besides, if deploying via the deploy.cmd or MSDeploy.exe, you could also use the the -useChecksum WebDeploy flag:
WebDeploy/MSDeploy Quick Tip: Only Deploy Changed Files
Hope this helps.