In my yaml file I have two tasks : CopyFiles and run a Powershell script.
If only the Powershell task is there, then no problem. I runs fine. So the ScriptPath is OK.
But with the CopyFiles task preceding, it isn't able to find the Powershell file anymore (see screenshot).
Anyone has an idea?
jobs:
- job: sync_wiki_repo
displayName: 'Sync wiki'
steps:
- checkout: wiki
- task: CopyFiles#2
displayName: Copy wiki files
inputs:
contents: 'Wiki/Distribution/Client-Welcome-Page/Readme.md'
**targetFolder**: '$(System.DefaultWorkingDirectory)/wiki-staging'
overWrite: true
- checkout: git://something/set-wiki #checkout specific branch
- task: AzurePowerShell#5
displayName: 'Set wiki'
inputs:
azureSubscription: 'mysub'
ScriptType: 'FilePath'
**ScriptPath**: '1.0/scripts/sync-wiki-repo.ps1'
ScriptArguments: '-wikipat $(e2e-pipelines-manage-wiki-secret) -project ${{parameters.project}}'
azurePowerShellVersion: 6.4.0
workingDirectory: '$(System.DefaultWorkingDirectory)'
As this is running on an Ubuntu Linux machines, the names of the folders and files are case sensitive. I doublechecked that. And I added "E2E-Pipelines" to the Scriptpath.
ScriptPath:'E2E-Pipelines/1.0/scripts/sync-wiki-repo.ps1'
Now my powershell file can be found.
It is solved.
Related
I have an ASP.Net MVC application that runs on an Azure server. It has a module that needs a licence file (.lic) to live in a particular solution folder. I'm not allowed to keep the licence file in the dev environment (locally) for development purposes and just to add and paste that in the specific folder during the build process on Azure.
I found two ways to accomplish that:
Store the content of licence.lic in the Azure key vault and use that like
- task: AzureKeyVault#2
inputs:
azureSubscription: 'my resource group'
KeyVaultName: 'kv-mykeyvault'
SecretsFilter: 'field1--Licensing--Keys--myModule'
RunAsPreJob: false
- task: CmdLine#2
displayName: 'Generate licence.lic'
inputs:
script: 'echo $(field1--Licensing--Keys--myModule) > licence.lic'
- task: CopyFiles#2
displayName: 'Copy licence.lic'
inputs:
Contents: licence.lic
targetFolder: '$(Build.ArtifactStagingDirectory)/path/to/directory/'
or
Simply upload the licence file in the azure → Libraries → Secure Files, download the file and copy that in the proper directory (/path/to/directory) as
- task: DownloadSecureFile#1
name: licence
displayName: 'Download licence.lic'
inputs:
secureFile: 'licence.lic'
- task: CopyFiles#2
displayName: 'Copy licence.lic'
inputs:
sourceFolder: $(Agent.TempDirectory)
contents: licence.lic
targetFolder: '$(Build.ArtifactStagingDirectory)/path/to/directory/'
Is there any significant difference(s) between these two options rather than the first one running 3 tasks and the second option running 2 tasks? If yes, which one is more preferred?
I have an azure function app that is running on Azure. I am now trying to configure the application based on a powershell script. This powershell script is being used to create azure resources (Ex: KeyVault, ApplicationInsights...) that will be then used by the function app. I created the powershell script and my question here is how can I add the powershell script in the yaml file which is responsible to deploy the application. The powershell script is located in the repository under the name functionapp.ps1 . The idea here is to run the powershell script once the build is finished so in the deploy stage. What I did so far is the following:
stages:
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: DotNetCoreCLI#2
displayName: Build
inputs:
command: 'build'
projects: |
$(workingDirectory)/*.csproj
arguments: --output $(System.DefaultWorkingDirectory)/publish_output --configuration Release
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(System.DefaultWorkingDirectory)/publish_output'
includeRootFolder: false
archiveType: zip
archiveFile: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
replaceExistingArchive: true
- publish: $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
artifact: drop
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: 'development'
pool:
vmImage: $(vmImageName)
strategy:
runOnce:
deploy:
steps:
- task: CmdLine#2
inputs:
script: |
echo '$(System.DefaultWorkingDirectory)'
dir
- task: PowerShell#2
inputs:
targetType: 'filePath'
filePath: $(System.DefaultWorkingDirectory)/functionapp.ps1
- task: AzureFunctionApp#1
displayName: 'Azure functions app deploy'
inputs:
azureSubscription: '$(azureSubscription)'
appType: functionApp
appName: $(functionAppName)
package: '$(Pipeline.Workspace)/drop/$(Build.BuildId).zip'
When the stages are being run and it arrives to this task I get the following error:
##[error]Invalid file path 'D:\a\1\s\functionapp.ps1'. A path to a .ps1 file is required.
I tried using the command line task to check the default working directory and I got the following result: D:\a\1\s
I still don't know what I am missing here and why I am getting this error. Can someone please give me a concrete solution to this problem ?
I ended up having the same error. Can someone please tell me what is the issue here ?
I have an azure pipeline that triggers a python selenium scrip that check that my website works properly.
But there is a stage that keeps failing because I need selenium to input a specific date, and as I am not sure if the date inputed is in the wrong format (locally works just fine) I would like to take a screenshot at that stage to fully understand what is happening in there.
locally this is my configuration to save the screenshot:
try:
wait.until(EC.element_to_be_clickable((By.XPATH, '//*[#id="root"]/div[2]/main/div[4]/div/button[2]'))).click()
except:
driver.save_screenshot('error.png')
This works just fine and it does output the png image in the local folder.
but running this on azure pipeline, is not saving the png file.
this is my pipeline configuration
stages:
- stage:
jobs:
- job: Configuration
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '3.8'
addToPath: true
- script: |
python -m pip install --upgrade pip
pip install selenium
printenv
- task: Pythonscript#0
inputs:
scriptSource: 'filePath'
scriptPath: './script1.py'
env:
USERNAMEOT: $(usernameot)
PASSWORDOT: $(passwordot)
- job: Artifact
steps:
- task: CopyFiles#2
displayName: 'Copy Files to: $(build.artifactstagingdirectory)'
inputs:
SourceFolder: '$(system.defaultworkingdirectory)'
Contents: '**.png'
TargetFolder: '$(build.artifactstagingdirectory)'
flattenFolders: true
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: screenshots'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
ArtifactName: screenshots
I do have a task to copy file and publish the artefact, but as this task runs in parallel, it completes before the previous job and return nothing.
I was wandering how can I save the png file I have to the artefact folder even if the Configuration job fails?
Thank you so much for any help you can provide me guys, I am really struggling on this
I was wandering how can I save the png file I have to the artefact folder even if the Configuration job fails?
You could try to set the Dependencies and condition for the job Artifact, like:
jobs:
- job: Artifact
dependsOn: Configuration
condition: succeededOrFailed()
With the dependsOn: Configuration, the job Artifact will executed after the job Configuration. And the condition: succeededOrFailed() will keep the job Artifact execute when the Configuration job fails.
You could check the document Specify conditions and Dependencies for some more details.
I have the following pipeline
variables:
azureSubscription: ...
stages:
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
condition: succeeded()
jobs:
- deployment: Deploy
displayName: Deploy
environment: 'development'
strategy:
runOnce:
deploy:
steps:
- task: AzureResourceGroupDeployment#2
inputs:
action: 'Create Or Update Resource Group'
resourceGroupName: '...'
location: '...'
templateLocation: 'Linked artifact'
csmFile: '$(Pipeline.Workspace)/azure-deploy.json'
deploymentMode: 'Incremental'
The repo has the following files (at the root directory)
azure-pipelines.yaml
azure-deploy.json
and only a master branch.
I have tried:
azure-deploy.json
**azure-deploy.json
**/*azure-deploy.json
$(Build.SourcesDirectory)/azure-deploy.json
$(Pipeline.Workspace)/azure-deploy.json
$(System.DefaultWorkingDirectory)/azure-deploy.json
Having read:
Azure Pipeline Error: Could not find any file matching the template file pattern
VSTS Pipeline Deployment of ARM Error: Could not find any file matching the template file pattern
https://github.com/microsoft/azure-pipelines-tasks/issues/11520
to no avail. Any ideas?
Update: I have added a publish pipeline as suggested by #ShaykiAbramczyk
Now I get a Template file pattern matches a directory instead of a file: /home/vsts/work/1/azure-deploy.json
- stage: Build
displayName: Build stage
jobs:
- job: Build
displayName: Build
pool:
vmImage: $(vmImageName)
steps:
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Pipeline.Workspace)'
artifact: 'azure-deploy.json'
publishLocation: 'pipeline'
"A deployment job doesn't automatically clone the source repo. You can checkout the source repo within your job with checkout: self."
Source: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/deployment-jobs?view=azure-devops
Example from my setup, I put checkout: self as the first step and now my repository is cloned before before executing the Azure PowerShell:
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: AzurePowerShell#5
displayName: Setup Network
inputs:
It is a good strategy to go for the multi-stage pipelines for what you are doing.
Build is for composing your artifacts.
Deployment jobs is for the publishing part.
So you are on the right track.
If you need sources during the deployment jobs then use the checkout step to fetch the sources. ref. Repo Checkout docs.
Just my two cents
Because you use deployment job the sources from the master branch didn't download into the agent.
You need to publish the files in the build stage and consume them in the deployment - with pipeline artifacts.
Or, just run the AzureResourceGroupDeployment in a regular job, than the .json file will be in the agent.
My objective was to create a Blank and Starter Resource Group in Azure and nothing else. And this was repeatedly giving me the error
Template file pattern matches a directory instead of a file: /home/vsts/work/1/s
And thats what got me here.
I finally sorted that out with Stringfello's
- checkout self
step.
My full pipeline is as follows.
variables:
- name: finalBuildArtifactName
value: 'aspNetCoreDropFromYaml123'
- name: BuildParameters.RestoreBuildProjects
value: '**/*.csproj'
- name: BuildParameters.TestProjects
value: '**/*[Tt]ests/*.csproj'
- name: buildConfiguration
value: 'Release'
trigger:
- master
name: $(date:yyyyMMdd)$(rev:.r)
stages:
- stage: Build ##Basically prints out some vars and copies the template files.
jobs:
- job: buildWebApp
displayName: Build Stage for somepurpose
pool:
vmImage: ubuntu-latest
steps:
- script: |
echo build.artifactstagingdirectory and build.buildnumber are as follows.
echo $(build.artifactstagingdirectory) $(build.buildnumber)
echo $(projects)
echo $(BuildConfiguration)
echo Pipeline.Workspace is $(Pipeline.Workspace)
echo The current branch is - $(Build.SourceBranchName)!!.
echo $(finalBuildArtifactName)
echo "This is the build pipe line. This produces the necessary artifacts for subsequent release pipeline."
displayName: 'Command Line Script to write out some vars'
- powershell: |
# Write your PowerShell commands here.
Write-Host "This is from power shell command task"
Write-Host "This writes out the env vars"
get-childitem -path env:*
displayName: 'PowerShell script to write out env vars'
# The following task is needed. This copies the arm template files.
# Created these templates from Visual studio 2019 as follows.
# Right click your solution and Add -> New Project -> Azure Resource Group and gave the name Vivek_Aks_Rg
- task: CopyFiles#2
inputs:
SourceFolder: 'iac/ArmTemplates/Vivek_Aks_Rg/'
Contents: 'azuredeploy*.json'
TargetFolder: '$(build.artifactStagingDirectory)/ArmTemplates'
- task: PublishBuildArtifacts#1
displayName: Publish Artifact
condition: succeededOrFailed()
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
ArtifactName: '$(finalBuildArtifactName)'
PublishLocation: 'Container'
- stage: DeployToDev
displayName: Deploy to Dev Env
jobs:
- deployment:
pool:
vmImage: ubuntu-latest
environment: Dev
strategy:
runOnce:
deploy:
steps:
- checkout: self
- task: AzureResourceManagerTemplateDeployment#3
displayName: 'Create Azure App Service in a Given Resource Group'
inputs:
deploymentScope: 'Subscription'
azureResourceManagerConnection: 'Pay-As-You-Go(123YourSubscriptionId)'
subscriptionId: '123YourSubscriptionId'
action: 'Create Or Update Resource Group'
resourceGroupName: 'YourResourceGroupName'
location: 'Central India'
csmFile: '$(Pipeline.Workspace)/$(finalBuildArtifactName)/ArmTemplates/azuredeploy.json'
csmParametersFile: '$(Pipeline.Workspace)/$(finalBuildArtifactName)/ArmTemplates/azuredeploy.parameters.json'
deploymentMode: 'Incremental'
I'm new to Azure DevOps and pipelines, and I ran into an issue running the same pipeline multiple times in a short period.
In brief, I created a pipeline to simply build a .Net project with MSBuild and generate an artifact. The pipeline trigger on change in master branch.
The first time, it worked, I can download the artifact and execute the program without any issue. Now if I do a change in the master branch 5 minutes later adding an option to my program, the pipeline runs successfully, however when running program stored in the generated artifact, my new option is not there.
I'm probably doing something stupid there, but I don't understand why I have this behaviour.
Is there any kind of caching and how can I have fresh build everytime ?
== EDIT ==
Here is my YAML definition as requested
Basically, steps are:
Checkout solution with all submodule
Nuget restore packages for all required projects
MSBuild task
Archive the output
Publish artifact.
trigger:
- master
pool:
demands: azureps
vmImage: 'windows-latest'
steps:
- checkout: "git://GSS-CMDB-Tools/GSSAM_Code"
submodules: true
persistCredentials: true
- task: NuGetCommand#2
inputs:
command: 'custom'
arguments: 'restore ADDMSync/packages.config -SolutionDirectory .'
- task: NuGetCommand#2
inputs:
command: 'custom'
arguments: 'restore GSSAM/packages.config -SolutionDirectory .'
- task: NuGetCommand#2
inputs:
command: 'custom'
arguments: 'restore GSSAM.ADDMRest/packages.config -SolutionDirectory .'
- task: NuGetCommand#2
inputs:
command: 'custom'
arguments: 'restore GSSAM.SNOWRest/packages.config -SolutionDirectory .'
- task: MSBuild#1
inputs:
solution: 'ADDMSync/ADDMSync.csproj'
msbuildArchitecture: 'x64'
configuration: 'Release'
msbuildArguments: '/p:PostBuildEvent='
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
mv ADDMSync/bin/Release ADDMSync/Bin/ADDMSync
rm ADDMSync/bin/ADDMSync/*.pdb
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: 'ADDMSync/bin/ADDMSync'
includeRootFolder: true
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/ADDMSync.zip'
replaceExistingArchive: true
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)/ADDMSync.zip'
ArtifactName: 'ADDMSync'
publishLocation: 'Container'
Thanks a lot
Rémi
OK I think I understand what happens.
What I did was to commit and push all submodules required by the build. However I did not commit the modification of the solution itself. By doing so it makes it working.
I don't understand why for now, I guess it's link to the way the checkout task works.