Maven local repo Cache on azure pipeline not working - azure

I'm having problems with the maven cache in azure-pipelines, even downloading the cache and being able to see that the m2/repository folder exists inside the work, when calling the maven task the files are downloaded again, thus invalidating my attempt to use the cache .
My first attempt was using:
MAVEN_CACHE_FOLDER: $(Pipeline.Workspace)/.m2/repository
Where I had the same error and after some research I found this topic in the community microsoft developercommunity
which suggested changing the directory used to create the env MAVEN_CACHE_FOLDER
But that didn't solve my problem.
Below is my script:
pool:
vmImage: ubuntu-latest
variables:
MAVEN_CACHE_FOLDER: $(HOME)/.m2/repository
MAVEN_OPTS: '-Dmaven.repo.local=$(MAVEN_CACHE_FOLDER)'
stages:
- stage: validate
displayName: VALIDATE
jobs:
- job: java_validations
displayName: Java Validations
workspace:
clean: outputs
steps:
- checkout: self
clean: false
- task: Cache#2
displayName: 'Cache Maven local repo'
inputs:
key: '"funcs" | maven | "$(Agent.OS)" | pom.xml'
restoreKeys: |
path: $(MAVEN_CACHE_FOLDER)
- script: |
mvn validate $(MAVEN_OPTS)
displayName: Maven Validate
- script: |
mvn checkstyle:check $(MAVEN_OPTS)
displayName: Maven Checkstyle
- script: |
tree $(MAVEN_CACHE_FOLDER)
displayName: show MAVEN_CACHE_FOLDER tree
- job: unit_tests
displayName: Unit Tests
dependsOn:
- java_validations
workspace:
clean: outputs
steps:
- checkout: self
clean: false
- task: Cache#2
displayName: 'Cache Maven local repo'
inputs:
key: '"funcs" | maven | "$(Agent.OS)" '
restoreKeys: |
path: $(MAVEN_CACHE_FOLDER)
- script: |
tree $(MAVEN_CACHE_FOLDER)
displayName: show MAVEN_CACHE_FOLDER tree
- task: Maven#3
inputs:
mavenPomFile: 'pom.xml'
options: 'test-compile failsafe:integration-test -Dcheckstyle.skip -Pun-tests $(MAVEN_OPTS)'
publishJUnitResults: false
mavenVersionOption: 'Default'
mavenAuthenticateFeed: true
effectivePomSkip: false
sonarQubeRunAnalysis: true
sqMavenPluginVersionChoice: 'latest'
env:
JAVA_HOME: $(JAVA_HOME_17_X64)
PATH: $(JAVA_HOME_17_X64)/bin:$(PATH)
- script: |
tree $(MAVEN_CACHE_FOLDER)
displayName: show MAVEN_CACHE_FOLDER tree

After contacting the people at microsoft, we arrived at the solution below, where the problem was in the way I was passing the options in the maven task, the property where I must pass is:
mavenOptions: '$(MAVEN_OPTS)'
looking like this:
- task: Maven#3
inputs:
mavenPomFile: 'pom.xml'
mavenOptions: '-Xmx3072m $(MAVEN_OPTS)'
options: 'test-compile failsafe:integration-test -Dcheckstyle.skip -Pun-tests $(MAVEN_OPTS)'
publishJUnitResults: false
mavenVersionOption: 'Default'
mavenAuthenticateFeed: true
effectivePomSkip: false
sonarQubeRunAnalysis: true
sqMavenPluginVersionChoice: 'latest'
env:
JAVA_HOME: $(JAVA_HOME_17_X64)
PATH: $(JAVA_HOME_17_X64)/bin:$(PATH)
This solved it for me and the cache started working.

Related

Publish file content to service bus from CI pipeline

In my CI pipeline I am trying to publish message to service bus and its working when its just some hardcoded text or variables, here Using "PublishToAzureServiceBus" task .
But problem is when trying to a read file from repository and then publish that to service bus.
I have tried using read file using scripting language and put to variable but its not able to work as variable is not storing big json file.
Is there any way to read file directly when publishing message to service bus.
Below is sample code snippet for debugging
trigger:
- none
pool:
vmImage: ubuntu-latest
parameters:
- name: ProjectName
displayName: Project Name
type: string
default: DevOpsDemo
- name: repoName
displayName: repo Name
type: string
default: ProjectCode
- name: branchRef
displayName: Branch Name
type: string
default: main
variables:
- name: jobStatus
value: "Failed"
- name: projectFile
value: ""
stages:
- stage: Stage1
displayName: Stage 1
jobs:
- job: CheckOutRepo
displayName: CheckOut-Repo Display
steps:
- script: |
echo "Checkout for " ${{ parameters.ProjectName}} : ${{ parameters.repoName}} : ${{ parameters.branchRef}}
name: PrintMessage
- checkout: git://${{ parameters.ProjectName}}/${{ parameters.repoName}}#refs/heads/${{ parameters.branchRef}}
name: Checkout
- task: PythonScript#0
inputs:
scriptSource: 'inline'
script: |
import json
import requests
f = open('project-release.json')
projectFile = json.load(f)
print(projectFile)
f.close()
print("Afterclosing")
print(projectFile)
- script: |
echo "Project release file" $(cat project-release.json)
name: TestPrint
- task: CopyFiles#2
inputs:
SourceFolder: 'services'
Contents: '**'
TargetFolder: $(Build.ArtifactStagingDirectory)
name: CopyFiles
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: $(Build.ArtifactStagingDirectory)
ArtifactName: 'drop'
publishLocation: 'Container'
name: PublishArtifacts
- bash: |
echo "##vso[task.setvariable variable=jobStatus]Success"
name: setVar
- bash: |
echo "##vso[task.setvariable variable=jobStatus;isOutput=true]$(jobStatus)"
echo "##vso[task.setvariable variable=projectFile;isOutput=true]$(cat project-release.json)"
name: SetStatus
condition: always()
- stage: Stage2
displayName: Stage 2
condition: always()
jobs:
- job: Publish
pool: server
variables:
jobStatus: $[ stageDependencies.Stage1.CheckOutRepo.outputs['SetStatus.jobStatus'] ]
projectFile: $[ stageDependencies.Stage1.CheckOutRepo.outputs['SetStatus.projectFile'] ]
steps:
- task: PublishToAzureServiceBus#1
inputs:
azureSubscription: 'SBConnection'
messageBody: |
{
"Status": "$(jobStatus)",
"BuildID": "$(build.buildid)",
"BuildNumber":"$(build.buildnumber)",
"projectFile":$(cat project-release.json)
}
signPayload: false
waitForCompletion: false
condition: always()
I am able to solve this by using setvariable in bash script as below
pool:
vmImage: ubuntu-latest
stages:
- stage: Stage1
displayName: Stage 1
jobs:
- job: CheckOutRepo
displayName: CheckOut-Repo Display
steps:
- checkout: git://${{ parameters.ProjectName}}/${{ parameters.repoName}}#refs/heads/${{ parameters.branchRef}}
name: Checkout
- bash: |
data=$(cat project-release.json)
echo "##vso[task.setvariable variable=jobStatus;isOutput=true]$(jobStatus)"
echo "##vso[task.setvariable variable=data;isOutput=true]"$data
name: SetStatus
condition: always()
- stage: Stage2
displayName: Stage 2
condition: always()
jobs:
- job: Publish
pool: server
variables:
jobStatus: $[ stageDependencies.Stage1.CheckOutRepo.outputs['SetStatus.jobStatus'] ]
projectFile: $[ stageDependencies.Stage1.CheckOutRepo.outputs['SetStatus.data'] ]
steps:
- task: PublishToAzureServiceBus#1
inputs:
azureSubscription: 'SBConnection'
messageBody: |
{
"Status": "$(jobStatus)",
"BuildID": "$(build.buildid)",
"BuildNumber":"$(build.buildnumber)",
"projectFile":$(projectFile)
}
signPayload: false
waitForCompletion: false
condition: always()

Pipeline Artifact not Downloading with Stages

I have recently upgraded my YAML pipeline to include Stages, since this has happened the build artifacts are not downloading in a later stage task. For the life of me I cant figure out why.
Please see my YAML Code Bellow with an explanation on how my pipeline works.
First I have the main pipeline that calls the template yaml files.
mainpipeline.yml
pool:
vmImage: 'ubuntu-latest'
resources:
repositories:
- repository: Terraform
name: VALUE/Terraform
path:
- include: /Terraform
type: git
ref: main #branch name
- repository: Website
name: VALUE/Website
path:
- include: /Website
type: git
ref: newartifactpipeline #branch name
- repository: AuthenticationServer
name: VALUE/AuthenticationServer
path:
- include: /AuthenticationServer
type: git
ref: VALUE #branch name
trigger:
branches:
include:
- master
variables:
buildConfiguration: 'Release'
stages:
- stage: build_website_files
displayName: Building Main Website Files
jobs:
- job: build_main_website
steps:
- checkout: Website
- template: buildartifact.yml#website
parameters:
#vmImage: 'windows-latest'
buildConFiguration: $(buildConfiguration)
project: Website.csproj
artifactName: Website
- stage: build_authenticationserver_files
displayName: Building AuthenticationServer Website Files
jobs:
- job: build_authenticationserver_website
steps:
- checkout: AuthenticationServer
- template: buildartifact.yml#AuthenticationServer
parameters:
# vmImage: 'windows-latest'
buildConFiguration: $(buildConfiguration)
project: AuthenticationServer.csproj
artifactName: AuthenticationServer
- stage: run_terraform_pre_build
displayName: Building Terraform Applications and Deploying Web Apps
jobs:
- job: building_terraform_applications
steps:
- checkout: Terraform
- template: /VALUE/runterraform.yml#Terraform
parameters:
terraformWorkingDirectory: '$(System.DefaultWorkingDirectory)/VALUE'
serviceConnection: 'VALUE'
azureSubscription: 'VALUE'
appconnectionname: 'VALUE'
backendresourcegroupname: 'VALUE'
backendstorageaccountname: 'VALUE'
backendcontainername: 'VALUE'
RG: 'RG_Example'
azureLocation: 'UK South'
terraformVersion: '1.0.4'
artifactName: 'Website'
- stage: run_terraform_post_build
displayName: Apply Post Build Settings
jobs:
- job: apply_post_build_settings
steps:
- checkout: Terraform
- template: /Terraform/PostBuild/runterraformpostbuild.yml#Terraform
parameters:
terraformWorkingDirectory: '$(System.DefaultWorkingDirectory)/VALUE/PostBuild'
serviceConnection: 'VALUE'
azureSubscription: 'VALUE'
appconnectionname: 'VALUE'
backendresourcegroupname: ''VALUE''
backendstorageaccountname: 'VALUE'
backendcontainername: 'VALUE'
RG: 'RG_Example'
azureLocation: 'UK South'
terraformVersion: '1.0.4'
artifactName: 'Website'
The first stage builds and calls this build artifact template yaml file, that does successfully publish the artifact file, yaml code below:
Please see screenshot bellow as proof:
buildartifact.yml
parameters:
- name: buildConfiguration
type: string
default: 'Release'
- name: project
type: string
default: 'Website.csproj'
- name: artifactName
type: string
default: 'Website'
- name: vmImage
type: string
default: 'windows-latest'
jobs:
- job: build_website
pool:
vmImage: ${{ parameters.vmImage }}
steps:
- checkout: Website
- task: CmdLine#2
inputs:
script: |
echo '$(System.DefaultWorkingDirectory)'
dir
- task: DotNetCoreCLI#2
displayName: dotnet restore
inputs:
command: restore
projects: '**/${{ parameters.project }}'
# Node.js tool installer
# Finds or downloads and caches the specified version spec of Node.js and adds it to the PATH
- task: NodeTool#0
displayName: 'Install Node .js'
inputs:
versionSpec: '14.17.3'
force32bit: false # Optional
checkLatest: false # Optional
- script: |
npm install -g #angular/cli#12.1.3
npm install
displayName: 'npm install'
- task: Npm#1
displayName: 'npm run build'
inputs:
command: 'custom'
workingDir: ClientApp
customCommand: 'build'
- task: DotNetCoreCLI#2
displayName: 'Build'
inputs:
command: 'build'
projects: '**/${{ parameters.project }}'
arguments: '--configuration ${{ parameters.buildConfiguration }}'
- task: DotNetCoreCLI#2
displayName: dotnet restore unit tests
inputs:
command: restore
projects: 'UnitTests/UnitTests.csproj'
- task: DotNetCoreCLI#2
displayName: dotnet Test
inputs:
command: test
projects: 'UnitTests/UnitTests.csproj'
arguments: '--configuration Release'
- task: DotNetCoreCLI#2
displayName: 'Publish Application'
inputs:
command: 'publish'
publishWebProjects: false
projects: '**/${{ parameters.project }}'
arguments: '--configuration ${{ parameters.buildConfiguration }} --output $(Pipeline.Workspace)/website/'
- task: PublishPipelineArtifact#1
displayName: 'Publish Artifacts'
inputs:
targetPath: '$(Pipeline.Workspace)/website/'
artifact: ${{ parameters.artifactName }}
publishLocation: 'pipeline'
With this yaml for the PublishPipelineArtifact task I have tried the following env variables for pipleines: pipeline.workspace and System.DefaultWorkingDirectory
Yet both have not worked in the later stage where the final yaml file tries to download the pipeline artifact see yaml code bellow:
runterraformanddownloadartifact.yml
parameters:
- name: terraformWorkingDirectory
type: string
default: $(System.DefaultWorkingDirectory)/Terraform
- name: serviceConnection
type: string
default: value
- name: azureSubscription
type: string
default: value
- name: appconnectionname
type: string
default: value
- name: backendresourcegroupname
type: string
default: DevOpsTerraform
- name: backendstorageaccountname
type: string
default: value
- name: backendcontainername
type: string
default: value
- name: RG
type: string
default: RG_Example
- name: azureLocation
type: string
default: UK South
- name: terraformVersion
type: string
default: 1.0.4
- name: artifactName
type: string
default: Website
jobs:
- job: Run_Terraform
displayName: Installing and Running Terraform
steps:
- checkout: Terraform
- task: TerraformInstaller#0
displayName: install
inputs:
terraformVersion: '${{ parameters.terraformVersion }}'
- task: CmdLine#2
inputs:
script: |
echo '$(System.DefaultWorkingDirectory)'
dir
- task: TerraformTaskV2#2
displayName: init
inputs:
provider: azurerm
command: init
backendServiceArm: '${{ parameters.serviceConnection }}'
backendAzureRmResourceGroupName: '${{ parameters.backendresourcegroupname }}'
backendAzureRmStorageAccountName: '${{ parameters.backendstorageaccountname }}'
backendAzureRmContainerName: '${{ parameters.backendcontainername }}'
backendAzureRmKey: terraform.tfstate
workingDirectory: '${{ parameters.terraformWorkingDirectory }}'
- task: TerraformTaskV1#0
displayName: plan
inputs:
provider: azurerm
command: plan
commandOptions: '-input=false'
environmentServiceNameAzureRM: '${{ parameters.serviceConnection }}'
workingDirectory: '${{ parameters.terraformWorkingDirectory }}'
- task: TerraformTaskV1#0
displayName: apply
inputs:
provider: azurerm
command: apply
commandOptions: '-input=false -auto-approve'
environmentServiceNameAzureRM: '${{ parameters.serviceConnection }}'
workingDirectory: '${{ parameters.terraformWorkingDirectory }}'
- job: Put_artifacts_into_place
displayName: Putting_artifacts_into_place
dependsOn: Run_Terraform
steps:
- checkout: Website
- checkout: AuthenticationServer
- task: DownloadPipelineArtifact#2
displayName: Download Build Artifacts
inputs:
artifact: '${{ parameters.artifactName }}'
patterns: /website/**/*.zip
path: $(Pipeline.Workspace)/website/
- task: AzureWebApp#1
displayName: 'Azure Web App Deploy: VALUE'
inputs:
package: $(Pipeline.Workspace)/website/**/*.zip
azureSubscription: '${{ parameters.azureSubscription }}'
ConnectedServiceName: '${{ parameters.appconnectionname}}'
appName: VALUE
ResourceGroupName: '${{ parameters.RG}}'
- task: DownloadPipelineArtifact#2
displayName: Download Build Artifacts
inputs:
artifact: '${{ parameters.artifactName}}'
patterns: /authsrv/**/*.zip
path: $(Pipeline.Workspace)/authsrv/
- task: AzureWebApp#1
displayName: 'Azure Web App Deploy: VALUE'
inputs:
package: $(Pipeline.Workspace)/authsrv/**/*.zip
azureSubscription: '${{ parameters.azureSubscription }}'
ConnectedServiceName: '${{ parameters.appconnectionname}}'
appName: VALUE
ResourceGroupName: '${{ parameters.RG}}'
Essentially the first pipeline calls these two pipelines which are wrapped in Stages. Before they were not wrapped in stages and this pipeline worked well. Since moving it to stages I have had the issue where the task: DownloadPipelineArtifact#2 completes but downloads nothing. Please see screen shot bellow:
The error I am getting at the end of the pipeline is:
##[error]Error: No package found with specified pattern: /home/vsts/work/1/website/**/*.zip<br/>Check if the package mentioned in the task is published as an artifact in the build or a previous stage and downloaded in the current job.
I have tried the following solutions without success:
File pattern for Publish Pipeline Artifact in Azure DevOps
how to use PublishPipelineArtifact#1 with build script
And consulted MS Doc: https://learn.microsoft.com/en-us/azure/devops/pipelines/artifacts/pipeline-artifacts?view=azure-devops&tabs=yaml

How to fix 401 when deploying Maven build to Azure Artifacts via Azure Pipeline?

I've created feed in Azure Artifacts (azure-maven), added the MavenAuthenticate task the to build pipeline (with artifactsFeeds: azure-mave), added mavenAuthenticateFeed: true to the Maven task, added the repository to the pom.xml with the same ID, but when deploying the Maven task fails with 401 (Unauthorized).
Is there a step I have missed?
(I dont want to use a PAT if I dont have too, isnt that the reason to use MavenAuthenticate task?)
Cheers,
Steve
Hmmm, looks like the issue was actually setting mavenAuthenticateFeed to true, which doesnt make sense to me :(
Here is the working yml:
trigger:
- main
variables:
- name: MAVEN_CACHE_FOLDER
value: $(Pipeline.Workspace)/.m2/repository
- name: MAVEN_OPTS
value: -Dmaven.repo.local=$(MAVEN_CACHE_FOLDER)
pool:
vmImage: 'ubuntu-latest'
steps:
- task: MavenAuthenticate#0
inputs:
artifactsFeeds: 'azure-maven'
- task: Cache#2
inputs:
key: 'maven | "$(Agent.OS)" | pom.xml'
path: '$(MAVEN_CACHE_FOLDER)'
cacheHitVar: 'CacheRestored'
restoreKeys: |
maven | "$(Agent.OS)"
maven
displayName: Cache Maven local repo
- task: Maven#3
inputs:
mavenPomFile: 'pom.xml'
goals: 'package deploy'
publishJUnitResults: true
testResultsFiles: '**/surefire-reports/TEST-*.xml'
javaHomeOption: 'JDKVersion'
mavenVersionOption: 'Default'
mavenOptions: '-Xmx3072m $(MAVEN_OPTS)'
mavenAuthenticateFeed: false
jdkVersionOption: '1.8'
jdkArchitectureOption: 'x64'
effectivePomSkip: false
sonarQubeRunAnalysis: false
- task: CopyFiles#2
inputs:
Contents: '**/target/*.jar'
TargetFolder: '$(Build.ArtifactStagingDirectory)'
CleanTargetFolder: true
flattenFolders: true
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'Test'
publishLocation: 'Container'
And if supplying your own settings.xml:
<servers>
<server>
<id>azure-maven</id>
<username>AzureDevOps</username>
<password>${env.SYSTEM_ACCESSTOKEN}</password>
</server>
</servers>
You need to set up the SYSTEM_ACCESSTOKEN:
- task: DownloadSecureFile#1
name: mvnSettings
inputs:
secureFile: 'settings.xml'
- task: Maven#3
env:
SYSTEM_ACCESSTOKEN: $(System.AccessToken)
inputs:
mavenPomFile: 'pom.xml'
goals: 'clean deploy'
publishJUnitResults: true
testResultsFiles: '**/surefire-reports/TEST-*.xml'
javaHomeOption: 'JDKVersion'
mavenVersionOption: 'Default'
options: '-s $(mvnSettings.secureFilePath)'
mavenOptions: '-Xmx3072m $(MAVEN_OPTS)'
mavenAuthenticateFeed: false
jdkVersionOption: '1.8'
jdkArchitectureOption: 'x64'
effectivePomSkip: false
sonarQubeRunAnalysis: false

Azure starting 2 self hosted servers deliver files to Azure managed agent both in pool but only 1 accepting copyfiles task

I have a very strange behaver where I starting 2 self-hosted agents from my AWS server to deliver files to the managed agent in azure all with different names to be unique. but in the azure pipeline, i can see
both pipelines are taking the files from only 1 place instead of taking files from its unique place in the self-hosting server.
here is the command I'm starting the self-hosting docker server :
they all take form :
docker run -d --rm -t
--name=ios_docker_128_linux_slave_2
-e AZP_WORK=/var/jenkins/jenkins_slave/workspace/run_build_ios#2/128/ios_build_temp/working_dir
-v /var/jenkins/jenkins_slave/workspace/run_build_ios#2/128/ios_build_temp/working_dir:/azp
-e AZP_URL=https://dev.azure.com/xxx -e AZP_TOKEN=xxxx
-e AZP_AGENT_NAME=ios_docker_128_linux_slave_2
xxx.xxx.com:1/azure_self_hosted_agent/agent:latest
docker run -d --rm -t
--name=ios_docker_127_linux_slave_2
-e AZP_WORK=/var/jenkins/jenkins_slave/workspace/run_build_ios/127/ios_build_temp/working_dir
-v /var/jenkins/jenkins_slave/workspace/run_build_ios/127/ios_build_temp/working_dir:/azp
-e AZP_URL=https://dev.azure.com/xxx
-e AZP_TOKEN=xxxx
-e AZP_AGENT_NAME=ios_docker_127_linux_slave_2
xxx.xxx.com:1/azure_self_hosted_agent/agent:latest
but see in the azure pipeline they are taking the files fro the same place :
pay attention they all take the files from /run_build_ios/127
/var/jenkins/jenkins_slave/workspace/run_build_ios/127/ios_build_temp/working_dir/_temp/
look at the red markers
how my pipeline looks like :
pool:
vmImage: 'macOS 10.14'
parameters:
- name: Folderpath
type: string
displayName: 'configure path'
- name: FolderCompile
type: string
displayName: 'Compile ios productes path'
- name: projectName
type: string
displayName: 'ios projectName'
- name: appIdentifier
type: string
displayName: 'ios appIdentifier'
- name: versionNumber
type: string
displayName: 'ios versionNumber'
- name: buildNumber
type: string
displayName: 'ios buildNumber'
- name: plistFileFtpBasePath
type: string
displayName: 'ios plistFileFtpBasePath'
- name: fastlaneAppleSession
type: string
displayName: 'ios fastlaneAppleSession'
variables:
scheme: ''
sdk: 'iphoneos'
configuration: 'Release'
CERTIFICATE_PASSWORD: xxxx
FASTLANE_PASSWORD: xxxx
FASTLANE_SESSION: '${{parameters.fastlaneAppleSession}}'
jobs:
- job: self_hosted_connect
timeoutInMinutes: 10
pool: Default
steps:
- task: CopyFiles#2
inputs:
SourceFolder: '$(Agent.HomeDirectory)/../${{parameters.Folderpath}}'
Contents: '**'
TargetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: '$(build.artifactstagingdirectory)'
artifactName: 'ios_artifacts'
- job: mac_agent
dependsOn: self_hosted_connect
timeoutInMinutes: 10
pool:
vmImage: 'macOS 10.14'
- task: UseRubyVersion#0
inputs:
versionSpec: '>= 2.4'
addToPath: true
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'ios_artifacts'
downloadPath: '$(System.ArtifactsDirectory)'
- script: |
gem install --no-document bundler
bundle update --bundler
bundle install --retry=3 --jobs=4
gem install --no-document fastlane
mkdir fastlane
mv Fastfile fastlane
mv Appfile fastlane
pod deintegrate
gem install cocoapods
pod install
pod --version
fastlane release --verbose projectName:${{parameters.projectName}} appIdentifier:${{parameters.appIdentifier}} versionNumber:${{parameters.versionNumber}} buildNumber:${{parameters.buildNumber}} plistFileFtpBasePath:${{parameters.plistFileFtpBasePath}} ArtifactsDirectory:$(System.ArtifactsDirectory)
workingDirectory: '$(System.ArtifactsDirectory)/ios_artifacts'
displayName: 'create_keychain'
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(Build.ArtifactStagingDirectory)'
ArtifactName: 'Artifacts'
publishLocation: 'Container'
- job: copy_back_files_to_self_hosted_connect
dependsOn: mac_agent
timeoutInMinutes: 10
pool: Default
steps:
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'Artifacts'
itemPattern: '**/*.ipa|manifest.plist'
downloadPath: '$(System.ArtifactsDirectory)'
- task: CopyFiles#2
inputs:
SourceFolder: '$(System.ArtifactsDirectory)'
Contents: '**/*.ipa|manifest.plist'
TargetFolder: '$(Agent.HomeDirectory)/../${{parameters.FolderCompile}}'
also here you can see i have 3 agents ( the name is a bit different as i took the picture later but it always 3 up )

Trying to deploy ionic5 angular8 pwa to aws ec2 from azure devops YAML

I am new to azure devops.I am trying to deploy my ionic5 angular application on amazon ec2 via azure pipelines.Whenever i am trying to run the pipeline it is giving "Encountered error(s) while parsing pipeline YAML:
/build-ci.yml (Line: 2, Col: 3): A mapping was not expected" this error.
This is yaml file code.I am quite stucked here. Please help.
trigger:
- task: CodeDeployDeployApplication#1
inputs:
awsCredentials: 'AWS Service Con'
regionName: 'us-east-1'
applicationName: 'Dev-erp-Frontend'
deploymentGroupName: 'Dev-erp-Frontend'
deploymentRevisionSource: 'workspace'
revisionBundle: 'Dev-erp-Frontend-Rev'
bucketName: 'Dev-erp-Frontend'
fileExistsBehavior: 'OVERWRITE'
batch: "true"
branches:
include:
- master
- dev
paths:
include:
- ./*
pr:
branches:
include:
- master
- dev
paths:
include:
- ./*
jobs:
- job: Build_Job
displayName: Build
pool:
vmImage: 'ubuntu-18.04'
demands:
- npm
steps:
- checkout: self
clean: false
# - powershell: 'npm cache clean --force'
# displayName: 'PowerShell Script'
# env:
# APPDATA: npm-cache
- task: Npm#1
displayName: 'Npm Install'
inputs:
workingDir: "./"
command: "ci"
- task: Npm#1
displayName: 'Lint Client App'
inputs:
workingDir: "./"
command: "custom"
customCommand: "run lint"
continueOnError: true
- task: Npm#1
displayName: 'Copy Assets'
inputs:
workingDir: "./"
command: "custom"
customCommand: "run copy-files"
continueOnError: false
- task: Npm#1
displayName: 'Build Client App'
inputs:
workingDir: "./"
command: "custom"
customCommand: "run build:prod"
# Archive files
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(Build.BinariesDirectory)'
includeRootFolder: true
archiveType: 'zip' # Options: zip, 7z, tar, wim
archiveFile: '$(Build.ArtifactStagingDirectory)/www.zip'
replaceExistingArchive: true
- task: CopyFiles#2
inputs:
Contents: 'www/**'
TargetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
ArtifactName: 'titas-ecom-erp'
We cannot set the task CodeDeployDeployApplication#1 at the trigger level, we should add the task at the steps level. You can update your YAML definition as follows.
trigger:
branches:
include:
- master
- dev
paths:
include:
- ./*
pr:
branches:
include:
- master
- dev
paths:
include:
- ./*
jobs:
- job: Build_Job
displayName: Build
pool:
vmImage: 'ubuntu-18.04'
demands:
- npm
steps:
- checkout: self
clean: false
# - powershell: 'npm cache clean --force'
# displayName: 'PowerShell Script'
# env:
# APPDATA: npm-cache
- task: CodeDeployDeployApplication#1
inputs:
awsCredentials: 'AWS Service Con'
regionName: 'us-east-1'
applicationName: 'Dev-erp-Frontend'
deploymentGroupName: 'Dev-erp-Frontend'
deploymentRevisionSource: 'workspace'
revisionBundle: 'Dev-erp-Frontend-Rev'
bucketName: 'Dev-erp-Frontend'
fileExistsBehavior: 'OVERWRITE'
batch: "true"
- task: Npm#1
displayName: 'Npm Install'
inputs:
workingDir: "./"
command: "ci"
- task: Npm#1
displayName: 'Lint Client App'
inputs:
workingDir: "./"
command: "custom"
customCommand: "run lint"
continueOnError: true
- task: Npm#1
displayName: 'Copy Assets'
inputs:
workingDir: "./"
command: "custom"
customCommand: "run copy-files"
continueOnError: false
- task: Npm#1
displayName: 'Build Client App'
inputs:
workingDir: "./"
command: "custom"
customCommand: "run build:prod"
# Archive files
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(Build.BinariesDirectory)'
includeRootFolder: true
archiveType: 'zip' # Options: zip, 7z, tar, wim
archiveFile: '$(Build.ArtifactStagingDirectory)/www.zip'
replaceExistingArchive: true
- task: CopyFiles#2
inputs:
Contents: 'www/**'
TargetFolder: '$(build.artifactstagingdirectory)'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
ArtifactName: 'titas-ecom-erp'

Resources