I'm new to the Azure pipeline, right now working with WebMethod sag its a Middleware application to setup CI/CD using using Azure.
By default WebMethod Application provide the setup build script (ANT) using ant script we can able to build and deploy the application with application configuration properties files.
I have create azure pipeline YAML file to build and deploy my application. Please find the below sample build script.
trigger:
- none
pool:
name: default
parameters:
- name: projectName
displayName: appname?
type: string
steps:
- bash: |
mkdir -p /home/user/devops/azure-source/build/packages
displayName: 'create directory to copy source code'
- task: CopyFiles#2
displayName: 'Copy Files '
inputs:
SourceFolder: '${{parameters.projectName}}'
TargetFolder: '/home/user/devops/azure-source/build/packages'
- bash: |
mkdir -p /home/user/devops/azure-source/build/packages/$(Build.BuildNumber)
/home/viswa/softwareAg/ant/bin/build.sh
displayName: 'create build'
- bash: |
echo "name=packages" > $(Build.ArtifactDirectory)/packagename.properties
echo "BuildNumber=$(Build.BuildNumber)" > $(Build.ArtifactDirectory)/BuildNumber.txt
displayName: 'create package'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: drop'
After the successful build i can able to download the Artifacts which contains compiled source code and build number, i have attached sample output file.
BuildNumber.txt
BuildNumber=20220929.1
packagename.properties
name=myapp
Now the Question is I have to create another yaml lets name it as (rollback.yaml) with new pipeline and I want to rollback last build number (20220929.1) which contains last packagename (myapp) as well from my dev server.
can you please some one assist me how we can easily achieve it using azure artifacts with build number.
Related
I'm working with azure pipeline to checkout source code from azure repo and execute setup of inbuilt script to which is provided by webmethod SAG, Using build.yaml i can able to build my application but not able to publish the artifacts.
cat build.yaml
trigger:
- devops-build
pool:
name: CICD
steps:
# Create Target Directory to keep git repo for later use
- bash: |
mkdir -p /home/user/cicd_source/dev/packages/packages
displayName: 'create directory'
- bash: |
echo "webname=${{parameters.projectName}}" > $(Build.ArtifactStagingDirectory)/devpackagename.properties
echo "BuildNumber=$(Build.BuildNumber)" > $(Build.ArtifactStagingDirectory)/devBuildNumber.txt
Above script will create devpackagename.properties and devBuildNumber.txt following path inside my self hosted agent directory work location.
pwd
/home/user/agent/CICD/_work/1/a
ls -lrt
devpackagename.properties
devBuildNumber.txt
cat devpackagename.properties
webname=package
cat devBuildNumber.txt
BuildNumber=20221004.83
After ran the successful pipeline i don't see any artefacts published inside my pipeline
after your build steps add below task
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Build.ArtifactStagingDirectory)'
artifact: 'drop'
publishLocation: 'pipeline'
you would see artifact get published on the pipeline
I've been followings microsoft's documentation for the new yml ci-cd datafactory deployment. However, I keep getting a validation error when trying to implement CI/CD pipelines for Azure Data Factory with a yml file, after the installation of npm.
I get the following error message:
Error: Command failed: node /home/vsts/work/1/s/DataFactories/joaov1/build/downloads/main.js validate /home/vsts/work/1/s DataFactories/joaov1/subscriptions/000000-0000-0000-0000-864d437bd294/resourceGroups/rg-hi-joaov1-dev/providers/Microsoft.DataFactory/factories/adf-client-joaov1-dev
Execution finished....
Found npm debug log, make sure the path matches with the one in npm's output: /home/vsts/.npm/_logs/2021-12-01T21_20_55_974Z-debug.log
I'm quite confident that it is a an issue with the path, but after hours of research online I still can't get it to work...
My file structure is the following:
And my code is as follows:
# Sample YAML file to validate and export an ARM template into a build artifact
# Requires a package.json file located in the target repository
trigger:
- master
# parameters:
# - name: 'dev'
# - name: 'test'
# - name: 'prod'
pool:
vmImage: 'ubuntu-latest'
steps:
# Installs Node and the npm packages saved in your package.json file in the build
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
- task: Npm#1
inputs:
command: 'install'
workingDir: '$(Build.Repository.LocalPath)/DataFactories/joaov1/build' #replace with the package.json folder
verbose: true
displayName: 'Install npm package'
# Validates all of the Data Factory resources in the repository. You'll get the same validation errors as when "Validate All" is selected.
# Enter the appropriate subscription and name for the source factory.
- task: Npm#1
inputs:
command: 'custom'
workingDir: '$(Build.Repository.LocalPath)/DataFactories/joaov1/build' #replace with the package.json folder
customCommand: 'run build validate $(Build.Repository.LocalPath)/DataFactories/joaov1/subscriptions/000000-0000-0000-974d-864d437bd294/resourceGroups/rg-hi-joaov1-dev/providers/Microsoft.DataFactory/factories/adf-client-joaov1-dev'
displayName: 'Validate'
# Validate and then generate the ARM template into the destination folder, which is the same as selecting "Publish" from the UX.
# The ARM template generated isn't published to the live version of the factory. Deployment should be done by using a CI/CD pipeline.
- task: Npm#1
inputs:
command: 'custom'
workingDir: '$(Build.Repository.LocalPath)/DataFactories/joaov1/build' #replace with the package.json folder
customCommand: 'run build export $(Build.Repository.LocalPath)/DataFactories/joaov1/subscriptions/000000-0000-4aab-974d-864d437bd294/resourceGroups/rg-hi-joaov1-dev/providers/Microsoft.DataFactory/factories/adf-client-joaov1-dev "ArmTemplate"'
displayName: 'Validate and Generate ARM template'
# Publish the artifact to be used as a source for a release pipeline.
- task: PublishPipelineArtifact#1
inputs:
targetPath: '$(Build.Repository.LocalPath)/DataFactories/joaov1/build/ArmTemplate' #replace with the package.json folder
artifact: 'ArmTemplates'
publishLocation: 'pipeline'
Any help would be great,
Thank you so much,
Joao
You ADF Build tasks Node version should be changed from 10.X to 14.X.
- task: NodeTool#0
inputs:
versionSpec: '14.x'
displayName: 'Install Node.js'
I have created an Azure agent environment to a virtual machine where I download builds to the folder. I have the following yaml file
# Python package
# Create and test a Python package on multiple Python versions. OK
# Add steps that analyze code, save the dist with the build record, publish to a PyPI-compatible index, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/python
trigger:
paths:
include:
- Flytteordre
pool:
vmImage: 'ubuntu-latest'
name: Azure Pipelines
variables:
python.version: '3.6'
stages:
- stage: Build
displayName: Build stage
jobs:
- job: build
displayName: build
steps:
- task: UsePythonVersion#0
displayName: 'Use Python $(python.version) copy'
inputs:
versionSpec: '$(python.version)'
# We actually don't need to install these dependencies here. It needs to happen in the deploy yaml file.
- task: CmdLine#2
inputs:
script: |
python -m pip install --upgrade pip
python -m pip install selenium
python -m pip install pdfplumber
python -m pip install pandas
displayName: 'Install dependencies'
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact: dist'
inputs:
PathtoPublish: Flytteordre
ArtifactName: dist
- stage: Deployment
displayName: Deployment stage
jobs:
- deployment: deploymentJob
displayName: Deploy
environment:
name: Production
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- download: none
- downloadBuild: none
- task: DownloadBuildArtifacts#0
inputs:
buildType: 'current'
downloadType: 'single'
artifactName: 'dist'
downloadPath: 'C:/azagent/A1/_work/myfolder/'
My problem is that each time I run the pipeline, it creates a folder inside the _work environment called 1, 2 etc.
I would like to avoid this so that I am in full control of which folders are created. As you can see I have indicated that my artifact should by downloaded to the folder path C:/azagent/A1/_work/myfolder/, however this will create two folders. One is the folder that I have indicated but the other is a folder with the title of a number. I now that this is the default but I would like to know if there is a way to turn off this default setting or at the very least to be able to change the predefined path variable PIPELINE.WORKSPACE or Agent.BuildDirectory?
How to avoid creating new folder in Azure Agent for Pipeline in Environment
According to the document Agent variables:
So, each build definition goes into its own directory within the agent's working directory.
As we know, a pipeline has input and output. Whenever we create a new pipeline, we will increment a number to name the new work folder created. The advantage of this is that multiple running builds sharing the same copy of the repository are guaranteed to step on each other sooner or later.
I would prefer the option of being able to name the directory myself
in case I need to find a particular project and open it on the local
agent.
If you want open it on the local agent, you could just use the Agent variables to show the path:
Agent.BuildDirectory
Environment
Azure Dev Ops (code repo and pipeline trigger)
AWS ECR/ECS (target deployment platform)
Docker
.NET Core Web application (v5.0)
Current Situation
Presently building the application using dotnet build (powershell script) and pushing the zip file to Azure DevOps artifacts using azurepipeline.yml. This works out fine. I have added another task for ECR Push and that also pushes a generated docker image to the ECR using a Dockerfile in the source code.
Business Problem
We want to be able to chose a specific build (eg 0.1.24) in the Azure Artifact (using a variable to provide version number), and generate a Docker build using the corresponding binaries and the Dockerfile. I am unable to find a way to do so. The specific task is as follows:-
Deploy user updates variable "versionNoToDeploy" with the artifact id or name
Deploy user runs a specific pipeline
Pipeline finds the artifact (assuming its valid, else sends error), unzips the package at temp location (-need help on)
Pipeline runs dockerfile to build the image (-known & working)
Pipeline pushes this image to ECR (-known & working)
The purpose is to keep on building the branch till we get stable builds. This build is deployed on a test server manually and tested. Once the build gets certified, it needs to be pushed to Production ECR/ECS instances.
Our pipeline (specific code only)
- pwsh: ./build.ps1 --target Clean Protolint Compile --runtime $(runtime)
displayName: ⚙️ Compile
- task: Docker#2
displayName: Build
inputs:
command: build
repository: appRepo
tags: |
$(Build.BuildId)
deploy
addPipelineData: true
Dockerfile: src\DockerfileNew
- task: ECRPushImage#1
inputs:
awsCredentials: 'AWS ECR Connection'
regionName: 'ap-south-1'
imageSource: 'imagename'
sourceImageName: 'myApplication'
sourceImageTag: 'deploy'
repositoryName: 'devops-demo'
outputVariable: 'imageTagOutputVar'
- pwsh: ./build.ps1 --target Test Coverage --skip
displayName: 🚦 Test
- pwsh: ./build.ps1 --target BuildImage Pack --runtime $(runtime) --skip
displayName: 📦 Pack
- pwsh: ./build.ps1 --target Publish --runtime $(runtime) --skip
displayName: 🚚 Publish
Artifact details
Any specific aspects needed can be provided
Finally after playing a lot with the pipeline and custom tweaking the individual steps, I came out with the following (excerpted yml).
This involves having a build version to be stored in a variable, which is referenced in each of the steps of the pipeline.
The admin has to decide if they want a general build, producing an artifact; or just deploy a specific build to AWS. The variable having the build-id is evaluated conditionally, and based on that, the steps are executed or bypassed.
- pwsh: ./build.ps1 --target Clean Protolint Compile --runtime $(runtime)
condition: eq(variables['artifactVersionToPush'], '')
displayName: ⚙️ Compile
- task: DownloadBuildArtifacts#0
condition: ne(variables['artifactVersionToPush'], '')
inputs:
buildType: 'specific'
project: 'NitinProj'
pipeline: 'NitinProj'
buildVersionToDownload: specific
buildId: $(artifactVersionToPush)
downloadType: 'single'
artifactName: 'app'
downloadPath: '$(System.ArtifactsDirectory)' #(this needs to be mentioned as default is Build directory)
- task: ExtractFiles#1
displayName: Extract Artifact to temp location
condition: ne(variables['artifactVersionToPush'], '')
inputs:
archiveFilePatterns: '$(System.ArtifactsDirectory)/app/*.zip' #path need update
cleanDestinationFolder: false
overwriteExistingFiles: true
destinationFolder: src
- task: Docker#2
displayName: Docker Build image with compiled code in artifact
condition: ne(variables['artifactVersionToPush'], '')
inputs:
command: build
repository: myApp
tags: |
$(Build.BuildId)
deploy
addPipelineData: true
Dockerfile: src\DockerfileNew
- task: ECRPushImage#1
displayName: Push built image to AWS ECR
condition: ne(variables['artifactVersionToPush'], '')
inputs:
awsCredentials: 'AWS ECR Connection'
regionName: 'ap-south-1'
imageSource: 'imagename'
sourceImageName: 'myApp'
sourceImageTag: 'deploy'
pushTag: '$(Build.BuildId)'
repositoryName: 'devops-demo'
outputVariable: 'imageTagOutputVar'
- pwsh: ./build.ps1 --target Test Coverage --skip
condition: eq(variables['artifactVersionToPush'], '')
displayName: 🚦 Test
- pwsh: ./build.ps1 --target BuildImage Pack --runtime $(runtime) --skip
condition: eq(variables['artifactVersionToPush'], '')
displayName: 📦 Pack
- pwsh: ./build.ps1 --target Publish --runtime $(runtime) --skip
condition: eq(variables['artifactVersionToPush'], '')
displayName: 🚚 Publish
I will update this yml to have steps organized into jobs, but that's an optimization story.. :)
Since it involves manual intervention here, you may consider splitting the workflow into several jobs like this:
jobs:
- job: BuildAndDeployToTest
steps:
- bash: echo "A"
- job: waitForValidation
displayName: Wait for external validation
pool: server
timeoutInMinutes: 4320 # job times out in 3 days
steps:
- task: ManualValidation#0
timeoutInMinutes: 1440 # task times out in 1 day
inputs:
notifyUsers: |
test#test.com
example#example.com
instructions: 'Please validate the build configuration and resume'
onTimeout: 'resume'
- job: DeployToProd
steps:
- bash: echo "B"
This is not exactly what you want in terms of involving variables, but you will be able to achieve your goal. Wait for validation and deploy to prod only validated builds.
It rely on ManualValidation task.
Another approach could be using deployment job and approvals:
jobs:
- job: BuildAndDeployToTest
steps:
- bash: echo "A"
jobs:
# Track deployments on the environment.
- deployment: DeployToProd
displayName: deploy Web App
pool:
vmImage: 'Ubuntu-16.04'
# Creates an environment if it doesn't exist.
environment: 'PROD'
strategy:
# Default deployment strategy, more coming...
runOnce:
deploy:
steps:
- checkout: self
- script: echo my first deployment
For this you need to define evnironment and define approval.
In both ways you will get clear picture what was delivered to prod and information who approved PROD deployment.
To try to make it short, I have a React application that's build on Azure DevOps Build Pipeline like so
trigger:
- Release
queue:
name: Hosted
demands: npm
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
- script: |
npm ci
npm run build
displayName: 'Do npm ci and build'
This does what the build would do locally too. Locally the results of build go to build directory (unzipped).
Now when I try to create release pipeline there isn't an artifact to be found using Azure Deploy Web Service task.
If I try to add PublishPipelineArtifact#0 to the build pipeline to create a publication, the YAML editor just tells string does not match pattern... and doesn't let save the definition.
I suppose I should zip the contents of the generated build directory, but what would be the right way? Also, Is using the Azure App Service Deploy task the right way to deploy Azure WebApp? It works for ASP.NET Core apps so it finds the code drop artefact (zipped) and deploys it.
<edit: Adding
- task: PublishPipelineArtifact#0
inputs:
artifactName: 'drop'
targetPath: '$(Build.ArtifactStagingDirectory)/build'
Can actually be saved and the build run. Though it errors with
2019-01-25T22:42:27.6896518Z ##[section]Starting:
PublishPipelineArtifact 2019-01-25T22:42:27.6898909Z
============================================================================== 2019-01-25T22:42:27.6898962Z Task : Publish Pipeline Artifact
2019-01-25T22:42:27.6899006Z Description : Publish Pipeline Artifact
2019-01-25T22:42:27.6899034Z Version : 0.139.0
2019-01-25T22:42:27.6899062Z Author : Microsoft Corporation
2019-01-25T22:42:27.6899090Z Help : Publish a local directory
or file as a named artifact for the current pipeline.
2019-01-25T22:42:27.6899137Z
============================================================================== 2019-01-25T22:42:28.0499917Z ##[error]Path not exist: D:\a\1\a\build
2019-01-25T22:42:28.0708878Z ##[section]Finishing:
PublishPipelineArtifact
< edit 2: It appears removing the /build does the trick. Feels a tad odd since this is what's produced locally. It doesn't produce a zip file which is expected by the release job, namely Azure App Service Deploy.
Have to examine this problem later today (two o'clock at night here).
There was another problem: The script ran only npm ci and didn't run the building part. Separating it to two different steps made the difference. It appears PublishPipelineArtifact#0 in this case isn't an ideal option if I'd like to have the results as zipped to the staging area.
A currently working solution seem to be
resources:
- repo: self
trigger:
- Release
queue:
name: Hosted
demands: npm
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
- script: |
npm ci
displayName: 'npm ci'
- script: |
npm run build
displayName: 'npm run build'
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: '$(System.DefaultWorkingDirectory)/build'
includeRootFolder: false
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/artifact.zip'
replaceExistingArchive: true
- task: PublishBuildArtifacts#1
displayName: 'Publish Artifact'
inputs:
PathtoPublish: '$(build.artifactstagingdirectory)'
Maybe this can be simplified, but works for now and is flexible, it feels.