azure DevOps pipeline CI/CD - azure

I am using an Open-Source project Magda (https://magda.io/docs/building-and-running) and want to make an Azure CI/CD Pipeline.
For this project, there are some prerequisites like having sbt + yarn + docker + java installed.
How can I specify those requirements in the azure-pipelines.yml file.
Is it possible in azure-pipelines.yml file, to just write scripts? Without any use of jobs or tasks? And what's the difference between them (Tasks,Jobs ... )
(I'm currently starting with it, so I don't have much experience)
That's my current azure-pipelines.yml file (if there is something wrong please tell me)
# Node.js
# Build a general Node.js project with npm.
# Add steps that analyze code, save build artifacts, deploy, and more:
# https://learn.microsoft.com/azure/devops/pipelines/languages/javascript
trigger:
- release
pool:
vmImage: 'ubuntu-latest'
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.0.0'
displayName: 'Install Node.js'
- script: |
npm install
npm run build
displayName: 'npm install and build'
- script: |
curl -fsSL -o get_helm.sh https://raw.githubusercontent.com/helm/helm/master/scripts/get-helm-3
chmod 700 get_helm.sh
./get_helm.sh
displayName: 'install Helm '
- script: |
yarn global add lerna
yarn global add #gov.au/pancake
yarn install
displayName: 'install lerna & pancake packages'
- script: |
export NODE_OPTIONS=--max-old-space-size=8192
displayName: 'set Env Variable '
- script: |
lerna run build --stream --concurrency=1 --include-dependencies
lerna run docker-build-local --stream --concurrency=4 --include-filtered-dependencies
displayName: 'Build lerna '

I recommend you read this Key concepts for new Azure Pipelines users
It is possible to put all your stuff in one script step, but now you have logical separation, and this helps navigate and read file than one really long step.
Here you have some bascis from above mentioned documentation:
A trigger tells a Pipeline to run.
A pipeline is made up of one or more stages. A pipeline can deploy to one or more environments.
A stage is a way of organizing jobs in a pipeline and each stage can have one or more jobs.
Each job runs on one agent. A job can also be agentless.
Each agent runs a job that contains one or more steps.
A step can be a task or script and is the smallest building block of a pipeline.
A task is a pre-packaged script that performs an action, such as invoking a REST API or publishing a build artifact.
An artifact is a collection of files or packages published by a run.
But I really recommend you to go through it.
For this project , there are some prerequisites like having sbt + yarn + docker + java installed. How can i specifiy those requirements in the azure-pipelines.yml file.
If you are using Microsoft hosted agents you cannot specify demands
Demands and capabilities apply only to self-hosted agents. When using Microsoft-hosted agents, you select an image for the hosted agent. You cannot use capabilities with hosted agents.
So if you need sth what is not inside the agent you can install it and use taht new piece of sfotware. Later when you job is finished agent is restroed to original version. If you go for self hosted agent you can specify demands and based on agents capabilities it can be assigned to your job.

Related

How to access artifacts in next stage in GitLab CI/CD

I am trying to build GitLab CI/CD for the first time. I have two stages build and deploy The job in the build stage produce artifacts. And then the job in deploy stage wants to upload those artifacts to AWS S3. Both the jobs are using same runner but different docker image.
default:
tags:
- dev-runner
stages:
- build
- deploy
build-job:
image: node:14
stage: build
script:
- npm install
- npm run build:prod
artifacts:
paths:
- deploy/build.zip
deploy-job:
image: docker.xx/xx/gitlab-templates/awscli
stage: deploy
script:
- aws s3 cp deploy/build.zip s3://mys3bucket
The build-job is successfully creating the artifacts. GitLab documentation says artifacts will be automatically downloaded and available in the next stage, however it does not specify where & how these artifacts will be available to consume in the next stage.
Question
In the deploy-job will the artifacts available at the same location? like deploy/build.zip
The artifacts should be available to the second job in the same location, where the first job saved them using the 'artifacts' directive.
I think this question already has an answer on the gitlab forum:
https://forum.gitlab.com/t/access-artifact-in-next-task-to-deploy/9295
Maybe you need to make sure the jobs run in the correct order using the dependencies directive, which is also mentioned in the forum discussion accesible via the link above.

Azure DevOps Release Pipeline - Protractor UI tests suite

I am being tasked to work on Azure DevOps implementation of an existing legacy application. Application has a QA team which uses 500+ automated test cases. This test cases have been developed using Protractor.
All test cases are developed using JavaScript.
With Existing setup, below are the steps taken:
a. release pipeline deploys an ASP.NET application to Azure app services
b. QA person manually logs on to the VM and initiates the protractor tests.
Can we use any tasks from Azure devOps pipeline to test the same deployed application?
Can we use any tasks from Azure devOps pipeline to test the same deployed application?
To run the test with Protractor, you can use the Command Line task with node tool to run the test.
If you want to use the Microsoft-hosted agent to run the test, you need to install the appropriate version of the node tool and protractor package before running the test.
Here is an example:
steps:
- task: NodeTool#0
displayName: 'Use Node 10.x'
inputs:
versionSpec: 10.x
- task: Npm#1
displayName: 'npm install'
inputs:
workingDir: EndToEndTests/EndToEndTests
verbose: false
- script: 'node $(build.sourcesdirectory)/EndToEndTests/EndToEndTests/node_modules/protractor/bin/webdriver-manager update --versions.chrome=xxxx.x.x.x'
displayName: 'Command Line Script'
- script: |
npm run e2e #Run the same script as you are on the VM
displayName: 'Command Line Script'
For more detaield info, you could refer to this blog or this ticket .
On the other hand, if your test cases requires more additional configuration, you can also install a self-hosted agent on the VM.
In this case, the QA person don't need to log on the VM. They could directly run the Pipeline tasks on the self-hosted agent(create on the VM). This is equivalent to testing on the VM.

No loading source code for the all jobs azure DevOps pipeline

How to avoid loading source code for the jobs azure DevOps pipeline every time. How can I make a download once source code, and then use it in all jobs? I set up a parallel launch of my jobs in the pipeline and now I have to spend time loading code every time. Thanks.
How can I make a download once source code, and then use it in all jobs
If you use Microsoft-hosted agents. It cannot be done. Because each job in your pipeline will get a fresh virtual Machine when you run your pipeline, The virtual machine is discarded after one use. So the source code downloaded in one job is not available for another job.
However it is possible in self-hosted agent. You can try creating a self-hosted agent and run your pipeline on this self-hosted agent. See below example:
I have below pipeline for testing on my self-hosted agent.
pool: Default #run pipeline on self-hosted agent
stages:
- stage: Build
jobs:
- job: A
steps:
- checkout: self
- powershell: |
echo "job1"> job1.txt
ls
- job:
dependsOn: A
steps:
- checkout: none
- powershell: |
echo "job2"> job2.txt
ls
See output in the second powershell task: The source code is only loaded for once in the first job. And the following jobs can use the it too.
If you want to skip downloading the source code for your whole pipeline. You can check below steps.
Click the 3dots on your yaml pipeline edit page--> Select Triggers-->Go the Yaml tab-->go to Get sources section--> Check Don't sync sources. See below screenshot.
But if you want to load the source code in some of the jobs. You can then add a script task to run the git clone commands to clone the source in this job (ie. git clone https://$(System.Accesstoken)#dev.azure.com/org/pro/_git/rep )
If you want to skip downloading source code for some of your jobs. You can also use checkout step (ie. checkout: none).
stages:
- stage: Build
jobs:
- job:
steps:
- checkout: none #skip loading source in this job
- job:
steps:
- checkout: self #loading source in this job
This is not possible as job
A stage contains one or more jobs. Each job runs on an agent. A job represents an execution boundary of a set of steps. All of the steps run together on the same agent. For example, you might build two configurations - x86 and x64. In this case, you have one build stage and two jobs.
So technically they run on separate machines:
Question is if you need source code on all those jobs. If not you can disable downloading source code by adding step checkout: none.

Azure build pipeline

when I run the build pipeline I am getting ##[error]File not found: 'git'. I have an agent running on a server. I installed Git on the server. The pipeline is using this agent and is tied to an Azure repo. I am using simple script as below. Please advice.
trigger:
- master
pool: 'build agent'
vmImage: 'ubuntu-latest'
steps:
- script: echo Hello, world!
displayName: 'Run a one-line script'
script: |
echo Add other tasks to build, test, and deploy your project.
echo See https://aka.ms/yaml
displayName: 'Run a multi-line script'
Here is all that you have to do if you want to create your Azure Pipeline:
Browse to Azure Pipelines and click on New Pipeline
Select Azure Repo when asked about the source of your codebase
Select your repository
Review your Pipeline YAML and click on Run
And voila, you have your first build running!
For customizing your build pipeline further, please check the various built-in build and release tasks.
Here is the YAML schema for your reference.

Azure DevOps pipeline build locally with YAML

how can I simulate the build process of Azure Devops pipeline on the local machine before pushing it to branch to test the possible errors.
the solution gets build locally correct with no errors and warnings. also from the VS command line MSBuild builds the solution with no errors but on some push tries the pipeline build throws many errors mostly related to preprocessor defenition and precompiled header.
I wanted to know how can test the same process locally on my machine without pushing to repo.
azure-pipelines.yml
-------------------
pool:
vmImage: 'vs2017-win2016'
steps:
- task: MSBuild#1
displayName: 'Build solution'
inputs:
platform: 'Win32'
configuration: 'release'
solution: 'mysolution.sln'
- task: VSTest#2
displayName: 'Run Test'
inputs:
platform: 'Win32'
Configuration: 'release'
testAssemblyVer2: |
**\*.Test.dll
!**\*TestAdapter.dll
!**\obj\**
runSettingsFile: project.Test/test.runsettings
codeCoverageEnabled: true
If you are using a git repsotiory you can create another branch and make a pull request. As long as the pull request is not set to auto complete the code will not get committed to the repository.
If you are using a TFVC respository you can setup a gated build that is configured to fail. The pipeline should be a copy of your original pipeline but add a PowerShell task at the end of the build pipeline that throws a terminating error. Be sure to setup this gated build on a separate branch so it does not block normal development.
Write-Error "Fail here" -ErrorAction 'Stop'
You can now make pull requests or trigger a gated build without the code actually being commited.
You can use AzurePipelinesPS to install an agent on your local machine with the Install-APAgent command if you need another agent.
I'm only a few hours into to development with Azure, but I think I found a solution that would work for you. I happen to already have the solution in place. Use gradle, then the default YML just runs gradle and you don't have to worry too much about it after the first run. In the gradle file you could also spin up a docker image if you want and build on that.
The issue you have is most likely related to the difference between your local environment and the one on build agent where this YAML pipeline execute the build. Testing it locally (even if it would be possible) will not help as it will be executed in your environment, where you know already that every component required for the successful build are already installed. On the other side on the environment where build agent is running the build there seems to be missed components (or different versions) which cause your build to fail. Try to compare list of installed components and environment variables (like PATH) on your local machine and on build agent - there might be some difference between them.

Resources