Using Azure Devops yaml pipelines to deploy to on-prem servers - azure

When using Azure DevOps pipelines, is it possible to deploy to on-prem servers using a yaml pipeline?
I have deployed to on premise servers with a Release (classic) pipeline using deployment groups, and I have seen instructions on deploying to Azure infrastructure using yaml pipelines.
However I can't find any examples of how to deploy to on-prem servers using yaml pipelines - is this possible yet? If so are there any examples available of how to achieve this?

As already explained in the previous answers, you need to create a new environment and add VMs to the environment (see documentation).
Using a deployment job, you also need to specify the resourceType
- deployment: VMDeploy
displayName: Deploy to VM
environment:
name: ContosoDeploy
resourceType: VirtualMachine
...
If you have multiple VMs in this environment, the job steps will be executed on all the VMs.
To target specific VMs, you can add tags (see documentation).
jobs:
- deployment: VMDeploy
displayName: Deploy to VM
environment:
name: ContosoDeploy
resourceType: VirtualMachine
tags: windows,prod # only deploy to virtual machines with both windows and prod tags
...

Yes you can. In YAML pipelines you can use "Environments" as a replacement for deployment groups. They work in a similar way in that you install the agent on the target machine and then specify the environment in your Deployment Job
Create a new Environment with an appropriate name (e.g. Dev) then add a resource, you'll be able to add either a VM or a Kubernetes cluster. Assuming that you choose VM, then you will able to download a script which you can run an target machines to install the deployment agent. This script will install and register the agent in to the environment.
Once you have the Agent registered in to the environment add a deployment job to your YAML
- stage: DeployToDev
displayName: Deploy to Dev
jobs:
- deployment: DeployWebSite
displayName: Deploy web site to Dev
environment: Dev
strategy:
runOnce:
deploy:
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
Write-Host "Deploy my code"

Related

How to use azure devops pipeline terraform to build oracle Cloud infra resources

I want to use a pipeline with azure devops terraform to deploy resources in the oracle Cloud infra.
But i don't knew if the provider oci is supported or not.
I want to store the state file of oracle CI resources in the storage account azure.
Some one have a solution please ?
The task you are using does not support Oracle cloud, it only supports AWS, Azure and GCP.
If you want to target Oracle cloud, I suggest using a generic bash task and using Terraform from CLI.
With Azure, the plan phase could look like this. With Oracle Cloud, you would have to replace the environment variables with these.
- task: Bash#3
name: tf_plan
displayName: 'Terraform plan'
inputs:
targetType: 'inline'
script: |
terraform init -backend-config=config/backend/${{ parameters.environment }}.json
terraform plan -detailed-exitcode -out=tfplan -input=false
exitcode=$?
echo "##vso[task.setvariable variable=terraform_exitcode;isOutput=true]$exitcode"
if [ "$exitcode" -eq 1 ]; then
exit $exitcode
else
exit 0
fi
workingDirectory: '$(System.ArtifactsDirectory)/Terraform/'
env:
ARM_CLIENT_ID: $(ArmClientId)
ARM_CLIENT_SECRET: $(ArmClientSecret)
ARM_SUBSCRIPTION_ID: $(ArmSubscriptionId)
ARM_TENANT_ID: $(ArmTenantId)
TF_IN_AUTOMATION: true
Documentation on the parameters used can be found here

Bash or powershell command for deploying Nodejs in Azure Linux VM using Azure pipelines

I am trying to build CI/CD for a nodejs application using Azure pipeline to be deployed in Azure VM . I can see the build being successful but the deployment is not .
Can anyone help me with bash command to start the node server.
The below is the yaml portion of deploy stage
- stage: Deploy
displayName: Deploy stage
dependsOn: Build
condition: succeeded()
jobs:
- deployment: VMDeploy
displayName: web
environment:
name: VMtest
resourceType: VirtualMachine
tags: web
strategy:
runOnce:
deploy:
steps:
- script: echo my first deployment

Running azure powershell script through YAML release pipeline

I have my normal and working release pipeline that, by given a certain deployment group, performs some tasks:
Copies a script
Executes that powershell script (on the target machines defined in the Deployment Group)
Deletes the script
I know that YAML doesn't support deployment groups, but (lucky me!) so far my deployment group has only one machine, let's call it MyTestVM .
So what I am trying to achieve mainly is simply executing a powershell script on that vm . Normally, what happenes with the release pipeline, is that you have a tentacle/release agent installed on the VM, your deployment target (which is inside the Deployment Group) is hooked up to that, and your release pipeline (thanks to the Deployment Group specification) is able to use that release agent on the machine and do whatever it wants on the VM itself.
I need the same... but through YAML ! I know there is PowerShellOnTargetMachines command available in YAML but I don't want to use that. It uses PSSession, it requires SSL certificates and many other things. I just want to use the already existing agent on the VM !
What I have in place so far:
pool: 'Private Pool'
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'specific'
project: 'blahblah'
definition: 'blah'
buildVersionToDownload: 'latest'
targetPath: '$(Pipeline.Workspace)'
- task: CopyFiles#2
displayName: 'Copy Files to: C:\TestScript'
inputs:
SourceFolder: '$(Pipeline.Workspace)/Scripts/'
Contents: '**/*.ps1'
TargetFolder: 'C:\TestScript'
CleanTargetFolder: true
OverWrite: true
The first part just downloads the Artifact containing my script. And then to be honest I am not even sure I need to copy the script in the second part.. first because I don't think it is copying the script to the VM target workspace, but it is copying it to the VM where the azure pipeline agent is installed. And second: I think I can just reference it from my artifact.. but this is not the important part.
How can I make my YAML pipeline make use of the release agent installed on the VM in the same way that a normal release pipeline does?
Reached somehow a solution. First of all worth mentioning that since deployment groups don't work with YAML pipelines the way to proceed is to create an Environment and add as resource your target VM.
So I didn't need to create my own hosted agent or anything special since the problem was the target itself and not the agent running the pipeline.
By creating an Environment and adding a resource (in my case a VM) to that environment, we create also a new release agent on the target itself. So my target VM will now have 2 release agents: the old one that can be used by normal release pipelines, and the new one, attached to the Environment resource on Azure Devops that can be used by YAML pipelines.
Now I am finally able to hit my VM:
- stage: PerformScriptInVM
jobs:
- deployment: VMDeploy
pool:
vmImage: 'windows-latest'
# watch out: this creates an environment if it doesn’t exist
environment:
name: My Environment Name
resourceType: VirtualMachine
strategy:
runOnce:
deploy:
steps:
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'specific'
project: 'blahblahblah'
definition: 'blah'
buildVersionToDownload: 'latest'
targetPath: '$(Pipeline.Workspace)'
- task: PowerShell#2
displayName: 'PowerShell Script'
inputs:
targetType: filePath
filePath: '$(Pipeline.Workspace)/Scripts/TestScript.ps1'
arguments: 'whatever your script needs..'
To get the job to run on the specific release agent you want, you can do two things:
Create a pool and only put your release agent into it.
pool: 'My Pool with only one release agent'
Use an existing pool, and publish/demand a capability for your agent.
On the agent machine itself, add a system environment variable (for example, MyCustomCapability. Give it a value like 1
then your pipeline becomes:
pool:
name: 'My pool with potentially more than one agent'
demands: 'MyCustomCapability'
If only this agent has this environment variable set, then only this agent can execute the job

DevOps Pipeline Automation - Infrastructure and Application deployment

I am facing an implementation issues with the below scenario using Azure DevOps pipelines
Provision two resources in Azure Subscription using ARM templates
Azure Container Registry
Azure Kubernetes Services
Deploy the containerized application code to the Kubernetes clusters
I am able to perform both of the steps in an individual pipelines. Need some help with combining/integrating two pipelines into one.
How can I automate both of the above steps without having any manual intervention in the process. The process must be robust enough to handle the Application deployment in the same ACR and AKS that was created in the previous step?
Both the Infrastructure and the Application code resides in the same GIT/ Azure Repository.
First of all. I would like to recommend that you do not tie the two pipeline together.
While infrastructure as code is important and should be used. It is important to decouple the infrastructure provisioning from the application provisioning for many good reasons.
Regarding you ask, you should have a task that create the ACR/AKS, one task that gets the credentials to be used in another task which would deploy to the ore created AKS ans ACR
The flow could be.
ACR —AKS (if you want to configure your AKS with ACR integration, you need to have them sequential) — docker build/push to ACR — deploy container using shell script with KubeCTL leveraging the credentials from second step.
You can use multiple stages in one yaml pipeline. See below simple example:
Below yaml pipeline has two stages. The first stage do the tasks to provision the Infrastructure. The second stage is dependsOn the first stage, and defined the ACR and AKS in the variables, then do the tasks to deploy to the Kubernetes clusters.
trigger:
- master
stages:
- stage: Infrastructure Deployment
pool:
vmImage: windows-latest
jobs:
- job: Infrastructure
steps:
- task: AzureResourceManagerTemplateDeployment#3
inputs:
.....
- stage: Application Deployment
dependsOn: Infrastructure Deployment
pool:
vmImage: windows-latest
variables:
ACRName: "ACRName"
AKSName: "ACRName"
azureSubscriptionEndpoint: ..
azureContainerRegistry: ..
azureResourceGroup: ..
jobs:
- job: Application
steps:
# - task: Docker#2
# inputs:
# containerRegistry: $(ACRName)
# ...
- powershell: |
# Log in to Docker with service principal credentials
docker login $(ACRName).azurecr.io --username $SP_APP_ID --password $SP_PASSWD
docker build
docker push
- task: Kubernetes#1
displayName: kubectl apply
inputs:
connectionType: Azure Resource Manager
azureSubscriptionEndpoint: $(azureSubscriptionEndpoint)
azureResourceGroup: $(azureResourceGroup)
kubernetesCluster: $(AKSName)
....
Update: dynamically capture the ACR and AKS name along with ACR login credentials
You can use azure powershell task to get above data. In order to use azure powershell task, you need to create a azure resource manager service connection.
Then you can write custom inline scripts in the task. See below example:
- task: AzurePowerShell#5
name: AzurePowershell
inputs:
azureSubscription: 'Microsoft Azure Internal Consumption (daaeef3e-d7fe-49e8-baaa-b3df9d072825)'
ScriptType: InlineScript
Inline: |
$json = (Get-Content "$(system.defaultworkingdirectory)\template.json" -Raw) | ConvertFrom-Json
$AKS = $json.resources | where {$_.type -eq "Microsoft.ContainerService/managedClusters"} | select name
$ACR = $json.resources | where {$_.type -eq "Microsoft.ContainerRegistry/registries"} | select name
echo "##vso[task.setvariable variable=ACRName;isOutput=true]$($ACR.name)"
echo "##vso[task.setvariable variable=AKSName;isOutput=true]$($AKS.name)"
$ACRCred = Get-AzContainerRegistryCredential -ResourceGroupName "MyResourceGroup" -Name $($ACR.name)
echo "##vso[task.setvariable variable=UserName;isOutput=true]$($ACRCred.Username)"
echo "##vso[task.setvariable variable=Password;isOutput=true]$($ACRCred.Password)"
azurePowerShellVersion: LatestVersion
You can then get these variables in the following stage by referring to stageDependencies.stageName.jobName.outputs['stepName.variableName']
See here for more azure powershell cli.

How to set up build and release pipeline for NestJS application to Azure using YAML

I am having trouble understanding and getting the build/release pipelines setup for deploying a NestJS application to Azure DevOps (ADO).
I am deploying to a Linux Web App hosted in Azure.
As far as I understand, if I run the app locally using something like npm run start, it creates a dist folder under my root project directory.
So, when writing the YAML for the build and deployment. My thought process is to:
Run an NPM update.
Run npm run build to build the application and generate the dist folder.
Copy the contents of the application (or just the dist folder?) into the target folder (/home/site/wwwroot)
Run npm run start:prod to start the server.
Here is my YAML so far:
trigger:
- master
pool:
vmImage: 'ubuntu-latest'
steps:
- task: UseNode#1
inputs:
version: '14.x'
checkLatest: true
- task: Npm#0
displayName: Run NPM Update for NestJS
inputs:
cwd: '$(Build.SourcesDirectory)/ProjectName'
command: update
- task: Npm#0
displayName: Build NestJS
inputs:
cwd: '$(Build.SourcesDirectory)/ProjectName'
command: run
arguments: "build"
- task: CopyFiles#2
inputs:
Contents: 'dist/**'
TargetFolder: '$(Build.ArtifactStagingDirectory)'
The issue is after the build process completes, I do not see a dist folder in /home/site/wwwroot/ProjectName. Can someone help me out with what I am missing?
Also, a side noob-y question about Azure DevOps, what does $(Build.SourcesDirectory) and $(Build.ArtifactStagingDirectory) refer to and how and where are those environment variables set?
To deploy your app to the hosted in Azure. You need to use Azure App Service Deploy task or Azure Web App task.
Azure devops is the tool to build and deploy your app to your server(eg. the Linux Web App on Azure), it is not for hosting your app.
$(Build.ArtifactStagingDirectory) refer to the folder of the agent machine which runs your pipeline. (When your run your pipeline, it pick up a agent defined in pool to run your pipeline tasks)
The mapping to the folders in the agent machine is showing as below screenshot. Check the predefined variables for more information.
$(Agent.BuildDirectory) is mapped to c:\agent_work\1
$(Build.ArtifactStagingDirectory) is mapped to c:\agent_work\1\a
$(Build.BinariesDirectory) is mapped to c:\agent_work\1\b
$(Build.SourcesDirectory) is mapped to c:\agent_work\1\s
So back to the question how to deploying a NestJS application to Azure?
First you need to create a service connection in Azure devops to connect to your azure subscription. Check here for detailed steps.
Then add Azure App Service Deploy task/Azure Web App task to the end of your pipeline. See below example:
- task: AzureRmWebAppDeployment#4
inputs:
ConnectionType: 'AzureRM'
azureSubscription: 'SubscriptionServiceConnectionName'
appType: 'webAppLinux'
WebAppName: 'MyWebAppName'
Package: '$(Build.ArtifactStagingDirectory)/dist/'
StartupCommand: 'npm run start:prod'
You can check here for more information.

Resources