Capture current user as variable - Azure DevOps - azure

Background: I want to create a tag on azure resources created via bicep which is orchestrated via my azure devops pipeline(s). The tag in question I would like is the user who created them I.E the person who ran the release pipeline.
I'm not aware of any pre-defined variables that can capture the current ADO user, and I've also tried in PowerShell, however the below snippet only captures the build agent user on the microsoft hosted agent
$currentUserTemp=[System.Security.Principal.WindowsIdentity]::GetCurrent().Name
Write-Host "##vso[task.setvariable variable=currentUser;]$currentUserTemp"

This can help you get the VSID of the user who queue the pipeline:
Build.QueuedById
And these are the official documents:
https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml
https://learn.microsoft.com/en-us/azure/devops/pipelines/build/variables?view=azure-devops&tabs=yaml#how-are-the-identity-variables-set

Related

How can I export a list of pipelines from Azure devops?

I am relatively new to Azure DevOps. I am documenting the pipelines that currently exist within Azure DevOps and there are hundreds of them. Is there a way to export the names of all of the pipelines and potentially any other information about them (like pipeline description or environment) into an excel or something similar? I have read only access. I have not been able to figure out an easy way to do this and I feel like there must be a better solution than manually copying them one by one.
I've looked through questions on stack overflow but none seemed to answer this specific question.
REST
Depending on the technology you want to use you can use the for example the REST api of Azure DevOps, for YAML pipelines:
https://learn.microsoft.com/en-us/rest/api/azure/devops/pipelines/pipelines/list?view=azure-devops-rest-7.0
For the classic pipelines:
https://learn.microsoft.com/en-us/rest/api/azure/devops/release/definitions/list?view=azure-devops-rest-7.0&tabs=HTTP
PowerShell
Or using PowerShell with VSTeam, check out the Getting started here.
For (YAML) build:
https://methodsandpractices.github.io/vsteam-docs/docs/modules/vsteam/commands/Get-VSTeamBuildDefinition
Get-VSTeamBuildDefinition -ProjectName Demo | Format-List *
And Classic (release) pipelines:
https://methodsandpractices.github.io/vsteam-docs/docs/modules/vsteam/commands/Get-VSTeamReleaseDefinition
Get-VSTeamReleaseDefinition -ProjectName demo | Format-List *
This result can than easily opened into Excel with $obj | Export-Excel :
https://github.com/dfinke/ImportExcel
For your scenario, REST API is a good approach.
To get all the pipelines from a DevOps Project, you could use: Pipeline-List
This could list all pipelines including pipeline name and pipeline ID.
You could copy the output into an excel and filter your target info.
With the above pipeline ID, you could use this REST API: Pipelines-Get to retrieve more info related to this specific pipeline by ID.
Variables:
Repo info:
Environments related API are listed here

How to manipulate remote Terraform state files on Azure Blob storage

I'm working with a subscription that has a few different deployed environments (dev, test, staging, etc.). Each environment has its own storage account, containing an associated Terraform state file. These environments get deployed via Azure DevOps Pipelines.
It's easy enough to get at the .tfstate files that have been created this way, through the portal, CLI, etc.
But is it possible to access these state files using the 'terraform state' commands, for example using Azure Cloud Shell? If so, how do you point them at the right location?
I've tried using the terraform state commands in a Cloud Shell, but it's not clear how to point them to the right location or if this is indeed possible.
For these requirement, you need AzurePowerShell task to achieve your requirement.
1, First, if you can achieve your requirement via powershell feature in azure portal, then it is possible using the AzurePowerShell task to achieve the same thing(AzurePowerShell is running on the agent based on the service connection/service principal you provided.).
- task: AzurePowerShell#5
inputs:
azureSubscription: 'testbowman_in_AAD' #This service connection related to service principal on Azure side.
ScriptType: 'InlineScript'
Inline: |
# Put your logic here.
# Put your logic here.
azurePowerShellVersion: 'LatestVersion'
2, Second, you can use AzCopy to download the file and then do operations to it. DevOps microsoft host agent support this tool.
running this command : terraform state pull > state.tfstate (you can give like thils dev.tfstate extension tfstate is important)in the Azure cloud shell.
All you need to move to the terraform file directory
enter image description here
and run this command terraform state pull > dev.tfstate
enter image description here

roleAssignment with current user id

I'm using Azure AD app registration principles to deploy resources via Azure Resource Manager to deploy via Pipelines.
During the deployment I need to set some permissions to the deployment user to ensure it has enough permission to - for example - upload files.
As I'm using different principles, and I'm not managing those in the code, I would like to know if there is a way to reference the "current user-principals - ID" during the deployment.
Something like:
deployment().properties.xx
or
environment()
https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/template-functions-deployment
https://learn.microsoft.com/en-us/azure/templates/microsoft.authorization/roleassignments?tabs=bicep
Otherwise, I would need to inject this information via parameter, I think. I could get that information by script - or there is a variable even present from azure dev ops.
Any ideas, help appreciated. Thanks.
Currently, it's not possible to get the objectId of the user deploying the template... we do have a backlog item for it.

Azure DevOps dynamic Release Pipeline creation

I am currently planning on a type of multi-tenant system, were different resource groups with a set of AppServices are deployed for customers via ARM Templates. Hence, each customer has its own Resource Group and set of AppServices. Currently we use Azure DevOps to deploy to a set of AppServices used for Development and Quality Assurance before it gets to Production. I am now trying to incorporate DevOps into the mix, automating a pipeline creation of some sort... (it would be a copy of an existing pipeline but only changing the Target AppServices). Which is were my question comes from, Is there a way to dynamically create or edit a Release pipeline to add the deployment of those new AppServices, without the need of manually edit or create a pipeline an adding those newly created AppServices, I was thinking something around the lines of being able to copy a yaml file template then replacing the necessary info to point to those AppServices after they have been created, but I am not totally sure where could I store the new yaml file so that it is picked up by Azure DevOps, or how could I would accomplish these, with the main idea being that all of this continues to be part of an automated process (if possible).
Thanks a lot for any help, any suggestion is appreciated.
EDIT:
The question is not about how to Deploy an ARM Template through the DevOps release pipeline (I plan on using a PowerShell Script/REST API to accomplish that), instead, is about when the AppServices Resources are created, I need to deploy code to those newly created AppServices and also update that code when necessary (Hopefully through a Release Pipeline), somehow generate a new release pipeline each time I deploy a new set of Resources. So that, when there is a new update, I could easily have that pipeline triggered and that set if AppServices can be updated (created as part of the automation process "dynamically"). (I Already have a similar pipeline that deploys to a "static" set of AppServices).
This is possible as you eluded to with YAML Pipelines. Based upon the scenario you have subscribed each repository would have it's own pipeline.yml file that will define the trigger, pool etc. It would also reference a repository that will house your yaml template.
The template would accept whichever parameters you may required (resource group, app service name, etc...) The triggering pipeline associated with each repository would pass this information leveraging the teamplate.
By doing this CI/CD can be set up to trigger on the individual pipelines and deploy the appropriate code all while leveraging the same YAML template.
The repository reference would be similar to:
resources:
repositories:
- repository: YAMLTemplates
type: git
name: OrginazationName/YAML Project Name
With the call to the template being similar to:
- template: azure-ARM-template.yml#YAMLTemplate
parameters:
appServiceName: 'AppServiceName'
resourceGroupName: 'ResourceGroupName'
UPDATE
At a high level the YAML pipeline would consist of the following. If all App Services are similar as stated and ARM Templates are similar this how it could be constructed and triggered based on a folder path:
Build necessary artifacts
Publish Pipeline
Deploy Azure Resource Group Task
Deploy App Settings Task (if applicable)
Deploy App Service
Release the deployment pieces for each environment in appropriate stages to help alleviate the amount of copying and pasting each of the above tasks can be part of a template either individually at a task, combination of tasks, or all in one. This would allow for defining the YAML once and referencing it and including app specific components as needed as parameters to the templates.

Upload File as a parameter to job in Azure DevOps

I have a Azure DevOps pipeline that automates user creation in salesforce. I am expecting the user details in an excel file, which is to be fed to the Azure DevOps pipeline as a pre-build parameter. However, I am not able to find a solution to it in Azure DevOps.
I had implemented this in Jenkins already using File parameter plugin in my previous projects. Does Azure DevOps has this capability?
After searching through various blogs and posts, I realized that there is no way to get this done directly in VSTS. However, I was able to get a work around for the same.
I created a VSTS User story and uploaded my attachment there
Using the Work Item ID, I used the work Item api to get the attachment ID.
Using the attachment API I was able to write a python script to download this attachment as a part of a pre-step in the Pipeline. Then this was available to use through out my automation script.
I don't think you can load a file before the build start and read the variables, but, you can add a task that read the variables from a file and put him in the beginning (the first step in your pipeline).
There are few extensions to read variable from a JSON file, for example: Json to Variable.
If you want to read from excel I think you should write a script that does it.
Using local hosted agent, you can publish artifact from local share, then move to i.e. ms-hosted agent and use it normally.
- task: DownloadFileshareArtifacts#1
inputs:
filesharePath: '\\myhost\myshare\myfolder'
artifactName: 'my-artifact'
downloadPath: '$(System.ArtifactsDirectory)'
parallelizationLimit: '8'
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/download-fileshare-artifacts?view=azure-devops

Resources