Azure devops service connection and central pipeline - azure

I have a requirement of giving multiple teams access to a shared resource in azure. I therefore want to limit how people can publish changes to the shared resource.
The idea is to limit the use of a service connection to a specific pipeline, as per this documentation. However if the pipeline is stored in their own repo the developer could change it. This would not give me enough control. I therefore found that it was possible using a template from a central repo. Using a shared repo, would then allow me to have a service connection solely for the template?
So how I imagine doing the above is I need to grant project X a service connection for my BuildTemplates Repo. But this is basically just access to the repo and to be able to use the shared templates. Then in BuildTemplates repo I can have a service connection for my template A.
Now the developer in project X - creates her deployments and configurations for her pipeline with her own service connection scoped for her resources. Then she inherits a template from BuildTemplates Repo and passes relevant parameters for the template A.
She cannot alter the template pipeline A and only the template pipeline A can publish to the shared resource, because of the scoped service connection. I can therefore create relevant guards for the shared azure resource in the template pipeline A - so I restrict how developer X can publish to my shared azure resource.
does this make sense and is it viable?
The pipeline part in A cannot be edited by developer in X ?
The service connection in A will not propagate out so developer in X can use it in an inappropriate way?
Update
The above solution does not seem to be viable since the pipeline template is executed in the source branch scope.
Proposed Solution
The benefits I see with the above suggestion doe not seem possible, because of the issues. However one can utilise pipeline triggers, as a viable solution. This however results in a new issue. When a pipeline is triggered by Developer Y in Y's repository and it succeeds. Then a trigger is made in MAIN repository and the pipeline in MAIN fails e.g., because the artifacts from Y introduced an Issue. How does developer Y get notified about the issues in MAIN pipeline?

Here is my solution, in same Azure organization, we can create a Azure Project, then create a repo to save common pipeline template.
All the repos in other Azure project can access this pipeline template.
UserProject/UserRepo/azure-pipelines.yml
trigger:
branches:
include:
- master
paths:
exclude:
- nuget.config
- README.md
- azure-pipelines.yml
- .gitignore
resources:
repositories:
- repository: devops-tools
type: git
name: PipelineTemplateProject/CommonPipeline
ref: 'refs/heads/master'
jobs:
- template: template-pipeline.yml#devops-tools
PipelineTemplateProject/CommonPipeline/template-pipeline.yml
Since the inline script of pipeline has 5000 characters limitation,
you can put your script(not only powershell, but also other languages) in PipelineTemplateProject/CommonPipeline/scripts/test.ps1
# Common Pipeline Template
jobs:
- job: Test_Job
pool:
name: AgentPoolName
steps:
- script: |
echo "$(Build.RequestedForEmail)"
echo "$(Build.RequestedFor)"
git config user.email "$(Build.RequestedForEmail)"
git config user.name "$(Build.RequestedFor)"
git config --global http.sslbackend schannel
echo '------------------------------------'
git clone -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" -b $(ToolsRepoBranch) --single-branch --depth=1 "https://PipelineTemplateProject/_git/CommonPipeline" DevOps_Tools
echo '------------------------------------'
displayName: 'Clone DevOps_Tools'
- task: PowerShell#2
displayName: 'Pipeline Debug'
inputs:
targetType: 'inline'
script: 'Get-ChildItem -Path Env:\ | Format-List'
condition: always()
- task: PowerShell#2
displayName: 'Run Powershell Scripts'
inputs:
targetType: filePath
filePath: 'DevOps_Tools/scripts/test.ps1'
arguments: "$(System.AccessToken)"
Notes:
Organization Setting - Settings - Disable Limit job authorization scope to current project for release pipelines
Organization Setting - Settings - Limit job authorization scope to current project for non-release pipelines
Check some option in project setting as well.
So the normal user only access their own repo, cannot access DevOps project, and DevOps owner can edit template pipeline only.
For the notification issue, I use an Email extention "rvo.SendEmailTask.send-email-build-task.SendEmail#1"

Related

The repository xxx in project xxx could not be retrieved. Verify the name and credentials being used

I am trying to extend a template on a Azure DevOps pipeline which exists on a repository hosted on Azure Devops. Code looks like below.
resources:
repositories:
- repository: devops
type: git
name: otherProject/repositoryXYZ
ref: main
parameters:
- name: environment
type: string
values:
- "UAT"
- "Production"
default: "UAT"
trigger:
- none
pr: none
extends:
template: folder/template.yml#devops
parameters:
environment: ${{ parameters.environment }}
When I deploy this pipeline on the same project on which repository repositoryXYZ exists, I get a successful run.
For example I have a project A which holds 5 pipelines. One of the pipelines is the above and can download the repository and run successfully. This pipeline exists in project A where repositoryXYZ is located.
When I deploy the same pipeline from a different project project B within the same Azure Devops organization, I get the below error.
/azure-repo.yml: The repository DevOps in project f1809f72 could not be retrieved. Verify the name and credentials being used.
The id of the project on the logs is for project B. (f1809f72).
I tried to alter DevOps repository permissions and to append project
build collection administrators full access. (repositoryXYZ)
Then I tried to place the repository on github and I got the same
issue (added a PAT and changed the directories for the repository)
I also tried to edit project settings and deactivate the limit jobs
options. (all limit job aithorization settings have been deactivated for both projects)
Do I miss something? How can I use my pipeline to extend the template which will be downloaded from an Azure Devops repository of another project within the same devops organization?
The pipeline cannot start running at all, so I guess something should be wrong with the permissions.
My error was the referencing repository on the template.yml file. On the build pipeline I was pointing the correct repository, but on the template I was pointing a false one that could not be retrieved. I corrected that and I was able to trigger the pipeline.

Security hole in Azure Pipelines?

I've been researching Azure DevOps and I've come across what looks like a pretty obvious security hole in Azure pipelines.
So, I'm creating my pipeline as YAML and defining 2 stages: a build stage, and a deployment stage. The deployment stage looks like this:
- stage: deployApiProdStage
displayName: 'Deploy API to PROD'
dependsOn: buildTestApiStage
jobs:
- deployment: deployApiProdJob
displayName: 'Deploy API to PROD'
timeoutInMinutes: 10
condition: and(succeeded(), eq(variables.isRelease, true))
environment: PROD
strategy:
runOnce:
deploy:
steps:
- task: AzureWebApp#1
displayName: 'Deploy Azure web app'
inputs:
azureSubscription: '(service connection to production web app)'
appType: 'webAppLinux'
appName: 'my-web-app'
package: '$(Pipeline.Workspace)/$(artifactName)/**/*.zip'
runtimeStack: 'DOTNETCORE|3.1'
startUpCommand: 'dotnet My.Api.dll'
The Microsoft documentation talks about securing this by adding approvals and checks to an environment; in the above case, the PROD environment. This would be fine if the protected resource here that allows publishing to my PROD web app - the service connection in azureSubscription - were pulled from the PROD environment. Unfortunately, as far as I can tell, it's not. It's associated instead with the pipeline itself.
This means that when the pipeline is first run, the Azure DevOps UI prompts me to permit the pipeline access to the service connection, which is needed for any deployment to happen. Once access is permitted, that pipeline has access to that service connection for evermore. This means that from then on, that service connection can be used no matter which environment is specified for the job. Worse still, any environment name specified that is not recognized does not cause an error, but causes a blank environment to be created by default!
So even if I setup a manual approval for the PROD environment, if someone in the organization manages to slip a change through our code review (which is possible, with regular large code reviews) that changes the environment name to 'NewPROD' in the azure-pipelines.yml file, the CI/CD will create that new environment, and go ahead and deploy immediately to PROD because the new environment has no checks or approvals!
Surely it would make sense for the service connection to be associated with the environment instead. It would also make sense to have an option to ban the auto-creation of new environments - I don't really see how that's particularly useful anyway. Right now, as far as I can tell, this is a huge security hole that could allow deployments to critical environments by anyone who has commit access to the repo or manages to slip a change to the azure-pipelines.yml file through the approval process, introducing a major single point of failure/weakness. What happened to the much-acclaimed incremental approach to securing your pipelines? Am I missing something here, or is this security hole as bad as I think it is?
In your example, it seemed you created/used an empty environment, there is no deployment target. Currently, only the Kubernetes resource and virtual machine resource types are supported in an environment.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/environments?view=azure-devops
The resource in your example is a service connection, so you need to go the service connection and define checks for this service connection.
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/approvals?view=azure-devops&tabs=check-pass

Can we use single CI CD pipeline for two different .net application solution?

I have checked this link Azure DevOps: 1 Solution Multiple Projects CI/CD which is related to one solution with multiple project.
Can we use multiple solution in single CI CD pipeline ? where we have different artifacts for each solution and app servers to deploy.
Please advise.
As long as the code is in the same repository there are no issues to using multiple .net solutions or any other type.
You can also publish multiple artifacts from the same pipeline
If you are using YAML pipeline, you can check out multiple repositories in your pipeline.
Pipelines often rely on multiple repositories. You can have different repositories with source, tools, scripts, or other items that you need to build your code. By using multiple checkout steps in your pipeline, you can fetch and check out other repositories in addition to the one you use to store your YAML pipeline.
Repository declared using a repository resource :
resources:
repositories:
- repository: MyGitHubRepo # The name used to reference this repository in the checkout step
type: github
endpoint: MyGitHubServiceConnection
name: MyGitHubOrgOrUser/MyGitHubRepo
- repository: MyAzureReposGitRepository
type: git
name: MyProject/MyAzureReposGitRepo
steps:
- checkout: MyGitHubRepo
- checkout: MyAzureReposGitRepository
Repository declared using inline syntax :
If your repository doesn't require a service connection, you can declare it inline with your checkout step.
steps:
- checkout: git://MyProject/MyRepo # Azure Repos Git repository in the same organization
For details ,please refer to this official document.

Is there a way to trigger pipeline using Azure Artifacts in YML?

When defining a GUI release I can make it be triggered by an Azure Artifact, is there a way to replicate this for pipelines in YML?
I am building in one AZDO tenant, pushing universal packages to another tenant, where the release definitions will be defined, I'm hoping this can be in YAML. But I don't see an obvious way to do this at the moment?
I see there is a design document that makes mention of packages, but no further details are provided
https://github.com/microsoft/azure-pipelines-yaml/blob/master/design/pipeline-resources.md
Cheers
Edit-
Is there a way to trigger pipeline using Azure Artifacts in YML?-But I don't see an obvious way to do this at the moment?
Yes, Yes. You are right !
That because the content in this document are speculative, designs, and future features.
If you check the upper level of design document you provided, there is a state:
Azure Pipelines YAML - Design Docs
The design docs within this repo are created at different times during
the development of Azure Pipelines, to support collaborative
contributions to the design process. Designs documents are for,
features considered for implementation but never implemented
already implemented features
future ideas for features
The design docs in this repo may not represent the current state
of an Azure Pipelines feature.
When you check the officially release document YAML schema reference-Resources, it only list:
resources:
pipelines: [ pipeline ]
repositories: [ repository ]
containers: [ container ]
So, Azure Artifacts source in YAML should be a future feature at this moment. Hope MS can achieve it one day earlier.
Hope this answer clear your question.
Build completion triggers are not yet supported in YAML syntax. After you create your YAML build pipeline, you can use the classic editor to specify a build completion trigger.
Reference:
https://learn.microsoft.com/en-us/azure/devops/pipelines/build/triggers?view=azure-devops&tabs=yaml#build-completion-triggers
Though i would suggest you to use below mechanism to trigger the release:
Resource triggers
Resources trigger will be helpful in below scenario:
I would like to trigger my pipeline when an artifact is published by ‘Helm-CI’ pipeline that ran on releases/* branch.
I would like to trigger my pipeline when an artifact is published and tested as part of Helm-CI pipeline and tagged as 'Production'.
I would like to trigger my pipeline when ‘TFS-Update’ pipeline has completed ‘Ring2’ stage so that I can run some diagnostics.
https://github.com/microsoft/azure-pipelines-yaml/blob/master/design/pipeline-triggers.md
Webhook triggers
At the end of the CI piepleing , you can add a task to hit webhook url which can trigger your CD is one way.
Hope it helps.
You can use a multi-stage pipeline to achieve this.
One stage would include a task that will push your artifacts to the feed. The next stage will contain other jobs that you want to be executed after pushing the Artifacts.
eg:
stages:
#Stage for preparing the Artifact
- stage: prepare
jobs:
- job: prepare
pool:
vmImage: xx
steps:
- task: PublishBuildArtifacts#1
inputs:
pathToPublish: xx
artifactName: xx
# Next stage in your pipeline
- stage: build
dependsOn: prepare
jobs:
steps:
- task: xx
Note that the second stage build dependsOn the stage prepare.
ps: Multi-Stage pipeline is currently under preview. If you enable it from the preview feature, you will also be able to see a nice visual representation of the stages.

Cannot authorize variable group in Azure Pipelines

I am constructing a multi-stage Azure Pipeline in Azure DevOps to build and release my product.
I want to use variable groups in this pipeline, so that I can substitute different configuration values for different deployment environments.
I am unable to authorize my variable groups to be used by the pipeline.
When I manually run a build, I see a message on the summary page telling me the variable group is not authorized:
The Azure DevOps documentation says this is to be expected:
When you make changes to the YAML file and add additional resources (assuming that these not authorized for use in all pipelines as explained above), then the build fails with a resource authorization error that is similar to the following: Could not find a {RESOURCE} with name {NAME}. The {RESOURCE} does not exist or has not been authorized for use.
In this case, on the Summary page, you will see an option to authorize the resources on the failed build. If you are a member of the User role for the resource, you can select this option. Once the resources are authorized, you can start a new build.
I am a member of the User role for the variable group, and I am seeing the message, but I am presented with no option to authorize. Is there something else I need to do? Or is there another way I can authorize a specific pipeline to use a variable group?
The provided solution proposed by #hey didn't work for me because i had to use deployment jobs. I've found a hacky way to resolve this error:
Go to your pipeline
Edit
Click on the tree dots > Triggers
Navigate to the variables tab
Variable groups
Add variable groups
Variable groups can only be accessed if they are imported at the "job" level.
Solution:
I have tested and tried to reproduce your issue. In order to solve it, you need to add the variable group under "job".
Explanation / Analysis:
This is how to reproduce and solve the issue:
First, I have tested with the below yaml script, by adding the variable group to the stage (literally at the same level as jobs):
stages:
- stage: build
variables:
- group: 789
jobs:
- job: run_build
pool:
vmImage: 'Ubuntu 16.04'
steps:
- script: echo Build
With this configuration, I was not able to use the variable group. I got the same issue as you:
I then moved the variable group into the job section of the yaml file:
stages:
- stage: build
jobs:
- job: run_build
pool:
vmImage: 'Ubuntu 16.04'
steps:
- script: echo Build
variables:
- group: 789
With the modified configuration, I was able to see and use the Authorize resources option in the error message:
I had this issue as well, but it was because when I created a variable group under Pipelines > Library, the name in Azure Portal did not match the name in my yml file.

Resources