How can we define different variable group for different schedule trigger in Azure devops - azure

Different variable group for different schedule trigger in Azure devops

Do you mean you would like to use variable in different variable groups in different schedule triggered pipelines? What about go to UI->library->choose the specific variable group then choose pipeline permission and add the corresponding schedule pipelines.
Then add the variable group name in the schedule pipelines yml and you could output the value of variables in the group, for example if there is a variable named 'SomeNumber' in the variable group.
variables:
- group: MyVarGroup
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: Write-Host "$(SomeNumber)"
But if you mean to add variable into the schedule trigger syntax, you can't use pipeline variables when specifying schedules in yml. You could see the official doc for more details.
If it is not what you want, please kindly specific your issue with the sample yaml, then I will try to do further investigation.

Related

Access pipeline A´s variables from pipeline B... Azure Devops

After some internet search, I wasnt able to find a proper way or suggestion on how to access variables from different pipelines. Lets say, from Pipeline A access variables of Pipeline B.
What I did find, is the idea to use Key Vault, which I am not able to use right now. I was wondering if there is a workaround, lets say, with powershell.
All of this is happening in an Azure Devops environment, where I am trying to access/read variables from different pipelines.
Any ideas?
Kind regards
Leo.
You can make use of variable groups to call variables in multiple pipelines within a project.
You just need to reference that variable in the YAML script or release pipeline with the variable group and use it in any pipeline, Refer below :-
I went to my project > Pipelines > Library > Variable Group > And added a variable > You can add multiple variables here storing your secrets or values.
Using the variable group in a yaml pipeline :-
trigger:
- main
pool:
vmImage: ubuntu-latest
variables:
- group: SharedVariables
steps:
- script: |
echo $(databaseserverpassword)
Now, when you run the pipeline, It will ask you to permit the use of variable group for the pipeline.
This will enable access to all the variables in the SharedVariables group.
Output :-
We got our databaseservername value masked.
You can also enable this variable group for all the pipeline in the project by default.
You can use the same variable group in your Classic pipeline or release pipeline in release or specific stages like below :-
Reference :-
Variable groups for Azure Pipelines - Azure Pipelines | Microsoft Learn
For PowerShell - Azure DevOps: how to manage CI/CD variable groups using PowerShell – Radu Narita (ranari.com)

Protect Tasks from modification in Azure Pipeline

How can I ensure that a particular scan is run at the end of an Azure Pipeline? The user should not be able to delete/modify this task. I cannot use decorators because they inject jobs across all pipelines in the Azure Organization, or I will need to filter using project names, which is not possible. Is there some other way to make this mandatory?
From your requirement, the pipeline decorators can directly meet your requirement.
Pipeline decorators also supports to filter using project name.
You can define the if expression in decorator YAML file to filter the projects.
Here is an example:
my-decorator.yml
steps:
- ${{ if in(variables['System.TeamProject'], '123', 'azure', 'ProjectC') }}:
- task: CmdLine#2
displayName: 'Run my script (injected from decorator)'
inputs:
script: 'echo "test"'
Result:
When the project name meets the filter, it will run the pipeline decorator task.
For example:
If no, it will not run the pipeline decorator task.
For example:
For more detailed info, you can refer to this doc: Use a decorator to inject steps into a pipeline

Need to trigger a pipeline to different Azure Cloud resources in different run

We have setup Data Factory pipelines in Azure DevOps. We are planning to deploy two different Data Factories. So, we want if we run a pipeline for the first time then it will use first Azure Data Factory resources (You can say it as DV01) and when any other user trigger that same pipeline parallelly then it will use other Azure Data Factory resource (You can say DV02).
Two different resources combined can be say as DV01 (It contains ADF, Data Lake, Blob, Virtual Machines etc) and DV02 (It will also conatin different ADF, Data Lake, Blob, Virtual Machines etc.)
How we can achieve this as we are using multiple resources (like ADF, Data Lake, Blob, Virtual Machines etc) in Azure Cloud to perform operation in pipeline?
we are planning to setup a new environment for different parallel jobs.
Based on your requirement, you want to set another job to deploy DV02.
Please refer to the following sample:
name: $(Rev:r)
trigger:
- none
pool:
vmImage: ubuntu-latest
stages:
- stage: A
jobs:
- job: A1
condition: eq(variables['build.buildnumber'], '1')
steps:
- checkout: self
- job: A2
condition: ne(variables['build.buildnumber'], '1')
steps:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
# Write your PowerShell commands here.
Write-Host "Hello World"
You could define the variable build.buildnumber(name: $(Rev:r)).It will count according to the pipeline run.
When the pipeline is triggered for the first time, the value will be 1. Then you could add the condition in each job.
When the value is 1, it will run the first job. Or it will run another job.
Here is the doc about condition and build.buildnumber.

Azure DevOps pipeline trigger variables

We have this below scenario -
Pipeline 1 triggers pipeline 2 and also send variables $(path1), $(path2), $(path3)... to pipeline 2.
I'm trying to find a way to do a for-each loop for the variables which are being sent from pipeline1.
We are using YAML for the setup.
Thanks in advance.
You can use each function
parameters:
steps:
- ${{ each p in parameters.pipeline1vars }}:
- script: echo ${{ p }}
Currently, we can use the expression "resources.pipeline.<Alias>.<var_name>" to only pass some predefined variables of the resource pipeline to the triggered pipeline (see here).
We have no way to directly pass the custom variables from the resource pipeline to the triggered pipeline.
However, you can reference to the article below to try passing the variables through pipeline artifact.
How to pass variables with pipeline trigger in Azure Pipeline
You can save all the variables you want to pass into the artifact file. And in the triggered pipeline, you can try to use a loop to read each variable saved in the file.

Cannot authorize variable group in Azure Pipelines

I am constructing a multi-stage Azure Pipeline in Azure DevOps to build and release my product.
I want to use variable groups in this pipeline, so that I can substitute different configuration values for different deployment environments.
I am unable to authorize my variable groups to be used by the pipeline.
When I manually run a build, I see a message on the summary page telling me the variable group is not authorized:
The Azure DevOps documentation says this is to be expected:
When you make changes to the YAML file and add additional resources (assuming that these not authorized for use in all pipelines as explained above), then the build fails with a resource authorization error that is similar to the following: Could not find a {RESOURCE} with name {NAME}. The {RESOURCE} does not exist or has not been authorized for use.
In this case, on the Summary page, you will see an option to authorize the resources on the failed build. If you are a member of the User role for the resource, you can select this option. Once the resources are authorized, you can start a new build.
I am a member of the User role for the variable group, and I am seeing the message, but I am presented with no option to authorize. Is there something else I need to do? Or is there another way I can authorize a specific pipeline to use a variable group?
The provided solution proposed by #hey didn't work for me because i had to use deployment jobs. I've found a hacky way to resolve this error:
Go to your pipeline
Edit
Click on the tree dots > Triggers
Navigate to the variables tab
Variable groups
Add variable groups
Variable groups can only be accessed if they are imported at the "job" level.
Solution:
I have tested and tried to reproduce your issue. In order to solve it, you need to add the variable group under "job".
Explanation / Analysis:
This is how to reproduce and solve the issue:
First, I have tested with the below yaml script, by adding the variable group to the stage (literally at the same level as jobs):
stages:
- stage: build
variables:
- group: 789
jobs:
- job: run_build
pool:
vmImage: 'Ubuntu 16.04'
steps:
- script: echo Build
With this configuration, I was not able to use the variable group. I got the same issue as you:
I then moved the variable group into the job section of the yaml file:
stages:
- stage: build
jobs:
- job: run_build
pool:
vmImage: 'Ubuntu 16.04'
steps:
- script: echo Build
variables:
- group: 789
With the modified configuration, I was able to see and use the Authorize resources option in the error message:
I had this issue as well, but it was because when I created a variable group under Pipelines > Library, the name in Azure Portal did not match the name in my yml file.

Resources