We have an existing AzureDevOps CI/CD. For every project, we have to manually set it up and add the job agent tasks one by one for each project.
Now, we decided to use IaC to auto deploy it using Terraform (as we have been using it for other projects as well). But there isn't much of documentation available there yet aside from this. It does have information on how to provision the project itself, but not the pipelines and other stuffs there, kind of limited at the moment, or I may have just not been able to find the complete documentation of the resources available.
We are keen to use Terraform for automating the creation of our CI/CD infrastructure, I just can't create agent jobs and tasks.
we ended up creating our own forked off of this last year, and added things that were missing, like service connections and things like that.
However, for the pipelines and such Azure DevOps is expecting you to use azure-pipelines.yml for the actual pipeline definition.
To have a build defined by Terraform and Azure something like this would work:
resource "azuredevops_build_definition" "build_definition" {
project_id = azuredevops_project.project.id
name = "My Awesome Build Pipeline"
path = "\\"
repository {
repo_type = "TfsGit"
repo_name = azuredevops_azure_git_repository.repository.name
branch_name = azuredevops_azure_git_repository.repository.default_branch
yml_path = "path to your azure-pipelines.yaml file in the repo"
}
}
So within the repo you are running the terraform from just have azure-pipeline.yaml describing the pipeline you wish to execute.
Damian Brady has a good blog on this from October 2 years ago:
https://damianbrady.com.au/2018/10/10/what-yaml-do-i-need-for-azure-pipelines/
There is also a lot of documentation around the supported Azure DevOps YAML Schema:
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema
Once comfortable with the basics you can start to look towards using templates if there are seemingly common patterns you find:
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema#template-references
Hope this helps!
Related
I’ve got a mono repo, which has 10 separate CICD pipelines written in yaml.
I’ve noticed lately that we’ve lost a vast number of runs, and some of them had successful production releases.
Am I right in thinking that the project rententiob settings applies to all pipelines? Rather than individual?
I’ve been reading on the ms website and I think in order to retain them going forward, I have to use the API via a powershell script.
I assume the said script needs to run after a successful deployment to production.
I’m quite surprised that there isn’t a global option to say ‘keep all production releases’
The project Retention policy settings will be applied to all pipeline runs not individual. So you could not use this setting to retention specific successful production releases directly.
To achieve this, you could use the PowerShell script to retention these specific runs with "Condition". Add the PowerShell script as the last task of your deployment to check if this one needs to be retained. Refer to this official doc: https://learn.microsoft.com/en-us/azure/devops/pipelines/build/run-retention?view=azure-devops
Here is an example to retention forever based on condition:
- powershell: |
$contentType = "application/json";
$headers = #{ Authorization = 'Bearer $(System.AccessToken)' };
$rawRequest = #{ daysValid = 365000 ; definitionId = $(System.DefinitionId); ownerId = 'User:$(Build.RequestedForId)'; protectPipeline = $false; runId = $(Build.BuildId) };
$request = ConvertTo-Json #($rawRequest);
$uri = "$(System.CollectionUri)$(System.TeamProject)/_apis/build/retention/leases?api-version=6.0-preview.1";
Invoke-RestMethod -uri $uri -method POST -Headers $headers -ContentType $contentType -Body $request;
displayName: 'PowerShell Script'
condition: {Your customize condition}
I was hoping to get some feedback on using Azure Pipelines and what the best practices are for my situation.
We have recently migrated from TFS 2017 and we are in the process of re-writing all our pipelines. We were using builds and releases prior to the upgrade in the legacy build tasks. We would like to setup more useful YAML pipelines.
Let me set the stage with what we currently have
10+ microservices
10 individual builds that trigger from a folder in the repo for each one
10 releases that get created on successful build
3 environments per release (Dev, QA, UAT)
So in summary... a build of a single microservice triggers off of a commit to a folder in the branch. The successful build then triggers a release to Dev. Dev completes and a user go go start a QA build by clicking the release.
In the new Azure Pipeline world... what would be the best approach to doing this model.
We would like to have all the builds happen in a single pipeline (each stage would be a microservice?)
How do we trigger only on a commit to that folder?
What would the CD look like? Should it be in the same pipeline and be a new stage?
How can we easily just add environments without having to just keep copy/pasting all the code for each environment? Ideally i would like to just be able to add a variable and a new environment can be deployed to
I am open to any suggestions here. I am ok if I am way off here, I am looking for the best practices and best approach to this.
TIA
I have a great experience of pipeline creating automation (in case of huge amount of repos).
For example, a project has 20 similar repos with Java app (like a microservice) and a pipeline for each of them is differing only by repo url (and a few more minor attributes). The CI/CD process for each of them is the same.
So, we can create a separated devops-repo with declaration configuration for our services. Also we can create a single pipeline which will pull the devops repo and create all needed pipelines for each repo in the configuration (this operation is going to be executed only once in the beginning and in case if we want to change the devops-configuration)
I have implemented that using Jenkins. Now, I am going to do so using GitLab CI. But I can't get how is it possible.
Is it possible to create a pipeline from another one (dynamically)?
Any suggestions?
You can use include and put the generic pipeline in your devops repo.
In your java repos you can include the devops pipeline and set the variables which are specific for the respective java repo.
So the pipeline for your java repos can be as short as this:
include:
- project: 'your-group/devops-repository'
file: '.generic-ci.yml'
variables:
FOO: bar
We are building a set of serverless functions in Azure, but having difficulty deciding how to structure our source (Azure GIT) and DevOps to support them.
I am thinking of a single GIT repo, with all function apps housed independently within projects. We may have a lot of these function apps, we see great value in small code segments to do utility type of work, and I don't want dozens and dozens of independent repos just because of DevOps deployments. Is there a way to have a unique build and release process for each project, not the repo entirely? We aren't clear how this can be done and searches have come up empty on this. I thought it was possible to have unique build YAMLs per project across many projects in a single repo - but unclear how to implement the DevOps build and release pipleines to support this approach - ie; only a single function gets updated and we need to deploy - any guidance if this is possible and how to approach it would be great.
I haven't done this myself, but I'm in a similar situation where I'd like to have multiple functions (and other stuff) in a single Git repo for simplicity, but only build/deploy them as needed when they change. It looks like you can have multiple pipelines on a single repo with a different YAML file for each pipeline. The steps are documented in this link, and summarized below
In Azure DevOps, create a new Pipeline.
For the "Where is your code?" page, at the bottom choose the Use the classic editor option.
Select your source repo and branch.
On the "Select a template" screen, choose the YAML option at the top. Hit Apply.
There is a YAML file path field where you can specify the path and name of your YAML file for the pipeline.
You may want to set the pipeline to run manually if you don't want a build each time there's a commit to the repo.
EDIT There may be an easier way to do this now. If you go through the New Pipeline wizard, select your source location, on the Configure tab, at the bottom you can choose the Existing Azure Pipelines YAML file option. This lets you select a custom YAML file directly.
Being novice to ADF CICD i am currently exploring how we can update the pipeline scoped parameters when we deploy the pipeline from one enviornment to another.
Here is the detailed scenario -
I have a simple ADF pipeline with a copy activity moving files from one blob container to another
Example - Below there is copy activity and pipeline has two parameters named :
1- SourceBlobContainer
2- SinkBlobContainer
with their default values.
Here is how the dataset is configured to consume these Pipeline scoped parameters.
Since this is development environment its OK with the default values. But the Test environment will have the containers present with altogether different name (like "TestSourceBlob" & "TestSinkBlob").
Having said that, when CICD will happen it should handle this via CICD process by updating the default values of these parameters.
When read the documents, no where i found to handle such use-case.
Here are some links which i referred -
http://datanrg.blogspot.com/2019/02/continuous-integration-and-delivery.html
https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment
Thoughts on how to handle this will be much appreciated. :-)
There is another approach in opposite to ARM templates located in 'ADF_Publish' branch.
Many companies leverage that workaround and it works great.
I have spent several days and built a brand new PowerShell module to publish the whole Azure Data Factory code from your master branch or directly from your local machine. The module resolves all pains existed so far in any other solution, including:
replacing any property in JSON file (ADF object),
deploying objects in an appropriate order,
deployment part of objects,
deleting objects not existing in the source any longer,
stop/start triggers, etc.
The module is publicly available in PS Gallery: azure.datafactory.tools
Source code and full documentation are in GitHub here.
Let me know if you have any question or concerns.
There is a "new" way to do ci/cd for ADF that should handle this exact use case. What I typically do is add global parameters and then reference those everywhere (in your case from the pipeline parameters). Then in your build you can override the global parameters with the values that you want. Here are some links to references that I used to get this working.
The "new" ci/cd method following something like what is outlined here Azure Data Factory CI-CD made simple: Building and deploying ARM templates with Azure DevOps YAML Pipelines. If you have followed this, something like this should work in your yaml:
overrideParameters: '-dataFactory_properties_globalParameters_environment_value "new value here"'
Here is an article that goes into more detail on the overrideParameters: ADF Release - Set global params during deployment
Here is a reference on global parameters and how to get them exposed to your ci/cd pipeline: Global parameters in Azure Data Factory