I'd like to deploy additional services (e.g. a certain kind of database) with my (review) application when using GitLab Auto DevOps. Customizing Auto DevOps/Custom Helm chart suggests to create a ./chart directory and put the full chart there.
Does that mean I have to copy the complete default chart to my repository, only to add a ./chart/requirements.yaml file? Is there a possibility to somehow "merge" the charts? I don't need to fully replace the default chart with a custom chart, I only need to add requirements (and override a few settings in ./chart/values.yaml; which can already be done with Customize Helm chart values).
What's the canonical way to add a requirement to the default chart?
Copy the complete default chart and adapt? (feels like overkill and needs to be kept in sync manually)
Override the deployment CI pipeline job to download the default chart and merge it with the custom requirements?
Find a way similar to customizing Helm chart values – does it exist?
Adding another CI pipeline job to deploy the requirement separately?
Related
All
I'd like to create some project release versions(released and unreleased). In JIRA, you can find these versions in following JSON file. I need these versions to associate with work items in Azure boards. For example, to show a work item (Type: Bug) is found in certain version. My project is not built on Azure pipeline, So I don't have any release pipeline, so Is there any way to define these versions and name them with whatever name I want. Thanks.
https://jira.XXXX.com/rest/api/latest/project/XXXXX/versions
Currently, we seem have no method to directly link work items on Azure Boards to the 3rd-party CI/CD pipelines. Only Azure Pipelines is supported.
As a workaround, you can try to add a custom field on the work item to show the related release as the field value.
For example, add a text field named "Release" on the Bug item, then you can fill this field with the release version or the URL of the related release.
In the 3rd-party CI/CD pipelines, you can set up a step to execute the REST API "Work Items - Update" to automatically fill the release version in the field.
We are building a set of serverless functions in Azure, but having difficulty deciding how to structure our source (Azure GIT) and DevOps to support them.
I am thinking of a single GIT repo, with all function apps housed independently within projects. We may have a lot of these function apps, we see great value in small code segments to do utility type of work, and I don't want dozens and dozens of independent repos just because of DevOps deployments. Is there a way to have a unique build and release process for each project, not the repo entirely? We aren't clear how this can be done and searches have come up empty on this. I thought it was possible to have unique build YAMLs per project across many projects in a single repo - but unclear how to implement the DevOps build and release pipleines to support this approach - ie; only a single function gets updated and we need to deploy - any guidance if this is possible and how to approach it would be great.
I haven't done this myself, but I'm in a similar situation where I'd like to have multiple functions (and other stuff) in a single Git repo for simplicity, but only build/deploy them as needed when they change. It looks like you can have multiple pipelines on a single repo with a different YAML file for each pipeline. The steps are documented in this link, and summarized below
In Azure DevOps, create a new Pipeline.
For the "Where is your code?" page, at the bottom choose the Use the classic editor option.
Select your source repo and branch.
On the "Select a template" screen, choose the YAML option at the top. Hit Apply.
There is a YAML file path field where you can specify the path and name of your YAML file for the pipeline.
You may want to set the pipeline to run manually if you don't want a build each time there's a commit to the repo.
EDIT There may be an easier way to do this now. If you go through the New Pipeline wizard, select your source location, on the Configure tab, at the bottom you can choose the Existing Azure Pipelines YAML file option. This lets you select a custom YAML file directly.
Being novice to ADF CICD i am currently exploring how we can update the pipeline scoped parameters when we deploy the pipeline from one enviornment to another.
Here is the detailed scenario -
I have a simple ADF pipeline with a copy activity moving files from one blob container to another
Example - Below there is copy activity and pipeline has two parameters named :
1- SourceBlobContainer
2- SinkBlobContainer
with their default values.
Here is how the dataset is configured to consume these Pipeline scoped parameters.
Since this is development environment its OK with the default values. But the Test environment will have the containers present with altogether different name (like "TestSourceBlob" & "TestSinkBlob").
Having said that, when CICD will happen it should handle this via CICD process by updating the default values of these parameters.
When read the documents, no where i found to handle such use-case.
Here are some links which i referred -
http://datanrg.blogspot.com/2019/02/continuous-integration-and-delivery.html
https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment
Thoughts on how to handle this will be much appreciated. :-)
There is another approach in opposite to ARM templates located in 'ADF_Publish' branch.
Many companies leverage that workaround and it works great.
I have spent several days and built a brand new PowerShell module to publish the whole Azure Data Factory code from your master branch or directly from your local machine. The module resolves all pains existed so far in any other solution, including:
replacing any property in JSON file (ADF object),
deploying objects in an appropriate order,
deployment part of objects,
deleting objects not existing in the source any longer,
stop/start triggers, etc.
The module is publicly available in PS Gallery: azure.datafactory.tools
Source code and full documentation are in GitHub here.
Let me know if you have any question or concerns.
There is a "new" way to do ci/cd for ADF that should handle this exact use case. What I typically do is add global parameters and then reference those everywhere (in your case from the pipeline parameters). Then in your build you can override the global parameters with the values that you want. Here are some links to references that I used to get this working.
The "new" ci/cd method following something like what is outlined here Azure Data Factory CI-CD made simple: Building and deploying ARM templates with Azure DevOps YAML Pipelines. If you have followed this, something like this should work in your yaml:
overrideParameters: '-dataFactory_properties_globalParameters_environment_value "new value here"'
Here is an article that goes into more detail on the overrideParameters: ADF Release - Set global params during deployment
Here is a reference on global parameters and how to get them exposed to your ci/cd pipeline: Global parameters in Azure Data Factory
Suppose we have 100 static websites of similar type. It will have similar build pipeline tasks. So instead of creating build and release pipelines one by one using visual designer, is there a way to automate it so that it will get created automatically?
You can do that via rest api, also, if all the pipelines are in different repos you can use azure-pipelines.yaml in the root of the repo, it will pick it up automatically.
go to builds > edit > top right >
on the next screen you can rename it:
I have a build up on Azure Pipelines, and one of the steps provides a code metric that I would like to have be consumable after the build is done. Ideally, this would be in the form of a badge like this (where we have text on the left and the metric in the form of a number on the right). I'd like to put such a badge on the README of the repository to make this metric visible on a per-build basis.
Azure DevOps does have a REST API that one can use to access built-in aspects of a given build. But as far as I can tell there's no way to expose a custom statistic or value that is generated or provided during a build.
(The equivalent in TeamCity would be outputting ##teamcity[buildStatisticValue key='My Custom Metric' value='123'] via Console.WriteLine() from a simple C# program, that TeamCity can then consume and use/make available.)
Anyone have experience with this?
One option is you could use a combination of adding a build tag using a command:
##vso[build.addbuildtag]"My Custom Metric.123"
Then use the Tags - Get Build Tags API.
GET https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}/tags?api-version=5.0