Azure Pipelines (DevOps): Custom Consumable Statistic/Metric - azure

I have a build up on Azure Pipelines, and one of the steps provides a code metric that I would like to have be consumable after the build is done. Ideally, this would be in the form of a badge like this (where we have text on the left and the metric in the form of a number on the right). I'd like to put such a badge on the README of the repository to make this metric visible on a per-build basis.
Azure DevOps does have a REST API that one can use to access built-in aspects of a given build. But as far as I can tell there's no way to expose a custom statistic or value that is generated or provided during a build.
(The equivalent in TeamCity would be outputting ##teamcity[buildStatisticValue key='My Custom Metric' value='123'] via Console.WriteLine() from a simple C# program, that TeamCity can then consume and use/make available.)
Anyone have experience with this?

One option is you could use a combination of adding a build tag using a command:
##vso[build.addbuildtag]"My Custom Metric.123"
Then use the Tags - Get Build Tags API.
GET https://dev.azure.com/{organization}/{project}/_apis/build/builds/{buildId}/tags?api-version=5.0

Related

Testing ARM-Templates with ARM-TTK

I am trying to use ARM-TTK for doing unit testing for my ARM templates and ensuring that the templates follow uniformity. I am only running few tests.
We are using Azure Repos as our VCS
I have incorporated this in my AZDO pipeline as a pre PR merge task which is in the form of a branch policy, so that before a PR is merged, these tests will run and validate all the templates that are pushed to the main branch.
But the problem is, the tests are returning false positives even though there are no issues with the JSON files.
According to this link ARM-TTK it seems there has to be one azuredeploy.json or maintemplate.json and all the other files are tested as linked templates.
I have JSON files with other names pertaining to the function of the template like win_vm_deploy.json, function_app-deploy.json etc etc.
It is not possible for me to have all the files as linked templates to the azuredeploy.json or maintemplate.json as mentioned in the URL.
I would also like to run the selected tests against the files loaded in the repo automatically and not specify a particular file to run the tests against.
So does that mean that in my situation i won't be able to use the ARM-TTK and utilize the unit tests?
What is the best way to check my templates in my particular folder and utilize some of the unit tests that i choose from ARM-TTK, but then i don't have to have a main template and the other templates as linked templates.
Appreciate any help
When we are working with several people to create a complex deployment it is recommended to use separate JSON files linked to an azureDeploy.json or a mainTemplate.json file. But it's not mandatory to do the same in every case.
To test one file in that folder, add the -File parameter. However, the folder must still have a main template named azuredeploy.json or maintemplate.json. In your case all files need to be specified in a script. There is no such shortcut available for automation.
You can customize the default test or even create your own test. You can implement you own set of rules by authoring the custom tests. A custom test needs to be placed in the correct directory:
_/arm-ttk/testcases/deploymentTemplate_
You can check this documentation for more information.
Also try this tasks for integration with Azure Pipeline.

How to add test result attachments during Azure Devops build or release via a C# application?

I'm wanting to add some test case attachments during a build or release but I'm struggling to find a valid approach to do this. I'm not using MSTest.
I tried creating a custom build/release task but I've found the azure-devops-node-api package to be flaky at best, and seemingly lacking contributors.
This is what I would hope to do...
Use C# if possible
Have the code/task available for either a build or release across multiple repositories and projects (same organization) without code duplication
Automatically authenticate with the currently running build/release without needing PAT tokens or any other form of authentication
Access to both Azure File Storage and Azure Devops
Works with any build or release agent
Is this achievable? I've seen odd articles in various places but nothing like whats described above. For example this shows promise in terms of validating the current build/release in a C# application however it is 4 years old now and doesn't explain how to integrate with a pipeline.
Can anyone help?
Thanks,
I've always leveraged mstest, so within the runner we've had access in c# to the TestContext that supports adding the attachements directly to the test result.
It looks like the API is exposed for adding attachments to the runs though, so I would think you can create something either in c# or in powershell that accomplishes what you are asking. You will likely need to make sure the agent phase has access to the OAuth token.
POST https://dev.azure.com/{organization}/{project}/_apis/test/Runs/{runId}/Results/{testCaseResultId}/attachments?api-version=5.1-preview.1

Azure Data Factory V2 multiple environments like in SSIS

I'm coming from a long SSIS background, we're looking to use Azure data factory v2 but I'm struggling to find any (clear) way of working with multiple environments. In SSIS we would have project parameters tied to the Visual Studio project configuration (e.g. development/test/production etc...) and say there were 2 parameters for SourceServerName and DestinationServerName, these would point to different servers if we were in development or test.
From my initial playing around I can't see any way to do this in data factory. I've searched google of course, but any information I've found seems to be around CI/CD then talks about Git 'branches' and is difficult to follow.
I'm basically looking for a very simple explanation and example of how this would be achieved in Azure data factory v2 (if it is even possible).
It works differently. You create an instance of data factory per environment and your environments are effectively embedded in each instance.
So here's one simple approach:
Create three data factories: dev, test, prod
Create your linked services in the dev environment pointing at dev sources and targets
Create the same named linked services in test, but of course these point at your tst systems
Now when you "migrate" your pipelines from dev to test, they use the same logical name (just like a connection manager)
So you don't designate an environment at execution time or map variables or anything... everything in test just runs against test because that's the way the linked servers have been defined.
That's the first step.
The next step is to connect only the dev ADF instance to Git. If you're a newcomer to Git it can be daunting but it's just a version control system. You save your code to it and it remembers every change you made.
Once your pipeline code is in git, the theory is that you migrate code out of git into higher environments in an automated fashion.
If you go through the links provided in the other answer, you'll see how you set it up.
I do have an issue with this approach though - you have to look up all of your environment values in keystore, which to me is silly because why do we need to designate the test servers hostname everytime we deploy to test?
One last thing is that if you a pipeline that doesn't use a linked service (say a REST pipeline), I haven't found a way to make that environment aware. I ended up building logic around the current data factories name to dynamically change endpoints.
This is a bit of a bran dump but feel free to ask questions.
Although it's not recommended - yes, you can do it.
Take a look at Linked Service - in this case, I have a connection to Azure SQL Database:
You have possibilities to use dynamic content for either the server name and database name.
Just add a parameter to your pipeline, pass it to the Linked Service and use in the required field.
Let me know whether I explained it clearly enough?
Yes, it's possible although not so simple as it was in VS for SSIS.
1) First of all: there is no desktop application for developing ADF, only the browser.
Therefore developers should make the changes in their DEV environment and from many reasons, the best way to do it is a way of working with GIT repository connected.
2) Then, you need "only":
a) publish the changes (it creates/updates adf_publish branch in git)
b) With Azure DevOps deploy the code from adf_publish replacing required parameters for target environment.
I know that at the beginning it sounds horrible, but the sooner you set up an environment like this the more time you save while developing pipelines.
How to do these things step by step?
I describe all the steps in the following posts:
- Setting up Code Repository for Azure Data Factory v2
- Deployment of Azure Data Factory with Azure DevOps
I hope this helps.

How to change pipeline badge name

As the standard pipeline badge from GitLab looks like this
you can tell pretty well that those are not really distinguishable.
Is there a way to change the pipeline text manually or programmatically to something else for each badge?
Btw, the badges were added with those links
https://gitlab.com/my-group/my-repository/badges/master/pipeline.svg
https://gitlab.com/my-group/my-repository/badges/dev/pipeline.svg
Additional facts:
The pipeline runs locally on my computer
My repo is private
I know it is a bit of an old post, but I was looking for the same and found that it is available now since GitLab 13.1.
The text for a badge can be customized to differentiate between multiple coverage jobs that run in the same pipeline. Customize the badge text and width by adding the key_text=custom_text and key_width=custom_key_width parameters to the URL:
https://gitlab.com/gitlab-org/gitlab/badges/main/coverage.svg?job=karma&key_text=Frontend+Coverage&key_width=130
The example is for the Coverage badge but this also works for Pipelines, so in your case:
https://gitlab.com/my-group/my-repository/badges/master/pipeline.svg?key_text=master&key_width=50
https://gitlab.com/my-group/my-repository/badges/dev/pipeline.svg?key_text=dev&key_width=50
(Found this via https://microfluidics.utoronto.ca/gitlab/help/ci/pipelines/settings.md#custom-badge-text)
There are multiple ways how you can achieve custom pipeline badges in GitLab.
One way could be to use Shields.io which provide a way to generate dynamic badges for your Gitlab repository via a jsonfile.But if your repository is private (only accessible from internal network) then you will get an inaccessible message in your badges.
Otherwise, if your build uses python Docker images or any other python installation with dependencies, you can simply install the anybadge package and generate svg badges to be used in the project from the artifacts directly.
It would be good in future that GitLab offers us more cleaner way to customize the badges, but for now I think those are the workaround solutions.

Is there a tool for manipulating Azure ARM templates

Using Azure, I would like to be able to extract an ARM template from a working configured Resource Group. I would then like to be able to reproduce the environment using a different Resource Group name. As far as I can tell. this does not work out of the box. Is there a tool that will clean up and parameterize the ARM template it can be used without so much work ? The purpose is to be able to make test and prod copies of our dev environments. Am I just trying to do this the wrong way ? Should I be using a specific tool ?
there is no such tool, unfortunately. I suppose you are talking about the Automation script feature. It provides you with a baseline template, but it requires refinement (like you mentioned).
I'm afraid you'd have to clean it up manually.

Resources