I am trying to use ARM-TTK for doing unit testing for my ARM templates and ensuring that the templates follow uniformity. I am only running few tests.
We are using Azure Repos as our VCS
I have incorporated this in my AZDO pipeline as a pre PR merge task which is in the form of a branch policy, so that before a PR is merged, these tests will run and validate all the templates that are pushed to the main branch.
But the problem is, the tests are returning false positives even though there are no issues with the JSON files.
According to this link ARM-TTK it seems there has to be one azuredeploy.json or maintemplate.json and all the other files are tested as linked templates.
I have JSON files with other names pertaining to the function of the template like win_vm_deploy.json, function_app-deploy.json etc etc.
It is not possible for me to have all the files as linked templates to the azuredeploy.json or maintemplate.json as mentioned in the URL.
I would also like to run the selected tests against the files loaded in the repo automatically and not specify a particular file to run the tests against.
So does that mean that in my situation i won't be able to use the ARM-TTK and utilize the unit tests?
What is the best way to check my templates in my particular folder and utilize some of the unit tests that i choose from ARM-TTK, but then i don't have to have a main template and the other templates as linked templates.
Appreciate any help
When we are working with several people to create a complex deployment it is recommended to use separate JSON files linked to an azureDeploy.json or a mainTemplate.json file. But it's not mandatory to do the same in every case.
To test one file in that folder, add the -File parameter. However, the folder must still have a main template named azuredeploy.json or maintemplate.json. In your case all files need to be specified in a script. There is no such shortcut available for automation.
You can customize the default test or even create your own test. You can implement you own set of rules by authoring the custom tests. A custom test needs to be placed in the correct directory:
_/arm-ttk/testcases/deploymentTemplate_
You can check this documentation for more information.
Also try this tasks for integration with Azure Pipeline.
Related
I have a number of solutions, each of which have a mixture of applications and libraries. Generally speaking, the applications get built and deployed, and the libraries get published as NuGet packages to our internal packages feed. I'll call these "apps" and "nugets."
In my Classic Pipelines, I would have one build for the apps, and one for the nugets. With path filters, I would indicate folders that contain the nuget material, and only trigger the nuget build if those folders had changes. Likewise, the app build would have path filters to detect if any app code had changed. As a result, depending on what was changed in a branch, the app build might run, the nuget build might run, or both might run.
Now I'm trying to convert these to YAML. It seems we can only have one pipeline set up for CI, so I've combined the stages/jobs/steps for nugets and apps into this single pipeline. However, I can't seem to figure out a good way to only trigger the nuget tasks if the nuget path filters are satisfied and only the app tasks if the app path filters are satisfied.
I am hoping someone knows a way to do something similar to one of the following (or anything else that would solve the issue):
Have two different CI pipelines with their own sets of triggers and branch/path filters such that one or both might run on a given branch change
Set some variables based on which paths have changes so that I could later trigger the appropriate tasks using conditions
Make a pipeline always trigger, but only do tasks if a path filter is satisfied (so that the nuget build could always run, but not necessarily do anything, and then the app build could be triggered by the nuget build completing, and itself only do stuff if path filters are satisfied.
It seems we can only have one pipeline set up for CI
My issue was that this was an erroneous conclusion. It appeared to me that, out of the box, a Pipeline is created for a repo with a YAML file in it, and that you can change which file the Pipeline uses, but you can't add a list of files for it to use. I did not realize I could create an additional Pipeline in the UI, and then associate it to a different YAML file in the repo.
Basically, my inexperience with this topic is showing. To future people who might find this, note that you can create as many Pipelines in the UI as you want, and associate each one to a different YAML file.
I have a repo with atlantis integration. This repo houses many stacks that use the same modules, each stack in its own folder and with distinct tfvars.
I generate new stacks using some automation, which generates the new directories, copies a bunch of *.tf files and sets the tfvars with the correct values. Unfortunately, this integration is brittle as I have no tests that can fail PRs when something changes in the module and is not updated in the template files.
What I want is an integration test which creates a test stack, gets it planned by atlantis, fails if the plan fails, and otherwise passes.
Is this achievable?
It doesn't sound like this problem needs to be solved by atlantis. Atlantis is more of a tool for humans.
Try this using a pipelime such as a github action workflow
Create an examples/complete root module/stack
Instantiate a reusable module within the new root module
Write a test in terratest or similar that will run terraform init/apply/destroy
Add lots of outputs and check the outputs with the test
Run this test on every pull request to prevent breaking changes
See how the cloudposse github org uses this method to verify that their code is tested.
examples/complete
examples_complete_test.go
We have hundreds of similar projects in GitLab which have the same structure inside.
To build these projects we use a one common TeamCity build. We trigger and pass project GitLab URL along with other parameters to the build via API, so TeamCity build knows which exact project needs to be fetched/cloned. TeamCity VCS root accepts target URL via parameter.
The question is how to replace existing TeamCity build with a GitLab pipeline.
I see the general approach is to have CI/CD configuration file(.gitlab-ci.yml) directly in project. Since the structure of the projects the same this is not the option to duplicate the same CI/CD config file across all projects.
I'm wondering is it possible to create a common pipeline for several projects which can accept the target project URL via parameter ?
You can store the full CICD config in a repository and put in all your projects a simple .gitlab-ci.yml which includes the shared file.
With thus approach there is no redundant definition of the jobs.
Still, you can add specific other jobs to specific projects (in the regarding .gitlab-ci.yml files or define variables in a problem and use some jobs conditionally) - you can also include multiple other definition files, e.g. if you have multiple similar projects.
cf. https://docs.gitlab.com/ee/ci/yaml/#include
With latest GitLab (13.9) there are even more referencing methods possible: https://docs.gitlab.com/ee/ci/yaml/README.html#reference-tags
As #MrTux already pointed out, you can use includes.
You can either use it to include a whole CI file, or to include just certain steps. in Having Gitlab Projects calling the same gitlab-ci.yml stored in a central location - you can find detailed explanation with examples of both usages
We are building a set of serverless functions in Azure, but having difficulty deciding how to structure our source (Azure GIT) and DevOps to support them.
I am thinking of a single GIT repo, with all function apps housed independently within projects. We may have a lot of these function apps, we see great value in small code segments to do utility type of work, and I don't want dozens and dozens of independent repos just because of DevOps deployments. Is there a way to have a unique build and release process for each project, not the repo entirely? We aren't clear how this can be done and searches have come up empty on this. I thought it was possible to have unique build YAMLs per project across many projects in a single repo - but unclear how to implement the DevOps build and release pipleines to support this approach - ie; only a single function gets updated and we need to deploy - any guidance if this is possible and how to approach it would be great.
I haven't done this myself, but I'm in a similar situation where I'd like to have multiple functions (and other stuff) in a single Git repo for simplicity, but only build/deploy them as needed when they change. It looks like you can have multiple pipelines on a single repo with a different YAML file for each pipeline. The steps are documented in this link, and summarized below
In Azure DevOps, create a new Pipeline.
For the "Where is your code?" page, at the bottom choose the Use the classic editor option.
Select your source repo and branch.
On the "Select a template" screen, choose the YAML option at the top. Hit Apply.
There is a YAML file path field where you can specify the path and name of your YAML file for the pipeline.
You may want to set the pipeline to run manually if you don't want a build each time there's a commit to the repo.
EDIT There may be an easier way to do this now. If you go through the New Pipeline wizard, select your source location, on the Configure tab, at the bottom you can choose the Existing Azure Pipelines YAML file option. This lets you select a custom YAML file directly.
There are tons of resources online on how to replace JSON configuration files in a release pipeline like this one. I configured this. It works. However, we have multiple integration tests which reach the database too. These tests are run during build time. I haven't seen any option yet to replace config values in the build pipeline. Does it exist? Or do I really have to use this custom task (see screenshot below)?
There is an out-of-the-box task since recently by Microsoft. It's called File Transform. It's currently in preview but it works really well! Haven't had any issues whatsoever with it and it works the same as you would configure it in the release pipeline. Would recommend this any day!
Below you can see my configuration.
There is no out-of-the-box task only to replace tokens/values in files (also in the release pipline the task is Azure App Service Deploy and not only for replace json configuration).
You need to use an external extension from here or write a PowerShell script for that.