Azure pipeline - terratest - ERROR: Please run 'az login' to setup account - azure

i'm facing a (it seams) recurent pbm in Azure Pipeline to run terratest.
While resources are well created the destroyed, when I call an azure.ResourceGroupExists function (or whatever else azure.xxx function) i have the following error :
--- FAIL: TestTerraform_RM_resource_group (102.30s)
resourcegroup.go:15:
Error Trace: resourcegroup.go:15
RM_resource_group_test.go:108
Error: Received unexpected error:
Invoking Azure CLI failed with the following error: ERROR: Please run 'az login' to setup account.
Test: TestTerraform_RM_resource_group
FAIL
Regarding some forum, It seems to be some configuration issue, and I follow all these recomanded configuratoion :
set environments variables for terraform :
-- ARM_CLIENT_ID
-- ARM_CLIENT_SECRET
-- ARM_SUBSCRIPTION_ID
-- ARM_TENANT_ID
set the az login in AzureCli task outside the go task for terratest, as it seems that terratest needs 2 differents authentifications : (using service principal client id for this az login)
For Assert tests, needs the ARM_CLIENT authentification
for Exists tests, needs the Service connection authentification
here the link I follow :
https://github.com/gruntwork-io/terratest/issues/454
https://github.com/gruntwork-io/terratest/tree/master/examples/azure#review-environment-variables
https://github.com/gruntwork-io/terratest/blob/master/modules/environment/envvar.go
https://blog.jcorioland.io/archives/2019/09/25/terraform-microsoft-azure-ci-docker-azure-pipeline.html
bellow my pipeline code, where the TF_VAR_ARM_CLIENT_SECRET is a secret variable of the pipeline
runOnce:
deploy:
steps:
- checkout: self
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-installer-task.TerraformInstaller#0
displayName: 'Install Terraform $(TERRAFORM_VERSION)'
inputs:
terraformVersion: $(TERRAFORM_VERSION)
- task: GoTool#0
displayName: 'Use Go $(GOVERSION)'
inputs:
version: $(GOVERSION)
goPath: $(GOPATH)
goBin: $(GOBIN)
- task: Go#0
displayName: 'Install Go Terratest module'
inputs:
command: get
arguments: '$(TF_LOG) github.com/gruntwork-io/terratest/modules/terraform'
- task: Go#0
displayName: 'Install Go Assert module'
inputs:
command: get
arguments: '$(TF_LOG) github.com/stretchr/testify/assert'
- task: Go#0
displayName: 'Install Go Terratest Azure module'
inputs:
command: get
arguments: '$(TF_LOG) github.com/gruntwork-io/terratest/modules/azure'
- task: Go#0
displayName: 'Install Go hashicorp/terraform-json module'
inputs:
command: get
arguments: '$(TF_LOG) github.com/hashicorp/terraform-json'
- task: Go#0
displayName: 'Install Go azure-sdk-for-go module'
inputs:
command: get
arguments: '$(TF_LOG) github.com/Azure/azure-sdk-for-go'
- task: AzureCLI#2
displayName: Azure CLI
inputs:
azureSubscription: $(serviceConnection)
scriptType: ps
scriptLocation: inlineScript
inlineScript: |
az login --service-principal --username $(TF_VAR_ARM_CLIENT_ID) --password $(TF_VAR_ARM_CLIENT_SECRET) --tenant 'f5ff14e7-93c8-49f7-9706-7beea059bd32'
# Go test command
- task: Go#0
displayName: 'Run Go terratest for resource_Modules'
inputs:
command: test
arguments: '$(TF_LOG) $(pathToTerraformRootModule)\resource_group\'
env:
ARM_CLIENT_SECRET: $(TF_VAR_ARM_CLIENT_SECRET) #pipeline secret variable
ARM_CLIENT_ID: $(TF_VAR_ARM_CLIENT_ID)
ARM_SUBSCRIPTION_ID: $(TF_VAR_ARM_SUBSCRIPTION_ID)
ARM_TENANT_ID: $(TF_VAR_ARM_TENANT_ID)
TF_VAR_SERVICE_PRINCIPAL_ID: $(TF_VAR_ARM_CLIENT_ID)
TF_VAR_SERVICE_PRINCIPAL_SECRET: $(TF_VAR_ARM_CLIENT_ID)
resource_group_name: $(storageAccountResourceGroup)
storage_account_name: $(storageAccount)
container_name: $(stateBlobContainer)
key: '$(MODULE)-$(TF_VAR_APPLICATION)-${{ parameters.Environment }}.tfstate'
Bellow my go terratest code :
package RM_resource_group_Test
import (
"testing"
"os"
"github.com/gruntwork-io/terratest/modules/azure"
"github.com/gruntwork-io/terratest/modules/terraform"
"github.com/stretchr/testify/assert"
)
var (
globalBackendConf = make(map[string]interface{})
globalEnvVars = make(map[string]string)
)
func TestTerraform_RM_resource_group(t *testing.T) {
t.Parallel()
// terraform Directory
fixtureFolder := "./"
// input value
inputStage := "demo_we"
inputEnvironment := "DEMO"
inputApplication := "DEMO"
// expected value
expectedName := "z-adf-ftnd-shrd-dm-ew1-rgp42"
// getting enVars from environment variables
ARM_CLIENT_ID := os.Getenv("ARM_CLIENT_ID")
ARM_CLIENT_SECRET := os.Getenv("ARM_CLIENT_SECRET")
ARM_SUBSCRIPTION_ID := os.Getenv("ARM_SUBSCRIPTION_ID")
ARM_TENANT_ID := os.Getenv("ARM_TENANT_ID")
if ARM_CLIENT_ID != "" {
globalEnvVars["ARM_USE_MSI"] = "false"
globalEnvVars["ARM_CLIENT_ID"] = ARM_CLIENT_ID
globalEnvVars["ARM_CLIENT_SECRET"] = ARM_CLIENT_SECRET
globalEnvVars["ARM_SUBSCRIPTION_ID"] = ARM_SUBSCRIPTION_ID
globalEnvVars["ARM_TENANT_ID"] = ARM_TENANT_ID
}
// getting backend vars from environment variables
resource_group_name := os.Getenv("resource_group_name")
storage_account_name := os.Getenv("storage_account_name")
container_name := os.Getenv("container_name")
key := os.Getenv("key")
if resource_group_name != "" {
globalBackendConf["use_msi"] = false
globalBackendConf["resource_group_name"] = resource_group_name
globalBackendConf["storage_account_name"] = storage_account_name
globalBackendConf["container_name"] = container_name
globalBackendConf["key"] = key
}
// User Terratest to deploy the infrastructure
terraformOptions := terraform.WithDefaultRetryableErrors(t, &terraform.Options{
// The path to where our Terraform code is located
TerraformDir: fixtureFolder,
// Variables to pass to our Terraform code using -var options
Vars: map[string]interface{}{
"STAGE": inputStage,
"ENVIRONMENT": inputEnvironment,
"APPLICATION" : inputApplication,
},
EnvVars: globalEnvVars,
// backend values to set when initialziing Terraform
BackendConfig: globalBackendConf,
// Disable colors in Terraform commands so its easier to parse stdout/stderr
NoColor: true,
})
// website::tag::4::Clean up resources with "terraform destroy". Using "defer" runs the command at the end of the test, whether the test succeeds or fails.
// At the end of the test, run `terraform destroy` to clean up any resources that were created
defer terraform.Destroy(t, terraformOptions)
// website::tag::2::Run "terraform init" and "terraform apply".
// This will run `terraform init` and `terraform apply` and fail the test if there are any errors
terraform.InitAndApply(t, terraformOptions)
actualName := terraform.Output(t, terraformOptions, "tested_name")
actualReaderName := terraform.Output(t, terraformOptions, "tested_readerName")
assert.Equal(t, expectedName, actualName)
assert.Equal(t, expectedName, actualReaderName)
subscriptionID := terraform.Output(t, terraformOptions, "current_subscription_id")
exists := azure.ResourceGroupExists(t, expectedName, subscriptionID)
assert.True(t, exists, "Resource group does not exist")
}
I'm sure I miss something in passing my parameters, as always I have the following error, after creating and destroying resources in Azure :
--- FAIL: TestTerraform_RM_resource_group (90.75s)
resourcegroup.go:15:
Error Trace: resourcegroup.go:15
RM_resource_group_test.go:108
Error: Received unexpected error:
Invoking Azure CLI failed with the following error: ERROR: Please run 'az login' to setup account.
Test: TestTerraform_RM_resource_group
please, help.

and thank-you for answering..
As I figure out earlier, it was a configuration mistake and, after having made some deep excavations on Go Terratest Azure module, I've found these lines that gives all the explanations :
https://github.com/gruntwork-io/terratest/blob/master/modules/azure/authorizer.go#L11
leading to
https://learn.microsoft.com/en-us/azure/developer/go/azure-sdk-authorization#use-environment-based-authentication
So I change my pipeline to this :
# Go test command
- task: Go#0
displayName: 'Run Go terratest for resource_Modules'
inputs:
command: test
arguments: '$(TF_LOG) $(pathToTerraformRootModule)\...'
env:
ARM_SUBSCRIPTION_ID: $(TF_VAR_ARM_SUBSCRIPTION_ID)
AZURE_CLIENT_ID: $(TF_VAR_ARM_CLIENT_ID)
AZURE_TENANT_ID: $(TF_VAR_ARM_TENANT_ID)
AZURE_CLIENT_SECRET: $(TF_VAR_ARM_CLIENT_SECRET)
resource_group_name: $(storageAccountResourceGroup)
storage_account_name: $(storageAccount)
container_name: $(stateBlobContainer)
key: '$(MODULE)-$(TF_VAR_APPLICATION)-${{ parameters.Environment }}.tfstate'
And my Go code to this (regarding the envVariables use) :
// getting enVars from environment variables
ARM_CLIENT_ID := os.Getenv("AZURE_CLIENT_ID")
ARM_CLIENT_SECRET := os.Getenv("AZURE_CLIENT_SECRET")
ARM_TENANT_ID := os.Getenv("AZURE_TENANT_ID")
ARM_SUBSCRIPTION_ID := os.Getenv("ARM_SUBSCRIPTION_ID")
// creating globalEnVars for terraform call through Terratest
if ARM_CLIENT_ID != "" {
//globalEnvVars["ARM_USE_MSI"] = "true"
globalEnvVars["ARM_CLIENT_ID"] = ARM_CLIENT_ID
globalEnvVars["ARM_CLIENT_SECRET"] = ARM_CLIENT_SECRET
globalEnvVars["ARM_SUBSCRIPTION_ID"] = ARM_SUBSCRIPTION_ID
globalEnvVars["ARM_TENANT_ID"] = ARM_TENANT_ID
}
// getting backend vars from environment variables
resource_group_name := os.Getenv("resource_group_name")
storage_account_name := os.Getenv("storage_account_name")
container_name := os.Getenv("container_name")
key := os.Getenv("key")
// creating globalBackendConf for terraform call through Terratest
if resource_group_name != "" {
//globalBackendConf["use_msi"] = true
globalBackendConf["resource_group_name"] = resource_group_name
globalBackendConf["storage_account_name"] = storage_account_name
globalBackendConf["container_name"] = container_name
globalBackendConf["key"] = key
}
// User Terratest to deploy the infrastructure
terraformOptions := terraform.WithDefaultRetryableErrors(t, &terraform.Options{
// website::tag::1::Set the path to the Terraform code that will be tested.
// The path to where our Terraform code is located
TerraformDir: fixtureFolder,
// Variables to pass to our Terraform code using -var options
Vars: map[string]interface{}{
"STAGE": inputStage,
"ENVIRONMENT": inputEnvironment,
"APPLICATION" : inputApplication,
//"configuration" : inputConfiguration,
},
// globalvariables for user account
EnvVars: globalEnvVars,
// backend values to set when initialziing Terraform
BackendConfig: globalBackendConf,
// Disable colors in Terraform commands so its easier to parse stdout/stderr
NoColor: true,
})
And all goes right !
Hopes this could help others.
Thanks again.
[EDIT] To be more explicit :
Go and Terraform uses two differents methods for Azure authentification.
** Terraform authentification is explained bellow :
https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/guides/service_principal_client_secret#configuring-the-service-principal-in-terraform
** Go authentification is explained bellow :
https://learn.microsoft.com/en-us/azure/developer/go/azure-sdk-authorization#use-environment-based-authentication
** Terratest is using both authentification methods regarding the work it has to be done :
azure existences tests uses Go azure authentification :
https://github.com/gruntwork-io/terratest/blob/master/modules/azure/authorizer.go#L11
terraform commands uses terraform authentification :
https://github.com/gruntwork-io/terratest/blob/0d654bd2ab781a52e495f61230cf892dfba9731b/modules/terraform/cmd.go#L12
https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/guides/service_principal_client_secret#configuring-the-service-principal-in-terraform
so both authentification methods have to be implemented

Related

Terraform Init throwing error while using with Azure DevOps

I am trying to run Terraform Open Source using Azure Devops
I have the state file stored in Azure Blobstorage
Below is my pipeline file
variables:
- group: infra-variables
trigger:
branches:
include:
- master
paths:
include:
- Terraform-Test
exclude:
- README.md
stages:
- stage: Validate
displayName: Validate
jobs:
- job: validate
pool:
vmImage: ubuntu-latest
steps:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-installer-task.TerraformInstaller#0
displayName: Install Terraform
inputs:
terraformVersion: 'latest'
# Init
- task: TerraformCLI#0
displayName: Initialize Terraform
env:
ARM_SAS_TOKEN: $(ARM_ACCESS_KEY)
inputs:
command: 'init'
workingDirectory: '$(System.DefaultWorkingDirectory)/Terraform-Test'
commandOptions: '-backend-config=storage_account_name=$(TF_STATE_BLOB_ACCOUNT_NAME) -backend-config=container_name=$(TF_STATE_BLOB_CONTAINER_NAME) -backend-config=key=$(ARM_ACCESS_KEY)'
backendType: 'selfConfigured'
# Validate
- task: TerraformCLI#0
displayName: Validate Config
inputs:
command: 'validate'
workingDirectory: '$(System.DefaultWorkingDirectory)/Terraform-Test'
- stage: Plan
displayName: Plan
jobs:
- job: plan
pool:
vmImage: ubuntu-latest
steps:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-installer-task.TerraformInstaller#0
displayName: Install Terraform
inputs:
terraformVersion: 'latest'
# Init
- task: TerraformCLI#0
displayName: Initialize Terraform
env:
ARM_SAS_TOKEN: $(ARM_ACCESS_KEY)
inputs:
command: 'init'
workingDirectory: '$(System.DefaultWorkingDirectory)/Terraform-Test'
commandOptions: '-backend-config=storage_account_name=$(TF_STATE_BLOB_ACCOUNT_NAME) -backend-config=container_name=$(TF_STATE_BLOB_CONTAINER_NAME) -backend-config=key=$(ARM_ACCESS_KEY)'
backendType: 'selfConfigured'
# Plan
- task: TerraformCLI#0
displayName: Plan Terraform Deployment
env:
ARM_SAS_TOKEN: $(ARM_ACCESS_KEY)
ARM_CLIENT_ID: $(AZURE_CLIENT_ID)
ARM_CLIENT_SECRET: $(AZURE_CLIENT_SECRET)
ARM_SUBSCRIPTION_ID: $(AZURE_SUBSCRIPTION_ID)
ARM_TENANT_ID: $(AZURE_TENANT_ID)
inputs:
command: 'plan'
workingDirectory: '$(System.DefaultWorkingDirectory)/Terraform-Test'
# Approve
- stage: Approve
displayName: Approve
jobs:
- job: approve
displayName: Wait for approval
pool: server
steps:
- task: ManualValidation#0
timeoutInMinutes: 60
inputs:
notifyUsers: 'pallabcd#hotmail.com'
instructions: 'Review the plan in the next hour'
- stage: Apply
displayName: Apply
jobs:
- job: apply
pool:
vmImage: ubuntu-latest
steps:
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-installer-task.TerraformInstaller#0
displayName: Install Terraform
inputs:
terraformVersion: 'latest'
# Init
- task: TerraformCLI#0
displayName: TF Init
env:
ARM_SAS_TOKEN: $(ARM_ACCESS_KEY)
inputs:
command: 'init'
workingDirectory: '$(System.DefaultWorkingDirectory)/Terraform-Test'
commandOptions: '-backend-config=storage_account_name=$(TF_STATE_BLOB_ACCOUNT_NAME) -backend-config=container_name=$(TF_STATE_BLOB_CONTAINER_NAME) -backend-config=key=$(ARM_ACCESS_KEY)'
backendType: 'selfConfigured'
# Apply
- task: TerraformCLI#0
displayName: TF Apply
env:
ARM_SAS_TOKEN: $(ARM_ACCESS_KEY)
ARM_CLIENT_ID: $(AZURE_CLIENT_ID)
ARM_CLIENT_SECRET: $(AZURE_CLIENT_SECRET)
ARM_SUBSCRIPTION_ID: $(AZURE_SUBSCRIPTION_ID)
ARM_TENANT_ID: $(AZURE_TENANT_ID)
inputs:
command: 'apply'
workingDirectory: '$(System.DefaultWorkingDirectory)/Terraform-Test'
commandOptions: '-auto-approve'
My main.tf file is given below
terraform {
required_version = "~> 1.0"
backend "azurerm" {
storage_account_name = var.storage_account_name
container_name = var.container_name
key = "terraform.tfstate"
access_key = "#{ARM_ACCESS_KEY}#"
features {}
}
required_providers {
azuread = "~> 1.0"
azurerm = "~> 2.0"
}
}
provider "azurerm" {
tenant_id = var.tenant_id
client_id = var.client_id
client_secret = var.client_secret
subscription_id = var.subscription_id
features {}
}
data "azurerm_resource_group" "az-rg-wu" {
name = "Great-Learning"
}
data "azurerm_client_config" "current" {}
When i am putting the actual storage access key in main.tf the Init is successful but if i am putting the ADO variable in the form of "#{ARM_ACCESS_KEY}#", the pipeline fails.
This variable is there in my tfvar file also and the value is set in a variable group in Azure Devops
So what i am doing wrong here
The service connection may cause problems. Terraform should require a proper service connection and an Azure DevOps extension as a pre-requisite. Ensure Terraform CLI is installed on the pipeline agent.
Here is the sample YAML to connect to Azure DevOps.
Step1: sample yaml file code as below
variables:
- name: TerraformBackend.ResourceGroup
value: rg-realworld-staging-001
- name: TerraformBackend.StorageAccount
value: strwstagingterraform01
- name: TerraformBackend.ContainerName
value: staging
- group: 'staging'
steps:
- task: AzureCLI#2
inputs:
azureSubscription: '[service connection]'
scriptType: 'bash'
scriptLocation: 'inlineScript'
inlineScript: |
az group create --location eastus --name $(TerraformBackend.ResourceGroup)
az storage account create --name $(TerraformBackend.StorageAccount) --resource-group $(TerraformBackend.ResourceGroup) --location eastus --sku Standard_LRS
az storage container create --name staging --account-name $(TerraformBackend.StorageAccount)
STORAGE_ACCOUNT_KEY=$(az storage account keys list -g $(TerraformBackend.ResourceGroup) -n $(TerraformBackend.StorageAccount) | jq ".[0].value" -r)
echo "setting storage account key variable"
echo "##vso[task.setvariable variable=ARM_ACCESS_KEY;issecret=true]$STORAGE_ACCOUNT_KEY"
- task: TerraformInstaller#0
inputs:
terraformVersion: '1.3.6'
- task: TerraformTaskV1#0
displayName: "Terraform Init"
inputs:
provider: 'azurerm'
command: 'init'
backendServiceArm: '[service connection]'
backendAzureRmResourceGroupName: $(TerraformBackend.ResourceGroup)
backendAzureRmStorageAccountName: $(TerraformBackend.StorageAccount)
backendAzureRmContainerName: '$(TerraformBackend.ContainerName)'
backendAzureRmKey: 'infrastructure/terraform.tfstate'
workingDirectory: '$(System.DefaultWorkingDirectory)/stag/'
- task: TerraformTaskV1#0
displayName: "Terraform Apply"
inputs:
provider: 'azurerm'
command: 'apply'
workingDirectory: '$(System.DefaultWorkingDirectory)/stag/'
environmentServiceNameAzureRM: '[service connection]'
commandOptions: |
-var "some_key=$(value)"
Upon init execution
Updated main tf file as below
terraform {
required_version = "~> 1.0"
backend "azurerm" {
storage_account_name = "tstate6075"
container_name = "tstate"
key = "terraform.tfstate"
access_key = "=="
******************************** // features {}
}
required_providers {
azuread = "~> 1.0"
azurerm = "~> 2.0"
}
}
provider "azurerm" {
tenant_id = "*******************"
//client_id = var.client_id
//client_secret = var.client_secret
subscription_id = "**********************"
features {}
}
data "azurerm_resource_group" "az-rg-wu" {
name = "rg-*****"
}
data "azurerm_client_config" "current" {}

Azure Function App infra redeployment makes existing functions in app fail because of missing requirements and also delete previous invocations data

I'm facing quite a big problem. I have a function app that I deploy by Azure Bicep in the following fashion:
param environmentType string
param location string
param storageAccountSku string
param vnetIntegrationSubnetId string
param kvName string
/*
This module contains the IaC for deploying the Premium function app
*/
/// Just a single minimum instance to start with and max scaling of 3 for dev, 5 for prd ///
var minimumElasticSize = 1
var maximumElasticSize = ((environmentType == 'prd') ? 5 : 3)
var name = 'nlp'
var functionAppName = 'function-app-${name}-${environmentType}'
/// Storage account for service ///
resource functionAppStorage 'Microsoft.Storage/storageAccounts#2019-06-01' = {
name: 'st4functionapp${name}${environmentType}'
location: location
kind: 'StorageV2'
sku: {
name: storageAccountSku
}
properties: {
allowBlobPublicAccess: false
accessTier: 'Hot'
supportsHttpsTrafficOnly: true
minimumTlsVersion: 'TLS1_2'
}
}
/// Premium app plan for the service ///
resource servicePlanfunctionApp 'Microsoft.Web/serverfarms#2021-03-01' = {
name: 'plan-${name}-function-app-${environmentType}'
location: location
kind: 'linux'
sku: {
name: 'EP1'
tier: 'ElasticPremium'
family: 'EP'
}
properties: {
reserved: true
targetWorkerCount: minimumElasticSize
maximumElasticWorkerCount: maximumElasticSize
elasticScaleEnabled: true
isSpot: false
zoneRedundant: ((environmentType == 'prd') ? true : false)
}
}
// Create log analytics workspace
resource logAnalyticsWorkspacefunctionApp 'Microsoft.OperationalInsights/workspaces#2021-06-01' = {
name: '${name}-functionapp-loganalytics-workspace-${environmentType}'
location: location
properties: {
sku: {
name: 'PerGB2018' // Standard
}
}
}
/// Log analytics workspace insights ///
resource applicationInsightsfunctionApp 'Microsoft.Insights/components#2020-02-02' = {
name: 'application-insights-${name}-function-${environmentType}'
location: location
kind: 'web'
properties: {
Application_Type: 'web'
Flow_Type: 'Bluefield'
publicNetworkAccessForIngestion: 'Enabled'
publicNetworkAccessForQuery: 'Enabled'
Request_Source: 'rest'
RetentionInDays: 30
WorkspaceResourceId: logAnalyticsWorkspacefunctionApp.id
}
}
// App service containing the workflow runtime ///
resource sitefunctionApp 'Microsoft.Web/sites#2021-03-01' = {
name: functionAppName
location: location
kind: 'functionapp,linux'
identity: {
type: 'SystemAssigned'
}
properties: {
clientAffinityEnabled: false
httpsOnly: true
serverFarmId: servicePlanfunctionApp.id
siteConfig: {
linuxFxVersion: 'python|3.9'
minTlsVersion: '1.2'
pythonVersion: '3.9'
use32BitWorkerProcess: true
appSettings: [
{
name: 'FUNCTIONS_EXTENSION_VERSION'
value: '~4'
}
{
name: 'FUNCTIONS_WORKER_RUNTIME'
value: 'python'
}
{
name: 'AzureWebJobsStorage'
value: 'DefaultEndpointsProtocol=https;AccountName=${functionAppStorage.name};AccountKey=${listKeys(functionAppStorage.id, '2019-06-01').keys[0].value};EndpointSuffix=core.windows.net'
}
{
name: 'WEBSITE_CONTENTAZUREFILECONNECTIONSTRING'
value: 'DefaultEndpointsProtocol=https;AccountName=${functionAppStorage.name};AccountKey=${listKeys(functionAppStorage.id, '2019-06-01').keys[0].value};EndpointSuffix=core.windows.net'
}
{
name: 'WEBSITE_CONTENTSHARE'
value: 'app-${toLower(name)}-functionservice-${toLower(environmentType)}a6e9'
}
{
name: 'APPINSIGHTS_INSTRUMENTATIONKEY'
value: applicationInsightsfunctionApp.properties.InstrumentationKey
}
{
name: 'ApplicationInsightsAgent_EXTENSION_VERSION'
value: '~2'
}
{
name: 'APPLICATIONINSIGHTS_CONNECTION_STRING'
value: applicationInsightsfunctionApp.properties.ConnectionString
}
{
name: 'ENV'
value: toUpper(environmentType)
}
]
}
}
/// VNET integration so flows can access storage and queue accounts ///
resource vnetIntegration 'networkConfig#2022-03-01' = {
name: 'virtualNetwork'
properties: {
subnetResourceId: vnetIntegrationSubnetId
swiftSupported: true
}
}
}
/// Outputs for creating access policies ///
output functionAppName string = sitefunctionApp.name
output functionAppManagedIdentityId string = sitefunctionApp.identity.principalId
Output is used for giving permissions to blob/queue and some keyvault stuff. This code is a single module called in a main.bicep module and deployed via an Azure Devops pipeline.
I have a second repository in which I have some functions and which I also deploy via Azure Pipelines. This one contains three .yaml files for deploying, 2 templates (CI and CD) and 1 main pipeline called azure-pipelines.yml pulling it all together:
functions-ci.yml:
parameters:
- name: environment
type: string
jobs:
- job:
displayName: 'Publish the function as .zip'
steps:
- task: UsePythonVersion#0
inputs:
versionSpec: '$(pythonVersion)'
displayName: 'Use Python $(pythonVersion)'
- task: CopyFiles#2
displayName: 'Create project folder'
inputs:
SourceFolder: '$(System.DefaultWorkingDirectory)'
Contents: |
**
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: Bash#3
displayName: 'Install requirements for running function'
inputs:
targetType: 'inline'
script: |
python3 -m pip install --upgrade pip
pip install setup
pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt
workingDirectory: '$(Build.ArtifactStagingDirectory)'
- task: ArchiveFiles#2
displayName: 'Create project zip'
inputs:
rootFolderOrFile: '$(Build.ArtifactStagingDirectory)'
includeRootFolder: false
archiveType: 'zip'
archiveFile: '$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip'
replaceExistingArchive: true
- task: PublishPipelineArtifact#1
displayName: 'Publish project zip artifact'
inputs:
targetPath: '$(Build.ArtifactStagingDirectory)'
artifactName: 'functions$(environment)'
publishLocation: 'pipeline'
functions-cd.yml:
parameters:
- name: environment
type: string
- name: azureServiceConnection
type: string
jobs:
- job: worfklowsDeploy
displayName: 'Deploy the functions'
steps:
# Download created artifacts, containing the zipped function codes
- task: DownloadPipelineArtifact#2
inputs:
buildType: 'current'
artifactName: 'functions$(environment)'
targetPath: '$(Build.ArtifactStagingDirectory)'
# Zip deploy the functions code
- task: AzureFunctionApp#1
inputs:
azureSubscription: $(azureServiceConnection)
appType: functionAppLinux
appName: function-app-nlp-$(environment)
package: $(Build.ArtifactStagingDirectory)/**/*.zip
deploymentMethod: 'zipDeploy'
They are pulled together in azure-pipelines.yml:
trigger:
branches:
include:
- develop
- main
pool:
name: "Hosted Ubuntu 1804"
variables:
${{ if notIn(variables['Build.SourceBranchName'], 'main') }}:
environment: dev
azureServiceConnection: SC-NLPDT
${{ if eq(variables['Build.SourceBranchName'], 'main') }}:
environment: prd
azureServiceConnection: SC-NLPPRD
pythonVersion: '3.9'
stages:
# Builds the functions as .zip
- stage: functions_ci
displayName: 'Functions CI'
jobs:
- template: ./templates/functions-ci.yml
parameters:
environment: $(environment)
# Deploys .zip workflows
- stage: functions_cd
displayName: 'Functions CD'
jobs:
- template: ./templates/functions-cd.yml
parameters:
environment: $(environment)
azureServiceConnection: $(azureServiceConnection)
So this successfully deploys my function app the first time around when I have also deployed the infra code. The imports are done well, the right function app is deployed, and the code runs when I trigger it.
But, when I go and redeploy the infra (bicep) code, all of a sudden I the newest version of the functions is gone and is replaced by a previous version.
Also, running this previous version doesn't work anymore since all my requirements that were installed in the pipeline (CI part) via pip install --target="./.python_packages/lib/site-packages" -r ./requirements.txt suddenly cannot be found anymore, giving import errors (i.e. Result: Failure Exception: ModuleNotFoundError: No module named 'azure.identity'). Mind you, this version did work previously just fine.
This is a big problem for me since I need to be able to update some infra stuff (like adding an APP_SETTING) without this breaking the current deployment of functions.
I had thought about just redeploying the function automatically after an infra update, but then I still miss the previous invocations which I need to be able to see.
Am I missing something in the above code because I cannot figure out what would be going wrong here that causes my functions to change on infra deployment...
Looking at the documentation:
To enable your function app to run from a package, add a WEBSITE_RUN_FROM_PACKAGE setting to your function app settings.
1 Indicates that the function app runs from a local package file deployed in the d:\home\data\SitePackages (Windows) or /home/data/SitePackages (Linux) folder of your function app.
In your case, when you deploy your function app code using AzureFunctionApp#1 and zipDeploy, this automatically add this appsetting into your function app. When redeploying your infrastructure, this setting is removed and the function app host does not know where to find the code.
If you add this app setting in your bicep file this should work:
{
name: 'WEBSITE_RUN_FROM_PACKAGE'
value: '1'
}

Issue installing Terratest using Go Task's Yaml Azure pipeline - issue triggering terratest tests in sub-folder

I'm facing this issue while installing terratest by azure yaml pipeline :
C:\hostedtoolcache\windows\go\1.17.1\x64\bin\go.exe install -v github.com/gruntwork-io/terratest#v0.40.6
go: downloading github.com/gruntwork-io/terratest v0.40.6
go install: github.com/gruntwork-io/terratest#v0.40.6: module github.com/gruntwork-io/terratest#v0.40.6 found, but does not contain package github.com/gruntwork-io/terratest
##[error]The Go task failed with an error: Error: The process 'C:\hostedtoolcache\windows\go\1.17.1\x64\bin\go.exe' failed with exit code 1
Finishing: Install Go Terratest module - v0.40.6
My code for installation is bellow :
- task: Go#0
displayName: Install Go Terratest module - v$(TERRATEST_VERSION)
inputs:
command: custom
customCommand: install
arguments: $(TF_LOG) github.com/gruntwork-io/terratest#v$(TERRATEST_VERSION)
workingDirectory: $(pipeline_artefact_folder_extract)/$(pathToTerraformRootModule)
But peharps I made mistakes in the use of terratest.
Bellow is a screenshot of my code tree :
I have terraform code in (for exemple) Terraform\azure_v2_X\ResourceModules sub-directory, and terratest test in Terraform\azure_v2_X\Tests_Unit_ResourceModules subdirectories (in screenshot app_configuration tests for app_configuration resourceModules).
In my terratest module, I'm calling for my resourceModule as in the following code :
######test in a un isolated Resource Group defined in locals
module "app_configuration_tobetested" {
source = "../../ResourceModules/app_configuration"
resource_group_name = local.rg_name
location = local.location
environment = var.ENVIRONMENT
sku = "standard"
// rem : here app_service_shared prefix and app_config_shared prefix are the same !
app_service_prefix = module.app_configuration_list_fortests.settings.frontEnd_prefix
# stage = var.STAGE
app_config_list = module.app_configuration_list_fortests.settings.list_app_config
}
And in my Go file, I test my module result regarding the expected result I want :
package RM_app_configuration_Test
import (
"os"
"testing"
// "github.com/gruntwork-io/terratest/modules/azure"
"github.com/gruntwork-io/terratest/modules/terraform"
"github.com/stretchr/testify/assert"
)
var (
globalBackendConf = make(map[string]interface{})
globalEnvVars = make(map[string]string)
)
func TestTerraform_RM_app_configuration(t *testing.T) {
t.Parallel()
// terraform Directory
fixtureFolder := "./"
// backend specification
strlocal := "RMapCfg_"
// input value
inputStage := "sbx_we"
inputEnvironment := "SBX"
inputApplication := "DEMO"
// expected value
expectedRsgName := "z-adf-ftnd-shrd-sbx-ew1-rgp01"
// expectedAppCfgPrefix := "z-adf-ftnd-shrd"
expectedAppConfigReader_ID := "[/subscriptions/f04c8fd5-d013-41c3-9102-43b25880d2e2/resourceGroups/z-adf-ftnd-shrd-sbx-ew1-rgp01/providers/Microsoft.AppConfiguration/configurationStores/z-adf-ftnd-shrd-sbx-ew1-blue-sbx-cfg01 /subscriptions/f04c8fd5-d013-41c3-9102-43b25880d2e2/resourceGroups/z-adf-ftnd-shrd-sbx-ew1-rgp01/providers/Microsoft.AppConfiguration/configurationStores/z-adf-ftnd-shrd-sbx-ew1-green-sbx-cfg01]"
// getting enVars from environment variables
/*
Go and Terraform uses two differents methods for Azure authentification.
** Terraform authentification is explained bellow :
- https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/guides/service_principal_client_secret#configuring-the-service-principal-in-terraform
** Go authentification is explained bellow
- https://learn.microsoft.com/en-us/azure/developer/go/azure-sdk-authorization#use-environment-based-authentication
** Terratest is using both authentification methods regarding the work it has to be done :
- azure existences tests uses Go azure authentification :
- https://github.com/gruntwork-io/terratest/blob/master/modules/azure/authorizer.go#L11
- terraform commands uses terraform authentification :
- https://github.com/gruntwork-io/terratest/blob/0d654bd2ab781a52e495f61230cf892dfba9731b/modules/terraform/cmd.go#L12
- https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/guides/service_principal_client_secret#configuring-the-service-principal-in-terraform
so both authentification methods have to be implemented
*/
// getting terraform EnvVars from Azure Go environment variables
ARM_CLIENT_ID := os.Getenv("AZURE_CLIENT_ID")
ARM_CLIENT_SECRET := os.Getenv("AZURE_CLIENT_SECRET")
ARM_TENANT_ID := os.Getenv("AZURE_TENANT_ID")
ARM_SUBSCRIPTION_ID := os.Getenv("ARM_SUBSCRIPTION_ID")
if ARM_CLIENT_ID != "" {
globalEnvVars["ARM_CLIENT_ID"] = ARM_CLIENT_ID
globalEnvVars["ARM_CLIENT_SECRET"] = ARM_CLIENT_SECRET
globalEnvVars["ARM_SUBSCRIPTION_ID"] = ARM_SUBSCRIPTION_ID
globalEnvVars["ARM_TENANT_ID"] = ARM_TENANT_ID
}
// getting terraform backend from environment variables
resource_group_name := os.Getenv("resource_group_name")
storage_account_name := os.Getenv("storage_account_name")
container_name := os.Getenv("container_name")
key := strlocal + os.Getenv("key")
if resource_group_name != "" {
globalBackendConf["resource_group_name"] = resource_group_name
globalBackendConf["storage_account_name"] = storage_account_name
globalBackendConf["container_name"] = container_name
globalBackendConf["key"] = key
}
// User Terratest to deploy the infrastructure
terraformOptions := terraform.WithDefaultRetryableErrors(t, &terraform.Options{
// website::tag::1::Set the path to the Terraform code that will be tested.
// The path to where our Terraform code is located
TerraformDir: fixtureFolder,
// Variables to pass to our Terraform code using -var options
Vars: map[string]interface{}{
"STAGE": inputStage,
"ENVIRONMENT": inputEnvironment,
"APPLICATION": inputApplication,
},
EnvVars: globalEnvVars,
// backend values to set when initialziing Terraform
BackendConfig: globalBackendConf,
// Disable colors in Terraform commands so its easier to parse stdout/stderr
NoColor: true,
})
// website::tag::4::Clean up resources with "terraform destroy". Using "defer" runs the command at the end of the test, whether the test succeeds or fails.
// At the end of the test, run `terraform destroy` to clean up any resources that were created
defer terraform.Destroy(t, terraformOptions)
// website::tag::2::Run "terraform init" and "terraform apply".
// This will run `terraform init` and `terraform apply` and fail the test if there are any errors
terraform.InitAndApply(t, terraformOptions)
// tests the resource_group for the app_configuration
/*
actualAppConfigReaderPrefix := terraform.Output(t, terraformOptions, "app_configuration_tested_prefix")
assert.Equal(t, expectedAppCfgprefix, actualAppConfigReaderPrefix)
*/
actualRSGReaderName := terraform.Output(t, terraformOptions, "app_configuration_tested_RG_name")
assert.Equal(t, expectedRsgName, actualRSGReaderName)
actualAppConfigReader_ID := terraform.Output(t, terraformOptions, "app_configuration_tobetested_id")
assert.Equal(t, expectedAppConfigReader_ID, actualAppConfigReader_ID)
}
The fact is locally, I can do, from My main folder Terraform\Azure_v2_X\Tests_Unit_ResourceModules the following command to trigger all my tests in a raw :
(from Go v1.11)
Go test ./...
With Go version 1.12, I could set GO111MODULE=auto to have the same results.
But with Go 1.17, I have now to set GO111MODULE=off to trigger my tetst.
For now, I have 2 main questions that nagging me :
How can I Go import Terratest (and other) modules from azure Pipeline ?
What I have to do to correctly use Go Modules with terratest ?
I have no Go code in my main folder _Terraform\Azure_v2_X\Tests_Unit_ResourceModules_ and would like to trigger all the sub_folder go tests in a simple command line in my Azure Pipeline.
Thank you for any help you could give.
Best regards,
I will once again answer my own question. :D
so, for now, using the following versions :
-- GOVERSION: 1.17.1
-- TERRAFORM_VERSION : 1.1.7
-- TERRATEST_VERSION: 0.40.6
The folder hierarchy has changed the following, regarding terratest tests :
I do no longer try to Go import my Terratest module. (so point 1) above is ansered, obviously)
I now just have to :
Go mod each of my terratest modules
Trigger each of them individually one by one, using script
so my pipeline just became the following :
- task: ms-devlabs.custom-terraform-tasks.custom-terraform-installer-task.TerraformInstaller#0
displayName: Install Terraform $(TERRAFORM_VERSION)
inputs:
terraformVersion: $(TERRAFORM_VERSION)
- task: GoTool#0
displayName: 'Use Go $(GOVERSION)'
inputs:
version: $(GOVERSION)
goPath: $(GOPATH)
goBin: $(GOBIN)
- task: PowerShell#2
displayName: run Terratest for $(pathToTerraformRootModule)
inputs:
targettype : 'filePath'
filePath: $(pipeline_artefact_folder_extract)/$(pathToTerraformRootModule)/$(Run_Terratest_script)
workingDirectory: $(pipeline_artefact_folder_extract)/$(pathToTerraformRootModule)
env:
# see https://learn.microsoft.com/en-us/azure/developer/go/azure-sdk-authorization#use-environment-based-authentication
# for Azure authentification with Go
ARM_SUBSCRIPTION_ID: $(TF_VAR_ARM_SUBSCRIPTION_ID)
AZURE_CLIENT_ID: $(TF_VAR_ARM_CLIENT_ID)
AZURE_TENANT_ID: $(TF_VAR_ARM_TENANT_ID)
AZURE_CLIENT_SECRET: $(TF_VAR_ARM_CLIENT_SECRET) # set as pipeline secret
resource_group_name: $(storageAccountResourceGroup)
storage_account_name: $(storageAccount)
container_name: $(stateBlobContainer)
key: '$(MODULE)-$(TF_VAR_APPLICATION)-$(TF_VAR_ENVIRONMENT).tfstate'
GO111MODULE: 'auto'
And in my main folder for my terratest sub-folders, I have the run_terratests.ps1 script and the Terratests list file as bellow :
run_terratests.ps1
# this file is based on https://github.com/google/go-cloud/blob/master/internal/testing/runchecks.sh
#
# This script runs all go Terratest suites,
# compatibility checks, consistency checks, Wire, etc.
$moduleListFile = "./Terratests"
# regex to filter : not began with #
$regexFilter = "^[^#]"
# read the ModuleListFile
[object] $arrayFromFile = Get-Content -Path $moduleListFile | Where-Object { $_ -match $regexFilter} | ConvertFrom-String -PropertyNames folder, totest
$result = 0 # set no error by default
# get the actual folder
$main_path = Get-Location | select -ExpandProperty "Path"
#read the array to show if to be tested !
foreach ($line in $arrayFromFile) {
# write-Host $line
if ($line.totest -eq "yes") {
$path = $line.folder
set-location $main_path\$path
$myPath = Get-Location
# Write-Host $myPath
# trigger terratest for files
Go test ./...
}
if ($false -eq $?)
{
$result = 1
}
}
# back to school :D
set-location $main_path
if ($result -eq 1)
{
Write-Error "Msbuild exit code indicate test failure."
Write-Host "##vso[task.logissue type=error]Msbuild exit code indicate test failure."
exit(1)
}
the code
if ($false -eq $?)
{
$result = 1
}
is usefull to make the pipeline fail on test error without escaping the other tests.
Terratests
# this file lists all the modules to be tested in the "Tests_Unit_ConfigHelpers" repository.
# it us used by the "run_terratest.ps1" powershell script to trigger terratest for each test.
#
# Any line that doesn't begin with a '#' character and isn't empty is treated
# as a path relative to the top of the repository that has a module in it.
# The 'tobetested' field specifies whether this is a module that have to be tested.
#
# this file is based on https://github.com/google/go-cloud/blob/master/allmodules
# module-directory tobetested
azure_constants yes
configure_app_srv_etc yes
configure_frontdoor_etc yes
configure_hostnames yes
constants yes
FrontEnd_AppService_slots/_main yes
FrontEnd_AppService_slots/settings yes
merge_maps_of_strings yes
name yes
name_template yes
network/hostname_generator yes
network/hostnames_generator yes
replace_2vars_into_string_etc yes
replace_var_into_string_etc yes
sorting_map_with_an_other_map yes
And the change in each terratest folder is that I will add the go.mod and go.sum files :
$ go mod init mytest
go: creating new go.mod: module mytest
go: to add module requirements and sums:
go mod tidy
and
$ go mod tidy
# link each of the go modules needed for your terratest module
so, with that, the go test ./... from the powershell script will downlaod the needed go modules and run the test for that particulary test.
Thanks for reading and vote if you think that can help :)

Azure devops pipeline access Json file inputs and perform for(each) loop

I am using, Linux agent, I need to iterate over json input objects for each project.
I have below json file and It may have more than 200 charts, I need perform build, lint, templates and push to repository, I can do this using shell/python, but I thought to use Azure pipelines yaml code.
{
"helm": {
"charts": [
{
"project1": {
"secretName" : "mysecret1",
"setValues": ["a", "b", "c"]
}
},
{
"project2": {
"secretName" : "mysecret2",
"setValues": ["x", "y", "z"]
}
}
]
}
}
azure-pipelines.yaml:
trigger:
- '*'
variables:
buildConfiguration: 'Release'
releaseBranchName: 'dev'
stages:
- stage: 'Build'
pool:
name: 'linux'
displayName: 'Build helm Projects'
jobs:
- job: 'buildProjects'
displayName: 'Building all the helm projects'
steps:
- task: HelmInstaller#0
displayName: install helm
inputs:
helmVersion: 'latest'
installKubectl: false
- script: chmod -R 755 $(Build.SourcesDirectory)/
displayName: 'Set Directory permissions'
- task: PythonScript#0
inputs:
scriptSource: inline
script: |
import argparse, json, sys
parser = argparse.ArgumentParser()
parser.add_argument("--filePath", help="Provide the json file path")
args = parser.parse_args()
with open(args.filePath, 'r') as f:
data = json.load(f)
data = json.dumps(data)
print('##vso[task.setvariable variable=helmConfigs;]%s' % (data))
arguments: --filePath $(Build.SourcesDirectory)/build/helm/helmConfig.json
failOnStderr: true
displayName: setting up helm configs
- template: helmBuild.yml
parameters:
helmCharts: $(HELMCONFIGS)
Json input is saved to HELMCONFIGS variable in azure pipelines, As per Azure documents, it string type and we cannot convert to any other type like array.
helmBuild.yml file:
parameters:
- name: helmCharts
type: object
default: {}
steps :
- ${{ each helm in parameters.helmCharts }}:
- ${{ each charts in helm}}:
- ${{ each chart in charts }}:
- task: AzureKeyVault#1
inputs:
azureSubscription: 'A-B-C'
KeyVaultName: chart.secretName
SecretsFilter: '*'
RunAsPreJob: true
I am not able to access chart.secretName, How to access to secretNames input?

Referencing SQL Username and Password from KeyVault in YAML

Below I have an Azure CI pipeline written in YAML, I already created a KV with 2 secrets (Username and Password) and their respective values (admin and password). Now, I have been trying to refer the secrets in variables into task: SqlAzureDacpacDeployment#1 but it doesnt work.
If I put $Username and $Password in SqlUsername and SqlPassword, I'd get this error Cannot validate argument on parameter 'Username'.
If I put '$(Username)'and '$(Password)' in SqlUsername and SqlPassword, I'd get this error Login failed for user '***'.
What should I put there or how do I refer them properly? Thanks!
trigger:
- master
pool:
vmImage: 'windows-latest'
variables:
$Username: ""
$Password: ""
steps:
- task: AzureKeyVault#1
displayName: 'Get credentials from Key-Vault'
inputs:
azureSubscription: 'Test-SC'
KeyVaultName: 'Test-KV'
SecretsFilter: '*'
RunAsPreJob: false
- task: SqlAzureDacpacDeployment#1
displayName: 'Reg Database DDL Script'
inputs:
SqlUsername: $Username
SqlPassword: $Password
enabled: true
Found the answer, so set RunAsPreJob: true and SqlUsername: '$(Username)' SqlPassword: '$(Password)' make sure you have access policies added and the name of the KV and azureSubscription are both correct

Resources