Automated Deployment of ADF Pipelines using Azure DevOps CI/CD Pipelines - azure

I have automated the Azure ADF Pipeline Deployment process using Azure DevOps CI/CD pipelines with the help of https://learn.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment (i.e) Deploying pipelines from DEV to PROD environment ADF. I am using ARM Templates of the ADF to deploy pipelines from one environment to another. Hence I will be having a separate ARM_Parameter.json corresponding to each environment(Dev/Prod).
The Problem is each ADF pipeline may have few base parameres along with it, which is not parameterized and hence it will not be available in parameter.json. Can you guys help me to replace the Dev Values with the PROD Values in Base Parameter section under each ADF Pipelines in an automated way during this automated ADF pipeline deployment process using CI/CD Pipelines?

I see two options:
If it's only for this RUN_ENVIRONMENT parameter, you could change your parameter to variable and use the system variable #pipeline().DataFactory to determine what environment you're running in.
Otherwist, you can configure the Data Factory to generate ARM Parameters for your pipeline parameter default values, but you'll have to create a custom arm-template-parameters-definition.json file. Check the documentation here

You could use Custom parameter with ARM template.
The custom parameter for Pipeline could look like this:
"Microsoft.DataFactory/factories/pipelines": {
"properties": {
"parameters": {
"RUN_ENVIRONMENT": "=:-:string"
}
}
},

Replace the Dev Values with the PROD Values in Base Parameter section
Based on your screenshot, the RUN_ENVIRONMENT is the parameter of pipeline, which means while convert to ARM template, its format like:
"resources": [
{
....
....
"properties": {
"parameters": {
"RUN_ENVIRONMENT": {
"type": "string",
"defaultValue": "pro"
}
},...
},...
}
]
It can not be replaced by using Override template parameters in ARM deploy task. Because it will prompt The template parameters 'environment' in the parameters file are not valid; they are not present in the original template and can therefore not be provided at deployment time.
To around this error, just install one extension and add the Replace token task into the pipeline which before ARM deploy task. And this task will replace the value of content during the build runtime:
For how to apply this task in our pipeline, you could have a refer to my answer1 and answer2

There is another approach to publish ADF, from master (collaboration) branch.
You can define (replace) value for every single node (property) in JSON file (ADF object).
It will resolve your problem as you can provide a separate CSV config file per each environment (stage).
Example of CSV config file (config-stage-UAT.csv):
type,name,path,value
pipeline,PL_CopyMovies,activities[0].outputs[0].parameters.BlobContainer,UAT
Then just run such cmdlet in PowerShell:
Publish-AdfV2FromJson -RootFolder "$RootFolder" -ResourceGroupName "$ResourceGroupName" -DataFactoryName "$DataFactoryName" -Location "$Location" -Stage "stage-UAT"
Check this out:
azure.datafactory.tools (PowerShell module)

Related

Azure devops Pipeline: List of Azure Region Locations as Parameter

I need to create Azure resources. ALL Region/ Location like eastus, westus etc.. should be displayed as Parameter in Pipeline so user can Select any one location for creating the Azure resource using Azure Devops Pipeline. Any suggestions please
Are you using yaml to define your pipeline? If so this is possible using runtime parameters. You essentially define a list of values that can be selected on pipeline running, and if a user doesn't it chooses a default value.
Runtime Parameters

Azure Data Factory V2 Copy activity Mapping deployment issue

Consider following test Mapping for Data Factory Copy activity:
"translator": {
"columnMappings": "#json('{\"from\":\"to\"}')",
"type": "TabularTranslator"
}
After deploying pipeline with the help of Set-AzureRmDataFactoryV2Pipeline PowerShell cmdlet we get normally deployed pipeline with the exact columnMappings value as specified in source code. But if you try to be more dynamic:
"translator": {
"columnMappings": "#json(pipeline().parameters.Mapping)",
"type": "TabularTranslator"
}
then after deployment you'll find that translator element is completely missing in pipeline. A workaround - set translator in Azure Portal Data Factory pipeline editing UI (either in Designer or JSON modes - both options work). But if after these manipulations you save pipeline JSON to the file and attempt to deploy it via Set-AzureRmDataFactoryV2Pipeline PowerShell cmdlet - bang, translator becomes missing. Expected result - deployment shall preserve translator element, because Portal JSON Editor preserves it.
We are doing automated deployment of pipelines (as you already figured out - with the help of Set-AzureRmDataFactoryV2Pipeline) and this bug breaks our automated deployment because it requires manual postdeployment pipeline editing on Azure Portal UI.
What may be the reason of such a buggy behavior? Can you suggest an idea how to work around this bug in automated way, or how to fix the code so it can be properly deployed with Set-AzureRmDataFactoryV2Pipeline?
You could try whether "Update-Module -Name AzureRm.DataFactoryV2" helps. It might be caused by that your powershell module is out of date.

Build arm template in VSTS fails with error about 'artifactsLocation'

Normally when i deploy through visual studio _artifactsLocation shows when editing the parameters so what should this be in VSTS and how do I set it?
2018-02-21T08:49:46.1918199Z ##[error]Deployment template validation failed: 'The value for the template parameter '_artifactsLocation' at line '1' and column '182' is not provided. Please see https://aka.ms/arm-deploy/#parameter-file for usage details.'.
2018-02-21T08:49:46.1919769Z ##[error]Task failed while creating or updating the template deployment.
You can specify it in parameters file, then specify the file path in Template parameters input box of Azure Resource Group Deployment task if you are using.
Also, the parameters can be override by specifying in Override template parameters input box of Azure Resource Group Deployment task.
If you are calling script through Azure PowerShell task, you can specify it in the arguments: -ArtifactStagingDirectory, related issue: The value for the template parameter '_artifactsLocation' is not provided
This sounds like you are using the Azure Resource Group deployment template from VS to deploy via VSTS.
It uses MSDeploy as part of the ARM template deployment to deploy your service.
The Powershell script that is generated by the VS project template uploads a ZIP-file containing your service to Blob storage, and puts the URL and other information into _artifactsLocation and other ARM template parameters.
Instead of doing that, you can remove the artifacts related parameters and the MSDeploy resource from the ARM template. Then the template ONLY contains infrastructure related resources.
After this, add a "Deploy to App Service" step in the VSTS Release pipeline after the ARM template deployment. That can then be used to deploy your service code.
If you are using a separate parameters json file, you'll need to initialise the _artifactsLocation and _artifactsLocationSasToken there. You can give them empty strings, like:
"_artifactsLocation": {
"value": ""
},
"_artifactsLocationSasToken": {
"value": ""
},
They should automatically get their values from a PowerShell script. I'm using the AzureResourceManagerTemplateDeployment#3 task, it would probably work with AzureResourceGroupDeployment#2 as well.

Specifying artifactsLocation in web app deployment to Azure

I am trying to deploy an Azure Resource group via Octopus Deploy (within it a website).
Hyak.Common.CloudException: InvalidTemplate: Deployment template validation failed: 'The value for the template parameter '_artifactsLocation' at line '40' and column '32' is not provided.
How would I specify this in Visual Studio so that the solution can get deployed to Azure?
Thanks
From an Octopus point-of-view, you can use variable substitution on both the Template and Parameter files to sub in whatever values you need to your templates.
If you have a look at the "Template Contained in a Package" section of the Azure Resource Groups documentation, it shows you example JSON templates with variable substitution in place.
Eg.
"databaseName": {
"value": "#{DatabaseName}"
},
So in your project, you'd setup a project variable, then use the variable substitution syntax to address that variable in your template JSON (that's contained in your package), and it will get substituted before being executed at Azure.
Hope this helps

Visual Studio Team Services: Raw link to build artifacts

I see several examples of Azure Resource Manager templates, referencing artifacts directly in Git Hub.
As in the following example, taken from this quick start template:
"modulesUrl": {
"type": "string",
"defaultValue": "https://github.com/Azure/azure-quickstart-templates/raw/master/dsc-extension-azure-automation-pullserver/UpdateLCMforAAPull.zip",
"metadata": {
"description": "URL for the DSC configuration package. NOTE: Can be a Github url(raw) to the zip file (this is the default value)"
}
As an orgnaisation, we can't use free Git Hub as code is public and as we pay for VSTS already... At the moment, we have to upload artifacts to Azure Storage Accounts using the VSTS build task Azure Resource Group Deployment task and reference them from there. It would be nice if we could remove this step.
So, is there a way to reference artifacts directly from a VSTS repository in a similar way to Git Hub? I assume the URI would require some form of authentication, such as a PAT token.
All I can find is this, but I think it is referring to packages. I don't need to create packages for ARM templates and DSC configurations.
There is a task called Azure Resource Group Deployment Task, we use this to deploy the ARM template.
According to you sample template, it's using publicly accessible http/https URLs in GitHub. Afraid this is not accessible via vsts url. In VSTS you need to follow below process (Need to use a SAS Token):
You could provide some extra parameters using the output variables defined in the Azure File Copy Task (storageURI, storageToken). This are needed because in the template we use the _artifactsLocation and _artifactsLocationSasToken parameters to build the storage URL to the files.
More details please refer this blog: Setting up VSTS with ARM Templates

Resources