How to manipulate remote Terraform state files on Azure Blob storage - azure

I'm working with a subscription that has a few different deployed environments (dev, test, staging, etc.). Each environment has its own storage account, containing an associated Terraform state file. These environments get deployed via Azure DevOps Pipelines.
It's easy enough to get at the .tfstate files that have been created this way, through the portal, CLI, etc.
But is it possible to access these state files using the 'terraform state' commands, for example using Azure Cloud Shell? If so, how do you point them at the right location?
I've tried using the terraform state commands in a Cloud Shell, but it's not clear how to point them to the right location or if this is indeed possible.

For these requirement, you need AzurePowerShell task to achieve your requirement.
1, First, if you can achieve your requirement via powershell feature in azure portal, then it is possible using the AzurePowerShell task to achieve the same thing(AzurePowerShell is running on the agent based on the service connection/service principal you provided.).
- task: AzurePowerShell#5
inputs:
azureSubscription: 'testbowman_in_AAD' #This service connection related to service principal on Azure side.
ScriptType: 'InlineScript'
Inline: |
# Put your logic here.
# Put your logic here.
azurePowerShellVersion: 'LatestVersion'
2, Second, you can use AzCopy to download the file and then do operations to it. DevOps microsoft host agent support this tool.

running this command : terraform state pull > state.tfstate (you can give like thils dev.tfstate extension tfstate is important)in the Azure cloud shell.
All you need to move to the terraform file directory
enter image description here
and run this command terraform state pull > dev.tfstate
enter image description here

Related

Specify local tf state file to azurerm provider in pipeline

I have been working on deploying terraform package using azure devops pipeline.
We have our tf state file locally, and no plans to move to azure storage account. Could you please help how can we define the attribute values in terraform init step in pipeline.
- task: TerraformTaskV2#2
displayName: Terraform init
inputs:
provider: 'azurerm'
command: 'init'
workingDirectory: 'some directory'
backendServiceArm: 'some service conn'
**backendAzureRmContainerName: ??
backendAzureRmResourceGroupName: ??
backendAzureRmStorageAccountName: ??
backendAzureRmKey: **
What should be the values for Resource group, storage account name, container name. If I don't specify these values, pipeline is failing with below error
##[error]Error: Input required: backendAzureRmStorageAccountName
Any help on this is much appreciated. Thanks in advance.
I'm unsure if you can use the TerraformTaskV2 without utilizing a cloud provider's backend. In the README for said task it doesn't show options for using a local backend, only the following for terraform init:
... AzureRM backend configuration
... Amazon Web Services(AWS) backend configuration
... Google Cloud Platform(GCP) backend configuration
I haven't had experience with this yet, but you could look at the extension Azure Pipelines Terraform Tasks, which does explicitly support a local backend:
The Terraform CLI task supports the following terraform backends
local
...
Just a note on working in teams:
if you're working in a team deploying infrastructure, using a local backend can lead to potential undefined state and/or undesirable outcomes. The benefits of choosing a good backend can offer "...support locking the state while operations are being performed, which helps prevent conflicts and inconsistencies." - docs

Azure infrastructure creation methods

I'm wondering what is the best way to create and manage an Azure infrastructure. By infrastructure, I mean a set of resources used by a project. E.g. An Application Service Plan, a web service, a SQL server etc.
Currently, I see that there are a couple of ways to do this programmatically in a CD fashion:
By uploading a template with the needed resources
By creating each resource using its own PowerShell Module: E.g. Az.Websites, Az.Sql, Az.IotHub etc.
By using Az CLI, which is approximately the same as 2.
What are the pros and cons of each method?
You can try with azure ARM templates. It support all your mentioned applications to deploy using simple json structure. once you prepared the ARM template you can deploy the template using Azure DevOps release pipeline. for more details check the microsoft documentation
trigger:
- master
pool:
vmImage: 'windows-latest'
steps:
- task: AzureFileCopy#4
inputs:
SourcePath: 'templates'
azureSubscription: 'copy-connection'
Destination: 'AzureBlob'
storage: 'demostorage'
ContainerName: 'projecttemplates'
name: AzureFileCopy
- task: AzureResourceManagerTemplateDeployment#3
inputs:
deploymentScope: 'Resource Group'
azureResourceManagerConnection: 'copy-connection'
subscriptionId: '00000000-0000-0000-0000-000000000000'
action: 'Create Or Update Resource Group'
resourceGroupName: 'demogroup'
location: 'West US'
templateLocation: 'URL of the file'
csmFileLink: '$(AzureFileCopy.StorageContainerUri)templates/mainTemplate.json$(AzureFileCopy.StorageContainerSasToken)'
csmParametersFileLink: '$(AzureFileCopy.StorageContainerUri)templates/mainTemplate.parameters.json$(AzureFileCopy.StorageContainerSasToken)'
deploymentMode: 'Incremental'
deploymentName: 'deploy1'
Basically, when you want to build an infrastructure, all the vehicles you use will do the same job. but the difference is speed and convenience. operations you will do using the interface in a shorter time using the CLI. You can do the If you are working with multiple cloud providers (AWS, GCP, Azure) I recommend using terraform. so you don't need to be knowledgeable about all cloud providers to build the infrastructure.
We suggest using ARM templates for a couple reasons. ARM templates uses declarative syntax, which lets you state what you intend to deploy without having to write the sequence of programming commands to create it. In the template, you specify the resources to deploy and the properties for those resources. ARM templates are more consistent and are idempotent. If you rerun a PowerShell or CLI command numerous times you can get different results. More pros can be found here, I am not going to re-write our docs.
The downside of ARM templates is that they can get complex, especially when you start nesting templates or start using Desired State Configuration. We have recenlty released Bicep (Preview) to reduce some of the complexity.
PowerShell and CLI are pretty similar in the pros/cons but there are times I find one is easier to use (e.g. it's easer to configure Web with CLI but AzureAD needs PowerShell). CLI of course is better is you are running on a non-windows client but now you can run PowerShell in Linux so that is not a hard/fast rule.
The downside with PowerShell or CLI is you must understand the dependencies of your infrastructure and code the script accordingly. ARM templates can take care of this orchestration and deploy everything in the proper order. This can also make PowerShell/CLI slower to deploy resources send they are not deployed in parallel where possible unless you code your script in an async manner.
I would be remiss if I didn't mention Terraform. Terraform is great if you want consistency in deployments across clouds like Azure, AWS and GCP.

How to trigger an AzureML Pipeline from Azure DevOps?

If we have an AzureML Pipeline published, how can we trigger it from Azure DevOps without using Python Script Step or Azure CLI Step?
The AzureML Steps supported natively in Azure DevOps include Model_Deployment and Model_Profiling.
Is there any step in Azure DevOps which can be used to directly trigger a published Azure Machine Learning Pipeline while maintaining capabilities like using Service Connections and passing environmental variables, Gated Release (Deployment)?
Edit:
This process can then be used to run as an agentless job.
I am afraid there is no other steps available in Azure Devops which can directly trigger a published azure ml pipeline. You have to use Python Script Step or Azure CLI Step in azure devops pipeline to trigger azure ml pipeline.
To trigger azure ml pipeline using azure cli task in azure devops pipeline. You can check out below steps.
1, Create an azure pipeline. See example here.
2, Create an azure Resource Manager service connection to connect your Azure subscription to Azure devops. See this thread for an example
3, Add Az cli task in your yaml pipeline. Run below scripts as inline scripts. See document here for more information.
steps:
- task: AzureCLI#2
displayName: 'Azure CLI '
inputs:
azureSubscription: 'azure Resource Manager service connection'
scriptType: ps
scriptLocation: inlineScript
inlineScript: |
#To install the Machine Learning CLI extension
az extension add -n azure-cli-ml
az ml run submit-pipeline --pipeline-id "{id}"
Update:
If you want to avoid using build agents. You can run the invoke rest api task in an agentless job. See below steps:
1, Create a Generic service connection in azure devops. See here for creating service connection.
2, Add below url as the Server URL of the generic service connection. See here for more information about below url.
3, Add a agentless job(server job) in your pipeline. Add invoke rest api task in this agentless job. So that, the pipeline will execute the invoke rest api task to trigger the azureml pipeline without using a build agent.
You can also setup an azure logic app in your azure subscription.
You can set the logic app trigger as azure devops events. Or you can set a http request as the trigger events(You can the use invoke rest api task or azure devops web hook to call this http request to trigger this logic app).
And then add a HTTP action with the url as above url screenshot. Please see here for more information.
Assumptions:
An AzureML Pipeline is published and the REST endpoint is ready- To be referred to in this answer as <AML_PIPELINE_REST_URI>. And Published Pipeline ID is also ready- To be referred to in this answer as <AML_PIPELINE_ID>
You have the Azure Machine Learning Extension installed: Azure Machine Learning Extension
To Invoke the Azure Machine Learning Pipeline we use the Invoke ML Pipeline step available in Azure DevOps. It is available when running an Agentless Job.
To trigger it the workflow is as follows:
Create a New Pipeline. Using the Classic Editor, delete the default Agent Job 1 stage.
Add an agentless job:
Add a task to this Agentless Job:
Use AzureML Published Pipeline Task:
Use the Service Connection Mapped to the AML Workspace. You can find more on this at the official documentation
Choose the Pipeline to trigger using the <AML_PIPELINE_ID>:
Give The experiment name and Pipeline Parameters if any:
That's it, you can Save and Queue:
Alternatively, you can simply use the following jobs:
- job: Job_2
displayName: Agentless job
pool: server
steps:
- task: MLPublishedPipelineRestAPITask#0
displayName: Invoke ML pipeline
inputs:
connectedServiceName: <REDACTED-AML-WS-Level-Service_Connection-ID>
PipelineId: <AML_PIPELINE_ID>
ExperimentName: experimentname
PipelineParameters: ''

Is it possible to do continuous deployment CI/CD of an Azure Function through a Linux Environment via Azure DevOps?

When creating a function in Azure through a Linux environment it seems CI/CD is completely missing from it's capabilities as I can't see any actual files. My VS code tells me this
Error: This plan does not support viewing files.
and when I try to deploy my files to the server through the Azure pipeline everything works except for the
Azure App Service Deploy
Which tells me this.
2020-04-21T19:48:37.6676043Z ##[error]Failed to deploy web package to App Service.
2020-04-21T19:48:37.6689536Z ##[error]Error: Error: Failed to deploy web package to App Service. Conflict (CODE: 409)
I did get it working directly through VS Code with a windows environment and didn't notice any of those issues.
Can you confirm this is not possible through Linux or perhaps there is a solution for what I am looking for.
is it possible to do continuous deployment CI/CD of an Azure Function through a Linux Environment via Azure DevOps?
The answer is Yes.
To deploy a Azure Function, you should use Azure Function App task instead of Azure App Service Deploy task. For below example.
steps:
- task: AzureFunctionApp#1
inputs:
azureSubscription: '<Azure service connection>'
appType: functionAppLinux
appName: '<Name of function app>'
#Uncomment the next lines to deploy to a deployment slot
#Note that deployment slots is not supported for Linux Dynamic SKU
#deployToSlotOrASE: true
#resourceGroupName: '<Resource Group Name>'
#slotName: '<Slot name>'
Please check out this document Continuous delivery by using Azure DevOps for detailed examples.

Task to Deploy Artifact to a container Storage Outside of my account

I am currently creating a CI for the FrontEnd of one of our client.
We need to copy the file coming from our repo the container account of the compagny that manage the operational part (we are only providing the code).
So , the company that will manage the infrastructure has Given us the storage account name (testdeploy) , the container name (artifact-deply) and the key (securekey).
I have managed to connect to the storage via Azure Storage Explorer , but now I need to deploy the artifact on this container via the CI.
The problem is , I don't know how , and I can't find documentation on how to proceed , every doc talk about deploying to a container in the same subscription.
But I do not have acces to this container , I only have it's name and key.
Here is the Yaml to what I have already setup , I do not know if i can help:
steps:
- task: AzureFileCopy#2
displayName: 'AzureBlob File Copy'
inputs:
SourcePath: '$(System.DefaultWorkingDirectory)/_listes-Azure/buildtest'
azureSubscription: 'Paiement à l''utilisation(my_subscription)'
Destination: AzureBlob
storage: testdeploy
ContainerName: 'artifact-deploy/front'
AdditionalArgumentsForBlobCopy: 'securekey'
outputStorageUri: 'https://testdeply.blob.core.windows.net/'
outputStorageContainerSasToken: 'securekey'
Of course when i do this I have this error message :
2019-10-25T10:45:51.1809999Z ##[error]Storage account: fprplistesdeploy not found. The selected service connection 'Service Principal' supports storage accounts of Azure Resource Manager type only.
Since It's not in my subscription scope , it can't acces it.
What I am doing wrong ?
I am using the AzurFileCopy task , is it good?
How can I setup the AzurFileCopy task to a container account that is not on my subscription scope , knowing that the only thing i have is a account name , and a key?
Thanks in advance !
What you basically have to do is to create and use a Shared Access Signature (SAS) to deploy resources into this blob container. Since you have the storage account key you can create a SAS token with Azure Storage Explorer.
Then use Azure Cloud Shell or Azure CLI on local machine for testing purposes. Try to copy a file into the blob container using a SAS token for authorization. If you have problems with authorization using a SAS token you can also test access using Azure Storage Explorer. Such basic tasks are widely known and well documented.
Finally find a way to run the file copy command used while testing in an Azure Pipeline Task. If Azure File Copy task does not fit to your use case, use a more generic task like an Azure CLI task. From reading over the docs it might be that it does not support your use case although the task name indicates that. I see your point. Find out how to access the artifact provided by the build pipeline and copy the file resources into the storage account. If that basically works find out how to improve it. Voila.
So I managed to do it.
Turns out , you can't do it via the AzureFile Copy , this task can't upload to as Container outside your subscription.
You must use an Azur CLI task , here is the script I used:
#!/bin/bash
az storage blob upload --container-name artifact --file $(System.DefaultWorkingDirectory)/artifact_deply/buildtest/front.zip --name front --account-key securekey
I changed all the variable but the idea is here ( I declared the account name in the variable panel of azur devops).
I used the account key , because I had error with the SAS URL , but I think you can easily use the Azur devops variable to pass the SAS Token URL.
And I created a task before this one to zip all the folder , so it's easier to manage.

Resources