Is it possible to do continuous deployment CI/CD of an Azure Function through a Linux Environment via Azure DevOps? - azure

When creating a function in Azure through a Linux environment it seems CI/CD is completely missing from it's capabilities as I can't see any actual files. My VS code tells me this
Error: This plan does not support viewing files.
and when I try to deploy my files to the server through the Azure pipeline everything works except for the
Azure App Service Deploy
Which tells me this.
2020-04-21T19:48:37.6676043Z ##[error]Failed to deploy web package to App Service.
2020-04-21T19:48:37.6689536Z ##[error]Error: Error: Failed to deploy web package to App Service. Conflict (CODE: 409)
I did get it working directly through VS Code with a windows environment and didn't notice any of those issues.
Can you confirm this is not possible through Linux or perhaps there is a solution for what I am looking for.

is it possible to do continuous deployment CI/CD of an Azure Function through a Linux Environment via Azure DevOps?
The answer is Yes.
To deploy a Azure Function, you should use Azure Function App task instead of Azure App Service Deploy task. For below example.
steps:
- task: AzureFunctionApp#1
inputs:
azureSubscription: '<Azure service connection>'
appType: functionAppLinux
appName: '<Name of function app>'
#Uncomment the next lines to deploy to a deployment slot
#Note that deployment slots is not supported for Linux Dynamic SKU
#deployToSlotOrASE: true
#resourceGroupName: '<Resource Group Name>'
#slotName: '<Slot name>'
Please check out this document Continuous delivery by using Azure DevOps for detailed examples.

Related

How to manipulate remote Terraform state files on Azure Blob storage

I'm working with a subscription that has a few different deployed environments (dev, test, staging, etc.). Each environment has its own storage account, containing an associated Terraform state file. These environments get deployed via Azure DevOps Pipelines.
It's easy enough to get at the .tfstate files that have been created this way, through the portal, CLI, etc.
But is it possible to access these state files using the 'terraform state' commands, for example using Azure Cloud Shell? If so, how do you point them at the right location?
I've tried using the terraform state commands in a Cloud Shell, but it's not clear how to point them to the right location or if this is indeed possible.
For these requirement, you need AzurePowerShell task to achieve your requirement.
1, First, if you can achieve your requirement via powershell feature in azure portal, then it is possible using the AzurePowerShell task to achieve the same thing(AzurePowerShell is running on the agent based on the service connection/service principal you provided.).
- task: AzurePowerShell#5
inputs:
azureSubscription: 'testbowman_in_AAD' #This service connection related to service principal on Azure side.
ScriptType: 'InlineScript'
Inline: |
# Put your logic here.
# Put your logic here.
azurePowerShellVersion: 'LatestVersion'
2, Second, you can use AzCopy to download the file and then do operations to it. DevOps microsoft host agent support this tool.
running this command : terraform state pull > state.tfstate (you can give like thils dev.tfstate extension tfstate is important)in the Azure cloud shell.
All you need to move to the terraform file directory
enter image description here
and run this command terraform state pull > dev.tfstate
enter image description here

Azure infrastructure creation methods

I'm wondering what is the best way to create and manage an Azure infrastructure. By infrastructure, I mean a set of resources used by a project. E.g. An Application Service Plan, a web service, a SQL server etc.
Currently, I see that there are a couple of ways to do this programmatically in a CD fashion:
By uploading a template with the needed resources
By creating each resource using its own PowerShell Module: E.g. Az.Websites, Az.Sql, Az.IotHub etc.
By using Az CLI, which is approximately the same as 2.
What are the pros and cons of each method?
You can try with azure ARM templates. It support all your mentioned applications to deploy using simple json structure. once you prepared the ARM template you can deploy the template using Azure DevOps release pipeline. for more details check the microsoft documentation
trigger:
- master
pool:
vmImage: 'windows-latest'
steps:
- task: AzureFileCopy#4
inputs:
SourcePath: 'templates'
azureSubscription: 'copy-connection'
Destination: 'AzureBlob'
storage: 'demostorage'
ContainerName: 'projecttemplates'
name: AzureFileCopy
- task: AzureResourceManagerTemplateDeployment#3
inputs:
deploymentScope: 'Resource Group'
azureResourceManagerConnection: 'copy-connection'
subscriptionId: '00000000-0000-0000-0000-000000000000'
action: 'Create Or Update Resource Group'
resourceGroupName: 'demogroup'
location: 'West US'
templateLocation: 'URL of the file'
csmFileLink: '$(AzureFileCopy.StorageContainerUri)templates/mainTemplate.json$(AzureFileCopy.StorageContainerSasToken)'
csmParametersFileLink: '$(AzureFileCopy.StorageContainerUri)templates/mainTemplate.parameters.json$(AzureFileCopy.StorageContainerSasToken)'
deploymentMode: 'Incremental'
deploymentName: 'deploy1'
Basically, when you want to build an infrastructure, all the vehicles you use will do the same job. but the difference is speed and convenience. operations you will do using the interface in a shorter time using the CLI. You can do the If you are working with multiple cloud providers (AWS, GCP, Azure) I recommend using terraform. so you don't need to be knowledgeable about all cloud providers to build the infrastructure.
We suggest using ARM templates for a couple reasons. ARM templates uses declarative syntax, which lets you state what you intend to deploy without having to write the sequence of programming commands to create it. In the template, you specify the resources to deploy and the properties for those resources. ARM templates are more consistent and are idempotent. If you rerun a PowerShell or CLI command numerous times you can get different results. More pros can be found here, I am not going to re-write our docs.
The downside of ARM templates is that they can get complex, especially when you start nesting templates or start using Desired State Configuration. We have recenlty released Bicep (Preview) to reduce some of the complexity.
PowerShell and CLI are pretty similar in the pros/cons but there are times I find one is easier to use (e.g. it's easer to configure Web with CLI but AzureAD needs PowerShell). CLI of course is better is you are running on a non-windows client but now you can run PowerShell in Linux so that is not a hard/fast rule.
The downside with PowerShell or CLI is you must understand the dependencies of your infrastructure and code the script accordingly. ARM templates can take care of this orchestration and deploy everything in the proper order. This can also make PowerShell/CLI slower to deploy resources send they are not deployed in parallel where possible unless you code your script in an async manner.
I would be remiss if I didn't mention Terraform. Terraform is great if you want consistency in deployments across clouds like Azure, AWS and GCP.

Deploy .NET Core app to Azure Web App with ARM template and GitHub Actions

I'm pretty familiar with Azure DevOps, pipelines all that stuff, and now I'm trying to dig into GitHub Actions. The question is basically pretty simple, I want to deploy my .NET Core 5 App to Azure. The only problem is, that all examples include this publish profile.
Since I provision the infrastructure with an ARM template, the publish profile is simply not there yet. I could find some examples that deploy the ARM template and a couple of examples that deploy the Web App, but no example combining both. Maybe I'm a little bit polluted by the way Azure DevOps works and the (wonderful) idea of service connections.
So my question is, how do I publish a web app to Azure when I don't have the ability to download a publish profile and store that in my GitHub secrets, using GitHub Actions?
OK, I found this one out myself. Apparently there's an action you can use that will download the publish profile for you. This means that you don't have to have the publish profile up front. The step looks like this:
- name: Get WebApp/FunctionApp publish profile
id: webapp-dev
uses: aliencube/publish-profile-actions#v1
env:
AZURE_CREDENTIALS: ${{ secrets.AZURE_CREDENTIALS }}
with:
resourceGroupName: 'your-resource-group-name'
appName: 'your-app-name'
This leaves you with an output variable called `profile' which can be used in following steps like so:
- name: 'Run Azure webapp deploy action using publish profile credentials'
uses: azure/webapps-deploy#v1
with:
app-name: 'your-app-name'
publish-profile: ${{ steps.webapp-dev.outputs.profile }}
package: './'
This means you can now provision resources using ARM templates, get the publish profile (just-in-time) and use that to deploy your system... Everybody happy...

How to trigger an AzureML Pipeline from Azure DevOps?

If we have an AzureML Pipeline published, how can we trigger it from Azure DevOps without using Python Script Step or Azure CLI Step?
The AzureML Steps supported natively in Azure DevOps include Model_Deployment and Model_Profiling.
Is there any step in Azure DevOps which can be used to directly trigger a published Azure Machine Learning Pipeline while maintaining capabilities like using Service Connections and passing environmental variables, Gated Release (Deployment)?
Edit:
This process can then be used to run as an agentless job.
I am afraid there is no other steps available in Azure Devops which can directly trigger a published azure ml pipeline. You have to use Python Script Step or Azure CLI Step in azure devops pipeline to trigger azure ml pipeline.
To trigger azure ml pipeline using azure cli task in azure devops pipeline. You can check out below steps.
1, Create an azure pipeline. See example here.
2, Create an azure Resource Manager service connection to connect your Azure subscription to Azure devops. See this thread for an example
3, Add Az cli task in your yaml pipeline. Run below scripts as inline scripts. See document here for more information.
steps:
- task: AzureCLI#2
displayName: 'Azure CLI '
inputs:
azureSubscription: 'azure Resource Manager service connection'
scriptType: ps
scriptLocation: inlineScript
inlineScript: |
#To install the Machine Learning CLI extension
az extension add -n azure-cli-ml
az ml run submit-pipeline --pipeline-id "{id}"
Update:
If you want to avoid using build agents. You can run the invoke rest api task in an agentless job. See below steps:
1, Create a Generic service connection in azure devops. See here for creating service connection.
2, Add below url as the Server URL of the generic service connection. See here for more information about below url.
3, Add a agentless job(server job) in your pipeline. Add invoke rest api task in this agentless job. So that, the pipeline will execute the invoke rest api task to trigger the azureml pipeline without using a build agent.
You can also setup an azure logic app in your azure subscription.
You can set the logic app trigger as azure devops events. Or you can set a http request as the trigger events(You can the use invoke rest api task or azure devops web hook to call this http request to trigger this logic app).
And then add a HTTP action with the url as above url screenshot. Please see here for more information.
Assumptions:
An AzureML Pipeline is published and the REST endpoint is ready- To be referred to in this answer as <AML_PIPELINE_REST_URI>. And Published Pipeline ID is also ready- To be referred to in this answer as <AML_PIPELINE_ID>
You have the Azure Machine Learning Extension installed: Azure Machine Learning Extension
To Invoke the Azure Machine Learning Pipeline we use the Invoke ML Pipeline step available in Azure DevOps. It is available when running an Agentless Job.
To trigger it the workflow is as follows:
Create a New Pipeline. Using the Classic Editor, delete the default Agent Job 1 stage.
Add an agentless job:
Add a task to this Agentless Job:
Use AzureML Published Pipeline Task:
Use the Service Connection Mapped to the AML Workspace. You can find more on this at the official documentation
Choose the Pipeline to trigger using the <AML_PIPELINE_ID>:
Give The experiment name and Pipeline Parameters if any:
That's it, you can Save and Queue:
Alternatively, you can simply use the following jobs:
- job: Job_2
displayName: Agentless job
pool: server
steps:
- task: MLPublishedPipelineRestAPITask#0
displayName: Invoke ML pipeline
inputs:
connectedServiceName: <REDACTED-AML-WS-Level-Service_Connection-ID>
PipelineId: <AML_PIPELINE_ID>
ExperimentName: experimentname
PipelineParameters: ''

How to connect Azure DevOps Pipeline to new app service?

I've got an app service on Microsoft Azure. I use Azure DevOps to deploy code. This has been working for a while.
Now I need a Beta version of the app. So I've created a new app service in Azure inside a new Resource Group. That was easy to set up and I'm up and running.
But I'm struggling with deployment in DevOps. I've created a new project for my Beta service in DevOps, but I can't figure out how I connect to my new Beta app service.
Edit:
In DevOps I have created a Service Connection that is linked to my Azure Beta Resource. But I don't see how my new pipelines linkes to this sercvice connection.
Pipeline looks like this (YAML export)
pool:
name: Azure Pipelines
demands: npm
steps:
- task: Npm#1
displayName: 'npm install'
inputs:
verbose: false
- task: Npm#1
displayName: 'npm build'
inputs:
command: custom
verbose: false
customCommand: 'run build --scripts-prepend-node-path=auto '
- task: ArchiveFiles#2
displayName: 'Archive files'
inputs:
rootFolderOrFile: dist
includeRootFolder: false
- task: PublishBuildArtifacts#1
displayName: 'Publish artifacts: drop'
The strange thing is that the pipeline runs without errors, but nor the Beta site or Prod site gets any of the code changes. What I'm I missing?
To deploy your app to an Azure resource (to an app service or to a virtual machine), you will need an Azure Resource Manager service connection first.
Go to project setting of the new Project-->Service connections under Pipelines-->new Service Connection--> Select Azure Resource Manager. See here for more information.
Then you need to create a new pipeline to build the Beta service in the new project. You can check what tasks are used in the pipeline for the other app service. And add the same tasks in this new pipeline. Here is an example of creating a yaml pipeline. You can also create a Classic UI pipeline.
If you deployed the other app service in a release pipeline. You can create a new release pipeline for the Beta service.
The tasks used in the build and release pipeline can be the same with your other app service. You might need to change the tasks' configurations a little bit according to the Beta service project. For example, if you use Azure Web App task to deploy your service to azure app. You need to set the azureSubscription field to the Azure Resource Manager service connection created in the very first step. And set appName field to the new azure app service. See below tutorials for more information.
Deploy an Azure Web App (Yaml Pipeline)
Deploy a web app to Azure App Services (Classic Pipeline)
As I understand the question, you want to setup Azure DevOps pipeline for your new App Service deployment. For this, you can follow this tutorial. Also, note that it is the same process you did with your previous Service.

Resources