Azure web apps post swap - azure

I'm using a deployment slot called staging on my Azure web.
My application use some specfic files settings.I would replace these file after swapping.
Is there a way to do a post-swap ? I'd like to replace some files automatically when the swap is finish ( to put the new settings )
Thanks

I think that the path of least resistance for you is going to be to use an Azure DevOps Pipeline. A pipeline is made up of one or more tasks, and there are lots of tasks to choose from.
You can check out source code from a specific branch, build, publish to a slot, swap slots, and copy files using scripted pipelines.
Here is a full list of available tasks. If you would like to copy files from within a repo from one place to another, just take a look at the command line task. The pipeline runs on a Windows or Linux machine (your choice when you choose a Build Agent), so you can use normal DOS copy commands to move files around.

Related

Self hosted azure agent - how to configure pipelines to share the same build folder

We have a self-hosted build agent on an on-prem server.
We typically have a large codebase, and in the past followed this mechanism with TFS2013 build agents:
Daily check-ins were built to c:\work\tfs\ (taking about 5 minutes)
Each night a batch file would run that did the same build to those folders, using the same sources (they were already 'latest' from the CI build), and build the installers. Copy files to a network location, and send an email to the team detailing the build success/failures. (Taking about 40 minutes)
The key thing there is that for the nightly build there would be no need to get the latest sources, and the disk space required wouldn't grow much. Just by the installer sizes.
To replicate this with Azure Devops, I created two pipelines.
One pipeline that did the CI using MSBuild tasks in the classic editor- works great
Another pipeline in the classic editor that runs our existing powershell script, scheduled at 9pm - works great
However, even though my agent doesn't support parallel builds what's happening is that:
The CI pipeline's folder is c:\work\1\
The Nightly build folder is c:\work\2\
This doubles the amount of disk space we need (10gb to 20gb)
They are the same code files, just built differently.
I have struggled to find a way to say to the agent "please use the same sources folder for all pipelines"
What setting is this, as we have to pay our service provider for extra GB storage otherwise.
Or do I need to change my classic pipelines into Yaml and somehow conditionally branch the build so it knows it's being scheduled and do something different?
Or maybe, stop using a Pipeline for the scheduled build, and use task scheduler in Windows as before?
(I did try looking for the same question - I'm sure I can't be the only one).
There is "workingDirectory" directive available for running scripts in pipeline. This link has details of this - https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/command-line?view=azure-devops&tabs=yaml
The number '1' '2'...'6' of work folder c:\work\1\, c:\work\2\... c:\work\6\ in your build agent which stands for a particular pipeline.
Agent.BuildDirectory
The local path on the agent where all folders for a given build
pipeline are created. This variable has the same value as
Pipeline.Workspace. For example: /home/vsts/work/1
If you have two pipelines, there will also be two corresponding work folders. It's an except behavior. We could not configure pipelines to share the same build folde. This is by designed.
If you need to use less disk space to save cost, afraid stop using a Pipeline for the scheduled build, and use task scheduler in Windows as before is a better way.

Is there any way to do selective deployment in azure devops?

I have a release pipeline which i use to deploy my resources to other environments. All works fine but the problem is that every time i deploy, all the resources even if no modification is made, are deployed. Is there a way through which i can do selective deployment; i.e. I deploy only those resources which have been modified. Any help would do. Thanks.
That`s a broad question. There is no out-of-box feature to select units to deploy. But you can use variables in the release pipeline:
Define a variable for each resource/unit and set some default value and "Settable at release time" property.
For each resource, define a separate task to deploy and define the custom condition, like: and(succeeded(), eq(variables['Custom.DeployUnit1'], 'YES'))
You can update these variables at the release creation time:
Is there any way to do selective deployment in azure devops?
There is no such out of box way to selective deployment in azure devops.
That because Azure devops release does not support release only changed files since only release changed files not always meaningful and could not archive what the project intend to release (such as the config file only changed in a commit).
But you could create a PowerShell script to compare timestamp for all files:
Create XML file that stores the last upload/publish information of
each files (e.g. file name, date time, changeset/commit version).
Create a PowerShell script file that included the logical to compare
files (get files metadata and compare with that XML file) and copy
updated files to specific folder
Publish the files in that folder
Check the similar thread for some more details.
Besides, if deploying via the deploy.cmd or MSDeploy.exe, you could also use the the -useChecksum WebDeploy flag:
WebDeploy/MSDeploy Quick Tip: Only Deploy Changed Files
Hope this helps.

Azure functions - deploy by project in single repo?

We are building a set of serverless functions in Azure, but having difficulty deciding how to structure our source (Azure GIT) and DevOps to support them.
I am thinking of a single GIT repo, with all function apps housed independently within projects. We may have a lot of these function apps, we see great value in small code segments to do utility type of work, and I don't want dozens and dozens of independent repos just because of DevOps deployments. Is there a way to have a unique build and release process for each project, not the repo entirely? We aren't clear how this can be done and searches have come up empty on this. I thought it was possible to have unique build YAMLs per project across many projects in a single repo - but unclear how to implement the DevOps build and release pipleines to support this approach - ie; only a single function gets updated and we need to deploy - any guidance if this is possible and how to approach it would be great.
I haven't done this myself, but I'm in a similar situation where I'd like to have multiple functions (and other stuff) in a single Git repo for simplicity, but only build/deploy them as needed when they change. It looks like you can have multiple pipelines on a single repo with a different YAML file for each pipeline. The steps are documented in this link, and summarized below
In Azure DevOps, create a new Pipeline.
For the "Where is your code?" page, at the bottom choose the Use the classic editor option.
Select your source repo and branch.
On the "Select a template" screen, choose the YAML option at the top. Hit Apply.
There is a YAML file path field where you can specify the path and name of your YAML file for the pipeline.
You may want to set the pipeline to run manually if you don't want a build each time there's a commit to the repo.
EDIT There may be an easier way to do this now. If you go through the New Pipeline wizard, select your source location, on the Configure tab, at the bottom you can choose the Existing Azure Pipelines YAML file option. This lets you select a custom YAML file directly.

Is it possible to create multiple build pipelines of similar type in Azure DevOps?

Suppose we have 100 static websites of similar type. It will have similar build pipeline tasks. So instead of creating build and release pipelines one by one using visual designer, is there a way to automate it so that it will get created automatically?
You can do that via rest api, also, if all the pipelines are in different repos you can use azure-pipelines.yaml in the root of the repo, it will pick it up automatically.
go to builds > edit > top right >
on the next screen you can rename it:

Build Definition: Copying the build output to the server?

When configuration the build definition, I have the option of copying the build output to the server:
We are using the VSO Host Build Controller and Azure's Continuous Integration build template to release to our development environment after every check-in.
Is there any reason why we need to have this value set? How could it ever be useful?
The Copy build output to the server will put the output of the build as a zip file attached the build that can be downloaded later.
In a situation where you don't care about the build output because you have it setup to continuously deploy to Azure (or some other build based deployment) you would not use this option.
If you however needed to download the output, for example a Windows Store App that you need to publish to the Store manually, then you could use this to get the application.
In VSO you most likely don't have a Drop Server (unless you have invested in Azure heavily) so you have 2 choices:
Put them in source control. Only in TFVC, not Git. Also fills up your Repo with Large Files.
Attach them to the build as a Zip.
The second scenario is exactly where you would use this.

Resources