I am trying to deltafy the deployment of pipelines of Azure Data Factory. In other words, I am planning to deploy only the recently modified pipelines or recently added pipelines to the ADF in Azure portal. Currently, I use powershell task in my CD pipeline to deploy all the pipelines. How do I get the recently modified pipelines using the timestamp? Any suggestions will help. :)
If you want to build all the files but push/publish only the modified files to deploy, that’s possible.
You can use a PowerShell task to get the last commit id (sha-1 value) in $(Build.SourcesDirectory), and then use git show <commit id> to find which files are changed in the commit, then copy these files to $(build.artifactstagingdirectory). And finally publish these files to server.
You can get changed files by using the REST API.
Simple powershell script:
Param(
[string]$collection,
[string]$project,
[string]$repository,
[string]$commit,
[string]$token
)
$u="$($collection)$($project)/_apis/git/repositories/$repository/commits/$commit/changes"
Write-Output $u
$changedFiles=New-Object System.Collections.ArrayList
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f "",$token)))
$result=Invoke-RestMethod -Method Get -Uri $u -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)} -ContentType "application/json"
Foreach($c in $result.changes)
{
If($c.item.isFolder -ne 'true')
{
Write-Output $c.item.path
$changedFiles.Add(($c.item.path))
}
}
Arguments (check Allow scripts to access OAuth token option in build/release definition):
-collection $(System.TeamFoundationCollectionUri) -project $(System.TeamProject) -repository $(Build.Repository.Name) -commit $(Build.SourceVersion) -token $(System.AccessToken)
Related
I have a single file that I need to deploy to an existing App Service (that already contains an application, deployed previously through another pipeline).
I would like to avoid using the FTP task for that.
Is there a way to deploy a single file to a specific folder in an App Service via a DevOps Pipeline?
Actually FTP seems to be the easiest way to deploy single file. However if you don't want to use it you can use KUDU API. What you need is to wrap it in powershell script:
$username = '$(username)'
$password = '$(password)'
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))
$apiUrl = "https://$(siteName).scm.azurewebsites.net/api/vfs/site/your-file"
$filePath = "$(System.DefaultWorkingDirectory)/your-file"
$headers = New-Object "System.Collections.Generic.Dictionary[[String],[String]]"
$headers.Add("Authorization", ("Basic {0}" -f $base64AuthInfo))
$headers.Add("If-Match", "*")
Invoke-RestMethod -Uri $apiUrl -Headers $headers -Method PUT -InFile $filePath -ContentType "multipart/form-data"
Here you find info how to get credentials.
But if this is one time task you may use drag & drop from KUDU panel as it is mentioned here
I have an Azure Pipeline which I invoke it using azure rest API
https://dev.azure.com/xxx/xxx_connection/_apis/pipelines/2/runs
All working great, I'm sending the source files from install Self Hosted Linux Agent to azure and getting them back compiled into the Self Hosted Linux Agent.
My question is how can I know when the pipeline is done in the server that is hosting the Self Hosted Linux Agent?
Is there any callback? or API I can invoke and query the if the pipeline job is done?
or just doing while loop on the directory where I accepting to get the compiled products?
last option I want to avoid.
You can also check out below ways to know if the pipeline is done.
1, Using Runs-Get or Latest - Get API to get the build result as mentioned by Shayki Abramczyk.
2, You can set up a service hook. For below example a web hook which is triggered on the the completion the selected build pipeline.
3, You can check out the extension CatLight. CatLight app will notify you on build start and completion.
4,Create a custom notification on the build completed event and subscribe. You will be notified by email when the pipeline completed. Check the tutorial here for more information.
You can use the Runs - Get API:
GET https://dev.azure.com/{organization}/{project}/_apis/pipelines/{pipelineId}/runs/{runId}?api-version=6.1-preview.1
And in the response check the result (should be succeeded).
You can get the runId when you invoke the pipeline with the API you mentioned (in the response).
You can poll the pipeline every 30 seconds to find out the status of your build, script will exit once status is succeeded.
$user="$(USER)"
$token="$(TOKEN)"
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $user,$token)))
$buildInfo = ( Invoke-RestMethod -Method Get -UseDefaultCredentials -Uri "https://tfs.com/tfs/Organization/Project/_apis/build/builds/${buildID}?api-version=6.1-preview.6" -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)} )
Write-Host "Build URL : " "https://tfs.com/tfs/Organization/Project/_build/results?buildId=$buildID"
while($buildInfo.status -eq "inProgress" -or $buildInfo.status -eq "notStarted") # keep checking till build completed
{
Write-Output "Build is $($buildInfo.status) Sleep for 30 seconds."
Start-Sleep -Seconds 30 # Start sleep for 30 seconds
$buildInfo = ( Invoke-RestMethod -Method Get -UseDefaultCredentials -Uri "https://tfs.com/tfs/Organization/Project/_apis/build/builds/${buildID}?api-version=6.1-preview.6" -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)} )
}
Write-Output "Build Status : $($buildInfo.status)" # print build status
Write-Output "Build Result : $($buildInfo.result)" # print build result
I would like to download the test plan artifacts (files) from azure devops repository via command line or power shell script. I used this (Download an application from Azure Devops via command line) as a reference but I cannot use GIT / repository.
Is there any way I can download the test artifacts (files) for each test case/ test suite via command line or powershell?
I tested this Rest Api. The download parameter didn't worked as expected.
So there is an another workaround. You can specify the Output file path in the Invoke-RestMethod stage.
$result = Invoke-RestMethod -Uri $uri -OutFile D:\1212\test1.txt -Method Get -ContentType "application/json" -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)}
In my Azure DevOps Pipeline I would like to copy a folder e.g. Media from 1 environment/app service, say TEST to another environment/app service say Live. The Media folder in TEST may get updated AFTER the Ci/cd build has been deployed to the TEST environment - just to exclude answers that might suggest putting it in Git and including it as a Build artifact.
EDIT - Clarification on using the accepted answer.
My repo contains the given powershell script in the accepted answer as :
azure/Copy-Media-Test-To-Live.ps1
I then add the azure folder as an artifact in the build pipeline i.e.
Edit azure-pipelines.yml and Add:
- task: PublishPipelineArtifact#1
inputs:
path: $(System.DefaultWorkingDirectory)/azure/
artifact: azure
In the release pipeline - reference the script to perform the copy:
steps:
- task: AzurePowerShell#4
displayName: 'Azure PowerShell script: FilePath'
inputs:
azureSubscription: 'Your subscription '
ScriptPath: '$(System.DefaultWorkingDirectory)/_your-artifact-path/azure/Copy-Media-Test-To-Live.ps1'
azurePowerShellVersion: LatestVersion
Any application running in an App Service Environment can be managed via Kudu. Kudu has an API for downloading a ZIP compressed archive of any folder currently deployed to an application. It can be accessed with a GET request to:
https://{{YOUR-APP-NAME}}.scm.azurewebsites.net/api/zip/site/{{FOLDER}}
You can use the Invoke-WebRequest cmdlet in PowerShell to pull this content to local storage.
You do need to authenticate to use the Kudu API, which is easy in a browser, but when automating is a little more involved. Please see the following article which details how to retrieve and present a Basic Authorization header, as well as demonstrating how to use the command API to extract a ZIP file using the Invoke-RestMethod cmdlet. Your service principal will need at least contributor access to your applications to get deployment credentials to use in the API calls.
https://blogs.msdn.microsoft.com/waws/2018/06/26/powershell-script-to-execute-commands-in-scm-website-on-all-instances/
EDIT (Include worked example script):
If you have multiple subscriptions and the context has not been set properly in the deployment runtime environment, you may need to use Set-AzContext -Subscription "<SubsciptionName>" to set the context for getting the WebApp
$srcResGroupName = "Test"
$srcWebAppName = "tstest12"
$srcDirectory = "/site/wwwroot/myFolder/"
$dstResGroupName = "Test"
$dstWebAppName = "tstest123"
$dstDirectory = "/site/wwwroot/myFolder/"
# Get publishing profile for SOURCE application
$srcWebApp = Get-AzWebApp -Name $srcWebAppName -ResourceGroupName $srcResGroupName
[xml]$publishingProfile = Get-AzWebAppPublishingProfile -WebApp $srcWebApp
# Create Base64 authorization header
$username = $publishingProfile.publishData.publishProfile[0].userName
$password = $publishingProfile.publishData.publishProfile[0].userPWD
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))
$apiBaseUrl = "https://$($srcWebApp.Name).scm.azurewebsites.net/api"
# Download the ZIP file to ./tmp.zip
Invoke-RestMethod -Uri "$apiBaseUrl/zip$($srcDirectory)" `
-Headers #{UserAgent="powershell/1.0"; `
Authorization=("Basic {0}" -f $base64AuthInfo)} `
-Method GET `
-OutFile ./tmp.zip
# Get publishing profile for DESTINATION application
$dstWebApp = Get-AzWebApp -Name $dstWebAppName -ResourceGroupName $dstResGroupName
[xml]$publishingProfile = Get-AzWebAppPublishingProfile -WebApp $dstWebApp
# Create Base64 authorization header
$username = $publishingProfile.publishData.publishProfile[0].userName
$password = $publishingProfile.publishData.publishProfile[0].userPWD
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $username,$password)))
$apiBaseUrl = "https://$($dstWebApp.Name).scm.azurewebsites.net/api"
# Upload and extract the ZIP file
Invoke-RestMethod -Uri "$apiBaseUrl/zip$($dstDirectory)" `
-Headers #{UserAgent="powershell/1.0"; `
Authorization=("Basic {0}" -f $base64AuthInfo)} `
-Method PUT `
-InFile ./tmp.zip `
-ContentType "multipart/form-data"
I'd ideally like to declare in my applications code or .deployment file for particular Site Extensions to be installed or updated when my code is pushed to the scm Kudu site. Is this possible?
Specifically I'd like the Microsoft.ApplicationInsights.AzureWebSites site extension to be automatically installed as part of all the sites I deploy without having to manually browse to the scm site and install it from the gallery.
You cannot install site extensions as part of Kudu git deployment, but you can do it as part of an ARM template. You can find a complete sample here.
This is basically the same as for any other site configurations. e.g. setting App Settings, app service tier, turning on logging, WebSockets, ... All those things sit outside of Kudu deployments, but can be achieved using an ARM template which has all your desired site state.
Another would be to make a WebJob that can create the SiteExtension folder and then copy your SiteExtension files in to that folder. As part of your deployment you would just include the webjob.
We do something similar with how Stackify APM is installed from a site extension in to Azure Web Apps. Site extension creates a WebJob and the WebJob actually updates the site extension applicationHost transform based on some conditional items.
It can be done using powershell, but it's a bit hacky (example using staging slot named staging):
Write-Host "Setting appsettings for Stackify on $AzureWebSite"
$hash = #{}
$hash['Stackify.ApiKey'] = "$licenceKey"
$hash['Stackify.Environment'] = "$environment"
$hash['Stackify.AppName'] = "$BaseWebAppName"
if ($loadCertificates -eq 'True')
{
$hash['WEBSITE_LOAD_CERTIFICATES'] = "*"
}
Set-AzureWebsite -Name $AzureWebSite -Slot staging -AppSettings $hash
### Install Extension for Azure App###
Write-Host "Installing Stackify on $AzureWebSite"
$Kudu = "https://" + $AzureWebSite + "-staging.scm.azurewebsites.net/api/extensionfeed" # Here you can get a list for all Extensions available.
$InstallNRURI = "https://" + $AzureWebSite + "-staging.scm.azurewebsites.net/api/siteextensions" # Install API EndPoint
$slot = Get-AzureWebsite $AzureWebSite -Slot staging
$base64AuthInfo = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(("{0}:{1}" -f $slot.PublishingUsername,$slot.PublishingPassword)))
$invoke = Invoke-RestMethod -Uri $Kudu -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)} -Method get ###-InFile $filePath -ContentType "multipart/form-data"
$id = ($invoke | ? {$_.id -match "stackify*"}).id ### Searching for Stackify ID Extension
try {
$InstallStackify = Invoke-RestMethod -Uri "$InstallNRURI/$id" -Headers #{Authorization=("Basic {0}" -f $base64AuthInfo)} -Method Put
$Status = ($InstallStackify.provisioningState).ToString() + "|" + ($InstallStackify.installed_date_time).ToString() ### Status
Write-Output "Stackify Installation Status : $Status"
}
catch{$_}
Restart-AzureWebsite $AzureWebSite -Slot staging