I have a vue spa application that I host in azure. However, I am not able to get it running as a build pipeline after setting it up in Azure DevOps
The npm install and npm run build work perfectly, however, the script to copy my dist directory to my blob store fails.
Here is what I tried and the results. Does anyone have experience with this?
AzureFileCopy
- task: AzureFileCopy#3
inputs:
SourcePath: '$(System.DefaultWorkingDirectory)/dist'
azureSubscription: '[my subscription details]'
Destination: 'AzureBlob'
storage: 'mystorageaccountname'
ContainerName: '$web'
Result
##[section]Starting: AzureFileCopy
==============================================================================
Task : Azure file copy
Description : Copy files to Azure Blob Storage or virtual machines
Version : 3.1.11
Author : Microsoft Corporation
Help : https://learn.microsoft.com/azure/devops/pipelines/tasks/deploy/azure-file-copy
==============================================================================
##[command]Import-Module -Name C:\Program Files\WindowsPowerShell\Modules\AzureRM\2.1.0\AzureRM.psd1 -Global
##[warning]The names of some imported commands from the module 'AzureRM.Websites' include unapproved verbs that might make them less discoverable. To find the commands with unapproved verbs, run the Import-Module command again with the Verbose parameter. For a list of approved verbs, type Get-Verb.
##[warning]The names of some imported commands from the module 'AzureRM' include unapproved verbs that might make them less discoverable. To find the commands with unapproved verbs, run the Import-Module command again with the Verbose parameter. For a list of approved verbs, type Get-Verb.
##[command]Import-Module -Name C:\Program Files\WindowsPowerShell\Modules\AzureRM.Profile\2.1.0\AzureRM.Profile.psm1 -Global
##[command]Add-AzureRMAccount -ServicePrincipal -Tenant *** -Credential System.Management.Automation.PSCredential -EnvironmentName AzureCloud
##[command] Set-AzureRmContext -SubscriptionId dc6a0ce7-adcd-49fd-ad85-e1c082994145 -TenantId ***
Uploading files from source path: 'D:\a\1\s\dist' to storage account: 'mystorageaccountname' in container: '$web' with blob prefix: ''
##[command] & "AzCopy\AzCopy.exe" /Source:"D:\a\1\s\dist" /Dest:"https://mystorageaccountname.blob.core.windows.net/`$web" /#:"D:\a\_temp\ead7e7cf-0b6e-4b16-928f-c84cf3e3a7ab" /XO /Y /SetContentType /Z:"AzCopy" /V:"AzCopy\AzCopyVerbose_ae491d97-a7a8-44e6-b7b0-4b932a5e6c08.log" /S
[2019/06/13 00:31:08][ERROR] Error parsing source location "D:\a\1\s\dist": Failed to enumerate directory D:\a\1\s\dist\ with file pattern *. The system cannot find the path specified. (Exception from HRESULT: 0x80070003) For more details, please type "AzCopy /?:Source" or use verbose option /V.
##[error]Upload to container: '$web' in storage account: 'mystorageaccountname' with blob prefix: '' failed with error: 'AzCopy.exe exited with non-zero exit code while uploading files to blob storage.' For more info please refer to https://aka.ms/azurefilecopyreadme
##[section]Finishing: AzureFileCopy
Thanks to the comment suggesting I run a "dir". It actually didn't work. Upon inspection, the following command was not even building the app.
- script: |
npm install
npm run build
dir
displayName: 'npm install and build'
I presume this is because I changed the agent to run on windows (I discovered earlier that AzureFileCopy only runs on windows) and windows agent does not allow stacked scripts the way the ubuntu agent does. So I split the install and build into separate tasks and now it runs with only verb warnings. Here is the working script:
pool:
vmImage: 'vs2017-win2016'
steps:
- task: NodeTool#0
inputs:
versionSpec: '10.x'
displayName: 'Install Node.js'
- script: |
npm install
displayName: 'npm install'
- script: |
npm run build
displayName: 'npm run build'
- script: |
dir
displayName: 'list cwd contents (verify build)'
- task: AzureFileCopy#3
inputs:
SourcePath: '$(System.DefaultWorkingDirectory)/dist'
azureSubscription: '[my subscription details]'
Destination: 'AzureBlob'
storage: 'mystorageaccountname'
ContainerName: '$web'
Related
I have Angular project, that I want to build to Azure Web App
I created this yaml to build and deploy
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
steps:
- task: NodeTool#0
inputs:
versionSpec: '18.x'
displayName: 'Install Node.js'
# - task: Npm#1
# inputs:
# command: 'install'
# workingDir: 'schooly-angular'
- script: npm install -g #angular/cli
displayName: 'npm install -g #angular/cli'
- script: yarn
displayName: 'yarn install'
- script: ng build --prod
displayName: 'ng build'
- task: PublishBuildArtifacts#1
- task: AzureRmWebAppDeployment#4
inputs:
ConnectionType: 'AzureRM'
azureSubscription: 'Azure subscription 1 (*********)'
appType: 'webAppLinux'
WebAppName: 'marifit-admin'
packageForLinux: 'dist/'
But I get this error when try to build
Clicking Authorize, doesn't helps
ow I can solve this?
It seems you do not have permission regard on the subscription Azure subscription 1 (*********) or this subscription does not exist. When edit YAML pipeline, click the Settings to show assistance >> choose your subscription >> click Authorize to create a service connection for this subscription.
If the authorization failed, means you do not have enough permission regard on that subscription. You need to get contributor or owner role regard on the subscription. To get related permission, you need get help from subscription owner. Here is Assign a user as an administrator of an Azure subscription for reference.
I have tried reproducing in my environment and got below results so sharing it here as further reference:
task: AzureRmWebAppDeployment#4
inputs:
ConnectionType: 'AzureRM'
azureSubscription: 'Azure subscription 1 (*********)'
appType: 'webAppLinux'
WebAppName: 'marifit-admin'
packageForLinux: 'dist/'
The error seems the service connection used here was not existed. To create a proper service connection, follow the below steps.
Step 1: Create a service principle for the subscription where the application resources are hosted using the below command in the cloud shell.
$az ad sp create-for-rbac –name <service-principle-name> --scope /subscriptions/<subscription-id>
Make note of make note of password, appID and tenantID.
Step 2: Create a service connection using the service principle just like mentioned in Step 1.
Go to project settings > service connections > click on create new service connection.
After filing all the details click on verify to confirm it is valid connection.
.
Make sure the azure subscription name above task match with the service connection name
I am using let's encrypt certificate and azure key vault to automate renewal process using this repo: https://github.com/brent-robinson/posh-acme-azure-example
I have installed the module Az.KeyVault using yaml on azure devops pipeline:
# Install the Az PowerShell modules our script will need
- task: PowerShell#2
displayName: Install PowerShell Modules (Az.Accounts, Az.KeyVault, Az.Resources, Posh-ACME)
inputs:
targetType: 'inline'
script: 'Install-Module Az.Accounts, Az.KeyVault, Az.Resources, Posh-ACME -Force'
errorActionPreference: 'stop'
failOnStderr: true
pwsh: true
But, when I run the script, getting the below error:
The 'Get-AzKeyVaultCertificate' command was found in the
| module 'Az.KeyVault', but the module could not be loaded. For
| more information, run 'Import-Module Az.KeyVault'.
When I try to add the import module (Import-Module -Name Az.KeyVault -Force) it's giving the below error:
Assembly with the same name is already loaded
There is an issue with the latest version of Posh-ACME module (4.15.0).
Install version 4.14.0 to resolve this.
Install-Module -Name Posh-ACME -RequiredVersion 4.14.0 -Force
Ref: https://github.com/brent-robinson/posh-acme-azure-example/issues/15#issuecomment-1241850416
When you try to install the module in CI/CD pipeline you can specify the user so that it won't conflict with other.
#install Az PowerShell Modules with Specific User
- task: PowerShell#2
displayName: Install PowerShell Modules (Az.Accounts, Az.KeyVault, Az.Resources, Posh-ACME)
inputs:
targetType: 'inline'
script: 'Install-Module Az.Accounts, Az.KeyVault, Az.Resources, Posh-ACME -Force CurrentUser'
The 'Get-AzKeyVaultCertificate' command was found in the
| module 'Az.KeyVault', but the module could not be loaded. For
| more information, run 'Import-Module Az.KeyVault'.
Assembly with the same name is already loaded
The error messages show that Az keyVault was already installed but not able to load in a pipeline to run task. You can specify the user(Current user or all user) to install the required module to specific user and import the required module.
My pipeline build is successful but I needed to publish the results in .xml file which is happening locally but not in DevOps - It says TestPester file not found.
When I use these commands locally - all tests are passed and it creates TestPester file automatically.
Command
Invoke-Pester -Script Get-Planet.Tests.ps1 -OutputFile Test-Pester.XML -OutputFormat NUnitXML
The pipeline code :
trigger:
- main
pool:
vmImage: windows-latest
steps:
- task: PowerShell#2
inputs:
filePath: 'Get-Planet.Tests.ps1'
workingDirectory: '$(System.DefaultWorkingDirectory)'
script: |
Install-Module -Name Pester -Force -SkipPublisherCheck
Import-Module Pester
Invoke-Pester -Script $(System.DefaultWorkingDirectory)\Get-Planet.Tests.ps1 -OutputFile $(System.DefaultWorkingDirectory)\TestPester.XML -OutputFormat NUnitXML
Invoke-Pester -CodeCoverage '$(System.DefaultWorkingDirectory)\Get-Planet.Tests.ps1' -CodeCoverageOutputFile '$(System.DefaultWorkingDirectory)\Pester-Coverage.xml' -CodeCoverageOutputFileFormat JaCoCo
- task: PublishTestResults#2
inputs:
testResultsFormat: 'NUnit'
testResultsFiles: '$(System.DefaultWorkingDirectory)\TestPester.XML'
mergeTestResults: true
- task: PublishCodeCoverageResults#1
inputs:
codeCoverageTool: 'JaCoCo'
summaryFileLocation: '**/Pester-Coverage.xml'
pathToSources: '$(System.DefaultWorkingDirectory)'
LOG FILE
2021-04-07T09:15:46.0910514Z ##[section]Starting: PublishTestResults
2021-04-07T09:15:46.1265107Z ==============================================================================
2021-04-07T09:15:46.1265611Z Task : Publish Test Results
2021-04-07T09:15:46.1265893Z Description : Publish test results to Azure Pipelines
2021-04-07T09:15:46.1266118Z Version : 2.180.0
2021-04-07T09:15:46.1266332Z Author : Microsoft Corporation
2021-04-07T09:15:46.1266847Z Help : https://learn.microsoft.com/azure/devops/pipelines/tasks/test/publish-test-results
2021-04-07T09:15:46.1267241Z ==============================================================================
2021-04-07T09:15:47.1966443Z [command]"C:\Program Files\dotnet\dotnet.exe" --version
2021-04-07T09:15:48.7472296Z 5.0.201
2021-04-07T09:15:48.7517977Z ##[warning]No test result files matching D:\a\1\s\TestPester.XML were found.
2021-04-07T09:15:48.8193796Z ##[section]Finishing: PublishTestResults
Powershell logs :
Generating script.
Formatted command: . 'D:\a\1\s\Get-Planet.Tests.ps1'
========================== Starting Command Output ===========================
"C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe" -NoLogo -NoProfile -NonInteractive -ExecutionPolicy Unrestricted -Command ". 'D:\a_temp\3dee7bed-1e4b-4a16-8e09-674f5ac01d78.ps1'"
Starting discovery in 1 files.
Discovery finished in 1.06s.
[+] D:\a\1\s\Get-Planet.Tests.ps1 5.22s (1.21s|3.05s)
Tests completed in 5.36s
Tests Passed: 6, Failed: 0, Skipped: 0 NotRun: 0
Finishing: PowerShell
Have you tried the following..
Add '' around your path to the first pester command output, sounds odd but worth a try
Add a test-path command using the path '$(System.DefaultWorkingDirectory)\TestPester.XML' before your publish task to make sure the file is being created
I find most of the time that the file has not been created or it can't properly resolve the path.
I added Get-Planet.Tests.ps1 my test file instead of PowerShell file. That is why I was getting error.
I changed it to :
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
Install-Module -Name Pester -Force
Invoke-Pester -CodeCoverage $(System.DefaultWorkingDirectory)\Get-Planet.psm1 `
-CodeCoverageOutputFile $(System.DefaultWorkingDirectory)\Pester-Coverage.xml `
-CodeCoverageOutputFileFormat JaCoCo `
-OutputFile $(System.DefaultWorkingDirectory)\test-results.xml `
-OutputFormat NUnitXml
I have this task in Devops: I want to copy a text file from blobstorage to a VM.
- task: AzureFileCopy#4
inputs:
sourcePath: 'https://storagename.blob.core.windows.net/container/file.txt'
azureSubscription: 'subscription connection'
storage: 'a_storage_in_subscription'
resourceGroup: $(rgName_of_VM)
destination: 'azureVMs'
MachineNames: $(ipofVM)
vmsAdminUserName: $(adminUsername)
vmsAdminPassword: $(adminPassword)
targetPath: 'c:\files'
But it fails with Upload to container: '8e107770-69d8-xxx' in storage account: 'a_storage_in_subscription' with blob prefix: '' failed with error: 'AzCopy.exe exited with non-zero exit code while uploading files to blob storage.' For more info please refer to https://aka.ms/azurefilecopyreadme
My understanding is that the task copies the file and first puts it in a container in the 'storage' field (a guid is used to create it). The task succeeded in it but then error happened. What am I doing wrong?
If you re-run this failed pipeline along with setting system.debug to true, you will see another message which can explain more detailed for why it failed: failed to perform copy command due to error: cannot start job due to error: cannot scan the path \\?\D:\a\1\s\https://storagename.blob.core.windows.net/container/file.txt.
Now, you should know that why you encountered that error message. That's because our Azure File Copy task does not support to use HTTPS url in sourcePath. The value of sourcePath must satisfied on below mentioned:
Since we does not support to use HTTPS url here. As a work around, you can firstly download this file into build working directory firstly by using Azure cli command. Then upload it to AzureVM:
- task: AzureCLI#1
displayName: 'Azure CLI '
inputs:
azureSubscription: {subscription}
scriptLocation: inlineScript
inlineScript: |
mkdir $(Build.SourcesDirectory)/File
az storage blob download --container-name {container name} --file $(Build.SourcesDirectory)/file --name {file name} --account-key $(accountkey) --account-name {blob name}
- task: AzureFileCopy#4
displayName: 'AzureVMs File Copy'
inputs:
SourcePath: '$(Build.SourcesDirectory)/File'
azureSubscription: {subscription}
Destination: AzureVMs
storage: {storage}
resourceGroup: '{resource group}'
vmsAdminUserName: {login name}
vmsAdminPassword: {login password}
TargetPath: 'xxx'
Note: You can get the accesskey from this tab:
Now, you can see that you can succeed to upload file to AzureVM.
As a part of my yml pipeline definition, I have a AzurePowerShell#4 task, following is an extract from my pipeline definition
stages:
- stage: DeployDemoCluster
jobs:
- job: 'DeployAKSAndAll'
pool:
vmImage: 'windows-latest'
steps:
- task: AzurePowerShell#4
displayName: Store AI instrumentation key for Inbound Processor in central KeyVault
inputs:
azureSubscription: 'service-connection'
azurePowerShellVersion: LatestVersion
pwsh: true
ScriptType: 'FilePath'
ScriptPath: 'AKS/ps/update_kv_firewall.ps1'
The issue is, within my update_kv_firewall.ps1, all the powershell commands fail with the error, for example:
[error]Login-AzureRmAccount : The term 'Login-AzureRmAccount' is not recognized as the name of a cmdlet, function, script file, or operable program.
The script when executed individually / standalone, works perfectly fine.
what am I missing here?
As per your comment: the command "Get-AzKeyVault" runs without any errors, while 'Get-AzureRmVirtualNetwork' leads to errors.
Then I'm sure that you're installing the new Az module of azure powershell. So the command like Get-AzKeyVault can work.
Since you're installing Az module, please use all the commands from Az module. Almost each azure Rm command has an equivalent azure Az command, you can find it from the Az command list.
Note: the command like Get-AzureRmVirtualNetwork / Login-AzureRmAccount is from azure RM module, which will be retired this year later.