Why Azure Devops File Copy static website not copying to root? - azure

I have a problem deploying my angular app to Azure storage static website. I created a CI/CD pipeline last year, I was using it for months. There was no deployment in the past few months, and with today's deployment I ran into some problems.
The first problem was about authentication, but I was able to solve it.
The second is the one I'm struggling with. I run the release pipeline, and the files are copied, but not in the $web container root, but it creates a 'prod' folder, and copies files there. Because of this, I can not open the website, as it searches for the 'index.html' file in the $web root folder, and obviously doesn't find it, as it is in the $web/prod folder. I can open the file going to /prod URL, but then it tries to load all resources from the root (so not the prod folder), and obviously the files aren't there.
I looked at some articles about deploying to static webiste storage account, and all of them showed similar yamls to what I'm using.
Here's my build pipeline which publishes the artifacts after build:
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)/dist/prod'
ArtifactName: 'prod'
publishLocation: 'Container'
displayName: 'publish prod'
And here is the file copy task:
- task: AzureFileCopy#4
displayName: 'AzureBlob File Copy'
inputs:
SourcePath: '$(System.DefaultWorkingDirectory)/_angularApp CI/prod'
azureSubscription: '***'
Destination: AzureBlob
storage: angularApp
ContainerName: '$web'
The files are there after the 'PublishBuildArtifact' task, they are copied, just not in the correct folder ($web root). Does anybody have an idea?
Thanks

Usually, without specifying Blob Prefix, the content in the specified folder is copied to the root of the container by default.
Tested on my side, it works well:
You can check whether your publish source directory contains a folder named prod and copy this folder to the container.
In addition, the AzureFileCopy#3 version I used here, you can try to use V3 azure copy file task.

Related

Azure Devops FTP Upload - Upload only changed files on Remote Server

The current setup is as below
Version Control - Git
Repos and Branch hosted on - Azure DevOps
Codebase - External server
The dev team clones Azure Repo into local git project and any staged changes are committed via Git and pushed to specific branch of Azure DevOps. In this setup we would want to upload the changes to external FTP servers and avoid manual upload. Currently trying to use Azure Devops FTP Upload Task (https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/ftp-upload?view=azure-devops), and able to successfully run pipeline and publish artifact. However, this script uploads all the files and folders at specified path and not just the staged changed. YAML script as below
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
variables:
- name: StagingRepo
value: $(System.DefaultWorkingDirectory)
steps:
- publish: $(StagingRepo)
artifact: Staging Repo
- task: FtpUpload#2
displayName: 'FTP Upload'
inputs:
credentialsOption: inputs
serverUrl: 'ftps://00.00.00.00:22'
username: ftp-username
password: ftp-password
rootDirectory: '$(StagingRepo)'
remoteDirectory: '/home/public_html'
clean: false
cleanContents: false
preservePaths: false
trustSSL: true
PROBLEM
Any way that we can only upload only committed changes to FTP instead of uploading the whole repo/files? From the docs, new build pipelines update only the changed files but in this case it is uploading everything.
Thanks
As per the task doc:
You are uploading the whole working directory(which store your repo content) to remote FTP server.
Please fix remoteDirectory as a directory not a file /home/public_html.
If you'd like to only upload committed changes to FTP, you need to find the changed files firstly. You can find the yaml script in answer here.
The problem is to upload the files, if you use task "task: FtpUpload#2", you can put the changed files into a folder and upload the folder content to remote server, but the changed files will in same folder which is different with git repo content.
If you'd to sync the change to remote server which means keeping the same file path in repo content, you need to install git-ftp. Please refer to the link here and here. Use git script instead of task to upload the files.
Edit:
This an Extension FTP Uploader which support "Ignore unchanged files".

Azure DevOps deployment pipeline failing with [error]Error: Cannot find any file based on /Users/runner/work/1/s/XYZ/**/ABC.ipa

Artifact is downloaded to '/Users/runner/work/1/' and the deployment task is looking for the artifact at '/Users/runner/work/1/s/XYZ/**/ABC.ipa'.
In build stage, artifacts are published to 'PathtoPublish: '$(build.artifactstagingdirectory)/${{parameters.env}}'' and in deployment, artifacts are accessed using ''$(System.DefaultWorkingDirectory)/XYZ/**/ABC.ipa''
Please help to access the ipa file correctly.
Two pre-defined variables are used but they points to different folder structures(doc here):
build.artifactstagingdirectory: The local path on the agent where any artifacts are copied to before being pushed to their destination. For example: c:\agent_work\1\a, not have s folder.
System.DefaultWorkingDirectory: The local path on the agent where your source code files are downloaded. For example: c:\agent_work\1\s.
It's recommended to add a powershell task adhere to list all files in the directory and subdirectory, so that we can find where the files are stored on the agent. code sample:
- task: PowerShell#2
name: listfiles
inputs:
targetType: 'inline'
script: 'Get-ChildItem -Path $(System.DefaultWorkingDirectory) -Recurse -File'
After we confirm where the file resides, we can modify the path for the task so the file can be found.

How to register AutoIT COM DLL while running from azure

I am trying to use AutoIT in my automated tests for the project. Locally I am able to register the COM Library using regsvr32 but when I try to do the same from my azure pipeline, the script runs continuously.
I have my azure pipeline yml as following:
- job: Tests
displayName: Automated Tests
pool:
vmImage: "windows-latest"
steps:
- task: NuGetToolInstaller#1
- task: DotNetCoreCLI#2
displayName: Restore Packages
inputs:
command: 'restore'
projects: 'Free.Automation/Free.Automation.csproj'
feedsToUse: 'config'
nugetConfigPath: 'Free.Automation/nuget.config'
- task: BatchScript#1
displayName: Register AutoIT
inputs:
filename: 'Free.Automation/autoit.bat'
- task: MSBuild#1
inputs:
solution: "Free.Automation/Free.Automation.sln"
And this is the bat file I am using:
cd c:\windows\system32
regsvr32 C:\Users\%USERNAME%\.nuget\packages\autoitx.dotnet\3.3.14.5\build\AutoItX3.dll
I verified that the path of azure pipeline space is something D:\1\a\s but not sure how the directory works.
Could anyone help me registering the COM lib on azure hosted pipeline space?
With Azure DevOps Microsoft-hosted agent, you can't get your local files directly. So if you want to use COM DLLs, you need to include them in your source code files.
I recommend that you have a lib folder to store your DLLs in. Please make sure that your DLLs are referenced correctly as a relative path in .csproj.
I verified that the path of azure pipeline space is something D:\1\a\s but not sure how the directory works.
In Azure DevOps, you can use the predefined variable $(System.DefaultWorkingDirectory) to get the local path on the agent where your source code files are downloaded. That's the "azure pipeline space D:\1\a\s" you mentioned.

What's should I specify for the rootDirectory of an Azure Pipeline for a static website?

I manage a static website. No database, no server-side processing like ASP.NET (IIS), PHP, etc. This website is comprised of just HTML, CSS, some JavaScript, and a few graphic files. I'm trying to use Azure Pipelines for this. This is my first time using Azure Pipelines. I chose an HTML template to start the YAML pipeline.
I'm using the FTP Upload task. The source code is in Azure DevOps Repos. I'm testing the Pipeline by trying to FTP the files to a test folder on the hosting server (not a part of Azure). In testing the pipeline, I get this error:
##[error]Error: Failed find: ENOENT: no such file or directory, stat '/bin/pyvenv'
I don't know what I should put as the rootDirectory. I thought it appropriate to put the "/". Here's the YAML:
# HTML
# Archive your static HTML project and save it with the build record.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
- none
pool:
vmImage: 'ubuntu-latest'
steps:
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(build.sourcesDirectory)'
includeRootFolder: false
- task: FtpUpload#2
inputs:
credentialsOption: 'inputs'
serverUrl: 'ftp://ftp.destination.com'
username: '$(TheUsername)'
password: '$(ThePassword)'
rootDirectory: '/'
filePatterns: '**'
remoteDirectory: '/test'
clean: false
cleanContents: false
preservePaths: false
trustSSL: false
What should I put for the rootDirectory?
Have you tried
rootDirectory: '.'
OR
rootDirectory: '$(Pipeline.Workspace)' ?
The task ArchiveFiles field archiveFile default value is $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip.
According to your YAML build definition, you have archived the folder $(build.sourcesDirectory) to the path $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip, and next task is FtpUpload, then field rootDirectory should be $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip and it will upload the zip file to a remote machine.
If you want to upload the static website, you could also set the field rootDirectory to $(Build.SourcesDirectory). Check the doc build variables for more details.
The root directory is your source code directory from where your compiled code is, in your case it should be checkout path.
I think you should use should use $(System.DefaultWorkingDirectory)/rootFolderName.
Here root folder should be your repo name. You try printing the content of the $(System.DefaultWorkingDirectory) to check whether you need to use $(System.DefaultWorkingDirectory) only or any nested folder inside it.

File pattern for Publish Pipeline Artifact in Azure DevOps

Recently just built an Azure Pipeline where in one stage there are different zip files in the artifact staging directory. What I'm trying to achieve is publish to the drop folder all the zip files from the staging folder with PublishPipelineArtifact task.
I have 2 archived zip files in artifact staging directory:
$(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip
$(Build.ArtifactStagingDirectory)/cli_scripts_$(Build.BuildId).zip
In my azure-pipelines.yml file please find the publish task:
- task: PublishPipelineArtifact#0
displayName: 'Publish pipeline artifacts'
inputs:
targetPath: $(Build.ArtifactStagingDirectory)/**
This gives the following error:
[error] Path does not exist: d:\a\1\a**
I have already tried with the following as well but none of them working:
$(Build.ArtifactStagingDirectory)/**
$(Build.ArtifactStagingDirectory)/**/*.zip
$(Build.ArtifactStagingDirectory)/*.zip
Question:
What is the pattern for targetPath to move all the zip files from that folder?
Any help is appreciated!
What finally resolved the issue is including a pattern with archiveFilePatterns in the task and not combining with the targetPath as I originally tried.
The solution which worked well is the following:
- task: PublishPipelineArtifact#0
displayName: 'Publish pipeline artifacts'
inputs:
targetPath: $(Build.ArtifactStagingDirectory)/
archiveFilePatterns: '**/*.zip'
The official documentation does not really states this but it was giving the idea using the pattern attribute: Publish and download artifacts
I hope that helps someone in the future.
You can use the .artifactignore file to filter what the PublishPipelineArtifact task can see. Make sure the file is in the folder that you're publishing, as mentioned here:

Resources