The current setup is as below
Version Control - Git
Repos and Branch hosted on - Azure DevOps
Codebase - External server
The dev team clones Azure Repo into local git project and any staged changes are committed via Git and pushed to specific branch of Azure DevOps. In this setup we would want to upload the changes to external FTP servers and avoid manual upload. Currently trying to use Azure Devops FTP Upload Task (https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/ftp-upload?view=azure-devops), and able to successfully run pipeline and publish artifact. However, this script uploads all the files and folders at specified path and not just the staged changed. YAML script as below
trigger:
- main
pool:
vmImage: 'ubuntu-latest'
variables:
- name: StagingRepo
value: $(System.DefaultWorkingDirectory)
steps:
- publish: $(StagingRepo)
artifact: Staging Repo
- task: FtpUpload#2
displayName: 'FTP Upload'
inputs:
credentialsOption: inputs
serverUrl: 'ftps://00.00.00.00:22'
username: ftp-username
password: ftp-password
rootDirectory: '$(StagingRepo)'
remoteDirectory: '/home/public_html'
clean: false
cleanContents: false
preservePaths: false
trustSSL: true
PROBLEM
Any way that we can only upload only committed changes to FTP instead of uploading the whole repo/files? From the docs, new build pipelines update only the changed files but in this case it is uploading everything.
Thanks
As per the task doc:
You are uploading the whole working directory(which store your repo content) to remote FTP server.
Please fix remoteDirectory as a directory not a file /home/public_html.
If you'd like to only upload committed changes to FTP, you need to find the changed files firstly. You can find the yaml script in answer here.
The problem is to upload the files, if you use task "task: FtpUpload#2", you can put the changed files into a folder and upload the folder content to remote server, but the changed files will in same folder which is different with git repo content.
If you'd to sync the change to remote server which means keeping the same file path in repo content, you need to install git-ftp. Please refer to the link here and here. Use git script instead of task to upload the files.
Edit:
This an Extension FTP Uploader which support "Ignore unchanged files".
Related
OK, so the main question here is: How to Sync Github with Azure?
Now, I had main reference source: How to synchronize Azure Repos with External Git Repos
https://faun.pub/how-to-synchronize-azure-repos-with-external-git-repos-70ff92e51c63
And that is perfect match, but here is a catch, it is kind of abstract and you have to have work with Azure experience, like knowing what is YAML for and how to use it.
Long story short, it did not work. And then I found this gentleman's video: Merge From Github to Azure DevOps
https://www.youtube.com/watch?v=Kks1pCG51bI
That is super close, yet there still were several error and bugs ,like:
Cannot find path 'D:\a\1\s\copyrepo~' because it does not exist;
remote: Not Found fatal: repository 'https://github.com/***/' not found ##[warning]Git fetch failed with exit code 128, back off 5.443 seconds before retry.
error: unable to create file Filename too long
So, that is a bummer...
I mean, yes, you technically can create and synchronize GitHub repo with Azure, but you have to create new GitHub repo, with existing one there was: error: unable to create file Filename too long.
Please tell me what you think.
Try to use the following yaml file as I used to sync Github to Azure Repos:
name: Sync Azure with Github
variables:
REMOTE_ADDR: 'https://github.com/{user{/{repo.git}'
stages:
- stage: syncing Repos
displayName: syncing Repos
jobs:
- job: run_Git_Commands
displayName: run_Git_Commands
continueOnError: false
steps:
- checkout: self
clean: true
persistCredentials: true
displayName: run_commands
- bash: |
git checkout master
git remote add repoGithub $(REMOTE_ADDR)
git fetch repoGithub master
git reset --hard repoGithub/master
git pull --rebase repoGithub master
git push --force origin
Artifact is downloaded to '/Users/runner/work/1/' and the deployment task is looking for the artifact at '/Users/runner/work/1/s/XYZ/**/ABC.ipa'.
In build stage, artifacts are published to 'PathtoPublish: '$(build.artifactstagingdirectory)/${{parameters.env}}'' and in deployment, artifacts are accessed using ''$(System.DefaultWorkingDirectory)/XYZ/**/ABC.ipa''
Please help to access the ipa file correctly.
Two pre-defined variables are used but they points to different folder structures(doc here):
build.artifactstagingdirectory: The local path on the agent where any artifacts are copied to before being pushed to their destination. For example: c:\agent_work\1\a, not have s folder.
System.DefaultWorkingDirectory: The local path on the agent where your source code files are downloaded. For example: c:\agent_work\1\s.
It's recommended to add a powershell task adhere to list all files in the directory and subdirectory, so that we can find where the files are stored on the agent. code sample:
- task: PowerShell#2
name: listfiles
inputs:
targetType: 'inline'
script: 'Get-ChildItem -Path $(System.DefaultWorkingDirectory) -Recurse -File'
After we confirm where the file resides, we can modify the path for the task so the file can be found.
I got a requirement to generate, archive and reuse the artifacts between two different repositories
Repository A: Compile Angular code and create a XLF file
Repository B: Use the 'XLF File' generated above and create a new XLF file
Repository A: Again use the newly generated XLF file to create the final output file
The activities mentioned above should be done using gitlab-ci.yml. I am not sure how to handle this using GitLab CI.
We can push the artifact from Repo A to Repo B. However, CI on Repo A should wait until Repo B pushes a new artifact to Repo A to complete the process
Ideally, you would not push a generated artifact to another Git source repository.
But a GitLab pipeline can retrieve an artifact produced by another one from its URL.
To avoid the back and forth, I would rather have 3 jobs instead of two
the first generates XLF file
the second curls/fetches that file, and use it to generate new XLF file
the third job curls/fetches that file, and complete the process.
How to release built artifacts back-and-forth from one to another repo on GitLab?
Repository A:
Compile Angular code and create a XLF file
Send a hook to repository B that it just compiled
just trigger: https://docs.gitlab.com/ee/ci/yaml/#trigger , works like a charm. It's even nicely visible in the gui.
or API https://docs.gitlab.com/ee/ci/triggers/
pass variables: PARENT_PIPELINE_ID: $CI_PIPELINE_ID to repository B so it can download artifacts from specific pipeline
Repository B:
Use the 'XLF File' generated above
use needs: https://docs.gitlab.com/ee/ci/yaml/#artifact-downloads-to-child-pipelines to download artifacts
or API: have personal access token from repository A https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html added to environment variables and use API to download artifacts https://docs.gitlab.com/ee/api/job_artifacts.html .
create a new XLF file
use trigger: or API to trigger repository A
but this time trigger different .gitlab-ci.yml file like: trigger: - project: repositoryA file: second_stage.gitlab-ci.yml https://docs.gitlab.com/ee/ci/yaml/#trigger-child-pipeline-with-files-from-another-project
or use like variables: SECOND_STAGE: "true" and use a variable to differentiate
Repository A:
run pipeline from the file second_stage.gitlab-ci.yml
download artifacts from repository B - needs: or API
use the newly generated XLF file to create the final output file
Overall, what you need is rules: and needs: documentation. On older gitlab, it was done with API.
CI on Repo A should wait until Repo B pushes a new artifact to Repo A to complete the process
Don't wait. Let the API trigger it.
I tried the following approach and it worked fine or at least I was able to proceed
Due to some reason 'variables' along with CURL did not work as expected but I did not analyze the root cause
Repo A - Pipeline
trigger-repob: (Trigger Project B of Repo B)
stage: repob
trigger:
project: repob-namespace/projectb
branch: devops
test_job:
image: $CI_REGISTRY/$CI_PROJECT_PATH/base-image:latest
stage: test_pipeline
when: delayed
start_in: 2 minutes
needs: (Use artifacts from Repo B/Project B)
-
project: repob-namespace/projectb
job: buildprojectb
ref: devops
artifacts: true
script:
- do something here
Repo B Pipeline
buildprojectb:
image: php:7.4.11
stage: build
script:
- do something here
artifacts:
paths:
- outputs/*.xlf
I manage a static website. No database, no server-side processing like ASP.NET (IIS), PHP, etc. This website is comprised of just HTML, CSS, some JavaScript, and a few graphic files. I'm trying to use Azure Pipelines for this. This is my first time using Azure Pipelines. I chose an HTML template to start the YAML pipeline.
I'm using the FTP Upload task. The source code is in Azure DevOps Repos. I'm testing the Pipeline by trying to FTP the files to a test folder on the hosting server (not a part of Azure). In testing the pipeline, I get this error:
##[error]Error: Failed find: ENOENT: no such file or directory, stat '/bin/pyvenv'
I don't know what I should put as the rootDirectory. I thought it appropriate to put the "/". Here's the YAML:
# HTML
# Archive your static HTML project and save it with the build record.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml
trigger:
- none
pool:
vmImage: 'ubuntu-latest'
steps:
- task: ArchiveFiles#2
inputs:
rootFolderOrFile: '$(build.sourcesDirectory)'
includeRootFolder: false
- task: FtpUpload#2
inputs:
credentialsOption: 'inputs'
serverUrl: 'ftp://ftp.destination.com'
username: '$(TheUsername)'
password: '$(ThePassword)'
rootDirectory: '/'
filePatterns: '**'
remoteDirectory: '/test'
clean: false
cleanContents: false
preservePaths: false
trustSSL: false
What should I put for the rootDirectory?
Have you tried
rootDirectory: '.'
OR
rootDirectory: '$(Pipeline.Workspace)' ?
The task ArchiveFiles field archiveFile default value is $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip.
According to your YAML build definition, you have archived the folder $(build.sourcesDirectory) to the path $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip, and next task is FtpUpload, then field rootDirectory should be $(Build.ArtifactStagingDirectory)/$(Build.BuildId).zip and it will upload the zip file to a remote machine.
If you want to upload the static website, you could also set the field rootDirectory to $(Build.SourcesDirectory). Check the doc build variables for more details.
The root directory is your source code directory from where your compiled code is, in your case it should be checkout path.
I think you should use should use $(System.DefaultWorkingDirectory)/rootFolderName.
Here root folder should be your repo name. You try printing the content of the $(System.DefaultWorkingDirectory) to check whether you need to use $(System.DefaultWorkingDirectory) only or any nested folder inside it.
I have a problem deploying my angular app to Azure storage static website. I created a CI/CD pipeline last year, I was using it for months. There was no deployment in the past few months, and with today's deployment I ran into some problems.
The first problem was about authentication, but I was able to solve it.
The second is the one I'm struggling with. I run the release pipeline, and the files are copied, but not in the $web container root, but it creates a 'prod' folder, and copies files there. Because of this, I can not open the website, as it searches for the 'index.html' file in the $web root folder, and obviously doesn't find it, as it is in the $web/prod folder. I can open the file going to /prod URL, but then it tries to load all resources from the root (so not the prod folder), and obviously the files aren't there.
I looked at some articles about deploying to static webiste storage account, and all of them showed similar yamls to what I'm using.
Here's my build pipeline which publishes the artifacts after build:
- task: PublishBuildArtifacts#1
inputs:
PathtoPublish: '$(System.DefaultWorkingDirectory)/dist/prod'
ArtifactName: 'prod'
publishLocation: 'Container'
displayName: 'publish prod'
And here is the file copy task:
- task: AzureFileCopy#4
displayName: 'AzureBlob File Copy'
inputs:
SourcePath: '$(System.DefaultWorkingDirectory)/_angularApp CI/prod'
azureSubscription: '***'
Destination: AzureBlob
storage: angularApp
ContainerName: '$web'
The files are there after the 'PublishBuildArtifact' task, they are copied, just not in the correct folder ($web root). Does anybody have an idea?
Thanks
Usually, without specifying Blob Prefix, the content in the specified folder is copied to the root of the container by default.
Tested on my side, it works well:
You can check whether your publish source directory contains a folder named prod and copy this folder to the container.
In addition, the AzureFileCopy#3 version I used here, you can try to use V3 azure copy file task.