Scenario is like this:
I have 2 pipelines.
First pipeline builds nuGet package as a tool and publishes to artifact feed.
That tool have appsettings.json file in witch I want to access some folder relative to that path. For example ./TestFolder
Second pipeline extracts zip to that TestFolder, installs tool globally with dotnet tool install -g and run the tool.
My question is, how can I extract that zip to folder where tool is located?
How this scenario is done in general?
I hope you understand me
Based on your situation, you can use this task "Extract files" Extract Files task to extract that zip and then specify the
destinationFolder as "TestFolder" where tool is located.
steps:
- task: ExtractFiles#1
displayName: 'Extract files '
inputs:
destinationFolder: TestFolder
Related
I keep saving a file to $(Build.ArtifactStagingDirectory) but when checking using Get-ChildItem, the directory is empty. Also, publishing doesn't produce anything. Just an empty directory.
Is the only way to save files to that directory is using copy task?
You can use these variables only in running build or release pipelines. Each build uses own $(Build.ArtifactStagingDirectory), $(Build.BinariesDirectory) and $(Build.SourcesDirectory) on your build agent. These folders are accessible for any cmd file or PowerShell script in the current build run.
It is not the only way to use copy task to save files to other directory.
You could also use bash,cmd or PowerShell script in pipeline.
The cause of why you failed to list them:
1.You need to select folder instead of the file in Source Foler.
pipeline definition
2.This directory($(Build.ArtifactStagingDirectory)) is purged before each new build.
So the files just exist when you running the pipeline.
Databricks recently added support for "files in repos" which is a neat feature. It gives a lot more flexibility to the projects, since we can now add .json config files and even write custom python modules that exists solely in our closed environment.
However, I just noticed that the standard way of deploying from an Azure git repo to a workspace does not support arbitrary files. First off, all .py files are converted to notebooks, breaking the custom modules that we wrote for our project. Secondly, it intentionally skips files ending in one of the following: .scala, .py, .sql, .SQL, .r, .R, .ipynb, .html, .dbc, which means our .json config files are missing when the deployment is finished.
Is there any way to get around these issues or will we have to revert everything to use notebooks like we used to?
You need to stop doing deployment the old way as it depends on the Workspace REST API that doesn't support arbitrary files. Instead you need to have a Git checkout in your destination workspace, and update that checkout to a given branch/tag when doing release. This is could be done via Repos API, or databricks cli. Here is an example of how to do that with cli from DevOps pipeline.
- script: |
echo "Checking out the releases branch"
databricks repos update --path $(STAGING_DIRECTORY) --branch "$(Build.SourceBranchName)"
env:
DATABRICKS_HOST: $(DATABRICKS_HOST)
DATABRICKS_TOKEN: $(DATABRICKS_TOKEN)
displayName: 'Update Staging repository'
I have been tasked to move a repository from Azure Devops to a Bitbucket server. I'm very new to Azure Devops. When I went to my repository in Azure, I noticed that I can download the repo as a zip file but there were no options to clone. I downloaded the zip file and when I tried to unzip the file, I got the following message:
unzip: can't find file table
The unzip process won't complete successfully. Will somebody explain to me why I cannot unzip a file from Azure Devops or tell me why the 'clone' option is missing?
As Daniel Mann said in the comment, there is no clone button in tfvc repo.
To get a copy of the source code ,besides using tf.exe, you can also map a worksapce through visual studio. You can map your source control folder to a single local folder.
You can refer to this document.
In addition ,download code repo to a zip file and import it to the new domain is also a way. I tested the downloading as zip and it can be successfully unzip with 7-Zip.
I'm building an azure DevOps pipeline in which one of the steps is to refer an already prebuilt package and copy it to the current pipeline. I'm following below step which is working fine, but I think there should be a possibility in azure DevOps to directly copy from the artifcatory to $(Build.ArtifactStagingDirectory)
Current Approach:
In Azure repo: (I have mentioned the prebuilt artifact/package) inside
requirements_generic_bash.txt
now in my pipeline.yml
- bash: |
echo PythonV3
python3 -m venv venv
source venv/bin/activate
python --version
http_proxy="xxxx"
https_proxy="xxx"
index_url="https://actory.com/artifactory/api/simple"
extra_index_url="https://actorycom/artifactory/api/simple"
python -m pip install -r $(System.DefaultWorkingDirectory)/requirements/requirements_generic_bash.txt --index-url ${index_url} --extra-index-url ${extra_index_url}
deactivate
displayName: Install GenericBash from Artifactory
- bash: |
cp -r venv/lib/python3.7/site-packages/* $(Build.ArtifactStagingDirectory)
displayName: Copy files to ArtifactStagingDirectory
so my question is is there any way 'Copy files to ArtifactStagingDirectory' can be done directly instead of virtual env ? if so then how?
There is a task in pipeline that might meet your need.
Copy files task can copy files from a source folder to a target folder using patterns matching file paths.
So you can set the already prebuilt package path in Source Folder and set the Target folder like below.
If you would like to use Yaml file, then you can set it as below.
More detail information about this task, you can refer to https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/utility/copy-files?view=azure-devops&tabs=yaml
I've a working build pipeline in Azure Devops that essentially installs Python3.6, sets up a virtual environment (.env) and then executes all unit tests. It then uses as its final step, a copy operation to move all files, including the virtual environment to a drop folder.
My problem arises from creating a release pipe. I am running a bash script for the release pipeline that essentially installs the azure functions command tools, and then I activate the python virtual environment before I call the func azure publish instruction.
The error I get states that settings are encrypted and that I need to call func setting add to add settings, however, when run locally, the script executes without any error whatsoever.
Does anyone have a working release pipeline in Azure Devops for a python-based Azure Function that they'd be able to share with me, so I can perhaps see what I am doing wrong?
Here is the relevant bit of script that executes:
#!/usr/bin/env bash
FUNCTION_APP_NAME="secret"
FUNCTION_APP_FOLDER="evenMoreSecret"
# Install Azure Functions Core Tools
echo "--> Install Azure Functions Core Tools"
wget -q https://packages.microsoft.com/config/ubuntu/16.04/packages-microsoft-prod.deb
sudo dpkg -i packages-microsoft-prod.deb
sudo apt-get update
sudo apt-get install azure-functions-core-tools -y
echo ">>>>>>>> Initialize Python Virtual Environment"
source .env/bin/activate
echo "--> Publish the function app to Azure Functions"
cd $FUNCTION_APP_FOLDER
func azure functionapp publish $FUNCTION_APP_NAME --build-native-deps
The script is executed using an Azure CLI, using a security principal which is tied to the azure account that it is targeting.
Usually with Azure DevOps you create several build steps that result in some build artifacts - these are defined in the azure-pipelines.yml file. You then do a release step to release the artifacts that you have created - this is created within the UI. This can involve deploying to a test server and then to production or however you want to configure it. What you are describing is doing the build and release step all in the one yaml file as the func publish is essentially doing a release and it seems to all be in the one script.
In the next release of the az cli there is a new command called az functionapp devops-build that will set up the DevOps pipeline with the seperate build and release steps. However, in the mean time, we have created a series of beta yaml files that we hope you can just drag and drop to do the build and release steps just within the build part (as you are doing).
The beta yaml files are here:
https://github.com/Azure/azure-functions-devops-build/wiki/Yaml-Samples
I must disclaim that they are not fully tested, nor are they supported yet.
I will answer myself as I've solved the problem.
To #Oliver Dolk: We do NOT want to publish as part of a build pipeline. The only thing I'm interested in is to set up a virtual environment and then run the unit tests.
The RELEASE stage is where we want to deploy the scripts copied over from the build step. These artifacts are then the basis for releasing into dev, test and prodution environment.
I was missing a very important step in my script; To create a local.settings.json file which contains encrypted settings for the functionapp.
In order to solve the problem, I only had to call the following:
func azure functionapp fetch-app-settings $FUNCTION_APP_NAME
This calls the azure functionApp, and retrieves it's settings into an encrypted local.settings.json which is then used during publishing.
For a complete script reference of both the build YAML script and the bash script that does the deployment, I've put both in an anonimized github repo:
https://github.com/digitaldias/Python-Examples