I have a YAML anchor that deploys logic-apps. I want the pipeline to look for logic apps in subdirectory and loop through each one and deploy it. Here's my deploy-logicapp anchor
- step: &deploy-logicapp
name: Deploy logic app
script:
- source environment.sh
- pipe: microsoft/azure-arm-deploy:1.0.2
variables:
AZURE_APP_ID: $AZURE_CLIENT_ID
AZURE_PASSWORD: $AZURE_SECRET
AZURE_TENANT_ID: $AZURE_TENANT
AZURE_LOCATION: $AZURE_LOCATION
AZURE_RESOURCE_GROUP: $AZURE_RESOURCE_GROUP
AZURE_DEPLOYMENT_TEMPLATE_FILE: 'logic-apps/$DIR/template.$DEPLOYMENT_SLOT.json'
so in my pipeline, I loop through all the subdirectories and this works, it echoes each $DIR
- step:
script:
- cd logic-apps
- for DIR in $(ls -l | grep '^d' | awk '{print $9}'); do echo $DIR ; done
What I want to do is inside this loop I want to call my YAML anchor with the $DIR environment variable. I have tried a number of ways. The problem is the for loop is inside bash and not YAML so I can not call it.
Any guidance will be much appreciated.
As it turns out, I need to do everything from the azure command line. Here's the bash script that loops through all directories and deploy them.
#!/bin/bash
az login --service-principal -u $AZURE_CLIENT_ID -p $AZURE_SECRET --tenant $AZURE_TENANT
for DIR in $(ls -l | grep '^d' | awk '{print $9}'); do
az deployment group create --resource-group $AZURE_RESOURCE_GROUP --template-file $DIR/template.$DEPLOYMENT_SLOT.json --name $DIR
done
Next is to only deploy those that have changed :)
Related
The Problem: I am trying to concat two variables for a copy cmd in a before script for a gitlab ci/cd pipeline job.
What I expect: myfile_filesuffix
What I get: _filesuffix
Can anyone see what I am doing wrong? When I run this for loop on my local CLI I have no problems. Thank you!
before_script:
- rm -rf .terraform
- terraform --version
- mkdir ~/.aws
- echo "[default]" > ~/.aws/credentials
- echo "aws_access_key_id=$AWS_ACCESS_KEY_ID" >> ~/.aws/credentials
- echo "aws_secret_access_key=$AWS_SECRET_ACCESS_KEY" >> ~/.aws/credentials
- mkdir ./deployments
- ls common
- common_files=$(find common -type f)
- echo $common_files
- prefix_common=$(echo $common_files | cut -d"/" -f 1)
- echo $prefix_common
- for f in $common_files;
do
common_file="$(basename $f)"
cp $f ./deployments/""${common_file}"_"${prefix_common}"";
done
you can used GitLab repo settings -> CI/CD -> Variables to add FILE type variable and use mv command move to your folder.
ex: File Type variable is ANSIBLE_VARIABLES_FILE
script:
- mv $ANSIBLE_VARIABLES_FILE ./deployments/variables_common.tf
I would like to save all the variables that are in the directory in a separate file, cut out duplicates
To begin with, I wrote all the lines with global variables in a separate file
grep -rI "\$.*" folder/ >> output.txt
Then I tried to pull out the variables of this file
cat output.txt | sed /\$.*.[{A-Z}]/p
And output was not what I expected
So how can I take needed variables, when file after grep like this:
something.text_text.txt: - export IMAGE_NAME=${MY_REGISTY}/$MY_PR/${MY_PRNNN} something.text_text.txt:
- docker build --network host -t ${IMAGE_NAME}:${VERSION} -f $DILE_PATH --build-arg setupfile=$SET_FIL> something.text_text.txt:
- docker push ${IMAGE_NAME}:${VERSION} something.text_text.txt: - docker tag ${IMAGE_NAME}:${VERSION} ${IMAGE_NAME}:${MY_BUILD_REF_NAME} something.text_text.txt: - docker push ${IMAGE_NAME}:${MY_BUILD_REF_NAME} something.text.txt:
- /^rel_.*$/ something.text.txt: - eval $(ssh-agent -s) something.text.txt: - chmod 400 $MY_SSH_KEY something.text.txt:
- ssh-add $MY_KEY something.text.txt: - git checkout ${MY_BUIL_NAME} something.text.txt: - git reset --hard origin/${MY_F_NAME} something.text.txt: - mvn -s MY_settings.xml ${MTS} license:add-third-party something.text.txt: - cat ${LICENSE_LIST_FILE} something.text.txt: POM_XML_COMMIT_HASH_LOCAL=$(git log --oneline --follow -- pom.xml | awk '{ print $1 }' | head -n 1) || true something.text.txt: echo POMIT_HASH_LOCAL=${PCOMMIT_HASH_LOCAL} something.text.txt: POM_XML_COMMIT_HASH_REMOTE=$(git log --oneline origin/${MY_BUILD_REF_NAME} --follow -- pom.xml | awk '{ print $1 }' | h> something.text.txt: echo POM_XML_COMMIT_HASH_REMOTE=${POM_OMMIT_HASH_REMOTE} something.text.txt: if [[ ${POM_XML_COMMIT_HASH_LOCAL} = ${POMMIT_HASH_REMOTE} ]]; then something.text.txt: echo "File pom.xml is the same for local and origin ${MY_BUILD_REF_NAME} branch." something.text.txt: echo "New commits are presented in origin/${MY_BUILD_REF} branch for pom.xml file. Skipping." something.text.txt: - git add -f ${LICENSE_LIST_FILE} something.text.txt: - export MY_PUSH_URL=`echo $MY_REPOSITORY_URL | perl -pe 's#.*#(.+?(\:\d+)?)/#git#\1:#'` something.text.txt: - git remote set-url --push origin "${MY_PUSH_URL}" something.text.txt: - git push -f -o ci.skip origin ${MY_BUILD_REF_NAME} something.text_tests.txt: - docker login -u $MY_REGISTRY_USER -p $MY_REGISTRY_PASSWORD $MY_REGISTRY something.text_tests.txt: - export CONFIG_FILE=${HOME}/.docker/config.json something.text_tests.txt:
- export VERSION=$(cat current_version) something.text_tests.txt: - export MY_PROJECT_NAME_UPPER_CASE=$(echo ${MY_PROJECT_NAME} | tr a-z A-Z) something.text_tests.txt: - export ${MY_PROJECUPPER_CASE}_IMAGE=${MYISTRY}/${MY_PROJECT_PATH}/${MY_PROJECT_NAME}:${VERSION} something.text_tests.txt: - docker pull ${MY_REG}/${MY_PR}/${MY_PROJEC}:${VERS}
Try
grep -Po '\$\.*.[{_A-Z}]+' output.txt
-P makes grep using the Perl syntax
-o outputs only the matching parts
Now, improve your regex. For starters, I have already added _ to it, but it would still find ${X}{Y} (false positive) or not find ${lowercase} (false negative) and just partly find ${DIR#/} (because of extra syntax).
You can tell grep to output only matching parts using --only-matching or simply -o.
Real problem here is what really makes a valid variable identifier. This of course depends on for which shell was script written and how many different styles original author of the script used.
Let's assume something sane, identifiers starts with [a-Z] and can only contain alphanumeric characters and underscore. You can also reference same identifier using $MY_VARIABLE or ${MY_VARIABLE}.
I would go with something like that:
grep -rhIo '\$[a-zA-Z_\{\}]*' directory | sort --unique
But be aware that syntax for arrays and operations above variables will break this very quickly.
To get correct results for ${adjacent}text maybe go for
grep -hrEo '\$\{?[A-Za-z_0-9]+\}?' .
This will still not work correctly for
: <<\_
literal $text in a quoted here document
_
echo '$quoted literal text'
echo \$escaped \$dollar \$signs
etc
but for a quick and dirty attempts, maybe just ignore those corner cases, or add some sort of postprocessing to remove them.
Properly solving this requires you to have a sh parser to figure out which dollar signs are quoted etc; and with eval even that won't be complete.
As a comment on other answers here, grep -P is not portable, and requires GNU grep. If you don't have that, and really require Perl regex extensions, maybe simply go with Perl.
perl -lne 'print($&) while m/\$\{?[A-Za-z_0-9]+\}?/go' **/*
The **/* recursive wildcard is not portable either; if you require a POSIX-compatible script, maybe resort to
find . -type f -exec \
'print($&) while m/\$\{?[A-Za-z_0-9]+\}?/go' {} +
though of course Perl isn't at all POSIX either.
The Linux VM is created using the Azure Pipeline. There is one folder which is created dynamically within the /data/config folder. The requirement is to return the name of this folder to the DevOps Pipeline.
The powershell code to invoke the shell script is as below
Invoke-AzVMRunCommand -ResourceGroupName $ResourceGroupName -VMName $VirtualMachineName -CommandId RunShellScript -ScriptPath "$($PSScriptRoot)/scripts/$PatchScript" -Parameter #{InputConf=$ReplInputConf}
I have done the following in the shell script and the folder name is written to a.txt
sudo -u $User find /data/config -maxdepth 1 -type d | sed 's/.*\///' >> /log/a.txt
sudo -u $User sed -i "/^\s*$/d" /log/a.txt
How can I return the string in a.txt back to the pipeline?
Thanks
You could try use a script to get the content of the txt file and set a variable $var for the content. Then use Logging commands to set another variable in the variable service of taskcontext. This variable is exposed to the following tasks as an environment variable.
echo "##vso[task.setvariable variable=testvar;]$var"
I need to copy the folder with multiple files to another folder in build pipeline.
I use
cp -R -v pathToSourceFolder pathToDestFolder
cp -R -v /Users/runner/runners/2.166.4/work/1/s/en.lproj/ /Users/runner/runners/2.166.4/work/1/s/platforms/ios/AppName/Resources
and I am getting error with exit code 126:
usage: cp [-R [-H | -L | -P]] [-fi | -n] [-apvXc] source_file target_file
cp [-R [-H | -L | -P]] [-fi | -n] [-apvXc] source_file ... target_directory
can anyone help with this as I am new to linux/macOS uses/cmd?
Note: pipeline run on macOS.
Since there is nothing obviously wrong in the cp command, I would just write some safety checks about the presence of the directories:
# Define directories involved
from="pathToSourceFolder"
to="pathToDestFolder"
# Check existence
if [[ -d "$from" ]]
then
# If the destination directory does not exist, we create it.
[[ -d $to ]] || mkdir -p "$to"
if [[ -d $to ]]
then
cp -R -v "$from" "$to"
else
# If we get here, we will likely have permission problems.
echo Can not create directory $to
ls -ld "$to"
fi
else
echo Source directory $from does not exist
fi
I need to copy the folder with multiple files to another folder in
build pipeline.
1.I assume /Users/runner/runners/2.166.4/work/1/s is the default working folder of your build. So please avoid hard-coding the path, instead you can use $(System.DefaultWorkingDirectory) which represents that path. See Azure Devops predefined variables.
2.Since your original purpose is to copy the files in Azure Devops pipeline, you don't need to care too much about the corresponding copy syntax in different OS systems (Linux,MacOS or Windows).
You can do what you want easily using official Copy Files task. This task requires three inputs: Source folder, Target folder and Contents we want to copy, that's it.
Classic UI format:
You can choose the source folder via Browse Source Folder option. And then use ** as Contents, $(System.DefaultWorkingDirectory)/platforms/ios/Fixi/Resources as Target folder.
Yaml format:
- task: CopyFiles#2
displayName: 'My Copy Task'
inputs:
SourceFolder: en.lproj
TargetFolder: '$(System.DefaultWorkingDirectory)/platforms/ios/Fixi/Resources'
We've done the logic in the behind for you so that you can use this task in MacOS/Linux/Windows easily. Log of my test:
I know the accepted answer says the "cp" command is not broken, but it actually is in azure pipelines. Microsoft seems to append a "/" at the end by default, which breaks the behaviour of the command.
if you do:
cp -r ./src ./dest
it will copy the files from src folder to dest folder.
if you write:
cp -r ./src ./dest/
it will copy the src folder into the destination one leaving you with /dest/src/*
Also, my build pipeline failed when I tried to copy a file
cp ./src/myfile.txt ./dest/myfile.txt
Adding the "/" at the end will cause it to fail as it's attempting to dump the file into a directory that does not exist since the actual command that runs is the following.
cp ./src/myfile.txt ./dest/myfile.txt/
Below is the script mentioned in the gitlab-ci.yml file. This GitLab CI configuration is valid. But, when the CI/CD build is run, the job fails. Is it something to do with the FOR loop syntax?
deploy_dv:
stage: deploy_dv
variables:
GIT_STRATEGY: none
script:
- echo "Deploying Artifacts..."
- echo "Configure JFrog CLI with parameters of your Artifactory instance"
- 'c:\build-tools\JFROG-CLI\jfrog rt config --url %ARTIFACTORY_WEBSITE% --user %ARTIFACTORY_USER% --apikey %APIKEY%'
- 'cd ..\artifacts'
- 'SETLOCAL ENABLEDELAYEDEXPANSION'
- FOR %%i in (*) do (
'c:\build-tools\curl\bin\curl.exe --header "PRIVATE-TOKEN:%HCA_ACCESS_TOKEN%" --insecure https://code.example.com/api/repository/tags/%CI_COMMIT_TAG% | c:\build-tools\jq\jq-win64.exe ".release.description" > temp.txt'
'set /p releasenote=<temp.txt'
'rem del temp.txt'
'set mydate=%DATE:~6,4%-%DATE:~3,2%-%DATE:~0,2%'
'c:\build-tools\JFROG-CLI\jfrog rt u "%%i" %ARTIFACTORY_ROOT_PATH%/%PROJECT_NAME%/%%i --build-name=%%i --build-number=%BUILDVERSION% --props releasenote=%releasenote%;releaseversion=%BUILDVERSION%;releasedate=%mydate% --flat=false'
)
- '%CURL% -X POST -F token=%REPOSITORY_TOKEN% -F ref=master -F "variables[RELEASE]=false" -F "variables[PROGRAM]=test" --insecure https://code.example.com/api/repository/trigger'
only:
- /^(dv-)(\d+\.)(\d+\.)(\d+)$/
I get this below error:
$ echo "Deploying Artifacts..."
"Deploying Artifacts..."
$ echo "Configure JFrog CLI with parameters of your Artifactory instance"
"Configure JFrog CLI with parameters of your Artifactory instance"
$ c:\build-tools\JFROG-CLI\jfrog rt config --url %ARTIFACTORY_WEBSITE% --user %ARTIFACTORY_USER% --apikey %APIKEY%
Artifactory server ID [Default-Server]: $ cd ..\artifacts
$ SETLOCAL ENABLEDELAYEDEXPANSION
$ FOR %%i in (*) do ( 'c:\build-tools\curl\bin\curl.exe --header "PRIVATE-TOKEN:%HCA_ACCESS_TOKEN%" --insecure https://code.example.com/api/repository/tags/%CI_COMMIT_TAG% | c:\build-tools\jq\jq-win64.exe ".release.description" > temp.txt' 'set /p releasenote=<temp.txt' 'rem del temp.txt' 'set mydate=%DATE:~6,4%-%DATE:~3,2%-%DATE:~0,2%' 'c:\build-tools\JFROG-CLI\jfrog rt u "%%i" %ARTIFACTORY_ROOT_PATH%/%PROJECT_NAME%/%%i --build-name=%%i --build-number=%BUILDVERSION% --props releasenote=%releasenote%;releaseversion=%BUILDVERSION%;releasedate=%mydate% --flat=false' )
The filename, directory name, or volume label syntax is incorrect.
ERROR: Job failed: exit status 255
Since there is still no good answer to this question, I will give it a try. I used this snippet to start multiple Docker builds for every directory in my repository. Notice the |+ and the > characters, which lets you put multi-line commands in YAML and are part of GitLab syntax.
Linux example:
build:
stage: loop
script:
- |+
for i in $(seq 1 3)
do
echo "Hello $i"
done
Windows example:
build:
stage: loop
script:
- >
setlocal enabledelayedexpansion
for %%a in ("C:\Test\*.txt") do (
set FileName=%%~a
echo Filename is: !FileName!
)
endlocal
Here is a working example of a job in a .gitlab-ci with a loop running on GNU/Linux OS and using Sh/Bash shell :
edit:
stage: edit
script:
- for file in $(find ${CI_PROJECT_DIR} -type f -name deployment.yml)
do
CURRENT_IMAGE=$(grep "image:" $file | cut -d':' -f2- | tr -d '[:space:]' | cut -d':' -f3)
sed -ie "s/$CURRENT_IMAGE/$VERSION/g" "$file"
done
only:
- master
I'm not an expert on Gitlab-Runner on Windows but Windows Batch is default shell used but you can also use Powershell.
In .gitlab.yml anything you write under "script" is shell. Thus for loop will be same as it works in shell script.
for var in ${NAME_1} ${NAME_2} ${NAME_3} ; do
*----computations----*
done