Argo workflow template execution failing when I use when condition with withParam loop along with the artifact input - conditional-operator

I've the following workflow template which has when condition when: "'{{item}}' =~ '^tests/'", artifacts input as file path in AWS S3 bucket and withParam loop.
Here is my workflow template
apiVersion: argoproj.io/v1alpha1
kind: WorkflowTemplate
metadata:
name: process-wft
spec:
entrypoint: main
templates:
- name: main
inputs:
parameters:
- name: dir-process
default: true
- name: dir
artifacts:
- name: Code
dag:
tasks:
- name: process-wft-tests
when: "'{{item}}' =~ '^tests/'"
templateRef:
name: tf-wf-rn
template: main
arguments:
parameters:
- name: dir-process
value: "{{inputs.parameters.dir-process}}"
artifacts:
- name: Code
from: "{{inputs.artifacts.Code}}"
withParam: "{{inputs.parameters.dir}}"
here is my input artifact Code extracted result which is being passed from my workflow
inputs:
artifacts:
- archive:
tar:
compressionLevel: 9
archiveLogs: true
globalName: GitSource
name: Code
path: /mnt/out/code
s3:
key: process-kfxqf/process-kfxqf-1938174407/GitSource.tgz
It is giving the below Error when I run my workflow
message: failed to resolve {{inputs.artifacts.Code}}
What's the mistake am doing here? if it doesn't work what's the alternate way to get this worked?
Note: I tried workflow execution by removing when condition, it is working fine. It is giving issue only when I add when condition.

This is a bug.
It seems that when the condition evaluates to false, some code skips populating the artifact (makes sense, save some time), but some other code doesn't respect the when condition and still expects the artifact to be populated.
Potential workarounds:
Move the conditional logic into the container.
remove the when condition
pass the dir parameter to the main template in your tf-wf-rn WorkflowTemplate
change the main template to run the regex against the dir parameter - if it doesn't match, just exit 0
This could make the workflow much slower, because you'll have to spin up a pod for each iteration of the loop to determine if there's actually any work to be done.
If you can calculate all the information about the artifact, pass that information as parameters to the main template in your tf-wf-rn WorkflowTemplate. Then actually load the artifact in that non-conditioned, non-looped template. (Basically hopscotch over the problematic code.)
Try an older version.
If you find a working older version, please 1) comment on the bug report and 2) make sure the older version doesn't have any relevant security vulnerabilities before running it on a production system.

Related

How run a particular stage in GitLab after the execution of child pipelines'?

I'm using GitLab and the CI config has got following stages:-
stages:
- test
- child_pipeline1
- child_pipeline2
- promote-test-reports
At any time, only one of the child pipeline will run i.e. either child_pipeline1 or child_pipeline2 after the test stage and not both at a time.
Now, I have added another stage called promote-test-reports which I would like to run in the end i.e. after the successful execution of any of child pipeline.
But I'm completely blocked here. This promote-test-reports is coming from a template which I have included in this main CI config file like:-
# Include the template file
include:
- project: arcesium/internal/vteams/commons/commons-helper
ref: promote-child-artifacts
file: 'templates/promote-child-artifacts.yml'
I'm overriding the GitLab project token in this same main file like below:-
test:
stage: promote-test-reports
trigger:
include: adapter/child_pipelin1.yml
strategy: depend
variables:
GITLAB_PRIVATE_TOKEN: $GITLAB_TOKEN
If you see the above stage definition in main CI config file, I'm trying to use strategy: depend to wait for successful execution of child_pipeline1 and then run this stage but it throwing error (jobs:test config contains unknown keys: trigger)and this approach is not working reason is I'm using scripts in the main definition of this stage (promote-test-reports) in the template and as per the documentation both scripts and strategy cannot go together.
Following is the definition of this stage in the template:-
test:
stage: promote-test-reports
image: "495283289134.dkr.ecr.us-east-1.amazonaws.com/core/my-linux:centos7"
before_script:
- "yum install unzip -y"
variables:
GITLAB_PRIVATE_TOKEN: $GITLAB_TOKEN
allow_failure: true
script:
- 'cd $CI_PROJECT_DIR'
- 'echo running'
when: always
rules:
- if: $CI_PIPELINE_SOURCE == 'web'
artifacts:
reports:
junit: "**/testresult.xml"
coverage_report:
coverage_format: cobertura
path: "**/**/coverage.xml"
coverage: '/TOTAL\s+\d+\s+\d+\s+(\d+%)/'
The idea of using strategy attribute failed. I cannot remove the logic of scripts in the template as well. May I know what is the alternate way of running my job(promote-test-reports) in the end and remember it is going to be an OR condition that run either after child_pipeline1 or child_pipeline2
I would really appreciate your help
Finally, I was able to do it by putting the strategy: depend on child pipelines. I was doing it incorrectly earlier by doing on stage, promote-test-reports

GitLab runner skips artifacts for failed job with when: always when using "extends"

I have a "parent" job defined in a shared (between projects) yaml "default-ci.yml":
.build_solution:
stage: exe_builds
# Declare variables to be overridden
variables:
LVSLN: null
OUTPUT_DIR: null
# optionally use CONFIGURATION to specify a configuration
script:
- . $SCRIPTS_DIR\build_solution.ps1
-SolutionPath "$LVSLN"
-OutputDir "$OUTPUT_DIR"
artifacts:
when: always
paths:
- $ESCAPED_ARTIFACTS_DIR\
expire_in: 1 week
In the specific project yaml I define values for the variables
include: "default-ci.yml"
Build My Project:
extends: .build_solution
variables:
LVSLN: build.lvsln
OUTPUT_DIR: build
With it set up as above, the artifacts are only stored for successful jobs despite the parent stating when: always. At the end of the job summary of a failed job, it goes straight from the after script to Cleaning up project directory and file based variables: no Uploading artifacts for failed job (in other words it's not even trying to find/upload artifacts).
But, when I move the artifacts section to the child yml, so it becomes
include: "default-ci.yml"
Build My Project:
extends: .build_solution
variables:
LVSLN: build.lvsln
OUTPUT_DIR: build
artifacts:
when: always
paths:
- $ESCAPED_ARTIFACTS_DIR\
expire_in: 1 week
I do get artifacts from failed jobs with Uploading artifacts for failed job appearing in the summary.
The artifacts section should be the same for all projects extending .build_solution so I do not want to have to define it in each of the children, it should be defined in the parent.
It looks like what's happening is the parent "artifacts" section is not applying or is being overridden, but I can't find why or where that would be happening when the child has no artifacts section. The rest of the .build_solution job is working as defined in the parent as I see output from my "build_solution.ps1" script and it was passed the correct parameters.

Is there a way to extend multiple templates with resources in Azure DevOps?

I have one pipeline that is referencing several templates with resources and I know you can extend a template with resources by using the extend keyword. I've been looking at this documentation trying to make it work.
azure-pipeline.yaml
trigger: none
resources:
repositories:
- repository: repoName
type: git
name: project/repoName
- repository: 'Release Notes'
name: 'project/Release Notes'
type: git
pipelines:
- pipeline: ooi-adf-ci
source: ooi-adf-ci
extends:
- {template: '/cicd/pipelines/templates/stages-deploy-app-registration.yaml'}
- {template: '/cicd/pipelines/templates/stages-set-app-credentials.yaml'}
- {template: '/cicd/pipelines/templates/stages-buying-release-apps.yaml'}
- {template: '/cicd/pipelines/templates/stages-buying-adf.yaml'}
I also tried something like:
extends:
template:
[
'/cicd/pipelines/templates/stages-deploy-app-registration.yaml',
'/cicd/pipelines/templates/stages-set-app-credentials.yaml',
'/cicd/pipelines/templates/stages-buying-release-apps.yaml',
'/cicd/pipelines/templates/stages-buying-adf.yaml',
]
template.yaml where the resource is consumed:
...
resources:
pipelines:
- pipeline: ooi-adf-ci
source: ooi-adf-ci
...
I get an error in my azure-pipeline.yaml file saying "A sequence was not expected". I'm wondering if this is actually possible as I haven't been able to find any documentation on it or if I'm just incorrectly doing it.
I dont think you can infinitely extend templates at the root level, as in your example.
What you can do is compose a azure-pipeline.yaml file where each job / step is from a template. I do this a lot and works perfectly.
Here is the official docs:
enter link description here

Is there any way to dynamically edit a variable in one job and then pass it to a trigger/bridge job in Gitlab CI?

I need to pass a file path to a trigger job where the file path is found within a specified json file in a separate job. Something along the lines of this...
stages:
- run_downstream_pipeline
variables:
- FILE_NAME: default_file.json
.get_path:
stage: run_downstream_pipeline
needs: []
only:
- schedules
- triggers
- web
script:
- apt-get install jq
- FILE_PATH=$(jq '.file_path' $FILE_NAME)
run_pipeline:
extends: .get_path
variables:
PATH: $FILE_PATH
trigger:
project: my/project
branch: staging
strategy: depend
I can't seem to find any workaround to do this, as using extends won't work since Gitlab wont allow for a script section in a trigger job.
I thought about trying to use the Gitlab API trigger method, but I want the status of the downstream pipeline to actually show up in the pipeline UI and I want the upstream pipeline to depend on the status of the downstream pipeline, which from my understanding is not possible when triggering via the API.
Any advice would be appreciated. Thanks!
You can use artifacts:reports:dotenv for setting variables dynamically for subsequent jobs.
stages:
- one
- two
my_job:
stage: "one"
script:
- FILE_PATH=$(jq '.file_path' $FILE_NAME) # In script, get the environment URL.
- echo "FILE_PATH=${FILE_PATH}" >> variables.env # Add the value to a dotenv file.
artifacts:
reports:
dotenv: "variables.env"
example:
stage: two
script: "echo $FILE_PATH"
another_job:
stage: two
trigger:
project: my/project
branch: staging
strategy: depend
Variables in the dotenv file will automatically be present for jobs in subsequent stages (or that declare needs: for the job).
You can also pull artifacts into child pipelines, in general.
But be warned you probably don't want to override the PATH variable, since that's a special variable used to help you find builtin binaries.

GitLab CI/CD - Using Both Includes: and Needs:

I'm working off the Auto-Devops template for my .gitlab-ci.yml. Trying to use both include: and needs: as part of a GitLab partner lab. The CI Lint tool says this is valid, but the pipeline fails, saying "dast: needs 'dast_environment_deploy'". After attempting the below code, I even tried copying the content of the entire dast_environment_deploy template and placing that in the file, still getting the same error.
How do I get my pipeline file to use needs: based on an include: template?
image: alpine:latest
stages:
- build
- test
- scan
- deploy # dummy stage to follow the template guidelines
- review
- dast
- staging
- canary
- production
- incremental rollout 10%
- incremental rollout 25%
- incremental rollout 50%
- incremental rollout 100%
- performance
- cleanup
scan:
stage: scan
trigger:
include:
- template: Security/License-Scanning.gitlab-ci.yml
- template: Security/Container-Scanning.gitlab-ci.yml
- template: Security/Dependency-Scanning.gitlab-ci.yml
review:
needs: ["build"]
dast:
needs: ["dast_environment_deploy"]
sast:
needs: []
cache:
paths:
- node_modules
include:
- template: Jobs/Build.gitlab-ci.yml # https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/templates/Jobs/Build.gitlab-ci.yml
- template: Jobs/Test.gitlab-ci.yml # https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/templates/Jobs/Test.gitlab-ci.yml
- template: Jobs/Code-Quality.gitlab-ci.yml # https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/templates/Jobs/Code-Quality.gitlab-ci.yml
- template: Jobs/Code-Intelligence.gitlab-ci.yml # https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/templates/Jobs/Code-Intelligence.gitlab-ci.yml
- template: Jobs/Deploy.gitlab-ci.yml # https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/templates/Jobs/Deploy.gitlab-ci.yml
- template: Jobs/Deploy/ECS.gitlab-ci.yml # https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/templates/Jobs/Deploy/ECS.gitlab-ci.yml
- template: Jobs/Deploy/EC2.gitlab-ci.yml # https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/templates/Jobs/Deploy/EC2.gitlab-ci.yml
- template: Jobs/DAST-Default-Branch-Deploy.gitlab-ci.yml # https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/templates/Jobs/DAST-Default-Branch-Deploy.gitlab-ci.yml
- template: Jobs/Browser-Performance-Testing.gitlab-ci.yml # https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/templates/Jobs/Browser-Performance-Testing.gitlab-ci.yml
- template: Security/DAST.gitlab-ci.yml # https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/templates/Security/DAST.gitlab-ci.yml
- template: Security/SAST.gitlab-ci.yml # https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/templates/Security/SAST.gitlab-ci.yml
- template: Security/Secret-Detection.gitlab-ci.yml # https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/templates/Security/Secret-Detection.gitlab-ci.yml
I haven't reviewed each of your included templates, but based on the error and the few I did review, the error is most likely caused by the needs keyword needing a job that isn't added to the pipeline due to a when condition or rules:if condition.
If a job needs another job, and the other job isn't added to the pipeline (the actual running pipeline instance, not the pipeline definition in .gitlab-ci.yml), the yml is considered invalid at runtime. You can see all of the requirements and limitations with needs in the docs: https://docs.gitlab.com/ee/ci/yaml/#requirements-and-limitations
Looking at the first included template, - template: Jobs/Build.gitlab-ci.yml # https://gitlab.com/gitlab-org/gitlab/blob/master/lib/gitlab/ci/templates/Jobs/Build.gitlab-ci.yml, both the build and build_artifact jobs have rules that could result in the job not being added to the pipeline. For example, if the variable $AUTO_DEVOPS_PLATFORM_TARGET is not "EC2" neither job will be added, so any job that needs these jobs will throw a YML error.

Resources