How make a job depend from other job in previous stage - gitlab

I have same stages :
stage:
- A
- B
job1:
stage: A
job2:
stage: A
job3 :
stage: B
the sequence must be job1 -> job3 -> job2 and each job depend from previous job.
job1 and job3 is working fine but as job2 depend from job3 (previous stage) is not working.
I get an error 'job2 job: need job3 is not defined in current or prior stages'
Any solution for this problem ?

You can use the needs keyword to create an acyclic relationship between one job and another - which also lets you make the job depend on any job from any previous stage - however in your case you don't need to do that.
Use the needs keyword for more advanced scenarios (like ignoring job failures from the proceeding stage).
Keep in mind that all jobs that share a stage will be run (by default) in parallel. This should work just fine:
stages:
- first
- second
job1:
stage: first
script: echo "this job will run"
job2:
stage: first
script: echo "at the same time as this job"
job3:
stage: second
script: echo "this job will run after all jobs in the first stage succeed"
Your question was
"How make a job depend from other job in previous stage"
and the above example answers that question. However in your description you wrote
"...the sequence must be job1 -> job3 -> job2"
Which, based on that description, could be achieved like so:
- first
- second
- third
job1:
stage: first
script: echo "this job will run during the first stage"
job2:
stage: third
script: echo "this job will run during the third stage"
job3:
stage: second
script: echo "this job will run during the second stage"
The above config would guarantee that none of your jobs execute in parallel and proceed in the sequence you described. You still wouldn't need the needs keyword in this case and the config would be easier to read.

There is a typo in your code. It has to be stages and not stage.
Below is the working code.
stages:
- A
- B
job1:
stage: A
script:
- echo "Job 1"
job2:
stage: A
script:
- echo "Job 2"
needs:
- job1
job3 :
stage: B
script:
- echo "Job 3"
needs:
- job2
I had to remove the tags: as it was internal to my runner.
See t he dependency is shown clearly in the Visualize tab.
And I validated that it triggers job1 first and job3 waits for it.

Related

whitelist some inherrited variables (but not all) in gitlab multi-project pipeline

I'm following the gitlab docs for multi-project pipelines. I'm running on gitlab.com (not enterprise/self-hosted).
I have successfully set up a multi-project pipeline. My question is - is there a way to pass some but not all variables between stages?
Here's a very simple build script for two projects:
Main project:
variables:
THIS_PROJECT_NAME: trigger-source
SHARED_ARGUMENT: "hello world!"
stages:
- build
- downstream
build-code-job:
stage: build
script:
- echo "${THIS_PROJECT_NAME}"
- echo "${SHARED_ARGUMENT}"
run-trigger-job:
stage: downstream
inherit:
variables: false
variables:
SHARED_ARGUMENT: $SHARED_ARGUMENT
trigger: my-org/triggers_dest
Triggered project:
variables:
THIS_PROJECT_NAME: trigger-dest
SHARED_ARGUMENT: "overwrite me"
stages:
- test
triggered-job:
stage: test
script:
- echo "${THIS_PROJECT_NAME}"
- echo "${SHARED_ARGUMENT}"
only:
- pipelines
when I run this with inherit: variables: false, the output in the triggered project's builds just show the default variables (no variables are passed):
$ echo "${THIS_PROJECT_NAME}"
trigger-dest
$ echo "${SHARED_ARGUMENT}"
overwrite me
However, when I use inherit: variables: true, all variables are passed, except the value of SHARED_ARGUMENT is actually written as the literal "$SHARED_ARGUMENT, which then gets expanded to "overwrite me":
$ echo "${THIS_PROJECT_NAME}"
trigger-source
$ echo "${SHARED_ARGUMENT}"
overwrite me
This is the opposite of what I want! Essentially I want to whitelist variables to pass through, rather than blacklisting them as above. Any way to do this?
Found the answer buried in the docs on the inherit: variables keyword. In addition to true/false, you can specify a list of variables to inherit.
Changing the source project's .gitlab-ci.yml to the following:
variables:
THIS_PROJECT_NAME: trigger-source
SHARED_ARGUMENT: "hello world!"
stages:
- build
- downstream
build-code-job:
stage: build
script:
- echo "${THIS_PROJECT_NAME}"
- echo "${SHARED_ARGUMENT}"
run-trigger-job:
stage: downstream
inherit:
variables:
- SHARED_ARGUMENT
trigger: my-org/triggers_dest
results in the desired output:
$ echo "${THIS_PROJECT_NAME}"
trigger-dest
$ echo "${SHARED_ARGUMENT}"
hello world!

Use $CI_JOB_ID as a constant across multiple stages of Gitlab pipeline

I have the following
stages:
- stage1
- stage2
variables:
     MY_ENV_VAR: env_$CI_JOB_ID
stage1_build:
stage: stage1
script:
- echo $MY_ENV_VAR
stage2_build:
stage: stage2
script:
- echo $MY_ENV_VAR
I get different values for $MY_ENV_VAR in the two stages (which means $CI_JOB_ID changes on every stage).
What I want is set $MY_ENV_VAR once with one value of $CI_JOB_ID and make it a constant, so that the same value of $MY_ENV_VAR is used across all stages.
Use $CI_PIPELINE_ID instaed, which will be constant across all jobs in the pipeline.
variables:
MY_ENV_VAR: env_$CI_PIPELINE_ID
See predefined environment variables for additional reference.
If you really want an environment variable to be created in one job and persist for the rest of the pipeline, you can pass variables between jobs using artifacts:reports:dotenv.
stages:
- stage1
- stage2
set_env:
stage: .pre
script:
echo "MY_ENV_VAR=env_$CI_JOB_ID" > .myenv
artifacts:
reports:
dotenv: .myenv
stage1_build:
stage: stage1
script:
- echo $MY_ENV_VAR
stage2_build:
stage: stage2
script:
- echo $MY_ENV_VAR

how to pass variables between gitlab-ci jobs?

I have a gitlab-ci like this:
stages:
- calculation
- execution
calculation-job:
stage: calculation
script: ./calculate_something_and_output_results.sh
tags:
- my-runner
execution-job:
stage: execution
script: ./execute_something_with_calculation_results.sh foo
tags:
- my-runner
The foo argument in execution-job is base on the results of calculation-job. I want to pass the results from one job to another job via variables. How can I do that?
If you're looking to get the results without storing a file anywhere you can use artifacts: reports: dotenv. This is taken entirely from DarwinJS shared-variables-across-jobs repo.
stages:
- calculation
- execution
calculation-job:
stage: calculation
script: - |
# stores new or updates existing env variables, ex. $OUTPUT_VAR1
./calculate_something_and_output_results.sh >> deploy.env
tags:
- my-runner
artifacts:
reports:
#propagates variables into the pipeline level, but never stores the actual file
dotenv: deploy.env
execution-job:
stage: execution
script: - |
echo "OUTPUT_VAR1: $OUTPUT_VAR1"
./execute_something_with_calculation_results.sh foo
tags:
- my-runner
AFAIK it is not possible to pass a variable directly from one job to another job. Instead you have to write them into a file and pass that as artifact to the receiving job. To make parsing of the file easy, I recommend to create it with bash export statements and source it in the receiving job's script:
calculation-job:
stage: calculation
script:
- ./calculate_something_and_output_results.sh
- echo "export RESULT1=$calculation_result1" > results
- echo "export RESULT2=$calculation_result2" >> results
tags:
- my-runner
artifacts:
name: "Calculation results"
path: results
execution-job:
stage: execution
script:
- source ./results
# You can access $RESULT1 and $RESULT2 now
- ./execute_something_with_calculation_results.sh $RESULT1 $RESULT2 foo
tags:
- my-runner
needs: calculation-job
Note the ./ when sourcing results might be necessary in case of a POSIX compliant shell that does not source files in the current directory directly like, for example, a bash started as sh.
As a simpler version of what #bjhend answered (no need for export or source statements), since GitLab 13.1 the docs. recommend using a dotenv artifact.
stages:
- calculation
- execution
calculation-job:
stage: calculation
script:
# Output format must feature one "VARIABLE=value" statement per line (see docs.)
- ./calculate_something_and_output_results.sh >> calculation.env
tags:
- my-runner
artifacts:
reports:
dotenv: calculation.env
execution-job:
stage: execution
script:
# Any variables created by above are now in the environment
- ./execute_something_with_calculation_results.sh
tags:
- my-runner
# The following is technically not needed, but serves as good documentation
needs:
job: calculation-job
artifacts: true
If you have a job after the calculation stage that you don't want to use the variables, you can add the following to it:
needs:
job: calculation-job
artifacts: false

Can 2 GitLab jobs save variables in the same .env file?

I have 2 GitLab CI/CD Jobs doing some stuff that generates variables.
For example, the first job saves in a .env file
job1:
script:
- echo "A_VARIABLE=${A_VARIABLE}" >> job.env
artifacts:
reports:
dotenv:
- job.env
And so does the second job.
job2:
script:
- echo "ANOTHER_VARIABLE=${ANOTHER_VARIABLE}" >> job.env
artifacts:
reports:
dotenv:
- job.env
But can it save to that same job.env file without scratching existing data ?
I know I can just group the 2 jobs but I would prefer not mixing them to make it cleaner
I didn't test it, but this should work from my understanding. Saving the file to an artifact will restore it before job2 starts. So it should be possible to append to the file.
Probably would test it with some simpler structure like this
job2:
script:
- echo "ANOTHER_VARIABLE=${ANOTHER_VARIABLE}" >> job.env
artifacts:
paths:
- job.env

How to fail AzureDevops pipeline when test files are not found

I have the following DotNet Test task in my pipeline
displayName: 'unit tests'
inputs:
command: test
projects: '**/*Unit*.csproj'
publishTestResults: true
arguments: '/p:CollectCoverage=true /p:CoverletOutputFormat=cobertura /p:CoverletOutput=results/'
How can I fail the pipeline if no files matched the project pattern: '**/*Unit*.csproj'?
Currently, it displays the current error message and moves on to the next task
##[warning]Project file(s) matching the specified pattern were not found.
Use the Visual Studio Test task. It has a minimumExpectedTests parameter, so if you set it to 1, the task will fail if 0 tests are run.
One could also run a bash script checking for the existence of the test result file (and that number of results is greater than 0):
- bash: |
if [ $(test_results_host_folder)**/*.trx ] && [ $(grep -E "<UnitTestResult" $(test_results_host_folder)**/*.trx -c) -gt 0 ]; then
echo "##vso[task.setVariable variable=TESTRESULTFOUND]true"
fi
displayName: Check if test results file & results >= 1
- script: |
echo No test result found
exit 1
displayName: No test result found
condition: ne(variables.TESTRESULTFOUND, 'true')
If you have stages in your yaml pipeline, then you can do something like this, this will not run the next stage if the previous stage has failed
stages:
- stage: unittests
displayName: 'unit tests'
- stage: nextstage
dependsOn: unittests
displayName: 'unit tests'
As far as I know, we cannot set the task itself to make the pipeline fail when it cannot find the file.
For a workaround:
You could use the Build Quality Checks task from Build Quality Checks extension.
This task can scan all set tasks and check warnings. If the number of warnings is greater than the set upper limit, the pipeline will fail.
Result:

Resources