How to delete artifacts directory on gitlab runner after uploading them to gitlab? - gitlab

I'm trying to create a gitlab job that shows a metric for test code coverage. To do that, I'm creating a .coverage file and placing it in a directory that uploads artifacts. In a subsequent stage the artifacts are downloaded and consumed by a coverage tool to produce a coverage report. I noticed that the artifacts are not deleted when the gitlab runner finishes the job and are bloating my filesystem. How can I remove the artifacts directory after the artifacts are uploaded?
Here's what we currently have
stages:
- test
- build
before_script:
- export GITLAB_ARTIFACT_DIR="$(pwd)"/artifacts
[...]
some-test:
stage: test
script:
- [some script that puts something in ${GITLAB_ARTIFACTS_DIR}
artifacts:
expire_in: 4 days
paths:
- artifacts/
some-other-test:
stage: test
script:
- [some script that puts something in ${GITLAB_ARTIFACTS_DIR}
artifacts:
expire_in: 4 days
paths:
- artifacts/
[...]
coverage:
stage: build
before_script:
script:
- [our coverage script]
coverage: '/TOTAL.*\s+(\d+%)$/'
artifacts:
expire_in: 4 days
paths:
- artifacts/
when: always
[...]
after_script:
- sudo rm -rf "${GITLAB_ARTIFACT_DIR}"
According to https://gitlab.com/gitlab-org/gitlab-runner/issues/4146 after_script does not have access to before_script or scripts environment variables.

A solution could be to use cache and artifact simultaneously.
This config will create a new directory depending of the job id ($CI_JOB_ID) for each job execution :
stages:
- test
remote:
stage: test
script :
- mkdir cache-$CI_JOB_ID
- echo hello> cache-$CI_JOB_ID/foo.txt
cache:
key: build-cache
paths:
- cache-$CI_JOB_ID/
artifacts:
paths:
- cache-$CI_JOB_ID/foo.txt
expire_in: 1 week
At the next run, the previous cache-$CI_JOB_ID will be removed and replace by a new directory (as the $CI_JOB_ID will be different). This will keep only one instance of your cached file until the next job execution.
Note : you need to prefix the directory name with cache- otherwise the .gitlab-ci.yml is invalid.

Related

Gitlab Ci include local only executes last

I got a lot of different android flavors for one app to build, so i want to split up the building into different yml files. I currently have my base file .gitlab-ci.yml
image: alvrme/alpine-android:android-29-jdk11
variables:
GIT_SUBMODULE_STRATEGY: recursive
before_script:
- export GRADLE_USER_HOME=`pwd`/.gradle
- chmod +x ./gradlew
cache:
key: "$CI_COMMIT_REF_NAME"
paths:
- .gradle/
stages:
- test
- staging
- production
- firebaseUpload
- slack
include:
- local: '/.gitlab/bur.yml'
- local: '/.gitlab/vil.yml'
- local: '/.gitlab/kom.yml'
I am currently trying to build 3 different flavors. But i dont know why only the last included yml file gets executed. the first 2 are ignored.
/.gitlab/bur.yml
unitTests:
stage: test
script:
- ./gradlew testBurDevDebugUnitTest
/.gitlab/vil.yml
unitTests:
stage: test
script:
- ./gradlew testVilDevDebugUnitTest
/.gitlab/kom.yml
unitTests:
stage: test
script:
- ./gradlew testKomDevDebugUnitTest
What you observe looks like the expected behavior:
Your three files .gitlab/{bur,vil,kom}.yml contain the same job name unitTests.
So, each include overrides the specification of this job.
As a result, you only get 1 unitTests job in the end, with the specification from the last YAML file.
Thus, the simplest fix would be to change this job name, e.g.:
unitTests-kom:
stage: test
script:
- ./gradlew testKomDevDebugUnitTest

Gitlab pipeline job fails to execute dependency job on branch push but works for master

I have below my CI configurations for my pipeline. All works fine when I push on the master branch. But the pipeline fails to execute job dependencies when I push on another branch other than master.
What am I be missing?
stages:
- prep
- tests
create-users-file:
stage: prep
script:
- ./create_users.sh
artifacts:
paths:
- src/test/resources/data/user.csv
expire_in: 7 days
AccountSimulation:
stage: tests
dependencies:
- create-user-file
script:
- MAVEN_OPTS="-Xms1g -Xmx4g -XX:MaxPermSize=1024m" ./mvnw clean gatling:test -Dgatling.simulationClass=dev.pallet.gatling.simulations.AccountSimulation
artifacts:
paths:
- ./target/gatling/*
expire_in: 30 days
AnalysisSimulation:
stage: tests
dependencies:
- create-user-file
script:
- MAVEN_OPTS="-Xms1g -Xmx4g -XX:MaxPermSize=1024m" ./mvnw clean gatling:test -Dgatling.simulationClass=dev.pallet.gatling.simulations.AnalysisSimulation
artifacts:
paths:
- ./target/gatling/*
expire_in: 30 days
I figured out what the issue was.
The env variables were marked as protected. This meant that the variables were only available to protected branches, and not all branches.
unchecking the protected variables and masking them to avoid leaking secrets in pipeline logs resolved the issue.

Get artifacts from previous GIT jobs

I have a 3 stages in pipeline, each job in all 3 stages are creating a xml data files. These jobs which runs in parallel.
I want to merge all xml data file in 4th stage. Below is my yml code
stages:
- deploy
- test
- execute
- artifact
script:
- XYZ
artifacts:
name: datafile.xml
paths:
- data/
Problem: how i can collect all xmls from previous jobs to merge it? Files names are unique.
Here is a .gitlab-ci.yml file that collects artifacts into a final artifact (takes a file generated by earlier stages, and puts them all together).
The key is the needs attribute which takes the artifacts from the earlier jobs (with artifacts: true).
stages:
- stage_one
- stage_two
- generate_content
apple:
stage: stage_one
script: echo apple > apple.txt
artifacts:
paths:
- apple.txt
banana:
stage: stage_two
script: echo banana > banana.txt
artifacts:
paths:
- banana.txt
put_it_all_together:
stage: generate_content
needs:
- job: apple
artifacts: true
- job: banana
artifacts: true
script:
- cat apple.txt banana.txt > fruit.txt
artifacts:
paths:
- fruit.txt

Gitlab 'exists' rules do not take artifacts into consideration

I am having a problem with some Gitlab CI jobs where I specify a rule to run only if a file exists.
This is my .gitlab-ci.yml:
stages:
- build
- test
#Jobs
build:
stage: build
script:
- dotnet restore --no-cache --force
- dotnet build --configuration Release --no-restore
artifacts:
paths:
- test/
expire_in: 1 week
unit_tests:
stage: test
script: dotnet vstest test/*UnitTests/bin/Release/**/*UnitTests.dll --Blame
rules:
- exists:
- test/*UnitTests/bin/Release/**/*UnitTests.dll
integration_tests:
stage: test
script: dotnet vstest test/*IntegrationTests/bin/Release/**/*IntegrationTests.dll --Blame
rules:
- exists:
- test/*IntegrationTests/bin/Release/**/*IntegrationTests.dll
I want to run unit_tests only when there are *UnitTests.dll under the bin in test/ folder and integration_tests only when there are *IntegrationTests.dll under the bin in test/ folder also.
The problem is that both jobs are completely ignored. In other words, Gitlab seems to be evaluating the exists to false as if it was only evaluated at the beginning of the pipeline and not at the beginning of the job, but these paths exist because they're generated in the previous stage and the artifacts are automatically available.
If I remove the rules the unit_tests will run successfully but integration_tests will fail because at my specific project there are no integration tests.
I've tried replacing exists with changes, same problem.
How can I achieve this conditional job execution?
UPDATE 1: I have an ugly workaround but the question remains because the exists seems to be evaluated at the beginning of the pipeline and not at the beginning of the job and, therefore, anything regarding artifacts is ignored.
This trick works because I always assume that if there's a csproj there will be a dll later on as the result of the build stage.
stages:
- build
- test
build:
stage: build
script:
- dotnet restore --no-cache --force
- dotnet build --configuration Release --no-restore
artifacts:
paths:
- test/
expire_in: 1 week
unit_tests:
stage: test
script: dotnet vstest test/*UnitTests/bin/Release/**/*UnitTests.dll --Blame
rules:
- exists:
- test/*UnitTests/*UnitTests.csproj
integration_tests:
stage: test
script: dotnet vstest test/*IntegrationTests/bin/Release/**/*IntegrationTests.dll --Blame
rules:
- exists:
- test/*IntegrationTests/*IntegrationTests.csproj
At the time of writing, it appears that GitLab doesn't support the use of artifact files in rules. This issue confirms that it doesn't work.
My own workaround is to remove the conditional rule and instead write a wrapper script that first checks the presence of the file.

Stop cleanup between two stages in gitlab-runner

Here is my .gitlab-ci.yml
stages:
- build
- unit_test_1
- unit_test_2
- perf_test
job1:
stage: build
script:
- bash build.sh
allow_failure: true
job2:
stage: unit_test_1
script:
- bash ./all/deployment/testframwork/unit_test_1.sh
allow_failure: true
Here build.sh creates a build and stores all binary in build directory. But after completion of job1 this directory is deleting.
But I am using that directory for running my 2nd job.
How can i achieve this ?
Use build artifacts. You should use expire_in with the artifacts so the build dir is not stored in your gitlab forever. To control what dir gets what artifacts use dependencies
job1:
artifacts:
path: build
expire_in: 1 week
job2:
dependencies:
-job1
job3:
dependencies: []

Resources