For a lecture exercise at my university we have to write some code to solve a specific problem. Depending on how well our solution performs we get assigned a score. This is all achieved by running a single script that gives you a result which makes it perfect to run over CI.
Now this is a group project so I'd love to have some way to report the score back in the UI e.g. when performing merge requests so I don't have to scroll through the terminal output each time.
Usually one would use artifact reports for this. While the majority of these seem rather application specific (e.g. junit test reports), it looks like the dotenv report is the closest to what I want.
However I have found no way to expose the values directly in the UI. To my knowledge artifact reports should show up as widgets in the merge request view. However I found this not to be the case for me.
What would be a way to expose the value generated when running the CI to quickly see the score a specific branch or commit achieved?
Minified CI config of what I currently have
image: python
run-test:
script:
# These are some logs generated by running the validation script
- echo 1,2,3,4 > some_logs.csv
# This is the result value I'd like to expose
- echo result=12.34 > results.env
artifacts:
paths:
- some_logs.csv
reports:
dotenv: results.env
and my university runs GitLab CE 13.7
Based on Arty-chan's comment I updated my CI config to perform a POST request to GitLabs API.
Sample config:
image: python
run-test:
script:
# These are some logs generated by running the validation script
- echo 1,2,3,4 > some_logs.csv
# This is the result value I'd like to expose
- echo result=12.34 > results.env
# Only perform POST if $CI_MERGE_REQUEST_IID is set
- '[ -z "$CI_MERGE_REQUEST_IID" ] && curl --request POST --header "PRIVATE-TOKEN: <your API token>" "https://<instance url>/api/v4/projects/$CI_PROJECT_ID/merge_requests/$CI_MERGE_REQUEST_IID/notes?body=<message>"'
artifacts:
paths:
- some_logs.csv
Note that all <...> need to be replaced with the appropriate values. Also note that $CI_MERGE_REQUEST_IID should be set on merge request but I wasn't able to achieve this consistently.
You will want to store <your API token> as a CI variable to not leak it. Preferably use a project access token for this token.
Related
Surely many of you have encountered this and I would like to share my hacky solution. Essentially during the CI / CD process of a Gitlab pipeline most parameters are passed through "Variables". There are two issues that I've encountered with that.
Those parameters cannot be altered in Realtime - Say I'd want to execute jobs based on information from previous jobs it would always need to be saved in the cache as opposed to written to CI / CD variables
The execution of jobs is evaluated before the script, so the "rules" will only ever apply to the original parameters. Trouble arises when those are only available during runtime.
For complex pipelines one would want to pick and choose the tests automatically without having to respecify parameters every time. In my case I delivered a data product and depending on the content different steps had to be taken. How do we deal with those issues?
Changing parameters real-time:
https://docs.gitlab.com/ee/api/project_level_variables.html This API provides users with a way of interacting with the CI / CD variables. This will not work for any variables defined at the head of the YML files under the Variables tag. Rather this is a way to access the "Custom CI/CD Variables" https://docs.gitlab.com/ee/ci/variables/#custom-cicd-variables found under this link. This way custom variables can be created altered and deleted during running a pipeline. The only thing needed is a PRIVATE-TOKEN that has access rights to the API (I believe my token has all the rights).
job:
stage: example
script:
- 'curl --request PUT --header "PRIVATE-TOKEN: $ACCESS_TOKEN" "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/variables/VARIABLE_NAME" --form "value=abc"'
Onto the next problem. Altering the variables won't let us actually control downstream jobs like this because of the fact that the "rules" block is executed before the pipeline is actually run. Hence it will use the variable before the curl request is sent.
job2:
stage: after_example
rules:
- if: $VARIABLE_NAME == "abc"
script:
- env
The way to avoid that is child pipelines. Child pipelines are initialized inside the parent pipeline and check the environment variables anew. A full example should illustrate my point.
variables:
PARAMETER: "Cant be changed"
stages:
- example
- after_example
- finally
job_1:
# Changing "VARIABLE_NAME" during runtime to "abc", VARIABLE_NAME has to exist
stage: example
script:
- 'curl --request PUT --header "PRIVATE-TOKEN: $ACCESS_TOKEN" "${CI_API_V4_URL}/projects/${CI_PROJECT_ID}/variables/VARIABLE_NAME" --form "value=abc"'
job_2.1:
# This wont get triggered as we assume "abc" was not the value in VARIABLE_NAME before job_1
stage: after_example
rules:
- if: $VARIABLE_NAME == "abc"
script:
- env
job_3:
stage: after_example
trigger:
include:
- local: .donwstream.yml
strategy: depend
job_4:
stage: finally
script:
- echo "Make sure to properly clean up your variables to a default value"
# inside downstream.yml
stages:
- downstream
job_2.2:
# This will happen because the pipeline is initialized after job_1
stage: downstream
rules:
- if: $VARIABLE_NAME == "abc"
script:
- env
This coding bit probably won't run, however it exemplifies my point rather nicely. Job 2 should be executed based on an action that happens in Job 1. While the variables will be updated once we reach job 2.1, the rules check happens before so it will never be executed. Child pipelines do the rule check during the runtime of the Parent pipeline, this is why job 2.2 does run.
This variant is quite hacky and probably really inefficient, but for all intents and purposes it gets the job done.
I'm absolutely new to all this ci/cd thing and its documentation is too extensive. So i'd like to apologize in advance if there is too easy question.
I have gitlab repo with many configs for different services i.e:
project_folder
--global_conf.json
--service1
----config_folder
------....
--service2
----config_folder
------....
--service3
----config_folder
------....
I'd like to know what should I do (just general plan and some key words to search documentation more precisely already would be fine) to send to hosts [host1:port, host2:port, host3:port] different strings with commands "check {service_name} for changes" or "check {global_config} for changes" e.t.c, depending on the file that was committed.
I already have my service on each host that could perform different operations for different task's strings, so I need just send them that task.
You'll propably end up using "only: changes" as explained here.
Something like this should fit:
stages:
- trigger
service1:
stage: trigger
script:
- curl http://host1:port1
only:
changes:
- service1/*
service2:
stage: trigger
script:
- curl http://host2:port2
only:
changes:
- service2/*
My application for this is to visualize the performance of my software. Therefore I briefly describe what I'm doing and where I'm stuck.
I have my source code in GitLab
Compile and run some tests in the CI for each commit
Measure the time it took for the test run and save it to a file
Upload the file with the time as an artifact
--------- From here on I don't know how to achieve it.
Run some new job that reads all timing files of the previous artifacts
Plot the times, probably with Python and save the image as SVG in the "main" repository
Show the image on the GitLab start page (README.md should probably include it)
Now I see which commits had which impact on my software's performance
No idea whether I'm asking for the impossible or not. I hope someone can help me as I'm not an CI expert. Maybe a single expression is already enough to google the solution but I don't even know how to formulate this.
Thanks everyone :)
Committing Images to Main
You can't just save an SVG image to the main repo from a pipeline job. You would need to make a commit. Not only would that pollute your git history and bulk up your repo, but it could also trigger a new pipeline, resulting in an endless loop.
There are ways around the endless loop, e.g. by controlling which sources/branches trigger pipelines or by prefixing the commit message with [skip ci], but it can get complicated and is probably not worth it. The truth is GitLab cannot do exactly what you want, so you will have to compromise somewhere.
Generate Metrics Graphs From Artifacts
You can collect metrics from past pipelines in a CSV file and save it as an artifact.
Add this to a reusable script called add_metrics.sh:
#!/bin/bash
HTTP_HEADER="PRIVATE-TOKEN: $YOUR_ACCESS_TOKEN"
URL_START="https://gitlab.example.com/api/v4/projects/$CI_PROJECT_ID/jobs/artifacts"
URL_END="raw/<path/to/artifact>/metrics.csv?job=$CI_JOB_NAME"
COLUMN_HEADERS=Date,Time,Branch,Commit SHA,Test Time(s),Code Coverage (%)
# download latest artifact
if curl --location --header $HTTP_HEADER $URL_START/$CI_COMMIT_BRANCH/$URL_END
then echo "Feature branch artifact downloaded."
elif curl --location --header $HTTP_HEADER $URL_START/master/$URL_END
then echo "Master branch artifact downloaded."
else echo $COLUMN_HEADERS >> metrics.csv
fi
# add data sample row to CSV
NOW_DATE=$(date +"%F")
NOW_TIME=$(date +"%T")
echo $NOW_DATE,$NOW_TIME,$CI_COMMIT_BRANCH,$CI_COMMIT_SHA,$TEST_TIME,$CODE_COVERAGE >> metrics.csv
# keep last 50 lines
echo "$(tail -50 metrics.csv)" > metrics.csv
Then call it from your pipeline in gitlab-ci.yml:
job_name:
script:
- TEST_TIME=10
- CODE_COVERAGE=85
- chmod +x add_metrics.sh
- bash add_metrics.sh
artifacts:
paths:
- metrics.csv
expire_in: 1 month
Note: You will have to create a personal token and add it to a masked variable. I will also leave it up to you to populate the data metrics, like test time, code coverage, etc.
Explanation of Code
Download the latest artifact for the current branch.
The first commit of a feature branch will not find a "latest" artifact. If that happens, download the latest artifact from master.
The first time the script runs, master won't even have a "latest" artifact, so create a new CSV file.
APPEND the current sample to the end of the CSV file. Delete old samples to keep a fixed number of data points. You can add date, pipeline ID and other metrics.
Store the updated artifact.
To view the graph, download the artifact from the GitLab UI and view it in a spreadsheet app.
Publish to Pages
Using Python (pandas, matplotlib), you can generate an image of the plot and publish it to Gitlab Pages from your master branch pipeline. You can have a static HTML page in your repository referencing the same image filename, and keep replacing the same image from your pipeline. You can also add more useful metrics, such as code coverage.
Lets say I have a test block for all my microservices (15-20+). Tests take a long time since there are so many disparate modules in this monorepo.
Lets say I only want to run 1 or maybe 2 at a time if and only if specific code changes have been made underneath a path. How can I best do this? For assembling I do something like this (not sure if this is terrible or not)
Ultimately, I'm trying to only build and test relevant things if they're relevant (based on if they or a related module I can define change)
Module-specific assembles
x:
stage: build
image: gradle:6.0-jdk11
script:
- gradle :x:assemble
artifacts:
paths:
- x/build/libs
only:
changes:
- x
- x/*
- x/*/**
y_build:
stage: build
image: gradle:6.0-jdk11
script:
- gradle :y:assemble
artifacts:
paths:
- y/build/libs
only:
changes:
- y
- y/*
- y/*/**
Current block for testing
test:
stage: test
image: gradle:6.0-jdk11
services:
- name: gitlab-registry.company.com/nap/dynamodb-local:1252954
command: [ "-inMemory", "-sharedDb" ]
alias: dynamodb
script: gradle check
There are many ways that a Gitlab CI Pipeline can be triggered, but the basic way is when a commit is pushed, no matter what the change is. Currently, there isn't a way to inspect which parts of the code are changed, and only run some steps or others based on the result, but you can control which steps run based on the branch or tag name.
So for example, you could have steps that only run when the branch name starts with something like "microservice_3_", and then make sure that when editing Microservice 3, you always start the branch name with "microserver_3_", though this could get complicated the more microservices you have to support.
The easier option (in terms of the pipeline definition) is to maintain the microservices in separate repositories, each with their own pipelines and tests. This way each individual pipeline only runs against a specific component and doesn't care about changes to the others. Then if you need to, you can combine the microservices in another repository (included as git submodules) and have a pipeline that only cares about the services as a whole.
This would add extra overhead while developing the project(s), but it makes the pipelines easier to manage.
In my pipeline, I'd like to have a job run only if the Merge Requests target branch is a certain branch, say master or release.
Is this possible?
I've read through https://docs.gitlab.com/ee/ci/variables/ and unless I missed something, I'm not seeing anything that can help.
Update: 2019-03-21
GitLab has variables for merge request info since version 11.6 (https://docs.gitlab.com/ce/ci/variables/ see the variables start with CI_MERGE_REQUEST_). But, these variables are only available in merge request pipelines.(https://docs.gitlab.com/ce/ci/merge_request_pipelines/index.html)
To configure a CI job for merge requests, we have to set:
only:
- merge_requests
And then we can use CI_MERGE_REQUEST_* variables in those jobs.
The biggest pitfall here is only: merge_request has complete different behavior from normal only/except parameters.
usual only/except parameters:
(https://docs.gitlab.com/ce/ci/yaml/README.html#onlyexcept-basic)
only defines the names of branches and tags for which the job will run.
except defines the names of branches and tags for which the job will not run.
only: merge_request: (https://docs.gitlab.com/ce/ci/merge_request_pipelines/index.html#excluding-certain-jobs)
The behavior of the only: merge_requests parameter is such that only jobs with that parameter are run in the context of a merge request; no other jobs will be run.
I felt hard to reorganize jobs to make them work like before with only: merge_request exists on any job. Thus I'm still using the one-liner in my original answer to get MR info in a CI job.
Original answer:
No.
But GitLab have a plan for this feature in 2019 Q2: https://gitlab.com/gitlab-org/gitlab-ce/issues/23902#final-assumptions
Currently, we can use a workaround to achieve this. The method is as Rekovni's answer described, and it actually works.
There's a simple one-liner, get the target branch of an MR from the current branch:
script: # in any script section of gitlab-ci.yml
- 'CI_TARGET_BRANCH_NAME=$(curl -LsS -H "PRIVATE-TOKEN: $AWESOME_GITLAB_API_TOKEN" "https://my.gitlab-instance.com/api/v4/projects/$CI_PROJECT_ID/merge_requests?source_branch=$CI_COMMIT_REF_NAME" | jq --raw-output ".[0].target_branch")'
Explanation:
CI_TARGET_BRANCH_NAME is a newly defined variable which stores resolved target branch name. Defining a variable is not necessary for various usage.
AWESOME_GITLAB_API_TOKEN is the variable configured in repository's CI/CD variable config. It is a GitLab personal access token(created in User Settings) with api scope.
About curl options: -L makes curl aware of HTTP redirections. -sS makes curl silent(-s) but show(-S) errors. -H specifies authority info accessing GitLab API.
The used API could be founded in https://docs.gitlab.com/ce/api/merge_requests.html#list-project-merge-requests. We use the source_branch attribute to figure out which MR current pipeline is running on. Thus, if a source branch has multiple MR to different target branch, you may want to change the part after | and do your own logic.
About jq(https://stedolan.github.io/jq/), it's a simple CLI util to deal with JSON stuff(what GitLab API returns). You could use node -p or any method you want.
Because of the new env variables in 11.6 $CI_MERGE_REQUEST_SOURCE_BRANCH_NAME and $CI_MERGE_REQUEST_TARGET_BRANCH_NAME jobs can be included or excluded based on the source or target branch.
Using the only and except (complex) expressions, we can build a rule to filter merge requests. For a couple examples:
Merge request where the target branch is master:
only:
refs:
- merge_requests
variables:
- $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "master"
Merge request except if the source branch is master or release:
only:
- merge_requests
except:
variables:
- $CI_MERGE_REQUEST_SOURCE_BRANCH_NAME == "master"
- $CI_MERGE_REQUEST_SOURCE_BRANCH_NAME == "release"
If you want to use multiple refs (let's say merge_requests and tags) and multiple variables, the refs will be OR'd, the variables will be OR'd, and the result will be AND'd:
If any of the conditions in variables evaluates to truth when using only, a new job is going to be created. If any of the expressions evaluates to truth when except is being used, a job is not going to be created.
If you use multiple keys under only or except, they act as an AND. The logic is:
(any of refs) AND (any of variables) AND (any of changes) AND (if kubernetes is active)
Variable expressions are also quite primitive, only supporting equality and (basic) regex. Because the variables will be OR'd you cannot specify both a source and target branch as of gitlab 11.6, just one or the other.
As of GitLab 11.6, there is CI_MERGE_REQUEST_TARGET_BRANCH_NAME.
If this is what you're really after, there could be an extremely convoluted way (untested) you could achieve this using the merge request API and CI variables.
With a workflow / build step something like:
Create merge request from feature/test to master
Start a build
Using the API (in a script), grab all open merge requests from the current project using CI_PROJECT_ID variable, and filter by source_branch and target_branch.
If there is a merge request open with the source_branch and target_branch being feature/test and master respectively, continue with the build, otherwise just skip the rest of the build.
For using the API, I don't believe you can use the CI_JOB_TOKEN variable to authenticate, so you'll probably need to create your own personal access token and store it as a CI variable to use in the build job.
Hope this helps!
Another example, but using rules:
rules:
# pipeline should run on merge request to master branch
- if: $CI_PIPELINE_SOURCE == 'merge_request_event' && $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == 'master'
when: always
# pipeline should run on merge request to release branch
- if: $CI_PIPELINE_SOURCE == 'merge_request_event' && $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == 'release'
when: always
- when: never
Gitlab CI is agnostic of Merge Requests (for now). Since the pipeline runs on the origin branch you will not be able to retrieve the destination.