gitlab-ci: setup every day builds - gitlab

Really do not understand how I can setup daily scheduler in gitlab . I have simple application and I need automatically build it every day at 8.00 morning.
I tried with Following https://gitlab.com/help/ci/triggers/README.md ,
but i do not understand how can I run this crone job?
30 0 * * * curl --request POST --form token=TOKEN --form ref=master https://gitlab.example.com/api/v3/projects/9/trigger/builds
This is also unacceptable http://cloudlady911.com/index.php/2016/11/02/how-to-schedule-a-job-in-gitlab-8-13/
because I must manually run it from pipeline.
Any solutions?

Now you can setup schedules in gitlab natively to run any pipeline each day.

Whether you craft a script or just run cURL directly, you can trigger
jobs in conjunction with cron. The example below triggers a job on the
master branch of project with ID 9 every night at 00:30:
30 0 * * * curl --request POST --form token=TOKEN --form ref=master https://gitlab.example.com/api/v3/projects/9/trigger/builds
This triggers script in your .gitlab-ci.yml. The assumption is you have ur deployment script prepared in this file. So it will execute stages step by step and if ur step is deployment, it will deploy your application.

Related

Gitlab API for automating TAG gets an error

I established a CI/CD pipeline with GitLab. I've created a variable to make an automated TAG called ADD_TAG.
I've put the below script
REF=$$2
TAG_NAME=$$1
TOKEN=$$ANDROID_CHANGELOG_PRIVATE_TOKEN
URL="$${CI_SERVER_URL}/api/v4/projects/$${CI_PROJECT_ID}/repository/tags"
PARAMS="tag_name=$${TAG_NAME}&ref=$${REF}"
curl --fail --request POST --header "PRIVATE-TOKEN: $${TOKEN}" "$${URL}?$${PARAMS}"
but I got the below error
curl: (3) URL using bad/illegal format or missing URL
Please help me to solve it.
Use single $ when referencing a variables.
https://docs.gitlab.com/ee/ci/variables/index.html#predefined-cicd-variables
For example:
URL="$${CI_SERVER_URL}/api/v4/projects/$${CI_PROJECT_ID}/repository/tags"
should be
URL="${CI_SERVER_URL}/api/v4/projects/${CI_PROJECT_ID}/repository/tags"

Update task status from external application in Databricks

I am having a workflow with a task that is dependant on external application execution (not residing in Databricks). After external application finishes, how to update the status of a task to complete. Currently, Jobs API doesn't support status updates.
Runs Cancel is available
curl --netrc --request POST \
https://<databricks-instance>/api/2.0/jobs/runs/cancel \
--data '{ "run_id": <run-id> }'
More Details

Download the latest artifacts of failed gitlab pipeline

I want to automatically load test results of my gitlab pipeline (basically one xml file) into my project management application. No direct connection between these two is possible. However the gitlab API offers the following options to download artifacts of a pipeline:
all artifacts of the latest successful pipeline run (selected by job name)
GET /projects/:id/jobs/artifacts/:ref_name/download?job=name`
all artifacts of a specific job (selected by job id)
GET /projects/:id/jobs/:job_id/artifacts
a single artifact file (selected by job id)
GET /projects/:id/jobs/:job_id/artifacts/*artifact_path
My current situation is following:
I have test reports which are saved inside the job artifacts when running the pipeline. The artifacts are created on every run of the pipeline independent of its outcome
gitlab-ci.yaml
...
artifacts:
when: always
...
The artifact I am trying to download has a dynamic name
./reports/junit/test-results-${CI_JOB_ID}.xml
If I now want to download the latest test results to a different server than the gitlab server, I have to realize that I don't know the latest job ID, which means:
I can't access the artifact directly because it has a dynamic name
I can't access the artifacts of a specific job
I can access the artifacts of the latest job, but only if it was successful
This leaves me with the situation, that i only get to download the latest test results, if nothing went wrong while testing. To put it mildly, this is suboptimal.
Is there some way to download the artifacts from the latest job run (without knowing the job ID), independent of its outcome?
Is there some way to download the artifacts from the latest job run
(without knowing the job ID), independent of its outcome?
In order to achieve this we will use the Gitlab API in combination with the jq package.
Let's break down this question into components.
Firstly, we need to find out the id of the last executed pipeline for this project. https://docs.gitlab.com/ee/api/pipelines.html#list-project-pipelines
GET /projects/:id/pipelines
For this call you will need your access token, if you don't have one already check
https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html#create-a-personal-access-token
https://docs.gitlab.com/ee/user/project/settings/project_access_tokens.html#create-a-project-access-token
You will also need the Project ID
LAST_PIPELINE_ID=$(curl -s --header "PRIVATE-TOKEN: <access_token>" https://gitlab.com/api/v4/projects/<project_id>/pipelines | jq '.[0].id')
Next we will retrieve the job id by providing the job name by using the following API
https://docs.gitlab.com/ee/api/jobs.html#list-pipeline-jobs
GET /projects/:id/pipelines/:pipeline_id/jobs
In your case you need to change the following example with your job's name, in this example let's call it my_job
JOB_ID=$(curl -s --header "PRIVATE-TOKEN: <access_token>" https://gitlab.com/api/v4/projects/<project_id>/pipelines/$LAST_PIPELINE_ID/jobs | jq '.[] | select(.name=="my_job") | .id')
Now we are ready to actually retrieve the artifacts, with the following API
GET /projects/:id/jobs/:job_id/artifacts
https://docs.gitlab.com/ee/api/job_artifacts.html#get-job-artifacts
wget -U "Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.17 (KHTML,like Gecko) Ubuntu/11.04 Chromium/11.0.654.0 Chrome/11.0.654.0 Safari/534.17" --header "PRIVATE-TOKEN: <access_token>" "https://gitlab.com/api/v4/projects/<project_id>/jobs/$JOB_ID/artifacts" -O artifacts.zip
The artifacts are available as artifacts.zip in the folder you executed wget from
Combining them here for clarity
LAST_PIPELINE_ID=$(curl -s --header "PRIVATE-TOKEN: <access_token>" https://gitlab.com/api/v4/projects/<project_id>/pipelines | jq '.[0].id')
JOB_ID=$(curl -s --header "PRIVATE-TOKEN: <access_token>" https://gitlab.com/api/v4/projects/<project_id>/pipelines/$LAST_PIPELINE_ID/jobs | jq '.[] | select(.name=="my_job") | .id')
wget -U "Mozilla/5.0 (X11; U; Linux i686; en-US) AppleWebKit/534.17 (KHTML,like Gecko) Ubuntu/11.04 Chromium/11.0.654.0 Chrome/11.0.654.0 Safari/534.17" --header "PRIVATE-TOKEN: <access_token>" "https://gitlab.com/api/v4/projects/<project_id>/jobs/$JOB_ID/artifacts" -O artifacts.zip

How to download artifacts using an URL in a gitlab job?

In the gitlab documentation some URL's are described for the purpose of downloading artifacts from pipelines HERE. They seem to have forgotten to describe HOW to download artifacts given these URLs.
Can it be done in a simple way? Or do I have to install e.g. wget, create a token, define a token, use a token?
If someone could give an example that would be great.
The documentation referenced in the question is supposed to be an REST API call, using the Job Artifact API:
GET /projects/:id/jobs/artifacts/:ref_name/download?job=name
To use this in a script definition inside .gitlab-ci.yml, you can use either:
The JOB-TOKEN header with the GitLab-provided CI_JOB_TOKEN variable.
For example, the following job downloads the artifacts of the test job of the main branch.
The command is wrapped in single quotes because it contains a colon (:):
artifact_download:
stage: test
script:
- 'curl --location --output artifacts.zip --header "JOB-TOKEN: $CI_JOB_TOKEN" "https://gitlab.example.com/api/v4/projects/$CI_PROJECT_ID/jobs/artifacts/main/download?job=test"'
Or the job_token attribute with the GitLab-provided CI_JOB_TOKEN variable.
For example, the following job downloads the artifacts of the test job of the main branch:
artifact_download:
stage: test
script:
- 'curl --location --output artifacts.zip >"https://gitlab.example.com/api/v4/projects/$CI_PROJECT_ID/jobs/artifacts/main/download?job=test&job_token=$CI_JOB_TOKEN"'
But the artifact: directive is meant to store data in the job workspace, for a new iteration of the job to get back, in the same folder.
No "download" involved, as illustrated in the article "GitLab CI: Cache and Artifacts explained by example" by Anton Yakutovich.
As such, no curl/wget/TOKEN should be needed to access an artifact stored by a previous job execution.

How to change the schedule of a Kubernetes cronjob or how to start it manually?

Is there a simple way to change the schedule of a kubernetes cronjob like kubectl change cronjob my-cronjob "10 10 * * *"? Or any other way without needing to do kubectl apply -f deployment.yml? The latter can be extremely cumbersome in a complex CI/CD setting because manually editing the deployment yaml is often not desired, especially not if the file is created from a template in the build process.
Alternatively, is there a way to start a cronjob manually? For instance, a job is scheduled to start in 22 hours, but I want to trigger it manually once now without changing the cron schedule for good (for testing or an initial run)?
You can update only the selected field of resourse by patching it
patch -h
Update field(s) of a resource using strategic merge patch, a JSON merge patch, or a JSON patch.
JSON and YAML formats are accepted.
Please refer to the models in
https://htmlpreview.github.io/?https://github.com/kubernetes/kubernetes/blob/HEAD/docs/api-reference/v1/definitions.html
to find if a field is mutable.
As provided in comment for ref :
kubectl patch cronjob my-cronjob -p '{"spec":{"schedule": "42 11 * * *"}}'
Also, in current kubectl versions, to launch a onetime execution of a declared cronjob, you can manualy create a job that adheres to the cronjob spec with
kubectl create job --from=cronjob/mycron
The more recent versions of k8s (from 1.10 on) support the following command:
$ kubectl create job my-one-time-job --from=cronjobs/my-cronjob
Source is this solved k8s github issue.
From #SmCaterpillar answer above kubectl patch my-cronjob -p '{"spec":{"schedule": "42 11 * * *"}}',
I was getting the error: unable to parse "'{spec:{schedule:": yaml: found unexpected end of stream
If someone else is facing a similar issue, replace the last part of the command with -
"{\"spec\":{\"schedule\": \"42 11 * * *\"}}"
I have a friend who developed a kubectl plugin that answers exactly that !
It takes an existing cronjob and just create a job out of it.
See https://github.com/vic3lord/cronjobjob
Look into the README for installation instructions.
And if you want to do patch a k8s cronjob schedule with the Python kubernetes library, you can do this like that:
from kubernetes import client, config
config.load_kube_config()
v1 = client.BatchV1beta1Api()
body = {"spec": {"schedule": "#daily"}}
ret = v1.patch_namespaced_cron_job(
namespace="default", name="my-cronjob", body=body
)
print(ret)

Resources