Gitlab pipeline job not being triggered even though its showing up when simulating pipeline run - gitlab

I am running a pipeline in one of my projects.
I have encountered a problem where the build job is not being run, despite it showing up when doing a simulation with the gitlab lint tool.
Unfortunately I cant paste the whole gitlab.ci file, as it contains sensitive information.
Has anybody encountered something similar before?
This is the build job:
build:
stage: build
extends: .docker
rules:
- if: $CI_PIPELINE_SOURCE != "schedule"
script:
- docker build --build-arg NODE_ENV=${ENV_TAG} --build-arg NODE_ENV=${NODE_ENV} --build-arg ENV_TAG=${ENV_TAG} -f Dockerfile --cache-from $CI_REGISTRY_IMAGE:$ENV_TAG -t $CI_REGISTRY_IMAGE:$ENV_TAG -t $CI_REGISTRY_IMAGE:$ENV_TAG-$CI_COMMIT_SHA -t $CI_REGISTRY_IMAGE:$CI_COMMIT_SHA .
- docker push $CI_REGISTRY_IMAGE --all-tags
I have tried changing the rules of the job, but that ended up making the gitlab.ci invalid or not running the job even in the simulation.

Related

Pipeline does not run when using CI_COMMIT_MESSAGE

I want to run a pipeline that builds a docker image from an app and deploys it to Gitlab Registry and Docker Hub.
I'd like this pipeline to run only when the main branch gets a commit and when that commit has a message that is a "version". Examples of versions:
1.0.0
3.4.0
10.1.6
I have made the following gitlab-ci.yml, however, my pipeline never triggers. What am I doing wrong? I've already checked the regex and it's ok, so I'm guessing I'm doing something wrong with the Gitlab config?
image: docker:stable
variables:
PROJECT_NAME: "project"
BRANCH_NAME: "main"
IMAGE_NAME: "$PROJECT_NAME:$CI_COMMIT_MESSAGE"
LATEST_IMAGE_NAME: "$PROJECT_NAME:latest"
services:
- docker:19.03.12-dind
build_image:
script:
# Push to Gitlab registry
- docker build -t $CI_REGISTRY/username/$PROJECT_NAME/$IMAGE_NAME .
- docker build -t $CI_REGISTRY/username/$PROJECT_NAME/$LATEST_IMAGE_NAME .
- docker login $CI_REGISTRY -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD
- docker push $CI_REGISTRY/username/$PROJECT_NAME/$IMAGE_NAME
- docker push $CI_REGISTRY/username/$PROJECT_NAME/$LATEST_IMAGE_NAME
# Push to Docker hub
- docker build -t username/$IMAGE_NAME .
- docker build -t username/$LATEST_IMAGE_NAME .
- docker login -u username-p $DOCKER_HUB_TOKEN
- docker push username/$IMAGE_NAME
- docker push username/$LATEST_IMAGE_NAME
rules:
- if: $CI_COMMIT_MESSAGE =~ /^(0|[1-9]\d*)\.(0|[1-9]\d*)\.(0|[1-9]\d*)$/
- if: $CI_PIPELINE_SOURCE == "$BRANCH_NAME"
The problem you're having is likely due to the $ marker. Often commit messages get formatted with additional newlines, which would make the pattern not match.
Alternatively, your pattern may work if using CI_COMMIT_TITLE which only includes the first line of the commit message.
I propose to you an alternative, that is tags.
A tag is a label that you attach on git commits. Examples of tags are: 1.0.1, 1.1.2, latest.
In order to tag a commit with a version, just issue:
git tag <version> (the newest commit is subtended here)
In order to trigger your pipeline for any tag, just use this:
rules:
- if: $CI_COMMIT_TAG
In this solution, we have decoupled the version issue from the message concern.

Gitlab CI - run a task with only TAG & specific BRANCH

how can I start a job with gitlab-ci only when I create a new tag with a specific branch?
I try everything but it still doesn't work.
stages:
- shov server
test:
stage: shov server
rules:
- if: '$CI_COMMIT_TAG && $CI_COMMIT_BRANCH == "CI_merge"'
when: always
tags:
- runner
script:
- docker-compose -f docker-compose.yml down
- docker-compose -f docker-compose.yml build
- docker-compose -f docker-compose.yml up -d
AFAIK this is not possible. When the Pipeline runs for a tag, there is no variable defined that indicates the branch. You see this for yourself by looking through all available variables with export. Gitlab Documentation
You could perhaps try to find the branch, which the commit is on. Something like git branch -a --contains $CI_COMMIT_SHA or something similar. However you probably can't do that in the rules, and have to do it in the script, or in the before_script with custom logic to stop the rest of the script from running.
Hope this helps.

Run 2 gitlab jobs on the same VM

I have the following pipeline in gitlab:
stages: # List of stages for jobs, and their order of execution
- build
- test
- deploy
clone-submodule-job: # This job runs in the build stage, which runs first.
tags:
- linuxvm
stage: build
script:
- git submodule update --init --recursive --jobs=10
build-job: # This job runs in the build stage, which runs first.
tags:
- linuxvm
stage: build
script:
- cd docker/project_builder && docker build -t docker_development -f Dockerfile .
I added the linuxvm tag so it runs on my linux vm. The problem is that build-job runs in a separate VM. Is it possible to make it run after the clone-submodule-job, but also run on the same VM so it accesses the submodules cloned?

How to run a pipeline only on changes in a specific branch?

I am looking through the documentation back and forth and cannot find how to configure my .gitlab-ci.yml so that the content is executed only on a change in the branch mqtt_based and not in the default master.
I was hoping that adding an only entry for each section would be enough (I was hoping for a global setting), but this did not help (the pipeline was not started when the mqtt_based branch was changed)
variables:
BRANCH: "mqtt_based"
stages:
- build
- deploy
job:build-back:
stage: build
script:
- cd back
- docker build --build-arg COMMIT=${CI_COMMIT_SHORT_SHA} --build-arg DATE=$(date --iso-8601=seconds) -t registry.XXX/homemonitor-back:latest -t registry.XXX/homemonitor-back:${CI_COMMIT_SHORT_SHA} -f Dockerfile .
only:
- $BRANCH
(...)
You need to use 'refs' after 'only'. Something like this
only:
refs:
- mqtt_based
Documentation: https://docs.gitlab.com/ce/ci/yaml/#onlyexcept-advanced

How to utilize Docker to run tests for multiple languages on Travis CI

I am attempting to create a CI/CD pipeline with Travis CI that tests the front-end, tests the back-end, and deploys. The front-end is using Node, the back-end is using Go.
My repository is structured as follows:
- client
- DockerFile
- ...(front-end code)
- server
- DockerFile
- ...(back-end code)
- .travis.yml
Would I be able to utilize the DockerFiles in some fashion to execute tests for both sides of the application and have Travis report their results properly?
I'm not well versed with either tools so I was hoping to get some input before I dig myself into a hole. I plan on using a combination of Travis stages and docker build/docker run commands. Something like this:
jobs:
include:
- stage: test client side
before_script:
- cd client
- docker build ...
script:
docker run image /bin/sh -c "run node tests"
after_script:
- cd ..
- stage: test server side
before_script:
- cd server
script:
docker run image /bin/sh -c "run go tests"
after_script:
- cd ..
- stage: deploy
script: skip
deploy:
- provider: s3
skip_cleanup: true
on:
branch: master
This doc page makes it looks promising, but the inclusion of language: ruby and script: - bundle exec rake test throws me off. I am not sure why Ruby is required if the tests are ran through docker (at least that's what it looks like).
Update 1
I believe I got it to work correctly with the client side of the application.
Here is what I got:
services:
- docker
jobs:
include:
- stage: test
before_script:
- docker pull node:12
script:
- docker run --rm -e CI=true -v $(pwd)/client:/src node:12 /bin/sh -c "cd src; npm install; npm test"

Resources