This is really bizarre, but I'm sure I'm misunderstanding or missing something simple.
Whilst RDP'd to the Windows GitLab runner, the following command works without issue:
"C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Current\Bin\MSBuild.exe" C:\GitLab-Runner\builds\rS1z3GJ3\0\folder1\client\build\ClientWebsite.msbuild /t:BuildAndPackage /nowarn:CS0618 /p:IisWebAppName=client /p:IncludeSetAclProviderOnDestination=False /p:ProductVersion=1
Going by the MSBuild file, the build files are then created under C:\GitLab-Runner\builds\rS1z3GJ3\0\folder1\client\deploy_files.
But when I run the exact same command in via the gitlab-ci.yml file, the deployment files aren't generated.
The output from the GitLab job is:
CMD.EXE "C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Current\Bin\MSBuild.exe" C:\GitLab-Runner\builds\rS1z3GJ3\0\folder1\client\build\ClientWebsite.msbuild /t:BuildAndPackage /nowarn:CS0618 /p:IisWebAppName=client /p:IncludeSetAclProviderOnDestination=False /p:ProductVersion=1
Microsoft Windows [Version 10.0.17763.2628]
(c) 2018 Microsoft Corporation. All rights reserved.
C:\GitLab-Runner\builds\rS1z3GJ3\0\folder1\client>
Cleaning up project directory and file based variables 00:00 Job succeeded
I thought perhaps this was an issue with the user the GitLab Runner service was set to run as, but even when trying with a local admin user it still doesn't work.
I suspected it was an issue with the cleaning up of the project directory, but even when setting the deployment files to be exported to a separate folder on another drive they still don't appear.
What could I be doing wrong?
Any help with this would be greatly appreciated.
EDIT:
GitLab Runner Config:
concurrent = 1
check_interval = 0
[session_server]
session_timeout = 1800
[[runners]]
name = "buildserver01"
url = "https://gitlab.company.co.uk"
token = "rS1z3GJ392utu-J-CkeM"
executor = "shell"
shell = "powershell"
[runners.custom_build_dir]
[runners.cache]
[runners.cache.s3]
[runners.cache.gcs]
[runners.cache.azure]
GitLab-Ci.Yml File:
variables:
SERVICE_NAME: "client"
NODE_VERSION: "14.15.3"
stages:
- build $SERVICE_NAME package with MSBuild.exe
build_package:
stage: build $SERVICE_NAME package with MSBuild.exe
tags:
- client
script:
- echo "Installing NodeJS environment $NODE_VERSION if it's not already installed."
- nvm install $NODE_VERSION
- echo "Selecting NodeJS environment $NODE_VERSION using NVM"
- nvm use $NODE_VERSION
- echo "Creating the Web Deploy Package"
- CMD.EXE "C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Current\Bin\MSBuild.exe" C:\GitLab-Runner\builds\rS1z3GJ3\0\folder1\client\build\clientWebsite.msbuild /t:BuildAndPackage /nowarn:CS0618 /p:IisWebAppName=client /p:IncludeSetAclProviderOnDestination=False /p:ProductVersion=1
Thanks to GitLab Support, this has now been resolved.
They advised to completely remove the 'CMD.EXE' command and instead run MSBuild directly via PowerShell:
'& "C:\Program Files (x86)\Microsoft Visual Studio\2022\BuildTools\MSBuild\Current\Bin\MSBuild.exe" $CI_PROJECT_DIR\build\ClientWebsite.msbuild /t:BuildAndPackage /nowarn:CS0618 /p:IisWebAppName=Client /p:IncludeSetAclProviderOnDestination=False /p:ProductVersion=1'
I've got some trouble defining a job. I'm in an early stage of defining my pipeline and I'm trying to solve my problems with variables :). I've got a few subfolder under in gcp/live/development and I'm trying to cd them and then run terraform stuff:
.plan-development:
variables:
ENVIRONMENT: "development"
TERRAFORM_DEVELOPMENT_PATH: "gcp/live/development"
TF_ROOT: ${TERRAFORM_DEVELOPMENT_PATH}/${CI_JOB_NAME%-plan-development}
stage: plan-development
script:
- cd ${TF_ROOT}
- gitlab-terraform plan
- gitlab-terraform plan-json
artifacts:
name: plan-${ENVIRONMENT}-${CI_JOB_NAME%-plan-development}
paths:
- ${TF_ROOT}/plan.cache
reports:
terraform: ${TF_ROOT}/plan.json
But according to my failing pipeline, the first script part cd ${TF_ROOT} doesn't enter the expected folder at gcp/live/development/cloud-nat.
I've got some debug outout, which tells me following:
$ cd ${TF_ROOT}
$ pwd && ls
/builds/b79ya_1j/0/devops/terraform/gcp/live/development
cloud-nat
firewall
gke
vpc
$ echo "${CI_JOB_NAME%-plan-development}"
cloud-nat
Am I missing something combing two variables (TF_ROOT: ${TERRAFORM_DEVELOPMENT_PATH}/${CI_JOB_NAME%-plan-development})?
Hope you can help me out.
Am I missing something combing two variables (TF_ROOT:
${TERRAFORM_DEVELOPMENT_PATH}/${CI_JOB_NAME%-plan-development})?
The issue is that Gitlab in variables section, doesn't perform bash operations, like you attempt here
${CI_JOB_NAME%-plan-development}
Gitlab literally searches for the variable named CI_JOB_NAME%-plan-development
That's why when it creates the TF_ROOT variable its value is gcp/live/development/ since it could not find the variable named CI_JOB_NAME%-plan-development
If a GitLab project is configured on GitLab CI, is there a way to run the build locally?
I don't want to turn my laptop into a build "runner", I just want to take advantage of Docker and .gitlab-ci.yml to run tests locally (i.e. it's all pre-configured). Another advantage of that is that I'm sure that I'm using the same environment locally and on CI.
Here is an example of how to run Travis builds locally using Docker, I'm looking for something similar with GitLab.
Since a few months ago this is possible using gitlab-runner:
gitlab-runner exec docker my-job-name
Note that you need both docker and gitlab-runner installed on your computer to get this working.
You also need the image key defined in your .gitlab-ci.yml file. Otherwise won't work.
Here's the line I currently use for testing locally using gitlab-runner:
gitlab-runner exec docker test --docker-volumes "/home/elboletaire/.ssh/id_rsa:/root/.ssh/id_rsa:ro"
Note: You can avoid adding a --docker-volumes with your key setting it by default in /etc/gitlab-runner/config.toml. See the official documentation for more details. Also, use gitlab-runner exec docker --help to see all docker-based runner options (like variables, volumes, networks, etc.).
Due to the confusion in the comments, I paste here the gitlab-runner --help result, so you can see that gitlab-runner can make builds locally:
gitlab-runner --help
NAME:
gitlab-runner - a GitLab Runner
USAGE:
gitlab-runner [global options] command [command options] [arguments...]
VERSION:
1.1.0~beta.135.g24365ee (24365ee)
AUTHOR(S):
Kamil TrzciĆski <ayufan#ayufan.eu>
COMMANDS:
exec execute a build locally
[...]
GLOBAL OPTIONS:
--debug debug mode [$DEBUG]
[...]
As you can see, the exec command is to execute a build locally.
Even though there was an issue to deprecate the current gitlab-runner exec behavior, it ended up being reconsidered and a new version with greater features will replace the current exec functionality.
Note that this process is to use your own machine to run the tests using docker containers. This is not to define custom runners. To do so, just go to your repo's CI/CD settings and read the documentation there. If you wanna ensure your runner is executed instead of one from gitlab.com, add a custom and unique tag to your runner, ensure it only runs tagged jobs and tag all the jobs you want your runner to be responsible of.
I use this docker-based approach:
Edit: 2022-10
docker run --entrypoint bash --rm -w $PWD -v $PWD:$PWD -v /var/run/docker.sock:/var/run/docker.sock gitlab/gitlab-runner:latest -c 'git config --global --add safe.directory "*";gitlab-runner exec docker test'
For all git versions > 2.35.2. You must add safe.directory within the container to avoid fatal: detected dubious ownership in repository at.... This also true for patched git versions < 2.35.2. The old command will not work anymore.
Details
0. Create a git repo to test this answer
mkdir my-git-project
cd my-git-project
git init
git commit --allow-empty -m"Initialize repo to showcase gitlab-runner locally."
1. Go to your git directory
cd my-git-project
2. Create a .gitlab-ci.yml
Example .gitlab-ci.yml
image: alpine
test:
script:
- echo "Hello Gitlab-Runner"
3. Create a docker container with your project dir mounted
docker run -d \
--name gitlab-runner \
--restart always \
-v $PWD:$PWD \
-v /var/run/docker.sock:/var/run/docker.sock \
gitlab/gitlab-runner:latest
(-d) run container in background and print container ID
(--restart always) or not?
(-v $PWD:$PWD) Mount current directory into the current directory of the container - Note: On Windows you could bind your dir to a fixed location, e.g. -v ${PWD}:/opt/myapp. Also $PWD will only work at powershell not at cmd
(-v /var/run/docker.sock:/var/run/docker.sock) This gives the container access to the docker socket of the host so it can start "sibling containers" (e.g. Alpine).
(gitlab/gitlab-runner:latest) Just the latest available image from dockerhub.
4. Execute with
Avoid fatal: detected dubious ownership in repository at... More info
docker exec -it -w $PWD gitlab-runner git config --global --add safe.directory "*"
Actual execution
docker exec -it -w $PWD gitlab-runner gitlab-runner exec docker test
# ^ ^ ^ ^ ^ ^
# | | | | | |
# (a) (b) (c) (d) (e) (f)
(a) Working dir within the container. Note: On Windows you could use a fixed location, e.g. /opt/myapp.
(b) Name of the docker container
(c) Execute the command "gitlab-runner" within the docker container
(d)(e)(f) run gitlab-runner with "docker executer" and run a job named "test"
5. Prints
...
Executing "step_script" stage of the job script
$ echo "Hello Gitlab-Runner"
Hello Gitlab-Runner
Job succeeded
...
Note: The runner will only work on the commited state of your code base. Uncommited changes will be ignored. Exception: The .gitlab-ci.yml itself does not have be commited to be taken into account.
Note: There are some limitations running locally. Have a look at limitations of gitlab runner locally.
I'm currently working on making a gitlab runner that works locally.
Still in the early phases, but eventually it will become very relevant.
It doesn't seem like gitlab want/have time to make this, so here you go.
https://github.com/firecow/gitlab-runner-local
If you are running Gitlab using the docker image there: https://hub.docker.com/r/gitlab/gitlab-ce, it's possible to run pipelines by exposing the local docker.sock with a volume option: -v /var/run/docker.sock:/var/run/docker.sock. Adding this option to the Gitlab container will allow your workers to access to the docker instance on the host.
The GitLab runner appears to not work on Windows yet and there is an open issue to resolve this.
So, in the meantime I am moving my script code out to a bash script, which I can easily map to a docker container running locally and execute.
In this case I want to build a docker container in my job, so I create a script 'build':
#!/bin/bash
docker build --pull -t myimage:myversion .
in my .gitlab-ci.yaml I execute the script:
image: docker:latest
services:
- docker:dind
before_script:
- apk add bash
build:
stage: build
script:
- chmod 755 build
- build
To run the script locally using powershell I can start the required image and map the volume with the source files:
$containerId = docker run --privileged -d -v ${PWD}:/src docker:dind
install bash if not present:
docker exec $containerId apk add bash
Set permissions on the bash script:
docker exec -it $containerId chmod 755 /src/build
Execute the script:
docker exec -it --workdir /src $containerId bash -c 'build'
Then stop the container:
docker stop $containerId
And finally clean up the container:
docker container rm $containerId
Another approach is to have a local build tool that is installed on your pc and your server at the same time.
So basically, your .gitlab-ci.yml will basically call your preferred build tool.
Here an example .gitlab-ci.yml that i use with nuke.build:
stages:
- build
- test
- pack
variables:
TERM: "xterm" # Use Unix ASCII color codes on Nuke
before_script:
- CHCP 65001 # Set correct code page to avoid charset issues
.job_template: &job_definition
except:
- tags
build:
<<: *job_definition
stage: build
script:
- "./build.ps1"
test:
<<: *job_definition
stage: test
script:
- "./build.ps1 test"
variables:
GIT_CHECKOUT: "false"
pack:
<<: *job_definition
stage: pack
script:
- "./build.ps1 pack"
variables:
GIT_CHECKOUT: "false"
only:
- master
artifacts:
paths:
- output/
And in nuke.build i've defined 3 targets named like the 3 stages (build, test, pack)
In this way you have a reproducible setup (all other things are configured with your build tool) and you can test directly the different targets of your build tool.
(i can call .\build.ps1 , .\build.ps1 test and .\build.ps1 pack when i want)
I am on Windows using VSCode with WSL
I didn't want to register my work PC as a runner so instead I'm running my yaml stages locally to test them out before I upload them
$ sudo apt-get install gitlab-runner
$ gitlab-runner exec shell build
yaml
image: node:10.19.0 # https://hub.docker.com/_/node/
# image: node:latest
cache:
# untracked: true
key: project-name
# key: ${CI_COMMIT_REF_SLUG} # per branch
# key:
# files:
# - package-lock.json # only update cache when this file changes (not working) #jkr
paths:
- .npm/
- node_modules
- build
stages:
- prepare # prepares builds, makes build needed for testing
- test # uses test:build specifically #jkr
- build
- deploy
# before_install:
before_script:
- npm ci --cache .npm --prefer-offline
prepare:
stage: prepare
needs: []
script:
- npm install
test:
stage: test
needs: [prepare]
except:
- schedules
tags:
- linux
script:
- npm run build:dev
- npm run test:cicd-deps
- npm run test:cicd # runs puppeteer tests #jkr
artifacts:
reports:
junit: junit.xml
paths:
- coverage/
build-staging:
stage: build
needs: [prepare]
only:
- schedules
before_script:
- apt-get update && apt-get install -y zip
script:
- npm run build:stage
- zip -r build.zip build
# cache:
# paths:
# - build
# <<: *global_cache
# policy: push
artifacts:
paths:
- build.zip
deploy-dev:
stage: deploy
needs: [build-staging]
tags: [linux]
only:
- schedules
# # - branches#gitlab-org/gitlab
before_script:
- apt-get update && apt-get install -y lftp
script:
# temporarily using 'verify-certificate no'
# for more on verify-certificate #jkr: https://www.versatilewebsolutions.com/blog/2014/04/lftp-ftps-and-certificate-verification.html
# variables do not work with 'single quotes' unless they are "'surrounded by doubles'"
- lftp -e "set ssl:verify-certificate no; open mediajackagency.com; user $LFTP_USERNAME $LFTP_PASSWORD; mirror --reverse --verbose build/ /var/www/domains/dev/clients/client/project/build/; bye"
# environment:
# name: staging
# url: http://dev.mediajackagency.com/clients/client/build
# # url: https://stg2.client.co
when: manual
allow_failure: true
build-production:
stage: build
needs: [prepare]
only:
- schedules
before_script:
- apt-get update && apt-get install -y zip
script:
- npm run build
- zip -r build.zip build
# cache:
# paths:
# - build
# <<: *global_cache
# policy: push
artifacts:
paths:
- build.zip
deploy-client:
stage: deploy
needs: [build-production]
tags: [linux]
only:
- schedules
# - master
before_script:
- apt-get update && apt-get install -y lftp
script:
- sh deploy-prod
environment:
name: production
url: http://www.client.co
when: manual
allow_failure: true
The idea is to keep check commands outside of .gitlab-ci.yml. I use Makefile to run something like make check and my .gitlab-ci.yml runs the same make commands that I use locally to check various things before committing.
This way you'll have one place with all/most of your commands (Makefile) and .gitlab-ci.yml will have only CI-related stuff.
I have written a tool to run all GitLab-CI job locally without have to commit or push, simply with the command ci-toolbox my_job_name.
The URL of the project : https://gitlab.com/mbedsys/citbx4gitlab
Years ago I build this simple solution with Makefile and docker-compose to run the gitlab runner in docker, you can use it to execute jobs locally as well and should work on all systems where docker works:
https://gitlab.com/1oglop1/gitlab-runner-docker
There are few things to change in the docker-compose.override.yaml
version: "3"
services:
runner:
working_dir: <your project dir>
environment:
- REGISTRATION_TOKEN=<token if you want to register>
volumes:
- "<your project dir>:<your project dir>"
Then inside your project you can execute it the same way as mentioned in other answers:
docker exec -it -w $PWD runner gitlab-runner exec <commands>..
I recommend using gitlab-ci-local
https://github.com/firecow/gitlab-ci-local
It's able to run specific jobs as well.
It's a very cool project and I have used it to run simple pipelines on my laptop.
I am new to gitlab runner and trying to setup acceptance_stage after unit test passes.
The tests run on a windows machine when the job is triggered.
The job exits when tests pass, but the job stalls when test fails. I have only 1 test right now and I am planning to run multiple tests in parallel when this issue is fixed.
The test is triggered using maven command. (I have tweaked the test in such a way so that it fails.)The test fails on this command :
mvn verify -P rest-test-run -Denvironment="qa" -Dwebdriver.chrome.driver="C:\tools\chromedriver.exe"
and the job stalls and is unable to execute the next command.
Here is the sample yaml file that i am using to trigger acceptance stage:
stages:
- test
acceptance_tests:
stage: test
tags:
- playground
only:
- ci_cd_test
before_script:
- export POM_VERSION=$(mvn help:evaluate -Dexpression=project.version -q -DforceStdout)
- cd ..
- git clone "git#gitlab.<project>" "<automation project name>" 2> /dev/null || (cd "<automation project name>" ; git pull)
- cd <webapp project name>
- export DB_URL=jdbc url
- export DB_USER=username
- export DB_PASSWORD=password
variables:
GIT_STRATEGY: none
script:
# flyway migrates
- java -Dfile.encoding=UTF-8 -jar <webapp project name>/target/<webapp project name>-$POM_VERSION.jar flyway migrate
# Bring up the app
- ./start.sh
# Run cucumber test
- cd ../<automation project name>
- mvn verify -P rest-test-run -Denvironment="qa" -Dwebdriver.chrome.driver="C:\tools\chromedriver.exe"
- cd ../<webapp project name>
# Stop the app
- ./stop.sh
I have also attached a screen shot of the stalling job.
Some troubleshooting ideas that I tried implementing:
upgraded surefireplugin(the version that I use in 2.22.0)
added a cleanup_job in yaml file
ran tests not in parallel
the above ideas did not solve the problem.
I want to deploy a static page using GitLab repos with plain HTML/CSS (actually SCSS). As far as I learnt, a static page needs at least .gitlab-ci.yml and /public folder. The file .gitlab-ci.yml will have a minimum requirement like this: (an example from official doc)
pages:
stage: deploy
script:
- mkdir .public
- cp -r * .public
- mv .public public
artifacts:
paths:
- public
only:
- master
And my question is lying in the script line.
(I assume the script below will create a hidden folder name .public and copy all the file in it then move it from .public to public folder. Please correct me if I'm wrong.)
script:
- mkdir .public
- cp -r * .public
- mv .public public
To me, it's similar to shell-script of Linux. It's also confirmed in GitLab doc that it's run by the Runner. But the problem is, how do I know how many shell-scripts are installed in GitLab? And is it possible to make one?
I would like to make 2 folders: src and public. The GitLab CI will run the script and compile SCSS from src then move it to public.
I'm using gitlab.com by the way.
So a few things to consider. Each job in gitlab is run in a container. Generally you specify which one you want to use. Pages is a special case though so you don't have to care about the image for the container.
The pages job will populate your public folder. But you can alter the gitlab-ci.yml file and add steps.
This would build an app using node:
build_stuff:
stage: build
image: node:11
before_script:
- npm install
script:
- npm run build
artifacts:
paths:
- build
pages:
stage: deploy
script:
- mkdir .public
- cp -r build/ .public
- mv .public public
artifacts:
paths:
- public
only:
- master
Formatting is off
Things to note. The first step is running the build steps to generate all the assets for your output folder. It is then storing anything declared in the artifacts block, in this case the build folder, and passing it on to the next job. Adjust this step accordingly to what you need to build your app.
The only thing I altered in the second step is that you copy the contents of the build folder instead of the entire repo into the .public folder. Adjust this to your needs as well.
As for shell scripts, there are none present except for the ones you bring to the repository. The default runner supports Bash so you can execute bash commands just as you would in your terminal.
If you create the file foo.sh in your repo and do bash foo.sh it will execute the script, if it's executable. Remember to chmod it before pushing it.
There are no "shell-scripts installed in Gitlab". Gitlab supports several shells and the script part in your example are just pure bash commands. Since you are most probably using the default docker runner you can execute bash commands from the script part, run scripts in other languages that are in your repo, install packages on the docker container and even prepare and run your own docker images.