Pipeline failed - gitlab

I am trying to run cypress test scripts on Gitlab CICD Pipeline but this error occured
enter image description here
Here is my code on gitlab-ci.yml file
image: docker:18.09
stages:
- test
test:
stage: test
script:
- npm install
- npm run test

Cypress provide some custom docker image to use to avoid dependencies issues. You can check for them here cypress docker images
I also faced many weird issues with the implementation of a cypress job to run in CI. In order to not reinvent the wheel, you can use the cypress run job I shared in this opensource community hu for CI/CD jobs.It's customizable, you just need to include the job url in your pipeline and override some little varaiables, as mentioned in the related documentation of the job.
You should have something like that:
include:
- remote: 'https://api.r2devops.io/job/r/r2devops-bot/cypress_run/latest.yaml'
cypress_run:
variables:
BASE_URL: '<your_server_url>

Related

The cypress npm package is installed, but the Cypress binary is missing (newbie)

I see many instances of this question but nothing that helps me. Apologies if this question gets boring.
I am just starting out with node.js, Cypress and GitLab Pipelines.
I've cobbled together something that has a simple web app, a few simple tests.
It ran fine the first time but, on subsequent commits, it fails at the 'Cypress Tests' step with: The cypress npm package is installed, but the Cypress binary is missing.
There's a lot more to the log but I don't know what is relevant.
Here is my yml file
cypress tests:
stage: test
image: cypress/browsers:node14.17.0-chrome91-ff89
cache:
key: package-lock.json
paths:
- node_modules
before_script:
- npm install
- npm run dev &
- ./node_modules/.bin/wait-on http://localhost:3000
script:
- npm run cypress
only:
- merge_requests
- master
Could you please help with anything that looks like it might be the culprit?
Or at least help me understand how to read the situation better?
I tried reading the docs as much as I can, I just can't see the right way.
I'm also a beginner but I'm trying to answer. First you can add "cypress:open" : "cypress open" to the file package.json. For more details you can watch this video

Coverage badge in gitlab is unknown

I am trying to setup a coverage badge for a python project on GitLab.
I was following this question but it is still not working.
Currently I see in "CI/CD"/jobs page this:
But when I go to Settings/"CI-CD"/General pipelines, the coverage report is still unknown:
This is how I defined coverage run in .gitlab-ci.yml file:
tests:
stage: test
only:
- merge_requests
script:
- pip install poetry
- poetry install
- poetry run coverage run -m pytest
- poetry run coverage report
- poetry run coverage xml
artifacts:
paths: [coverage.xml]
Any ideas what might need to be set differently?
Okay it looks like that I need to add also -main to only part of my .gitlab-ci.yml and then it works. I was just maybe hoping I can have it without running tests twice (before MR to main and after).

Tox: different test sets during local and gitlab CI runs

A subset of pytest tests cannot run on gitlab due to dependencies on locally ran services.
How can I exclude them from gitlab CI pipelines while keeping them for local testing? I am not sure if the filtering needs to be done in pytest, tox or gitlab config.
Current configuration:
tox.ini
[testenv]
commands = pytest {posargs}
gitlab-ci.yml
build:
stage: build
script:
- tox
The most convenient way of doing that is dynamically through pytest
def test_function():
if not valid_config():
pytest.xfail("unsupported configuration")
https://docs.pytest.org/en/latest/skipping.html
You could also use two different tox.ini files.
While tox looks for a tox.ini by default, you can also pass in a separate tox.ini file like...
tox -c tox-ci.ini

How to store node modules between jobs and stages in gitlab with continuous integration

I am fairly new to GitLab CI and I've been trying different approaches to use the node_modules directory in my entire pipeline. From what I've read in the official docs, cache and artifacts seem to be valid approaches to pass on files between jobs:
cache is used to specify a list of files and directories which should
be cached between jobs. You can only use paths that are within the
project workspace.
However, my issue with the caching method is that the node_modules would be persisted between pipelines by default:
cache can be set globally and per-job.
from GitLab 9.0, caching is enabled and shared between pipelines and jobs by default.
I do not want to persist the node_modules between pipelines. What I actually want is to trigger a fresh install with npm in my setup stage and then allow all further jobs in the pipeline to use these modules. Hence, I started using artifacts instead of cache, which is described similarly:
artifacts is used to specify a list of files and directories which
should be attached to the job after success. [...]
The artifacts will be sent to GitLab after the job finishes
successfully and will be available for download in the GitLab UI.
The dependency feature should be used in conjunction with artifacts
and allows you to define the artifacts to pass between different jobs.
The artifact-dependency method seems to be usable in my case. However, both cache and artifacts are extremely inefficient and slow. The node_modules are installed and usable, but the entire directory then gets uploaded somewhere and is re-downloaded between each job. (I would really love to know what happens here... Where do the modules go?)
Is there a better approach to run npm install only once at the beginning of the pipeline and then keep the node_modules in the pipeline during its entire runtime? I do not want to keep the node_modules after all jobs are finished so they don't need to be uploaded or downloaded anywhere.
Sample pipeline configuration file to reproduce the behavior:
image: node:lts
stages:
- setup
- build
- test
node:
stage: setup
script:
- npm install
artifacts:
paths:
- node_modules/
build:
stage: build
script:
- npm run build
dependencies:
- node
test:
stage: test
script:
- npm run lint
- npm run test
dependencies:
- node
Where do the modules go?
By default artifacts are saved on the main gitlab machine:
/var/opt/gitlab/gitlab-rails/shared/artifacts
Is there a better approach to run npm install only once at the beginning of the pipeline and then keep the node_modules in the pipeline during its entire runtime?
There are some options that you can try:
Merge setup and build stages to one stage.
Local npm cache on builder machines. Faster npm install times. Or use private npm proxy registry (for example - Nexus/Artifactory)
Check if gitlab main machine and the builders are in the same network so the upload/download will be faster
Consider packaging your build in docker. You will get reusable docker images between your gitlab stages. (Of course that there is an overhead of uploading the images to docker registry)

Gitlab CI: Cannot find output of build stage

I have my .gitlab-ci.yml file set up in the typical three stages: test, build, deploy. During the build stage, I run a command that compiles my project and puts it in a tarball. The build stage appears to execute successfully because it moves on to the deploy stage, but the deploy stage then says it can't find the tarball. Is it in another directory? What happened to it? Thanks.
For each test gitlab-ci clean the build folder, therefore the output files of the build stage are not available in the deploy stage.
You need to rebuild your project also in the deploy stage.
The "stages" are only useful to order your tests, i.e. avoid to try to do a deploy test if a build test failed.
EDIT:
Since Gitlab 8.6, it is possible using dependencies feature
I was surprised to see the same behaviour (on GitLab 8.4).
I use cmake to create makefiles, then make to build, and then make test to run the test. I run all these in a build/ directory.
I don't want to repeat myself and identify easily which steps are failing. As such, I've created different gitlab-ci stages: cmake, make, test, etc. I then tell gitlab-ci to keep the build directory using the cache option:
cache:
key: "$CI_BUILD_REF_NAME"
untracked: true
paths:
- build/
I think that the key option will keep the same build directory for all stages acting on the same branch. See the gitlab-ci doc here: http://doc.gitlab.com/ce/ci/yaml/README.html#cache
EDIT: Don't use the cache for this! GitLab implemented reusable artifacts between stages in 8.4: https://gitlab.com/gitlab-org/gitlab-ce/issues/3423
The CI runners will have to be adapted to support this. See: https://gitlab.com/gitlab-org/gitlab-ci-multi-runner/issues/336

Resources