Hey i`ve a problem with my bitbucket pipeline caching.
I see in first step, that the caching is working like that:
Cache "cpphp71": Downloading
Cache "cpphp71": Downloaded 3 MB in 4 seconds
Cache "cpphp71": Extracting
Cache "cpphp71": Extracted in 0 seconds
But when it's uploading the cache again this message is displayed:
Cache "cpphp71": Skipping upload for existing cache
I don't really know what to change so it's working again.
image: albertcolom/ci-pipeline-php:7.1-alpine
pipelines:
default:
- step:
name: testing
caches:
- cpphp71
script:
- php --version
- step:
name: Package build
caches:
- cpphp71
script:
- echo "test"
definitions:
caches:
cpphp71: /test/.rl/repo
The cache only uploads if no cache exits. The cache automatically clears itself after a week. You can also clear in via the bitbucket UI.
If you have two different dependencies you should create two caches like this.
image: albertcolom/ci-pipeline-php:7.1-alpine
pipelines:
default:
- step:
name: testing
caches:
- cpphp71_1
script:
- php --version
- step:
name: Package build
caches:
- cpphp71_2
script:
- echo "test"
definitions:
caches:
cpphp71_1: /test/.rl/repo
cpphp71_2: /test/.rl/repo
For more information about caching see here: https://confluence.atlassian.com/bitbucket/caching-dependencies-895552876.html
Related
I have a stage in my CI pipeline (gitlab-ci) as follows:
build_node:
stage: Build Prerequisites
only:
- staging
- production
- ci
image: node:15.5.0
artifacts:
paths:
- http
cache:
key: "node_modules"
paths:
- ui/node_modules
script:
- cd ui
- yarn install --network-timeout 600000
- CI=false yarn build
- mv build ../http
The UI however, is not the only part of the project. There are other files with their own build processes. So whenever we commit changes for only those other files, this stage gets rerun every time, even if nothing in the ui folder changed.
Is there a way to have gitlab cache or otherwise not rebuild this every time if there were no changes? Any changes that should trigger a rebuild would all be under the ui folder. Just have it use the older build if possible?
It is possible to do in latest Gitlab version using the rules:changes keyword.
rules:
- changes:
- ui/*
Link: https://docs.gitlab.com/ee/ci/jobs/job_control.html#variables-in-ruleschanges
This will only check for changes inside the ui folder and trigger this stage.
Check this link for more info: https://docs.gitlab.com/ee/ci/yaml/#ruleschanges
since migrating our test from bitbucket to gitlab the video is no longer recorded during runs in the pipeline. has anyone encountered a similar problem? cypress version 7.3.0
stages:
- build
- test
variables:
npm_config_cache: "$CI_PROJECT_DIR/.npm"
CYPRESS_CACHE_FOLDER: "$CI_PROJECT_DIR/cache/Cypress"
cache:
key: ${CI_COMMIT_REF_SLUG}
paths:
- .cache/*
- cache/Cypress
- node_modules
- build
image: cypress/browsers:node14.15.0-chrome86-ff82
stage: build
script:
- yarn install
- npx cypress cache path
- npx cypress cache list
phone-sanity-tests-development:
image: cypress/browsers:node14.15.0-chrome86-ff82
stage: test
parallel: 15
script:
- yarn cypress:run-phone-development-sanity
artifacts:
paths:
- cypress/screenshots/**
- cypress/videos/**
- cypress/reports/**
- cypress/projects/phone/puppeteer/videos/**
Here: - yarn cypress:run-phone-development-sanity you need to add --record.
In order to tell cypress to record and make screenshots you need to configure this on the run command in yml file.
This link is nice example how Cypress team configures their gitlab-ci.yml:
https://github.com/cypress-io/cypress-realworld-app/blob/develop/.gitlab-ci.yml
I would like to use an artifact from a previous pipeline and checking the documentation I haven't been able to find how.
I've only seen how to reuse them in the same pipeline (https://confluence.atlassian.com/bitbucket/using-artifacts-in-steps-935389074.html)
How can I reuse an existing artifact from a previous pipeline?
This is my current bitbucket-pipelines.yml:
image: php:7.2.18
pipelines:
branches:
delete-me:
- step:
name: Build docker containers
artifacts:
- docker_containers.tar
services:
- docker
script:
- docker/build_containers_if_not_exists.sh
- sleep 30 # wait for docker to start all containers
- docker save $(docker images -q) -o ${BITBUCKET_CLONE_DIR}/docker_containers.tar
- step:
name: Compile styles & js
caches:
- composer
script:
- docker load --input docker_containers.tar
- docker-compose up -d
- composer install
Maybe you can try to use Pipelines Caches feature. You should define your custom cache, for example:
definitions:
caches:
docker_containers: /docker_containers
The cache will be saved after the first successful build and will be available to the next pipelines for the next 7 days. Here is more info about using caches https://confluence.atlassian.com/bitbucket/caching-dependencies-895552876.html
I'm trying to setup GitLab CI for a mono repository.
For the sake of the argument, lets say I want to process 2 JavaScript packages:
app
cli
I have defined 3 stages:
install
test
build
deploy
Because I'm reusing the files from previous steps, I use the GitLab cache.
My configuration looks like this:
stages:
- install
- test
- build
- deploy
install_app:
stage: install
image: node:8.9
cache:
policy: push
paths:
- app/node_modules
script:
- cd app
- npm install
install_cli:
stage: install
image: node:8.9
cache:
policy: push
paths:
- cli/node_modules
script:
- cd cli
- npm install
test_app:
image: node:8.9
cache:
policy: pull
paths:
- app/node_modules
script:
- cd app
- npm test
test_cli:
image: node:8.9
cache:
policy: pull
paths:
- cli/node_modules
script:
- cd cli
- npm test
build_app:
stage: build
image: node:8.9
cache:
paths:
- app/node_modules
- app/build
script:
- cd app
- npm run build
deploy_app:
stage: deploy
image: registry.gitlab.com/my/gcloud/image
only:
- master
environment:
name: staging
url: https://example.com
cache:
policy: pull
paths:
- app/build
script:
- gcloud app deploy app/build/app.yaml
--verbosity info
--version master
--promote
--stop-previous-version
--quiet
--project "$GOOGLE_CLOUD_PROJECT"
The problem is in the test stage. Most of the time the test_app job fails, because the app/node_modules directory is missing. Sometimes a retry works, but mostly not.
Also, I would like to use two caches for the build_app job. I want to pull app/node_modules and push app/build. I can't find a way to accomplish this. This makes me feel like I don't fully understand how the cache works.
Why are my cache files gone? Do I misunderstand how GitLab CI cache works?
The cache is provided on a best-effort basis, so don't expect that the cache will be always present.
If you have hard dependencies between jobs, use artifacts and dependencies.
Anyway, if it is just for node_modules, I suggest you to install it in every step, instead of using artifacts - you will not save much time with artifacts.
I have this pipeline file to unittest my project:
image: jameslin/python-test
pipelines:
default:
- step:
script:
- service mysql start
- pip install -r requirements/test.txt
- export DJANGO_CONFIGURATION=Test
- python manage.py test
but is it possible to switch to another docker image to deploy?
image: jameslin/python-deploy
pipelines:
default:
- step:
script:
- ansible-playbook deploy
I cannot seem to find any documentation saying either Yes or No.
You can specify an image for each step. Like that:
pipelines:
default:
- step:
name: Build and test
image: node:8.6
script:
- npm install
- npm test
- npm run build
artifacts:
- dist/**
- step:
name: Deploy
image: python:3.5.1
trigger: manual
script:
- python deploy.py
Finally found it:
https://confluence.atlassian.com/bitbucket/configure-bitbucket-pipelines-yml-792298910.html#Configurebitbucket-pipelines.yml-ci_stepstep(required)
step (required) Defines a build execution unit. Steps are executed in
the order in which they appear in the pipeline. Currently, each
pipeline can have only one step (one for the default pipeline and one
for each branch). You can override the main Docker image by specifying
an image in a step.
I have not found any information saying yes or no either so what I have assumed is that since this image can be configured with all the languages and technology you need I would suggest this method:
Create your docker image with all utilities you need for both default and deployment.
Use the branching method they show in their examples https://confluence.atlassian.com/bitbucket/configure-bitbucket-pipelines-yml-792298910.html#Configurebitbucket-pipelines.yml-ci_branchesbranches(optional)
Use shell scripts or other scripts to run specific tasks you need and
image: yourusername/your-image
pipelines:
branches:
master:
- step:
script: # Modify the commands below to build your repository.
- echo "Starting pipelines for master"
- chmod +x your-task-configs.sh #necessary to get shell script to run in BB Pipelines
- ./your-task-configs.sh
feature/*:
- step:
script: # Modify the commands below to build your repository.
- echo "Starting pipelines for feature/*"
- npm install
- npm install -g grunt-cli
- npm install grunt --save-dev
- grunt build