So I have 2 services - A and B
I am trying to use A as a GitLab service in one of the jobs of B :
.gitlab-ci.yml
job1:
stage: dummy
services:
- name: registry.gitlab.com/service_a
alias: A
script:
- echo "hello"
Now B has a file that service A has to access.
Is it possible that service A can access a file in the repository of B using CI_PROJECT_DIR GitLab variable?
Related
I have a solution where reside a web application and an azure function.
So different pipes should be used for deployment:
microsoft/azure-web-apps-deploy:1.0.3 - for web app
microsoft/azure-functions-deploy:1.0.2 - for azure function
How they should be combined in a yml file? Or do I need to have separate yml files?
It looks like I can't have multiple deployment steps for a single environment?
Bitbucket Pipelines does not support multiple files for defining the pipeline nor different deployment steps for the same deployment stage so: yes, same file, same step.
pipelines:
default:
- step:
deployment: production
script:
- pipe: [docker://]microsoft/azure-web-apps-deploy:1.0.3
variables:
FOO: bar
- pipe: [docker://]microsoft/azure-functions-deploy:1.0.2
variables:
BAR: foo
So, deploying to "production" will deploy both the webapp and the function.
The docker:// prefix for the pipe is necessary if the pipes aren't registered as such but are arbitrary docker images with smart entrypoints.
Yet, you could use custom deployment environments like "Production Webapp" and "Production Function" so you could deploy them separatedly like
pipelines:
default:
- parallel:
- step:
deployment: production-webapp
script:
- pipe: [docker://]microsoft/azure-web-apps-deploy:1.0.3
variables:
FOO: bar
- step:
deployment: production-function
script:
- pipe: [docker://]microsoft/azure-functions-deploy:1.0.2
variables:
BAR: foo
and Bitbucket will keep track of the deployments made to each "environment" and also let you redeploy to them individually.
I have a simple pipeline where I just want to navigate to a specific folder on the server hosting the runner.
My runner is online, I added the host in the SSH Keys section, and my step is:
- step:
name: 'Write log to server'
services:
- docker
runs-on:
- self.hosted
- linux
script:
- export HOST_PROJECT_PATH=/home/project/myproject
- ls -a
- cd $HOST_PROJECT_PATH // folder not found but exists
The ls -a shows the content of my Bitbucket repository, and I can't cd to an existing directory (directory exists on the server).
Can I do it using pipeline & runner ? Do I need to use an other service ? Can't find any example or documentation.
We have multiple environments (staging, production...) and I don't want to put sentitive informations like database passwords inside the codebase. For this, I want to use environment variables provided by GitLab CI/CD.
However I don't know how to tell GitLab to run a different set of variables depending on my environment.
What I've done so far:
1- Create environments : Via UI (Project => Operations => Environments : Here I created 2 environments, STAGING and PRODUCTION
2- Create variables Via UI (Project => Settings => CI/CD => Variables : Here I created the variable DB_PASSWORD twice (with of course different values assigned) , one with environment scope set to STAGING, the other one to PRODUCTION.
Now what I want to do is run my project's pipeline. So I go to Project => CI/CD => Pipelines => Run Pipeline and here I expect GitLab CI to ask me if I would like to run my pipeline with the set of variables set for STAGING or PRODUCTION but it doesn't.
How I am supposed to tell GitLab that I want to run my pipeline using DB_PASSWORD variable with the value corresponding to the environment I want to target?
You need to specify the environment in your gitlab-ci.yml file, see here
Example from official gitlab docs:
stages:
- test
- build
- deploy
test:
stage: test
script: echo "Running tests"
build:
stage: build
script: echo "Building the app"
deploy_staging:
stage: deploy
script:
- echo "Deploy to staging server"
environment:
name: staging
url: https://staging.example.com
only:
- master
In this example when running deploy_staging the environment is set to staging and thus you can access the defined Variables for the environment, like so:
deploy_staging:
stage: deploy
script:
- echo "Deploy to staging server"
environment:
name: staging
url: https://staging.example.com
variables:
DB_PASS: ${DB_PASSWORD} # which is your defined variable within Gitlab CI
only:
- master
I am new at gitlab CI/CD Settings. I don't want to put sensitive credentials (like API-Keys, passwords...) into my branch. For this, GitLab (and other CI/CD-Services) are able to set environment variables.
What I have done so far:
Via UI (Project ⇒ Settings ⇒ CI/CD ⇒ Variables)
First go to Project ⇒ Settings ⇒ CI/CD ⇒ Variables and add them like this:
enter image description here
Now here trying to get the File with all your config-values(e.g. with dotenv).
require("dotenv");
module.exports = process.env.NODE_ENV.trim() === "production" ? _config.production : _config.development;
Current .gitlab-ci.yaml file is:
image: node:8.9.0
cache:
paths:
- node_modules/
stages:
- ver
- init
- test
- build
- deploy
ver:
stage: ver
script:
- node -v
init:
stage: init-dev
script:
- npm install
tags:
- dev_server
only:
- dev
variables:
ENV_PRODUCTION: "/builds/AkdiD/8/abcde/projectName/ENV_PRODUCTION"
test:
stage: test
script:
- npm test
build:
stage: build
script:
- echo "BUILD_VERSION=production" >> build.env
artifacts:
reports:
dotenv: build.env
deploy:
stage: deploy-dev
script:
- npm run killcurrent
- echo $ENV_PRODUCTION
- echo $BUILD_VERSION
- npm run staging
tags:
- dev_server
only:
- dev
Question: where do I need to keep this ENV_PRODUCTION file name (yaml file or other place) so that server take that value ??
Edited variable like this- still server it not fetching these variables. Should I change/put something in .gitlab-ci.yml file?
Settings up a Custom environment variables of type File (GItLab 11.11+) does not seem the way to reference/set a list of variables, including ones with sensitive information.
A variable of type file is generally there to represent, for instance, a certificate.
You should define variables, possibly Group-level environment variables
You can define per-project or per-group variables that are set in the pipeline environment.
Group-level variables are stored out of the repository (not in .gitlab-ci.yml) and are securely passed to GitLab Runner, which makes them available during a pipeline run.
For Premium users who do not use an external key store or who use GitLab’s integration with HashiCorp Vault, we recommend using group environment variables to store secrets like passwords, SSH keys, and credentials.
I am trying to improve the CI/CD process of an old funky project whose code is not open to refactoring at the moment. I just cant get this to work following the Azure documentation or even if it is possible.
I have been able to improve the current state with an azure pipeline file that runs unit tests before merging into releases/dev branch. But i want to further.
Tasks every PR into releases/dev will:
script: npm run test:unit
script: npm run build:dev
copy/publish the contents of the .div/ folder to a azure blob store config for static site
Any PR or merge into releases/staging will:
script: npm run build:staging
copy/publish the contents of the .div/ folder to a azure blob store config for static site
Any PR or merge into master will:
script: npm run test:unit
script: npm run build:production
copy/publish the contents of the .div/ folder to a azure blob store config for static site
I have 3 questions
Is this possible within a single yaml file?
How do i run different task for different branches, I've defined jobs/stages but cant get them to be conditional?
Is there some magic anyone can direct me to that lets me copy the content of a directory to a blob store? Or must it be zipped->copied->un zipped?
Thanks in advance from a new sleep deprived dad
Is this possible within a single yaml file? How do i run different
task for different branches, I've defined jobs/stages but cant get
them to be conditional?
Of course. You could add these stages in a single yaml file. Then you need to define the condiftion field for each stage or each job.
Here is an example for stages:
trigger:
- '*'
pool:
vmImage: 'ubuntu-latest'
stages:
- stage: Test1
condition: OR(eq(variables['Build.SourceBranch'], 'refs/heads/master') ,eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/master'))
jobs:
- job: BuildJob
steps:
- script: echo Build Stage1!
- stage: Test2
condition: OR(eq(variables['Build.SourceBranch'], 'refs/heads/dev') ,eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/dev'))
jobs:
- job: BuildJob
steps:
- script: echo Build Stage2!
- stage: Test3
condition: OR(eq(variables['Build.SourceBranch'], 'refs/heads/staging') ,eq(variables['System.PullRequest.TargetBranch'], 'refs/heads/staging'))
jobs:
- job: BuildJob
steps:
- script: echo Build Stage3!
You could set required branches as trigger. Then you could use the Build.SourceBranch and System.PullRequest.TargetBranch to determine whether to run the current stage.
Build.SourceBranch -> for merge branch.
System.PullRequest.TargetBranch -> for Pull Request.
Here are the docs about conditions and variables.
Is there some magic anyone can direct me to that lets me copy the content of a directory to a blob store? Or must it be zipped->copied->un zipped?
Since you need to publish file to Azure Blob, you could directly try to use the Azure File Copy task.
Here is an example:
- task: AzureFileCopy#4
displayName: 'AzureBlob File Copy'
inputs:
SourcePath: xxx
azureSubscription: xxx
Destination: AzureBlob
storage: xxx
ContainerName: '$web'
Hope this helps.