.gitlab-ci.yml to include multiple shell functions from multiple yml files - gitlab

I have a Gitlab mono repository with some backend Java and frontend Node.js code. To create a CI, I'm working on a shared approach to build both the applications.
In the application repository, let's call it "A", I have source code as well a .gitlab-ci.yml file as below,
A
├── .gitlab-ci.yml
├── backendapi
└── ui
.gitlab-ci.yml file,
---
include:
- project: 'root/B'
ref: master
file: 'top-level.yml'
- project: 'root/B'
ref: master
file: 'maven.yml'
- project: 'root/B'
ref: master
file: 'node.yml'
I have another repository called "B", where I have all my CI functionalities in three different files.
B
├── maven.yml
├── node.yml
└── top-level.yml
top-level.yml file that has my build stage in it,
---
stages:
- build
variables:
GIT_SSL_NO_VERIFY: "1"
.build_script: &build_script
stage: build
tags:
- default
- docker
java_build:
<<: *build_script
image:
name: maven:latest
script:
- backend_build
node_build:
<<: *build_script
image:
name: node:slim
script:
- frontend_build
maven.yml, that has mvn build function,
.maven_build: &maven_build |-
function backend_build {
cd backendapi
mvn clean package -DskipTests
}
before_script:
- *maven_build
node.yml, with node function in it,
.node_build: &node_build |-
function frontend_build {
cd ui
npm install
npm build
}
before_script:
- *node_build
When the .gitlab-ci.yml file in repository "A" is run, it is calling the top-level.yml, maven.yml and node.yml files from the repository "B" which is good.
The problem here is when it runs the java_build it is unable to find the backend_build function from maven.yml instead it seems like it only loading the frontend_build function from node.yml file or overwriting the backend_build function from maven.yml file. The node_build works as expected, cause it can find the frontend_build function.
the Skipping Git submodules setup
Authenticating with credentials from /root/.docker/config.json
Authenticating with credentials from /root/.docker/config.json
Authenticating with credentials from /root/.docker/config.json
$ function frontend_build { # collapsed multi-line command
$ backend_build
/bin/bash: line 90: backend_build: command not found
I know that I can copy all the functions into one big yml file in repository "B" and include the in .gitlab-ci.yml in the repository "A" but here I'm trying to understand is it even possible to try the above approach.
Thanks in advance!

Ok, Finally found a hack but not a complete answer as yaml files cannot act accordingly as I stated in my question, but I took a different approach to solve the problem.
Well, there are no more maven.yml or node.yml, there are only four files in the repository B backend.yml, frontend.yml, hybrid.yml and top-level.yml.
The backend.yml has all the functions (build_app, lint_app, unit_test_app and so on..) that are required and the same follows the frontend.yml with different commands in the functions.
ex: In the backend.yml build_app function I will have the maven command at the same time in the frontend.yml build_app function I will have the nom command. Here the build_app function name is common in both the frontend.yml and backend.yml but the functionality is different.
In the top-level.yml stages, I specified the common function name as build_app in the script key.
stages:
- build
variables:
GIT_SSL_NO_VERIFY: "1"
.build_script: &build_script
stage: build
tags:
- default
- docker
build:
<<: *build_script
image: maven:latest
script:
- build_app
But in the .gitlab-ci.yml, depending on the build I need to do, I include that specific yml file. In the below example I want to build the backend and included the backend.yml same applies for the frontend.
include:
- project: 'root/B'
ref: master
file: 'top-level.yml'
- project: 'root/B'
ref: master
file: 'backend.yml'
If I have to build both the backend and frontend, I will use a hybrid.yml with the same function name as build_app but include both the maven and npm command. I know this is not the right approach but I will suffice the use case I'm trying to solve.
Thank you for helping me with the question!
Happy Automation :)

Related

gitlab-ci includes a file from another project that executes a script file

I have two different projects. project1 and project2.
Inside project1, I have file_project1 file:
apply:
stage: apply
script:
- bash folder/scripts/automation.sh
rules:
- if: $CI_PIPELINE_SOURCE == "push"
when: always
In project2, I have created the .gitlab-ci.yml and I have included the project1 and the file_project1:
include:
- project: 'namespace/project1'
ref: main
file: 'file_project1'
During the execution, project2 does not recognize the folder/scripts/automation.sh. I got the following error:
bash: folder/scripts/automation.sh: No such file or directory
Please, how can the pipeline inside project2 executes correctly the bash instruction defined in project1 ?
I would clone project1 from the project2 job, then the script would be in project1/folder/scripts/automation.sh. See, GitLab CI/CD job token.
my-job:
script:
- git clone https://gitlab-ci-token:${CI_JOB_TOKEN}#gitlab.example.com/<namespace>/<project>
The include keyword is used for importing .yml files into .gitlab-ci.yml. So, it may not be useful, here.

CircleCI Dynamic Config / Config breakdown

Does anyone know if it's possible to breakdown the config file for circleci into smaller files where each job, command, workflow, etc, is in it's own specific file/subdirectory, and if so, how would you approach this?
I've been looking around and even attempted myself to build a python script to build a config from all these yaml files, but with no luck due to reference variable names not existing in these various files so pyyaml library won't load them.
What I'm trying to accomplish is to have this folder structure
configs/
dependencies.yml
commands/
command_1.yml
command_2.yml
jobs/
job_1.yml
job_2.yml
workflows/
workflow_1.yml
workflow_2.yml
Where dependencies.yml contains a breakdown of what each workflow requires in terms of what is used in each step > job > command. And this file would be hand written.
You can do the following :
Split your config.yml in a structure defined in Packing a config
Use dynamic configuration where you fist generate the config from step 1 and the call the generated config file from them main file
Example original config.yml to split:
version: 2
orbs:
sonarcloud: sonarsource/sonarcloud#1.0.3
jobs
my-job:
docker:
- image: cimg/latest
steps:
- checkout
- run: make
workflows:
build:
jobs:
-my-job
Create following layout in a new folder called config (run tree):
.
├── config.yml
└── config
   ├── #orbs.yml
   ├── jobs
│ └──my-job.yml
   └── #workflows.yml
#orbs.yml contains
version: 2
orbs:
sonarcloud: sonarsource/sonarcloud#1.0.3
#workflows.yml contains
workflows:
build:
jobs:
-my-job
my-job.yml contains
docker:
- image: cimg/latest
steps:
- checkout
- run: make
And the main config.yml should look like:
version: 2.1
setup: true
orbs:
continuation: circleci/continuation#0.3.1
jobs:
generate-and-run-circleci:
docker:
- image: 'circleci/circleci-cli:latest'
steps:
- circleci-cli/install
- checkout
- run:
command : |
cd .circleci
circleci config pack config > generated.yml
- continuation/continue:
configuration_path: .circleci/generated.yml
workflows:
build:
jobs:
- generate-and-run-circleci

Is there anyway to define the artifacts paths dynamically in Gitlab-ci?

i have a template that i don't want to be modified as follows:
build:
script:
- ...
artifacts:
name: $BINFILE
paths: [$ARTIFACTS_PATH]
and gitlab-ci.yaml that includes that template and the following vars
BINFILE: demo-bin-files
ARTIFACTS_PATH: "Build"
the artifacts NAME is substituted correctly by the var BINFILE but the var ARTIFACTS_PATH is not and throws an error while i start the job
Uploading artifacts...
WARNING: --artifact-format: no matching files
ERROR: No files to upload
What i want here is that user pass only the paths that he want to upload as artifacts.
Can i do this or Gitlab doesn't support that ? thanks
You can't
but you can achieve this using shell script.
Set the artifacts: path to a temp folder and in the last step of your script: or after_script: section, copy all the content that you generate dynamically to the temp folder
it will solve your problem until Gitlab add this feature
I've had a similar issue and there's a simple fix - use the template as ana actual template.
template.yml (note the dot in the job name denoting it as a template)
.build:
script:
- ...
.gitlab-ci.yml
build:
extends: .build
artifacts:
name: demo-bin-files
paths:
- Build
You can even define common artifact settings in the template like so:
template.yml
.build:
script:
- ...
artifacts:
when: always
expire_in: 6 hours
and .gitlab-ci.yml remains the same.

How do I run my CI steps in a specific folder in github action

I have a golang repo with a react client. I want to setup up CI using github actions for my client. The React client is inside the client folder in the workspace.
I have written the following workflow
name : Node.js CI
on: [push, pull_request]
jobs:
build:
name: build
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
with:
path: client
- name: Set up Node.js
uses: actions/setup-node#v1
with:
node-version: 12.x
- run: yarn install
- run: yarn build
But on committing it shows the following error
Run yarn build1s
##[error]Process completed with exit code 1.
Run yarn build
yarn run v1.21.1
error Couldn't find a package.json file in "/home/runner/work/evential/evential"
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
##[error]Process completed with exit code 1
The snippet
- uses: actions/checkout#v2
with:
path: client
doesn't make the steps following run inside the client folder.
Need help. Thanks in advance.
You can use the working-directory keyword in a run step. See the documentation here.
- run: yarn install
working-directory: client
- run: yarn build
working-directory: client
Assuming your repository structure looks like this:
.
├── README.md
├── client
│   └── ... # your source files
└── workflows
└── example-ci.yml
You can also set the default working directory for multiple steps using:
defaults:
run:
working-directory: client # The working directory path
This way you don't need to specify it for each steps.
You can also adjust the scope depending on where you put the above snippet, either for:
All steps of all jobs: put it at the base of your workflow.
All steps of one job: put it within the job in the jobs attribute of your workflow so it applies to the steps of that job.

Gitlab Pages throw 404 when accessed

I have a group project with the following name (hosted in Gitlab): gitlab.com/my-group/my-project.
I have generated coverage reports during testing and saved them as artifacts using Gitlab CI. Here is Gitlab CI config:
test:
stage: test
image: node:11
before_script:
- npm install -g yarn
- yarn
cache:
paths:
- node_modules/
script:
- yarn lint
- yarn test --all --coverage src/
except:
- tags
artifacts:
paths:
- coverage/
coverage: '/Statements\s+\:\s+(\d+\.\d+)%/'
deploy-pages:
stage: deploy
dependencies:
- test
script:
- mv coverage/ public/
artifacts:
paths:
- public/
expire_in: 30 days
except:
- tags
When I open deploy stage job, I can see the artifact being created. Here is the screenshot: . All the files are under /public directory in the artifact.
Now, when I go to: https://my-group.gitlab.io/my-project, I keep getting 404.
I am not sure what step I am missing here. Can someone shed some light on this issue for me?
Thanks!
There are three basic requirements for the project itself:
project must be named group.gitlab.io (if you want it to be the base domain)
job must create artifact in public directory
job must be called pages
Most likely it's the last one that needs fixing since your job is currently called deploy-pages. Simply rename that to pages.
You'll know when you got everything working because under Settings > Pages, it will tell you the link where it's published to.

Resources