I am facing a problem with GitLab include and I'm wondering whether it's possible to do what I intent to.
I have 2 GitLab repositories:
my-infrastructure
my-prod-deployment
In my-infrastructure I have the YML file that defines a job with a command referencing a local file. For example in my-infrastructure I have the following:
A gitlab-infrastructure.yml template
image: amazon/aws-cli
variables:
FOO: ReplaceMe
stages:
- one
agent_ui:
stage: microservices
script:
- aws cloudformation deploy --stack-name sample --template-file templates/aws-template.yml
and also I have a templates/aws-template.yml that has some cloud formation code.
Notice that the GitLab template needs access to a local file that exists in the same project my-infrastructure
Now in the other project my-prod-deployment I have a .gitlab-ci.yml with
include:
- project: mycompany/my-infrastructure
ref: main
file: gitlab-infrastructure.yml
variables:
FOO: Bar
When I run this CI/CD pipeline I can see the FOO variable being properly overriden and I can see that the included job's script is executed. The problem is that I get a
$ aws cloudformation deploy --stack-name sample --template-file templates/aws-template.yml
Invalid template path templates/aws-template.yml
This is probably because the local relative path is in my-infrastructure, but not in my-prod-deployment that file is not locally available in this project and therefore it can't be found.
Is there any solution to this?
Maybe a way to include not only gitlab but also other files or similar?
Or maybe some kind of shortcut or link to a different repo folder?
Or maybe a way to temporary copy a remote folder to the local CI/CD pipeline execution?
Notice that I cannot use an absolute or URL path for that script parameter since that specific tool (AWS CLI) does not allow it. Otherwise I wouldn't face this relative path issue
UPDATE 1: I have tried a workaround with git submodules separating the gitlab template in a different project and adding my-infrastructure as a submodule
cd my-prod-deployment
git submodule add git#gitlab.com:mycompany/my-infrastructure.git
so that my .gitlab-ci.yml looks like this
include:
- project: mycompany/my-gitlab-templates
ref: main
file: gitlab-infrastructure.yml
variables:
CLOUDFORMATION_SUBMODULE: my-infrastructure
FOO: Bar
and my repo has a local folder my-infrastructure, but I am shocked to find that it still complains about the AWS CloudFormation template path, so I've added AWS Cloud Formation tag to the question and edited it.
This is the error
$ aws cloudformation deploy --stack-name sample --template-file $CLOUDFORMATION_SUBMODULE/templates/aws-template.yml
Invalid template path my-infrastructure/templates/aws-template.yml
Cleaning up project directory and file based variables 00:00
ERROR: Job failed: exit code 1
There is a my-infrastructure/templates/aws-template.yml path under my repo. It's part of the submodule. So I don't understand why this workaround does not work.
Any help would be appreciated.
I fixed the issue with the git submodule approach.
I had to add 2 changes as per https://docs.gitlab.com/ee/ci/git_submodules.html
Add the submodule with relative path
git submodule add ../my-infrastructure.git
So that .gitmodules displays the relative url within the same server
[submodule "my-infrastructure"]
path = my-infrastructure
url = ../my-infrastructure.git
and add this variable to .gitlab-ci.yml
variables:
# submodule behavior
GIT_SUBMODULE_STRATEGY: recursive
Related
My current gitlab yml code looks like this.
stage: terraform-apply
script:
- terragrunt run-all apply -auto-approve --terragrunt-non-interactive --terragrunt-working-dir env/${ENVIRONMENT}
when: manual
Consider I have folders like this
../api/
../infra/
If I'm working on 'api' folder I want to execute commits only from current working directory in my cicd pipeline how do i go about about doing this.How to edit the gitlab yml to make pipeline execute only from the current working directory.
I have a project in gitlab and want to deploy this to a specific directory on a Linux server using the .gitlab-ci.yml for which I am facing an issue.
I have setup a gitlab runner for "cms-project" and added .gitlab-ci.yml to the root directory
When I push to the repository, the runner fetches the commits to the following directory
home/gitlab-runner/builds/ziwwUK3Jz/0/project/cms_project
Now I want the runner to fetch the commit to the dev server which is located in
/var/www/project-cms.com/html
I have tried the changes and below is the .gitlab-ci.yml file
job_main:
type: deploy
script: cd /var/www/project-cms.com/html && git pull
deploy:
stage: deploy
only:
- master
variables:
BRANCH: master
script:
- composer install
but I am getting the following error
Removing modules/contrib/
Removing vendor/
Skipping Git submodules setup
Executing "step_script" stage of the job script
$ cd /var/www/project-cms.com/html && git pull
error: cannot open .git/FETCH_HEAD: Permission denied
The user has the root permissions to the directory.
I have already gone through this link "https://stackoverflow.com/questions/37564681/gitlab-ci-how-to-deploy-the-latest-to-a-specific-directory" but that did not help
can anyone please help me to deploy the project to the "/var/www/project-cms.com/html" directory?
It is because you did not understand the difference between the gitlab-runner and your server
This code is running inside a "machine" called gitlab runner that is responsible to execute anything that you want. But it is not your web server.
To be able to deploy your code, you need to build a script that will run inside this runner and you need to understand that your entire code is inside this runner (this is the reason that you found your code inside the /builds folder.
So you need to tell to your runner to copy your code to your webserver, using any protocol/binary like ssh, ftp, sftp etc.
Edit: If your runner is running inside your webserver, you just need to copy the code to your folder, do not need to pull with git anymore
Some examples here: https://docs.gitlab.com/ee/ci/examples/
Docs: https://docs.gitlab.com/ee/ci/
Runner docs: https://docs.gitlab.com/runner/
I have two repositories in GitLab, repositories A and B let's say.
Repo A contains:
read_ci.yml
read_ci.sh
read_ci.yml contains:
stages:
- initialise
create checksum from pipeline:
stage: initialise
script:
- chmod +x read_ci.sh
- source ./read_ci.sh
Repo B contains:
gitlab-ci.yml
gitlab-ci.yml contains:
include:
project: 'Project/project_name'
file:
- '.gitlab-ci.yml'
ref: main
Obviously, this doesn't do what my intention is.
What I want to achieve is in the project B pipeline to run the project A script.
The reason is that I want project A to be called from multiple different pipelines and run there.
an alternative to this for GitLab: Azure Pipelines. Run script from resource repo
Submodules would absolutely work as Davide mentions, though it's kinda like using a sledgehammer to hang a picture. If all you want is a single script from the repository, just download it into your container. Use the v4 API with your CI_JOB_TOKEN to download the file, then simply run it using sh. If you have many files in your secondary repository and want access to them all, then use Submodules as Davide mentiones, and make sure your CI job retrieves them by setting the submodule strategy like this:
variables:
GIT_SUBMODULE_STRATEGY: normal
If you want to run the project A script in the project B pipeline, you can add the repository B as a git submodule in A
git submodule add -b <branch-B> <git-repository-B> <target-dir>
You need also to add in the CI job, the variable GIT_SUBMODULE_STRATEGY: recursive.
I got a requirement to generate, archive and reuse the artifacts between two different repositories
Repository A: Compile Angular code and create a XLF file
Repository B: Use the 'XLF File' generated above and create a new XLF file
Repository A: Again use the newly generated XLF file to create the final output file
The activities mentioned above should be done using gitlab-ci.yml. I am not sure how to handle this using GitLab CI.
We can push the artifact from Repo A to Repo B. However, CI on Repo A should wait until Repo B pushes a new artifact to Repo A to complete the process
Ideally, you would not push a generated artifact to another Git source repository.
But a GitLab pipeline can retrieve an artifact produced by another one from its URL.
To avoid the back and forth, I would rather have 3 jobs instead of two
the first generates XLF file
the second curls/fetches that file, and use it to generate new XLF file
the third job curls/fetches that file, and complete the process.
How to release built artifacts back-and-forth from one to another repo on GitLab?
Repository A:
Compile Angular code and create a XLF file
Send a hook to repository B that it just compiled
just trigger: https://docs.gitlab.com/ee/ci/yaml/#trigger , works like a charm. It's even nicely visible in the gui.
or API https://docs.gitlab.com/ee/ci/triggers/
pass variables: PARENT_PIPELINE_ID: $CI_PIPELINE_ID to repository B so it can download artifacts from specific pipeline
Repository B:
Use the 'XLF File' generated above
use needs: https://docs.gitlab.com/ee/ci/yaml/#artifact-downloads-to-child-pipelines to download artifacts
or API: have personal access token from repository A https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html added to environment variables and use API to download artifacts https://docs.gitlab.com/ee/api/job_artifacts.html .
create a new XLF file
use trigger: or API to trigger repository A
but this time trigger different .gitlab-ci.yml file like: trigger: - project: repositoryA file: second_stage.gitlab-ci.yml https://docs.gitlab.com/ee/ci/yaml/#trigger-child-pipeline-with-files-from-another-project
or use like variables: SECOND_STAGE: "true" and use a variable to differentiate
Repository A:
run pipeline from the file second_stage.gitlab-ci.yml
download artifacts from repository B - needs: or API
use the newly generated XLF file to create the final output file
Overall, what you need is rules: and needs: documentation. On older gitlab, it was done with API.
CI on Repo A should wait until Repo B pushes a new artifact to Repo A to complete the process
Don't wait. Let the API trigger it.
I tried the following approach and it worked fine or at least I was able to proceed
Due to some reason 'variables' along with CURL did not work as expected but I did not analyze the root cause
Repo A - Pipeline
trigger-repob: (Trigger Project B of Repo B)
stage: repob
trigger:
project: repob-namespace/projectb
branch: devops
test_job:
image: $CI_REGISTRY/$CI_PROJECT_PATH/base-image:latest
stage: test_pipeline
when: delayed
start_in: 2 minutes
needs: (Use artifacts from Repo B/Project B)
-
project: repob-namespace/projectb
job: buildprojectb
ref: devops
artifacts: true
script:
- do something here
Repo B Pipeline
buildprojectb:
image: php:7.4.11
stage: build
script:
- do something here
artifacts:
paths:
- outputs/*.xlf
Is the list value build:image just a name like build_image? Or does it have special usage in either the yaml file or the .gitlab-ci.yml file? If there isn't a special usage, what is the value of using name1:name2 instead of name1_name2?
The :image doesn't seem to be put into a variable. When I run this through the gitlab pipeline, the output is
Skipping Git submodules setup
Restoring cache
Downloading artifacts
Running before_script and script
$ echo image is $image
image
is
.gitlab-ci.yml
stages:
- build:image
- tag:image
- deploy
build:
stage: build:image
script:
- echo image is $image
I don't see anything like this in the GitLab CI/CD Pipeline Configuration Reference
Where did you see this .gitlab-ci.yml file?
I ran the .gitlab-ci.yml you provided and it seems to work fine, apparently GitLab CI doesn't treat the colon in any special way -- and I wouldn't expect it to, as there is no mention of it in the documentation.