getting pipeline to execure only in current directory - gitlab

My current gitlab yml code looks like this.
stage: terraform-apply
script:
- terragrunt run-all apply -auto-approve --terragrunt-non-interactive --terragrunt-working-dir env/${ENVIRONMENT}
when: manual
Consider I have folders like this
../api/
../infra/
If I'm working on 'api' folder I want to execute commits only from current working directory in my cicd pipeline how do i go about about doing this.How to edit the gitlab yml to make pipeline execute only from the current working directory.

Related

Not to start a pipeline when only specific files are changed

I am trying to not to start a pipeline when only specific files are changed.
In case, I change the README.md, README.adoc, .gitignore file. The Pipeline must not run.
I can not make it happen correctly yet.
image: alpine
stages:
- test
workflow:
rules:
- changes:
- "**/*.md"
- "**/*.adoc"
- "**/.gitignore"
when: never
- when: always
test :
script:
- "echo Hello Pipeline"
tags:
- dev
My Test Cases are
Push abc.java and abc.python. -> should run Pipeline : success
Push abc.java file and README.md file -> should run pipeline : But it doesn't. Because the *.md file is in push
Create a Branch -> should run pipeline, but it does't run. Because the *.md Files are in the source branch
Merge a Branch -> should run pipeline, but it does't run. Because the *.md Files are in the source branch.
Create a Tag -> should run pipeline, but it does't run. Because the *.md Files are in the source branch.
has anyone succeed?
I like to make a blacklist of file extension to stop pipeline, instead of whitelisting of file extension to run pipeline. Because, in my projects runs several programming languages and has many different file extensions.

GitLab submodule complains about AWS CloudFormation templates located in submodule

I am facing a problem with GitLab include and I'm wondering whether it's possible to do what I intent to.
I have 2 GitLab repositories:
my-infrastructure
my-prod-deployment
In my-infrastructure I have the YML file that defines a job with a command referencing a local file. For example in my-infrastructure I have the following:
A gitlab-infrastructure.yml template
image: amazon/aws-cli
variables:
FOO: ReplaceMe
stages:
- one
agent_ui:
stage: microservices
script:
- aws cloudformation deploy --stack-name sample --template-file templates/aws-template.yml
and also I have a templates/aws-template.yml that has some cloud formation code.
Notice that the GitLab template needs access to a local file that exists in the same project my-infrastructure
Now in the other project my-prod-deployment I have a .gitlab-ci.yml with
include:
- project: mycompany/my-infrastructure
ref: main
file: gitlab-infrastructure.yml
variables:
FOO: Bar
When I run this CI/CD pipeline I can see the FOO variable being properly overriden and I can see that the included job's script is executed. The problem is that I get a
$ aws cloudformation deploy --stack-name sample --template-file templates/aws-template.yml
Invalid template path templates/aws-template.yml
This is probably because the local relative path is in my-infrastructure, but not in my-prod-deployment that file is not locally available in this project and therefore it can't be found.
Is there any solution to this?
Maybe a way to include not only gitlab but also other files or similar?
Or maybe some kind of shortcut or link to a different repo folder?
Or maybe a way to temporary copy a remote folder to the local CI/CD pipeline execution?
Notice that I cannot use an absolute or URL path for that script parameter since that specific tool (AWS CLI) does not allow it. Otherwise I wouldn't face this relative path issue
UPDATE 1: I have tried a workaround with git submodules separating the gitlab template in a different project and adding my-infrastructure as a submodule
cd my-prod-deployment
git submodule add git#gitlab.com:mycompany/my-infrastructure.git
so that my .gitlab-ci.yml looks like this
include:
- project: mycompany/my-gitlab-templates
ref: main
file: gitlab-infrastructure.yml
variables:
CLOUDFORMATION_SUBMODULE: my-infrastructure
FOO: Bar
and my repo has a local folder my-infrastructure, but I am shocked to find that it still complains about the AWS CloudFormation template path, so I've added AWS Cloud Formation tag to the question and edited it.
This is the error
$ aws cloudformation deploy --stack-name sample --template-file $CLOUDFORMATION_SUBMODULE/templates/aws-template.yml
Invalid template path my-infrastructure/templates/aws-template.yml
Cleaning up project directory and file based variables 00:00
ERROR: Job failed: exit code 1
There is a my-infrastructure/templates/aws-template.yml path under my repo. It's part of the submodule. So I don't understand why this workaround does not work.
Any help would be appreciated.
I fixed the issue with the git submodule approach.
I had to add 2 changes as per https://docs.gitlab.com/ee/ci/git_submodules.html
Add the submodule with relative path
git submodule add ../my-infrastructure.git
So that .gitmodules displays the relative url within the same server
[submodule "my-infrastructure"]
path = my-infrastructure
url = ../my-infrastructure.git
and add this variable to .gitlab-ci.yml
variables:
# submodule behavior
GIT_SUBMODULE_STRATEGY: recursive

How to run a script from repo A to the pipeline B in Gitlab

I have two repositories in GitLab, repositories A and B let's say.
Repo A contains:
read_ci.yml
read_ci.sh
read_ci.yml contains:
stages:
- initialise
create checksum from pipeline:
stage: initialise
script:
- chmod +x read_ci.sh
- source ./read_ci.sh
Repo B contains:
gitlab-ci.yml
gitlab-ci.yml contains:
include:
project: 'Project/project_name'
file:
- '.gitlab-ci.yml'
ref: main
Obviously, this doesn't do what my intention is.
What I want to achieve is in the project B pipeline to run the project A script.
The reason is that I want project A to be called from multiple different pipelines and run there.
an alternative to this for GitLab: Azure Pipelines. Run script from resource repo
Submodules would absolutely work as Davide mentions, though it's kinda like using a sledgehammer to hang a picture. If all you want is a single script from the repository, just download it into your container. Use the v4 API with your CI_JOB_TOKEN to download the file, then simply run it using sh. If you have many files in your secondary repository and want access to them all, then use Submodules as Davide mentiones, and make sure your CI job retrieves them by setting the submodule strategy like this:
variables:
GIT_SUBMODULE_STRATEGY: normal
If you want to run the project A script in the project B pipeline, you can add the repository B as a git submodule in A
git submodule add -b <branch-B> <git-repository-B> <target-dir>
You need also to add in the CI job, the variable GIT_SUBMODULE_STRATEGY: recursive.

gitlab-ci.yml - override a specific job and script execution

I have a .gitlab-ci.yml file that says:
include:
- project: 'my-proj/my-gitlab-ci'
ref: master
file: '/pipeline/gitlab-ci.yml'
Because of some "Inconvenience" I would like to override some specific stage that is defined on the above mentioned gitlab-ci.yml' file injected on my top level .gitlab-ci.ymlfile. Theplan` stage I am interested in has the following thing:
plan-dummy:
stage: plan
script:
- terraform plan -lock=false -var-file=vars/vars.tfvars
What I want to do is override the above on the main .gitlab-ci.yml file such that only the script is executed as an override:
plan-dummy:
stage: plan
script:
- terraform refresh # This is the line I want to add as an additional step before next
- terraform plan -lock=false -var-file=vars/dev.tfvars
How do I achieve that without fiddling with the injected file? Yes, I know that alternative is to do dirty copy-paste from child file, but I don't want to do that.
Regards,
simply reuse the same job name and add the configuration you need:
plan-dummy:
before_script:
- terraform refresh
You can do the terraform refresh in a before_script part which will be executed before you script:
plan-dummy:
extends: plan-dummy
before_script:
- terraform refresh

GitLab CI Pipeline not triggered for events on default branch

I got two branches in my GitLab repo (uat and production). Two deploy jobs are meant to deploy a branch to a specific environment. There are two gitlab-ci.yml files, one in each branch (with the config for that branch) and production is my default branch.
The jobs should run only if files in dir/ changed and not for scheduled pipelines.
Problem: The deploy job for UAT is just working as expected: it runs if I push directly to the branch or if I accept a merge request. However, although there is no difference except the branch, the deploy job for production is not triggered on any event.
Question: Do you know if I misunderstood something and what would fix this?
Thanks!
gitlab-ci.yml in production
deploy_to_production:
only:
refs:
- production
changes:
- dir/*
except:
- schedules
script:
# upload to prod
gitlab-ci.yml in uat
deploy_to_uat:
only:
refs:
- uat
changes:
- dir/*
except:
- schedules
script:
# upload to uat
Do you have those empty lines before script: in your file?
This will define script under default, because it is not tied to a job.
default:
script:
# upload to uat
The reason that it is only running uat is that on the second reference the first one gets overwritten. You can check this on gitlab on your project page under CI/CD/Editor.
And here you can view the final yaml after merging:

Resources