gitlab pipeline rules multiple changes - gitlab

I am wondering if there is a way to configure the gitlab pipeline with an AND logic in the rules.changes section.
I want a job to run only when there are changes in two directories, I initially thought it would work like this but it's not
changes:
- Dockerfile
- docker/scripts/*
This is expected as the gitlab doc says
changes resolves to true if any of the matching files are changed (an OR operation)
So, Is there anyway we can make this like an AND operation?
Thanks in Advance!

Related

Run gitlab CI pipeline stage ONLY on clone of repo (or when a template was the source)?

I have a use case where I have a template project set up that I then use as a base for new microservices. While this works in building some basic stuff, such as base files, ci/cd pipeline, etc, it's still the template. I'm going through all the ci/cd variables now to check, but wanted to see if anyone else had this use case come up. Basically, I want to know if there's something like a "first run on repo creation" trigger that can run as soon as the repo is cloned from the template, but then never run again. This stage would modify some of the files in the repo to change names of things like the service, etc.
Is there any way to currently do this? The only other way I can think of doing it would be to have another project that uses the api or something to get the new repo name then check in the modified files.
Thanks!
You could use a rule that checks for a specific commit message crafted to be the message at the HEAD of the template project. Optionally, you can also check if the project ID is not the project ID of the template project (to avoid the job running in the template project itself).
rules:
- if: '$CI_COMMIT_MESSAGE == "something very specific" && CI_PROJECT_ID != "1234"'
When a new project is created, the rule will evaluate true, but future commits that users make won't (or at least shouldn't, under normal circumstances) match the rule.
Though, to hook into project creation, using a system hook (for self-managed instances) might be a better option.

Can we detect and get specific directory changed in gitlab CI/CD?

I am a beginner to learn gitlab CI/CD and doing some projects related to it. I have a trouble when using OpenFaaS and build a large number of function.I want to use CI/CD to automatically build functions that have updates, remain functions are not, but I don't know how to get name of specific folder or directory changed in my source to build corresponding functions.
So I wonder if we have a way to get name of specific directory changed in Gitlab CI/CD? Or another tools CI/CD can do that?
only:changes is what you are looking for:
#.gitlab-ci.yml
my job:
script: echo "I am triggered only when changes to some/dir are applied"
only:
changes:
- some/dir/**/*

Is there a way to trigger different commit builds on merge requests based on different file changes in GitlabCI?

We are migrating from tfs to gitlab. Ours is a huge repository with multiple technology stack. And each solution has its own unit tests associated. We are planning to allow merge requests to successfully complete only upon successful build of that particular solution with the changes associated with merge request. As we have multiple solutions, is there a way to tell Gitlab ci yaml to trigger only a particular job if the merge request is associated with specific file changes.
eg..
If I have changes from solution A in the merge request, the pipeline should trigger job A
Currently, we have yaml snippet similar to this. But its getting triggered on all the merge requests while we want it to trigger only for merge requests for files under /docs/UI
rules:
- if: $CI_MERGE_REQUEST_IID
- changes:
- ./docs/UI
Our Gitlab version: GitLab Enterprise Edition 12.6.6-ee
Any leads would be appreciated. Thanks!!
You are using the keyword rules. Rules are evaluated in order until the first match. When matched, the job is either included or excluded from the pipeline, depending on the configuration.
Use the keyword only instead. For example like this:
only:
refs:
- merge_requests
changes:
- docs/UI/**/*

Gitlab: Issues and Pipelines

I have setup a Git project + CI (using Gitlab-runner) on Gitlab v12.3.5. I have a question about issues and pipelines. Let's say I create an issue and assign it to myself. So this create a branch/merge request. Then, I open up the WebIDE to modify some files in an attempt to fix the issue. Now I want to see what if my changes will fix the issue. In order to run the pipeline, is it necessary to commit the changes into the branch or is there some other way?
The scenario I have is that it may take me 20 times to fix the files to make the pipeline 'clean'. In that case, I would have to keep committing on each change to see the results. What is the preferred way to accomplish this? Is it possible to run the pipeline by just staging the changes to see if they work?
I am setting up the gitlab-ci.yaml file. Hence it is taking a lot of trials to get it working properly.
You should create a branch and push to that. Only pushed changes will trigger pipeline runs. After you're done, you can squash and merge the branch so that the repo's history will be clean.
Usually though, you won't have to do this because you'll have automated tests set up to check whether your code works. You should also try testing the Linux commands (or whichever commands you're running in your GitLab CI scripts) locally first. If you're worried about whether your .gitlab-ci.yml syntax is correct, you can navigate to the file in your repository and check there (there's a button at the top which lints it).

Can we have multiple triggers that execute different jobs in one yaml file?

Is it possible for a pipeline to have multiple triggers in one YAML file that executes different jobs per trigger?
In our pipeline, we pack each project in the solution and push it as a nuget package in our own azure devops artifacts and want to do the packing and pushing depending on the project. Saw that it is possible to specify the branch and path in the trigger, but you can only have one trigger according to this. But he only indicated it in the question, and the documentation doesn't explicitly state it.
Right now my option is to just configure different pipelines with yaml files per project but I want to ask here to confirm if this is possible or not.
Agree with Jessehouwing You can add multiple triggers. You can use conditionals on tasks, jobs, stages and environments to only run in specific cases.
https://learn.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema#triggers
https://learn.microsoft.com/en-us/azure/devops/pipelines/process/conditions?tabs=yaml&view=azure-devops
Thanks for the input, studied the docs but it's not possible to achieve what I wanted with just the built in tasks for azure devops. I had to make a script that does it and assign true of false values to the conditionals.
The exact answer I was looking for was in this post

Resources