Gitlab: How to trigger a script when a file is changed? - gitlab

I have a repository in Gitlab and what I wish to have is a setup in which when a specific file in a specific branch is changed, I want a script/job to be triggered, which will read and make operations based on this new version of the changed file.
That script can be in another machine and be accessed through SSH, or it can be inside the same repository and be executed somehow.
Is there any way to do this with Gitlab CI/CD?
Edit: I'm using GitLab Enterprise Edition 11.2.3-ee aadca99

You can use only/except changes to do this.
It has been introduced in Gitlab 11.4 and it works with files and directories within your repository, example :
docker build:
script: docker build -t my-image:$CI_COMMIT_REF_SLUG .
only:
changes:
- Dockerfile
- docker/scripts/*
- dockerfiles/**/*
- more_scripts/*.{rb,py,sh}

Related

Gitlab runner does not always update environment

I have a gitlab job that does not seem to update the repository before being run. Sometimes it leaves some files in their old states and run the script... Any idea ?
For instance when I have a
packagePython:
stage: package
script:
- .\scripts\PackagePython.ps1
tags:
- myServer
cache:
paths:
- .\python\cache\
only:
changes:
- python/**/*
I finally managed to understand what was happening :
I realised that the gitlab-runner did not use exactly the same path for each run on my server, and my script assumed that it did... So I ended up pointing on a build made on the wrong path.
I guess if you think that it is not updating the repository (like I did) make sure you are not referencing hardcoded path/package in your scripts that could refer to previous versions !

Gitlab CI to deliver files to a remote server (rsync)

I'm working with SVN but I would like to move on to Git, and more specifically to Gitlab.
I have the following structure:
MyStructure/
customer/
client1/
delivery.sh
MyFiletoSend.sh
client2/
delivery.sh
MyFiletoSend2.sh
Currently, the "delivery.sh" will send the modifications (rsync) of the file "MyFiletoSend.sh" to the server "client1".
Can I run the "delivery.sh" via Gitlab automatically after/before the git push only on the files modified in this push?
Example:
I have a modification to make to the file "MyFiletoSend.sh" from client1/
I make my change
commit and push
Gitlab is running "delivery.sh" on my "client1/" file.
The file "MyFiletoSend.sh" is sent to the server of "client1" without touching "client2".
Yes, it is possible
but first of all you need to understand how gitlab ci works. Read this article https://docs.gitlab.com/ee/ci/yaml/
You will create a step in your pipeline that will do what you want after you push the code (in master or in any other branch/mr)
and about the job? you have to create one, you can use this code to help you
https://gist.github.com/hnlq715/6c222ba0fd868bae7e4dfd3af61bf26e
Assuming your delivery.sh scripts have all the rsync logic required, GitLab has built-in logic to detect changes in files and execute bash commands in response. You can create a separate job for each client, which can run in parallel in the same stage. This approach is also auditable in that it will clearly show you which clients got updated and with which version of the file.
update-client-1:
stage: update-clients
only:
changes:
# Detect change only in MyFiletoSend.sh:
- customer/client1/MyFiletoSend.sh
# Detect any change in the customer folder:
- customer/client1/*
script:
- cd customer/client1
- delivery.sh
update-client-2:
stage: update-clients
only:
changes:
- customer/client2/*
script:
- cd customer/client2
- delivery.sh
# repeat for all remaining clients
For more information: https://docs.gitlab.com/ee/ci/yaml/#onlychangesexceptchanges

Gitlab CI Web Deployment

So we are currently moving away from our current deployment provider: Beanstalk, which is great but we are on the top tier and we keep running out of space or hitting our repository limits. So we are moving away so please do not suggest any other SaaS provider.
I personally use Gitlab for my own projects and a few company projects and it's amazing we use a self hosted version on our local server in our company building.
We have CI setup and currently are using the following deployment code (I have minified the bits just to the deployment for development) - this uses the shell executer for deploying as we deploy to an existing linux server.
variables:
HOSTNAME: '<hostname>'
USERNAME: '<username>'
PASSWORD: '<password>'
PATH_DEV: '/path/to/www'
# Define the stages (we can add as many as we want)
stages:
# - build
- deploy
# The code for development deployment
deploy_dev:
stage: deploy
script:
- echo "Deploying to development environment..."
- rm .gitlab-ci.yml
- rsync -urltvz --filter=':- .gitignore' --exclude=".git" -e "sshpass -p"$PASSWORD" ssh -o StrictHostKeyChecking=no -o UserKnownHostsFile=/dev/null" * $USERNAME#$HOSTNAME:$PATH_DEV
- echo "Finished deploying."
environment:
name: Development
url: http://dev.domain.com
only:
- envdev
The Problem:
When we use the above code to deploy it's perfect and works really well, and it deploys all the code after optimisation etc, but we have found a little bug here.
When you delete a file then the rsync command will not delete the file, now I did some searching and found the --remove flag you can add, and it worked - but it deleted all the user uploaded content as well. Now I added the .gitignore in to the filtering, so it would ignore some the files in their (which are usually user generated) or configuration files or/and libraries (npm, etc.). This is fine until a user started uploading files using the media manager in our framework which stores in a folder that is not in the .gitignore file and it can't because it contains other files, as we also add our own files in there so they're editable by the user, so now I am unsure how to manage this.
What we are looking for is a CI setup, which will upload file changes to the server, so it would search through the latest commits, and find the latest files that have been changed and then push only them files up. Of course I would like to do this with the Gitlab CI still, so any ideas examples or tutorials would be amazing.
Thanks in advance.
~ Danny
May it helps: https://github.com/banago/PHPloy
Looks this tool designed for php project, but I think it can use other web deployment.
how it works:
PHPloy stores a file called .revision on your server. This file contains the hash of the commit that you have deployed to that server. When you run phploy, it downloads that file and compares the commit reference in it with the commit you are trying to deploy to find out which files to upload. PHPloy also stores a .revision file for each submodule in your repository.

GitLab CI/CD pull code from repository before building ASP.NET Core

I have GitLab running on computer A, development environment (Visual studio Pro) on computer B and Windows Server on computer C.
I set up GitLab-Runner on computer C (Windows server). I also set up .gitlab-ci.yml file to perform build and run tests for ASP.NET Core application on every commit.
I don't know how can I get code on computer C (Windows server) so I can build it (dotnet msbuild /p:Configuration=Release "%SOLUTION%"). It bothers me that not a single example .gitlab-ci.yml I found on net, doesn't pull code form GitLab, before building application. Why?
Is this correct way to set-up CI/CD:
User create pull request (a new branch is created)
User writes code
User commit code to branch from computer B.
GitLab runner is started on computer C.
It needs to pull code from current branch (CI_COMMIT_REF_NAME)
Build, test, deploy ...
Should I use common git command to get the code, or is this something GitLab runner already do? Where is the code?
Why no-one pull code from GitLab in .gitlab-ci.yml?
Edited:
I get error
'"git"' is not recognized as an internal or external command
. Solution in my case was restart GitLab-Runner. Source.
#MilanVidakovic explain that source is automatically downloaded (which I didn't know).
I just have one remaining problem of how to get correct path to my .sln file.
Here is my complete .gitlab-ci.yml file:
variables:
SOLUTION: missing_path_to_solution #TODO
before_script:
- dotnet restore
stages:
- build
build:
stage: build
script:
- echo "Building %CI_COMMIT_REF_NAME% branch."
- dotnet msbuild /p:Configuration=Release "%SOLUTION%"
except:
- tags
I need to set correct variable for SOLUTION. My dir (where GitLab-Runner is located) currently holds this folder/files:
- config.toml
- gitlab-runner.exe
- builds/
- 7cab42e4/
- 0/
- web/ # I think this is project group in GitLab
- test/ # I think this is project name in GitLab
- .sln
- AND ALL OTHER PROJECT FILES #Based on first look
- testm.tmp
So, what are 7cab42e4, 0. Or better how to get correct path to my project structure? Is there any predefined variable?
Edited2:
Answer is CI_PROJECT_DIR.
I'm not sure I follow completely.
On every commit, Gitlab runner is fetching your repository to C:\gitlab-runner\builds.. on the local machine (Computer C), and builds/deploys or does whatever you've provided as an action for the stage.
Also, I don't see the need for building the source code again. If you're using Computer C for both runner and tests/acceptance, just let the runner do the building and add Artifacts item in your .gitlab-ci.yaml. Path defined in artifacts will retain your executables on Computer C, which you are then able to use for whatever purposes.
Hope it helps.
Edit after comment:
When you push to repository, Gitlab CI/CD automatically checks your root folder for .gitlab-ci.yaml file. If its there, the runner takes over, parses the file and starts executing jobs/stages.
As soon as the file itself is valid and contains proper jobs and stages, runner fetches the latest commit (automatically) and does whatever script item tells it to do.
To verify that everything works correctly, go to your Gitlab -> CI / CD -> Pipelines, and check out whats going on. You should see something like this:
Maybe it would be best if you posted your .yaml file, there could be a number of reasons your runner is not picking up the code. For instance, maybe your .yaml tags are not matching what runner is created to pick up etc.

Remote trigger for (re)build CI Gitlab

I'm trying to use a remote trigger for (re)building in ci.gitlab. For explaining this, I made up this scenario:
2 repository, "lib" and "app1"
app1 will successfully build only if lib is included (solved simply by .gitlab-ci.yml)
I need to trigger the build of app1 (only for the master branch, in best-case) on commit (or merge request) of lib
I tried to figure it out using web hooks, but I wasn't able to find a url for ci.gitlab.com. Is this possible in a gitlab environment?
You can do this with newly added triggers functionality.
In your CI's project, find the section "Triggers". Add a trigger and use its token like this:
curl -X POST \
-F token=TOKEN \
https://ci.gitlab.com/api/v1/projects/{project_id}/refs/REF_NAME/trigger
(https://about.gitlab.com/2015/08/22/gitlab-7-14-released/)
Obsolete:
we have the same problem, and the way we solved it is by pushing and subsequently deleting a tag.
The assumption is that you manage the machine with Gitlab-CI runner. First, clone the main repository, app1 for you. And in lib's .gitlab-ci.yml add the steps:
- cd /path/to/app1_repository
- git pull
- git tag ci-trigger master
- git push origin ci-trigger
- git push --delete origin ci-trigger
- git tag -d ci-trigger
Make sure that you have the option Tag push events checked in your Gitlab Services settings for Gitlab-CI.
This solution has drawbacks:
Gitlab-CI runner must have write permissions to the repository, so it won't work for shared runners
git history will be bloated with all this tagging (especially Gitlab UI)
I opened an issue for this (https://gitlab.com/gitlab-org/gitlab-ci/issues/223) so let's hope they add this functionality to the API (http://doc.gitlab.com/ci/api/README.html).

Resources