How to use specific runners for different gitlab installations? - gitlab

I am planning on put some personal work on a repo in gitlab.com.
I also have some repos in a gitlab installation in another server for work.
Now, I would like to use specific runners to enable CI in my repos.
I am learning how to install and register runners but in the registration part it says you have to specify your url and your token,
The thing is that I have two different urls (gitlab.com and the gitlab server from work) and two different tokens
So what do I do in this case?
Do I have to create two runners? How? (which part of the installation method is done only once and which twice?)

Related

deploy files from branch to folder

My company has recently moved to self-hosted Gitlab instance, and now I’m trying to wrap my head around it to see how we could use the CI/CD features of it for our cases. For instance:
We have a PHP-based front end project which consists of several PHP, CSS and JS files that as of now are being copied to our Apache2 folder in the same structure as we use for development.
Is it possible to configure Gitlab to do it for us, let’s say, implementing such an algorithm:
we make a DEPLOY branch in our repository
when we’re done with changes in other branches, we merge those branches into DEPLOY
Gitlab CI/CD detects the new activity and automatically puts the latest version of the files from this branch into our Apache2 directory (connected as a remote folder)
Any advice and guidance is appreciated. I’m currently lost in many manuals that describe Docker deployment etc. which is not yet used in our project.

How to create a common pipeline in GitLab for many similar projects

We have hundreds of similar projects in GitLab which have the same structure inside.
To build these projects we use a one common TeamCity build. We trigger and pass project GitLab URL along with other parameters to the build via API, so TeamCity build knows which exact project needs to be fetched/cloned. TeamCity VCS root accepts target URL via parameter.
The question is how to replace existing TeamCity build with a GitLab pipeline.
I see the general approach is to have CI/CD configuration file(.gitlab-ci.yml) directly in project. Since the structure of the projects the same this is not the option to duplicate the same CI/CD config file across all projects.
I'm wondering is it possible to create a common pipeline for several projects which can accept the target project URL via parameter ?
You can store the full CICD config in a repository and put in all your projects a simple .gitlab-ci.yml which includes the shared file.
With thus approach there is no redundant definition of the jobs.
Still, you can add specific other jobs to specific projects (in the regarding .gitlab-ci.yml files or define variables in a problem and use some jobs conditionally) - you can also include multiple other definition files, e.g. if you have multiple similar projects.
cf. https://docs.gitlab.com/ee/ci/yaml/#include
With latest GitLab (13.9) there are even more referencing methods possible: https://docs.gitlab.com/ee/ci/yaml/README.html#reference-tags
As #MrTux already pointed out, you can use includes.
You can either use it to include a whole CI file, or to include just certain steps. in Having Gitlab Projects calling the same gitlab-ci.yml stored in a central location - you can find detailed explanation with examples of both usages

Good practices for pulling from git repo into production server

I have a DigitalOcean VPS with ubuntu and a few laravel projects, for my projects initial setup I do a git clone to create a folder with my application files from my online repository.
I do all development work in my local machine, where I have two branches (master and develop), what I do is merge develop into my local master, then I push from master into my local repository.
Nw back into my production server, when I want to add all the changes added into production I do a git pull from origin, so far this has resulted into git telling me to stash my changes, why is this?
What would be the best approach to pull changes into production server? take in mind that my production server has no working directory perse, all I do in my VPS is either clone or push upgrades into production.
You can take a look at the CI/CD (continuous integration / continuous delivery) systems. GitLab for example offer free-to-use plan for small teams.
You can create a pipeline with a manual deploy step (you have to press a button after the code is merged to the master branch) and use whatever tool you like to deploy your code (scp, rsync, ftp, sftp etc.).
And the biggest benefit is that you can have multiple intermediate steps (even for the working branches) where you can run unit tests which would prevent you to upload failing builds (whenever you merge non-working code)
For the first problem, do a git status on production to see which files that git sees as changed or added and consider adding them to your .gitignore file (which itself should be a part of your repo). Laravel generally has good defaults for these, but you might have added things or deviated from them in the process of upgrading Laravel.
For the deployment, the best practice is to have something that is consistent, reproducible, loggable, and revertable. For this, I would recommend choosing a deployment utility. These usually do pretty much the same thing:
You define deployment parameters in code, which you can commit as a part of your repo (not passwords, of course, but things like the server name, deploy path, and deploy tasks).
You initiate a deploy directly from your local computer.
The script/utility SSH's into your target server and pulls the latest code from the remote git repo (authorized via SSH key forwarded into the server) into a 'release' folder.
The script does any additional tasks you define (composer install, npm run prod, systemctl restart php-fpm, soft-linking shared files like .env, and etc.)
The script soft-links the document root to your new 'release' folder, which results in an essentially zero-downtime deployment. If any of the previous steps fail, or you find a bug in the latest release, you just soft-link to the previous release folder and your site still works.
Here are some solutions you can check out that all do this sort of thing:
Laravel Envoyer: A 1st-party (paid) service that allows you to deploy via a web-based GUI.
Laravel Envoy: A 1st-party (free) package that allows you to connect to your prod server and script deployment tasks. It's very bare-bones in that you have to write all of the commands yourself, but some may prefer that.
Capistrano: This is (free) a tried-and-tested popular ruby-based deployment utility.
Deployer: The (free) PHP equivalent of Capistrano. Easier to use, has a lot of built-in tasks (including a Laravel one), and doesn't require ruby.
Using these utilities is not necessarily exclusive of doing CI/CD if you want to go that route. You can use these tools to define the CD step in your pipeline while still doing other steps beforehand.

Bitbucket Pipelines access other node repository

I have enabled Bitbucket Pipelines in one of my node.js repositories to have it run the build on every commit. My repository depends on another node.js repository. For development I've linked the one to the other using npm link.
I've tried a git clone of that repository that is specified in the bitbucket-pipelines.yml file, but the build gets stuck on that command. I guess it's because git is asking for authentication at that point.
Is there a way to allow the container to access other repositories in the same team? Or is there a better way altogether on how to solve this? I'd also be fine with switching to another CI tool if Bitbucket Pipelines aren't capable of this – the only requirement is that it's free for teams < 5 people.
Btw. I'd like to avoid paying for npm private packages if possible.
Thanks!
You can organize access by ssh key for another repo like described in official docs https://confluence.atlassian.com/bitbucket/access-remote-hosts-via-ssh-847452940.html

Gitlab and CI server

I recently moved a repo from bitbucket to gitlab. I now want to have a CI (travis or drone) working with my repo.
After some reading, i found out that gitlab builded their own CI (gitlab CI) but needed to be self hosted and it dont seem to be possible to set on heroku.
I dont want to manage an AWS instance only to get a CI server, as travis, drone (and probably some other that i dont know of) already exist and do the job.
Is there something i missed? Is there a way to have (quick and easy) gitlab CI (i repeat that i wont take a self-managed server for this) or i will have to move to github or get back to bitbucket?
Gitlab is really a nice product, but the lack of support for CI server is a road block!
Thank you
It seems that Drone already does GitLab: http://feedback.gitlab.com/forums/176466-general/suggestions/5675077-integrate-docker-drone-with-gitlab-ci-runner but I haven't tried it.
You might also have a look at: https://githost.io/, it manages GitLab and / or CI for you, and you can connect the CI to any GitLab instance: https://githost.io/docs#ci_master Since you already have the CI there, keeping it in-house is not a concern, so you might as well also have the GitLab instance there or at gitlab.com It was acquired by GitLab in 2015 Q2 https://twitter.com/gitlab/status/592438051533524993
Travis on the other hand seems to be bound to GitHub and thus not an option: Integrate Gitlab and TravisCi
As mentioned by Dorum, Magnum CI also handles GitLab: https://magnum-ci.com/docs
MagnumCI now support Gitlab and other popular platforms. Also Gitlab launched the own CI service with shared servers.

Resources