Copy existing jenkins configuration to a new jenkins - linux

I'm new to Jenkins and havnig trouble with some basic tasks.
I have a configured Jenkins on my Linux SUSE VM with several installed plugins and important jobs. I'd like to make a new VM(also SUSE) with another Jenkins on it with the same configuration and jobs as the existing Jenkins. So basically my goal is to entirely copy the existing Jenkins instance's functionality. Also I'd like to automate this process with Ansible later on. What is the easiest way to move existing Jenkins configurations and jobs to make the new instance behave like the old one?
What I've tried already:
archived existing jenkins home directory
created a new instance of jenkins on the other VM
transferred the archived home dir, extracted and replaced it with the new one's home directory
I think some steps are missing as when I start the new Jenkins, it's status is active(exited).

Following the backup/restore procedure from Jenkins works for me. Did you tried it?

Related

boto3 describe_images empty (jenkins/packer)

I am running packer via a jenkins pipeline and want to delete the ami afterwards.
I am using a small python3/boto3 script to do that.
However, when calling describe_images I get an empty list. No errors (via debug).
If I run the same script via the same docker based agent (on a ec2 jenkins node) but from a different pipeline, it works.
I also do not have issues on another project with similar settings.
Sometimes, intermittently it will work, but seldom.
I can rule out a general config issue as the same script works perfectly on the same systems (just a different jenkins pipeline).
I can also rule out general issue with the jenkins pipeline, as it will intermittently work - without changes.
What am I missing?
Yikes, this was a stupid mistake on my side. So my script to fetch the ami-id from the packer manifest.json was not returning the correct ami-id (I assumed I'd only find one ami-id in that file).

gitlab - Synology - automatic synchronisation

im using gitlab on a synology NAS.
for reasons of safety, i would like to automatically synchronise (mirror) the entire gitlab system on my NAS with my account on gitlab.com.
is there an easy way to do this?
Your entire gitlab repository is stored under /volume1/docker/gitlab/repositories (or a similar directory).
To synchronize, you would need to run a script to push a certain repository (which is in bare mode) to a gitlab. This shows how to do it to GitHub, so it should be similar for gitlab.com:
https://gist.github.com/joahking/780877
Once configured, you can use the task manager in Synology's control panel to execute the script regularly.
Good luck!

What is gitlab runner

I think I'm fundamentally missing something. I'm new to CI/CD and trying to set up my first pipeline ever with gitlab.
The project is a pre-existing PHP project.
I don't want to clean it up just yet, at the moment I've pushed the whole thing into a docker container and it's running fine talking to google cloud's mysql databases etc as it should locally and also on a remote google cloud testing VM.
The dream is to be able to push to the development branch, and then merge the dev banch into the test branch which then TRIGGERS automated tests (easy part), and also causes the remote test VM (hosted on google cloud), to PULL the newest changes, rebuild the image from the latest docker file (or pull the latest image from gitlab image register)... and then rebuild the container with the newest image.
I'm playing around with gitlab's runner but I'm not understanding what it's actually for, despite looking through almost all the online content for it.
Do I just install it in the google cloud VM, and then when I push to gitlab from my development machine.. the repo will 'signal' the runner (which is running on the VM, to execute a bunch of scripts (which might include git pull on the newest changes?).
Because I already pre-package my app into a container locally (and push the image to the image registry) do I need to use docker as my executor on the runner? or can i just use shell and shell the commands in?
What am I missing?
TLDR and extra:
Questions:
What is runner actually for,
where is it meant to be installed?
Does it care which directory it is run in?
If it doesn't care which directory it's run,
where does it execute it's script commands? At root?
If I am locally building my own images and uploading them to gitlab's registry,
Do I need to set my executor to docker? Shouldn't I just set it to shell, pull the image, and build it? (Assuming the runner is runing on the remote VM).
What is runner actually for?
You have your project along with a .gitlab-ci.yml file. .gitlab-ci.yml defines what stages your CI/CD pipeline has and what to do in each stage. This typically consists of a build,test,deploy stages. Within each stage you can define multiple job. For example in build stage you may have 3 jobs to build on debian, centos and windows (in GitLab glossary build:debian, build:centos, build:windows). A GitLab runner clones the project read the gitlab-ci.yaml file and do what he is instructed to do. So basically GitLab runner is a Golang process that executes some instructed tasks.
where is it meant to be installed?
You can install a runner in your desired environment listed here. https://docs.gitlab.com/runner/install/
or
you can use a shared runner that is already installed on GitLab's infrastructure.
Does it care which directory it is run in?
Yes. Every task executed by runner is relativly to CI_PROJECT_DIR defined in https://gitlab.com/help/ci/variables/README. But you can alter this behaviour.
where does it execute it's script commands? At root?
Do I need to set my executor to docker? Shouldn't I just set it to shell, pull the image, and build it?
A runner can have mutiple executors such as docker, shell, virtualbox etc but docker being the most common one. If you use docker as the executor you can pull any image from docker hub or your configured registry and you can do loads of stff with docker images. In a docker environment normally you run them as the root user.
https://docs.gitlab.com/runner/executors/README.html
See gitlab access logs , runner is constantly polling the server

How to migrate GitLab to a new server?

I am trying to migrate an GitLab setup from 7.8.2 to 7.12.2. I am not really sure how to go about this. I have installed a new box, on Ubuntu 14.04.2.
Now I would really like to just export the old user/group database and import it on the new server, then copy all the repositories from the old server to the new one. And tell the users to start using the new one.
I do not know which database my new gitlab installation uses, neither the old one.
I have been up and down the gitlab documentation, but cannot find sufficient information on how to migrate from one server to another.
I followed the instructions on https://about.gitlab.com/downloads/ for ubuntu, and all seems to work fine. I am looking for a way to export the users/groups from the old gitlab box and import it on the new gitlab box. and then just copy all the repositories from the old box to the new one.
Any assistance? I know next to nothing about gitlab :(
I would take the following steps
Find out if gitlab is installed by hand or with gitlab-omnibus. This you need to know for the exact backup and update steps.
Do a backup of the old version just to be safe
Update the current 7.8.2 instance to 7.12.2 instance by following the update guideline
Back up the newly updated gitlab system
Restore the backup on the new system
Backup & restore documentation can be found here

How to place Email-Ext groovy script on the jenkins file system

I need to dynamically modify the notification e-mail recipients based on the build, so I'm using a groovy script. I want this script to be available to all jobs, so I want it to reside on the Jenkins file system and not within each project. It can be either in the recipients fields (using ${SCRIPT,...}) or in the pre-send script. A short (fixed) script that evaluates the master script is also good, as long it is the same for all projects.
You should try Config File Provider plugin. It works together with the Credentials configuration of Jenkins.
Go to Manage Jenkins > Managed files
Click Add a new Config
Select Groovy file
Enter the contents of your script
Your script will now be saved centrally on Jenkins, available to master/slave nodes.
In your Job Configuration:
Under Build Environment
Check Provide Configuration file
Select your configured groovy File from dropdown
Set Variable with a name, for example: myscript
Now, anywhere in your job, you can say ${myscript} and it will refer to absolute location of the file on filesystem (it will be somewhere in Jenkins dir).
My impression it that you would probably want to completely switch to Jenkins pipelines where the entire job is groovy file (Jenkinsfile) in the root of the repository.
Email-Ext already supports it even if it may be lacking some documentation.
https://jenkins.io/doc/pipeline/steps/email-ext/

Resources