Running Giltab-runner on multiple servers - gitlab

It's possible to use gitlab on one host and Gitlab-runner on other host. For example , I have ansible server on other host and I want to setup there gitlab-runner

Yes.
From the documentation:
Ideally, the GitLab Runner should not be installed on the same machine
as GitLab. Read the requirements documentation for more information.

Related

Gitlab pipeline running in VM and not in Gitlab server

I have a Gitlab server from the company where the project and the pipeline are configured. By default, every time a commit is done, the pipeline starts to execute in the Gitlab server.
I have my personalized VM, which is completely different from Gitlab. I want that the pipeline will be executed in my personalized VM instead of the Gitlab server. What should I do so that the pipeline runs on the VM and not on the Gitlab server?
I have configured the following runner in config.toml that is located in $MYPROJECT/:
[[runners]]
name = "Project-name"
url = "https://gitlab.server/"
token = "TOKEN ID"
executor = "shell"
shell = "bash"
There are things that I don't understand.
If I want to execute the pipeline in my personalized VM, should I install Gitlab runner in the VM [1]?
Should I have the project source code in the VM so that it can read the config.toml file every time there is a commit?
If I register the runner with the token key in the Gitlab server, how the Gitlab server knows that the pipeline is to be executed in the VM and not in the server [2]?
Should I use the executor docker or shell, to execute the pipeline in the VM?
[1] https://docs.gitlab.com/runner/install/linux-manually.html
[2] https://docs.gitlab.com/runner/register/#registering-runners
For running a job on a machine you need a GitLab Runner installed on that machine, connected with the GitLab server.
The project source code is fetched automatically in front of every run
You can use a tag (e.g. "MyVM") when registering the runner. Then you can set the same tag into your job so that this job is only executed by this runner. See: https://docs.gitlab.com/ee/ci/runners/configure_runners.html#use-tags-to-control-which-jobs-a-runner-can-run
You need to use docker if you want to use docker in your VM (which needs to be installed before there). Otherwise use shell.

How to create a httpd container in docker through jenkins job?

I need help in creating the container through Jenkins job.
Let me know the steps to be followed: I have already created 3 jobs in Jenkins, and I want to create httpd container through the jobs created.
Should I install any plugins or write any script ?
Assuling we are not talking about Jenkins in docker, or Jenkins agents in Docker, you need to create your http container manually first, without Jenkins.
That means:
validate your SSH access to the remote server
Check it has Docker installed
execute docker commands to run an http container, s described in Docker httpd
Once that is working, you can replicate the process in a Jenkins Job, provided your remote server (the one with Docker, where you want to run your httpd container) is declared as agent to the main Jenkins controller.

GitLab CI/CD pipeline, deploy to Windows Server

Using GitLab Runner I have on Linux, I am trying to connect to a Windows Server and run some basic commands there such as git pull.
Does GitLab runner provide any capabilities for accessing windows server?
What other options are there to get such requirement done?
I think you have a few options
install openssh and configure server on your windows machine/vm and connect from your gitlab runner with ssh ( https://learn.microsoft.com/en-us/windows-server/administration/openssh/openssh_install_firstuse )
( as Wojciech Wirzbicki commented) install a gitlab runner instance on windows. https://docs.gitlab.com/runner/install/windows.html . I think this option more secure and easy win
connect to windows server with winrm

Intershop server in docker

I'm trying to setup a local development environment on a linux machine. The ideal setup would be using a docker container spinning up the server with a shared directory to push code changes.
The question is if this is an accepted approach to use? Any tips appreciated.
There is
a public guide how to setup an environment on ICM 7.9 , https://support.intershop.com/kb/index.php/Display/28K663, and
a sample linux VM on ICM 7.5.5. provided in case you have got a support login - https://support.intershop.com/static/Customer-Support-offers-a-Linux-VM-as-an-example--development-environment-for-Intershop-7.5.html
but both of them are not related to a docker container, however it may help you with settung up a development environment in there.

Is it possible to access Jenkins slave machine's files?

I have Jenkins master installed in cloud service in linux server. I have also installed Jenkins in my local machine (Windows). The local machine version of Jenkins is working as a slave. Slave setup is configured in master location. The connection between them works fine.
I have a plugin in Jenkins master where I need to provide an application directory. This application directory is located only in my slave machine. So I would like to know that is it possible to tell to Jenkins master that the application directory is located at slave machine? If yes how it is done?
I have been searching in Google, but not found any solution yet.
The Copy to Slave plugin allows copying to and from a Slave to Master

Resources