creating docker container on Linux from Jenkins running on windows - linux

I have a build pipeline running on Windows that I cannot move to Linux, the simple reason being that it uses SQL Server tools not currently available on the RC1 version of SQL Server on Linux. Therefore my only option for running my build pipeline which needs to spin up SQL Server in containers on a Linux machine is to keep Jenkins on windows. My question is this, what is the most elegant way of creating a container on a remote Linux host from a windows server ?. I could use remote shells, however this seems like a really clunky way of doing things.

You can do this by installing a slave of the Jenkins (that is installed in Windows host) on your Linux machine and execute a job which will bring up a SQL container.
Since you are using a Pipeline job and want to execute few steps in the master and then call the SQL packages in remote hosts from your Windows host you can follow the below syntax to achieve that in a single pipeline job:
node('master') {
...................
<some task to perform>
...................
}
node('slave1 && slave2') {
...................
<some task to perform>
...................
}

Related

Gitlab pipeline running in VM and not in Gitlab server

I have a Gitlab server from the company where the project and the pipeline are configured. By default, every time a commit is done, the pipeline starts to execute in the Gitlab server.
I have my personalized VM, which is completely different from Gitlab. I want that the pipeline will be executed in my personalized VM instead of the Gitlab server. What should I do so that the pipeline runs on the VM and not on the Gitlab server?
I have configured the following runner in config.toml that is located in $MYPROJECT/:
[[runners]]
name = "Project-name"
url = "https://gitlab.server/"
token = "TOKEN ID"
executor = "shell"
shell = "bash"
There are things that I don't understand.
If I want to execute the pipeline in my personalized VM, should I install Gitlab runner in the VM [1]?
Should I have the project source code in the VM so that it can read the config.toml file every time there is a commit?
If I register the runner with the token key in the Gitlab server, how the Gitlab server knows that the pipeline is to be executed in the VM and not in the server [2]?
Should I use the executor docker or shell, to execute the pipeline in the VM?
[1] https://docs.gitlab.com/runner/install/linux-manually.html
[2] https://docs.gitlab.com/runner/register/#registering-runners
For running a job on a machine you need a GitLab Runner installed on that machine, connected with the GitLab server.
The project source code is fetched automatically in front of every run
You can use a tag (e.g. "MyVM") when registering the runner. Then you can set the same tag into your job so that this job is only executed by this runner. See: https://docs.gitlab.com/ee/ci/runners/configure_runners.html#use-tags-to-control-which-jobs-a-runner-can-run
You need to use docker if you want to use docker in your VM (which needs to be installed before there). Otherwise use shell.

How to connect to remote deployment windows server in Jenkins Pipeline script?

I am using Jenkins Pipeline to set the build and deployment process. For the build, i am using a Windows server node but for deployment, I have to use separate deployment servers (Windows machines) for Dev, QA and Production environments which cannot be added as slave nodes. I have to connect to these deployment servers to execute the deployment code. I have already tried using PowerShell and PSExec to connect to the remote machines and both are working fine, but our requirement is not to use any of these (PowerShell or PSExec) and do everything in Groovy scripting. I have searched everywhere but have not found a suitable solution to connect to the remote windows servers using Groovy in Jenkins Pipeline and run the deployment commands. Please suggest.

Executing node.js script remotely

I am setting continuous integration using Jenkins server for my Node.js application. For deployment I am using a powershell script and for that I have installed powershell plug-in. The script will need to perform the following tasks in the order.
# Step1
# Stop all the currently running services on web server. For that I am trying to
# execute maintenance.js remotely from the build server under node
node \\SharedWebServerFolder\Utilities\maintenance.js stopServices
# Step2
# Copies all the resources to server at \\SharedServerWebFolder
# Step3
# Start the services
node \\SharedWebServerFolder\Utilities\maintenance.js startServices
I have no problem executing Step2. My question is about Step1 and Step3.
Should I execute maintenance.js remotely from the build server? Is this even possible? ( Assume I have installed node.js on the build server)
Should I have one more PowerShell script on the web server which executes maintenance.js locally? So basically deployment script (from build server) will execute remote PowerShell script (which is on web server) and that remote PowerShell script will execute maintenance.js locally. In this scenario I don't have to install Node.js on the build server.
What is recommended?
People generally use ssh to execute commands remotely. I think you want to execute a command from your build server on your production / staging server.
see linux execute command remotely
this is simple where the remote box is a linux box.
If your remote is a Windows box then you can still run an ssh server on it but its more painful to setup. you might try Bitvise SSH Client.
Once your ssh client is setup correctly you should be able to execute remote commands from your build server.
I have used Jenkins SSH Plugin for deployment in my remote staging/production servers.I would recommend to execute your 'maintenance.js' in web server machine instead of your build server.You can just trigger 'maintenance.js' execution in staging/production servers from build server using ssh.I believe the execution of 'maintenance.js' remotely from the build server is hard to achieve.

Is it possible to access Jenkins slave machine's files?

I have Jenkins master installed in cloud service in linux server. I have also installed Jenkins in my local machine (Windows). The local machine version of Jenkins is working as a slave. Slave setup is configured in master location. The connection between them works fine.
I have a plugin in Jenkins master where I need to provide an application directory. This application directory is located only in my slave machine. So I would like to know that is it possible to tell to Jenkins master that the application directory is located at slave machine? If yes how it is done?
I have been searching in Google, but not found any solution yet.
The Copy to Slave plugin allows copying to and from a Slave to Master

How to simultaneously deploy Node.js web app on multiple servers with Jenkins?

I'm gonna deploy a Node.js mobile web application on two remote servers.(Linux OS)
I'm using SVN server to manage my project source code.
To simply and clearly manage the app, I decided to use Jenkins.
I'm new to Jenkins so it was a quite difficult task installing and configuring Jenkins.
But I couldn't find how to set up Jenkins to build remote servers simultaneously.
Could you help me?
You should look into supervisor. It's language and application type agnostic, it just takes care of (re-) starting application.
So in your jenkins build:
You update your code from SVN
You run your unit tests (definitely a good idea)
You either launch an svn update on each host or copy the current content to them (I'd recommend this because there are many ways to make SVN fail and this allows to include SVN_REVISION in the some .JS file for instance)
You execute on each host: fuser -k -n tcp $DAEMON_PORT, this will kill the currently running application with the port $DAEMON_PORT (the one you use in your node.js's app)
And the best is obviously that it will automatically start your node.js at system's startup (provided supervisor is correctly installed (apt-get install supervisor on Debian)) and restart it in case of failure.
A node.js supervisord's subconfig looks like this:
# /etc/supervisor/conf.d/my-node-app.conf
[program:my-node-app]
user = running-user
environment = NODE_ENV=production
directory = /usr/local/share/dir_app
command = node app.js
stderr_logfile = /var/log/supervisor/my-node-app-stderr.log
stdout_logfile = /var/log/supervisor/my-node-app-stdout.log
There are many configuration parameters.
Note: There is a node.js's supervisor, it's not the one I'm talking about and I haven't tested it.
per Linux OS, you need to ssh to your hosts to run command to get application updated:
work out the workflow of application update in shell script. Especially you need to daemonize your node app so that a completed jenkins job execution will not kill your app when exits. Here's a nice article to tell how to do this: Running node.js Apps With Upstart, or you can refer to pure nodejs tech like forever. Assume you worked out a script under /etc/init.d/myNodeApp
ssh to your Linux OS from jenkins. so you need to make sure the ssh private key file has been copied to /var/lib/jenkins/.ssh/id_rsa with the ownership of jenkins user
Here's an example shell step in jenkins job configuration:
ssh <your application ip> "service myNodeApp stop; cd /ur/app/dir; svn update; service myNodeApp restart"

Resources