gitlab CI/CD run commands on external server - gitlab

I want to use gitlabs CI/CD to deploy my app on a external server. i have the IP, username and password, and i understand i need to connect through SSH. How can i runn all the nessesary commands on the server side. Server runs on linux.
Currently i just get the code from reposiroty and to the npm build:prod and npm serve:prod for the API and npm start for the UI. How can i do the same chain of cammands with gitlab CI/CD? Or is this even possible? I basically want it to run similarily like jenkins works. But since the code is already on gitlab, it might be simplerer to let gitlab to handle this process instead of installing and setting up jenkins.

To be able to SSH to your machine from within GitLab CI, you probably should setup ssh key authentication, since you can't just type in the password inside the CI.
When you've got that set up, you have to store the private key in an environment variable so you can use it in the CI job. How to do that can be found here.
The last part is actually executing commands over ssh. That can be done in the following way:
ssh <host> '
command1;
command2;
'

Related

jenkins.plugins.publish_over.BapPublisherException: Failed to connect and initialize SSH connection Message [Auth fail]

I am learning to use Jenkins to deploy a .Net 5.0 application on an AWS EC2 server. This is the first time I am using Linux server and Jenkins for .Net (I'm am a life long Windows guy), and I am facing an error while trying to publish my artifacts over SSH to Web Server.
My setup:
Jenkins server is an AWS EC2 Linux AMI server.
Web Server is also an AWS EC2 LInux AMI server.
My Jenkins is correctly installed and working. I am able to build and run unit test cases without any issues.
For Deploy, I am using 'Publish Over SSH' plugin, and I have followed all steps to configure this plugin as mentioned here https://plugins.jenkins.io/publish-over-ssh/.
However, when try to 'Test Configuration', I get the below error,
Failed to connect or change directory
jenkins.plugins.publish_over.BapPublisherException: Failed to connect and initialize SSH connection. Message: [Failed to connect session for config [WebServer]. Message [Auth fail]]
I did a ping test from Jenkins server to Web Server, and it is a success.
I'm using the .pem key in the 'Key' section of 'Publish over SSH'. This key is the same key I use to SSH into the web server.
The below link suggests many different solutions, but none is working in my case.
Jenkins Publish over ssh authentification failed with private key
I was looking at the below link which describes the same problem,
Jenkins publish over SSH failed to change to remote directory
However in my case I have kept 'Remote Directory' as empty. I don't know if I have to specify any directory here. Anyways, I tried creating a new directory under the home directory of user ec2-user as '/home/ec2-user/publish' and then used this path as Remote Directory, but it still didn't work.
Screenshot of my settings in Jenkins:
I would appreciate if anyone can point me to the right direction or highlight any mistake I'm doing with my configuration.
In my case following steps solved the problem.
Solution is based on Ubuntu 22.04
add two line in /etc/ssh/sshd_config
PubkeyAuthentication yes
PubkeyAcceptedKeyTypes +ssh-rsa
restart sshd service
sudo service sshd restart
you might consider the following:
a. From the screenshot you’ve provided, it seems that you have checked the Use password authentication, or use different key option which will require you to add your key and password (inputs from these fields will be used in connecting to your server via SSH connection). If you use the same SSH key and passphrase/password on all of your servers, you can uncheck/untick that box and just use the config you have specified above.
b. You might also check if port 22 of your web server allows inbound traffic from the security group where your Jenkins server/EC2 instance is running. See reference here.
c. Also, make sure that the remote directory you have specified is existing otherwise the connection may fail.
Here's the sample config

NPM to pull from private GitLab repository

I have a GitLab domain, project and repo. This project is accessible via a group I am apart of.
I would like for this to be downloaded via npm install in the following ways:
Local computer
GitLab CI job
Inside of a Docker container
I'm guessing the easiest way of doing this is to just make it public. Is there a way to fix this so it is secure. I can imagine that it must be done with keys.
In my package.json under dependencies I currently have this, but it gives a 401 error of course:
"my-module": "my-domain.com:my-project/my-repo#my-branch",
I do not want hardcoded tokens in the package.json file, if it can be avoided.
You can use SSH keys to access your repository. Add ssh keys to GitLab server and define url to your repsitory in following form:
git+ssh://git#git.mydomain.com:Username/Repository#{branch|tag}
or
git+ssh://git#git.mydomain.com/Username/Repository#{branch|tag}
In your package.json it will be something like this "my-module": "git+ssh://git#my-domain.com:my-project/my-repo#my-branch"
If your ssh key is password protected, then npm will ask for password.

How to allow jenkins from local machine to run remote python test scripts

I have a jenkins running on my local centos machine.
I have configured my local jenkins and was able to run a successful local build .
Now, i want to run remote tests which are python scripts on a remote centos machine which is not having jenkins installed. also, i dont want to install any jenkins process on the remote linux system as it is "like a" production server and am advised not to install any apps on it.
How do i use my local jenkins to run a build to execute those remote tests and report/output on my local jenkins console.
Do i need to use jenkins master-slave architecture ? if yes, how do i configure that given my above requirement.
You might want to have a look at this:
https://wiki.jenkins-ci.org/display/JENKINS/Distributed+builds
for you req, precisely this part:
https://wiki.jenkins-ci.org/display/JENKINS/Distributed+builds#Distributedbuilds-Launchslaveagentheadlessly
However, i believe you still have to have java on your slave unix node to run the slave.jar on it
This answer is assuming the scripts are in GitHub. May it helps to think in your case.
So.. First you need to install Git in you server machine by:
$ sudo apt-get update
$ sudo apt-get install git
Now you need to get the path of Git by $ which git
it will give like "/usr/local/bin/git"
copy that path into ManageJenkins->Global Tool Configuration-> in the git section, paste into "Path to Git executable".
it will allows you to access git sources.
Now you need to provive SSH keys.
Type sudo su- jenkins in you remote machine.You have to generate ssh key for "jenkins" user.
Now add public key to GitHub account(You can see https://www.youtube.com/watch?v=Vi-WqFKYpnw).
and add the private key to Jenkins by
Go to Credentials
Click in Global in Stores scoped
Add Credentials
Kind: SSH Username with private key
Username: your server username
Private Key: give the private key of user "Jenkins"
Specify ID as "jenkins-private-key" or anything else to identify
Now
Go to job configuration->select credentials that you have created and
Copy the ssh url of repository(Where you scripts are stored) Now you can run the scripts which are stored in Git.

Running git from node.js as a child process?

I am attempting to write a generic command-runner in Node.JS - however that's not massively important.
My setup is as follows:
I have a list of string commands that are executed using child_process.exec one after the other.
I want to run git from one of these commands, specifically a pull.
The location I am pulling from requires SSH authentication. HTTPS is not an option.
My private key is passphrased.
I am currently using keychain to manage ssh-agent.
When running git pull from the command line, it succeeds. When running my application as the logged-in user, it succeeds. However, when running my application using forever, it fails.
The error I receive is Permission denied (publickey).. I have tried calling keychain as part of my command, but I cannot get it to recognise the credentials.
How can I fix this?
My mistake was taking the contents of .bash_profile and using that to set up keychain from my exec.
What I needed to do was:
. $HOME/.keychain/$HOSTNAME-sh; git pull
I found this out by looking up examples of how to use keychain with bash scripts.

Jenkins ignores proxy settings while building a job

I set a proxy under Plugins in Jenkins like suggested online.
I also edited the /etc/environment
bash-3.2$ cat /etc/environment
http_proxy=proxy.company.net:8080
https_proxy=proxy.company.net:8080
HTTP_PROXY=proxy.company.net:8080
HTTPS_PROXY=proxy.company.net:8080
HTTPS_PROXY_REQUEST_FULLURI=false
HTTP_PROXY_REQUEST_FULLURI=false
I verfied the variables and they are available on logon.
When I start ant manualy as root via ssh, my "composer.phar" script is able to connect and download files. As soon as Jenkins starts the job (I think its the "jenkins" linux user), he waits until timeout and aborts the build. I used "su jenkins -s /bin/bash" to get a shell as "jenkins" and the env-vars are set correctly...
What can I do? Why does Jenkins ignore these ENV-Vars?
Thanks.
The http_proxy variables (as seen e.g. on the wget man page) require a "http://" prefix to work properly for many programs.
Jenkins on the other hand has a proxy configuration at Manage Jenkins > Plugin Manager > Advanced. This configuration overrides the environment variables.
Check Alex' answer to another question for getting around this behavior for individual nodes/builds.
I did not get it solved. After a restart the server fails all Jenkins Jobs for some minutes... suddenly the connection to the proxy succeeds and everything works well.

Resources