How do I access a remote (local gitlab instance on remote server) repository over SSH? - linux

The setup is as follows:
remote private server far far away
remote private server has private gitlab instance on port XXXX
remote private server is configured to allow SSH sign-on via SSH key
gitlab instance on port XXXX of remote private server requires SSH key authentication using different SSH key
How can I clone that repository onto my local machine, and push/pull data remotely given that setup?
This is how I access it locally when I am not far, far away from remote private server:
git clone git#XXX.XXX.XX.X:REPODIR/repo_name.git
In this case, XXX.XXX.XX.X is the IP of the local git-lab instance on the remote network.
Is there anyway to tunnel into the remote network and access the gitlab instance by proxy (forgive me for using the word wrong likely).
Thank you.

Ok, mostly thanks to #o11c for this, although here are my findings that led me to be able to clone my repo remotely.
Disclaimer: ProxyJump (-J see ssh manpage) is the shorthand, more modern, version of this but I couldn't get it working -- if anyone wants to update with their implementation of ProxyJump that would be useful!
SSH into your remote account to the main server with port to your gitlab or other application instance, using your main identity (this can be in ~/.ssh or you can manually reference it with -i)
ssh -ND 3131 nkunes#XXX.XXX.1.146 -i ../../keys/XXX-ssh &
I then source this bash script in the shell I intend to run git commands (notice the ProxyCommand usage instead of ProxyJump, this is the old method of doing this yet it works well for me. also notice the 127.0.0.1:PORT should be swapped with your application's port)
alias ssh="ssh -o ProxyCommand='/usr/bin/nc -X 4 -x 127.0.0.1:3131 %h %p'"
export GIT_SSH=~/Desktop/XXX-eng/ssh-access/ssh-proxy.sh
export PRE_SSH_ALIAS_PROMPT="$PS1"
export PS1="<< SSH ALIAS >>$PS1"
Where ssh-proxy.sh is defined as follows: (again, swap the port out for your application, and possibly use ProxyJump if want modern implementation)
ssh -o ProxyCommand='/usr/bin/nc -X 4 -x 127.0.0.1:3131 %h %p' "$#"
Then, you can clone normally using:
git clone git#XXX.XXX.XX.X:REPODIR/repo_name.git

Related

Git pull a repository to local computer from a remote computer

I have setup an ssh connection on computer B and I am connecting to it properly via ssh. I want to execute a git pull command so that it would pull the repo to computer A instead of B. Is that too much of a hassle or maybe is there an alternative?
I basically need to copy whatever git pull pulled on computer B to my computer A. The only thing I have is just an ssh connection between the two and the repo is only reachable from computer B.
If I understand correctly, you want to use Git over an SSH tunnel so that computer A can access the repository REPO.git on computer C via computer B
On computer A, open the SSH tunnel:
ssh -L3333:compC:22 compB
From a second console on computer A:
git clone ssh://git#localhost:3333/REPO.git
It's possible to run git commands over double ssh tunnel. The accepted answer there is a bit outdated, ssh currently can construct a tunnel without external commands like netcat or socat.
Configure in your ~/.ssh/config:
Host server
HostName git-server
ProxyCommand ssh -W %h:%p B
This configures ssh to start a connection to the host B and opens a
tunnel over that connection to the host git-server. Run
git pull ssh://server/path/to/repository
Another possible solution is to use ext:: remote helper. See the second answer at the linked question. Run
git pull "ext::ssh -t B ssh git-server %S '/path/to/repository'"
I'm not sure it will answer your question, but if it's only for pulling, you can use the scp command after pulling on B:
scp <source> <destination>
It will copy as the cp command but through your ssh connection.
An other solution,(and the most straightforward solution in my opinion) is to just :
Connect to the remote machine via SSH
Push the remote changes to a git branch
Pull the changes from the remote branch from your local machine

Configure ssh private key in smartgit

I'm trying to configure a git repo in SmartGit through a SSH tunnel on Ubuntu 16.04.
I can't configure my private SSH key in SmartGit. I want to use the SmartGit SSH client but the Pereferences->Authentication don't allow me to add a key to use.
When I pull from the remote, I get a 'permission denied' error.
I found windows related topics but nothing on Linux distribs and nothing in SG documentation.
First, make sure to configure the System SSH client in the SmartGit preferences.
If you have ssh in your path, you can then export the GIT_SSH_COMMAND environment variable to instruct Git to use the ssh :command of your choice.
In your case, a command which would directly reference your private key
export GIT_SSH_COMMAND='ssh -i /path/to/private/key'
Then launch again SmartGit (for it to inherit that new environment variable), and try again your SSH tunnel.

Issues with using Jump Host

How do I transfer a file from my local machine to a remote host to which I need to get through a jump host? These are the steps I follow to connect to the remote host
1. ssh myname#jump-host
2. enter password
3. sudo su - another-random-name
4. ssh name#remote-host
Now I want to transfer a file from my local machine to the remote-host. How would I achieve this? I have already tried scp -oProxyCommand but I don't quite know where I should include step 3 as part of this command?
Use port forwarding to get third host ssh port on your localhost, in this way:
ssh -L 2222:remote-host:22 myname#jump-host
then (on another tab/shell on first host):
scp -P 2222 file myname#localhost:
will copy directly to remote host.
On the jump host under another-random-name run
ssh -L 2222:remote-host:22 myname#jump-host
then on your local computer you can run
scp -P 2222 file name#jump-host:
SCP will try to connect to jump-host, while in fact this connection will be forwarded to jump-host. And will use name as it is connecting to remote-host.
You are probably still facing problem with certificate for another-random-user. You can either create certificate on your machine for your-local-user and put public key on remote-host in user allowed keys.

How to disable ssh-agent forwarding

ssh-agent forwarding can be accomplished with ssh -A ....
Most references I have found state that the local machine must configure ~/.ssh/config to enable AgentForwarding with the following code:
Host <trusted_ip>
ForwardAgent yes
Host *
ForwardAgent no
However, with this configuration, I am still able to see my local machines keys when tunneling into a remote machine, with ssh -A user#remote_not_trusted_ip, and running ssh-add -l.
From the configuration presented above, I would expect that the ssh-agent forwarding would fail and the keys of the local machine would not be listed by ssh-add -l.
Why is the machine #remote_not_trusted_ip able to access the ssh-agent forwarded keys even though the ~/.ssh/config file states the following?
Host *
ForwardAgent no
How can i prevent ssh-agent from forwarding keys to machines not explicitly defined in the ~/.ssh/config?
How can i prevent ssh-agent from forwarding keys to machines not explicitly defined in the ~/.ssh/config?
It is the default behavior. If you do not allow it in ~/.ssh/config it will not be forwarded. But the command-line arguments have higher priority so it overwrites what is defined in the configuration,as explained in the manual page for ssh_config:
ssh(1) obtains configuration data from the following sources in the following order:
command-line options
user's configuration file (~/.ssh/config)
system-wide configuration file (/etc/ssh/ssh_config)
So as already said, you just need to provide correct arguments to ssh.
So back to the questions:
Why is the machine #remote_not_trusted_ip able to access the ssh-agent forwarded keys even though the ~/.ssh/config file states the following?
Host *
ForwardAgent no
Because the command-line argument -A has higher priority than the configuration files.
How can I prevent ssh-agent from forwarding keys to machines not explicitly defined in the ~/.ssh/config?
Do not use -A command-line option if you do not want forward your ssh-agent. Use -a command-line option instead.
You are using -A option to connect. man ssh says :
-A Enables forwarding of the authentication agent connection.
You should connect without -A, just using :
ssh user#remote_not_trusted_ip
CLI args will have priority on ssh config file.
By the way, if you want to connect to your trusted ip without forwarding, you can also use :
ssh -a user#trusted_ip
-a Disables forwarding of the authentication agent connection.
This is over a year old, but I encountered the same issue and landed on a config option that works.
I had a problem when I connected from my home computer to my work computer that Git commands no longer worked. I figured out that it was because the connecting home computer's public key was forwarded, which was not configured for that GitHub account.
The -a command line options fixed the problem by not forwarding the authentication agent connection. I also thought that the equivalent ~/.ssh/config option would be this:
ForwardAgent no
When that didn't work I looked for other configuration variables, and finally found that this one worked.
IdentityAgent none
This part of the man-page is crucial:
Since the first obtained value for each parameter is used, more
host-specific declarations should be given near the beginning of the
file, and general defaults at the end.
Put your Host * with ForwardAgent yes at the end and the specific Host with ForwardAgent, not at the start of the .ssh/config
Not an answer to the question, and maybe just semantics:
Why is the machine #remote_not_trusted_ip able to access the ssh-agent forwarded keys even though the ~/.ssh/config file states the following?
My understanding is that authentication keys are never "forwarded" to a remote computer. Rather ssh-agent forwards authentication challenges from a remote server back to the computer that holds the authentication private key through whatever chain of remote computers the ssh connection is running through.

How do I use SSH in a Jenkins pipeline?

I have some Jenkins jobs defined using a Jenkins Pipeline Model Definition, which builds NPM projects. I use Docker containers to build these projects (using a common image with
just Node.js + npm + yarn).
The results of the builds are contained in the dist/ folder that I zipped using a zip pipeline command.
I want to copy this ZIP file to another server using SSH/SCP (with private key authentication). My private key is added to the Jenkins environment (credentials manager), but when I use Docker containers, an SSH connection cannot be established.
I tried to add agent { label 'master' } to use the master Jenkins node for file transfer, but it seems to create a clean workspace with new Git fetch, and without my built files.
After I tried the SSH Agent Plugin, I have this output:
Identity added: /srv/jenkins3/workspace/myjob-TFD#tmp/private_key_370451445598243031.key (rsa w/o comment)
[ssh-agent] Started.
[myjob-TFD] Running shell script
+ scp -r dist test#myremotehost:/var/www/xxx
$ docker exec bfda17664965b14281eef8670b34f83e0ff60218b04cfa56ba3c0ab23d94d035 env SSH_AGENT_PID=1424 SSH_AUTH_SOCK=/tmp/ssh-k658r0O76Yqb/agent.1419 ssh-agent -k
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 1424 killed;
[ssh-agent] Stopped.
Host key verification failed.
lost connection
How do I add a remote host as authorized?
I had a similar issue. I did not use the label 'master', and I identified that the file transfer works across slaves when I do it like this:
Step 1 - create SSH keys in a remote host server, include the key to authorized_keys
Step 2 - Create credential using SSH keys in Jenkins, use the private key from the remote host
Use the SSH agent plugin:
stage ('Deploy') {
steps{
sshagent(credentials : ['use-the-id-from-credential-generated-by-jenkins']) {
sh 'ssh -o StrictHostKeyChecking=no user#hostname.com uptime'
sh 'ssh -v user#hostname.com'
sh 'scp ./source/filename user#hostname.com:/remotehost/target'
}
}
}
Use the SSH agent plugin:
SSH Agent Plugin
SSH Agent Plugin
When using this plugin you can use the global credentials.
To add a remote host to known hosts and hopefully cope with your error try to manually ssh from the Jenkins host to the target host as the Jenkins user.
Get on the host where Jenkins is installed. Type
sudo su jenkins
Now use ssh or scp like
ssh username#server
You should be prompted like this:
The authenticity of host 'server (ip)' can't be established.
ECDSA key fingerprint is SHA256:some-weird-string.
Are you sure you want to continue connecting (yes/no)?
Type yes. The server will be permanently added as a known host. Don't even bother passing a password, just Ctrl + C and try running a Jenkins job.
Like #haschibaschi recommends, I also use the ssh-agent plugin. I have a need to use my personal UID credentials on the remote machine, because it doesn't have any UID Jenkins account. The code looks like this (using, for example, my personal UID="myuid" and remote server hostname="non_jenkins_svr":
sshagent(['e4fbd939-914a-41ed-92d9-8eededfb9243']) {
// 'myuid#' required for scp (this is from UID jenkins to UID myuid)
sh "scp $WORKSPACE/example.txt myuid#non_jenkins_svr:${dest_dir}"
}
The ID e4fbd939-914a-41ed-92d9-8eededfb9243 was generated by the Jenkins credentials manager after I created a global domain credentials entry.
After creating the credentials entry, the ID is found under the "ID" column on the credentials page. When creating the entry, I selected type 'SSH Username with private key' ('Kind' field), and copied the RSA private key I had created for this purpose under the myuid account on host non_jenkins_svr without a passphrase.

Resources