My main development box uses Linux Mint.
When I am physically at the computer I can do remote operations like git fetch with no problem.
The user I log-in as is "jonbri".
> whoami
jonbri
In ~/.ssh (/home/jonbri/.ssh) is my private key (/home/jonbri/.ssh/jonbri) and public key (/home/jonbri/.ssh/jonbri.pub).
But when I am at another computer, for example another Linux Mint computer, and on the command-line I open a ssh remote shell to my main computer, when I try operations such as git fetch, it looks like the keys in ~/.ssh are not being picked up.
Here's what I see (with pwd being the root of the git repo):
> git fetch
Password:
Then, even no matter which password I enter it doesn't work.
To enable the ability to open a remote ssh shell I used apt-get to install open-ssh-server and open-ssh-client.
Any ideas why my keys aren't being picked up when inside a remote ssh shell.
SSH is likely expecting the standard names of id_dsa for your private key and id_dsa.pub for your public key.
From the github documentation:
Check the directory listing to see if you already have a public SSH key.
The default public key file names are:
id_dsa.pub
id_ecdsa.pub
id_ed25519.pub
id_rsa.pub
Related
in my local dev windows machine I generated shh key using PuttyGen. I also pasted public key into gitlab ssh keys section so now are linked.
I can correcty use ssh now from my windows manchine but I want to use it also in my production server which uses ubuntu.
For example I wan to ssh clone a repository into my ubuntu machine, where and how should I add the ssh keys to my ubuntu server so I can link it with gitlab.
I used this tutorial to generate ssh keys in windows with Putty.
https://ourcodeworld.com/articles/read/1421/how-to-create-a-ssh-key-to-work-with-github-and-gitlab-using-puttygen-in-windows-10
How should I add the ssh keys to my Ubuntu server so I can link it with GitLab?
Ideally, you would create a dedicated key pair on your Ubuntu server, in order to be able to clone GitLab repositories.
On that Ubuntu server, go to your $HOME folder of your account 'user' (replace user by the actual user name you are login with on that server).
cd
# assuming you do not have a default key yet:
ssh-keygen -t rsa -P "" -f ~/.ssh/id_rsa
# copy ~/.ssh/id_rsa.pub to your GitLab account
# Check the key is working
ssh -Tv git#gitlab.com
# Use your key to clone repositories
git clone git#gitlab.com:me/myRepository
I have some Jenkins jobs defined using a Jenkins Pipeline Model Definition, which builds NPM projects. I use Docker containers to build these projects (using a common image with
just Node.js + npm + yarn).
The results of the builds are contained in the dist/ folder that I zipped using a zip pipeline command.
I want to copy this ZIP file to another server using SSH/SCP (with private key authentication). My private key is added to the Jenkins environment (credentials manager), but when I use Docker containers, an SSH connection cannot be established.
I tried to add agent { label 'master' } to use the master Jenkins node for file transfer, but it seems to create a clean workspace with new Git fetch, and without my built files.
After I tried the SSH Agent Plugin, I have this output:
Identity added: /srv/jenkins3/workspace/myjob-TFD#tmp/private_key_370451445598243031.key (rsa w/o comment)
[ssh-agent] Started.
[myjob-TFD] Running shell script
+ scp -r dist test#myremotehost:/var/www/xxx
$ docker exec bfda17664965b14281eef8670b34f83e0ff60218b04cfa56ba3c0ab23d94d035 env SSH_AGENT_PID=1424 SSH_AUTH_SOCK=/tmp/ssh-k658r0O76Yqb/agent.1419 ssh-agent -k
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 1424 killed;
[ssh-agent] Stopped.
Host key verification failed.
lost connection
How do I add a remote host as authorized?
I had a similar issue. I did not use the label 'master', and I identified that the file transfer works across slaves when I do it like this:
Step 1 - create SSH keys in a remote host server, include the key to authorized_keys
Step 2 - Create credential using SSH keys in Jenkins, use the private key from the remote host
Use the SSH agent plugin:
stage ('Deploy') {
steps{
sshagent(credentials : ['use-the-id-from-credential-generated-by-jenkins']) {
sh 'ssh -o StrictHostKeyChecking=no user#hostname.com uptime'
sh 'ssh -v user#hostname.com'
sh 'scp ./source/filename user#hostname.com:/remotehost/target'
}
}
}
Use the SSH agent plugin:
SSH Agent Plugin
SSH Agent Plugin
When using this plugin you can use the global credentials.
To add a remote host to known hosts and hopefully cope with your error try to manually ssh from the Jenkins host to the target host as the Jenkins user.
Get on the host where Jenkins is installed. Type
sudo su jenkins
Now use ssh or scp like
ssh username#server
You should be prompted like this:
The authenticity of host 'server (ip)' can't be established.
ECDSA key fingerprint is SHA256:some-weird-string.
Are you sure you want to continue connecting (yes/no)?
Type yes. The server will be permanently added as a known host. Don't even bother passing a password, just Ctrl + C and try running a Jenkins job.
Like #haschibaschi recommends, I also use the ssh-agent plugin. I have a need to use my personal UID credentials on the remote machine, because it doesn't have any UID Jenkins account. The code looks like this (using, for example, my personal UID="myuid" and remote server hostname="non_jenkins_svr":
sshagent(['e4fbd939-914a-41ed-92d9-8eededfb9243']) {
// 'myuid#' required for scp (this is from UID jenkins to UID myuid)
sh "scp $WORKSPACE/example.txt myuid#non_jenkins_svr:${dest_dir}"
}
The ID e4fbd939-914a-41ed-92d9-8eededfb9243 was generated by the Jenkins credentials manager after I created a global domain credentials entry.
After creating the credentials entry, the ID is found under the "ID" column on the credentials page. When creating the entry, I selected type 'SSH Username with private key' ('Kind' field), and copied the RSA private key I had created for this purpose under the myuid account on host non_jenkins_svr without a passphrase.
I have updated my system with sudo apt-get update.
There was a update of PAM (The Pluggable Authentication Module). I don't remember the message, but there was like a pink screen and I decided to choose no (sorry for that poor explanation).
After that the update continues until something like ssh stop/waiting and then nothing happens. I couldn't cancel this und decided to reboot my Ubuntu Server (14.04 LTS).
After that I cannot connect with a user to this machine with ssh -X user#host. Only the owner can connect. But no other user.
With ssh -v user#host I get the error
debug1: Authentications that can continue: publickey,password
Permission denied, please try again.
Then I recognized that there are a lot of missing files in my ~/.ssh/ directory.
There is only the file known_hosts. I think there should be also the files: Readme, authorized_keys, bup, deprec, id_dsa, id_dsa.pub.
Do I have to reinstall ssh?
You do not need to reinstall ssh.
Many of those files are generate as you use ssh and related commands.
The most important files in my experience (which you will generate) are:
authorized_keys: contains public keys which are authorized to connect.
id_dsa and id_dsa.pub (or id_rsa, etc.) are the private key and public key (with .pub suffix) are the keys you offer when attempting a connection. These are generated by executing ssh-keygen.
Also, config is nice to use, but also not necessary. see man ssh_config.
Restoring connections from other machines
It appears you've lost the authorized_keys file you had. If you wish to continue connecting via publickey from other machines, you will need to put the public key from the other machine into your authorized_keys file.
Ensure authorized_keys file exists (if not: touch ~/.ssh/authorized_keys)
Copy the public key (id_rsa.pub for example) from the machine[s] you will be connecting from.
Paste the public key[s] into authorized_keys, one per line.
I created a new EC2 Amazon Linux instance. I want to allow a developer to SSH into the EC2 instance. To test this, I'm trying it from my windows computer. I have followed the instructions in the link below but I can't get SSH (Putty) to connect using the key pair I'm generating.
I'm following the instructions here as reference
and here
After logging into EC2 as ec2-user using FireSSH and the pem generated by AWS, I use SSH to run the following commands to create a new user, .ssh directory, and permissions.
[ec2-user ~]$ sudo adduser newuser
[ec2-user ~]$ sudo su - newuser
[newuser ~]$ mkdir .ssh
[newuser ~]$ touch .ssh/authorized_keys
[newuser ~]$ chmod 600 .ssh/authorized_keys
[newuser ~]$ vim .ssh/authorized_keys
Then I paste a public key into authorized_keys using vim. I will explain where I get the public key in the next step.
ssh-rsaAAAAB3NzaC1yc2EAAAADAQABAAABAQClKsfkNkuS ....
To create the public key which I pasted in the previous step I followed the steps in this reference starting at "Generating an SSH Key"
I copied the public key from PuttyKeyGen which is showed in the box labeled "Public key for pasting into OpenSSH authorized_keys". Then I pasted that into the .ssh/authorized_keys file on my EC2 instance in the newuser directory.
I log out of the SSH client on EC2. Then I try to login with Putty using the newly created private key on my windows machine. I use the newuser login name. I get this error in Putty: server refused our key. There is also a dialog box that says Disconnected: No supported authentication methods available {server sent: publickey)
What am I doing wrong in these steps?
I did two things different and it works now. It's probably the number of bits that made it work.
I generated a new key pair using PuttyGen but I specified SSH-2 RSA with 1024 bits instead of the default that PuttyGen was putting in which was like 2048.
When I logged back into EC2 with my SSH I pasted the public key using nano instead of vim.
Always use ec2-import-keypair features to verified whether it is GOOD for EC2 instance. It the import works, then it is good, otherwise, regen a compliance keypair. If you simply copy a keypair that is not compliance , you will run into trouble.
Here is the document for import key pair
OpenSSH public key format (the format in ~/.ssh/authorized_keys)
Base64 encoded DER format SSH public key file format as specified in
RFC4716 DSA keys are not supported. Make sure your key generator is
set up to create RSA keys.
Supported lengths: 1024, 2048, and 4096.
So I just setup an Amazon EC2 instance. And installed git..
sudo yum install git
I then set up my ssh key with github. Now when I try to clone my repo into /var/www/html folder i get this error..
fatal: could not create work tree dir 'example.com'.: Permission denied
and when I run as root...
Cloning into 'example.com'...
Permission denied (publickey).
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
But I made sure that my github public key matches my ~/.ssh/id_rsa.pub key. Is there something that I'm missing here?
Your first error is because your user does not have access to write to /var/www/html . You could give your user permissions to do so.
Your second error when running as root, is likely that you have your ssh keys in your user home directory, not in /root/.ssh/ , or that your .ssh directory or the ~/.ssh/id_rsa.pub key file have improber permissions. ~/.ssh/ should have the permission bits 0700 , and should have ~/.ssh/id_rsa.pub e.g. 0600
Note: this fix works for Mac users
Incase of macOS 10.12.2 or later, you will need to modify your ~/.ssh/config file to automatically load keys into the ssh-agent and store passphrases in your keychain.
Host *
AddKeysToAgent yes
UseKeychain yes
IdentityFile ~/.ssh/<your_id_rsa>
Add your SSH private key to the ssh-agent and store your passphrase in the keychain. If you created your key with a different name, or if you are adding an existing key that has a different name, replace id_rsa in the command with the name of your private key file.
ssh-add -K ~/.ssh/<your_id_rsa>
For more information please review
https://help.github.com/en/github/authenticating-to-github/generating-a-new-ssh-key-and-adding-it-to-the-ssh-agent
Have you tried this:
git: fatal: Could not read from remote repository
You can specify the username that SSH should send to the remote system as part of your remote's URL. Put the username, followed by an #, before the remote hostname.
git remote set-url website abc#***.com:path/to/repo
Is the id_rsa private key in ~/.ssh/id_rsa the pair to you public key (~/.ssh/id_rsa.pub) ?
If it's not (or you're not sure) I suggest you regenerate a new private/public key pair with ssh-keygen -t dsa.
My solution matches that of nos. Adding the public key of the root user fixes it. Another option would be changing the permission of the directory and executing the command as a regular user.