How to setup the passwords-less authentication between two different accounts - linux

Can we setup a password less authentication between two different uses in two machine.
Eg: Computer A has user A,Computer B has user B.
Can we setup passwords ssh that User A from Computer A to log into computer B using his User account(A).
Thank you!!

If I understand your question, can you set up ssh-keys to allow user A and user B to log into to two different computers A & B without providing a password? Sure, but user A can't log into user B's account via ssh any more than user A can log into user B's account on a local machine. (directory ownerships are different for the $HOME, etc.. That's what su is for).
To create a password less login, let's take user A and computer A who has an account on computer B and would like to ssh hostnameB and login without providing a password.
(1) user A creates a public_key and private_key on computer A with ssh-keygen -t ecdsa (for an ecdsa encryption key. dsa keys are no longer supported due to insecurity in the current openssh). When ssh-keygen is run it will create two files (by default in $HOME/.ssh). The keys are id_edcsa (the private key) and id_ecdsa.pub (the public key).
(2) for user A to login to computer B without a password, he must first transfer his public_key to computer B and add it to his $HOME/.ssh/authorized_keys file on computer B. e.g. from computer A:
$ ssh-keygen -t ecdsa # generate key-pair
$ cd ~/.ssh # verify private and public keys created
$ rsync -a id_ecdsa.pub hostnameB:~/.ssh/id_ecdsa.pub.hostA
password: enter pw
$ ssh hostnameB
password: enter pw
$ cd ~/.ssh
$ cat id_dsa.pub.hostA >> authorized_keys # permissions must be 0600
$ exit # exit hostnameB
note: above you could rsync the public_key directory to the computer B ~/.ssh/authorized_keys file if you are sure one does NOT already exist to save time a completely skip the last step copying the transferred file into it above. e.g.
$ rsync -a id_ecdsa.pub hostnameB:~/.ssh/authorized_keys
(you may have to check permissions on computer B afterwards)
Now for the test, user A should no longer need a password to long into computer B. From computer A:
$ ssh hostnameB
$ welcome to hostnameB>
Now you simply repeat the process of creating key-pairs for each user and transferring the public_key to the host you want to access w/o a password and add the public_key to the authorized_keys file. (note: you can just copy the same private_key to everyone's ~/.ssh directory and add the same public_key to everyone's ~/.ssh/authorized_keys file, but that sort of defeats the purpose of having separate keys). note: each authorized_keys file must be owned by the user owning the $HOME/.ssh directory and the file permissions must be 0600 (-rw-------) or sshd will not allow a connection.
That's all there is to it (you can check in /etc/ssh/sshd_config to insure the name of authorized_keys file has not been changed to something else.
Give it a try and let me know if you have questions. I done it hundreds of times -- no issues as long as your follow those rules.

Related

Accessing ec2 instance via ftp/ssh from user other than ec2-user

I've created a key-pair and have access to a linux instance via FTP and SSH using ec2-user. I have also added the desired user to all groups that ec2-user is in. All of this was tested using the .ppk key, generated my puttygen, which allowed ec2-user to the instance. I even changed the key type from SSH-2 to SSH-1 via puttygen.
I've followed countless guides, but without any luck. Is there anything else I can do?
From what I've read I'll have to create a key-pair (dont know where) for each user, and add one of the keys to a .ssh directory.
Assuming putty and AWS EC2
Step by step
1) AWS EC2 must be running sshd and have port 22 open on security groups. If you can login with the default user then this is ok with other users from the same address
2) generate a putty key pair using putty keygen. During the setup process you will be offered a "public key for pasting". Copy this into your cut n paste buffer
Also save the private part of the key
3) login to the AWS EC2 and become root.
4) If you haven't made the user (let's call the user "binky") make it with adduser or a similar command
5) issue commands like this to add key
cd ~binky
mkdir .ssh
cd .ssh
cat > authorized_keys
paste the public key from step 2 here and press ctrl-D
nb use the command wc -l authorized_keys to check that your cut n paste is one line only. The file has one line per key
6) ensure that file permissions are correct
cd ~binky
chmod 700 .ssh
chmod 644 .ssh/authorized_keys
chown -R binky .ssh
7) back on your putty host run pageant. Right click on the hat logo in the tray/bar thing (I'm not an MS Windows expert) and use "add key" to add the private key from step 2
8) on your putty host open a new, blank connection. Give the ip or domain name in the "Host Name (or IP Address)" box. In the left hand menu tree find the "connection" -> "data" box and give the "auto login username" as the user on EC2 (binky)
9) save the settings under a new name using the "session" box of putty
10) click on "open" and you should be logged on

Unable to connect via ssh with public key authentication method

On my Windows 10, I am running into the problem of not being able to connect to m Vagrant virtual machine via ssh user with public key authentication method at git bash using command such as
$ ssh -v lauser#127.0.0.1 -p 2222 -i ~/.ssh/id_rsa
I would be prompted for password, as if the public key I copied to in the ~/.ssh/Authorized_keys file inside the vm were not seen. Meanwhile,the password authentication method works, as well as 'vagrant ssh'.
I have made sure to
create key pairs locally, create a .ssh directory at the remote, and add pub key string to the remote's .ssh /authorized_keys file; both the .ssh and the .ssh /authorized_keys file are owned by the user(lauser), and set at 700 and 644
edit the /etc/ssh/sshd_config file on vm to use
RSAAuthentication yes
PubkeyAuthentication yes
and restarted the sshd server (with 'sudo service ssh restart').
verify that firewall has been disabled temporarily to eliminate any complication.
verify that there is only one vm running, all others are either in 'suspend' or 'halt' mode.
confirm the file type by 'file ~/.ssh/authorized_keys', and get confirmation '~/.ssh/authorized_keys: OpenSSH RSA public key'
verify that the keys match by comparing the output from 'sudo cat ~/.ssh/authorized_keys' in vm and the output from ' cat ~/.ssh/id_rsa.pub' at the local.
but still I get Permission denied (publickey) when trying to connect through public key authentication.
It sounds like you've done everything correctly so far. When I run in to this problem, it's usually due to directory permissions on the target user's home directory (~), ~/.ssh or ~/.ssh/authorized_keys.
See this answer on SuperUser.
I faced same challenges when the home directory on the remote did not have correct privileges. Changing permissions from 777 to 744 helped me

How to make key based ssh user?

I am new to Ubuntu-Linux,i have to create a ssh user in remote system and generate its key. and access this system by key_file through the command.
ssh -i key_file user#host
Can any body tell me how can i do ?
On the system you are trying to connect to, the public key (usually id_rsa.pub or something similar) needs to be added to the authorized_keys file.
If the user is brand new and the authorized_keys file doesn't exist yet, this command will create it for you.
cp ~/.ssh/id_rsa.pub ~/.ssh/authorized_keys
Next just make sure sshd is running on the host and you should be able to connect with the command you posted.
on remote-server-
ssh-keygen
ssh-copy-id user#host
cd .ssh
make a copy of the file id_rsa and give any body who want to access this server/system.
on the other system
ssh -i id_rsa user#host
If you want to connect to another host as user "user", what you need is the public key of the user that is going to open that connection, i.e. the user you are logged in on your desktop computer or some server you are coming from, not for the user, you are logging in to on the remote host.
You can check, if the keys for your current user are already created in $HOME/.ssh; there you should find something like "id_rsa" and "id_rsa.pub" (for rsa keys). If they don't exist, you create them by calling
ssh-keygen -t rsa
The public key that is generated that way, id_rsa.pub in this example, has to be put in a file ${HOME of user on remote host}/.ssh/authorized_keys on the target host.
If this file does not exist on the remote host or if even .ssh does not exist, you have to create those files with the following permissions:
.ssh 700
.ssh/authorized_keys 600
See http://www.openssh.com/faq.html#3.14 for details.
A detailed description of the process can be found here:
https://help.github.com/articles/generating-ssh-keys/

authorized_keys does not present for new user

I want to setup an ssh key in a machine of Linux running under AWS in EC2 cloud.
For that firstly, I installed cygwin, then I followed the following steps :
ssh-keygen -t dsa -f ~/.ssh/<key name> -C "<username of remote server>#<ip>"
cat ~/.ssh/<key name>.pub | ssh <username of remote server>#<ip> "cat >> ~/.ssh/authorized_keys"
Now the 1st statement executes successfully but the 2nd statement shows
bash: /home/<username of server>/.ssh/authorized_keys: No such file exists
Prior to this, I connected to the remote machine in root mode and created the user, that I am specifying at the command 1, 2 (username)
And I saw that the file is not present in the remote server for the user I created explicitly, but it is present for the user root.
bash: /home//.ssh/authorized_keys: No such file exists
When you create a new user, the ~/.ssh directory is not created by default. You will have to create the ~/.ssh/ directory and ~/.ssh/authorized_keys file yourself.
On your server, check whether ~/.ssh or ~/.ssh/authorized_keys exists. Looking at the error you have, it seems that it does not.
When you create a new linux instance, you specify a key pair that you want to use. You have a choice of creating a key pair, and downloading the public key, or uploading a private key.
In your steps, you never reference the key pair you specified when you created the instance. So the 2nd command should be something like:
cat ~/.ssh/<key name>.pub | ssh -i ~/.ssh/<key specified when launching instance> ec2-user#<public id> ...
ec2-user may be different depending on what AMI you used to create your instance - ubuntu is the default user for ubuntu instances, for example.

Asking password after command ssh server2

I have two server
server1
server2
want to login server 2 from server 1,I added both pub key (ssh_host_rsa_key.pub) in one another server in .ssh/authorized_keys.
when i run #cd /etc/ssh;ls -ltr able to see below file
sshd_config
ssh_config
moduli
ssh_host_key.pub
ssh_host_key
ssh_host_rsa_key.pub
ssh_host_rsa_key
ssh_host_dsa_key.pub
ssh_host_dsa_key
Host keys ssh_host_rsa_key.pub are stored automatically by ssh in known_hosts files; they are not intended to be managed by the user.
The authorized_keys is intended for user identity files. What you really want to do is to use ssh-keygen to generate an identity file representing you:
ssh-keygen -t ecdsa
Two identity files are generated: the private key id_ecdsa and the public key id_ecdsa.pub. Copy the public key into server2's .ssh/authorized_keys.
If you created a passphrase for your identity files, that's what you will be using from now on. Otherwise, your login will be password-less.

Resources