Different pulic key for jump host and destination host over ssh - linux

I have HOP server (bastion) and APP server. I am able to ssh to HOP server via:
$ ssh HOP
I am able to ssh to APP server from HOP server:
$ ssh APP
Both SSH works without password, only via ssh keys.
During ansible deploy I find out I cannot connect to APP server via HOP. I am testing it over cmd:
ssh -o ProxyCommand="ssh -W %h:%p HOP" APP
After this, I am able to connect to APP server but it requires password. I find out, even when APP ssh works from HOP, now it cannot detect configuration from HOP server. So I defined the public key to use in command line:
ssh -o ProxyCommand="ssh -W %h:%p HOP" APP -i /etc/ssh/my_ssh_key
Warning: Identity file /etc/ssh/my_ssh_key not accessible: No such file or directory.
But now, It tried to locate my_ssh_key on localhost, and it asks for password again.
How can I force the use configuration from HOP server or define to use SSH public key from HOP and not from localhost? Is it even possible?

I have achieved this by following this approach:
Ansible host file changes to use bastion host under host group:
[testservers]
192.168.20.140
192.168.10.88
[testservers:vars]
ansible_port = 22 # remote host port
ansible_user = ec2-user # remote user host
private_key_file = /Users/laptop-ansibleuser/.ssh/id_rsa # laptop key to login to bastion host
ansible_ssh_common_args='-o StrictHostKeyChecking=no -o ProxyCommand="ssh -o \'ForwardAgent yes\' ansible-remote#<bastion host> -p 2222 \'ssh-add /home/ansible-remote/.ssh/id_rsa && nc %h %p\'"'

Related

Problems to create ssh tunnel from Windows to Ubuntu with VNC

I have Windows (client) and Linux (server).
I want to have a VNC access over ssh to Linux.
I use Port 2222 as ssh port.
I managed to install the ssh and login to the linux server via ssh using public key without password.
I have tried several configurations via windows console, which also work:
ssh -l user -L 5901:localhost:5901 xxx.xxx.xxx.xxx -p2222
OR
ssh -L 5901:127.0.0.1:5901 user#xxx.xxx.xxx.xxx -p2222
But when I try to go in via VNC viewer (Windows) with xxx.xxx.xxx.xxx:5901, the connection is interrupted.
I change the default "sshd_config" to:
AllowTcpForwarding local
X11Forwarding yes
Port 2222 at ufw is open.
Which mistake I make?
Is the port 5901 right to use?
From the fact that you are doing ssh port-forwarding, I understand xxx.xxx.xxx.xxx:5901 is not directly accessible from your Windows machine.
Can you try to go in via VNC viewer (Windows) with localhost:5901, after setting up ssh as follows ?
ssh -X -L 5901:127.0.0.1:5901 user#xxx.xxx.xxx.xxx -p2222

How to SSH tunnel from jump server to another server without directly logging in to the jump server

I know, this question has been asked a lot, but still I have problems using ssh proxy.
I have an EC2 server (running a simple web server) which is in a private network in aws. And have a jumphost to connect to it. jumphost is in a public network. Only way I can login in to the web server instance is through the jumphost.
So I have created ~/.ssh/config file in my local computer as below:
Host jumphost
Hostname <Retracted-Public-IP>
user ec2-user
IdentityFile /Users/jananath/.ssh/private-key.pem
I can log in to the jumphost as: ssh jumphost and it works.
And in the jumphost above I have configured ~/.ssh/config as below:
Host my-web-server
Hostname <Retracted-Private-IP>
user ec2-user
IdentityFile ~/.ssh/web-server-private-key.pem
And I can ssh into the web server (from jumphost) as ssh my-web-server and it works.
I don't want to log in to the jumphost everytime I need to log into the web server, so I tried proxying.
Therefore, I added another block to my local ~/.ssh/config file as below:
Host jumphost
Hostname <Retracted-Public-IP>
user ec2-user
IdentityFile /Users/jananath/.ssh/private-key.pem
Host my-web-server
ProxyCommand ssh jumphost -W %h:%p
And I tried: ssh my-web-server and it gives the below output:
kex_exchange_identification: Connection closed by remote host
Connection closed by UNKNOWN port 65535
Can someone help me fix this?
This should work :
Host my-web-server
ProxyCommand ssh jumphost nc %h %p
You can also try :
ssh -oProxyCommand="ssh -W %h:%p jumphost" my-web-server
Third command worth to try :
ssh -J jumphost my-web-server
Copy the public key of your local machine to ~/.ssh/authorized_keys of the remote machine and not just the jump server. This will enable passwordless login from the local machine using ssh -J. If your ip is ipv6 make the following modification in the config file of your local machine.
Host jumphost
Hostname Retracted-Public-IPv6
user ec2-user
IdentityFile /Users/jananath/.ssh/private-key.pem
Host my-web-server
ProxyCommand ssh jumphost -W %[h]:%p

Port forwarding on Spinnaker

I am doing a port forwarding to connect my local machine to Spinnaker.
Step1: -> Localhost to AWS instance
ssh -A -L 9000:localhost:9000 -L 8084:localhost:8084 -L 8087:localhost:8087 ec2-user#<aws-instance-ip>
Step2: -> Aws instance to Spinnaker cluster
ssh -L 9000:localhost:9000 -L 8084:localhost:8084 -L 8087:localhost:8087 ubuntu#10.100.10.5
This works fine when i do http://localhost:9000
However, instead of port forwarding from local machine I want to setup a tunnel from another aws instance (Eg: 55.55.55.55) and access via http://55.55.55.55:9000 . So that other team members can i directly access Spinnaker UI.
I have tried following the above steps from 55.55.55.55 host and then tried
http://55.55.55.55:9000 however it didnt work.
What should i change to make it resolve on 55.55.55.55 host?
Port forwarding is bound to the IP you give to ssh. If you give localhost (default), it will be accessible only on localhost (127.0.0.1). If you want to access it from outside, you need to give the 55.55.55.55 address instead.
You will also need a -g switch to ssh, which will allow remote hosts to connect to your locally forwarded ports.

How to access a host port (bind with ssh -R) from a container?

Using Docker 1.12.1, I face a strange behaviour trying to access a host port created with ssh -R.
Basically I try to access a service running on port 12345 on my local machine from a docker container running on a server.
I opened a ssh connection with ssh -R *:12345:localhost:12345 user#server to open a port 12345 on server that forwards to port 12345 on my local machine.
Now when I try curl https://172.17.42.1:12345 inside the container (172.17.42.1 is the IP to access the docker host from the docker container) I get :
root#f6873fe1109b:/# curl https://172.17.42.1:12345
curl: (7) Failed to connect to 172.17.42.1 port 12345: Connection refused
But on server the command curl http://localhost:12345 succeeds (eg. no Connection refused)
server$ curl http://localhost:12345
curl: (52) Empty reply from server
I don't really understand how the port binding done with ssh differs from a test with nc on server (it works) :
# on server
nc -l -p 12345
# inside a container
root#f6873fe1109b:/# curl http://172.17.42.1:12345
curl: (52) Empty reply from server
NB: the container was started with docker run -it --rm maven:3-jdk-8 bash.
What can I do to allow my container to access the host port corresponding to a ssh binding ?
From man ssh:
-R [...]
... Specifying a remote bind_address will only succeed if the server's GatewayPorts option is enabled
And man sshd_config:
GatewayPorts
Specifies whether remote hosts are allowed to connect to ports forwarded for the client. By default, sshd(8) binds remote port forwardings to the loopback address. This prevents other remote hosts from connecting to forwarded ports. GatewayPorts can be used to specify that sshd should allow remote port forwardings to bind to non-loopback addresses, thus allowing other hosts to connect. The argument may be “no” to force remote port forwardings to be available to the local host only, “yes” to force remote port forwardings to bind to the wildcard address, or “clientspecified” to allow the client to select the address to which the forwarding is bound. The default is “no”.
This means that a default sshd server installation only allows to create forwards that bind to the local interface. If you want to allow forwards to other interfaces then loopback, you need to set the GatewayPorts option to yes or clientspecified in your /etc/ssh/sshd_config

server to server copy using custom port and private key with passphrase parameter

While copy some file from server to server using scp command custom port and private key with passphrase parameter gives error
Command is like
scp -i xxxxxxx.pem -P xxxxx /path/source-file.zip
root#example.com:/path/to/destination/file.zip
error message
ssh: connect to host example.com port xxxxx: Connection timed out
lost connection
While connecting example.com using privatekey(with custom port and passphrase) is working fine. But not working while we use scp command.
This syntax is not working for me.
So I have used alternate method like wget and CURL

Resources