Tunnel SSH: access a server blocked by firewall through another server - linux

I have 1 pc and 2 servers.
Each device has a user associated with it:
pc (10.0.0.10) -> pc_user
server1 (10.0.0.146) -> server1_user
server2 (192.168.0.3) -> server2_user
There is a firewall blocking everything from "pc" to "server2".
The goal is to acess "server2" from "pc" through a SSH tunnel to "server1".
How can I do it?

If using openssh:
TRIVIAL WAY
PC> ssh server1_user#server1
server1> ssh server2_user#server2
PROXY WAY
Get a netcat on server1, if you can't install one, you can try to statically compile one (check busybox), download one (find server1 and OS version and check it's repos). If you have python/perl, there are "script implementations" of the command.
On your ~/.ssh/config file add:
Host server1
HostName 10.0.0.146
User server1_user
Host server2
ProxyCommand ssh -C -q server1 /<server1_path_to>/nc 192.168.0.3 22
User server2_user
ssh server2 will prompt for both passwords, if you're not using key authentication.
Since OpenSSH 5.4 netcat is not required for proxying
Host server2
ProxyCommand ssh -W %h:%p server1
User server2_user
TUNNEL WAY
PC TTY1> ssh -L 2222:192.168.0.3:22 server1_user#server1
PC TTY2> ssh server2_user#localhost -p 2222

Related

kex_exchange_identification: Connection closed by remote host with ~/.ssh/config file

I have a bastion host where I have configured the ~/.ssh/config file as follows:
Host stage-bastion
ProxyCommand ssh -q stage-bastion nc -q0 <capped-ip> 22
Hostname <capped-ip>
User stage
IdentityFile /home/abc/Documents/key
Port 1984
ServerAliveInterval 15
ServerAliveCountMax 3
And I try to log in as follows:
ssh stage-bastion
and I get the error:
kex_exchange_identification: Connection closed by remote host
I even did a eval "$(ssh-agent -s)" but not luck.
Then I tried normally as below:
ssh -i /home/abc/Documents/key stage#<capped-ip>
Voila it worked and I was able to ssh.
Can someone help me why my ~/.ssh/config is not working and giving the above error?

ssh tunnel from azure bastion hosts

Is it possible to do ssh tunneling (SSH port forwarding) from azure bastion host?
Like we do normally from a jump box:
ssh -i path/to/private_key -L 127.0.0.1:FORWARD_PORT:VM_IP:APPLICATION_PORT user#jumphost.net
ssh -i path/to/private_key -L 127.0.0.1:8080:10.0.0.1:8080 user#jumphost.net
Do you really need port fowarding? Your use case can perfectly use TCP forwarding like so, with the following SSH config.
Host JumpHost1
Hostname jumphost1.net
User user_jh1
Host JumpHost2
Hostname jumphost2.net
User user_jh2
ProxyCommand ssh -W %h:%p JumpHost1
Host AppBox
Hostname appbox_behind_firewall.net
User app_user
ProxyCommand ssh -W %h:%p JumpHost2
Then you can easily do ssh AppBox without issue. You'll need to have your local public key authenticated to each jumphost and the appbox. Which you should be able to easily do with ssh-copy-id if you are doing this with OpenSSH

How could I mount remote directory to local machine through two ssh hops

I can access my serve like this:
(from local)ssh -p5222 name#server1.com
(from server1)ssh name#server2.com
Then I can work on server2.
Now I find I need to mount the folder in server2 to my local machine so that I could use my IDE.
I tried this:
ssh -Nf name#server1.com -p5222 -L 2233:name#server2.com:2233
sshfs -p 2233 localname#localhost:~/ ./target-dir
But I got this error message:
channel 2: open failed: administratively prohibited: open failed read: Connection reset by peer
Why I met this trouble and how could I mount my remote file to my local machine please?
From the commands you run, it looks like the ssh server on server2.com is listening on the default port 22:
(from server1)ssh name#server2.com
If that's the case, then you need to forward the connection towards this port 22.
Instead of:
ssh -Nf name#server1.com -p5222 -L 2233:name#server2.com:2233
Do:
ssh -Nf name#server1.com -p5222 -L 2233:name#server2.com:22
Also, in your sshfs command, you need to provide the ssh user on server2.com, not your local user.
Intead of:
sshfs -p 2233 localname#localhost:~/ ./target-dir
Do:
sshfs -p 2233 name#localhost:~/ ./target-dir

Shell script remotely

I have one script running on server and doing some job on other server
I have many scp commands and ssh commands, this is why each time I have to enter the remote server password at each remote command.
is there any way to establish ssh connection between the servers so I type the remote password only once?
thanks
I would suggest to setup an ssh config together with ssh keys. In a nutshell the config will hold an alias for one or more remote servers.
ssh remote_server1
ssh remote server2
While your config file will look something like this:
Host remote_server1
Hostname 192.168.1.12
user elmo
IdentityFile ~/.ssh/keys/remote.key
...
If an ssh config file is not for you (although I can highly recommend it), you can use sshpass as well.
sshpass -p 't#uyM59bQ' ssh username#server.example.com
Do note that the above does expose your password. If someone else has access to your account, the history command will show the code snippet above.

how to transfer data between local and remote server connected via intermediate server?

I can login by ssh -X servA from local, then ssh -X servB from servA
To copy data from local to servB, I scp files from local to servA, then from servA to servB.
Is it feasible to copy files from local to servB directly and vice versa?
You can use nc (net cat) as a proxy for ssh.
So for your example, edit your ~/.ssh/config file to look like this:
Host servB
ProxyCommand ssh -q servA nc servB 22
As long as nc is in your path you should now be able to ssh or scp directory to servB
If you don't have nc you can do it with ssh -W if your version is new enough (>= OpenSSH 5.4),
Host ServB
ProxyCommand ssh -W ServB:22 servA
Use ProxyCommand in ssh config file.
This is what I usually do (I do it in a Mac machine don't know if it's different from a Windows machine):
Once you have set up connection with any of the servA or servB you can do:
Copy from local to servA or servB:
$ scp -P <port-number used> <file location to copy from> <username_in servA/servB>#localhost:<file location to copy to>
NOTE: This works being in your local machine without ssh-ing to any of the servA/servB, just need to establish connection.
or from servA to servB:
$ scp -P <port-number used> <username_in servA>#localhost:<file location to copy from> <port-number used> <username_in servB>#localhost:<file location to copy to>
NOTE: I haven't tried this scp from server to server but seems a little bit straight forward.
Just trying to help here.

Resources