Configuring multi node hadoop install - linux

Im trying to configure a hadoop - master and slave env.
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-multi-node-cluster/
So far,
I have created 2 vagrant(Ubuntu) boxes and installed Hadoop in both the machines and up in running.
Now, I have assigned a new ipaddress - 192.168.0.1 to my master machine and trying to ssh to that machine but does not work.
ssh localhost - works
ssh master - does not work
127.0.0.1 localhost
127.0.1.1 vagrant
# The following lines are desirable for IPv6 capable hosts
::1 localhost ip6-localhost ip6-loopback
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters
#hadoop
192.168.0.1 master

You need to follow following steps to create passwordless ssh passwordless login.
edit /etc/hosts in all the nodes.
Add master and slave
192.168.0.1 master
192.168.0.2 slave
ssh-keygen -t rsa
ssh-copy-id -i ~/.ssh/id_rsa.pub user#slave
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
chmod 0600 ~/.ssh/authorized_keys
Now try ssh master and ssh slave

Did you configure the interface with that IP address? Also from which machine you are trying to ssh to that master IP. Where did you build these 2 VMs? on your PC? Post /sbin/ifconfig and netstat -rn output from both master and slave VMs.
Since you mentioned vagrant. This might help https://docs.vagrantup.com/v2/networking/index.html

Related

kex_exchange_identification: Connection closed by remote host with ~/.ssh/config file

I have a bastion host where I have configured the ~/.ssh/config file as follows:
Host stage-bastion
ProxyCommand ssh -q stage-bastion nc -q0 <capped-ip> 22
Hostname <capped-ip>
User stage
IdentityFile /home/abc/Documents/key
Port 1984
ServerAliveInterval 15
ServerAliveCountMax 3
And I try to log in as follows:
ssh stage-bastion
and I get the error:
kex_exchange_identification: Connection closed by remote host
I even did a eval "$(ssh-agent -s)" but not luck.
Then I tried normally as below:
ssh -i /home/abc/Documents/key stage#<capped-ip>
Voila it worked and I was able to ssh.
Can someone help me why my ~/.ssh/config is not working and giving the above error?

Copy file to remote host through intermediate (jump) host

Now I am connecting to endpoint using this command:
ssh my.jumphost.com -t 'export iip=111.22.3.44; bash'
The problem is that I don't have direct access to this IP, I mean I can't just ssh 111.22.3.44 from my jumphost. So basically I can reach destination host only if I export this variable with IP address from jumphost.
I've already looked into scp command and the way with ssh tunnel, but seems like both of them are required direct access to destination host.
I've also tried
cat test.py | ssh my.jumphost.com -t 'export iip=111.22.3.44' 'cat > /home/user/test.py'
but in that case file is being copied to jumphost only.
Any advice or guidance would be greatly appreciated!

ssh tunnel from azure bastion hosts

Is it possible to do ssh tunneling (SSH port forwarding) from azure bastion host?
Like we do normally from a jump box:
ssh -i path/to/private_key -L 127.0.0.1:FORWARD_PORT:VM_IP:APPLICATION_PORT user#jumphost.net
ssh -i path/to/private_key -L 127.0.0.1:8080:10.0.0.1:8080 user#jumphost.net
Do you really need port fowarding? Your use case can perfectly use TCP forwarding like so, with the following SSH config.
Host JumpHost1
Hostname jumphost1.net
User user_jh1
Host JumpHost2
Hostname jumphost2.net
User user_jh2
ProxyCommand ssh -W %h:%p JumpHost1
Host AppBox
Hostname appbox_behind_firewall.net
User app_user
ProxyCommand ssh -W %h:%p JumpHost2
Then you can easily do ssh AppBox without issue. You'll need to have your local public key authenticated to each jumphost and the appbox. Which you should be able to easily do with ssh-copy-id if you are doing this with OpenSSH

ssh failed when installing hadoop:connect to host master port 22: connection refused

I want to install hadoop cluster, the hostname of my computer is modified Master. I configure the ssh login without password, but I can only use ssh localhost successfully, when it comes to ssh Master, it shows ssh:connect to host master port 22: connection refused. I don't know why
/etc/host
127.0.0.1 localhost
113.*.*.2 Master
113.*.*.31 Slave1
cd ~/.ssh
rm ./id_rsa
ssh-keygen -t rsa
cat ./id_rsa.pub >> ./authorized_keys
ssh Master
Do this :
ssh-keygen -t rsa
ssh-copy-id -i ~/.ssh/id_rsa.pub root#master
ssh-copy-id -i ~/.ssh/id_rsa.pub root#slave1
chmod 0600 ~/.ssh/authorized_keys

Tunnel SSH: access a server blocked by firewall through another server

I have 1 pc and 2 servers.
Each device has a user associated with it:
pc (10.0.0.10) -> pc_user
server1 (10.0.0.146) -> server1_user
server2 (192.168.0.3) -> server2_user
There is a firewall blocking everything from "pc" to "server2".
The goal is to acess "server2" from "pc" through a SSH tunnel to "server1".
How can I do it?
If using openssh:
TRIVIAL WAY
PC> ssh server1_user#server1
server1> ssh server2_user#server2
PROXY WAY
Get a netcat on server1, if you can't install one, you can try to statically compile one (check busybox), download one (find server1 and OS version and check it's repos). If you have python/perl, there are "script implementations" of the command.
On your ~/.ssh/config file add:
Host server1
HostName 10.0.0.146
User server1_user
Host server2
ProxyCommand ssh -C -q server1 /<server1_path_to>/nc 192.168.0.3 22
User server2_user
ssh server2 will prompt for both passwords, if you're not using key authentication.
Since OpenSSH 5.4 netcat is not required for proxying
Host server2
ProxyCommand ssh -W %h:%p server1
User server2_user
TUNNEL WAY
PC TTY1> ssh -L 2222:192.168.0.3:22 server1_user#server1
PC TTY2> ssh server2_user#localhost -p 2222

Resources