.screenrc and logging in to multiple remote servers - linux

I'm looking for a way to login into multiple remote servers using SSH on multiple tabs while using screen. Now I have set multiple tabs when running my screen and bash on each of it and I have to manually connect to each remote server using SSH and enter the password for my key. Now I'm looking for a way to enter that password once and login to all of servers automatically. All of them are possible to login using same ssh key. Any ideas?

I think what you are looking for is a way to - do the same thing on X servers. If you are on ubuntu you can achieve the same using terminator. Open up terminator, split the screen X times and then start the broadcast mode in the left top corner of the screen. This will allow you to work like a jedi!

Related

How to fix a problem with Super PuTTY not opening multiple Telnet nodes in separate tabs when I attempt group select console to nodes

When I group select to console into multiple Eve-NG Telnet nodes within a lab and then select ">_Console To Selected Nodes", Super Putty opens, however, it will only open a tab for a single Telnet node.
I have created a lab in Eve-NG with two vSRX Telnet nodes that are connected via ge-0/0/1 and ge-0/0/2. This lab also has an Eve-NG cloud image where the vSRX nodes are connected to it via ge-0/0/3. I added two GUI enabled Linux VNC hosts which are also connected to the afore mentioned cloud image and each via e0.
I set up Super PuTTY in the Windows 11 registry and then set it as the default Telnet application with the "only allow single instance" mode enabled.
Ideally, Super PuTTY should open both vSRX nodes within a single instance and each within a separate tab.
I discovered through troubleshooting that if a Super PuTTY session is already opened on the desktop, then both vSRX's will be opened in separate tabs. Or if the "only allow single instance" mode is disabled in Super Putty, then the vSRX's are both opened but in separate Super PuTTY instances. This behavior is not desirable.
Would anyone be able to offer any tips that I could try at solving the above issue?

Is it possible to write a shell script that takes input then will ssh into a server and run scripts on my local machine?

I have several scripts on my local machine. These scripts run install and configuration commands to setup my Elasticsearch nodes. I have 15 nodes coming and we definitely do not want to do that by hand.
For now, let's call them Script_A, Script_B, Script_C and Script_D.
Script_A will be the one to initiate the procces, it currently contains:
#!/bin/bash
read -p "Enter the hostname of the remote machine: " hostname
echo "Now connecting to $hostname!"
ssh root#$hostname
This works fine obviously and I can get into any server I need to. My confusion is running the other scripts remotely. I have read few other articles/SO questions but I'm just not understanding the methodology.
I will have a directory on my machine as follows:
Elasticsearch_Installation
|
|=> Scripts
|
|=> Script_A, Script_B, etc..
Can I run the Script_A, which remotes into the server, then come back to my local and run Script_B and so on within the remote server without moving the files over?
Please let me know if any of this needs to be clarified, I'm fairly new to the Linux environment in general.. much less running remote installs from scripts over the network.
Yes you can. Use ssh in non interactive mode, it will be like launching a command in your local environment.
ssh root#$hostname /remote/path/to/script
Nothing will be changed in your local system, you will be at the same point where you launched the ssh command.
NB: this command will ask you a password, if you want a really non interactive flow, set up host a passwordless login, like explained here
How to ssh to localhost without password?
You have a larger problem than just setting up many nodes: you have to be concerned with ongoing maintenance and administration of all those nodes, too. This is the space in which configuration management systems such as Puppet, Ansible, and others operate. But these have a learning curve to overcome, and they require some infrastructure of their own. You would probably benefit from one of them in the medium-to-long term, but if your new nodes are coming next week(ish) then you probably want a solution that you can use immediately to get up and going.
Certainly you can ssh into the server to run commands there, including non-interactively.
My confusion is running the other scripts remotely.
Of course, if you want to run your own scripts on the remote machine then they have to be present there, first. But this is not a major problem, for if you have ssh then you also have scp, the secure copy program. It can copy files to the remote machine, something like this:
#!/bin/bash
read -p "Enter the hostname of the remote machine: " hostname
echo "Now connecting to $hostname!"
scp Script_[ABCD] root#${hostname}:./
ssh root#hostname ./Script_A
I also manage Elasticsearch clusters with multiple nodes. A hack that works for me is using Terminator Terminal Emulator and split it into multiple windows/panes, one for each ES node. Then you can broadcast the commands you type in one window into all the windows.
This way, you run commands & view their results almost interactively across all nodes parallely. You could also save this layout of windows in Terminator, and then you can get this view quickly using a shortcut.
PS, this approach will only work of you have only small number of nodes & that too for small tasks only. The only thing that will scale with the number of nodes & the number of times and variety of tasks you need to perform will probably be a config management solution like Puppet or Salt.
Fabric is another interesting project that may be relevant to your use case.

Pageant keys not working in crontab

I understand the issue but not sure how to fix it :(
Problem Story:
I've installed pageant in my windows10 and added ssh keys(keys generated through puttygen) into it. configured putty session in windows10 with agent forwarding to access the servers(linux) with out using typing/credentials.
whenever I open putty session to login to any server, putty talks to the pageant and load/used my credentials without my involvement to enter credentials,.
keys deployed to all over servers when I do ssh form one server to another server the pageant works fine and able to access,no issue at all untill paegent inactive....I'm happy till this part
when i use ssh in cron auto job, it unable to calls the keys from linux to pageant(win10).
how to make this to run in linux(cron)
Of course not, as the cron does not run in the context of your SSH session.
So it cannot talk to local Pageant.
Even if the cron knows what user did create the job, how could it know, which of potentially many SSH sessions, you have opened, it should query for the keys? And what, if you actually do not have any SSH session open? The cron job should work even, when you are not connected to the server.
You have to have the keys stored on the server, where cron runs. There's no other way around that.

How to enter and leave an existing screen through script?

I'm fairly new with Linux shell scripting but have several years experience using Linux(non-hardcore).
There is an application running on my server(accessing it through SSH/putty) which is a console one. In order to ensure that it will run even even if I close my ssh client(putty), I made it run though screen.
In my script, there is a part where I want to enter this specific screen and "leave a message", then leave the screen to proceed with other things it have to perform.
How to do it exactly? Thanks!
(the console application is actually a vanilla Minecraft server)
check out 'screen' window manager http://www.gnu.org/software/screen/ with session names, see the following article:
http://www.mattcutts.com/blog/a-quick-tutorial-on-screen/

How to connect to the random temporary servers with ssh without password prompt

I would like to generate few VMs by script and then depending on some variables I would like to connect to them and make changes there (download/untar/run something). Everything automatically from another script.
The VMs are reachable via IP, but the question is how to connect to them with ssh without any password request. The security is not an issue.
The best fro me would be if ssh could take the password from some file.
Most of answers I found was using sshpass or expect but as I want to be sure that those scripts can be performed from everywhere I don't want to use "non-standard" application.
Any idea?
Thank you!
Create a public key that will be common to these temp vms.
Once the vm is alive, copy the common key to it using ssh-copy-id.
There are tutorials online I'm sure.

Resources