How to send three pipes of data over ssh? - linux

I have a bash script on a remote host that produces a large amount of data on fd=3 as well as some possibly interesting data on stdout and stderr. I want to:
Log stdout and stderr to a file on my local machine.
Write the data on fd=3 to stdout on my local machine.
Here's how it could be done if my big script were local:
exec 3> >(cat)
./big_script.sh -o /dev/fd/3 2>&1 >big_script.log
exec 3>&-
However, I want to run big_script.sh on a remote machine and have all three pipes (fd=1, fd=2, and fd=3) come out of the ssh program as separate. What is the best way to that?

nc (netcat) and tunnels ? you can make kinda log radio on your net this way!

SSH opens just a single tty, so you just get a single stream that has all the data. You cannot tell apart what went to what the other side seen as stdout and stderr.

You could log to files on the remote host, and then ssh remote tail -f each of the log files from your local machine.

Related

SSH, run process and then ignore the output

I have a command that will SSH and run a script after SSH'ing. The script runs a binary file.
Once the script is done, I can type any key and my local terminal goes back to its normal state. However, since the process is still running in the machine I SSH'ed into, any time it logs to stdout I see it in my local terminal.
How can I ignore this output without monkey patching it on my local machine by passing it to /dev/null? I want to keep the output inside the machine I am SSH'ing to and I want to just leave the SSH altogether after the process starts. I can pass it to /dev/null in the machine, however.
This is an example of what I'm running:
cat ./sh/script.sh | ssh -i ~/.aws/example.pem ec2-user#11.111.11.111
The contents of script.sh looks something like this:
# Some stuff...
# Run binary file
./bin/binary &
Solved it with ./bin/binary &>/dev/null &
Copy the script to the remote machine and then run it remotely. Following commands are executed on your local machine.
$ scp -i /path/to/sshkey /some/script.sh user#remote_machine:/path/to/some/script.sh
# Run the script in the background on the remote machine and pipe the output to a logfile. This will also exit from the SSH session right away.
$ ssh -i /path/to/sshkey \
user#remote_machine "/path/to/some/script.sh &> /path/to/some/logfile &"
Note, logfile will be created on the remote machine.
# View the log file while the process is executing
$ ssh -i /path/to/sshkey user#remote_machine "tail -f /path/to/some/logfile"

How to create one logfile when ssh-ing to multiple servers

I'd like to create a bash script to automatically connect myself to a bunch of servers, execute some commands there and save the output of these commands in one logfile on the server I use to connect myself to all the other servers.
So far I was able to create a logfile on each of the servers I'm connecting myself to or to display the output of each of the commands on the console of the server I use to get to all the other servers.
My script currently looks like this (I know about for loops, but I don't want to use them in this case because I need to execute different commands on each server):
#!/bin/bash
ssh server1 <<EOF
hostname
printf '\n'
mount
EOF
printf '\n'
printf '\n'
printf '\n'
ssh server2 <<EOF
hostname
printf '\n'
mount
EOF
...
My idea was to use the &>> operator, because I need to know if all commands where executed successfully or not. In the end I'd like to have only one logfile which should look somewhat like this:
server1
output of mount
server 2
output of mount
...
So, how can I manage to create only one large logfile that contains the results of all executed commands? Also, will this script still work correctly if I make use of the ssh -T option to get rid of the message "Pseudo-terminal will not be allocated because stdin is not a terminal."? And do I have to escape special characters like / _ - when using mount in my script to mount something?
Thanks in advance!
I suggest using Open source utilities like logstash or fluentd.
I would use fabric, which is a tool to interact with several servers using ssh. It provides operations for executing remote shell commands.
For your example, the fabfile:
from fabric.api import run, sudo
def my-task():
run('hostname')
run('mount')
An you can execute it:
fab -H server1,server2 my-task
Output will be via standard output of the server you are executing so you can easily redirect it to a file:
fab -H server1,server2 my-task | my-task.log

Execute script on remote host - output given in local host

I am trying to execute two scripts which are available as sh files on remote host having 755 permissions.
I try callling them from client host as below:
REMOTE_HOST="host1"
BOUNCE_SCRIPT="
/code/sys/${ENV}/comp/1/${ENV}/scripts/unix/stopScript.sh ${ENV};
/code/sys/${ENV}/comp/1/${ENV}/scripts/unix/startScript.sh ${ENV};
"
ssh ${REMOTE_HOST} "${BOUNCE_SCRIPT}"
Above lines are in a script on local host.
While running the script on local host, the first command on remote host i.e. stopScript.sh gets executed correctly. It kills the running process which it was inteded to kill w/o any error.
However output of second script i.e. startScript.sh gets printed to local host window but the process it intended to start does not start on remote host.
Can anyone please let me know?
Is the way executing script on remote host correct?
Should I see output of running script on remote host locally as well? i.e. on the window of local host?
Thanks
You could try the -n flag for ssh:
ssh -n $REMOTE_HOST "$BOUNCE_SCRIPT" >> $LOG
The man page has further information (http://unixhelp.ed.ac.uk/CGI/man-cgi?ssh+1). The following is a snippet:
-n Redirects stdin from /dev/null (actually, prevents reading from
stdin).
Prefacing your startScript.sh line with 'nohup' may help. Often times if you remotely execute commands they will die when your ssh session ends, nohup allows your process to live after the session has ended. It would be helpful to know if your process is starting at all or if it starts and then dies.
I think cyber-monk is right, you should launch the processes with nohup to create à new independent process. Look if your stop script is killing the right process (the new one included).

how to log all coming messages of SSH Secure Shell client?

I am using SSH Secure Shell client, a nice tool to connect to the server.
However, I am wondering whether is it possible to log all of coming messages from my program that I run via the SSH Secure Shell client. for example: ./test and my program will run with giving debug lines. how can I log the debug lines to a txt file for analysing?
Have you tryed?
./test > log.txt

How do I execute a Perl program on a remote machine?

I wrote a Perl program to capture a live data stream from a tail command on a Linux machine using the following command in the console:
tail -f xyz.log | myperl.pl
It works fine. But now I have to execute this Perl program on a different machine because the log file is on that different machine. Can anyone tell me how I can do it?
You could say
ssh remotemachine tail -f xyz.log | myperl.pl
I suppose or maybe mount the remote log directories locally onto your administrative machine and do the processing there.
Or you could even say
ssh remotemachine bash -c "tail -f xyz.log | myperl.pl"
in order to run the script on the remote machine (if your script produces some output files and you want them on remote machine)

Resources