How do I execute a Perl program on a remote machine? - linux

I wrote a Perl program to capture a live data stream from a tail command on a Linux machine using the following command in the console:
tail -f xyz.log | myperl.pl
It works fine. But now I have to execute this Perl program on a different machine because the log file is on that different machine. Can anyone tell me how I can do it?

You could say
ssh remotemachine tail -f xyz.log | myperl.pl
I suppose or maybe mount the remote log directories locally onto your administrative machine and do the processing there.

Or you could even say
ssh remotemachine bash -c "tail -f xyz.log | myperl.pl"
in order to run the script on the remote machine (if your script produces some output files and you want them on remote machine)

Related

Linux shell script to download file(s) from server to PC while connected with putty

I am connected to a server through putty, and I want to download (to my PC) certain files on a regular basis using a shell script. Specifically, these are the files...
ls -t ~/backup | head -n2
What is the best strategy for this? I was trying with command line FTP but I am prompted to login to something. I'm already logged into the server that has the files I need to download, so I am missing something.
The SSH protocol can be a good way, with scp command. You can take a look at this thread
To automate the process and script a solution, you will need use password-less ssh and ssh keys.
The first step will be to get the list of files to copy and so:
fils=$(ssh username#host ls -t ~/backup | head -n2)
Then once we have the files read into a variable fils, we can loop on the entries and run a secure copy command:
while read fyle
do
scp username#host:~/"$fyle" "$fyle"
done <<< "$fils"

SSH, run process and then ignore the output

I have a command that will SSH and run a script after SSH'ing. The script runs a binary file.
Once the script is done, I can type any key and my local terminal goes back to its normal state. However, since the process is still running in the machine I SSH'ed into, any time it logs to stdout I see it in my local terminal.
How can I ignore this output without monkey patching it on my local machine by passing it to /dev/null? I want to keep the output inside the machine I am SSH'ing to and I want to just leave the SSH altogether after the process starts. I can pass it to /dev/null in the machine, however.
This is an example of what I'm running:
cat ./sh/script.sh | ssh -i ~/.aws/example.pem ec2-user#11.111.11.111
The contents of script.sh looks something like this:
# Some stuff...
# Run binary file
./bin/binary &
Solved it with ./bin/binary &>/dev/null &
Copy the script to the remote machine and then run it remotely. Following commands are executed on your local machine.
$ scp -i /path/to/sshkey /some/script.sh user#remote_machine:/path/to/some/script.sh
# Run the script in the background on the remote machine and pipe the output to a logfile. This will also exit from the SSH session right away.
$ ssh -i /path/to/sshkey \
user#remote_machine "/path/to/some/script.sh &> /path/to/some/logfile &"
Note, logfile will be created on the remote machine.
# View the log file while the process is executing
$ ssh -i /path/to/sshkey user#remote_machine "tail -f /path/to/some/logfile"

Collecting system data from Red Hat Linux

I am planning to write a small program .The input for this should be the IP,username,password for a Linux machine ,and it should give me the system details of that machine as output.
I am planning to write this using Shell ,using RSH for the login . I am in no way asking for a solution ,but could you please point me towards other options that I have ? I am not really comfortable using Shell scripts .
Thanks in advance
i have a same demand. and what i do is:
first write a script which will be executed at target host (T). something like this
> cat check_server.sh
#!/usr/bin/env bash
# execute at target host
all_cmd=(
"uname -a"
"lscpu"
"free -m"
)
function _check {
for one_cmd in "${all_cmd[#]}"; do
echo -e "\n\n$one_cmd" >> /tmp/server_info.txt
eval "$one_cmd" >> /tmp/server_info.txt
done
}
then execute it in target and copy back result, like this
_cmd=`base64 -w0 check_server`
ssh $user#$ip "echo $_cmd | base64 -d | bash"
scp $user#$ip:/tmp/server_info.txt ./

How to run a script to remote machine

I've a machine and a remote machine.
Now I've a script which does some simple operations(basically execute 50 odd linux CLIs).
This script is present locally on my machine and NOT on remote machine.
Is there a neat way which can let me run this local script run on remote machine?
As answered here:
https://unix.stackexchange.com/questions/87405/how-can-i-execute-local-script-on-remote-machine-and-include-arguments
you can just do
ssh auser#host "bash -s" < ./myscript.sh
Use a combination of scp and ssh e.g.
scp myscript.sh auser#host:/home/auser/path \
&& ssh auser#host '/home/auser/path/myscript.sh'
Without copying you can directly execute it as
cat myscript.sh | xargs -0 ssh auser#host
Or if you don't like pipes
xargs -a myscript.sh -0 ssh auser#host

using a variable in a BASH command?

I have 20 machines, each running a process. The machines are named:
["machine1", "machine2", ...., "machine20"]
To inspect how the process is doing on machine1, I issue the following command from a remote machine:
ssh machine1 cat log.txt
For machine2, I issue the following command:
ssh machine2 cat log.txt
Similarly, for machine20, I issue the following command:
ssh machine20 cat log.txt
Is there a bash command that will allow me to view the output from all machines using one command?
If the machines are nicely numbered like in your example:
for i in {1..20} ; do ssh machine$i cat log.txt; done
If you have the list of machines in a file, you can use:
cat machinesList.txt | xargs -i ssh {} cat log.txt
You could store all your machine names in an array or text file, and loop through it.
declare -a machineList=('host1' 'host2' 'otherHost') # and more...
for machine in ${machineList[#]}
do
ssh $machine cat log.txt
done
I assume your machines aren't literally named 'machine1', 'machine2', etc.
Some links:
bash Array Tutorial
GNU Bash Array Documentation
for i in {1..20}
do
ssh machine$i cat log.txt
done
Use a loop?
for i in {1..20}
do
ssh machine$i cat log.txt
done
But note that you're running cat within a remote shell session, not the current one, so this might not quite work as you expect. Try it and see.
Put your hosts in a file and use a while loop as shown below. Note the use of the -n flag on ssh:
while read host; do ssh -n $host cat log.txt; done < hosts-file
Alternatively you can use PSSH:
pssh -h hosts-file -i "cat log.txt"
I would recommend using a program called Shmux. Despite the name, it works really well. I've used it with more than 100 machines with good results. It also gracefully handles machine failures for you which could be a disadvantage with a bash for loop approach.
I think the coolest thing about this program is the ability to issue multiple threads for your commands which allows to run the commands on all 20 machines in parallel.
Aside from the suggestions for using a loop, you might want to take a look at tools, like pssh or dsh, designed for running commands on multiple clients.

Resources