Is is possible to pipe the output of a command from a server to a local machine? - linux

I have a series of functionally identical servers provided by my school that run various OS and hardware configurations. For the most part, I can use 5 of these interchangeably. Unfortunately, other students tend to bunch up on some machines and It's a pain to find one that isn't bogged down.
What I want to is ssh into a machine, run the command:
w | wc -l
to get a rough estimate of the load on that server, and use that information to select the least impacted one. A sort of client-side load balancer.
Is there a way to do this or achieve the same result?

I'd put this on your .bashrc file
function choose_host(){
hosts="host1 ... hostn"
for host in $hosts
do
echo $(ssh $host 'w|wc -l') $host
done | sort | head -1 | awk '{print $2}'
}
function ssh_host(){
ssh $(choose_host)
}
choose_host should give you the one you're looking for. This is absolutely overkill but i was feeling playful :D
sort will order the output according to the result of w|wc -l, then head -1 gets the first line and awk will just print the hostname !
You can call ssh_host and should log you automatically.

You can use pdsh command from your desktop which run the specified command on the set of machines you specified and return the results. This way you can find out the one which is least loaded. This will avoid you doing ssh to every single machine and run the w | wc -l.

Yes. See e.g.:
ssh me#host "ls /etc | sort" | wc -l
The part inside "" is done remotely. The part afterwards is local.

Related

How to list the machine name along with running programs list in Linux?

I have a shell script to check if Firefox is running in my Linux machine like;
ps -ef|grep firefox
this will list all the instances of firefox running in my machine, showing their PIDs, so that I can manually kill them. My question is, is it possible to display the machine name also in this list? If there are multiple instances, each line should contain the machine name (or IP) also. In my shellscript, i did something like;
hostname
ps -ef|grep firefox
which returns the hostname once, and multiple instances are listed below that one by one. How can I print machine name (or IP) also along with each line?
Like this:
ps -ef | egrep '[/ ]firefox' | sed "s/^/$(hostname -s) : /"
This will do it:
ps -ef | grep [f]irefox | xargs -I{} echo "$(hostname) {}"
Notice the brackets around 'f' in firefox. This prevents your grep command from showing up in the results.

referencing stdout in a command that has been piped into

I want to make a simple dmenu command that reads a file of commands and names. Then takes the names and displays them using dmenu then takes dmenu's output and runs the associated command using the file again.
I got to the point where dmenu displays the names, but I don't really know where to go from there. Learning bash is a really daunting task to me and I don't really know where to start with this seemingly simple script/command.
here is the file:
Pushbullet
google-chrome-stable --app=https://www.pushbullet.com
Steam
steam
Chrome
google-chrome-stable
Libre Office
libreoffice
Transmission
transmission-qt
Audio Control Panel
sudo pavucontrol & bluberry
and here is what I have so far for my command:
awk 'NR % 2 != 0' /home/rocco/programlist | dmenu | ??(grep -l "stdout" /home/rocco/programlist....)
It was my thinking that I could somehow pipe into grep or awk with the name of the application then get the line number then add one and pipe that into sh.
Thanks
I have no experience with dmenu but if I understand how it works correctly, this should do what you want. Wrapping a command in $(…) returns the output as a variable, which we can pass on to another command.
#!/bin/bash
plist="/home/rocco/programlist"
# pipe every second line to dmenu
selected=$(awk 'NR % 2 != 0' "$plist" | dmenu)
# search for the selected item, get the command after it
cmd=$(grep -A1 "$selected" "$plist" | tail -n 1)
# run the command
$cmd
Worth mentioning a mistake in your question. dmenu sends to stdout, or standard output, but the next program in line would be reading stdin, or standard input. In any case, grep can't take patterns on standard input, which is why I've saved to a variable instead of trying to pipe it somewhere.
Assuming you have programlist.txt in the working directory you can use:
awk 'NR%2 !=0' programlist.txt |dmenu |awk '{system("grep --no-group-separator -A 1 '"'"'"$0"'"'"' programlist.txt");}' |awk '{if(NR==2){system($0);}}'
Note the quoting of the $0 in the first awk envocation. This is necessary to get names with spaces in them like "Libre Office"

How to script bulk nslookups

I have a list of several million domain names and I want to see if they are available or not.
I tried pywhois first but am getting rate limited. As I don't need an authoritative answer, I thought I would just use nslookup. I am having trouble scripting this though.
Basically, what I want to do is, if the domain is registered, echo it. What I'm getting is grep: find”: No such file or directory . I think its something easy and I've just been looking at this for too long...
#!/bin/bash
START_TIME=$SECONDS
for DOMAIN in `cat ./domains.txt`;
do
if ! nslookup $DOMAIN | grep -v “can’t find”; then
echo $DOMAIN
fi
done
echo ELAPSED_TIME=$(($SECONDS - $START_TIME))
If you have millions to check, you may like to use GNU Parallel to get the job done faster, like this if you want to repeatedly do, say 32 lookups in parallel
parallel -j 32 nslookup < domains.txt | grep "^Name"
If you want to fiddle with the output of nslookup, the easiest way is probably to declare a little function called lkup(), tell GNU Parallel about it and then use that, like this
#!/bin/bash
lkup() {
if ! nslookup $1 | grep -v "can't find"; then
echo $1
fi
}
# Make lkup() function visible to GNU parallel
export -f lkup
# Check the domains in parallel
parallel -j 32 lkup < domains.txt
If the order of the lookups is important to you, you can add the -k flag to parallel to keep the order.
The error is because you have curly quotes in your script, which are not the proper way to quote command line elements. As a result, they're being treated as part of a filename. Change to:
if ! nslookup $DOMAIN | grep -v "can't find"; then

A script that places HDD location eg.sda into a variable

What I'm trying to do via command line is have a script take a newly connected HDD and put its device location eg. sda, sdb, sdc etc into a variable i can use.
I've tried:
tail -f /var/log/messages | grep GB/
Which take the line with "GB/" which has the device location.
But i can not manipulate line down using "sed" or anything equivalent as i don't know how to exit the command above once it have found the most recent, relevant information and i also cant get that information into a position to manipulate it.
I have tried the > and >> to output to a file but that didn't work and have also tried putting the above code in brackets and redirecting that which also didn't work.
It's not clear when you would be doing this, like if it's just after you "know" that a new HDD has been connected.
What you could do is capture the output of "ls -1d /sys/block/sd*" before, and then again after, and diff them, which would give the added device.
This works on my machine,
$ HDD=$(dmesg | grep blocks | cut -f3 -d\[ | cut -f1 -d\] | tail -n1)
$ echo $HDD
sdk

shell script to download latest file from FTP

I am writing shell script first time, I want to download latest create file from FTP.
I want to download latest file of specific folder. Below is my code for that. But it is downloading all the files of the folder not the latest one.
ftp -in ftp.abc.com << SCRIPTEND
user xyz xyz
binary
cd Rpts/
mget ls -t -r | tail -n 1
quit
SCRIPTEND
help me with this, please?
Try using wget or lftp utility instead, it compares file time/date and AFAIR its purpose is ftp scripting. Switch to ssh/rsync if possible, you can read a bit about lftp instead of rsync here:
https://serverfault.com/questions/24622/how-to-use-rsync-over-ftp
Probably the easiest way is to link last version on server side to "current", and always get the file pointed. If you're not admin of the server, you need to list all files with date/time, grab the information, parse it, decide which one is newest, in the meantime state on the server can change, and you find yourself in more complicated solution than it's worth.
The point is, that "ls" sorts output in some way, and time may not be default. There are switches to sort it e.g. base on modification time, however even when server responds with OK on ls -t , you can't be sure it really supports sorting, it can just ignore all switches and always return the same list, that's why admins usually use "current" link (ln -s). If there's no "current", to make sure you have the right file, you need to parse list anyway ( ls -al ).
http://www.catb.org/esr/writings/unix-koans/shell-tools.html
Looking at the code, the line
mget ls -t -r | tail -n 1
doesn't do what you think. It actually grabs all of the output of ls -t and then tail processes the output of mget. You could replace this line with
mget $(ls -t -r | tail -n 1)
but I am not sure if ftp will support such a call...
Try using an FTP client other than ftp. For example, curlftpfs available at curlftpfs.sourceforge.net is a good candidate as it allows you to mount an FTP to a directory as if it is a local folder and then run different commands on the files there (including find, grep, etc.). Take a look at this article.
This way, since the output comes form a local command, you'd be more certain that ls -t returns a properly sorted list.
Btw, it's a bit less convoluted to use ls -t | head -1 than ls -t -r | tail -1. They produce the same result but why reverse and grab from the tail when you can just grab the head :)
If you use curlftpfs then your script would be something like this (assuming server ftp.abc.com and user xyz with password xyz).
mkdir /tmp/ftpsession
curlftpfs ftp://xyz:xyz#ftp.abc.com /tmp/ftpsession
cd /tmp/ftpsession/Rpts
cp -Rpf $(ls -t | head -1) /your/destination/folder/or/file
cd -
umount /tmp/ftpsession
My Solution is this:
curl 'ftp://server.de/dir/'$(curl 'ftp://server.de/dir/' 2>/dev/null | tail -1 | awk '{print $(NF)}')

Resources