OR condition in Shell Scripting - Unix - linux

I declare three variables.
$1=`ssh <server_1> cat /etc/passswd|cut -f -d:|grep -e $IID -e $EID`
$2=`ssh <server_2> cat /etc/shadow|cut -f -d:|grep -e $IID -e $EID`
$3=`ssh <server_3> cat /etc/passwd}|cut -f -d:|grep -i $CID`
The above three variables are created by taking ssh to servers and checking the presence of the IDs which I give as input. If the ID doesn't exist already, the the variable is going to be null.
Now, how do I verify if all the three variables are null. I wanted to use the OR condition specified within an IF.
I tried,
if [ -s "$1" -o -s "$2" -o -s "$3"];then
echo -$1 $2 $3 "already exist(s)"
It didnt work. Please advise.
PS: I have just begun my career in Unix and correct me If am wrong anywhere.

Several points.
When you assign to a variable, don't use the dollar sign:
foo=xxx
Variables $1, $2 etc are already used for your command line arguments. Pick other names. But not $4please. :-)
When you specify a command for ssh, and it has arguments, it has to be quoted, because the command needs to be a single argument for ssh. In your case use double quotes, as you want variable expansion for $IID etc.
Most Unix utils are able to open input files themselves, so you don't need to start your pipeline with cat.
foo=`ssh <server_1> "cut -f -d: /etc/passwd | grep -e $IID -e $EID"`
Or something like that.

It was a typo in my question. I had actually declared it as,
1=`ssh <server_1> cat /etc/passswd|cut -f -d:|grep -e $IID -e $EID`
2=`ssh <server_2> cat /etc/shadow|cut -f -d:|grep -e $IID -e $EID` and so on.
And I tried it as ,
if [ -s "$1" -o -s "$2" -o -s "$3"];then
echo -e $1 $2 $3 "already exist(s)"
Since I had to Deliver my script today, I used the conventional method of,
ssh <server_1> "cat /etc/passswd|cut -f -d:|grep -e $IID -e $EID" > file1
ssh <server_2> "cat /etc/shadow|cut -f -d:|grep -e $IID -e $EID" > file2
ssh <server_3> "cat /etc/passwd|cut -f -d:|grep -ix $CID" > file3
if [ -s file1 -o -s file2 -o -s file3]; then
for i in `cat file1 file2 file3`
do
echo $i "already exists"
done
else
And I have now learnt from my first post, that -s to ensure that a file is not empty and -z is to ensure string is empty.

Related

Using ssh inside a script to run another script that itself calls ssh

I'm trying to write a script that builds a list of nodes then ssh into the first node of that list
and runs a checknodes.sh script which it's self is just a for i loop that calls checknode.sh
The first 2 lines seems to work ok, the list builds successfully, but then I get either get just the echo line of checknodes.sh to print out or an error saying cat: gpcnodes.txt: No such file or directory
MYSCRIPT.sh:
#gets the master node for the job
MASTERNODE=`qstat -t -u \* | grep $1 | awk '{print$8}' | cut -d'#' -f 2 | cut -d'.' -f 1 | sed -e 's/$/.com/' | head -n 1`
#builds list of nodes in job
ssh -qt $MASTERNODE "qstat -t -u \* | grep $1 | awk '{print$8}' | cut -d'#' -f 2 | cut -d'.' -f 1 | sed -e 's/$/.com/' > /users/issues/slow_job_starts/gpcnodes.txt"
ssh -qt $MASTERNODE cd /users/issues/slow_job_starts/
ssh -qt $MASTERNODE /users/issues/slow_job_starts/checknodes.sh
checknodes.sh
for i in `cat gpcnodes.txt `
do
echo "### $i ###"
ssh -qt $i /users/issues/slow_job_starts/checknode.sh
done
checknode.sh
str=`hostname`
cd /tmp
time perf record qhost >/dev/null 2>&1 | sed -e 's/^/${str}/'
perf report --pretty=raw | grep % | head -20 | grep -c kernel.kallsyms | sed -e "s/^/`hostname`:/"
When ssh -qt $MASTERNODE cd /users/issues/slow_job_starts/ is finished, the changed directory is lost.
With the backquotes replaced by $(..) (not an error here, but get used to it), the script would be something like
for i in $(cat /users/issues/slow_job_starts/gpcnodes.txt)
do
echo "### $i ###"
ssh -nqt $i /users/issues/slow_job_starts/checknode.sh
done
or better
while read -r i; do
echo "### $i ###"
ssh -nqt $i /users/issues/slow_job_starts/checknode.sh
done < /users/issues/slow_job_starts/gpcnodes.txt
Perhaps you would also like to change your last script (start with cd /users/issues/slow_job_starts)
You will find more problems, like sed -e 's/^/${str}/' (the ${str} inside single quotes won't be replaced by a host), but this should get you started.
EDIT:
I added option -n to the ssh call.
Redirects stdin from /dev/null (actually, prevents reading from stdin).
Without this option only one node is checked.

Running variable string match against grep search?

I've defined the variables here to shorten the logic a little. The wget works fine (downloads the correct file) and grepping for tar.gz works in the wget.log
The issue is the match to another file!
Basically, if it's on a blacklist I want it to skip!
var1=https://somewebsite.com/directory
line1=directory
sudo wget -O wget.log https://somewebsite.com/$line1/releases
if grep -q "tar.gz" wget.log | "$var1" -ne grep -q
"https://somewebsite.com/$line1" banned; then
echo "Good Job!"
else
echo "Skip!"
fi
Use && to test if both of the grep commands succeed
if grep -q -F 'tar.gz' wget.log && grep -q -F -x "$variable" banned
then
echo "Skip!"
else
echo "Good Job!"
fi
I've used the -F option to grep because none of the strings we're searching for are regular expressions, they're fixed strings. And I used -x in the second grep to match the whole line in the blacklist.

how to sed replace whole line with the string as variable?

how to sed replace whole line with the string as variable ?
#!/bin/bash
ssh $1 ssh-keyscan -t rsa $1 > /tmp/$1
RSA=$(cat /tmp/$1)
echo $RSA
sed -i 's:'^"$1".*':'"$RSA"':' /etc/ssh/ssh_known_hosts
cat /etc/ssh/ssh_known_hosts | grep $1
It is storing the variable in RSA but not replacing, not sure what's wrong with sed part.
You can use the following command.
#!/bin/bash
ssh $1 ssh-keyscan -t rsa $1 > /tmp/$1
RSA=$(cat /tmp/$1)
echo $RSA
sed -i -e "/$1/ d" -e "/^$1/ a $RSA" /etc/ssh/ssh_known_hosts
cat /etc/ssh/ssh_known_hosts | grep $1
I modified as per your requirement to add if not line exists and replace if it exists.

command not working as expected if run via /bin/sh -c

I have to concatenate a set of files. Directory structure is like this:
root/features/xxx/multiple_files... -> root/xxx/single_file
what i have written (and it works fine):
for dirname in $(ls -d root/features/*|awk -F/ '{print $NF}');do;mkdir root/${dirname};cat root/features/${dirname}/* > root/${dirname}/final.txt;done
But when i run the same thing via sh shell
/bin/sh -c "for dirname in $(ls -d root/features/*|awk -F/ '{print $NF}');do;mkdir root/${dirname};cat root/features/${dirname}/* > root/${dirname}/final.txt;done"
it gives me errors:
/bin/sh: -c: line 1: syntax error near unexpected token `201201000'
/bin/sh: -c: line 1: `201201000'
My process always appends /bin/sh -c before running any commands. Any suggestions what might be going wrong here? Any alternate ways? I have spent a really long time on this ,without making much headway!
EDIT:
`ls -d root/features/*|awk -F/ '{print $NF}' returns
201201
201201000
201201001
201201002
201201003
201201004
201201005
201201006
201201007
201202000
201205000
201206000
201207000
201207001
201207002
Always use sh -c 'cmd1 | cmd2' with single quotes.
Always use sh -eu -xv -c 'cmd1 | cmd2' to debug.
Always use bash -c 'cmd1 | cmd2' if your code is Bash-specific (cf. process substitution, ...).
Remove ; after do in for ... ; do; mkdir ....
Escape possible single quotes within single quotes like so: ' --> '\''.
(And sometimes just formatting your code clarifies a lot.)
Applied to your command this should look somewhat like this ...
# test version
/bin/sh -c '
for dirname in $(ls -d /* | awk -F/ '\''{print $NF}'\''); do
printf "%s\n" "mkdir root/${dirname}";
printf "%s\n" "cat root/features/${dirname}/* > root/${dirname}/final.txt";
echo
done
' | nl
# test version using 'printf' instead of 'ls'
sh -c '
printf "%s\000" /*/ | while IFS="" read -r -d "" file; do
dirname="$(basename "$file")"
printf "%s\n" "mkdir root/${dirname}";
printf "%s\n" "cat root/features/${dirname}/* > root/${dirname}/final.txt";
echo
done
' | nl
I got this to run in the little test environment I set up on my box. Turns out it didn't like the double quotes. The issue I ran into was the quotes around the awk statement...if you wrap it in double quotes it prints the whole thing.....I used cut to get the desired result, but my guess is you'll have to change the -f arg to 3 instead of 2..I think.
/bin/sh -c 'for dirname in $(ls -d sh_test/* | awk -F/ '\''{print $NF}'\''); do mkdir sh_test_root/${dirname}; cat sh_test/${dirname}/* > sh_test_root/${dirname}/final.txt;done'
edit: Tested edit proposed by nadu and it works fine. The above reflects that change.

Find and highlight text in linux command line

I am looking for a linux command that searches a string in a text file,
and highlights (colors) it on every occurence in the file, WITHOUT omitting text lines (like grep does).
I wrote this handy little script. It could probably be expanded to handle args better
#!/bin/bash
if [ "$1" == "" ]; then
echo "Usage: hl PATTERN [FILE]..."
elif [ "$2" == "" ]; then
grep -E --color "$1|$" /dev/stdin
else
grep -E --color "$1|$" $2
fi
it's useful for stuff like highlighting users running processes:
ps -ef | hl "alice|bob"
Try
tail -f yourfile.log | egrep --color 'DEBUG|'
where DEBUG is the text you want to highlight.
command | grep -iz -e "keyword1" -e "keyword2" (ignore -e switch if just searching for a single word, -i for ignore case, -z for treating as a single file)
Alternatively,while reading files
grep -iz -e "keyword1" -e "keyword2" 'filename'
OR
command | grep -A 99999 -B 99999 -i -e "keyword1" "keyword2" (ignore -e switch if just searching for a single word, -i for ignore case,-A and -B for no of lines before/after the keyword to be displayed)
Alternatively,while reading files
grep -A 99999 -B 99999 -i -e "keyword1" "keyword2" 'filename'
command ack with --passthru switch:
ack --passthru pattern path/to/file
I take it you meant "without omitting text lines" (instead of emitting)...
I know of no such command, but you can use a script such as this (this one is a simple solution that takes the filename (without spaces) as the first argument and the search string (also without spaces) as the second):
#!/usr/bin/env bash
ifs_store=$IFS;
IFS=$'\n';
for line in $(cat $1);
do if [ $(echo $line | grep -c $2) -eq 0 ]; then
echo $line;
else
echo $line | grep --color=always $2;
fi
done
IFS=$ifs_store
save as, for instance colorcat.sh, set permissions appropriately (to be able to execute it) and call it as
colorcat.sh filename searchstring
I had a requirement like this recently and hacked up a small program to do exactly this. Link
Usage: ./highlight test.txt '^foo' 'bar$'
Note that this is very rough, but could be made into a general tool with some polishing.
Using dwdiff, output differences with colors and line numbers.
echo "Hello world # $(date)" > file1.txt
echo "Hello world # $(date)" > file2.txt
dwdiff -c -C 0 -L file1.txt file2.txt

Resources