working fine in command prompt but not in shell script - linux

I have to take the field count from particular line in a zip file.
when I queried on command prompt in Linux it gives me output.
gunzip -c file | grep 'good' | awk -F' ' '{prinf NF}'
when execute this query on command line it gives a output 10 which is correct.
when I assigned this to a variable in shell script and execute .sh it gives me error
cat > find.sh
cnt=`gunzip -c file | grep 'good' | awk -F' ' '{print NF}'`
echo $cnt
./ sh find.sh
find.sh: 2: find sh: 10: not found
Please help out in this..!!

Try this:
cat find.sh
#!/bin/bash
cnt=$(gunzip -c file | awk '/good/ {prinf NF}')
echo $cnt
./find.sh
10

Related

Cronjob fails when script is located in other folder

I have a bash script that has a sentence like this:
readarray users < <(cat /etc/passwd | grep 'aText' | awk -F':' '{print $1}');
When I run it in my home folder, it works. Same if I run it with sudo. Same success when I run it as a crontab job adding it with sudo crontab -e and defined like this: /15 * * * * /home/my-home/myscript.sh > /home/my-home/myscript.log 2>&1
When I move the script to the folder /opt/my-org/my-app/utils/myscript.sh I can still run it, however when I update the cronjob with sudo crontab -e to /15 * * * * /opt/my-org/my-app/utils/myscript.sh > /opt/my-org/my-app/utils/myscript.log 2>&1 I get the next error in the log file:
/opt/my-org/my-app/utils/myscript.sh: line 6: syntax error near unexpected token `<'
/opt/my-org/my-app/utils/myscript.sh: line 6: `readarray users < <(cat /etc/passwd | grep 'aText' | awk -F':' '{print $1}');'
I am using RHEL 7.9
Why is this happening?
My script has the #!/bin/bash line, also bash version is 4.3
Also, I noticed this:
$ readarray users < <(cat /etc/passwd | grep 'aText' | awk -F':' '{print $1}');
$ echo $?
0
$ sudo readarray users < <(cat /etc/passwd | grep 'aText' | awk -F':' '{print $1}');
sudo: readarray: command not found
$ echo $?
1
I am expecting the crontab job to run successfully independently where is it located .

echo text to multiple files in bash script

I am working on a bash script that uses pssh to run external commands, then join the output of the commands with the IP of each server. pssh has an option -o that writes a file for each server into a specified directory, but if the commands do not run, you just have an empty file. What I am having issues with is updating these empty files with something like "Server Unreachable" so that I know there was a connection issue reaching the server and to not cause problems with the rest of the script.
Here is what I have so far:
#!/bin/bash
file="/home/user/tools/test-host"
now=$(date +"%F")
folder="./cnxhwinfo-$now/"
empty="$(find ./cnxhwinfo-$now/ -maxdepth 1 -type f -name '*' -size 0 -printf '%f%2d')"
command="echo \$(uptime | awk -F'( |,|:)+' '{d=h=m=0; if (\$7==\"min\") m=\$6; else {if (\$7~/^day/) {d=\$6;h=\$8;m=\$9} else {h=\$6;m=\$7}}} {print d+0,\"days\",h+0,\"hours\",m+0,\"minutes\"}'), \$(hostname | awk '{print \$1}'), \$(sudo awk -F '=' 'FNR == 2 {print \$2}' /etc/connex-release/version.txt), \$(lscpu | awk -F: 'BEGIN{ORS=\", \";} NR==4 || NR==6 || NR==15 {print \$2}' | sed 's/ *//g') \$(free -k | awk '/Mem:/{print \$2}'), \$(df -Ph | awk '/var_lib/||/root/ {print \$2,\",\"\$5,\",\"}')"
pssh -h $file -l user -t 10 -i -o /home/user/tools/cnxhwinfo-$now -x -tt $command
echo "Server Unreachable" | tee "./cnxhwinfo-$now/$empty"
ls ./cnxhwinfo-$now >> ./cnx-data-$now
cat ./cnxhwinfo-$now/* >> ./cnx-list-$now
paste -d, ./cnx-data-$now ./cnx-list-$now >>./cnx-data-"$(date +"%F").csv"
I was trying to use find to locate the empty files and write "Server" unavailable using tee with this:
echo "Server Unreachable" | tee "./cnxhwinfo-$now/$empty"
if the folder specified doesn't already exist i get this error:
tee: ./cnxhwinfo-2019-09-03/: Is a directory
And if it does exist (ie, i run the script again), it instead creates a file named after the IP addresses returned by the find command, like this:
192.168.1.2 192.168.1.3 192.168.1.4 1
I've also tried:
echo "Server Unreachable" | tee <(./cnxhwinfo-$now/$empty)
The find command outputs the IP addresses on a single line with a space in between each one, so I thought that would be fine for tee to use, but I feel like I am either running into syntax issues, or am going about this the wrong way. I have another version of this same script that uses regular ssh and works great, just much slower than using pssh.
empty should be an array, assuming none of the file names will contain any whitespace in their names.
readarray -t empty < <(find ...)
echo "Server unreachable" | (cd ./cnxhwinfo-$now/; tee "${empty[#]}" > /dev/null)
Otherwise, you are building a single file name by concatenating the empty file names.

echo $variable in cron not working

Im having trouble printing the result of the following when run by a cron. I have a script name under /usr/local/bin/test
#!/bin/sh
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
ARAW=`date +%y%m%d`
NAME=`hostname`
TODAY=`date '+%D %r'`
cd /directory/bar/foo/
VARR=$(ls -lrt /directory/bar/foo/ | tail -1 | awk {'print $8'} | ls -lrt `xargs` | grep something)
echo "Resolve2 Backup" > /home/user/result.txt
echo " " >> /home/user/result.txt
echo "$VARR" >> /home/user/result.txt
mail -s "Result $TODAY" email#email.com < /home/user/result.txt
I configured it in /etc/cron.d/test to run every 1am:
00 1 * * * root /usr/local/bin/test
When Im running it manually in command line
# /usr/local/bin/test
Im getting the complete value. But when I let cron do the work, it never display the part of echo "$VARR" >> /home/user/result.txt
Any ideas?
VARR=$(ls -lrt /directory/bar/foo/ | tail -1 | awk {'print $8'} | ls -lrt `xargs` | grep something)
ls -ltr /path/to/dir will not include the directory in the filename part of the output. Then, you call ls again with this output, and this will look in your current directory, not in /path/to/dir.
In cron, your current directory is likely to be /, and in your manual testing, I bet your current directory is /path/to/dir
Here's another approach to finding the newest file in a directory that emits the full path name:
stat -c '%Y %n' /path/to/dir/* | sort -nr | head -1 | cut -d" " -f 2-
Requires GNU stat, check your man page for the correct invocation for your system.
I think your VARR invocation can be:
latest_dir=$(stat -c '%Y %n' /path/to/dir/* | sort -nr | head -1 | cut -d" " -f 2-)
interesting_files=$(ls -ltr "$latest_dir"/*something*)
Then, no need for a temp file:
{
echo "Resolve2 Backup"
echo
echo "$interesting_files"
} |
mail -s "Result $TODAY" email#email.com
Thanks for all your tips and response. I solved my problem. The problem is the ouput of $8 and $9 in cron. I dont know what special field being read while it is being run in cron. Im just a newbie in scripting so sorry for my bad script =)

cat file_name | grep "something" results "cat: grep: No such file or directory" in shell scripting

I have written shell script which reads commands from input file and execute commands. I have command like:
cat linux_unit_test_commands | grep "dmesg"
in the input file. I am getting below error message while executing shell script:
cat: |: No such file or directory
cat: grep: No such file or directory
cat: "dmesg": No such file or directory
Script:
#!/bin/bash
while read line
do
output=`$line`
echo $output >> logs
done < $1
Below is input file(example_commands):
ls
date
cat linux_unit_test_commands | grep "dmesg"
Execute: ./linux_unit_tests.sh example_commands
Please help me to resolve this issue.
Special characters like | and " are not parsed after expanding variables; the only processing done after variable expansion is word splitting and wildcard expansions. If you want the line to be parsed fully, you need to use eval:
while read line
do
output=`eval "$line"`
echo "$output" >> logs
done < $1
You might be wondering why its not working with cat command.
Then here is the answer for your question.
output=`$line` i.e. output=`cat linux_unit_test_commands | grep "dmesg"`
here the cat command will take (linux_unit_test_commands | grep "dmesg") all these as arguments i.e. fileNames.
From Man page:
SYNTAX : cat [OPTION]... [FILE]...
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
Script is OK!
#!/bin/bash
while read line;
do
output=`$line`
echo $output >> logs
done < $1
To make it work you need to change 'cat: "dmesg": No such file or directory' to 'grep "dmesg" linux_unit_test_commands'. It will work!
cat linux_unit_test_commands
ls
date
grep "dmesg" linux_unit_test_commands

command not working as expected if run via /bin/sh -c

I have to concatenate a set of files. Directory structure is like this:
root/features/xxx/multiple_files... -> root/xxx/single_file
what i have written (and it works fine):
for dirname in $(ls -d root/features/*|awk -F/ '{print $NF}');do;mkdir root/${dirname};cat root/features/${dirname}/* > root/${dirname}/final.txt;done
But when i run the same thing via sh shell
/bin/sh -c "for dirname in $(ls -d root/features/*|awk -F/ '{print $NF}');do;mkdir root/${dirname};cat root/features/${dirname}/* > root/${dirname}/final.txt;done"
it gives me errors:
/bin/sh: -c: line 1: syntax error near unexpected token `201201000'
/bin/sh: -c: line 1: `201201000'
My process always appends /bin/sh -c before running any commands. Any suggestions what might be going wrong here? Any alternate ways? I have spent a really long time on this ,without making much headway!
EDIT:
`ls -d root/features/*|awk -F/ '{print $NF}' returns
201201
201201000
201201001
201201002
201201003
201201004
201201005
201201006
201201007
201202000
201205000
201206000
201207000
201207001
201207002
Always use sh -c 'cmd1 | cmd2' with single quotes.
Always use sh -eu -xv -c 'cmd1 | cmd2' to debug.
Always use bash -c 'cmd1 | cmd2' if your code is Bash-specific (cf. process substitution, ...).
Remove ; after do in for ... ; do; mkdir ....
Escape possible single quotes within single quotes like so: ' --> '\''.
(And sometimes just formatting your code clarifies a lot.)
Applied to your command this should look somewhat like this ...
# test version
/bin/sh -c '
for dirname in $(ls -d /* | awk -F/ '\''{print $NF}'\''); do
printf "%s\n" "mkdir root/${dirname}";
printf "%s\n" "cat root/features/${dirname}/* > root/${dirname}/final.txt";
echo
done
' | nl
# test version using 'printf' instead of 'ls'
sh -c '
printf "%s\000" /*/ | while IFS="" read -r -d "" file; do
dirname="$(basename "$file")"
printf "%s\n" "mkdir root/${dirname}";
printf "%s\n" "cat root/features/${dirname}/* > root/${dirname}/final.txt";
echo
done
' | nl
I got this to run in the little test environment I set up on my box. Turns out it didn't like the double quotes. The issue I ran into was the quotes around the awk statement...if you wrap it in double quotes it prints the whole thing.....I used cut to get the desired result, but my guess is you'll have to change the -f arg to 3 instead of 2..I think.
/bin/sh -c 'for dirname in $(ls -d sh_test/* | awk -F/ '\''{print $NF}'\''); do mkdir sh_test_root/${dirname}; cat sh_test/${dirname}/* > sh_test_root/${dirname}/final.txt;done'
edit: Tested edit proposed by nadu and it works fine. The above reflects that change.

Resources