Bash exercise giving multiple input to a script by another - linux

i've come across to another exercise in preparation to the exam that i always find tricky for the redirection of input/output.
it asks:
Write a first script named "contaseparatamente.sh" that takes a variable number of arguments, each is a name of a file.
The script need to write on the Standard output the tot number of rows of the even's arguments and on the Standard error the tot number of rows of the odd's arguments.
(And i have done like this, and it works):
GNU nano 4.8 contaseparatamente.sh
#!/bin/bash
NUMEVEN=0
NUMODD=0
for((i=1; i<=$#; i++)); do
if((i%2==0))
then
NUMEVEN=$((${NUMEVEN} + `wc -l ${!i} | cut -d ' ' -f 1` ))
else
NUMODD=$((${NUMODD} + `wc -l ${!i} | cut -d ' ' -f 1` ))
fi;
done
echo rows of even ${NUMEVEN}
echo rows of odd ${NUMODD} 1<&2
then it asks: write a second script to launch and execute the first giving him as arguments the first 7 line of the output of ls -S1 /usr/include/*.h in the end this second script must show on the standard error also the output of the first script.
This is my try:
GNU nano 4.8
#!/bin/bash
./contaseparatamente.sh <( ls -S1 /usr/include/*.h | head -n 7 ) 2<&1
but in this way the result is
0 rows from the even
and 7 from the odd, which is not possible

I don't like the assignment, but...
To pass the args in the simplest way, use an unquoted subshell. (ugh)
#!/bin/bash
./contaseparatamente.sh $( ls -S1 /usr/include/*.h | head -n 7 )
The stderr of the first script will bleed through and show when you run the second script if you do nothing at all. If you need it on stdout, just redirect.
#!/bin/bash
./contaseparatamente.sh $( ls -S1 /usr/include/*.h | head -n 7 ) 2>&1

Related

why does it not work while overwrite IFS and read multi-data info variables?

I want to read result of ps command and the proc number into two variables, but all the output assigned to the first variable.
my shell followed like this
#!/bin/bash
function status() {
proc_num=`ps -ef | grep noah.*super | grep -v grep | tee /dev/stderr | wc -l`
return $proc_num
}
IFS=$'#' read -r -d '' ret proc_num <<< `status 2>&1; echo "#$?"`
echo -e "proc_num: $proc_num\n"
echo -e "ret: $ret"
the result followed like this:
proc_num:
ret: root 7140 21935 0 Jul27 ? 00:00:00 /bin/sh -- /noah/modules/cecb4af2fce3393df49e748f86d7a176/supervise.minos-agent --run
root 8213 7140 0 Jul27 ? 00:00:00 /bin/sh -- /noah/modules/cecb4af2fce3393df49e748f86d7a176/supervise.minos-agent --run
root 8919 21935 0 Jul27 ? 00:00:00 /bin/sh -- /noah/modules/cecb4af2fce3393df49e748f86d7a176/supervise.minos-agent --run
root 18530 1 0 17:04 ? 00:00:00 /bin/sh -- /noah/modules/c0b527e8b1ce71007f8164d07195a8a2/supervise.logagent --run
root 21935 1 0 Jul10 ? 00:00:00 /bin/sh -- /noah/modules/cecb4af2fce3393df49e748f86d7a176/supervise.minos-agent --run
root 32278 32276 0 2019 ? 00:00:00 /bin/sh /noah/modules/f314c3a2b201042b9545e255364e9a9d/bin/supervise.noah-ccs-agent --run
root 34836 1 0 Sep18 ? 00:00:00 /bin/sh /noah/modules/488dddfee9441251c82ea773a97dfcd3/bin/supervise.noah-client --run
root 56155 1 0 Jun07 ? 00:00:00 /bin/sh /noah/modules/11e7054f8e14a30bd0512113664584b4/bin/supervise.server_inspector --run
8
thanks for your help.
The immediate problem is that you're running into a bug in how earlier versions of bash treat unquoted here-strings (see this question). You can avoid it by double-quoting the here-string:
IFS=$'#' read -r -d '' ret proc_num <<< "`status 2>&1; echo "#$?"`"
...but please don't do that; this whole approach is overcomplicated and prone to problems.
Before I get to the more significant problems, I'll recommend using $( ) rather than backticks for command substitutions; they're easier to read, and avoid some parsing weirdnesses that backticks have.
Quote everything that might be misinterpreted. In grep noah.*super, the shell will try to turn noah.*super into a list of matching filenames. It's unlikely to find any matches, but if it somehow does the script will break in really weird ways. So use grep 'noah.*super' instead.
Do you have the pgrep command available? If so, use it instead of all of the ps | grep | grep stuff.
Exit/return statuses are for reporting status (i.e. success/failure, and maybe what failed), not returning data. Returning the number of processes found, as you're doing, will run into trouble if the number ever exceeds 255 (because the status is just a single byte, so that's the max it can hold). If there are ever 256 processes, the function will return 0. If there are 300, it'll return 44. etc. Return data as output, rather than abusing the return status like this.
Also, it's best to have functions produce output via stdout, rather than stderr as this one's doing. If you need to sneak a copy of the output past something like $( ), redirect it back to stdout afterward. And I'd tend to use something other than stderr anyway, to avoid mixing in any actual errors with the output stream. Here's an example using FD #3 (and BTW use local variables in functions when possible):
{ local proc_num=$(ps -ef | grep 'noah.*super' | grep -v grep | tee /dev/fd/3 | wc -l); } 3>&1
...or just capture the output, then do multiple things with it:
local output="$(ps -ef | grep 'noah.*super' | grep -v grep)"
echo "$output"
local proc_num="$(echo "$output" | wc -l | tr -d ' ')" # tr is to remove spaces from the output
status 2>&1; echo "#$?" is also trouble-prone; here you're taking that return status (which should've been output rather than a return status), and converting it to part of the output (which is what it should've been in the first place). And you're doing it so you can then re-split them back into separate bits of data with read. If you ever actually do need to capture both the output and return status from something, capture them separately:
output="$(status 2>&1)"
return_status=$?
(BTW, the right side of a simple assignment like this is one of the very few places it's safe to omit double-quotes around a process or variable substitution. But using double-quotes doesn't hurt, and it's easier to just reflexively double-quote than remember the list of safe places, so I went ahead and double-quoted it here.)
Don't use the function keyword, it's nonstandard. Just use funcname() { definition... }.
I'd avoid using echo -e -- different versions of echo (including the bash builtin complied with different options) will treat -e differently. Some will treat it as meaning to interpret escape sequences in the output, but some will print it as part of the output(!). Either just avoid it:
echo "proc_num: $proc_num"
echo
echo "ret: $ret"
Or use printf and put the escape stuff in the format string:
printf 'proc_num: %s\n\nret: %s\n' "$proc_num" "$ret"
...or...
printf '%s\n' "proc_num: $proc_num" "" "ret: $ret"
So, how would I do this? My first preference would be to move the number-of-processes calculation outside of the status function entirely:
#!/bin/bash
status() {
ps -ef | grep 'noah.*super' | grep -v grep
}
ret="$(status)"
proc_num="$(echo "$ret" | wc -l | tr -d ' ')" # tr -d ' ' to remove spaces from the string
echo "proc_num: $proc_num"
echo
echo "ret: $ret"
If you do need to have the function compute that count, I'd have it also take care of adding that to its output (and probably use process substitution instead of a here-string):
...
status() {
local output="$(ps -ef | grep 'noah.*super' | grep -v grep)"
echo "$output"
printf '#'
echo "$output" | wc -l | tr -d ' '
}
IFS='#' read -r -d '' ret proc_num < <(status)
...
Final note: run your scripts through shellcheck.net -- it'spot many common problems (like incorrect quoting).

Getting a specific line from a string where the line number I must get is stored in a variable?

I'm trying to get a specific line of a variable. The line I must get is stored in i. My code looks like this right now.
$(echo "$data" | sed '$iq;d')
It looks like I'm putting i in there wrong, Putting a number in for i works fine but $i gets me the entire string.
I haven't found a solution that works with a variable yet and I'm not too familiar with bash and would appreciate help,
Edit: a bit of context
i=5
data=$(netstat -a | grep ESTAB)
line=$(echo "$data" | sed "${i}p")
echo $line
Use sed -n "${i}p" instead.
Example:
i=4; seq 1 10 | sed -n "${i}p"
Output:
4
Bonus:
i=5
readarray -O 1 -t data < <(exec netstat -a | grep ESTAB) ## Stores data as an array of lines starting at index 1
line=${data[i]}
echo "$line"
# printf '%s\n' "${data[#]}" ## Prints whole data.
Here is way you can do this in BASH itself:
IFS=$'\n' arr=($data)
echo "${arr[$i]}"

Command to count the characters present in the variable

I am trying to count the number of characters present in the variable. I used the below shell command. But I am getting error - command not found in line 4
#!/bin/bash
for i in one; do
n = $i | wc -c
echo $n
done
Can someone help me in this?
In bash you can just write ${#string}, which will return the length of the variable string, i.e. the number of characters in it.
Something like this:
#!/bin/bash
for i in one; do
n=$(echo $i | wc -c)
echo $n
done
Assignments in bash cannot have a space before the equals sign. In addition, you want to capture the output of the command you run and assign that to $n, rather than that statement which would probably just assign $i to $n.
Use the following instead:
#!/bin/bash
for i in one; do
n=`$i | wc -c`
echo $n
done
It can be as simple as that:
str="abcdef"; wc -c <<< "$str"
7
But mind you that end of line counts as a character:
str="abcdef"; cat -A <<< "$str"
abcdef$
If you need to remove it:
str="abcdef"; tr -d '\n' <<< "$str" | wc -c
6

Count number of words in file, bash script

How could i go printing the number of words in a specified file in a bash script. For example it will be run as
cat test | ./bash_script.sh
cat test
Hello World
This is a test
Output of running cat test | ./bash_script would look like
Word count: 6.
I am aware that it can be done without a script. I am trying to implement wc -w into a bash script that will count the words like shown above. Any help is appreciated! Thank You
if given a stream of input as shown:
while read -a words; do (( num += ${#words[#]} )); done
echo Word count: $num.
Extending from the link #FredrikPihl gave in a comment: this reads from each file given as an argument or from stdin if no files given:
for f in "${#:-/dev/stdin}"; do
while read -a words; do (( num += ${#words[#]} )); done < "$f"
done
echo Word count: $num.
this should be faster:
for f in "${#:-/dev/stdin}"; do
words=( $(< "$f") )
(( num += ${#words[#]} ))
done
echo Word count: $num.
in pure bash:
read -a arr -d $'\004'
echo ${#arr[#]}
Try this:
wc -w *.md | grep total | awk '{print $1}'
#!/bin/bash
word_count=$(wc -w)
echo "Word count: $word_count."
As pointed by #keshlam in the comments, this can be easily done by executing wc -w from the shell script, I didn't understand what could be its use case.
Although, the above shell script will work as per your requirement. Below is a test output.
I believe what you need is a function that you could add to your bashrc:
function script1() { wc -w $1; }
script1 README.md
335 README.md
You can add the function to your .bash_rc file and call it what you want upon next console or if you source your .bashrc file then it will load in the function ... from then on you can call function name like you see with file and it will give you count
You could expand the contents of the file as arguments and echo the number of arguments in the script.
$# Expands to the number of script arguments
#!/bin/bash
echo "Word count: $#."
Then execute:
./bash_script.sh $(cat file)

Variable scope for bash shell scripts and functions in the script

I'm a little confused with my script regarding functions, variable scope, and possibly subshells.
I saw in another post that pipes spawn a subshell and the parent shell can't access variables from the subshell. Is this the same case with cmds run in backticks too?
To not bore people, I've shortened my 100+ line script but I tried to remember to leave in the important elements (i.e. backticks, pipes etc). Hopefully I didn't leave anything out.
global1=0
global2=0
start_read=true
function testfunc {
global1=9999
global2=1111
echo "in testfunc"
echo $global1
echo $global2
}
file1=whocares
file2=whocares2
for line in `cat $file1`
do
for i in `grep -P "\w+ stream" $file2 | grep "$line"` # possible but unlikely problem spot
do
end=$(echo $i | cut -d ' ' -f 1-4 | cut -d ',' -f 1) # possible but unlikely spot
duration=`testfunc $end` # more likely problem spot
done
done
echo "global1 = $global1"
echo "global2 = $global2"
So when I run my script, the last line says global1 = 0. However, in my function testfunc, global1 gets set to 9999 and the debug msgs print out that within the function at least, it is 9999.
Two questions here:
Do the backticks spawn a subshell and thus making my script not
work?
How do I work around this issue?
You can try something like
global1=0
global2=0
start_read=true
function testfunc {
global1=9999
global2=1111
echo "in testfunc"
echo $global1
echo $global2
duration=something
}
file1=whocares
file2=whocares2
for line in `cat $file1`
do
for i in `grep -P "\w+ stream" $file2 | grep "$line"` # possible but unlikely problem spot
do
end=$(echo $i | cut -d ' ' -f 1-4 | cut -d ',' -f 1) # possible but unlikely spot
testfunc $end # more likely problem spot
done
done
echo "global1 = $global1"
echo "global2 = $global2"
Do the backticks spawn a subshell and thus making my script not work?:
Yes they do and any changes made in variable in a subshell are not visible in parent shell.
How do I work around this issue?
You can probably try this loop that avoid spawning a subshell:
while read line
do
while read i
do
end=$(echo $i | cut -d ' ' -f 1-4 | cut -d ',' -f 1)
duration=$(testfunc "$end")
done < <(grep -P "\w+ stream" "$file2" | grep "$line")
done < "$file1"
PS: But testfunc will still be called in sub process.

Resources