Too many arguments in for - linux

In both for statements, I am getting the following error:
./count_files.sh: line 21: [: too many arguments
./count_files.sh: line 16: [: too many arguments.
Can anyone help me ?
#!/bin/bash
files=($(find /usr/src/linux-headers-3.13.0-34/include/ -type f -name '[aeiou][a-z0-9]*.h'))
count=0
headerfiles=($(find /usr/src/linux-headers-3.13.0-34/include/ -type f -name '[_a-zA-Z0-9]*.h' | grep -v "/linux/"))
for file in "${files[#]}"
do
if ! [ grep -Fxq "linux/err.h" $file ];
then
localcount=0
for header in "${headerfiles[#]}"
do
if [ grep -Fxq $header $file ];
then
localcount=$((localcount+1))
if [ $localcount -eq 3 ];
then
count=$(($count+1))
break
fi
fi
done
localcount=0
fi
done
echo $count

One of the problem lines is:
if ! [ grep -Fxq "linux/err.h" $file ];
The semicolon at the end is unnecessary unless the then is on the same line; it is, however, harmless.
It looks as though you want to execute the grep command and check whether it produces any output. However, you've simply provided the test (aka [) command with four string arguments (plus the closing ] for 5 in total), the second of which is not one of the options recognized by test.
You might have meant to use this:
if ! [ -n "$(grep -Fxq "linux/err.h" "$file")" ]
(unless you meant -z instead of -n; the negations are confusing me). But if you're interested in whether grep found anything, you can simply test the exit status of grep:
if grep -Fxq "linux/err.h" "$file"
Hmmm...the -q is 'quiet' mode; so in fact the string test won't work since grep produces no output. You want the direct test of the exit status, possibly preceded by the ! logical not operator.

You shouldn't use square brackets around the grep.
In shell scripts square brackets are not used for grouping, [ is a command in its own right (an alias for test), and it is the [ command that is complaining that you've given it too many arguments.
Just make the call without brackets
if ! grep ....

Change the fors to whiles with reads:
...
echo "${files}" | while read file ; do
...
echo "${headerfiles}" | while read header ; do
...
done
...
done
...

Related

Check if WC Command output is greather than in BASH

I need to check if wc command output is greather than a variable. Here's my code:
if test wc -w $i -gt $num
then
echo "too great"
fi
If the current file $i contains more words than the $num variable i print "too great". I already tried all but can't get the script working.
You need to take the output of the wc command and use it as an argument to test:
if test "$( wc -w < "$i" )" -gt "$num"
See "Command Substitution" in man bash.
If you don't use redirection <, wc also outputs the file name, which would break the comparison.

Bash scripting wanting to find a size of a directory and if size is greater than x then do a task

I have put the following together with a couple of other articles but it does not seem to be working. What I am trying to do eventually do is for it to check the directory size and then if the directory has new content above a certain total size it will then let me know.
#!/bin/bash
file=private/videos/tv
minimumsize=2
actualsize=$(du -m "$file" | cut -f 1)
if [ $actualsize -ge $minimumsize ]; then
echo "nothing here to see"
else
echo "time to sync"
fi
this is the output:
./sync.sh: line 5: [: too many arguments
time to sync
I am new to bash scripting so thank you in advance.
The error:
[: too many arguments
seems to indicate that either $actualsize or $minimumsize is expanding to more than one argument.
Change your script as follows:
#!/bin/bash
set -x # Add this line.
file=private/videos/tv
minimumsize=2
actualsize=$(du -m "$file" | cut -f 1)
echo "[$actualsize] [$minimumsize]" # Add this line.
if [ $actualsize -ge $minimumsize ]; then
echo "nothing here to see"
else
echo "time to sync"
fi
The set -x will echo commands before attempting to execute them, something which assists greatly with debugging.
The echo "[$actualsize] [$minimumsize]" will assist in trying to establish whether these variables are badly formatted or not, before the attempted comparison.
If you do that, you'll no doubt find that some arguments will result in a lot of output from the du -m command since it descends into subdirectories and gives you multiple lines of output.
If you want a single line of output for all the subdirectories aggregated, you have to use the -s flag as well:
actualsize=$(du -ms "$file" | cut -f 1)
If instead you don't want any of the subdirectories taken into account, you can take a slightly different approach, limiting the depth to one and tallying up all the sizes:
actualsize=$(find . -maxdepth 1 -type f -print0 | xargs -0 ls -al | awk '{s += $6} END {print int(s/1024/1024)}')

How to use grep with single brackets?

I was looking at an answer in another thread about which bracket pair to use with if in a bash script. [[ is less surprising and has more features such as pattern matching (=~) whereas [ and test are built-in and POSIX compliant making them portable.
Recently, I was attempting to test the result of a grep command and it was failing with [: too many arguments. I was using [. But, when I switched to [[ it worked. How would I do such a test with [ in order to maintain the portability?
This is the test that failed:
#!/bin/bash
cat > slew_pattern << EOF
g -x"$
EOF
if [ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]; then
echo "slew mode"
else
echo "not slew mode"
fi
And the test that succeeded:
#!/bin/bash
cat > slew_pattern << EOF
g -x"$
EOF
if [[ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]]; then
echo "slew mode"
else
echo "not slew mode"
fi
if [ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]; then
This command will certainly fail for multiple matches. It will throw an error as the grep output is being split on line ending.
Multiple matches of grep are separated by new line and the test command becomes like:
[ match1 match2 match3 ... ]
which doesn't make much of a sense. You will get different error messages as the number of matches returned by grep (i.e the number of arguments for test command [).
For example:
2 matches will give you unary operator expected error
3 matches will give you binary operator expected error and
more than 3 matches will give you too many arguments error or such, in Bash.
You need to quote variables inside [ to prevent word splitting.
On the other hand, the Bash specific [[ prevents word splitting by default. Thus the grep output doesn't get split on new line and remains a single string which is a valid argument for the test command.
So the solution is to look only at the exit status of grep:
if grep -E -f slew_pattern /etc/sysconfig/ntpd; then
Or use quote when capturing output:
if [ "$(grep -E -f slew_pattern /etc/sysconfig/ntpd)" ]; then
Note:
You don't really need to capture the output here, simply looking at the exit status will suffice.
Additionally, you can suppress output of grep command to be printed with -q option and errors with -s option.

Linux Bash file Reading Lines and words

I apologize if this is a trivial question. I am learning how to use linux bash and this little task is giving me a headache...
So I need to write a script, let's call it count.sh. I want that: for each file in the working directory, prints the filename, the number of lines, and the number of words to the console:
test.txt 100 1023
someOtherfiles 10 233
So far, I know that the following gives me all the files names in the directory. And thanks for all who helped me, I get this working version:
for f in *; do
echo -n "$f"
cat "$f" | wc -wl
done
I would really appreciate your help! Thanks ahead!
P.s. If you know great resources (links for tutorials) for learning about script and you are willing to share it with me. I think I really need to know these basics. Thanks again!
If you must have the file name as the first field in your output, try this:
for f in *; do
if [ -f "$f" ]; then
echo -n "$f"
cat "$f" | wc -wl
fi
done
for f in *; do
if [[ -f $f ]]; then
echo "$f $(wc -wl < "$f")"
fi
done
[[ -f $f ]] processes only files (excludes subdirectories) and also handles the case where the directory is empty (in which case * is (by default) left unexpanded, i.e. assigned to $f as is).
echo "$f $(wc -wl < "$f")" uses command substitution ($( ... )) to directly include the output from the enclosed command in the output string passed to echo.
Note that the reason that < is used to direct the content of file $f to wc via stdin is that wc would otherwise append the name of the input file to its output (thanks, #R Sahu).

how to compare umask

I am trying to compare the umask of an user. I am getting an error while doing the comparison. The code I am using is
val=`su - user -c "umask" | tail -2 | sed -n "/[0-9]/p"`
if [ $val -eq 744 ]
then
echo "477 found."
fi
When I execute this I am geting an error like:
sh: ^[H: A test command parameter is not valid.
I've tried with = in the compare command, but it is still not working.
Please give any suggestions.
Regards.
val has been initialised as 0.
I am running this as root, so no login is there.
I've also tried giving quotes.
You should quote the variable name in your test expression:
if [ "$val" -eq 744 ]
See here for why.
Your code executes well on my machine, the only solution I can suggest is to use a slightly different syntax, sometimes different bash version complain about one syntax and accept another one:
val=`su - user -c "umask" | tail -2 | sed -n "/[0-9]/p"`
if [[ $val -eq 477 ]] ; then
echo "477 found."
fi
Have a look here for the difference between [ cond ] and [[ cond ]].

Resources