How to use grep with single brackets? - linux

I was looking at an answer in another thread about which bracket pair to use with if in a bash script. [[ is less surprising and has more features such as pattern matching (=~) whereas [ and test are built-in and POSIX compliant making them portable.
Recently, I was attempting to test the result of a grep command and it was failing with [: too many arguments. I was using [. But, when I switched to [[ it worked. How would I do such a test with [ in order to maintain the portability?
This is the test that failed:
#!/bin/bash
cat > slew_pattern << EOF
g -x"$
EOF
if [ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]; then
echo "slew mode"
else
echo "not slew mode"
fi
And the test that succeeded:
#!/bin/bash
cat > slew_pattern << EOF
g -x"$
EOF
if [[ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]]; then
echo "slew mode"
else
echo "not slew mode"
fi

if [ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]; then
This command will certainly fail for multiple matches. It will throw an error as the grep output is being split on line ending.
Multiple matches of grep are separated by new line and the test command becomes like:
[ match1 match2 match3 ... ]
which doesn't make much of a sense. You will get different error messages as the number of matches returned by grep (i.e the number of arguments for test command [).
For example:
2 matches will give you unary operator expected error
3 matches will give you binary operator expected error and
more than 3 matches will give you too many arguments error or such, in Bash.
You need to quote variables inside [ to prevent word splitting.
On the other hand, the Bash specific [[ prevents word splitting by default. Thus the grep output doesn't get split on new line and remains a single string which is a valid argument for the test command.
So the solution is to look only at the exit status of grep:
if grep -E -f slew_pattern /etc/sysconfig/ntpd; then
Or use quote when capturing output:
if [ "$(grep -E -f slew_pattern /etc/sysconfig/ntpd)" ]; then
Note:
You don't really need to capture the output here, simply looking at the exit status will suffice.
Additionally, you can suppress output of grep command to be printed with -q option and errors with -s option.

Related

Prevent logging of clear command

Suppose the following simple script:
#!/bin/bash
log="${HOME}/bin/test.log"
if [ -r "${log}" ]; then
rm -f "${log}"
fi
{
echo "Start of test"
clear
echo "End of test"
} 2>&1 | tee -a "${log}"
The contents of the generated log file look like the following:
Start of test
<unprintable>[H<unprintable>[2JEnd of test
Is there any way to avoid the extra characters resulting from issuing a clear command using this style of logging?
One possibility is to just filter them out of the stream that goes to the log file.
{
echo "Start of test"
clear
echo "End of test"
} 2>&1 | tee -a >(sed 's/.\[H.\[2J//' > "${log}")
(I'm not sure how to match a literal escape character using portable sed alone. Here, I just use . to match any character and assume that this regular expression will only match the intended sequence. One could "cheat" and use bash to generate a literal escape character in the sed command:
sed $'s/\e\\[H\e\\[2J//'
although it's not cheating too much since we're already using bash-specific process substitution.)

About egrep command

How can I create a bash script that admits a file as a command line argument and prints on screen all lines that have a length of more than 12 characters using egrep command?
You can use:
egrep '.{13}'
The . will match any character, and the {13} repeats it exactly 13 times. You can put this in a shell script like:
#!/bin/sh
# Make sure the user actually passed an argument. This is useful
# because otherwise grep will try and read from stdin and hang forever
if [ -z "$1" ]; then
echo "Filename needed"
exit 1
fi
egrep '.{13}' "$1"
The $1 refers to the first command argument. You can also use $2, $3, etc, and $# refers to all commandline arguments (useful if you want to run it over multiple files):
egrep '.{13}' "$#"

Too many arguments in for

In both for statements, I am getting the following error:
./count_files.sh: line 21: [: too many arguments
./count_files.sh: line 16: [: too many arguments.
Can anyone help me ?
#!/bin/bash
files=($(find /usr/src/linux-headers-3.13.0-34/include/ -type f -name '[aeiou][a-z0-9]*.h'))
count=0
headerfiles=($(find /usr/src/linux-headers-3.13.0-34/include/ -type f -name '[_a-zA-Z0-9]*.h' | grep -v "/linux/"))
for file in "${files[#]}"
do
if ! [ grep -Fxq "linux/err.h" $file ];
then
localcount=0
for header in "${headerfiles[#]}"
do
if [ grep -Fxq $header $file ];
then
localcount=$((localcount+1))
if [ $localcount -eq 3 ];
then
count=$(($count+1))
break
fi
fi
done
localcount=0
fi
done
echo $count
One of the problem lines is:
if ! [ grep -Fxq "linux/err.h" $file ];
The semicolon at the end is unnecessary unless the then is on the same line; it is, however, harmless.
It looks as though you want to execute the grep command and check whether it produces any output. However, you've simply provided the test (aka [) command with four string arguments (plus the closing ] for 5 in total), the second of which is not one of the options recognized by test.
You might have meant to use this:
if ! [ -n "$(grep -Fxq "linux/err.h" "$file")" ]
(unless you meant -z instead of -n; the negations are confusing me). But if you're interested in whether grep found anything, you can simply test the exit status of grep:
if grep -Fxq "linux/err.h" "$file"
Hmmm...the -q is 'quiet' mode; so in fact the string test won't work since grep produces no output. You want the direct test of the exit status, possibly preceded by the ! logical not operator.
You shouldn't use square brackets around the grep.
In shell scripts square brackets are not used for grouping, [ is a command in its own right (an alias for test), and it is the [ command that is complaining that you've given it too many arguments.
Just make the call without brackets
if ! grep ....
Change the fors to whiles with reads:
...
echo "${files}" | while read file ; do
...
echo "${headerfiles}" | while read header ; do
...
done
...
done
...

How to make a non interactive shell scripts [duplicate]

This question already has an answer here:
Shell script argument parsing
(1 answer)
Closed 8 years ago.
I want to make a non-interactive shell script,where I can give options at the beginning of the execution of the script.
Where I can hard-code the various actions to be taken when different inputs are provided by user.
for example:
Below should perform some action on target.txt
user#/root>myPlannedScript -p targer.txt
Below should perform some other actions on the target.txt
user#/root>myPlannedScript -a targer.txt
For example:
cat tool performs various actions when different options are given. I want my script to act like this:
:/root> cat --h
Usage: cat [OPTION] [FILE]...
Concatenate FILE(s), or standard input, to standard output.
-A, --show-all equivalent to -vET
-b, --number-nonblank number nonblank output lines
-e equivalent to -vE
-E, --show-ends display $ at end of each line
-n, --number number all output lines
-r, --reversible use \ to make the output reversible, implies -v
-s, --squeeze-blank never more than one single blank line
-t equivalent to -vT
-T, --show-tabs display TAB characters as ^I
query.sh
#!/bin/bash
if [ $# -eq 0 ]
then echo do one thing
else echo do other thing
fi
"query.sh" => do one thing
"query.sh anythingYouPut" => do other thing ;oP
but if you really want a parameter for each action
#!/bin/bash
if [ -z "$1" ]
then
echo do nothing
else
if [ $1 -eq 1 ]
then
echo do one thing
fi
if [ $1 -eq 2 ]
then
echo do other thing
fi
fi
"query.sh" => do nothing
"query.sh 1" => do one thing
"query.sh 2" => do other thing

Bash shell `if` command returns something `then` do something

I am trying to do an if/then statement, where if there is non-empty output from a ls | grep something command then I want to execute some statements. I am do not know the syntax I should be using. I have tried several variations of this:
if [[ `ls | grep log ` ]]; then echo "there are files of type log";
Well, that's close, but you need to finish the if with fi.
Also, if just runs a command and executes the conditional code if the command succeeds (exits with status code 0), which grep does only if it finds at least one match. So you don't need to check the output:
if ls | grep -q log; then echo "there are files of type log"; fi
If you're on a system with an older or non-GNU version of grep that doesn't support the -q ("quiet") option, you can achieve the same result by redirecting its output to /dev/null:
if ls | grep log >/dev/null; then echo "there are files of type log"; fi
But since ls also returns nonzero if it doesn't find a specified file, you can do the same thing without the grep at all, as in D.Shawley's answer:
if ls *log* >&/dev/null; then echo "there are files of type log"; fi
You also can do it using only the shell, without even ls, though it's a bit wordier:
for f in *log*; do
# even if there are no matching files, the body of this loop will run once
# with $f set to the literal string "*log*", so make sure there's really
# a file there:
if [ -e "$f" ]; then
echo "there are files of type log"
break
fi
done
As long as you're using bash specifically, you can set the nullglob option to simplify that somewhat:
shopt -s nullglob
for f in *log*; do
echo "There are files of type log"
break
done
Or without if; then; fi:
ls | grep -q log && echo 'there are files of type log'
Or even:
ls *log* &>/dev/null && echo 'there are files of type log'
The if built-in executes a shell command and selects the block based on the return value of the command. ls returns a distinct status code if it does not find the requested files so there is no need for the grep part. The [[ utility is actually a built-in command from bash, IIRC, that performs arithmetic operations. I could be wrong on that part since I rarely stray far from Bourne shell syntax.
Anyway, if you put all of this together, then you end up with the following command:
if ls *log* > /dev/null 2>&1
then
echo "there are files of type log"
fi

Resources