How can I create a bash script that admits a file as a command line argument and prints on screen all lines that have a length of more than 12 characters using egrep command?
You can use:
egrep '.{13}'
The . will match any character, and the {13} repeats it exactly 13 times. You can put this in a shell script like:
#!/bin/sh
# Make sure the user actually passed an argument. This is useful
# because otherwise grep will try and read from stdin and hang forever
if [ -z "$1" ]; then
echo "Filename needed"
exit 1
fi
egrep '.{13}' "$1"
The $1 refers to the first command argument. You can also use $2, $3, etc, and $# refers to all commandline arguments (useful if you want to run it over multiple files):
egrep '.{13}' "$#"
Related
I have a function produce which determines whether a file is present and if not it runs the following command. This works fine when the command output simply writes to stdout. However in the command below I pipe the output to a second command and then to a third command before it outputs to stdout. In this scenario I get the output writing to file correctly but it does not echo the preceding $# from the produce function and the contents of the initial unpopulated outputfile.vcf (contains header columns) which is generated by the pipeline command on execution is also being outputted to stdout. Is there a more appropriate way to evaluate $# > "${curfile}"
produce() {
local curfile=$1
#Remove the first element of the list of passed arguments
shift
if [ ! -e "${curfile}" ]; then
#Run the subsequent command as shown in the list of passed arguments
echo $#
$# > "${curfile}"
fi
}
produce outputfile.vcf samtools view -bq 20 input.bam | samtools mpileup -Egu -t DP,SP -f hs37d5formatted.fa -| bcftools call -cNv -
Ok as I mentioned in my comment the issue seems to relate to the pipe characters so I had to evaluate the variable using eval and escape the pipe character. So in order to ensure the function produce interprets $# correctly I fed the command as follows. Note also that the variables are all now quoted
#######
produce() {
local curfile="$1"
#Remove the first element of the list of passed arguments
shift
if [ ! -e "${curfile}" ]; then
#Run the subsequent command as shown in the list of passed arguments
echo "$#"
eval "$# > ${curfile}"
fi
}
produce outputfile.vcf samtools view -bq 20 input.bam \| samtools mpileup -Egu -t DP,SP -f hs37d5formatted.fa -\| bcftools call -cNv -
You can use >> to append to a file. For example:
echo "line 1" >> filename
echo "line 2" >> filename
Will result in a file containing:
line 1
line 2
I was looking at an answer in another thread about which bracket pair to use with if in a bash script. [[ is less surprising and has more features such as pattern matching (=~) whereas [ and test are built-in and POSIX compliant making them portable.
Recently, I was attempting to test the result of a grep command and it was failing with [: too many arguments. I was using [. But, when I switched to [[ it worked. How would I do such a test with [ in order to maintain the portability?
This is the test that failed:
#!/bin/bash
cat > slew_pattern << EOF
g -x"$
EOF
if [ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]; then
echo "slew mode"
else
echo "not slew mode"
fi
And the test that succeeded:
#!/bin/bash
cat > slew_pattern << EOF
g -x"$
EOF
if [[ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]]; then
echo "slew mode"
else
echo "not slew mode"
fi
if [ $(grep -E -f slew_pattern /etc/sysconfig/ntpd) ]; then
This command will certainly fail for multiple matches. It will throw an error as the grep output is being split on line ending.
Multiple matches of grep are separated by new line and the test command becomes like:
[ match1 match2 match3 ... ]
which doesn't make much of a sense. You will get different error messages as the number of matches returned by grep (i.e the number of arguments for test command [).
For example:
2 matches will give you unary operator expected error
3 matches will give you binary operator expected error and
more than 3 matches will give you too many arguments error or such, in Bash.
You need to quote variables inside [ to prevent word splitting.
On the other hand, the Bash specific [[ prevents word splitting by default. Thus the grep output doesn't get split on new line and remains a single string which is a valid argument for the test command.
So the solution is to look only at the exit status of grep:
if grep -E -f slew_pattern /etc/sysconfig/ntpd; then
Or use quote when capturing output:
if [ "$(grep -E -f slew_pattern /etc/sysconfig/ntpd)" ]; then
Note:
You don't really need to capture the output here, simply looking at the exit status will suffice.
Additionally, you can suppress output of grep command to be printed with -q option and errors with -s option.
I am trying to learn bash commands, and some very basic commands are not working as I expect...http://www.tutorialspoint.com/unix/unix-special-variables.htm
http://i.stack.imgur.com/F5VGK.png
Script:
#!/bin/bash
name="john"
other="shawn"
echo $name
echo $other
echo $1
echo $2
echo $#
echo $#
Output:
$ new
john
shawn
0
$
$1, $2, etc and $# have special meaning in bash scripts. They refer to the arguments passed to the bash script, so if you have a script in a file called foo.sh like:
#!/bin/bash
echo "Number of arguments: $#";
echo "First argument: $1";
echo "Second argument: $2";
If you chmod +x foo.sh and then run:
./foo.sh first second
You will see:
Number of arguments: 2
First argument: first
Second argument: second
$1 refers to the first command line argument passed to the script. The script is foo.sh, so anything after the script name will become a command line argument.
The default command line argument separator is the "space", so when you type ./foo.sh first second, bash stores first into $1 and second into $2.
If you typed:
./foo.sh first second third FOURTH fifth
bash would store third in the variable $3, FOURTH in the variable $4, and so on.
Is your script named 'new' ? In that case run it as follows one by one and you will get an idea how this works:
./new
./new a
./new a b
when you ran your script you did not pass any arguments. the number of arguments passed to the scripts are show by echoing "echo $#". and your output clearly shows that the "echo $#" command returned "0" count. pass the argument when you call your script like below
./new argument1 argument2
This question already has an answer here:
Shell script argument parsing
(1 answer)
Closed 8 years ago.
I want to make a non-interactive shell script,where I can give options at the beginning of the execution of the script.
Where I can hard-code the various actions to be taken when different inputs are provided by user.
for example:
Below should perform some action on target.txt
user#/root>myPlannedScript -p targer.txt
Below should perform some other actions on the target.txt
user#/root>myPlannedScript -a targer.txt
For example:
cat tool performs various actions when different options are given. I want my script to act like this:
:/root> cat --h
Usage: cat [OPTION] [FILE]...
Concatenate FILE(s), or standard input, to standard output.
-A, --show-all equivalent to -vET
-b, --number-nonblank number nonblank output lines
-e equivalent to -vE
-E, --show-ends display $ at end of each line
-n, --number number all output lines
-r, --reversible use \ to make the output reversible, implies -v
-s, --squeeze-blank never more than one single blank line
-t equivalent to -vT
-T, --show-tabs display TAB characters as ^I
query.sh
#!/bin/bash
if [ $# -eq 0 ]
then echo do one thing
else echo do other thing
fi
"query.sh" => do one thing
"query.sh anythingYouPut" => do other thing ;oP
but if you really want a parameter for each action
#!/bin/bash
if [ -z "$1" ]
then
echo do nothing
else
if [ $1 -eq 1 ]
then
echo do one thing
fi
if [ $1 -eq 2 ]
then
echo do other thing
fi
fi
"query.sh" => do nothing
"query.sh 1" => do one thing
"query.sh 2" => do other thing
I wrote a zsh function to help me do some grepping at my job.
function rgrep (){
if [ -n "$1" ] && [ -n "$2" ]
then
exec grep -rnw $1 -r $2
elif [ -n "$1" ]
then
exec grep -rnw $1 -r "./"
else
echo "please enter one or two args"
fi
}
Works great, however, grep finishes executing I don't get thrown back into the shell. it just hangs at [process complete] any ideas?
I have the function in my .zshrc
In addition to getting rid of the unnecessary exec, you can remove the if statement as well.
function rgrep (){
grep -rwn "${1:?please enter one or two args}" -r "${2:-./}"
}
If $1 is not set (or null valued), an error will be raised and the given message displayed. If $2 is not set, a default value of ./ will be used in its place.
Do not use exec as it replace the existing shell.
exec [-cl] [-a name] [command [arguments]]
If command is supplied, it replaces the shell without creating a new process. If the -l option is supplied, the shell places a dash at the beginning of the zeroth argument passed to command. This is what the login program does. The -c option causes command to be executed with an empty environment. If -a is supplied, the shell passes name as the zeroth argument to command. If no command is specified, redirections may be used to affect the current shell environment. If there are no redirection errors, the return status is zero; otherwise the return status is non-zero.
Try this instead:
rgrep ()
{
if [ -n "$1" ] && [ -n "$2" ]
then
grep -rnw "$1" -r "$2"
elif [ -n "$1" ]
then
grep -rnw "$1" -r "./"
else
echo "please enter one or two args"
fi
}
As a completely different approach, I like to build command shortcuts like this as minimal shell scripts, rather than functions (or aliases):
% echo 'grep -rwn "$#"' >rgrep
% chmod +x rgrep
% ./rgrep
Usage: grep [OPTION]... PATTERN [FILE]...
Try `grep --help' for more information.
%
(This relies on a traditional behavior of Unix: executable text files without #! lines are considered shell scripts and are executed by /bin/sh. If that doesn't work on your system, or you need to run specifically under zsh, use an appropriate #! line.)
One of the main benefits of this approach is that shell scripts in a directory in your PATH are full citizens of the environment, not local to the current shell like functions and aliases. This means they can be used in situations where only executable files are viable commands, such as xargs, sudo, or remote invocation via ssh.
This doesn't provide the ability to give default arguments (or not easily, anyway), but IMAO the benefits outweigh the drawbacks. (And in the specific case of defaulting grep to search PWD recursively, the real solution is to install ack.)