Positional parameters for shell not working as expected - linux

I am trying to learn bash commands, and some very basic commands are not working as I expect...http://www.tutorialspoint.com/unix/unix-special-variables.htm
http://i.stack.imgur.com/F5VGK.png
Script:
#!/bin/bash
name="john"
other="shawn"
echo $name
echo $other
echo $1
echo $2
echo $#
echo $#
Output:
$ new
john
shawn
0
$

$1, $2, etc and $# have special meaning in bash scripts. They refer to the arguments passed to the bash script, so if you have a script in a file called foo.sh like:
#!/bin/bash
echo "Number of arguments: $#";
echo "First argument: $1";
echo "Second argument: $2";
If you chmod +x foo.sh and then run:
./foo.sh first second
You will see:
Number of arguments: 2
First argument: first
Second argument: second
$1 refers to the first command line argument passed to the script. The script is foo.sh, so anything after the script name will become a command line argument.
The default command line argument separator is the "space", so when you type ./foo.sh first second, bash stores first into $1 and second into $2.
If you typed:
./foo.sh first second third FOURTH fifth
bash would store third in the variable $3, FOURTH in the variable $4, and so on.

Is your script named 'new' ? In that case run it as follows one by one and you will get an idea how this works:
./new
./new a
./new a b

when you ran your script you did not pass any arguments. the number of arguments passed to the scripts are show by echoing "echo $#". and your output clearly shows that the "echo $#" command returned "0" count. pass the argument when you call your script like below
./new argument1 argument2

Related

Bash discards command line arguments when passing to another bash shell

I have a big script (call it test) that, after stripping out the unrelated parts, comes down to just this using which I can explain my question:
#!/bin/bash
bash -c "$#"
This doesn't work as expected. E.g. ./test echo hi executes the only the echo and the argument disappears!
Testing with various inputs I can see only $1 is passed to bash -c ... and rest are discarded.
But if I use a variable like:
#!/bin/bash
cmd="$#"
bash -c "$cmd"
it works as expected for all inputs.
Questions:
1) I would like to understand why the double quotes don't "pass" the entire command line arguments to bash -c .... What am I missing here (that it works perfectly fine when using an intermediate variable)?
2) Why does bash discard the rest of the arguments (except $1) without any error messages?
For example:
bash -c "ls" -l -a hi hello blah
simply runs echo and hi hello blah doesn't result in any errors at all?
(If possible, please refer to the bash grammar where this behaviour is documented).
1) I would like to understand why the double quotes don't "pass" the entire command line arguments to bash -c .... What am I missing here (that it works perfectly fine when using an intermediate variable)?
From info bash #:
#
($#) Expands to the positional parameters, starting from one. When the expansion occurs within double quotes, each parameter expands
to a separate word. That is, "$#" is equivalent to "$1" "$2" ....
Thus, bash -c "$#" is equivalent to bash -c "$1" "$2" .... In the case of ./test echo hi invocation, the expression is expanded to
bash -c "echo" "hi"
2) Why does bash discard the rest of the arguments (except $1) without any error messages?
Bash actually doesn't discard anything. From man bash:
If the -c option is present, then commands are read from the first non-option argument command_string. If there are arguments after the command_string, they are assigned to the positional parameters, starting with $0.
Thus, for the command bash -c "echo" "hi", Bash passes "hi" as $0 for the "echo" script.
bash -c "ls" -l -a hi hello blah
simply runs echo and hi hello blah doesn't result in any errors at all?
According to the rules mentioned above, Bash executes "ls" script and passes the following positional parameters to this script:
$0: "-l"
$1: "-a"
$2: "hi"
$3: "hello"
$4: "blah"
Thus, the command actually executes ls, and the positional parameters are unused in the script. You can use them by referencing to the positional parameters, e.g.:
$ set -x
$ bash -c "ls \$0 \$1 \$3" -l -a hi hello blah
+ bash -c 'ls $0 $1 $3' -l -a hi hello blah
ls: cannot access hello: No such file or directory
You should be using $* instead of $# to pass command line as string. "$#" expands to multiple quoted arguments and "$*" combines multiple arguments into a single argument.
#!/bin/bash
bash -c "$*"
Problem is with your $# it executes:
bash -c echo hi
But with $* it executes:
bash -c 'echo hi'
When you use:
cmd="$#"
and use: bash -c "$cmd" it does the same thing for you.
Read: What is the difference between “$#” and “$*” in Bash?

About egrep command

How can I create a bash script that admits a file as a command line argument and prints on screen all lines that have a length of more than 12 characters using egrep command?
You can use:
egrep '.{13}'
The . will match any character, and the {13} repeats it exactly 13 times. You can put this in a shell script like:
#!/bin/sh
# Make sure the user actually passed an argument. This is useful
# because otherwise grep will try and read from stdin and hang forever
if [ -z "$1" ]; then
echo "Filename needed"
exit 1
fi
egrep '.{13}' "$1"
The $1 refers to the first command argument. You can also use $2, $3, etc, and $# refers to all commandline arguments (useful if you want to run it over multiple files):
egrep '.{13}' "$#"

What is the best way to evaluate two variables representing a single pipeline command in bash?

I have a function produce which determines whether a file is present and if not it runs the following command. This works fine when the command output simply writes to stdout. However in the command below I pipe the output to a second command and then to a third command before it outputs to stdout. In this scenario I get the output writing to file correctly but it does not echo the preceding $# from the produce function and the contents of the initial unpopulated outputfile.vcf (contains header columns) which is generated by the pipeline command on execution is also being outputted to stdout. Is there a more appropriate way to evaluate $# > "${curfile}"
produce() {
local curfile=$1
#Remove the first element of the list of passed arguments
shift
if [ ! -e "${curfile}" ]; then
#Run the subsequent command as shown in the list of passed arguments
echo $#
$# > "${curfile}"
fi
}
produce outputfile.vcf samtools view -bq 20 input.bam | samtools mpileup -Egu -t DP,SP -f hs37d5formatted.fa -| bcftools call -cNv -
Ok as I mentioned in my comment the issue seems to relate to the pipe characters so I had to evaluate the variable using eval and escape the pipe character. So in order to ensure the function produce interprets $# correctly I fed the command as follows. Note also that the variables are all now quoted
#######
produce() {
local curfile="$1"
#Remove the first element of the list of passed arguments
shift
if [ ! -e "${curfile}" ]; then
#Run the subsequent command as shown in the list of passed arguments
echo "$#"
eval "$# > ${curfile}"
fi
}
produce outputfile.vcf samtools view -bq 20 input.bam \| samtools mpileup -Egu -t DP,SP -f hs37d5formatted.fa -\| bcftools call -cNv -
You can use >> to append to a file. For example:
echo "line 1" >> filename
echo "line 2" >> filename
Will result in a file containing:
line 1
line 2

read more than one parameters right after the command in bash

I am making a bash script and I want it to be just one line, meaning it will not have any interaction with the users and the parameters will be on the same line as the command. Once the user clicks return, it will output the result.
Right now, I have something that looks like this:
#! \bin\bash
read $1 $2
do something with $1 and $2
However, if I name my script "test" when I type in test at the beginning of the command line, I will have to type enter for the rest of the script to be executed. How should I modify it so that I can run the entire thing on just one line?
The standard way to pass parameters to a script is not with read (which actively waits for input from stdin), but just to call your script with the parameters on the same line:
./my_script.sh param1 param2
Then inside the script, you can access these parameters using $1, $2, etc. Example (note also the first line - this describes what shell should be used to run the script, and should be a valid path - ie /bin/bash, not backslashes):
#!/bin/bash
echo "First: $1 Second: $2"
Then call the script:
$ ./my_script.sh Hello There
First: Hello Second: There
What you probably need is this :
You script name test.sh contains the following:
#!/bin/bash
echo "$1 $2"
Then, change permission so that you can execute the script on the command line :
chmod u+x test.sh
and run the script with arguments (two in this case) :
./test.sh tik tak
will return
tik tak

Pass command line arguments to another command in bash

I have a program which takes some command line arguments.
It's executed like this:
user#home $ node server.js SOME ARGUMENTS -p PORT -t THEME OTHER STUFF
Now I want to make a executable launcher for it:
#!/bin/bash
node server.js ARGUMENTS GIVEN FROM COMMAND LINE
How to do this?
I want to pass the same arguments to the command (without knowing how many and what they will be).
Is there a way?
Use the $# variable in double quotes:
#!/bin/bash
node server.js "$#"
This will provide all the arguments passed on the command line and protect spaces that could appear in them. If you don't use quotes, an argument like "foo bar" (that is, a single argument whose string value has a space in it) passed to your script will be passed to node as two arguments. From the documentation:
When the expansion occurs within double quotes, each parameter expands to a separate word. That is, "$#" is equivalent to "$1" "$2" ….
In light of the other answer, edited to add: I fail to see why you'd want to use $* at all. Execute the following script:
#!/bin/bash
echo "Quoted:"
for arg in "$*"; do
echo $arg
done
echo "Unquoted:"
for arg in $*; do
echo $arg
done
With, the following (assuming your file is saved as script and is executable):
$ script "a b" c
You'll get:
Quoted:
a b c
Unquoted:
a
b
c
Your user meant to pass two arguments: a b, and c. The "Quoted" case processes it as one argument: a b c. The "Unquoted" case processes it as 3 arguments: a, b and c. You'll have unhappy users.
Depending on what you want, probably using $#:
#!/bin/bash
node server.js "$#"
(Probably with quotes)
The thing to keep in mind is how arguments with e.g. spaces are handled. In this respect, $* and $# behave differently:
#!/usr/bin/env bash
echo '$#:'
for arg in "$#"; do
echo $arg
done
echo '$*:'
for arg in "$*"; do
echo $arg
done
Output:
$ ./test.sh a "b c"
$#:
a
b c
$*:
a b c

Resources