Why would someone use echo to assign values to variables in bash or ksh? - linux

Recently I came across an unusual use of echo to assign variables in a client's ksh scripts.
For example, there are many instances such as the following
a='something'
b='else'
c=`echo "${a} ${b}"`
I have been unable to come up with any reason why someone may have done this.
Could there be some legacy reason for this?
(I've been doing shell for 30+ years, and never before have I seen this)
Or is it just ignorance?

There is no compelling reason whatsoever for this, either in current bash, or its POSIX sh or Bourne predecessors.
c="$a $b"
...is a complete replacement for the code given, and runs far faster (try putting it in a loop; command substitutions, as created by backticks, fork off a new shell as a subprocess and read its stdout -- a high-overhead operation).

What you saw is an example of bad use of echo because c could be declared as:
c="$a $b"
A common use of echo is when you need comands to filter output, for example
$ line="100090 $100,00 Mary"
$ name=`echo "$line" | grep -Eo "[a-zA-Z]+$"`
echo $name
Mary
But it would be more efficient if you don't use echo at all. The same thing above can be done with "read", without creating a new process:
$ line="100090 $100,00 Paul"
$ read -r _ _ name _ <<<"$line"
echo $name
Paul

Related

Bash interprets && as an argument

Observe the following
[admin#myVM ~]$ test="echo hello && echo world"
[admin#myVM ~]$ $test
hello && echo world
[admin#myVM ~]$ echo hello && echo world
hello
world
It seems that if you place && into a bash variable and execute it as a command, bash interprets && as a string argument. Does anyone know how to alter this behavior? That is to say, does anyone know how to tell bash “I want to execute double ampersand”.
Variables are not meant to store scripting. Store code in functions, not variables.
$ test() { echo hello && echo world; }
$ test
hello
world
The reason $test doesn't work is because unquoted variable expansions are only subjected to two of bash's many phases of parsing:
Word splitting
Globbing
That means it will split apart $test into separate words, which is why it recognizes the echo command. It will also expand wildcards AKA globs. Those are the only two bits of parsing it does, though. It doesn't look for symbols like &&, |, or ;.
I don't recommend it, but you can get bash to execute your string with bash -c "$test" or eval "$test". These are bad habits that you should avoid unless absolutely necessary. Functions are much better than hacks with eval, and they don't open up potential security holes by allowing arbitrary code execution.

Shell: Write for ... in arguments into a variable

for i in "a" "a b"; do
echo $i;
done
echoes:
a
a b
How can I write something like for i in $input; do and assign "a" "a b" to input? The whitespace is important. Otherwise $(echo ...) would work.
Edit: The question is not about files and neither about some input, which can be caught using $#.
Since you're using bash, you could do this:
input=("a" "a b")
for i in "${input[#]}"; do
echo $i
done
This can only be done portably with the "$#" construct for command-line arguments.
However, if you don't need the actual command line arguments anymore, you can use set to replace their contents:
input='"a" "a b"'
eval set fnord $input
shift
for i in "$#"; do
echo $i
done
You should be aware that merely having asked this question suggests that you are approaching the complexity level where you should switch to a less limited scripting language (Perl and Python are the usual choices).

Why doesn't this bash code work?

x="a=b"
`echo $x`
echo $a
I expect the second line to generate "a=b", and execute it in the context of the main shell, resulting in a new variable a with value b.
However, what I really get (if I enter the commands manually) is the error message after the second line, bash: a=b: command not found
Why is that so?
Try
eval $x
(And we need 30 characters for this answer to be posted)
What your first echo line does is running in a subshell and returns its value to the callee.. The same result is achieved using $() and is - by the way - easier to use than backticks.
So, what you are doing is first running echo $x (which returns a=b). And, because of the backticks, a=b is returned to the shell that tries to run that line as a command which - obviously - won't work.
Try this in a shell:
$(echo ls)
And you will clearly see what is happening.
It's because of the order in which bash parses the command line. It looks for variable definitions (e.g. a=b) before performing variable and command substitution (e.g. commands in backticks). Because of this, by the time echo $x is replaced by a=b, it's too late for bash to see this as a variable definition and it's parsed as a command instead. The same thing would've happened if you'd just used $x as the command (instead of echo in backticks). As in #mvds's answer, the eval command can be used to force the command to be reparsed from the beginning, meaning that it'll be recognized as a variable definition:
$ x="a=b"
$ `echo $x`
-bash: a=b: command not found
$ $(echo $x) # Exact same thing, but with cleaner syntax
-bash: a=b: command not found
$ $x # This also does the same thing, but without some extra steps
-bash: a=b: command not found
$ eval "$x" # This will actually work
$ echo $a
b
$ a= # Start over
$ eval "$(echo "$x")" # Another way of doing the same thing, with extra steps
$ echo $a
b
Note that when using eval I've put all of the references to $x in double-quotes -- this is to prevent the later phases of bash parsing (e.g. word splitting) from happening twice, since bash will finish its regular parsing process, then recognize the eval command, and then redo the entire parsing process again. It's really easy to get unexpected results from using eval, and this removes at least some of the potential for trouble.
Did you try $x in that funny apostrophes? Without echo, echo seems to be only for displaying string, not execute commands.

How to detect using of wildcard (asterisk *) as parameter for shell script?

In my script, how can I distinguish when the asterisk wildcard character was used instead of strongly typed parameters?
This
# myscript *
from this
# myscript p1 p2 p3 ... (where parameters are unknown number)
The shell expands the wildcard. By the time a script is run, the wildcard has been expanded, and there is no way a script can tell whether the arguments were a wildcard or an explicit list.
Which means that your script will need help from something else which is not a script. Specifically, something which is run before command-line processing. That something is an alias. This is your alias
alias myscript='set -f; globstopper /usr/bin/myscript'
What this does is set up an alias called 'myscript', so when someone types 'myscript', this is what gets run. The alias does two things: firstly, it turns off wildcard expansion with set -f, then it runs a function called globstopper, passing in the path to your script, and the rest of the command-line arguments.
So what's the globstopper function? This:
globstopper() {
if [[ "$2" == "*" ]]
then echo "You cannot use a wildcard"
return
fi
set +f
"$#";
}
This function does three things. Firstly, it checks to see if the argument to the script is a wildcard (caveat: it only checks the first argument, and it only checks to see if it's a simple star; extending this to cover more cases is left as an exercise to the reader). Secondly, it switches wildcard expansion back on. Lastly, it runs the original command.
For this to work, you do need to be able to set up the alias and the shell function in the user's shell, and require your users to use the alias, not the script. But if you can do that, it ought to work.
I should add that i am leaning heavily on the resplendent Simon Tatham's essay 'Magic Aliases: A Layering Loophole in the Bourne Shell' here.
I had a similar question, but rather than detecting when the user called the script using a wildcard, I simply wanted to prevent the use of the wildcard, and pass the string pre-expansion.
Tom's solution is great if you want to detect, but I'd rather prevent. In other words, if I had a script called findin that looked like
#!/bin/bash
echo "[${1}]"
and ran it using:
$ findin *
I would expect the output to be simply
[*]
To do this, you could just alias findin by
alias findin='set -f; /path/to/findin'
But then you would have the shell option set for the rest of your session. This will likely break many programs that don't expect this (e.g. ls -lh *.py). You could verify this by typing
echo $-
in console. If you see an f, that option is set.
You could manually clear the option by typing
set +f
after every instance of findin, but that would get tedious and annoying.
Since shell scripts spawn subshells and you cannot clear the flag from within the script (set +f), the solution I came up with was the following:
g(){ /usr/local/bin/findin "$#"; set +f; }
alias findin='set -f; g'
Note: 'g' might not be the best name for the function, so you'd be encouraged to change it.
Finally, you could generalize this by doing something like:
reset_expansion(){ CMD="$1"; shift; $CMD "$#"; set +f; }
alias findin='set -f; reset_expansion /usr/local/bin/findin'
That way another script where you would want expansion disabled would only require an additional alias, e.g.
alias newscript='set -f; reset_expansion /usr/local/bin/newscript'
and not an additional wrapper function.
For a much longer than necessary writeup, see my post here.
You can't.
It is one of the strengths (or, in some eyes, weaknesses) of Unix.
See the diatribe(s) in "The UNIX-HATERS Handbook".
$arg_num == ***; // detects *(literally anything since it's a global wildcard)
$arg_num == *_*; // detects _
here is an example of it working with _
for i in $*
do
if [[ "$i" == *_* ]];
then echo $i;
fi
done
output of ./bash test * test2 _
_
output of ./bash test * test2 with ********* rather then ****
test
bash
pass.rtf
test2
_
NOTE: the * is so global in bash that it printed out files matching that description or in my case of the files on my oh-so-unused desktop. I wish I could give you a better answer but the best choice it to use something other then * or another scripting language.
Addendum
I found this post while looking for a workaround for my command line calculator:
alias c='set -f; call_clc'
where "call_clc" is the function: "function call_clc { clc "$*"; set +f; }"
and "clc" is the script
#!/bin/bash
echo "$*" | sed -e 's/ //g' >&1 | tee /dev/tty | bc
I need 'set -f' and 'set +f' in order to make inputs such as 'c 4 * 3' to work,
therefore an asterix with white space before and after,
in order to prevent globbing of the bash.
Update: the previous variant 'alias c='set -f; clc "$*"; set+f;'' did not work
because for some reason the correct result was given after invoking the command "c 4 * 4' twice.
Anyone an idea why this is so?
If this is something you feel you must do, perhaps:
# if the number of parms is not the same as the number of files in cwd
# then user did not use *
dir_contents=(*)
if [[ "${##}" -ne "${#dir_contents[#]}" ]]; then
used_star=false
else
# if one of the params is not a file in cwd
# then user did not use *
used_star=true
for f; do [[ ! -a "$f" ]] && { used_star=false; break; }; done
fi
unset dir_contents
$used_star && echo "used star" || echo "did not use star"
Pedantically, this will echo "used star" if the user actually used an asterisk or if the user manually entered the directory contents in any order.

Read values into a shell variable from a pipe

I am trying to get bash to process data from stdin that gets piped into, but no luck. What I mean is none of the following work:
echo "hello world" | test=($(< /dev/stdin)); echo test=$test
test=
echo "hello world" | read test; echo test=$test
test=
echo "hello world" | test=`cat`; echo test=$test
test=
where I want the output to be test=hello world. I've tried putting "" quotes around "$test" that doesn't work either.
Use
IFS= read var << EOF
$(foo)
EOF
You can trick read into accepting from a pipe like this:
echo "hello world" | { read test; echo test=$test; }
or even write a function like this:
read_from_pipe() { read "$#" <&0; }
But there's no point - your variable assignments may not last! A pipeline may spawn a subshell, where the environment is inherited by value, not by reference. This is why read doesn't bother with input from a pipe - it's undefined.
FYI, http://www.etalabs.net/sh_tricks.html is a nifty collection of the cruft necessary to fight the oddities and incompatibilities of bourne shells, sh.
if you want to read in lots of data and work on each line separately you could use something like this:
cat myFile | while read x ; do echo $x ; done
if you want to split the lines up into multiple words you can use multiple variables in place of x like this:
cat myFile | while read x y ; do echo $y $x ; done
alternatively:
while read x y ; do echo $y $x ; done < myFile
But as soon as you start to want to do anything really clever with this sort of thing you're better going for some scripting language like perl where you could try something like this:
perl -ane 'print "$F[0]\n"' < myFile
There's a fairly steep learning curve with perl (or I guess any of these languages) but you'll find it a lot easier in the long run if you want to do anything but the simplest of scripts. I'd recommend the Perl Cookbook and, of course, The Perl Programming Language by Larry Wall et al.
This is another option
$ read test < <(echo hello world)
$ echo $test
hello world
read won't read from a pipe (or possibly the result is lost because the pipe creates a subshell). You can, however, use a here string in Bash:
$ read a b c <<< $(echo 1 2 3)
$ echo $a $b $c
1 2 3
But see #chepner's answer for information about lastpipe.
I'm no expert in Bash, but I wonder why this hasn't been proposed:
stdin=$(cat)
echo "$stdin"
One-liner proof that it works for me:
$ fortune | eval 'stdin=$(cat); echo "$stdin"'
bash 4.2 introduces the lastpipe option, which allows your code to work as written, by executing the last command in a pipeline in the current shell, rather than a subshell.
shopt -s lastpipe
echo "hello world" | read test; echo test=$test
A smart script that can both read data from PIPE and command line arguments:
#!/bin/bash
if [[ -p /dev/stdin ]]
then
PIPE=$(cat -)
echo "PIPE=$PIPE"
fi
echo "ARGS=$#"
Output:
$ bash test arg1 arg2
ARGS=arg1 arg2
$ echo pipe_data1 | bash test arg1 arg2
PIPE=pipe_data1
ARGS=arg1 arg2
Explanation: When a script receives any data via pipe, then the /dev/stdin (or /proc/self/fd/0) will be a symlink to a pipe.
/proc/self/fd/0 -> pipe:[155938]
If not, it will point to the current terminal:
/proc/self/fd/0 -> /dev/pts/5
The bash [[ -p option can check it it is a pipe or not.
cat - reads the from stdin.
If we use cat - when there is no stdin, it will wait forever, that is why we put it inside the if condition.
The syntax for an implicit pipe from a shell command into a bash variable is
var=$(command)
or
var=`command`
In your examples, you are piping data to an assignment statement, which does not expect any input.
In my eyes the best way to read from stdin in bash is the following one, which also lets you work on the lines before the input ends:
while read LINE; do
echo $LINE
done < /dev/stdin
The first attempt was pretty close. This variation should work:
echo "hello world" | { test=$(< /dev/stdin); echo "test=$test"; };
and the output is:
test=hello world
You need braces after the pipe to enclose the assignment to test and the echo.
Without the braces, the assignment to test (after the pipe) is in one shell, and the echo "test=$test" is in a separate shell which doesn't know about that assignment. That's why you were getting "test=" in the output instead of "test=hello world".
Because I fall for it, I would like to drop a note.
I found this thread, because I have to rewrite an old sh script
to be POSIX compatible.
This basically means to circumvent the pipe/subshell problem introduced by POSIX by rewriting code like this:
some_command | read a b c
into:
read a b c << EOF
$(some_command)
EOF
And code like this:
some_command |
while read a b c; do
# something
done
into:
while read a b c; do
# something
done << EOF
$(some_command)
EOF
But the latter does not behave the same on empty input.
With the old notation the while loop is not entered on empty input,
but in POSIX notation it is!
I think it's due to the newline before EOF,
which cannot be ommitted.
The POSIX code which behaves more like the old notation
looks like this:
while read a b c; do
case $a in ("") break; esac
# something
done << EOF
$(some_command)
EOF
In most cases this should be good enough.
But unfortunately this still behaves not exactly like the old notation
if some_command prints an empty line.
In the old notation the while body is executed
and in POSIX notation we break in front of the body.
An approach to fix this might look like this:
while read a b c; do
case $a in ("something_guaranteed_not_to_be_printed_by_some_command") break; esac
# something
done << EOF
$(some_command)
echo "something_guaranteed_not_to_be_printed_by_some_command"
EOF
Piping something into an expression involving an assignment doesn't behave like that.
Instead, try:
test=$(echo "hello world"); echo test=$test
The following code:
echo "hello world" | ( test=($(< /dev/stdin)); echo test=$test )
will work too, but it will open another new sub-shell after the pipe, where
echo "hello world" | { test=($(< /dev/stdin)); echo test=$test; }
won't.
I had to disable job control to make use of chepnars' method (I was running this command from terminal):
set +m;shopt -s lastpipe
echo "hello world" | read test; echo test=$test
echo "hello world" | test="$(</dev/stdin)"; echo test=$test
Bash Manual says:
lastpipe
If set, and job control is not active, the shell runs the last command
of a pipeline not executed in the background in the current shell
environment.
Note: job control is turned off by default in a non-interactive shell and thus you don't need the set +m inside a script.
I think you were trying to write a shell script which could take input from stdin.
but while you are trying it to do it inline, you got lost trying to create that test= variable.
I think it does not make much sense to do it inline, and that's why it does not work the way you expect.
I was trying to reduce
$( ... | head -n $X | tail -n 1 )
to get a specific line from various input.
so I could type...
cat program_file.c | line 34
so I need a small shell program able to read from stdin. like you do.
22:14 ~ $ cat ~/bin/line
#!/bin/sh
if [ $# -ne 1 ]; then echo enter a line number to display; exit; fi
cat | head -n $1 | tail -n 1
22:16 ~ $
there you go.
The questions is how to catch output from a command to save in variable(s) for use later in a script. I might repeat some earlier answers but I try to line up all the answers I can think up to compare and comment, so bear with me.
The intuitive construct
echo test | read x
echo x=$x
is valid in Korn shell because ksh have implemented that the last command in a piped series is part of the current shell ie. the previous pipe commands are subshells. In contrast other shells define all piped commands as subshells including the last.
This is the exact reason I prefer ksh.
But having to copy with other shells, bash f.ex., another construct must be used.
To catch 1 value this construct is viable:
x=$(echo test)
echo x=$x
But that only caters for 1 value to be collected for later use.
To catch more values this construct is useful and works in bash and ksh:
read x y <<< $(echo test again)
echo x=$x y=$y
There is a variant which I have noticed work in bash but not in ksh:
read x y < <(echo test again)
echo x=$x y=$y
The <<< $(...) is a here-document variant which gives all the meta handling of a standard command line. < <(...) is an input redirection of a file-substitution operator.
I use "<<< $(" in all my scripts now because it seems the most portable construct between shell variants. I have a tools set I carry around on jobs in any Unix flavor.
Of course there is the universally viable but crude solution:
command-1 | {command-2; echo "x=test; y=again" > file.tmp; chmod 700 file.tmp}
. ./file.tmp
rm file.tmp
echo x=$x y=$y
I wanted something similar - a function that parses a string that can be passed as a parameter or piped.
I came up with a solution as below (works as #!/bin/sh and as #!/bin/bash)
#!/bin/sh
set -eu
my_func() {
local content=""
# if the first param is an empty string or is not set
if [ -z ${1+x} ]; then
# read content from a pipe if passed or from a user input if not passed
while read line; do content="${content}$line"; done < /dev/stdin
# first param was set (it may be an empty string)
else
content="$1"
fi
echo "Content: '$content'";
}
printf "0. $(my_func "")\n"
printf "1. $(my_func "one")\n"
printf "2. $(echo "two" | my_func)\n"
printf "3. $(my_func)\n"
printf "End\n"
Outputs:
0. Content: ''
1. Content: 'one'
2. Content: 'two'
typed text
3. Content: 'typed text'
End
For the last case (3.) you need to type, hit enter and CTRL+D to end the input.
How about this:
echo "hello world" | echo test=$(cat)

Resources