Bash don't replace $ [duplicate] - linux

This question already has answers here:
Difference between single and double quotes in Bash
(7 answers)
Closed 1 year ago.
I want to write a bash script which creates other bash scripts. But when I do
echo "export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:~/SOME/PATH" >> NEWFILE.sh
it already replaces $LD_LIBRARY_PATH in the first script.
So in NEWFILE.sh I only get:
export LD_LIBRARY_PATH=:~/SOME/PATH
But i want that in the NEWFILE.sh there's still:
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:~/SOME/PATH
So it gets replaced, when running NEWFILE.sh. I hope this makes sense. Thank you for your help

If you don't want the variable interpolated, use single quotes:
echo 'export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:~/SOME/PATH' >> NEWFILE.sh
But even this small snippet has issues. When NEWFILE.sh runs with an empty LD_LIBRARY_PATH, it will create an invalid value that has a leading :. Also, it is bad practice to use ~ in variables. Instead, you should do:
echo 'export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}${LD_LIBRARY_PATH:+:}$HOME/SOME/PATH' >> NEWFILE.sh
Also, this snippet makes NEWFILE.sh not idempotent, and you may wind up with multiple instances of $HOME/SOME/PATH in the final value. This is easy enough to avoid, but takes a bit more logic. Something like:
cat << \EOF >> NEWFILE.sh
pathmunge () { case ":${LD_LIBRARY_PATH}:" in *:"$1":*) ;;
*) LD_LIBRARY_PATH="$LD_LIBRARY_PATH${LD_LIBRARY_PATH:+:}$1";; esac; }
pathmunge $HOME/SOME/PATH
export LD_LIBRARY_PATH
EOF
This uses a common technique of quoting the HEREDOC (the backslash before the EOF is essential; see how it changes when you remove the leading backslash) to prevent interpolation, and allows multi-line output and quotes to be used in the content that is going to be written to NEWFILE.sh.
One small point is that this will wind up putting the expansion of $HOME in LD_LIBRARY_PATH, which is probably what you want. If you really do want $HOME in the path (so that it is expanded when LD_LIBRARY_PATH is used, rather than when it is defined), you could do pathmunge '$HOME/path', but you may wind up with duplicate instances in the final value, since pathmunge will not recognize the unexpanded value as matching the expanded value. Avoiding that duplication is an exercise left for the reader.
Note that, depending on the remaining content of NEWFILE.sh, you may want to ensure that pathmunge is only defined once, and that this definition is not overriding some other definition. pathmunge is a common name for a function used for modifying PATH so you may want to consider a different name, or adding logic to allow it to take the name of the variable to be overridden.

You should escape the $ as follow:
echo "export LD_LIBRARY_PATH=\$LD_LIBRARY_PATH:~/SOME/PATH" >> NEWFILE.sh

Related

Indirect expansion returns variable name instead of value

I am trying to set up some variables using indirect expansion. According to the documentation I've read, the set up should be simple:
var1=qa
qa_num=12345
varname="${var1}_ci"
echo ${!varname}
I should be getting "12345". Instead, the output is "varname". If I remove the exclamation point, I end up with "qa_ci", not "12345"
This should be a relatively simple solution, so I'm not sure what I'm missing, if anything.
Your code defines qa_num, but the varname assignment references qa_ci. As a result, your echo was expanding nonexistent qa_ci, giving empty results. Changing the varname assignment fixes the problem on my system.
Example: foo.sh:
#!/bin/bash
var1=qa
qa_num=12345
varname="${var1}_num" # <=== not _ci
echo "${!varname}" # I also added "" here as a general good practice
Output:
$ bash foo.sh
12345

"read" command not executing in "while read line" loop [duplicate]

This question already has answers here:
Read user input inside a loop
(6 answers)
Closed 5 years ago.
First post here! I really need help on this one, I looked the issue on google, but can't manage to find an useful answer for me. So here's the problem.
I'm having fun coding some like of a framework in bash. Everyone can create their own module and add it to the framework. BUT. To know what arguments the script require, I created an "args.conf" file that must be in every module, that kinda looks like this:
LHOST;true;The IP the remote payload will connect to.
LPORT;true;The port the remote payload will connect to.
The first column is the argument name, the second defines if it's required or not, the third is the description. Anyway, long story short, the framework is supposed to read the args.conf file line by line to ask the user a value for every argument. Here's the piece of code:
info "Reading module $name argument list..."
while read line; do
echo $line > line.tmp
arg=`cut -d ";" -f 1 line.tmp`
requ=`cut -d ";" -f 2 line.tmp`
if [ $requ = "true" ]; then
echo "[This argument is required]"
else
echo "[This argument isn't required, leave a blank space if you don't wan't to use it]"
fi
read -p " $arg=" answer
echo $answer >> arglist.tmp
done < modules/$name/args.conf
tr '\n' ' ' < arglist.tmp > argline.tmp
argline=`cat argline.tmp`
info "Launching module $name..."
cd modules/$name
$interpreter $file $argline
cd ../..
rm arglist.tmp
rm argline.tmp
rm line.tmp
succes "Module $name execution completed."
As you can see, it's supposed to ask the user a value for every argument... But:
1) The read command seems to not be executing. It just skips it, and the argument has no value
2) Despite the fact that the args.conf file contains 3 lines, the loops seems to be executing just a single time. All I see on the screen is "[This argument is required]" just one time, and the module justs launch (and crashes because it has not the required arguments...).
Really don't know what to do, here... I hope someone here have an answer ^^'.
Thanks in advance!
(and sorry for eventual mistakes, I'm french)
Alpha.
As #that other guy pointed out in a comment, the problem is that all of the read commands in the loop are reading from the args.conf file, not the user. The way I'd handle this is by redirecting the conf file over a different file descriptor than stdin (fd #0); I like to use fd #3 for this:
while read -u3 line; do
...
done 3< modules/$name/args.conf
(Note: if your shell's read command doesn't understand the -u option, use read line <&3 instead.)
There are a number of other things in this script I'd recommend against:
Variable references without double-quotes around them, e.g. echo $line instead of echo "$line", and < modules/$name/args.conf instead of < "modules/$name/args.conf". Unquoted variable references get split into words (if they contain whitespace) and any wildcards that happen to match filenames will get replaced by a list of matching files. This can cause really weird and intermittent bugs. Unfortunately, your use of $argline depends on word splitting to separate multiple arguments; if you're using bash (not a generic POSIX shell) you can use arrays instead; I'll get to that.
You're using relative file paths everywhere, and cding in the script. This tends to be fragile and confusing, since file paths are different at different places in the script, and any relative paths passed in by the user will become invalid the first time the script cds somewhere else. Worse, you aren't checking for errors when you cd, so if any cd fails for any reason, then entire rest of the script will run in the wrong place and fail bizarrely. You'd be far better off figuring out where your system's root directory is (as an absolute path), then referencing everything from it (e.g. < "$module_root/modules/$name/args.conf").
Actually, you're not checking for errors anywhere. It's generally a good idea, when writing any sort of program, to try to think of what can go wrong and how your program should respond (and also to expect that things you didn't think of will also go wrong). Some people like to use set -e to make their scripts exit if any simple command fails, but this doesn't always do what you'd expect. I prefer to explicitly test the exit status of the commands in my script, with something like:
command1 || {
echo 'command1 failed!' >&2
exit 1
}
if command2; then
echo 'command2 succeeded!' >&2
else
echo 'command2 failed!' >&2
exit 1
fi
You're creating temp files in the current directory, which risks random conflicts (with other runs of the script at the same time, any files that happen to have names you're using, etc). It's better to create a temp directory at the beginning, then store everything in it (again, by absolute path):
module_tmp="$(mktemp -dt module-system)" || {
echo "Error creating temp directory" >&2
exit 1
}
...
echo "$answer" >> "$module_tmp/arglist.tmp"
(BTW, note that I'm using $() instead of backticks. They're easier to read, and don't have some subtle syntactic oddities that backticks have. I recommend switching.)
Speaking of which, you're overusing temp files; a lot of what you're doing with can be done just fine with shell variables and built-in shell features. For example, rather than reading line from the config file, then storing them in a temp file and using cut to split them into fields, you can simply echo to cut:
arg="$(echo "$line" | cut -d ";" -f 1)"
...or better yet, use read's built-in ability to split fields based on whatever IFS is set to:
while IFS=";" read -u3 arg requ description; do
(Note that since the assignment to IFS is a prefix to the read command, it only affects that one command; changing IFS globally can have weird effects, and should be avoided whenever possible.)
Similarly, storing the argument list in a file, converting newlines to spaces into another file, then reading that file... you can skip any or all of these steps. If you're using bash, store the arg list in an array:
arglist=()
while ...
arglist+=("$answer") # or ("#arg=$answer")? Not sure of your syntax.
done ...
"$module_root/modules/$name/$interpreter" "$file" "${arglist[#]}"
(That messy syntax, with the double-quotes, curly braces, square brackets, and at-sign, is the generally correct way to expand an array in bash).
If you can't count on bash extensions like arrays, you can at least do it the old messy way with a plain variable:
arglist=""
while ...
arglist="$arglist $answer" # or "$arglist $arg=$answer"? Not sure of your syntax.
done ...
"$module_root/modules/$name/$interpreter" "$file" $arglist
... but this runs the risk of arguments being word-split and/or expanded to lists of files.

set default values for bash variables only if they were not previously declared

Here's my current process:
var[product]=messaging_app
var[component]=sms
var[version]=1.0.7
var[yum_location]=$product/$component/$deliverable_name
var[deliverable_name]=$product-$component-$version
# iterate on associative array indices
for default_var in "${!var[#]}" ; do
# skip variables that have been previously declared
if [[ -z ${!default_var} ]] ; then
# export each index as a variable, setting value to the value for that index in the array
export "$default_var=${var[$default_var]}"
fi
done
The core functionality I'm looking for is to set a list of default variables that will not overwrite previously declared variables.
The above code does that, but it also created the issue of these variables can now not depend on one another. This is because the ordering of the associative array's indices output from "${!var[#]}" is not always the same as the order they were declared in.
Does a simpler solution exist like:
declare --nooverwrite this=that
I haven't been able to find anything akin to that.
Also, I'm aware that this can be done with an if statement. However using a bunch of if statements would kill the readability on a script with near 100 default variables.
From 3.5.3 Shell Parameter Expansion:
${parameter:=word}
If parameter is unset or null, the expansion of word is assigned to parameter. The value of parameter is then substituted. Positional parameters and special parameters may not be assigned to in this way.
So
: ${this:=that}
: needed because otherwise the shell would see ${this:=that} as a request to run, as a command, whatever that expanded to.
$ echo "$this"
$ : ${this:=that}
$ echo "$this"
that
$ this=foo
$ echo "$this"
foo
$ : ${this:=that}
$ echo "$this"
foo
You can also to this the first place you use the variable (instead of on its own) if that suits things better (but make sure that's clear because it is easy to mess that up in later edits).
$ echo "$this"
$ echo "${this:=that}"
that
$ echo "$this"
that
Doing this dynamically, however, is less easy and may require eval.

Bash variable defaulting doesn't work if followed by pipe (bash bug?)

I've just discovered a strange behaviour in bash that I don't understand. The expression
${variable:=default}
sets variable to the value default if it isn't already set. Consider the following examples:
#!/bin/bash
file ${foo:=$1}
echo "foo >$foo<"
file ${bar:=$1} | cat
echo "bar >$bar<"
The output is:
$ ./test myfile.txt
myfile.txt: ASCII text
foo >myfile.txt<
myfile.txt: ASCII text
bar ><
You will notice that the variable foo is assigned the value of $1 but the variable bar is not, even though the result of its defaulting is presented to the file command.
If you remove the innocuous pipe into cat from line 4 and re-run it, then it both foo and bar get set to the value of $1
Am I missing somehting here, or is this potentially a bash bug?
(GNU bash, version 4.3.30)
In second case file is a pipe member and runs as every pipe member in its own shell. When file with its subshell ends, $b with its new value from $1 no longer exists.
Workaround:
#!/bin/bash
file ${foo:=$1}
echo "foo >$foo<"
: "${bar:=$1}" # Parameter Expansion before subshell
file $bar | cat
echo "bar >$bar<"
It's not a bug. Parameter expansion happens when the command is evaluated, not parsed, but a command that is part of a pipeline is not evaluated until the new process has been started. Changing this, aside from likely breaking some existing code, would require extra level of expansion before evaluation occurs.
A hypothetical bash session:
> foo=5
> bar='$foo'
> echo "$bar"
$foo
# $bar expands to '$foo' before the subshell is created, but then `$foo` expands to 5
# during the "normal" round of parameter expansion.
> echo "$bar" | cat
5
To avoid that, bash would need some way of marking pieces of text that result from the new first round of pre-evaluation parameter expansion, so that they do not undergo a second
round of evaluation. This type of bookkeeping would quickly lead to unmaintainable code as more corner cases are found to be handled. Far simpler is to just accept that parameter expansions will be deferred until after the subshell starts.
The other alternative is to allow each component to run in the current shell, something that is allowed by the POSIX standard, but is not required, either. bash made the choice long ago to execute each component in a subshell, and reversing that would break too much existing code that relies on the current behavior. (bash 4.2 did introduce the lastpipe option, allowing the last component of a pipeline to execute in the current shell if explicitly enabled.)

Why should eval be avoided in Bash, and what should I use instead?

Time and time again, I see Bash answers on Stack Overflow using eval and the answers get bashed, pun intended, for the use of such an "evil" construct. Why is eval so evil?
If eval can't be used safely, what should I use instead?
There's more to this problem than meets the eye. We'll start with the obvious: eval has the potential to execute "dirty" data. Dirty data is any data that has not been rewritten as safe-for-use-in-situation-XYZ; in our case, it's any string that has not been formatted so as to be safe for evaluation.
Sanitizing data appears easy at first glance. Assuming we're throwing around a list of options, bash already provides a great way to sanitize individual elements, and another way to sanitize the entire array as a single string:
function println
{
# Send each element as a separate argument, starting with the second element.
# Arguments to printf:
# 1 -> "$1\n"
# 2 -> "$2"
# 3 -> "$3"
# 4 -> "$4"
# etc.
printf "$1\n" "${#:2}"
}
function error
{
# Send the first element as one argument, and the rest of the elements as a combined argument.
# Arguments to println:
# 1 -> '\e[31mError (%d): %s\e[m'
# 2 -> "$1"
# 3 -> "${*:2}"
println '\e[31mError (%d): %s\e[m' "$1" "${*:2}"
exit "$1"
}
# This...
error 1234 Something went wrong.
# And this...
error 1234 'Something went wrong.'
# Result in the same output (as long as $IFS has not been modified).
Now say we want to add an option to redirect output as an argument to println. We could, of course, just redirect the output of println on each call, but for the sake of example, we're not going to do that. We'll need to use eval, since variables can't be used to redirect output.
function println
{
eval printf "$2\n" "${#:3}" $1
}
function error
{
println '>&2' '\e[31mError (%d): %s\e[m' "$1" "${*:2}"
exit $1
}
error 1234 Something went wrong.
Looks good, right? Problem is, eval parses twice the command line (in any shell). On the first pass of parsing one layer of quoting is removed. With quotes removed, some variable content gets executed.
We can fix this by letting the variable expansion take place within the eval. All we have to do is single-quote everything, leaving the double-quotes where they are. One exception: we have to expand the redirection prior to eval, so that has to stay outside of the quotes:
function println
{
eval 'printf "$2\n" "${#:3}"' $1
}
function error
{
println '&2' '\e[31mError (%d): %s\e[m' "$1" "${*:2}"
exit $1
}
error 1234 Something went wrong.
This should work. It's also safe as long as $1 in println is never dirty.
Now hold on just a moment: I use that same unquoted syntax that we used originally with sudo all of the time! Why does it work there, and not here? Why did we have to single-quote everything? sudo is a bit more modern: it knows to enclose in quotes each argument that it receives, though that is an over-simplification. eval simply concatenates everything.
Unfortunately, there is no drop-in replacement for eval that treats arguments like sudo does, as eval is a shell built-in; this is important, as it takes on the environment and scope of the surrounding code when it executes, rather than creating a new stack and scope like a function does.
eval Alternatives
Specific use cases often have viable alternatives to eval. Here's a handy list. command represents what you would normally send to eval; substitute in whatever you please.
No-op
A simple colon is a no-op in bash:
:
Create a sub-shell
( command ) # Standard notation
Execute output of a command
Never rely on an external command. You should always be in control of the return value. Put these on their own lines:
$(command) # Preferred
`command` # Old: should be avoided, and often considered deprecated
# Nesting:
$(command1 "$(command2)")
`command "\`command\`"` # Careful: \ only escapes $ and \ with old style, and
# special case \` results in nesting.
Redirection based on variable
In calling code, map &3 (or anything higher than &2) to your target:
exec 3<&0 # Redirect from stdin
exec 3>&1 # Redirect to stdout
exec 3>&2 # Redirect to stderr
exec 3> /dev/null # Don't save output anywhere
exec 3> file.txt # Redirect to file
exec 3> "$var" # Redirect to file stored in $var--only works for files!
exec 3<&0 4>&1 # Input and output!
If it were a one-time call, you wouldn't have to redirect the entire shell:
func arg1 arg2 3>&2
Within the function being called, redirect to &3:
command <&3 # Redirect stdin
command >&3 # Redirect stdout
command 2>&3 # Redirect stderr
command &>&3 # Redirect stdout and stderr
command 2>&1 >&3 # idem, but for older bash versions
command >&3 2>&1 # Redirect stdout to &3, and stderr to stdout: order matters
command <&3 >&4 # Input and output!
Variable indirection
Scenario:
VAR='1 2 3'
REF=VAR
Bad:
eval "echo \"\$$REF\""
Why? If REF contains a double quote, this will break and open the code to exploits. It's possible to sanitize REF, but it's a waste of time when you have this:
echo "${!REF}"
That's right, bash has variable indirection built-in as of version 2. It gets a bit trickier than eval if you want to do something more complex:
# Add to scenario:
VAR_2='4 5 6'
# We could use:
local ref="${REF}_2"
echo "${!ref}"
# Versus the bash < 2 method, which might be simpler to those accustomed to eval:
eval "echo \"\$${REF}_2\""
Regardless, the new method is more intuitive, though it might not seem that way to experienced programmed who are used to eval.
Associative arrays
Associative arrays are implemented intrinsically in bash 4. One caveat: they must be created using declare.
declare -A VAR # Local
declare -gA VAR # Global
# Use spaces between parentheses and contents; I've heard reports of subtle bugs
# on some versions when they are omitted having to do with spaces in keys.
declare -A VAR=( ['']='a' [0]='1' ['duck']='quack' )
VAR+=( ['alpha']='beta' [2]=3 ) # Combine arrays
VAR['cow']='moo' # Set a single element
unset VAR['cow'] # Unset a single element
unset VAR # Unset an entire array
unset VAR[#] # Unset an entire array
unset VAR[*] # Unset each element with a key corresponding to a file in the
# current directory; if * doesn't expand, unset the entire array
local KEYS=( "${!VAR[#]}" ) # Get all of the keys in VAR
In older versions of bash, you can use variable indirection:
VAR=( ) # This will store our keys.
# Store a value with a simple key.
# You will need to declare it in a global scope to make it global prior to bash 4.
# In bash 4, use the -g option.
declare "VAR_$key"="$value"
VAR+="$key"
# Or, if your version is lacking +=
VAR=( "$VAR[#]" "$key" )
# Recover a simple value.
local var_key="VAR_$key" # The name of the variable that holds the value
local var_value="${!var_key}" # The actual value--requires bash 2
# For < bash 2, eval is required for this method. Safe as long as $key is not dirty.
local var_value="`eval echo -n \"\$$var_value\""
# If you don't need to enumerate the indices quickly, and you're on bash 2+, this
# can be cut down to one line per operation:
declare "VAR_$key"="$value" # Store
echo "`var_key="VAR_$key" echo -n "${!var_key}"`" # Retrieve
# If you're using more complex values, you'll need to hash your keys:
function mkkey
{
local key="`mkpasswd -5R0 "$1" 00000000`"
echo -n "${key##*$}"
}
local var_key="VAR_`mkkey "$key"`"
# ...
How to make eval safe
eval can be safely used - but all of its arguments need to be quoted first. Here's how:
This function which will do it for you:
function token_quote {
local quoted=()
for token; do
quoted+=( "$(printf '%q' "$token")" )
done
printf '%s\n' "${quoted[*]}"
}
Example usage:
Given some untrusted user input:
% input="Trying to hack you; date"
Construct a command to eval:
% cmd=(echo "User gave:" "$input")
Eval it, with seemingly correct quoting:
% eval "$(echo "${cmd[#]}")"
User gave: Trying to hack you
Thu Sep 27 20:41:31 +07 2018
Note you were hacked. date was executed rather than being printed literally.
Instead with token_quote():
% eval "$(token_quote "${cmd[#]}")"
User gave: Trying to hack you; date
%
eval isn't evil - it's just misunderstood :)
I’ll split this answer in two parts, which, I think, cover a large proportion of the cases where people tend to be tempted by eval:
Running weirdly built commands
Fiddling with dynamically named variables
Running weirdly built commands
Many, many times, simple indexed arrays are enough, provided that you take on good habits regarding double quotes to protect expansions while defining the array.
# One nasty argument which must remain a single argument and not be split:
f='foo bar'
# The command in an indexed array (use `declare -a` if you really want to be explicit):
cmd=(
touch
"$f"
# Yet another nasty argument, this time hardcoded:
'plop yo'
)
# Let Bash expand the array and run it as a command:
"${cmd[#]}"
This will create foo bar and plop yo (two files, not four).
Note that sometimes it can produce more readable scripts to put just the arguments (or a bunch of options) in the array (at least you know at first glance what you’re running):
touch "${args[#]}"
touch "${opts[#]}" file1 file2
As a bonus, arrays let you, easily:
Add comments about a specific argument:
cmd=(
# Important because blah blah:
-v
)
Group arguments for readability by leaving blank lines within the array definition.
Comment out specific arguments for debugging purposes.
Append arguments to your command, sometimes dynamically according to specific conditions or in loops:
cmd=(myprog)
for f in foo bar
do
cmd+=(-i "$f")
done
if [[ $1 = yo ]]
then
cmd+=(plop)
fi
to_be_added=(one two 't h r e e')
cmd+=("${to_be_added[#]}")
Define commands in configuration files while allowing for configuration-defined whitespace-containing arguments:
readonly ENCODER=(ffmpeg -blah --blah 'yo plop')
# Deprecated:
#readonly ENCODER=(avconv -bloh --bloh 'ya plap')
# […]
"${ENCODER[#]}" foo bar
Log a robustly runnable command, that perfectly represents what is being run, using printf’s %q:
function please_log_that {
printf 'Running:'
# From `help printf`:
# “The format is re-used as necessary to consume all of the arguments.”
# From `man printf` for %q:
# “printed in a format that can be reused as shell input,
# escaping non-printable characters with the proposed POSIX $'' syntax.”
printf ' %q' "$#"
echo
}
arg='foo bar'
cmd=(prog "$arg" 'plop yo' $'arg\nnewline\tand tab')
please_log_that "${cmd[#]}"
# ⇒ “Running: prog foo\ bar plop\ yo $'arg\nnewline\tand tab'”
# You can literally copy and paste that ↑ to a terminal and get the same execution.
Enjoy better syntax highlighting than with eval strings, since you don’t need to nest quotes or use $-s that “will not be evaluated right away but will be at some point”.
To me, the main advantage of this approach (and conversely disadvantage of eval) is that you can follow the same logic as usual regarding quotation, expansion, etc. No need to rack your brain trying to put quotes in quotes in quotes “in advance” while trying to figure out which command will interpret which pair of quotes at which moment. And of course many of the things mentioned above are harder or downright impossible to achieve with eval.
With these, I never had to rely on eval in the past six years or so, and readability and robustness (in particular regarding arguments that contain whitespace) were arguably increased. You don’t even need to know whether IFS has been tempered with! Of course, there are still edge cases where eval might actually be needed (I suppose, for example, if the user has to be able to provide a full fledged piece of script via an interactive prompt or whatever), but hopefully that’s not something you’ll come across on a daily basis.
Fiddling with dynamically named variables
declare -n (or its within-functions local -n counterpart), as well as ${!foo}, do the trick most of the time.
$ help declare | grep -- -n
-n make NAME a reference to the variable named by its value
Well, it’s not exceptionally clear without an example:
declare -A global_associative_array=(
[foo]=bar
[plop]=yo
)
# $1 Name of global array to fiddle with.
fiddle_with_array() {
# Check this if you want to make sure you’ll avoid
# circular references, but it’s only if you really
# want this to be robust.
# You can also give an ugly name like “__ref” to your
# local variable as a cheaper way to make collisions less likely.
if [[ $1 != ref ]]
then
local -n ref=$1
fi
printf 'foo → %s\nplop → %s\n' "${ref[foo]}" "${ref[plop]}"
}
# Call the function with the array NAME as argument,
# not trying to get its content right away here or anything.
fiddle_with_array global_associative_array
# This will print:
# foo → bar
# plop → yo
(I love this trick ↑ as it makes me feel like I’m passing objects to my functions, like in an object-oriented language. The possibilities are mind-boggling.)
As for ${!…} (which gets the value of the variable named by another variable):
foo=bar
plop=yo
for var_name in foo plop
do
printf '%s = %q\n' "$var_name" "${!var_name}"
done
# This will print:
# foo = bar
# plop = yo

Resources