bash variable in string substitution - linux

I am trying to do string substitution in bash, want to understand it better.
I crafted a success case like this:
a=abc_de_f
var=$a
echo ${var//_/-}
outout is abc-de-f. This works.
However, the following script fails:
a=abc_de_f
echo ${$a//_/-}
The error message is ${$a//_/-}: bad substitution.
It seems like related to how we can use a variable in substitution. Why this fails? How bash handles variables in this case?
Also, what is the best practice to handle escape characters in bash string substitution?

In the second case, you don't need the second $ as a is the string.
a=abc_de_f
echo ${a//_/-}
If you wanted to add a level of indirection, you can use ! before the variable as in
a=abc_de_f
b=a
echo ${b//_/-}
will output a, while
echo ${!b//_/-}
will output abc-de-f.
See here for a discussion on the art of escaping in BASH

Related

How to remove a substring from string in SH not bash

I would like to modify input parameters of an SH script (it begins with #!/bin/sh). I found some solutions but they don't work here and need bash. They give me bad substitution error. so I look for a solution that works in SH (or whatever it is called)
The bash_params could be like "_learn _vil=bar _meet=foo". Here "_learn" acts as a flag. I want to set some variabels based on this flag and then remove it so that I can set other variables with eval.
Also, if you know better approaches please let me know
case $bash_params in
*"_learn"*) # learn is enabled
_learn_sp=True
tt="_learn"
bash_params="${bash_params%"$tt"}" # it doesn't work
bash_params="${bash_params/_learn//}" # this gives Bad substitution error
_lsp=False
;;
eval ${bash_params}
Pipe to sed.
bash_params=$(echo "$bash_params" | sed "s/$tt//")

pass string with spaces to gcc [duplicate]

This question already has answers here:
How can I store a command in a variable in a shell script?
(12 answers)
Closed 4 years ago.
These work as advertised:
grep -ir 'hello world' .
grep -ir hello\ world .
These don't:
argumentString1="-ir 'hello world'"
argumentString2="-ir hello\\ world"
grep $argumentString1 .
grep $argumentString2 .
Despite 'hello world' being enclosed by quotes in the second example, grep interprets 'hello (and hello\) as one argument and world' (and world) as another, which means that, in this case, 'hello will be the search pattern and world' will be the search path.
Again, this only happens when the arguments are expanded from the argumentString variables. grep properly interprets 'hello world' (and hello\ world) as a single argument in the first example.
Can anyone explain why this is? Is there a proper way to expand a string variable that will preserve the syntax of each character such that it is correctly interpreted by shell commands?
Why
When the string is expanded, it is split into words, but it is not re-evaluated to find special characters such as quotes or dollar signs or ... This is the way the shell has 'always' behaved, since the Bourne shell back in 1978 or thereabouts.
Fix
In bash, use an array to hold the arguments:
argumentArray=(-ir 'hello world')
grep "${argumentArray[#]}" .
Or, if brave/foolhardy, use eval:
argumentString="-ir 'hello world'"
eval "grep $argumentString ."
On the other hand, discretion is often the better part of valour, and working with eval is a place where discretion is better than bravery. If you are not completely in control of the string that is eval'd (if there's any user input in the command string that has not been rigorously validated), then you are opening yourself to potentially serious problems.
Note that the sequence of expansions for Bash is described in Shell Expansions in the GNU Bash manual. Note in particular sections 3.5.3 Shell Parameter Expansion, 3.5.7 Word Splitting, and 3.5.9 Quote Removal.
When you put quote characters into variables, they just become plain literals (see http://mywiki.wooledge.org/BashFAQ/050; thanks #tripleee for pointing out this link)
Instead, try using an array to pass your arguments:
argumentString=(-ir 'hello world')
grep "${argumentString[#]}" .
In looking at this and related questions, I'm surprised that no one brought up using an explicit subshell. For bash, and other modern shells, you can execute a command line explicitly. In bash, it requires the -c option.
argumentString="-ir 'hello world'"
bash -c "grep $argumentString ."
Works exactly as original questioner desired. There are two restrictions to this technique:
You can only use single quotes within the command or argument strings.
Only exported environment variables will be available to the command
Also, this technique handles redirection and piping, and other shellisms work as well. You also can use bash internal commands as well as any other command that works at the command line, because you are essentially asking a subshell bash to interpret it directly as a command line. Here's a more complex example, a somewhat gratuitously complex ls -l variant.
cmd="prefix=`pwd` && ls | xargs -n 1 echo \'In $prefix:\'"
bash -c "$cmd"
I have built command processors both this way and with parameter arrays. Generally, this way is much easier to write and debug, and it's trivial to echo the command you are executing. OTOH, param arrays work nicely when you really do have abstract arrays of parameters, as opposed to just wanting a simple command variant.

Bash Nested Substring Removal (Extraction)?

I have something similar in a script I'm writing:
CMD="/path/to/cmd,there.sh"
TMP="${CMD##*/}"
echo "${TMP%%,*}"
Is there a way to nest the substring removals in line 2 & 3, or produce the same result in one-line, in pure bash, without going out to another program? The length of ${CMD} is not static. To be clear, I want the output to be simply "cmd".
I've tried the below, with various forms of brackets and quotations, but get a syntax error. This is something (I think) was allowed but isn't in new versions of Bash.
echo "${${CMD##*/}%%,*}"
Unfortunately, no, it's not possible to combine or nest string operations in bash.
With bash:
[[ $CMD =~ .*/([^,]*) ]] && echo ${BASH_REMATCH[1]}
Shell parameter substitution is primitive in that they don't provide functionalities like nesting. However, nobody prevents you from doing a sed thing here.
cmd="/path/to/cmd,there.sh" # Use lower-case identifiers for user variables
cmd=$(sed -E 's#^.*/([^,]+),.*$#\1#' <<<"$cmd")
The <<< enables the use of herestrings in bash.
I've found that zsh actually supports nested string operations, so I actually switched the interpreter to zsh for my script and the below works fine:
echo "${${CMD##*/}%%,*}"
If you want to write the script "in one line", just use ; or && to indicate the end of a line instead of a line-break:
CMD="/path/to/cmd,there.sh"; TMP="${CMD##*/}"; echo "${TMP%%,*}"
or
CMD="/path/to/cmd,there.sh" && TMP="${CMD##*/}" && echo "${TMP%%,*}"
A more elaborate answer about combining commands can be found below this question.
Disclaimer:
I understand that it is debatable whether or not this is a one-liner. But if you are visiting this question looking for a way to throw this in bash, it may answer your question regardless.

How to store command arguments which contain double quotes in an array?

I have a Bash script which generates, stores and modifies values in an array. These values are later used as arguments for a command.
For a MCVE I thought of an arbitrary command bash -c 'echo 0="$0" ; echo 1="$1"' which explains my problem. I will call my command with two arguments -option1=withoutspace and -option2="with space". So it would look like this
> bash -c 'echo 0="$0" ; echo 1="$1"' -option1=withoutspace -option2="with space"
if the call to the command would be typed directly into the shell. It prints
0=-option1=withoutspace
1=-option2=with space
In my Bash script, the arguments are part of an array. However
#!/bin/bash
ARGUMENTS=()
ARGUMENTS+=('-option1=withoutspace')
ARGUMENTS+=('-option2="with space"')
bash -c 'echo 0="$0" ; echo 1="$1"' "${ARGUMENTS[#]}"
prints
0=-option1=withoutspace
1=-option2="with space"
which still shows the double quotes (because they are interpreted literally?). What works is
#!/bin/bash
ARGUMENTS=()
ARGUMENTS+=('-option1=withoutspace')
ARGUMENTS+=('-option2=with space')
bash -c 'echo 0="$0" ; echo 1="$1"' "${ARGUMENTS[#]}"
which prints again
0=-option1=withoutspace
1=-option2=with space
What do I have to change to make ARGUMENTS+=('-option2="with space"') work as well as ARGUMENTS+=('-option2=with space')?
(Maybe it's even entirely wrong to store arguments for a command in an array? I'm open for suggestions.)
Get rid of the single quotes. Write the options exactly as you would on the command line.
ARGUMENTS+=(-option1=withoutspace)
ARGUMENTS+=(-option2="with space")
Note that this is exactly equivalent to your second option:
ARGUMENTS+=('-option1=withoutspace')
ARGUMENTS+=('-option2=with space')
-option2="with space" and '-option2=with space' both evaluate to the same string. They're two ways of writing the same thing.
(Maybe it's even entirely wrong to store arguments for a command in an array? I'm open for suggestions.)
It's the exact right thing to do. Arrays are perfect for this. Using a flat string would be a mistake.

Bash Shell - The : Command

The colon command is a null command.
The : construct is also useful in the conditional setting of variables. For example,
: ${var:=value}
Without the :, the shell would try to evaluate $var as a command. <=???
I don't quite understand the last sentence in above statement. Can anyone give me some details?
Thank you
Try
var=badcommand
$var
you will get
bash: badcommand: command not found
Try
var=
${var:=badcommand}
and you will get the same.
The shell (e.g. bash) always tries to run the first word on each command line as a command, even after doing variable expansion.
The only exception to this is
var=value
which the shell treats specially.
The trick in the example you provide is that ${var:=value} works anywhere on a command line, e.g.
# set newvar to somevalue if it isn't already set
echo ${newvar:=somevalue}
# show that newvar has been set by the above command
echo $newvar
But we don't really even want to echo the value, so we want something better than
echo ${newvar:=somevalue}.
The : command lets us do the assignment without any other action.
I suppose what the man page writers meant was
: ${var:=value}
Can be used as a short cut instead of say
if [ -z "$var" ]; then
var=value
fi
${var} on its own executes the command stored in $var. Adding substitution parameters does not change this, so you use : to neutralize this.
Try this:
$ help :
:: :
Null command.
No effect; the command does nothing.
Exit Status:
Always succeeds.

Resources