linux bash, passing paramenters using a varible issue - linux

I am trying to use a variable to store the parameters, here is the simple test:
#!/bin/bash
sed_args="-e \"s/aaaa/bbbb/g\""
echo $sed_args`
I expected the output to be
-e "s/aaaa/bbbb/g"
but it gives:
"s/aaaa/bbbb/g"
without the "-e"
I am new to bash, any comment is welcome. Thanks, maybe this is already answered somewhere.

You need an array to construct arguments dynamically:
#!/usr/bin/env bash
sed_args=('-e' 's/aaaa/bbbb/g')
echo "${sed_args[#]}"

When you use the variable without double quotes, it gets word split by the shell even before echo sees the value(s). Then, the bash's builtin echo interprets -e as a parameter for itself (which is normally used to turn on interpretation of backslash escapes).
When you double quote the variable, it won't be split and will be interpreted as a single argument to echo:
echo "$sed_args"
For strings you don't control, it's safer to use printf as it doesn't take any arguments after the format string:
printf %s "$string"

Related

Increment a variable name in ksh [duplicate]

Seems that the recommended way of doing indirect variable setting in bash is to use eval:
var=x; val=foo
eval $var=$val
echo $x # --> foo
The problem is the usual one with eval:
var=x; val=1$'\n'pwd
eval $var=$val # bad output here
(and since it is recommended in many places, I wonder just how many scripts are vulnerable because of this...)
In any case, the obvious solution of using (escaped) quotes doesn't really work:
var=x; val=1\"$'\n'pwd\"
eval $var=\"$val\" # fail with the above
The thing is that bash has indirect variable reference baked in (with ${!foo}), but I don't see any such way to do indirect assignment -- is there any sane way to do this?
For the record, I did find a solution, but this is not something that I'd consider "sane"...:
eval "$var='"${val//\'/\'\"\'\"\'}"'"
A slightly better way, avoiding the possible security implications of using eval, is
declare "$var=$val"
Note that declare is a synonym for typeset in bash. The typeset command is more widely supported (ksh and zsh also use it):
typeset "$var=$val"
In modern versions of bash, one should use a nameref.
declare -n var=x
x=$val
It's safer than eval, but still not perfect.
Bash has an extension to printf that saves its result into a variable:
printf -v "${VARNAME}" '%s' "${VALUE}"
This prevents all possible escaping issues.
If you use an invalid identifier for $VARNAME, the command will fail and return status code 2:
$ printf -v ';;;' '%s' foobar; echo $?
bash: printf: `;;;': not a valid identifier
2
eval "$var=\$val"
The argument to eval should always be a single string enclosed in either single or double quotes. All code that deviates from this pattern has some unintended behavior in edge cases, such as file names with special characters.
When the argument to eval is expanded by the shell, the $var is replaced with the variable name, and the \$ is replaced with a simple dollar. The string that is evaluated therefore becomes:
varname=$value
This is exactly what you want.
Generally, all expressions of the form $varname should be enclosed in double quotes, to prevent accidental expansion of filename patterns like *.c.
There are only two places where the quotes may be omitted since they are defined to not expand pathnames and split fields: variable assignments and case. POSIX 2018 says:
Each variable assignment shall be expanded for tilde expansion, parameter expansion, command substitution, arithmetic expansion, and quote removal prior to assigning the value.
This list of expansions is missing the parameter expansion and the field splitting. Sure, that's hard to see from reading this sentence alone, but that's the official definition.
Since this is a variable assignment, the quotes are not needed here. They don't hurt, though, so you could also write the original code as:
eval "$var=\"the value is \$val\""
Note that the second dollar is escaped using a backslash, to prevent it from being expanded in the first run. What happens is:
eval "$var=\"the value is \$val\""
The argument to the command eval is sent through parameter expansion and unescaping, resulting in:
varname="the value is $val"
This string is then evaluated as a variable assignment, which assigns the following value to the variable varname:
the value is value
The main point is that the recommended way to do this is:
eval "$var=\$val"
with the RHS done indirectly too. Since eval is used in the same
environment, it will have $val bound, so deferring it works, and since
now it's just a variable. Since the $val variable has a known name,
there are no issues with quoting, and it could have even been written as:
eval $var=\$val
But since it's better to always add quotes, the former is better, or
even this:
eval "$var=\"\$val\""
A better alternative in bash that was mentioned for the whole thing that
avoids eval completely (and is not as subtle as declare etc):
printf -v "$var" "%s" "$val"
Though this is not a direct answer what I originally asked...
Newer versions of bash support something called "parameter transformation", documented in a section of the same name in bash(1).
"${value#Q}" expands to a shell-quoted version of "${value}" that you can re-use as input.
Which means the following is a safe solution:
eval="${varname}=${value#Q}"
Just for completeness I also want to suggest the possible use of the bash built in read. I've also made corrections regarding -d'' based on socowi's comments.
But much care needs to be exercised when using read to ensure the input is sanitized (-d'' reads until null termination and printf "...\0" terminates the value with a null), and that read itself is executed in the main shell where the variable is needed and not a sub-shell (hence the < <( ... ) syntax).
var=x; val=foo0shouldnotterminateearly
read -d'' -r "$var" < <(printf "$val\0")
echo $x # --> foo0shouldnotterminateearly
echo ${!var} # --> foo0shouldnotterminateearly
I tested this with \n \t \r spaces and 0, etc it worked as expected on my version of bash.
The -r will avoid escaping \, so if you had the characters "\" and "n" in your value and not an actual newline, x will contain the two characters "\" and "n" also.
This method may not be aesthetically as pleasing as the eval or printf solution, and would be more useful if the value is coming in from a file or other input file descriptor
read -d'' -r "$var" < <( cat $file )
And here are some alternative suggestions for the < <() syntax
read -d'' -r "$var" <<< "$val"$'\0'
read -d'' -r "$var" < <(printf "$val") #Apparently I didn't even need the \0, the printf process ending was enough to trigger the read to finish.
read -d'' -r "$var" <<< $(printf "$val")
read -d'' -r "$var" <<< "$val"
read -d'' -r "$var" < <(printf "$val")
Yet another way to accomplish this, without eval, is to use "read":
INDIRECT=foo
read -d '' -r "${INDIRECT}" <<<"$(( 2 * 2 ))"
echo "${foo}" # outputs "4"

Storing escape characters in unix variable

I am extracting a part from an existing file and storing it as a string in a variable.The string looks something like this.
var="*a<br>*b<br>*c"
Now as * is a special character in unix it doesnot work in further operations(like sed,grep) until I put an escape character infront of every *
Thats why,I am doing something like this -
echo $var | sed 's/\*/\\*/g'
On running this command in bash we get
echo $var | sed 's/\*/\\*/g'
\*a<br>\*b<br>\*c
which is the desired output,but when I try to store this in a variable, I am getting back my original variable like so
var=`echo $var | sed 's/\*/\\*/g'`
echo $var
*a<br>*b<br>*c
I am assuming this happens because the variable ignores the backslashes interpreting them as escape characters. How can I retain the backslashes and store them as in a variable?
The problem is caused by backticks. Use $( ) instead, and it goes away:
var="*a<br>*b<br>*c"
var=$(printf '%s\n' "$var" | sed 's/\*/\\*/g')
printf '%s\n' "$var"
(Why is this problem caused by backticks? Because the only way to nest them is to escape the inner ones with backslashes, so they necessarily change how backslashes behave; whereas $( ), because it uses different starting and ending sigils, can be nested natively).
That said, if your shell is one (like bash) with ksh-inspired extensions, you don't need sed at all here, as the shell can perform simple string replacements natively via parameter expansion:
var="*a<br>*b<br>*c"
printf '%s\n' "${var//'*'/'\*'}"
For background on why this answer uses printf instead of echo, see Why is printf better than echo? at [unix.se], or the APPLICATION USAGE section of the POSIX specification for echo.

bash echo environment variable containing escaped characters

I have an script that echo the input given, into a file as follows:
echo $# > file.txt
When I pass a sting like "\"" I want it to exactly print "\"" to the file however it prints ".
My question is how can I print all characters of a variable containing a string without considering escapes?
When I use echo in bash like echo "\"" it only prints " while when I use echo '"\""' it prints it correctly. I thought maybe that would be the solution to use single quotes around the variable, however I cannot get the value of a variable inside single quotes.
First, note that
echo $# > file.txt
can fail in several ways. Shellcheck identifies one problem (missing quotes on $#). See the accepted, and excellent, answer to Why is printf better than echo? for others.
Second, as others have pointed out, there is no practical way for a Bash program to know exactly how parameters were specified on the command line. For instance, for all of these invocations
prog \"
prog "\""
prog '"'
the code in prog will see a $1 value that consists of one double-quote character. Any quoting characters that are used in the invocation of prog are removed by the quote removal part of the shell expansions done by the parent shell process.
Normally that doesn't matter. If variables or parameters contain values that would need to be quoted when entered as literals (e.g. "\"") they can be used safely, including passing them as parameters to other programs, by quoting uses of the variable or parameter (e.g. "$1", "$#", "$x").
There is a problem with variables or parameters that require quoting when entered literally if you need to write them in a way that they can be reused as shell input (e.g. by using eval or source/.). Bash supports the %q format specification to the printf builtin to handle this situation. It's not clear what the OP is trying to do, but one possible solution to the question is:
if (( $# > 0 )) ; then
printf -v quoted_params '%q ' "$#" # Add all parameters to 'quoted_params'
printf '%s\n' "${quoted_params% }" # Remove trailing space when printing
fi >file.txt
That creates an empty 'file.txt' when no positional parameters are provided. The code would need to be changed if that is not what is required.
If you run echo \", the function of the backslash in bash is to escape the character after it. This actually enables you to use the double quotes as an argument. You cannot use a backslash by itself; if you want to have a backslash as an argument you need to use another slash to escape that: echo \\
Now if you want to create a string where these things are not escaped, use single quotes: echo '\'
See for a better explanation this post: Difference between single and double quotes in Bash

BASH: A variable inside a defined variable?

i have the following function in a bash script which does not work?
do_get() {
cmd='<command version="33" cmd="GETINFO" $3</command>'
echo $cmd
}
Now, if i echo $3 right before the cmd variable it echos out 1234 which i am passing as a 3 argument when executing this. BUT it shows just $3 when i do an echo $cmd.
i tried a couple of things like such below thinking its getting striped out
'$3' but it then shows blank
'"$3"' same as above
The variable doesn't expand when inside single quotes. You need to use double quotes instead, but since you have double quotes on the inside, you need to make sure you remember to escape those as well.
do_get() {
cmd="<command version=\"33\" cmd=\"GETINFO\" $3</command>"
echo $cmd
}
Newer versions of bash add a -v flag to the printf command that makes assignments like this a little easier on the eye, in that quoting is reduced.
printf -v cmd '<command version="33" cmd="GETINFO" %d </command' "$3"
The single quote in Bash prevents variable substitution. In order for the third parameter to be substituted, you should enclose your string in double quotes. of course, you have the problem of the double quotes that are part of your string, so they need to be escaped with a backslash:
cmd="<command version=\"33\" cmd=\"GETINFO\" $3 </command>"

How to pass the value of a variable to the standard input of a command?

I'm writing a shell script that should be somewhat secure, i.e., does not pass secure data through parameters of commands and preferably does not use temporary files. How can I pass a variable to the standard input of a command?
Or, if it's not possible, how can I correctly use temporary files for such a task?
Passing a value to standard input in Bash is as simple as:
your-command <<< "$your_variable"
Always make sure you put quotes around variable expressions!
Be cautious, that this will probably work only in bash and will not work in sh.
Simple, but error-prone: using echo
Something as simple as this will do the trick:
echo "$blah" | my_cmd
Do note that this may not work correctly if $blah contains -n, -e, -E etc; or if it contains backslashes (bash's copy of echo preserves literal backslashes in absence of -e by default, but will treat them as escape sequences and replace them with corresponding characters even without -e if optional XSI extensions are enabled).
More sophisticated approach: using printf
printf '%s\n' "$blah" | my_cmd
This does not have the disadvantages listed above: all possible C strings (strings not containing NULs) are printed unchanged.
(cat <<END
$passwd
END
) | command
The cat is not really needed, but it helps to structure the code better and allows you to use more commands in parentheses as input to your command.
Note that the 'echo "$var" | command operations mean that standard input is limited to the line(s) echoed. If you also want the terminal to be connected, then you'll need to be fancier:
{ echo "$var"; cat - ; } | command
( echo "$var"; cat - ) | command
This means that the first line(s) will be the contents of $var but the rest will come from cat reading its standard input. If the command does not do anything too fancy (try to turn on command line editing, or run like vim does) then it will be fine. Otherwise, you need to get really fancy - I think expect or one of its derivatives is likely to be appropriate.
The command line notations are practically identical - but the second semi-colon is necessary with the braces whereas it is not with parentheses.
This robust and portable way has already appeared in comments. It should be a standalone answer.
printf '%s' "$var" | my_cmd
or
printf '%s\n' "$var" | my_cmd
Notes:
It's better than echo, reasons are here: Why is printf better than echo?
printf "$var" is wrong. The first argument is format where various sequences like %s or \n are interpreted. To pass the variable right, it must not be interpreted as format.
Usually variables don't contain trailing newlines. The former command (with %s) passes the variable as it is. However tools that work with text may ignore or complain about an incomplete line (see Why should text files end with a newline?). So you may want the latter command (with %s\n) which appends a newline character to the content of the variable. Non-obvious facts:
Here string in Bash (<<<"$var" my_cmd) does append a newline.
Any method that appends a newline results in non-empty stdin of my_cmd, even if the variable is empty or undefined.
I liked Martin's answer, but it has some problems depending on what is in the variable. This
your-command <<< """$your_variable"""
is better if you variable contains " or !.
As per Martin's answer, there is a Bash feature called Here Strings (which itself is a variant of the more widely supported Here Documents feature):
3.6.7 Here Strings
A variant of here documents, the format is:
<<< word
The word is expanded and supplied to the command on its standard
input.
Note that Here Strings would appear to be Bash-only, so, for improved portability, you'd probably be better off with the original Here Documents feature, as per PoltoS's answer:
( cat <<EOF
$variable
EOF
) | cmd
Or, a simpler variant of the above:
(cmd <<EOF
$variable
EOF
)
You can omit ( and ), unless you want to have this redirected further into other commands.
Try this:
echo "$variable" | command
If you came here from a duplicate, you are probably a beginner who tried to do something like
"$variable" >file
or
"$variable" | wc -l
where you obviously meant something like
echo "$variable" >file
echo "$variable" | wc -l
(Real beginners also forget the quotes; usually use quotes unless you have a specific reason to omit them, at least until you understand quoting.)

Resources