How to pass the value of a variable to the standard input of a command? - security

I'm writing a shell script that should be somewhat secure, i.e., does not pass secure data through parameters of commands and preferably does not use temporary files. How can I pass a variable to the standard input of a command?
Or, if it's not possible, how can I correctly use temporary files for such a task?

Passing a value to standard input in Bash is as simple as:
your-command <<< "$your_variable"
Always make sure you put quotes around variable expressions!
Be cautious, that this will probably work only in bash and will not work in sh.

Simple, but error-prone: using echo
Something as simple as this will do the trick:
echo "$blah" | my_cmd
Do note that this may not work correctly if $blah contains -n, -e, -E etc; or if it contains backslashes (bash's copy of echo preserves literal backslashes in absence of -e by default, but will treat them as escape sequences and replace them with corresponding characters even without -e if optional XSI extensions are enabled).
More sophisticated approach: using printf
printf '%s\n' "$blah" | my_cmd
This does not have the disadvantages listed above: all possible C strings (strings not containing NULs) are printed unchanged.

(cat <<END
$passwd
END
) | command
The cat is not really needed, but it helps to structure the code better and allows you to use more commands in parentheses as input to your command.

Note that the 'echo "$var" | command operations mean that standard input is limited to the line(s) echoed. If you also want the terminal to be connected, then you'll need to be fancier:
{ echo "$var"; cat - ; } | command
( echo "$var"; cat - ) | command
This means that the first line(s) will be the contents of $var but the rest will come from cat reading its standard input. If the command does not do anything too fancy (try to turn on command line editing, or run like vim does) then it will be fine. Otherwise, you need to get really fancy - I think expect or one of its derivatives is likely to be appropriate.
The command line notations are practically identical - but the second semi-colon is necessary with the braces whereas it is not with parentheses.

This robust and portable way has already appeared in comments. It should be a standalone answer.
printf '%s' "$var" | my_cmd
or
printf '%s\n' "$var" | my_cmd
Notes:
It's better than echo, reasons are here: Why is printf better than echo?
printf "$var" is wrong. The first argument is format where various sequences like %s or \n are interpreted. To pass the variable right, it must not be interpreted as format.
Usually variables don't contain trailing newlines. The former command (with %s) passes the variable as it is. However tools that work with text may ignore or complain about an incomplete line (see Why should text files end with a newline?). So you may want the latter command (with %s\n) which appends a newline character to the content of the variable. Non-obvious facts:
Here string in Bash (<<<"$var" my_cmd) does append a newline.
Any method that appends a newline results in non-empty stdin of my_cmd, even if the variable is empty or undefined.

I liked Martin's answer, but it has some problems depending on what is in the variable. This
your-command <<< """$your_variable"""
is better if you variable contains " or !.

As per Martin's answer, there is a Bash feature called Here Strings (which itself is a variant of the more widely supported Here Documents feature):
3.6.7 Here Strings
A variant of here documents, the format is:
<<< word
The word is expanded and supplied to the command on its standard
input.
Note that Here Strings would appear to be Bash-only, so, for improved portability, you'd probably be better off with the original Here Documents feature, as per PoltoS's answer:
( cat <<EOF
$variable
EOF
) | cmd
Or, a simpler variant of the above:
(cmd <<EOF
$variable
EOF
)
You can omit ( and ), unless you want to have this redirected further into other commands.

Try this:
echo "$variable" | command

If you came here from a duplicate, you are probably a beginner who tried to do something like
"$variable" >file
or
"$variable" | wc -l
where you obviously meant something like
echo "$variable" >file
echo "$variable" | wc -l
(Real beginners also forget the quotes; usually use quotes unless you have a specific reason to omit them, at least until you understand quoting.)

Related

Increment a variable name in ksh [duplicate]

Seems that the recommended way of doing indirect variable setting in bash is to use eval:
var=x; val=foo
eval $var=$val
echo $x # --> foo
The problem is the usual one with eval:
var=x; val=1$'\n'pwd
eval $var=$val # bad output here
(and since it is recommended in many places, I wonder just how many scripts are vulnerable because of this...)
In any case, the obvious solution of using (escaped) quotes doesn't really work:
var=x; val=1\"$'\n'pwd\"
eval $var=\"$val\" # fail with the above
The thing is that bash has indirect variable reference baked in (with ${!foo}), but I don't see any such way to do indirect assignment -- is there any sane way to do this?
For the record, I did find a solution, but this is not something that I'd consider "sane"...:
eval "$var='"${val//\'/\'\"\'\"\'}"'"
A slightly better way, avoiding the possible security implications of using eval, is
declare "$var=$val"
Note that declare is a synonym for typeset in bash. The typeset command is more widely supported (ksh and zsh also use it):
typeset "$var=$val"
In modern versions of bash, one should use a nameref.
declare -n var=x
x=$val
It's safer than eval, but still not perfect.
Bash has an extension to printf that saves its result into a variable:
printf -v "${VARNAME}" '%s' "${VALUE}"
This prevents all possible escaping issues.
If you use an invalid identifier for $VARNAME, the command will fail and return status code 2:
$ printf -v ';;;' '%s' foobar; echo $?
bash: printf: `;;;': not a valid identifier
2
eval "$var=\$val"
The argument to eval should always be a single string enclosed in either single or double quotes. All code that deviates from this pattern has some unintended behavior in edge cases, such as file names with special characters.
When the argument to eval is expanded by the shell, the $var is replaced with the variable name, and the \$ is replaced with a simple dollar. The string that is evaluated therefore becomes:
varname=$value
This is exactly what you want.
Generally, all expressions of the form $varname should be enclosed in double quotes, to prevent accidental expansion of filename patterns like *.c.
There are only two places where the quotes may be omitted since they are defined to not expand pathnames and split fields: variable assignments and case. POSIX 2018 says:
Each variable assignment shall be expanded for tilde expansion, parameter expansion, command substitution, arithmetic expansion, and quote removal prior to assigning the value.
This list of expansions is missing the parameter expansion and the field splitting. Sure, that's hard to see from reading this sentence alone, but that's the official definition.
Since this is a variable assignment, the quotes are not needed here. They don't hurt, though, so you could also write the original code as:
eval "$var=\"the value is \$val\""
Note that the second dollar is escaped using a backslash, to prevent it from being expanded in the first run. What happens is:
eval "$var=\"the value is \$val\""
The argument to the command eval is sent through parameter expansion and unescaping, resulting in:
varname="the value is $val"
This string is then evaluated as a variable assignment, which assigns the following value to the variable varname:
the value is value
The main point is that the recommended way to do this is:
eval "$var=\$val"
with the RHS done indirectly too. Since eval is used in the same
environment, it will have $val bound, so deferring it works, and since
now it's just a variable. Since the $val variable has a known name,
there are no issues with quoting, and it could have even been written as:
eval $var=\$val
But since it's better to always add quotes, the former is better, or
even this:
eval "$var=\"\$val\""
A better alternative in bash that was mentioned for the whole thing that
avoids eval completely (and is not as subtle as declare etc):
printf -v "$var" "%s" "$val"
Though this is not a direct answer what I originally asked...
Newer versions of bash support something called "parameter transformation", documented in a section of the same name in bash(1).
"${value#Q}" expands to a shell-quoted version of "${value}" that you can re-use as input.
Which means the following is a safe solution:
eval="${varname}=${value#Q}"
Just for completeness I also want to suggest the possible use of the bash built in read. I've also made corrections regarding -d'' based on socowi's comments.
But much care needs to be exercised when using read to ensure the input is sanitized (-d'' reads until null termination and printf "...\0" terminates the value with a null), and that read itself is executed in the main shell where the variable is needed and not a sub-shell (hence the < <( ... ) syntax).
var=x; val=foo0shouldnotterminateearly
read -d'' -r "$var" < <(printf "$val\0")
echo $x # --> foo0shouldnotterminateearly
echo ${!var} # --> foo0shouldnotterminateearly
I tested this with \n \t \r spaces and 0, etc it worked as expected on my version of bash.
The -r will avoid escaping \, so if you had the characters "\" and "n" in your value and not an actual newline, x will contain the two characters "\" and "n" also.
This method may not be aesthetically as pleasing as the eval or printf solution, and would be more useful if the value is coming in from a file or other input file descriptor
read -d'' -r "$var" < <( cat $file )
And here are some alternative suggestions for the < <() syntax
read -d'' -r "$var" <<< "$val"$'\0'
read -d'' -r "$var" < <(printf "$val") #Apparently I didn't even need the \0, the printf process ending was enough to trigger the read to finish.
read -d'' -r "$var" <<< $(printf "$val")
read -d'' -r "$var" <<< "$val"
read -d'' -r "$var" < <(printf "$val")
Yet another way to accomplish this, without eval, is to use "read":
INDIRECT=foo
read -d '' -r "${INDIRECT}" <<<"$(( 2 * 2 ))"
echo "${foo}" # outputs "4"

Storing escape characters in unix variable

I am extracting a part from an existing file and storing it as a string in a variable.The string looks something like this.
var="*a<br>*b<br>*c"
Now as * is a special character in unix it doesnot work in further operations(like sed,grep) until I put an escape character infront of every *
Thats why,I am doing something like this -
echo $var | sed 's/\*/\\*/g'
On running this command in bash we get
echo $var | sed 's/\*/\\*/g'
\*a<br>\*b<br>\*c
which is the desired output,but when I try to store this in a variable, I am getting back my original variable like so
var=`echo $var | sed 's/\*/\\*/g'`
echo $var
*a<br>*b<br>*c
I am assuming this happens because the variable ignores the backslashes interpreting them as escape characters. How can I retain the backslashes and store them as in a variable?
The problem is caused by backticks. Use $( ) instead, and it goes away:
var="*a<br>*b<br>*c"
var=$(printf '%s\n' "$var" | sed 's/\*/\\*/g')
printf '%s\n' "$var"
(Why is this problem caused by backticks? Because the only way to nest them is to escape the inner ones with backslashes, so they necessarily change how backslashes behave; whereas $( ), because it uses different starting and ending sigils, can be nested natively).
That said, if your shell is one (like bash) with ksh-inspired extensions, you don't need sed at all here, as the shell can perform simple string replacements natively via parameter expansion:
var="*a<br>*b<br>*c"
printf '%s\n' "${var//'*'/'\*'}"
For background on why this answer uses printf instead of echo, see Why is printf better than echo? at [unix.se], or the APPLICATION USAGE section of the POSIX specification for echo.

Writing variables to file with bash

I'm trying to configure a file with a bash script. And the variables in the bash script are not written in file as it is written in script.
Ex:
#!/bin/bash
printf "%s" "template("$DATE\t$HOST\t$PRIORITY\t$MSG\n")" >> /file.txt
exit 0
This results to template('tttn') instead of template("$DATE\t$HOST\t$PRIORITY\t$MSG\n in file.
How do I write in the script so that the result is template("$DATE\t$HOST\t$PRIORITY\t$MSG\n in the configured file?
Is it possible to write variable as it looks in script to file?
Enclose the strings you want to write within single quotes to avoid variable replacement.
> FOO=bar
> echo "$FOO"
bar
> echo '$FOO'
$FOO
>
Using printf in any shell script is uncommon, just use echo with the -e option.
It allows you to use ANSI C metacharacters, like \t or \n. The \n at the end however isn't necessary, as echo will add one itself.
echo -e "template(${DATE}\t${HOST}\t${PRIORITY}\t${MSG})" >> file.txt
The problem with what you've written is, that ANSI C metacharacters, like \t can only be used in the first parameter to printf.
So it would have to be something like:
printf 'template(%s\t%s\t%s\t%s)\n' ${DATE} ${HOST} ${PRIORITY} ${MSG} >> file.txt
But I hope we both agree, that this is very hard on the eyes.
There are several escaping issues and the power of printf has not been used, try
printf 'template(%s\t%s\t%s\t%s)\n' "${DATE}" "${HOST}" "${PRIORITY}" "${MSG}" >> file.txt
Reasons for this separate answer:
The accepted answer does not fit the title of the question (see comment).
The post with the right answer
contains wrong claims about echo vs printf as of this post and
is not robust against whitespace in the values.
The edit queue is full at the moment.

Get first character of a string SHELL

I want to first the first character of a string, for example:
$>./first $foreignKey
And I want to get "$"
I googled it and I found some solutions but it concerns only bash and not Sh !
This should work on any Posix compatible shell (including sh). printf is not required to be a builtin but it often is, so this may save a fork or two:
first_letter=$(printf %.1s "$1")
Note: (Possibly I should have explained this six years ago when I wrote this brief answer.) It might be tempting to write %c instead of %.1s; that produces exactly the same result except in the case where the argument "$1" is empty. printf %c "" actually produces a NUL byte, which is not a valid character in a Posix shell; different shells might treat this case differently. Some will allow NULs as an extension; others, like bash, ignore the NUL but generate an error message to tell you it has happened. The precise semantics of %.1s is "at most 1 character at the start of the argument, which means that first_letter is guaranteed to be set to the empty string if the argument is the empty string, without raising any error indication.
Well, you'll probably need to escape that particular value to prevent it being interpreted as a shell variable but, if you don't have access to the nifty bash substring facility, you can still use something like:
name=paxdiablo
firstchar=`echo $name | cut -c1-1`
If you do have bash (it's available on most Linux distros and, even if your login shell is not bash, you should be able to run scripts with it), it's the much easier:
firstchar=${name:0:1}
For escaping the value so that it's not interpreted by the shell, you need to use:
./first \$foreignKey
and the following first script shows how to get it:
letter=`echo $1 | cut -c1-1`
echo ".$letter."
Maybe it is an old question.
recently I got the same problem, according to POSIX shell manual about substring processing, this is my solution without involving any subshell/fork
a="some string here"
printf 'first char is "%s"\n' "${a%"${a#?}"}"
for shell sh
echo "hello" | cut -b 1 # -b 1 extract the 1st byte
h
echo "hello" |grep -o "." | head -n 1
h
echo "hello" | awk -F "" '{print $1}'
h
you can try this for bash:
s='hello'; echo ${s:0:1}
h
printf -v first_character "%c" "${variable}"

Making a bash script to accept input from file OR piping output

I have the following bash script which takes the tabular data as input,
get the first line and spit them vertically:
#!/bin/bash
# my_script.sh
export LC_ALL=C
file=$1
head -n1 $file |
tr "\t" "\n" |
awk '{print $1 " " NR-1}'
The problem is that I can only execute it this way:
$ myscript.sh some_tab_file.txt
What I want to do is on top of the above capability also allows you to do this:
$ cat some_tab_file.txt myscript.sh | myscript.sh
Namely take it from pipe output. How can I achieve that?
I'd normally write:
export LC_ALL=C
head -n1 "$#" |
tr "\t" "\n" |
awk '{print $1 " " NR-1}'
This works with any number of arguments, or none if there are none. Using "$#" is important in this and many other contexts. See the Bash manual on special parameters and shell parameter expansion for more information on the many and varied notations available for controlling how shell parameters are handled. Generally, double quotes are a good idea, especially if the file names may contain spaces.
A common idiom is to fall back to the input file - if there are no parameters. There is a convenient shorthand for that;
file=${1--}
The substitution ${variable-fallback} evaluates to the variable's value, or fallback if it's unset.
I believe your script should work as-is, though; head will read standard input if the (unquoted!) file name you pass in evaluates to the empty string.
Take care to properly double-quote all interpolations of "$file", by the way; otherwise, your script won't work on filenames containing spaces or shell metacharacters. (Then you break the fortunate side effect of not passing a filename to head if your script did not receive one, though.)

Resources