I my bash script, I have a string variable with a $ sign it, and the it isn't escaped. It looks like this:
x="hello $world"
Obviously, when I echo "$x", the output is hello, since $world is being interpreted as a variable. My goal is to modify the string to be hello \$world. I tried several techniques, listed below. None of them seem to be working:
y="$(echo "$x" | sed 's/\$/z/g')" (outputs hello)
y="$(echo "$x" | sed 's/$/z/g')" (outputs hello z, even though I didn't escape \ in sed)
Even tried Bash's native string replacement through:
y=${x//\$/z} (outputs hello)
I realize that I could easily do any of these if the string weren't stored in a variable, but the way my script works, this string will be stored in a variable first, so I need to figure out how to add the \ after that. I don't care if I create a new copy of the string or edit the same string.
The assignment (with $world empty or undefined) is the same as writing
x="hello "
Nothing you do to $x will see a $ in there, unless you add it from outside.
Perhaps you meant instead:
x='hello $world'
You can use BASH:
x='hello $world'
echo "${x//\$/\\$}"
hello \$world
Related
Is there a way to prevent envsubst from substituting a $VARIABLE? For example, I would expect something like:
export THIS=THAT
echo "dont substitute \\\$THIS" | envsubst
and have it return
dont substitute $THIS
but instead I get
dont substitute \THAT
is there any escape character for doing this?
If you give envsubst a list of variables, it only substitutes those variables, ignoring other substitutions. I'm not exactly sure how it works, but something like the following seems to do what you want:
$ export THIS=THAT FOO=BAR
$ echo 'dont substitute $THIS but do substitute $FOO' | envsubst '$FOO'
dont substitute $THIS but do substitute BAR
Note that $THIS is left alone, but $FOO is replaced by BAR.
export DOLLAR='$'
export THIS=THAT
echo '${DOLLAR}THIS' | envsubst
Or more clear:
export THIS=THAT
echo '${DOLLAR}THIS' | DOLLAR='$' envsubst
My workaround is as follows:
Original template:
$change_this
$dont_change_this
Editted template:
$change_this
§dont_change_this
Now you can process:
envsubst < $template | sed -e 's/§/$/g'
This relies on the character § not occurring anywhere else on your template. You can use any other character.
$ echo $SHELL
/bin/bash
$ echo \$SHELL
$SHELL
$ echo \$SHELL | envsubst
/bin/bash
$ echo \$\${q}SHELL | envsubst
$SHELL
So doing $$ allows you to add a $ character. Then just "substitute" non-existent variable (here I used ${q} but can be something more meaningful like ${my_empty_variable} and you'll end up with what you need.
Just as with the paragraph solution - you need something special - here... a non-existent variable, which I like a bit more than performing additional sed on templates.
If there's only one or two variables you don't want to expand, you can sort of whitelist them by temporarily setting them to their own name, like this:
$ echo 'one $two three $four' | four='$four' envsubst
one three $four
Here, the $four variable gets replaced with $four, effectively leaving it unchanged.
In my case I wanted to only escape vars that aren't already defined. To do so run:
envsubst "$(env | sed -e 's/=.*//' -e 's/^/\$/g')"
Another way to "escape" some environment variable substitution is to use default value assignment (or any other variable processing) as envsubst will not substitute these:
$ export two=2
$ echo 'one $two three ${four:-}' | envsubst
one 2 three ${four:-}
The fourth envvar is not substituted, while in its output the processing to use defaulkt value is still there. This does not matter though, as processing this line later on will still deliver nothing if the variable is not set and its value when set.
Here's an alternative that I use, as it saves installing the entire gettext package for just one program. I have this awk script, I call envtmpl, it will swap any environment variable that looks like {{ENV-VAR}} for the value of ENV-VAR
#! /usr/bin/awk -f
{ for (a in ENVIRON) gsub("{{" _ a _ "}}",ENVIRON[a]); print }
So
$ echo "My shell '{{SHELL}}' is cool" | envtmpl
My shell '/bin/bash' is cool
As you can see, if {{ & }} aren't what you prefer, its really each to change and this script works fine with busybox's awk.
It's not going to be the world's fastest solution, but it's really easy to implement and I mostly run it to prepare config files, so speed is pretty irrelevant.
WARNING: The only major difference between this and envsubst is that this will NOT alter variables where no value exists. That is {{HAS-NO-VALUE}} will be left exactly as that, where as envsubst will remove those (replace them with blank).
You can fix this by adding more code into the awk, if you want.
The way I did it is
export DONT_CHANGE_THIS=\${DONT_CHANGE_THIS}
envsubst < some-template.yml > changed.yml
So it will try to replace ${var} with \${var} and as output, you will get ${var} printed as it is
I used escape character for this
MYENVVAR="\${MYENVVAR}"
export MYENVVAR
envsubst #whatever you want
then reset it to what actually I want
MYENVVAR="my value"
export MYENVVAR
I just connected parts of other answers to create one-liner that substitutes vars prefixed with $, but ignores $$:
echo "\$TEST ; \$\$l" > TEST_FILE
cat TEST_FILE
# $TEST ; $$l
export TEST=1
cat TEST_FILE | sed -e 's/\$\$/§/g' | envsubst | sed -e 's/§/\$/g'
# 1 ; $l
I have a variable in a linux bash ".sh" script
$data="test_1"
now I want to create a new variable ($name) that contains only the part of $data before the underscore, so
$name="test"
I thought of doing this with sed
name=$(echo "$dataset" | sed 's/_.*//');
but this doesn't seem to work. What am I doing wrong?
No need to call an external process(sed). Instead you can use shell's parameter substitution like this:
$ data="test_1"
$ echo "${data%%_*}"
test
${var%%Pattern} Remove from $var the longest part of Pattern that matches the back end(from the right) of $var.
${var%Pattern} for removing shortest pattern
More info on parameter substitution can be found here.
You can store it in a variable like this:
$ name="${data%%_*}"
$ echo "$name"
test
In order to get eval to work on commands that contain spaces inside one of the parameters, I have only found this to work so far:
eval 'sed 's/foo/foo'" "'bar/g' filename'
In a hypothetical program where users would enter a command and then the command and arguments to be fed to eval, this isn't a very elegant or robust solution. Are there any other ways to run the eval command so that the interface for my_command can be a little more user friendly? The following is an example of how the program accepts arguments now.
my_command 'sed 's/foo/foo'" "'bar/g' filename'
I would like the interface to work something like this:
my_command sed 's/foo/foo bar/g' filename
edit:
I'll try asking a different question:
How do I get bash to read input from the command line literally? I want the exact input to be preserved, so if there are quotes I want to keep them. I can accomplish what I want to do by using egrep to read from file and then sanitizing the input, like so:
egrep '/.*/' filename |
sed 's/\(.*\)['"'"']\(.*\) \(.*\)['"'"']\(.*\)/\1'"\'"'\2" "\3'"\'"'\4/g'
with "filename" containing this line
sed 's/foo/foo bar/g' file
this gives me the desired output of:
sed 's/foo/foo" "bar/g' file
Problem here is that I can't echo "$#" because bash interprets the quotes. I want the literal input without having to read from file.
Original question
For your preferred use-case, you'd simply write (inside my_command):
"$#"
to execute the command as given.
Your eval line is odd:
eval 'sed 's/foo/foo'" "'bar/g' filename'
Because of the way single quotes don't nest, it is equivalent to:
eval 'sed s/foo/foo" "bar/g filename'
Revised question
Possible solution:
egrep '/.*/' filename | sh
This feeds what is in filename directly to the shell for interpretation. Given file containing:
Some text containing foo; and bar.
More foo bar?
More text; more foo and bar; more foo bar beyond the possibility of unfooing.
The output is:
Some text containing foo bar; and bar.
More foo bar bar?
More text; more foo bar and bar; more foo bar bar beyond the possibility of unfoo baring.
Fixing quotes is hard!
Note that your complex sed script is not complex enough. Given filename containing:
sed 's/foo/foo bar/g' file
sed 's/foo bar/foo bar baz/g' file
the output from:
egrep '/.*/' filename |
sed 's/\(.*\)['"'"']\(.*\) \(.*\)['"'"']\(.*\)/\1'"\'"'\2" "\3'"\'"'\4/g'
is:
sed 's/foo/foo" "bar/g' file
sed 's/foo bar/foo bar" "baz/g' file
which has not solved all the problems for the eval.
I've spent a lot of time, on and off, working on such issues over quite a long period of time (a quarter century is no exaggeration), and it isn't trivial. You can find one discussion in extenso in How to iterate over arguments in bash script. Somewhere, I have another answer which goes through gyrations about this stuff, but I can't immediately find it (where 'immediately' means an hour or so of distracted searching, where the distractions were sets of duplicate questions, etc). It may have been deleted, or I may have looked in the wrong place.
your design is flawed. Create a user interface that doesn't let them input commands directly. give options, or let them enter the parameters only.
At the back end, you do your sanitization check on the parameters before calling sed or other tools desired. You don't have to use eval
Array quoting
The following keeps spaces in arguments by quoting each element of array:
function token_quote {
local quoted=()
for token; do
quoted+=( "$(printf '%q' "$token")" )
done
printf '%s\n' "${quoted[*]}"
}
Example usage:
$ token_quote token 'single token' token
token single\ token token
Above, note the single token's space is quoted as \.
$ set $(token_quote token 'single token' token)
$ eval printf '%s\\n' "$#"
token
single token
token
$
This shows that the tokens are indeed kept separate.
Given some untrusted user input:
% input="Trying to hack you; date"
Construct a command to eval:
% cmd=(echo "User gave:" "$input")
Eval it, with seemingly correct quoting:
% eval "$(echo "${cmd[#]}")"
User gave: Trying to hack you
Thu Sep 27 20:41:31 +07 2018
Note you were hacked. date was executed rather than being printed literally.
Instead with token_quote():
% eval "$(token_quote "${cmd[#]}")"
User gave: Trying to hack you; date
%
eval isn't evil - it's just misunderstood :)
It can actually work as you desire. Use "$#" - this will pass all the arguments exactly as they were given on the command line.
If my_command.sh contains:
sed "$#"
Then my_command.sh 's/foo/foo bar/g' filename will do exactly what you expect.
I'm trying to use bash string operators on a constant string. For instance, you can do the following on variable $foo:
$ foo=a:b:c; echo ${foo##*:}
c
Now, if the "a:b:c" string is constant, I would like to have a more concise solution like:
echo ${"a:b:c"##*:}
However, this is not valid bash syntax. Is there any way to perform this?
[The reason I need to do this (rather than hardcoding the result of the substitution, ie. "c" here) is because I have a command template where a "%h" placeholder is replaced by something before running the command; the result of the substitution is seen as a constant by bash.]
That's not possible using parameter expansion.
You could use other commands for this like sed/awk/expr.
but I don't see the requirement for this.
You could just do:
tmp=%h
echo ${tmp##*:}
Or if speed is not an issue, and you don't want to clutter the current environment with unneeded variables:
(tmp=%h; echo ${tmp##*:})
Anyway, you'd be better off using the command template to do the string manipulation or using something simple like cut:
# get third filed delimited by :
$ cut -d: -f3<<<'a:b:c'
c
Or more sophisticated like awk or sed:
#get last field separated by ':'
$ awk -F: '{print $NF}'<<<'a:b:c'
c
$ sed 's/.*:\([^:]*\)/\1/'<<<'a:b:c'
c
Depends on what you need.
You could use expr to get a similar result:
$ expr match "a:b:c" '.*:\(.*\)'
c
You may be able to use Bash regex matching:
pattern='.*:([^:]+)$'
[[ "a:b:c" =~ $pattern ]]
echo "${BASH_REMATCH[1]}"
But why can't you do your template substitution into a variable assignment, then use the variable in the parameter expansion?
I'm writing a shell script that should be somewhat secure, i.e., does not pass secure data through parameters of commands and preferably does not use temporary files. How can I pass a variable to the standard input of a command?
Or, if it's not possible, how can I correctly use temporary files for such a task?
Passing a value to standard input in Bash is as simple as:
your-command <<< "$your_variable"
Always make sure you put quotes around variable expressions!
Be cautious, that this will probably work only in bash and will not work in sh.
Simple, but error-prone: using echo
Something as simple as this will do the trick:
echo "$blah" | my_cmd
Do note that this may not work correctly if $blah contains -n, -e, -E etc; or if it contains backslashes (bash's copy of echo preserves literal backslashes in absence of -e by default, but will treat them as escape sequences and replace them with corresponding characters even without -e if optional XSI extensions are enabled).
More sophisticated approach: using printf
printf '%s\n' "$blah" | my_cmd
This does not have the disadvantages listed above: all possible C strings (strings not containing NULs) are printed unchanged.
(cat <<END
$passwd
END
) | command
The cat is not really needed, but it helps to structure the code better and allows you to use more commands in parentheses as input to your command.
Note that the 'echo "$var" | command operations mean that standard input is limited to the line(s) echoed. If you also want the terminal to be connected, then you'll need to be fancier:
{ echo "$var"; cat - ; } | command
( echo "$var"; cat - ) | command
This means that the first line(s) will be the contents of $var but the rest will come from cat reading its standard input. If the command does not do anything too fancy (try to turn on command line editing, or run like vim does) then it will be fine. Otherwise, you need to get really fancy - I think expect or one of its derivatives is likely to be appropriate.
The command line notations are practically identical - but the second semi-colon is necessary with the braces whereas it is not with parentheses.
This robust and portable way has already appeared in comments. It should be a standalone answer.
printf '%s' "$var" | my_cmd
or
printf '%s\n' "$var" | my_cmd
Notes:
It's better than echo, reasons are here: Why is printf better than echo?
printf "$var" is wrong. The first argument is format where various sequences like %s or \n are interpreted. To pass the variable right, it must not be interpreted as format.
Usually variables don't contain trailing newlines. The former command (with %s) passes the variable as it is. However tools that work with text may ignore or complain about an incomplete line (see Why should text files end with a newline?). So you may want the latter command (with %s\n) which appends a newline character to the content of the variable. Non-obvious facts:
Here string in Bash (<<<"$var" my_cmd) does append a newline.
Any method that appends a newline results in non-empty stdin of my_cmd, even if the variable is empty or undefined.
I liked Martin's answer, but it has some problems depending on what is in the variable. This
your-command <<< """$your_variable"""
is better if you variable contains " or !.
As per Martin's answer, there is a Bash feature called Here Strings (which itself is a variant of the more widely supported Here Documents feature):
3.6.7 Here Strings
A variant of here documents, the format is:
<<< word
The word is expanded and supplied to the command on its standard
input.
Note that Here Strings would appear to be Bash-only, so, for improved portability, you'd probably be better off with the original Here Documents feature, as per PoltoS's answer:
( cat <<EOF
$variable
EOF
) | cmd
Or, a simpler variant of the above:
(cmd <<EOF
$variable
EOF
)
You can omit ( and ), unless you want to have this redirected further into other commands.
Try this:
echo "$variable" | command
If you came here from a duplicate, you are probably a beginner who tried to do something like
"$variable" >file
or
"$variable" | wc -l
where you obviously meant something like
echo "$variable" >file
echo "$variable" | wc -l
(Real beginners also forget the quotes; usually use quotes unless you have a specific reason to omit them, at least until you understand quoting.)