Bash execute string as command without expanding escaped spaces [duplicate] - string

This question already has answers here:
Why does shell ignore quoting characters in arguments passed to it through variables? [duplicate]
(3 answers)
How can I store a command in a variable in a shell script?
(12 answers)
Closed 2 years ago.
I have an external executable that I need to pass arguments to. With a bash script, I have code that determines these arguments. Some arguments may have escaped spaces.
I need to then execute that string, without expanding each of the arguments.
# ... some code that determines argument string
# the following is an example of the string
ARGSTR='./executable test\ file.txt arg2=true'
exec ${ARGSTR}
I must have the $ARGSTR be expanded so that I can pass arguments to ./executable, but each of the arguments should not be expanded. I have tried quoting "test file.txt", but this still does not pass it as one argument to ./executable.
Is there a way to do something like this?

Us an array instead of a string:
#!/usr/bin/env bash
ARGSTR=('./executable' 'test file.txt' 'arg2=true')
exec "${ARGSTR[#]}"
See:
BashFAQ-50 - I'm trying to put a command in a variable, but the complex cases always fail.
https://stackoverflow.com/a/44055875/7939871

This may achieve what you wanted :
ARGSTR='./executable test\ file.txt arg2=true'
exec bash -c "exec ${ARGSTR}"

Related

Bashscript throws command error when populating variable [duplicate]

This question already has answers here:
How do I set a variable to the output of a command in Bash?
(15 answers)
Bash variable from command with pipes, quotes, etc
(2 answers)
Variable variable assignment error -"command not found"
(1 answer)
Closed 1 year ago.
i have the following two lines in a batch script
iperf_options=" -O 10 -V -i 10 --get-server-output -P " $streams
$iperf_options=$iperf_options $proto
and
$streams = 2
$proto = -u
but when i run this i get the following error.
./bandwidth: line 116: -O: command not found
I am simply trying to wrote a string and then append it to a variable so why does it throw the error on the -O?
I have looked about the web but i jsut seem to find stuff about spaces around the "="
any help greatfully recived.
Thankyou
code block to show error
proto=-u
streams=2
iperf_options=" -O 10 -V -i 10 --get-server-output -P " $streams
$iperf_options=$iperf_options $proto
running this will give this out put
./test
./test: line 3: 2: command not found
./test: line 4: =: command not found
There are two main mistakes here, in a variety of combinations.
Use $ to get the value of a variable, never when setting the variable (or changing its properties):
$var=value # Bad
var=value # Good
var=$othervar # Also good
Spaces are critical delimiters in shell syntax; adding (or removing) them can change the meaning of a command in unexpected ways:
var = value # Runs `var` as a command, passing "=" and "value" as arguments
var=val1 val2 # Runs `val2` as a command, with var=val1 set in its environment
var="val1 val2" # Sets `var1` to `val1 val2`
So, in this command:
iperf_options=" -O 10 -V -i 10 --get-server-output -P " $streams
The space between iperf_options="..." and $streams means that it'll expand $streams and try to run it as a command (with iperf_options set in its environment). You want something like:
iperf_options=" -O 10 -V -i 10 --get-server-output -P $streams"
Here, since $streams is part of the double-quoted string, it'll be expanded (variable expand inside double-quotes, but not in single-quoted), and its value included in the value assigned to iperf_options.
There's actually a third mistake (or at least dubious scripting practice): building lists of options as simple string variables. This works in simple cases, but fails when things get complex. If you're using a shell that supports arrays (e.g. bash, ksh, zsh, etc, but not dash), it's better to use those instead, and store each option/argument as a separate array element, and then expand the array with "${arrayname[#]}" to get all of the elements out intact (yes, all those quotes, braces, brackets, etc are actually needed).
proto="-u" # If this'll always have exactly one value, plain string is ok
streams=2 # Same here
iperf_options=(-O 10 -V -i 10 --get-server-output -P "$streams")
iperf_options=("${iperf_options[#]}" "$proto")
# ...
iperf "${iperf_options[#]}"
Finally, I recommend shellcheck.net to sanity-check your scripts for common mistakes. A warning, though: it won't catch all errors, since it doesn't know your intent. For instance, if it sees var=val1 val2 it'll assume you meant to run val2 as a command and won't flag it as a mistake.

Why the assignment of an array string (with brackets) to environment variable is not working [duplicate]

This question already has answers here:
I just assigned a variable, but echo $variable shows something else
(7 answers)
Closed 2 years ago.
Execute the following command in bash shell:
export sz1='"authorities" : ["uaa.resource"]'
Now, try echo $sz1
I expect to see the following output:
"authorities" : ["uaa.resource"]
But instead I get this:
"authorities" : c
The interesting thing is that I have dozens of servers where I can execute this type of variable assignment and it works except on this server. This server has exactly the same OS version, profile, bash version etc. What could be the reason for this behavior?
Always quote your variables. Use
echo "$sz1"
When you don't quote the variable, word splitting and wildcard expansion is done on the variable expansion. On ["uaa.resource"] is a wildcard that will match any of the following filenames:
"
u
a
.
r
e
s
o
u
c
On that one machine you have a file named c, so the wildcard matches and gets replaced with that filename.

pass string with spaces to gcc [duplicate]

This question already has answers here:
How can I store a command in a variable in a shell script?
(12 answers)
Closed 4 years ago.
These work as advertised:
grep -ir 'hello world' .
grep -ir hello\ world .
These don't:
argumentString1="-ir 'hello world'"
argumentString2="-ir hello\\ world"
grep $argumentString1 .
grep $argumentString2 .
Despite 'hello world' being enclosed by quotes in the second example, grep interprets 'hello (and hello\) as one argument and world' (and world) as another, which means that, in this case, 'hello will be the search pattern and world' will be the search path.
Again, this only happens when the arguments are expanded from the argumentString variables. grep properly interprets 'hello world' (and hello\ world) as a single argument in the first example.
Can anyone explain why this is? Is there a proper way to expand a string variable that will preserve the syntax of each character such that it is correctly interpreted by shell commands?
Why
When the string is expanded, it is split into words, but it is not re-evaluated to find special characters such as quotes or dollar signs or ... This is the way the shell has 'always' behaved, since the Bourne shell back in 1978 or thereabouts.
Fix
In bash, use an array to hold the arguments:
argumentArray=(-ir 'hello world')
grep "${argumentArray[#]}" .
Or, if brave/foolhardy, use eval:
argumentString="-ir 'hello world'"
eval "grep $argumentString ."
On the other hand, discretion is often the better part of valour, and working with eval is a place where discretion is better than bravery. If you are not completely in control of the string that is eval'd (if there's any user input in the command string that has not been rigorously validated), then you are opening yourself to potentially serious problems.
Note that the sequence of expansions for Bash is described in Shell Expansions in the GNU Bash manual. Note in particular sections 3.5.3 Shell Parameter Expansion, 3.5.7 Word Splitting, and 3.5.9 Quote Removal.
When you put quote characters into variables, they just become plain literals (see http://mywiki.wooledge.org/BashFAQ/050; thanks #tripleee for pointing out this link)
Instead, try using an array to pass your arguments:
argumentString=(-ir 'hello world')
grep "${argumentString[#]}" .
In looking at this and related questions, I'm surprised that no one brought up using an explicit subshell. For bash, and other modern shells, you can execute a command line explicitly. In bash, it requires the -c option.
argumentString="-ir 'hello world'"
bash -c "grep $argumentString ."
Works exactly as original questioner desired. There are two restrictions to this technique:
You can only use single quotes within the command or argument strings.
Only exported environment variables will be available to the command
Also, this technique handles redirection and piping, and other shellisms work as well. You also can use bash internal commands as well as any other command that works at the command line, because you are essentially asking a subshell bash to interpret it directly as a command line. Here's a more complex example, a somewhat gratuitously complex ls -l variant.
cmd="prefix=`pwd` && ls | xargs -n 1 echo \'In $prefix:\'"
bash -c "$cmd"
I have built command processors both this way and with parameter arrays. Generally, this way is much easier to write and debug, and it's trivial to echo the command you are executing. OTOH, param arrays work nicely when you really do have abstract arrays of parameters, as opposed to just wanting a simple command variant.

expand unix variable inside sed command [duplicate]

This question already has answers here:
Replace a string in shell script using a variable
(12 answers)
sed substitution with Bash variables
(6 answers)
Closed 4 years ago.
I need to replace current value in configuration file with new value which is assigned to variable ,
like
file_name=abc.txt
needs to be replaced like
file_name=xyz.txt
where $file=xyz.txt
I tried
sed -i 's/file_name=.*/file_name=$file/g' conf_file.conf
however the variable is not getting expanded,
it comes like file_name=$file.
any pointers?
This should work,assuming that variable file has value:xyz.txt assigned to it:
sed "s/file_name=.*/file_name=${file}/g" file_name
Output:
file_name=xyz.txt

How to pass arguments/parameters when executing bash/shell script from NodeJS [duplicate]

This question already has an answer here:
Execute an shell program with node.js and pass node variable to the shell command [closed]
(1 answer)
Closed 4 years ago.
I have the following code:
exec('sh cert-check-script-delete.sh', req.body.deletedCert);
console.log(req.body.deletedCert);
The console log correctly shows the req.body.deletedCert is non-empty.
And in cert-check-script-delete.sh I have:
#!/bin/sh
certs.json="./" # Name of JSON file and directory location
echo -e $1 >> certs.json
But it's just writing an empty line to certs.json
I've also tried:
exec('sh cert-check-script-delete.sh' + req.body.deletedCert)
But neither formats work
Use execFile(), and pass your arguments out-of-band:
child_process.execFile('./cert-check-script-delete.sh', [req.body.deletedCert])
That way your string (from req.body.deletedCert) is passed as a literal argument, not parsed as code. Note that this requires that your script be successfully marked executable (chmod +x check-cert-script-delete.sh), and that it start with a valid shebang.
If you can't fix your file permissions to make your executable, at least pass the arguments out-of-band:
child_process.execFile('/bin/sh', ['./check-cert-script-delete.sh', req.body.deletedCert])

Resources