When i run from bash:
# su -c "psql -d mapping -c \"INSERT INTO mapping (ip,username) VALUES('1.2.3.4','test');\"" postgres
It works fine, but when i try from python:
subprocess.run("su -c \"psql -d mapping -c \"INSERT INTO mapping (ip,username) VALUES('1.2.3.4','test');\"\" postgres")
It fails, i have tried different quotation marks and all failing. Could you please help?
There are two solutions, depending on whether or not you use the shell from Python. The trivial but inefficient solution is to pass the string with shell=True and basically just add Python quoting around it.
subprocess.run(r'''su -c "psql -d mapping -c \"INSERT INTO mapping (ip,username) VALUES('1.2.3.4','test');\"" postgres''', shell=True,
# For good measure, you should check its status
check=True)
More efficiently and perhaps in fact more readably, you can remove the shell from the equation, and split the command into strings yourself.
subprocess.run([
'su', '-c',
# The argument to "su -c" should be one string
# Because we escaped the shell, the double quotes don't need escaping
'''psql -d mapping -c "INSERT INTO mapping (ip,username) VALUES('1.2.3.4','test');"''',
'postgres'],
check=True)
Notice how with shell=True the first argument is a string which gets passed to the shell, whereas without it, you pass a list of tokens directly to the OS-level exec() or (somewhat less straightforwardly on Windows) CreateProcess(). Notice also how in the first instance I used an r'...' string to avoid having Python meddle with the backslashes in the string.
Related
This question already has answers here:
How do I set a variable to the output of a command in Bash?
(15 answers)
Bash variable from command with pipes, quotes, etc
(2 answers)
Variable variable assignment error -"command not found"
(1 answer)
Closed 1 year ago.
i have the following two lines in a batch script
iperf_options=" -O 10 -V -i 10 --get-server-output -P " $streams
$iperf_options=$iperf_options $proto
and
$streams = 2
$proto = -u
but when i run this i get the following error.
./bandwidth: line 116: -O: command not found
I am simply trying to wrote a string and then append it to a variable so why does it throw the error on the -O?
I have looked about the web but i jsut seem to find stuff about spaces around the "="
any help greatfully recived.
Thankyou
code block to show error
proto=-u
streams=2
iperf_options=" -O 10 -V -i 10 --get-server-output -P " $streams
$iperf_options=$iperf_options $proto
running this will give this out put
./test
./test: line 3: 2: command not found
./test: line 4: =: command not found
There are two main mistakes here, in a variety of combinations.
Use $ to get the value of a variable, never when setting the variable (or changing its properties):
$var=value # Bad
var=value # Good
var=$othervar # Also good
Spaces are critical delimiters in shell syntax; adding (or removing) them can change the meaning of a command in unexpected ways:
var = value # Runs `var` as a command, passing "=" and "value" as arguments
var=val1 val2 # Runs `val2` as a command, with var=val1 set in its environment
var="val1 val2" # Sets `var1` to `val1 val2`
So, in this command:
iperf_options=" -O 10 -V -i 10 --get-server-output -P " $streams
The space between iperf_options="..." and $streams means that it'll expand $streams and try to run it as a command (with iperf_options set in its environment). You want something like:
iperf_options=" -O 10 -V -i 10 --get-server-output -P $streams"
Here, since $streams is part of the double-quoted string, it'll be expanded (variable expand inside double-quotes, but not in single-quoted), and its value included in the value assigned to iperf_options.
There's actually a third mistake (or at least dubious scripting practice): building lists of options as simple string variables. This works in simple cases, but fails when things get complex. If you're using a shell that supports arrays (e.g. bash, ksh, zsh, etc, but not dash), it's better to use those instead, and store each option/argument as a separate array element, and then expand the array with "${arrayname[#]}" to get all of the elements out intact (yes, all those quotes, braces, brackets, etc are actually needed).
proto="-u" # If this'll always have exactly one value, plain string is ok
streams=2 # Same here
iperf_options=(-O 10 -V -i 10 --get-server-output -P "$streams")
iperf_options=("${iperf_options[#]}" "$proto")
# ...
iperf "${iperf_options[#]}"
Finally, I recommend shellcheck.net to sanity-check your scripts for common mistakes. A warning, though: it won't catch all errors, since it doesn't know your intent. For instance, if it sees var=val1 val2 it'll assume you meant to run val2 as a command and won't flag it as a mistake.
This question already has answers here:
How can I store a command in a variable in a shell script?
(12 answers)
Closed 4 years ago.
These work as advertised:
grep -ir 'hello world' .
grep -ir hello\ world .
These don't:
argumentString1="-ir 'hello world'"
argumentString2="-ir hello\\ world"
grep $argumentString1 .
grep $argumentString2 .
Despite 'hello world' being enclosed by quotes in the second example, grep interprets 'hello (and hello\) as one argument and world' (and world) as another, which means that, in this case, 'hello will be the search pattern and world' will be the search path.
Again, this only happens when the arguments are expanded from the argumentString variables. grep properly interprets 'hello world' (and hello\ world) as a single argument in the first example.
Can anyone explain why this is? Is there a proper way to expand a string variable that will preserve the syntax of each character such that it is correctly interpreted by shell commands?
Why
When the string is expanded, it is split into words, but it is not re-evaluated to find special characters such as quotes or dollar signs or ... This is the way the shell has 'always' behaved, since the Bourne shell back in 1978 or thereabouts.
Fix
In bash, use an array to hold the arguments:
argumentArray=(-ir 'hello world')
grep "${argumentArray[#]}" .
Or, if brave/foolhardy, use eval:
argumentString="-ir 'hello world'"
eval "grep $argumentString ."
On the other hand, discretion is often the better part of valour, and working with eval is a place where discretion is better than bravery. If you are not completely in control of the string that is eval'd (if there's any user input in the command string that has not been rigorously validated), then you are opening yourself to potentially serious problems.
Note that the sequence of expansions for Bash is described in Shell Expansions in the GNU Bash manual. Note in particular sections 3.5.3 Shell Parameter Expansion, 3.5.7 Word Splitting, and 3.5.9 Quote Removal.
When you put quote characters into variables, they just become plain literals (see http://mywiki.wooledge.org/BashFAQ/050; thanks #tripleee for pointing out this link)
Instead, try using an array to pass your arguments:
argumentString=(-ir 'hello world')
grep "${argumentString[#]}" .
In looking at this and related questions, I'm surprised that no one brought up using an explicit subshell. For bash, and other modern shells, you can execute a command line explicitly. In bash, it requires the -c option.
argumentString="-ir 'hello world'"
bash -c "grep $argumentString ."
Works exactly as original questioner desired. There are two restrictions to this technique:
You can only use single quotes within the command or argument strings.
Only exported environment variables will be available to the command
Also, this technique handles redirection and piping, and other shellisms work as well. You also can use bash internal commands as well as any other command that works at the command line, because you are essentially asking a subshell bash to interpret it directly as a command line. Here's a more complex example, a somewhat gratuitously complex ls -l variant.
cmd="prefix=`pwd` && ls | xargs -n 1 echo \'In $prefix:\'"
bash -c "$cmd"
I have built command processors both this way and with parameter arrays. Generally, this way is much easier to write and debug, and it's trivial to echo the command you are executing. OTOH, param arrays work nicely when you really do have abstract arrays of parameters, as opposed to just wanting a simple command variant.
I have a Bash script which generates, stores and modifies values in an array. These values are later used as arguments for a command.
For a MCVE I thought of an arbitrary command bash -c 'echo 0="$0" ; echo 1="$1"' which explains my problem. I will call my command with two arguments -option1=withoutspace and -option2="with space". So it would look like this
> bash -c 'echo 0="$0" ; echo 1="$1"' -option1=withoutspace -option2="with space"
if the call to the command would be typed directly into the shell. It prints
0=-option1=withoutspace
1=-option2=with space
In my Bash script, the arguments are part of an array. However
#!/bin/bash
ARGUMENTS=()
ARGUMENTS+=('-option1=withoutspace')
ARGUMENTS+=('-option2="with space"')
bash -c 'echo 0="$0" ; echo 1="$1"' "${ARGUMENTS[#]}"
prints
0=-option1=withoutspace
1=-option2="with space"
which still shows the double quotes (because they are interpreted literally?). What works is
#!/bin/bash
ARGUMENTS=()
ARGUMENTS+=('-option1=withoutspace')
ARGUMENTS+=('-option2=with space')
bash -c 'echo 0="$0" ; echo 1="$1"' "${ARGUMENTS[#]}"
which prints again
0=-option1=withoutspace
1=-option2=with space
What do I have to change to make ARGUMENTS+=('-option2="with space"') work as well as ARGUMENTS+=('-option2=with space')?
(Maybe it's even entirely wrong to store arguments for a command in an array? I'm open for suggestions.)
Get rid of the single quotes. Write the options exactly as you would on the command line.
ARGUMENTS+=(-option1=withoutspace)
ARGUMENTS+=(-option2="with space")
Note that this is exactly equivalent to your second option:
ARGUMENTS+=('-option1=withoutspace')
ARGUMENTS+=('-option2=with space')
-option2="with space" and '-option2=with space' both evaluate to the same string. They're two ways of writing the same thing.
(Maybe it's even entirely wrong to store arguments for a command in an array? I'm open for suggestions.)
It's the exact right thing to do. Arrays are perfect for this. Using a flat string would be a mistake.
command 'which' shows the link to a command.
command 'less' open the file.
How can I 'less' the file as the output of 'which'?
I don't want to use two commands like below to do it.
=>which script
/file/to/script/fiel
=>less /file/to/script/fiel
This is a use case for command substitution:
less -- "$(which commandname)"
That said, if your shell is bash, consider using type -P instead, which (unlike the external command which) is built into the shell:
less -- "$(type -P commandname)"
Note the quotes: These are important for reliable operation. Without them, the command may not work correctly if the filename contains characters inside IFS (by default, whitespace) or can be evaluated as a glob expression.
The double dashes are likewise there for correctness: Any argument after them is treated as positional (as per POSIX Utility Syntax Guidelines), so even if a filename starting with a dash were to be returned (however unlikely this may be), it ensures that less treats that as a filename rather than as the beginning of a sequence of options or flags.
You may also wish to consider honoring the user's pager selection via the environment variable $PAGER, and using type without -P to look for aliases, shell functions and builtins:
cmdsource() {
local sourcefile
if sourcefile="$(type -P -- "$1")"; then
"${PAGER:-less}" -- "$sourcefile"
else
echo "Unable to find source for $1" >&2
echo "...checking for a shell builtin:" >&2
type -- "$1"
fi
}
This defines a function you can run:
cmdsource commandname
You should be able to just pipe it over, try this:
which script | less
im using system command in perl to execute su command like this
system("su -");
The above command works fine..
But if my command is this
su -c "echo hello"
Then how do i embed this command into system command of perl?
system can work with a list rather than a single string:
system LIST
system PROGRAM LIST
[...] Note that argument processing varies depending on the number of arguments. If there is more than one argument in LIST, or if LIST is an array with more than one value, starts the program given by the first element of the list with arguments given by the rest of the list.
So you could avoid the nested quote problems with this:
system('su', '-c', 'echo hello')
You simple need quote escaping, either by using a different quote set:
system('su -c "echo hello"');
Or by "escaping" the quotes themselves:
system("su -c \"echo hello\"");
Or as individual arguments, as mu is too short points out:
system("su", "-c", "echo hello");