Multiword string as a curl option using Bash - string

I want to get some data from a HTTP server. What it sends me depends on what I put in a POST request.
What I put in the INPUT_TEXT field is a sequence of words. When I run the following command, I get good looking output.
$ curl http://localhost:59125/process -d INPUT_TEXT="here are some words"
I want a bash script to take some string as a command line argument, and pass it appropriately to curl. The first thing I tried was to put the following in a script:
sentence=$1
command="curl http://localhost:59125/process -d INPUT_TEXT=\"${sentence}\""
$command
I then run the script like so:
$ ./script "here are some words"
But then I get a curl Couldn't resolve host error for each of "are", "some", and "words". It would seem that "here" got correctly treated as the INPUT_TEXT, but the rest of the words were then considered to be hosts, and not part of the option.
So I tried:
command=("curl" "http://localhost:59125/process" "-d" "INPUT_TEXT='$sentence'")
${command[#]}
I got the same output as the first script. I finally got what I wanted with:
result=$(curl http://localhost:59125/process -d INPUT_TEXT="${sentence}")
echo $result
I'm still unsure as to what the distinction is. In the first two cases, when I echoed out the contents of command, I get exactly what I input from the interactive Bash prompt, which had worked fine. What caused the difference?

The following will work:
command=("curl" "http://localhost:59125/process"
"-d" "INPUT_TEXT=$sentence")
"${command[#]}"
That has two changes from yours:
I removed the incorrect quotes around $sentence since you don't want to send quotes to the server (as far as I can see).
I put double-quotes around the use of "${command[#]}". Without the double quotes, the array's elements are concatenated with spaces between them and then the result is word-split. With double quotes, the individual array elements are used as individual words.
The second point is well-explained in the bash FAQ and a bunch of SO answers dealing with quotes.
The important thing to understand is that quotes only quote when a command is parsed. A quote which is a character in a variable is just a character; it is not reinterpreted when the value of the variable expanded. Whitespace in the variable is used for word-splitting if the variable expansion is unquoted; the fact that the whitespace was quoted in the the command which defined the variable is completely irrelevant. In this sense, bash is just the same as any other programming language.

Related

How do I pass ">>" or "<<" to my script without the terminal trying to interpret it as me either appending to something or getting stdin?

My python script can take a series of bitwise operators as one of its arguments. They all work fine except for "=<<" which is roll left, and "=>>" which is roll right. I run my script like ./script.py -b +4,-4,=>>10,=<<1, where anything after -b can be any combination of similar operations. As soon as the terminal sees "<<" though, it just drops the cursor to a new line after the command and asks for more input instead of running the script. When it sees ">>", my script doesn't process the arguments correctly. I know it's because bash uses these characters for a specific purpose, but I'd like to get around it while still using "=>>" and "=<<" in my arguments for my script. Is there any way to do it without enclosing the argument in quotation marks?
Thank you for your help.
You should enclose the parameters that contain special symbols into single quotation marks (here, echo represents your script):
> echo '+4,-4,=>>10,=<<1'
+4,-4,=>>10,=<<1
Alternatively, save the parameters to a file (say, params.txt) and read them from the file onto the command line using the backticks:
> echo `cat params.txt`
+4,-4,=>>10,=<<1
Lastly, you can escape some offending symbols:
> echo +4,-4,=\>\>10,=\<\<1
+4,-4,=>>10,=<<1

pass string with spaces to gcc [duplicate]

This question already has answers here:
How can I store a command in a variable in a shell script?
(12 answers)
Closed 4 years ago.
These work as advertised:
grep -ir 'hello world' .
grep -ir hello\ world .
These don't:
argumentString1="-ir 'hello world'"
argumentString2="-ir hello\\ world"
grep $argumentString1 .
grep $argumentString2 .
Despite 'hello world' being enclosed by quotes in the second example, grep interprets 'hello (and hello\) as one argument and world' (and world) as another, which means that, in this case, 'hello will be the search pattern and world' will be the search path.
Again, this only happens when the arguments are expanded from the argumentString variables. grep properly interprets 'hello world' (and hello\ world) as a single argument in the first example.
Can anyone explain why this is? Is there a proper way to expand a string variable that will preserve the syntax of each character such that it is correctly interpreted by shell commands?
Why
When the string is expanded, it is split into words, but it is not re-evaluated to find special characters such as quotes or dollar signs or ... This is the way the shell has 'always' behaved, since the Bourne shell back in 1978 or thereabouts.
Fix
In bash, use an array to hold the arguments:
argumentArray=(-ir 'hello world')
grep "${argumentArray[#]}" .
Or, if brave/foolhardy, use eval:
argumentString="-ir 'hello world'"
eval "grep $argumentString ."
On the other hand, discretion is often the better part of valour, and working with eval is a place where discretion is better than bravery. If you are not completely in control of the string that is eval'd (if there's any user input in the command string that has not been rigorously validated), then you are opening yourself to potentially serious problems.
Note that the sequence of expansions for Bash is described in Shell Expansions in the GNU Bash manual. Note in particular sections 3.5.3 Shell Parameter Expansion, 3.5.7 Word Splitting, and 3.5.9 Quote Removal.
When you put quote characters into variables, they just become plain literals (see http://mywiki.wooledge.org/BashFAQ/050; thanks #tripleee for pointing out this link)
Instead, try using an array to pass your arguments:
argumentString=(-ir 'hello world')
grep "${argumentString[#]}" .
In looking at this and related questions, I'm surprised that no one brought up using an explicit subshell. For bash, and other modern shells, you can execute a command line explicitly. In bash, it requires the -c option.
argumentString="-ir 'hello world'"
bash -c "grep $argumentString ."
Works exactly as original questioner desired. There are two restrictions to this technique:
You can only use single quotes within the command or argument strings.
Only exported environment variables will be available to the command
Also, this technique handles redirection and piping, and other shellisms work as well. You also can use bash internal commands as well as any other command that works at the command line, because you are essentially asking a subshell bash to interpret it directly as a command line. Here's a more complex example, a somewhat gratuitously complex ls -l variant.
cmd="prefix=`pwd` && ls | xargs -n 1 echo \'In $prefix:\'"
bash -c "$cmd"
I have built command processors both this way and with parameter arrays. Generally, this way is much easier to write and debug, and it's trivial to echo the command you are executing. OTOH, param arrays work nicely when you really do have abstract arrays of parameters, as opposed to just wanting a simple command variant.

How to store command arguments which contain double quotes in an array?

I have a Bash script which generates, stores and modifies values in an array. These values are later used as arguments for a command.
For a MCVE I thought of an arbitrary command bash -c 'echo 0="$0" ; echo 1="$1"' which explains my problem. I will call my command with two arguments -option1=withoutspace and -option2="with space". So it would look like this
> bash -c 'echo 0="$0" ; echo 1="$1"' -option1=withoutspace -option2="with space"
if the call to the command would be typed directly into the shell. It prints
0=-option1=withoutspace
1=-option2=with space
In my Bash script, the arguments are part of an array. However
#!/bin/bash
ARGUMENTS=()
ARGUMENTS+=('-option1=withoutspace')
ARGUMENTS+=('-option2="with space"')
bash -c 'echo 0="$0" ; echo 1="$1"' "${ARGUMENTS[#]}"
prints
0=-option1=withoutspace
1=-option2="with space"
which still shows the double quotes (because they are interpreted literally?). What works is
#!/bin/bash
ARGUMENTS=()
ARGUMENTS+=('-option1=withoutspace')
ARGUMENTS+=('-option2=with space')
bash -c 'echo 0="$0" ; echo 1="$1"' "${ARGUMENTS[#]}"
which prints again
0=-option1=withoutspace
1=-option2=with space
What do I have to change to make ARGUMENTS+=('-option2="with space"') work as well as ARGUMENTS+=('-option2=with space')?
(Maybe it's even entirely wrong to store arguments for a command in an array? I'm open for suggestions.)
Get rid of the single quotes. Write the options exactly as you would on the command line.
ARGUMENTS+=(-option1=withoutspace)
ARGUMENTS+=(-option2="with space")
Note that this is exactly equivalent to your second option:
ARGUMENTS+=('-option1=withoutspace')
ARGUMENTS+=('-option2=with space')
-option2="with space" and '-option2=with space' both evaluate to the same string. They're two ways of writing the same thing.
(Maybe it's even entirely wrong to store arguments for a command in an array? I'm open for suggestions.)
It's the exact right thing to do. Arrays are perfect for this. Using a flat string would be a mistake.

Semicolon on command line in linux

i am running my application in linux by providing inputs as command line. My input field contain an argument which contains ";"(semicolon) internally.(For example:123;434;5464).
This will be parsed using UTF8String encode and send.
But when i am using like this, in initial itself i am getting,
bash: 434: command not found
bash: 5464: command not found
And when i capture traffic the output contains only 123 instead 123;434;5464
But if i give without semicolon (Ex:123:434:5464),not getting any problem output coming properly as 123:434:5464
Point me how to give command line input by using semicolon as to come output. Is there any particular syntax to use while doing with semicolon.
I am running like below
./runASR.sh -ip 10.78.242.4 -port 3868 -sce 10.78.241.206 -id 85;167838865;1385433280
where -id field contain that value with issue.
; is treated an end of command character. So 123;456;5464 to bash is in fact 3 commands. To pass such meta-characters escape it with escape character \.
./command 123\;456\;5464
Or Just quote it with single quote (double quote evaluates the inner string) (Thanks Triplee, I forgot to mention this)
./command '123;456;5464'

Passing quotes and other special characters literally through bash and ssh

I am trying to run an SSH command that will invoke a script on a remote machine that writes some Lua code to a file.
I have this script command that executes under bash:
ssh bob writelua.sh '{version=1,{["foo"]=17}}'
And writelua.sh looks like this:
echo "return $1" > bar.lua
The end result, however, is that bar.lua has the content:
return version=1
I had thought that single quotes prevented all interpretation. How can I edit the scripts and escaping to pass the raw Lua code through unharmed?
The single quotes prevent interpretation on the local host. The remote host sees the command line
writelua.sh {version=1,{["foo"]=17}}
which is subject to brace expansion. You need a second set of quotes so that the first set of single quotes is passed through to the remote host.
ssh bob writelua.sh "'{version=1,{[\"foo\"]=17}}'"
As you can see, the quotes start to get unwieldy. A better solution is to simply copy a script containing
writelua.sh '{version=1,{["foo"]=17}}'
to the remote host and execute that remotely.
An example using the $'...' quotes:
ssh bob writelua.sh $'{version=1,{[\'foo\']=17}}'
Use heredoc and avoid all the excessive quoting:
ssh -T bob << \EOF
writelua.sh '{version=1,{["foo"]=17}}'
EOF
This will send raw script to remote host and it will get interpreted on the remote host itself.
When it gets too complex, particularly with lots of escaping, I prefer generating the command on a temporary script and execute it locally or remotely via SSH as required.
But there's an alternative: using echo to store the command in a variable and taking advantage of three things:
Single quotes don't do variable expansion and allow double quotes, so you can include something like "$myvar" without escaping $ or "
Double quotes allow variable expansion and single quotes, which means you can include something like animals='all'; echo love $animals to have $animals replaced by its value, and without escaping the '
Strings of both types, i.e. enclosed by single quotes or double quotes, can be concatenated simply by putting them together.
As an example, if I want something like this executed on a remote machine:
source /my-env.sh; perl -MMYLIB::DB -e 'my $t=db_list("name", 1553786458); print "#$t"'
But instead of 1553786458 I want to pass the value from a local variable:
now=`date +%s`
We could have this:
get_list=`echo 'source /my-env.sh; perl -MMYLIB::DB -e' "'my " '$t=db_list("name", ' "$now" '); print "#$t"' "'"`
You can see that single and double quotes are alternated, so we din't have to do any escaping! They don't need to be separated by spaces, but it improves readability and won't affect the result in this case.
And now we can execute:
ssh user#host $get_list
There's still no guarantee that this approach will always work, so once you've built your command, the safest bet would be to copy it over in a file.
If you can use Perl...
use Net::OpenSSH;
my $ssh = Net::OpenSSH->new("bob");
$ssh->system('writelua.sh', '{version=1,{["foo"]=17}}')
or die $ssh->error;
Net::OpenSSH takes care of quoting everything for you.

Resources