Using printf %q to make a quoted string usable as shell script input - linux

Within a bash script, I am trying to append a command string that is single and double quoted to a file (.profile).
I would like to use echo and then >> the command to .profile. Of course, I am open to any solution that works.
The command I would like to use is echo "curl -X POST -H "Content-Type: application/json" -d '{"value1":"PHONENUMBER","value2":"MESSAGE"}' https://maker.ifttt.com/trigger/TRIGGER/with/key/KEY &> /dev/null" >> .profile but clearly this doesn't work within my bash script.
I am not clear on how printf %q works and don't understand how to apply it to my problem.
I have tried this
`CMDSTRING='curl -X POST -H "Content-Type: application/json" -d '`
`CMDSTRING=${CMDSTRING}"'"`
`CMDSTRING=${CMDSTRING}'{"value1":"+PHONENUMBER","value2":"MESSAGE"}'`
`CMDSTRING=${CMDSTRING}"'"`
`CMDSTRING=${CMDSTRING}' https://maker.ifttt.com/trigger/TRIGGER/with/key/KEY &> /dev/null'`
`echo $CMDSTRING`

Using printf '%q' to generate .profile content looks something like the following:
{
printf '%q ' \
curl -X POST -H "Content-Type: application/json" \
-d '{"value1":"PHONENUMBER","value2":"MESSAGE"}' \
https://maker.ifttt.com/trigger/TRIGGER/with/key/KEY
printf '%s\n' "&>/dev/null"
} >> .profile
Note that you cannot use the %q format string if you want &>/dev/null to be parsed as syntax, since by its very nature it formats everything it's passed to be parsed as data.
Thus, we use printf '%q ' "command name" "first argument" ... for the actual command itself, and format the redirection out-of-band.
That said, note that there's value to the above only if you're substituting variables from an untrusted source (rather than hardcoding them as in the example), and are worried about invalid values being abused for command injection. If you're truly just appending a constant string to the end of a file, a quoted heredoc will let you build more natural-looking shell quoting manually (indeed, as you've already done!), and pass it through verbatim:
cat >>.profile <<'EOF'
curl -X POST -H "Content-Type: application/json" \
-d '{"value1":"PHONENUMBER","value2":"MESSAGE"}' \
https://maker.ifttt.com/trigger/TRIGGER/with/key/KEY &> /dev/null
EOF
Here, everything between the <<'EOF' and the EOF are passed through exactly-as-given, including quotes and parameter expansions the shell might otherwise try to interpret.

Related

curl command can not be formed dynamically due to single quote [duplicate]

This question already has answers here:
When to wrap quotes around a shell variable?
(5 answers)
Closed 11 months ago.
I want to have a curl command like below
curl --location --request POST 'https://abcd.com/api/v4/projects/<projectId>/triggers' \
--header 'PRIVATE-TOKEN: <your_access_token>' \
--form 'description="my description"'
Now I wrote a shell script function to generate it dynamically
it need projectId, token, and description as a pramter
callApi(){
while IFS="," read -r -a users; do
for u in "${users[#]}"
do
url="'https://abcd.com/api/v4/projects/$1/triggers'"
echo $url
header="'PRIVATE-TOKEN: $2'"
echo $header
desc="'description=$u token'"
echo $desc
tk=$(curl --location --request POST $url \
--header $header \
--form $desc)
echo $tk
done
done <<< $(cat $3)
}
callApi "<projectId>" "<token>" ./users.csv
It echo perfectly
But
It thorws error
Don't use both double and single quotes like that. You are adding literal single quotes to the url (and other variables) which, as you have discovered, breaks.
Use double quotes if you need to allow command or parameter substitution, single quotes otherwise. Double quote your variables everywhere you dereference them.
Use indentation for readability.
Useless use of cat.
I've captured the function parameters at the start of the function for visibility: I did not even notice the $1 buried in the URL.
callApi() {
local project=$1 token=$2 userfile=$3
while IFS="," read -r -a users; do
for u in "${users[#]}"; do
url="https://abcd.com/api/v4/projects/${project}/triggers"
echo "$url"
header="PRIVATE-TOKEN: ${token}"
echo "$header"
desc="description=$u token"
echo "$desc"
tk=$(
curl --location \
--request POST \
--header "$header" \
--form "$desc" \
"$url"
)
echo "$tk"
done
done < "$userfile"
}

Bash script passing parameter with spaces using variables

I want to run command:
curl -H "X-Auth-Token: $OS_TOKEN" "http://192.168.0.13:8774/v2.1/servers"
And, I made the script for it.
URL="http://192.168.0.13:8774/v2.1/servers"
HEADER="X-Auth-Token: 12345678"
METHOD="GET"
CMD="curl -H $HEADER $URL"
eval "$CMD"
But, as it doesn't include any double quotes in $CMD and separates parameters by space, it runs wrong command
$ bash request.sh
curl: (6) Could not resolve host: 12345678
How can I wrap it?
In command line, we can give double quotes to separate parameters.
But how can I put a variable with spaces to a parameter using scripts as same as command line.
You should change CMD="curl -H $HEADER $URL" in CMD="curl -H \"$HEADER\" \"$URL\"".
I would try:
URL="http://192.168.0.13:8774/v2.1/servers"
HEADER="X-Auth-Token: 12345678"
CMD='curl -H \"${HEADER}\" ${URL}'
eval "${CMD}"

Arguments inside string in shell script

I'm trying to use a shell script which looks like this:
#!/bin/bash
echo "First arg: $1"
echo "Second arg: $2"
curl -w "\n" -d '{"ssid": "$1", "psk": "$2" }' \
-H "Content-Type: application/json" \
-X POST localhost:8080/connect
The problem is that inside the curl command it takes $1 and $2 as strings and not arguments. If i try to remove the ":s then it doesn't work at all.
Does anyone knows how to solve this?
You want to use double quotes (to get variables expanded correctly) but escape the qoutes within the string with backslashes in front of them (and drop -X POST when you use -d, because):
#!/bin/bash
echo "First arg: $1"
echo "Second arg: $2"
curl -w "\n" -d "{\"ssid\": \"$1\", \"psk\": \"$2\" }" \
-H "Content-Type: application/json" localhost:8080/connect

Curl command doesn't work in bash script

I am trying to upload a JSON file into my noSQL database using a bash script, but it doesn't work and I don't understand why.
This is the script :
test='{"evaluation": "none"}'
test="'$test'"
command="curl -XPUT localhost:9200/test/evaluation/$i -d $test"
echo "$command"
$command
This is the error :
curl -XPUT localhost:9200/test/evaluation/0 -d '{"evaluation": "none"}'
{"error":"Content-Type header [application/x-www-form-urlencoded] is not supported","status":406}curl: (3) [globbing] unmatched close brace/bracket in column 7
When I do the command given in my command line it works fine though.
What is the error here ? Thank you
Don't store a command in a variable; if you absolutely must have something usable with logging, put the arguments in an array.
test='{"evaluation": "none"}'
args=( -XPUT localhost9200/test/evaluation/"$i" -d "$test" )
echo "curl ${args[*]}"
curl "${args[#]}"

Using Environment Variables in cURL Command - Unix

My question is very simple. I want to use environment variables in a cURL command sth similar to this:
curl -k -X POST -H 'Content-Type: application/json' -d '{"username":"$USERNAME","password":"$PASSWORD"}'
When I run the command $USERNAME is passed to the command as a "$USERNAME" string not the value of the variable. Is there a way to escape this situation?
Thanks.
Single quotes inhibit variable substitution, so use double quotes. The inner double quotes must then be escaped.
... -d "{\"username\":\"$USERNAME\",\"password\":\"$PASSWORD\"}"
Since this answer was written in 2015, it has become clear that this technique is insufficient to properly create JSON:
$ USERNAME=person1
$ PASSWORD="some \"gnarly 'password"
$ echo "{\"username\":\"$USERNAME\",\"password\":\"$PASSWORD\"}"
{"username":"person1","password":"some "gnarly 'password"}
$ echo "{\"username\":\"$USERNAME\",\"password\":\"$PASSWORD\"}" | jq .
parse error: Invalid numeric literal at line 1, column 47
The quoting problem are clear. The (shell) solutions are not
Current best practice: use a JSON-specific tool to create JSON:
jq
$ jq -n -c --arg username "$USERNAME" --arg password "$PASSWORD" '$ARGS.named'
{"username":"person1","password":"some \"gnarly 'password"}
jo
$ jo "username=$USERNAME" "password=$PASSWORD"
{"username":"person1","password":"some \"gnarly 'password"}
And with curl:
json=$( jq -n -c --arg username "$USERNAME" --arg password "$PASSWORD" '$ARGS.named' )
# or
json=$( jo "username=$USERNAME" "password=$PASSWORD" )
# then
curl ... -d "$json"
For less quoting, read from standard input instead.
curl -k -X POST -H 'Content-Type: application/json' -d #- <<EOF
{ "username": "$USERNAME", "password": "$PASSWORD"}
EOF
-d #foo reads from a file named foo. If you use - as the file name, it reads from standard input. Here, standard input is supplied from a here document, which is treated as a double-quoted string without actually enclosing it in double quotes.
curl -k -X POST -H 'Content-Type: application/json' -d '{"username":"'$USERNAME'","password":"'$PASSWORD'"}'
Here the variable are placed outside of "'" quotes and will be expanded by shell (just like in echo $USERNAME). For example assuming that USRNAME=xxx and PASSWORD=yyy the argv[7] string passed to curl is {"username":"xxx","password":"yyy"}
And yes, this will not work when $USERNAME or $PASSWORD contain space characters.
Our: curl -k -X POST -H 'Content-Type: application/json' -d '{"username":"'"$USERNAME"'","password":"'"$PASSWORD"'"}'
You can wrap the environment variables with "'" and you should keep the single quote for the external json object.
e.g
-d '{"username":"'"$USERNAME"'","password":"'"$PASSWORD"'"}'

Resources