How to run commands using variables which included quotes, in bash? - string

I'm trying to run a command while injecting a variable which has quotes, but bash keeps adding extra quotes which I don't need and falsifies the process...
this is my example:
set -x
MYVAR=" --name='user should login' "
cucumber $MYVAR
this results into running this:
cucumber '--name='\''user' should 'login'\'''
and the command fails. However, when I run cucumber --name='user should login' as is, without using variables, everything goes as expected.
PS. same result with cucumber "${MYVAR}" or cucumber "$MYVAR"

It is better to use BASH arrays for storing command lines:
myvar=(--name 'user should login')
cucumber "${myvar[#]}"
I've also lowercased your variable as using all UPPER case variables can be problematic sometimes.

This works, with no bashisms:
MYVAR=" --name='user should login' "
eval cucumber $MYVAR

Related

Bash wrong spaces, quotes interpretation in variables

I see a weird behavior in shell scripts when I pass a variable with parameters to external ruby script
For example:
params="--val1=test --val2='test'"
ruby ./script.rb
causes ruby to output 'test' for var2 instead of test.
If I just pass params directly without using a variable everything works just fine.
Could you please clarify your question a bit?
From what I understand you have a shell script, something like:
#!/bin/bash
PARAMS="--val1=test --val2='test'"
ruby ./script.rb $PARAMS
And in script.rb you print out the value for the command line parameter val2. In this case it's expected that it prints out test instead of 'test', because the following steps are happening:
bash replaces $PARAMS with its value
bash tries to execute the line ruby ./script.rb --val1=test --val2='test'
now bash sees the quoted value 'test' and replaces it with test, so ruby / your script sees test

Escape char in Gitlab Secret Variables

I have a secret var :
But when I do - echo %MySecretVar%, runner displays foo only
How can i escape special chars like ! in Gitlab Secret Vars ?
I had the same problems with Gitlab, job running on windows, but I assume it will reproduce on Linux as well, because it seems Gitlab parsing issue or relay weird escaping.
So I have set environment variable
APPPOOL_PWD: 'blabla!foo$bar'
and output of echo %APPPOOL_PWD% or echo $APPPOOL_PWD was 'blabla'
The Gitlab seems to be was eating out the exclamation mark sign ! and dollar sign $. To avoid it as proposed in comment for exclamation mark I have used ^^ and for dollar sign I have used $$ as proposed in the Gitlab variables documentation.
So following variable works well:
APPPOOL_PWD: 'blabla^^!foo$$bar'
and output of the echo command in this case would be 'blabla!foo$bar'.
I was able to use a value with special characters this way:
Define Gitlab CI variable FOO with special characters in the value, e.g. ?!asdf&%fghjkl
In .gitlab-ci.yml define:
variables:
bar: '"%FOO%"'
script:
- echo %bar%
This way the variable will stay exactly the way it is typed in your CI variable field.
I'm using Windows batch shell. If you use another shell for script running, the syntax is a little different from %bar%. Check out the syntax here: Gitlab CI reference
I am using GitLab 15.3.3-ee and I don't see any issue with the ! it get's passed through. However for $ you will have to use extra $ as escape character, just like mentioned in the first comment.

Passing multiple variables from local bash to remote bash script without gobbling

I'm having trouble sending multiple variables to a remote bash script without gobbling occurring.
For the sake of this question the variable $timestamp contains 12-12-15 19:45:21
ssh user#serverip "/usr/path/to/script.sh http://www.web.com/$1 http://web.com/$2 $timestamp";
I am sending 3 variables to script.sh
Two URLs with an amended file name in the form of a variable on the end and then my $timestamp variable
But on myscript.sh, when I try to insert $timestamp into a mysql database it only see's the first part of the date before the white space :
12-12-15
So my quotes around the command aren't preventing gobbling. Do I need to quote each variable separately?
ssh user#serverip "/usr/path/to/script.sh http://www.web.com/$1 http://web.com/$2 $timestamp";
This is equivalent to this locally calling
/usr/path/to/script.sh http://www.web.com/$1 http://web.com/$2 $timestamp
Try to quote each individual argument passed
ssh user#serverip "/usr/path/to/script.sh 'http://www.web.com/$1' 'http://web.com/$2' '$timestamp'";
You can also print each argument in the script to see what's being passed... e.g. echo $1, etc.
You can try something like
ssh localhost "printf \"%s %s %s\n\" a b \"last parameter\""
You need to escape the values for the remote host. The correct way of doing this is with printf %q:
ssh user#serverip "/usr/path/to/script.sh \
$(printf "%q " "http://www.web.com/$1" "http://web.com/$2" "$timestamp")"
This works for all variable values. Wrapping them in single quotes would instead result in syntax error and command injection when the variables themselves contain single quotes.

perl exec screen with parameters

If I run the following:
system("screen -dmS $screenname");
it works as it should be but when I try to run a screen from perl and to execute a command (in this case tcpreplay) with some extra arguments it doesn't run as it's supposed to.
system("screen -dmS $screenname -X stuff \"`printf \"tcpreplay --intf1=eth0 s.cap\\r\"`\" ");
What am I doing wrong here?
Simo A's answer is probably right with regards to the issue, but I like to use the following when working with screen opposed to using the -X flag. Explicitly telling it the command language interpreter.
Why use -c you ask?
If the -c option is present, then commands are read from string. If there are arguments after the string, they are assigned to the positional parameters, starting with $0.
system("screen -dmS $screenname sh -c 'PRETTY MUCH ANYTHING WORKS'");
I figured I'd shared as I run alot of Perl system commands and the above always works for screen commands.
Try replacing single \" with \\\". That should do the trick.
Consider the same issue here:
system ("echo Quotation marks: \\\"here\\\" but \"not here\". ");
The output from the former line of code is: Quotation marks: "here" but not here.
Taking Simo A's answer as a starting point, I would use q( ) rather than " ".
system ( q(echo Quotation marks: \"here\" but "not here". ));
This means you don't need to escape the quote twice.

Passing quotes and other special characters literally through bash and ssh

I am trying to run an SSH command that will invoke a script on a remote machine that writes some Lua code to a file.
I have this script command that executes under bash:
ssh bob writelua.sh '{version=1,{["foo"]=17}}'
And writelua.sh looks like this:
echo "return $1" > bar.lua
The end result, however, is that bar.lua has the content:
return version=1
I had thought that single quotes prevented all interpretation. How can I edit the scripts and escaping to pass the raw Lua code through unharmed?
The single quotes prevent interpretation on the local host. The remote host sees the command line
writelua.sh {version=1,{["foo"]=17}}
which is subject to brace expansion. You need a second set of quotes so that the first set of single quotes is passed through to the remote host.
ssh bob writelua.sh "'{version=1,{[\"foo\"]=17}}'"
As you can see, the quotes start to get unwieldy. A better solution is to simply copy a script containing
writelua.sh '{version=1,{["foo"]=17}}'
to the remote host and execute that remotely.
An example using the $'...' quotes:
ssh bob writelua.sh $'{version=1,{[\'foo\']=17}}'
Use heredoc and avoid all the excessive quoting:
ssh -T bob << \EOF
writelua.sh '{version=1,{["foo"]=17}}'
EOF
This will send raw script to remote host and it will get interpreted on the remote host itself.
When it gets too complex, particularly with lots of escaping, I prefer generating the command on a temporary script and execute it locally or remotely via SSH as required.
But there's an alternative: using echo to store the command in a variable and taking advantage of three things:
Single quotes don't do variable expansion and allow double quotes, so you can include something like "$myvar" without escaping $ or "
Double quotes allow variable expansion and single quotes, which means you can include something like animals='all'; echo love $animals to have $animals replaced by its value, and without escaping the '
Strings of both types, i.e. enclosed by single quotes or double quotes, can be concatenated simply by putting them together.
As an example, if I want something like this executed on a remote machine:
source /my-env.sh; perl -MMYLIB::DB -e 'my $t=db_list("name", 1553786458); print "#$t"'
But instead of 1553786458 I want to pass the value from a local variable:
now=`date +%s`
We could have this:
get_list=`echo 'source /my-env.sh; perl -MMYLIB::DB -e' "'my " '$t=db_list("name", ' "$now" '); print "#$t"' "'"`
You can see that single and double quotes are alternated, so we din't have to do any escaping! They don't need to be separated by spaces, but it improves readability and won't affect the result in this case.
And now we can execute:
ssh user#host $get_list
There's still no guarantee that this approach will always work, so once you've built your command, the safest bet would be to copy it over in a file.
If you can use Perl...
use Net::OpenSSH;
my $ssh = Net::OpenSSH->new("bob");
$ssh->system('writelua.sh', '{version=1,{["foo"]=17}}')
or die $ssh->error;
Net::OpenSSH takes care of quoting everything for you.

Resources