Quotes around quotes on the Windows command line - string

So I found this Stack Overflow question which relates to what I would like to do; however, I am having trouble with the directory having spaces within it. I tried looking at several other Stack Overflow questions, but either I misunderstood them, or none have really addressed my problem. I've seen stuff on passing arguments as arrays and using %1 or something to address the special characters, but nothing has worked yet.
I tried entering the following into cmd.exe:
schtasks /Create /SC DAILY /TN PythonTask /TR "python "C:\Users\me\stuff with spaces \pythonprogram.py""
However, the quotes appear to not be taken in the correct order. I would like the command to be input as python "C:\Users\me\stuff with spaces \pythonprogram.py" to cmd.exe everyday.
How can I use quotes around quotes on the Windows command line?
ANSWER FROM BELOW:
Add a backslash \ before the argument which you are putting in quotes. I.e.:
do_some_command_in_windows_shell_with_this_given_string "run "something.exe""
is replaced with:
do_some_command_in_windows_shell_with_this_given_string "run \"something.exe""

Educated guess:
Escape the inner quotes with a backslash.

Related

Argument escaping not interpreted correctly when running node.js script from Windows PowerShell

Given the following script:
const yargs = require('yargs');
const argv =
yargs
.usage('Usage: $0 [--whatIf]')
.alias('d', 'directory')
.alias('wi', 'whatIf')
.nargs('d', 1)
.describe('d', 'alphabetize this directory')
.describe('whatIf', 'show what would happen if run')
.demandOption(['d'])
.argv;
console.log(argv.directory);
If I invoke the script from Windows PowerShell like so: node .\alphabetizer.js -d 'l:\my folder\Files - Some Files In Here\' --whatIf I get the output l:\my folder\Files - Some Files In Here\" --whatIf where I would expect just l:\my folder\Files - Some Files In Here\. It works OK with folder names that require no escaping, but it seems to get confused by the escaping.
If I examine process.argv, I can see the same escaping issue.
I have noticed that if I remove the trailing slash it will work. However, this still points to the node script not handling the input properly, because this should not be necessary with string set off by single quotes.
Is there a way to make this work?
Both Windows PowerShell (powershell.exe) and PowerShell [Core] v6+ (pwsh) are fundamentally broken with respect to quoting arguments for external programs properly - see this answer for background info.
Generally, PowerShell on Windows has to perform re-quoting behind the scenes in order to ensure that just "..."-quoting is used, given that external programs can't be assumed to understand '...'-quoting too when parsing their command line (which on Windows every program has to do itself).
Windows PowerShell is more broken with respect to arguments that end in \ and have embedded spaces, re-quoting them improperly; e.g.:
PS> foo.exe 'c:\foo \' bar
is translated into the following command line behind the scenes:
foo.exe "c:\ foo \" bar
This is broken, in that most applications - including PowerShell's own CLI - sensibly assume that the \" is an escaped " char. to be taken verbatim, thinking that the argument continues with  bar and then implicitly ends, despite the formal lack of a closing ".
PowerShell [Core] v6+ more sensibly translates the above to foo.exe "c:\foo \\" bar, where the \\ is interpreted as an escaped \, and the following " again has syntactic function.
If you're stuck with Windows PowerShell, your only choices are:
either: if possible, leave off the trailing \
otherwise: manually double it (\\), but only do so if the argument also contains spaces (otherwise, the \\ will be retained as-is, though in the case of filesystem paths that is usually benign).

TFS 2017 - Variable substitution with double backslash in shell script

I'm trying to substitute a variable with double backslash using shell script task, my environment variable is configured to use double backslash.
When is substituted in bash script, the TFS puts only one backslash.
Variable:
Substitution in script, I'm using double quotes because of white spaces in connection string:
Log of execution:
I tried everything, put more backslashs, one quote around, but nothing works.
I appreciate a help.
Thanks
You need type \\\\\\\\ in your variable:
CONNECTION_STRING: Data Source=192.168.2.227\\\\\\\\COMMERCE
Yes, I solved it!!!
I just put the backslashes inside double quotes

Multiword string as a curl option using Bash

I want to get some data from a HTTP server. What it sends me depends on what I put in a POST request.
What I put in the INPUT_TEXT field is a sequence of words. When I run the following command, I get good looking output.
$ curl http://localhost:59125/process -d INPUT_TEXT="here are some words"
I want a bash script to take some string as a command line argument, and pass it appropriately to curl. The first thing I tried was to put the following in a script:
sentence=$1
command="curl http://localhost:59125/process -d INPUT_TEXT=\"${sentence}\""
$command
I then run the script like so:
$ ./script "here are some words"
But then I get a curl Couldn't resolve host error for each of "are", "some", and "words". It would seem that "here" got correctly treated as the INPUT_TEXT, but the rest of the words were then considered to be hosts, and not part of the option.
So I tried:
command=("curl" "http://localhost:59125/process" "-d" "INPUT_TEXT='$sentence'")
${command[#]}
I got the same output as the first script. I finally got what I wanted with:
result=$(curl http://localhost:59125/process -d INPUT_TEXT="${sentence}")
echo $result
I'm still unsure as to what the distinction is. In the first two cases, when I echoed out the contents of command, I get exactly what I input from the interactive Bash prompt, which had worked fine. What caused the difference?
The following will work:
command=("curl" "http://localhost:59125/process"
"-d" "INPUT_TEXT=$sentence")
"${command[#]}"
That has two changes from yours:
I removed the incorrect quotes around $sentence since you don't want to send quotes to the server (as far as I can see).
I put double-quotes around the use of "${command[#]}". Without the double quotes, the array's elements are concatenated with spaces between them and then the result is word-split. With double quotes, the individual array elements are used as individual words.
The second point is well-explained in the bash FAQ and a bunch of SO answers dealing with quotes.
The important thing to understand is that quotes only quote when a command is parsed. A quote which is a character in a variable is just a character; it is not reinterpreted when the value of the variable expanded. Whitespace in the variable is used for word-splitting if the variable expansion is unquoted; the fact that the whitespace was quoted in the the command which defined the variable is completely irrelevant. In this sense, bash is just the same as any other programming language.

perl exec screen with parameters

If I run the following:
system("screen -dmS $screenname");
it works as it should be but when I try to run a screen from perl and to execute a command (in this case tcpreplay) with some extra arguments it doesn't run as it's supposed to.
system("screen -dmS $screenname -X stuff \"`printf \"tcpreplay --intf1=eth0 s.cap\\r\"`\" ");
What am I doing wrong here?
Simo A's answer is probably right with regards to the issue, but I like to use the following when working with screen opposed to using the -X flag. Explicitly telling it the command language interpreter.
Why use -c you ask?
If the -c option is present, then commands are read from string. If there are arguments after the string, they are assigned to the positional parameters, starting with $0.
system("screen -dmS $screenname sh -c 'PRETTY MUCH ANYTHING WORKS'");
I figured I'd shared as I run alot of Perl system commands and the above always works for screen commands.
Try replacing single \" with \\\". That should do the trick.
Consider the same issue here:
system ("echo Quotation marks: \\\"here\\\" but \"not here\". ");
The output from the former line of code is: Quotation marks: "here" but not here.
Taking Simo A's answer as a starting point, I would use q( ) rather than " ".
system ( q(echo Quotation marks: \"here\" but "not here". ));
This means you don't need to escape the quote twice.

build bash string containing a variable value surrounded with single quotes

I'm having a nightmare from what should be the most trivial of tasks.
My final goal is issue the following command from a bash script:
sqlite3 my_db.db '.read my_file.sql'
There are two catches here:
1. The single-quotes are obligatory, and can't be replaced by, say, double-quotes
2. my_file.sql is a variable known only at run-time.
So what I need is a way to have bash build a string that on one hand contains a variable value, while on the other hand that value should be surrounded by single quotes.
I would also much prefer a solution not relying on additional tools like AWK, Perl or the like. Maybe sed if it's really necessary.
Thanks.
Thanks Jonathan and Nelson.
I tried all three suggestions, but they all failed.
For simplicity I reduced the problem to the following:
I wrote the following script (tst.sh):
#!/bin/bash
file=/tmp/1
ls "'"$file"'"
ls \'$file\'
ls "'$file'"
Then I isuues the following commands:
$ touch /tmp/1
$ ls '/tmp/1'
/tmp/1
$ ./tst.sh
'/tmp/1': No such file or directory
'/tmp/1': No such file or directory
'/tmp/1': No such file or directory
It seems the quotes were indeed added, but the resulting command was not the same as when entered manually.
Any ideas ?
Single-quotes are not obligatory. All of the following commands run sqlite3 with exactly the same arguments:
sqlite3 my_db.db '.read my_file.sql'
sqlite3 my_db.db ".read my_file.sql"
sqlite3 my_db.db .read\ my_file.sql
sqlfile="my_file.sql"
sqlite3 my_db.db ".read $sqlfile"
In all cases, the quotes (/escape) are parsed and removed before the arguments are passed to sqlite3. This is what you want. You want sqlite3 to get two arguments: my_db.db and .read my_file.sql. You do not want sqlite3 to see the quotes around the command -- that would be the equivalent of:
$ sqlite3 my_db.db
SQLite version 3.7.7 2011-06-25 16:35:41
Enter ".help" for instructions
Enter SQL statements terminated with a ";"
sqlite> '.read my_file.sql'
...>
...which, as you can see, just confuses sqlite3.
BTW, this is the same as the problem in your ls examples: you're passing single-quotes as part of the argument to ls, so it's looking for a file with single-quotes in the name and not finding it. You want the shell to remove the quotes rather than pass them to the command as part of an argument.
This will do what you say you want to do (getting single quotes to the program), but it uses double quotes:
sqlite3 my_db.db "'".read" "my_file.sql"'"
Avoiding double quotes, you can write:
sqlite3 my_db.db \'.read\ my_file.sql\'
For both of these, the second argument will be seen by sqlite3 as a string containing:
'.read my_file.sql'
If the file name is in a variable (file=my_file.sql), then:
sqlite3 my_db.db "'".read" "$file"'"
sqlite3 my_db.db \'.read\ $file\'
These notations are vulnerable to confusion if the file name contains spaces.
However, I don't think that's likely to be what you really want. The proscription on double quotes is puzzling, and the requirement for single quotes is likewise puzzling.
You can do as follows:
VAR=my_file.sql
VAR2="'.read $VAR'"
sqlite3 my_db.db $VAR2
user1860085, if you check out documentation for sqlite3 command and you will know how shell treats quotes and white spaces, you will probably come to conclusion that you want double quotes for your case.
but if you really want single quotes, here is solution:
eval sqlite3 my_db.db \'.read $VARIABLE\'
which in the fly will change to:
sqlite3 my_db.db '.read my_file.sql'
But I don't see why you could want it...
OK, problem solved !!
All that was missing is adding a little 'eval' command before the line.
So, in the simple example script I gave, changing:
ls "'$file'" to:
eval ls "'$file'"
did the job.
Thanks to all replyers :-)

Resources