I am working with the graphing tool xmgrace and I am trying to plot multiple datasets. Rewriting the command with all the arguments over and over is becoming a waste of time, so I decided to make a shell script called xmgraceScript.
At the moment my shell script looks like this:
xmgrace dirA/argA dirB/argB dirC/argC
since the dir paths are pretty long I would like to have each argument in a new line, just to make the script more readable. I tried to do this by writing:
xmgrace << _XMARGS_
dirA/argA
dirB/argB
dirC/argC
_XMARGS_
this does not work. Can anyone recommend a different way of doing so?
thank you
Simply end all lines other than the last with a backslash character:
xmgrace \
dirA/argA \
dirB/argB \
dirC/argC
(The '<<' construct feeds the enclosed content to the standard input of the command.)
Related
In Linux command line, one can use either of two ways to pass options to commands. Either we can use the short option format which uses a single dash followed by a single letter, for example: -o or the long option format which uses two consecutive dashes followed by a word, for example: --option. But recently I came across some commands which in my thinking uses a 'hybrid' of both the formats, which uses a single dash followed by a word, for example: -option. Now I'm not talking about a commands where you can stick multiple short options together like ls -lisa. I'm talking about options where the word after the single dash is just one option and not multiple short options strung together.
I don't seem to understand why there's a third option. Because what I know about the Linux command line is you can have only a short form format or a long form format. Where did the third format came from?
It's actually confusing because sometimes you cannot be sure if the third format is really a dash followed by one option or a dash followed by multiple short options.
This is not a bash issue. All programs have their on way of handling the options/flags. There are many different styles:
the singe letter style with a single hyphen, for example:
ls -l
the mnemonic-style with double dashes, which seems a preference for GNU-stuff, for example, ls --size
the variable=value-style, for example dd if=file of=otherfile
options without dashes, as in tar cvzf arghive.tgz
You could even use a + instead of a - (as in date +%m).
etcetera.
It is important to understand that bash just passes these options to the programs/commands. So, in the programs you will generally see:
int main(int argc, char *argv[]){
(c-code example). In that case, argv[0] will point to the program-name (to simplify things a bit) and argv[1] will point to the first argument. Depending on the program, that may be different.
A quick scan through the built-in commands reveals that the built-ins always seem to use the minus-single letter (-a) for specifying options.
I think you are confusing which component does which part of the parsing.
The command line you type into bash gets parsed twice. First it gets parsed by bash. At this stage, spaces are used to separate the different parameters. Quotes and escapes are being taken into consideration. Wildcards are expanded, and $ variables are substituted.
At the end of this phase, we are left with a command line that has a list of strings, the first of which describes the command to be executed. At this point, bash calls execve, and passes it that list of strings.
The next phase of parsing is optional, and is up to each program to carry out. Most programs call getopt_long, a library function that parses options. The one and two dash convention you mention is applied by it (as well as it's older sibling, getopt).
It is, however, up to each program to parse its own parameters. Many programs use getopt_long, which is why you feel, correctly, that it is a standard. Some, however, do not. Those who do not follow their own way.
That's just how things are.
For your programs, you should try to use either getopt_long or some compatible solution, as that causes the least amount of confusion for users.
I'd like to execute with subprocess.Popen() this command containing an ampersand to be interpreted as a batch concatenation operator:
COMMAND="c:\p\a.exe & python run.py"
subprocess.Popen(COMMAND,cwd=wd,shell=False)`
The ampersand & however is interpreted as an argument of a.exe and not as a batch operator.
Solution 1: Having seen the related question, a solution could be to set shell=True but that gives the error 'UNC path are not supported', since my working directory is remote. This solution does not work as it is.
Solution2: The executable a.exe should take a -inputDir parameter to specify the remote location of the files and use a local working directory. I think this solution could work but I may not have the source code of the executable.
Solution3: I could instead write c:\p\a.exe & python run.py into command.bat and then use
COMMAND="c:\p\command.bat"
subprocess.Popen(COMMAND,cwd=wd,shell=False)`
Could this approach work?
Solution4: I am trying to solve this changing only the subprocess.Popen() call. Is it possible to do it? Based on the python Popen doc I suspect is not possible. Please tell me I am wrong.
See also this related questions.
UPDATE:
Solution 5: #mata suggested to use Powershell Popen(['powershell', '-Command', r'C:\p\a.exe; python run.py']). This actually works, but now I have to deal with slightly different commands, and being lazy I've decided to use solution 3.
My favourite solution was Solution 3, to create a .bat file and call it
COMMAND="c:\p\command.bat"
subprocess.Popen(COMMAND,cwd=wd,shell=False)
I would use Solution 3
The & character is used to separate multiple commands on one command line.
Cmd.exe runs the first command, and then the second command.
In this case you could just write your batch file like this:
#echo off
c:\p\a.exe
python run.py
Also, it's worth noting when using cmd.exe:
The ampersand (&), pipe (|), and parentheses ( ) are special characters
that must be preceded by the escape character (^) or quotation marks
when you pass them as arguments.
So I have this existing command that accepts a single argument, but I need something that accepts the argument over stdin instead.
A shell script wrapper like the following works, but as I will be allowing untrusted users to pass arbitrary strings on stdin, I'm wondering if there's potential for someone to execute arbitary commands on the shell.
#!/bin/sh
$CMD "`cat`"
Obviously if $CMD has a vulnerability in the way it processes the argument there's nothing I can do, so I'm concerned stuff like this:
Somehow allow the user to escape the double quotes and pass input into argument #2 of $CMD
Somehow cause another arbitary command to run
The parameter looks fine to me, but the command might be a bit shaky, if it can have a space in it. Also, if you're looking to get just one line from the user then you might prefer this:
#!/bin/bash
read line
exec "$CMD" "$line"
A lot of code would be broken if "$(cmd)" could expand to multiple words.
Say for example a script begins like this
#!/bin/bash
#$ -S /bin/bash
#$ -l hostname=qn*
and then later down the page the actual script comes into play. My question is what does the "#$" symbol mean or do?
Are you by any means running on a batch cluster? Like Sun Grid Engine? There might be special meanings in scripts intended to run as a batch job.
https://www.wiki.ed.ac.uk/display/EaStCHEMresearchwik/How+to+write+a+SGE+job+submission+script
Update:
above link blocks when used from stackoverflow.com (works from google.com)
alternatives:
http://www.cbi.utsa.edu/sge_tutorial
http://web.njit.edu/all_topics/HPC/basement/sge/SGE.html
Lines beginning with # are comments. The first line may begin with #!, but it's still a comment to bash and is merely used to indicate the interpreter to use for the file. All other lines beginning with # are absolutely unimportant to bash, whether the next character is $ or anything else.
They seem to be parameters for the Oracle (ex-Sun) Grid Engine, look at this SO question or this one.
They are heavily using these kind of comments.
Those line are important for queue systems like sbatch.
I am trying to add security of GET query to exec function.
If I remove escapeshellarg() function, it work fine. How to fix this issue?
ajax_command.php
<?php
$command = escapeshellarg($_GET['command']);
exec("/usr/bin/php-cli " . $command);
?>
Assume $_GET['command'] value is run.php -n 3
What security check I can also add?
You want escapeshellcmd (escape a whole command, or in your case, sequence of arguments) instead of escapeshellarg (escape just a single argument).
Notice that although you have taken special precautions, this code allows anyone to execute arbitrary commands on your server anyways, by specifying the whole php script in a -r option. Note that php.ini can not be used to restrict this, since the location of it can be overwritten with -c. In short (and with a very small error margin): This code creates a severe security vulnerability.
escapeshellarg returns a quoted value, so if it contains multiple arguments, it won't work, instead looking like a single stringesque argument. You should probably look at splitting the command up into several different parameters, then each can be escaped individually.
It will fail unless there's a file called run.php -n 3. You don't want to escape a single argument, you want to escape a filename and arguments.
This is not the proper way to do this. Have a single PHP script run all your commands for you, everything specified in command line arguments. Escape the arguments and worry about security inside that PHP file.
Or better yet, communicate through a pipe.