In my Bash CGI script, I take a command passed as GET parameter and execute it. This could be:
CMD='ls -al'
$CMD
Which works fine and produces expected output. But if I try to pass two commands with
CMD='ls -al; echo hello'
$CMD
or
CMD='ls -al && echo hello'
$CMD
neither command gets executed.
How can I run multiple commands from the same line/variable in my bash CGI?
You can execute variables as bash code using bash:
# UNSAFE, DO NOT USE
cmd='ls -al; echo hello'
bash -c "$cmd"
Alternatively, depending on the context you want to run it in, you can use eval "$cmd" to run it as if it was a line in your own script, rather than a separate piece of shell code to execute:
# UNSAFE, DO NOT USE
cmd='ls -al; echo hello'
eval "$cmd"
Both of these methods have serious implications for security and correctness, so I felt I had to add warnings to prevent them from being copied out of context.
For your remote shell or root kit specifically meant to run insecure user input, you can ignore the warnings.
Related
I have a bash shell-script with a function which exports an environment variable.
For sake of argument lets use the following example:
#!/bin/bash
function my_function()
{
export my_env_var=$1
}
Since the whole purpose is to export the variable to the main shell I source it.
When the main shell is bash this works fine:
<bash-shell>
> source ~/tmp/my_test.sh
> my_function test
> echo $my_env_var
test
But other customers use csh and there things start to fail if I use the same command with the same script, since csh does not know functions :-(
<csh-shell>
% source ~/tmp/my_test.sh
Badly placed ()'s
I already tried to wrap it in a wrapper-script:
#!/bin/sh
bash -c 'source ~/tmp/my_test.sh; my_function test`
echo my_env_var = $my_env_var
But my_env_var is not exported in this way:
<csh-shell>
% source ~/tmp/my_test2.sh
my_env_var: Undefined variable.
Where it is known in the bash shell (as can be seen by changing the 2nd script to:
#!/bin/sh
bash -c 'source ~/tmp/my_test.sh; my_function test; echo my_env_var in bash = $my_env_var`
echo my_env_var = $my_env_var
<csh-shell>
% source ~/tmp/my_test2.sh
my_env_var in bash = test
my_env_var: Undefined variable.
What am I missing / doing wrong so the script exports the variable when it is called from bash and when it is called from csh?
The Bourne shell and csh are not compatible; many commands are different, and csh misses many features (it doesn't have functions at all). Plus, sooner or later you're going to have someone who uses fish, which is different yet still. The only way to make a non-trivial script work for both is to write it twice.
That said, if you want to set some environment variables then the general strategy is to create a script which outputs the required commands; this can be in any language (shell, Python, C); for example:
#!/bin/sh
# ... do work here ...
var="foo"
# Getting the shell in a cross-platform way isn't too easy. This was only tested
# on Linux. Can add a "-c" or "-f" flag if you need cross-platform support.
shell=$(ps -ho comm $(ps -ho ppid $$))
case "$shell" in
(csh|tcsh) echo "setenv VAR $var" ;;
(fish) echo "set -Ux VAR $var" ;;
(*) echo "export VAR=$var"
esac
And when you run it, it outputs the appropriate commands:
% ./work
export VAR=foo
% tcsh
> ./work
setenv VAR foo
> fish
martin#x270 ~> ./work
set -Ux VAR foo
And to actually set it, eval the output like so:
% eval $(./work)
% echo $VAR
foo
% tcsh
> eval `./work`
> echo $VAR
foo
> fish
martin#x270 ~> eval (./work)
martin#x270 ~> echo $VAR
foo
The downside of this is that informational messages, warnings, etc. will also get eval'd; to solve this make sure to always output these to stderr:
echo >&2 "warning: foo"
If you don't want to run eval you can also use something slightly more complicated which prints VAR=foo and then create a Bourne and csh wrapper script to parse those lines, but "output the variables you want to set, instead of directly setting them" is the general approach to take to make something work in multiple incompatible shells.
I have a test.sh file which takes as a parameter a bash command, it does some logic, i.e. setting and checking some env vars, and then executes that input command.
#!/bin/bash
#Some other logic here
echo "Run command: $#"
eval "$#"
When I run it, here's the output
% ./test.sh echo "ok"
Run command: echo ok
ok
But the issue is, when I pass something like sh -c 'echo "ok"', I don't get the output.
% ./test.sh sh -c 'echo "ok"'
Run command: sh -c echo "ok"
%
So I tried changing eval with exec, tried to execute $# directly (without eval or exec), even tried to execute it and save the output to a variable, still no use.
Is there any way to run the passed command in this format and get the ourput?
Use case:
The script is used as an entrypoint for the docker container, it receives the parameters from docker CMD and executes those to run the container.
As a quickfix I can remove the sh -c and pass the command without it, but I want to make the script reusable and not to change the commands.
TL;DR:
This is a typical use case (perform some business logic in a Docker entrypoint script before running a compound command, given at command line) and the recommended last line of the script is:
exec "$#"
Details
To further explain this line, some remarks and hyperlinks:
As per the Bash user manual, exec is a POSIX shell builtin that replaces the shell [with the command supplied] without creating a new process.
As a result, using exec like this in a Docker entrypoint context is important because it ensures that the CMD program that is executed will still have PID 1 and can directly handle signals, including that of docker stop (see also that other SO answer: Speed up docker-compose shutdown).
The double quotes ("$#") are also important to avoid word splitting (namely, ensure that each positional argument is passed as is, even if it contains spaces). See e.g.:
#!/usr/bin/env bash
printargs () { for arg; do echo "$arg"; done; }
test0 () {
echo "test0:"
printargs $#
}
test1 () {
echo "test1:"
printargs "$#"
}
test0 /bin/sh -c 'echo "ok"'
echo
test1 /bin/sh -c 'echo "ok"'
test0:
/bin/sh
-c
echo
"ok"
test1:
/bin/sh
-c
echo "ok"
Finally eval is a powerful bash builtin that is (1) unneeded for your use case, (2) and actually not advised to use in general, in particular for security reasons. E.g., if the string argument of eval relies on some user-provided input… For details on this issue, see e.g. https://mywiki.wooledge.org/BashFAQ/048 (which recaps the few situations where one would like to use this builtin, typically, the command eval "$(ssh-agent -s)").
In my terminal,
prog="cat"
name=$(which $prog)
echo $name
prints /bin/cat
But in my script:
pro="$1"
prog=$(which $pro)
echo "pro is $pro"
echo "prog is "$prog""
running scriptname cat prints
pro is cat
prog is
How do I make which work? it should print prog is /bin/cat
which(1) is an external program used to search PATH for an executable. It behaves differently on different systems and you can't rely on a useful exit code; use (from most to least portable) command -v or type -P (to find the path) or hash (to check) instead.
try printf '%s\n' "$PATH" inside your script as well as outside of it. maybe the command you're looking for is not in the PATH used in the script?
That is almost certainly the cause.
I would like to connect to different shells (csh, ksh etc.,) and execute command inside each switched shell.
Following is the sample program which reflects my intention:
#!/bin/bash
echo $SHELL
csh
echo $SHELL
exit
ksh
echo $SHELL
exit
Since, i am not well versed with Shell scripting need a pointer on how to achieve this. Any help would be much appreciated.
If you want to execute only one single command, you can use the -c option
csh -c 'echo $SHELL'
ksh -c 'echo $SHELL'
If you want to execute several commands, or even a whole script in a child-shell, you can use the here-document feature of bash and use the -s (read commands from stdin) on the child shells:
#!/bin/bash
echo "this is bash"
csh -s <<- EOF
echo "here go the commands for csh"
echo "and another one..."
EOF
echo "this is bash again"
ksh -s <<- EOF
echo "and now, we're in ksh"
EOF
Note that you can't easily check the shell you are in by echo $SHELL, because the parent shell expands this variable to the text /././bash. If you want to be sure that the child shell works, you should check if a shell-specific syntax is working or not.
It is possible to use the command line options provided by each shell to run a snippet of code.
For example, for bash use the -c option:
bash -c $code
bash -c 'echo hello'
zsh and fish also use the -c option.
Other shells will state the options they use in their man pages.
You need to use the -c command line option if you want to pass commands on bash startup:
#!/bin/bash
# We are in bash already ...
echo $SHELL
csh -c 'echo $SHELL'
ksh -c 'echo $SHELL'
You can pass arbitrary complex scripts to a shell, using the -c option, as in
sh -c 'echo This is the Bourne shell.'
You will save you a lot of headaches related to quotes and variable expansion if you wrap the call in a function reading the script on stdin as:
execute_with_ksh()
{
local script
script=$(cat)
ksh -c "${script}"
}
prepare_complicated_script()
{
# Write shell script on stdout,
# for instance by cat-ting a here-document.
cat <<'EOF'
echo ${SHELL}
EOF
}
prepare_complicated_script | execute_with_ksh
The advantage of this method is that it easy to insert a tee in the pipe or to break the pipe to control the script being passed to the shell.
If you want to execute the script on a remote host through ssh you should consider encode your script in base 64 to transmit it safely to the remote shell.
I need to execute the shell command as follows:
ssh <device> "command"
command is invoked as:
$(typeset); <function_name> \"arguement_string\"; cd ...; ls ...
How exactly to quote here? Is this correct?
""$(typeset); <function_name> \"arguement_string\""; cd ...; ls ..."
I am confused with this quoting in shell scripts.
Don't try to do the quoting by hand -- ask the shell to do it for you!
command_array=( function_name "first argument" "second argument" )
printf -v command_str '%q ' "${command_array[#]}"
ssh_str="$(typeset); $command_str"
ssh machine "$ssh_str"
You can then build up command_array as you wish -- using logic to conditionally append values, with only the kind of quoting you'd usually refer to use to those values, and let printf %q add all additional quoting needed to make the content safe to pass through ssh.
If you're trying to incrementally build up a script, you can do that like so:
remote_script="$(typeset)"$'\n'
safe_append_command() {
local command_str
printf -v command_str '%q ' "$#"
remote_script+="$command_str"$'\n'
}
safe_append_command cp "$file" "$destination"
safe_append_command tar -cf /tmp/foo.tar "${destination%/*}"
# ...etc...
ssh machine "$remote_script"
Note that in this case, all expansions take place locally, when the script is being generated, and shell constructs such as redirection operators cannot be used (except by embedding them in a function you then pass to the remote system with typeset). Doing so means that no data passed to safe_append_command can be treated as code -- foreclosing large classes of potential security holes at the cost of flexibility.
I would use a here document:
ssh machine <<'EOF'
hello() {
echo "hello $1!"
}
hello "world"
EOF
Note that I wrapped the starting EOF in single quotes. Doing so prevents bash from interpreting variables or command substitutions in the local shell.