Setting environment variables for multiple commands in bash one-liner - linux

Let's say I have following command
$> MYENVVAR=myfolder echo $MYENVVAR && MYENVVAR=myfolder ls $MYENVVAR
I mean that MYENVVAR=myfolder repeats
Is it possible to set it once for both "&&" separated commands while keeping the command on one line?

Assuming you actually need it as an environment variable (even though the example code does not really need an environment variable; some shell variables are not environment variables):
(export MYENVVAR=myfolder; echo $MYENVVAR && ls $MYENVVAR)
If you don't need it as an environment variable, then:
(MYENVVAR=myfolder; echo $MYENVVAR && ls $MYENVVAR)
The parentheses create a sub-shell; environment variables (and plain variables) set in the sub-shell do not affect the parent shell. In both commands shown, the variable is set once and then used twice, once by each of the two commands.

Parentheses spawn a new process, where you can set its own variables:
( MYENVVAR=myfolder; echo 1: $MYENVVAR; ); echo 2: $MYENVVAR;
1: myfolder
2:

Wrapping the commands into a string and using eval on them is one way not yet mentioned:
a=abc eval 'echo $a; echo $a'
a=abc eval 'echo $a && echo $a'
Or, if you want to use a general-purpose many-to-many mapping between environment variables and commands, without the need to quote your commands, you can use my trap-based function below:
envMulti()
{
shopt -s extdebug;
PROMPT_COMMAND="$(trap -p DEBUG | tee >(read -n 1 || echo "trap - DEBUG")); $(shopt -p extdebug); PROMPT_COMMAND=$PROMPT_COMMAND";
eval "trap \"\
[[ \\\"\\\$BASH_COMMAND\\\" =~ ^trap ]] \
|| { eval \\\"$# \\\$BASH_COMMAND\\\"; false; }\" DEBUG";
}
Usage:
envMulti a=aaa b=bbb; eval 'echo $a'; eval 'echo $b'
Note: the eval 'echo...'s above have nothing to do with my script; you can never do a=aaa echo $a directly, because the $a gets expanded too early.
Or use it with env if you prefer (it actually prefixes any commands with anything):
echo -e '#!/bin/bash\n\necho $a' > echoScript.sh
chmod +x echoScript.sh
envMulti env a=aaa; ./echoScript.sh; ./echoScript.sh
Note: created a test script just to demonstrate usage with env, which can't accept built-ins like eval as used in the earlier demo.
Oh, and the above were all intended for running your own shell commands by-hand. If you do anything other than that, make sure you know all the cautions about using eval -- i.e. make sure you trust the source of the commands, etc.

Did you consider using export like
export MYENVVAR=myfolder
then type your commands like echo $MYENVVAR (that would work even in sub-shells) etc

Related

Exporting environment variables both to bash as csh using a bash script with functions

I have a bash shell-script with a function which exports an environment variable.
For sake of argument lets use the following example:
#!/bin/bash
function my_function()
{
export my_env_var=$1
}
Since the whole purpose is to export the variable to the main shell I source it.
When the main shell is bash this works fine:
<bash-shell>
> source ~/tmp/my_test.sh
> my_function test
> echo $my_env_var
test
But other customers use csh and there things start to fail if I use the same command with the same script, since csh does not know functions :-(
<csh-shell>
% source ~/tmp/my_test.sh
Badly placed ()'s
I already tried to wrap it in a wrapper-script:
#!/bin/sh
bash -c 'source ~/tmp/my_test.sh; my_function test`
echo my_env_var = $my_env_var
But my_env_var is not exported in this way:
<csh-shell>
% source ~/tmp/my_test2.sh
my_env_var: Undefined variable.
Where it is known in the bash shell (as can be seen by changing the 2nd script to:
#!/bin/sh
bash -c 'source ~/tmp/my_test.sh; my_function test; echo my_env_var in bash = $my_env_var`
echo my_env_var = $my_env_var
<csh-shell>
% source ~/tmp/my_test2.sh
my_env_var in bash = test
my_env_var: Undefined variable.
What am I missing / doing wrong so the script exports the variable when it is called from bash and when it is called from csh?
The Bourne shell and csh are not compatible; many commands are different, and csh misses many features (it doesn't have functions at all). Plus, sooner or later you're going to have someone who uses fish, which is different yet still. The only way to make a non-trivial script work for both is to write it twice.
That said, if you want to set some environment variables then the general strategy is to create a script which outputs the required commands; this can be in any language (shell, Python, C); for example:
#!/bin/sh
# ... do work here ...
var="foo"
# Getting the shell in a cross-platform way isn't too easy. This was only tested
# on Linux. Can add a "-c" or "-f" flag if you need cross-platform support.
shell=$(ps -ho comm $(ps -ho ppid $$))
case "$shell" in
(csh|tcsh) echo "setenv VAR $var" ;;
(fish) echo "set -Ux VAR $var" ;;
(*) echo "export VAR=$var"
esac
And when you run it, it outputs the appropriate commands:
% ./work
export VAR=foo
% tcsh
> ./work
setenv VAR foo
> fish
martin#x270 ~> ./work
set -Ux VAR foo
And to actually set it, eval the output like so:
% eval $(./work)
% echo $VAR
foo
% tcsh
> eval `./work`
> echo $VAR
foo
> fish
martin#x270 ~> eval (./work)
martin#x270 ~> echo $VAR
foo
The downside of this is that informational messages, warnings, etc. will also get eval'd; to solve this make sure to always output these to stderr:
echo >&2 "warning: foo"
If you don't want to run eval you can also use something slightly more complicated which prints VAR=foo and then create a Bourne and csh wrapper script to parse those lines, but "output the variables you want to set, instead of directly setting them" is the general approach to take to make something work in multiple incompatible shells.

Executing `sh -c` in a bash script

I have a test.sh file which takes as a parameter a bash command, it does some logic, i.e. setting and checking some env vars, and then executes that input command.
#!/bin/bash
#Some other logic here
echo "Run command: $#"
eval "$#"
When I run it, here's the output
% ./test.sh echo "ok"
Run command: echo ok
ok
But the issue is, when I pass something like sh -c 'echo "ok"', I don't get the output.
% ./test.sh sh -c 'echo "ok"'
Run command: sh -c echo "ok"
%
So I tried changing eval with exec, tried to execute $# directly (without eval or exec), even tried to execute it and save the output to a variable, still no use.
Is there any way to run the passed command in this format and get the ourput?
Use case:
The script is used as an entrypoint for the docker container, it receives the parameters from docker CMD and executes those to run the container.
As a quickfix I can remove the sh -c and pass the command without it, but I want to make the script reusable and not to change the commands.
TL;DR:
This is a typical use case (perform some business logic in a Docker entrypoint script before running a compound command, given at command line) and the recommended last line of the script is:
exec "$#"
Details
To further explain this line, some remarks and hyperlinks:
As per the Bash user manual, exec is a POSIX shell builtin that replaces the shell [with the command supplied] without creating a new process.
As a result, using exec like this in a Docker entrypoint context is important because it ensures that the CMD program that is executed will still have PID 1 and can directly handle signals, including that of docker stop (see also that other SO answer: Speed up docker-compose shutdown).
The double quotes ("$#") are also important to avoid word splitting (namely, ensure that each positional argument is passed as is, even if it contains spaces). See e.g.:
#!/usr/bin/env bash
printargs () { for arg; do echo "$arg"; done; }
test0 () {
echo "test0:"
printargs $#
}
test1 () {
echo "test1:"
printargs "$#"
}
test0 /bin/sh -c 'echo "ok"'
echo
test1 /bin/sh -c 'echo "ok"'
test0:
/bin/sh
-c
echo
"ok"
test1:
/bin/sh
-c
echo "ok"
Finally eval is a powerful bash builtin that is (1) unneeded for your use case, (2) and actually not advised to use in general, in particular for security reasons. E.g., if the string argument of eval relies on some user-provided input… For details on this issue, see e.g. https://mywiki.wooledge.org/BashFAQ/048 (which recaps the few situations where one would like to use this builtin, typically, the command eval "$(ssh-agent -s)").

How to evaluate the multi-line export command to set environment variables

I have a script, generate some output just as what the echo below does. How to export the two environment variables a and b?
I tried
echo -e "export a=3\nexport b=4"|bash
or
echo -e "export a=3\nexport b=4"|eval
or
echo -e "export a=3\nexport b=4"|exec
Neither works. Please help.
If you pipe the command to a program, the program runs in a child process, so none of its environment changes affect the original shell.
Use eval and give the string as an argument. Use ; to separate commands rather than newline.
eval 'export a=3; export b=4'

What's the point of eval/bash -c as opposed to just evaluating a variable?

Suppose you have the following command stored in a variable:
COMMAND='echo hello'
What's the difference between
$ eval "$COMMAND"
hello
$ bash -c "$COMMAND"
hello
$ $COMMAND
hello
? Why is the last version almost never used if it is shorter and (as far as I can see) does exactly the same thing?
The third form is not at all like the other two -- but to understand why, we need to go into the order of operations when bash in interpreting a command, and look at which of those are followed when each method is in use.
Bash Parsing Stages
Quote Processing
Splitting Into Commands
Special Operator Parsing
Expansions
Word Splitting
Globbing
Execution
Using eval "$string"
eval "$string" follows all the above steps starting from #1. Thus:
Literal quotes within the string become syntactic quotes
Special operators such as >() are processed
Expansions such as $foo are honored
Results of those expansions are split on characters into whitespace into separate words
Those words are expanded as globs if they parse as same and have available matches, and finally the command is executed.
Using sh -c "$string"
...performs the same as eval does, but in a new shell launched as a separate process; thus, changes to variable state, current directory, etc. will expire when this new process exits. (Note, too, that that new shell may be a different interpreter supporting a different language; ie. sh -c "foo" will not support the same syntax that bash, ksh, zsh, etc. do).
Using $string
...starts at step 5, "Word Splitting".
What does this mean?
Quotes are not honored.
printf '%s\n' "two words" will thus parse as printf %s\n "two words", as opposed to the usual/expected behavior of printf %s\n two words (with the quotes being consumed by the shell).
Splitting into multiple commands (on ;s, &s, or similar) does not take place.
Thus:
s='echo foo && echo bar'
$s
...will emit the following output:
foo && echo bar
...instead of the following, which would otherwise be expected:
foo
bar
Special operators and expansions are not honored.
No $(foo), no $foo, no <(foo), etc.
Redirections are not honored.
>foo or 2>&1 is just another word created by string-splitting, rather than a shell directive.
$ bash -c "$COMMAND"
This version starts up a new bash interpreter, runs the command, and then exits, returning control to the original shell. You don't need to be running bash at all in the first place to do this, you can start a bash interpreter from tcsh, for example. You might also do this from a bash script to start with a fresh environment or to keep from polluting your current environment.
EDIT:
As #CharlesDuffy points out starting a new bash shell in this way will clear shell variables but environment variables will be inherited by the spawned shell process.
Using eval causes the shell to parse your command twice. In the example you gave, executing $COMMAND directly or doing an eval are equivalent, but have a look at the answer here to get a more thorough idea of what eval is good (or bad) for.
There are at least times when they are different. Consider the following:
$ cmd="echo \$var"
$ var=hello
$ $cmd
$var
$ eval $cmd
hello
$ bash -c "$cmd"
$ var=world bash -c "$cmd"
world
which shows the different points at which variable expansion is performed. It's even more clear if we do set -x first
$ set -x
$ $cmd
+ echo '$var'
$var
$ eval $cmd
+ eval echo '$var'
++ echo hello
hello
$ bash -c "$cmd"
+ bash -c 'echo $var'
$ var=world bash -c "$cmd"
+ var=world
+ bash -c 'echo $var'
world
We can see here much of what Charles Duffy talks about in his excellent answer. For example, attempting to execute the variable directly prints $var because parameter expansion and those earlier steps had already been done, and so we don't get the value of var, as we do with eval.
The bash -c option only inherits exported variables from the parent shell, and since I didn't export var it's not available to the new shell.

Linux script: Reinterpret environment variable

I am creating a Bash script which reads some other environment variables:
echo "Exporting configuration variables..."
while IFS="=" read -r k v; do
key=$k
value=$v
if [[ ${#key} > 0 && ${#value} > 0 ]]; then
export $key=$value
fi
done < $HOME/myfile
and have the variable:
$a=$b/c/d/e
and want to call $a as in:
cp myOtherFile $a
The result for the destination folder for the copy is "$b/c/d/e", and an error is shown:
"$b/c/d/e" : No such file or directory
because it is interpreted literally as a folder path.
Can this path be reinterpreted before being used in the cp command?
You need eval to do this :
$ var=foo
$ x=var
$ eval $x=another_value
$ echo $var
another_value
I recommend you this doc before using eval : http://mywiki.wooledge.org/BashFAQ/048
And a safer approach is to use declare instead of eval:
declare "$x=another_value"
Thanks to chepner 2 for the latest.
It sounds like you want $HOME/myfile to support Bash notations, such as parameter-expansion. I think the best way to do that is to modify $HOME/myfile to be, in essence, a Bash script:
export a=$b/c/d/e
and use the source builtin to run it as part of the current Bash script:
source $HOME/myfile
... commands ...
cp myOtherFile "$a"
... commands ...
try this
cp myOtherFile `echo $a`

Resources