Exporting a script variable using `sh -cx` - linux

I'm trying to export a variable from a script in the following manner:
sh -xc "<script here>"
But cannot get it to work at all. I've tried several techniques such as:
sh -xc "./xxx.sh"(exporting a variable yyy from the file itself)
sh -xc "./xxx.sh && export yyy=1"
(had xxx.sh exit 0)
sh -xc ". ./xxx.sh"
As well as several permutations of the above, but no dice on any of them.
Unfortunately, I must conform to the sh -xc "<script here>" style. Any script I execute will be placed inside of the quotations, file and/or command(s). There's no way around this.
Is what I'm asking even possible? If so, how?
Thanks!

You can't do an export through a shell script, because a shell script runs in a child shell process, and only children of the child shell would inherit the export.
The reason for using source is to have the current shell execute the commands
It's very common to place export commands in a file such as .bashrc which a bash will source on startup (or similar files for other shells)
But you can do as follows:
shell$ cat > script.sh
#!/bin/sh
echo export myTest=success
chmod u+x script.sh
And then have the current shell execute that output
shell$ `./script.sh`
shell$ echo $myTest
success
shell$ /bin/sh
$ echo $myTest
success

Related

shopt -s extdebug in .bashrc not working in script files

I am writing a a bash script (echoo.sh) with the intention of echoing the command before it is executed. I source this script (echoo.sh) inside .bashrc. But it does not execute for commands run in script file(tmp.sh) with the bash shebang. Below is the code I have so far
echoo.sh
#!/usr/bin/env bash
shopt -s extdebug; get_hacked () {
[ -n "$COMP_LINE" ] && return # not needed for completion
[ "$BASH_COMMAND" = "$PROMPT_COMMAND" ] && return # not needed for prompt
local this_command=$BASH_COMMAND;
echo $this_command;
};
trap 'get_hacked' DEBUG
When I open a shell and run any command - It works. But for stuff in a script file it doesn't work.
SOME FURTHER TRIES:
I tried sourcing the .bashrc file within the script file (tmp.sh) - didn't work.
I sourced echoo.sh inside tmp.sh and it worked.
SO, I am trying to understand
Why doesn't it work if I just source my script in .bashrc for stuff that runs in scripts?
Why doesn't further try #1 work when #2 does.
AND finally what can I do such that I don't have to source echoo.sh in all script files for this to work. Can source my script in one place and change some setting such that it works in all scenarios.
I source this script (echoo.sh) inside .bashrc. But it does not execute for commands run in script file(tmp.sh) with the bash shebang
Yes it won't because you are invoking the shell non-interactively!
The shell can be spawned either interactively or non-interactively. When bash is invoked as an interactive login shell it first reads and executes commands from the file /etc/profile, if that file exists. After reading that file, it looks for ~/.bash_profile, ~/.bash_login, and ~/.profile, in that order, and reads and executes commands from the first one that exists and is readable.
When an interactive shell that is not a login shell is started, bash reads and executes commands from ~/.bashrc, if that file exists.
When you run a shell script with an interpreter set, it opens a new sub-shell that is non-interactive and does not have the option -i set in the shell options.
Looking into ~/.bashrc closely you will find a line saying
# If not running interactively, don't do anything
[[ "$-" != *i* ]] && return
which means in the script you are calling, e.g. consider the case below which am spawning a non-interactive shell explicitly using the -c option and -x is just to enable debug mode
bash -cx 'source ~/.bashrc'
+ source /home/foobaruser/.bashrc
++ [[ hxBc != *i* ]]
++ return
which means the rest of the the ~/.bashrc was not executed because of this guard. But there is one such option to use here to read a start-up file for such non-interactive cases, as defined by BASH_ENV environment variable. The behavior is as if this line is executed
if [ -n "$BASH_ENV" ]; then . "$BASH_ENV"; fi
You can define a file and pass its value to the local environment variable
echo 'var=10' > non_interactive_startup_file
BASH_ENV=non_interactive_startup_file bash -x script.sh
Or altogether run your shell script as if an interactive non login shell is spawned. Run the script with an -i flag. Re-using the above example, with the -i flag passed now the ~/.bashrc file will be sourced.
bash -icx 'source ~/.bashrc'
You could also set this option when setting your interpreter she-bang in bash to #!/bin/bash -i
So to answer your questions from the above inferences,
Why doesn't it work if I just source my script in .bashrc for stuff that runs in scripts?
It won't because ~/.bashrc cannot be sourced from a shell that is launched non-interactively. By-pass it by passing -i to the script i.e. bash -i <script>
Why doesn't further try #1 work when #2 does.
Because you are depending on reading up the ~/.bashrc at all here. When you did source the echoo.sh inside tmp.sh, all its shell configurations are reflected in the shell launched by tmp.sh

exporting PATH not working in a bash shell on linux

For example, I type to the following command:
# PATH=$PATH:/var/test
# echo $PATH
........./var/test // working
# export PATH
Next, I open another bash shell session to test if the export works by typing the following command:
# echo $PATH
........ // not working as in I don't see /var/test path
you have set the PATH environment variable only for your current bash session. You need to add the line PATH=$PATH:/var/test into ~/.bashrc so that it works for any bash shell.
Just run the following command to put it into your rc(run commands) file (rc files contain startup information for a command(initialization)):
echo "PATH=$PATH:/var/test" >> ~/.bashrc
More info: https://en.wikipedia.org/wiki/Run_commands
https://superuser.com/questions/789448/choosing-between-bashrc-profile-bash-profile-etc
exporting a variable makes it available only in child processes spwaned/started from that bash shell.
As an example:
$ export var=abcd
$ sh
$ echo "$var"
abcd
$ exit
$ echo "$var"
abcd
$
sh is the child process of bash hence it gets the value of var, since you open a new bash which is a different process altogether it does get the PATH value.

export doesn't work in shell script but does via CLI

I have the following file - test.sh - in .:
#!/bin/sh
export ASDF=test
I do chmod +x test.sh,then ./test.sh and finally echo $ASDF and... nothing. It's as though $ASDF hasn't been set. But if I do it via the CLI instead of a shell script it works just fine and $ASDF is defined.
Why isn't the shell script working?
It is because:
./test.sh
will create a sub shell and set env variables in the sub shell. Once sub shell exits this variable isn't available in parent shell.
Use this form to avoid forking a sub shell and execute test.sh in the current shell itself:
. ./test.sh
OR:
source ./test.sh
Now that variable ASDF will be available in current shell also.

command not found when executing nested bash scripts

I am having a bash script that is executing another bash script:
ex:
script name "rotator" is calling script name "s3-get" like below
!# /bin/bash
...
./s3-get {and params here}
All commands as "cat", "basename" etc. run correctly here
Within the "s3-get" script there is code as:
!# /bin/bash
cat > /dev/null << EndOfLicense
...
readonly weAreKnownAs="$(basename $0)"
...
main "$#"
So, if I simply execute the s3-get script directly from shell, it runs perfectly. When I try to execute it from "rotator" script, I get the error "cat: command not found". I can fix this by changing "cat" with "/bin/cat" just that I don't think this is correct since, as I stated above, the script runs correctly when executed as standalone. If I fix the "cat" command as above, the next error that raises is "basename: command not found", then "main: command not found"
I am pretty new to shell programming, so any help is appreciated.
Thank you
Try $ echo 'export PATH=$PATH:/root/scripts/RotateVideos' >> ~/.bashrc && source ~/.bashrc in the command line and then just call it using s3-get in your script. Alternatively use cd /root/scripts/RotateVideos && bash s3-get.

Why doesn't a shell get variables exported by a script run in a subshell?

I have two scripts 1.sh and 2.sh.
1.sh is as follows:
#!/bin/sh
variable="thisisit"
export variable
2.sh is as follows:
#!/bin/sh
echo $variable
According to what I read, doing like this (export) can access the variables in one shell script from another. But this is not working in my scripts.
If you are executing your files like sh 1.sh or ./1.sh Then you are executing it in a sub-shell.
If you want the changes to be made in your current shell, you could do:
. 1.sh
# OR
source 1.sh
Please consider going through the reference-documentation.
"When a script is run using source [or .] it runs within the existing shell, any variables created or modified by the script will remain available after the script completes. In contrast if the script is run just as filename, then a separate subshell (with a completely separate set of variables) would be spawned to run the script."
export puts a variable in the executing shell's environment so it is passed to processes executed by the script, but not to the process calling the script or any other processes. Try executing
#!/bin/sh
FOO=bar
env | grep '^FOO='
and
#!/bin/sh
FOO=bar
export FOO
env | grep '^FOO='
to see the effect of export.
To get the variable from 1.sh to 2.sh, either call 2.sh from 1.sh, or import 1.sh in 2.sh:
#!/bin/sh
. ./1.sh
echo $variable

Resources