Debug bash/ksh script and subscripts - linux

I know that to debug script I can issue command
set -x
on the first line. The problem is that when script launches some other scripts they do not inherit this setting. So my question is whether there is some possibility to set this flag globally for shell and all subshells or for some script and all scripts launched by it?

In Bash you can use export SHELLOPTS. It will make all Bash subshells inherit the -x option (as well as all the other options in SHELLOPTS!).
Example:
export SHELLOPTS
bash -x script1.sh
See bash recursive xtrace

It depends, you call call your subshells with -x too,

Put the set -x inside the Shell script
$ cat shell1.sh
echo "Shell1"
$ cat shell2.sh
#!/bin/bash
set -x
./shell1.sh
echo "shell2.sh"
$ ./shell2.sh
+ ./shell1.sh
Shell1
+ echo shell2.sh
shell2.sh

Related

bash command working from terminal but not from script [duplicate]

a.sh
#! /bin/sh
export x=/usr/local
we can do source ./a in command-line. But I need to do the export through shell script.
b.sh
#! /bin/sh
. ~/a.sh
no error... but $x in command-line will show nothing. So it didn't get export.
Any idea how to make it work?
a.sh
#! /bin/sh
export x=/usr/local
-----------
admin#client: ./a.sh
admin#client: echo $x
admin#client: <insert ....>
You can put export statements in a shell script and then use the 'source' command to execute it in the current process:
source a.sh
You can't do an export through a shell script, because a shell script runs in a child shell process, and only children of the child shell would inherit the export.
The reason for using source is to have the current shell execute the commands
It's very common to place export commands in a file such as .bashrc which a bash will source on startup (or similar files for other shells)
Another idea is that you could create a shell script which generates an export command as it's output:
shell$ cat > script.sh
#!/bin/sh
echo export foo=bar
^D
chmod u+x script.sh
And then have the current shell execute that output
shell$ `./script.sh`
shell$ echo $foo
bar
shell$ /bin/sh
$ echo $foo
bar
(note above that the invocation of the script is surrounded by backticks, to cause the shell to execute the output of the script)
Answering my own question here, using the answers above: if I have more than one related variable to export which use the same value as part of each export, I can do this:
#!/bin/bash
export TEST_EXPORT=$1
export TEST_EXPORT_2=$1_2
export TEST_EXPORT_TWICE=$1_$1
and save as e.g. ~/Desktop/TEST_EXPORTING
and finally $chmod +x ~/Desktop/TEST_EXPORTING
--
After that, running it with source ~/Desktop/TEST_EXPORTING bob
and then checking with export | grep bob should show what you expect.
Exporting a variable into the environment only makes that variable visible to child processes. There is no way for a child to modify the environment of its parent.
Another way you can do it (to steal/expound upon the idea above), is to put the script in ~/bin and make sure ~/bin is in your PATH. Then you can access your variable globally. This is just an example I use to compile my Go source code which needs the GOPATH variable to point to the current directory (assuming you're in the directory you need to compile your source code from):
From ~/bin/GOPATH:
#!/bin/bash
echo declare -x GOPATH=$(pwd)
Then you just do:
#> $(GOPATH)
So you can now use $(GOPATH) from within your other scripts too, such as custom build scripts which can automatically invoke this variable and declare it on the fly thanks to $(pwd).
script1.sh
shell_ppid=$PPID
shell_epoch=$(grep se.exec_start "/proc/${shell_ppid}/sched" | sed 's/[[:space:]]//g' | cut -f2 -d: | cut -f1 -d.)
now_epoch=$(($(date +%s%N)/1000000))
shell_start=$(( (now_epoch - shell_epoch)/1000 ))
env_md5=$(md5sum <<<"${shell_ppid}-${shell_start}"| sed 's/[[:space:]]//g' | cut -f1 -d-)
tmp_dir="/tmp/ToD-env-${env_md5}"
mkdir -p "${tmp_dir}"
ENV_PROPS="${tmp_dir}/.env"
echo "FOO=BAR" > "${ENV_PROPS}"
script2.sh
shell_ppid=$PPID
shell_epoch=$(grep se.exec_start "/proc/${shell_ppid}/sched" | sed 's/[[:space:]]//g' | cut -f2 -d: | cut -f1 -d.)
now_epoch=$(($(date +%s%N)/1000000))
shell_start=$(( (now_epoch - shell_epoch)/1000 ))
env_md5=$(md5sum <<<"${shell_ppid}-${shell_start}"| sed 's/[[:space:]]//g' | cut -f1 -d-)
tmp_dir="/tmp/ToD-env-${env_md5}"
mkdir -p "${tmp_dir}"
ENV_PROPS="${tmp_dir}/.env"
source "${ENV_PROPS}"
echo $FOO
./script1.sh
./script2.sh
BAR
It persists for the scripts run in the same parent shell, and it prevents collisions.

Bash command option clarification bash -ex

could you please explain to me what exactly this shell command do?
It is quite difficoult to retrive the description of this -ex option.
sh #!/bin/bash -ex
Thanks in advance
It means you're invoking new bash shell with -e and -x shell options
See shell options here: https://tldp.org/LDP/abs/html/options.html
-e errexit Abort script at first error, when a command exits with non-zero status (except in until or while loops, if-tests, list constructs)
-x xtrace Similar to -v, but expands commands
since -x is similar to -v:
-v verbose Print each command to stdout before executing it
So it's actually dropping to next level shell:
$ echo $SHLVL
1
$ sh #!/bin/bash -ex
$ echo $SHLVL
2
in which in this level 2 shell, option -e and -x is activated

Unix shell scripting: pass shell options (-x etc.) to nested scripts

How can I run nested shell scripts with the same option? For example,
parent.sh
#!/bin/sh
./child.sh
child.sh
#!/bin/sh
ls
How can I modify parent.sh so that when I run it with sh -x parent.sh, the -x option is effective in child.sh as well and the execution of ls is displayed on my console?
I'm looking for a portable solution which is effective for rare situations such as system users with /bin/false as their registered shell. Will the $SHELL environment variable be of any help?
Clarification: I sometimes want to call parent.sh with -x, sometimes with -e, depending on the situation. So the solution must not involve hard-coding the flags.
If you use bash, i can recommend the following:
#!/bin/bash
export SHELLOPTS
./child.sh
You can propagate as many times as you need, also you can use echo $SHELLOPTS in every script down the line to see what is happening and how options are propagated if you need to understand it better.
But for /bin/sh it will fail with /bin/sh: SHELLOPTS: readonly variable because of how POSIX is enforced on /bin/sh in various systems, more info here: https://lists.gnu.org/archive/html/bug-bash/2011-10/msg00052.html
it's looks like a hack and seems it's not the best way.
But it will do exact what you want
One of the ways how you can do it - it's to create aliases to create wrappers for sh:
alias saveShell='cp /bin/sh $some_safe_place'
alias shx='cp $some_safe_place /bin/x_sh; rm /bin/sh; echo "/bin/x_sh -x $#" > /bin/sh; chmod 755 /bin/sh '
alias she='cp $some_safe_place /bin/e_sh; rm /bin/sh; echo "/bin/e_sh -e $#" > /bin/sh; chmod 755 /bin/sh '
alias restoreShell='cp $some_safe_place /bin/sh'
How to Use:
run saveShell and then use shx or she , if you would change -x on -e run restoreShell and then run shx or she
run script as usually
sh ./parent.sh
BE VERY CAREFUL WITH MOVING SH
Other solution
replace #!/bin/sh to #!/bin/sh -x or #!/bin/sh -e with sed in all sh files before running script.

Shell scripting shell inside shell

I would like to connect to different shells (csh, ksh etc.,) and execute command inside each switched shell.
Following is the sample program which reflects my intention:
#!/bin/bash
echo $SHELL
csh
echo $SHELL
exit
ksh
echo $SHELL
exit
Since, i am not well versed with Shell scripting need a pointer on how to achieve this. Any help would be much appreciated.
If you want to execute only one single command, you can use the -c option
csh -c 'echo $SHELL'
ksh -c 'echo $SHELL'
If you want to execute several commands, or even a whole script in a child-shell, you can use the here-document feature of bash and use the -s (read commands from stdin) on the child shells:
#!/bin/bash
echo "this is bash"
csh -s <<- EOF
echo "here go the commands for csh"
echo "and another one..."
EOF
echo "this is bash again"
ksh -s <<- EOF
echo "and now, we're in ksh"
EOF
Note that you can't easily check the shell you are in by echo $SHELL, because the parent shell expands this variable to the text /././bash. If you want to be sure that the child shell works, you should check if a shell-specific syntax is working or not.
It is possible to use the command line options provided by each shell to run a snippet of code.
For example, for bash use the -c option:
bash -c $code
bash -c 'echo hello'
zsh and fish also use the -c option.
Other shells will state the options they use in their man pages.
You need to use the -c command line option if you want to pass commands on bash startup:
#!/bin/bash
# We are in bash already ...
echo $SHELL
csh -c 'echo $SHELL'
ksh -c 'echo $SHELL'
You can pass arbitrary complex scripts to a shell, using the -c option, as in
sh -c 'echo This is the Bourne shell.'
You will save you a lot of headaches related to quotes and variable expansion if you wrap the call in a function reading the script on stdin as:
execute_with_ksh()
{
local script
script=$(cat)
ksh -c "${script}"
}
prepare_complicated_script()
{
# Write shell script on stdout,
# for instance by cat-ting a here-document.
cat <<'EOF'
echo ${SHELL}
EOF
}
prepare_complicated_script | execute_with_ksh
The advantage of this method is that it easy to insert a tee in the pipe or to break the pipe to control the script being passed to the shell.
If you want to execute the script on a remote host through ssh you should consider encode your script in base 64 to transmit it safely to the remote shell.

Piping a shell script to bash and launch interactive bash

Consider the following shell script on example.com
#/bin/bash
export HELLO_SCOPE=WORLD
eval $#
Now, I would like to download and then execute this shell script with parameters in the simplest way and be able to launch an interactive bash terminal with the HELLO_SCOPE variable set.
I have tried
curl http://example.com/hello_scope.sh | bash -s bash -i
But it quits the shell immediately. From what I can understand, it's because curls stdout, the script, remains the stdin of the bash, preventing it from starting interactively (as that would require my keyboard to be stdin).
Is there a way to avoid this without going through the extra step of creating a temporary file with the shell script?
You can source it:
# open a shell
. <(curl http://example.com/hello_scope.sh)
# type commands ...
You could just download this script you (using wget for example) and source this script, isn't it ?
script_name="hello_scope.sh"
[[ -f $script_name ]] && rm -rf "$script_name"
wget "http://example.com/$script_name" -O "$script_name" -o /dev/null
&& chmod u+x "$script_name"
&& source "$script_name"
You could use . "$script_name" instead of source "$script_name" if you want (. is POSIX compliant). You could write the previous code in a script and source it to have interactive shell with the setted variable $HELLO_SCOPE.
Finally you could remove the eval line in your remote shell script.

Resources