Bash - File local variables - prompt

I just wrote a small file to set my PS1 variable. This file is sourced from my .bashrc. Now I have a couple of questions regarding this approach.
But first the code:
setprompt:
# Normal variables
BOLD="$(tput bold)"
RESET="$(tput sgr0)"
RED="$(tput setaf 1)"
GREEN="$(tput setaf 2)"
YELLOW="$(tput setaf 3)"
BLUE="$(tput setaf 4)"
PINK="$(tput setaf 5)"
CYAN="$(tput setaf 6)"
GRAY="$(tput setaf 7)"
# Make non-printable variables
PROMPT_BOLD="\[$BOLD\]"
PROMPT_RESET="\[$RESET\]"
PROMPT_RED="\[$RED\]"
PROMPT_GREEN="\[$GREEN\]"
PROMPT_YELLOW="\[$YELLOW\]"
PROMPT_BLUE="\[$BLUE\]"
PROMPT_PINK="\[$PINK\]"
PROMPT_CYAN="\[$CYAN\]"
PROMPT_GRAY="\[$GRAY\]"
# Other variables
USERNAME="\u"
FULL_HOSTNAME="\H"
SHORT_HOSTNAME="\h"
FULL_WORKING_DIR="\w"
BASE_WORKING_DIR="\W"
# Throw it together
FINAL="${PROMPT_RESET}${PROMPT_BOLD}${PROMPT_GREEN}"
FINAL+="${USERNAME}#${SHORT_HOSTNAME} "
FINAL+="${PROMPT_RED}${FULL_WORKING_DIR}\$ "
FINAL+="${PROMPT_RESET}"
# Export variable
export PS1="${FINAL}"
.bashrc:
..
source ~/.dotfiles/other/setprompt
..
My questions:
Will this approach slow down my bash startup? Should I just write one ugly unreadable line of code instead of doing these variable definitions/sourcing?
I noticed, that the variables defined in setprompt are still defined in my .bashrc. I don't like this behaviour since it's not obvious to the editor of .bashrc that variables are defined when sourcing setprompt. Is this just the behaviour of source? What can I do about this?
Edit:
This is the approach I use now (recommended by tripleee):
getPrompt.sh:
#!/bin/bash
getPrompt () {
# Bold/Reset
local PROMPT_BOLD="\[$(tput bold)\]"
local PROMPT_RESET="\[$(tput sgr0)\]"
# Colors
local PROMPT_RED="\[$(tput setaf 1)\]"
local PROMPT_GREEN="\[$(tput setaf 2)\]"
# Miscellaneous
local USERNAME="\u" local SHORT_HOSTNAME="\h"
local FULL_WORKING_DIR="\w"
# Print for later use
printf "%s%s%s%s" "${PROMPT_RESET}${PROMPT_BOLD}${PROMPT_GREEN}" \
"${USERNAME}#${SHORT_HOSTNAME} " \
"${PROMPT_RED}${FULL_WORKING_DIR}\$ " \
"${PROMPT_RESET}"
}
.bashrc:
source ~/.dotfiles/bash/getPrompt.sh
PS1=$(getPrompt)

Keeping things human-readable is probably a good thing, and if performance is a problem, perhaps you can control whether this gets executed at all if your prompt is already set. As a first step, maybe move the call to .bash_profile instead of .bashrc.
You can either unset all the variables at the end of the script, or refactor the script so that it runs as a function, or as a separate script (i.e. call it instead of source it).
If you put it all in a function, the function will need to declare all the variables local.
If you run this as an external script, you will need to change it so that it prints the final value. Then you can call it like
PS1=$(setprompt)
without any side effects. (Perhaps you would want to do this with a function, too, just to keep it clean and modular.)

Related

Odd, long space in bash when writing commands

I have added some custom parameters to personalize my bash and I am experiencing some unexpected behavior, so I think I might have done it wrong. In the code block below are the custom parameters:
# Custom parameters
tprompt () {
local bold=$(tput bold)
local red=$(tput setaf 1)
local green=$(tput setaf 2)
local magenta=$(tput setaf 5)
local cyan=$(tput setaf 6)
local plain=$(tput sgr0)
printf -v PS1 '%s' \
'\[\033[1;36m\]' \
'\u\[\033[1;31m\]' \
'#\[\033[1;32m\]' \
'\h:\[\033[1;35m\]' \
'\w\[\033[1;31m\]' \
'\$\[\033[0m\] '
}
tprompt
tput () {
printf '\\['
command tput "$#"
printf '\\]'
}
Everything works well, but it seems as if it will behave like this when the a path is too long, as shown in the pic below:
It might also be worth mentioning that I am using ble.sh.
EDIT:
Output of echo $SHELL: /bin/bash
Output of declare -p PS1:
declare -- PS1="\\[\\033[1;36m\\]\\u\\[\\033[1;31m\\]#\\[\\033[1;32m\\]\\h:\\[\\033[1;35m\\]\\w\\[\\033[1;31m\\]\\\$\\[\\033[0m\\] "
Thank you for the report! I'm the author of ble.sh. This was a bug of ble.sh in the coordinate calculation. I fixed it in the latest push. Could you please update ble.sh by the following command?
$ ble-update
The bug was created in 0.4.0-devel3 commit 4fa139ad (2021-03-21) and fixed in commit 9badb5f (2021-06-11 now!). (I have actually noticed this problem on 2021-05-16, but somehow I was forgetting to fix it). The versions between these commits were affected by the bug. Everyone using the master branch of ble.sh should update it.

Linux shell (Bash/Z shell) - change background color when under a specific directory

I would like to change the background color of a shell (Z shell, but Bash will do as well) every time I go under a specific directory. For example, I would like to change the background color every time I am in /mnt/data to say red and change it back to normal if I go out of /mnt/data/...
To change the background and preserve my current prompt, I do:
export PS1="$PS1 %{$'\e[0;41m'%}"
I am not sure how to hook this up so that it is evaluated (wrapped in an if statement) every time I change working directory.
The trick is to use the command substitution in PS1. The following kind of works for me in Bash:
PS1='$(if [[ $PWD == /mnt/data* ]] ; then printf "\[\e[0;41m\]" ; else printf "\[\e[m\]" ; fi) %'
By "kind of" I mean the fact that the behaviour on the command line immediately after changing to/from the directory is a bit weird (e.g., the background changes after you press Backspace).
You can also use the PROMPT_COMMAND shell variable which is more suitable for code than the prompt itself:
PROMPT_COMMAND='if [[ $PWD == /mnt/data* ]] ; then printf "\e[0;41m" ; else printf "\e[m" ; fi'
It's cleaner to keep the code in a function with all the proper indentation, and just call the function from the variable:
colour_mnt_data () {
if [[ $PWD == /mnt/data* ]] ; then
printf '\e[0;41m'
else
printf '\e[m'
fi
}
PROMPT_COMMAND='colour_mnt_data'
Answer for zsh (although the second part can be adapted to bash):
This is a two-part problem:
Acting on directory changes: For zsh you can just use the chpwd hook function. chpwd as well as any function listed in the chpwd_functions array are called each time the current working directory is changed.
So, if you want to react to certain directories you can use something like this
# load helper function to manipulate hook arrays
autoload -Uz add-zsh-hook
# define hook function, decide on action based on $PWD, the new pwd.
chback_on_chdir () {
case $PWD in
/mnt/data/* )
# change background, when entering subdirectories of "/mnt/data"
;;
/home )
# change background, when entering exactly "/home"
;;
/usr | /usr/* )
# change background, when entering "/usr" or a subdirectory thereof
;;
* )
# change background, when entering any other directory
;;
esac
}
# add chback_on_chdir to chpwd_functions
add-zsh-hook chpwd chback_on_chdir
Changing the background color: There are actually two ways to change the background color.
You can change the background for the following printed characters within the colors available within the terminal (which is, what you do in your example). In zsh this could be done like this (shortened example for chdir hook):
# allow for parameter substitution in prompts
setopt PROMPT_SUBST
# add string `$COLOR` to $PS1.
# Note the `\` before `${COLOR}`, which prevents immediate evaluation.
# `${COLOR}` will be substituted each time the prompt is printed
PS1="$PS1\${COLOR}"
chpwd () {
case $PWD in
/mnt/data/* )
# set background color to red
COLOR='%K{red}'
;;
* )
# reset background color to default
COLOR='%k'
# could also be just an empty string
#COLOR=''
# or unset
#unset COLOR
;;
esac
}
In some (many?) terminals you can also redefine the default background color. This will actually change the background color everywhere, even on already printed text and "unprinted" locations. This can be done by utilizing XTerm Control Sequences, which - despite their name - work in other terminal emulatores, too. (I tested successfully with xterm, urxvt, gnome-terminal and termite). The control sequence in question is
ESC]11;<color>ST
where ESC is the escape character \e, <color> is a color specification (e.g. red, #ff0000, rgb:ff/00/00, rgbi:1/0/0 - what actually works might depend on the terminal) and ST is the string terminator \e\\ (ESC\). You can send it to the terminal with
printf "\e]11;red\e\\"
You can reset the color to the configured default with the control sequence
ESC]111ST
using the command
printf "\e]111\e\\"
So, if you usually have a black background and want to tint it slightly red when entering /mnt/data or a directory below it, you can use:
chpwd () {
case $PWD in
/mnt/data | /mnt/data/* )
# set background color to a dark red
printf "\e]11;#1f0000\e\\"
;;
* )
# reset the background color to configured default
printf "\e]111\e\\"
;;
esac
}
Note: I found that it does not seem to work on urxvt, if transparency is enabled.
It is possible to retrieve the current value by replacing the color specification with ?:
printf "\e]11;?\e\\" ; sleep 1
The sleep 1 is needed so that the output is not immediately overwritten by the prompt.
You will need to write a script file with functions that you can call to determine what your PS1 should be because of the directory that you are in.
Then, you source this script file in your .bashrc and set your PS1 so that it calls the function from your script file to set its value.
. ~/.myCleverPS1
export PS1='$PS1 $(myCleverPS1func " (%s)") $ '
An example you can look at is the git-completion script which adds the name of the current branch to the prompt whenever you are in a git repo directory (and can optionally colorize too).
See as example: https://github.com/git/git/tree/master/contrib/completion

Colourful makefile info command

Usually I am using echo -e "\e[1:32mMessage\e[0m" to print colourful messages out of the makefile. But now I want to print message inside of the ifndef block, so I am using $(info Message) style. Is it possible to make this kind of message colourful ?
Yes. You can use a tool like tput to output the literal escape sequences needed instead of using echo -e (which isn't a good idea anyway) to do the same thing.
For example:
$(info $(shell tput setaf 1)Message$(shell tput sgr0))
Though that requires two shells to be spawned and two external commands to be run as opposed to the echo (or similar) methods in a recipe context so that's comparatively more expensive.
You could (and should if you plan on using the colors in more than one place) save the output from tput in a variable and then just re-use that.
red:=$(shell tput setaf 1)
reset:=$(shell tput sgr0)
$(info $(red)Message$(reset))
$(info $(red)Message$(reset))

Unix: What is the difference between source and export?

I am writing a shell script, to read a file which has key=value pair and set those variables as environment variables. But I have a doubt, if I do source file.txt will that set the variables defined in that file as environment variable or I should read the file line by line and set it using export command ?
Is source command in this case different than export?
When you source the file, the assignments will be set but the variables are not exported unless the allexport option has been set. If you want all the variables to be exported, it is much simpler to use allexport and source the file than it is to read the file and use export explicitly. In other words, you should do:
set -a
. file.txt
(I prefer . because it is more portable than source, but source works just fine in bash.)
Note that exporting a variable does not make it an environment variable. It just makes it an environment variable in any subshell.
source (.) vs export (and also some file lock [flock] stuff at the end):
In short:
source some_script.sh, or the POSIX-compliant equivalent, . some_script.sh, brings variables in from other scripts, while
export my_var="something" pushes variables out to other scripts/processes which are called/started from the current script/process.
Using source some_script.sh or . some_script.sh in a Linux shell script is kind of like using import some_module in Python, or #include <some_header_file.h> in C or C++. It brings variables in from the script being sourced.
Using export some_var="something" is kind of like setting that variable locally, so it is available for the rest of the current script or process, and then also passing it in to any and all sub-scripts or processes you may call from this point onward.
More details:
So, this:
# export `some_var` so that it is set and available in the current script/process,
# as well as in all sub-scripts or processes which are called from the
# current script/process
export some_var="something"
# call other scripts/processes, passing in `some_var` to them automatically
# since it was just exported above!
script1.sh # this script now gets direct access to `some_var`
script2.sh # as does this one
script3.sh # and this one
is as though you had done this:
# set this variable for the current script/process only
some_var="something"
# call other scripts/processes, passing in `some_var` to them **manually**
# so they can use it too
some_var="something" script1.sh # manually pass in `some_var` to this script
some_var="something" script2.sh # manually pass in `some_var` to this script
some_var="something" script3.sh # manually pass in `some_var` to this script
except that the first version above, where we called export some_var="something" actually has a recursive passing or exporting of variables to sub-processes, so if we call script1.sh from inside our current script/process, then script1.sh will get the exported variables from our current script, and if script1.sh calls script5.sh, and script5.sh calls script10.sh, then both of those scripts as well will get the exported variables automatically. This is in contrast to the manual case above where only those scripts called explicitly with manually-set variables as the scripts are called will get them, so sub-scripts will NOT automatically get any variables from their calling scripts!
How to "un-export" a variable:
Note that once you've exported a variable, calling unset on it will "unexport it", like this:
# set and export `some_var` so that sub-processes will receive it
export some_var="something"
script1.sh # this script automatically receives `some_var`
# unset and un-export `some_var` so that sub-processes will no longer receive it
unset some_var
script1.sh # this script does NOT automatically receive `some_var`
In summary:
source or . imports.
export exports.
unset unexports.
Example:
Create this script:
source_and_export.sh:
#!/bin/bash
echo "var1 = $var1"
var2="world"
Then mark it executable:
chmod +x source_and_export.sh
Now here is me running some commands at the terminal to test the source (.) and export commands with this script. Type in the command you see after the lines beginning with $ (not including the comments). The other lines are the output. Run the commands sequentially, one command at a time:
$ echo "$var1" # var1 contains nothing locally
$ var1="hello" # set var1 to something in the current process only
$ ./source_and_export.sh # call a sub-process
var1 = # the sub-process can't see what I just set var1 to
$ export var1 # **export** var1 so sub-processes will receive it
$ ./source_and_export.sh # call a sub-process
var1 = hello # now the sub-process sees what I previously set var1 to
$ echo "$var1 $var2" # but I can't see var2 from the subprocess/subscript
hello
$ . ./source_and_export.sh # **source** the sub-script to _import_ its var2 into the current process
var1 = hello
$ echo "$var1 $var2" # now I CAN see what the subprocess set var2 to because I **sourced it!**
hello world # BOTH var1 from the current process and var2 from the sub-process print in the current process!
$ unset var1 # unexport (`unset`) var1
$ echo "$var1" # var1 is now NOT set in the current process
$ ./source_and_export.sh # and the sub-process doesn't receive it either
var1 =
$ var1="hey" # set var1 again in the current process
$ . ./source_and_export.sh # if I **source** the script, it runs in the current process, so it CAN see var1 from the current process!
var1 = hey # notice it prints
$ ./source_and_export.sh # but if I run the script as a sub-process, it can NOT see var1 now because it was `unset` (unexported)
var1 = # above and has NOT been `export`ed again since then!
$
Using files as global variables between processes
Sometimes, when writing scripts to launch programs and things especially, I have come across cases where export doesn't seem to work right. In these cases, sometimes one must resort to using files themselves as global variables to pass information from one program to another. Here is how that can be done. In this example, the existence of the file "~/temp/.do_something" functions as an inter-process boolean variable:
# In program A, if the file "~/temp/.do_something" does NOT exist,
# then create it
mkdir -p ~/temp
if [ ! -f ~/temp/.do_something ]; then
touch ~/temp/.do_something # create the file
fi
# In program B, check to see if the file exists, and act accordingly
mkdir -p ~/temp
DO_SOMETHING="false"
if [ -f ~/temp/.do_something ]; then
DO_SOMETHING="true"
fi
if [ "$DO_SOMETHING" == "true" ] && [ "$SOME_OTHER_VAR" == "whatever" ]; then
# remove this global file "variable" so we don't act on it again
# until "program A" is called again and re-creates the file
rm ~/temp/.do_something
do_something
else
do_something_else
fi
Simply checking for the existence of a file, as shown above, works great for globally passing around boolean conditions between programs and processes. However, if you need to pass around more complicated variables, such as strings or numbers, you may need to do this by writing these values into the file. In such cases, you should use the file lock function, flock, to properly ensure inter-process synchronization. It is a type of process-safe (ie: "inter-process") mutex primitive. You can read about it here:
The shell script flock command: https://man7.org/linux/man-pages/man1/flock.1.html. See also man flock or man 1 flock.
The Linux library C command: https://man7.org/linux/man-pages/man2/flock.2.html. See also man 2 flock. You must #include <sys/file.h> in your C file to use this function.
References:
https://askubuntu.com/questions/862236/source-vs-export-vs-export-ld-library-path/862256#862256
My own experimentation and testing
I'll be adding the above example to my project on GitHub here, under the bash folder: https://github.com/ElectricRCAircraftGuy/eRCaGuy_hello_world

Shell script for setting environment variable

I am writing a shell script to set the environment variables whose values are available in a file. Below is the shell script I wrote,
VARIABLE_FILE=env-var.dat
if [ -f ${VARIABLE_FILE} ] ; then
. ${VARIABLE_FILE}
if [ ! -z "${TEST_VAR1}" ] ; then
export TEST_VAR1="${TEST_VAR1}"
fi
if [ ! -z "${TEST_VAR2}" ] ; then
export TEST_VAR2="${TEST_VAR2}"
fi
fi
The above code works only in bash shell, since I have used export command to set the environment variable and it fails if I used it with any other shell. Is there is any command to set the environment variable which works in any shell ?
"Fancier" shells like bash and zsh permit you to set a variable and export it as an environment variable at the same time like so:
export FOO=bar
With a standard POSIX bourne shell, the equivalent is achieved by doing it in two commands:
FOO=bar
export FOO
Note that once you've exported a variable, you can reset it to a different value later in the script and it's still exported (you don't need to export it again). Also, you can export several variables at a time:
FOO=bar
BAZ=quux
export FOO BAZ
You mentioned tcsh in your comment, but csh and derivatives are completely different from bourne-based shells (and not recommended for use!). You can rarely make a shell script compatible with both sh and csh at the same time. For csh, look into setenv
If you really want this to happen, it can be done, but it's tricky. One way to do it is to use awk to output the correct syntax and evaluate the text coming back from awk. To share a single environment variable value file between major sh and csh flavors, the following command in a file will import a variable value file to the environment: (yes, yes, it's one huge line, due to the inflexible way that some shells treat the backticks. If you didn't mind having a .awk file too, you could use awk -f...)
eval `awk '{ var = $1; $1=""; val=substr($0,2); if ( ENVIRON["SHELL"] ~ /csh$/) { print "setenv", var, " \"" val "\";" } else { print var "=\"" val "\"" ; print "export", var }}' $HOME/env_value_file`
The variable value file is in this format:
FOO value for foo
BAR foo bar
BAZ $BAR plus values $FOO
Design notes for educational purposes:
In awk, there's no easy way of accessing fields 2-NF, so if there
could be spaces in our variable values we need to modify $1 to get
$0 to be close to get the value we want.
To get this to work, since a SHELL variable is always set, but not as an
environment variable and not with a consistent capitalization, you have to wet
a SHELL environment variable from the shell's value as below.
as an environment variable before you use the script.
Also, if you want the new environment values to be present after the import
environment script you need to source the environment script.
If a shell doesn't do eval well, you'll have to tweak the script.
For bourne shell flavors (bash, sh, ksh, zsh):
export SHELL
. import_environment
For csh flavors: (shell variable tends to be lower case in csh shells)
setenv SHELL "$shell"
source import_environment

Resources