Aliasing with variables in bash profile - vim

Here's a very simple question about my vim bash profile.
I would like to create an alias where I type "activate (variable)", and my virtual env immediately gets activated by running this command:
$ source foldername/bin/activate
As you can see, foldername will be the variable in this case, so I figured, I should write a function instead of a static one liner to set this alias. I tried something likes this:
activate(something){
source something/bin/activate
}
Ideally, what I would like is to type:
$ activate f1
and this command gets run:
$ source f1/bin/activate
It would also be nice to have a default. So calling "activate" would also work.
Thanks for the help.

You could update your shell environment, using a function like this:
function activate () {
if [ $# -eq 0 ]; then
# no arguments passed to the function (default case)
source f1/bin/activate
elif [ $# -eq 1 ]; then
# one argument passed to the function
source "$1"/bin/activate # argument value read from $1
fi
}

Related

Im missing an output from MOTD, MTU and users from docker group

So, I'm writing a bash script that doesnt give me any output.
The script is:
a) going to detect what operating system that is running
b) And know what package managers to use between APT, DNF and Pacman.
Further in the script it is:
a) going to choose the correct package manager to use when installing both Docker and Docker-Compose.
I have written down the MOTD function that should show a message on my ubuntu server.
Im creating a function that adds users to a docker group.
Configuring Docker Daemon that sets a specific MTU value to 1442 and logging.
The problem is that I dont get any output, otherwise from the MTU value that is actually 1442, that seems correct in my script.
Furhter i should get an empty line where i can get an input scenario to add a user that will be added in to a docker group.
#!/bin/bash
# This script will install Docker and Docker-Compose, configure the Docker daemon,
# and add specified users to the docker group.
# Define default values
MTU=1442
VERBOSE=false
# Function to detect operating system
detect_os() {
if [ -f /etc/lsb-release ]; then
os="ubuntu"
package_manager="apt"
elif [ -f /etc/redhat-release ]; then
os="centos"
package_manager="dnf"
elif [ -f /etc/arch-release ]; then
os="arch"
package_manager="pacman"
else
echo "Error: Unable to detect operating system."
exit 1
fi
}
# Function to update MOTD
update_motd() {
local motd_file="/etc/motd"
echo "$1" > "$motd_file"
echo "MOTD updated with message: $1"
}
# Function to add users to docker group
add_users() {
local users="$1"
local group="docker"
for user in $users; do
# Check if user exists
if ! id "$user" >/dev/null 2>&1; then
useradd "$user"
echo "User $user created."
fi
# Add user to docker group
usermod -aG "$group" "$user"
echo "User $user added to $group group."
done
}
# Function to install Docker and Docker-Compose
install_docker() {
local package_manager="$1"
local packages="docker docker-compose"
case "$package_manager" in
apt)
sudo apt-get update
sudo apt-get install -y $packages
;;
dnf)
sudo dnf install -y $packages
;;
pacman)
sudo pacman -S --noconfirm $packages
;;
*)
echo "Error: Invalid package manager: $package_manager"
exit 1
;;
esac
}
# Function to configure Docker daemon
configure_docker() {
local mtu="$1"
local config_file="/etc/docker/daemon.json"
# Create config file if it does not exist
if [ ! -f "$config_file" ]; then
sudo touch "$config_file"
sudo chmod 644 "$config_file"
fi
# Update MTU value in config file
sudo sh -c "echo '{\"mtu\": $mtu}' > $config_file"
echo "Docker daemon configured with MTU=$mtu."
}
# Parse command line argume
while [ "$#" -gt 0 ]; do
case "$1" in
--motd )
MOTD="$2"
shift 2
;;
--users)
USERS="$2"
shift 2
;;
--mtu)
MTU="$2"
shift 2
;;
esac
done
echo "MOTD: $MOTD"
echo "USERS: $USERS"
echo "MTU: $MTU"
echo "Script is finish"
The output doesnt show me anything more than the MTU=1442, and missing the users and MOTD.
Im not sure if I was clear enough, but from my project i thought my script was correct, but probably I'm missing some logic any places in my script. The projects tasks are described above, but im not sure if im on the right way here
Would appreciate any suggestions for the way in my script :)
This is not a full-fix of your script - since I'm sure you are not about to cheat on your project but want to understand and know why your script doesn't provide your expected output so you will be able to develop it on your own.
Here I'm pasting a small script that may help you better understand the basic usage of functions in BASH. Hope it will help 🤞.
#!/bin/bash
### Defining function - Functions are reusable code blocks in the script and can accept arguments while calling them.
# So each time we call an individual function later in the script we may pass different arguments to it (if needed).
my_function1(){
echo "this is a function that doesn't expect any arguments."
echo "End of 'my_function1'"
}
my_function2(){
echo "this is a function that do expect an argument."
echo "this function expects one argument to print/echo it."
echo "Hello ${1}" # <-- Numerical Variables ($1 $2 .. $N) are reserved variables in 'BASH' which values are assigned from the relevant argument(s) provided to them on script runtime and function runtime.
echo "End of 'my_function2'"
}
my_function3(){
echo "this is a function that expect one or more arguments."
echo "this function print/echo all arguments passed to it."
echo "Hi ${#}"
echo "End of 'my_function3'"
}
### Calling the functions to execute their code - we may pass relevant argument(s) to them.
# This is done by using the function name - and any parameter/string added after the function name will be passed to it as the function's argument accordingly.
# Running the `my_function1` without providing any arguments - since it is not neccessary.
my_function1
# Print an empty line to seperate outputs
echo ""
# Running the `my_function2` passing it a name as argument. Ex. Vegard
my_function2 Vegard
# Print an empty line to seperate outputs
echo ""
# Running the `my_function3` passing it a `name` as first argument and a `LAST_NAME` as second argument. Ex. Vegard YOUR_LASTNAME
my_function3 Vegard YOUR_LASTNAME
# Print an empty line to seperate outputs
echo ""
### End of the script.
# Exitting the script with the `0` exit-code.
exit 0
Bonus Update #1
How to provide arguments to a script at run time:
You can provide arguments to the scripts almost in the same way as providing arguments to the functions.
Assuming the script file name is script.sh, it is located in our current working directory, and it is executable:
NAME - as first argument.
LAST_NAME - as second argument.
Run the script as follows:
./script.sh NAME LAST_NAME
Bonus Update #2
How to provide Dynamic arguments to a function from the script run time:
If you need to provide a dynamic argument for a function at runtime instead of having hard-coded argument(s) for that function, you may use the same
reserved numeric variables princip.
Simple example
Consider you run your script providing some argument that can be change on every run.
./script.sh firstarg secondarg "last arg"
Note: If a single argument contains space character should be quoted to avoid detecting it as separate arguments - same applies to providing arguments to funtions
Sum-up: These arguments will can be called by $1 $2 .. $<N> variables accordingly within the Script anywhere out of the Functions code blocks.
${#} or ${*} will get all the provided arguments - google to find their difference.
Consider you defined functions that works with one or more arguments.
#!/bin/bash
my_function(){
# Since this $1 is defined in the function's block itself, it
# will get its value from the argument provided to function
# at run-time Not directly from the arguments provided to the Script!
echo "Argument I got is: ${1}"
}
my_other_function(){
# Printing the first three arguments provided to function
# delimited by Collons.
echo "Arguments I got are: ${1} : ${2} : ${3}"
}
another_function(){
# $# will get all the argument provided to function
# at run-time Not directly from the arguments provided to the Script!
echo "All arguments got are: ${#}"
}
### Now at calling those functions
# Providing a static argument
my_function STATIC_ARGUMENT
# Passing the First argument provided to the script at run-time to the same function.
my_function "${1}"
# Passing the Three arguments provided to the script at run-time to this function.
my_other_function "${1}" "${2}" "${3}"
# Passing all the provided arguments of the script to this function at run-time.
another_function ${#}
Summery
The same numeric reserved variables that used to refer to the argument passed to the script can be passed to a function when calling it, and in the same manner the function's arguments can be referred from within the function block.
Caution
The behavior of a script that deals with any argument that will contain space or other special character may vary since the Bash threats them differently.

Create a directory in a function, then return path as a string to caller in Bash

this is my first time posting here. I don't know if this has been already asked, but I can't find an answer anywhere.
So, I'm trying to make a little script that creates a directory tree, using the root directory I pass as a parameter from the calling function, and then I want to return the full path to the caller function. Here's what I have so far:
#!/bin/bash
function createLogPath(){
local __BASE="$1/var/log"
local __fullPath=$2
mkdir -p $__BASE
if [ $? -eq 0]
then
__fullPath=$( sed "$__BASE" )
eval $__fullPath
fi
}
As you can possibly tell by my code, there are several things I still don't grasp about Bash, since I've been only working with it for like a week or something, so I'm doing this script for testing and learning purposes mostly. My intention with this little function is to create a path to store log files at a given location, and then return the full path to the caller function, so that I can use that path to create a log file in '../var/log'.
But when I call this function from another script, like this:
#!/bin/bash
. makedir.sh # That's the name of the script where createLogPath() is
BASE=$1
RESULT=''
createLogPath $BASE $RESULT
printf "The path is: %s\n" $RESULT
This is not printing anything... what I'm trying to do is return the full path to '../var/log' as a string, but eval keeps treating $__fullPath as a directory and it's giving me a headache...
As you can see from my code, I have no idea what I'm doing, so please explain what I'm doing wrong as well as the correct way to do what I'm trying to do. Any advice would be appreciated.
Bash function arguments are passed by value, not by reference. That is, assigning a value to the argument inside the function will not affect the variable outside the function.
You have two possibilities to return a string from a bash function:
Echo it to the standard output, for example:
function createLogPath(){
local __BASE="$1/var/log"
mkdir -p $__BASE
if [ $? -eq 0 ]
then
__fullPath=$( sed "$__BASE" )
echo $__fullPath
fi
}
and retrieve it with the $() operator, like this:
RESULT=$(createLogPath $BASE)
Use export, like this:
function createLogPath(){
local __BASE="$1/var/log"
mkdir -p $__BASE
if [ $? -eq 0 ]
then
__fullPath=$( sed "$__BASE" )
export RESULT=$__fullPath
fi
}
(Keeping as closely to your original code although it could be simplified.)

Checking cmd line argument in bash script bypass the source statement

I have an bash script "build.sh" like this:
# load Xilinx environment settings
source $XILINX/../settings32.sh
cp -r "../../../EDK/platform" "hw_platform"
if [ $# -ne 0 ]; then
cp $1/system.xml hw_platform/system.xml
fi
echo "Done"
Normally I run it as "./build.sh" and it execute the "source" statement to set environment variables correct. Sometimes I need to let the script to copy file from an alternative place, I run it as "./build.sh ~/alternative_path/"; My script check whether there is an cmd line argument by checking $# against 0.
When I do that, the "source" statement at the beginning of the script somehow get skipped, and build failed. I have put two "echo" before and after the "source", and I see echo statements get executed.
Currently I circumvent this issue by "source $XILINX/../settings32.sh; build.sh". However, please advise what I have done wrong in the script? Thanks.
Try storing the values of your positional paramaters first on an array variable then reset them to 0. "$XILINX/../settings32.sh" may be acting differently when it detects some arguments.
# Store arguments.
ARGS=("$#")
# Reset to 0 arguments.
set --
# load Xilinx environment settings
source "$XILINX/../settings32.sh"
cp -r "../../../EDK/platform" "hw_platform"
if [[ ${#ARGS[#]} -ne 0 ]]; then
cp "${ARGS[0]}/system.xml" hw_platform/system.xml
fi
echo "Done"

Bash config file or command line parameters

If I am writing a bash script, and I choose to use a config file for parameters. Can I still pass in parameters for it via the command line? I guess I'm asking can I do both on the same command?
The watered down code:
#!/bin/bash
source builder.conf
function xmitBuildFile {
for IP in "{SERVER_LIST[#]}"
do
echo $1#$IP
done
}
xmitBuildFile
builder.conf:
SERVER_LIST=( 192.168.2.119 10.20.205.67 )
$bash> ./builder.sh myname
My expected output should be myname#192.168.2.119 and myname#10.20.205.67, but when I do an $ echo $#, I am getting 0, even when I passed in 'myname' on the command line.
Assuming the "config file" is just a piece of shell sourced into the main script (usually containing definitions of some variables), like this:
. /etc/script.conf
of course you can use the positional parameters anywhere (before or after ". /etc/..."):
echo "$#"
test -n "$1" && ...
you can even define them in the script or in the very same config file:
test $# = 0 && set -- a b c
Yes, you can. Furthemore, it depends on your architecture of script. You can overwrite parametrs with values from config and vice versa.
By the way shflags may be pretty useful in writing such script.

How can I run a function from a script in command line?

I have a script that has some functions.
Can I run one of the function directly from command line?
Something like this?
myScript.sh func()
Well, while the other answers are right - you can certainly do something else: if you have access to the bash script, you can modify it, and simply place at the end the special parameter "$#" - which will expand to the arguments of the command line you specify, and since it's "alone" the shell will try to call them verbatim; and here you could specify the function name as the first argument. Example:
$ cat test.sh
testA() {
echo "TEST A $1";
}
testB() {
echo "TEST B $2";
}
"$#"
$ bash test.sh
$ bash test.sh testA
TEST A
$ bash test.sh testA arg1 arg2
TEST A arg1
$ bash test.sh testB arg1 arg2
TEST B arg2
For polish, you can first verify that the command exists and is a function:
# Check if the function exists (bash specific)
if declare -f "$1" > /dev/null
then
# call arguments verbatim
"$#"
else
# Show a helpful error
echo "'$1' is not a known function name" >&2
exit 1
fi
If the script only defines the functions and does nothing else, you can first execute the script within the context of the current shell using the source or . command and then simply call the function. See help source for more information.
The following command first registers the function in the context, then calls it:
. ./myScript.sh && function_name
Briefly, no.
You can import all of the functions in the script into your environment with source (help source for details), which will then allow you to call them. This also has the effect of executing the script, so take care.
There is no way to call a function from a shell script as if it were a shared library.
Using case
#!/bin/bash
fun1 () {
echo "run function1"
[[ "$#" ]] && echo "options: $#"
}
fun2 () {
echo "run function2"
[[ "$#" ]] && echo "options: $#"
}
case $1 in
fun1) "$#"; exit;;
fun2) "$#"; exit;;
esac
fun1
fun2
This script will run functions fun1 and fun2 but if you start it with option
fun1 or fun2 it'll only run given function with args(if provided) and exit.
Usage
$ ./test
run function1
run function2
$ ./test fun2 a b c
run function2
options: a b c
I have a situation where I need a function from bash script which must not be executed before (e.g. by source) and the problem with #$ is that myScript.sh is then run twice, it seems... So I've come up with the idea to get the function out with sed:
sed -n "/^func ()/,/^}/p" myScript.sh
And to execute it at the time I need it, I put it in a file and use source:
sed -n "/^func ()/,/^}/p" myScript.sh > func.sh; source func.sh; rm func.sh
Edit: WARNING - seems this doesn't work in all cases, but works well on many public scripts.
If you have a bash script called "control" and inside it you have a function called "build":
function build() {
...
}
Then you can call it like this (from the directory where it is):
./control build
If it's inside another folder, that would make it:
another_folder/control build
If your file is called "control.sh", that would accordingly make the function callable like this:
./control.sh build
Solved post but I'd like to mention my preferred solution. Namely, define a generic one-liner script eval_func.sh:
#!/bin/bash
source $1 && shift && "#a"
Then call any function within any script via:
./eval_func.sh <any script> <any function> <any args>...
An issue I ran into with the accepted solution is that when sourcing my function-containing script within another script, the arguments of the latter would be evaluated by the former, causing an error.
The other answers here are nice, and much appreciated, but often I don't want to source the script in the session (which reads and executes the file in your current shell) or modify it directly.
I find it more convenient to write a one or two line 'bootstrap' file and run that. Makes testing the main script easier, doesn't have side effects on your shell session, and as a bonus you can load things that simulate other environments for testing. Example...
# breakfast.sh
make_donuts() {
echo 'donuts!'
}
make_bagels() {
echo 'bagels!'
}
# bootstrap.sh
source 'breakfast.sh'
make_donuts
Now just run ./bootstrap.sh.Same idea works with your python, ruby, or whatever scripts.
Why useful? Let's say you complicated your life for some reason, and your script may find itself in different environments with different states present. For example, either your terminal session, or a cloud provider's cool new thing. You also want to test cloud things in terminal, using simple methods. No worries, your bootstrap can load elementary state for you.
# breakfast.sh
# Now it has to do slightly different things
# depending on where the script lives!
make_donuts() {
if [[ $AWS_ENV_VAR ]]
then
echo '/donuts'
elif [[ $AZURE_ENV_VAR ]]
then
echo '\donuts'
else
echo '/keto_diet'
fi
}
If you let your bootstrap thing take an argument, you can load different state for your function to chew, still with one line in the shell session:
# bootstrap.sh
source 'breakfast.sh'
case $1 in
AWS)
AWS_ENV_VAR="arn::mumbo:jumbo:12345"
;;
AZURE)
AZURE_ENV_VAR="cloud::woo:_impress"
;;
esac
make_donuts # You could use $2 here to name the function you wanna, but careful if evaluating directly.
In terminal session you're just entering:
./bootstrap.sh AWS
Result:
# /donuts
you can call function from command line argument like below
function irfan() {
echo "Irfan khan"
date
hostname
}
function config() {
ifconfig
echo "hey"
}
$1
Once you defined the functions put $1 at the end to accept argument which function you want to call.
Lets say the above code is saved in fun.sh. Now you can call the functions like ./fun.sh irfan & ./fun.sh config in command line.

Resources