Create a script that adds lines of code to .bashrc then reloads the terminal - linux

I am trying to create a shell script that once run, adds a line of code to the end of .bashrc then reloads the terminal. This is the code i have written in my install.sh
function addbashaliases() {
echo 'if [ -f ~/aliases/.bash_aliases ]; then
. ~/aliases/.bash_aliases
fi' >> ~/.bashrc;
source ~/.bashrc
}
Nothing is happening. How should I write the code so that it runs and adds the text to the .bashrc file?

For the sake of clarity I prefer to append the information on the .bashrc file using the cat command instead of the echo. However, this should work also using your echo command.
This said you should make sure that:
The script calls the addbashaliases function
The ~/aliases/.bash_aliases file exists (I would expect to have something more similar to ~/.aliases/.bash_aliases)
You can check that the script has correctly run by checking the content of the ~/.bashrc file and printing some environment variable set on the .bash_aliases file after the source command.
#!/bin/bash
function addbashaliases() {
# Add source bash_aliases on .bashrc
cat >> ~/.bashrc << EOT
if [ -f ~/aliases/.bash_aliases ]; then
. ~/aliases/.bash_aliases
fi
EOT
# Reload current environment
source ~/.bashrc
}
# Execute the function
addbashaliases

I am just correcting your script. As per your logic it should be like below.
function addbashaliases() {
if [ -f ~/aliases/.bash_aliases ]; then
output=$(. ~/aliases/.bash_aliases)
echo $output >> ~/.bashrc
fi
source ~/.bashrc
}

Related

'command not found' when passing variable into BASH function, am I quoting incorrectly?

So I have a command that basically adds a line to a file, but only if that line isn't already in the file. It uses grep to check the file and if not there then appends to the file.
The purpose of this is because I want to import all my aliases into BASH from an installation script that is likely to be executed more than once (and I don't want to fill ~/.bashrc with duplicate lines of the same alias).
The command works fine by itself, but when I try to abstract it away into a function to reuse elsewhere, I get the error: command not found.
So far I've looked at grep and pattern matches (thinking maybe & or ~ was throwing it off), parameter expansion OR command expansion and quoting.
I feel it's the latter i.e. I'm not quoting the alias string or file string correctly and it's trying to execute it as a command instead of a string.
I've been pulling my hair out for a while on this one, would somebody please be able to point me in the right direction?
Any help appreciated!
# Command (which works)
grep -qxF 'alias gs="clear && git status"' ~/.bashrc || echo 'alias gs="clear && git status"' >> ~/.bashrc
# Wrap command in function so I can reuse and pass in different parameters
function append_unique_line {
grep -qxF $1 $2 || echo $1 >> $2
}
# Doesn't work.. append_unique_line: command not found
append_unique_line 'alias gs="clear && git status"' ~/.bashrc
Try
function append_unique_line() {
grep -qxF "$1" "$2" || echo "$1" >> "$2"
}
append_unique_line 'alias gs="clear && git status"' ~/.bashrc
Always wrap your variable in " for expansion.

Environment variable is not setting globally

I am using csh terminal.
.cshrc
setenv $files /home/ec2-user/files
.login
if [ -f ~/.cshrc ]; then
. ~/.cshrc
fi
I am trying to echo $files values from plink.
It showing the error files: undefined variable
You don't use a dollar sign when setting a variable, you only use it when you refer to that variable.
setenv files /home/ec2-user/files
The test command should be a built-in in most csh/tcsh implementations, and has most of the same functionality you'll see listed under man test.
test -f ~/.cshrc && source ~/.cshrc
Note that normally, csh/tcsh will run your .cshrc or .tcshrc file automatically, before it runs .login.

Sourcing files in shell script vs sourcing on command line

I have the problem that my shell script is not acting exactly the same as my manual typing into a console. I am attempting to find and source some setup files in a shell script as follows:
#!/bin/bash
TURTLE_SHELL=bash
# source setup.sh from same directory as this file
_TURTLE_SETUP_DIR=$(builtin cd "`dirname "${BASH_SOURCE[0]}"`" > /dev/null && pwd)
. "$_TURTLE_SETUP_DIR/turtle_setup.sh"
This bash file calls a .sh file:
#!/bin/env sh
_TURTLE_ROS_SETUP_DIR=$_TURTLE_SETUP_DIR/../devel
if [ -z "$TURTLE_SHELL" ]; then
TURTLE_SHELL=sh
fi
if [ -d "$PX4_FIRMWARE_DIR/integrationtests" ]; then
if [ -f "$PX4_FIRMWARE_DIR/integrationtests/setup_gazebo_ros.bash" ]; then
. "$PX4_FIRMWARE_DIR/integrationtests/setup_gazebo_ros.bash" "$PX4_FIRMWARE_DIR"
fi
fi
if [ "$TURTLE_SHELL" = "bash" ]; then
if [ -f "$_TURTLE_ROS_SETUP_DIR/setup.bash" ]; then
source $_TURTLE_ROS_SETUP_DIR/setup.bash
fi
else
if [ "$TURTLE_SHELL" = "sh" ]; then
if [ -f "$_TURTLE_ROS_SETUP_DIR/setup.sh" ]; then
source $_TURTLE_ROS_SETUP_DIR/setup.sh
fi
fi
fi
The line in question is:
. "$PX4_FIRMWARE_DIR/integrationtests/setup_gazebo_ros.bash" "$PX4_FIRMWARE_DIR"
I have made sure that this code is actually running and that my environment variables are correct. If I run this command on the command line everything works well. However, the same is not true when the file is sourced via shell script. Why is this? Is there something different about the environment of a shell script that is different from a command line. Also, how can I fix this problem?
Edit:
I am sourcing either the .bash or the .sh scale, depending upon which shell I am using.
Edit 2:
I am sourcing this script. Thus, everything is run in my default bash terminal, and it is all run within the same terminal and not a terminal spawned from a child process. Why is the script not sourcing setup_gazebo_ros.bash within the current shell?
It's the same reason why you source the env script and not run it. When you run the script it runs in a new shell and the variables are not transferred back to the parent shell.
To illustrate
$ cat << ! > foo.sh
> export foo='FOO'
> !
$ chmod +x foo.sh
$ ./foo.sh
$ echo $foo
$ source ./foo.sh
$ echo $foo
FOO

How to execute a command on the remote at login with ssh, after .bashrc sourcing?

I am working on different machines where my home is NFS-mounted. I want to switch easily to another machine if the one I am working on is too much loaded.
I often modify my environment in the shell I am working, and I would like to find the same modified (with respect to the bashrc) environment when I switch to another machine. I tried the following script, but it does not work because the .bashrc is sourced after source $HOME/.env-dump.txt.
Is there a clean way to execute some commands when logging to a machine with ssh as if you type them at the prompt after logged?
#!/usr/bin/env sh
if [[ $# != 1 ]];
echo 'sssh USAGE:'
echo ' sssh remotename'
exit 1
fi
printenv | sed -e '/_=.*/ d;s/\([^=]\+\)=\(.*\)/export \1="\2"/' > $HOME/.env-dump.txt
ssh $1 -t 'source $HOME/.env-dump.txt; bash -l'
Add the following lines to your ~/.bash_profile
[ -f "$HOME/.bashrc" ] && . $HOME/bashrc
[ -f "$HOME/.env-dump.txt" ] && source $HOME/.env-dump.txt
And create a ~/.bash_logout file with the line
[ -f "$HOME/.env-dump.txt" ] && rm $HOME/.env-dump.txt
Now you can simply call ssh $1 -t 'bash -l' in the last line of your script.
WARNING
The output of printenv contains some variables which are machine dependent like GNOME_KEYRING_CONTROL, SESSION_MANAGER, DBUS_SESSION_BUS_ADDRESS ... (These variables are form a Ubuntu 12.04). These variables should be removed from the ~/.env-dump.txt file.

Making Bash modular

I have a bunch of Bash scripts and they each make use of the following:
BIN_CHATTR="/usr/bin/chattr"
BIN_CHKCONFIG="/sbin/chkconfig";
BIN_WGET="/usr/bin/wget";
BIN_TOUCH="/bin/touch";
BIN_TAR="/bin/tar";
BIN_CHMOD="/bin/chmod";
BIN_CHOWN="/bin/chown";
BIN_ECHO="/bin/echo";
BIN_GUNZIP="/usr/bin/gunzip";
BIN_PATCH="/usr/bin/patch";
BIN_FIND="/usr/bin/find";
BIN_RM="/bin/rm";
BIN_USERDEL="/usr/sbin/userdel";
BIN_GROUPDEL="/usr/sbin/groupdel";
BIN_MOUNT="/bin/mount";
Is there a way I could just wget a Bash script with global settings like that and then include them in the script I want to run?
Yes, you can put all those variables in a file like "settings.sh" and then do this in your scripts:
source settings.sh
You can keep your variables in a shell script and then source that file:
source /path/to/variables.sh
You should actually use . which in bash is the same thing as source but will offer better portability:
. /path/to/variables.sh
Yes you can. Just add your variables and functions to a file, make it executable and "execute" it at the top of any script that needs to access them. Here's an example:
$ pwd
/Users/joe/shell
$ cat lib.sh
#!/bin/bash
BIN_WGET="/usr/bin/wget"
BIN_MOUNT="/bin/mount"
function test() {
echo "This is a test"
}
$ cat script.sh
#!/bin/bash
. /Users/joe/shell/lib.sh
echo "wget=$BIN_WGET"
test
$ ./script.sh
wget=/usr/bin/wget
This is a test
are you looking for the source command?
mylib.sh:
#!/bin/bash
JAIL_ROOT=/www/httpd
is_root(){
[ $(id -u) -eq 0 ] && return $TRUE || return $FALSE
}
test.sh
#!/bin/bash
# Load the mylib.sh using source comamnd
source mylib.sh
echo "JAIL_ROOT is set to $JAIL_ROOT"
# Invoke the is_root() and show message to user
is_root && echo "You are logged in as root." || echo "You are not logged in as root."
btw - use rsync -a to mirror scripts with +x flag.

Resources