Open new gnome-terminal from scripts and input vars from present script. - linux

#!/bin/bash
Dpath=/home/$USER/Docker/
IP=`sed -n 1p /home/medma/.medmadoc`
DockerMachine=`sed -n 2p /home/$USER/.medmadoc`
DockerPort=`sed -n 5p /home/$USER/.medmadoc`
DockerUser=`sed -n 3p /home/$USER/.medmadoc`
DockerPass=`sed -n 4p /home/$USER/.medmadoc`
if [ ! -d $Dpath ] ; then
mkdir -p $Dpath
else
stat=`wget -O ".dockerid" http://$IP/DOCKER-STAT.txt`
for ids in `cat .dockerid`
do
if [ "$ids" == "$DockerMachine" ] ; then
gnome-terminal -x sh -c 'sshfs -p$DockerPort $DockerUser#$IP:/var/www/html $Dpath ; bash '
nautilus $Dpath
zenity --info --text "Mounted $DockerMachine"
exit
else
:
fi
done
zenity --info --text "No Such ID:$DockerMachine"
fi
gnome-terminal -x sh -c 'sshfs -p$DockerPort $DockerUser#$IP:/var/www/html $Dpath ; bash '
this command opens up a new terminal but the problem is that it does not load vars like $DockerPort $DockerUser $IP $Dpath from this script.
How do I input the values in these vars from this script to the newly opened terminal ?
Thanks !

As indicated before, you could try to use double quotes instead of single quotes around the sshfs invocation.
Single quotes in Bash are used to delimit verbatim text, in which variables are not expanded. Double quotes, in contrast, allow for variables expansion and command substitution ($(...)) to take place.
If you do use double quotes, beware of unintended side-effects (your username may contain a space, a dollar, a semicolon, or any other shell-special character). A cleaner approach would be to export the variables to the environment before calling gnome-terminal (and not forgetting to add double quotes around your variables inside the single-quotes), so that your code looks like :
export Docker{Port,User} IP Dpath
gnome-terminal -x sh -c 'sshfs -p"$DockerPort" "$DockerUser#$IP":/var/www/html "$Dpath" ; bash'
You may not want to pollute the environment with variables that will only be used once. If that is the case, instead of exporting them, you can use Bash's declare -p feature to serialize variables before loading them into a new environment (in my opinion, this is the cleanest approach). Here is what it looks like :
set_vars="$(declare -p Docker{Port,User} IP Dpath)"
gnome-terminal -x bash -c "$set_vars;"'sshfs ....'
Using this latest method, the variables are only visible to the shell process that runs the sshfs command, not gnome-terminal itself nor any sub-process run thereafter.
PS: you could read all your variables at once from the ~/.medmadoc file by using the following code instead of repeated sed invocations :
for var in IP Docker{Machine,User,Pass,Port}; do
read $var
done < ~/.medmadoc
This code makes use of the read builtin, that reads a line of input into a variable (in its simplest form).
PPS: That stat variable probably won't contain any useful information, since the output of wget was redirected by the -O flag. Perhaps you meant to store the result code of wget into stat, in which case what you meant was :
wget -O .dockerid ...
stat=$?

Related

How do you export local shell variables into a multi-command ssh?

I am trying to ssh to another server in a shell script and run some scripts.
Currently my line looks something like:
ssh user#$SERVER '$(typeset -a >> /dev/null); PROFILE_LOCATION=`locate db2profile| grep -i $INST_NAME| grep -v bak`; . $PROFILE_LOCATION; function1; function2;'
I've tried both ' and " , as well as using a combination of those with \; or ';'
How do I use the variables I have in my current shell script in my ssh into another server and running multiple commands? Thanks!!
If you want function declarations, and your shell is bash, use typeset -p rather than typeset -a (which will provide a textual dump of variables but not functions). Also, you need to actually run that in a context where it'll be locally evaluated (and ensure that your remote shell is something that understands it, not /bin/sh).
The following hits all those points:
evaluate_db2profile() {
local db2profile
db2profile=$(locate db2profile | grep -i "$INST_NAME" | grep -v bak | head -n 1)
[ -n "$db2profile" ] && . "$db2profile"
}
ssh "user#$SERVER" bash -s <<EOF
$(typeset -p)
evaluate_db2profile
function1
function2
EOF
Because <<EOF is used rather than <<'EOF', the typeset -p command is run locally and substituted into the heredoc. (You could also accomplish this by using double rather than single quotes in the one-line formulation, but see below).
Defining evaluate_db2profile locally as a function ensures that typeset -p will emit it in a format that the remote shell can evaluate, without need to be concerned about escaping.
Using bash -s on the remote command line ensures that the shell interpreting your functions is bash, not /bin/sh. If your code is written for ksh, run ksh -s to achieve that same effect.

Execute a find command with expression from a shell script [duplicate]

This question already has answers here:
Why does shell ignore quoting characters in arguments passed to it through variables? [duplicate]
(3 answers)
Closed 6 years ago.
I'm trying to write a database call from within a bash script and I'm having problems with a sub-shell stripping my quotes away.
This is the bones of what I am doing.
#---------------------------------------------
#! /bin/bash
export COMMAND='psql ${DB_NAME} -F , -t --no-align -c "${SQL}" -o ${EXPORT_FILE} 2>&1'
PSQL_RETURN=`${COMMAND}`
#---------------------------------------------
If I use an 'echo' to print out the ${COMMAND} variable the output looks fine:
echo ${COMMAND}
screen output:-
#---------------
psql drupal7 -F , -t --no-align -c "SELECT DISTINCT hostname FROM accesslog;" -o /DRUPAL/INTERFACES/EXPORTS/ip_list.dat 2>&1
#---------------
Also if I cut and paste this screen output it executes just fine.
However, when I try to execute the command as a variable within a sub-shell call, it gives an error message.
The error is from the psql client to the effect that the quotes have been removed from around the ${SQL} string.
The error suggests psql is trying to interpret the terms in the sql string as parameters.
So it seems the string and quotes are composed correctly but the quotes around the ${SQL} variable/string are being interpreted by the sub-shell during the execution call from the main script.
I've tried to escape them using various methods: \", \\", \\\", "", \"" '"', \'"\', ... ...
As you can see from my 'try it all' approach I am no expert and it's driving me mad.
Any help would be greatly appreciated.
Charlie101
Instead of storing command in a string var better to use BASH array here:
cmd=(psql ${DB_NAME} -F , -t --no-align -c "${SQL}" -o "${EXPORT_FILE}")
PSQL_RETURN=$( "${cmd[#]}" 2>&1 )
Rather than evaluating the contents of a string, why not use a function?
call_psql() {
# optional, if variables are already defined in global scope
DB_NAME="$1"
SQL="$2"
EXPORT_FILE="$3"
psql "$DB_NAME" -F , -t --no-align -c "$SQL" -o "$EXPORT_FILE" 2>&1
}
then you can just call your function like:
PSQL_RETURN=$(call_psql "$DB_NAME" "$SQL" "$EXPORT_FILE")
It's entirely up to you how elaborate you make the function. You might like to check for the correct number of arguments (using something like (( $# == 3 ))) before calling the psql command.
Alternatively, perhaps you'd prefer just to make it as short as possible:
call_psql() { psql "$1" -F , -t --no-align -c "$2" -o "$3" 2>&1; }
In order to capture the command that is being executed for debugging purposes, you can use set -x in your script. This will the contents of the function including the expanded variables when the function (or any other command) is called. You can switch this behaviour off using set +x, or if you want it on for the whole duration of the script you can change the shebang to #!/bin/bash -x. This saves you explicitly echoing throughout your script to find out what commands are being run; you can just turn on set -x for a section.
A very simple example script using the shebang method:
#!/bin/bash -x
ec() {
echo "$1"
}
var=$(ec 2)
Running this script, either directly after making it executable or calling it with bash -x, gives:
++ ec 2
++ echo 2
+ var=2
Removing the -x from the shebang or the invocation results in the script running silently.

bash passing strings to "gnome-terminal -e"

this question looks like Opening multiple tabs in gnome terminal with complex commands from a cycle, but I am looking for a more generic solution.
I have a C program that calls a script "xvi" with arguments. Each argument is originally enclosed within quotes (''') and each quote in an argument is isolated and back-slashed (this format is a prerequisite) ex:
xvi 'a file' 'let'\''s try another'
The script xvi must launch gnome-terminal with "-e vim args"
With xterm instead of gnome-terminal, this is easy because xterm assumes that "-e" is the last argument and passes all the tail to the shell, so the following is OK:
exec /usr/bin/xterm -e /usr/bin/vim "$#"
For gnome-terminal, "-e" is an option among others and we need to 'package' the whole command line in one argument. This is what I have done, which is OK: Enclose each argument within double quotes(\"arg\") and backslash any double quote within an argument:
cmd="/usr/bin/vim"
while [ "$1" != "" ] ; do
arg=`echo "$1" | sed -e 's/\"/\\\"/g'`
cmd="$cmd \"$arg\""
shift
done
exec gnome-terminal --zoom=0.9 --disable-factory -e "$cmd"
Again, this works fine and I am nearly happy with that.
Question: Is there any nicer solution, avoiding the loop?
Thanks
Untested, but you could probably finagle printf '%q' into doing the job:
exec gnome-terminal --zoom=0.9 --disable-factory -e "$(printf '%q ' "$#")"
I know this thread is old but recently I had a similar need and I created a bash script to launch multiple tabs and run different commands on each of them:
#!/bin/bash
# Array of commands to run in different tabs
commands=(
'tail -f /var/log/apache2/access.log'
'tail -f /var/log/apache2/error.log'
'tail -f /usr/local/var/postgres/server.log'
)
# Build final command with all the tabs to launch
set finalCommand=""
for (( i = 0; i < ${#commands[#]}; i++ )); do
export finalCommand+="--tab -e 'bash -c \"${commands[$i]}\"' "
done
# Run the final command
eval "gnome-terminal "$finalCommand
You just need to add your commands in the array and execute.
Gist link: https://gist.github.com/rollbackpt/b4e17e2f4c23471973e122a50d591602

Shell scripting shell inside shell

I would like to connect to different shells (csh, ksh etc.,) and execute command inside each switched shell.
Following is the sample program which reflects my intention:
#!/bin/bash
echo $SHELL
csh
echo $SHELL
exit
ksh
echo $SHELL
exit
Since, i am not well versed with Shell scripting need a pointer on how to achieve this. Any help would be much appreciated.
If you want to execute only one single command, you can use the -c option
csh -c 'echo $SHELL'
ksh -c 'echo $SHELL'
If you want to execute several commands, or even a whole script in a child-shell, you can use the here-document feature of bash and use the -s (read commands from stdin) on the child shells:
#!/bin/bash
echo "this is bash"
csh -s <<- EOF
echo "here go the commands for csh"
echo "and another one..."
EOF
echo "this is bash again"
ksh -s <<- EOF
echo "and now, we're in ksh"
EOF
Note that you can't easily check the shell you are in by echo $SHELL, because the parent shell expands this variable to the text /././bash. If you want to be sure that the child shell works, you should check if a shell-specific syntax is working or not.
It is possible to use the command line options provided by each shell to run a snippet of code.
For example, for bash use the -c option:
bash -c $code
bash -c 'echo hello'
zsh and fish also use the -c option.
Other shells will state the options they use in their man pages.
You need to use the -c command line option if you want to pass commands on bash startup:
#!/bin/bash
# We are in bash already ...
echo $SHELL
csh -c 'echo $SHELL'
ksh -c 'echo $SHELL'
You can pass arbitrary complex scripts to a shell, using the -c option, as in
sh -c 'echo This is the Bourne shell.'
You will save you a lot of headaches related to quotes and variable expansion if you wrap the call in a function reading the script on stdin as:
execute_with_ksh()
{
local script
script=$(cat)
ksh -c "${script}"
}
prepare_complicated_script()
{
# Write shell script on stdout,
# for instance by cat-ting a here-document.
cat <<'EOF'
echo ${SHELL}
EOF
}
prepare_complicated_script | execute_with_ksh
The advantage of this method is that it easy to insert a tee in the pipe or to break the pipe to control the script being passed to the shell.
If you want to execute the script on a remote host through ssh you should consider encode your script in base 64 to transmit it safely to the remote shell.

Triple nested quotations in shell script

I'm trying to write a shell script that calls another script that then executes a rsync command.
The second script should run in its own terminal, so I use a gnome-terminal -e "..." command. One of the parameters of this script is a string containing the parameters that should be given to rsync. I put those into single quotes.
Up until here, everything worked fine until one of the rsync parameters was a directory path that contained a space. I tried numerous combinations of ',",\",\' but the script either doesn't run at all or only the first part of the path is taken.
Here's a slightly modified version of the code I'm using
gnome-terminal -t 'Rsync scheduled backup' -e "nice -10 /Scripts/BackupScript/Backup.sh 0 0 '/Scripts/BackupScript/Stamp' '/Scripts/BackupScript/test' '--dry-run -g -o -p -t -R -u --inplace --delete -r -l '\''/media/MyAndroid/Internal storage'\''' "
Within Backup.sh this command is run
rsync $5 "$path"
where the destination $path is calculated from text in Stamp.
How can I achieve these three levels of nested quotations?
These are some question I looked at just now (I've tried other sources earlier as well)
https://unix.stackexchange.com/questions/23347/wrapping-a-command-that-includes-single-and-double-quotes-for-another-command
how to make nested double quotes survive the bash interpreter?
Using multiple layers of quotes in bash
Nested quotes bash
I was unsuccessful in applying the solutions to my problem.
Here is an example. caller.sh uses gnome-terminal to execute foo.sh, which in turn prints all the arguments and then calls rsync with the first argument.
caller.sh:
#!/bin/bash
gnome-terminal -t "TEST" -e "./foo.sh 'long path' arg2 arg3"
foo.sh:
#!/bin/bash
echo $# arguments
for i; do # same as: for i in "$#"; do
echo "$i"
done
rsync "$1" "some other path"
Edit: If $1 contains several parameters to rsync, some of which are long paths, the above won't work, since bash either passes "$1" as one parameter, or $1 as multiple parameters, splitting it without regard to contained quotes.
There is (at least) one workaround, you can trick bash as follows:
caller2.sh:
#!/bin/bash
gnome-terminal -t "TEST" -e "./foo.sh '--option1 --option2 \"long path\"' arg2 arg3"
foo2.sh:
#!/bin/bash
rsync_command="rsync $1"
eval "$rsync_command"
This will do the equivalent of typing rsync --option1 --option2 "long path" on the command line.
WARNING: This hack introduces a security vulnerability, $1 can be crafted to execute multiple commands if the user has any influence whatsoever over the string content (e.g. '--option1 --option2 \"long path\"; echo YOU HAVE BEEN OWNED' will run rsync and then execute the echo command).
Did you try escaping the space in the path with "\ " (no quotes)?
gnome-terminal -t 'Rsync scheduled backup' -e "nice -10 /Scripts/BackupScript/Backup.sh 0 0 '/Scripts/BackupScript/Stamp' '/Scripts/BackupScript/test' '--dry-run -g -o -p -t -R -u --inplace --delete -r -l ''/media/MyAndroid/Internal\ storage''' "

Resources