Piping a shell script to bash and launch interactive bash - linux

Consider the following shell script on example.com
#/bin/bash
export HELLO_SCOPE=WORLD
eval $#
Now, I would like to download and then execute this shell script with parameters in the simplest way and be able to launch an interactive bash terminal with the HELLO_SCOPE variable set.
I have tried
curl http://example.com/hello_scope.sh | bash -s bash -i
But it quits the shell immediately. From what I can understand, it's because curls stdout, the script, remains the stdin of the bash, preventing it from starting interactively (as that would require my keyboard to be stdin).
Is there a way to avoid this without going through the extra step of creating a temporary file with the shell script?

You can source it:
# open a shell
. <(curl http://example.com/hello_scope.sh)
# type commands ...

You could just download this script you (using wget for example) and source this script, isn't it ?
script_name="hello_scope.sh"
[[ -f $script_name ]] && rm -rf "$script_name"
wget "http://example.com/$script_name" -O "$script_name" -o /dev/null
&& chmod u+x "$script_name"
&& source "$script_name"
You could use . "$script_name" instead of source "$script_name" if you want (. is POSIX compliant). You could write the previous code in a script and source it to have interactive shell with the setted variable $HELLO_SCOPE.
Finally you could remove the eval line in your remote shell script.

Related

grab remote shell script then run it with parameter in localhost

i have uploaded a test script remote.sh to a remote webserver like this :
#!/usr/bin/bash
echo "input var is : $1"
and i have a local script local.sh like this :
#!/usr/bin/bash
curl -sS https://remote_host/remote.sh | bash
then i run the local script with some inline parameter :
./local.sh "some input here."
but the remote script i grabbed doesn't seem to see the local inline parameter. how can this be done ?
Your code is starting a second copy of bash, and not passing the arguments retrieved to it.
I would generally suggest not starting a second copy of bash at all:
#!/usr/bin/env bash
eval "$(curl -sS https://remote_host/remote.sh)"
...but you could proceed to do so and pass them through. The following passes that code on the command line, leaving stdin free (so the new copy of bash being started can use it to prompt the user):
#!/bin/sh
code=$(curl -sS https://remote_host/remote.sh) || exit
exec bash -c "$code" bash "$#"
...or, to continue using stdin to pass code, bash -s can be used:
#!/bin/sh
curl -sS https://remote_host/remote.sh | bash -s -- "$#"
By the way -- everywhere I use /bin/sh above you could substitute /bin/bash or any other POSIX-compliant shell; the point being made is that the code given above does not depend on behaviors that are unspecified in the POSIX.2 standard.

Different outputs using ./ and sh [duplicate]

I have a line of code that works fine in my terminal:
for i in *.mp4; do echo ffmpeg -i "$i" "${i/.mp4/.mp3}"; done
Then I put the exact same line of code in a script myscript.sh:
#!/bin/sh
for i in *.mp4; do echo ffmpeg -i "$i" "${i/.mp4/.mp3}"; done
However, now I get an error when running it:
$ sh myscript.sh
myscript.sh: 2: myscript.sh: Bad substitution
Based on other questions I tried changing the shebang to #!/bin/bash, but I get the exact same error. Why can't I run this script?
TL;DR: Since you are using Bash specific features, your script has to run with Bash and not with sh:
$ sh myscript.sh
myscript.sh: 2: myscript.sh: Bad substitution
$ bash myscript.sh
ffmpeg -i bar.mp4 bar.mp3
ffmpeg -i foo.mp4 foo.mp3
See Difference between sh and Bash. To find out which sh you are using: readlink -f $(which sh).
The best way to ensure a bash specific script always runs correctly
The best practices are to both:
Replace #!/bin/sh with #!/bin/bash (or whichever other shell your script depends on).
Run this script (and all others!) with ./myscript.sh or /path/to/myscript.sh, without a leading sh or bash.
Here's an example:
$ cat myscript.sh
#!/bin/bash
for i in *.mp4
do
echo ffmpeg -i "$i" "${i/.mp4/.mp3}"
done
$ chmod +x myscript.sh # Ensure script is executable
$ ./myscript.sh
ffmpeg -i bar.mp4 bar.mp3
ffmpeg -i foo.mp4 foo.mp3
(Related: Why ./ in front of scripts?)
The meaning of #!/bin/sh
The shebang suggests which shell the system should use to run a script. This allows you to specify #!/usr/bin/python or #!/bin/bash so that you don't have to remember which script is written in what language.
People use #!/bin/sh when they only use a limited set of features (defined by the POSIX standard) for maximum portability. #!/bin/bash is perfectly fine for user scripts that take advantage of useful bash extensions.
/bin/sh is usually symlinked to either a minimal POSIX compliant shell or to a standard shell (e.g. bash). Even in the latter case, #!/bin/sh may fail because bash will run in compatibility mode as explained in the man page:
If bash is invoked with the name sh, it tries to mimic the startup behavior of historical versions of sh as closely as possible, while conforming to the POSIX standard as well.
The meaning of sh myscript.sh
The shebang is only used when you run ./myscript.sh, /path/to/myscript.sh, or when you drop the extension, put the script in a directory in your $PATH, and just run myscript.
If you explicitly specify an interpreter, that interpreter will be used. sh myscript.sh will force it to run with sh, no matter what the shebang says. This is why changing the shebang is not enough by itself.
You should always run the script with its preferred interpreter, so prefer ./myscript.sh or similar whenever you execute any script.
Other suggested changes to your script:
It is considered good practice to quote variables ("$i" instead of $i). Quoted variables will prevent problems if the stored file name contains white space characters.
I like that you use advanced parameter expansion. I suggest to use "${i%.mp4}.mp3" (instead of "${i/.mp4/.mp3}"), since ${parameter%word} only substitutes at the end (for example a file named foo.mp4.backup).
The ${var/x/y/} construct is not POSIX. In your case, where you just remove a string at the end of a variable and tack on another string, the portable POSIX solution is to use
#!/bin/sh
for i in *.mp4; do
ffmpeg -i "$i" "${i%.mp4}.mp3"
done
or even shorter, ffmpeg -i "$i" "${i%4}3".
The definitive dope for these constructs is the chapter on Parameter Expansion for the POSIX shell.

echo Command of Shell Script -e option writing on file [duplicate]

I have a line of code that works fine in my terminal:
for i in *.mp4; do echo ffmpeg -i "$i" "${i/.mp4/.mp3}"; done
Then I put the exact same line of code in a script myscript.sh:
#!/bin/sh
for i in *.mp4; do echo ffmpeg -i "$i" "${i/.mp4/.mp3}"; done
However, now I get an error when running it:
$ sh myscript.sh
myscript.sh: 2: myscript.sh: Bad substitution
Based on other questions I tried changing the shebang to #!/bin/bash, but I get the exact same error. Why can't I run this script?
TL;DR: Since you are using Bash specific features, your script has to run with Bash and not with sh:
$ sh myscript.sh
myscript.sh: 2: myscript.sh: Bad substitution
$ bash myscript.sh
ffmpeg -i bar.mp4 bar.mp3
ffmpeg -i foo.mp4 foo.mp3
See Difference between sh and Bash. To find out which sh you are using: readlink -f $(which sh).
The best way to ensure a bash specific script always runs correctly
The best practices are to both:
Replace #!/bin/sh with #!/bin/bash (or whichever other shell your script depends on).
Run this script (and all others!) with ./myscript.sh or /path/to/myscript.sh, without a leading sh or bash.
Here's an example:
$ cat myscript.sh
#!/bin/bash
for i in *.mp4
do
echo ffmpeg -i "$i" "${i/.mp4/.mp3}"
done
$ chmod +x myscript.sh # Ensure script is executable
$ ./myscript.sh
ffmpeg -i bar.mp4 bar.mp3
ffmpeg -i foo.mp4 foo.mp3
(Related: Why ./ in front of scripts?)
The meaning of #!/bin/sh
The shebang suggests which shell the system should use to run a script. This allows you to specify #!/usr/bin/python or #!/bin/bash so that you don't have to remember which script is written in what language.
People use #!/bin/sh when they only use a limited set of features (defined by the POSIX standard) for maximum portability. #!/bin/bash is perfectly fine for user scripts that take advantage of useful bash extensions.
/bin/sh is usually symlinked to either a minimal POSIX compliant shell or to a standard shell (e.g. bash). Even in the latter case, #!/bin/sh may fail because bash will run in compatibility mode as explained in the man page:
If bash is invoked with the name sh, it tries to mimic the startup behavior of historical versions of sh as closely as possible, while conforming to the POSIX standard as well.
The meaning of sh myscript.sh
The shebang is only used when you run ./myscript.sh, /path/to/myscript.sh, or when you drop the extension, put the script in a directory in your $PATH, and just run myscript.
If you explicitly specify an interpreter, that interpreter will be used. sh myscript.sh will force it to run with sh, no matter what the shebang says. This is why changing the shebang is not enough by itself.
You should always run the script with its preferred interpreter, so prefer ./myscript.sh or similar whenever you execute any script.
Other suggested changes to your script:
It is considered good practice to quote variables ("$i" instead of $i). Quoted variables will prevent problems if the stored file name contains white space characters.
I like that you use advanced parameter expansion. I suggest to use "${i%.mp4}.mp3" (instead of "${i/.mp4/.mp3}"), since ${parameter%word} only substitutes at the end (for example a file named foo.mp4.backup).
The ${var/x/y/} construct is not POSIX. In your case, where you just remove a string at the end of a variable and tack on another string, the portable POSIX solution is to use
#!/bin/sh
for i in *.mp4; do
ffmpeg -i "$i" "${i%.mp4}.mp3"
done
or even shorter, ffmpeg -i "$i" "${i%4}3".
The definitive dope for these constructs is the chapter on Parameter Expansion for the POSIX shell.

How can I feed input within bash [Executed through the Network]

As the title says, within linux how can I feed input to the bash when I do sudo bash
Lets say I have a bash script that reads the name.
The way I execute the script is through sudo using:
cat read-my-name-script.sh | sudo bash
Lets just say this is how I execute the script throught the network.
Now I want to fill the name automatically, is there a way to feed the input. I tried doing this: cat read-my-name-script.sh < name-input-file | sudo bash where the name-input-file is a file for the input that the user will be using to feed the script.
I am new to linux and learning to automate the input and wanted to create a file for input where the user can fill it and feed it to my script.
This is convoluted, but might do what you want.
sudo bash -c "$(cat read-my-name.sh)" <name-input-file
The -c says the next quoted argument are the commands to run (so, read the script as a string on the command line, instead of from a file), and the calling shell interpolates the contents of the file inside the double quotes before the sudo command gets evaluated. So if read-my-name.sh contains
#!/bin/bash
read -p "I want your name please"
then the command gets expanded into
sudo bash -c '#!/bin/bash
read -p "I want your name please"' <name-input-file
(where of course at this time the shell has actually removed the outer double quotes altogether; I put in single quotes in their place instead to show how this would look as actually executable, syntactically valid code).
I think you need that:
while read -r arg; do sudo bash read-my-name-script.sh "$arg";done <name-input-file
So each line of name-input-file will be passed as argument to sudo bash read-my-name-script.sh
If your argslist located on http server, you can do that:
while read -r arg; do sudo bash read-my-name-script.sh "$arg";done < <(wget -q -O- http://some/address/in/internet/name-input-file)
UPD
add [[ -f name-input-file ]] && readarray -t args <name-input-file
to read-my-name-script.sh
and use "${args[#]}" as arguments of command in the script.
For example echo "${args[#]}" or cmd "${args[0]}" "${args[1]}" ... "${args[100]}" in any order.
In this case you can use
wget -q -O- http://some/address/in/internet/read-my-name-script.sh | bash
for run your script with arguments from name-input-file whitout saving script to the local machine

Shell scripting shell inside shell

I would like to connect to different shells (csh, ksh etc.,) and execute command inside each switched shell.
Following is the sample program which reflects my intention:
#!/bin/bash
echo $SHELL
csh
echo $SHELL
exit
ksh
echo $SHELL
exit
Since, i am not well versed with Shell scripting need a pointer on how to achieve this. Any help would be much appreciated.
If you want to execute only one single command, you can use the -c option
csh -c 'echo $SHELL'
ksh -c 'echo $SHELL'
If you want to execute several commands, or even a whole script in a child-shell, you can use the here-document feature of bash and use the -s (read commands from stdin) on the child shells:
#!/bin/bash
echo "this is bash"
csh -s <<- EOF
echo "here go the commands for csh"
echo "and another one..."
EOF
echo "this is bash again"
ksh -s <<- EOF
echo "and now, we're in ksh"
EOF
Note that you can't easily check the shell you are in by echo $SHELL, because the parent shell expands this variable to the text /././bash. If you want to be sure that the child shell works, you should check if a shell-specific syntax is working or not.
It is possible to use the command line options provided by each shell to run a snippet of code.
For example, for bash use the -c option:
bash -c $code
bash -c 'echo hello'
zsh and fish also use the -c option.
Other shells will state the options they use in their man pages.
You need to use the -c command line option if you want to pass commands on bash startup:
#!/bin/bash
# We are in bash already ...
echo $SHELL
csh -c 'echo $SHELL'
ksh -c 'echo $SHELL'
You can pass arbitrary complex scripts to a shell, using the -c option, as in
sh -c 'echo This is the Bourne shell.'
You will save you a lot of headaches related to quotes and variable expansion if you wrap the call in a function reading the script on stdin as:
execute_with_ksh()
{
local script
script=$(cat)
ksh -c "${script}"
}
prepare_complicated_script()
{
# Write shell script on stdout,
# for instance by cat-ting a here-document.
cat <<'EOF'
echo ${SHELL}
EOF
}
prepare_complicated_script | execute_with_ksh
The advantage of this method is that it easy to insert a tee in the pipe or to break the pipe to control the script being passed to the shell.
If you want to execute the script on a remote host through ssh you should consider encode your script in base 64 to transmit it safely to the remote shell.

Resources