How to pass a parameter to a downloaded script - linux

I'm stuck passing a parameter(URL to download) to a script.
My goal is to create a script for deployment that downloads and installs an app.
The script I run:
curl url_GitHub | bash -s url_download_app
The script on GitHub:
#! /bin/sh
url="$2"
filename=$(basename "$url")
workpath=$(dirname $(readlink -f $0))
curl $url -o $workpath/$filename -s
sudo dpkg --install $workpath/$filename
As I understood it doesn't pass the URL to download the app to the URL="$2" variable.
If I run the GitHub script locally, and pass the URL to download the app, it executes successfully.
Smth like:
bash install.sh -s url_download_app
Please help=)

-s appears to be an option intended for the downloaded script. However, it is also an option accepted by bash, so what I think you want is
curl url_GitHub | bash -s -- -s url_download_app

As the script on GitHub use $2, we should pass it as second argument :
curl url_GitHub | bash -s _ url_download_app
_ url_download_app will be passed to the script on GitHub.

What about the following (using process substitution):
bash <(curl -Ss url_GitHub) url_download_app
I did a proof of concept with the following script:
$ cat /tmp/test.sh
#!/bin/bash
echo "I got '$1'"
exit 0
and when you run it you get:
$ bash <(cat /tmp/test.sh) "test input argument"
I got 'test input argument'

Related

grab remote shell script then run it with parameter in localhost

i have uploaded a test script remote.sh to a remote webserver like this :
#!/usr/bin/bash
echo "input var is : $1"
and i have a local script local.sh like this :
#!/usr/bin/bash
curl -sS https://remote_host/remote.sh | bash
then i run the local script with some inline parameter :
./local.sh "some input here."
but the remote script i grabbed doesn't seem to see the local inline parameter. how can this be done ?
Your code is starting a second copy of bash, and not passing the arguments retrieved to it.
I would generally suggest not starting a second copy of bash at all:
#!/usr/bin/env bash
eval "$(curl -sS https://remote_host/remote.sh)"
...but you could proceed to do so and pass them through. The following passes that code on the command line, leaving stdin free (so the new copy of bash being started can use it to prompt the user):
#!/bin/sh
code=$(curl -sS https://remote_host/remote.sh) || exit
exec bash -c "$code" bash "$#"
...or, to continue using stdin to pass code, bash -s can be used:
#!/bin/sh
curl -sS https://remote_host/remote.sh | bash -s -- "$#"
By the way -- everywhere I use /bin/sh above you could substitute /bin/bash or any other POSIX-compliant shell; the point being made is that the code given above does not depend on behaviors that are unspecified in the POSIX.2 standard.

How do i make my bash script on download automatically turn into a terminal command? [duplicate]

Say I have a file at the URL http://mywebsite.example/myscript.txt that contains a script:
#!/bin/bash
echo "Hello, world!"
read -p "What is your name? " name
echo "Hello, ${name}!"
And I'd like to run this script without first saving it to a file. How do I do this?
Now, I've seen the syntax:
bash < <(curl -s http://mywebsite.example/myscript.txt)
But this doesn't seem to work like it would if I saved to a file and then executed. For example readline doesn't work, and the output is just:
$ bash < <(curl -s http://mywebsite.example/myscript.txt)
Hello, world!
Similarly, I've tried:
curl -s http://mywebsite.example/myscript.txt | bash -s --
With the same results.
Originally I had a solution like:
timestamp=`date +%Y%m%d%H%M%S`
curl -s http://mywebsite.example/myscript.txt -o /tmp/.myscript.${timestamp}.tmp
bash /tmp/.myscript.${timestamp}.tmp
rm -f /tmp/.myscript.${timestamp}.tmp
But this seems sloppy, and I'd like a more elegant solution.
I'm aware of the security issues regarding running a shell script from a URL, but let's ignore all of that for right now.
source <(curl -s http://mywebsite.example/myscript.txt)
ought to do it. Alternately, leave off the initial redirection on yours, which is redirecting standard input; bash takes a filename to execute just fine without redirection, and <(command) syntax provides a path.
bash <(curl -s http://mywebsite.example/myscript.txt)
It may be clearer if you look at the output of echo <(cat /dev/null)
This is the way to execute remote script with passing to it some arguments (arg1 arg2):
curl -s http://server/path/script.sh | bash /dev/stdin arg1 arg2
For bash, Bourne shell and fish:
curl -s http://server/path/script.sh | bash -s arg1 arg2
Flag "-s" makes shell read from stdin.
Use:
curl -s -L URL_TO_SCRIPT_HERE | bash
For example:
curl -s -L http://bitly/10hA8iC | bash
Using wget, which is usually part of default system installation:
bash <(wget -qO- http://mywebsite.example/myscript.txt)
You can also do this:
wget -O - https://raw.github.com/luismartingil/commands/master/101_remote2local_wireshark.sh | bash
The best way to do it is
curl http://domain/path/to/script.sh | bash -s arg1 arg2
which is a slight change of answer by #user77115
You can use curl and send it to bash like this:
bash <(curl -s http://mywebsite.example/myscript.txt)
I often using the following is enough
curl -s http://mywebsite.example/myscript.txt | sh
But in a old system( kernel2.4 ), it encounter problems, and do the following can solve it, I tried many others, only the following works
curl -s http://mywebsite.example/myscript.txt -o a.sh && sh a.sh && rm -f a.sh
Examples
$ curl -s someurl | sh
Starting to insert crontab
sh: _name}.sh: command not found
sh: line 208: syntax error near unexpected token `then'
sh: line 208: ` -eq 0 ]]; then'
$
The problem may cause by network slow, or bash version too old that can't handle network slow gracefully
However, the following solves the problem
$ curl -s someurl -o a.sh && sh a.sh && rm -f a.sh
Starting to insert crontab
Insert crontab entry is ok.
Insert crontab is done.
okay
$
Also:
curl -sL https://.... | sudo bash -
Just combining amra and user77115's answers:
wget -qO- https://raw.githubusercontent.com/lingtalfi/TheScientist/master/_bb_autoload/bbstart.sh | bash -s -- -v -v
It executes the bbstart.sh distant script passing it the -v -v options.
Is some unattended scripts I use the following command:
sh -c "$(curl -fsSL <URL>)"
I recommend to avoid executing scripts directly from URLs. You should be sure the URL is safe and check the content of the script before executing, you can use a SHA256 checksum to validate the file before executing.
instead of executing the script directly, first download it and then execute
SOURCE='https://gist.githubusercontent.com/cci-emciftci/123123/raw/123123/sample.sh'
curl $SOURCE -o ./my_sample.sh
chmod +x my_sample.sh
./my_sample.sh
This way is good and conventional:
17:04:59#itqx|~
qx>source <(curl -Ls http://192.168.80.154/cent74/just4Test) Lord Jesus Loves YOU
Remote script test...
Param size: 4
---------
17:19:31#node7|/var/www/html/cent74
arch>cat just4Test
echo Remote script test...
echo Param size: $#
If you want the script run using the current shell, regardless of what it is, use:
${SHELL:-sh} -c "$(wget -qO - http://mywebsite.example/myscript.txt)"
if you have wget, or:
${SHELL:-sh} -c "$(curl -Ls http://mywebsite.example/myscript.txt)"
if you have curl.
This command will still work if the script is interactive, i.e., it asks the user for input.
Note: OpenWRT has a wget clone but not curl, by default.
bash | curl http://your.url.here/script.txt
actual example:
juan#juan-MS-7808:~$ bash | curl https://raw.githubusercontent.com/JPHACKER2k18/markwe/master/testapp.sh
Oh, wow im alive
juan#juan-MS-7808:~$

How can I pass arguments to a script executed by sh read from stdin?

I download some shell script from example.com with wget and execute it immediately by streaming stdout of wget via a pipe to stdin of the sh command.
wget -O - http://example.com/myscript.sh | sh -
How can I pass arguments to the script?
You need to use -s option while invoking bash for passing an argument to the shell script being downloaded:
wget -O - http://example.com/myscript.sh | bash -s 'arg1' 'arg2'
As per man bash:
-s If the -s option is present, or if no arguments remain after option processing,
then commands are read from the standard input. This option allows
the positional parameters to be set when invoking an interactive shell.
While the accepted answer is correct, it does only work on bash and not on sh as the initial poster requested.
To do this in sh you'll have to add --:
curl https://example.com/script.sh | sh -s -- --my-arg

Installing RVM with a specific bash command format using `-s`

I'm trying to install RVM via bash.
According to the installation guide on rvm.io, the command to do that is:
\curl -sSL https://get.rvm.io | bash -s stable --auto-dotfiles
I'm, however, compelled to download the script and then run it separately. So, my bash command would attempt to execute the downloaded file (let's call it rvm-installer).
How can I execute rvm-installer using the format below:
/bin/bash -c './rvm-installer 2>&1'
Obviously, the above command is incorrect. I'm thinking it should look more like this:
/bin/bash -s stable --auto-dotfiles './rvm-installer 2>&1'
I know the above command sets my positional parameters in the correct order because of this:
$ printf "%s\n" "$#"
stable
--auto-dotfiles
./rvm-installer 2>&1
and this:
$ echo "$SHLVL"
2
But I'm confident that rvm-installer is not running because I replaced its contents with the following, to test but once the command above is run, I see no output:
#!/usr/bin/env bash
gimme a syntax error
echo "$1"
echo "Foo bar"
exit 1
How can I run rvm-installer using -s within the bash command format I need to use? Is there a way around using -s?
-s is a bash argument not an rvm-installer argument.
The other arguments are the rvm-installer arguments.
So just pass those to rvm-installer yourself.
./rvm-installer stable --auto-dotfiles
Using the, entirely pointless, explicit call to bash that would look like this
/bin/bash -c './rvm-installer stable --auto-dotfiles 2>&1'
though that could just as meaningfully be this as well
/bin/bash rvm-installer stable --auto-dotfiles 2>&1

Check for existence of wget/curl

Trying to do a script to download a file using wget, or curl if wget doesn't exist in Linux. How do I have the script check for existence of wget?
Linux has a which command which will check for the existence of an executable on your path:
pax> which ls ; echo $?
/bin/ls
0
pax> which no_such_executable ; echo $?
1
As you can see, it sets the return code $? to easily tell if the executable was found, so you could use something like:
if which wget >/dev/null ; then
echo "Downloading via wget."
wget --option argument
elif which curl >/dev/null ; then
echo "Downloading via curl."
curl --option argument
else
echo "Cannot download, neither wget nor curl is available."
fi
wget http://download/url/file 2>/dev/null || curl -O http://download/url/file
One can also use command or type or hash to check if wget/curl exists or not. Another thread here - "Check if a program exists from a Bash script" answers very nicely what to use in a bash script to check if a program exists.
I would do this -
if [ ! -x /usr/bin/wget ] ; then
# some extra check if wget is not installed at the usual place
command -v wget >/dev/null 2>&1 || { echo >&2 "Please install wget or set it in your path. Aborting."; exit 1; }
fi
First thing to do is try install to install wget with your usual package management system,. It should tell you if already installed;
yum -y wget
Otherwise just launch a command like below
wget http://download/url/file
If you receive no error, then its ok.
A solution taken from the K3S install script (https://raw.githubusercontent.com/rancher/k3s/master/install.sh)
function download {
url=$1
filename=$2
if [ -x "$(which wget)" ] ; then
wget -q $url -O $2
elif [ -x "$(which curl)" ]; then
curl -o $2 -sfL $url
else
echo "Could not find curl or wget, please install one." >&2
fi
}
# to use in the script:
download https://url /local/path/to/download
Explanation:
It looks for the location of wget and checks for a file to exist there, if so, it does a script-friendly (i.e. quiet) download. If wget isn't found, it tries curl in a similarly script-friendly way.
(Note that the question doesn't specify BASH however my answer assumes it.)
Simply run
wget http://download/url/file
you will see the statistics whether the endpoint is available or not.

Resources