I'm trying to execute a wget command with a variable inside it but it just ignores it, any idea what am I doing wrong?
#!/bin/bash
URL=http:://www.myurl.com
echo $(date) 'Running wget...'
wget -O - -q "$URL/something/something2"
Four things:
Add quotes around your URL: http:://www.myurl.com ==> "http:://www.myurl.com"
Remove the double colon: "http:://www.myurl.com" ==> "http://www.myurl.com"
Get rid of the extra flags and hyphen on the wget command: "wget -O - -q "$URL/something/something2"" ==> wget "$URL/something/something2"
Add curly braces around your variable: "wget "$URL/something/something2"" ==> "wget "${URL}/something/something2""
This works:
#!/bin/bash
URL="http://www.google.com"
echo $(date) 'Running wget...'
wget "${URL}"
I use that code in IPython (colab):
URL = 'http:://www.myurl.com'
!wget {URL}
I wrote this answer because was searching it!)
Another handy option in Bash (or other shells) is to create a simple helper function that calls wget with the common options required by nearly all sites and any specific options you generally use. This reduces the typing involved and can also be useful in your scripts. I place the following in my ~/.bashrc to make it available to all shells/subshells. It validates input, checks that wget is available, and then passes all command line arguments to wget with the default options set in the script:
wgnc () {
if [ -z $1 ]; then
printf " usage: wg <filename>\t\t(runs wget --no-check-certificate --progress=bar)\n"
elif ! type wget &>/dev/null; then
printf " error: 'wget' not found on system\n"
else
printf " wget --no-check-certificate --progress=bar %s\n" "$#"
wget --no-check-certificate --progress=bar "$#"
fi
}
You can cut down typing even more by aliasing the function further. I use:
alias wg='wgnc'
Which reduces the normal wget --no-check-certificate --progress=bar URL to simply wg URL. Obviously, you can set the options to suit your needs, but this is a further way to utilize wget in your scripts.
Related
The script sometimes doesn't run after wget. Perhaps it is necessary to wait for the completion of wget?
#!/usr/bin/env bash
set -Eeuo pipefail
# Installing tor-browser
echo -en "\033[1;33m Installing tor-browser... \033[0m \n"
URL='https://tor.eff.org/download/' # Official mirror https://www.torproject.org/download/, may be blocked
LINK=$(wget -qO- $URL | grep -oP -m 1 'href="\K/dist.+?ALL.tar.xz')
URL='https://tor.eff.org'${LINK}
curl --location $URL | tar xJ --extract --verbose --preserve-permissions
sudo mv tor-browser /opt
sudo chown -R $USER /opt/tor-browser
cd /opt/tor-browser
./start-tor-browser.desktop --register-app
There are pitfalls associated with set -e (aka set -o errexit). See BashFAQ/105 (Why doesn't set -e (or set -o errexit, or trap ERR) do what I expected?).
If you decide to use set -e despite the problems then it's a very good idea to set up an ERR trap to show what has happened, and use set -E (aka set -o errtrace) so it fires in functions and subshells etc. A basic ERR trap can be set up with
trap 'echo "ERROR: ERR trap: line $LINENO" >&2' ERR
This will prevent the classic set -e problem: the program stops suddenly, at an unknown place, and for no obvious reason.
Under set -e, the script stops on any error.
set -Eeuo pipefail
# ^
Maybe the site is sometimes unavailable, or the fetched page doesn't match the expression grep is searching for.
You are doing
wget -qO- $URL
according to wget man page
-q
--quiet
Turn off Wget's output.
this is counterproductive for finding objective cause of malfunction, by default wget is verbose and write information to stderr, if you wish to store that into file you might redirect stderr to some file, consider following simple example
wget -O - http://www.example.com 2>>wget_out.txt
it does download Example Domain and write its' content to standard output (-) whilst stderr is appended to file named wget_out.txt, therefore if you run that command e.g. 3 times you will have information from 3 runs in wget_out.txt
I want to make a shortcut for long commands with default arguments so I can bring the command into the command line and then add or change arguments by myself.
For example with the command wget:
print "wget -O downloaded.file"
will result in:
user#hostname$ wget -O downloaded.file
and then I add the "url" I want to download from:
user#hostname$ wget -O downloaded.file http://example.com/
With
alias my_personal_wget_shortcut="wget -O downloaded.file"
additional arguments are added afterwards:
my_personal_wget_shortcut http://example.com/
With a function or a shell script, you can place the arguments anywhere you like:
Function:
my_personal_wget_shortcut()
{
wget -O downloaded.file "$#"
}
Script:
#!/bin/bash
wget -O downloaded.file "$#"
Don't forget to set the x-bit on the script!
My answer is based on solutions from https://unix.stackexchange.com/a/82716/330217 and https://unix.stackexchange.com/a/325220/330217.
You might also get some ideas from https://unix.stackexchange.com/a/391698/330217
Clean way for bash, but requires pressing cursor-up:
You can put "fake" commands onto bash's command history using
history -s 'some command'
(see https://www.gnu.org/savannah-checkouts/gnu/bash/manual/bash.html#Bash-History-Builtins)
After putting the command to the history you can press the cursor-up key to get this command into the command buffer and edit it as you like.
Of course you can use an alias as a shortcut.
alias myalias="history -s 'wget -O downloaded.file '"
myalias
After pressing cursor-up you should get
wget -O downloaded.file _
Note: The underscore (_) is meant to show the cursor position after the trailing space.
More hacky solution which directly fills the command buffer:
writecmd () {
perl -e 'ioctl STDOUT, 0x5412, $_ for split //, do{ chomp($_ = <>); $_ }' ;
}
# Example usage
echo 'my test cmd' | writecmd
Combined with an alias
writecmd () {
perl -e 'ioctl STDOUT, 0x5412, $_ for split //, do{ chomp($_ = <>); $_ }' ;
}
alias myalias="echo 'wget -O downloaded.file '|writecmd"
myalias
should directly result in
wget -O downloaded.file _
Note: This solution has the drawback that it also prints the command to stdout. That means the command also appears before the prompt. (I don't know if it is possible to suppress this. When I redirect stdout to /dev/null, the command will no longer be written to the command buffer.)
In zsh you could use print -z to put something into the command buffer.
alias myalias="print -z 'wget -O downloaded.file '"
Say I have a file at the URL http://mywebsite.example/myscript.txt that contains a script:
#!/bin/bash
echo "Hello, world!"
read -p "What is your name? " name
echo "Hello, ${name}!"
And I'd like to run this script without first saving it to a file. How do I do this?
Now, I've seen the syntax:
bash < <(curl -s http://mywebsite.example/myscript.txt)
But this doesn't seem to work like it would if I saved to a file and then executed. For example readline doesn't work, and the output is just:
$ bash < <(curl -s http://mywebsite.example/myscript.txt)
Hello, world!
Similarly, I've tried:
curl -s http://mywebsite.example/myscript.txt | bash -s --
With the same results.
Originally I had a solution like:
timestamp=`date +%Y%m%d%H%M%S`
curl -s http://mywebsite.example/myscript.txt -o /tmp/.myscript.${timestamp}.tmp
bash /tmp/.myscript.${timestamp}.tmp
rm -f /tmp/.myscript.${timestamp}.tmp
But this seems sloppy, and I'd like a more elegant solution.
I'm aware of the security issues regarding running a shell script from a URL, but let's ignore all of that for right now.
source <(curl -s http://mywebsite.example/myscript.txt)
ought to do it. Alternately, leave off the initial redirection on yours, which is redirecting standard input; bash takes a filename to execute just fine without redirection, and <(command) syntax provides a path.
bash <(curl -s http://mywebsite.example/myscript.txt)
It may be clearer if you look at the output of echo <(cat /dev/null)
This is the way to execute remote script with passing to it some arguments (arg1 arg2):
curl -s http://server/path/script.sh | bash /dev/stdin arg1 arg2
For bash, Bourne shell and fish:
curl -s http://server/path/script.sh | bash -s arg1 arg2
Flag "-s" makes shell read from stdin.
Use:
curl -s -L URL_TO_SCRIPT_HERE | bash
For example:
curl -s -L http://bitly/10hA8iC | bash
Using wget, which is usually part of default system installation:
bash <(wget -qO- http://mywebsite.example/myscript.txt)
You can also do this:
wget -O - https://raw.github.com/luismartingil/commands/master/101_remote2local_wireshark.sh | bash
The best way to do it is
curl http://domain/path/to/script.sh | bash -s arg1 arg2
which is a slight change of answer by #user77115
You can use curl and send it to bash like this:
bash <(curl -s http://mywebsite.example/myscript.txt)
I often using the following is enough
curl -s http://mywebsite.example/myscript.txt | sh
But in a old system( kernel2.4 ), it encounter problems, and do the following can solve it, I tried many others, only the following works
curl -s http://mywebsite.example/myscript.txt -o a.sh && sh a.sh && rm -f a.sh
Examples
$ curl -s someurl | sh
Starting to insert crontab
sh: _name}.sh: command not found
sh: line 208: syntax error near unexpected token `then'
sh: line 208: ` -eq 0 ]]; then'
$
The problem may cause by network slow, or bash version too old that can't handle network slow gracefully
However, the following solves the problem
$ curl -s someurl -o a.sh && sh a.sh && rm -f a.sh
Starting to insert crontab
Insert crontab entry is ok.
Insert crontab is done.
okay
$
Also:
curl -sL https://.... | sudo bash -
Just combining amra and user77115's answers:
wget -qO- https://raw.githubusercontent.com/lingtalfi/TheScientist/master/_bb_autoload/bbstart.sh | bash -s -- -v -v
It executes the bbstart.sh distant script passing it the -v -v options.
Is some unattended scripts I use the following command:
sh -c "$(curl -fsSL <URL>)"
I recommend to avoid executing scripts directly from URLs. You should be sure the URL is safe and check the content of the script before executing, you can use a SHA256 checksum to validate the file before executing.
instead of executing the script directly, first download it and then execute
SOURCE='https://gist.githubusercontent.com/cci-emciftci/123123/raw/123123/sample.sh'
curl $SOURCE -o ./my_sample.sh
chmod +x my_sample.sh
./my_sample.sh
This way is good and conventional:
17:04:59#itqx|~
qx>source <(curl -Ls http://192.168.80.154/cent74/just4Test) Lord Jesus Loves YOU
Remote script test...
Param size: 4
---------
17:19:31#node7|/var/www/html/cent74
arch>cat just4Test
echo Remote script test...
echo Param size: $#
If you want the script run using the current shell, regardless of what it is, use:
${SHELL:-sh} -c "$(wget -qO - http://mywebsite.example/myscript.txt)"
if you have wget, or:
${SHELL:-sh} -c "$(curl -Ls http://mywebsite.example/myscript.txt)"
if you have curl.
This command will still work if the script is interactive, i.e., it asks the user for input.
Note: OpenWRT has a wget clone but not curl, by default.
bash | curl http://your.url.here/script.txt
actual example:
juan#juan-MS-7808:~$ bash | curl https://raw.githubusercontent.com/JPHACKER2k18/markwe/master/testapp.sh
Oh, wow im alive
juan#juan-MS-7808:~$
I'm trying to write a shell script that calls another script that then executes a rsync command.
The second script should run in its own terminal, so I use a gnome-terminal -e "..." command. One of the parameters of this script is a string containing the parameters that should be given to rsync. I put those into single quotes.
Up until here, everything worked fine until one of the rsync parameters was a directory path that contained a space. I tried numerous combinations of ',",\",\' but the script either doesn't run at all or only the first part of the path is taken.
Here's a slightly modified version of the code I'm using
gnome-terminal -t 'Rsync scheduled backup' -e "nice -10 /Scripts/BackupScript/Backup.sh 0 0 '/Scripts/BackupScript/Stamp' '/Scripts/BackupScript/test' '--dry-run -g -o -p -t -R -u --inplace --delete -r -l '\''/media/MyAndroid/Internal storage'\''' "
Within Backup.sh this command is run
rsync $5 "$path"
where the destination $path is calculated from text in Stamp.
How can I achieve these three levels of nested quotations?
These are some question I looked at just now (I've tried other sources earlier as well)
https://unix.stackexchange.com/questions/23347/wrapping-a-command-that-includes-single-and-double-quotes-for-another-command
how to make nested double quotes survive the bash interpreter?
Using multiple layers of quotes in bash
Nested quotes bash
I was unsuccessful in applying the solutions to my problem.
Here is an example. caller.sh uses gnome-terminal to execute foo.sh, which in turn prints all the arguments and then calls rsync with the first argument.
caller.sh:
#!/bin/bash
gnome-terminal -t "TEST" -e "./foo.sh 'long path' arg2 arg3"
foo.sh:
#!/bin/bash
echo $# arguments
for i; do # same as: for i in "$#"; do
echo "$i"
done
rsync "$1" "some other path"
Edit: If $1 contains several parameters to rsync, some of which are long paths, the above won't work, since bash either passes "$1" as one parameter, or $1 as multiple parameters, splitting it without regard to contained quotes.
There is (at least) one workaround, you can trick bash as follows:
caller2.sh:
#!/bin/bash
gnome-terminal -t "TEST" -e "./foo.sh '--option1 --option2 \"long path\"' arg2 arg3"
foo2.sh:
#!/bin/bash
rsync_command="rsync $1"
eval "$rsync_command"
This will do the equivalent of typing rsync --option1 --option2 "long path" on the command line.
WARNING: This hack introduces a security vulnerability, $1 can be crafted to execute multiple commands if the user has any influence whatsoever over the string content (e.g. '--option1 --option2 \"long path\"; echo YOU HAVE BEEN OWNED' will run rsync and then execute the echo command).
Did you try escaping the space in the path with "\ " (no quotes)?
gnome-terminal -t 'Rsync scheduled backup' -e "nice -10 /Scripts/BackupScript/Backup.sh 0 0 '/Scripts/BackupScript/Stamp' '/Scripts/BackupScript/test' '--dry-run -g -o -p -t -R -u --inplace --delete -r -l ''/media/MyAndroid/Internal\ storage''' "
Trying to do a script to download a file using wget, or curl if wget doesn't exist in Linux. How do I have the script check for existence of wget?
Linux has a which command which will check for the existence of an executable on your path:
pax> which ls ; echo $?
/bin/ls
0
pax> which no_such_executable ; echo $?
1
As you can see, it sets the return code $? to easily tell if the executable was found, so you could use something like:
if which wget >/dev/null ; then
echo "Downloading via wget."
wget --option argument
elif which curl >/dev/null ; then
echo "Downloading via curl."
curl --option argument
else
echo "Cannot download, neither wget nor curl is available."
fi
wget http://download/url/file 2>/dev/null || curl -O http://download/url/file
One can also use command or type or hash to check if wget/curl exists or not. Another thread here - "Check if a program exists from a Bash script" answers very nicely what to use in a bash script to check if a program exists.
I would do this -
if [ ! -x /usr/bin/wget ] ; then
# some extra check if wget is not installed at the usual place
command -v wget >/dev/null 2>&1 || { echo >&2 "Please install wget or set it in your path. Aborting."; exit 1; }
fi
First thing to do is try install to install wget with your usual package management system,. It should tell you if already installed;
yum -y wget
Otherwise just launch a command like below
wget http://download/url/file
If you receive no error, then its ok.
A solution taken from the K3S install script (https://raw.githubusercontent.com/rancher/k3s/master/install.sh)
function download {
url=$1
filename=$2
if [ -x "$(which wget)" ] ; then
wget -q $url -O $2
elif [ -x "$(which curl)" ]; then
curl -o $2 -sfL $url
else
echo "Could not find curl or wget, please install one." >&2
fi
}
# to use in the script:
download https://url /local/path/to/download
Explanation:
It looks for the location of wget and checks for a file to exist there, if so, it does a script-friendly (i.e. quiet) download. If wget isn't found, it tries curl in a similarly script-friendly way.
(Note that the question doesn't specify BASH however my answer assumes it.)
Simply run
wget http://download/url/file
you will see the statistics whether the endpoint is available or not.