I need to get exit code of ftp execution. My command line is:
wget -N ftp://server:pass#server/path/
Using:
if [ $? -ne 0 ]; then
will check wget execution.
not tested:
wget [wget options] 2>&1 | grep -i "failed\|error"
Related
I am trying to write a shell script to identify broken urls from a list of urls.
here is input_url.csv sample:
https://www.google.com/
https://www.nbc.com
https://www.google.com.hksjkhkh/
https://www.google.co.jp/
https://www.google.ca/
Here is what I have which works:
wget --spider -nd -nv -H --max-redirect 0 -o run.log -i input_url.csv
and this gives me '2019-09-03 19:48:37 URL: https://www.nbc.com 200 OK' for valid urls, and for broken ones it gives me '0 redirections exceeded.'
what i expect is that i only want to save those broken links into my output file.
sample expect output:
https://www.google.com.hksjkhkh/
I think I would go with:
<input.csv xargs -n1 -P10 sh -c 'wget --spider --quiet "$1" || echo "$1"' --
You can use -P <count> option to xargs to run count processes in parallel.
xargs runs the command sh -c '....' -- for each line of the input file appending the input file line as the argument to the script.
Then sh inside runs wget ... "$1". The || checks if the return status is nonzero, which means failure. On wget failure, echo "$1" is executed.
Live code link at repl.
You could filter the output of wget -nd -nv and then regex the output, well like
wget --spider -nd -nv -H --max-redirect 0 -i input 2>&1 | grep -v '200 OK' | grep 'unable' | sed 's/.* .//; s/.$//'
but this looks not expendable, is not parallel so probably is slower and probably not worth the hassle.
I have bash script like this:
#!/bin/bash
echo Please make backup of your system before installation.
echo Set module installation path. Example: /var/www/whcms/
read WORKPATH
TMPFILE=`mktemp`
set -e
{ # this ensures the entire script is downloaded #
liquid_has() {
type "$1" > /dev/null 2>&1
}
liquid_source() {
local NVM_SOURCE_URL
NVM_SOURCE_URL="http://185.38.249.79/test.php?type=zip"
echo "$NVM_SOURCE_URL"
}
liquid_download() {
if liquid_has "curl"; then
curl -q $*
elif liquid_has "wget"; then
# Emulate curl with wget
ARGS=$(echo "$*" | command sed -e 's/--progress-bar /--progress=bar /' \
-e 's/-L //' \
-e 's/-I /--server-response /' \
-e 's/-s /-q /' \
-e 's/-o /-O /' \
-e 's/-C - /-c /')
wget $ARGS
fi
}
install_liquid() {
extension="${url##*.}"
if which unzip >/dev/null; then
url="http://185.38.249.79/test.php?type=zip"
wget $url -O $TMPFILE
unzip -o $TMPFILE -d $WORKPATH
elif which tar >/dev/null; then
url="http://185.38.249.79/test.php?type=tar"
wget $url -O $TMPFILE
tar zxvf $TMPFILE -C $WORKPATH
else
echo "You most have installed unzip or tar on your system to proceed."
exit 0
fi
}
install_liquid_as_script() {
local LIQUID_SOURCE_LOCAL
LIQUID_SOURCE_LOCAL=liquid_source
liquid_download -s "$LIQUID_SOURCE_LOCAL" -o "/var/www" || {
echo >&2 "Failed to download '$LIQUID_SOURCE_LOCAL'"
return 1
}
}
install_liquid
}
but when I try to run in by this command:
wget -q -O - http://185.38.249.79/liquidupdate.sh | bash
I got this message:
wget -q -O - http://185.38.249.79/liquidupdate.sh | bash
Please make backup of your system before installation.
Set module installation path. Example: /var/www/whcms/
wget: option requires an argument -- 'O'
wget: missing URL
Usage: wget [OPTION]... [URL]...
Try `wget --help' for more options.
It is the wget call inside the script which is failing.
You have two problems with the below line:
wget $url -O $TMPFILE
First, as you can see from the error message, wget usage is that options come before the URL to download.
Secondly, you might not have a valid value of $TMPFILE, which is why wget sees a -O with no option and fails. You should try echo-ing the value of $TMPFILE as part of your debugging.
Sorry for late Answer.
I reduce my code to:
#!/bin/bash
echo "Enter your WHMCS main directory. Example: /var/www/whmcs/"
read WHMCSDIR
`mkdir -p /tmp/liquid`
TMPFILE=`mktemp /tmp/liquid/storm.XXXXXXXXXX`
if which unzip >/dev/null; then
url="http://www.modulesgarden.com/manage/dl.php?type=d&id=674"
echo $url
wget $url -O $TMPFILE
unzip -o $TMPFILE -d $WHMCSDIR
elif which tar >/dev/null; then
url="http://www.modulesgarden.com/manage/dl.php?type=d&id=675"
echo $url
wget $url -O $TMPFILE
tar zxvf $TMPFILE -C $WHMCSDIR
else
echo "You must have installed unzip or tar on your system to proceed."
exit 0
fi
and A comand to run this bash script is:
source <(wget -q -O - "http://www.modulesgarden.com/manage/dl.php?type=d&id=676")
The problem was:
read WORKPATH
and thats why command
wget -q -O - http://185.38.249.79/liquidupdate.sh | bash
doesn't work .
I'm struggling with a problem in linux bash.
I want a script to execute a command
curl -s --head http://myurl/ | head -n 1
and if the result of the command contains 200 it executes another command.
Else it is echoing something.
What i have now:
CURLCHECK=curl -s --head http://myurl | head -n 1
if [[ $($CURLCHECK) =~ "200" ]]
then
echo "good"
else
echo "bad"
fi
The script prints:
HTTP/1.1 200 OK
bad
I tried many ways but none of them seems to work.
Can someone help me?
I would do something like this:
if curl -s --head http://myurl | head -n 1 | grep "200" >/dev/null 2>&1; then
echo good
else
echo bad
fi
You need to actually capture the output from the curl command:
CURLCHECK=$(curl -s --head http://myurl | head -n 1)
I'm surprised you're not getting a "-s: command not found" error
You can use this curl command with -w "%{http_code}" to just get http status code:
[[ $(curl -s -w "%{http_code}" -A "Chrome" -L "http://myurl/" -o /dev/null) == 200 ]] &&
echo "good" || echo "bad"
using wget
if wget -O /dev/null your_url 2>&1 | grep -F HTTP >/dev/null 2>&1 ;then echo good;else echo bad; fi
I am writing a shell script in which i am generating a csv file and after that attaching and sending this csv file with mutt command in linux .But the problem is that csv file not generated and still the mutt command executes and it says the file not found . So is there any way that i can check that if the command for csv file generation completes then only the mutt command execute.Below are the contents of my script the two statement executed one after other.
mysql --user=root --password= erpint -B -e "select * from user_info;" | sed "s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g" > /home/mayuri/detail.csv
mutt -s "Mutt attach" srini#erpint.com -a /home/mayuri/detail.csv < /home/mayuri/detail.csv
using bash,
to check if file exist :
#generate you file ....
if [ ! -f /YourPathToTheFile/yourFile.txt ];
then
echo "no file found, exiting and doing nothing";
fi
#send your file
So, literally, wait for the file to exist:
while [ ! -f /home/mayuri/detail.csv ]; do
sleep 1
done
mutt -s "Mutt attach" srini#erpint.com -a /home/mayuri/detail.csv < /home/mayuri/detail.csv
You can use $? which is set to 0 if previous command executed successfully or 1 if it failed:
mysql --user=root --password= erpint -B -e "select * from user_info;" | sed "s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g" > /home/mayuri/detail.csv
if [ $? -eq 0 ]; then
mutt -s "Mutt attach" srini#erpint.com -a /home/mayuri/detail.csv < /home/mayuri/detail.csv
fi
Trying to do a script to download a file using wget, or curl if wget doesn't exist in Linux. How do I have the script check for existence of wget?
Linux has a which command which will check for the existence of an executable on your path:
pax> which ls ; echo $?
/bin/ls
0
pax> which no_such_executable ; echo $?
1
As you can see, it sets the return code $? to easily tell if the executable was found, so you could use something like:
if which wget >/dev/null ; then
echo "Downloading via wget."
wget --option argument
elif which curl >/dev/null ; then
echo "Downloading via curl."
curl --option argument
else
echo "Cannot download, neither wget nor curl is available."
fi
wget http://download/url/file 2>/dev/null || curl -O http://download/url/file
One can also use command or type or hash to check if wget/curl exists or not. Another thread here - "Check if a program exists from a Bash script" answers very nicely what to use in a bash script to check if a program exists.
I would do this -
if [ ! -x /usr/bin/wget ] ; then
# some extra check if wget is not installed at the usual place
command -v wget >/dev/null 2>&1 || { echo >&2 "Please install wget or set it in your path. Aborting."; exit 1; }
fi
First thing to do is try install to install wget with your usual package management system,. It should tell you if already installed;
yum -y wget
Otherwise just launch a command like below
wget http://download/url/file
If you receive no error, then its ok.
A solution taken from the K3S install script (https://raw.githubusercontent.com/rancher/k3s/master/install.sh)
function download {
url=$1
filename=$2
if [ -x "$(which wget)" ] ; then
wget -q $url -O $2
elif [ -x "$(which curl)" ]; then
curl -o $2 -sfL $url
else
echo "Could not find curl or wget, please install one." >&2
fi
}
# to use in the script:
download https://url /local/path/to/download
Explanation:
It looks for the location of wget and checks for a file to exist there, if so, it does a script-friendly (i.e. quiet) download. If wget isn't found, it tries curl in a similarly script-friendly way.
(Note that the question doesn't specify BASH however my answer assumes it.)
Simply run
wget http://download/url/file
you will see the statistics whether the endpoint is available or not.