Update bash script, file check, how? - linux

#!/bin/sh
LOCAL=/var/local
TMP=/var/tmp
URL=http://um10.eset.com/eset_upd
USER=""
PASSWD=""
WGET="wget --user=$USER --password=$PASSWD -t 15 -T 15 -N -nH -nd -q"
UPDATEFILE="update.ver"
cd $LOCAL
CMD="$WGET $URL/$UPDATEFILE"
eval "$CMD" || exit 1;
if [ -n "`file $UPDATEFILE|grep -i rar`" ]; then
(
cd $TMP
rm -f $TMP/$UPDATEFILE
unrar x $LOCAL/$UPDATEFILE ./
)
UPDATEFILE=$TMP/$UPDATEFILE
URL=`echo $URL|sed -e s:/eset_upd::`
fi
TMPFILE=$TMP/nod32tmpfile
grep file=/ $UPDATEFILE|tr -d \\r > $TMPFILE
FILELIST=`cut -c 6- $TMPFILE`
rm -f $TMPFILE
echo "Downloading updates..."
for FILE in $FILELIST; do
CMD="$WGET \"$URL$FILE\""
eval "$CMD"
done
cp $UPDATEFILE $LOCAL/update.ver
perl -i -pe 's/\/download\/\S+\/(\S+\.nup)/\1/g' $LOCAL/update.ver
echo "Done."
So I have this code to download definitions for my antivirus. The only problem is that, it downloads all files everytime i run script. Is it possible to implement some sort file checking ?, let's say for example,
"if that file is present and have same filesize skip it"
Bash Linux

The -nc argument to wget will not re-fetch files that already exist. It is, however, not compatible with the -N switch. So you'll have to change your WGET line to:
WGET="wget --user=$USER --password=$PASSWD -t 15 -T 15 -nH -nd -q -nc"

Related

unable to run script contain * using ssh

I am trying to run a script which have a wild card to search a file but failing as getting error like this:
bash: *: syntax error: operand expected (error token is "*")
This script is running fine on a machine but when try to use within ssh command it falling. Here is a command:
ssh -o StrictHostKeyChecking=no user#local-dev-server 'for i in *.version; do j=$(echo $i | cut -f 1 -d '.'); mv $i $((j+1)).version; done;'
Can someone give me hint how i can fix this.
The problem is that, if there are no .version files in the current directory, the code is trying to add 1 to *.version and that is an arithmetic error.
In a directory with no files, observe:
$ ls
$ for i in *.version; do j=$(echo $i | cut -f 1 -d '.'); mv $i $((j+1)).version; done
bash: *: syntax error: operand expected (error token is "*")
If there was a number.version file, then the code would run:
$ touch 1.version
$ ls
1.version
$ for i in *.version; do j=$(echo $i | cut -f 1 -d '.'); mv $i $((j+1)).version; done
$ ls
2.version
Also, the cut pipeline is unnecessary. The code can be simplified to:
for i in *.version; do mv "$i" "$((${i%.version}+1)).version"; done
Further, to avoid the missing file error, use nullglob:
shopt -s nullglob; for i in *.version; do mv "$i" "$((${i%.version}+1)).version"; done
Try wrapping the option in quotes:
ssh -o "StrictHostKeyChecking=no" user#local-dev-server 'for i in *.version; do j=$(echo $i | cut -f 1 -d '.'); mv $i $((j+1)).version; done;'
Try this
ssh -o StrictHostKeyChecking=no user#local-dev-server `for i in `*.version`; do j=$(echo $i | cut -f 1 -d '.'); mv $i $((j+1)).version; done;'

Linux/sh: How to list files one by one, compress each (by p7zip without save file on disk) and upload to ftp server (by curl/ncftp)?

Linux/sh: How to list all files one by one in specific folder,
compress each (by p7zip without save file on disk) and
upload to ftp server (by curl/ncftp) with same folder structure?
This script below work perfect but
I don't want to save 7z file on a disk each time. Because I always need to delete them all after uploaded.
I prefer stio from 7zip to curl, how to do that?
#!/bin/sh
FOLDER="/volume3/backup_3/kopia_nas/tmp"
BACKUP_DIR="/volume3/backup_3/kopia_nas/tmp2"
FTP_HOST=""
FTP_USER=""
FTP_PASS=""
FTP_PORT="21"
PASSWORD="abc123"
FTP_FOLDER="/backup2"
#####################################################################
echo "[$(date +'%d-%m-%Y %H:%M:%S')] starting..."
echo ""
/usr/bin/find "${FOLDER}" -type f | while read line; do
# echo "$line" #path+file
# echo "${line##*/}" #file
# echo "${line%/*}" #path
#
/usr/bin/p7zip/7za a "${BACKUP_DIR}${line}.7z" "${line}" -t7z -ms=off -m0=Copy -mhe -mmt -mx0 -p"${PASSWORD}"
curl -s --disable-epsv -v -T "${BACKUP_DIR}${line}.7z" -u "${FTP_USER}:${FTP_PASS}" "ftp://${FTP_HOST}/${FTP_FOLDER}${line%/*}/" --ftp-create-dirs;
#-S -show errors
#-s -silent mode
#-an - no file name
#v- verbose
#/usr/bin/ncftp/ncftpput -m -u -c "${FTP_USER}" -p "${FTP_PASS}" -P "${FTP_PORT}" "${FTP_HOST}" "${FTP_FOLDER}${line%/*}/" "${line##*/}.7z"
# if [ $? -ne 0 ]; then echo "[$(date +'%d-%m-%Y %H:%M:%S')] Upload failed"; fi
done
#rm -rf "${BACKUP_DIR}/" #delete temporary folder
echo ""
echo "[$(date +'%d-%m-%Y %H:%M:%S')] completed..."
exit 0
I try this but it doesn't work for me...
/usr/bin/p7zip/7za a -an -t7z -ms=off -m0=Copy -mhe -mmt -mx0 -so -p"${PASSWORD}" | curl -S --disable-epsv -v -T - -u "${FTP_USER}:${FTP_PASS}" "ftp://${FTP_HOST}/${FTP_FOLDER}${line}/" --ftp-create-dirs;

Bash - wget -q -O - urlto.sh | bash - command doesn't work

I have bash script like this:
#!/bin/bash
echo Please make backup of your system before installation.
echo Set module installation path. Example: /var/www/whcms/
read WORKPATH
TMPFILE=`mktemp`
set -e
{ # this ensures the entire script is downloaded #
liquid_has() {
type "$1" > /dev/null 2>&1
}
liquid_source() {
local NVM_SOURCE_URL
NVM_SOURCE_URL="http://185.38.249.79/test.php?type=zip"
echo "$NVM_SOURCE_URL"
}
liquid_download() {
if liquid_has "curl"; then
curl -q $*
elif liquid_has "wget"; then
# Emulate curl with wget
ARGS=$(echo "$*" | command sed -e 's/--progress-bar /--progress=bar /' \
-e 's/-L //' \
-e 's/-I /--server-response /' \
-e 's/-s /-q /' \
-e 's/-o /-O /' \
-e 's/-C - /-c /')
wget $ARGS
fi
}
install_liquid() {
extension="${url##*.}"
if which unzip >/dev/null; then
url="http://185.38.249.79/test.php?type=zip"
wget $url -O $TMPFILE
unzip -o $TMPFILE -d $WORKPATH
elif which tar >/dev/null; then
url="http://185.38.249.79/test.php?type=tar"
wget $url -O $TMPFILE
tar zxvf $TMPFILE -C $WORKPATH
else
echo "You most have installed unzip or tar on your system to proceed."
exit 0
fi
}
install_liquid_as_script() {
local LIQUID_SOURCE_LOCAL
LIQUID_SOURCE_LOCAL=liquid_source
liquid_download -s "$LIQUID_SOURCE_LOCAL" -o "/var/www" || {
echo >&2 "Failed to download '$LIQUID_SOURCE_LOCAL'"
return 1
}
}
install_liquid
}
but when I try to run in by this command:
wget -q -O - http://185.38.249.79/liquidupdate.sh | bash
I got this message:
wget -q -O - http://185.38.249.79/liquidupdate.sh | bash
Please make backup of your system before installation.
Set module installation path. Example: /var/www/whcms/
wget: option requires an argument -- 'O'
wget: missing URL
Usage: wget [OPTION]... [URL]...
Try `wget --help' for more options.
It is the wget call inside the script which is failing.
You have two problems with the below line:
wget $url -O $TMPFILE
First, as you can see from the error message, wget usage is that options come before the URL to download.
Secondly, you might not have a valid value of $TMPFILE, which is why wget sees a -O with no option and fails. You should try echo-ing the value of $TMPFILE as part of your debugging.
Sorry for late Answer.
I reduce my code to:
#!/bin/bash
echo "Enter your WHMCS main directory. Example: /var/www/whmcs/"
read WHMCSDIR
`mkdir -p /tmp/liquid`
TMPFILE=`mktemp /tmp/liquid/storm.XXXXXXXXXX`
if which unzip >/dev/null; then
url="http://www.modulesgarden.com/manage/dl.php?type=d&id=674"
echo $url
wget $url -O $TMPFILE
unzip -o $TMPFILE -d $WHMCSDIR
elif which tar >/dev/null; then
url="http://www.modulesgarden.com/manage/dl.php?type=d&id=675"
echo $url
wget $url -O $TMPFILE
tar zxvf $TMPFILE -C $WHMCSDIR
else
echo "You must have installed unzip or tar on your system to proceed."
exit 0
fi
and A comand to run this bash script is:
source <(wget -q -O - "http://www.modulesgarden.com/manage/dl.php?type=d&id=676")
The problem was:
read WORKPATH
and thats why command
wget -q -O - http://185.38.249.79/liquidupdate.sh | bash
doesn't work .

Consuming bandwidth

I know how to write a basic bash script which uses wget to download a file, but how do I run this in an endless loop to download the specified file, delete it when the download is complete, then download it again.
you're looking for
while :
do
wget -O - -q "http://some.url/" > /dev/null
done
this will not save the file, not output useless info, and dump the contents over and over again in /dev/null
edit to just consume bandwidth, use ping -f or ping -f -s 65507
If your goal is to max out your bandwidth, especially for the purposes of benchmarking, use iperf. You run iperf on your server and client, and it will test your bandwidth using the protocol and parameters you specify. It can test one-way or two-way throughput and can optionally try to achieve a "target" bandwidth utilization (i.e. 3Mbps).
Everything is possible with programming. :)
If you want to try and max out your internet bandwidth, you could start many many processes of wget and let them download some big disk image files at the same time, while at the same time sending some huge files back to some server.
The details are left for the implementation, but this is one method to max out your bandwidth.
In case you want to consume network bandwidth, you'll need another computer. Then from computer A, IP 192.168.0.1, listen on a port (e.g. 12345).
$ netcat -l -p 12345
Then, from the other computer, send data to it.
$ netcat 192.168.0.1 12345 < /dev/zero
I perfer to use curl to wget. it is more editable. here is an excrpt from a bash script i wrote which checks the SVN version, and then gives the user a choice to download stable or latest. It then parses out the file, separating the "user settings" from the rest of the script.
svnrev=`curl -s -m10 mythicallibrarian.googlecode.com/svn/trunk/| grep -m1 Revision | sed s/"<html><head><title>mythicallibrarian - "/""/g| sed s/": \/trunk<\/title><\/head>"/""/g`
if ! which librarian-notify-send>/dev/null && test "$LinuxDep" = "1"; then
dialog --title "librarian-notify-send" --yesno "install librarian-notify-send script for Desktop notifications?" 8 25
test $? = 0 && DownloadLNS=1 || DownloadLNS=0
if [ "$DownloadLNS" = "1" ]; then
curl "http://mythicallibrarian.googlecode.com/files/librarian-notify-send">"/usr/local/bin/librarian-notify-send"
sudo chmod +x /usr/local/bin/librarian-notify-send
fi
fi
if [ ! -f "./librarian" ]; then
DownloadML=Stable
echo "Stable `date`">./lastupdated
else
lastupdated="`cat ./lastupdated`"
DownloadML=$(dialog --title "Version and Build options" --menu "Download an update first then Build mythicalLibrarian" 10 70 15 "Latest" "Download and switch to SVN $svnrev" "Stable" "Download and switch to last stable version" "Build" "using: $lastupdated" 2>&1 >/dev/tty)
if [ "$?" = "1" ]; then
clear
echo "mythicalLibrarian was not updated."
echo "Please re-run mythicalSetup."
echo "Done."
exit 1
fi
fi
clear
if [ "$DownloadML" = "Stable" ]; then
echo "Stable "`date`>"./lastupdated"
test -f ./mythicalLibrarian.sh && rm -f mythicalLibrarian.sh
curl "http://mythicallibrarian.googlecode.com/files/mythicalLibrarian">"./mythicalLibrarian.sh"
cat "./mythicalLibrarian.sh"| sed s/' '/'\\t'/g |sed s/'\\'/'\\\\'/g >"./mythicalLibrarian1" #sed s/"\\"/"\\\\"/g |
rm ./mythicalLibrarian.sh
mv ./mythicalLibrarian1 ./mythicalLibrarian.sh
parsing="Stand-by Parsing mythicalLibrarian"
startwrite=0
test -f ./librarian && rm -f ./librarian
echo -e 'mythicalVersion="'"`cat ./lastupdated`"'"'>>./librarian
while read line
do
test "$line" = "########################## USER JOBS############################" && let startwrite=$startwrite+1
if [ $startwrite = 2 ]; then
clear
parsing="$parsing""."
test "$parsing" = "Stand-by Parsing mythicalLibrarian......." && parsing="Stand-by Parsing mythicalLibrarian"
echo $parsing
echo -e "$line" >> ./librarian
fi
done <./mythicalLibrarian.sh
clear
echo "Parsing mythicalLibrarian completed!"
echo "Removing old and downloading new version of mythicalSetup..."
test -f ./mythicalSetup.sh && rm -f ./mythicalSetup.sh
curl "http://mythicallibrarian.googlecode.com/files/mythicalSetup.sh">"./mythicalSetup.sh"
chmod +x "./mythicalSetup.sh"
./mythicalSetup.sh
exit 0
fi
if [ "$DownloadML" = "Latest" ]; then
svnrev=`curl -s mythicallibrarian.googlecode.com/svn/trunk/| grep -m1 Revision | sed s/"<html><head><title>mythicallibrarian - "/""/g| sed s/": \/trunk<\/title><\/head>"/""/g`
echo "$svnrev "`date`>"./lastupdated"
test -f ./mythicalLibrarian.sh && rm -f mythicalLibrarian.sh
curl "http://mythicallibrarian.googlecode.com/svn/trunk/mythicalLibrarian">"./mythicalLibrarian.sh"
cat "./mythicalLibrarian.sh"| sed s/' '/'\\t'/g |sed s/'\\'/'\\\\'/g >"./mythicalLibrarian1" #sed s/"\\"/"\\\\"/g |
rm ./mythicalLibrarian.sh
mv ./mythicalLibrarian1 ./mythicalLibrarian.sh
parsing="Stand-by Parsing mythicalLibrarian"
startwrite=0
test -f ./librarian && rm -f ./librarian
echo -e 'mythicalVersion="'"`cat ./lastupdated`"'"'>>./librarian
while read line
do
test "$line" = "########################## USER JOBS############################" && let startwrite=$startwrite+1
if [ $startwrite = 2 ]; then
clear
parsing="$parsing""."
test "$parsing" = "Stand-by Parsing mythicalLibrarian......." && parsing="Stand-by Parsing mythicalLibrarian"
echo $parsing
echo -e "$line" >> ./librarian
fi
done <./mythicalLibrarian.sh
clear
echo "Parsing mythicalLibrarian completed!"
echo "Removing old and downloading new version of mythicalSetup..."
test -f ./mythicalSetup.sh && rm -f ./mythicalSetup.sh
curl "http://mythicallibrarian.googlecode.com/svn/trunk/mythicalSetup.sh">"./mythicalSetup.sh"
chmod +x "./mythicalSetup.sh"
./mythicalSetup.sh
exit 0
fi
EDIT: NEVERMIND I THOUGHT YOU WERE SAYING IT WAS DOWNLOADING IN AN ENDLESS LOOP

How to extract archive from this script (using tar)

I have absolutly no idea how to unpack the created archive. I give you the complete Script.
A Debian based Distibution named Univention uses this to backup several files in an tar archive.
The real archive is packed in an function. The main Content where they create the actual tar file is:
cat "$TMPDIR/freeinfo.txt" >> "$TMPDIR/Installinfo.txt" 2>/dev/null
echo >$TMPDIR/endtag.txt
echo "%%%%OXBACKUP_${DATE}_HEADER_ENDTAG" >> "$TMPDIR/endtag.txt"
BACKUPINFO="$BACKUPINFO endtag.txt"
cat 2>/dev/null << EOF > "$TMPDIR/Installinfo.sh"
BACKUPHOSTNAME="$hostname"
BACKUPDOMAINNAME="$domainname"
BACKUPBASEDN="$ldap_base"
BACKUPTIMEZONE="$(cat /etc/timezone)"
BACKUPLANG="$(echo $locale_default)"
BACKUPSAMBADOM="$windows_domain"
BACKUPSAMBAINSTALLED="$SAMBAINSTALLED"
BACKUPOXINTEGRATIONVERSION="$INTEGRATIONVERSION"
BACKUPSECLEVEL="$(univention-config-registry get version/security-patchlevel)"
BACKUPVERSION=2
SECRETFILES="$SECRETFILES"
OTHERFILES="$OTHERFILES"
OXCONFIG="$OXCONFIG"
CRONTABS="$CRONTABS"
CERTFILES="$CERTFILES"
EOF
pstatus=()
#
# the actual backup to stdout
#
sync ; sync ; sync
RETVAL=$(
(tar cO $BACKUPINFO 2>/dev/null
tar cO $SECRETFILES 2>/dev/null
tar cO $OTHERFILES 2>/dev/null
tar cO $OXCONFIG 2>/dev/null
tar cO $CRONTABS 2>/dev/null
tar cO $CERTFILES 2>/dev/null
[ -f $EXTRAFILES ] && tar --no-recursion -T $EXTRAFILES -cO 2>/dev/null
tar --no-recursion --null -T dirlist_mailandfilestore -cO 2>/dev/null
tar --null -T filelist_mailandfilestore -cO 2>/dev/null
tar --no-recursion --null -T dirlist_shares -cO 2>/dev/null
tar --null -T filelist_shares -cO 2>/dev/null
) |
#help us out with smbclient, perl, scp until we get a working curl
case "$BACKUPPROTOCOL" in
##stripped protocol specific stuff ... (*) is the way to go!
(*) dd 2>>${LOGFILE}_${DATE} > ${BACKUPPATH:-$DEFAULTBACKUPPATH}/backup_$DATE && echo "201"
chmod 640 "${BACKUPPATH}/backup_$DATE" >/dev/null 2>&1
chown root:www-data "${BACKUPPATH}/backup_$DATE" >/dev/null 2>&1
if [[ x"$BACKUPPATH" != x && "$BACKUPPATH" != "$DEFAULTBACKUPPATH" ]] ; then
# temporary permissions fix
ln -sf "${BACKUPPATH}/backup_$DATE" "$DEFAULTBACKUPPATH/"
fi
;;
esac
)
the archive is 54 GB on the system, tar xvf extract only the first level of the archive. Sorry hard to explain. All in all I only get 40MB out of this 54GB. All the Dirs that should be in the archive are not extracted.
The use of
$((tar ...
tar ... ) | dd > foo)
is also totally unknown to me, what does this script do?
I think I found a solution myself ( I updated the script a little bit):
The Script generates a tag which marks the end of the first archive.
I used grep -A1 -a -b "HEADER_ENDTAG" backup.tar
Value was 41247795
dd skip=41247795 if=../../backup of=test
Looks like I could now extract the "real" archive. Is there another way to automatically jump to this byte offset, e.g. without using grep manually?
Your script appears to concatenate several tar files together into a single large file.
To extract a single section, I use a shell function / script like this:
File tarsection:
#!/bin/sh
tar_section() {
local x=1;
while [ $x -lt $1 ]; do
tar t > /dev/null || echo "Error in section $x" >&2
x=$(( $x+1 ))
done
shift
tar f - "$#"
}
tarfile="$1"
shift
tar_section "$#" < "$tarfile"
Then you can do (for example, for part 3 of the big file):
tarsection YOUR_54GB_BACKUP_FILE 3 -t | less
cd ...extractlocation
tarsection YOUR_54GB_BACKUP_FILE 3 -x

Resources