Consuming bandwidth - linux

I know how to write a basic bash script which uses wget to download a file, but how do I run this in an endless loop to download the specified file, delete it when the download is complete, then download it again.

you're looking for
while :
do
wget -O - -q "http://some.url/" > /dev/null
done
this will not save the file, not output useless info, and dump the contents over and over again in /dev/null
edit to just consume bandwidth, use ping -f or ping -f -s 65507

If your goal is to max out your bandwidth, especially for the purposes of benchmarking, use iperf. You run iperf on your server and client, and it will test your bandwidth using the protocol and parameters you specify. It can test one-way or two-way throughput and can optionally try to achieve a "target" bandwidth utilization (i.e. 3Mbps).

Everything is possible with programming. :)
If you want to try and max out your internet bandwidth, you could start many many processes of wget and let them download some big disk image files at the same time, while at the same time sending some huge files back to some server.
The details are left for the implementation, but this is one method to max out your bandwidth.

In case you want to consume network bandwidth, you'll need another computer. Then from computer A, IP 192.168.0.1, listen on a port (e.g. 12345).
$ netcat -l -p 12345
Then, from the other computer, send data to it.
$ netcat 192.168.0.1 12345 < /dev/zero

I perfer to use curl to wget. it is more editable. here is an excrpt from a bash script i wrote which checks the SVN version, and then gives the user a choice to download stable or latest. It then parses out the file, separating the "user settings" from the rest of the script.
svnrev=`curl -s -m10 mythicallibrarian.googlecode.com/svn/trunk/| grep -m1 Revision | sed s/"<html><head><title>mythicallibrarian - "/""/g| sed s/": \/trunk<\/title><\/head>"/""/g`
if ! which librarian-notify-send>/dev/null && test "$LinuxDep" = "1"; then
dialog --title "librarian-notify-send" --yesno "install librarian-notify-send script for Desktop notifications?" 8 25
test $? = 0 && DownloadLNS=1 || DownloadLNS=0
if [ "$DownloadLNS" = "1" ]; then
curl "http://mythicallibrarian.googlecode.com/files/librarian-notify-send">"/usr/local/bin/librarian-notify-send"
sudo chmod +x /usr/local/bin/librarian-notify-send
fi
fi
if [ ! -f "./librarian" ]; then
DownloadML=Stable
echo "Stable `date`">./lastupdated
else
lastupdated="`cat ./lastupdated`"
DownloadML=$(dialog --title "Version and Build options" --menu "Download an update first then Build mythicalLibrarian" 10 70 15 "Latest" "Download and switch to SVN $svnrev" "Stable" "Download and switch to last stable version" "Build" "using: $lastupdated" 2>&1 >/dev/tty)
if [ "$?" = "1" ]; then
clear
echo "mythicalLibrarian was not updated."
echo "Please re-run mythicalSetup."
echo "Done."
exit 1
fi
fi
clear
if [ "$DownloadML" = "Stable" ]; then
echo "Stable "`date`>"./lastupdated"
test -f ./mythicalLibrarian.sh && rm -f mythicalLibrarian.sh
curl "http://mythicallibrarian.googlecode.com/files/mythicalLibrarian">"./mythicalLibrarian.sh"
cat "./mythicalLibrarian.sh"| sed s/' '/'\\t'/g |sed s/'\\'/'\\\\'/g >"./mythicalLibrarian1" #sed s/"\\"/"\\\\"/g |
rm ./mythicalLibrarian.sh
mv ./mythicalLibrarian1 ./mythicalLibrarian.sh
parsing="Stand-by Parsing mythicalLibrarian"
startwrite=0
test -f ./librarian && rm -f ./librarian
echo -e 'mythicalVersion="'"`cat ./lastupdated`"'"'>>./librarian
while read line
do
test "$line" = "########################## USER JOBS############################" && let startwrite=$startwrite+1
if [ $startwrite = 2 ]; then
clear
parsing="$parsing""."
test "$parsing" = "Stand-by Parsing mythicalLibrarian......." && parsing="Stand-by Parsing mythicalLibrarian"
echo $parsing
echo -e "$line" >> ./librarian
fi
done <./mythicalLibrarian.sh
clear
echo "Parsing mythicalLibrarian completed!"
echo "Removing old and downloading new version of mythicalSetup..."
test -f ./mythicalSetup.sh && rm -f ./mythicalSetup.sh
curl "http://mythicallibrarian.googlecode.com/files/mythicalSetup.sh">"./mythicalSetup.sh"
chmod +x "./mythicalSetup.sh"
./mythicalSetup.sh
exit 0
fi
if [ "$DownloadML" = "Latest" ]; then
svnrev=`curl -s mythicallibrarian.googlecode.com/svn/trunk/| grep -m1 Revision | sed s/"<html><head><title>mythicallibrarian - "/""/g| sed s/": \/trunk<\/title><\/head>"/""/g`
echo "$svnrev "`date`>"./lastupdated"
test -f ./mythicalLibrarian.sh && rm -f mythicalLibrarian.sh
curl "http://mythicallibrarian.googlecode.com/svn/trunk/mythicalLibrarian">"./mythicalLibrarian.sh"
cat "./mythicalLibrarian.sh"| sed s/' '/'\\t'/g |sed s/'\\'/'\\\\'/g >"./mythicalLibrarian1" #sed s/"\\"/"\\\\"/g |
rm ./mythicalLibrarian.sh
mv ./mythicalLibrarian1 ./mythicalLibrarian.sh
parsing="Stand-by Parsing mythicalLibrarian"
startwrite=0
test -f ./librarian && rm -f ./librarian
echo -e 'mythicalVersion="'"`cat ./lastupdated`"'"'>>./librarian
while read line
do
test "$line" = "########################## USER JOBS############################" && let startwrite=$startwrite+1
if [ $startwrite = 2 ]; then
clear
parsing="$parsing""."
test "$parsing" = "Stand-by Parsing mythicalLibrarian......." && parsing="Stand-by Parsing mythicalLibrarian"
echo $parsing
echo -e "$line" >> ./librarian
fi
done <./mythicalLibrarian.sh
clear
echo "Parsing mythicalLibrarian completed!"
echo "Removing old and downloading new version of mythicalSetup..."
test -f ./mythicalSetup.sh && rm -f ./mythicalSetup.sh
curl "http://mythicallibrarian.googlecode.com/svn/trunk/mythicalSetup.sh">"./mythicalSetup.sh"
chmod +x "./mythicalSetup.sh"
./mythicalSetup.sh
exit 0
fi
EDIT: NEVERMIND I THOUGHT YOU WERE SAYING IT WAS DOWNLOADING IN AN ENDLESS LOOP

Related

How to develop a Condition to close program only when log file has been updated in Bash Script [duplicate]

I want to run a shell script when a specific file or directory changes.
How can I easily do that?
You may try entr tool to run arbitrary commands when files change. Example for files:
$ ls -d * | entr sh -c 'make && make test'
or:
$ ls *.css *.html | entr reload-browser Firefox
or print Changed! when file file.txt is saved:
$ echo file.txt | entr echo Changed!
For directories use -d, but you've to use it in the loop, e.g.:
while true; do find path/ | entr -d echo Changed; done
or:
while true; do ls path/* | entr -pd echo Changed; done
I use this script to run a build script on changes in a directory tree:
#!/bin/bash -eu
DIRECTORY_TO_OBSERVE="js" # might want to change this
function block_for_change {
inotifywait --recursive \
--event modify,move,create,delete \
$DIRECTORY_TO_OBSERVE
}
BUILD_SCRIPT=build.sh # might want to change this too
function build {
bash $BUILD_SCRIPT
}
build
while block_for_change; do
build
done
Uses inotify-tools. Check inotifywait man page for how to customize what triggers the build.
Use inotify-tools.
The linked Github page has a number of examples; here is one of them.
#!/bin/sh
cwd=$(pwd)
inotifywait -mr \
--timefmt '%d/%m/%y %H:%M' --format '%T %w %f' \
-e close_write /tmp/test |
while read -r date time dir file; do
changed_abs=${dir}${file}
changed_rel=${changed_abs#"$cwd"/}
rsync --progress --relative -vrae 'ssh -p 22' "$changed_rel" \
usernam#example.com:/backup/root/dir && \
echo "At ${time} on ${date}, file $changed_abs was backed up via rsync" >&2
done
How about this script? Uses the 'stat' command to get the access time of a file and runs a command whenever there is a change in the access time (whenever file is accessed).
#!/bin/bash
while true
do
ATIME=`stat -c %Z /path/to/the/file.txt`
if [[ "$ATIME" != "$LTIME" ]]
then
echo "RUN COMMNAD"
LTIME=$ATIME
fi
sleep 5
done
Check out the kernel filesystem monitor daemon
http://freshmeat.net/projects/kfsmd/
Here's a how-to:
http://www.linux.com/archive/feature/124903
As mentioned, inotify-tools is probably the best idea. However, if you're programming for fun, you can try and earn hacker XPs by judicious application of tail -f .
Just for debugging purposes, when I write a shell script and want it to run on save, I use this:
#!/bin/bash
file="$1" # Name of file
command="${*:2}" # Command to run on change (takes rest of line)
t1="$(ls --full-time $file | awk '{ print $7 }')" # Get latest save time
while true
do
t2="$(ls --full-time $file | awk '{ print $7 }')" # Compare to new save time
if [ "$t1" != "$t2" ];then t1="$t2"; $command; fi # If different, run command
sleep 0.5
done
Run it as
run_on_save.sh myfile.sh ./myfile.sh arg1 arg2 arg3
Edit: Above tested on Ubuntu 12.04, for Mac OS, change the ls lines to:
"$(ls -lT $file | awk '{ print $8 }')"
Add the following to ~/.bashrc:
function react() {
if [ -z "$1" -o -z "$2" ]; then
echo "Usage: react <[./]file-to-watch> <[./]action> <to> <take>"
elif ! [ -r "$1" ]; then
echo "Can't react to $1, permission denied"
else
TARGET="$1"; shift
ACTION="$#"
while sleep 1; do
ATIME=$(stat -c %Z "$TARGET")
if [[ "$ATIME" != "${LTIME:-}" ]]; then
LTIME=$ATIME
$ACTION
fi
done
fi
}
Quick solution for fish shell users who wanna track a single file:
while true
set old_hash $hash
set hash (md5sum file_to_watch)
if [ $hash != $old_hash ]
command_to_execute
end
sleep 1
end
replace md5sum with md5 if on macos.
Here's another option: http://fileschanged.sourceforge.net/
See especially "example 4", which "monitors a directory and archives any new or changed files".
inotifywait can satisfy you.
Here is a common sample for it:
inotifywait -m /path -e create -e moved_to -e close_write | # -m is --monitor, -e is --event
while read path action file; do
if [[ "$file" =~ .*rst$ ]]; then # if suffix is '.rst'
echo ${path}${file} ': '${action} # execute your command
echo 'make html'
make html
fi
done
Suppose you want to run rake test every time you modify any ruby file ("*.rb") in app/ and test/ directories.
Just get the most recent modified time of the watched files and check every second if that time has changed.
Script code
t_ref=0; while true; do t_curr=$(find app/ test/ -type f -name "*.rb" -printf "%T+\n" | sort -r | head -n1); if [ $t_ref != $t_curr ]; then t_ref=$t_curr; rake test; fi; sleep 1; done
Benefits
You can run any command or script when the file changes.
It works between any filesystem and virtual machines (shared folders on VirtualBox using Vagrant); so you can use a text editor on your Macbook and run the tests on Ubuntu (virtual box), for example.
Warning
The -printf option works well on Ubuntu, but do not work in MacOS.

Watch file to be updated [duplicate]

I want to run a shell script when a specific file or directory changes.
How can I easily do that?
You may try entr tool to run arbitrary commands when files change. Example for files:
$ ls -d * | entr sh -c 'make && make test'
or:
$ ls *.css *.html | entr reload-browser Firefox
or print Changed! when file file.txt is saved:
$ echo file.txt | entr echo Changed!
For directories use -d, but you've to use it in the loop, e.g.:
while true; do find path/ | entr -d echo Changed; done
or:
while true; do ls path/* | entr -pd echo Changed; done
I use this script to run a build script on changes in a directory tree:
#!/bin/bash -eu
DIRECTORY_TO_OBSERVE="js" # might want to change this
function block_for_change {
inotifywait --recursive \
--event modify,move,create,delete \
$DIRECTORY_TO_OBSERVE
}
BUILD_SCRIPT=build.sh # might want to change this too
function build {
bash $BUILD_SCRIPT
}
build
while block_for_change; do
build
done
Uses inotify-tools. Check inotifywait man page for how to customize what triggers the build.
Use inotify-tools.
The linked Github page has a number of examples; here is one of them.
#!/bin/sh
cwd=$(pwd)
inotifywait -mr \
--timefmt '%d/%m/%y %H:%M' --format '%T %w %f' \
-e close_write /tmp/test |
while read -r date time dir file; do
changed_abs=${dir}${file}
changed_rel=${changed_abs#"$cwd"/}
rsync --progress --relative -vrae 'ssh -p 22' "$changed_rel" \
usernam#example.com:/backup/root/dir && \
echo "At ${time} on ${date}, file $changed_abs was backed up via rsync" >&2
done
How about this script? Uses the 'stat' command to get the access time of a file and runs a command whenever there is a change in the access time (whenever file is accessed).
#!/bin/bash
while true
do
ATIME=`stat -c %Z /path/to/the/file.txt`
if [[ "$ATIME" != "$LTIME" ]]
then
echo "RUN COMMNAD"
LTIME=$ATIME
fi
sleep 5
done
Check out the kernel filesystem monitor daemon
http://freshmeat.net/projects/kfsmd/
Here's a how-to:
http://www.linux.com/archive/feature/124903
As mentioned, inotify-tools is probably the best idea. However, if you're programming for fun, you can try and earn hacker XPs by judicious application of tail -f .
Just for debugging purposes, when I write a shell script and want it to run on save, I use this:
#!/bin/bash
file="$1" # Name of file
command="${*:2}" # Command to run on change (takes rest of line)
t1="$(ls --full-time $file | awk '{ print $7 }')" # Get latest save time
while true
do
t2="$(ls --full-time $file | awk '{ print $7 }')" # Compare to new save time
if [ "$t1" != "$t2" ];then t1="$t2"; $command; fi # If different, run command
sleep 0.5
done
Run it as
run_on_save.sh myfile.sh ./myfile.sh arg1 arg2 arg3
Edit: Above tested on Ubuntu 12.04, for Mac OS, change the ls lines to:
"$(ls -lT $file | awk '{ print $8 }')"
Add the following to ~/.bashrc:
function react() {
if [ -z "$1" -o -z "$2" ]; then
echo "Usage: react <[./]file-to-watch> <[./]action> <to> <take>"
elif ! [ -r "$1" ]; then
echo "Can't react to $1, permission denied"
else
TARGET="$1"; shift
ACTION="$#"
while sleep 1; do
ATIME=$(stat -c %Z "$TARGET")
if [[ "$ATIME" != "${LTIME:-}" ]]; then
LTIME=$ATIME
$ACTION
fi
done
fi
}
Quick solution for fish shell users who wanna track a single file:
while true
set old_hash $hash
set hash (md5sum file_to_watch)
if [ $hash != $old_hash ]
command_to_execute
end
sleep 1
end
replace md5sum with md5 if on macos.
Here's another option: http://fileschanged.sourceforge.net/
See especially "example 4", which "monitors a directory and archives any new or changed files".
inotifywait can satisfy you.
Here is a common sample for it:
inotifywait -m /path -e create -e moved_to -e close_write | # -m is --monitor, -e is --event
while read path action file; do
if [[ "$file" =~ .*rst$ ]]; then # if suffix is '.rst'
echo ${path}${file} ': '${action} # execute your command
echo 'make html'
make html
fi
done
Suppose you want to run rake test every time you modify any ruby file ("*.rb") in app/ and test/ directories.
Just get the most recent modified time of the watched files and check every second if that time has changed.
Script code
t_ref=0; while true; do t_curr=$(find app/ test/ -type f -name "*.rb" -printf "%T+\n" | sort -r | head -n1); if [ $t_ref != $t_curr ]; then t_ref=$t_curr; rake test; fi; sleep 1; done
Benefits
You can run any command or script when the file changes.
It works between any filesystem and virtual machines (shared folders on VirtualBox using Vagrant); so you can use a text editor on your Macbook and run the tests on Ubuntu (virtual box), for example.
Warning
The -printf option works well on Ubuntu, but do not work in MacOS.

Linux/sh: How to list files one by one, compress each (by p7zip without save file on disk) and upload to ftp server (by curl/ncftp)?

Linux/sh: How to list all files one by one in specific folder,
compress each (by p7zip without save file on disk) and
upload to ftp server (by curl/ncftp) with same folder structure?
This script below work perfect but
I don't want to save 7z file on a disk each time. Because I always need to delete them all after uploaded.
I prefer stio from 7zip to curl, how to do that?
#!/bin/sh
FOLDER="/volume3/backup_3/kopia_nas/tmp"
BACKUP_DIR="/volume3/backup_3/kopia_nas/tmp2"
FTP_HOST=""
FTP_USER=""
FTP_PASS=""
FTP_PORT="21"
PASSWORD="abc123"
FTP_FOLDER="/backup2"
#####################################################################
echo "[$(date +'%d-%m-%Y %H:%M:%S')] starting..."
echo ""
/usr/bin/find "${FOLDER}" -type f | while read line; do
# echo "$line" #path+file
# echo "${line##*/}" #file
# echo "${line%/*}" #path
#
/usr/bin/p7zip/7za a "${BACKUP_DIR}${line}.7z" "${line}" -t7z -ms=off -m0=Copy -mhe -mmt -mx0 -p"${PASSWORD}"
curl -s --disable-epsv -v -T "${BACKUP_DIR}${line}.7z" -u "${FTP_USER}:${FTP_PASS}" "ftp://${FTP_HOST}/${FTP_FOLDER}${line%/*}/" --ftp-create-dirs;
#-S -show errors
#-s -silent mode
#-an - no file name
#v- verbose
#/usr/bin/ncftp/ncftpput -m -u -c "${FTP_USER}" -p "${FTP_PASS}" -P "${FTP_PORT}" "${FTP_HOST}" "${FTP_FOLDER}${line%/*}/" "${line##*/}.7z"
# if [ $? -ne 0 ]; then echo "[$(date +'%d-%m-%Y %H:%M:%S')] Upload failed"; fi
done
#rm -rf "${BACKUP_DIR}/" #delete temporary folder
echo ""
echo "[$(date +'%d-%m-%Y %H:%M:%S')] completed..."
exit 0
I try this but it doesn't work for me...
/usr/bin/p7zip/7za a -an -t7z -ms=off -m0=Copy -mhe -mmt -mx0 -so -p"${PASSWORD}" | curl -S --disable-epsv -v -T - -u "${FTP_USER}:${FTP_PASS}" "ftp://${FTP_HOST}/${FTP_FOLDER}${line}/" --ftp-create-dirs;

Shell Script for local file

I am trying to install freeSwitch in my CentOS 6.5 machine. I have followed the instructions given at https://confluence.freeswitch.org/display/FREESWITCH/CentOS+6.
While executing make command I am facing problem of time out while downloading a library from files.freeswitch.org at terminal.
While if I paste the url in browser I am able to download the file. The instruction for downloading is written at a script file.
Now I want to change the script so that instead of doing a wget from url it should read it from the local disk. As I am very new to shell scripting. How I should change the script to get the desired result. the file name of script is getlib.sh and the script is
Please help
#!/bin/sh
##### -*- mode:shell-script; indent-tabs-mode:nil; sh-basic-offset:2 -*-
TAR=/bin/gtar
ZCAT=/bin/gunzip
BZIP=/usr/bin/bzip2
XZ=/usr/bin/xz
WGET=/usr/bin/wget
CURL=/usr/bin/curl
if [ -f "$WGET" ]; then
DOWNLOAD_CMD=$WGET
elif [ -f "$CURL" ]; then
DOWNLOAD_CMD="$CURL -O"
fi
if [ -n "`echo $1 | grep '://'`" ]; then
base=$1/
tarfile=$2
else
base=http://files.freeswitch.org/downloads/libs/
tarfile=$1
fi
uncompressed=`echo $tarfile | sed 's/\(\(\.tar\.gz\|\.tar\.bz2\|\.tar\.xz\)\|\(\.tgz\|\.tbz2\)\)$//'`
case `echo $tarfile | sed 's/^.*\.//'` in
bz2|tbz2) UNZIPPER=$BZIP ;;
xz) UNZIPPER=$XZ ;;
gz|tgz|*) UNZIPPER=$ZCAT ;;
esac
if [ ! -d $tarfile ]; then
if [ ! -f $tarfile ]; then
rm -fr $uncompressed
$DOWNLOAD_CMD $base$tarfile
if [ ! -f $tarfile ]; then
echo cannot find $tarfile
exit 1
fi
fi
if [ ! -d $uncompressed ]; then
$UNZIPPER -c -d $tarfile | $TAR -xf -
fi
fi
exit 0
if [ -n "`echo $1 | grep '://'`" ]; then
base=$1/
tarfile=$2
Your script is already setup to use your downloaded file. Just run your script giving it the PATH as $1 and the downloaded filename as $2. For example if the script name is freeSwitch and the PATH where you downloaded the tarball is /home/user1/files/ and the tarball name is freeswitch.tar.bz2, then simply run:
./freeSwitch /home/user1/files/ freeswitch.tar.bz2
That should start the script using the tarball from your local disk instead of trying to download it from http://files.freeswitch.org/downloads/libs/. NOTE you need the trailing / at the end of PATH. Good luck!

Update bash script, file check, how?

#!/bin/sh
LOCAL=/var/local
TMP=/var/tmp
URL=http://um10.eset.com/eset_upd
USER=""
PASSWD=""
WGET="wget --user=$USER --password=$PASSWD -t 15 -T 15 -N -nH -nd -q"
UPDATEFILE="update.ver"
cd $LOCAL
CMD="$WGET $URL/$UPDATEFILE"
eval "$CMD" || exit 1;
if [ -n "`file $UPDATEFILE|grep -i rar`" ]; then
(
cd $TMP
rm -f $TMP/$UPDATEFILE
unrar x $LOCAL/$UPDATEFILE ./
)
UPDATEFILE=$TMP/$UPDATEFILE
URL=`echo $URL|sed -e s:/eset_upd::`
fi
TMPFILE=$TMP/nod32tmpfile
grep file=/ $UPDATEFILE|tr -d \\r > $TMPFILE
FILELIST=`cut -c 6- $TMPFILE`
rm -f $TMPFILE
echo "Downloading updates..."
for FILE in $FILELIST; do
CMD="$WGET \"$URL$FILE\""
eval "$CMD"
done
cp $UPDATEFILE $LOCAL/update.ver
perl -i -pe 's/\/download\/\S+\/(\S+\.nup)/\1/g' $LOCAL/update.ver
echo "Done."
So I have this code to download definitions for my antivirus. The only problem is that, it downloads all files everytime i run script. Is it possible to implement some sort file checking ?, let's say for example,
"if that file is present and have same filesize skip it"
Bash Linux
The -nc argument to wget will not re-fetch files that already exist. It is, however, not compatible with the -N switch. So you'll have to change your WGET line to:
WGET="wget --user=$USER --password=$PASSWD -t 15 -T 15 -nH -nd -q -nc"

Resources