I want to put a file watcher on a directory in my Docker container. I'm using entrypoint.sh script to setup the script that places a file watcher. The setup is like so:
#!/bin/sh
# Trigger the script with the file watcher in the background
./bin/watcher.sh &
And the watcher.sh script contains the inotifywait command:
#!/bin/sh
inotifywait \
--event create --event delete \
--event modify --event move \
--format "%e %w%f" \
--monitor --outfile '/var/log/inotifywait.log' \
--syslog --quiet --recursive \
/etc/haproxy |
while read CHANGED;
do
echo "$CHANGED"
haproxy -W -db -f /etc/haproxy/haproxy.cfg -p /var/run/haproxy.pid -sf $(cat /var/run/haproxy.pid) &
done
However, although the watcher is listed when i check with top, and it reports changes in the defined log file, the loop never triggers. I've tried debugging the loop with simple:
touch /var/log/special.log
echo "${CHANGED}" >> /var/log/special.log
But file never gets created, and nothing gets echoed in it. What is the right way to use inotifywait with loop in bash script?
You are explicitly sending output to a file rather than stdout using the --outfile option. Nothing is ever written to stdout, so the read statement in your while loop never reads any data.
You probably want:
inotifywait \
--event create --event delete \
--event modify --event move \
--format "%e %w%f" \
--monitor \
--syslog --quiet --recursive \
/etc/haproxy |
while read CHANGED;
do
echo "$CHANGED"
haproxy -W -db -f /etc/haproxy/haproxy.cfg -p /var/run/haproxy.pid -sf $(cat /var/run/haproxy.pid) &
done
Related
I have multiple file in SFTP server from which I need to copy only latest file. I have written sample code but in that I am passing filename. What logic I need to add that it identify the latest file from sftp and copy it into my local?
In SFTP server -
my_data_20220428.csv
my_data_20220504.csv
my_data_20220501.csv
my_data_20220429.csv
The code which I am running-
datadir="/script/data"
cd ${datadir}
rm -f ${datadir}/my_data*.csv
rm -f ${logfile}
lftp<<END_SCRIPT
open sftp://${sftphost}
user ${sftpuser} ${sftppassword}
cd ${sftpfolder}
lcd $datadir
mget my_data_20220504.csv
bye
END_SCRIPT
what changes I need to do it automatically pick the latest file from server without hardcoding the filename?
You can try this script mainly copied from your sample, so it is expected that the variables have already been created.
#!/usr/bin/env bash
datadir="/script/data"
rm -f "$datadir"/my_data*.csv
rm -f "$logfile"
new=$(echo "ls -halt $sftpfolder" | lftp -u "${sftpuser}","${sftppassword}" sftp://"${sftphost}" | sed -n '/my_data/s/.* \(.*\)/\1/p' | head -1)
lftp -u "${sftpuser}","${sftppassword}" sftp://"${sftphost}" << --EOF--
cd "$sftpfolder"
lcd "$datadir"
get "$new"
bye
--EOF--
You could try:
latest=$(lftp "sftp://$sftpuser:$sftppassword#$sftphost" \
-e "cd $sftpfolder; glob rels -1t *.csv; bye" |
head -1)
lftp "sftp://$sftpuser:$sftppassword#myhost" \
-e "cd $sftpfolder; mget $latest; bye"
I'm trying to write a bash script to create a screen (software) session with a specific set of windows, and cd to specific directories on each one.
Here is the script I have so far:
#!/bin/bash
killall screen;
screen -AmdS work;
screen -S work bash -c "cd myDir";
The problem is that I can't seem to change directories on that session. After running this script, I run $ screen -r and the current directory is still my default directory (~/).
(I've tried changing the cd command to touch myFile and the file is there after I run the script)
Try the following, it will open a new screen session with a bash which will change the directory and open a new bash with this directory as current:
screen -S work bash -c 'cd myDir && exec bash'
Adding -d -m to run it in detached mode. And after reattaching you will be in myDir:
screen -S work -d -m bash -c 'cd myDir && exec bash'
Better solution
The following code will create a detached screen with 3 screens each running myCommand1/2/3 in directory myDir1/2/3.
cd myDir1
screen -S work -d -m
screen -S work -X exec myCommand1
screen -S work -X chdir myDir2
screen -S work -X screen
screen -S work -X exec myCommand2
screen -S work -X chdir myDir3
screen -S work -X screen
screen -S work -X exec myCommand3
cd -
Note the last cd - that will return you back to your original working directory.
Finally just use screen -r work to attach your running screen session.
You can save the command line you want to run (including the final newline) into a register and paste it into the screen input:
screen -S work -X register c $'cd myDir\n'
screen -S work -X paste c
I has running a daemon program to monitor a specific directory file changes, at the beginning, program is running normally, but after a period of time, inotifywait does work at the time of file changes. When i restart the program, it gets back to normal again. This is my shell script:
#!/bin/sh
./etc/puppet/modules/config.sh
puppetmaster=`grep -w server ${puppet_config} | awk -F'=' '{print $2}'`
/usr/local/bin/inotifywait -mrq -e modify ${log_dir}| while read D E F
do
/usr/bin/rsync -i -p -H -S -z -r -A -o -g -a --port=${port} \
--timeout=600 --exclude='.svn/' --exclude='.git/' ${log_dir}/ \
rsync://${puppetmaster}/log_dir > /dev/null
done
Please someone help me.Thanks..
I have a bash script that processes some data using inotify-tools to know when certain events took place on the filesystem. It works fine if run in the bash console, but when I try to run it as a daemon it fails. I think the reason is the fact that all the output from the inotifywait command call goes to a file, thus, the part after | while doesn't get called anymore. How can I fix that? Here is my script.
#!/bin/bash
inotifywait -d -r \
-o /dev/null \
-e close_write \
--exclude "^[\.+]|cgi-bin|recycle_bin" \
--format "%w:%&e:%f" \
$1|
while IFS=':' read directory event file
do
#doing my thing
done
So, -d tells inotifywait to run as daemon, -r to do it recursively and -o is the file in which to save the output. In my case the file is /dev/null because I don't really need the output except for processing the part after the command (| while...)
You don't want to run inotify-wait as a daemon in this case, because you want to continue process output from the command. You want to replace the -d command line option with -m, which tells inotifywait to keep monitoring the files and continue printing to stdout:
-m, --monitor
Instead of exiting after receiving a single event, execute
indefinitely. The default behaviour is to exit after the
first event occurs.
If you want things running in the background, you'll need to background the entire script.
Here's a solution using nohup: (Note in my testing, if I specified the -o the while loop didn't seem to be evaluated)
nohup inotifywait -m -r \
-e close_write \
--exclude "^[\.+]|cgi-bin|recycle_bin" \
--format "%w:%&e:%f" \
$1 |
while IFS=':' read directory event file
do
#doing my thing
done >> /some/path/to/log 2>&1 &
So I want to download multiple files from rapidshare. This what I currently have. I created a cookie by running-
wget \
--save-cookies ~/.cookies/rapidshare \
--post-data "login=USERNAME&password=PASSWORD" \
--no-check-certificate \
-O - \
https://ssl.rapidshare.com/cgi-bin/premiumzone.cgi \
> /dev/null
and now I have a shell script which I run which looks like this-
#!/bin/bash
wget -c --load-cookies ~/.cookies/rapidshare http://rapidshare.com/files/219920856/file1.rar
wget -c --load-cookies ~/.cookies/rapidshare http://rapidshare.com/files/393839302/file2.rar
wget -c --load-cookies ~/.cookies/rapidshare http://rapidshare.com/files/398293204/file3.rar
....
I want two things-
The shell script needs to read the files to download from a file.
The shell script should download anywhere from 2 - 8 files at a time.
Thanks!
When you want parallel jobs, think make.
#!/usr/bin/make -f
login:
wget -qO/dev/null \
--save-cookies ~/.cookies/rapidshare \
--post-data "login=USERNAME&password=PASSWORD" \
--no-check-certificate \
https://ssl.rapidshare.com/cgi-bin/premiumzone.cgi
$(MAKEFILES):
%: login
wget -ca$(addsuffix .log,$(notdir $#)) \
--load-cookies ~/.cookies/rapidshare $#
#echo "Downloaded $# (log in $(addsuffix .log,$(notdir $#)))"
Save this as rsget somewhere in $PATH (make sure you use tabs and not spaces for indentation), give it chmod +x, and run
rsget -kj8 \
http://rapidshare.com/files/219920856/file1.rar \
http://rapidshare.com/files/393839302/file2.rar \
http://rapidshare.com/files/398293204/file3.rar \
...
This will log in, then wget each target. -j8 tells make to run up to 8 jobs in parallel, and -k means "keep going even if a target returned failure".
Edit
Tested with GNU Make 3.79 and 3.81.
Try this. I think it should do what you want:
#! /bin/bash
MAX_CONCURRENT=8
URL_BASE="http://rapidshare.com/files/"
cookie_file=~/.cookies/rapidshare
# do your login thing here...
[ -n "$1" -a -f "$1" ] || { echo "please provide a file containing the stuff to download"; exit 1; }
inputfile=$1
count=0
while read x; do
if [ $count -ge $MAX_CONCURRENT ]; then
count=0
wait
fi
{ wget -c --load-cookies "$cookie_file" "${URL_BASE}$x" && echo "Downloaded $x"; } &
count=$((count + 1))
done < $inputfile