I am writing a shell script in which i am generating a csv file and after that attaching and sending this csv file with mutt command in linux .But the problem is that csv file not generated and still the mutt command executes and it says the file not found . So is there any way that i can check that if the command for csv file generation completes then only the mutt command execute.Below are the contents of my script the two statement executed one after other.
mysql --user=root --password= erpint -B -e "select * from user_info;" | sed "s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g" > /home/mayuri/detail.csv
mutt -s "Mutt attach" srini#erpint.com -a /home/mayuri/detail.csv < /home/mayuri/detail.csv
using bash,
to check if file exist :
#generate you file ....
if [ ! -f /YourPathToTheFile/yourFile.txt ];
then
echo "no file found, exiting and doing nothing";
fi
#send your file
So, literally, wait for the file to exist:
while [ ! -f /home/mayuri/detail.csv ]; do
sleep 1
done
mutt -s "Mutt attach" srini#erpint.com -a /home/mayuri/detail.csv < /home/mayuri/detail.csv
You can use $? which is set to 0 if previous command executed successfully or 1 if it failed:
mysql --user=root --password= erpint -B -e "select * from user_info;" | sed "s/'/\'/;s/\t/\",\"/g;s/^/\"/;s/$/\"/;s/\n//g" > /home/mayuri/detail.csv
if [ $? -eq 0 ]; then
mutt -s "Mutt attach" srini#erpint.com -a /home/mayuri/detail.csv < /home/mayuri/detail.csv
fi
Related
Linux/sh: How to list all files one by one in specific folder,
compress each (by p7zip without save file on disk) and
upload to ftp server (by curl/ncftp) with same folder structure?
This script below work perfect but
I don't want to save 7z file on a disk each time. Because I always need to delete them all after uploaded.
I prefer stio from 7zip to curl, how to do that?
#!/bin/sh
FOLDER="/volume3/backup_3/kopia_nas/tmp"
BACKUP_DIR="/volume3/backup_3/kopia_nas/tmp2"
FTP_HOST=""
FTP_USER=""
FTP_PASS=""
FTP_PORT="21"
PASSWORD="abc123"
FTP_FOLDER="/backup2"
#####################################################################
echo "[$(date +'%d-%m-%Y %H:%M:%S')] starting..."
echo ""
/usr/bin/find "${FOLDER}" -type f | while read line; do
# echo "$line" #path+file
# echo "${line##*/}" #file
# echo "${line%/*}" #path
#
/usr/bin/p7zip/7za a "${BACKUP_DIR}${line}.7z" "${line}" -t7z -ms=off -m0=Copy -mhe -mmt -mx0 -p"${PASSWORD}"
curl -s --disable-epsv -v -T "${BACKUP_DIR}${line}.7z" -u "${FTP_USER}:${FTP_PASS}" "ftp://${FTP_HOST}/${FTP_FOLDER}${line%/*}/" --ftp-create-dirs;
#-S -show errors
#-s -silent mode
#-an - no file name
#v- verbose
#/usr/bin/ncftp/ncftpput -m -u -c "${FTP_USER}" -p "${FTP_PASS}" -P "${FTP_PORT}" "${FTP_HOST}" "${FTP_FOLDER}${line%/*}/" "${line##*/}.7z"
# if [ $? -ne 0 ]; then echo "[$(date +'%d-%m-%Y %H:%M:%S')] Upload failed"; fi
done
#rm -rf "${BACKUP_DIR}/" #delete temporary folder
echo ""
echo "[$(date +'%d-%m-%Y %H:%M:%S')] completed..."
exit 0
I try this but it doesn't work for me...
/usr/bin/p7zip/7za a -an -t7z -ms=off -m0=Copy -mhe -mmt -mx0 -so -p"${PASSWORD}" | curl -S --disable-epsv -v -T - -u "${FTP_USER}:${FTP_PASS}" "ftp://${FTP_HOST}/${FTP_FOLDER}${line}/" --ftp-create-dirs;
I need to get exit code of ftp execution. My command line is:
wget -N ftp://server:pass#server/path/
Using:
if [ $? -ne 0 ]; then
will check wget execution.
not tested:
wget [wget options] 2>&1 | grep -i "failed\|error"
So the reason I am asking this is because I'm running two programs simultaneously that are persistent, on the child process a programm is running that requires sudo rights.
#!/bin/bash
echo "Name the file:"
read filename
while [[ 1 -lt 2 ]]
do
if [ -f /home/max/dump/$filename.eth ]; then
echo "File already exist."
read filename
else
break
fi
done
#Now calling a new terminal for dumping
gnome-terminal --title="tcpdump" -e "sh /home/max/dump/dump.sh $filename.eth"
ping -c 1 0 > /dev/null **Waiting for tcpdump to create file**
#Packet analysis program is being executed
Script dump.sh
#!/bin/bash
filename=$1
echo password | sudo tcpdump -i 2 -s 60000 -w /home/max/dump/$filename -U
host 192.168.3.2
#Sudo still asks me for my password though password is piped into stdin
I'm working on a bash script that pulls a file from an FTP site only if the timestamp on remote is different than local. After it puts the file, it copies the file over to 3 other computers via samba (smbclient).
Everything works, but the file copies even if the wget -N ftp://insertsitehere.com returns a value that the file on the remote was not newer. What would be the best way to check the output of the script so that the copy only happens if a new version was pulled from FTP?
Ideally, I'd like the copy to the computers to preserve the timestamp just like the wget -N command does, too.
Here is an example of what I have:
#!/bin/bash
OUTDIR=/cats/dogs
cd $OUTDIR
wget -N ftp://user:password#sitegoeshere.com/filename
if [ $? -eq 0 ]; then
HOSTS="server1 server2 server3"
for i in $HOSTS; do
echo "Uploading to $i..."
smbclient -A /root/.smbclient.authfile //$i/path -c "lcd /cats/dogs; put fiilename.txt"
if [ $? -eq 0 ]; then
echo "Upload to $i successful..."
else
echo "There was an issue uploading to host $i..."
fi
done
else
echo "There was an issue with the FTP Download...."
exit 1
fi
The return value of wget is different than 0 only if there is an error. If -N is in use and the remote file is older than the local file, it will still have a return value of 0, so you cannot use that to check if the file has been modified.
You could check the mtime of the file to see if it changed, or the content. For example, you could use something like:
md5_old=$( md5sum filename.txt 2>/dev/null )
wget -N ftp://user:password#sitegoeshere.com/filename.txt
md5_new=$( md5sum filename.txt )
if [ "$md5_old" != "$md5_new" ]; then
# Copy filename.txt to SMB servers
fi
Regarding smbclient, unfortunately there is no way to preserve timestamps in either get or put commands. If you need it, you must use some different tool (scp -p, rsync -t...)
touch -r foo.txt foo.old
wget -N example.com/foo.txt
if [ foo.txt -nt foo.old ]
then
echo 'Uploading to server1...'
fi
"Save" the current timestamp into a new empty file
Use wget --timestamping to only download the file if it is newer
If file is newer than the "save" file, do stuff
SOLVED! add #!/bin/bash at the top of all my scripts in order to make use of bash extensions. Otherwise it restricts itself to POSIX shell syntax. Thanks Barmar!
Also, I'll add that I had trouble with gpg decryption not working from cronjob after I got it executing, and the answer was to add the --no-tty option (no terminal output) to the gpg command.
I am fairly new to linux, so bear with me...
I am able to execute a simple script with crontab -e when logged in as ubuntu:
* * * * * /ngage/extract/bin/echoer.sh
and this bash script simply prints output to a file:
echo "Hello" >> output.txt
But when I try to execute my more complex bash script in exactly the same way, it doesn't work:
* * * * * /ngage/extract/bin/superMasterExtract.sh
This script called into other bash scripts. There are 4 scripts in total, which 3 levels of hierarchy. It goes superMasterExtract > masterExtract > (decrypt, unzip)
Here is the code for superMasterExtract.sh (top level):
shopt -s nullglob # ignore empty file
cd /str/ftp
DIRECTORY='writeable'
for d in */ ; do # for all directories in /str/ftp
if [ -d "$d$DIRECTORY" ]; then # if the directory contains a folder called 'writeable'
files=($d$DIRECTORY/*)
dirs=($d$DIRECTORY/*/)
numdirs=${#dirs[#]}
numFiles=${#files[#]}
((numFiles-=$numdirs))
if [ $numFiles -gt 0 ]; then # if the folder has at least one file in it
bash /ngage/extract/bin/masterExtract.sh /str/ftp ${d:0:${#d} - 1} # execute this masterExtract bash script with two parameters passed in
fi
fi
done
masterExtract.sh:
DATE="$(date +"%m-%d-%Y_%T")"
LOG_FILENAME="log$DATE"
LOG_FILEPATH="/ngage/extract/logs/$2/$LOG_FILENAME"
echo "Log file is $LOG_FILEPATH"
bash /ngage/extract/bin/decrypt.sh $1 $2 $DATE
java -jar /ngage/extract/bin/sftp.jar $1 $2
bash /ngage/extract/bin/unzip.sh $1 $2 $DATE
java -jar /ngage/extract/bin/sftp.jar $1 $2
echo "Log file is $LOG_FILEPATH"
decrypt.sh:
shopt -s nullglob
UPLOAD_FILEPATH="$1/$2/writeable"
DECRYPT_FOLDER="$1/decryptedFiles/$2"
HISTORY_FOLDER="$1/encryptHistory/$2"
DONE_FOLDER="$1/doneFiles/$2"
LOG_FILENAME="log$3"
LOG_FILEPATH="/ngage/extract/logs/$2/$LOG_FILENAME"
echo "DECRYPT_FOLDER=$DECRYPT_FOLDER" >> $LOG_FILEPATH
echo "HISTORY_FOLDER=$HISTORY_FOLDER" >> $LOG_FILEPATH
cd $UPLOAD_FILEPATH
for FILE in *.gpg;
do
FILENAME=${FILE%.gpg}
echo ".done FILE NAME=$UPLOAD_FILEPATH/$FILENAME.done" >> $LOG_FILEPATH
if [[ -f $FILENAME.done ]]; then
echo "DECRYPTING FILE=$UPLOAD_FILEPATH/$FILE INTO $DECRYPT_FOLDER/$FILENAME" >> $LOG_FILEPATH
cat /ngage/extract/.sftpPasswd | gpg --passphrase-fd 0 --output "$DECRYPT_FOLDER/$FILENAME" --decrypt "$FILE"
mv $FILE $HISTORY_FOLDER/$FILE
echo "MOVING FILE=$UPLOAD_FILEPATH/$FILE INTO $HISTORY_FOLDER/$FILE" >> $LOG_FILEPATH
else
echo "Done file not found!" >> $LOG_FILEPATH
fi
done
cd $DECRYPT_FOLDER
for FILE in *
do
mv $FILE $DONE_FOLDER/$FILE
echo "DECRYPTED FILE=$DONE_FOLDER/$FILE" >> $LOG_FILEPATH
done
If anyone has a clue why it refuses to execute my more complicated script, I'd love to hear it. I have also tried setting some environment variables at the beginning of crontab as well:
SHELL=/bin/bash
PATH=/sbin:/bin:/usr/sbin:/usr/local/bin:/usr/bin
MAILTO=jgardnerx85#gmail.com
HOME=/
* * * * * /ngage/extract/bin/superMasterExtract.sh
Note, I don't know that these are the appropriate variables for my installation or my script. I just pulled them off other posts and tried it to no avail. If these aren't the correct environment variables, can someone tell me how I can deduce the right ones for my particular application?
You need to begin your script with
#!/bin/bash
in order to make use of bash extensions. Otherwise it restricts itself to POSIX shell syntax.