Cron job command to not overwrite exising file when executed - cron

I'm using the following command
mysqldump -hHOST -uUSERNAME -pPASSWORD DATABASE > file.sql
Would like a separate job created each time when cron job is executed and not overwrite existing file.

i'm using shell scrip for solve this problem on my server. This is the code if you wan to try. Create file .sh, Ex: backupdb.sh
#!/bin/bash
NOW=$(date +"%m%d%Y")
FILE="backupdb.$NOW.tgz"
SQL="backupdb.$NOW.sql"
DBNAME='live' #name of database
DUMP='/usr/bin/mysqldump'
USER='root' #user
HOST=10.0.0.8 #Ip database
$DUMP -h $HOST -u $USER -pYOURPASSWORD --routines --events $DBNAME > /home/backupdb/$SQL
gzip -9 /home/backupdb/$SQL
echo "BACKUP DATABASES FROM MYSQL MASTER SUCCESS $SQL"|mail -s "BACKUP DATABASES" your#email.com #you can chang this email with your personal email
and save.
Don't forget to change permission,
chmod +x backupdb.sh
Add this code in cron job
00 04 * * * sh /home/backupdb/backupdb.sh
Save
That's it.
Sorry if my English is not good

Related

Acessing and transferring files from a FTP server to my ftp server through cron file

I have a FTP Server that uses Cron to automate tasks and I would like to use it to access another ftp server, get a file that starts with 26 and have an extension .csv, transfer to my FTP I am running the cron and delete the file on the origin FTP server, every friday of the week. Can somebody help me with the script code?
What I have right now is this:
#!/bin/bash -x
filename="dir/*.csv"
hostname="files.test"
username="testuser"
password="testpassword"
ftp -in $hostname <<EOF
quote USER $username
quote PASS $password
binary
get $filename
quit
EOF
Please, help
#!/bin/bash
USER=user
PASS=password
URL=myIP
PLACE=tmp
#
ftp -v -n > /tmp/xftpb.log <<EOF
open $URL
user $USER $PASS
binary
cd $PLACE
mget 26*.csv
mdel 26*.csv
quit
EOF
To run every friday, 8:00h, use at crontab:
0 8 * * 5 /path/mybash.sh

How to run a cron job for edX notifier digest

I am trying to write a bash script that would be run by a daily cron job as a specific user ('notifier').
Entry in the crontab and the bash script
crontab -u notifier -e
53 09 * * * /edx/app/notifier/not.sh
The contents of the script which I placed in the home directory of my user ('notifier) are as follows:
#!/bin/bash
SHELL=/bin/bash
PATH=/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin
cd "$(dirname "$0")"
DATE=`date +%Y-%m-%d -d "tomorrow"`
/edx/app/notifier/virtualenvs/notifier/bin/python /edx/app/notifier/src/manage.py forums_digest --to_datetime=$DATE
This does not work as expected, however.
Below are the individual steps that I successfully run manually
sudo -H -u notifier bash
cd
DATE=`date +%Y-%m-%d -d "tomorrow"`
/edx/app/notifier/virtualenvs/notifier/bin/python /edx/app/notifier/src/manage.py forums_digest --to_datetime=$DATE
How can I run notifier digest as a cron job?
This answer is essentially by the the user tripleee (see his replies in the comments above). Indeed, as he suggested, I had to activate venv before running the script. I am just providing the completed steps based on his idea that helped me to solve this problem.
1. This is what I put in the crontab if the user 'notifier'
55 13 * * * /edx/app/notifier/not.sh >/dev/null 2>&1
2. And below are contents of the script file not.sh that I created in /edx/app/notifier
#!/bin/bash
source /edx/app/notifier/notifier_env
cd /edx/app/notifier/src
export LANG=en_US.UTF-8
DATE=`date +%Y-%m-%d -d "tomorrow"`
/edx/app/notifier/virtualenvs/notifier/bin/python manage.py forums_digest --to_datetime=$DATE

How to mail the path cron.log file?

I am using a crontab to run my scripts periodically, Here is the code that I have added in crontab -e
* * * * * /usr/bin/python3 /home/mark/WORKSPACE/ep_prac/scripts/main.py > $HOME/test-cron.log 2>&1; mail -s "CronJob is run successfully" abc#gmail.com, xyz#gmail.com < /home/mark/test-cron.log
I want to mail the path of "test-cron.log" file. I am running crunch of time, searched a lot but couldnt file a relevant solution
Instead of sending the path, I found a way to attach the log file itself and mail it to user.
mail -A "$HOME/test-cron.log -s "Cron Job run success" xyz#gmail.com"

sh file not running on cron ubuntu

I am trying to run a shell script on crontab on Ubuntu platform. I have tried googling and other links but nothing has helped so far.
This is my crontab:
*/2 * * * * sudo bash /data/html/mysite/site_cleanup.sh
This is the content of my sh file:
#!/bin/sh
# How many days retention do we want ?
DAYS=0
# geting present day
now=$(date +"%m_%d_%Y")
# Where is the base directory
BASEDIR=/data/html/mysite
#where is the backup directory
BKPDIR=/data/html/backup
# Where is the log file
LOGFILE=$BKPDIR/log/mysite.log
# add to tar
tar -cvzf $now.tar.gz $BASEDIR
mv $now.tar.gz $BKPDIR
# REMOVE OLD FILES
echo `date` Purge Started >> $LOGFILE
find $BASEDIR -mtime +$DAYS | xargs rm
echo `date` Purge Completed >> $LOGFILE
The same script runs from a terminal and gives the desired result.
Generic troubleshooting for noninteractive shell scripts
Put set -x; exec 2>/path/to/logfile at the top of your script to log all subsequent commands to a file as they're run. If this doesn't work, you'll know that your script isn't being run at all; if it does, you'll know where it fails and how.
If this is a personal crontab
If you're running crontab -e as a user (without sudo), then the crontab being modified is one for commands run with that user's permissions. Check that file permissions allow that user to modify the content in question (which, if these files are in a cgi-bin directory, may require being run by the same user as the web server).
If your intent is to have commands run as root, rather than as your own user, be sure you use sudo when editing the crontab to edit the system crontab instead (but please take care as to your script's correctness in this case -- carelessness such as missing quotes or lack of appropriate precautions in xargs usage can cause a script to delete the wrong files if malicious filenames are created):
sudo crontab -e ## to edit the system (root) crontab
...or, if you're cleaning up files owned by the apache user (for example; check which account is correct for your own operating system and web server):
sudo -u apache crontab -e ## to edit the apache user's crontab
Troubleshooting for a system crontab
Do not attempt to put a sudo command within the commands run by cron; with sudo's default configuration, it requires a TTY (a keyboard and screen) to be attached to a session in order to run. Thus, your crontab line should not contain sudo, but instead should look like the following:
*/2 * * * * bash /data/html/mysite/site_cleanup.sh
Your issue is likely coming from the sudo call from your user level cron. Unless you've gone through and edited the bashrc profile to allow that script to run without sudo it'll hang up every time.
So you can lookup how to run a script with no password by modifying the bashrc profile, remove the sudo call if you aren't doing something in your script that calls for Super User permissions, or as a last ditch, extremely bad idea you can call your script from root's cron by doing sudo crontab -e or sudo env EDITOR=nano crontab -e if you prefer nano as your editor.
try to add this line to the crontab of user root and without the sudo.
like this:
*/2 * * * * bash /data/html/mysite/site_cleanup.sh

rdiff-backup bash script and cron trouble

I have this very simple bash script:
#!/opt/bin/bash
/opt/bin/rdiff-backup --force --print-statistics myhost::/var/bkp /volume1/backups/sql 2>&1 > /var/log/rdiff-backup.log;
/opt/bin/rdiff-backup --force --print-statistics myhost::/var/www/vhosts /volume1/backups/vhosts 2>&1 >> /var/log/rdiff-backup.log;
/opt/bin/rdiff-backup --force --print-statistics myhost::/etc /volume1/backups/etc 2>&1 >> /var/log/rdiff-backup.log;
/opt/bin/rdiff-backup --force --print-statistics /volume1/homes /volume1/backups/homes 2>&1 >> /var/log/rdiff-backup.log;
cat /var/log/rdiff-backup.log | /opt/bin/nail -s "rdiff-backup log" me#email.com;
if I run the script from the command line, in this way:
nohup /path/to/my/scipt.sh &
it works fine, appending each rdiff-backup statistic report to the rdiff-backup.log and sending this file to my email address, as expected. But if I put the script in the crontab, the script make only one rdiff-backup job sending statistics via email. I cannot understand because the script doesn't work in the same way...
Any idea?
this is my cronjob entry:
30 19 * * * /opt/bin/bash /volume1/backups/backup.sh
via crontab only the last job is executed correctly, I think because this is the only one local backup. When I execute the script from command line I use the root user, and the public key of the root user is in the /root/./ssh/authorized_keys of the remote machine. The owner of the crontab file is the root user too, I created them through "crontab -e" using the root account.
First of all you need to make sure the script used in cron doesn't output anything, otherwise:
cron will assume there is an error
you will not see the error if any
A solution for this is to use
30 19 * * * /opt/bin/bash /volume1/backups/backup.sh 2>&1 >> /var/log/rdiff-backup-cron.log;
Second of all, it appears you are losing env variables when executing via cron, try adding the env settings to your script
#!/opt/bin/bash
. /root/.profile
/opt/bin/rdiff-backup --force --print-statistics myhost::/var/bkp /volume1/backups/sql 2>&1 > /var/log/rdiff-backup.log
if /root/.profile doesn't, exist try adding . /root/.bashrc or /etc/profile
I hope this helps.

Resources