MAILTO is not sending mail to my email - linux

what I'm trying to make is
Installation of MYSQL
Create a database named "important_files" with tables "data1" "data2" "data3"
Create a folder in /opt named “incremental-backup", “full-backup" and “dump_backup”
Backup the database "important_files” in the folder “dump_backup” every 5 minutes (ARCHIVE THE DUMP_BACKUP FOLDER)
Create a script that will:
backup the dump_backup folder using incremental method.
backup the dump_backup folder using full backup method.
Will notify gilroy.toledano#toro.io via e-mail
Note: The archive folder should only contain the files and no extra directories.
This is my crontab -e configuration:
MAILTO=toro.guinto#gmail.com
*/1 * * * * tar -czf /opt/dump-backup/backup_.tar -C /var/lib/mysql/ important_files
*/1 * * * * tar -czf tar -czf /opt/full-backup/fullbackup_.tar -C /opt/ dump-backup
I've done all the possible solution including mutt mailx etc. But none are working.

Related

Error in crontab while try to save in CLI mode

I am trying to add a commend in crontab so folder "XYZABC" delete if anyone makes in my server, and it should run every hour.
* 1 * * * *  /usr/bin/find /home/ -type d -name "XYZABC" -exec rm -rf {} +
When I try to save the above command it shows me an error
crontab: installing new crontab
"/tmp/crontab.cc9iSz":23: bad command
errors in crontab file, can't install.
Do you want to retry the same edit?
what I am doing wrong...

Cronjob is not running in Linux

So I am trying to automate backups to S3 buckets through linux.
The script I am trying to run is
TIME=`date +%b-%d-%y`
FILENAME=backup-$TIME.tar.gz
SRCDIR=/opt/nexus
DESDIR=/usr/local/backup
tar -cpzf $DESDIR/$FILENAME $SRCDIR
aws s3 cp /usr/local/backup/backup.tar.gz s3://s3backup
The cronjob to run that script is 44 11 * * * ./backup.sh
However whenever I try to run the backup script (by updating cronjob) it does not seem to be working at all.
Any ideas why it will not work?
You are creating a date stamped backup file, but attempting to copy static file name. Try changing the copy command to:
aws s3 cp $DESDIR/$FILENAME s3://s3backup
Do not use relative path names in cron job or script.
44 11 * * * ./backup.sh
Instead, use full path of the script.
44 11 * * * <full_path>/backup.sh
In addition, use full path in your script:
<full_path>/tar -cpzf $DESDIR/$FILENAME $SRCDIR
<full_path>aws s3 cp /usr/local/backup/backup.tar.gz s3://s3backup
Make sure the cron job is added for the user who has the AWS credentials set up correctly.

Cron entry does not work

I've the following entries made to take backup of php files and the apache configuration files from our web server. But i found there are no files in the backup destination folder. I would like to know if the entries have any incorrect syntax.
# Production Events module php files from /var/events/ folder
00 2 * * * tar cvpzf /var/backups/events/events-backup.date +\%Y\%m\%d.tar.gz /var/events/ > /var/backups/events/events-backup.log 2>&1`
# Production Survey module php files backup from /var/survey/ folder
00 2 * * * tar cvpzf /var/backups/survey/survey-backup.date +\%Y\%m\%d.tar.gz /var/survey/ > /var/backups/survey/survey-backup.log 2>&1
# Apache configuration files under /etc/apache2/ directory
00 2 * * * tar cvpzf /var/backups/apache2/apache2-backup.date +\%Y\%m\%d.tar.gz /etc/apache2/ > /var/backups/apache2/apache2-backup.log 2>&1
I fixed this issue today. I missed to add the user "root" just before the tar command. The following change fixed this issue.
00 2 * * * root tar cvpzf /var/backups/apache2/apache2-backup.date +\%Y\%m\%d.tar.gz /etc/apache2/ > /var/backups/apache2/apache2-backup.log 2>&1

cron job not working properly giving error as "syntax error near unexpected token `)'"

I m creating cron job that takes backup of my entire DB. For that I used following code
*/5 * * * * mysqldump -u mydbuser -p mypassword mydatabase | gzip > /home/myzone/public_html/test.com/newfolder/dbBackup/backup.sql.gz
But instead of getting backup I m getting error as "syntax error near unexpected token `)' ". In my password there is round bracket included is this happening because of this. Please Help me..
Thanks in advance.
) is a special character for the shell (and crontab uses the shell to execute commands).
Add single quotes around your password:
*/5 * * * * mysqldump -u mydbuser -p 'mypassword' mydatabase | ...
try to remove spaces between -u mydbuser and -p mypassword..
-umydbuser -pmypassword
As I suggested in my commentary, move this in external script, and include the script in cron.daily. I've given below the basic skeleton for such a script. This way you gain a couple of advantages => you can test the script, you can easily reuse it, it's also configurable. I don't know if you do this for administration or personal usage. My suggestion is more towards "I do it for administration" :)...
#!/bin/bash
# Backup destination directory
DIR_BACKUP=/your/backup/directory
# Timestamp format for filenames
TIMESTAMP=`date +%Y%m%d-%H%M%S`
# Database name
DB_NAME=your_database_name
# Database user
DB_USER=your_database_user
# Database password
DB_PASSWD=your_database_password
# Database export file name
DB_EXPORT=your_database_export_filename.sql
# Backup file path
BKFILE=$DIR_BACKUP/your-backup-archive-name-$TIMESTAMP.tar
# Format for time recordings
TIME="%E"
###########################################
# Create the parent backup directory if it does not exist
if [ ! -e $DIR_BACKUP ]
then
echo "=== Backup directory not found, creating it ==="
mkdir $DIR_BACKUP
fi
# Create the backup tar file
echo "=== Creating the backup archive ==="
touch $BKFILE
# Export the database
echo "=== Exporting YOUR DATABASE NAME database ==="
time bash -c "mysqldump --user $DB_USER --password=$DB_PASSWD $DB_NAME > $DIR_BACKUP/$DB_EXPORT"
# Add the database export to the tar file, remove after adding
echo "=== Adding the database export to the archive ==="
time tar rvf $BKFILE $DIR_BACKUP/$DB_EXPORT --remove-files
# Compress the tar file
echo "=== Compressing the archive ==="
time gzip $BKFILE
# All done
DATE=`date`
echo "=== $DATE: Backup complete ==="

Cronjob or rake task for automate a download, unzip and move to a specified directory

I'm not sure if this should be achieved using cron job or rake task.
There is this large zipped file (250MB+) that is provided by third-party, that is updated weekly. I can't be downloading it to my local machine and uploading to the server weekly to replace the old data. Is there anyway I can write such workflow:
Download a zipped file from this URL: http://download.abc.com/data.zip every Sunday 4am.
Unzip it to data.
Move the folder, its subfolders and contents to public/data and replace old public/data.
Many thanks.
This sounds like a bash script. Install it with crontab -e.
#!/bin/bash
cd /tmp
mkdir data
cd data
wget http://download.abc.com/data.zip
unzip data.zip
rm -rf /public/data/*
mv data/* /public/data/
chown -R www-data:www-data /public/data/
Didn't test it, but it should do that, what you want.
Don't forget to adapt owner/group www-data:www-data to your own needs.

Resources