I've the following entries made to take backup of php files and the apache configuration files from our web server. But i found there are no files in the backup destination folder. I would like to know if the entries have any incorrect syntax.
# Production Events module php files from /var/events/ folder
00 2 * * * tar cvpzf /var/backups/events/events-backup.date +\%Y\%m\%d.tar.gz /var/events/ > /var/backups/events/events-backup.log 2>&1`
# Production Survey module php files backup from /var/survey/ folder
00 2 * * * tar cvpzf /var/backups/survey/survey-backup.date +\%Y\%m\%d.tar.gz /var/survey/ > /var/backups/survey/survey-backup.log 2>&1
# Apache configuration files under /etc/apache2/ directory
00 2 * * * tar cvpzf /var/backups/apache2/apache2-backup.date +\%Y\%m\%d.tar.gz /etc/apache2/ > /var/backups/apache2/apache2-backup.log 2>&1
I fixed this issue today. I missed to add the user "root" just before the tar command. The following change fixed this issue.
00 2 * * * root tar cvpzf /var/backups/apache2/apache2-backup.date +\%Y\%m\%d.tar.gz /etc/apache2/ > /var/backups/apache2/apache2-backup.log 2>&1
Related
I have a simple script that works when I manually put it into the ~/ directory.
#!/bin/bash
source /opt/python/current/env
source /opt/python/run/venv/bin/activate
cd /opt/python/current/app
scrapy crawl myspider
deactivate
exit 0
Since EBS will delete the script and crontab eventually, I need to create a config file in .ebextensions.
So I've created cron-linux.config
But I can't make it work. It doesn't do anything (I should see changes in a database).
files:
"/etc/cron.d/crawl":
mode: "000644"
owner: root
group: root
content: |
* * * * * /usr/local/bin/crawl.sh
"/usr/local/bin/crawl.sh":
mode: "000755"
owner: ec2-user
group: ec2-user
content: |
#!/bin/bash
source /opt/python/current/env
source /opt/python/run/venv/bin/activate
cd /opt/python/current/app
scrapy crawl myspider
deactivate
exit 0
commands:
remove_old_cron:
command: "rm -f /etc/cron.d/crawl.bak"
Do you know where is the problem?
Even when I do eb logs > log.txt and check the file, there is no such word like 'crawl', 'cron' etc...
EDIT
I can even manually run the script: [ec2-user#ip-xxx-xx-x-xx ~]$ /usr/local/bin/crawl.sh
And I see that it works. Just the cron doesn't work.
what I'm trying to make is
Installation of MYSQL
Create a database named "important_files" with tables "data1" "data2" "data3"
Create a folder in /opt named “incremental-backup", “full-backup" and “dump_backup”
Backup the database "important_files” in the folder “dump_backup” every 5 minutes (ARCHIVE THE DUMP_BACKUP FOLDER)
Create a script that will:
backup the dump_backup folder using incremental method.
backup the dump_backup folder using full backup method.
Will notify gilroy.toledano#toro.io via e-mail
Note: The archive folder should only contain the files and no extra directories.
This is my crontab -e configuration:
MAILTO=toro.guinto#gmail.com
*/1 * * * * tar -czf /opt/dump-backup/backup_.tar -C /var/lib/mysql/ important_files
*/1 * * * * tar -czf tar -czf /opt/full-backup/fullbackup_.tar -C /opt/ dump-backup
I've done all the possible solution including mutt mailx etc. But none are working.
So I am trying to automate backups to S3 buckets through linux.
The script I am trying to run is
TIME=`date +%b-%d-%y`
FILENAME=backup-$TIME.tar.gz
SRCDIR=/opt/nexus
DESDIR=/usr/local/backup
tar -cpzf $DESDIR/$FILENAME $SRCDIR
aws s3 cp /usr/local/backup/backup.tar.gz s3://s3backup
The cronjob to run that script is 44 11 * * * ./backup.sh
However whenever I try to run the backup script (by updating cronjob) it does not seem to be working at all.
Any ideas why it will not work?
You are creating a date stamped backup file, but attempting to copy static file name. Try changing the copy command to:
aws s3 cp $DESDIR/$FILENAME s3://s3backup
Do not use relative path names in cron job or script.
44 11 * * * ./backup.sh
Instead, use full path of the script.
44 11 * * * <full_path>/backup.sh
In addition, use full path in your script:
<full_path>/tar -cpzf $DESDIR/$FILENAME $SRCDIR
<full_path>aws s3 cp /usr/local/backup/backup.tar.gz s3://s3backup
Make sure the cron job is added for the user who has the AWS credentials set up correctly.
gsutil cp -r gs://mybucetid/stats/installs/installs_com.appdomain_YYYYMM_overview.csv /home/ubuntu/appstats
The above command runs successfully from my terminal. The command copies the statistics file from google cloud to my local directory.
Hence I tried to put the above command in crontab.
below is the line from the crontab
30 00 * * * gsutil cp -r gs://mybucetid/stats/installs/installs_com.appdomain_YYYYMM_overview.csv /home/ubuntu/appstats
The cron executes on time with no errors (checked in cron log) but the file does not download to the specified location)
Can anybody suggest me, in the crontab, what is missing in the command and why the file is not copied from my google cloud bucket to my specified local directory?
Simple Update yyyyMM to date formate as shown below
30 00 * * * gsutil cp -r gs://mybucetid/stats/installs/installs_com.appdomain_201710_overview.csv /home/ubuntu/appstats
I am very sorry for asking this again, but I've tried all advices. I have 2 scripts in /var/TPbackup_script/. This is the first one:
mysqldump -u root -pPASSWORD teampass > /var/TPbackups/TPbackup_$(date +"%Y-%m-%d").sql
Corresponding cronjob in /etc/crontab
20 9 * * * root sudo sh /var/TPbackup_script/TPbackup_script
This script works in crontab. All is good. The second script does not run:
s3cmd sync /var/TPbackups s3://PwdMgmt
Corresponding cronjob in /etc/crontab:
25 9 * * * root sudo sh /var/TPbackup_script/TPsyncS3_script
This one fails. If i run it manually in terminal:
sudo sh /var/TPbackup_script/TPsyncS3_script
then it works perfectly. What i tried:
1) Trying to add shebang #!/bin/sh to the beginning of the script
2) Renaming script to TPsyncS3_script.sh
3) I have added script into cron.daily and it was in the list of daily cron tasks (i see it with command run-parts --test /etc/cron.daily)
No success.
Here is my /etc/crontab file:
# /etc/crontab: system-wide crontab
# Unlike any other crontab you don't have to run the `crontab'
# command to install the new version when you edit this file
# and files in /etc/cron.d. These files also have username fields,
# that none of the other crontabs do.
SHELL=/bin/sh
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
# m h dom mon dow user command
17 * * * * root cd / && run-parts --report /etc/cron.hourly
16 9 * * * root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.daily )
47 6 * * 7 root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.weekly )
52 6 1 * * root test -x /usr/sbin/anacron || ( cd / && run-parts --report /etc/cron.monthly )
20 9 * * * root sudo sh /var/TPbackup_script/TPbackup_script
25 9 * * * root sudo sh /var/TPbackup_script/TPsyncS3_script.sh > /var/TPbackup_script/sync_log.txt
#
All permissions on scripts were set with sudo chmod 777.
And by the way. sync_log.txt was created after cronjob, but it's empty.
Any help is appreciated
Had the same problem. Solved this by adding the option to specify the location of the s3cfg:
--config /root/.s3cfg
e.g:
s3cmd sync --config /root/.s3cfg /var/TPbackups s3://PwdMgmt
I had a similar problem. Try to run your script using root's crontab.
do:
sudo crontab -e
Add your script and try again. It worked for me :)