So I am trying to automate backups to S3 buckets through linux.
The script I am trying to run is
TIME=`date +%b-%d-%y`
FILENAME=backup-$TIME.tar.gz
SRCDIR=/opt/nexus
DESDIR=/usr/local/backup
tar -cpzf $DESDIR/$FILENAME $SRCDIR
aws s3 cp /usr/local/backup/backup.tar.gz s3://s3backup
The cronjob to run that script is 44 11 * * * ./backup.sh
However whenever I try to run the backup script (by updating cronjob) it does not seem to be working at all.
Any ideas why it will not work?
You are creating a date stamped backup file, but attempting to copy static file name. Try changing the copy command to:
aws s3 cp $DESDIR/$FILENAME s3://s3backup
Do not use relative path names in cron job or script.
44 11 * * * ./backup.sh
Instead, use full path of the script.
44 11 * * * <full_path>/backup.sh
In addition, use full path in your script:
<full_path>/tar -cpzf $DESDIR/$FILENAME $SRCDIR
<full_path>aws s3 cp /usr/local/backup/backup.tar.gz s3://s3backup
Make sure the cron job is added for the user who has the AWS credentials set up correctly.
Related
There is a shell script in the location /file/location/azcopy/, and an Azcopy binary is also located there
The below command runs successfully when I run it manually
./azcopy cp "/abc/def/Goa.csv" "https://.blob.core.windows.net/abc\home\xyz?"
However, when I scheduled it in crontab, the "./azcopy" command didn't execute.
below is the script
#!/bin/bash
./azcopy cp "/abc/def/Goa.csv" "https://<Blobaccount>.blob.core.windows.net/abc\home\xyz?<SAS- Token>"
below is the crontab entry
00 17 * * * root /file/location/azcopy/script.sh
Is there something I'm doing wrong?
Could someone please help me figure out what's wrong.
When you use root to execute the /file/location/azcopy/script.sh,you work directory is /root ,so you need to add cd /file/location/azcopy/ in your script.sh script to change work directory. You can add pwd in your script to see the current work directory.
I'm using a virtual machine on GCP, and every day I upload a new file (same name) to Storage, then use the Cloud Shell Terminal to upload the file to the virtual machine using:
gsutil cp gs://my_bucket/my_file .
I want to create a cronjob that will load the file to the VM at a scheduled time.
Here's my cron:
00 13 * * 1-5 /usr/bin/gsutil cp /home/user_name/ gs://mybucket/my file .
When I check the cron syslog, I see it ran:
(CRON) info (No MTA installed, discarding output)
Found the answer, so I'll post it here.
The problem was I was not providing the right path to gsutil and I didn't have the rest of the syntax correct.
Find the correct gsutil path by running:
gsutil version -l
In my case, the corrected cron was:
00 13 * * 1-5 /snap/google-cloud-sdk/147/bin/gsutil cp gs://mybucket/myfile.py ./
Note the ./ put the file in my home directory.
(Again, what I'm doing above is copying a file from my Google Cloud Storage bucket ("mybucket") to my virtual machine home directory. It can then be run by another cronjob.)
I installed Debian 9 on my VPS. I installed LAMP on the server. I'm logged in as root, I created a new site "/var/www/example.com" and I see that the permissions are "root:root". The web page is displayed in the browser.
I created a cron.php file that writes the current time to the file. In crontab I have /usr/bin/php /var/www/example.com/cron.php. If I run this command through the terminal, everything works. However, Crontab returns an error because it does not have write permissions. However, Crontab runs as root. The directory has 777 permissions.
I tried to set /var/www as www-data:www-data and the same for crontab (crontab -u www-data -e). The result is the same, cron runs but does not write to the file.
EDIT:
I found that if the script contains: file_put_contents('output.txt', 'xxx'); the file created by cron is in root. If I set the full path, everything is fine: file_put_contents('/var/www/exmaple.com/output.txt', 'xxx'); Is there any way to modify this behavior?
You can create sample script like this:
#!/bin/bash
source ~/.bashrc #or use .bash_profile
/usr/bin/php /var/www/example.com/cron.php >>/path/to/output
and add it as cron record:
0 * * * * /path/to/script/sh
what I'm trying to make is
Installation of MYSQL
Create a database named "important_files" with tables "data1" "data2" "data3"
Create a folder in /opt named “incremental-backup", “full-backup" and “dump_backup”
Backup the database "important_files” in the folder “dump_backup” every 5 minutes (ARCHIVE THE DUMP_BACKUP FOLDER)
Create a script that will:
backup the dump_backup folder using incremental method.
backup the dump_backup folder using full backup method.
Will notify gilroy.toledano#toro.io via e-mail
Note: The archive folder should only contain the files and no extra directories.
This is my crontab -e configuration:
MAILTO=toro.guinto#gmail.com
*/1 * * * * tar -czf /opt/dump-backup/backup_.tar -C /var/lib/mysql/ important_files
*/1 * * * * tar -czf tar -czf /opt/full-backup/fullbackup_.tar -C /opt/ dump-backup
I've done all the possible solution including mutt mailx etc. But none are working.
gsutil cp -r gs://mybucetid/stats/installs/installs_com.appdomain_YYYYMM_overview.csv /home/ubuntu/appstats
The above command runs successfully from my terminal. The command copies the statistics file from google cloud to my local directory.
Hence I tried to put the above command in crontab.
below is the line from the crontab
30 00 * * * gsutil cp -r gs://mybucetid/stats/installs/installs_com.appdomain_YYYYMM_overview.csv /home/ubuntu/appstats
The cron executes on time with no errors (checked in cron log) but the file does not download to the specified location)
Can anybody suggest me, in the crontab, what is missing in the command and why the file is not copied from my google cloud bucket to my specified local directory?
Simple Update yyyyMM to date formate as shown below
30 00 * * * gsutil cp -r gs://mybucetid/stats/installs/installs_com.appdomain_201710_overview.csv /home/ubuntu/appstats