crontab gsutil command executes but no output - cron

gsutil cp -r gs://mybucetid/stats/installs/installs_com.appdomain_YYYYMM_overview.csv /home/ubuntu/appstats
The above command runs successfully from my terminal. The command copies the statistics file from google cloud to my local directory.
Hence I tried to put the above command in crontab.
below is the line from the crontab
30 00 * * * gsutil cp -r gs://mybucetid/stats/installs/installs_com.appdomain_YYYYMM_overview.csv /home/ubuntu/appstats
The cron executes on time with no errors (checked in cron log) but the file does not download to the specified location)
Can anybody suggest me, in the crontab, what is missing in the command and why the file is not copied from my google cloud bucket to my specified local directory?

Simple Update yyyyMM to date formate as shown below
30 00 * * * gsutil cp -r gs://mybucetid/stats/installs/installs_com.appdomain_201710_overview.csv /home/ubuntu/appstats

Related

When scheduled with cron, ./azcopy does not run

There is a shell script in the location /file/location/azcopy/, and an Azcopy binary is also located there
The below command runs successfully when I run it manually
./azcopy cp "/abc/def/Goa.csv" "https://.blob.core.windows.net/abc\home\xyz?"
However, when I scheduled it in crontab, the "./azcopy" command didn't execute.
below is the script
#!/bin/bash
./azcopy cp "/abc/def/Goa.csv" "https://<Blobaccount>.blob.core.windows.net/abc\home\xyz?<SAS- Token>"
below is the crontab entry
00 17 * * * root /file/location/azcopy/script.sh
Is there something I'm doing wrong?
Could someone please help me figure out what's wrong.
When you use root to execute the /file/location/azcopy/script.sh,you work directory is /root ,so you need to add cd /file/location/azcopy/ in your script.sh script to change work directory. You can add pwd in your script to see the current work directory.

Crontab to upload file from Google Cloud (gsutil)

I'm using a virtual machine on GCP, and every day I upload a new file (same name) to Storage, then use the Cloud Shell Terminal to upload the file to the virtual machine using:
gsutil cp gs://my_bucket/my_file .
I want to create a cronjob that will load the file to the VM at a scheduled time.
Here's my cron:
00 13 * * 1-5 /usr/bin/gsutil cp /home/user_name/ gs://mybucket/my file .
When I check the cron syslog, I see it ran:
(CRON) info (No MTA installed, discarding output)
Found the answer, so I'll post it here.
The problem was I was not providing the right path to gsutil and I didn't have the rest of the syntax correct.
Find the correct gsutil path by running:
gsutil version -l
In my case, the corrected cron was:
00 13 * * 1-5 /snap/google-cloud-sdk/147/bin/gsutil cp gs://mybucket/myfile.py ./
Note the ./ put the file in my home directory.
(Again, what I'm doing above is copying a file from my Google Cloud Storage bucket ("mybucket") to my virtual machine home directory. It can then be run by another cronjob.)

Cronjob is not running in Linux

So I am trying to automate backups to S3 buckets through linux.
The script I am trying to run is
TIME=`date +%b-%d-%y`
FILENAME=backup-$TIME.tar.gz
SRCDIR=/opt/nexus
DESDIR=/usr/local/backup
tar -cpzf $DESDIR/$FILENAME $SRCDIR
aws s3 cp /usr/local/backup/backup.tar.gz s3://s3backup
The cronjob to run that script is 44 11 * * * ./backup.sh
However whenever I try to run the backup script (by updating cronjob) it does not seem to be working at all.
Any ideas why it will not work?
You are creating a date stamped backup file, but attempting to copy static file name. Try changing the copy command to:
aws s3 cp $DESDIR/$FILENAME s3://s3backup
Do not use relative path names in cron job or script.
44 11 * * * ./backup.sh
Instead, use full path of the script.
44 11 * * * <full_path>/backup.sh
In addition, use full path in your script:
<full_path>/tar -cpzf $DESDIR/$FILENAME $SRCDIR
<full_path>aws s3 cp /usr/local/backup/backup.tar.gz s3://s3backup
Make sure the cron job is added for the user who has the AWS credentials set up correctly.

tar archiving via cron does not work

I am trying to archive my localhost's root folder with tar and want to automate it's execution on a daily basis with crontab. For this purpose, I created a 'backupfolder' in my personal folder. I am running on Ubuntu 12.04.
The execution of tar in the command line works fine without problems:
sudo tar -cvpzf backupfolder/localhost.tar.gz /var/www
However, when I schedule the command for a daily backup (let's say at 17.00) in sudo crontab -e, it is not executing, i.e. the backup does not update using the following command:
0 17 * * * sudo tar -cpzf backupfolder/localhost.tar.gz /var/www
I already tried the full path home/user/backupfolder/localhost.tar.gz without success.
var/log/syslog gives me the following output for the scheduled execution:
Feb 2 17:00:01 DESKTOP-PC CRON[12052]: (root) CMD (sudo tar -cpzfbackupfolder/localhost.tar.gz /var/www)
Feb 2 17:00:01 DESKTOP-PC CRON[12051]: (CRON) info (No MTA installed, discarding output)
/etc/crontab specifies the following path:
SHELL=/bin/sh
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
I assume that crontab is not executing as this is a sudo command.
Is there a way how I can get this running? What is the recommended, safe way if I don't want to hardcode my root password?
Well, the command that works for you is
sudo tar -cvpzf backupfolder/localhost.tar.gz /var/www
Which means, you have to run the command with sudo access, and it will not work from within your crontab.
I would suggest adding the cron job to the root user's crontab.
Basically, do
sudo crontab -e
And add an entry there
0 17 * * * cd /home/user/backupfolder && tar -cpzf localhost.tar.gz /var/www
If that doesn't work, add the full path of tar (like /bin/tar).
Also, while debugging, set the cronjob to run every minute (* * * * *)
Basically the problem is the sudo command so we will allow sudo to run tar for the "user" without prompting for the password.
Add the following line in /etc/sudoers file.
user ALL=(ALL) NOPASSWD:/bin/tar
where user is the user installing the crontab.
I suspect a PATH problem, try to set some variables at the top of sudo crontab -e :
MAILTO=your_email#domain.tld # to get the output if there's errors
PATH=/usr/bin:/bin:/usr/local/bin:/usr/local/sbin:/sbin
You can write your command in a script like run.sh
#/bin/sh -l
tar -cvpzf backupfolder/localhost.tar.gz /var/www
then use the crontab to run the script.
IMPORTANT NOTE: the script's first line has the "-l" option.
Try it.

crontab no working

I have crontab :
35 16 * * * mysqldump -h mysql2.alwaysdata.com -u user -ppass --all-databases > ../copias/fichero_`date +%d-%m-%Y-%H:%M:%S`.sql
but the command working correctly without crontab.
the folder chmod 777 -R.
Thanks.
You should use an absolute path instead of ../copias/fichero....
You don't know what the current directory will be when the command is run by cron.
In the /etc/crontab file you must specify the username as well before the command to run.

Resources