Crontab to upload file from Google Cloud (gsutil) - cron

I'm using a virtual machine on GCP, and every day I upload a new file (same name) to Storage, then use the Cloud Shell Terminal to upload the file to the virtual machine using:
gsutil cp gs://my_bucket/my_file .
I want to create a cronjob that will load the file to the VM at a scheduled time.
Here's my cron:
00 13 * * 1-5 /usr/bin/gsutil cp /home/user_name/ gs://mybucket/my file .
When I check the cron syslog, I see it ran:
(CRON) info (No MTA installed, discarding output)

Found the answer, so I'll post it here.
The problem was I was not providing the right path to gsutil and I didn't have the rest of the syntax correct.
Find the correct gsutil path by running:
gsutil version -l
In my case, the corrected cron was:
00 13 * * 1-5 /snap/google-cloud-sdk/147/bin/gsutil cp gs://mybucket/myfile.py ./
Note the ./ put the file in my home directory.
(Again, what I'm doing above is copying a file from my Google Cloud Storage bucket ("mybucket") to my virtual machine home directory. It can then be run by another cronjob.)

Related

When scheduled with cron, ./azcopy does not run

There is a shell script in the location /file/location/azcopy/, and an Azcopy binary is also located there
The below command runs successfully when I run it manually
./azcopy cp "/abc/def/Goa.csv" "https://.blob.core.windows.net/abc\home\xyz?"
However, when I scheduled it in crontab, the "./azcopy" command didn't execute.
below is the script
#!/bin/bash
./azcopy cp "/abc/def/Goa.csv" "https://<Blobaccount>.blob.core.windows.net/abc\home\xyz?<SAS- Token>"
below is the crontab entry
00 17 * * * root /file/location/azcopy/script.sh
Is there something I'm doing wrong?
Could someone please help me figure out what's wrong.
When you use root to execute the /file/location/azcopy/script.sh,you work directory is /root ,so you need to add cd /file/location/azcopy/ in your script.sh script to change work directory. You can add pwd in your script to see the current work directory.

Cronjob is not running in Linux

So I am trying to automate backups to S3 buckets through linux.
The script I am trying to run is
TIME=`date +%b-%d-%y`
FILENAME=backup-$TIME.tar.gz
SRCDIR=/opt/nexus
DESDIR=/usr/local/backup
tar -cpzf $DESDIR/$FILENAME $SRCDIR
aws s3 cp /usr/local/backup/backup.tar.gz s3://s3backup
The cronjob to run that script is 44 11 * * * ./backup.sh
However whenever I try to run the backup script (by updating cronjob) it does not seem to be working at all.
Any ideas why it will not work?
You are creating a date stamped backup file, but attempting to copy static file name. Try changing the copy command to:
aws s3 cp $DESDIR/$FILENAME s3://s3backup
Do not use relative path names in cron job or script.
44 11 * * * ./backup.sh
Instead, use full path of the script.
44 11 * * * <full_path>/backup.sh
In addition, use full path in your script:
<full_path>/tar -cpzf $DESDIR/$FILENAME $SRCDIR
<full_path>aws s3 cp /usr/local/backup/backup.tar.gz s3://s3backup
Make sure the cron job is added for the user who has the AWS credentials set up correctly.

How can i keep crontab in file in ubuntu docker

I am using docker and OS is Ubuntu.
If i use crontab -e and place data in there then cron runs fine.
* * * * * /var/www/daily.sh
But if remove the container then that crontab is also gone. I want to somehow place crontab in some file like crontabs.sh then mount that inside container so that if i create container then my cron is still there.
I don't know at what location i need to mount that so that cron runs normally. something like
/myhost/code/crontabs.sh: /etc/crons.daily
As mentioned in this answer, you can copy your file, adding to your Dockerfile:
FROM ubuntu:latest
MAINTAINER docker#ekito.fr
# Add crontab file in the cron directory
COPY crontab /etc/cron.d/crons.daily
# Give execution rights on the cron job
RUN chmod 0644 /etc/cron.d/hello-cron
# Create the log file to be able to run tail
RUN touch /var/log/cron.log
# Run the command on container startup
CMD cron && tail -f /var/log/cron.log
(Source: example "Run a cron job with Docker" (by Julien Boulay)
That way, your image will always include the right cron definition.
You can initialize the content of 'crontab', the local file you are copying to your image, with cronsandbox.com.
In your case: 0 23 * * *
If you don't want to make a new image at each change, you remove the COPY line, and mount that file at runtime:
docker run -v crontab:/etc/cron.d/hello-cron -n mycontainer myimage
That way, the local file crontab is mounted as in the container as /etc/cron.d/hello-cron (or any other name you want).
Whenever you change it, stop and restart your container.

crontab gsutil command executes but no output

gsutil cp -r gs://mybucetid/stats/installs/installs_com.appdomain_YYYYMM_overview.csv /home/ubuntu/appstats
The above command runs successfully from my terminal. The command copies the statistics file from google cloud to my local directory.
Hence I tried to put the above command in crontab.
below is the line from the crontab
30 00 * * * gsutil cp -r gs://mybucetid/stats/installs/installs_com.appdomain_YYYYMM_overview.csv /home/ubuntu/appstats
The cron executes on time with no errors (checked in cron log) but the file does not download to the specified location)
Can anybody suggest me, in the crontab, what is missing in the command and why the file is not copied from my google cloud bucket to my specified local directory?
Simple Update yyyyMM to date formate as shown below
30 00 * * * gsutil cp -r gs://mybucetid/stats/installs/installs_com.appdomain_201710_overview.csv /home/ubuntu/appstats

`gcloud compute copy-files`: permission denied when copying files

I'm having a hard time copying files over to my Google Compute Engine. I am using an Ubuntu server on Google Compute Engine.
I'm doing this from my OS X terminal and I am already authorized using gcloud.
local:$ gcloud compute copy-files /Users/Bryan/Documents/Websites/gce/index.php example-instance:/var/www/html --zone us-central1-a
Warning: Permanently added '<IP>' (RSA) to the list of known hosts.
scp: /var/www/html/index.php: Permission denied
ERROR: (gcloud.compute.copy-files) [/usr/bin/scp] exited with return code [1].
insert root# before the instance name:
local:$ gcloud compute copy-files /Users/Bryan/Documents/Websites/gce/index.php root#example-instance:/var/www/html --zone us-central1-a
The reason this doesn't work is that your username does not have permissions on the GCE VM instance and so cannot write to /var/www/html/.
Note that since this question is about Google Compute Engine VMs, you cannot SSH directly to a VM as root, nor can you copy files directly as root, for the same reason: gcloud compute scp uses scp which relies on ssh for authentication.
Possible solutions:
(also suggested by Faizan in the comments) this solution will require two steps every time
use gcloud compute scp --recurse to transfer files/directories where your user can write to, e.g., /tmp or /home/$USER
login to the GCE VM via gcloud compute ssh or via the SSH button on the console and copy using sudo to get proper permissions:
# note: sample command; adjust paths appropriately
sudo cp -r $HOME/html/* /var/www/html
this solution is one step with some prior prep work:
one-time setup: give your username write access to /var/www/html directly; this can be done in several ways; here's one approach:
# make the HTML directory owned by current user, recursively`
sudo chown -R $USER /var/www/html
now you can run the copy in one step:
gcloud compute scp --recurse \
--zone us-central1-a \
/Users/Bryan/Documents/Websites/gce/index.php \
example-instance:/var/www/html
I use a bash script to copy from my local machine to writable directory on the remote GCE machine; then using ssh move the files.
SRC="/cygdrive/d/mysourcedir"
TEMP="~/incoming"
DEST="/var/my-disk1/my/target/dir"
You also need to set GCE_USER and GCE_INSTANCE
echo "=== Pushing data from $SRC to $DEST in two simple steps"
echo "=== 1) Copy to a writable temp directoy in user home"
gcloud compute copy-files "$SRC"/*.* "${GCE_USER}#${GCE_INSTANCE}:$TEMP"
echo "=== 2) Move with 'sudo' to destination"
gcloud compute ssh ${GCE_USER}#${GCE_INSTANCE} --command "sudo mv $TEMP/*.* $DEST"
In my case I don't want to chown the target dir as this causes other problems with other scripts ...
I had the same problem and didn't get it to work using the methods suggested in the other answers. What finally worked was to explicitly send in my "user" when copying the file as indicated in the official documentation. The important part being the "USER#" in
gcloud compute scp [[USER#]INSTANCE:]SRC [[[USER#]INSTANCE:]SRC …] [[USER#]INSTANCE:]DEST
In my case I could initially transfer files by typing:
gcloud compute scp instance_name:~/file_to_copy /local_dir
but after I got the permission denied I got it working by instead typing:
gcloud compute scp my_user_name#instance_name:~/file_to_copy /local_dir
where the username in my case was the one I was logged in to Google Cloud with.
UPDATE
gcloud compute copy-files is deprecated.
Use instead:
$ gcloud compute scp example-instance:~/REMOTE-DIR ~/LOCAL-DIR \ --zone us-central1-a
More info:
https://cloud.google.com/sdk/gcloud/reference/compute/scp
The updated solution for this exact issue (2020)
For the sake of exposition, we have to break the issue in two parts. The "copy-files" command is officially depreciated and we are to use "scp", however both old and new options are limited to certain folders only.
Since we do have access to the /tmp folder, this means we can easily move our distribution files with the preferred "scp" command, as a staging step.
More importantly we also have access to execute scripts, or commands remotely via SSH on the instance which means the limited access is no longer an issue.
Example Time
The first part is to copy the dist folder, and all it's content recursively to the tmp folder to which gloud does give access:
gcloud compute scp --recurse dist user_name#instance:/tmp
The second part leverages the fact that we can run commands remotely via ssh:
gcloud compute ssh user_name#instance --command "sudo bash golive"
(or any other command you may need to execute)
More importantly this also means that we can just copy our distribution files to the final destination using sudo and the "cp" copy function:
gcloud compute ssh user_name#instance --command "sudo cp -rlf /tmp/dist/* /var/www/html/"
This completely eliminates the need to set the permissions first through the ssh terminal.
This is to copy files from remote machine to your machine. And make sure you have ssh setup because this will use default ssh keys.
This worked for me:
gcloud compute scp 'username'#'instance_name':~/source_dir /destination_dir --recurse
This is the generic syntax, so if you want to copy files from your machine to remote machine, you can use this.
--recurse : required to copy directories with other files inside
Syntax: gcloud compute scp 'SOURCE' 'DESTINATION'
NOTE: run it without root

Resources