ssh cronjob delete not working - linux

I am new to SSH & cronjobs so probably I am doing something wrong.
I am using a Google Cloud Engine for hosting my SSH Linux instance and I want to delete a snapshot by its name with a cronjob (I have also a create snapshot cronjob that works fine).
So I wrote this script:
1 * * * * sudo gcloud compute snapshots delete my-snapshot-name -q
This script should delete the snapshot every 1 minute (the 1 minute is just in order to see the result immediately after I will see it works, I will change it accordingly).
The snapshot is not deleted.
If I run the same script, not in the cronjob then it deletes it:
sudo gcloud compute snapshots delete my-snapshot-name -q
Some more details about how I create the cronjob:
In the google cloud SSH I run crontab -e.
I write the cron script written above.
Click on Ctrl + X.
It asks if I want to save modified buffer - I click Y.
It offers to write the file name to /tmp/crontab.xxxxxx/crontab, I click enter and the cronjob is created.
What am I doing wrong? What could be the cause that the delete does not work?

It's better to add this to root crontab than using sudo.
Not so familiar with GCE, however a few things to try,
1) Use the complete path to the gcloud binary
2) Check /var/log/syslog for 'CRON' and check what the error is.

Related

How to use cron on a simple script

I want to use cron for execute a script periodically. I want to try a simple script first but it does not work.
This is my script (scritp.sh) which permission are 700:
#!/bin/sh
clear
echo "Hello!"
mkdir Hello
And this is the crontab file when I edit it with the command crontab -e:
SHELL=/bin/sh
* * * * * /home/padro/Documents/script.sh
EDIT:
I have that script on /home/padro/Documents folder. What I do after it is execute the command crontab -e for modify the cron file. In this file I put the shell that I want SHELL=/bin/sh and also the cron schedule expression * * * * * /home/padro/Documents/script.sh. This schedule teorically run the script every minute. Finally I save the file and when a minute passes I can't see the echo of the script on the terminal.
EDIT2:
I have added mkdir hello, because I don't know if the echo of the script is shown on the terminal. But the hello directory is never created.
Any output generated by a program called from cron will by default be emailed to the user owning the crontab (assuming local delivery of mail messages is possible). So I'd suggest that you look in your inbox on the local machine.
To save the output into a file, use a redirection in the crontab, or arrange for the script to write its output to a file.
Jobs started by cron does not run with a terminal, so you should not expect to see your terminal being cleared every minute by running this script through cron.
The Hello folder should have been created in the working directory used by the script (possibly your home directory). To make absolutely sure you know where the script's working directory is, use cd in the script to move to the correct location.
I do not have enough reputation to add comment.
My humble comment would be.
Is the cron file you mentioned via root?
cos chmod 700 a file would be only be executed by owner.
If you are using redhat linux, the user account you use on the first log in is user rights NOT root.
Reference link to a cheat sheet.
su - root
system will prompt root password
crontab -e
* * * * * /home/padro/Documents/script.sh
You can even run a test script, which I did encounter the similar situation as you when I first learnt scripting into your crontab-
* * * * * date > export/home/padro/Documents/testing.txt
If you could, restart the server.
Check if your directory is correct using the command
pwd in linux/unix.
I hope my comment based on my recent learning have helped you.
Edit 1: Remove clear in your script. Thanks...
Edit 2: I believe your Hello folder is created at the core of the root folder try looking for it... or the home directory of the user...

How can I backup mongodb database regularly, the specific time of a day

I want to backup database regularly in my linux server (Ubuntu 12.02),
I red some documents and that saying I should use linux cron, and Fortunately, I found this : https://github.com/micahwedemeyer/automongobackup/blob/master/src/automongobackup.sh
I put my configuration and save it mongobackup.sh and put it to /etc/cron.daily
It was 3 days ago, Today, I check the backup folder(/var/backups/mongodb) but the backup file does not exist.
Should I detele extension of mongobackup.sh? or something I missed?
It looks like your mongobackup.sh doesn't have proper rights to be executed.
chmod 755 /etc/cron.daily/mongobackup.sh should do the trick, but it wouldn't hurt to see what's inside of the script and results of ls -l /etc/cron.daily.
Also, you could manually add a task to root crontab (or any other user that has rights to run the script and to work with everything mentioned there):
to start editing crontab enter command crontab -u username -e
in the end of the file insert this: 0 0 * * * /bin/sh /full-path-to-mongobackup.sh >/dev/null 2>&1, press Esc, :wq, Enter - that will create a task, which will run mongobackup.sh every midnight.
And in order to answer your question about how you could run scripts in specific time of a day i would recommend you to read this article about cron and crontab.

What is preventing my cron job from running

I have created a list of cron jobs (see below) using sudo crontab -e in the root crontab file. When I run the commands individually on the command line, they work fine, however none of the jobs are run by cron. Any help would be appreciated. Do I need to add something else into the crontab file?
48 * * * * sudo gzip -k /calcservergc.log.*
49 * * * * for file in /calcservergc.log.*.gz; do sudo mv $file $(hostname).${file:1}; done
50 * * * * sudo rm $(hostname)..log..gz
sudo
The sudo command may not work in a crontab. Generally you need a password to run sudo but there might be a way to have it run without a password when running in a cron job. This would not be recommended however to attempt.
cron
You'll need to run the cron as a user that has access to do what you need to accomplish. Cron runs with a short list of specific paths. By default that list is pretty short. On a linux box I use the path is /sbin:/usr/sbin:/bin:/usr/bin.
Also, the paths need to be more specific. Cron doesn't run as a normal user so you have to be more specific with paths and output of those commands.
For instance, on the first command, where will the gzip file be placed?
logrotate
It looks like you're trying to zip a log file, then move log files, then remove old log files - this is exactly what logrotate accomplishes. It would be worth installing. Logrotate solves problems like the log file being opened when you run this command - generally the process that has the log file opened doesn't lose the file handle even if you rename it so the log continues to be written to even after you move it. It also handles the problem of keeping an archive of the recent log files, like syslog.1.gz, syslog.2.gz, syslog.x.gz or as many back as you have storage space for or want to keep for posterity.
Summary
Don't use sudo in cron
Be specific in paths when running commands in cron
Use logrotate to accomplish this specific task in your question
I don't have 50 points of reputation so can't comment on your question, so I'll try to say it in one shot.
I detect a possible problem with your 3 commands each called at one minute apparts. Let's say the first operation takes more than one minute to run (shouldn't happen but in theory it could), your second call won't work or worst, it could work on half the data). You don't want to loose time by, lets say, put 5 minutes delay between your commands, that would be a lost of time.
What you could do is create a shell script in which you put the 3 commands. This way it will prevent your operations to "crash". So just put your 3 commands in a script shell and they will be executed one after the other.
Then put your file in a place like /bin (you can also create a symbolic link with ln -s) and call your script with cron. (Be careful with the paths in the script shell)
Now, for the sudo problem... well even if you put it in a shell script, you would still need to pass your sudo password, and cron runs in the background so you won't be able to enter your password.
You could try two solutions. Change the rights on the containing folder where your files are stored (by using chmod -r 777 or chmod 755 on the folder) or move/copy your files in a directory where you have access to read and write.

shell script doesn't run fully when run as a cron job

I'm having a peculiar issue with a shell script that I have set to run every minute via crontab.
I use Pelican as a blog platform and wanted to semi-automate the way in which the site updates whenever there's a new post. To do this, I've created a script to look for a file called respawn in the same directory as the content (it syncs via Dropbox so I simply create the file there which syncs to the server).
The script is written so that if the file respawn exists then it rebuilds the blog and deletes it. If it's not, it exits instead.
Here's the script called publish.sh
#!/bin/bash
Respawn="/home/user/content/respawn"
if [ -f $Respawn ]
then
sudo /home/user/sb.sh;rm $Respawn
else
exit 0
fi
exit 0
Here's the crontab for the shell script
* * * * * /home/user/publish.sh
And finally, here's the contents of sb.sh
make html -C /var/www/site/
Now, if I run the script via SSH and respawn exists, it works perfectly. However, if I let the cron do it then it doesn't run the shell script but it still deletes the respawn file.
I have one other cron job that runs every 4 hours that simply runs sb.sh which works perfectly (in case I forget to publish something).
I've tried using the user's crontab as well as adding it to root instead and I've also added the user to the sudoers file so it can be run without password intervention. Neither seem to work. Am I missing something?
It must be sudo. cron can't input the password.
Check mail for the user running the cron to confirm. Something like sudo: no tty present.
Try changing sudo /home/user/sb.sh;rm $Respawn to
/home/user/sb.sh;rm $Respawn
sudo is not necessary to run your command in this context, since it'll be invoked as root anyway.

Unable to run the cronjob in ubuntu 10.10 server, EC2 instance

I have a AWS EC2 instance with ubuntu 10.10 server.
I am trying to add a cron job to the list. But the cron job is not being executed.
I am actually uploading a particular file to aws s3 using s3cmd visit s3tools.org
What will be in the problem and also the solution.
Kindly help me out
Here is the bash script which has to be run
s3cmd put file-name s3://bucket_name/foder_name/file-name
Here is the job
bash /path/to/file.sh
Are you aware that the global crontab (/etc/crontab) has a user field:
# m h dom mon dow user command
While the crontab of a user (reachable by running crontab -e as a user) does not?
# m h dom mon dow command
This drove me crazy once: cron was failing relatively silently...
That said, try and make a very simple cron entry, maybe directly inline, that touches a file in a writeable folder. This way you'll figure out if it's your script or cron that's being difficult.
Is cron running? you can test this by adding to your crontab:
* * * * * /bin/date >/tmp/the_time
is my crontab being called? You can test this by adding to your cronscript:
echo "Hello world!" >/tmp/the_hello
Does my cronscript get the right path-settings when it is invoked?: add to your cronscript:
set >/tmp/the_settings
In most cases the scripts that are called from cron need to have most of their pathnames hardcoded, sometimes even PATH needs to be set or expanded.
was able to figure out the solution for this. The problem was I need to specify absolute path thats /usr/local/bin/s3cmd .

Resources