I only have access to cpanel and I have two different types of cron jobs that I would like to understand better:
This one seems to suffer time out constraints like a normal webpage does
wget --secure-protocol=TLSv1 -O /dev/null "https://www.website.com/code.php" >/dev/null 2>&1
But what about this one?
php -q /home/website/public_html/code.php >/dev/null 2>&1
How do I make sure a cron job completes?
Related
I'm trying to run only one instance of my back up script as a cron job.
I know I can do it with a function that checks if the process is running:
if pgrep -x rclone >/dev/null
then
# rclone is still running, backup is not done yet, exit.
exit
else
# rclone is not running.
# start backup.
/path/to/rclone/script.sh
fi
But after some research, I found out flock should be used instead of looking for process ID, in crontab -e, in this case, run the script every 30 minutes:
*/30 * * * * /usr/bin/flock -n /var/lock/myjob.lock /path/to/rclone/script.sh
Running the command above requires sudo. Therefore, the script asks for sudo password, and never runs. (I ran the command above manually, that's how I found out).
How exactly do I use flock? Do I type a variable in my script that injects the sudo password when flock asks for it? (I know it's not secure, so there must be a different way to do this).
I tried to research this subject but couldn't find any good answers that explain how to use it properly.
Thank you.
I have a cron every two minutes (*/2 * * * *) firing the following command...
wget "http://www.example.com/wp-cron.php?import_key=my_key_stringimport_id=16&action=trigger"
Trouble is, it is emailing me every two minutes, and also creating copious tiny files on the server, one each time.
I have tried several things. I know there is plenty of info out there about suppressing email feedback from cron.
cPanel's Cron page, where my crons are set, makes clear: "If you do not want an email to be sent for an individual cron job, you can redirect the command’s output to /dev/null. For example: mycommand >/dev/null 2>&1"
But when I did it like this...
wget -O "http://www.example.com/wp-cron.php?import_key=my_key_stringimport_id=16&action=trigger" >/dev/null 2>&1
... the cron stopped functioning.
(I believed an -O was necessarily to direct the output).
What is the proper way to formulate this?
To suppress mails from cron you can add before your line in cron MAILTO
MAILTO=""
*/2 * * * * command
This seems to do the trick...
wget --quiet -O "http://www.example.com/wp-cron.php?import_key=my_key_stringimport_id=16&action=trigger"
Ie. Add --quiet
Answer found elsewhere on Stackoverflow.
Bit confused how --quiet and -O co-exist.
When setting up AcySMS, there are few option for the cron job. "Web cron" runs at the fastest interval of 15 minutes, way too slow for me.
I have opted for "manual cron", and am given the following "cron URL" https://www.followmetrading.com/index.php?option=com_acysms&ctrl=cron
Putting that into the cPanel cron job manager just leaves me with an error everytime the cron attempts to run:
/usr/local/cpanel/bin/jailshell: http://www.followmetrading.com/index.php?option=com_acysms: No such file or directory
I have discovered that the following command executes the script properly.
curl --request GET 'https://www.followmetrading.com/index.php?option=com_acysms&ctrl=cron&no_html=1' >/dev/null 2>&1
Note: ensure the URL is surrounded by ' ' otherwise it seems to miss everything from the & onward.
Note: >/dev/null 2>&1 makes sure there is no email trail.
This is my cron command:
/usr/bin/wget -O /dev/null -o /dev/null https://example.com/file1.php; wget -q -O - https://example.com/file2.php
The first file is running for 4 minutes for updating database (i have logs), the second file for some reason runs after 1:45, how it runs without the first one finished?
Thanks!
I set up a cron job on my Ubuntu server. Basically, I just want this job to call a php page on an other server. This php page will then clean up some stuff in a database. So I tought it was a good idea to call this page with wget and then send the result to /dev/null because I don't care about the output of this page at all, I just want it to do its database cleaning job.
So here is my crontab:
0 0 * * * /usr/bin/wget -q --post-data 'pass=mypassword' http://www.mywebsite.com/myscript.php > /dev/null 2>&1
(I post a password to make sure no one could run the script but me). It works like a charm except that wget writes each time an empty page in my user directory: the result of downloading the php page.
I don't understand why the result isn't send to /dev/null ? Any idea about the problem here?
Thanks you very much!
wget's output to STDOUT is it trying to make a connection, showing progress, etc.
If you don't want it to store the saved file, use the -O file parameter:
/usr/bin/wget -q --post-data -O /dev/null 'pass=mypassword' http://www.mywebsite.com/myscript.php > /dev/null 2>&1
Checkout the wget manpage. You'll also find the -q option for completely disabling output to STDOUT (but offcourse, redirecting the output as you do works too).
wget -O /dev/null ....
should do the trick
you can mute wget output with the --quiet option
wget --quiet http://example.com