i need to create cron job to run URL from Cpanel every minute
when i open the link from browser it's auto generate backup for database
i choose the common setting ( once per minute ) * * * * *
and this is the command i use but no one worked
GET http://example.com/backup > /dev/null
wget http://example.com/backup
curl -s http://example.com/backup > /dev/null
wget -q -O /dev/null "http://example.com/backup" > /dev/null 2>&1
this is my references
https://forums.cpanel.net/threads/cron-job-to-call-a-web-page.60253/
Using CRON jobs to visit url?
CRON command to run URL address every 5 minutes
use this
wget -q -O - http://example.com/backup >/dev/null 2>&1
instead of
wget -q -O /dev/null "http://example.com/backup" > /dev/null 2>&1
wget -q -O /dev/null "http://example.com/backup" > /dev/null 2>&1
Alternative try this
curl -s https://example.com/cron
don't forget to add the HTTP or HTTPS protocol
You can use "links" command like:
links https://www.honeymovies.com
For Cron job to work you just need to hit the url, without any output and without downloading anything.
I am using only the wget with this two parameters:
wget -q --spider https://example.com/
‘-q’ :Turn off Wget’s output.
--spider : When invoked with this option, Wget will behave as a Web spider, which means
that it will not download the pages, just check that they are there. For example,
you can use Wget to check your bookmarks:
wget --spider --force-html -i bookmarks.html
This feature needs much more work for Wget to get close to the functionality of real
web spiders.
Documentation: https://www.gnu.org/software/wget/manual/wget.pdf
Related
The script sometimes doesn't run after wget. Perhaps it is necessary to wait for the completion of wget?
#!/usr/bin/env bash
set -Eeuo pipefail
# Installing tor-browser
echo -en "\033[1;33m Installing tor-browser... \033[0m \n"
URL='https://tor.eff.org/download/' # Official mirror https://www.torproject.org/download/, may be blocked
LINK=$(wget -qO- $URL | grep -oP -m 1 'href="\K/dist.+?ALL.tar.xz')
URL='https://tor.eff.org'${LINK}
curl --location $URL | tar xJ --extract --verbose --preserve-permissions
sudo mv tor-browser /opt
sudo chown -R $USER /opt/tor-browser
cd /opt/tor-browser
./start-tor-browser.desktop --register-app
There are pitfalls associated with set -e (aka set -o errexit). See BashFAQ/105 (Why doesn't set -e (or set -o errexit, or trap ERR) do what I expected?).
If you decide to use set -e despite the problems then it's a very good idea to set up an ERR trap to show what has happened, and use set -E (aka set -o errtrace) so it fires in functions and subshells etc. A basic ERR trap can be set up with
trap 'echo "ERROR: ERR trap: line $LINENO" >&2' ERR
This will prevent the classic set -e problem: the program stops suddenly, at an unknown place, and for no obvious reason.
Under set -e, the script stops on any error.
set -Eeuo pipefail
# ^
Maybe the site is sometimes unavailable, or the fetched page doesn't match the expression grep is searching for.
You are doing
wget -qO- $URL
according to wget man page
-q
--quiet
Turn off Wget's output.
this is counterproductive for finding objective cause of malfunction, by default wget is verbose and write information to stderr, if you wish to store that into file you might redirect stderr to some file, consider following simple example
wget -O - http://www.example.com 2>>wget_out.txt
it does download Example Domain and write its' content to standard output (-) whilst stderr is appended to file named wget_out.txt, therefore if you run that command e.g. 3 times you will have information from 3 runs in wget_out.txt
I have multiple sites and all this sites have cache plugin. I want to set crontab to all my these sites to clear cache. But I don't know how to do it with cron.
I use this code for only 1 site :
wget -O - "https://domain1/?action=wpfastestcache&type=clearcache&token=ABCD" >/dev/null 2>&1
So there are these sample sites, too :
https://domain1/?action=wpfastestcache&type=clearcache&token=ABCD
https://domain2/?action=wpfastestcache&type=clearcache&token=ABCD
https://domain3/?action=wpfastestcache&type=clearcache&token=ABCD
...
I don't want to set multiple cron for each site. I want to do it for all sites with 1 cron.
You can write a bash script to clear cache for all this sites from a single file
vim cache-clear.sh
#!/bin/bash
wget -O - "https://domain1/?action=wpfastestcache&type=clearcache&token=ABCD" >/dev/null 2>&1
wget -O - "https://domain2/?action=wpfastestcache&type=clearcache&token=ABCD" >/dev/null 2>&1
wget -O - "https://domain3?action=wpfastestcache&type=clearcache&token=ABCD" >/dev/null 2>&1
Then run this bash script from crontab , suppose you want to clear cache every 10 minutes
crontab -e ---> to edit crontab
*/10 * * * * bash cache-clear.sh >/dev/null 2>&1
Run this script from crontab every 10th minute by adding above command in crontab
I've made a little bash script to backup my nextcloud files including my database from my ubuntu 18.04 server. I want the backup to be executed every day. When the job is done I want to reseive one mail if the job was done (additional if it was sucessful or not). With the current script I reseive almost 20 mails and I can't figure out why. Any ideas?
My cronjob looks like this:
* 17 * * * "/root/backup/"backup.sh >/dev/null 2>&1
My bash script
#!/usr/bin/env bash
LOG="/user/backup/backup.log"
exec > >(tee -i ${LOG})
exec 2>&1
cd /var/www/nextcloud
sudo -u www-data php occ maintenance:mode --on
mysqldump --single-transaction -h localhost -u db_user --password='PASSWORD' nextcloud_db > /BACKUP/DB/NextcloudDB_`date +"%Y%m%d"`.sql
cd /BACKUP/DB
ls -t | tail -n +4 | xargs -r rm --
rsync -avx --delete /var/www/nextcloud/ /BACKUP/nextcloud_install/
rsync -avx --delete --exclude 'backup' /var/nextcloud_data/ /BACKUP/nextcloud_data/
cd /var/www/nextcloud
sudo -u www-data php occ maintenance:mode --off
echo "###### Finished backup on $(date) ######"
mail -s "BACKUP" name#domain.com < ${LOG}
Are you sure about the CRON string? For me this means "At every minute past hour 17".
Should be more like 0 17 * * *, right?
I'm facing this problem :
When I execute a CRON task from my Web Server with this command :
wget https://mon-url.com/file/cron.php >/dev/null 2>&1
It creates new files cron.php1, cron.php2, ... in my /root directory and It takes a lot of space. How can I prevent this ?
Thanks
You can use -O /dev/null what will write the output file to /dev/null
wget -O /dev/null https://mon-url.com/file/cron.php >/dev/null 2>&1
Or -O- which will output it to stdout.
wget -O- https://mon-url.com/file/cron.php >/dev/null 2>&1
This is the answer:
wget --delete-after
Every time I use wget http://www.domain.com a Log file is being saved automatically on my server. is there anyway to run this command without logging?
Thanks,
Joel
You could try -o and -q
-o logfile
--output-file=logfile
Log all messages to logfile. The messages are
normally reported to standard error.
-q
--quiet
Turn off Wget's output.
So you'd have:
wget ... -q -o /dev/null ...
This will print the site contents to the standard output, is this what you mean when you say that you don't want logging to a file?
wget -O - http://www.domain.com/
I personally found that #Paul's answer was still adding to a Log file, regardless of the Command line output of -q
Added -O to /dev/null ontop of the -o output file argument.
wget [url] -q -o /dev/null -O &> /dev/null