I'm facing this problem :
When I execute a CRON task from my Web Server with this command :
wget https://mon-url.com/file/cron.php >/dev/null 2>&1
It creates new files cron.php1, cron.php2, ... in my /root directory and It takes a lot of space. How can I prevent this ?
Thanks
You can use -O /dev/null what will write the output file to /dev/null
wget -O /dev/null https://mon-url.com/file/cron.php >/dev/null 2>&1
Or -O- which will output it to stdout.
wget -O- https://mon-url.com/file/cron.php >/dev/null 2>&1
This is the answer:
wget --delete-after
Related
I have multiple sites and all this sites have cache plugin. I want to set crontab to all my these sites to clear cache. But I don't know how to do it with cron.
I use this code for only 1 site :
wget -O - "https://domain1/?action=wpfastestcache&type=clearcache&token=ABCD" >/dev/null 2>&1
So there are these sample sites, too :
https://domain1/?action=wpfastestcache&type=clearcache&token=ABCD
https://domain2/?action=wpfastestcache&type=clearcache&token=ABCD
https://domain3/?action=wpfastestcache&type=clearcache&token=ABCD
...
I don't want to set multiple cron for each site. I want to do it for all sites with 1 cron.
You can write a bash script to clear cache for all this sites from a single file
vim cache-clear.sh
#!/bin/bash
wget -O - "https://domain1/?action=wpfastestcache&type=clearcache&token=ABCD" >/dev/null 2>&1
wget -O - "https://domain2/?action=wpfastestcache&type=clearcache&token=ABCD" >/dev/null 2>&1
wget -O - "https://domain3?action=wpfastestcache&type=clearcache&token=ABCD" >/dev/null 2>&1
Then run this bash script from crontab , suppose you want to clear cache every 10 minutes
crontab -e ---> to edit crontab
*/10 * * * * bash cache-clear.sh >/dev/null 2>&1
Run this script from crontab every 10th minute by adding above command in crontab
I've made a little bash script to backup my nextcloud files including my database from my ubuntu 18.04 server. I want the backup to be executed every day. When the job is done I want to reseive one mail if the job was done (additional if it was sucessful or not). With the current script I reseive almost 20 mails and I can't figure out why. Any ideas?
My cronjob looks like this:
* 17 * * * "/root/backup/"backup.sh >/dev/null 2>&1
My bash script
#!/usr/bin/env bash
LOG="/user/backup/backup.log"
exec > >(tee -i ${LOG})
exec 2>&1
cd /var/www/nextcloud
sudo -u www-data php occ maintenance:mode --on
mysqldump --single-transaction -h localhost -u db_user --password='PASSWORD' nextcloud_db > /BACKUP/DB/NextcloudDB_`date +"%Y%m%d"`.sql
cd /BACKUP/DB
ls -t | tail -n +4 | xargs -r rm --
rsync -avx --delete /var/www/nextcloud/ /BACKUP/nextcloud_install/
rsync -avx --delete --exclude 'backup' /var/nextcloud_data/ /BACKUP/nextcloud_data/
cd /var/www/nextcloud
sudo -u www-data php occ maintenance:mode --off
echo "###### Finished backup on $(date) ######"
mail -s "BACKUP" name#domain.com < ${LOG}
Are you sure about the CRON string? For me this means "At every minute past hour 17".
Should be more like 0 17 * * *, right?
i need to create cron job to run URL from Cpanel every minute
when i open the link from browser it's auto generate backup for database
i choose the common setting ( once per minute ) * * * * *
and this is the command i use but no one worked
GET http://example.com/backup > /dev/null
wget http://example.com/backup
curl -s http://example.com/backup > /dev/null
wget -q -O /dev/null "http://example.com/backup" > /dev/null 2>&1
this is my references
https://forums.cpanel.net/threads/cron-job-to-call-a-web-page.60253/
Using CRON jobs to visit url?
CRON command to run URL address every 5 minutes
use this
wget -q -O - http://example.com/backup >/dev/null 2>&1
instead of
wget -q -O /dev/null "http://example.com/backup" > /dev/null 2>&1
wget -q -O /dev/null "http://example.com/backup" > /dev/null 2>&1
Alternative try this
curl -s https://example.com/cron
don't forget to add the HTTP or HTTPS protocol
You can use "links" command like:
links https://www.honeymovies.com
For Cron job to work you just need to hit the url, without any output and without downloading anything.
I am using only the wget with this two parameters:
wget -q --spider https://example.com/
‘-q’ :Turn off Wget’s output.
--spider : When invoked with this option, Wget will behave as a Web spider, which means
that it will not download the pages, just check that they are there. For example,
you can use Wget to check your bookmarks:
wget --spider --force-html -i bookmarks.html
This feature needs much more work for Wget to get close to the functionality of real
web spiders.
Documentation: https://www.gnu.org/software/wget/manual/wget.pdf
I'm trying to pipe stderr to logger with this code:
/usr/local/bin/Script.py >/dev/null 2>(/usr/bin/logger -t MyScript -p syslog.err)
This runs fine when run from a bash commandline but has no output in syslog when run from cron. This is my (root) crontab:
0-59/5 * * * * /usr/local/bin/Script.py >/dev/null 2>(/usr/bin/logger -t MyScript -p syslog.err)
Can anybody help and tell me What is going wrong here?
Thanks!
>/dev/null is redirecting both stdout/stderr to /dev/null before the 2> redirection can pick it up.
Instead, redirect stdout to /dev/null explicitly:
/usr/local/bin/Script.py 1>/dev/null 2>(/usr/bin/logger -t MyScript -p syslog.err)
Every time I use wget http://www.domain.com a Log file is being saved automatically on my server. is there anyway to run this command without logging?
Thanks,
Joel
You could try -o and -q
-o logfile
--output-file=logfile
Log all messages to logfile. The messages are
normally reported to standard error.
-q
--quiet
Turn off Wget's output.
So you'd have:
wget ... -q -o /dev/null ...
This will print the site contents to the standard output, is this what you mean when you say that you don't want logging to a file?
wget -O - http://www.domain.com/
I personally found that #Paul's answer was still adding to a Log file, regardless of the Command line output of -q
Added -O to /dev/null ontop of the -o output file argument.
wget [url] -q -o /dev/null -O &> /dev/null