Using wget in a crontab to run a PHP script - linux

I set up a cron job on my Ubuntu server. Basically, I just want this job to call a php page on an other server. This php page will then clean up some stuff in a database. So I tought it was a good idea to call this page with wget and then send the result to /dev/null because I don't care about the output of this page at all, I just want it to do its database cleaning job.
So here is my crontab:
0 0 * * * /usr/bin/wget -q --post-data 'pass=mypassword' http://www.mywebsite.com/myscript.php > /dev/null 2>&1
(I post a password to make sure no one could run the script but me). It works like a charm except that wget writes each time an empty page in my user directory: the result of downloading the php page.
I don't understand why the result isn't send to /dev/null ? Any idea about the problem here?
Thanks you very much!

wget's output to STDOUT is it trying to make a connection, showing progress, etc.
If you don't want it to store the saved file, use the -O file parameter:
/usr/bin/wget -q --post-data -O /dev/null 'pass=mypassword' http://www.mywebsite.com/myscript.php > /dev/null 2>&1
Checkout the wget manpage. You'll also find the -q option for completely disabling output to STDOUT (but offcourse, redirecting the output as you do works too).

wget -O /dev/null ....
should do the trick

you can mute wget output with the --quiet option
wget --quiet http://example.com

Related

Concatenate file output text with Crontab

I have followed up this question successfully, Using CRON jobs to visit url?, to maintain the following Cron task:
*/30 * * * * wget -O - https://example.com/operation/lazy-actions?lt=SOME_ACCESS_TOKEN_HERE >/dev/null 2>&1
The above Cron task works fine and it visits the URL periodically every 30 minutes.
However, the access token is recorded in a text file found in /home/myaccount/www/site/aToken.txt, the aToken file is very simple text file of one line which just contains the token string.
I have tried to read its contents and pass, using cat, it to the crontab command like the following:
*/30 * * * * wget -O - https://example.com/operation/lazy-actions?lt=|cat /home/myaccount/www/site/aToken.txt| >/dev/null 2>&1
However, the above solution has been failed to run the cronjob.
I edit Cronjobs using crontab -e using nano on Ubuntu 16.04
This is a quick solution that will do exactly what you want without the complicated one-liner:
Create this file in your myaccount -- You may also put it into your bin directory if you so desire just remember where you put it so you can call it from your CRON. Also make sure the user has permissions to read/write to the directory the sh file is in
wget.sh
#!/bin/bash
#simple cd -- change directory
cd /home/myaccount/www/site/
#grab token into variable aToken
aToken=`cat aToken.txt`
#simple cd -- move to wget directory
cd /wherever/you/want/the/wget/results/saved
#Notice the $ -- This is how we let the shell know that aToken is a variable = $aToken
#wget -O - https://example.com/operation/lazy-actions?lt=$aToken
wget -q -nv -O /tmp/wget.txt https://example.com/operation/lazy-actions?lt=$aToken >/dev/null 2>/dev/null
# You can writle logs etc etc afterward here. IE
echo "Job was successful" >> /dir/to/logs/success.log
Then simply call this file with your CRON like you are doing already.
*/30 * * * * sh /home/myaccount/www/site/wget.sh >/dev/null 2>&1
Built on that question, Concatenate in bash the output of two commands without newline character, I have got the following simple solution:
wget -O - https://example.com/operation/lazy-actions?lt="$(cat /home/myaccount/www/site/aToken.txt)" >/dev/null 2>&1
Where it able to read the contents of the text file and then echoed to the command stream.

Using WGET to run a cronjob PHP disable notification email

Im using godaddy as a webhost and id like to disable the email notification that is sent after a cronjob is done. Lucky for me they have been no help but the cronjob area says:
You can have cron send an email every time it runs a command. If you do not want an email to be sent for an individual cron job you can redirect the command’s output to /dev/null like this: mycommand >/dev/null 2>&1
Ive tried several variations of this and nothing seems to fix it.
My command:
wget http://example.com/wp-admin/tools.php?page=post-by-email&tab=log&check_mail=1
Any advice is greatly appreciated.
As the cronjob area says, you need to redirect the command’s output to /dev/null.
Your command should look like this:
wget -O /dev/null -o /dev/null "http://example.com/wp-admin/wp-mail.php" &> /dev/null
The -O option makes sure that the fetched content is sent to /dev/null.
If you want the fetched content to be downloaded in the server filesystem, you can use this option to specify the path to the desired file.
The -o option logs to /dev/null instead of stderr
&> /dev/null is another way yo redirect stdout output to /dev/null.
NOTES
For more information on wget, check the man pages: you can type man wget on the console, or use the online man pages: http://man.he.net/?topic=wget&section=all
With both -O and -o pointing to /dev/null, the output redirection ( &> ... ) should not be needed.
If you don't need to download the contents, and only need the server to process the request, you can simply use the --spider argument

Cron output to nothing

I've noticed that my cron outputs are creating index.html files on my server. The command I'm using is wget http://www.example.com 2>&1. I've also tried including --reject "index.html*"
How can I prevent the output from creating index.html files?
--2013-07-21 16:03:01-- http://www.examplel.com
Resolving example.com... 192.0.43.10
Connecting to www.example.com|192.0.43.10|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 0 [text/html]
Saving to: `index.html.9'
0K 0.00 =0s
2013-07-21 16:03:03 (0.00 B/s) - `index.html.9' saved [0/0]
Normally, the whole point of running wget is to create an output file. A URL like http://www.example.com typically resolves to http://www.example.com/index.html, so by creating index.html, the wget command is just doing its job.
If you want to run wget and discard the downloaded file, you can use:
wget -q -O /dev/null http://www.example.com
The -o /dev/null discards log messages; -O /dev/null discards the downloaded file.
If you want to be sure that anything wget writes to stdout or stderr is discarded:
wget -q -O /dev/null http://www.example.com >/dev/null 2>&1
In a comment, you say that you're using the wget command to "trigger items on your cron controller" using CodeIgniter. I'm not familiar with CodeIgniter, but downloading and discarding an HTML file seems inefficient. I suspect (and hope) that there's a cleaner way to do whatever you're trying to do.

Adding a cronjob into cPanel

I just need to run the following url using cron jobs in my cPanel.
When I am trying to execute the link
http://www.insidewealth.com.au/index.php?option=com_acymailing&ctrl=cron
the link is running in browser but when I am tried to add the same URL as it is in cron jobs I am getting the following error
bash/sh/ file not found
and when I edited the cron job as
/usr/bin/php /home/staging/public_html/index.php?option=com_acymailing&ctrl=cron
but I am getting 404 error.
My cPanel username is staging
Can anybody tell me what's the syntax of cron job in cPanel.
Cron Job running every minute and email report showing this errors.
Use wget function with full URL.
#yannick-blondeau As suggested you can use a wget or curl to make a simple request to your website.
Usually wget will attempt to download a file but this is not necessary with the -O flag to pipe to /dev/null or -q (both options used to prevent from saving output to a file), an example will look like
wget -O /dev/null http://www.insidewealth.com.au/index.php?option=com_acymailing&ctrl=cron
wget -q http://www.insidewealth.com.au/index.php?option=com_acymailing&ctrl=cron
You can also use curl for the same effect
curl --silent http://www.insidewealth.com.au/index.php?option=com_acymailing&ctrl=cron

Check Whether a Web Application is Up or Not

I would like to write a script to check whethere the application is up or not using unix shell scripts.
From googling I found a script wget -O /dev/null -q http://mysite.com, But not sure how this works. Can someone please explain. It will be helpful for me.
Run the wget command
the -O option tells where to put the data that is retrieved
/dev/null is a special UNIX file that is always empty. In other words the data is discarded.
-q means quiet. Normally wget prints lots of info telling its progress in downloading the data so we turn that bit off.
http://mysite.com is the URL of the exact web page that you want to retrieve.
Many programmers create a special page for this purpose that is short, and contains status data. In that case, do not discard it but save it to a log file by replacing -O /dev/null with -a mysite.log.
Check whether you can connect to your web server.
Connect to the port where you web server
If it connects properly your web server is up otherwise down.
You can check farther. (e.g. if index page is proper)
See this shell script.
if wget -O /dev/null -q http://shiplu.mokadd.im;
then
echo Site is up
else
echo Site is down
fi

Resources