Cron logging in /var/log/syslog - cron

I Have a rake task which executes successfully . But the cron logs it in the /var/log/syslog. Is it a genuine problem. I'm observing this issue for more than a month now, but the cron always logs it as an error in the /var/log/syslog. Is there any reason behind this.

Check the return value of the executed command and the output it prints to STDOUT and STDERR. This is the reason why cron logs it.

Related

How to get to the shell that runs my rc.local script?

I have modified my rc.local script to run some python script at startup.
This python script seems to be started successfully.
As the script is running forever (intended) and I want to see what the script does, my question is:
Is there a way to access the shell that runs this script?
Yes, to see what is going on, I could log to some file, but what if that script needs to get input from the user via console?
Thanks for your help!
You will not be able to interact with the script run by rc.local. But you can see what it does by logging its output into dedicated files:
python myscript.py > /home/myhome/log/myscript.log 2> /home/myhome/log/myscript.err
where error messages go into a separate log file.
Note that your script will be executed by root, having permissions and ownership accordingly.
Here's a link to an earlier answer about this with a method to log all outputs of rc.local.
Now you can see in your log file, if the execution stops due to the script demanding input or indeed crashing, and then you can fix the script accordingly.
If you don't want to mess with rc.local for testing, you could also first run it through crontab on your or root's account (scheduled execution by user, see man crontab). This might be easier for debugging, and you can start it through rc.local once it works as you want.

Where does, Carp library in Perl, print logs when run from crontab

I have a Perl script that uses Carp (carp croak) to print info and errors.
When run directly I can see the output on console. But when I am running the script as cronjob I dont know where the messages are going.
Already checked in /var/log/messages /var/log/cron . Not found
Please help.
man cron:
When executing commands, any output is mailed to the owner of the
crontab (or to the user named in the MAILTO environment variable in
the crontab, if such exists). Job output can also be sent to syslog by
using the -s option.
I think you should redirect the stdout in your cronjob.

Cron Logging Issue

I cannot find documentation on how to set up Cron commands for some reason. Here's my current Cron command using GoDaddy's cron job manager tool. It runs twice an hour.
/web/cgi-bin/php5 "$HOME/~path-to-file~/my-function.php" > "$HOME/~path-to-file~/my-function.log"
What I'd like to do is get a thorough log of each time the Cron runs - output, timestamp, etc. I'm trying to debug this script and I'm stuck without some kind of error logging.
Thanks all!
You are directing Standard Output to the specified file by >, but Standard Error gets sent into oblivion. Add 2>&1 to the end of the command to append any errors to the same file.
/web/cgi-bin/php5 /path/to/php > /path/to/logfile.log 2>&1

Cron job mysteriously stopped running?

I have a cron job on an Ubuntu 10.4 server that stopped running for no apparent reason. (The job ran for months and has not been changed.) I am not a *nix guru so I plead ignorance if this is a simple problem. I can't find any reason or indication why this job would have stopped. I've restarted the server without success. Here's the job:
# m h dom mon dow command
0 * * * * java -jar /home/mydir/myjar.jar >>/home/mydir/crontaboutput.txt
The last line in the output file shows that the program ran on 8/29/2012. Nothing after that.
Any ideas where to look?
There should be something in your system log when the job was run. The other thing you could >try is to add 2>&1 to the job to see any errors in your text file. – Lars Kotthoff yesterday
This proved to be the key piece of information - adding 2>&1 allowed me to capture an error that wasn't getting reported anywhere else. The completed command line then looked like:
java -jar /home/mydir/myjar.jar 2>&1 >>/home/mydir/crontaboutput.txt
Perhaps your cron daemon has stopped, or changed configuration (i.e. /etc/cron.deny). I suggest to make a shell script, and running it from crontab. I also suggest to run thru your crontab some other program (just for testing) at some other time. You can use the logger command in your shell script for syslog. Look into system log files.
Accepted answer is correct, (i.e, check the error logs) which pointed out the error in my case. Besides check for the following issues
include('../my_dir/my_file.php) may work from url but it will not work when cron job is run, will spit out error.
$_SERVER variables does not work inside cron os if you are using $_SERVER['DOCUMENT_ROOT'], it will not be recognized and you will have an error in the cron job.
Make sure to test the cron and have it run, send an email etc to make sure it is run.

How would I go about debugging a cron job that executes the script, but the script seem to not complete?

I have a cron job scrape.sh that look like this:
#!/bin/bash
touch rage
cd /etc/myproject/scraper
scrapy crawl foosite --set FEED_URI=../feeds/foosite.xml --set FEED_FORMAT=xml
scrapy crawl barsite --set FEED_URI=../feeds/barsite.xml --set FEED_FORMAT=xml
When it executes the file rage does get created and judging from my syslog it does run as root, so permissions shouldn't be a problem.
May 6 17:35:01 server CRON[10233]: (root) CMD (/etc/myproject/scraper/scrape.sh)
May 6 17:40:01 server CRON[17804]: (root) CMD (/etc/myproject/scraper/scrape.sh)
When I run scrape.sh it executes as expected and puts the foosite.xml file in the ../feeds directory, the directory exist and is empty when the cron jobs starts. What can I do to solve this issue?
If I were going to guess the problem
it was an environment issue (e.g.
scrapy is not in the path).
To debug, make sure your cron job is sending the standard out and standard error to a log file/and or syslog
Maybe the command scrapy is not found? Cron jobs typically get a different shell environment than interactive shells, so perhaps scrapy is missing in your PATH and you should use /some/full/path/to/scrapy.
If that doesn't help, try redirecting stdout and stderr to some files, so you can see what the output is?
http://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO-3.html

Resources