I have a Perl script that uses Carp (carp croak) to print info and errors.
When run directly I can see the output on console. But when I am running the script as cronjob I dont know where the messages are going.
Already checked in /var/log/messages /var/log/cron . Not found
Please help.
man cron:
When executing commands, any output is mailed to the owner of the
crontab (or to the user named in the MAILTO environment variable in
the crontab, if such exists). Job output can also be sent to syslog by
using the -s option.
I think you should redirect the stdout in your cronjob.
Related
I am trying to find a way to record every single command that is executed by any user on the system.
Things that I have came across earlier.
It is possible to view shell commands executed from the terminal using ~/.bashrc_history file.
There is a catch here, It logs only those commands which were executed interactively from bash shell/terminal.
This solves one of my problems. But in addition to it, I would like to log those commands also which were executed as a part of the shell script.
Note: I don't have control over shell script. Therefore, adding verbose mode like #!/bin/bash -xe is not possible.
However, this can be assumed that I have root access as a system administrator.
Eg: I have another user that has access to the system. And he runs the following shell script using from his account.
#!/bin/sh
nmap google.com
and run as "$ sh script.sh"
Now, What I want is "nmap google.com" command should be logged somewhere once this file is executed.
Thanks in advance. Even a small help is appreciated.
Edit: I would like to clarify that users are unaware that they are being monitored. So I need a solution something at system level(may be agent running with root). I cannot depend on user to log suspicious activity. Of-course everyone will avoid such tricks to put blame on someone else if they do something fishy or wrong
I am aware that you were asking for Bash and Shell scripting and tagged your question accordingly, but in respect to your requirements
Record every single command that is executed by any user on the system
Users are unaware that they are being monitored
A solution something at system level
I am under the assumption that you are looking for Audit Logging.
So you may take advantage from articles like
Log all commands run by Admins on production servers
Log every command executed by a User
You can run the script in this way:
execute bash (it will override the shebang)
ts to prefix every lines
logs both in terminal and files
bash -x script.sh |& ts | tee -a /tmp/$(date +%F).log
You may ask the other user to create an alias.
Edit:
You may also add this into /etc/profile (sourced when users login)
exec > >(tee -a /tmp/$(date +%F).log)
Do it also for error output if needed. Keep it splited.
I have modified my rc.local script to run some python script at startup.
This python script seems to be started successfully.
As the script is running forever (intended) and I want to see what the script does, my question is:
Is there a way to access the shell that runs this script?
Yes, to see what is going on, I could log to some file, but what if that script needs to get input from the user via console?
Thanks for your help!
You will not be able to interact with the script run by rc.local. But you can see what it does by logging its output into dedicated files:
python myscript.py > /home/myhome/log/myscript.log 2> /home/myhome/log/myscript.err
where error messages go into a separate log file.
Note that your script will be executed by root, having permissions and ownership accordingly.
Here's a link to an earlier answer about this with a method to log all outputs of rc.local.
Now you can see in your log file, if the execution stops due to the script demanding input or indeed crashing, and then you can fix the script accordingly.
If you don't want to mess with rc.local for testing, you could also first run it through crontab on your or root's account (scheduled execution by user, see man crontab). This might be easier for debugging, and you can start it through rc.local once it works as you want.
I want to log bash commands executed non-interactively.
Example:
on execution of ssh user#a.b.c.d ls -l command, it should be logged on syslog.
Found a similar question but no help. Is there any way to achieve it by making code changes in bash package source code?
Thanks in Advance !!
I tried a lot of methods but ended up with nothing in hand..My simple target is to reset a variable to zero at the end of the day.
i checked the location of php as "which php" and "whereis php"
which resulted into /usr/bin/php
here are some of things i tried..
/usr/local/bin/php -f /home/USERNAME/public_html/developer3/crons/filename.php
/usr/bin/php -f /home/USERNAME/public_html/developer3/crons/filename.php
php -q /home/USERNAME/public_html/developer3/crons/filaname.php
/usr/bin/wget -O http://subdomain.sitename.com/crons/filename.php
for quick results, i kept the timming as every minute to execute the code. i could successfully execute the code as
http://subdomain.sitename.com/crons/filename.php
please guide me.
If the path to php on your system is /usr/bin/php then the command you should be running with cron should be
/usr/bin/php /full/path/to/php/script.php
Is the script you're running designed to be invoked from the php-cli in that way? If it isn't and works correctly when you use a browser then you can use curl, if it's installed on your server
curl -A cron-job http://subdomain.sitename.com/crons/filename.php
It would likely be helpful in any event to configure the MAILTO variable (since you're using cPanel, it's just a text box on the appropriate page that you can fill in) so that you get an email with the output from your cron jobs. Having the output emailed to you will help you diagnose what's causing your script to not have the desired effect when it runs.
I have a cron job scrape.sh that look like this:
#!/bin/bash
touch rage
cd /etc/myproject/scraper
scrapy crawl foosite --set FEED_URI=../feeds/foosite.xml --set FEED_FORMAT=xml
scrapy crawl barsite --set FEED_URI=../feeds/barsite.xml --set FEED_FORMAT=xml
When it executes the file rage does get created and judging from my syslog it does run as root, so permissions shouldn't be a problem.
May 6 17:35:01 server CRON[10233]: (root) CMD (/etc/myproject/scraper/scrape.sh)
May 6 17:40:01 server CRON[17804]: (root) CMD (/etc/myproject/scraper/scrape.sh)
When I run scrape.sh it executes as expected and puts the foosite.xml file in the ../feeds directory, the directory exist and is empty when the cron jobs starts. What can I do to solve this issue?
If I were going to guess the problem
it was an environment issue (e.g.
scrapy is not in the path).
To debug, make sure your cron job is sending the standard out and standard error to a log file/and or syslog
Maybe the command scrapy is not found? Cron jobs typically get a different shell environment than interactive shells, so perhaps scrapy is missing in your PATH and you should use /some/full/path/to/scrapy.
If that doesn't help, try redirecting stdout and stderr to some files, so you can see what the output is?
http://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO-3.html