Cron run every minute ( Runs in bash but not in cron) - linux

This have been discussed several times in previous post. I followed given advise but it does not work for me. I have two scripts which are run by the cron service every minute. To my surprise, only one runs per minute( 1st in the list below), the other fails (2nd in list below). Most surprising, when run direct from the terminal, both scripts execute fine.
Cron setup :
*/1 * * * * /home/user/Desktop/scripts/generatepattern.sh
*/1 * * * * /home/user/Desktop/scripts/getnextfile.sh
File permissions are:
-rwxr--r-- 1 user user 522 Jul 25 16:18 generatepattern.sh
-rwxr--r-- 1 user user 312 Jul 25 23:02 getnextfile.sh
The code for the non-schedulable( not running in cron ) is :
#!/bin/bash
#Generate a file to be used for the search
cd /home/user/Desktop/scripts
no=`cat filecount.txt`
if test $no -lt 20
then
#echo "echo less"
#echo $no
expr `cat filecount.txt` + 1 >filecount.txt
fi

In the last line you wrote cat filecount.txt instead of cat /home/user/Desktop/scripts/filecount.txt

I discovered the main issue was that the new cron settings only get used when the vi editor gets closed. Changes have to be made on the editor and a :wq command issued so that the new settings get installed. Just issuing :w command does not work since no install happens(this was my mistake). I realised this after issuing :wq command on vi and the following output displayed :-
# crontab -e
crontab: installing new crontab
Thanks to all other suggestions made.

Related

Cron.d file not running

I am new to Linux and have a problem that I have tried to look online and could not find a solution
I have 4 scripts that are running in cron.d 3 of them I set to run every minute and they are fine and logging into the output files but the last one should run at 1:00 am but it will not.
-rw-r--r-- 1 root root 390 Jul 29 14:03 Test
* * * * * root /usr/sa/dir1/dir2/script1.sh >> /usr/sa/dir1/logs/fileoutput 2>&1
* * * * * root /usr/sa/dir1/dir2/script2.sh >> /usr/sa/dir1/logs/fileoutput 2>&1
* * * * * root /usr/sa/dir1/dir2/script3.sh >> /usr/sa/dir1/logs/fileoutput 2>&1
0 1 * * * root /usr/sa/dir1/dir2/script4.sh >> /usr/sa/dir1/logs/fileoutput 2>&1
I checked the permission and all seems to be fine as the same script from cron.d file are running as I can see entries from cron that are executed in /var/log/messages and same from the log files.
Things I have tried till now and worked
IF I vim the file and change for the 4th script to run every minute it runs fine.
IF I vim the file and change for the 4th script to run during the day it runs.
IF I include the script under crontab of root user and it runs ok.
IF I run script form the command line it runs ok.
I can not figure out why viming the file in cron.d the script will be executed by cron.
Thank you in advance for the help
Things I have tried till now and worked
IF I vim the file and change for the 4th script to run every minute it runs fine.
IF I vim the file and change for the 4th script to run during the day it runs.
IF I include the script under crontab of root user and it runs ok.
IF I run script form the command line it runs ok.
I can not figure out why viming the file in cron.d the script will be executed by cron.
I don't see errors in /var/log/messages
Are you commiting this crontab by just editing the raw file or are you using crontab <my-new-crontab>? If you aren't then try the latter. :)
I end it up adding the scripts to the crontab of the user root and seems to be ok.

running a script in crontab

I have a very simple script in my crontab that I want to run every day. It is located in /home:
-rwxr-xr-x 1 root root 40 Apr 15 08:01 kill_slony_stop_sql.sh
It has execute permission and here is the content:
#!/bin/bash
slon_kill;rcpostgresql stop
and here is the cron line for it to run daily:
56 12 * * * /home/kill_slony_stop_sql.sh
But it is not working for some reason. When I type /home/kill_slony_stop_sql.sh in the command line, it works good but it is not working in the crontab.
Any thoughts?
It is most likely a PATH issue. Have a look at Why is my crontab not running and be sure to set a PATH so that it can call your slon_kill command.
Also, add some debug to your cron
56 12 * * * /home/kill_slony_stop_sql.sh &>/tmp/errorcron.log
And also look at the logs; cron logs its actions via syslog, which (depending on your setup) often go to /var/log/cron or /var/log/syslog.
I had the same problem with a daily cron job, I used the #daily but this will run at 00:00 every day.
#daily /usr/local/bin/msa70_check.sh
was the cron tab line i added, below is the script i run.
#!/bin/bash
# msa70 disk check
/sbin/mdadm --detail /dev/md0 /dev/md1|
/bin/mailx -s"Disk check on server123 please check" person#domain.com
I also had to edit my script and add /sbin/ and /bin in front of mdadm and mailx for the cron job to run

Linux bash shell script output is different from cronjob vs manually running the script

I wrote a linux bash shell script which works fine except the output when I run it manually is different than when I run it from a cronjob.
The particular command is lftp:
lftp -e "lcd $outgoingpathlocal;mput -O $incomingpathremote *.CSV;exit" -u $FTPUSERNAME,$FTPPASSWORD $FTPSERVER >> ${SCRIPTLOGFILE} 2>&1
When I run the script manually, the ${SCRIPTLOGFILE} contains a lot of info such as how many files/bytes/etc transferred. But when I run the same script from a cronjob there is no output unless there was an error (such as could not connect). I have tried various terminal output configurations but none work for this lftp command. Suggestions?
It's worth reading this:
crontab PATH and USER
In particular, cron won't set the same environment variables you're used to an interactive shell.
You might want to wrap your entire cron job up in a script, and then you can, for example, temporarily add some code like export >> scriptenvironment.txt and see what the difference is between the cron invoked script and the interactively invoked script.
Try man 5 crontab for details.
Once you know what envrionment variables you need for your script to run, you can set them in the crontab as necessary, or source at the start of your own script.
EXAMPLE CRON FILE
# use /bin/sh to run commands, overriding the default set by cron
SHELL=/bin/sh
# mail any output to `paul', no matter whose crontab this is
MAILTO=paul
#
# run five minutes after midnight, every day
5 0 * * * $HOME/bin/daily.job >> $HOME/tmp/out 2>&1
# run at 2:15pm on the first of every month -- output mailed to paul
15 14 1 * * $HOME/bin/monthly
# run at 10 pm on weekdays, annoy Joe
0 22 * * 1-5 mail -s "It's 10pm" joe%Joe,%%Where are your kids?%
23 0-23/2 * * * echo "run 23 minutes after midn, 2am, 4am ..., everyday"
5 4 * * sun echo "run at 5 after 4 every sunday"

Why my scheduled job is not working?

I define a job with crontab like this
0 2 * * * dbadmin . /home/dbadmin/back.sh
it is not root I want to run this .sh file with dbadmin user.
but when I checked it is not working.
in the log it gives this:
Feb 22 21:16:01 localhost crond[14634]: (*system*) BAD FILE MODE (/etc/crontab)
Feb 22 21:16:01 localhost crond[14634]: (dbadmin) RELOAD (cron/dbadmin)
Feb 22 21:16:01 localhost crond[28451]: (dbadmin) CMD (dbadmin . /home/dbadmin/back.sh)
How can I fix this? thanks in advance
Make a crontab entry as dbadmin without the username in it:
0 2 * * * /home/dbadmin/movefolder.sh > /home/dbadmin/cron.out 2>#1
Each day the logfile /home/dbadmin/cron.out should be replaced by a new one.
When you are confident about the cron+movefolder, replace the outputfile with /dev/null.
When above fails, check calling the script as dbadmin:
sh /home/dbadmin/movefolder.sh
When this one works and cron fails, it might be the environment. Try saomething like
0 2 * * * . /home/dbadmin/.profile; /home/dbadmin/movefolder.sh > /home/dbadmin/cron.out 2>#1
Using the crontab program, you normally have access only to the 5 scheduling fields (minute, hour, day of month, month and day of week). However, with Vixie cron (usually on Linux) by editing the system crontab file (/etc/crontab, as well as files in /etc/cron.d) you can use the 6th field for the username. For example, see How to specify in crontab by what user to run script?
If you use crontab to enter this line
0 2 * * * dbadmin . /home/dbadmin/back.sh
^^^^^^^
the "dbadmin" username is treated as the command to execute. You can (as noted in crontab's manual page) use that line in /etc/crontab. I pointed out that this is Vixie (also known as ISC) crontab. Legacy systems such as Solaris have a less capable crontab which would not allow specifying the user to run under.
According to cron's manual page, it will send output via email. Perhaps there was no email because the command "dbadmin" failed.
You need to use sh command for executing sh file.like following
0 2 * * * dbadmin sh /home/dbadmin/movefolder.sh
Hope it works.
Thank you.

Shell script to log server checks runs manually, but not from cron

I'm using a basic shell script to log the results of top, netstat, ps and free every minute.
This is the script:
/scripts/logtop:
TERM=vt100
export TERM
time=$(date)
min=${time:14:2}
top -b -n 1 > /var/log/systemCheckLogs/$min
netstat -an >> /var/log/systemCheckLogs/$min
ps aux >> /var/log/systemCheckLogs/$min
free >> /var/log/systemCheckLogs/$min
echo "Message Content: $min" | mail -s "Ran System Check script" email#domain.com
exit 0
When I run this script directly it works fine. It creates the files and puts them in /var/log/systemCheckLogs/ and then sends me an email.
I can't, however, get it to work when trying to get cron to do it every minute.
I tried putting it in /var/spool/cron/root like so:
* * * * * /scripts/logtop > /dev/null 2>&1
and it never executes
I also tried putting it in /var/spool/cron/myservername and also like so:
* * * * * /scripts/logtop > /dev/null 2>&1
it'll run every minute, but nothing gets created in systemCheckLogs.
Is there a reason it works when I run it but not when cron runs it?
Also, here's what the permissions look like:
-rwxrwxrwx 1 root root 326 Jul 21 01:53 logtop
drwxr-xr-x 2 root root 4096 Jul 21 01:51 systemCheckLogs
Normally crontabs are kept in "/var/spool/cron/crontabs/". Also, normally, you update it with the crontab command as this HUPs crond after you're done and it'll make sure the file gets in the correct place.
Are you using the crontab command to create the cron entry? crontab to import a file directly. crontab -e to edit the current crontab with $EDITOR.
All jobs run by cron need the interpreter listed at the top, so cron knows how to run them.
I can't tell if you just omitted that line or if it is not in your script.
For example,
#!/bin/bash
echo "Test cron jon"
When running from /var/spool/cron/root, it may be failing because cron is not configured to run for root. On linux, root cron jobs are typically run from /etc/crontab rather than from /var/spool/cron.
When running from /var/spool/cron/myservername, you probably have a permissions problem. Don't redirect the error to /dev/null -- capture them and examine.
Something else to be aware of, cron doesn't initialize the full run environment, which can sometimes mean you can run it just fine from a fully logged-in shell, but it doesn't behave the same from cron.
In the case of above, you don't have a "#!/bin/shell" up top in your script. If root is configured to use something like a regular bourne shell or cshell, the syntax you use to populate your variables will not work. This would explain why it would run, but not populate your files. So if you need it to be ksh, "#!/bin/ksh". It's generally best not to trust the environment to keep these things sane. If you need your profile run the do a ". ~/.profile" up front as well. Or a quick and dirty way to get your relatively full env is to do it from su as such "* * * * * su - root -c "/path/to/script" > /dev/null 2>&1
Just some things I've picked up over the years. You're definitely expecting a ksh based on your syntax, so you might want to be sure it's using it.
Thanks for the tips... used a little bit of each answer to get to the bottom of this.
I did have the interpreter at the top (wasn't shown here), but may have been wrong.
Am using #!/bin/bash now and that works.
Also had to tinker with the permissions of the directory the log files are being dumped in to get things working.

Resources