I have a script written in csh which I want to run weekly.
I tried using cronjob for this but it seems like cronjob is trying to run my script in sh and hence it is not working properly.
What can be done to make sure that the scripts runs properly in cronjob or is there any other way to accomplish what I am trying to do without using cronjob ?
My cron job looks something like this :
0 0 * * 6 source ~/cron_job
~/cron_job looks something like :
#!/bin/csh
source ~/.cshrc;
source ~/test_setup;
source ~/start_test
Also note that running source ~/cron_job directly on terminal works as intended but cronjob is not working. I get following error :
/bin/sh: source ~/cron_job : No such file or directory
0 0 * * 6 source ~/cron_job
This tries to "include" or "source" the file in the current script; this is wrong for a number of reasons:
You are trying to source a csh script;
even if it would be a sh script, cron expects you to run a separate program , not source something in the current script (perhaps it will work, I never tried, but consider two scripts sourcing something which have the same variable or function names. Oops!)
The correct way would be:
0 0 * * 6 csh -f ~/cron_job
This starts csh; the -f is to prevent loading startup files, which may sometimes interfere with the running of the sript.
The most probable cause of that behaviour is that you have a bourne shell configured in your /etc/profile file.
cron(8) uses your /etc/profile data to select the shell to use for executing your crontab(??) jobs, so in case you want to use a different shell you can do it in a subprocess, not using source.
Another way is to switch to csh(1) and use it instead of sh(1).
Related
Set up
I have several bashfiles on my computer which I want to run periodically.
I can run the bashfiles manually in Terminal (Mac OS), e.g. cd'ing myself to the correct folder and subsequently executing,
./France_run.txt
gives the desired result.
Problem
I do not want to run the bashfiles manually.
I've created cronjobs in crontab, e.g.
0 0 * * 2 /Users/mypath/France_run.txt
which should run each Tuesday at 00:00. However, nothing happens.
Am I only referring to the file and missing a 'run this script' command? Or is it something else?
You may be only referring to the file, and it's probably logging an error somewhere (usually /var/log/message, or in the mail file of the root user...which is disabled by default on Macs).
The thing about running scripts through cron is that it runs under a different environment. When you normally log in to a Bash session, certain environment variables get automatically set, so the system automatically checks for things like a path (locations in the file system where executables can be found). Different Unix like systems handle this situation slightly differently...I can't recall the details of how Macs deal with it, but on some systems, I've had to explicitly provide the full path to, for example, the Bash executable in order to get stuff to work.
The location of the executable for the scripts is usually /bin/bash, or /bin/sh, or something like that. So when going through a Bash session, if you call /Users/mypath/France_run.txt and that file is an executable Bash script (e.g. the first line is something like #!/bin/bash and the file's executable bit is set) then system knows to automatically run something like /bin/bash /Users/mypath/France_run.txt.
In the context of cron, however, you don't automatically get those conveniences, so you may have to spell out just about everything (i.e. specify the full paths to all binaries or executables). Again, this is not always the case. I just looked at a Debian system where I created some cron jobs to run scripts, and I didn't have to call /bin/bash there, but I do recall having to do something like that int the past on a Mac.
So your cron job may just need to specify the full path to the Bash binary:
0 0 * * 2 /bin/bash /Users/mypath/France_run.txt
And if France_run.txt makes any calls to system binaries (like ls), you may need to fully qualify those as well (/bin/ls instead of just ls).
Also, depending on how the script is written, it may even be necessary to cd into the directory of the script, as if you were running it manually:
0 0 * * 2 cd /Users/mypath; /bin/bash ./France_run.txt
(cd is a Bash built-in, so there's no path to specify there)
I'm trying to start a bash script(test.sh) from a second bash script that runs as a cronjob(startTest.sh) on Ubuntu 14.04.
Cron is running and both scripts work perfectly if called from command line.
startTest.sh looks like this:
#!bin/bash
SHELL=/bin/bash
PATH=/usr/local/bin:/usr/local/sbin:/sbin:/usr/sbin:/bin:/usr/bin:/home/username/path/to/script
bash /home/username/path/to/script/test.sh
test.sh looks like this:
#!/bin/bash
touch it_works.txt
My crontab entry looks like this
* * * * * /usr/local/bin/startTest.sh
Best practice is generally not to use relative paths (unless you do an explicit cd) in scripts run as cron jobs.
crond is probably not running from whatever directory you expect it to. Depending on what user this cron job runs as, the script either does not have permission to create it_works.txt in crond's current working directory, or it is creating the file and you're looking in the wrong place.
I have my php script file in /var/www/html/dbsync/index.php. When cd /var/www/html/dbsync/ and run php index.php it works perfectly.
I want to call PHP file through sh file, the location of SH file is as below
/var/www/html/dbsync/dbsync.sh
This is the content of the dbsync.sh file is:
/usr/bin/php /var/www/html/dbsync/index.php >> /var/www/html/dbsync/myscript.log 2>&1 -q -f
When I cd /var/www/html/dbsync/ and run ./dbsync.sh it works perfectly as well.
Now if I set up crontab as below:
1 * * * * /var/www/html/dbsync/dbsync.sh /var/www/html/dbsync
However, this crontab is not working as expected.
What can be wrong?
As seen in comments, the problem is that you are not defining what program should be used to execute the script. Take into account that a cronjob is executed in a tiny environment; there, not much can be assumed. This is why we define full paths, etc.
So you need to say something like:
1 * * * * /bin/sh /var/www/html/dbsync/dbsync.sh /var/www/html/dbsync
# ^^^^^^^
/bin/sh being the binary you want to use to execute the script.
Otherwise, you can set execution permissions to the script and add a shell-script header telling it what interpreter to use:
#!/bin/sh
If you do this, adding the path of the binary is not necessary.
From Troubleshooting common issues with cron jobs:
Using relative paths. If your cron job is executing a script of some
kind, you must be sure to use only absolute paths inside that script.
For example, if your script is located at /path/to/script.phpand
you're trying to open a file called file.php in the same directory,
you cannot use a relative path such as fopen(file.php). The file must
be called from its absolute path, like this: fopen(/path/to/file.php).
This is because cron jobs do not necessarily run from the directory in
which the script is located, so all paths must be called specifically.
Also, I understand you want to run this every minute. If so, 1 * * * * won't do. Intead, it will run at every 1st minute past every hour. So if you want to run it every minute, say * * * * *.
It is important to understand "login shell" and "interactive shell" what they means.
login shell: is briefly when you sign in with ssh session and get a terminal window where you can enter shell commands. After login the system executes some files(.bashrc) and sets some environment variables such as the PATH variable for you.
interactive shell :After login on a system, you can startup manually shell terminal(s). The system executes some profile file assigned to your account (.bash_profile, .bash_login,.profile). This files also sets some environment variables and initialize PATH variable for your manually opened shell session.
By OS started shell scripts and cron jobs does not fit in above mentioned way for starting a shell. Therefore no any system scripts(.bashrc) or user profiles are executed. This means our PATH variable is not initialized. Shell commands could not found because PATH variable does not point to right places.
This explains why your script runs successfully if you start it manually but fails when you start it via crontab.
Solution-1:
Use absolute path of every shell command instead of only the command name used in your script file(s).
instead of "awk" use "/usr/bin/awk"
instead of "sed" use "/bin/sed"
Solution-2: Initialize environment variables and especially the PATH variable before executing shell scripts!
method 1, add this header in your dbsync.sh:
#!/bin/bash -l
method 2, add bash -l in your cron file:
1 * * * * bash -l /var/www/html/dbsync/dbsync.sh /var/www/html/dbsync
I have a shell script that I have scheduled using cron using a command:
0 10 * * * /directory/Script.sh > /directory/log/output.log
The script is scheduled to run at 10 AM everyday. The script executes but produces output files only with headers, no content is there.
The script produces two output files. When I run the script manually it works fine. But when scheduled it is not producing the correct output.
Help me out.
Thanks
Multiple reasons
1> Check full path of all executable in the script.
2> Ensure all environment variables are set accordingly
3> Check the script when run from the same user as the cron is executing.
Technically there is no difference between manually running a script and scheduling from cron
I have a script used for zipping a database and site files, then dumps the output into a backup folder on the server. The script runs fine from the command line, but it will not work through cron.
After much research, I am thinking that cron cannot run it in its current form because it runs in a different environment.
Here is the script, saved as file_name.sh
#!/bin/bash
NOW=$(date +"%Y-%m-%d-%H%M")
FILE="website.com.$NOW.tar"
BACKUP_DIR="/backupfolder"
WWW_DIR="/var/www/website/"
DB_USER="dbuser"
DB_PASS="dbpw"
DB_NAME="dbname"
DB_FILE="website.com.$NOW.sql"
WWW_TRANSFORM='s,^var/www/website,www,'
DB_TRANSFORM='s,^backupfolder,database,'
tar -cvf $BACKUP_DIR/$FILE --transform $WWW_TRANSFORM $WWW_DIR
mysqldump -u$DB_USER -p$DB_PASS $DB_NAME > $BACKUP_DIR/$DB_FILE
tar --append --file=$BACKUP_DIR/$FILE --transform $DB_TRANSFORM $BACKUP_DIR/$DB_FILE
rm $BACKUP_DIR/$DB_FILE
gzip -9 $BACKUP_DIR/$FILE
I currently have the script stored in /usr/local/scripts/
Is there something wrong with the above code that does not allow it to run through cron?
Which crontab should it go in? crontab -e from terminal, or /etc/crontab? They are two different files.
Several things come to mind: first, one of the most common problems with cron jobs is that generally crond runs things with a very minimal PATH (usually just /usr/bin:/bin), so if the script uses any commands from some other binaries directory, it'll fail. Where is mysqldump on your system (run which mysqldump if you aren't sure)? If this is the problem, adding PATH=/usr/local/bin:/usr/bin:/bin (or whatever's appropriate in your case) at the beginning of your script should fix it. Alternately, you can set PATH in the crontab file (put this line before the entry that runs your script).
If that's not the problem, my next step would be to capture the script's output, with something like:
1 1 * * * /usr/local/scripts/file_name.sh >/tmp/file_name.log 2>&1
... and see if the output is informative. BTW, as #tripleee mentioned, the format of your cron entry is suitable for the files crontab -e edits, but not for /etc/crontab. The /etc version has an additional field specifying which user to run the job as, e.g.
1 1 * * * eric /usr/local/scripts/file_name.sh >/tmp/file_name.log 2>&1
Best practice is to always use crontab -e (the resultant files are usually in /var/spool/cron/) and this works on every unix and linux platform I ever worked on.
Other common issues with cron execution are missing environment variables. Any environment variables set in .bash_profile (or .profile if you use korn shell) will not necessarily be present in the cron environment. This can be overcome by including them in your script.
As Gordon said, paths are another suspect. You can always full path you executables in your script (eg /bin/mysqldump). Some of the more cynical of us do this anyway to make sure we are executing what we intended as apposed to some other file of the same name in the current path.
I can only guess at your specific problem since you fixed it by creating /scripts, that perhaps the permissions on /usr/local/scripts directory did not allow execution by the cron user?
I have had to remove the extension (.sh) for cron to run in some instances.
So I fixed it. Not sure what the problem was, but this worked for me.
I originally had the scripts located in /usr/local/scripts/
I created a new directory here - /scripts/ and moved the scripts there. The new crontab -e command looked like this:
1 1 * * * bash /scripts/file_name.sh
Works perfectly. Again, I am not sure what the issue was before, but it works now.