Deal with different directories in bash cron - linux

The problem...
I use trick77's IP blacklist script to configure the firewall of my apache server and am able to run his script in terminal.
However, when assigning the bash script in ipset-blacklist to crontab, it will not run no matter what I do.
Code written in crontab file for root:
#daily /var/bash/update-blacklist.sh
What I think is the culprit...
Since I haven't done this sort of thing before, I believe that the PATH of the bash script isn't set correctly... but again, I'm not sure.
I have seen others using a line such as PATH=/usr/bin:/bin:/usr/sbin:/sbin to resolve problems involving the script's location, but, I don't exactly know what this does.
I set the location of the bash file to /var/bash instead of /usr/bin and I believe that this is throwing things off.
Pardon my lack of understanding. I really am a beginner when it comes to bash.
Any help is greatly appreciated.
What I have done...
Per #EtanReisner:
Added echo here >> /tmp/update-blacklist.out to top of update-blacklist.sh and set cron to run it every minute (* * * * *).
The file was successfully created.
Added type -p curl grep egrep ipset >> /tmp/update-blacklist.out to top of update-blacklist.sh and returned:
-p: not found
curl is /usr/bin/curl
grep is /bin/grep
egrep is /bin/egrep
ipset: not found

The output from type ipset indicated that ipset was not in the cron script PATH which isn't surprising.
The default PATH for cron jobs is fairly limited.
With ipset located in /usr/sbin that is the path that must be added to the cron script's PATH variable.
You talked about this in your question
I have seen others using a line such as PATH=/usr/bin:/bin:/usr/sbin:/sbin to resolve problems involving the script's location, but, I don't exactly know what this does.
What that does is set the PATH variable to those paths (from whatever the default value was).
The PATH variable contains the paths where the shell looks for binaries/scripts/etc. to run when you try to run them as commands.

Related

Import PATH environment variable into Bash script launched with cron

When creating Bash scripts, I have always had a line right at the start defining the PATH environment variable. I recently discovered that this doesn't make the script very portable as the PATH variable is different for different versions of Linux (in my case, I moved the script from Arch Linux to Ubuntu and received errors as various executables weren't in the same places).
Is it possible to copy the PATH environment variable defined by the login shell into the current Bash script?
EDIT:
I see that my question has caused some confusion resulting in some thinking that I want to change the PATH environment variable of the login shell with a bash script, which is the exact opposite of what I want.
This is what I currently have at the top of one of my Bash scripts:
#!/bin/bash
PATH=/usr/local/sbin:/usr/local/bin:/usr/bin:/usr/bin/site_perl:/usr/bin/vendor_perl:/usr/bin/core_perl
# Test if an internet connection is present
wget -O /dev/null google.com
I want to replace that second line with something that copies the value of PATH from the login shell into the script environment:
#!/bin/bash
PATH=$(command that copies value of PATH from login shell)
# Test if an internet connection is present
wget -O /dev/null google.com
EDIT 2: Sorry for the big omission on my part. I forgot to mention that the scripts in question are being run on a schedule through cron. Cron creates it's own environment for running the scripts which does not use the environment variables of the login shell or modify them. I just tried running the following script in cron:
#!/bin/bash
echo $PATH >> /home/user/output.txt
The result is as follows. As you can see, the PATH variable used by cron is different to the login shell:
user#ubuntu_router:~$ cat output.txt
/usr/bin:/bin
user#ubuntu_router:~$ echo $PATH
/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
Don't touch the user's PATH at all unless you have a specific reason. Not doing anything will (basically) accomplish what you ask.
You don't have to do anything to get the user's normal PATH since every process inherits the PATH and all other environment variables automatically.
If you need to add something nonstandard to the PATH, the usual approach is to prepend (or append) the new directory to the user's existing PATH, like so:
PATH=/opt/your/random/dir:$PATH
The environment of cron jobs is pretty close to the system's "default" (for some definition of "default") though interactive shells may generally run with a less constrained environment. But again, the fix for that is to add any missing directories to the current value at the beginning of the script. Adding directories which don't exist on this particular system is harmless, as is introducing duplicate directories.
I've managed to find the answer to my question:
PATH=$PATH:$(sed -n '/PATH=/s/^.*=// ; s/\"//gp' '/etc/environment')
This command will grab the value assigned to PATH by Linux from the environment file and append it to the PATH used by Cron.
I used the following resources to help find the answer:
How to grep for contents after pattern?
https://help.ubuntu.com/community/EnvironmentVariables#System-wide_environment_variables

Why does my crontab not work?

I am planning to run some bash scripts every minute, and I wrote:
* * * * * bash ~/Dropbox/temp_scripts/run_all_scripts
in crontab.
It was supposed to run every minute, but it did not work. Does anyone have idea why this happens?
Transferring a comment into an answer.
Add I/O redirection to the command line in the crontab entry:
>/tmp/run_all_scripts.out 2>/tmp/run_all_scripts.err
Review the contents of the files after a minute or two has passed. Consider recording the environment to see if that's part of the problem. And consider using bash -x instead of just bash.
If you still don't get anything (the files in /tmp are not created), then you've got issues with cron; the daemon isn't running, or your user does not have permission to use it (but crontab isn't telling you that), or you've not submitted your crontab to the program (what does crontab -l say?), or … whatever is really wrong.
Note, too, that the output from cron jobs is normally (well, at least sometimes — on Mac OS X for a system I currently use, and Solaris for another that I've used previously) emailed to the person whose job it is. You should review the email on the system.
Thank you! I have already fixed it! The reason why it does not work is I used "ls -a .sh" in the script, and when the crontab did not find any *.sh files in the folder it was executing. When modifying it to "ls -a $HOME/Dropbox/temp_scripts/.sh", everything works! This debugging technique is quite helpful!
It is, in many ways, the most basic of debugging techniques — make sure you see what is actually happening. If you're not sure why a shell script isn't working, make sure you can see that it is executing and what it is producing in the way of output, and (very often) make sure you can see what it is executing with bash -x or equivalent. (AFAIK, all shells support -x to trace the execution.)

Script runs from terminal, but not cron. What edits to this script do I need to make?

I have a script used for zipping a database and site files, then dumps the output into a backup folder on the server. The script runs fine from the command line, but it will not work through cron.
After much research, I am thinking that cron cannot run it in its current form because it runs in a different environment.
Here is the script, saved as file_name.sh
#!/bin/bash
NOW=$(date +"%Y-%m-%d-%H%M")
FILE="website.com.$NOW.tar"
BACKUP_DIR="/backupfolder"
WWW_DIR="/var/www/website/"
DB_USER="dbuser"
DB_PASS="dbpw"
DB_NAME="dbname"
DB_FILE="website.com.$NOW.sql"
WWW_TRANSFORM='s,^var/www/website,www,'
DB_TRANSFORM='s,^backupfolder,database,'
tar -cvf $BACKUP_DIR/$FILE --transform $WWW_TRANSFORM $WWW_DIR
mysqldump -u$DB_USER -p$DB_PASS $DB_NAME > $BACKUP_DIR/$DB_FILE
tar --append --file=$BACKUP_DIR/$FILE --transform $DB_TRANSFORM $BACKUP_DIR/$DB_FILE
rm $BACKUP_DIR/$DB_FILE
gzip -9 $BACKUP_DIR/$FILE
I currently have the script stored in /usr/local/scripts/
Is there something wrong with the above code that does not allow it to run through cron?
Which crontab should it go in? crontab -e from terminal, or /etc/crontab? They are two different files.
Several things come to mind: first, one of the most common problems with cron jobs is that generally crond runs things with a very minimal PATH (usually just /usr/bin:/bin), so if the script uses any commands from some other binaries directory, it'll fail. Where is mysqldump on your system (run which mysqldump if you aren't sure)? If this is the problem, adding PATH=/usr/local/bin:/usr/bin:/bin (or whatever's appropriate in your case) at the beginning of your script should fix it. Alternately, you can set PATH in the crontab file (put this line before the entry that runs your script).
If that's not the problem, my next step would be to capture the script's output, with something like:
1 1 * * * /usr/local/scripts/file_name.sh >/tmp/file_name.log 2>&1
... and see if the output is informative. BTW, as #tripleee mentioned, the format of your cron entry is suitable for the files crontab -e edits, but not for /etc/crontab. The /etc version has an additional field specifying which user to run the job as, e.g.
1 1 * * * eric /usr/local/scripts/file_name.sh >/tmp/file_name.log 2>&1
Best practice is to always use crontab -e (the resultant files are usually in /var/spool/cron/) and this works on every unix and linux platform I ever worked on.
Other common issues with cron execution are missing environment variables. Any environment variables set in .bash_profile (or .profile if you use korn shell) will not necessarily be present in the cron environment. This can be overcome by including them in your script.
As Gordon said, paths are another suspect. You can always full path you executables in your script (eg /bin/mysqldump). Some of the more cynical of us do this anyway to make sure we are executing what we intended as apposed to some other file of the same name in the current path.
I can only guess at your specific problem since you fixed it by creating /scripts, that perhaps the permissions on /usr/local/scripts directory did not allow execution by the cron user?
I have had to remove the extension (.sh) for cron to run in some instances.
So I fixed it. Not sure what the problem was, but this worked for me.
I originally had the scripts located in /usr/local/scripts/
I created a new directory here - /scripts/ and moved the scripts there. The new crontab -e command looked like this:
1 1 * * * bash /scripts/file_name.sh
Works perfectly. Again, I am not sure what the issue was before, but it works now.

shell script not running via crontab, runs fine manually

I have tried exporting my paths and variables and crontab still will not run my script. I'm sure I am doing something wrong.
I have a shell script which runs a jar file. This is not working correctly.
After reading around I have read this is commonly due to incorrect paths due to cron running via its own shell instance and therefore does not have the same preferences setup as my profile does.
Here is what my script looks like today after several modifications:
#!/bin/bash --
. /root/.bash_profile
/usr/bin/java -jar Pharmagistics_auto.jar -o
...
those are the most important pieces of the script, the rest are straightforward shell based.
Can someone tell me what I am doing wrong?
Try specifying the full path to the jar file:
/usr/bin/java -jar /path/to/Pharmagistics_auto.jar -o
I would just tell you what you have already ruled out: Check your path and environment.
Since you have alredy done this, start debugging. Like write checkpoints into a logfile to see how far your script gets (if even started at all), check the cronjob log file for errors, check your mail (cron sends mails on errors) and so on ...
Not very specific, sorry.
"exporting my paths and variables" won't work since crontab runs in a different shell by a different user.
Also, not sure if this is a typo in how you entered the question, but I see:
usr/bin/java
...and I can't help but notice you're not specifying the fully qualified path. It's looking for a directory named "usr" in the current working directory. Oft times for crontab, the cwd is undefined, hence your reference goes nowhere.
Try specifying the full path from root, like so:
/usr/bin/java
Or, if you want to see an example of relative pathing in action, you could also try:
cd /
usr/bin/java
A few thoughts.
Remove the -- after the #!/bin/bash
Make sure to direct script output seen by cron to mail or somewhere else where you can view it (e.g. MAILTO=desiredUser)
Confirm that your script is running and not blocked by a different long-running script (e.g. on the second line, add touch /tmp/MY_SCRIPT_RAN && exit)
Debug the script using set -x and set -v once you know it's actually running
Do you define necessary paths and env vars in your personal .profile (or other script)? Have you tried sourcing that particular file (or is that what you're doing already with /root/.bash_profile?)
Another way of asking this is: are you certain that whatever necessary paths and env vars you expect are actually available?
If nothing else, have you tried echo'ing individual values or just using the "env" command in your script and then reviewing the stdout?
provide full paths to your jar file, and what user are you running the crontab in? If you set it up for a normal user, do you think that user has permission to source the root's profile?

Cron does not run from /root

If I run a script from /home/<user>/<dir>/script.sh, as root, the cron works pretty well. But If I run the script from /root/<dir>/script.sh (as root, again), the cron does not seem to work.
Having run afoul of various default $PATHs in the past when using 'cron', I always spell in full the absolute $PATH for each executable file and each target file. I always assume that 'cron' has NO $PATH set and has NO current-working-directory.
In other words don't use a command like
"myprocess abc*.txt"
but do it in full like
"/usr/localbin/myprocess /home/jvs/abc*.txt".
Alternatively, create a bash script which does the job, and call that bash script with a full absolute path, such as
"/usr/local/bin/myprocess_abc_txts".
If you need to have some flexibility in the script, use environment variables which are set specifically within the bash script you call with 'cron'.
I think you need to add a little more information. I'd guess it is a permissions thing though. Add the permissions of the file, the directories, and the line in your crontab so we can help. Also, if you are putting this in /root, are you running this in root's crontab?
Remember the environment - especially when run by cron rather than by root. When cron runs something, you probably don't have anything much set of your environment, unlike when you run a command via at. It is also not clear what your current directory will be. So, for commands that will be run by cron, use a script (as you're already doing) and make sure it sets enough of the environment for it to run. And make sure your environment setting code is not interactive!
On my machines, I have a mechanism such that the cron entry reads (for example):
23 1 * * 1-5 /usr/bin/ksh /work1/jleffler/bin/Cron/weekday
The weekday script in the Cron directory is a link to a standard script that first sets the environment and then runs the command /work1/jleffler/bin/weekday (in this case - it uses the name of the command to determine what to run).
The actual script in the Cron directory is:
: "$Id: runcron.sh,v 2.1 2001/02/27 00:53:22 jleffler Exp $"
#
# Commands to be performed by Cron (no debugging options)
# Set environment -- not done by cron (usually switches HOME)
. $HOME/.cronfile
base=`basename $0`
cmd=${REAL_HOME:-/real/home}/bin/$base
if [ ! -x $cmd ]
then cmd=${HOME}/bin/$base
fi
exec $cmd ${#:+"$#"}
I've been using it a while now - this version since 2001 - and it works a treat for me. I'm using a basic (Sun Solaris 10) implementation of cron; there may be new features in new versions of cron on other platforms to make some of this unnecessary. (The $REAL_HOME stuff is a weirdness of mine; pretend it says $HOME - though that makes some of the script unnecessary for you.) The .cronfile is responsible for the environment setting - it does quite a lot, but that's my problem, not yours.
It could be because you're looking for relative directories/files in the script which are located when running it from /home/ but not from /root, because /root is not in /home/root nor would it look like a users homefolder in /home/
Can you check and see if it is looking for relative files, or post the script?
On another note, why don't you just set it to run from a user's homefolder then?
Another way to run sh script is place your bash script in /usr/bin directory and simply run command bash yourscript.sh without adding /usr/bin/ directory

Resources