Crontab running issue - linux

I have 500 php files i want to make a crontab to run them automatically (they store XML data in my database) my problem is :
when I make a crontab for each php file it works fine. the command like this:
* * * * * php /home/username/public_html/codes/php0.php
But when I want to run a shell script including all my php files like this :
* * * * * bash /home/username/public_html/codes/php.sh
it does not run.
php.sh:
#!/bin/sh
php php0.php
echo php0
php php1.php
echo php1
php php2.php
echo php2
.
.
.
Is it possible to wrap php files with bash script? and if yes why does not work am I miss something?

You are probably not running in the directory you believe you are.
So replace temporarily #!/bin/sh with #!/bin/sh -vx
and add a
pwd
at the beginning of your script (that is, the 2nd line)
Then perhaps add a
cd /home/username/public_html/codes
or maybe define a variable and use it:
mycodedir=/home/username/public_html/codes
php $mycodedir/php0.php
echo php0
etc...
I suggest to read the advanced bash scripting guide (even if it does have weaknesses).

At the moment, you have relative paths in your php.sh. The following line
php php0.php
means that php will look for the php0.php file in the current working directory. In a cron job, the working directory is usually /, i.e. the file system root. To check what the working directory really is, you can use the pwd command.
So your php.sh script looks for a file php0.php in your file system root, but the file is in /home/username/public_html/codes/.
If you use absolute paths in your php.sh, the current working directory does not matter. So simply add absolute paths in your php.sh. The file will look like this:
#!/bin/sh
php /home/username/public_html/codes/php0.php
echo php0
php /home/username/public_html/codes/php1.php
echo php1
php /home/username/public_html/codes/php2.php
echo php2
.
.
.

Related

Linux crontab doesn't appear to be working

I'm trying to use a cron job on linux to back up my minecraft server every 24 hours. The crontab looks like this:
00 00 * * * bash /home/pi/wgsanarchy/backup.sh
And backup.sh looks like this:
#!/bin/sh
var=$(date +"%FORMAT_STRING")
now=$(date +"%d_%m_%Y")
tar -zcvf $now-backup.tar.gz /home/pi/wgsanarchy
gupload $now-backup.tar.gz WGSAnarchy
rm /home/pi/$now-backup.tar.gz
(The gupload line pushes the file to my google drive)
I've tried to change the time so I can see if it works, but so far I don't think it does.
Can anyone see any errors?
Thanks!
I do not understand why people undervoting instead of helping each other.
This is a common error in Linux bash script when binaries of tar, gunload and so on are not found at the current directory. To solve the issue, write the PATH variable at the beginning of the bash script. Just execute echo $PATH; in your terminal/bash and copy the result to make PATH variable like this
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin;

cronjob does not execute a script that works fine standalone

I have my php script file in /var/www/html/dbsync/index.php. When cd /var/www/html/dbsync/ and run php index.php it works perfectly.
I want to call PHP file through sh file, the location of SH file is as below
/var/www/html/dbsync/dbsync.sh
This is the content of the dbsync.sh file is:
/usr/bin/php /var/www/html/dbsync/index.php >> /var/www/html/dbsync/myscript.log 2>&1 -q -f
When I cd /var/www/html/dbsync/ and run ./dbsync.sh it works perfectly as well.
Now if I set up crontab as below:
1 * * * * /var/www/html/dbsync/dbsync.sh /var/www/html/dbsync
However, this crontab is not working as expected.
What can be wrong?
As seen in comments, the problem is that you are not defining what program should be used to execute the script. Take into account that a cronjob is executed in a tiny environment; there, not much can be assumed. This is why we define full paths, etc.
So you need to say something like:
1 * * * * /bin/sh /var/www/html/dbsync/dbsync.sh /var/www/html/dbsync
# ^^^^^^^
/bin/sh being the binary you want to use to execute the script.
Otherwise, you can set execution permissions to the script and add a shell-script header telling it what interpreter to use:
#!/bin/sh
If you do this, adding the path of the binary is not necessary.
From Troubleshooting common issues with cron jobs:
Using relative paths. If your cron job is executing a script of some
kind, you must be sure to use only absolute paths inside that script.
For example, if your script is located at /path/to/script.phpand
you're trying to open a file called file.php in the same directory,
you cannot use a relative path such as fopen(file.php). The file must
be called from its absolute path, like this: fopen(/path/to/file.php).
This is because cron jobs do not necessarily run from the directory in
which the script is located, so all paths must be called specifically.
Also, I understand you want to run this every minute. If so, 1 * * * * won't do. Intead, it will run at every 1st minute past every hour. So if you want to run it every minute, say * * * * *.
It is important to understand "login shell" and "interactive shell" what they means.
login shell: is briefly when you sign in with ssh session and get a terminal window where you can enter shell commands. After login the system executes some files(.bashrc) and sets some environment variables such as the PATH variable for you.
interactive shell :After login on a system, you can startup manually shell terminal(s). The system executes some profile file assigned to your account (.bash_profile, .bash_login,.profile). This files also sets some environment variables and initialize PATH variable for your manually opened shell session.
By OS started shell scripts and cron jobs does not fit in above mentioned way for starting a shell. Therefore no any system scripts(.bashrc) or user profiles are executed. This means our PATH variable is not initialized. Shell commands could not found because PATH variable does not point to right places.
This explains why your script runs successfully if you start it manually but fails when you start it via crontab.
Solution-1:
Use absolute path of every shell command instead of only the command name used in your script file(s).
instead of "awk" use "/usr/bin/awk"
instead of "sed" use "/bin/sed"
Solution-2: Initialize environment variables and especially the PATH variable before executing shell scripts!
method 1, add this header in your dbsync.sh:
#!/bin/bash -l
method 2, add bash -l in your cron file:
1 * * * * bash -l /var/www/html/dbsync/dbsync.sh /var/www/html/dbsync

crontab output nothing when calling a .sh that contains an output

I have the following test.sh file in /home/me folder
#!/bin/sh
_now=$(date +"%Y_%m_%d")
_file="/home/me/$_now.txt"
speedtest-cli --simple > $_file
Where speedtest-cli is a python script that gives internet up and dll speed infos : https://github.com/sivel/speedtest-cli.
Calling test.sh from /home/me works very good: I get my yyy_mm_dd.txt output with all infos (dll speed up speed, etc.).
But when I try to call the test.sh from a crontab I get a empty yyy_mm_dd.txt file (nothing inside).
Inside crontab-e
20 20 * * * /home/me/test.sh
Did I do something wrong?
I suspect a PATH problem, so
pick one of :
add PATH=/usr/local/bin:/bin:/usr/bin in the top of your script
add in the top of crontab -e : PATH=/usr/local/bin:/bin:/usr/bin on his own line
source ~/.bashrc in the top of your script
add full path to each commands in your script
Your PATH is probably different for your interactive shell than the context your cronjob runs in, so you should specify the full path of speedtest-cli in your crontab entry.

How to make my BASH script know the path that is saved

I'm going to mk a script that when you run it, it creates a html page with user's computer details... But I only have one problem...
If the user put that script in a folder and set it as a cronjob, when the crontab execute it, the script makes a folder in home directory and that is bad, because, I want the script to make the folder with the HTML docs at the same dir that the script is... what can I do??
thnx ;-)
I would usually do the following:
#!/bin/bash
# sample script to change directory and save a file
WORKDIR=/home/file/n34_panda
cd $WORKDIR
echo "this "> newfile.txt
This is a simple answer without reviewing the rest of your code. To be honest the simplest method would be simply adding the command:
cd /path/to/html/docs
Then after that you can add the rest of your code
Try this script from several directories. It will always return its location.
#!/bin/bash
echo ${0%/*}
You can simply set cron as below :
# m h dom mon dow command
* * * * * cd /path-where-you-want-to-save/ && /path/of/yourscript.sh
As suggested in the comments, there are ways of a script knowing what directory it is stored in. However, I would recommend a simpler solution. Use absolute file paths in the script. The destination directory could be an argument to the script. This make the script more flexible; it means that you don't have to have many copies stored in separate directories if you want to do the same task in more than one place.
#!/bin/bash
output_dir="$1" # first argument to the script
echo "blah blah" > "$output_dir"/filename
Using bash parameter expansion (https://www.gnu.org/software/bash/manual/html_node/Shell-Parameter-Expansion.html) to extract the path from the special variable $0 (which stores the name of the script) if executed from a remote folder and the current directory otherwise ($0 will not contain full path if executed from current directory).
#!/bin/bash
if [[ "$0" == */* ]]; then
script_path=${0%/*}
else
script_path=$(pwd)
fi
I am testing if the special variable $0 contains a /. if so then use parameter expansion to truncate everything following the last /. If it does not contain / then we must be in the current directory, so store the result of pwd.
From my testing this allowed me to get the same output if executed as /pathTo/script.sh or pushd /pathTo; ./script.sh; popd
I tested this in Cygwin with GNU bash, version 4.3.42

Script runs from terminal, but not cron. What edits to this script do I need to make?

I have a script used for zipping a database and site files, then dumps the output into a backup folder on the server. The script runs fine from the command line, but it will not work through cron.
After much research, I am thinking that cron cannot run it in its current form because it runs in a different environment.
Here is the script, saved as file_name.sh
#!/bin/bash
NOW=$(date +"%Y-%m-%d-%H%M")
FILE="website.com.$NOW.tar"
BACKUP_DIR="/backupfolder"
WWW_DIR="/var/www/website/"
DB_USER="dbuser"
DB_PASS="dbpw"
DB_NAME="dbname"
DB_FILE="website.com.$NOW.sql"
WWW_TRANSFORM='s,^var/www/website,www,'
DB_TRANSFORM='s,^backupfolder,database,'
tar -cvf $BACKUP_DIR/$FILE --transform $WWW_TRANSFORM $WWW_DIR
mysqldump -u$DB_USER -p$DB_PASS $DB_NAME > $BACKUP_DIR/$DB_FILE
tar --append --file=$BACKUP_DIR/$FILE --transform $DB_TRANSFORM $BACKUP_DIR/$DB_FILE
rm $BACKUP_DIR/$DB_FILE
gzip -9 $BACKUP_DIR/$FILE
I currently have the script stored in /usr/local/scripts/
Is there something wrong with the above code that does not allow it to run through cron?
Which crontab should it go in? crontab -e from terminal, or /etc/crontab? They are two different files.
Several things come to mind: first, one of the most common problems with cron jobs is that generally crond runs things with a very minimal PATH (usually just /usr/bin:/bin), so if the script uses any commands from some other binaries directory, it'll fail. Where is mysqldump on your system (run which mysqldump if you aren't sure)? If this is the problem, adding PATH=/usr/local/bin:/usr/bin:/bin (or whatever's appropriate in your case) at the beginning of your script should fix it. Alternately, you can set PATH in the crontab file (put this line before the entry that runs your script).
If that's not the problem, my next step would be to capture the script's output, with something like:
1 1 * * * /usr/local/scripts/file_name.sh >/tmp/file_name.log 2>&1
... and see if the output is informative. BTW, as #tripleee mentioned, the format of your cron entry is suitable for the files crontab -e edits, but not for /etc/crontab. The /etc version has an additional field specifying which user to run the job as, e.g.
1 1 * * * eric /usr/local/scripts/file_name.sh >/tmp/file_name.log 2>&1
Best practice is to always use crontab -e (the resultant files are usually in /var/spool/cron/) and this works on every unix and linux platform I ever worked on.
Other common issues with cron execution are missing environment variables. Any environment variables set in .bash_profile (or .profile if you use korn shell) will not necessarily be present in the cron environment. This can be overcome by including them in your script.
As Gordon said, paths are another suspect. You can always full path you executables in your script (eg /bin/mysqldump). Some of the more cynical of us do this anyway to make sure we are executing what we intended as apposed to some other file of the same name in the current path.
I can only guess at your specific problem since you fixed it by creating /scripts, that perhaps the permissions on /usr/local/scripts directory did not allow execution by the cron user?
I have had to remove the extension (.sh) for cron to run in some instances.
So I fixed it. Not sure what the problem was, but this worked for me.
I originally had the scripts located in /usr/local/scripts/
I created a new directory here - /scripts/ and moved the scripts there. The new crontab -e command looked like this:
1 1 * * * bash /scripts/file_name.sh
Works perfectly. Again, I am not sure what the issue was before, but it works now.

Resources