I have a script .sh that works when executed manually but not working through cron :
#!/bin/sh
sudo find . -name "*.log" -delete
There are other .sh which runs perfectly with cron.
I don't know why this one doesn't work.
I compared the env output from the terminal and from cron and both are similar beside the user which has higher privilege in cron using :
echo env > output.txt
The only difference between this script and others which work is the location of the file.
If anyone had a similar problem or know how to get more precise logs.
Thank you in advance
As mentioned by #Shawn and #user1834428 in the comment the "." in my script was not pointing where I wanted.
Related
I wrote crontab-; by mistake.
I tried to look for such a command online, but it doesn't seem to have any such command.
Now it's not listing any of my jobs using crontab-l . What do I do?
I am facing a weird issue with cron utility.
I have some script which if i run from shell works fine.
However, if run the same script using cron it results in error.
The error it says, is that it is not able to find a particular cmd.
I source my user shell file in the script that i am running.
Any possible reasons for this issue ?
Probably a PATH issue. echo $PATH in terminal, then take the result and put it on the top of the cron file like PATH=/RESULT/STUFF
I have a bash script that runs a few commands including rsync and this one below
rm -f $(ls -1t /nas/backups | tail -n +161)
If I execute the script myself on the cli, all commands work. However if run by cron all commands work except the one above.
No idea why. The files in /nas/backups are owned by root, but cron is running as root.
Any ideas? thanks
Okay. So I'm a dummy.
My ls command returns a list of file names, not file paths! And cron wasn't running in the correct working directory.
i have very simple shell script
#!/bin/bash
cp -rf /var/www/ksite/app2/* /var/www/ksite/app
echo "----"
echo "done"
but seems cp command fails
if i execute
cp -rf /var/www/ksite/app2/* /var/www/ksite/app
from terminal everything work ok. Can someone tell me how to include cp in shell script?
Thanks
We seem to have doubt as to how this script fails. If there is no error message then this is a strange one. I suggest:
On the command line (which works), do a which cp
Whatever the reply, then copy that and use it as the cp in the script (e.g. /bin/cp)
Check the widcard expansion, run your script with bash -x script-name and see if you get what you expect.
echo $? after the copy in the script - if it is zero then it (thinks it) worked.
Do a ls -ld /var/www/ksite/app from your script, maybe someone set a symbolic link?
If it still fails, source the script from the command-line and see if that works . script-name
Double check that the copy did actually fail! (maybe that should be step 1.)
Make sure you really have bash at /bin/bash. I think a batter hash bang is:
#!/usr/bin/env bash
This uses the env command to locate the bash binary and set the environment.
I had similar problem. What helped me:
I used windows and putty to write script, so I had \r\n at the end of lines. Be sure, you have only \n symbol.
I copied files and the only way it worked for me at script was cp <source_dir>/fileName <dest_dir>/fileName whereas at command line cp <source_dir>/fileName <dest_dir> worked well too.
Just covering all the bases .. do the permissions vary between the excutions .. i.e. do you execute one with sudo/root privileges, the other as user (unlikely, but thought I'd ask since we don't know what the exact error is)
Similar issue to Vladmir where the script was created in Windows. I created a new file "my_bash_script.sh" in the linux environment using VIM, then read the contents of my script into the file:
:r file_made_in_windows.sh
Then I saved, closed, then set the file as executable:
chmod 744 my_bash_script.sh
From there, I ran the script:
./my_bash_script.sh
...and it worked. What a weird issue. I was confounded for a moment.
I have a cron script that I run nightly from the system-wide /etc/crontab like this:
00 01 * * * www-data /root/scripts/image-auto-process.sh 1> /var/data/logs/cron.out 2> /var/data/logs/cron.err
It's a bash script that backs up (with rsync) some directories full of scanned jpegs, runs a php-program to batch process those jpegs (preview/thumbnails), uploads them to a server and upon success cleans out the first mentioned directories.
Everything but the last clean-out step works like a charm. However, if I run it from the commandline as the www-data user, it works. When it runs via cron as same user, it doesn't.
The last, clean-out step looks like the following:
echo "remove original scans"
for i in `find $SCAN_ORIG_DIR -mindepth 2 -type d -print | grep -v -f $EXCLUDES`; do rm -rvf $i; done
echo "job Done..."
$SCAN_ORIG_DIR is the directory to search. $EXCLUDES is a file (readable to www-data) containing lines with directories to ignore (same file is used to tell rsync what not to backup). -mindepth 2 is used in find because I only want to return subdir's of $SCAN_ORIG_DIR that also have subdirs, like $SCAN_ORIG_DIR/subdir/subsubdir.
I tried putting the above code into its own script and running it on commandline and via cron just to make sure no prior code in the script was causing the problem.
Results in commandline (not exact representation, but just to illustrate):
remove original scans
removed '$SCAN_ORIG_DIR/subdir1/subsubdir'
removed '$SCAN_ORIG_DIR/subdir2/subsubdir'
removed '$SCAN_ORIG_DIR/subdir3/subsubdir'
job Done...
Results via cron:
remove original scans
job Done...
So, I'm stumped. I sincerely hope anyone can help shine a light on what's wrong here.
Thank you very much for you time and efforts :-)
A common problem with scripts when running in cron, is that the user login scripts (.bashrc, ,bash_profile) are not executed, so some variables are missing.
BTW, it is not good practice to use the system-wide /etc/crontab. Use crontab -e to add cron jobs.