I wrote crontab-; by mistake.
I tried to look for such a command online, but it doesn't seem to have any such command.
Now it's not listing any of my jobs using crontab-l . What do I do?
Related
I have a script .sh that works when executed manually but not working through cron :
#!/bin/sh
sudo find . -name "*.log" -delete
There are other .sh which runs perfectly with cron.
I don't know why this one doesn't work.
I compared the env output from the terminal and from cron and both are similar beside the user which has higher privilege in cron using :
echo env > output.txt
The only difference between this script and others which work is the location of the file.
If anyone had a similar problem or know how to get more precise logs.
Thank you in advance
As mentioned by #Shawn and #user1834428 in the comment the "." in my script was not pointing where I wanted.
On Linux I'm using "tee" to capture the output of "source" command and print it to output log file, but failed. The command I'm using is like this:
source ./my_run.sh 2>&1 | tee -i my_run_log
The intention of my_run.sh is to "make" some compile job, as well as some routine jobs like cd, rm and svn update. The content of my_run.sh is like follows:
make clean
cd ..
rm ./xxx
svn up -r 166
cd ./aaa/
sed -i -e ......
make compile
make run
However, when I run it the "tee" just does NOT work, and do NOT give me the log file at all. In order to verify that the entire environment is good, I did a simpler test with:
ll 2>&1 | tee -i log
and in this simpler scenario the "tee" works perfectly fine and prints out "log" as I expected.
Can anyone help me find out where my problem is?
btw,
I'm working on Red Hat Linux (Release 5.9), using bash shell.
Thanks in advance!
SOME MORE COMMENTS:
I did some more tests and found that as long as the my_run.sh script has got "make xxx" stuffs in it, then "tee" will fail. Seems like tee does NOT like make. Any solutions?
Problem solved; many thanks goes to #thatotherguy in leading me to the solution. The log output was actually deleted by the make clean process. After fixing the clean stuff in the makefile, everything is good.
I have a bash script that runs a few commands including rsync and this one below
rm -f $(ls -1t /nas/backups | tail -n +161)
If I execute the script myself on the cli, all commands work. However if run by cron all commands work except the one above.
No idea why. The files in /nas/backups are owned by root, but cron is running as root.
Any ideas? thanks
Okay. So I'm a dummy.
My ls command returns a list of file names, not file paths! And cron wasn't running in the correct working directory.
I am not able to execute hadoop/hive commands from Crontab. Basically i have scheduled a perl script in crontab which contains system commands which are setting PATH before my operations.
I am aware that, the env of running from cron could be different from your regular shell. That's the reason i am setting paths like below. IS there any other way to make it work?
system(". /home/ciber/.bash_profile");
system("export JAVA_HOME=/usr/lib/jvm/java-6-openjdk-amd64");
system("export HADOOP_INSTALL=~/poc/install/hadoop-1.0.3");
system("export PATH=$PATH:$HADOOP_INSTALL/bin");
system("export HADOOP_HOME=$HADOOP_INSTALL");
system("export HIVE_INSTALL=~/poc/install/hive-0.9.0");
system("export PATH=$PATH:$HIVE_INSTALL/bin");
#Jingguo Yao: Do you have any idea abt this?
You can absolute path of commands in crontab . Also, you can declare env variable in crontab simply . for example
foo=bar
If it is executing properly from terminal and not in crontab, loading user bash profile in the script should do the work, as below
. ~/.bash_profile
or
. /home/<user>/.bash_profile
Usually #!/bin/bash is included in bash_profile and it will have user specific configurations as well.
I have a cron script that I run nightly from the system-wide /etc/crontab like this:
00 01 * * * www-data /root/scripts/image-auto-process.sh 1> /var/data/logs/cron.out 2> /var/data/logs/cron.err
It's a bash script that backs up (with rsync) some directories full of scanned jpegs, runs a php-program to batch process those jpegs (preview/thumbnails), uploads them to a server and upon success cleans out the first mentioned directories.
Everything but the last clean-out step works like a charm. However, if I run it from the commandline as the www-data user, it works. When it runs via cron as same user, it doesn't.
The last, clean-out step looks like the following:
echo "remove original scans"
for i in `find $SCAN_ORIG_DIR -mindepth 2 -type d -print | grep -v -f $EXCLUDES`; do rm -rvf $i; done
echo "job Done..."
$SCAN_ORIG_DIR is the directory to search. $EXCLUDES is a file (readable to www-data) containing lines with directories to ignore (same file is used to tell rsync what not to backup). -mindepth 2 is used in find because I only want to return subdir's of $SCAN_ORIG_DIR that also have subdirs, like $SCAN_ORIG_DIR/subdir/subsubdir.
I tried putting the above code into its own script and running it on commandline and via cron just to make sure no prior code in the script was causing the problem.
Results in commandline (not exact representation, but just to illustrate):
remove original scans
removed '$SCAN_ORIG_DIR/subdir1/subsubdir'
removed '$SCAN_ORIG_DIR/subdir2/subsubdir'
removed '$SCAN_ORIG_DIR/subdir3/subsubdir'
job Done...
Results via cron:
remove original scans
job Done...
So, I'm stumped. I sincerely hope anyone can help shine a light on what's wrong here.
Thank you very much for you time and efforts :-)
A common problem with scripts when running in cron, is that the user login scripts (.bashrc, ,bash_profile) are not executed, so some variables are missing.
BTW, it is not good practice to use the system-wide /etc/crontab. Use crontab -e to add cron jobs.