Error in crontab while try to save in CLI mode - linux

I am trying to add a commend in crontab so folder "XYZABC" delete if anyone makes in my server, and it should run every hour.
* 1 * * * *  /usr/bin/find /home/ -type d -name "XYZABC" -exec rm -rf {} +
When I try to save the above command it shows me an error
crontab: installing new crontab
"/tmp/crontab.cc9iSz":23: bad command
errors in crontab file, can't install.
Do you want to retry the same edit?
what I am doing wrong...

Related

How to run linux commands using cron

I have a couple of linux commands I want to run everyday in the morning once using cron. Not sure how to do this though.
I understand that this needs to be done using shell, but I don't know how to do all this in linux. I will be able to create cron using cpanel though...
These are the commands
rm -rf <directory>
mkdir <directory>
chmod 777 <directory>
You can create a shell script with this commands in a file script.sh, for example:
#!/usr/bin/bash
rm -rf <directory>
mkdir <directory>
chmod 777 <directory>
<others commands or logical instructions>...
In linux you can add a cron job in crontab, with crontab -e command, or in /etc/cron.d directory. The difference is that with command crontab -e the cron job will be set to user that execute crontab -e and add a cron job file right in cron.d you will need to put the user ahead the command of cron job.
Examples of cron jobs to be executed at 06:00 am.
With crontab -e:
0 6 * * * /usr/bin/bash /path_to_script/script.sh
Creating a file in /etc/cron.d:
0 6 * * * root /usr/bin/bash /path_to_script/script.sh
Alternatively you can just put the commands in your cron job as:
0 6 * * * rm -rf <directory> && mkdir <directory> && chmod 777 <directory>
Attention: remember to put the absolute path to directories that want to remove or create
PS: you can make your scripts in any language and use shell calls, as php with shell_exec() function or system() function of perl.

script not running via crontab

I have created a shell script which deletes subfolder of var/cache folder. Please check below script.
#!/bin/sh
now=$(date +"%Y-%m-%d %T")
if rm -rf var/cache/* ; then
echo "$now: Deleted"
else
echo "$now: problem"
fi
When I run this shell file directly by command sh hello.sh it works fine.
But when I run this file using crontab it creates an entry in log file but doesn't delete subfolder of var/cache/..
Please check my crontab as well.
*/1 * * * * /bin/sh /www/html/wp/hello.sh >> /www/html/var/log/redis.flush.cron.log 2>&1
Please suggest how can I run that file using crontab.
Try using an absolute path instead of var/cache. When you run it via cron, it will run a) as a specific user, and b) from the home directory of that user. One or both of these might be causing issues for you.
Instead of this:
if rm -rf var/cache/* ; then
Try something like this:
if rm -rf /full/path/to/var/cache/* ; then

crontab bash script not running

I updated the script with the absolute paths. Also here is my current cronjob entry.
I went and fixed the ssh key issue so I know it works know, but might still need to tell rsync what key to use.
The script runs fine when called manually by user. It looks like not even the rm commands are being executed by the cron job.
UPDATE
I updated my script but basically its the same as the one below. Below I have a new cron time and added an error output.
I get nothing. It looks like the script doesn't even run.
crontab -e
35 0 * * * /bin/bash /x/y/z/s/script.sh 2>1 > /tmp/tc.log
#!/bin/bash
# Clean up
/bin/rm -rf /z/y/z/a/b/current/*
cd /z/y/z/a/to/
/bin/rm -rf ?s??/D????
cd /z/y/z/s/
# Find the latest file
FILE=`/usr/bin/ssh user#server /bin/ls -ht /x/y/z/t/a/ | /usr/bin/head -n 1`
# Copy over the latest archive and place it in the proper directory
/usr/bin/rsync -avz -e /urs/bin/ssh user#server:"/x/y/z/t/a/$FILE" /x/y/z/t/a/
# Unzip the zip file and place it in the proper directory
/usr/bin/unzip -o /x/y/z/t/a/$FILE -d /x/y/z/t/a/current/
# Run Dev's script
cd /x/y/z/t/
./old.py a/current/ t/ 5
Thanks for the help.
I figured it out, I'm use to working in cst and the server was in gmt time.
Thanks everybody for the help.

Linux script is unable to delete files via crontab, but it works manually

I have a simple script file to copy all files to a remote server and then delete them all. I could run this script by "user" manually, when i add into crontab (user), the first part, scp, works fine, but the rm part is always with failure.
i wonder what i am missing or set up incorrectly, could somebody help me out with this ?
thanks in advance
/home/user/bin/test.sh
#!/bin/bash
scp -v -r /var/spool/asterisk/monitor test#xx.xx.xx.xx:/home/test/audio&&sudo rm -f /var/spool/asterisk/monitor/*
access permission of /var/spool/asterisk/monitor
drwxr-xr-x. 1 root root 532 Sep 06 11:14 monitor
crontab - user]
* */1 * * * bash /home/user/bin/test.sh
try this, it will work if sudo does not require password ( and it is possible ) )
scp -v -r /var/spool/asterisk/monitor test#xx.xx.xx.xx:/home/test/audio && ssh test#xx.xx.xx.xx "sudo rm -f /var/spool/asterisk/monitor/*"
Make sure requiretty is off in /etc/sudoers. It is normally on by default on Red Hat.

tar archiving via cron does not work

I am trying to archive my localhost's root folder with tar and want to automate it's execution on a daily basis with crontab. For this purpose, I created a 'backupfolder' in my personal folder. I am running on Ubuntu 12.04.
The execution of tar in the command line works fine without problems:
sudo tar -cvpzf backupfolder/localhost.tar.gz /var/www
However, when I schedule the command for a daily backup (let's say at 17.00) in sudo crontab -e, it is not executing, i.e. the backup does not update using the following command:
0 17 * * * sudo tar -cpzf backupfolder/localhost.tar.gz /var/www
I already tried the full path home/user/backupfolder/localhost.tar.gz without success.
var/log/syslog gives me the following output for the scheduled execution:
Feb 2 17:00:01 DESKTOP-PC CRON[12052]: (root) CMD (sudo tar -cpzfbackupfolder/localhost.tar.gz /var/www)
Feb 2 17:00:01 DESKTOP-PC CRON[12051]: (CRON) info (No MTA installed, discarding output)
/etc/crontab specifies the following path:
SHELL=/bin/sh
PATH=/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
I assume that crontab is not executing as this is a sudo command.
Is there a way how I can get this running? What is the recommended, safe way if I don't want to hardcode my root password?
Well, the command that works for you is
sudo tar -cvpzf backupfolder/localhost.tar.gz /var/www
Which means, you have to run the command with sudo access, and it will not work from within your crontab.
I would suggest adding the cron job to the root user's crontab.
Basically, do
sudo crontab -e
And add an entry there
0 17 * * * cd /home/user/backupfolder && tar -cpzf localhost.tar.gz /var/www
If that doesn't work, add the full path of tar (like /bin/tar).
Also, while debugging, set the cronjob to run every minute (* * * * *)
Basically the problem is the sudo command so we will allow sudo to run tar for the "user" without prompting for the password.
Add the following line in /etc/sudoers file.
user ALL=(ALL) NOPASSWD:/bin/tar
where user is the user installing the crontab.
I suspect a PATH problem, try to set some variables at the top of sudo crontab -e :
MAILTO=your_email#domain.tld # to get the output if there's errors
PATH=/usr/bin:/bin:/usr/local/bin:/usr/local/sbin:/sbin
You can write your command in a script like run.sh
#/bin/sh -l
tar -cvpzf backupfolder/localhost.tar.gz /var/www
then use the crontab to run the script.
IMPORTANT NOTE: the script's first line has the "-l" option.
Try it.

Resources