linux Find command not working through crontab - linux

I am using the command given below to delete the 2 days old data from the folder. The command works fine when I run this through directly terminal but not working through crontab entry.
find /backup/DWHPROD/ -type f -mtime +1 -exec rm -r {} \;
Crontab Entry :
30 16 * * * find /backup/DWHPROD/ -type f -mtime +1 -exec rm -r {} \;

Please run the command in your terminal:
which find
I am having find located at /usr/bin/find.
Try adding that in your crontab script
30 16 * * * /usr/bin/find /backup/DWHPROD/ -type f -mtime +1 -exec rm -r {} \;

Related

how to delete 3days old back files automatically in server and My backup files looks like?

web_20_10_2022
web_21_10_2022
web_22_10_2022
web_23_10_2022
web_24_10_2022
web_25_10_2022
I need auto delete script for 3 days old backup files from server and with cron job.
below are some commond I have tried but it is not working, so please help me out!!!
find /home -type f -mtime +3 -exec rm -rf {} +
find /home* -mtime +3 -type f -delete
find /home -mtime +3 -type d -exec rmdir {} ;
Kindly run the below command to find out the files older than 3 days.
find /path/to/directory/* -mtime +3 -print
The older files will be displayed in the mentioned directory if the above command is used.
To delete the 3 day old files:
find /path/to/directory/* -type f -mtime +3 | xargs rm -f
or
find /path/to/directory/* -type f -mtime +3 -exec rm -f {} \;
or
find /path/to/directory/* -type f -mtime +3 -delete
To automate the process:
Copy the below code and save the file in a directory where you are not performing any housekeeping activity.
#!/bin/bash
find /path/to/directory/* -mtime +3 -delete
Make the file an executable file and set a cron job.
crontab -e
To run the script at every hour, enter
00 * * * * /path/to/executablefile

Output from script - exec rm

I got a script which deletes files which are older than 2 days, usually it works properly, but for some reason it doesn't work fine today.
I want to find an option to get an output from script with error why the files are not deleted.
Could you tell me is there such option?
script:
#!/bin/bash
#script for cleaning logs from files older than two days
dir_name=/home/albert/scripts/files
file_log=/home/albert/scripts/info.log
{
find ${dir_name} -type f -name '*.log' -mtime +2 -exec rm -v {} \;
} >> ${file_log)
You probably want to add a redirection for standard error too, i.e. in the case you want to send it to the same file:
{
find ${dir_name} -type f -name '*.log' -mtime +2 -exec rm -v {} \;
} >> ${file_log) 2>&1
You could use the find option -delete:
-delete - If the removal failed, an error message is issued.
find "${dir_name}" -type f -name '*.log' -mtime +2 -delete >> ${file_log} 2>&1
Example:
$ find /etc -name passwd -delete
find: cannot delete ‘/etc/pam.d/passwd’: Permission denied
find: cannot delete ‘/etc/passwd’: Permission denied

crontab is not deleting the files in linux

I am trying to delete all the pdf files which are more than 30 days old at 11:30 PM
I added the below given code in crontab
30 23 * * * find /var/www/html/site/reports/ -name "*.pdf" -type f -mtime +30 | xargs -I {} rm -f {} \;
But it doesn't delete the files.
Can you please check what the issue is?
The crontab details
-rw-r--r--. 1 root root 532 Sep 30 11:14 crontab
One of the files which i need to delete
-rw-r--r-- 1 apache apache 15215 Jul 25 11:24 sales_report.pdf
You missed user and PATH. This may help
SHELL=/bin/bash
PATH=/sbin:/bin:/usr/sbin:/usr/bin
MAILTO=root
HOME=/
30 23 * * * root find /var/www/html/site/reports/ \( -name "*.pdf" \) -type f -mtime +30 -exec rm {} \; >> /tmp/debug_cron 2>&1
And then check /tmp/debug_cron
I have this working a a linux box.
30 23 * * * find /var/www/html/site/reports/ \( -name "*.pdf" \) -type f -mtime +30 -exec rm {} \;

Copy files in Unix generated in 24 hours

I am trying to copy files which are generated in the past one day (24 hrs). I am told to use awk command but I couldn't find the exact command for doing this. My task is to copy files from /source/path --> /destination/path.
find /source/path -type f -mmin -60 -exec ls -al {} \;
I have used the above command to find the list of files generated in the past 60 mins, but my requirement is to copy the files, and not just knowing the file names.
Just go ahead an exec cp instead of ls:
find /source/path -type f -mmin -60 -exec cp {} /destination/path \;
You are really close! Take the name of files and use it for copy.
find /source/path -type f -mmin -60 -exec ls -al {} \; |\
while read file
do
cp -a "${file}" "/destination/path"
done

Multiple find -exec commands in one bash script doesn't work?

I have a bash script that needs to be run by cron. It works when the script only contains 1 command line, but fails when it's more than 1 line.
#!/bin/sh
find /path/to/file1 -name 'abc_*' -type f -mtime +7 -exec rm {} \;
find /path/to/file2 -name 'def*.gz' -type f -mtime +7 -exec rm {} \;
I received find: missing argument to `-exec' error message. I need to keep only the last 7 days of several different files in several different directories.
Why did I get that error message when all the commands have already seem to be true?
#user1576748
Is there anything that would prevent you from doing this inside one line?
example:
find /path/to/file1 /path/to/file2 -name 'abc*' -o -name 'def*.gz' -type f -mtime +7 -exec rm {} \;
The above works for me.

Resources