Output from script - exec rm - linux

I got a script which deletes files which are older than 2 days, usually it works properly, but for some reason it doesn't work fine today.
I want to find an option to get an output from script with error why the files are not deleted.
Could you tell me is there such option?
script:
#!/bin/bash
#script for cleaning logs from files older than two days
dir_name=/home/albert/scripts/files
file_log=/home/albert/scripts/info.log
{
find ${dir_name} -type f -name '*.log' -mtime +2 -exec rm -v {} \;
} >> ${file_log)

You probably want to add a redirection for standard error too, i.e. in the case you want to send it to the same file:
{
find ${dir_name} -type f -name '*.log' -mtime +2 -exec rm -v {} \;
} >> ${file_log) 2>&1

You could use the find option -delete:
-delete - If the removal failed, an error message is issued.
find "${dir_name}" -type f -name '*.log' -mtime +2 -delete >> ${file_log} 2>&1
Example:
$ find /etc -name passwd -delete
find: cannot delete ‘/etc/pam.d/passwd’: Permission denied
find: cannot delete ‘/etc/passwd’: Permission denied

Related

log deleted files in linux

I am using the following command to delete the files that are older than 10 days, but I also want to store(log) the list of files that are being deleted using the below command.
find ./path/delete -type f -name '*' -mtime +10 -exec rm {} \;
If you create a file to log to (touch ./path/to/logfile), just add another -exec to your command. Below is a very basic example, but you can add to it:
find ./path/delete -type f -name '*' -mtime +10 -exec rm {} \; -exec echo {} >> ./path/to/logfile \;

Multiple find -exec commands in one bash script doesn't work?

I have a bash script that needs to be run by cron. It works when the script only contains 1 command line, but fails when it's more than 1 line.
#!/bin/sh
find /path/to/file1 -name 'abc_*' -type f -mtime +7 -exec rm {} \;
find /path/to/file2 -name 'def*.gz' -type f -mtime +7 -exec rm {} \;
I received find: missing argument to `-exec' error message. I need to keep only the last 7 days of several different files in several different directories.
Why did I get that error message when all the commands have already seem to be true?
#user1576748
Is there anything that would prevent you from doing this inside one line?
example:
find /path/to/file1 /path/to/file2 -name 'abc*' -o -name 'def*.gz' -type f -mtime +7 -exec rm {} \;
The above works for me.

CentOS: delete X-days old files using cron

I want to delete x-days old .tar & .sql files using cronjob.
I have tried multiple commands but it didn't work.
command such as,
find /path/to/files -type f -mtime +10 -delete
find /path/to/folder -name '*.sql' -mtime +30 -delete
find "$FILEDIR" -mtime +14 -delete
[[ $FILEDIR == /home/abc/* ]] && find "$FILEDIR" -mtime +14 -delete
above commands run perfectly but not deleting anything.
Need help.
I use for loops and echo statements for debugging purposes and also to make sure I don't delete anything by mistake.
So try this:
for filename in `find /path/to/files -type f -mtime +10`; do echo "rm $filename"; done
That will list out what it plans to do. Check it, make sure it looks good, then run by removing the echo statement
for filename in `find /path/to/files -type f -mtime +10`; do rm $filename; done
If it doesn't work, then you can output the above to a file and run it as a script
for filename in `find /path/to/files -type f -mtime +10`; do echo "rm $filename"; done > myremove.sh
Then run it
sh -x ./myremove.sh
And check the output above for errors.
Once you got it bug free, you can add to cron as a single command or in a script

Linux find and delete files but redirect file names to be deleted

Is there a way to write the file names to a file before they are deleted for reference later to check what has been deleted.
find <PATH> -type f -name "<filePattern>" -mtime +1 -delete
Just add a -print expression to the invocation of find:
find <PATH> -type f -name "<filePattern>" -mtime +1 -delete -print > log
I'm not sure if this prints the name before or after the file is unlinked, but it should not matter. I suspect -delete -print unlinks before it prints, while -print -delete will print before it unlinks.
Like William said, you can use -print. However, instead of -print > log, you can also use the -fprint flag.
You'd want something like:
find <PATH> -type f -name "<filePattern>" -mtime +1 -fprint "<pathToLog>" -delete
For instance, I use this in a script:
find . -type d -name .~tmp~ -fprint /var/log/rsync-index-removal.log -delete
You can use -exec and rm -v:
find <PATH> -type f -name "<filePattern>" -mtime +1 -exec rm -v {} \;
rm -v will report what it is deleting.
With something like this you can execute multiple commands in the exec statement, like log to file, rm file, and whatever more you should need
find <PATH> -type f -name "<filePattern>" -mtime +1 -exec sh -c "echo {} >>mylog; rm -f {}" \;
From a shell script named removelogs.sh
run the command sh removelogs.sh in terminal
this is the text in removelogs.sh file.
cd /var/log;
date >> /var/log/removedlogs.txt;
find . -maxdepth 4 -type f -name \*log.old -delete -print >> /var/log/removedlogs.txt
. - to run at this location !!! so ensure you do not run this in root folder!!!
-maxdepth - to prevent it getting out of control
-type - to ensure just files
-name - to ensure just your filtered names
-print - to send the result to stdout
-delete - to delete said files
>> - appends to files not overwrites > creates new file
works for me on CENTOS7

nonzero return code although find -exec rm works

I'm on a linux system I wonder what is wrong with the following execution of find:
mkdir a && touch a/b
find . -name a -type d -exec echo '{}' \;
./a
find . -name a -type d -exec rm -r '{}' \;
find: `./a': No such file or directory
The invocation of echo is just for testing purposes. I would expect the last command to remove the directory './a' entirely and return 0. Instead it removes the directory and generates the error message. To repeat, it does remove the directory! What is going on?
rm executes without a problem. The issue is that find is confused, since it knew the directory ./a was there, it tries to visit that directory to look for directories named a. However, find cannot enter the directory, since it was already removed.
One way to avoid this is to do
find -name a -type d | xargs rm -r
This will let the find move along before the rm command is executed. Or, you can simply ignore the error in your original command.
Based on epsalon's comment the solution is to use the -depth option which causes the deeper files to be visited first.
find . -depth -name a -type d -exec rm -r '{}' \;
does the trick. Thanks a bunch!
If performance is an issue, use -prune in order to prevent find from descending into directories named "a":
find . -name a -type d -prune -exec rm -r '{}' \;

Resources