cronjob to remove files older than N days with special characters - linux

I'm trying to create a job to delete files on a linux box older than X days. Pretty straightforward with:
find /path/to/files -mtime +X -exec rm {}\;
Problem is all my files have special characters b/c they are pictures from a webcam - most contain parenthesis so the above command fails with "no such file or directory".

Have you tried this:
find /path/to/files -mtime +X -exec rm '{}' \;
Or perhaps:
rm $(find /path/to/files -mtime +X);
Or even this method using xargs instead of -exec:
find /path/to/files -mtime +X | xargs rm -f;
Another twist on xargs is to use -print0 which will help the script differentiate between spaces in filenames & spaces between the returned list by using the ASCII null character as a file separator:
find /path/to/files -mtime +X -print0 | xargs -0 rm -f;
Or as man find explains under -print0:
This primary always evaluates to true. It prints the pathname of
the current file to standard output, followed by an ASCII NUL
character (character code 0).
I would also recommend adding the -maxdepth and -type flags to better control what the script does. So I would use this for a dry-run test:
find /path/to/files -maxdepth 1 -type f -mtime +1 -exec echo '{}' \;
The -maxdepth flag controls how many directories down the find will execute and -type will limit the search to files (aka: f) so the script is focused on files only. This will simply echo the results. Then when you are comfortable with it, change the echo to rm.

Does
find /path/to/files -mtime +X -print | tr '()' '?' | xargs rm -f
work?

Related

Write a script that deletes all the regular files (not the directories) with a .js extension that are present in the current directory and all its sub [duplicate]

I'm trying to work out a command which deletes sql files older than 15 days.
The find part is working but not the rm.
rm -f | find -L /usr/www2/bar/htdocs/foo/rsync/httpdocs/db_backups -type f \( -name '*.sql' \) -mtime +15
It kicks out a list of exactly the files I want deleted but is not deleting them. The paths are correct.
usage: rm [-f | -i] [-dIPRrvW] file ...
unlink file
/usr/www2/bar/htdocs/foo/rsync/httpdocs/db_backups/20120601.backup.sql
...
/usr/www2/bar/htdocs/foo/rsync/httpdocs/db_backups/20120610.backup.sql
What am I doing wrong?
You are actually piping rm's output to the input of find. What you want is to use the output of find as arguments to rm:
find -type f -name '*.sql' -mtime +15 | xargs rm
xargs is the command that "converts" its standard input into arguments of another program, or, as they more accurately put it on the man page,
build and execute command lines from standard input
Note that if file names can contain whitespace characters, you should correct for that:
find -type f -name '*.sql' -mtime +15 -print0 | xargs -0 rm
But actually, find has a shortcut for this: the -delete option:
find -type f -name '*.sql' -mtime +15 -delete
Please be aware of the following warnings in man find:
Warnings: Don't forget that the find command line is evaluated
as an expression, so putting -delete first will make find try to
delete everything below the starting points you specified. When
testing a find command line that you later intend to use with
-delete, you should explicitly specify -depth in order to avoid
later surprises. Because -delete implies -depth, you cannot
usefully use -prune and -delete together.
P.S. Note that piping directly to rm isn't an option, because rm doesn't expect filenames on standard input. What you are currently doing is piping them backwards.
find /usr/www/bar/htdocs -mtime +15 -exec rm {} \;
Will select files in /usr/www/bar/htdocs older than 15 days and remove them.
Another simpler method is to use locate command. Then, pipe the result to xargs.
For example,
locate file | xargs rm
Use xargs to pass arguments, with the option -rd '\n' to ignore spaces in names:
"${command}" | xargs -rd '\n' rm
Include --force if you want to also remove read only files.
Assuming you aren't in the directory containing the *.sql backup files:
find /usr/www2/bar/htdocs/foo/rsync/httpdocs/db_backups/*.sql -mtime +15 -exec rm -v {} \;
The -v option above is handy it will verbosely output which files are being deleted as they are removed.
I like to list the files that will be deleted first to be sure. E.g:
find /usr/www2/bar/htdocs/foo/rsync/httpdocs/db_backups/*.sql -mtime +15 -exec ls -lrth {} \;

I want to get an output of the find command in shell script

Am trying to write a script that finds the files that are older than 10 hours from the sub-directories that are in the "HS_client_list". And send the Output to a file "find.log".
#!/bin/bash
while IFS= read -r line; do
echo Executing cd /moveit/$line
cd /moveit/$line
#Find files less than 600 minutes old.
find $PWD -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';' | xargs ls > /home/infa91punv/find.log
done < HS_client_list
However, the script is able to cd to the folders from HS_client_list(this file contents the name of the subdirectories) but, the find command (find $PWD -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';' | xargs ls > /home/infa91punv/find.log) is not working. The Output file is empty. But when I run find $PWD -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';' | xargs ls > /home/infa91punv/find.log as a command it works and from the script it doesn't.
You are overwriting the file in each iteration.
You can use xargs to perform find on multiple directories; but you have to use an alternate delimiter to avoid having xargs populate the {} in the -execdir command.
sed 's%^%/moveit/%' HS_client_list |
xargs -I '<>' find '<>' -type f -iname "*.enc" -mmin +600 -execdir basename {} \; > /home/infa91punv/find.log
The xargs ls did not seem to perform any useful functionality, so I took it out. Generally, don't use ls in scripts.
With GNU find, you could avoid the call to an external utility, and use the -printf predicate to print just the part of the path name that you care about.
For added efficiency, you could invoke a shell to collect the arguments:
sed 's%^%/moveit/%' HS_client_list |
xargs sh -c 'find "$#" -type f -iname "*.enc" -mmin +600 -execdir basename {} \;' _ >/home/infa91punv/find.log
This will run as many directories as possible in a single find invocation.
If you want to keep your loop, the solution is to put the redirection after done. I would still factor out the cd, and take care to quote the variable interpolation.
while IFS= read -r line; do
find /moveit/"$line" -type f -iname "*.enc" -mmin +600 -execdir basename '{}' ';'
done < HS_client_list >/home/infa91punv/find.log

How to write a unix command or script to remove files of the same type in all sub-folders under current directory?

Is there a way to remove all temp files and executables under one folder AND its sub-folders?
All that I can think of is:
$rm -rf *.~
but this removes only temp files under current directory, it DOES NOT remove any other temp files under SUB-folders at all, also, it doesn't remove any executables.
I know there are similar questions which get very well answered, like this one:
find specific file type from folder and its sub folder
but that is a java code, I only need a unix command or a short script to do this.
Any help please?
Thanks a lot!
Perl from command line; should delete if file ends with ~ or it is executable,
perl -MFile::Find -e 'find(sub{ unlink if -f and (/~\z/ or (stat)[2] & 0111) }, ".")'
You can achieve the result with find:
find /path/to/directory \( -name '*.~' -o \( -perm /111 -a -type f \) \) -exec rm -f {} +
This will execute rm -f <path> for any <path> under (and including) /path/to/base/directory which:
matches the glob expression *.~
or which has an executable bit set (be it owner, group or world)
The above applies to the GNU version of find.
A more portable version is:
find /path/to/directory \( -name '*.~' -o \( \( -perm -01 -o -perm -010 -o -perm -0100 \) \
-a -type f \) \) -exec rm -f {} +
find . -name "*~" -exec rm {} \;
or whatever pattern is needed to match the tmp files.
If you want to use Perl to do it, use a specific module like File::Remove
This should do the job
find -type f -name "*~" -print0 | xargs -r -0 rm

delete old files in a directory

okay maybe this sounds simple, but it has been a bit challenging to me
I have a directory called backups and it has (backup files + other files)
backups files:
../backups/backup-2013_03_03.zip
../backups/backup-2013_03_05.zip
../backups/backup-2013_01_01.zip
../backups/backup-2013_08_16.zip
../backups/backup-2013_02_28.zip
../backups/backup-2013_01_21.zip
../backups/backup-2013_03_29.zip
../backups/backup-2013_04_05.zip
I'm trying to delete backup files older than 90 days.
find /var/tmp/stuff -mtime +90 -print | xargs /bin/rm
seems to work, but I'm not able to limit the search to backup files only. "files which starts with backup*"
I have tried adding "-iname backup" option to find command argument, thinking it would do the trick but it doesn't seems to work.
Any ideas?
Thank you
You can pipe through grep before calling rm. Something like:
find /var/tmp/stuff -mtime +90 -print | grep 'backup-' | xargs /bin/rm
while the find utility has all kinds of options to single handedly do this, including the deleting as noted in other answers, I can never remember any but the most basic options.
find "stuff" | grep "some_other_stuff" | xargs "do_stuff"
seems much easier to remember for me.
The parameter to iname matches against the full filename, so you need a trailing wildcard:
find /var/tmp/stuff -mtime +90 -iname "backup*" -print | xargs /bin/rm
You could also use find's -exec argument, but personally I find the syntax quite arcane. I prefer xargs.
find /var/tmp/stuff -mtime +90 -iname "backup*" -exec /bin/rm '{}'
Or, as damienfrancois points out, GNU find can take a -delete argument. This is the best solution because a) it is shorter and b) it is more efficient because the deletion happens within the find process. exec and xargs will both spawn one new process per file to delete. Source: GNU manual However, as wildplasser points out, it could also be dangerous - -delete will remove directories by default. To only delete files, use -type f.
find /var/tmp/stuff -type f -mtime +90 -iname "backup*" -delete
You could use -exec option of find along with -iname. Since you want to delete only files, you would need to specify -type f
find /var/tmp/stuff -type f -iname 'backup*' -mtime +90 -exec rm {} +
If you prefer xargs like me
find /var/tmp/stuff -type f -iname 'backup*' -mtime +90 -print0 | xargs -0 rm
Note : It's recommended to use find -print0 with xargs -0 to avoid weird file name caveats

Linux command for removing all ~ files

What command can I use in Linux to check if there is a file in a given directory (or its subdirectories) that contains a ~at the end of the file's name?
For example, if I'm at a directory called t which contains many subdirectories, etc, I would like to remove all files that end with a ~.
Watch out for filenames with spaces in them!
find ./ -name "*~" -type f -print0 | xargs -0 rm
with GNU find
find /path -type f -name "*~" -exec rm {} +
or
find /path -type f -name "*~" -delete
find ./ -name '*~' -print0 | xargs -0 rm -f
Here find will search the directory ./ and all sub directories, filtering for filenames that match the glob '*~' and printing them (with proper quoting courtesy of alberge). The results are passed to xargs to be appended to rm -f and the resulting string run in a shell. You can use multiple paths, and there are many other filters available (just read man find).
you can use a find, grep, rm combination, something like
find | grep "~" | xargs rm -f
Probably others have better ideas :)

Resources