Linux - Can't recursively delete large directories - linux

I have a pretty big find that is supposed to delete any files/dir it finds. I just can't get it to work properly.
If I attach -exec rm -fr {} \;, at some point, I always get the following errors:
find: ‘/path/to/dir/file123.local’: No such file or directory
If I replace it with -delete, I get the following error:
find: cannot delete `/path/to/dir': Directory not empty
I looked for suggestions online but the suggestion is always the other option (replace -exec with -delete and vice-versa)
Does anyone happen to know a way to fix it without redirecting stderr to null?
Thanks ahead!

find doesn't know what your command passed to -exec does. It traverses the directory tree in this order:
find a file
execute a command on that file
if it's a directory, traverse it down
Now if the directory is removed with rm -fr, there is nothing to traverse down any more, so find reports it.
If you supply the -depth option, then the traversal order changes:
find a file
if it's a directory, traverse it down
execute a command on that file
This will eliminate the error message.
-delete implies -depth, so ostensibly it should work. However it is your responsibility to make sure the directories you want to delete are completely cleaned up. If you filter out some files with -time etc, you may end up trying to delete a directory which is not completely clean.

You could try to wrap {} in double quotes, there may have space in directry path.
-exec rm -rf "{}" \;

If I read your question well, you want to remove files, but sometimes it happens that they already have been removed by some other process, and you wonder what you should do.
Why do you think you should do anything? You want the file to be gone, and apparently it is gone, so no worries.
Obviously the corresponding error messages might be annoying, but this you can handle adding 2>/dev/null at the end of your command (redirect the error output to <NULL>).
So you get:
find ... -exec rm -fr {} \; 2>/dev/null
Edit after comment from user1934428:
I might be a good idea to drop the r switch:
find ... -exec rm -f {} \; 2>/dev/null
In that case, you should have no errors anymore:
find ... -exec rm -f {} \;

Related

`find` command core dumps in directory with too many files

I have a folder that contains so many files/folders inside it that even basic commands like du and find are crashing. I'd like to clean up some old files from it... but obviously I can't do that with the find command...
# find /opt/graphite/storage/whisper -mtime +30 -type f -delete
Aborted (core dumped)
What command or trick can I use to delete files from that folder since find isn't working?
I believe the best way to go is using a simple for-loop: the problem is that find loads all found information in memory, and only once this is done, it starts deleting.
However, a loop can solve this:
for f in $(ls -a)
do
if <check_last_modification_date>($f)
then rm -r $f
fi
done
Concerning the last modification date check, there are plenty of ways to do this, as explained here.
For find command using the -exec option worked for me to delete the files.
# find /opt/graphite/storage/whisper -mtime +30 -type f -exec rm -f {} \;

removing some files out of several folders

I have a question about removing some files out of several folders.
To be more specific: There are 5 Folders which are only the same by a few characters. For example: o1_FolderF_xy and zz_FolderF_34. And in each folder with the characters "FolderF" I want to delete all the files which starts with "filename".
The last time I did it by hand.
Will this work? Or do i need a script with a loop?
rm -rf /path/toFolder/*FolderF*/filename*
I'm sorry, I think for most it's a stupid question. But I'm new to all the stuff and I just do not want to go wrong with the delete
Your suggested command will work just fine.
You could use find instead:
find /path -name 'filename*' -exec rm {} \;
Basically it search's files with filename pattern on /path directory and for each file it executes rm.
Or, if you want to just check into those specific directories:
find /path -wholename '*folder*/filename*' -exec rm {} \;

Linux large amount of files not being deleted

I have a folder of cache files in a linux VM that weren't being deleted for some reason.
I'm trying to delete them ( or the folder it self ) but nothing seems to work.
rm just gives me back Argument list is too long
I'm trying now
find ./cache -type f -delete , hitting ls-l every once in a while but keep getting the same # of files.
Also tried
find ./cache -type f -exec rm-v {} \; but same thing again.
I would be ok if i just delete the folder as long as i recreate it after.
Thank you
EDIT: Ok found out ls-l does not return the # of files, if however i do
ls | wc -l system seems to not respond at all.
Use rm -R filename to remove large data files
Linux command line length is limited so the rm cannot work.
The find command will work, though your directory is really big. Launch your find command and go to lunch.
EDIT – btw make sure to ls the same directory you want to remove files of, i.e. ./cache. It is not clear in your question.

Removing files called --exclude=*.xdr

Somehow I must have mistyped a command, because now I have files named --exclude=.xdr and --exclude=.h5 in one of my directories. I want to delete them. Only problem is whenever I do something like:
rm --exclude=*.xdr
it thinks I'm passing an argument to the rm command. I've tried encasing in single and double quotes but it still didn't work. How can I delete these files?
Cheers
Flag interpretation is done based purely on text. Any string that doesn't start with a - is not a flag. The path to a file in the local directory can start with ./ (the . means "current directory").
I'd also recommend reading the man page for rm, as that explicitly lists two different ways of doing exactly this.
rm -- --blah
rm ./--blah
rm -- "--exclude=.xdr"
Use this command for delete that file
What about using find:
find . -type f -name "--exclude*" -exec rm {} \; -print

Bash script to recursively step through folders and delete files

Can anyone give me a bash script or one line command i can run on linux to recursively go through each folder from the current folder and delete all files or directories starting with '._'?
Change directory to the root directory you want (or change . to the directory) and execute:
find . -name "._*" -print0 | xargs -0 rm -rf
xargs allows you to pass several parameters to a single command, so it will be faster than using the find -exec syntax. Also, you can run this once without the | to view the files it will delete, make sure it is safe.
find . -name '._*' -exec rm -Rf {} \;
I've had a similar problem a while ago (I assume you are trying to clean up a drive that was connected to a Mac which saves a lot of these files), so I wrote a simple python script which deletes these and other useless files; maybe it will be useful to you:
http://github.com/houbysoft/short/blob/master/tidy
find /path -name "._*" -exec rm -fr "{}" +;
Instead of deleting the AppleDouble files, you could merge them with the corresponding files. You can use dot_clean.
dot_clean -- Merge ._* files with corresponding native files.
For each dir, dot_clean recursively merges all ._* files with their corresponding native files according to the rules specified with the given arguments. By default, if there is an attribute on the native file that is also present in the ._ file, the most recent attribute will be used.
If no operands are given, a usage message is output. If more than one directory is given, directories are merged in the order in which they are specified.
Because dot_clean works recursively by default, use:
dot_clean <directory>
If you want to turn off the recursively merge, use -f for flat merge.
dot_clean -f <directory>
find . -name '.*' -delete
A bit shorter and perform better in case of extremely long list of files.

Resources