Bash script to recursively step through folders and delete files - linux

Can anyone give me a bash script or one line command i can run on linux to recursively go through each folder from the current folder and delete all files or directories starting with '._'?

Change directory to the root directory you want (or change . to the directory) and execute:
find . -name "._*" -print0 | xargs -0 rm -rf
xargs allows you to pass several parameters to a single command, so it will be faster than using the find -exec syntax. Also, you can run this once without the | to view the files it will delete, make sure it is safe.

find . -name '._*' -exec rm -Rf {} \;

I've had a similar problem a while ago (I assume you are trying to clean up a drive that was connected to a Mac which saves a lot of these files), so I wrote a simple python script which deletes these and other useless files; maybe it will be useful to you:
http://github.com/houbysoft/short/blob/master/tidy

find /path -name "._*" -exec rm -fr "{}" +;

Instead of deleting the AppleDouble files, you could merge them with the corresponding files. You can use dot_clean.
dot_clean -- Merge ._* files with corresponding native files.
For each dir, dot_clean recursively merges all ._* files with their corresponding native files according to the rules specified with the given arguments. By default, if there is an attribute on the native file that is also present in the ._ file, the most recent attribute will be used.
If no operands are given, a usage message is output. If more than one directory is given, directories are merged in the order in which they are specified.
Because dot_clean works recursively by default, use:
dot_clean <directory>
If you want to turn off the recursively merge, use -f for flat merge.
dot_clean -f <directory>

find . -name '.*' -delete
A bit shorter and perform better in case of extremely long list of files.

Related

Linux find command explanation

Can someone explain me what does this command do and if I want to try the same thing using git, how should I modify this command?
find . -name CVS -print -exec rm -fr {} \;
This command looks in your current working directory for any directories or files named "CVS" and prints the full path. Then executes a forced recursive removal for each result returned by the find command.
Since there is no filetype present in the name, this command will remove any directory, within your current working directory, named CVS, including all subdirectories and files housed within.

find and remove multiple file using linux command

I have a directory named classes which contains a lot of sub-directories -
classes
|-security
|-registration
|-service
....
Each of these directory contains a lot of java files and their compiled classes files. I want to remove all the class file.
Going to classes directory I can list out all the class file using find command -
$ find . -name *.class
Is there any command in linux to remove all the classes file under the classes directory.
The usual answer uses the -exec option of find:
find . -name "*.class" -exec rm {} \;
Be sure to quote the wildcard, to ensure that it is passed into find (rather than globbed by the shell, first).
For further discussion, see these questions:
Command line: piping find results to rm
Linux why can't I pipe find result to rm?
Use xargs with pipe lining -
$ find . -name *.class | xargs rm *

Iteratively remove file type

I'm trying to delete the files that visual sourcesafe inserts into various folders. It's this file:
vssver2.scc
Since I have many nested folders, I'd like to do this recursively from the parent folder. What would the linux code be to delete all files with .scc extension? (I'm on a mac).
Thanks.
Look for them and remove:
find . -name "*.scc" -exec rm {} +
To make sure you are going to delete the correct files, you can replace the rm with ls so that it will show these files.
Also, you can replace find . with find /your/path to indicate the exact path from which you want to remove. With find . it will start from the current path.
find . -name ".scc" -print0 | xargs -0 rm -rf

BASH: Checking if files are duplicates within a directory?

I am writing a house-keeping script and have files within a directory that I want to clean up.
I want to move files from a source directory to another, there are many sub-directories so there could be files that are the same. What I want to do, is either use CMP command or MD5sum each file, if they are no duplicates then move them, if they are the same only move 1.
So the I have the move part working correctly as follows:
find /path/to/source -name "IMAGE_*.JPG" -exec mv '{}' /path/to/destination \;
I am assuming that I will have to loop through my directory, so I am thinking.
for files in /path/to/source
do
if -name "IMAGE_*.JPG"
then
md5sum (or cmp) $files
...stuck here (I am worried about how this method will be able to compare all the files against eachother and how I would filter them out)...
then just do the mv to finish.
Thanks in advance.
find . -type f -exec md5sum {} \; | sort | uniq -d
That'll spit out all the md5 hashes that have duplicates. then it's just a matter of figuring out which file(s) produced those duplicate hashes.
There's a tool designed for this purpose, it's fdupes :
fdupes -r dir/
dupmerge is another such tool...

Removing files called --exclude=*.xdr

Somehow I must have mistyped a command, because now I have files named --exclude=.xdr and --exclude=.h5 in one of my directories. I want to delete them. Only problem is whenever I do something like:
rm --exclude=*.xdr
it thinks I'm passing an argument to the rm command. I've tried encasing in single and double quotes but it still didn't work. How can I delete these files?
Cheers
Flag interpretation is done based purely on text. Any string that doesn't start with a - is not a flag. The path to a file in the local directory can start with ./ (the . means "current directory").
I'd also recommend reading the man page for rm, as that explicitly lists two different ways of doing exactly this.
rm -- --blah
rm ./--blah
rm -- "--exclude=.xdr"
Use this command for delete that file
What about using find:
find . -type f -name "--exclude*" -exec rm {} \; -print

Resources