delete all files except a pattern list file - linux

I need to delete all the files in the current directory except a list of patterns that are described in a whitelist file (delete_whitelist.txt) like this:
(.*)dir1(/)?
(.*)dir2(/)?
(.*)dir2/ser1(/)?(.*)
(.*)dir2/ser2(/)?(.*)
(.*)dir2/ser3(/)?(.*)
(.*)dir2/ser4(/)?(.*)
(.*)dir2/ser5(/)?(.*)
How can I perform this in one bash line?

Any bash script can fit on one line:
find . -type f -print0 | grep -EzZvf delete_whitelist.txt | xargs -0 printf '%s\n'
Check the output and then, if it's OK:
find . -type f -print0 | grep -EzZvf delete_whitelist.txt | xargs -0 rm

Related

How to pipe a list of files returned from find to cat and sort them

I'm trying to find all the files from a folder and then print them but sorted.
I have this so far
find . -type f -exec cat {} \;
and it print's all files but I need to sort them too but when I do
find . -type f -exec sort cat {};
I get the next error
sort:cannot read:cat:No such file or directory
and if I switch sort and cat like this
find . -type f -exec cat sort {} \;
I get the same error the it print's the file(I have only one file to print)
It's not clear to me if you want to display the contents of the files unchanged sorting the files by name, or if you want to sort the contents of each file. If the latter:
find . -type f -exec sort {} \;
If the former, use bsd find's -s option:
find -s . -type f -exec cat {} \;
If you don't have bsd find, use:
find . -type f -print0 | sort -z | xargs -0 cat
Composing commands using pipes is often the simplest solution.
find . -print0 -type f | sort | xargs -0 cat
Explanation: you can sort filenames after the fact using ... | sort, then pass the output (the list of files) to cat using xargs, i.e. ... | xargs cat.
As #arkaduisz points out, when using pipes, should carefully handle filenames containing whitespaces (thus using -print0 and -0).

Grep regular files in a linux File System and show their content

How do I display the content of files regular files matched with grep command? For example I grep a directory in order to see the regular files it has. I used the next line to see the regular files only:
ls -lR | grep ^-
Then I would like to display the content of the files found there. How do I do it?
I would do something like:
$ cat `ls -lR | egrep "^-" | rev | cut -d ' ' -f 1 | rev`
Use ls to find the files
grep finds your pattern
reverse the whole result
cut out the first file separated field to get the file name (files with spaces are problematic)
reverse the file name back to normal direction
Backticks will execute that and return the list of file names to cat.
or the way I would probably do it is use vim to look at each file.
$ vim `ls -lR | egrep "^-" | rev | cut -d ' ' -f 1 | rev`
It feels like you are trying to find only the files recursively. This is what I do in those cases:
$ vim `find . -type f -print`
There are multiple ways of doing it. Would try to give you a few easy and clean ways here. All of them handle filenames with space.
$ find . -type f -print0 | xargs -0 cat
-print0 adds a null character '\0' delimiter and you need to call xargs -0 to recognise the null delimiter. If you don't do that, whitespace in the filename create problems.
e.g. without -print0 filenames: abc 123.txt and 1.inc would be read as three separate files abc, 123.txt and 1.inc.
with -print0 this becomes abc 123.txt'\0' and 1.inc'\0' and would be read as abc 123.txt and 1.inc
As for xargs, it can accept the input as a parameter. command1 | xargs command2 means the output of command1 is passed to command2.
cat displays the content of the file.
$ find . -type f -exec echo {} \; -exec cat {} \;
This is just using the find command. It finds all the files (type f), calls echo to output the filename, then calls cat to display its content.
If you don't want the filename, omit -exec echo {} \;
Alternatively you can use cat command and pass the output of find.
$ cat `find . -type f -print`
If you want to scroll through the content of multiple files one by one. You can use.
$ less `find . -type f -print`
When using less, you can navigate through :n and :p for next and previous file respectively. press q to quit less.

prevent space from splitting filenames using backticks

Using find to select files to pass to another command using backticks/backquotes, I've noted that filenames that contain spaces will be split, and therfore not found.
Is it possible to avoid this behaviour? The command I issued looks like this
wc `find . -name '*.txt'`
but for example when there is a file named a b c.txt in directory x it reports
$ wc `find . -name '*.txt'`
wc: ./x/a: No such file or directory
wc: b: No such file or directory
wc: c.txt: No such file or directory
When used with multiple files wc will show the output of each file, and a final summary line with the totals of all files. that's why I want to execute wc once.
I tried escaping spaces with sed, but wc produces the same output (splits filenames with spaces).
wc `find . -name '*.txt' | sed 's/ /\\\ /pg'`
Use the -print0 option to find and the corresponding -0 option to xargs:
find . -name '*.txt' -print0 | xargs -0 wc
You can also use the -exec option to find:
find . -name '*.txt' -exec wc {} +
from this very similar question (should I flag my question as a duplicate?) I found another answer to this using bash's ** expansion:
wc **/*.txt
for this to work I had to
shopt -s globstar

Linux: Find a List of Files in a Dictionary recursively

I have a Textfile with one Filename per row:
Interpret 1 - Song 1.mp3
Interpret 2 - Song 2.mp3
...
(About 200 Filenames)
Now I want to search a Folder recursivly for this Filenames to get the full path for each Filename in Filenames.txt.
How to do this? :)
(Purpose: Copied files to my MP3-Player but some of them are broken and i want to recopy them all without spending hours of researching them out of my music folder)
The easiest way may be the following:
cat orig_filenames.txt | while read file ; do find /dest/directory -name "$file" ; done > output_file_with_paths
Much faster way is run the find command only once and use fgrep.
find . -type f -print0 | fgrep -zFf ./file_with_filenames.txt | xargs -0 -J % cp % /path/to/destdir
You can use a while read loop along with find:
filecopy.sh
#!/bin/bash
while read line
do
find . -iname "$line" -exec cp '{}' /where/to/put/your/files \;
done < list_of_files.txt
Where list_of_files.txt is the list of files line by line, and /where/to/put/your/files is the location you want to copy to. You can just run it like so in the directory:
$ bash filecopy.sh
+1 for #jm666 answer, but the -J option doesn't work for my flavor of xargs, so i chaned it to:
find . -type f -print0 | fgrep -zFf ./file_with_filenames.txt | xargs -0 -I{} cp "{}" /path/to/destdir/

How to only get file name with Linux 'find'?

I'm using find to all files in directory, so I get a list of paths. However, I need only file names. i.e. I get ./dir1/dir2/file.txt and I want to get file.txt
In GNU find you can use -printf parameter for that, e.g.:
find /dir1 -type f -printf "%f\n"
If your find doesn't have a -printf option you can also use basename:
find ./dir1 -type f -exec basename {} \;
Use -execdir which automatically holds the current file in {}, for example:
find . -type f -execdir echo '{}' ';'
You can also use $PWD instead of . (on some systems it won't produce an extra dot in the front).
If you still got an extra dot, alternatively you can run:
find . -type f -execdir basename '{}' ';'
-execdir utility [argument ...] ;
The -execdir primary is identical to the -exec primary with the exception that utility will be executed from the directory that holds the current file.
When used + instead of ;, then {} is replaced with as many pathnames as possible for each invocation of utility. In other words, it'll print all filenames in one line.
If you are using GNU find
find . -type f -printf "%f\n"
Or you can use a programming language such as Ruby(1.9+)
$ ruby -e 'Dir["**/*"].each{|x| puts File.basename(x)}'
If you fancy a bash (at least 4) solution
shopt -s globstar
for file in **; do echo ${file##*/}; done
If you want to run some action against the filename only, using basename can be tough.
For example this:
find ~/clang+llvm-3.3/bin/ -type f -exec echo basename {} \;
will just echo basename /my/found/path. Not what we want if we want to execute on the filename.
But you can then xargs the output. for example to kill the files in a dir based on names in another dir:
cd dirIwantToRMin;
find ~/clang+llvm-3.3/bin/ -type f -exec basename {} \; | xargs rm
On mac (BSD find) use:
find /dir1 -type f -exec basename {} \;
As others have pointed out, you can combine find and basename, but by default the basename program will only operate on one path at a time, so the executable will have to be launched once for each path (using either find ... -exec or find ... | xargs -n 1), which may potentially be slow.
If you use the -a option on basename, then it can accept multiple filenames in a single invocation, which means that you can then use xargs without the -n 1, to group the paths together into a far smaller number of invocations of basename, which should be more efficient.
Example:
find /dir1 -type f -print0 | xargs -0 basename -a
Here I've included the -print0 and -0 (which should be used together), in order to cope with any whitespace inside the names of files and directories.
Here is a timing comparison, between the xargs basename -a and xargs -n1 basename versions. (For sake of a like-with-like comparison, the timings reported here are after an initial dummy run, so that they are both done after the file metadata has already been copied to I/O cache.) I have piped the output to cksum in both cases, just to demonstrate that the output is independent of the method used.
$ time sh -c 'find /usr/lib -type f -print0 | xargs -0 basename -a | cksum'
2532163462 546663
real 0m0.063s
user 0m0.058s
sys 0m0.040s
$ time sh -c 'find /usr/lib -type f -print0 | xargs -0 -n 1 basename | cksum'
2532163462 546663
real 0m14.504s
user 0m12.474s
sys 0m3.109s
As you can see, it really is substantially faster to avoid launching basename every time.
Honestly basename and dirname solutions are easier, but you can also check this out :
find . -type f | grep -oP "[^/]*$"
or
find . -type f | rev | cut -d '/' -f1 | rev
or
find . -type f | sed "s/.*\///"
-exec and -execdir are slow, xargs is king.
$ alias f='time find /Applications -name "*.app" -type d -maxdepth 5'; \
f -exec basename {} \; | wc -l; \
f -execdir echo {} \; | wc -l; \
f -print0 | xargs -0 -n1 basename | wc -l; \
f -print0 | xargs -0 -n1 -P 8 basename | wc -l; \
f -print0 | xargs -0 basename | wc -l
139
0m01.17s real 0m00.20s user 0m00.93s system
139
0m01.16s real 0m00.20s user 0m00.92s system
139
0m01.05s real 0m00.17s user 0m00.85s system
139
0m00.93s real 0m00.17s user 0m00.85s system
139
0m00.88s real 0m00.12s user 0m00.75s system
xargs's parallelism also helps.
Funnily enough i cannot explain the last case of xargs without -n1.
It gives the correct result and it's the fastest ¯\_(ツ)_/¯
(basename takes only 1 path argument but xargs will send them all (actually 5000) without -n1. does not work on linux and openbsd, only macOS...)
Some bigger numbers from a linux system to see how -execdir helps, but still much slower than a parallel xargs:
$ alias f='time find /usr/ -maxdepth 5 -type d'
$ f -exec basename {} \; | wc -l; \
f -execdir echo {} \; | wc -l; \
f -print0 | xargs -0 -n1 basename | wc -l; \
f -print0 | xargs -0 -n1 -P 8 basename | wc -l
2358
3.63s real 0.10s user 0.41s system
2358
1.53s real 0.05s user 0.31s system
2358
1.30s real 0.03s user 0.21s system
2358
0.41s real 0.03s user 0.25s system
I've found a solution (on makandracards page), that gives just the newest file name:
ls -1tr * | tail -1
(thanks goes to Arne Hartherz)
I used it for cp:
cp $(ls -1tr * | tail -1) /tmp/

Resources