Recursively count files in all nested Subdirectories in a Linux [duplicate] - linux

This question already has answers here:
Recursively counting files in a Linux directory
(24 answers)
Closed 4 years ago.
How can I cound recursively number of files in a subdirectry in a Linux system. I know
tree
ncdu
These commands with really nice and informative output but does not cound files. I was trying to
find . -type d -print0 | xargs -0 {HERE I SHOULD DO SOMETHING}
But fail. Output like this:
00. Introduction: 6
04. Topic 1: 18
01. Topic 2: 14
02. Topic 3: 10
05. Details 4: 4
03. Conclusion: 6
There is a spaces and special characters in directory names

After researching and testing i've got
find . -maxdepth 1 -type d -print0 | sort -z | \
while IFS= read -r -d '' i ; do \
echo -n "$i: " ; (find "$i" -type f | wc -l) ; done
Explanation
-maxdepth 1 - I need only one level of recursion
-type d - only directories
-print0 | while IFS= read -r -d '' i - I have spaces in directories. The -r option to read prevents backslash interpretation (usually used as a backslash newline pair, to continue over multiple lines or to escape the delimiters). Without this option, any unescaped backslashes in the input will be discarded. You should almost always use the -r option with read.
The most common exception to this rule is when -e is used, which uses Readline to obtain the line from an interactive shell. In that case, tab completion will add backslashes to escape spaces and such, and you do not want them to be literally included in the variable. This would never be used when reading anything line-by-line, though, and -r should always be used when doing so.
By default, read modifies each line read, by removing all leading and trailing whitespace characters (spaces and tabs, if present in IFS). If that is not desired, the IFS variable may be cleared, as in the example above.
The IFS variable is used in shells (Bourne, POSIX, ksh, bash) as the input field separator (or internal field separator). Essentially, it is a string of special characters which are to be treated as delimiters between words/fields when splitting a line of input.
The default value of IFS is space, tab, newline. (A three-character string.) If IFS is unset, it acts as though it were set to this default value. (This is presumably for simplicity in shells that do not support the $'...' syntax for special characters.) If IFS is set to an empty string (which is very different from unsetting it!) then no splitting will be performed.
In the read command, if multiple variable-name arguments are specified, IFS is used to split the line of input so that each variable gets a single field of the input. (The last variable gets all the remaining fields, if there are more fields than variables.)
sort -z - sort output of find in alfabetical order
do echo -n "$i: " - print directory name and colon
find "$i" -type f - find files only inside each directory
wc -l - display number of files (lines of second find output)
Example
~/tmp$ tree
.
├── 1000
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_56c7fa8bb58cca0055b0efc2c5ad303d.jpg
│   ├── 138x116_cropped_c00f5791305b20d52e16e0f7a4c2e3d9.jpg
│   ├── 640x320_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   └── original_thumb.jpg
├── 10000
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   └── original_thumb.jpg
├── 10001
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   └── original_thumb.jpg
├── 10005
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   └── original_thumb.jpg
├── 10006
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   └── original_thumb.jpg
├── 10009
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   └── original_thumb.jpg
├── 1001
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_56c7fa8bb58cca0055b0efc2c5ad303d.jpg
│   ├── 138x116_cropped_c00f5791305b20d52e16e0f7a4c2e3d9.jpg
│   └── original_thumb.jpg
├── 10011
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   └── original_thumb.jpg
├── 10015
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   └── original_thumb.jpg
├── 10016
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   ├── 640x320_cropped_d4dcbfaafb98dafcbc594b020ce7c54b.jpg
│   └── original_thumb.jpg
├── 10017
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   └── original_thumb.jpg
├── 10018
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   └── original_thumb.jpg
├── 10019
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   ├── 640x320_cropped_d4dcbfaafb98dafcbc594b020ce7c54b.jpg
│   └── original_thumb.jpg
├── 1002
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_56c7fa8bb58cca0055b0efc2c5ad303d.jpg
│   ├── 138x116_cropped_c00f5791305b20d52e16e0f7a4c2e3d9.jpg
│   ├── 640x320_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   └── original_thumb.jpg
├── 10021
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   └── original_thumb.jpg
├── 10025
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 640x320_cropped_6450e078c12f532b29ba57eeb58ca8b3.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   └── original_thumb.jpg
├── 10028
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   └── original_thumb.jpg
├── 10029
│   ├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
│   ├── 138x116_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_6c8e104b1fc8e31695beb9e950830d64.jpg
│   ├── 640x320_cropped_8ae863ac89a31bf9834085414215be36.jpg
│   └── original_thumb.jpg
└── 1003
├── 138x116_cropped_50198398faa3c0d168c176824edd4ff7.jpg
├── 138x116_cropped_56c7fa8bb58cca0055b0efc2c5ad303d.jpg
├── 138x116_cropped_c00f5791305b20d52e16e0f7a4c2e3d9.jpg
├── 640x320_cropped_50198398faa3c0d168c176824edd4ff7.jpg
└── original_thumb.jpg
19 directories, 89 files
Using explained above one-liner:
~/tmp$ find . -maxdepth 1 -type d -print0 | sort -z | \
> while IFS= read -r -d '' i ; do \
> echo -n "$i: " ; (find "$i" -type f | wc -l) ; done
.: 89
./1000: 5
./10000: 5
./10001: 4
./10005: 4
./10006: 5
./10009: 5
./1001: 4
./10011: 5
./10015: 4
./10016: 5
./10017: 4
./10018: 4
./10019: 5
./1002: 5
./10021: 5
./10025: 6
./10028: 4
./10029: 5
./1003: 5

Related

Bash brace expansion - operand behavior

When using brace expansion with certain commands, the actual behavior differed from what I expected- a member within the brace was evaluated as an argument in the other brace member's expansion.
For instance,
$ mkdir -p {folder1,folder2,folder3}/{folderA,folderB,folderC}
Works as expected -
$ tree .
.
├── folder1
│   ├── folderA
│   ├── folderB
│   └── folderC
├── folder2
│   ├── folderA
│   ├── folderB
│   └── folderC
└── folder3
├── folderA
├── folderB
└── folderC
However, if we do
$ cp -r folder1/ folder2/{folderA,folderB}
Instead of folder1 being copied to both folder2/folderA and folder2/folderB, 'folderA' is interpreted as a second source. Thus we get -
.
├── folder1
│   ├── folderA
│   ├── folderB
│   └── folderC
├── folder2
│   ├── folderA
│   ├── folderB
│   │   ├── folder1
│   │   │   ├── folderA
│   │   │   ├── folderB
│   │   │   └── folderC
│   │   └── folderA
│   └── folderC
└── folder3
├── folderA
├── folderB
└── folderC
Can anyone explain why this is the case? I would have thought the above to be evaluated as -
$ cp -r folder1/ folder2/folderA
$ cp -r folder1/ folder2/folderB
Brace expansion doesn't result in multiple commands, it's just expanded in place in the original command. So the result is
cp -r folder1/ folder2/folderA folder2/folderB
When you get more than 2 arguments to cp, the last is the destination folder, the rest are source files and folders.
If you want multiple commands, you can use an explicit loop:
for dest in folder2/{folderA,folderB}; do
cp -r folder1/ "$dest"
done

Copy recursively multiple folders as symbolic links except one

I need to copy multiple folders in a directory recursively with their files as symbolic links for example :
[root#tests : /app/dirs]#
|-- dir1
|-- dir2
|-- dir3
|-- dir4
|-- dir5
| |-- a.txt
| |-- b.o
| `-- c.txt
I want to copy the content of /app/dirs to an existing directory in the same folder with symbolic links i tried this :
cp -as !(/app/dirs/dir3) ./dir3
I tried to exclude dir3 because you can't copy a folder in itself but now i get an error saying that the symbolic links can't be created for the files in dir5.
the expected result :
/home/barmar/test.dir
├── dir1
├── dir2
├── dir3
│ ├── dir1
│ ├── dir2
│ ├── dir4
│ └── dir5
│ ├── a.txt -> /home/barmar/test.dir/dir5/a.txt
│ ├── b.o -> /home/barmar/test.dir/dir5/b.o
│ └── c.txt -> /home/barmar/test.dir/dir5/c.txt
├── dir4
└── dir5
├── a.txt
├── b.o
└── c.txt
is there anyway to acheive the expected result ?
!(/app/dirs/dir3) doesn't expand as you think. echo !(/app/dirs/dir3) shows that it includes dir3. If the current directory is /app/dirs, you can use /app/dirs/!(dir3) to get what you want.
shopt -s extglob # needed for the extended wildcard
cd /app/dirs
cp -as /app/dirs/!(dir3) dir3
This creates
/home/barmar/test.dir
├── dir1
├── dir2
├── dir3
│   ├── dir1
│   ├── dir2
│   ├── dir4
│   └── dir5
│   ├── a.txt -> /home/barmar/test.dir/dir5/a.txt
│   ├── b.o -> /home/barmar/test.dir/dir5/b.o
│   └── c.txt -> /home/barmar/test.dir/dir5/c.txt
├── dir4
└── dir5
├── a.txt
├── b.o
└── c.txt

Linux: Batch rename multiple files to parent dir + suffix in order of name

I need to batch rename multiple images and want to use the parent directory as base name. To prevent one overwriting the other, a suffix must be added. The order of the renaming process musts follow the timestamp of the file. Because the 'first' file is a featured image for the site I'm using it for.
Tree:
└── main
├── white tshirt
│   ├── IMG_1.jpg
│   ├── IMG_2.jpg
│   ├── IMG_3.jpg
│   └── IMG_4.jpg
├── black tshirt
│   ├── IMG_1.jpg
│   ├── IMG_2.jpg
│   ├── IMG_3.jpg
│   └── IMG_4.jpg
└── red tshirt
├── IMG_1.jpg
├── IMG_2.jpg
├── IMG_3.jpg
└── IMG_4.jpg
Goal:
└── main
├── white tshirt
│   ├── white-tshirt-1.jpg
│   ├── white-tshirt-2.jpg
│   ├── white-tshirt-3.jpg
│   └── white-tshirt-4.jpg
├── black tshirt
│   ├── black-tshirt-1.jpg
│   ├── black-thisrt-2.jpg
│   ├── black-tshirt-3.jpg
│   └── black-tshirt-4.jpg
└── red tshirt
├── red-tshirt-1.jpg
├── red-tshirt-2.jpg
├── red-tshirt-3.jpg
└── red-tshirt-4.jpg
Replacing spaces with dashes is not required, but preferred. Platform: Debian 8
I think this should do the job:
#!/bin/sh
for dir in *; do
if [ ! -d "$dir" ]; then
continue
fi
cd "$dir"
pref=$(echo "$dir" | tr ' ' -)
i=1
ls -tr | while read f; do
ext=$(echo "$f" | sed 's/.*\.//')
mv "$f" "${pref}-${i}.$ext"
i=$(expr $i + 1)
done
cd ..
done
Invoke the script inside your main directory and make sure there are only your target folders in it. Also make sure your files'names do not contain the character '\'

Batch rename images to folder name + sequential number [duplicate]

This question already has answers here:
Batch renaming files with Bash
(10 answers)
Closed 6 years ago.
I want to rename files in a directory with subdirectories to parent it's directory name + sequential numbers.
For example:
hello-images/
├── first-black
│   ├── full_b200056_m.png
│   ├── full_b200056_x_DSC01973.JPG
│   ├── full_b200056_x_DSC01978.JPG
│   ├── full_b200056_x_DSC01988.JPG
│   ├── full_b200056_x_DSC01994.JPG
│   ├── full_b200056_x_DSC02003.JPG
├── second-atlas
│   ├── full_b200035_m1.png
│   ├── full_b200035_x_3926.JPG
│   ├── full_b200035_x_3928.JPG
│   ├── full_b200035_x_3931.JPG
│   ├── full_b200035_x_3944.JPG
Desidered result:
hello-images/
├── first-black
│   ├── first-black_1.png
│   ├── first-black_2.JPG
│   ├── first-black_3.JPG
│   ├── first-black_4.JPG
│   ├── first-black_5.JPG
│   ├── first-black_6.JPG
├── second-atlas
│   ├── second-atlas_1.png
│   ├── second-atlas_2.JPG
│   ├── second-atlas_3.JPG
│   ├── second-atlas_4.JPG
│   ├── second-atlas_5.JPG
From hello-images directory, do:
for d in */; do i=1; for f in "$d"/*.*; do echo mv -- "$f" "$d${d%/}_${i}.${f##*.}"; ((i++)); done; done
This is dry-run, it will show the mv commands to be run. If satisfied with the changes to be made, remove echo for actual action:
for d in */; do i=1; for f in "$d"/*.*; do mv -- "$f" "$d${d%/}_${i}.${f##*.}"; ((i++)); done; done
Expanded form:
for d in */; do
i=1
for f in "$d"/*.*; do
mv -- "$f" "$d${d%/}_${i}.${f##*.}"
((i++))
done
done

Linux/shell - Remove all (sub)subfolders from a directory except one

I've inherited a structure like the below, a result of years of spaghetti code...
gallery
├── 1
│   ├── deleteme1
│   ├── deleteme2
│   ├── deleteme3
│   └── full
│   ├── file1
│   ├── file2
│   └── file3
├── 2
│   ├── deleteme1
│   ├── deleteme2
│   ├── deleteme3
│   └── full
│   ├── file1
│   ├── file2
│   └── file3
└── 3
├── deleteme1
├── deleteme2
├── deleteme3
└── full
├── file1
├── file2
└── file3
In reality, this folder is thousands of subfolders large. I only need to keep ./gallery/{number}/full/* (i.e. the full folder and all files within, from each numbered directory within gallery), with everything else no longer required and needs to be deleted.
Is it possible to construct a one-liner to handle this? I've experimented with find/maxdepth/prune could not find an arragement which met my needs.
(Update: To clarify, all folders contain files - none are empty)
Using PaddyD answer you can first clean unwanted directories and then delete them:
find . -type f -not -path "./gallery/*/full/*" -exec rm {} + && find . -type d -empty -delete
This can easily be done with bash extglobs, which allow matching all files that don't match a pattern:
shopt -s extglob
rm -ri ./gallery/*/!(full)
How about:
find . -type d -empty -delete

Resources