Remove dirs from parent dir without deleting parent directory (Linux) - linux

I am using the following command to remove all dirs older than 1 minute from the following path:
find /myhome/me/xyz/[0-9][0-9]/[0-9][0-9][0-9][0-9]/my_stats/ -mmin +1 -exec rm -rf {} \;
folder structure :
/home/myhome/me/xyz/<2 digit name>/<4 digit name>/my_stats/
There could be multiple dirs or a single dir inside my_stats. The issue is when there is a single folder inside my_stats, the find command is deleting the my_stats dir as well.
Is there a way to solve this?
Thanks

If I understand your question correctly you are probably looking for this:
find /myhome/me/xyz/[0-9][0-9]/[0-9][0-9][0-9][0-9]/my_stats/ -mmin +1 -maxdepth 1 -mindepth 1 -type d -exec rm -rf {} \;
The -mindepth 1 parameter is what excludes the my_stats directory from the listing, as it is located at depth 0.
The -maxdepth 1 parameter will not show subdirs of subdirs (you are deleting their parents recursively anyway).
The -type d parameter limits the output to directories only, not ordinary files.

Related

how to move jpg and jpeg files whose size is greater than 10kb [duplicate]

I have some automated downloads in a proprietary linux distro.
They go to a temp scratch disk. I want to move them when they're finished to the main RAID array. The best way I can see to do this is to check the folders on the disk to see if the contents have changed in the last minute. If not then its probably finished downloading and then move it.
Assuming there could be hundreds of folders or just one in this location and its all going to the same place. Whats the best way to write this?
I can get a list of folder sizes with
du -h directory/name
The folders can contain multiple files anywhere from 1.5mb to 10GB
Temp Loc: /volume2/4TBScratch/Processing
Dest Loc when complete: /volume1/S/00 Landing
EDIT:
Using this:
find /volume2/4TBScratch/Processing -mindepth 1 -type d -not -mmin +10 -exec mv "{}" "/volume1/S/00 Landing" \;
find: `/volume2/4TBScratch/Processing/test': No such file or directory
4.3#
yet it DOES copy the relevant folders and all files. But the error worries me that something might go wrong in the future.... is it because there is multiple files and it's running the same move command for EACH file or folder in the root folder? But since it moves it all on the first iteration it cant find it on the next ones?
EDIT2:
Using Rsync
4.3# find /volume2/4TBScratch/Processing -mindepth 1 -type d -not -mmin +10 -exec rsync --remove-source-files "{}" "/volume1/S/00 Landing" \;
skipping directory newtest
skipping directory erw
RESOLVED: EDIT3
Resolved with the help in the comments below. Final script looks like this:
find /volume2/4TBScratch/Processing -mindepth 1 -type d -not -mmin +10 -exec rsync -a --remove-source-files "{}" "/volume1/S/00 Landing" \;
find /volume2/4TBScratch/Processing -depth -type d -empty -delete
rsync to move folders and files but leaves empty root dir
the next command finds empty folders and removes them.
Thanks all!
You can use GNU find with options -size for detecting files/folders of certain size and use mv with the -exec option to move to destination directory. The syntax is
find /volume2/4TBScratch/Processing -type d -maxdepth 1 -size -10G -exec mv "{}" "/volume1/S/00 Landing" \;
Using rsync
find /volume2/4TBScratch/Processing -type d -maxdepth 1 -size -10G -exec rsync --remove-source-files "{}" "/volume1/S/00 Landing" \;
The size with a - sign to indicate less than the mentioned size which in this case is 10GB. A note on each of the flags used
-type d -> For identifying only the folders from the source path.
-maxdepth 1 -> To look only on the current source directory and not
being recursive.
-exec -> Execute command following it.
Alternatively, if you want to find files that are last modified over a certain time(minutes), find has an option for -mmin which can be set to a value. E.g. -mmin -5 would return files modified five minutes ago.
So suggest adding it to your requirement, for x as you need and see if the directories are listed, then you can add the -exec option for moving the directories
find /volume2/4TBScratch/Processing -type d -maxdepth 1 -mmin -2 -size -10G
Refer to the GNU documentation for finding files according to size on how this works.
Note:- The double quotes("") are added to avoid Bash from splitting the names containing spaces.

How to loop through multiple folder and subfolders and remove file name start with abc.txt and 14 days old

I have folder and subfolder. I need to loop through each folder and subfolder and remove or move the file names which start with abc.txt and 14 days old to temporary folder. My folder tree structure is:
The file may be inside the folder or subfolder 'abc.txt'
I have used this below code but not working.
I took the folder paths into a list.txt file using below command
find $_filepath -type d >> folderpathlist.txt
I pass the path list to below code to search and remove or move files to temporary folder
find folderpathlist.txt -name "abc*" -mtime \+14 >>temp/test/
How do I achieve this scenario ?
You want to find files: -type f
that start with abc.txt: -name "abc.txt*"
that are 14 days old: -mtime +14
and move them to a dir.: -exec mv {} /tmp \;
and to see what moved: -print
So the final command is:
find . -type f -name "abc.txt*" -mtime +14 -exec mv {} /tmp \; -print
Adjust the directory as required.
Note that mtime is the modification time. So it is 14 days old since the last modification was done to it.
Note 2: the {} in the -exec is replaced by each filename found.
Note 3: \; indicates the termination of the command inside the -exec
Note 4: find will recurse into sub-directories anyway. No need to list the directories and loop on them again.

How to delete files under subdirectories but not deleting subdirectories themselves in linux

I have the following directory structure:
/archive/file1.csv
/archive/file2.csv
/archive/myfile/my.txt
/archive/yourfile/your.txt
I want to delete all files under /archive but not its subfolders, so after deletion, the directory structure should look like:
/archive/
/archive/myfile/
/archive/yourfile/
I have tried the following two commands, but the files under the subfolders are not deleted (ie. my.txt and your.txt), anyone know why ?
find -L /archive ! -type d -exec rm -rfv {} +
find -L /archive -type f -exec rm -rfv {} +
use find
$ find . ! -type d -delete
make sure you're in the right path.

Bash-Performing the same command on several directories

I want to create a script that will delete any files older than 7 days on a specified list of directories, but wondering what would be the best way to go about it...
I want to perform the following command on all directories specified:
find DIRECTORY_PATH -type f -mtime +7 -exec rm {} \;
Maybe an array holding a list of directories, and loop through each element in the array performing the find command on that?
Any help/advice would be appreciated.
You can directly store all the directories in a file, say dirs.txt and loop through it:
while read dir
do
find "$dir" -type f -mtime +7 -exec rm {} \;
done < dirs.txt

How to delete all files older than 3 days when "Argument list too long"?

I've got a log file directory that has 82000 files and directories in it (about half and half).
I need to delete all the file and directories which are older than 3 days.
In a directory that has 37000 files in it, I was able to do this with:
find * -mtime +3 -exec rm {} \;
But with 82000 files/directories, I get the error:
/usr/bin/find: Argument list too long
How can I get around this error so that I can delete all files/directories that are older than 3 days?
To delete all files and directories within the current directory:
find . -mtime +3 | xargs rm -Rf
Or alternatively, more in line with the OP's original command:
find . -mtime +3 -exec rm -Rf -- {} \;
Can also use:
find . -mindepth 1 -mtime +3 -delete
To not delete target directory
Another solution for the original question, esp. useful if you want to remove only SOME of the older files in a folder, would be smth like this:
find . -name "*.sess" -mtime +100
and so on.. Quotes block shell wildcards, thus allowing you to "find" millions of files :)

Resources