Can't find a way to delete files that match a date pattern - linux

I'm trying to delete all files in a folder structure (recursively) except the youngest one for each month.
In other words.... only keep the first ones from each month in each folder.
On a Linux system (bash) ... ;-) (or even more precise on a Synology NAS)
May thanks for your help !
Alex

Please be careful! I take no responsibility!
Try find:
Remove files which are older than 7 days:
find . -type f -ctime +7 -delete

Related

Finding and following symbolic links but without deleting them

The current find command is utilized to find and delete outdated files and directories. The expired data is based on a given properties file and the destination.
If the properties file says…
"/home/some/working/directory;.;180"
…then we want files and empty subdirectories deleted after 180 days.
The original command is…
"find ${var[0]} -mtime +${var[2]} -delete &"
…but I now need to modify now that we've discovered is has deleted symbolic links that existed in specified sub-directories after the given expiration date in the properties file. The variable path and variable expiration time are designated in the properties file (as previously demonstrated).
I have been testing using…
"find -L"
…to follow the symbolic links to make sure this clean up command reaches the destinations, as desired.
I have also been testing using…
"\! -type l"
…to ignore deleting symbolic links, so the command I've been trying is…
"find -L ${var[0]} ! -type l -mtime +${var[2]} -delete &"
…but I haven't achieved the desired results. Help please, I am still fresh into Linux and my research hasn't lead me to a desired answer. Thank you for your time.
Change
\! -type l
to
\! -xtype l
find -L ${var[0]} \\! -xtype l -mtime +${var[2]} -delete &

bash delete older files

I have this unique requirement of finding 2 years older files and delete them. But not only files as well as corresponding empty directories. I have written most of the logic but only thing that is still pending is , when I delete particular file from a directory , How can I delete the corresponding directory when it is empty. As when I delete the particular file , the ctime/mtime would also accordingly get updated. How do I target those corresponding older directories and delete them?
Any pointers will be helpful.
Thanks in advance.
Admin
I would do something like this:
find /path/to/files* -mtime +730 -delete
-mtime +730 finds files which are older than 730 days.
Please be careful with this kind of command though, be sure to write find /path/to/files* -mtime +730 beforehand and check that these are the files you want to delete!
Edit:
Now you have deleted the files from the directories, -mtime +730 won't work.
To delete all empty directories that you have recently altered:
find . -type d -mmin -60 -empty -delete

I wanna to delete all files and directories in linux

I have to delete all directories and files which should be 3 years back from current date what should be the specific command for that in linux.
It depends on how you define "3 years back": created, last modified... If that's last modified, you can do something like this to list those files
find /directory -mtime +1095
/directory is the starting directory, +1095 meaning modified 1095 days ago, 365*3.
If you're okay with the list, then add the delete option
find /directory -mtime +1095 -delete
Be careful not to put -delete before -mtime, there's a specific order there. See man find for more informations.

Is there any way to find out changed file after some date in whole project code?

see i am working in one BIG project source code Now i want to know which files are modified after some date.
Is there any command or any way to get that..
i have tried
# ls -R -l
but here it shows all file with last modified data but i want to filter this output by some data ...
so is there any way to do this in linux? is there any tool available for this?
#set timestamp for file
touch --date "2011-12-31" /tmp/foo
# Find files newer than 2011/Dec/31, in /some/files
find /some/files -newer /tmp/foo
Use find command with mtime arguments: Some examples are here or here
For example, list files changed in last 7 days...
find / -type f -mtime -7
For fine grained search you may try -mmin argument. See an example discussed in another SE site: Find All files older than x minutes
You should use find with -newerXY option.
m – modification time of the file reference
t – reference is interpreted directly as a time
All files modified after 2022-12-01 (inclusive):
find . -type f -newermt 2022-12-01

Two Questions on for Rsync - rsync by date and by file name

I have two questions with respect to rsync:
1: I have a bunch of files which are incremented by day of the year. Ex: file.txt.81, file.txt.82, etc. Now, these files are in different directories:
data1/file.txt.81
data1/file.txt.82
data2/file2.txt.81
data2/file2.txt.82
How can I have rsync get only the *.82 files and not even touch the other files
2: Now I have a similar data directory structure as above. How can I rsync all files that have been modified on or after a specific day?
Thanks
Here is the answer for #1rsync -avz --include "**/" --include=*.82 --exclude=* /path/from /path/to
This will recursively (-a) include the directories and search them for anything matching .82 and exclude everthing else. You can find more info on this in man rsync and look for "exclude patterns"
For #2 I would find some way to do it with find and mtime. To find files modified in past 60 minutes with the name *.82 this should work:
sudo find /path/from -mmin 60 -type f -name *.82
EDITED: too many backticks

Resources