Delete directories older than X days - linux

so I have looked at every single script on here regarding deleting directories older than 14 days. The Script I wrote works with files but for some reason it is not deleting the directories. So here is my scripts.
#!/bin/bash
find /TBD/* -mtim +1 | xargs rm -rf
So this code successfully deleted the FILES inside TBD but it left two directories. I checked the timestamp on them and they are atleast 2 days since last modification according to the timestamp. Specifically Dec 16 16:10 So I can't figure this out. My crontab I have running this runs every minute and logs and in the log it only shows.
+ /scripts/deletebackups.sh: :2:BASH_XTRACEFD=3xargs rm -rf
+ /scripts/deletebackups.sh: :2: BASH_XTRACEFD=3find /TBD/contents TBD/contents -mtime +1
I used contents since the contents are actually peoples name in our pxe server. I checked every file and folder INSIDE these two directories and their timestamps are the same as the parent directory as they should be but it's still not deleting.
Could it be a permissions thing? I wrote the script using sudo nano deletebackups.sh
When I type ls under TBD in the far left it shows
drwxr-xr-x 3 hscadministrator root 4096 DEC 16 16:10 for each of the two directories that won't delete.
I'm not overly familiar with what all those letters mean.
Other iterations of this code I have already attempted are
find /TBD/* -mtime +1 rm -r {} \;

To delete directories in /TBD older than 1 day:
find /TBD -mtime +1 -type d | xargs rm -f -r

Add -exec and -f to your find:
find /TBD/* -mtime +1 -exec rm -rf {} \;
Note, if you're looking to delete files older than 14 days, you need to change mtime:
-mtime +14

Related

Cron Job email find: missing argument to `-exec'

I'm setting up a webserver with Plesk on Ubuntu 18.04 and I would like to use a part of the space I've available to store security footage. I succeeded in automatically uploading the photos and videos to the correct folder, but the problem is that they are not automatically removed, so that the server is full of security images. I upload the footages to a folder on the server that is also available from the internet (secured). I did some research on the internet to a cron job that automatically deleted the files older than 7 days where I found this:
find /var/www/vhosts/path to files/* -mtime +7 -exec rm -f {} \;
I also found that you can name a file to, for example: delete-files and which can be executed with crontab -e. (Yes, I made it executable;-)
I added this cron to run every hour and stated that I received notifications from the cron. Now, however, I get the following output: find: missing argument to `-exec '
Is there anything else that I need to share? Like logs?
change find /var/www/vhosts/path to files/* -mtime +7 -exec rm -f {} \;
to
find /var/www/vhosts/path to files/ -mtime +7 -exec rm -f {} \;
the * is unnecessary in the path
Can you try this as well?
find /var/www/vhosts/path to files/ -mtime +7 | xargs rm -f

Deleting files after 7 days not working

I am trying to delete all files that are older than 7 days. The command is working but not correctly.
find '/files/tem_data/' -mtime +7 -exec rm -rf {} \+
It does delete files but it's not accurate.
ls -Artl | head -n 2
The find does delete files, but when I run the ls command does contain files that should be deleted. For example today is November 7th. The find should delete all files before November 1st. It does not. The command leaves files that are in October 30 and 31. How can I delete files that are older than 7 days.
If I run find command like 3 minutes later. It deletes files with the date of October 30 and a time of 3 minutes after it first ran.
From man find:
-atime n
File was last accessed n*24 hours ago. When find figures out how many
24-hour periods ago the file was last accessed, any fractional part is
ignored, so to match -atime +1, a file has to have been accessed at least
two days ago.
This means that your command actually deletes files that were accessed 8 or more days ago.
Since the time now is
$ date
Tue Nov 7 10:29:29 PST 2017
find will require files need to be older than:
$ date -d 'now - 8 days'
Mon Oct 30 11:29:05 PDT 2017
In other words, leaving some files from Oct 30 is expected and documented behavior.
To account for find rounding down, simply use -mtime +6 instead.
This is not the exact answer but you can try this as an sample.
find /path/to/ -type f -mtime +7 -name '*.gz' -execdir rm -- '{}' \;
Or for an alternative and also faster command is using exec's + terminator instead of \;:
find /path/to/ -type f -mtime +7 -name '*.gz' -execdir rm -- '{}' +
or
find /path/to/ -type f -mtime +7 -name '*.gz' -delete
find: the unix command for finding files/directories/links and etc.
/path/to/: the directory to start your search in.
-type f: only find files.
-name '*.gz': list files that ends with .gz.
-mtime +7: only consider the ones with modification time older than 7 days. -execdir ... \;: for each such result found, do the
following command in rm -- '{}': remove the file; the {} part is
where the find result gets substituted into from the previous part.
-- means end of command parameters avoid prompting error for those files starting with hyphen
.

Deleting files that are older than one day [duplicate]

This question already has answers here:
find files older than X days in bash and delete
(3 answers)
Closed 7 years ago.
I have a server which creates several log files in the log directory. Due to this logging mechanism it eats up a lot of disk space on my server. I want to write a script that deletes all the files that are older than one day and keep the latest ones.
I am able to list the directories in sorted form using ls -trl command. But I am not able to understand how to remove these files. Please help.
You can use the following command:
/usr/bin/find <Your Log Directory> -mtime +1 | xargs rm -f
mtime - provides the file modification time.
+1 - indicates greater than one day.
Try using rm and find command like:
find . -mmin +$((60*24)) -exec rm {} \;
You don't want ls, you want find.
It has a neat argument, -mtime, that limits the results to a specific time delta, and -exec which allows you to provide a command to run on the results.
So for example,
find -mtime +10 -name "*tmp*" -exec rm {} \;
Does an rm on all files older than 10 days, with tmp in the name.
Oh, and be careful.
Very careful.
find . -mtime +1 -exec rm {} \;

How to delete all files older than 3 days when "Argument list too long"?

I've got a log file directory that has 82000 files and directories in it (about half and half).
I need to delete all the file and directories which are older than 3 days.
In a directory that has 37000 files in it, I was able to do this with:
find * -mtime +3 -exec rm {} \;
But with 82000 files/directories, I get the error:
/usr/bin/find: Argument list too long
How can I get around this error so that I can delete all files/directories that are older than 3 days?
To delete all files and directories within the current directory:
find . -mtime +3 | xargs rm -Rf
Or alternatively, more in line with the OP's original command:
find . -mtime +3 -exec rm -Rf -- {} \;
Can also use:
find . -mindepth 1 -mtime +3 -delete
To not delete target directory
Another solution for the original question, esp. useful if you want to remove only SOME of the older files in a folder, would be smth like this:
find . -name "*.sess" -mtime +100
and so on.. Quotes block shell wildcards, thus allowing you to "find" millions of files :)

In linux shell, How to cp/rm files by time?

In linux shell, When I run
ls -al -t
that show the time of files.
How to cp/rm files by time? just like copy all the files that created today or yesterday. Thanks a lot.
Depending on what you actually want to do, find provides -[acm]time options for finding files by accessed, created or modified dates, along with -newer and -min. You can combine them with -exec to copy, delete, or whatever you want to do. For example:
find -maxdepth 1 -mtime +1 -type f -exec cp '{}' backup \;
Will copy all the regular files in the current directory more than 1 day old to the directory backup (assuming the directory backup exists).
Simple Example
find /path/to/folder/ -mtime 1 -exec rm {} \; // Deletes all Files modified yesterday
For more examples google for bash find time or take a look here

Resources