Move files that are 30 minutes old - linux

I work on a server system that does not allow me to store files more than 50 gigabytes. My application takes 20 minutes to generate a file. Is there any way whereby I can move all the files that are more than 30 minutes old from source to destination? I tried rsync:
rsync -avP source/folder/ user#destiantionIp:dest/folder
but this does not remove the files from my server and hence the storage limit fails.
Secondly, if I use the mv command, the files that are still getting generated also move to the destination folder and the program fails.

You can use find along with -exec for this:-
Replace /sourcedirectory and /destination/directory/ with the source and target paths as you need.
find /sourcedirectory -maxdepth 1 -mmin -30 -type f -exec mv "{}" /destination/directory/ \;
What basically the command does is, it tries to find files in the current folder -maxdepth 1 that were last modified 30 mins ago -mmin -30 and move them to the target directory specified. If you want to use the time the file was last accessed use -amin -30.
Or if you want to find files modified within a range you can use something like -mmin 30 -mmin -35 which will get you the files modified more than 30 but less than 35 minutes ago.
References from the man page:-
-amin n
File was last accessed n minutes ago.
-atime n
File was last accessed n*24 hours ago. When find figures out how many 24-hour periods ago the file was last accessed, any fractional part is ignored, so to match -atime
+1, a file has to have been accessed at least two days ago.
-mmin n
File's data was last modified n minutes ago.
-mtime n
File's data was last modified n*24 hours ago. See the comments for -atime to understand how rounding affects the interpretation of file modification times.

Related

Issue with Linux find command

I have written a script to create an array of files that are modified more than 1 day back, iterate through the array and delete them interactively. Somehow the script is not working.
Below is the code:
#!/bin/bash
#take input from user for directory
read -p "Please enter the directory path from which you want to remove unused files" path
#create an array of files modified 2 days back
readarray -t files < <(find "$path" -maxdepth 0 -type f -mtime +1)
#remove the files iteratively and interactively
for file in "${files[#]}"; do
rm -i "$file"
done
This script isn't deleting anything. I have some files created on 12th Jan and untouched after that, but they're still there.
Can you please mention if something is missing here?
-mtime only considers full days, meaning +1 means "at least 2 days ago". From man find:
-atime n
File was last accessed less than, more than or exactly n*24 hours
ago. When find figures out how many 24-hour periods ago the file was
last accessed, any fractional part is ignored, so to match -atime +1,
a file has to have been accessed at least two days ago.
-mtime n
File's data was last modified less than, more than or exactly n*24
hours ago. See the comments for -atime to understand how rounding
affects the interpretation of file modification times.
You might also want to consider -exec of find instead of storing in an array. This will avoid all sorts of problems when your file names contain special characters, such as blanks, newlines, or globbing wildcards:
find "$path" -maxdepth 0 -type f -mtime +1440 -exec rm {} +
(a day has 1440 minutes (=60*24))

Shell script to find recently modified files [duplicate]

E.g., a MySQL server is running on my Ubuntu machine. Some data has been changed during the last 24 hours.
What (Linux) scripts can find the files that have been changed during the last 24 hours?
Please list the file names, file sizes, and modified time.
To find all files modified in the last 24 hours (last full day) in a particular specific directory and its sub-directories:
find /directory_path -mtime -1 -ls
Should be to your liking
The - before 1 is important - it means anything changed one day or less ago.
A + before 1 would instead mean anything changed at least one day ago, while having nothing before the 1 would have meant it was changed exacted one day ago, no more, no less.
Another, more humanist way, is to use -newermt option which understands human-readable time units.
Unlike -mtime option which requires the user to read find documentation to figure our what time units -mtime expects and then having the user to convert its time units into those, which is error-prone and plain user-unfriendly. -mtime was barely acceptable in 1980s, but in the 21st century -mtime has the convenience and safety of stone age tools.
Example uses of -newermt option with the same duration expressed in different human-friendly units:
find /<directory> -newermt "-24 hours" -ls
find /<directory> -newermt "1 day ago" -ls
find /<directory> -newermt "yesterday" -ls
You can do that with
find . -mtime 0
From man find:
[The] time since each file was last modified is divided by 24 hours and any remainder is discarded. That means that to
match -mtime 0, a file will have to have a modification in the past which is less than 24 hours ago.
On GNU-compatible systems (i.e. Linux):
find . -mtime 0 -printf '%T+\t%s\t%p\n' 2>/dev/null | sort -r | more
This will list files and directories that have been modified in the last 24 hours (-mtime 0). It will list them with the last modified time in a format that is both sortable and human-readable (%T+), followed by the file size (%s), followed by the full filename (%p), each separated by tabs (\t).
2>/dev/null throws away any stderr output, so that error messages don't muddy the waters; sort -r sorts the results by most recently modified first; and | more lists one page of results at a time.
For others who land here in the future (including myself), add a -name option to find specific file types, for instance: find /var -name "*.php" -mtime -1 -ls
This command worked for me
find . -mtime -1 -print
Find the files...
You can set type f = file
find /directory_path -type f -mtime -1 -exec ls -lh {} \;
👍

scp files from yesterday

I want to copy files from a remote Server to a local server. The problem is: I only want to copy the files from yesterday.
The remote server is writing logfiles and at 23:59 the logrotation is compressing it to a file [name]_[date].log.gz. At 6:00 in the morning a cronjob on the local server needs to copy the file previously created from the remote server. Does anyone know how to do this?
Regards,
Alex
You can use a script like this
for i in `find /interface/outbound/Web -type f -ctime -1`
do
scp $i user#$destination_server:/destination_directory/
done
in particular the command find as the following features for example:
find . -ctime -1 # which are created less than 1 day ago from currrent folder.
find . -ctime +2 # finds files which are created older than 2 days from currrent folder.
where ctime is the creation time. It's also possible to use the modification time mtime in this way:
find . -mtime 0 # find files modified between now and 1 day ago
find . -mtime -1 # find files modified less than 1 day ago
find . -mtime 1 # find files modified between 24 and 48 hours ago
find . -mtime +1 # find files modified more than 48 hours ago
More information in man find
Edit:
To have the same behaviour from remote to local you can use something like:
latest_file=`ssh user#destination_server find /pathtoDir -type f -ctime -1`
/usr/bin/scp user#destination_server:$latest_file /local_dir
echo SCP Completed.
At this moment I haven't a Unix environment to make some tests.

What is the difference between -{minutes} and {minutes} when using the find command on Linux?

My understanding of the following command is that it looks for files that have been modified in the last {x} minutes.
What does it mean if I exclude the - from mmin, what is it supposed to return?
COMMAND
find . -maxdepth 1 -mmin -20
-mmin -20 returns files that are modified less than 20 minutes ago.
-mmin 20 returns files that are modified exactly 20 minutes ago.
-mmin +20 returns any file modified 20 minutes ago or older.
From the find(1) man page:
Numeric arguments can be specified as
+n
for greater than n,
-n
for less than n,
n
for exactly n.
-mmin n
File's data was last modified n minutes ago.
i.e., -mmin -n means the data is modified less than n minutes ago and -mmin n means the data is modified exactly n minutes ago.
The minus is for 'or less'. '-mmin 20' will give you files modified exactly 20 minutes ago. mmin -20 gives you files modified 0 to 20 minutes ago.

Recent files in folder

I want to check files which are recently added to the folder in unix environment.
is there any find check
find -name 'filename' timestamp last 5 mins ??
To locate files modified less than 5 minutes ago
find -name 'filename' -mmin -5
From the man page:
-mmin n
File's data was last modified n minutes ago.
-mtime n
File's data was last modified n*24 hours ago.
-mmin is supported under most recent versions of GNU find.
To find all files modified in the last 24 hours (last full day) in current directory and its sub-directories:
find . -mtime -1 -print
Source
In zsh:
ls *(mm-5) # or 'mh' to detect stuff modified in the last 5 hours etc.

Resources