Find modified files and echo them into a txt file on server root - linux

I'd need a shell command to show of the last modified and new files on the whole server (recursive) and echoing them into a txt file in the root.
Has anybody something like this?
I tried
find / - mmtime 30 -printf "%AD %Ar - %p\n" 2> /dev/null | sort -r > /lastmodified.txt
to post all names of all modified files from the last 30 days into a txt file in root, but it shows me only the files of the server itself and not the directories where my websites are uploaded to.
Thank you in advance - I am not an expert, and this is what I found so far. It is relative urgent as I need this to fix hacked files which has happend last week.

From http://content.hccfl.edu/pollock/Unix/FindCmd.htm:
find . -mtime 0 # find files modified within the past 24 hours
find . -mtime -1 # find files modified within the past 24 hours
find . -mtime 1 # find files modified between 24 and 48 hours ago
find . -mtime +1 # find files modified more than 48 hours ago
Make sure that you have only one 'm' and a minus sign in -mtime -30, as suggested in chorobas comment, to get last 30 days. -mtime 30 would give just files exactly 30 days ago.
You may want to use option -daystart to get files of last 30 days starting from midnight instead of just 30*24 hours ago. Use %TD and %Tr instead of %AD and %Ar to get modification times (instead of access times).
The final command would then be:
find / -daystart -mtime -30 -printf "%TD %Tr - %p\n" 2> /dev/null | sort -r > /lastmodified.txt
Note that the sort will break in January as 12 is sorted before 01. If you want to make sure the dates are always in order, use for example time definition %T+ (2012-11-29+21:07:41.0000000000) or %Tu/%Tm/%Td %TH:%TM (12/11/29 21:07)

What about inotify-tools
https://github.com/rvoicilas/inotify-tools/wiki#wiki-getting
http://linux.die.net/man/1/inotifywait
inotifywait example 2
#!/bin/sh
EVENT=$(inotifywait --format '%e' ~/file1)
[ $? != 0 ] && exit
[ "$EVENT" = "MODIFY" ] && echo 'file modified!'
[ "$EVENT" = "DELETE_SELF" ] && echo 'file deleted!'
# etc...

Related

Creating cron on RHEL 7 server to delete files older than a week [duplicate]

This question already has answers here:
Shell script to delete directories older than n days
(5 answers)
Closed 9 years ago.
I want to delete scripts in a folder from the current date back to 10 days.
The scripts looks like:
2012.11.21.09_33_52.script
2012.11.21.09_33_56.script
2012.11.21.09_33_59.script
The script will run in every 10 day with Crontab, that's why I need the current date.
find is the common tool for this kind of task :
find ./my_dir -mtime +10 -type f -delete
EXPLANATIONS
./my_dir your directory (replace with your own)
-mtime +10 older than 10 days
-type f only files
-delete no surprise. Remove it to test your find filter before executing the whole command
And take care that ./my_dir exists to avoid bad surprises !
Just spicing up the shell script above to delete older files but with logging and calculation of elapsed time
#!/bin/bash
path="/data/backuplog/"
timestamp=$(date +%Y%m%d_%H%M%S)
filename=log_$timestamp.txt
log=$path$filename
days=7
START_TIME=$(date +%s)
find $path -maxdepth 1 -name "*.txt" -type f -mtime +$days -print -delete >> $log
echo "Backup:: Script Start -- $(date +%Y%m%d_%H%M)" >> $log
... code for backup ...or any other operation .... >> $log
END_TIME=$(date +%s)
ELAPSED_TIME=$(( $END_TIME - $START_TIME ))
echo "Backup :: Script End -- $(date +%Y%m%d_%H%M)" >> $log
echo "Elapsed Time :: $(date -d 00:00:$ELAPSED_TIME +%Hh:%Mm:%Ss) " >> $log
The code adds a few things.
log files named with a timestamp
log folder specified
find looks for *.txt files only in the log folder
type f ensures you only deletes files
maxdepth 1 ensures you dont enter subfolders
log files older than 7 days are deleted ( assuming this is for a backup log)
notes the start / end time
calculates the elapsed time for the backup operation...
Note: to test the code, just use -print instead of -print -delete. But do check your path carefully though.
Note: Do ensure your server time is set correctly via date - setup timezone/ntp correctly . Additionally check file times with 'stat filename'
Note: mtime can be replaced with mmin for better control as mtime discards all fractions (older than 2 days (+2 days) actually means 3 days ) when it deals with getting the timestamps of files in the context of days
-mtime +$days ---> -mmin +$((60*24*$days))
If you can afford working via the file data, you can do
find -mmin +14400 -delete

Shell script to find recently modified files [duplicate]

E.g., a MySQL server is running on my Ubuntu machine. Some data has been changed during the last 24 hours.
What (Linux) scripts can find the files that have been changed during the last 24 hours?
Please list the file names, file sizes, and modified time.
To find all files modified in the last 24 hours (last full day) in a particular specific directory and its sub-directories:
find /directory_path -mtime -1 -ls
Should be to your liking
The - before 1 is important - it means anything changed one day or less ago.
A + before 1 would instead mean anything changed at least one day ago, while having nothing before the 1 would have meant it was changed exacted one day ago, no more, no less.
Another, more humanist way, is to use -newermt option which understands human-readable time units.
Unlike -mtime option which requires the user to read find documentation to figure our what time units -mtime expects and then having the user to convert its time units into those, which is error-prone and plain user-unfriendly. -mtime was barely acceptable in 1980s, but in the 21st century -mtime has the convenience and safety of stone age tools.
Example uses of -newermt option with the same duration expressed in different human-friendly units:
find /<directory> -newermt "-24 hours" -ls
find /<directory> -newermt "1 day ago" -ls
find /<directory> -newermt "yesterday" -ls
You can do that with
find . -mtime 0
From man find:
[The] time since each file was last modified is divided by 24 hours and any remainder is discarded. That means that to
match -mtime 0, a file will have to have a modification in the past which is less than 24 hours ago.
On GNU-compatible systems (i.e. Linux):
find . -mtime 0 -printf '%T+\t%s\t%p\n' 2>/dev/null | sort -r | more
This will list files and directories that have been modified in the last 24 hours (-mtime 0). It will list them with the last modified time in a format that is both sortable and human-readable (%T+), followed by the file size (%s), followed by the full filename (%p), each separated by tabs (\t).
2>/dev/null throws away any stderr output, so that error messages don't muddy the waters; sort -r sorts the results by most recently modified first; and | more lists one page of results at a time.
For others who land here in the future (including myself), add a -name option to find specific file types, for instance: find /var -name "*.php" -mtime -1 -ls
This command worked for me
find . -mtime -1 -print
Find the files...
You can set type f = file
find /directory_path -type f -mtime -1 -exec ls -lh {} \;
👍

scp files from yesterday

I want to copy files from a remote Server to a local server. The problem is: I only want to copy the files from yesterday.
The remote server is writing logfiles and at 23:59 the logrotation is compressing it to a file [name]_[date].log.gz. At 6:00 in the morning a cronjob on the local server needs to copy the file previously created from the remote server. Does anyone know how to do this?
Regards,
Alex
You can use a script like this
for i in `find /interface/outbound/Web -type f -ctime -1`
do
scp $i user#$destination_server:/destination_directory/
done
in particular the command find as the following features for example:
find . -ctime -1 # which are created less than 1 day ago from currrent folder.
find . -ctime +2 # finds files which are created older than 2 days from currrent folder.
where ctime is the creation time. It's also possible to use the modification time mtime in this way:
find . -mtime 0 # find files modified between now and 1 day ago
find . -mtime -1 # find files modified less than 1 day ago
find . -mtime 1 # find files modified between 24 and 48 hours ago
find . -mtime +1 # find files modified more than 48 hours ago
More information in man find
Edit:
To have the same behaviour from remote to local you can use something like:
latest_file=`ssh user#destination_server find /pathtoDir -type f -ctime -1`
/usr/bin/scp user#destination_server:$latest_file /local_dir
echo SCP Completed.
At this moment I haven't a Unix environment to make some tests.

Finding files modified within an hour of another file

If i have 3 files called 1.txt,2.txt and 3.txt for example and they were created an hour apart, say 1pm 2pm and 3pm respectively. What I need is a command that finds all files modified within an hour of a specific file.
I'm in the same directory as the files in the terminal and all files are setuid permission
I've been trying:
find . -type f -perm -4000 -newer 2.txt -mmin -60 -print
This should return 3.txt but it doesn't
What would use to file created in the hour before or after 2.txt?
Try this
touch /tmp/temp -t time1
touch /tmp/ntemp -t time2
find . -newer /tmp/temp -a ! -newer /tmp/ntemp -exec ls -l {} \; 2>/dev/null
where
time1 = time of file creation - 1hr
time2 = time of file creation + 1hr
Ex:
time1 = 201210041500
time2 = 201210041700
Here is my logic -
First get time of last access of file in seconds since Epoch in some variable.
time_in_sec=$(stat -c %X 2.txt)
Get the time of last hour ( ie 3600 seconds back )
one_hr_old_time_in_sec=`expr $time_in_sec - 3600`
Convert it into format suitable for find command
newerthan=$(date -d #$one_hr_old_time_in_sec '+%m-%d-%y%n %H:%M:%S')
Convert time of original file in format suitable for find command
olderthan=$(date -d #$time_in_sec '+%m-%d-%y%n%H:%M:%S')
Get the list of files modified between two time periods using find command
find . -newermt "$newerthan" ! -newermt "$olderthan" -print
If it works you can write a small shell script which will take file name as parameter and can try for +3600 also.
Honestly, I haven't tried it. Hope it works !
The previous solution can be simplified since -newermt shall accept the same formats than 'date' including #N for N seconds since epoch.
Also for the 'stat' command it is probably better to use -Y (the time of last modification) instead of -X (the time of last access).
So the command to find the files modified + or - 1 hours of 2.txt is
N=$(stat -c %Y 2.txt)
find . -newermt #$((N-3600)) ! -newermt #$((N+3600)) -print
If you are running this after 4pm, given your example, it makes sense that it wouldn't return 3.txt, as -newer 2.txt -mmin -60 means "modified in the last 60 minutes, and more recently than 2.txt", not "modified less than 60 minutes after 2.txt". I don't think find currently has options to do what you're wanting (at least, the version I have doesn't), but it shouldn't be too hard to script in python or perl.

Deleting files based on CreationTime

In a directory there are files which are generated daily.
Format of files, if its generated on 16th Apr 2012 is TEST_20120416.
So I need to delete all the files which are older than 7 days. I tried doing this
#!/bin/ksh
find /data/Test/*.* -mtime -7 -exec rm -rf {} \;
exit 0
Now the problem is above code is deleting based on modification time but according to requirement file should delete based on creation time.Kindly help me out in deleting files based on filename(filename has timestamp).
As you fortunately have creation date encoded in filename, this should work:
#!/bin/sh
REFDATE=$(date --date='-7 days' +%Y%m%d)
PREFIX=TEST_
find /data/Test/ -name $PREFIX\* | while read FNAME; do
if [ ${FNAME#$PREFIX} -lt $REFDATE ]; then
rm $FNAME
fi
done
It will print warnings if you have some other files with names starting with TEST_ in which case some more filtering may be needed.
find /data/Test/*.* -ctime -7 -delete
'find /data/Test/.' will find the all the files in the /data/Test folder and argument '-ctime -7' will limit the search to the creation time to last 7 days and -delete option will delete such files

Resources