scp files from yesterday - linux

I want to copy files from a remote Server to a local server. The problem is: I only want to copy the files from yesterday.
The remote server is writing logfiles and at 23:59 the logrotation is compressing it to a file [name]_[date].log.gz. At 6:00 in the morning a cronjob on the local server needs to copy the file previously created from the remote server. Does anyone know how to do this?
Regards,
Alex

You can use a script like this
for i in `find /interface/outbound/Web -type f -ctime -1`
do
scp $i user#$destination_server:/destination_directory/
done
in particular the command find as the following features for example:
find . -ctime -1 # which are created less than 1 day ago from currrent folder.
find . -ctime +2 # finds files which are created older than 2 days from currrent folder.
where ctime is the creation time. It's also possible to use the modification time mtime in this way:
find . -mtime 0 # find files modified between now and 1 day ago
find . -mtime -1 # find files modified less than 1 day ago
find . -mtime 1 # find files modified between 24 and 48 hours ago
find . -mtime +1 # find files modified more than 48 hours ago
More information in man find
Edit:
To have the same behaviour from remote to local you can use something like:
latest_file=`ssh user#destination_server find /pathtoDir -type f -ctime -1`
/usr/bin/scp user#destination_server:$latest_file /local_dir
echo SCP Completed.
At this moment I haven't a Unix environment to make some tests.

Related

Creating cron on RHEL 7 server to delete files older than a week [duplicate]

This question already has answers here:
Shell script to delete directories older than n days
(5 answers)
Closed 9 years ago.
I want to delete scripts in a folder from the current date back to 10 days.
The scripts looks like:
2012.11.21.09_33_52.script
2012.11.21.09_33_56.script
2012.11.21.09_33_59.script
The script will run in every 10 day with Crontab, that's why I need the current date.
find is the common tool for this kind of task :
find ./my_dir -mtime +10 -type f -delete
EXPLANATIONS
./my_dir your directory (replace with your own)
-mtime +10 older than 10 days
-type f only files
-delete no surprise. Remove it to test your find filter before executing the whole command
And take care that ./my_dir exists to avoid bad surprises !
Just spicing up the shell script above to delete older files but with logging and calculation of elapsed time
#!/bin/bash
path="/data/backuplog/"
timestamp=$(date +%Y%m%d_%H%M%S)
filename=log_$timestamp.txt
log=$path$filename
days=7
START_TIME=$(date +%s)
find $path -maxdepth 1 -name "*.txt" -type f -mtime +$days -print -delete >> $log
echo "Backup:: Script Start -- $(date +%Y%m%d_%H%M)" >> $log
... code for backup ...or any other operation .... >> $log
END_TIME=$(date +%s)
ELAPSED_TIME=$(( $END_TIME - $START_TIME ))
echo "Backup :: Script End -- $(date +%Y%m%d_%H%M)" >> $log
echo "Elapsed Time :: $(date -d 00:00:$ELAPSED_TIME +%Hh:%Mm:%Ss) " >> $log
The code adds a few things.
log files named with a timestamp
log folder specified
find looks for *.txt files only in the log folder
type f ensures you only deletes files
maxdepth 1 ensures you dont enter subfolders
log files older than 7 days are deleted ( assuming this is for a backup log)
notes the start / end time
calculates the elapsed time for the backup operation...
Note: to test the code, just use -print instead of -print -delete. But do check your path carefully though.
Note: Do ensure your server time is set correctly via date - setup timezone/ntp correctly . Additionally check file times with 'stat filename'
Note: mtime can be replaced with mmin for better control as mtime discards all fractions (older than 2 days (+2 days) actually means 3 days ) when it deals with getting the timestamps of files in the context of days
-mtime +$days ---> -mmin +$((60*24*$days))
If you can afford working via the file data, you can do
find -mmin +14400 -delete

Deleting files after 7 days not working

I am trying to delete all files that are older than 7 days. The command is working but not correctly.
find '/files/tem_data/' -mtime +7 -exec rm -rf {} \+
It does delete files but it's not accurate.
ls -Artl | head -n 2
The find does delete files, but when I run the ls command does contain files that should be deleted. For example today is November 7th. The find should delete all files before November 1st. It does not. The command leaves files that are in October 30 and 31. How can I delete files that are older than 7 days.
If I run find command like 3 minutes later. It deletes files with the date of October 30 and a time of 3 minutes after it first ran.
From man find:
-atime n
File was last accessed n*24 hours ago. When find figures out how many
24-hour periods ago the file was last accessed, any fractional part is
ignored, so to match -atime +1, a file has to have been accessed at least
two days ago.
This means that your command actually deletes files that were accessed 8 or more days ago.
Since the time now is
$ date
Tue Nov 7 10:29:29 PST 2017
find will require files need to be older than:
$ date -d 'now - 8 days'
Mon Oct 30 11:29:05 PDT 2017
In other words, leaving some files from Oct 30 is expected and documented behavior.
To account for find rounding down, simply use -mtime +6 instead.
This is not the exact answer but you can try this as an sample.
find /path/to/ -type f -mtime +7 -name '*.gz' -execdir rm -- '{}' \;
Or for an alternative and also faster command is using exec's + terminator instead of \;:
find /path/to/ -type f -mtime +7 -name '*.gz' -execdir rm -- '{}' +
or
find /path/to/ -type f -mtime +7 -name '*.gz' -delete
find: the unix command for finding files/directories/links and etc.
/path/to/: the directory to start your search in.
-type f: only find files.
-name '*.gz': list files that ends with .gz.
-mtime +7: only consider the ones with modification time older than 7 days. -execdir ... \;: for each such result found, do the
following command in rm -- '{}': remove the file; the {} part is
where the find result gets substituted into from the previous part.
-- means end of command parameters avoid prompting error for those files starting with hyphen
.

Shell script to find recently modified files [duplicate]

E.g., a MySQL server is running on my Ubuntu machine. Some data has been changed during the last 24 hours.
What (Linux) scripts can find the files that have been changed during the last 24 hours?
Please list the file names, file sizes, and modified time.
To find all files modified in the last 24 hours (last full day) in a particular specific directory and its sub-directories:
find /directory_path -mtime -1 -ls
Should be to your liking
The - before 1 is important - it means anything changed one day or less ago.
A + before 1 would instead mean anything changed at least one day ago, while having nothing before the 1 would have meant it was changed exacted one day ago, no more, no less.
Another, more humanist way, is to use -newermt option which understands human-readable time units.
Unlike -mtime option which requires the user to read find documentation to figure our what time units -mtime expects and then having the user to convert its time units into those, which is error-prone and plain user-unfriendly. -mtime was barely acceptable in 1980s, but in the 21st century -mtime has the convenience and safety of stone age tools.
Example uses of -newermt option with the same duration expressed in different human-friendly units:
find /<directory> -newermt "-24 hours" -ls
find /<directory> -newermt "1 day ago" -ls
find /<directory> -newermt "yesterday" -ls
You can do that with
find . -mtime 0
From man find:
[The] time since each file was last modified is divided by 24 hours and any remainder is discarded. That means that to
match -mtime 0, a file will have to have a modification in the past which is less than 24 hours ago.
On GNU-compatible systems (i.e. Linux):
find . -mtime 0 -printf '%T+\t%s\t%p\n' 2>/dev/null | sort -r | more
This will list files and directories that have been modified in the last 24 hours (-mtime 0). It will list them with the last modified time in a format that is both sortable and human-readable (%T+), followed by the file size (%s), followed by the full filename (%p), each separated by tabs (\t).
2>/dev/null throws away any stderr output, so that error messages don't muddy the waters; sort -r sorts the results by most recently modified first; and | more lists one page of results at a time.
For others who land here in the future (including myself), add a -name option to find specific file types, for instance: find /var -name "*.php" -mtime -1 -ls
This command worked for me
find . -mtime -1 -print
Find the files...
You can set type f = file
find /directory_path -type f -mtime -1 -exec ls -lh {} \;
👍

Find modified files and echo them into a txt file on server root

I'd need a shell command to show of the last modified and new files on the whole server (recursive) and echoing them into a txt file in the root.
Has anybody something like this?
I tried
find / - mmtime 30 -printf "%AD %Ar - %p\n" 2> /dev/null | sort -r > /lastmodified.txt
to post all names of all modified files from the last 30 days into a txt file in root, but it shows me only the files of the server itself and not the directories where my websites are uploaded to.
Thank you in advance - I am not an expert, and this is what I found so far. It is relative urgent as I need this to fix hacked files which has happend last week.
From http://content.hccfl.edu/pollock/Unix/FindCmd.htm:
find . -mtime 0 # find files modified within the past 24 hours
find . -mtime -1 # find files modified within the past 24 hours
find . -mtime 1 # find files modified between 24 and 48 hours ago
find . -mtime +1 # find files modified more than 48 hours ago
Make sure that you have only one 'm' and a minus sign in -mtime -30, as suggested in chorobas comment, to get last 30 days. -mtime 30 would give just files exactly 30 days ago.
You may want to use option -daystart to get files of last 30 days starting from midnight instead of just 30*24 hours ago. Use %TD and %Tr instead of %AD and %Ar to get modification times (instead of access times).
The final command would then be:
find / -daystart -mtime -30 -printf "%TD %Tr - %p\n" 2> /dev/null | sort -r > /lastmodified.txt
Note that the sort will break in January as 12 is sorted before 01. If you want to make sure the dates are always in order, use for example time definition %T+ (2012-11-29+21:07:41.0000000000) or %Tu/%Tm/%Td %TH:%TM (12/11/29 21:07)
What about inotify-tools
https://github.com/rvoicilas/inotify-tools/wiki#wiki-getting
http://linux.die.net/man/1/inotifywait
inotifywait example 2
#!/bin/sh
EVENT=$(inotifywait --format '%e' ~/file1)
[ $? != 0 ] && exit
[ "$EVENT" = "MODIFY" ] && echo 'file modified!'
[ "$EVENT" = "DELETE_SELF" ] && echo 'file deleted!'
# etc...

Recent files in folder

I want to check files which are recently added to the folder in unix environment.
is there any find check
find -name 'filename' timestamp last 5 mins ??
To locate files modified less than 5 minutes ago
find -name 'filename' -mmin -5
From the man page:
-mmin n
File's data was last modified n minutes ago.
-mtime n
File's data was last modified n*24 hours ago.
-mmin is supported under most recent versions of GNU find.
To find all files modified in the last 24 hours (last full day) in current directory and its sub-directories:
find . -mtime -1 -print
Source
In zsh:
ls *(mm-5) # or 'mh' to detect stuff modified in the last 5 hours etc.

Resources