In linux shell, How to cp/rm files by time? - linux

In linux shell, When I run
ls -al -t
that show the time of files.
How to cp/rm files by time? just like copy all the files that created today or yesterday. Thanks a lot.

Depending on what you actually want to do, find provides -[acm]time options for finding files by accessed, created or modified dates, along with -newer and -min. You can combine them with -exec to copy, delete, or whatever you want to do. For example:
find -maxdepth 1 -mtime +1 -type f -exec cp '{}' backup \;
Will copy all the regular files in the current directory more than 1 day old to the directory backup (assuming the directory backup exists).

Simple Example
find /path/to/folder/ -mtime 1 -exec rm {} \; // Deletes all Files modified yesterday
For more examples google for bash find time or take a look here

Related

Task Scheduler Script copy NEW files only - Synology

Similar to this question I want to make a task scheduler script to copy NEW files (last 24h) to a new folder.
I try to use this code:
find /volume1/start/ -mtime -1 -type f -exec cp -r {} /volume1/target/ \;
but it delivers a 1kb filename.pdf#SynoEAStream file instead of the file itself.
How can I fix that?
Ok actually it seems to work as supposed I actually just had the wrong file modification date in the files where it didn't work. The script additionally copies some "useless" #SynoEAStream files whicht I now avoid by only looking for pdf-files which ist what I wanted.
find /volume1/TestScan/start/ -mtime -1 -type f -iname '*.pdf' -exec rsync -r {} /volume1/TestScan/ziel/ \;
Maybe its helpful for someone

How to delete files and directories older than n days in linux

I have a directory named repository which has a number of files and sub directories. I want to find the files and directories which have not been modified since last 14 days so that I can delete those files and directories.
I have wrote this script but it is giving the directory name only
#!/bin/sh
M2_REPO=/var/lib/jenkins/.m2/repository
echo $M2_REPO
OLDFILES=/var/lib/jenkins/.m2/repository/deleted_artifacts.txt
AGE=14
find "${M2_REPO}" -name '*' -atime +${AGE} -exec dirname {} \; >> ${OLDFILES}
find /path/to/files* -mtime +5 -exec rm {} \;
Note that there are spaces between rm, {}, and \;
Explanation
The first argument is the path to the files. This can be a path, a directory, or a wildcard as in the example above. I would recommend using the full path, and make sure that you run the command without the exec rm to make sure you are getting the right results.
The second argument, -mtime, is used to specify the number of days old that the file is. If you enter +5, it will find files older than 5 days.
The third argument, -exec, allows you to pass in a command such as rm. The {} \; at the end is required to end the command.
This should work on Ubuntu, Suse, Redhat, or pretty much any version of linux.
You can give the find -delete flag to remove the files with it. Just be careful to put it in the end of the command so that the time filter is applied first.
You can first just list the files that the command finds:
find "${M2_REPO}" -depth -mtime +${AGE} -print
The -d flag makes the find do the search depth-first, which is implied by the -deletecommand.
If you like the results, change the print to delete:
find "${M2_REPO}" -mtime +${AGE} -delete
I know this is a very old question but FWIW I solved the problem in two steps, first find and delete files older than N days, then find and delete empty directories. I tried doing both in one step but the delete operation updates the modification time on the file's parent directory, and then the (empty) directory does not match the -mtime criteria any more! Here's the solution with shell variables:
age=14
dir="/tmp/dirty"
find "$dir" -mtime "+$age" -delete && find "$dir" -type d -empty -delete

Copy N days old files on Linux

Good morning,
I have many files inside directories, subdirectories which I'm now using copy everything inside.
find /tmp/temp/ -name *files.csv -type f -exec cp -u {} /home/dir/Desktop/dir1/ \;
And I was wondering, if there is anyway that I can copy like, copy if the file's modified date is within two days. I don't want to copy if the modification date is 2 days before the current date.
You can use mtime within your find command:
find /tmp/temp/ -type f -mtime -2 -name *files.csv -exec cp -u {} /home/dir/Desktop/dir1/ \;
This would copy only files with a modified time within the last two days of the system time.
-mtime n
File's data was last modified n*24 hours ago

Deleting files that are older than one day [duplicate]

This question already has answers here:
find files older than X days in bash and delete
(3 answers)
Closed 7 years ago.
I have a server which creates several log files in the log directory. Due to this logging mechanism it eats up a lot of disk space on my server. I want to write a script that deletes all the files that are older than one day and keep the latest ones.
I am able to list the directories in sorted form using ls -trl command. But I am not able to understand how to remove these files. Please help.
You can use the following command:
/usr/bin/find <Your Log Directory> -mtime +1 | xargs rm -f
mtime - provides the file modification time.
+1 - indicates greater than one day.
Try using rm and find command like:
find . -mmin +$((60*24)) -exec rm {} \;
You don't want ls, you want find.
It has a neat argument, -mtime, that limits the results to a specific time delta, and -exec which allows you to provide a command to run on the results.
So for example,
find -mtime +10 -name "*tmp*" -exec rm {} \;
Does an rm on all files older than 10 days, with tmp in the name.
Oh, and be careful.
Very careful.
find . -mtime +1 -exec rm {} \;

How to copy the recent updated multiple files in another directory in Solaris

I want to copy the recently updated multiple file into another directory.
I am having 1.xml,2.xml,3.xml.... in this directory recently someone updated file or added new file into the directory,So i want to copy those files into the destination directory ..Its like synchronization of 2 directories.
For that I have tried below commend
find home/deployment/server/services/ -type f -mtime 1 | xargs cp /home/application/
and below one also
find home/deployment/server/services/ -type f -mtime 1 -exec cp /home/application/
I am not getting any file into destination after updating 1.xml file,So I have added new file 4.xml even that also not updating in destination directory.
How to process recently updated or newly added multiple files.
Thanks in advance.
Short answer:
use xargs to mv the "find" directory into another directory
Long answer: As I recall (not tested) for exec syntax is
find . -type f --mtime 1 -exec cp {} /destination/path/ +
"{}" is an argument which came from command "find"
For xargs
find . -type f --mtime 1 | xargs -0 -I {} cp {} /destination/path/
I do this often but use \; instead of + and usually -cnewer rather than -mtime.
\; executes the cp command on files individually instead of as a group.
+ executes as a group with as many paths as xterm will take. It may do this multiple time if there are a lot of files.
the \ in front of the ; option is required or bash will think it is the end of the command.
find ./ -mtime -1 -exec cp {} /path/ \; -print
Use the -print at the end to get a list of the files that were copied.

Resources