7zip archieving files that are newer than a specific date - linux

I create 7zip files like this from command line in Linux:
# 7za a /backup/files.7z /myfolder
After that I want to create another zip file that includes all files inside /myfolder that are newer then dd-mm-YY.
Is it possible to archieve files with respect to file's last change time ?
(I don't want to update "files.7z" file I need to create another zip file that includes only new files)

The proposal by Gooseman:
# find myfolder -mtime -10 -exec 7za a /backup/newfile.7z {} \;
adds all files of each directory tree which got new files since the directory is also new and then adds all new files just archived again.
The following includes only new files but does not store the path names in the archive:
# find myfolder -type f -mtime -10 -exec 7za a /backup/newfile.7z {} \;
This stores only new files — with path names:
# find myfolder -type f -mtime -10 > /tmp/list.txt
# tar -cvf /tmp/newfile.tar -T /tmp/list.txt
# 7za a /backup/newfile.7z /tmp/newfile.tar

You could try this command:
find myfolder -mtime -10 -exec 7za a /backup/newfile.7z {} \;
In order to find the number to use by the mtime option you could use some of these answers:
How to find the difference in days between two dates? In your case it would be the difference between the current date and your custom dd-mm-YY (in my example dd-mm-YY is 10 days back from now)
From man find:
-n for less than n
-mtime n
File's data was last modified n*24 hours ago. See the comments for -atime to understand how rounding affects the interpretation of file modification times.

Related

How to delete files in LINUX which are older than 30 days and are of specific extension *.pdf

I have a peculiar challenge, we have one directory where there is close to 15000 PDF files, and the file names also contain spaces (plus we have other config file which we are not supposed to touch).
I am trying to delete all the PDF files (Please note PDF file name has spaces) from this directory which are older than 30 days/1 month. how can I achieve this?
For find all PDF on your linux system with +30 days old and delete them you can use this command :
find / -name -ls -o -regex '.*\.pdf' -type f -mtime +30 -exec rm {} \;
The / is the path where recursively the command search PDF file.
The -regex '.*.pdf' is the regex for only match PDF file
The -type f only file
The -mtime +30 match file with 30 days minimum old (delete too file which have 32 days old)
The -exec rm {} ; Execute the rm command with {} is the full file name found.

Create ZIP of hundred thousand files based on date newer than one year on Linux

I have a /folder with over a half million files created in the last 10 years. I'm restructuring the process so that in the future there are subfolders based on the year.
For now, I need to backup all files modified within the last year. I tried
zip -r /backup.zip $(find /folder -type f -mtime -365
but get error: Argument list too long.
Is there any alternative to get the files compressed and archived?
Zip has an option to read the filelist from stdin. Below is from the zip man page
-# file lists. If a file list is specified as -# [Not on MacOS],
zip takes the list of input files from standard input instead of
from the command line. For example,
zip -# foo
will store the files listed one per line on stdin in foo.zip.
This should do what you need
find /folder -type f -mtime -365 | zip -# /backup.zip
Note that I've removed the -r option because it isn't doing anything - you are explicitly selecting standard files with the find command (-type f)
You'll have to switch from passing all the files at once to piping the files one at a time to the zip command.
find /folder -type f -mtime -365 | while read FILE;do zip -r /backup.zip $FILE;done
You can also work with the -exec parameter in find, like this:
find /folder -type f -mtime -365 -exec zip -r /backup.zip \;
(or whatever your command is). For every file, the given command is executed with the file passed as a last parameter.
Find the files and then execute the zip command on as many files as possible using + as opposed to ;
find /folder -type f -mtime -365 -exec zip -r /backup.zip '{}' +

How to loop through multiple folder and subfolders and remove file name start with abc.txt and 14 days old

I have folder and subfolder. I need to loop through each folder and subfolder and remove or move the file names which start with abc.txt and 14 days old to temporary folder. My folder tree structure is:
The file may be inside the folder or subfolder 'abc.txt'
I have used this below code but not working.
I took the folder paths into a list.txt file using below command
find $_filepath -type d >> folderpathlist.txt
I pass the path list to below code to search and remove or move files to temporary folder
find folderpathlist.txt -name "abc*" -mtime \+14 >>temp/test/
How do I achieve this scenario ?
You want to find files: -type f
that start with abc.txt: -name "abc.txt*"
that are 14 days old: -mtime +14
and move them to a dir.: -exec mv {} /tmp \;
and to see what moved: -print
So the final command is:
find . -type f -name "abc.txt*" -mtime +14 -exec mv {} /tmp \; -print
Adjust the directory as required.
Note that mtime is the modification time. So it is 14 days old since the last modification was done to it.
Note 2: the {} in the -exec is replaced by each filename found.
Note 3: \; indicates the termination of the command inside the -exec
Note 4: find will recurse into sub-directories anyway. No need to list the directories and loop on them again.

Copy N days old files on Linux

Good morning,
I have many files inside directories, subdirectories which I'm now using copy everything inside.
find /tmp/temp/ -name *files.csv -type f -exec cp -u {} /home/dir/Desktop/dir1/ \;
And I was wondering, if there is anyway that I can copy like, copy if the file's modified date is within two days. I don't want to copy if the modification date is 2 days before the current date.
You can use mtime within your find command:
find /tmp/temp/ -type f -mtime -2 -name *files.csv -exec cp -u {} /home/dir/Desktop/dir1/ \;
This would copy only files with a modified time within the last two days of the system time.
-mtime n
File's data was last modified n*24 hours ago

In linux shell, How to cp/rm files by time?

In linux shell, When I run
ls -al -t
that show the time of files.
How to cp/rm files by time? just like copy all the files that created today or yesterday. Thanks a lot.
Depending on what you actually want to do, find provides -[acm]time options for finding files by accessed, created or modified dates, along with -newer and -min. You can combine them with -exec to copy, delete, or whatever you want to do. For example:
find -maxdepth 1 -mtime +1 -type f -exec cp '{}' backup \;
Will copy all the regular files in the current directory more than 1 day old to the directory backup (assuming the directory backup exists).
Simple Example
find /path/to/folder/ -mtime 1 -exec rm {} \; // Deletes all Files modified yesterday
For more examples google for bash find time or take a look here

Resources