find directory older than 3 days and zip all files in it - linux

Can i find any directories with a condition like older than 3 days
and zip them then delete the directories?
I have 2 solutions.
zip all directories in 1 zip under working directory
I tried
zip -rm ${WORKDIR}/date +%Y%m%d -d "${DAY_TO_ZIP} days ago".zipfind ${WORKDIR} -daystart -mtime +${DAY_TO_ZIP} -type d ! -name "*.zip"``
this command will zip all files include non-directory file.
1 directory 1 zip same path with a directory
thank you very much

Execute bellow command to find all directory older than 3 days and zip all file
# find / -mtime +3 -type d -exec zip -r zipfile.zip {} +
-mtime +3 means you are looking for a file modified 3 days ago.
-mtime -3 means less than 3 days.
-mtime 3 If you skip + or – it means exactly 3 days.
Finally If you delete all directory then execute bellow command
# find / -mtime +3 -type d -exec rm -f {} \;

find ./ -mtime +x -print -exec gzip {} ;

Related

how to delete 3days old back files automatically in server and My backup files looks like?

web_20_10_2022
web_21_10_2022
web_22_10_2022
web_23_10_2022
web_24_10_2022
web_25_10_2022
I need auto delete script for 3 days old backup files from server and with cron job.
below are some commond I have tried but it is not working, so please help me out!!!
find /home -type f -mtime +3 -exec rm -rf {} +
find /home* -mtime +3 -type f -delete
find /home -mtime +3 -type d -exec rmdir {} ;
Kindly run the below command to find out the files older than 3 days.
find /path/to/directory/* -mtime +3 -print
The older files will be displayed in the mentioned directory if the above command is used.
To delete the 3 day old files:
find /path/to/directory/* -type f -mtime +3 | xargs rm -f
or
find /path/to/directory/* -type f -mtime +3 -exec rm -f {} \;
or
find /path/to/directory/* -type f -mtime +3 -delete
To automate the process:
Copy the below code and save the file in a directory where you are not performing any housekeeping activity.
#!/bin/bash
find /path/to/directory/* -mtime +3 -delete
Make the file an executable file and set a cron job.
crontab -e
To run the script at every hour, enter
00 * * * * /path/to/executablefile

Linux project: bash script to archive and remove files

I've been set a mini project to run a bash script to archive and remove files that are older than 'x' number of days. The file will be archived in the /nfs/archive directory and they need to be compressed (TAR) or removed... e.g. '/test.sh 15' would remove files older than 15 days. Moreover, I also need to input some validation checking before removing files...
My code so far:
> #!/bin/bash
>
> #ProjectEssentials:
>
> # TAR: allows you to back up files
> # cronjob: schedule taks
> # command: find . -mtime +('x') -exec rm {} \; this will remove files older than 'x' number of days
>
> find /Users/alimohamed/downloads/nfs/CAMERA -type f -name '*.mov'
> -mtime +10 -exec mv {} /Users/limohamed/downloads/nfs/archive/ \;
>
> # TAR: This will allow for the compression
>
> tar -cvzf doc.tar.gz /Users/alimohamed/downloads/nfs/archive/
>
> # Backup before removing files 'cp filename{,.bak}'? find /Users/alimohamed/downloads/nfs/CAMERA -type f name '*.mov' -mtime +30
> -exec rm {} \; ~
Any help would much appreciated!!
Modified script to fix few typos. Note backup file will have a YYYY-MM-DD, to allow for multiple backups (limited to one backup per day).Using TOP to make script generic - work on any account.
X=15 # Number of days
# Move old files (>=X days) to archive, via work folder
TOP=~/downloads/nfs
mkdir -p "$TOP/work"
find $TOP/CAMERA -type f -name '*.mov' -mtime +"$X" -exec mv {} "$WORK/work" \;
# Create daily backup (note YYYY-MM-DD in file name from work folder
tar -cvzf $TOP/archive/doc.$(date +%Y-%m-%d).tar.gz -C "$TOP/work" .
# Remove all files that were backed-up, If needed
find "$TOP/work" -type f -name '*.mov' -exec rm {} \; ~

How do I recover files that disappeared after wrong MV command format?

I'm trying to move files from the current directory to another directory by date, but I accidentally used the wrong target format:
find . -maxdepth 1 -mtime +365 -type f -exec mv "{}" "..\folder" \;
instead of
find . -maxdepth 1 -mtime +365 -type f -exec mv "{}" "../folder" \;
Then my files just disappeared.
I can't seem to find it anywhere. I've tried on both target & source directories and even the non existent directory that I have accidentally sent the files to.
I would just like to know if I can still recover the files.
They're all gone. When you run:
find . -maxdepth 1 -mtime +365 -type f -exec mv "{}" "..\folder" \;
You are executing, for every file, the command:
mv filename ..folder
In other words, you renamed every file to the name ..folder. Each file overwrote the next one. The contents of the ..folder file are whatever file was last processed by your command, and all the rest are gone.

MOVING Files and place them into folders accordingly to text file

I need to move files from ORIGIN and place them to DESTINATION accordingly to the information contained in text file "toto.txt"
I do NOT know how to code the part which says:
place these files accordingly with the information contained in toto.txt which states
the sub-folder structure on DESTINATION folder"
toto.txt conatins the folder structure of ORIGIN and the files must be moved accordingly to DESTINATION but with the original folder structure location.
# My working Paths
MY_DIR1="/media/nss/MBVOL1/TEST/ORIGIN"
MY_DIR2="/media/nss/MBVOL1/TEST/DESTINATION"
# Flag files older than 1 day and list their name\full path to “TOTO” text file
echo "REPORT Created"
cd $MY_DIR1 && find . -mindepth 0 -maxdepth 40 -mtime +1 -type f > toto.txt
cp $MY_DIR1/toto.txt /$MY_DIR2
# Flag files older than 1 day then MOVE file to “DESTINATION” Folder
echo "FILES Moved"
find $MY_DIR1 -mindepth 0 -maxdepth 400 -type f -mtime +14 -exec mv '{}' $MY_DIR2 \;
Try this:
cd "$MY_DIR1"
# Duplicate directory structure
find . -type d -exec mkdir -p "$MY_DIR2"/{} \;
# move files older than 1 day
find . -type f -mtime +1 -exec mv {} "$MY_DIR2"/{} \;
You can combine them into one command:
find . -type d -exec mkdir -p "$MY_DIR2"/{} \; -o -type f -mtime +1 -exec mv {} "$MY_DIR2"/{} \;
Use something like this...
cat ${MY_DIR2}/toto.txt | while read FILE ; do
mv -v "${MY_DIR1}/${FILE}" "${MY_DIR2}"
done

How to delete all files older than 3 days when "Argument list too long"?

I've got a log file directory that has 82000 files and directories in it (about half and half).
I need to delete all the file and directories which are older than 3 days.
In a directory that has 37000 files in it, I was able to do this with:
find * -mtime +3 -exec rm {} \;
But with 82000 files/directories, I get the error:
/usr/bin/find: Argument list too long
How can I get around this error so that I can delete all files/directories that are older than 3 days?
To delete all files and directories within the current directory:
find . -mtime +3 | xargs rm -Rf
Or alternatively, more in line with the OP's original command:
find . -mtime +3 -exec rm -Rf -- {} \;
Can also use:
find . -mindepth 1 -mtime +3 -delete
To not delete target directory
Another solution for the original question, esp. useful if you want to remove only SOME of the older files in a folder, would be smth like this:
find . -name "*.sess" -mtime +100
and so on.. Quotes block shell wildcards, thus allowing you to "find" millions of files :)

Resources