How can I exclude a directory but not its content using the find command? - linux

I use the following command in a scheduled Cron task to delete all files older than 30 days in a folder called Downloads:
find /home/orschiro/Downloads -path ./Archiveror -prune -path ./Fairphone -prune -o -type f -mtime +30 -exec rm {} \; && notify-send "Searching for old files..."
Thereby, I want to exclude:
the folder Archiveror with all its content which it does
and the folder Fairphone but keep its content checked if older than 30 days
How do can I tell the find command to not delete the folder Fairphone but its content if older than 30 days?

Related

How to delete files in LINUX which are older than 30 days and are of specific extension *.pdf

I have a peculiar challenge, we have one directory where there is close to 15000 PDF files, and the file names also contain spaces (plus we have other config file which we are not supposed to touch).
I am trying to delete all the PDF files (Please note PDF file name has spaces) from this directory which are older than 30 days/1 month. how can I achieve this?
For find all PDF on your linux system with +30 days old and delete them you can use this command :
find / -name -ls -o -regex '.*\.pdf' -type f -mtime +30 -exec rm {} \;
The / is the path where recursively the command search PDF file.
The -regex '.*.pdf' is the regex for only match PDF file
The -type f only file
The -mtime +30 match file with 30 days minimum old (delete too file which have 32 days old)
The -exec rm {} ; Execute the rm command with {} is the full file name found.

Linux project: bash script to archive and remove files

I've been set a mini project to run a bash script to archive and remove files that are older than 'x' number of days. The file will be archived in the /nfs/archive directory and they need to be compressed (TAR) or removed... e.g. '/test.sh 15' would remove files older than 15 days. Moreover, I also need to input some validation checking before removing files...
My code so far:
> #!/bin/bash
>
> #ProjectEssentials:
>
> # TAR: allows you to back up files
> # cronjob: schedule taks
> # command: find . -mtime +('x') -exec rm {} \; this will remove files older than 'x' number of days
>
> find /Users/alimohamed/downloads/nfs/CAMERA -type f -name '*.mov'
> -mtime +10 -exec mv {} /Users/limohamed/downloads/nfs/archive/ \;
>
> # TAR: This will allow for the compression
>
> tar -cvzf doc.tar.gz /Users/alimohamed/downloads/nfs/archive/
>
> # Backup before removing files 'cp filename{,.bak}'? find /Users/alimohamed/downloads/nfs/CAMERA -type f name '*.mov' -mtime +30
> -exec rm {} \; ~
Any help would much appreciated!!
Modified script to fix few typos. Note backup file will have a YYYY-MM-DD, to allow for multiple backups (limited to one backup per day).Using TOP to make script generic - work on any account.
X=15 # Number of days
# Move old files (>=X days) to archive, via work folder
TOP=~/downloads/nfs
mkdir -p "$TOP/work"
find $TOP/CAMERA -type f -name '*.mov' -mtime +"$X" -exec mv {} "$WORK/work" \;
# Create daily backup (note YYYY-MM-DD in file name from work folder
tar -cvzf $TOP/archive/doc.$(date +%Y-%m-%d).tar.gz -C "$TOP/work" .
# Remove all files that were backed-up, If needed
find "$TOP/work" -type f -name '*.mov' -exec rm {} \; ~

How to loop through multiple folder and subfolders and remove file name start with abc.txt and 14 days old

I have folder and subfolder. I need to loop through each folder and subfolder and remove or move the file names which start with abc.txt and 14 days old to temporary folder. My folder tree structure is:
The file may be inside the folder or subfolder 'abc.txt'
I have used this below code but not working.
I took the folder paths into a list.txt file using below command
find $_filepath -type d >> folderpathlist.txt
I pass the path list to below code to search and remove or move files to temporary folder
find folderpathlist.txt -name "abc*" -mtime \+14 >>temp/test/
How do I achieve this scenario ?
You want to find files: -type f
that start with abc.txt: -name "abc.txt*"
that are 14 days old: -mtime +14
and move them to a dir.: -exec mv {} /tmp \;
and to see what moved: -print
So the final command is:
find . -type f -name "abc.txt*" -mtime +14 -exec mv {} /tmp \; -print
Adjust the directory as required.
Note that mtime is the modification time. So it is 14 days old since the last modification was done to it.
Note 2: the {} in the -exec is replaced by each filename found.
Note 3: \; indicates the termination of the command inside the -exec
Note 4: find will recurse into sub-directories anyway. No need to list the directories and loop on them again.

Remove all sub-directories and files except for one

I have a folder structure as follows:
/home/user/<individual_user>
In some of the <individual_user> folders there is a .bashrc file that I want to keep, however I want to remove all files and folders under /home/user/<individual_user> except that .bashrc file. All other files and subdirectories under <individual_user> should be deleted. There is an undetermined number of <individual_user> folders.
I would prefer to execute this command as a one-liner under cron.
After your edit, you can use:
find /home/user -mindepth 2 -not -path '*/.bashrc' -print
Once you are satisfied with the output, you can replace -print with -delete to make it:
find /home/user -mindepth 2 -not -path '*/.bashrc' -delete
How about this:
find /home/user ! -name .bashrc -exec rm -rf {} +
For obvious reasons, I haven't tested it ;)

Linux cron job remove .zip files from folder every 12 hours

I'm looking for help to make a cron job that every 12 hours will delete all .zip files from a subfolder in my hostgator hosting account.
http://prntscr.com/7rr22d
After some google research I tried this command but nothing seems to happen
find /home/username/domain.com -type f -name "*.zip" |xargs rm
What should I put in the "Command:" field ?
Maybe try:
find /home/username/domain.com -name "*.zip" -exec rm -rf {} \;

Resources