How can i remove folder with crontab? i create new file called delete_old_file.sh and calling in contrab
* * * * * . ~/delete_old_files.sh
i already try with -type d but still not working
find ~/dev/test -type f \( -name "*.gz" -o -name "*.tar" -o -name "*.zip" \) -or -type -d -mtime +7 -exec rm -rf {} +
For now only zip and file can remove
find ~/dev/test -type f \( -name "*.gz" -o -name "*.tar" -o -name "*.zip" \) -mtime +7 -exec rm -rf {} +
did i missed some command?
thanks.
Use the second form and change the last + to \;
i put in two separate line:
find ~/dev/test -type d -mtime +7 -exec rm -rf {} \;
find ~/dev/test -type f ( -name ".gz" -o -name ".tar" -o -name "*.zip" ) -mtime +7 -exec rm -rf {} +
Related
I would like to know is there any way to show the message with "echo function" + "find command" output into log file?
Current:
/mnt/backup/XXXX/Daily/Logs/20210326.log
Code:
logfile=$(date +"%Y%m%d")
find /mnt/backup/XXXX/Daily/Logs -type f -name "*.log" -mtime +6 -print -exec rm {} \; >> /mnt/backup/XXXX/Daily/Logs/$logfile.log
Expected result:
Deleted file - /mnt/backup/XXXX/Daily/Logs/20210326.log
Remark: Bold text is belong to echo function. Normal text is belong to find command.
Use a second -exec and so:
find /mnt/backup/XXXX/Daily/Logs -type f -name "*.log" -mtime +6 -print -exec rm {} \; -exec echo 'Deleted file - {}' >> /mnt/backup/XXXX/Daily/Logs/$logfile.log \;
I am struggling to list down all the files in the current directory with .pdf, .xls, . ser and .csv extensions which must be 30 days older.
I am using the command
find $Path -maxdepth 1 -mtime +33 -type f \(-iname "*pdf" -o -iname "*xls" -o -iname "*ser" -o -iname "*csv"\) | xargs ls -ltr >> ${LOG_OUT};
but i am receiving an error:
find: paths must precede expression: (-iname Usage: find [-H] [-L]
[-P] [-Olevel] [-D help|tree|search|stat|rates|opt|exec] [path...]
[expression]
Try this:
find $Path -maxdepth 1 -mtime +33 -type f \( -iname "*pdf" -o -iname "*xls" -o -iname "*ser" -o -iname "*csv" \) | xargs ls -ltr >> ${LOG_OUT};
you need a space after \( and before \)
Also you do not need |xargs, try this:
find $Path -maxdepth 1 -mtime +33 -type f \( -iname "*pdf" -o -iname "*xls" -o -iname "*ser" -o -iname "*csv" \) -exec ls -ltr {} \; >> ${LOG_OUT}
having the following bash code on Linux, how I can modify it to append the datestamp after gz accomplished his process?
DOMINIO=filenetvers
DATAORA_ATTUALI=$(date +"%Y.%m.%d")
GGZIP=1
GGRM=90
find /work/pr-${DOMINIO}-0[0-2]/servers -name "*.log*[^gz]" -type f -user bea -mtime +${GGZIP} -exec /usr/bin/gzip -9 -f {} "*.gz.$DATAORA_ATTUALI" \;
find /work/pr-${DOMINIO}-0[0-2]/servers -name "*.stdout*[^gz]" -type f -user bea -mtime +${GGZIP} -exec /usr/bin/gzip -9 -f {} "*.gz.$DATAORA_ATTUALI" \;
find /work/pr-${DOMINIO}-0[0-2]/servers -name "*.stderr*[^gz]" -type f -user bea -mtime +${GGZIP} -exec /usr/bin/gzip -9 -f {} "*.gz.$DATAORA_ATTUALI" \;
This to view your output from find:
find ./ -type f -name "nsshow*" -exec echo cp {} /tmp/{}_test \;
cp ./nsshow_SANSW06_FABB /tmp/./nsshow_SANSW06_FABB_test
cp ./nsshow_SANSW02_FABB /tmp/./nsshow_SANSW02_FABB_test
cp ./nsshow_SANSW05_FABA /tmp/./nsshow_SANSW05_FABA_test
cp ./nsshow_SANSW01_FABA /tmp/./nsshow_SANSW01_FABA_test
This to run it:
find ./ -type f -name "nsshow*" -exec cp {} /tmp/{}_test \;
The above "for" loop is nice and simpel for Advanced work:
for f in $(find /work/pr-${DOMINIO}-0[0-2]/servers -name "*.log*[^gz]" -type f -user bea -mtime +${GGZIP})
do
/usr/bin/gzip -9 -f $f
mv $f.gz $f.gz.$DATAORA_ATTUALI
done
A simple solution is to use a simple for loop and do the gzip and rename like this :
for f in $(find /work/pr-${DOMINIO}-0[0-2]/servers -name "*.log*[^gz]" -type f -user bea -mtime +${GGZIP}) ; do /usr/bin/gzip -9 -f $f ; mv $f.gz $f.gz.$DATAORA_ATTUALI; done
I am trying to create small utility to collect log files from remote host by creating tar ball, for simplicity assume for now assume to just display list of files based on user input.
This command works fine
find $LOGS_DIR -maxdepth 1 -type f \( -name 'process1.log*' -o -name 'process2.log*' \) -exec echo 'FOUND_FILES:{}' ';'
If i programmatically want to update -name clause based on the user input, say for example user input is process3.log*, process4.log*, process5*.log then my bash script should generate find command as
find $LOGS_DIR -maxdepth 1 -type f \( -name 'process3.log*' -o -name 'process4.log*' -o -name 'process5.log*' \) -exec echo 'FOUND_FILES:{}' ';'
Here is my snippet
...
for pattern in "${file_pattern_to_match[#]}"
do
if [ -z $final_pattern ];then
final_pattern="-name $pattern"
continue;
fi
final_pattern="$final_pattern -o -name $pattern"
done
#This will print final_pattern: -name process3.log* -o -name process4.log* -o -name process5.log*
echo "final_pattern:$final_pattern"
find $LOGS_DIR -maxdepth 1 -type f \( $final_pattern \) -exec echo "FOUND_FILES:{}" \;
But the issue is while executing the script find is evaluated as
find /x/path/logs -maxdepth 1 -type f \( -name process3.log.1 process3.log.2 -o -name process4.log.1 process4.log.2 \) -exec echo "FOUND_FILES:{}" \;
But the expected is
find /x/path/logs -maxdepth 1 -type f \( -name "process3.log.*" -o -name process4.log.* -o -name process5.log.* \) -exec echo "FOUND_FILES:{}" \;
because the variable got expanded "find" is exiting with an error
Can someone please help me how to get the expected result above?
Use an array to keep each argument properly quoted.
first=
for pattern in "${file_pattern_to_match[#]}"
do
if [ -z "$first" ]; then
final_pattern=(-name "$pattern")
first=1
else
final_pattern+=(-o -name "$pattern")
fi
done
# Hacky
# first=
# for pattern in "${file_pattern_to_match[#]}"
# do
# final_pattern+=($first -name "$pattern")
# first=-o
# done
find "$LOGS_DIR" -maxdepth 1 -type f \( "${final_pattern[#]}" \) -exec echo "FOUND_FILES:{}" \;
I have a script which deletes files older than +2 days in a specific Directory.
I would like to check if there is a file with todays date created before removing the older files.
This is what I have:
#!/bin/bash
find /var/backups/server1 -type f -mtime +2 -exec rm {} \;
find /var/backups/server2 -type f -mtime +2 -exec rm {} \;
find /var/backups/server3 -type f -mtime +2 -exec rm {} \;
find /var/backups/server4 -type f -mtime +2 -exec rm {} \;
find /var/backups/server5 -type f -mtime +2 -exec rm {} \;
So Basically:
1.Check Directory with file with todays date.
2.If affirmative find /var/backups/serverX -type f -mtime +2 -exec rm {} \;
3.If not "execute scriptX" (which maybe a mail notification)
thanks!
You could do something like this
find /var/backups/ -maxdepth 1 -type d -print0 | while read -rd '' dirname
do
arry=( $(find "${dirname}" -type f -atime 0) )
#Checks if there is a file that is updated today.
[ "${#arry[#]}" -ge 1 ] && find "${dirname}" -type f -mtime +2 -exec rm {} \;
done